WorldWideScience

Sample records for rate testing methodology

  1. Methodological framework for heart rate variability analysis during exercise: application to running and cycling stress testing.

    Science.gov (United States)

    Hernando, David; Hernando, Alberto; Casajús, Jose A; Laguna, Pablo; Garatachea, Nuria; Bailón, Raquel

    2017-09-26

    Standard methodologies of heart rate variability analysis and physiological interpretation as a marker of autonomic nervous system condition have been largely published at rest, but not so much during exercise. A methodological framework for heart rate variability (HRV) analysis during exercise is proposed, which deals with the non-stationary nature of HRV during exercise, includes respiratory information, and identifies and corrects spectral components related to cardiolocomotor coupling (CC). This is applied to 23 male subjects who underwent different tests: maximal and submaximal, running and cycling; where the ECG, respiratory frequency and oxygen consumption were simultaneously recorded. High-frequency (HF) power results largely modified from estimations with the standard fixed band to those obtained with the proposed methodology. For medium and high levels of exercise and recovery, HF power results in a 20 to 40% increase. When cycling, HF power increases around 40% with respect to running, while CC power is around 20% stronger in running.

  2. Single Event Test Methodologies and System Error Rate Analysis for Triple Modular Redundant Field Programmable Gate Arrays

    Science.gov (United States)

    Allen, Gregory; Edmonds, Larry D.; Swift, Gary; Carmichael, Carl; Tseng, Chen Wei; Heldt, Kevin; Anderson, Scott Arlo; Coe, Michael

    2010-01-01

    We present a test methodology for estimating system error rates of Field Programmable Gate Arrays (FPGAs) mitigated with Triple Modular Redundancy (TMR). The test methodology is founded in a mathematical model, which is also presented. Accelerator data from 90 nm Xilins Military/Aerospace grade FPGA are shown to fit the model. Fault injection (FI) results are discussed and related to the test data. Design implementation and the corresponding impact of multiple bit upset (MBU) are also discussed.

  3. Thermal-hydraulic analysis for changing feedwater check valve leakage rate testing methodology

    Energy Technology Data Exchange (ETDEWEB)

    Fuller, R.; Harrell, J.

    1996-12-01

    The current design and testing requirements for the feedwater check valves (FWCVs) at the Grand Gulf Nuclear Station are established from original licensing requirements that necessitate extremely restrictive air testing with tight allowable leakage limits. As a direct result of these requirements, the original high endurance hard seats in the FWCVs were modified with elastomeric seals to provide a sealing surface capable of meeting the stringent air leakage limits. However, due to the relatively short functional life of the elastomeric seals compared to the hard seats, the overall reliability of the sealing function actually decreased. This degraded performance was exhibited by frequent seal failures and subsequent valve repairs. The original requirements were based on limited analysis and the belief that all of the high energy feedwater vaporized during the LOCA blowdown. These phenomena would have resulted in completely voided feedwater lines and thus a steam environment within the feedwater leak pathway. To challenge these criteria, a comprehensive design basis accident analysis was developed using the RELAP5/MOD3.1 thermal-hydraulic code. Realistic assumptions were used to more accurately model the post-accident fluid conditions within the feedwater system. The results of this analysis demonstrated that no leak path exists through the feedwater lines during the reactor blowdown phase and that sufficient subcooled water remains in various portions of the feedwater piping to form liquid water loop seals that effectively isolate this leak path. These results provided the bases for changing the leak testing requirements of the FWCVs from air to water. The analysis results also established more accurate allowable leakage limits, determined the real effective margins associated with the FWCV safety functions, and led to design changes that improved the overall functional performance of the valves.

  4. Test reactor risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jennings, R.H.; Rawlins, J.K.; Stewart, M.E.

    1976-04-01

    A methodology has been developed for the identification of accident initiating events and the fault modeling of systems, including common mode identification, as these methods are applied in overall test reactor risk assessment. The methods are exemplified by a determination of risks to a loss of primary coolant flow in the Engineering Test Reactor.

  5. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires

    Science.gov (United States)

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  6. NREL module energy rating methodology

    Energy Technology Data Exchange (ETDEWEB)

    Whitaker, C.; Newmiller, J.; Kroposki, B. [National Renewable Energy Laboratory, Golden, CO (United States)] [and others

    1995-11-01

    The goals of this project were to develop a tool for: evaluating one module in different climates; comparing different modules; provide a Q&D method for estimating periodic energy production; provide an achievable module rating; provide an incentive for manufacturers to optimize modules to non-STC conditions; and to have a consensus-based, NREL-sponsored activity. The approach taken was to simulate module energy for five reference days of various weather conditions. A performance model was developed.

  7. Heart rate recovery: Short review of methodology

    Directory of Open Access Journals (Sweden)

    Đurić Biljana

    2016-01-01

    Full Text Available Determination of the heart rate recovery (HRR after the session of a physical activity, represents the valuable parameter for the investigation of autonomic balance and its dynamic in the general population, but also in the population of elite athletes. However, the methodology for its determination and analysis is still not entirely specified. It is necessary to define an adequate protocol of cardiopulmonary exercise test, by choosing an adequate ergometer (treadmill, ergo-bicycle or step bench. Organization of recovery period (active or passive, after the session of exercise is also very important, because its protocol interfered significantly with the value of HRR. Interpretation of obtained HRR values varies a lot, and researcher has freedom to choose the most adequate way, in accordance with the objectives of his study. Following paper represents a short review of determination, interpretation and analysis of HRR, followed by the latest recommendations.

  8. Diesel injector fouling bench test methodology

    Science.gov (United States)

    Stavinoha, Leon L.; Yost, Douglas M.; Lestz, Sidney J.

    1992-06-01

    Compared to conventional compression ignition (CI) engine operation with the fuel being delivered at approximately 149 C (300 F), adiabatic engine operation potentially may deliver the fuel at temperatures as high as 260 C (500 F). Hypergolic CI engine combustion systems now in theoretical design stages will deliver fuel at temperatures approaching 427 to 538 C (800 to 1000 F). The ability of a fuel to resist formation of deposits on internal injector system surfaces is a form of thermal oxidative stability for which test methodology will be required. The injector Fouling Bench Test (IFBT) methodology evaluated in this report will assist in defining fuel contribution to injector fouling and control of fuel thermal stability in procurement specifications. The major observations from this project are discussed. Forty-hour cyclic IFB tests employing both Bosch APE 113 and Detroit Diesel (DD) N70 injectors are viable procedures for evaluating fuel effects on injector fouling. Cyclic operation appears to be superior to steady-state operation for both type injectors. Eighty-hour cyclic tests are more discriminating than 40-hour cyclic tests using the Bosch APE 113 injectors. JFTOT tests of fuels provide directional information on thermal stability-related deposits and filter plugging but show limited good correlation with IFBT DD N70 ratings, and none with IFBT Bosch APE 113 injector ratings. Deposition on injector pintles was more realistically rated by optical microscopy and Scanning Electron Microscopy (SEM) than conventional visual and bench rating methods. High-sulfur fuel readily caused sticking of Detroit Diesel injectors. Injector sticking is an important mode of injector fouling.

  9. SOC Testing Methodology and Practice

    CERN Document Server

    Wu, Cheng-Wen

    2011-01-01

    On a commercial digital still camera (DSC) controller chip we practice a novel SOC test integration platform, solving real problems in test scheduling, test IO reduction, timing of functional test, scan IO sharing, embedded memory built-in self-test (BIST), etc. The chip has been fabricated and tested successfully by our approach. Test results justify that short test integration cost, short test time, and small area overhead can be achieved. To support SOC testing, a memory BIST compiler and an SOC testing integration system have been developed.

  10. IRST testing methodologies: Maritime Infrared Background Simulator

    NARCIS (Netherlands)

    Schwering, P.B.W.

    2006-01-01

    In this paper we discuss methodologies to incorporate the effects of environments and scenarios in the testing of IRST systems. The proposed methodology is based on experience with sea based IRST trials combining the possibilities of performance assessment in required scenarios to the real

  11. Methodology of oral sensory tests.

    Science.gov (United States)

    Jacobs, R; Wu, C-H; Van Loven, K; Desnyder, M; Kolenaar, B; Van Steenberghed, D

    2002-08-01

    Different methods of oral sensory tests including light touch sensation, two-point discrimination, vibrotactile function and thermal sensation were compared. Healthy subjects were tested to assess the results obtained from two psychophysical approaches, namely the staircase and the ascending & descending method of limits for light touch sensation and two-point discrimination. Both methods appeared to be reliable for examining oral sensory function. The effect of topical anaesthesia was also evaluated but no conclusion could be drawn as too few subjects were involved. Newly developed simple testing tools for two-point discrimination and thermal sensation in a clinical situation were developed prior to this study and tested for their reproducibility. Thermal sensation could be reliably detected in repeated trials. Although the hand-held instruments have some drawbacks, the outcome of these instruments in a clinical environment is suitable for assessing oral sensory function. Three different frequencies (32, 128 and 256 Hz) were used to estimate the vibrotactile function. Different threshold levels were found at different frequencies.

  12. Green Methodologies to Test Hydrocarbon Reservoirs

    Directory of Open Access Journals (Sweden)

    Francesca Verga

    2010-01-01

    Full Text Available Problem statement: The definition and the economic viability of the best development strategy of a hydrocarbon reservoir mainly depend on the quantity and type of fluids and on the well productivity. Well testing, consisting in producing hydrocarbon to the surface while measuring the pressure variations induced in the reservoir, has been used for decades to determine the fluid nature and well potential. In exploration and appraisal scenarios the hydrocarbons produced during a test are flared, contributing to the emissions of greenhouse gases. Approach: Due to more stringent environmental regulations and a general need for reduced operating expenses, the current industry drivers in today’s formation evaluation methodologies demand short, safe, cost-effective and environmentally friendly test procedures, especially when conventional tests are prohibitively expensive, logistically not feasible or no surface emissions are allowed. Different methods have been proposed or resuscitated in the last years, such as wireline formation tests, closed chamber tests, production/reinjection tests and injection tests, as viable alternatives to conventional well testing. Results: While various short-term tests, test procedures and interpretation methods are apparently available for conducting successful tests without hydrocarbon production at the surface, clarity is lacking for specific applications of these techniques. An attempt to clarify advantages and limitations of each methodology, particularly with respect to the main testing target is pursued in this study. Specific insight is provided on injection testing, which is one of the most promising methodology to replace traditional well testing in reservoir characterization, except for the possibility to sample the formation fluids. Conclusion/Recommendations: Not a single one method but a combination of more methodologies, in particular injection testing and wireline formation testing, is the most promising

  13. A Design Methodology for Computer Security Testing

    OpenAIRE

    Ramilli, Marco

    2013-01-01

    The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. ...

  14. Photovoltaic module energy rating methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L. [National Renewable Energy Lab., Golden, CO (United States); Whitaker, C.; Newmiller, J. [Endecon Engineering, San Ramon, CA (United States)

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  15. Designing a Test Fixture with DFSS Methodology

    Directory of Open Access Journals (Sweden)

    Charles G. Kibbe

    2016-01-01

    Full Text Available This paper addresses the application of Design for Six Sigma (DFSS methodology to the design of a marine riser joint hydraulic line test fixture. The original test fixture was evaluated using Value Steam Mapping (VSM and appropriate Lean design tools such as 3D Modeling and Finite Element Analysis (FEA. A new test fixture was developed which resulted in improving the process cycle efficiency for the test from 25% to 50% percent, leading to a 50% reduction in test cost. Handling of the new test fixture is greatly improved as compared to the original fixture.

  16. Alternative methodology for Scott-Knott test

    Directory of Open Access Journals (Sweden)

    Leonardo Lopes Bhering

    2008-01-01

    Full Text Available The test proposed by Scott Knott (1974, a procedure of means grouping, is an effective alternative to performprocedures of multiple comparisons without ambiguity. This study aimed to propose a modification related to the partitioningand means grouping in the said procedure, to obtain results without ambiguity among treatments, organized in morehomogeneous groups. In the proposed methodology, treatments that did not participate in the initial group are joined for a newanalysis, which allows for a better group distribution. In a comparative study, four experiments were simulated in a randomizedcomplete block design. The first consisted of 10 and the other 3 of 100 treatments. All experiments were performed in threereplications at a significance level of 0.05 for the means grouping test. Only in the third experiment of those of 100 treatmentsthe groups formed by Scott-Knott did not differ from the methodology proposed here. The proposed methodology is consideredeffective, aiming at the identification of elite cultivar groups for recommendation.

  17. Improved methodology for generating controlled test atmospheres.

    Science.gov (United States)

    Miller, R R; Letts, R L; Potts, W J; McKenna, M J

    1980-11-01

    Improved methodology has been developed for generating controlled test atmospheres. Vaporization of volatile liquids is accomplished in a 28 mm (O.D.) glass J-tube in conjunction with a compressed air flameless heat torch, a pressure-sensitive switch, and a positive displacement piston pump. The vaporization system has been very reliable with a variety of test materials in studies ranging from a few days to several months. The J-tube vaporization assembly minimizes the possibility of thermal decomposition of the test material and affords a better margin of safety when vaporizing potentially explosive materials.

  18. Assessment of variations in wear test methodology.

    Science.gov (United States)

    Gouvêa, Cresus V D; Weig, Karin; Filho, Thales R M; Barros, Renata N

    2010-01-01

    The properties of composite resin for dental fillings were improved by development, but its weakness continues to be its wear strength. Several tests have been proposed to evaluate wear in composite resin materials. The aim of this study was to verify how polishing and the type of abrasive can influence the wear rate of composite resin. The test was carried out on two groups. In one group we employed an ormocer and a hybrid composite that was polished group the composite was polished with the same abrasive paper plus a 1 microm and 0.25 microm grit diamond paste. A three-body wear test was performed using the metal sphere of the wear test machine, the composite and an abrasive. A diamond paste and aluminum oxide dispersion were used as abrasive. Analysis of the results showed that there was no difference between polishing techniques, but revealed a difference between abrasives.

  19. A performance-based methodology for rating remediation systems

    Energy Technology Data Exchange (ETDEWEB)

    Rudin, M.J.; O' Brien, M.C.; Richardson, J.G.; Morrison, J.L.; Morneau, R.A. (Idaho National Engineering Lab., Idaho Falls, ID (United States))

    1993-10-01

    A methodology for evaluating and rating candidate remediation systems has been developed within the Buried Waste Integrated Demonstration (BWID) Systems Analysis Project at the Idaho National Engineering Laboratory (INEL). Called the performance-based technology selection filter (PBTSF), the methodology provides a formalized process to score systems based upon performance measures, and regulatory and technical requirements. The results are auditable and can be validated with field data.

  20. Certification Testing Methodology for Composite Structure. Volume 2. Methodology Development

    Science.gov (United States)

    1986-10-01

    J.E., "F-18 Composites Development Tests," N00019- 79-C-0044 (January 1981). 3. Stenberg , K.V., et al., "YAV-8B Composite Wing Development," Volumes I...Louis, MO 63166 (Attn: K. Stenberg , R. Garrett, R. Riley, J. Doerr). . . 4 MCDONNELL-DOUGLAS CORP., Long Beach, CA 90846 (Attn: J. Palmer

  1. Methodological choices affect cancer incidence rates: a cohort study

    OpenAIRE

    Hannah L Brooke; Talbäck, Mats; Feychting, Maria; Ljung, Rickard

    2017-01-01

    Background Incidence rates are fundamental to epidemiology, but their magnitude and interpretation depend on methodological choices. We aimed to examine the extent to which the definition of the study population affects cancer incidence rates. Methods All primary cancer diagnoses in Sweden between 1958 and 2010 were identified from the national Cancer Register. Age-standardized and age-specific incidence rates of 29 cancer subtypes between 2000 and 2010 were calculated using four definitions ...

  2. PETA: Methodology of Information Systems Security Penetration Testing

    Directory of Open Access Journals (Sweden)

    Tomáš Klíma

    2016-12-01

    Full Text Available Current methodologies of information systems penetration testing focuses mainly on a high level and technical description of the testing process. Unfortunately, there is no methodology focused primarily on the management of these tests. It often results in a situation when the tests are badly planned, managed and the vulnerabilities found are unsystematically remediated. The goal of this article is to present new methodology called PETA which is focused mainly on the management of penetration tests. Development of this methodology was based on the comparative analysis of current methodologies. New methodology incorporates current best practices of IT governance and project management represented by COBIT and PRINCE2 principles. Presented methodology has been quantitatively evaluated.

  3. Comprehensive Error Rate Testing (CERT)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services (CMS) implemented the Comprehensive Error Rate Testing (CERT) program to measure improper payments in the Medicare...

  4. Standardization Activities in TMF Test Methodologies

    Science.gov (United States)

    Verrilli, M. J.; Castelli, M. G.; Bressers, J.; Oehmke, R. L. T.

    1996-01-01

    No standard test practice currently exists for strain-controlled thermomechanical fatigue (TMF) testing. This paper discusses recent activities which lay the foundation for standardization of TMF test methods. Specifically, the paper documents the results of two interlaboratory TMF test programs, identifies key TMF symposia and workshops, and discusses efforts toward drafting a TMF standard test practice.

  5. Rate of force development: physiological and methodological considerations.

    Science.gov (United States)

    Maffiuletti, Nicola A; Aagaard, Per; Blazevich, Anthony J; Folland, Jonathan; Tillin, Neale; Duchateau, Jacques

    2016-06-01

    The evaluation of rate of force development during rapid contractions has recently become quite popular for characterising explosive strength of athletes, elderly individuals and patients. The main aims of this narrative review are to describe the neuromuscular determinants of rate of force development and to discuss various methodological considerations inherent to its evaluation for research and clinical purposes. Rate of force development (1) seems to be mainly determined by the capacity to produce maximal voluntary activation in the early phase of an explosive contraction (first 50-75 ms), particularly as a result of increased motor unit discharge rate; (2) can be improved by both explosive-type and heavy-resistance strength training in different subject populations, mainly through an improvement in rapid muscle activation; (3) is quite difficult to evaluate in a valid and reliable way. Therefore, we provide evidence-based practical recommendations for rational quantification of rate of force development in both laboratory and clinical settings.

  6. The ROF+ methodology for grease life testing

    NARCIS (Netherlands)

    Lugt, Pieter Martin; van den Kommer, A.; Lindgren, H.; Deinhofer, L.

    2013-01-01

    Very often, the service life of grease lubricated rolling bearings is determined by the so called “grease life”. This can be tested on R0F test rigs, which have been available for this since more than 40 years. Recently, this technology has been updated and is now called R0F+. The R0F+ can be used a

  7. A Structured Design for Test Methodology: A Case Study

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper is a case study of a structured Design for Test (DFT) methodology that was adopted for our ASIC project used in High-Definition TV system. The methodology includes the following features, full-scan for chip test, test point insertion, test, path delay test, embedded SRAM Build-In Self Test (BIST), and also the implementation of IEEE 1149.1 Standard. The paper discusses details of the ASIC designs and technology. Our chip has a 2+ million transistors and large embedded memory, which brought us extra test challenge.

  8. Methodology of diagnostic tests in hepatology

    DEFF Research Database (Denmark)

    Christensen, Erik

    2009-01-01

    The performance of diagnostic tests can be assessed by a number of methods. These include sensitivity, specificity,positive and negative predictive values, likelihood ratios and receiver operating characteristic (ROC) curves. This paper describes the methods and explains which information they pr...

  9. Low-Energy Proton Testing Methodology

    Science.gov (United States)

    Pellish, Jonathan A.; Marshall, Paul W.; Heidel, David F.; Schwank, James R.; Shaneyfelt, Marty R.; Xapsos, M.A.; Ladbury, Raymond L.; LaBel, Kenneth A.; Berg, Melanie; Kim, Hak S.; Phan, Anthony; Friendlich, M.R.; Rodbell, Kenneth P.; Hakey, Mark C.; Dodd, Paul E.; Reed, Robert A.; Weller, Robert A.; Mendenhall, Marcus H.; Sierawski, B.D.

    2009-01-01

    Use of low-energy protons and high-energy light ions is becoming necessary to investigate current-generation SEU thresholds. Systematic errors can dominate measurements made with low-energy protons. Range and energy straggling contribute to systematic error. Low-energy proton testing is not a step-and-repeat process. Low-energy protons and high-energy light ions can be used to measure SEU cross section of single sensitive features; important for simulation.

  10. Methodology Investigation Characterization of Test Environment.

    Science.gov (United States)

    1979-08-01

    the Chagres River at the east end of the area and near the mouth of the Frijoles River in the western portion. Unit 2-Tilted, coarse-grained bedded...sedimentary rock-shale, sandstone, and mudstones forming rolling lowlands-the low area in the Frijoles River Basin is formed on these softer sediments...forms the Rio Frijoles Basin, and the western and southern flanks of the KV unit along the western edge of the test area. This unit also comprises

  11. Comparative performance of image fusion methodologies in eddy current testing

    Directory of Open Access Journals (Sweden)

    S. Thirunavukkarasu

    2012-12-01

    Full Text Available Image fusion methodologies have been studied for improving the detectability of eddy current Nondestructive Testing (NDT. Pixel level image fusion has been performed on C-scan eddy current images of a sub-surface defect at two different frequencies. Multi-resolution analysis based Laplacian pyramid and wavelet fusion methodologies, statistical inference based Bayesian fusion and Principal Component Analysis (PCA based fusion methodologies have been studied towards improving the detectability of defects. The performance of the fusion methodologies has been compared using image metrics such as SNR and entropy. Bayesian based fusion methodology has shown better performance as compared to other methodologies with 33.75 dB improvement in the SNR and an improvement of 3.22 in the entropy.

  12. Applying Lean Six Sigma methodology to reduce cesarean section rate.

    Science.gov (United States)

    Chai, Ze-Ying; Hu, Hua-Min; Ren, Xiu-Ling; Zeng, Bao-Jin; Zheng, Ling-Zhi; Qi, Feng

    2017-06-01

    This study aims to reduce cesarean section rate and increase rate of vaginal delivery. By using Lean Six Sigma (LSS) methodology, the cesarean section rate was investigated and analyzed through a 5-phase roadmap consisting of Define, Measure, Analyze, Improve, and Control. The principal causes of cesarean section were identified, improvement measures were implemented, and the rate of cesarean section before and after intervention was compared. After patients with a valid medical reason for cesarean were excluded, the main causes of cesarean section were maternal request, labor pain, parturient women assessment, and labor observation. A series of measures was implemented, including an improved parturient women assessment system, strengthened pregnancy nutrition guidance, implementation of painless labor techniques, enhanced midwifery team building, and promotion of childbirth-assist skills. Ten months after introduction of the improvement measures, the cesarean section rate decreased from 41.83% to 32.00%, and the Six Sigma score (ie, Z value) increased from 1.706 to 1.967 (P cesarean section. © 2016 John Wiley & Sons, Ltd.

  13. Loading Rate for Modulus of Rupture Test

    Institute of Scientific and Technical Information of China (English)

    QUMing; ZHANGYong-fang

    1996-01-01

    Relationship among load rate,strain rate and stress rate for modulus of ruptue test,the way of applying load with stress rate using both hydraulic compression testing machine and nechanical compression testing machine have been described.The test results are identical with selected strain rate loading and stress rate loading.

  14. A Tester-Assisted Methodology for Test Redundancy Detection

    Directory of Open Access Journals (Sweden)

    Negar Koochakzadeh

    2010-01-01

    Full Text Available Test redundancy detection reduces test maintenance costs and also ensures the integrity of test suites. One of the most widely used approaches for this purpose is based on coverage information. In a recent work, we have shown that although this information can be useful in detecting redundant tests, it may suffer from large number of false-positive errors, that is, a test case being identified as redundant while it is really not. In this paper, we propose a semiautomated methodology to derive a reduced test suite from a given test suite, while keeping the fault detection effectiveness unchanged. To evaluate the methodology, we apply the mutation analysis technique to measure the fault detection effectiveness of the reduced test suite of a real Java project. The results confirm that the proposed manual interactive inspection process leads to a reduced test suite with the same fault detection ability as the original test suite.

  15. Development of nondestructive testing/evaluation methodology for MEMS

    Science.gov (United States)

    Zunino, James L., III; Skelton, Donald R.; Marinis, Ryan T.; Klempner, Adam R.; Hefti, Peter; Pryputniewicz, Ryszard J.

    2008-02-01

    Development of MEMS constitutes one of the most challenging tasks in today's micromechanics. In addition to design, analysis, and fabrication capabilities, this task also requires advanced test methodologies for determination of functional characteristics of MEMS to enable refinement and optimization of their designs as well as for demonstration of their reliability. Until recently, this characterization was hindered by lack of a readily available methodology. However, using recent advances in photonics, electronics, and computer technology, it was possible to develop a NonDestructive Testing (NDT) methodology suitable for evaluation of MEMS. In this paper, an optoelectronic methodology for NDT of MEMS is described and its application is illustrated with representative examples; this description represents work in progress and the results are preliminary. This methodology provides quantitative full-field-of-view measurements in near real-time with high spatial resolution and nanometer accuracy. By quantitatively characterizing performance of MEMS, under different vibration, thermal, and other operating conditions, specific suggestions for their improvements can be made. Then, using the methodology, we can verify the effects of these improvements. In this way, we can develop better understanding of functional characteristics of MEMS, which will ensure that they are operated at optimum performance, are durable, and are reliable.

  16. Proposed Objective Odor Control Test Methodology for Waste Containment

    Science.gov (United States)

    Vos, Gordon

    2010-01-01

    The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.

  17. Weapon Simulator Test Methodology Investigation: Comparison of Live Fire and Weapon Simulator Test Methodologies and the Effects of Clothing and Individual Equipment on Marksmanship

    Science.gov (United States)

    2016-09-15

    marksmanship performance) mirror those which would be captured in a live fire evaluation . 1 WEAPON SIMULATOR TEST METHODOLOGY INVESTIGATION ...METHODOLOGY INVESTIGATION : COMPARISON OF LIVE FIRE AND WEAPON SIMULATOR TEST METHODOLOGIES AND THE EFFECTS OF CLOTHING AND INDIVIDUAL EQUIPMENT ON...2. REPORT TYPE Final 3. DATES COVERED (From - To) October 2014 – August 2015 4. TITLE AND SUBTITLE WEAPON SIMULATOR TEST METHODOLOGY INVESTIGATION

  18. Rating methodological quality: toward improved assessment and investigation.

    Science.gov (United States)

    Moyer, Anne; Finney, John W

    2005-01-01

    Assessing methodological quality is considered essential in deciding what investigations to include in research syntheses and in detecting potential sources of bias in meta-analytic results. Quality assessment is also useful in characterizing the strengths and limitations of the research in an area of study. Although numerous instruments to measure research quality have been developed, they have lacked empirically-supported components. In addition, different summary quality scales have yielded different findings when they were used to weight treatment effect estimates for the same body of research. Suggestions for developing improved quality instruments include: distinguishing distinct domains of quality, such as internal validity, external validity, the completeness of the study report, and adherence to ethical practices; focusing on individual aspects, rather than domains of quality; and focusing on empirically-verified criteria. Other ways to facilitate the constructive use of quality assessment are to improve and standardize the reporting of research investigations, so that the quality of studies can be more equitably and thoroughly compared, and to identify optimal methods for incorporating study quality ratings into meta-analyses.

  19. A methodology to study cyclic debond growth at constant mode-mixity and energy release rate

    DEFF Research Database (Denmark)

    Quispitupa, Amilcar; Berggreen, Christian; Carlsson, Leif A.

    2010-01-01

    It is well known that face/core debond crack propagation is governed by the critical energy release rate (fracture toughness) and mode-mixity at the crack tip. Thus, the current study focuses on the developing of a methodology to perform fatigue crack growth experiments of debonded sandwich...... and better control of loading conditions at the crack tip will be the most relevant outcomes of using the proposed fatigue test method....

  20. ESR (Erythrocyte Sedimentation Rate) Test

    Science.gov (United States)

    Advertisement Proceeds from website advertising help sustain Lab Tests Online. AACC is a not-for-profit organization and does not endorse non-AACC products and services. Advertising & Sponsorship: Policy | Opportunities ...

  1. Diagnostic Tests for Alzheimer's Disease: Rationale, Methodology, and Challenges

    Directory of Open Access Journals (Sweden)

    S. E. Mason

    2010-01-01

    Full Text Available There has been a large increase in the amount of research seeking to define or diagnose Alzheimer's disease before patients develop dementia. If successful, this would principally have clinical benefits both in terms of treatment as well as risk modification. Moreover, a better method for diagnosing predementia disease would assist research which seeks to develop such treatments and risk modification strategies. The evidence-based definition of a diagnostic test's accuracy is fundamental to achieve the above goals and to address this, the Cochrane Collaboration has established a Diagnostic Test Accuracy group dedicated to examining the utility and accuracy of proposed tests in dementia and cognitive impairment. We present here the assumptions and observations underpinning the chosen methodology as well as the initial methodological approach decided upon.

  2. Acceptance testing for PACS: from methodology to design to implementation

    Science.gov (United States)

    Liu, Brent J.; Huang, H. K.

    2004-04-01

    Acceptance Testing (AT) is a crucial step in the implementation process of a PACS within a clinical environment. AT determines whether the PACS is ready for clinical use and marks the official sign off of the PACS product. Most PACS vendors have Acceptance Testing (AT) plans, however, these plans do not provide a complete and robust evaluation of the full system. In addition, different sites will have different special requirements that vendor AT plans do not cover. The purpose of this paper is to introduce a protocol for AT design and present case studies of AT performed on clinical PACS. A methodology is presented that includes identifying testing components within PACS, quality assurance for both functionality and performance, and technical testing focusing on key single points-of-failure within the PACS product. Tools and resources that provide assistance in performing AT are discussed. In addition, implementation of the AT within the clinical environment and the overall implementation timeline of the PACS process are presented. Finally, case studies of actual AT of clinical PACS performed in the healthcare environment will be reviewed. The methodology for designing and implementing a robust AT plan for PACS was documented and has been used in PACS acceptance tests in several sites. This methodology can be applied to any PACS and can be used as a validation for the PACS product being acquired by radiology departments and hospitals. A methodology for AT design and implementation was presented that can be applied to future PACS installations. A robust AT plan for a PACS installation can increase both the utilization and satisfaction of a successful implementation of a PACS product that benefits both vendor and customer.

  3. Counter unmanned aerial system testing and evaluation methodology

    Science.gov (United States)

    Kouhestani, C.; Woo, B.; Birch, G.

    2017-05-01

    Unmanned aerial systems (UAS) are increasing in flight times, ease of use, and payload sizes. Detection, classification, tracking, and neutralization of UAS is a necessary capability for infrastructure and facility protection. We discuss test and evaluation methodology developed at Sandia National Laboratories to establish a consistent, defendable, and unbiased means for evaluating counter unmanned aerial system (CUAS) technologies. The test approach described identifies test strategies, performance metrics, UAS types tested, key variables, and the necessary data analysis to accurately quantify the capabilities of CUAS technologies. The tests conducted, as defined by this approach, will allow for the determination of quantifiable limitations, strengths, and weaknesses in terms of detection, tracking, classification, and neutralization. Communicating the results of this testing in such a manner informs decisions by government sponsors and stakeholders that can be used to guide future investments and inform procurement, deployment, and advancement of such systems into their specific venues.

  4. Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation

    Science.gov (United States)

    2016-05-01

    it can be used as an additional glass characterization method . If the SOC of a glass sample is known, the stress state of a glass specimen can be...evaluated through photoelastic methods both qualitatively and quantitatively. Approved for public release; distribution is unlimited. 11 8...ARL-TN-0756 ● MAY 2016 US Army Research Laboratory Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation

  5. Development of tools, technologies, and methodologies for imaging sensor testing

    Science.gov (United States)

    Lowry, H.; Bynum, K.; Steely, S.; Nicholson, R.; Horne, H.

    2013-05-01

    Ground testing of space- and air-borne imaging sensor systems is supported by Vis-to-LWIR imaging sensor calibration and characterization, as well as hardware-in-the-loop (HWIL) simulation with high-fidelity complex scene projection to validate sensor mission performance. To accomplish this successfully, there must be the development of tools, technologies, and methodologies that are used in space simulation chambers for such testing. This paper provides an overview of such efforts being investigated and implemented at Arnold Engineering Development Complex (AEDC).

  6. Auditing HIV Testing Rates across Europe

    DEFF Research Database (Denmark)

    Raben, D; Mocroft, A; Rayment, M

    2015-01-01

    European guidelines recommend the routine offer of an HIV test in patients with a number of AIDS-defining and non-AIDS conditions believed to share an association with HIV; so called indicator conditions (IC). Adherence with this guidance across Europe is not known. We audited HIV testing behaviour...... candidiasis. Observed HIV-positive rates were applied by region and IC to estimate the number of HIV diagnoses potentially missed. Outcomes examined were: HIV test rate (% of total patients with IC), HIV test accepted (% of tests performed/% of tests offered) and new HIV diagnosis rate (%). There were 49...... audits from 23 centres, representing 7037 patients. The median test rate across audits was 72% (IQR 32-97), lowest in Northern Europe (median 44%, IQR 22-68%) and highest in Eastern Europe (median 99%, IQR 86-100). Uptake of testing was close to 100% in all regions. The median HIV+ rate was 0.9% (IQR 0...

  7. Testing Strategies and Methodologies for the Max Launch Abort System

    Science.gov (United States)

    Schaible, Dawn M.; Yuchnovicz, Daniel E.

    2011-01-01

    The National Aeronautics and Space Administration (NASA) Engineering and Safety Center (NESC) was tasked to develop an alternate, tower-less launch abort system (LAS) as risk mitigation for the Orion Project. The successful pad abort flight demonstration test in July 2009 of the "Max" launch abort system (MLAS) provided data critical to the design of future LASs, while demonstrating the Agency s ability to rapidly design, build and fly full-scale hardware at minimal cost in a "virtual" work environment. Limited funding and an aggressive schedule presented a challenge for testing of the complex MLAS system. The successful pad abort flight demonstration test was attributed to the project s systems engineering and integration process, which included: a concise definition of, and an adherence to, flight test objectives; a solid operational concept; well defined performance requirements, and a test program tailored to reducing the highest flight test risks. The testing ranged from wind tunnel validation of computational fluid dynamic simulations to component ground tests of the highest risk subsystems. This paper provides an overview of the testing/risk management approach and methodologies used to understand and reduce the areas of highest risk - resulting in a successful flight demonstration test.

  8. Development of test methodology for dynamic mechanical analysis instrumentation

    Science.gov (United States)

    Allen, V. R.

    1982-08-01

    Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

  9. Retroreflector Array for Test Environments (RATE) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Research Support Instruments, Inc. (RSI) proposes to develop the Retroreflector Array for Test Environments (RATE), an innovative technology that will...

  10. Interest Rate Risk Management using Duration Gap Methodology

    Directory of Open Access Journals (Sweden)

    Dan Armeanu

    2008-01-01

    should be measured and managed within an asset-liability management. Then the articles takes a short look at methods for measuring interest rate risk and after that explains and demonstrates how can be used Duration Gap Model for managing interest rate risk in banks.

  11. On the methodology of energy-GDP Granger causality tests

    Energy Technology Data Exchange (ETDEWEB)

    Beaudreau, Bernard C. [Department of Economics, Universite Laval, Quebec (Canada)

    2010-09-15

    Despite their growing technical sophistication and empirical breadth, Granger energy-GDP causality tests remain inconclusive, leaving unresolved the increasingly relevant debate over the role of energy or energy growth in economic growth. While historians and growth theorists point to the development of the steam engine, the electromagnetic motor and the ensuing energy deepening as a key contributing factor in economic growth, the existing tests provide little support for this view. This paper examines this debate, focusing particular attention on the underlying methodology, specifically on the measures of energy used. It is argued that existing tests, by regressing GDP growth on current and lagged levels of energy consumption growth, do not capture the essence of the historical record and recent work on energy and development. A new energy metric in the form of energy availability is presented and discussed in detail. (author)

  12. Interest Rate Risk Management using Duration Gap Methodology

    Directory of Open Access Journals (Sweden)

    Dan Armeanu

    2008-01-01

    Full Text Available The world for financial institutions has changed during the last 20 years, and become riskier and more competitive-driven. After the deregulation of the financial market, banks had to take on extensive risk in order to earn sufficient returns. Interest rate volatility has increased dramatically over the past twenty-five years and for that an efficient management of this interest rate risk is strong required. In the last years banks developed a variety of methods for measuring and managing interest rate risk. From these the most frequently used in real banking life and recommended by Basel Committee are based on: Reprising Model or Funding Gap Model, Maturity Gap Model, Duration Gap Model, Static and Dynamic Simulation. The purpose of this article is to give a good understanding of duration gap model used for managing interest rate risk. The article starts with a overview of interest rate risk and explain how this type of risk should be measured and managed within an asset-liability management. Then the articles takes a short look at methods for measuring interest rate risk and after that explains and demonstrates how can be used Duration Gap Model for managing interest rate risk in banks.The world for financial institutions has changed during the last 20 years, and become riskier and more competitive-driven. After the deregulation of the financial market, banks had to take on extensive risk in order to earn sufficient returns. Interest rate volatility has increased dramatically over the past twenty-five years and for that an efficient management of this interest rate risk is strong required. In the last years banks developed a variety of methods for measuring and managing interest rate risk. From these the most frequently used in real banking life and recommended by Basel Committee are based on: Reprising Model or Funding Gap Model, Maturity Gap Model, Duration Gap Model, Static and Dynamic Simulation. The purpose of this article is to give a

  13. Simplifying multivariate survival analysis using global score test methodology

    Science.gov (United States)

    Zain, Zakiyah; Aziz, Nazrina; Ahmad, Yuhaniz

    2015-12-01

    In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve multiple endpoints, and this situation further complicates the analysis of survival data. In the case of tumor patients, endpoints concerning survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For each patient, these endpoints are correlated, and the estimation of the correlation between two score statistics is fundamental in derivation of overall treatment advantage. In this paper, the bivariate survival analysis method using the global score test methodology is extended to multivariate setting.

  14. Practical remarks on the heart rate and saturation measurement methodology

    Science.gov (United States)

    Kowal, M.; Kubal, S.; Piotrowski, P.; Staniec, K.

    2017-05-01

    A surface reflection-based method for measuring heart rate and saturation has been introduced as one having a significant advantage over legacy methods in that it lends itself for use in special applications such as those where a person’s mobility is of prime importance (e.g. during a miner’s work) and excluding the use of traditional clips. Then, a complete ATmega1281-based microcontroller platform has been described for performing computational tasks of signal processing and wireless transmission. In the next section remarks have been provided regarding the basic signal processing rules beginning with raw voltage samples of converted optical signals, their acquisition, storage and smoothing. This chapter ends with practical remarks demonstrating an exponential dependence between the minimum measurable heart rate and the readout resolution at different sampling frequencies for different cases of averaging depth (in bits). The following section is devoted strictly to the heart rate and hemoglobin oxygenation (saturation) measurement with the use of the presented platform, referenced to measurements obtained with a stationary certified pulsoxymeter.

  15. Efficient testing methodologies for microcameras in a gigapixel imaging system

    Science.gov (United States)

    Youn, Seo Ho; Marks, Daniel L.; McLaughlin, Paul O.; Brady, David J.; Kim, Jungsang

    2013-04-01

    Multiscale parallel imaging--based on a monocentric optical design--promises revolutionary advances in diverse imaging applications by enabling high resolution, real-time image capture over a wide field-of-view (FOV), including sport broadcast, wide-field microscopy, astronomy, and security surveillance. Recently demonstrated AWARE-2 is a gigapixel camera consisting of an objective lens and 98 microcameras spherically arranged to capture an image over FOV of 120° by 50°, using computational image processing to form a composite image of 0.96 gigapixels. Since microcameras are capable of individually adjusting exposure, gain, and focus, true parallel imaging is achieved with a high dynamic range. From the integration perspective, manufacturing and verifying consistent quality of microcameras is a key to successful realization of AWARE cameras. We have developed an efficient testing methodology that utilizes a precisely fabricated dot grid chart as a calibration target to extract critical optical properties such as optical distortion, veiling glare index, and modulation transfer function to validate imaging performance of microcameras. This approach utilizes an AWARE objective lens simulator which mimics the actual objective lens but operates with a short object distance, suitable for a laboratory environment. Here we describe the principles of the methodologies developed for AWARE microcameras and discuss the experimental results with our prototype microcameras. Reference Brady, D. J., Gehm, M. E., Stack, R. A., Marks, D. L., Kittle, D. S., Golish, D. R., Vera, E. M., and Feller, S. D., "Multiscale gigapixel photography," Nature 486, 386--389 (2012).

  16. Testing methodologies and systems for semiconductor optical amplifiers

    Science.gov (United States)

    Wieckowski, Michael

    Semiconductor optical amplifiers (SOA's) are gaining increased prominence in both optical communication systems and high-speed optical processing systems, due primarily to their unique nonlinear characteristics. This in turn, has raised questions regarding their lifetime performance reliability and has generated a demand for effective testing techniques. This is especially critical for industries utilizing SOA's as components for system-in-package products. It is important to note that very little research to date has been conducted in this area, even though production volume and market demand has continued to increase. In this thesis, the reliability of dilute-mode InP semiconductor optical amplifiers is studied experimentally and theoretically. The aging characteristics of the production level devices are demonstrated and the necessary techniques to accurately characterize them are presented. In addition, this work proposes a new methodology for characterizing the optical performance of these devices using measurements in the electrical domain. It is shown that optical performance degradation, specifically with respect to gain, can be directly qualified through measurements of electrical subthreshold differential resistance. This metric exhibits a linear proportionality to the defect concentration in the active region, and as such, can be used for prescreening devices before employing traditional optical testing methods. A complete theoretical analysis is developed in this work to explain this relationship based upon the device's current-voltage curve and its associated leakage and recombination currents. These results are then extended to realize new techniques for testing semiconductor optical amplifiers and other similarly structured devices. These techniques can be employed after fabrication and during packaged operation through the use of a proposed stand-alone testing system, or using a proposed integrated CMOS self-testing circuit. Both methods are capable

  17. TESTS AND METHODOLOGIES FOR THE SURVEY OF NARROW SPACES

    Directory of Open Access Journals (Sweden)

    L. Perfetti

    2017-02-01

    Full Text Available The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today’s era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  18. MUSIC-Expected maximization gaussian mixture methodology for clustering and detection of task-related neuronal firing rates.

    Science.gov (United States)

    Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A

    2017-01-15

    Researchers often rely on simple methods to identify involvement of neurons in a particular motor task. The historical approach has been to inspect large groups of neurons and subjectively separate neurons into groups based on the expertise of the investigator. In cases where neuron populations are small it is reasonable to inspect these neuronal recordings and their firing rates carefully to avoid data omissions. In this paper, a new methodology is presented for automatic objective classification of neurons recorded in association with behavioral tasks into groups. By identifying characteristics of neurons in a particular group, the investigator can then identify functional classes of neurons based on their relationship to the task. The methodology is based on integration of a multiple signal classification (MUSIC) algorithm to extract relevant features from the firing rate and an expectation-maximization Gaussian mixture algorithm (EM-GMM) to cluster the extracted features. The methodology is capable of identifying and clustering similar firing rate profiles automatically based on specific signal features. An empirical wavelet transform (EWT) was used to validate the features found in the MUSIC pseudospectrum and the resulting signal features captured by the methodology. Additionally, this methodology was used to inspect behavioral elements of neurons to physiologically validate the model. This methodology was tested using a set of data collected from awake behaving non-human primates. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Non-animal methodologies within biomedical research and toxicity testing.

    Science.gov (United States)

    Knight, Andrew

    2008-01-01

    Laboratory animal models are limited by scientific constraints on human applicability, and increasing regulatory restrictions, driven by social concerns. Reliance on laboratory animals also incurs marked - and in some cases, prohibitive - logistical challenges, within high-throughput chemical testing programmes, such as those currently underway within Europe and the US. However, a range of non-animal methodologies is available within biomedical research and toxicity testing. These include: mechanisms to enhance the sharing and assessment of existing data prior to conducting further studies, and physicochemical evaluation and computerised modelling, including the use of structure-activity relationships and expert systems. Minimally-sentient animals from lower phylogenetic orders or early developmental vertebral stages may be used, as well as microorganisms and higher plants. A variety of tissue cultures, including immortalised cell lines, embryonic and adult stem cells, and organotypic cultures, are also available. In vitro assays utilising bacterial, yeast, protozoal, mammalian or human cell cultures exist for a wide range of toxic and other endpoints. These may be static or perfused, and may be used individually, or combined within test batteries. Human hepatocyte cultures and metabolic activation systems offer potential assessment of metabolite activity and organ-organ interaction. Microarray technology may allow genetic expression profiling, increasing the speed of toxin detection, well prior to more invasive endpoints. Enhanced human clinical trials utilising micro- dosing, staggered dosing, and more representative study populations and durations, as well as surrogate human tissues, advanced imaging modalities and human epidemiological, sociological and psycho- logical studies, may increase our understanding of illness aetiology and pathogenesis, and facilitate the development of safe and effective pharmacologic interventions. Particularly when human tissues

  20. Assessment of RAMONA-3B methodology with FRIGG dynamic tests

    Energy Technology Data Exchange (ETDEWEB)

    Rohatgi, U.S.; Neymotin, L.Y.; Wulff, W.

    1990-12-31

    The computer codes used at Brookhaven National Laboratory to compute BWR safety parameters are the Engineering Plant Analyzer (EPA) and RAMONA-3B/MOD1. Both codes have the same methodology for modeling thermal hydraulic phenomena: drift-flux formulation, two-phase multipliers for the wall friction and form losses calculations, and the momentum integral approach for spatial integration of the loop momentum equations. Both codes use explicit integration methods for solving ordinary differential equations. It is concluded that both the codes are capable of modelling the instability problems for a BWR. The accuracy of thermohydraulics codes predictions was assessed by modelling oscillatory FRIGG tests. Nodalizations studies showed that 24 axial nodes were sufficient for a converged solution, 12 axial nodes produced an error of 4.4% in the gain of the power to flow transfer function. The code predicted consistently the effects of power and inlet subcooling on gain and system resonance frequency. The comparisons showed that the code predicted the peak gains with a mean difference from experiments of 7% {plus_minus} 30% for all the tests modeled. The uncertainty in the experimental data is {minus}11% to +12%. The mean difference in the predicted frequency at the peak gain is {minus}6% {plus_minus} 14%.

  1. Assessment of RAMONA-3B methodology with FRIGG dynamic tests

    Energy Technology Data Exchange (ETDEWEB)

    Rohatgi, U.S.; Neymotin, L.Y.; Wulff, W.

    1990-01-01

    The computer codes used at Brookhaven National Laboratory to compute BWR safety parameters are the Engineering Plant Analyzer (EPA) and RAMONA-3B/MOD1. Both codes have the same methodology for modeling thermal hydraulic phenomena: drift-flux formulation, two-phase multipliers for the wall friction and form losses calculations, and the momentum integral approach for spatial integration of the loop momentum equations. Both codes use explicit integration methods for solving ordinary differential equations. It is concluded that both the codes are capable of modelling the instability problems for a BWR. The accuracy of thermohydraulics codes predictions was assessed by modelling oscillatory FRIGG tests. Nodalizations studies showed that 24 axial nodes were sufficient for a converged solution, 12 axial nodes produced an error of 4.4% in the gain of the power to flow transfer function. The code predicted consistently the effects of power and inlet subcooling on gain and system resonance frequency. The comparisons showed that the code predicted the peak gains with a mean difference from experiments of 7% {plus minus} 30% for all the tests modeled. The uncertainty in the experimental data is {minus}11% to +12%. The mean difference in the predicted frequency at the peak gain is {minus}6% {plus minus} 14%.

  2. Methodological tests of a heterotrophy index for aquatic ecosystems

    Directory of Open Access Journals (Sweden)

    R. M. Antonio

    Full Text Available Experiments in glucose mineralization were carried out to investigate the effects caused by natural forcing functions on both the decomposition rates and heterotrophy capacity of aquatic ecosystems. In addition, the methodology used could show connections between mineralization rates measured in both laboratory and field work with those measured in aquatic ecosystems. Water samples from Infernão lagoon (21º35'S and 47º51'W were collected, filtered, enriched with glucose, and incubated under aerobic and anaerobic conditions. The glucose concentration variation, dissolved oxygen (DO consumption, pH, electric conductivity, and total CO2 amount in the water were determined for sixteen days. In the period with intense oxygen consumption there was also an evident glucose demand and the dissolved oxygen consumption rate was approximately the same as that for glucose mineralization. The process in the aerobic chambers was 2.2 times faster than that in the anaerobic chambers. An initial acidification of the water samples, probably due to microbial carbonic acid liberation, was noted. A rise in pH values was also observed at the end of the process. The electric conductivity was low for both aerobic and anaerobic chambers, indicating a probable ion uptake by microbial organisms due to the presence of carbon sources. The glucose content variations corresponded to both CO2 formation and dissolved oxygen consumption. It was estimated that 19.4% of the initial glucose content turned into CO2 and the remaining 80.6% into humic compounds and microbial biomass. This experiment showed that glucose can be used as a substrate indicating the heterotrophy of a given aquatic ecosystem.

  3. Methodological tests of a heterotrophy index for aquatic ecosystems.

    Science.gov (United States)

    Antonio, R M; Bianchini Júnior, I

    2003-08-01

    Experiments in glucose mineralization were carried out to investigate the effects caused by natural forcing functions on both the decomposition rates and heterotrophy capacity of aquatic ecosystems. In addition, the methodology used could show connections between mineralization rates measured in both laboratory and field work with those measured in aquatic ecosystems. Water samples from Infernão lagoon (21 degrees 35'S and 47 degrees 51'W) were collected, filtered, enriched with glucose, and incubated under aerobic and anaerobic conditions. The glucose concentration variation, dissolved oxygen (DO) consumption, pH, electric conductivity, and total CO2 amount in the water were determined for sixteen days. In the period with intense oxygen consumption there was also an evident glucose demand and the dissolved oxygen consumption rate was approximately the same as that for glucose mineralization. The process in the aerobic chambers was 2.2 times faster than that in the anaerobic chambers. An initial acidification of the water samples, probably due to microbial carbonic acid liberation, was noted. A rise in pH values was also observed at the end of the process. The electric conductivity was low for both aerobic and anaerobic chambers, indicating a probable ion uptake by microbial organisms due to the presence of carbon sources. The glucose content variations corresponded to both CO2 formation and dissolved oxygen consumption. It was estimated that 19.4% of the initial glucose content turned into CO2 and the remaining 80.6% into humic compounds and microbial biomass. This experiment showed that glucose can be used as a substrate indicating the heterotrophy of a given aquatic ecosystem.

  4. Evaluation of Mapping Methodologies at a Legacy Test Site

    Science.gov (United States)

    Sussman, A. J.; Schultz-Fellenz, E. S.; Roback, R. C.; Kelley, R. E.; Drellack, S.; Reed, D.; Miller, E.; Cooper, D. I.; Sandoval, M.; Wang, R.

    2013-12-01

    On June 12th, 1985, a nuclear test with an announced yield between 20-150kt was detonated in rhyolitic lava in a vertical emplacement borehole at a depth of 608m below the surface. This test did not collapse to the surface and form a crater, but rather resulted in a subsurface collapse with more subtle surface expressions of deformation, providing an opportunity to evaluate the site using a number of surface mapping methodologies. The site was investigated over a two-year time span by several mapping teams. In order to determine the most time efficient and accurate approach for mapping post-shot surface features at a legacy test site, a number of different techniques were employed. The site was initially divided into four quarters, with teams applying various methodologies, techniques, and instrumentations to each quarter. Early methods included transect lines and site gridding with a Brunton pocket transit, flagging tape, measuring tape, and stakes; surveying using a hand-held personal GPS to locate observed features with an accuracy of × 5-10m; and extensive photo-documentation. More recent methods have incorporated the use of near survey grade GPS devices to allow careful location and mapping of surface features. Initially, gridding was employed along with the high resolution GPS surveys, but this was found to be time consuming and of little observational value. Raw visual observation (VOB) data included GPS coordinates for artifacts or features of interest, field notes, and photographs. A categorization system was used to organize the myriad of items, in order to aid in database searches and for visual presentation of findings. The collected data set was imported into a geographic information system (GIS) as points, lines, or polygons and overlain onto a digital color orthophoto map of the test site. Once these data were mapped, spectral data were collected using a high resolution field spectrometer. In addition to geo-locating the field observations with 10cm

  5. Methodology for testing subcomponents; background and motivation for subcomponent testing of wind turbine rotor blades

    DEFF Research Database (Denmark)

    Antoniou, Alexandros; Branner, Kim; Lekou, D.J.

    2016-01-01

    that cannot be verified through the currently followed testing procedures and recommend ways to overcome these limitations. The work is performed within Work-Package WP7.1 entitled “Improved and validated wind turbine structural reliability - Efficient blade structure” of the IRPWIND programme. The numerical...... for blade design, highlighting the current state of the art. The review of the full-scale blade testing procedure is performed under Section 3, followed by the discussion on the issues of verification of design and manufacture performed through testing. Finally, methodologies for testing blade subcomponents...

  6. Symptomatic hyponatremia during glomerular filtration rate testing

    OpenAIRE

    2010-01-01

    Hyponatremia affects nearly one in five of all hospitalized patients. Severe hyponatremia is associated with significant morbidity and mortality, and is therefore important to recognize. Prior reports have linked duloxetine with hyponatremia, but it is uncommon. In this case report, we describe a research subject taking duloxetine who developed severe symptomatic hyponatremia during glomerular filtration rate testing despite having undergone such testing uneventfully in the past.

  7. Methodology of Testing Shot Blasting Machines in Industrial Conditions

    Directory of Open Access Journals (Sweden)

    R. Wrona

    2012-04-01

    Full Text Available Shot blasting machines are widely used for automated surface treatment and finishing of castings. In shot blasting processes the stream of shots is generated and shaped by blasting turbines, making up a kinetic and dynamic system comprising a separating rotor, an adapting sleeve and a propelling rotor provided with blades. The shot blasting performance- i.e. the quality of shot treated surfaces depends on the actual design and operational parameters of the unit whilst the values of relevant parameters are associated with the geometry of turbine components and the level of its integration with the separator system. The circulation of the blasting medium becomes the integrating factor of the process line, starting from the hopper, through the propeller turbine, casting treatment, separation of contaminated abrasive mixture, to its recycling and reuse.Inferior quality of the abrasive agent (shot and insufficient purity of the abrasive mixture are responsible for low effectiveness of shot blasting. However, most practitioners fail to fully recognise the importance of proper diagnostics of the shot blasting process in industrial conditions. The wearing of major machine components and of the blasting agent and quality of shot treated surfaces are often misinterpreted, hence the need to take into account all factors involved in the process within the frame of a comprehensive methodology.This paper is an attempt to formulate and apply the available testing methods to the engineering practice in industrial conditions.

  8. Nondestructive Semistatic Testing Methodology for Assessing Fish Textural Characteristics via Closed-Form Mathematical Expressions

    Directory of Open Access Journals (Sweden)

    D. Dimogianopoulos

    2017-01-01

    Full Text Available This paper presents a novel methodology based on semistatic nondestructive testing of fish for the analytical computation of its textural characteristics via closed-form mathematical expressions. The novelty is that, unlike alternatives, explicit values for both stiffness and viscoelastic textural attributes may be computed, even if fish of different size/weight are tested. Furthermore, the testing procedure may be adapted to the specifications (sampling rate and accuracy of the available equipment. The experimental testing involves a fish placed on the pan of a digital weigh scale, which is subsequently tested with a ramp-like load profile in a custom-made installation. The ramp slope is (to some extent adjustable according to the specification (sampling rate and accuracy of the equipment. The scale’s reaction to fish loading, namely, the reactive force, is collected throughout time and is shown to depend on the fish textural attributes according to a closed-form mathematical formula. The latter is subsequently used along with collected data in order to compute these attributes rapidly and effectively. Four whole raw sea bass (Dicentrarchus labrax of various sizes and textures were tested. Changes in texture, related to different viscoelastic characteristics among the four fish, were correctly detected and quantified using the proposed methodology.

  9. Introduction to Investigation And Utilizing Lean Test Metrics In Agile Software Testing Methodologies

    Directory of Open Access Journals (Sweden)

    Padmaraj Nidagundi

    2016-04-01

    Full Text Available The growth of the software development industry approaches the new development methodologies to deliver the error free software to its end-user fulfilling the business values to product. The growth of tools and technology has brought the automation in the development and software testing process, it has also increased the demand of the fast testing and delivery of the software to end customers. Traditional software development methodologies to trending agile software development trend have brought new philosophy, dimensions, and processes having invested new tools to make process easy. The Agile development (Scrum, XP, FDD, BDD, ATDD, ASD, DSDM, Kanban, Crystal and Lean process also faces the software testing model crises because of the fast development of life cycles, fast delivery to end users without having appropriate test metrics which make the software testing process slow as well as increase the risk. The analysis of the testing metrics in the software testing process and setting the right lean test metrics help to improve the software quality effectively in agile process.

  10. Methodology for Flight Relevant Arc-Jet Testing of Flexible Thermal Protection Systems

    Science.gov (United States)

    Mazaheri, Alireza; Bruce, Walter E., III; Mesick, Nathaniel J.; Sutton, Kenneth

    2013-01-01

    A methodology to correlate flight aeroheating environments to the arc-jet environment is presented. For a desired hot-wall flight heating rate, the methodology provides the arcjet bulk enthalpy for the corresponding cold-wall heating rate. A series of analyses were conducted to examine the effects of the test sample model holder geometry to the overall performance of the test sample. The analyses were compared with arc-jet test samples and challenges and issues are presented. The transient flight environment was calculated for the Hypersonic Inflatable Aerodynamic Decelerator (HIAD) Earth Atmospheric Reentry Test (HEART) vehicle, which is a planned demonstration vehicle using a large inflatable, flexible thermal protection system to reenter the Earth's atmosphere from the International Space Station. A series of correlations were developed to define the relevant arc-jet test environment to properly approximate the HEART flight environment. The computed arcjet environments were compared with the measured arc-jet values to define the uncertainty of the correlated environment. The results show that for a given flight surface heat flux and a fully-catalytic TPS, the flight relevant arc-jet heat flux increases with the arc-jet bulk enthalpy while for a non-catalytic TPS the arc-jet heat flux decreases with the bulk enthalpy.

  11. TESTING MONETARY EXCHANGE RATE MODELS WITH PANEL COINTEGRATION TESTS

    Directory of Open Access Journals (Sweden)

    Szabo Andrea

    2015-07-01

    Full Text Available The monetary exchange rate models explain the long run behaviour of the nominal exchange rate. Their central assertion is that there is a long run equilibrium relationship between the nominal exchange rate and monetary macro-fundamentals. Although these models are essential tools of international macroeconomics, their empirical validity is ambiguous. Previously, time series testing was prevalent in the literature, but it did not bring convincing results. The power of the unit root and the cointegration tests are too low to reject the null hypothesis of no cointegration between the variables. This power can be enhanced by arranging our data in a panel data set, which allows us to analyse several time series simultaneously and enables us to increase the number of observations. We conducted a weak empirical test of the monetary exchange rate models by testing the existence of cointegration between the variables in three panels. We investigated 6, 10 and 15 OECD countries during the following periods: 1976Q1-2011Q4, 1985Q1-2011Q4 and 1996Q1-2011Q4. We tested the reduced form of the monetary exchange rate models in three specifications; we have two restricted models and an unrestricted model. Since cointegration can only be interpreted among non-stationary processes, we investigate the order of the integration of our variables with IPS, Fisher-ADF, Fisher-PP panel unit root tests and the Hadri panel stationary test. All the variables can be unit root processes; therefore we analyze the cointegration with the Pedroni and Kao panel cointegration test. The restricted models performed better than the unrestricted one and we obtained the best results with the 1985Q1-2011Q4 panel. The Kao test rejects the null hypotheses – there is no cointegration between the variables – in all the specifications and all the panels, but the Pedroni test does not show such a positive picture. Hence we found only moderate support for the monetary exchange rate models.

  12. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Xiao-Ying; Yao, Juan; He, Hua; Glantz, Clifford S.; Booth, Alexander E.

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  13. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    Science.gov (United States)

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  14. The Reliability of Methodological Ratings for speechBITE Using the PEDro-P Scale

    Science.gov (United States)

    Murray, Elizabeth; Power, Emma; Togher, Leanne; McCabe, Patricia; Munro, Natalie; Smith, Katherine

    2013-01-01

    Background: speechBITE (http://www.speechbite.com) is an online database established in order to help speech and language therapists gain faster access to relevant research that can used in clinical decision-making. In addition to containing more than 3000 journal references, the database also provides methodological ratings on the PEDro-P (an…

  15. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    Science.gov (United States)

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  16. University Rating: Ideology and Methodology of «League Charts»(foreign practice

    Directory of Open Access Journals (Sweden)

    I V Trotsuk

    2008-03-01

    Full Text Available The article represents the first part of analytical review of ideology, methodology and practice of working out university ratings. The author singles out basic approaches to the task, whose specificity is conditioned by differences in national educational systems and respective emphases in functional aims of ratings (part of professional accreditation of a university, factor of choice by applicants, and/or instrument of international competition.

  17. Servo drive chain evaluation test set-up and configuration methodology

    Directory of Open Access Journals (Sweden)

    R. Heera Singh

    2016-06-01

    Full Text Available The evaluation test bench set-up for servo drive chain consists of various servo sub-modules viz., generic motion controller, servo drive amplifier, brushless AC servo motor, torque coupler, gear reducer and shaft encoder assembly for position feedback is considered. The module interfaces are established for efficient use in commissioning, diagnosing and qualifying the antenna tracking chain. Design methodology demonstrated and specifications of systems were derived. Design specifications of drive chain are configured through software tools for optimizing rate loop and position loop. The transient behaviour and response of servo system using Proportional Integral Derivative controller in time as well in frequency domain is analyzed. Stability conditions are simulated and verified. The test set up energised and test results of different inputs verified and following error minimised by tuning/ optimising the system.

  18. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol [Seoul Nationl Univ., Seoul (Korea, Republic of)] (and others)

    1996-07-15

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method.

  19. Two methodologies for physical penetration testing using social engineering

    NARCIS (Netherlands)

    Dimkov, Trajce; Pieters, Wolter; Hartel, Pieter

    2010-01-01

    Penetration tests on IT systems are sometimes coupled with physical penetration tests and social engineering. In physical penetration tests where social engineering is allowed, the penetration tester directly interacts with the employees. These interactions are usually based on deception and if not

  20. Test methodology and characterization of batteries for remote power applications

    Energy Technology Data Exchange (ETDEWEB)

    Manninen, L.M.; Tuominen, E.; Lund, P.H.

    1997-12-31

    Battery storage is an integral subcomponent of many remote autonomous energy systems. An accurate battery model is essential for the analysis of the system performance. The accuracy of the performance estimate is therefore open dependent on how well the behaviour of the battery is understood. This paper presents computational submodels that predict the voltage vs. current behaviour and internal losses of a vented lead acid battery and illustrates their utilization in practical simulation. A complete and compact methodology for the determination of the battery model parameters that is easily adaptable for different battery types is also presented. The method can be applied routinely. Required instrumentation is minimal, only battery voltage, current and temperature are recorded. The model parameters for a vented lead acid battery determined with this method are also given. (orig.) 26 refs.

  1. A Novel HW/SW Based NoC Router Self-Testing Methodology

    OpenAIRE

    Nazari, Masoom; Lighvan, Mina Zolfy; Koozekonani, Ziaeddin Daie; Sadeghi, Ali

    2016-01-01

    Network-on-Chip (NoC) architecture has been proposed to solve the global communication problem of complex Systems-on-Chips (SoCs). However, NoC testing is a main challenging problem yet. In this article, we propose novel test architecture for NoC router testing. The proposed test architecture uses both advantages of Software Based Self-Testing (SBST) and Built in Self-Testing (BIST) methodologies. In this methodology, we propose custom test instructions with regarding the ISA of NoC processor...

  2. METHODOLOGY FOR DETERMINING INDICATORS OF DIESEL 4DTNA1 DURING ROAD TESTING

    Directory of Open Access Journals (Sweden)

    Gritsuk, O.

    2013-06-01

    Full Text Available The paper presents a methodology for determining the technical and economic parameters of automobile diesel during the road test. Recent testing techniques during testing of small class bus RUTA-25d, equipped with a domestic car diesel engine 4DTNA1 are shown. Described were charac-teristics of the landfill for the testing.

  3. Methodological issues in testing the marginal productivity theory

    NARCIS (Netherlands)

    P.T. Gottschalk (Peter); J. Tinbergen (Jan)

    1982-01-01

    textabstractPrevious tests of the marginal productivity theory have been criticized on several grounds reviewed by the authors. One important deficiency has been the small number of factor inputs entered in the production functions. In 1978 Gottschalk suggested a method to estimate production functi

  4. Two methodologies for physical penetration testing using social engineering

    NARCIS (Netherlands)

    Dimkov, Trajce; Pieters, Wolter; Hartel, Pieter

    2009-01-01

    During a penetration test on the physical security of an organization, if social engineering is used, the penetration tester directly interacts with the employees. These interactions are usually based on deception and if not done properly can upset the employees, violate their privacy or damage thei

  5. Adaptive and robust active vibration control methodology and tests

    CERN Document Server

    Landau, Ioan Doré; Castellanos-Silva, Abraham; Constantinescu, Aurelian

    2017-01-01

    This book approaches the design of active vibration control systems from the perspective of today’s ideas of computer control. It formulates the various design problems encountered in the active management of vibration as control problems and searches for the most appropriate tools to solve them. The experimental validation of the solutions proposed on relevant tests benches is also addressed. To promote the widespread acceptance of these techniques, the presentation eliminates unnecessary theoretical developments (which can be found elsewhere) and focuses on algorithms and their use. The solutions proposed cannot be fully understood and creatively exploited without a clear understanding of the basic concepts and methods, so these are considered in depth. The focus is on enhancing motivations, algorithm presentation and experimental evaluation. MATLAB®routines, Simulink® diagrams and bench-test data are available for download and encourage easy assimilation of the experimental and exemplary material. Thre...

  6. Testing for Equivalence: A Methodology for Computational Cognitive Modelling

    Science.gov (United States)

    Stewart, Terrence; West, Robert

    2010-12-01

    The equivalence test (Stewart and West, 2007; Stewart, 2007) is a statistical measure for evaluating the similarity between a model and the system being modelled. It is designed to avoid over-fitting and to generate an easily interpretable summary of the quality of a model. We apply the equivalence test to two tasks: Repeated Binary Choice (Erev et al., 2010) and Dynamic Stocks and Flows (Gonzalez and Dutt, 2007). In the first case, we find a broad range of statistically equivalent models (and win a prediction competition) while identifying particular aspects of the task that are not yet adequately captured. In the second case, we re-evaluate results from the Dynamic Stocks and Flows challenge, demonstrating how our method emphasizes the breadth of coverage of a model and how it can be used for comparing different models. We argue that the explanatory power of models hinges on numerical similarity to empirical data over a broad set of measures.

  7. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    Science.gov (United States)

    Izzuddin, Nur; Sunarsih, Priyanto, Agoes

    2015-05-01

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel's speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel's speed to obtain better characteristics and hence optimize the fuel saving rate.

  8. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    Energy Technology Data Exchange (ETDEWEB)

    Izzuddin, Nur; Sunarsih,; Priyanto, Agoes [Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Skudai, Johor (Malaysia)

    2015-05-15

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel’s speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel’s speed to obtain better characteristics and hence optimize the fuel saving rate.

  9. Fast Lemons and Sour Boulders: Testing Crossmodal Correspondences Using an Internet-Based Testing Methodology

    Directory of Open Access Journals (Sweden)

    Andy T. Woods

    2013-09-01

    Full Text Available According to a popular family of hypotheses, crossmodal matches between distinct features hold because they correspond to the same polarity on several conceptual dimensions (such as active–passive, good–bad, etc. that can be identified using the semantic differential technique. The main problem here resides in turning this hypothesis into testable empirical predictions. In the present study, we outline a series of plausible consequences of the hypothesis and test a variety of well-established and previously untested crossmodal correspondences by means of a novel internet-based testing methodology. The results highlight that the semantic hypothesis cannot easily explain differences in the prevalence of crossmodal associations built on the same semantic pattern (fast lemons, slow prunes, sour boulders, heavy red; furthermore, the semantic hypothesis only minimally predicts what happens when the semantic dimensions and polarities that are supposed to drive such crossmodal associations are made more salient (e.g., by adding emotional cues that ought to make the good/bad dimension more salient; finally, the semantic hypothesis does not explain why reliable matches are no longer observed once intramodal dimensions with congruent connotations are presented (e.g., visually presented shapes and colour do not appear to correspond.

  10. Dynamic tensile testing for determining the stress-strain curve at different strain rate

    OpenAIRE

    Mansilla, A; Regidor, A.; García, D.; Negro, A

    2001-01-01

    A detailed discussion of high strain-rate tensile testing is presented. A comparative analysis of different ways to measure stress and strain is made. The experimental stress-strain curves have been suitably interpreted to distinguish between the real behaviour of the material and the influence of the testing methodology itself. A special two sections flat specimen design was performed through FEA computer modelling. The mechanical properties as function of strain rate were experimentally obt...

  11. Introduction and summary of the 13th meeting of the Scientific Group on Methodologies for the Safety Evaluation of Chemicals (SGOMSEC): alternative testing methodologies.

    OpenAIRE

    Stokes, W S; Marafante, E

    1998-01-01

    A workshop on alternative toxicological testing methodologies was convened by the Scientific Group on Methodologies for the Safety Evaluation of Chemicals (SGOMSEC) 26-31 January 1997 in Ispra, Italy, at the European Centre for the Validation of Alternative Methods. The purpose of the workshop was to assess the current status of alternative testing methodologies available to evaluate adverse human health and environmental effects of chemicals. Another objective of the workshop was to identify...

  12. Nutrients interaction investigation to improve Monascus purpureus FTC5391 growth rate using Response Surface Methodology and Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Mohamad, R.

    2013-01-01

    Full Text Available Aims: Two vital factors, certain environmental conditions and nutrients as a source of energy are entailed for successful growth and reproduction of microorganisms. Manipulation of nutritional requirement is the simplest and most effectual strategy to stimulate and enhance the activity of microorganisms. Methodology and Results: In this study, response surface methodology (RSM and artificial neural network (ANN were employed to optimize the carbon and nitrogen sources in order to improve growth rate of Monascus purpureus FTC5391,a new local isolate. The best models for optimization of growth rate were a multilayer full feed-forward incremental back propagation network, and a modified response surface model using backward elimination. The optimum condition for cell mass production was: sucrose 2.5%, yeast extract 0.045%, casamino acid 0.275%, sodium nitrate 0.48%, potato starch 0.045%, dextrose 1%, potassium nitrate 0.57%. The experimental cell mass production using this optimal condition was 21 mg/plate/12days, which was 2.2-fold higher than the standard condition (sucrose 5%, yeast extract 0.15%, casamino acid 0.25%, sodium nitrate 0.3%, potato starch 0.2%, dextrose 1%, potassium nitrate 0.3%. Conclusion, significance and impact of study: The results of RSM and ANN showed that all carbon and nitrogen sources tested had significant effect on growth rate (P-value < 0.05. In addition the use of RSM and ANN alongside each other provided a proper growth prediction model.

  13. Development of methodology for community level toxicity testing using the fathead minnow seven day survival-growth impairment test

    OpenAIRE

    Lauth, John Robert

    1990-01-01

    Single species toxicity tests are widely used to assess the potential effects of a toxicant on aquatic life. Increasingly, it is necessary to understand how the results of these tests relate to toxicant effects in natural communities. This dissertation presents the methodology and validation for a community level toxicity test that bridges the gap between single species tests and natutal community responses. The research involved control of environmental parameters, improvement...

  14. Methodology for assessing the impacts of alternative rate designs on industrial energy use. Draft report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-01-11

    A task was undertaken to develop a method for analyzing industrial user responses to alternative rate designs. The method described considers the fuel switching and conservation responses of industrial users and the impact to a hypothetical utility regarding revenue stability, annual gas demand, and seasonal fluctuations. Twenty-seven hypothetical industrial plant types have been specified. For each combustor in the plant, the fuel consumption by season, initial fuel type, fuel switching costs, conservation costs, and amount of fuel conservable is provided. The decision making takes place at the plant level and is aggregated to determine the impact to the utility. Section 2 discusses the factors affecting an industrial user's response to alternative rate designs. Section 3 describes the methodology, includes an overview of the model and an example industrial user's response to a set of fuel prices. The data describing the 27 hypothetical firms is in an appendix.

  15. 78 FR 4855 - Random Drug Testing Rate for Covered Crewmembers

    Science.gov (United States)

    2013-01-23

    ... SECURITY Coast Guard Random Drug Testing Rate for Covered Crewmembers AGENCY: Coast Guard, DHS. ACTION: Notice of minimum random drug testing rate. SUMMARY: The Coast Guard has set the calendar year 2013 minimum random drug testing rate at 25 percent of covered crewmembers. The Coast Guard will continue to...

  16. 76 FR 79204 - Random Drug Testing Rate for Covered Crewmembers

    Science.gov (United States)

    2011-12-21

    ... SECURITY Coast Guard Random Drug Testing Rate for Covered Crewmembers AGENCY: Coast Guard, DHS. ACTION: Notice of minimum random drug testing rate. SUMMARY: The Coast Guard has set the calendar year 2012 minimum random drug testing rate at 50 percent of covered crewmembers. DATES: The minimum random drug...

  17. 76 FR 1448 - Random Drug Testing Rate for Covered Crewmembers

    Science.gov (United States)

    2011-01-10

    ... SECURITY Coast Guard Random Drug Testing Rate for Covered Crewmembers AGENCY: Coast Guard, DHS. ACTION: Notice of minimum random drug testing rate. SUMMARY: The Coast Guard has set the calendar year 2011 minimum random drug testing rate at 50 percent of covered crewmembers. DATES: The minimum random drug...

  18. 21 CFR 864.6700 - Erythrocyte sedimentation rate test.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Erythrocyte sedimentation rate test. 864.6700... (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Manual Hematology Devices § 864.6700 Erythrocyte sedimentation rate test. (a) Identification. An erythrocyte sedimentation rate test is a device that...

  19. Methodology for estimating radiation dose rates to freshwater biota exposed to radionuclides in the environment

    Energy Technology Data Exchange (ETDEWEB)

    Blaylock, B.G.; Frank, M.L.; O`Neal, B.R.

    1993-08-01

    The purpose of this report is to present a methodology for evaluating the potential for aquatic biota to incur effects from exposure to chronic low-level radiation in the environment. Aquatic organisms inhabiting an environment contaminated with radioactivity receive external radiation from radionuclides in water, sediment, and from other biota such as vegetation. Aquatic organisms receive internal radiation from radionuclides ingested via food and water and, in some cases, from radionuclides absorbed through the skin and respiratory organs. Dose rate equations, which have been developed previously, are presented for estimating the radiation dose rate to representative aquatic organisms from alpha, beta, and gamma irradiation from external and internal sources. Tables containing parameter values for calculating radiation doses from selected alpha, beta, and gamma emitters are presented in the appendix to facilitate dose rate calculations. The risk of detrimental effects to aquatic biota from radiation exposure is evaluated by comparing the calculated radiation dose rate to biota to the U.S. Department of Energy`s (DOE`s) recommended dose rate limit of 0.4 mGy h{sup {minus}1} (1 rad d{sup {minus}1}). A dose rate no greater than 0.4 mGy h{sup {minus}1} to the most sensitive organisms should ensure the protection of populations of aquatic organisms. DOE`s recommended dose rate is based on a number of published reviews on the effects of radiation on aquatic organisms that are summarized in the National Council on Radiation Protection and Measurements Report No. 109 (NCRP 1991). DOE recommends that if the results of radiological models or dosimetric measurements indicate that a radiation dose rate of 0. 1 mGy h{sup {minus}1} will be exceeded, then a more detailed evaluation of the potential ecological consequences of radiation exposure to endemic populations should be conducted.

  20. Methodology for estimating radiation dose rates to freshwater biota exposed to radionuclides in the environment

    Energy Technology Data Exchange (ETDEWEB)

    Blaylock, B.G.; Frank, M.L.; O`Neal, B.R.

    1993-08-01

    The purpose of this report is to present a methodology for evaluating the potential for aquatic biota to incur effects from exposure to chronic low-level radiation in the environment. Aquatic organisms inhabiting an environment contaminated with radioactivity receive external radiation from radionuclides in water, sediment, and from other biota such as vegetation. Aquatic organisms receive internal radiation from radionuclides ingested via food and water and, in some cases, from radionuclides absorbed through the skin and respiratory organs. Dose rate equations, which have been developed previously, are presented for estimating the radiation dose rate to representative aquatic organisms from alpha, beta, and gamma irradiation from external and internal sources. Tables containing parameter values for calculating radiation doses from selected alpha, beta, and gamma emitters are presented in the appendix to facilitate dose rate calculations. The risk of detrimental effects to aquatic biota from radiation exposure is evaluated by comparing the calculated radiation dose rate to biota to the U.S. Department of Energy`s (DOE`s) recommended dose rate limit of 0.4 mGy h{sup {minus}1} (1 rad d{sup {minus}1}). A dose rate no greater than 0.4 mGy h{sup {minus}1} to the most sensitive organisms should ensure the protection of populations of aquatic organisms. DOE`s recommended dose rate is based on a number of published reviews on the effects of radiation on aquatic organisms that are summarized in the National Council on Radiation Protection and Measurements Report No. 109 (NCRP 1991). DOE recommends that if the results of radiological models or dosimetric measurements indicate that a radiation dose rate of 0. 1 mGy h{sup {minus}1} will be exceeded, then a more detailed evaluation of the potential ecological consequences of radiation exposure to endemic populations should be conducted.

  1. Building Energy Simulation Test for Existing Homes (BESTEST-EX) Methodology: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Polly, B.; Bianchi, M.; Neymark, J.

    2011-11-01

    The test suite represents a set of cases applying the new Building Energy Simulation Test for Existing Homes (BESTEST-EX) Methodology developed by NREL. (Judkoff et al. 2010a). The NREL team developed the test cases in consultation with the home retrofit industry (BESTEST-EX Working Group 2009), and adjusted the test specifications in accordance with information supplied by a participant with access to large utility bill datasets (Blasnik 2009).

  2. Hypothesis testing on the fractal structure of behavioral sequences: the Bayesian assessment of scaling methodology.

    Science.gov (United States)

    Moscoso del Prado Martín, Fermín

    2013-12-01

    I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology.

  3. Test methodology for elemental sulfur resistant advanced materials for oil and gas field equipment

    Energy Technology Data Exchange (ETDEWEB)

    Steinbeck, G. [Verein Deutscher Eisenhuettenleute, Duesseldorf (Germany); Bruckhoff, W. [BEB Erdgas und Erdoel GmbH, Hannover (Germany); Koehler, M. [Krupp-VDM AG, Werdohl (Germany); Schlerkmann, H. [Mannesmann Forschungsinstitut, Duisburg (Germany); Schmitt, G. [Iserlohn Polytechnic (Germany). Lab. for Corrosion Protection

    1995-10-01

    The great variety of methodologies for testing the performance of advanced materials for resistance to elemental sulfur in oil and gas industry prompted the Technical Committee for Corrosion of the German Iron and Steel Institute (VDEh) to define recommended test procedures. These procedures have already found wide acceptance in the German materials and oil and gas industry.

  4. A European test of pesticide-leaching models: methodology and major recommendations

    NARCIS (Netherlands)

    Vanclooster, M.; Boesten, J.J.T.I.; Trevisan, M.; Brown, C.D.; Capri, E.; Eklo, O.M.; Gottesbüren, B.; Gouy, V.; Linden, van der A.M.A.

    2000-01-01

    Testing of pesticide-leaching models is important in view of their increasing use in pesticide registration procedures in the European Union. This paper presents the methodology and major conclusions of a test of pesticide-leaching models. Twelve models simulating the vertical one-dimensional moveme

  5. 75 FR 76078 - Pipeline Safety: Random Drug Testing Rate

    Science.gov (United States)

    2010-12-07

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Random Drug Testing Rate AGENCY... percentage rate for random drug testing. SUMMARY: PHMSA has determined that the minimum random drug testing... percentage of covered employees for random drug testing. Pursuant to 49 CFR 199.105(c)(2), (3), and (4), the...

  6. 77 FR 2606 - Pipeline Safety: Random Drug Testing Rate

    Science.gov (United States)

    2012-01-18

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Random Drug Testing Rate AGENCY... Percentage Rate for Random Drug Testing. SUMMARY: PHMSA has determined that the minimum random drug testing... percentage of covered employees for random drug testing. Pursuant to 49 CFR 199.105(c)(2), (3), and (4), the...

  7. 75 FR 9018 - Pipeline Safety: Random Drug Testing Rate

    Science.gov (United States)

    2010-02-26

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Random Drug Testing Rate AGENCY... Percentage Rate for Random Drug Testing. SUMMARY: PHMSA has determined that the minimum random drug testing... percentage of covered employees for random drug testing. Pursuant to 49 CFR 199.105(c)(2), (3), and (4), the...

  8. Experimental Tests of Local Cosmological Expansion Rates

    CERN Document Server

    Widom, A; Srivastava, Y

    2015-01-01

    Cosmological expansion on a local scale is usually neglected in part due to its smallness, and in part due to components of bound systems (especially those bound by non-gravitational forces such as atoms and nuclei) not following the geodesics of the cosmological metric. However, it is interesting to ask whether or not experimental tests of cosmological expansion on a local scale (well within our own galaxy) might be experimentally accessible in some manner. We point out, using the Pioneer satellites as an example, that current satellite technology allows for this possibility within time scales of less than one human lifetime.

  9. Field Cone Penetration Tests with Various Penetration Rates - Test Results

    DEFF Research Database (Denmark)

    Poulsen, Rikke; Nielsen, Benjaminn Nordahl; Ibsen, Lars Bo

    The test site is located at Nordre Ringgade near the town called Dronninglund in the northern Jutland in Denmark. The site area is relatively flat, and was chosen because it has a size of approximately 3 ha and contains a relatively thick deposit of silty soils. Furthermore the groundwater...... was encountered at approximately 0.2-0.6 m below the ground level. The soil stratigraphy of the test site was before test start identified by geotechnical borings results. The geotechnical borings indicated that the site contains of sandy silt with clay stripes from approx. 4.0 to 10 m. In the top the silty soil...... is very sandy with few clay stripes, and gradually the clay stripes increases wherefore the soil from approx. 10 m contains of clay with sandy silt stripes. Large soil sample was also collected from the test site in order to determine basic soil properties in the laboratory....

  10. Methodology for Estimating Radiation Dose Rates to Freshwater Biota Exposed to Radionuclides in the Environment

    Energy Technology Data Exchange (ETDEWEB)

    Blaylock, B.G.

    1993-01-01

    The purpose of this report is to present a methodology for evaluating the potential for aquatic biota to incur effects from exposure to chronic low-level radiation in the environment. Aquatic organisms inhabiting an environment contaminated with radioactivity receive external radiation from radionuclides in water, sediment, and from other biota such as vegetation. Aquatic organisms receive internal radiation from radionuclides ingested via food and water and, in some cases, from radionuclides absorbed through the skin and respiratory organs. Dose rate equations, which have been developed previously, are presented for estimating the radiation dose rate to representative aquatic organisms from alpha, beta, and gamma irradiation from external and internal sources. Tables containing parameter values for calculating radiation doses from selected alpha, beta, and gamma emitters are presented in the appendix to facilitate dose rate calculations. The risk of detrimental effects to aquatic biota from radiation exposure is evaluated by comparing the calculated radiation dose rate to biota to the U.S. Department of Energy's (DOE's) recommended dose rate limit of 0.4 mGy h{sup -1} (1 rad d{sup -1}). A dose rate no greater than 0.4 mGy h{sup -1} to the most sensitive organisms should ensure the protection of populations of aquatic organisms. DOE's recommended dose rate is based on a number of published reviews on the effects of radiation on aquatic organisms that are summarized in the National Council on Radiation Protection and Measurements Report No. 109 (NCRP 1991). The literature identifies the developing eggs and young of some species of teleost fish as the most radiosensitive organisms. DOE recommends that if the results of radiological models or dosimetric measurements indicate that a radiation dose rate of 0.1 mGy h{sup -1} will be exceeded, then a more detailed evaluation of the potential ecological consequences of radiation exposure to endemic

  11. Methodology for Estimating Radiation Dose Rates to Freshwater Biota Exposed to Radionuclides in the Environment

    Energy Technology Data Exchange (ETDEWEB)

    Blaylock, B.G.

    1993-01-01

    The purpose of this report is to present a methodology for evaluating the potential for aquatic biota to incur effects from exposure to chronic low-level radiation in the environment. Aquatic organisms inhabiting an environment contaminated with radioactivity receive external radiation from radionuclides in water, sediment, and from other biota such as vegetation. Aquatic organisms receive internal radiation from radionuclides ingested via food and water and, in some cases, from radionuclides absorbed through the skin and respiratory organs. Dose rate equations, which have been developed previously, are presented for estimating the radiation dose rate to representative aquatic organisms from alpha, beta, and gamma irradiation from external and internal sources. Tables containing parameter values for calculating radiation doses from selected alpha, beta, and gamma emitters are presented in the appendix to facilitate dose rate calculations. The risk of detrimental effects to aquatic biota from radiation exposure is evaluated by comparing the calculated radiation dose rate to biota to the U.S. Department of Energy's (DOE's) recommended dose rate limit of 0.4 mGy h{sup -1} (1 rad d{sup -1}). A dose rate no greater than 0.4 mGy h{sup -1} to the most sensitive organisms should ensure the protection of populations of aquatic organisms. DOE's recommended dose rate is based on a number of published reviews on the effects of radiation on aquatic organisms that are summarized in the National Council on Radiation Protection and Measurements Report No. 109 (NCRP 1991). The literature identifies the developing eggs and young of some species of teleost fish as the most radiosensitive organisms. DOE recommends that if the results of radiological models or dosimetric measurements indicate that a radiation dose rate of 0.1 mGy h{sup -1} will be exceeded, then a more detailed evaluation of the potential ecological consequences of radiation exposure to endemic

  12. Testing the hypothesis of the natural suicide rates: Further evidence from OECD data

    DEFF Research Database (Denmark)

    Andres, Antonio Rodriguez; Halicioglu, Ferda

    2011-01-01

    made ideal from the point of view of suicide (Yang and Lester, 1991). This research relates the suicide rates to harmonized unemployment and divorce rates to test the natural hypothesis statistically. We also address methodological flaws by earlier suicide studies by employing autoregressive.......64) and Japan has the highest (13.98) natural rate of suicides. In terms of the male natural suicide rates, the United Kingdom ranks the lowest (4.73) and Belgium ranks the top (15.44). As for the female natural suicide rates, Japan takes the lead (16.76) and Italy has the lowest (5.60). The results are also...

  13. The curious case of the coding and self-ratings mismatches: A methodological and theoretical detective story

    DEFF Research Database (Denmark)

    Panattoni, Katherine W.; McLean, Kate C.

    2017-01-01

    In this paper, we investigate methodological and theoretical constraints implicated by findings of low correlations between researcher codings and participant ratings of conceptually similar narrative features. We discuss potential explanations for these puzzling mismatches from a measurement per...

  14. Development of Clinical Rating Criteria for Tests of Lumbopelvic Stability

    Directory of Open Access Journals (Sweden)

    Margaret A. Perrott

    2012-01-01

    Objective. To develop rating criteria for three clinical tests of LPS. Design. Qualitative research: focus group. Method. A focus group of five expert physiotherapists used qualitative methods to develop rating criteria for the three clinical tests. Results. Detailed rating criteria were established for the three tests. Each key factor considered important for LPS had characteristics described that represented both good and poor LPS. Conclusion. This study established rating criteria that may be used to clinically assess LPS.

  15. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang [Seoul National Univ., Seoul (Korea, Republic of)

    1997-07-15

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code.

  16. Evaluation of the Performance of the PVUSA Rating Methodology Applied to Dual Junction PV Technology: Preprint (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    Myers, D. R.

    2009-07-01

    The PVUSA (Photovoltaics for Utility Scale Applications) project in the 1990's developed a rating methodology for PV performance evaluation which has become popular, and even incorporated into concentrating PV rating standards This report apply that method to rack-mounted dual-junction PV system, and produces a system rating.

  17. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  18. PCP METHODOLOGY FOR DETERMINING DOSE RATES FOR SMALL GRAM QUANTITIES IN SHIPPING PACKAGINGS

    Energy Technology Data Exchange (ETDEWEB)

    Nathan, S.

    2011-08-23

    The Small Gram Quantity (SGQ) concept is based on the understanding that small amounts of hazardous materials, in this case radioactive materials, are significantly less hazardous than large amounts of the same materials. This study describes a methodology designed to estimate an SGQ for several neutron and gamma emitting isotopes that can be shipped in a package compliant with 10 CFR Part 71 external radiation level limits regulations. These regulations require packaging for the shipment of radioactive materials perform, under both normal and accident conditions, the essential functions of material containment, subcriticality, and maintain external radiation levels within regulatory limits. 10 CFR 71.33(b)(1)(2)&(3) state radioactive and fissile materials must be identified and their maximum quantity, chemical and physical forms be included in an application. Furthermore, the U.S. Federal Regulations require application contain an evaluation demonstrating the package (i.e., the packaging and its contents) satisfies the external radiation standards for all packages (10 CFR 71.31(2), 71.35(a), & 71.47). By placing the contents in a He leak-tight containment vessel, and limiting the mass to ensure subcriticality, the first two essential functions are readily met. Some isotopes emit sufficiently strong photon radiation that small amounts of material can yield a large external dose rate. Quantifying of the dose rate for a proposed content is a challenging issue for the SGQ approach. It is essential to quantify external radiation levels from several common gamma and neutron sources that can be safely placed in a specific packaging, to ensure compliance with federal regulations. The Packaging Certification Program (PCP) Methodology for Determining Dose Rate for Small Gram Quantities in Shipping Packagings described in this report provides bounding mass limits for a set of proposed SGQ isotopes. Methodology calculations were performed to estimate external radiation levels

  19. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hyun; You, Young Woo; Cho, Jae Seon; Huh, Chang Wook; Kim, Do Hyoung; Kim, Ju Youl; Kim, Yoon Ik; Yang, Hui Chang; Park, Kang Min [Seoul National Univ., Seoul (Korea, Republic of)

    1998-03-15

    The objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Internal(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plant safety. In this study, the survey about the assessment methodologies, modelings and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. The sensitivity analyses about the failure factors of the components are performed in the bases of the and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code. The qualitative assessment for the STI/AOR of RPS/ESFAS assured safety the most important system in the nuclear power plant are performed.

  20. Ground validation of DPR precipitation rate over Italy using H-SAF validation methodology

    Science.gov (United States)

    Puca, Silvia; Petracca, Marco; Sebastianelli, Stefano; Vulpiani, Gianfranco

    2017-04-01

    The H-SAF project (Satellite Application Facility on support to Operational Hydrology and Water Management, funded by EUMETSAT) is aimed at retrieving key hydrological parameters such as precipitation, soil moisture and snow cover. Within the H-SAF consortium, the Product Precipitation Validation Group (PPVG) evaluate the accuracy of instantaneous and accumulated precipitation products with respect to ground radar and rain gauge data adopting the same methodology (using a Unique Common Code) throughout Europe. The adopted validation methodology can be summarized by the following few steps: (1) ground data (radar and rain gauge) quality control; (2) spatial interpolation of rain gauge measurements; (3) up-scaling of radar data to satellite native grid; (4) temporal comparison of satellite and ground-based precipitation products; and (5) production and evaluation of continuous and multi-categorical statistical scores for long time series and case studies. The statistical scores are evaluated taking into account the satellite product native grid. With the recent advent of the GPM era starting in march 2014, more new global precipitation products are available. The validation methodology developed in H-SAF can be easily applicable to different precipitation products. In this work, we have validated instantaneous precipitation data estimated from DPR (Dual-frequency Precipitation Radar) instrument onboard of the GPM-CO (Global Precipitation Measurement Core Observatory) satellite. In particular, we have analyzed the near surface and estimated precipitation fields collected in the 2A-Level for 3 different scans (NS, MS and HS). The Italian radar mosaic managed by the National Department of Civil Protection available operationally every 10 minutes is used as ground reference data. The results obtained highlight the capability of the DPR to identify properly the precipitation areas with higher accuracy in estimating the stratiform precipitation (especially for the HS). An

  1. The Decisions of Elementary School Principals: A Test of Ideal Type Methodology.

    Science.gov (United States)

    Greer, John T.

    Interviews with 25 Georgia elementary school principals provided data that could be used to test an application of Max Weber's ideal type methodology to decision-making. Alfred Schuetz's model of the rational act, based on one of Weber's ideal types, was analyzed and translated into describable acts and behaviors. Interview procedures were…

  2. Comparison of two bond strength testing methodologies for bilayered all-ceramics

    NARCIS (Netherlands)

    Dundar, Mine; Ozcan, Mutlu; Gokce, Bulent; Comlekoglu, Erhan; Leite, Fabiola; Valandro, Luiz Felipe

    Objectives. This study compared the shear bond strength (SBS) and microtensile (MTBS) testing methodologies for core and veneering ceramics in four types of all-ceramic systems. Methods. Four different ceramic veneer/core combinations, three of which were feldspathic and the other a fluor-apatite to

  3. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    Science.gov (United States)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  4. USING STRESS TESTING METHODOLOGY IN EVALUATING BANKING INSTITUTION’S EXPOSURE TO RISK

    OpenAIRE

    Ioan TRENCA; Simona MUTU; Maria-Miruna POCHEA

    2010-01-01

    In order to correctly estimate the unpredictable effects on their transaction portfolios, the banks developed stress testing methods which turned out to be a very important tool in the bank supervision process. Moreover, the supervision authorities started using stress-testing methods for evaluating systemic risk and for determining the adequacy degree of capital in the banking sector. Taking into account the importance of these simulations, the present paper presents methodologies with which...

  5. Range Safety Real-time System for Satellite Launch Vehicle Missions–Testing Methodologies

    Directory of Open Access Journals (Sweden)

    R. Varaprasad

    2006-11-01

    Full Text Available A real-time system plays a critical role in the range safety decision-making in a satellitelaunch mission. Real-time software, the heart of such systems, is becoming an issue of criticality.Emphasis is being laid on the development of reliable, robust, and operational system. Thispaper purports to delineate prudent testing methodologies implemented to test the real-timesystem.

  6. Methodology to identify risk-significant components for inservice inspection and testing

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, M.T.; Hartley, R.S.; Jones, J.L. Jr.; Kido, C.; Phillips, J.H.

    1992-08-01

    Periodic inspection and testing of vital system components should be performed to ensure the safe and reliable operation of Department of Energy (DOE) nuclear processing facilities. Probabilistic techniques may be used to help identify and rank components by their relative risk. A risk-based ranking would allow varied DOE sites to implement inspection and testing programs in an effective and cost-efficient manner. This report describes a methodology that can be used to rank components, while addressing multiple risk issues.

  7. Smartphone-enabled pulse rate variability: an alternative methodology for the collection of heart rate variability in psychophysiological research.

    Science.gov (United States)

    Heathers, James A J

    2013-09-01

    Heart rate variability (HRV) is widely used to assess autonomic nervous system (ANS) function. It is traditionally collected from a dedicated laboratory electrocardiograph (ECG). This presents a barrier to collecting the large samples necessary to maintain the statistical power of between-subject psychophysiological comparisons. An alternative to ECG involves an optical pulse sensor or photoplethysmograph run from a smartphone or similar portable device: smartphone pulse rate variability (SPRV). Experiment 1 determined the simultaneous accuracy between ECG and SPRV systems in n = 10 participants at rest. Raw SPRV values showed a consistent positive bias, which was successfully attenuated with correction. Experiment 2 tested an additional n = 10 participants at rest, during attentional load, and during mild stress (exercise). Accuracy was maintained, but slightly attenuated during exercise. The best correction method maintained an accuracy of +/-2% for low-frequency spectral power, and +/-5% for high-frequency spectral power over all points. Thus, the SPRV system records a pulse-to-pulse approximation of an ECG-derived heart rate series that is sufficiently accurate to perform time- and frequency-domain analysis of its variability, as well as accurately reflecting change in autonomic output provided by typical psychophysiological stimuli. This represents a novel method by which an accurate approximation of HRV may be collected for large-sample or naturalistic cardiac psychophysiological research.

  8. 49 CFR 1109.4 - Mandatory mediation in rate cases to be considered under the stand-alone cost methodology.

    Science.gov (United States)

    2010-10-01

    ... § 1109.4 Mandatory mediation in rate cases to be considered under the stand-alone cost methodology. (a) A..., the Board will assign a mediator to the case. Within 5 business days of the assignment to mediate, the... 49 Transportation 8 2010-10-01 2010-10-01 false Mandatory mediation in rate cases to be considered...

  9. A new methodology for hydro-abrasive erosion tests simulating penstock erosive flow

    Science.gov (United States)

    Aumelas, V.; Maj, G.; Le Calvé, P.; Smith, M.; Gambiez, B.; Mourrat, X.

    2016-11-01

    Hydro-abrasive resistance is an important property requirement for hydroelectric power plant penstock coating systems used by EDF. The selection of durable coating systems requires an experimental characterization of coating performance. This can be achieved by performing accelerated and representative laboratory tests. In case of severe erosion induced by a penstock flow, there is no suitable method or standard representative of real erosive flow conditions. The presented study aims at developing a new methodology and an associated laboratory experimental device. The objective of the laboratory apparatus is to subject coated test specimens to wear conditions similar to the ones generated at the penstock lower generatrix in actual flow conditions. Thirteen preselected coating solutions were first been tested during a 45 hours erosion test. A ranking of the thirteen coating solutions was then determined after characterisation. To complete this first evaluation and to determine the wear kinetic of the four best coating solutions, additional erosion tests were conducted with a longer duration of 216 hours. A comparison of this new method with standardized tests and with real service operating flow conditions is also discussed. To complete the final ranking based on hydro-abrasive erosion tests, some trial tests were carried out on penstock samples to check the application method of selected coating systems. The paper gives some perspectives related to erosion test methodologies for materials and coating solutions for hydraulic applications. The developed test method can also be applied in other fields.

  10. A methodology to condition distorted acoustic emission signals to identify fracture timing from human cadaver spine impact tests.

    Science.gov (United States)

    Arun, Mike W J; Yoganandan, Narayan; Stemper, Brian D; Pintar, Frank A

    2014-12-01

    While studies have used acoustic sensors to determine fracture initiation time in biomechanical studies, a systematic procedure is not established to process acoustic signals. The objective of the study was to develop a methodology to condition distorted acoustic emission data using signal processing techniques to identify fracture initiation time. The methodology was developed from testing a human cadaver lumbar spine column. Acoustic sensors were glued to all vertebrae, high-rate impact loading was applied, load-time histories were recorded (load cell), and fracture was documented using CT. Compression fracture occurred to L1 while other vertebrae were intact. FFT of raw voltage-time traces were used to determine an optimum frequency range associated with high decibel levels. Signals were bandpass filtered in this range. Bursting pattern was found in the fractured vertebra while signals from other vertebrae were silent. Bursting time was associated with time of fracture initiation. Force at fracture was determined using this time and force-time data. The methodology is independent of selecting parameters a priori such as fixing a voltage level(s), bandpass frequency and/or using force-time signal, and allows determination of force based on time identified during signal processing. The methodology can be used for different body regions in cadaver experiments.

  11. 40 CFR 280.104 - Local government bond rating test.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 26 2010-07-01 2010-07-01 false Local government bond rating test. 280... STORAGE TANKS (UST) Financial Responsibility § 280.104 Local government bond rating test. (a) A general purpose local government owner or operator and/or local government serving as a guarantor may satisfy...

  12. Small punch creep test: A promising methodology for high temperature plant components life evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Tettamanti, S. [CISE SpA, Milan (Italy); Crudeli, R. [ENEL SpA, Milan (Italy)

    1998-12-31

    CISE and ENEL are involved for years in a miniaturization creep methodology project to obtain similar non-destructive test with the same standard creep test reliability. The goal can be reached with `Small punch creep test` that collect all the requested characteristics; quasi nondestructive disk specimens extracted both on external or internal side of components, than accurately machined and tested on little and cheap apparatus. CISE has developed complete creep small punch procedure that involved peculiar test facility and correlation`s law comparable with the more diffused isostress methodology for residual life evaluation on ex-serviced high temperature plant components. The aim of this work is to obtain a simple and immediately applicable relationship useful for plant maintenance managing. More added work is need to validate the Small Punch methodology and for relationship calibration on most diffusion high temperature structural materials. First obtained results on a comparative work on ASTM A355 P12 ex-serviced pipe material are presented joint with a description of the Small Punch apparatus realized in CISE. (orig.) 6 refs.

  13. Test Methodology to Evaluate the Safety of Materials Using Spark Incendivity

    Science.gov (United States)

    Buhler, Charles; Calle, Carlos; Clements, Sid; Ritz, Mindy; Starnes, Jeff

    2007-01-01

    For many years scientists and engineers have been searching for the proper test method to evaluate an electrostatic risk for materials used in hazardous environments. A new test standard created by the International Electrotechnical Commission is a promising addition to conventional test methods used throughout industry. The purpose of this paper is to incorporate this test into a proposed new methodology for the evaluation of materials exposed to flammable environments. However, initial testing using this new standard has uncovered some unconventional behavior in materials that conventional test methods were thought to have reconciled. For example some materials tested at higher humidities were more susceptible to incendive discharges than at lower humidity even though the surface resistivity was lower.

  14. An Application of the Coda Methodology for Moment-Rate Spectra Using Broadband Stations in Turkey

    Energy Technology Data Exchange (ETDEWEB)

    Eken, T; Mayeda, K; Hofstetter, A; Gok, R; Orgulu, G; Turkelli, N

    2004-02-03

    A recently developed coda magnitude methodology was applied to selected broadband stations in Turkey for the purpose of testing the coda method in a large, laterally complex region. As found in other, albeit smaller regions, coda envelope amplitude measurements are significantly less variable than distance-corrected direct wave measurements (i.e., L{sub g} and surface waves) by roughly a factor 3-to-4. Despite strong lateral crustal heterogeneity in Turkey, we found that the region could be adequately modeled assuming a simple 1-D, radially symmetric path correction for 10 narrow frequency bands ranging between 0.02 to 2.0 Hz. For higher frequencies however, 2-D path corrections will be necessary and will be the subject of a future study. After calibrating the stations ISP, ISKB, and MALT for local and regional distances, single-station moment-magnitude estimates (M{sub w}) derived from the coda spectra were in excellent agreement with those determined from multi-station waveform modeling inversions of long-period data, exhibiting a data standard deviation of 0.17. Though the calibration was validated using large events, the results of the calibration will extend M{sub w} estimates to significantly smaller events which could not otherwise be waveform modeled due to poor signal-to-noise ratio at long periods and sparse station coverage. The successful application of the method is remarkable considering the significant lateral complexity in Turkey and the simple assumptions used in the coda method.

  15. An Application of the Coda Methodology for Moment-Rate Spectra Using Broadband Stations in Turkey

    Energy Technology Data Exchange (ETDEWEB)

    Eken Tuna, Kevin Mayeda, Abraham Hofstetter, Rengin Gok, Gonca Orgulu, Niyazi Turkelli

    2004-07-11

    A recently developed coda magnitude methodology was applied to selected broadband stations in Turkey for the purpose of testing the coda method in a large, laterally complex region. As found in other, albeit smaller regions, coda envelope amplitude measurements are significantly less variable than distance-corrected direct wave measurements (i.e., L{sub g} and surface waves) by roughly a factor 3-to-4. Despite strong lateral crustal heterogeneity in Turkey, they found that the region could be adequately modeled assuming a simple 1-D, radially symmetric path correction. After calibrating the stations ISP, ISKB and MALT for local and regional distances, single-station moment-magnitude estimates (M{sub W}) derived from the coda spectra were in excellent agreement with those determined from multistation waveform modeling inversions, exhibiting a data standard deviation of 0.17. Though the calibration was validated using large events, the results of the calibration will extend M{sub W} estimates to significantly smaller events which could not otherwise be waveform modeled. The successful application of the method is remarkable considering the significant lateral complexity in Turkey and the simple assumptions used in the coda method.

  16. Material Removal Rate, Electrode Wear Rate, and Surface Roughness Evaluation in Die Sinking EDM with Hollow Tool through Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Teepu Sultan

    2014-01-01

    Full Text Available Electrical discharge machining is one of the earliest nontraditional machining, extensively used in industry for processing of parts having unusual profiles with reasonable precision. In the present work, an attempt has been made to model material removal rate, electrode wear rate, and surface roughness through response surface methodology in a die sinking EDM process. The optimization was performed in two steps using one factor at a time for preliminary evaluation and a Box-Behnken design involving three variables with three levels for determination of the critical experimental conditions. Pulse on time, pulse off time, and peak current were changed during the tests, while a copper electrode having tubular cross section was employed to machine through holes on EN 353 steel alloy workpiece. The results of analysis of variance indicated that the proposed mathematical models obtained can adequately describe the performances within the limits of factors being studied. The experimental and predicted values were in a good agreement. Surface topography is revealed with the help of scanning electron microscope micrographs.

  17. Prediction of material removal rate and surface roughness for wire electrical discharge machining of nickel using response surface methodology

    Directory of Open Access Journals (Sweden)

    Thangam Chinnadurai

    2016-12-01

    Full Text Available This study focuses on investigating the effects of process parameters, namely, Peak current (Ip, Pulse on time (Ton, Pulse off time (Toff, Water pressure (Wp, Wire feed rate (Wf, Wire tension (Wt, Servo voltage (Sv and Servo feed setting (Sfs, on the Material Removal Rate (MRR and Surface Roughness (SR for Wire electrical discharge machining (Wire-EDM of nickel using Taguchi method. Response Surface Methodology (RSM is adopted to evolve mathematical relationships between the wire cutting process parameters and the output variables of the weld joint to determine the welding input parameters that lead to the desired optimal wire cutting quality. Besides, using response surface plots, the interaction effects of process parameters on the responses are analyzed and discussed. The statistical software Mini-tab is used to establish the design and to obtain the regression equations. The developed mathematical models are tested by analysis-of-variance (ANOVA method to check their appropriateness and suitability. Finally, a comparison is made between measured and calculated results, which are in good agreement. This indicates that the developed models can predict the responses accurately and precisely within the limits of cutting parameter being used.

  18. Metodologia de rating em cooperativas agropecuárias: um estudo de caso Rating methodology in agricultural cooperatives: a case study

    Directory of Open Access Journals (Sweden)

    Davi Rogério de Moura Costa

    2009-12-01

    information asymmetry among cooperatives and the financial markets agents. In this way, developing a rating methodology applied to agricultural cooperatives in Brazil to reduce moral hazard and adverse selection that creates business and contractual organizational inefficiency among cooperatives and financial markets agents. The 'five C' method was used which considers organizational financial capacity, corporate governance characteristics, relations with members, and commodity market characteristics, among others. A weighting and evaluation were associated with each variable. The result was applied to an agricultural cooperative case study which concluded that this methodology is applicable and that results, evaluations and weights, could be discussed in cooperative rating committees. As a final consideration, an agenda is discussed for new applications of this methodology, for testing in other agricultural cooperative organizations and to consolidate it as a market financial information signal.

  19. Framed bit error rate testing for 100G ethernet equipment

    DEFF Research Database (Denmark)

    Rasmussen, Anders; Ruepp, Sarah Renée; Berger, Michael Stübert

    2010-01-01

    The Internet users behavioural patterns are migrating towards bandwidth-intensive applications, which require a corresponding capacity extension. The emerging 100 Gigabit Ethernet (GE) technology is a promising candidate for providing a ten-fold increase of todays available Internet transmission...... rate. As the need for 100 Gigabit Ethernet equipment rises, so does the need for equipment, which can properly test these systems during development, deployment and use. This paper presents early results from a work-in-progress academia-industry collaboration project and elaborates on the challenges...... of performing bit error rate testing at 100Gbps. In particular, we show how Bit Error Rate Testing (BERT) can be performed over an aggregated 100G Attachment Unit Interface (CAUI) by encapsulating the test data in Ethernet frames at line speed. Our results show that framed bit error rate testing can...

  20. Identification of neutron irradiation induced strain rate sensitivity change using inverse FEM analysis of Charpy test

    Science.gov (United States)

    Haušild, Petr; Materna, Aleš; Kytka, Miloš

    2015-04-01

    A simple methodology how to obtain additional information about the mechanical behaviour of neutron-irradiated WWER 440 reactor pressure vessel steel was developed. Using inverse identification, the instrumented Charpy test data records were compared with the finite element computations in order to estimate the strain rate sensitivity of 15Ch2MFA steel irradiated with different neutron fluences. The results are interpreted in terms of activation volume change.

  1. Assessment of fetal maturation age by heart rate variability measures using random forest methodology.

    Science.gov (United States)

    Tetschke, F; Schneider, U; Schleussner, E; Witte, O W; Hoyer, D

    2016-03-01

    Fetal maturation age assessment based on heart rate variability (HRV) is a predestinated tool in prenatal diagnosis. To date, almost linear maturation characteristic curves are used in univariate and multivariate models. Models using complex multivariate maturation characteristic curves are pending. To address this problem, we use Random Forest (RF) to assess fetal maturation age and compare RF with linear, multivariate age regression. We include previously developed HRV indices such as traditional time and frequency domain indices and complexity indices of multiple scales. We found that fetal maturation was best assessed by complexity indices of short scales and skewness in state-dependent datasets (quiet sleep, active sleep) as well as in state-independent recordings. Additionally, increasing fluctuation amplitude contributed to the model in the active sleep state. None of the traditional linear HRV parameters contributed to the RF models. Compared to linear, multivariate regression, the mean prediction of gestational age (GA) is more accurate with RF than in linear, multivariate regression (quiet state: R(2)=0,617 vs. R(2)=0,461, active state: R(2)=0,521 vs. R(2)=0,436, state independent: R(2)=0,583 vs. R(2)=0,548). We conclude that classification and regression tree models such as RF methodology are appropriate for the evaluation of fetal maturation age. The decisive role of adjustments between different time scales of complexity may essentially extend previous analysis concepts mainly based on rhythms and univariate complexity indices. Those system characteristics may have implication for better understanding and accessibility of the maturating complex autonomic control and its disturbance.

  2. Development of a methodology for selecting office building upgrading solutions based on a test survey in European buildings

    Energy Technology Data Exchange (ETDEWEB)

    Wittchen, K.B.; Brandt, E. [Danish Building and Urban Research, Horsholm (Denmark)

    2001-07-01

    The potential for using the TOBUS methodology to select office building upgrading solutions have been investigated during field tests in 15 European office buildings in 5 European countries. The 15 office buildings represent a variety of building traditions, architectural designs, construction periods and energy and indoor performance. The buildings were audited following the TOBUS methodology developed within the project. The results of the test surveys were primarily used to improve the TOBUS methodology and secondly to suggest general upgrading solutions and energy retrofit measures for the surveyed buildings. This paper describes the development of the TOBUS methodology based on the 15 test surveys. (author)

  3. Methodologies for Combined Loads Tests Using a Multi-Actuator Test Machine

    Science.gov (United States)

    Rouse, Marshall

    2013-01-01

    The NASA Langley COmbined Loads Test System (COLTS) Facility was designed to accommodate a range of fuselage structures and wing sections and subject them to both quasistatic and cyclic loading conditions. Structural tests have been conducted in COLTS that address structural integrity issues of metallic and fiber reinforced composite aerospace structures in support of NASA Programs (i.e. the Aircraft Structural Integrity (ASIP) Program, High-Speed-Research program and the Supersonic Project, NASA Engineering and Safety Center (NESC) Composite Crew Module Project, and the Environmentally Responsible Aviation Program),. This paper presents experimental results for curved panels subjected to mechanical and internal pressure loads using a D-box test fixture. Also, results are presented that describe use of a checkout beam for development of testing procedures for a combined mechanical and pressure loading test of a Multi-bay box. The Multi-bay box test will be used to experimentally verify the structural performance of the Multi-bay box in support of the Environmentally Responsible Aviation Project at NASA Langley.

  4. Strength of wood versus rate of testing - A theoretical approach

    DEFF Research Database (Denmark)

    Nielsen, Lauge Fuglsang

    2007-01-01

    Strength of wood is normally measured in ramp load experiments. Experience shows that strength increases with increasing rate of testing. This feature is considered theoretically in this paper. It is shown that the influence of testing rate is a phenomenon, which depends on the quality...... of the considered wood. Low quality wood shows lesser influence of testing rate. This observation agrees with the well-known statement made by Borg Madsen that weak wood subjected to a constant load, has a longer lifetime than strong wood. In general, the influence of testing rate on strength increases...... with increasing moisture content. This phenomenon applies irrespective of the considered wood quality such that the above-mentioned order of magnitude observations between low and high quality wood are kept....

  5. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications

    Science.gov (United States)

    2010-01-01

    ... PROD Mortgage Product Type A0 Age immediately prior to start of Stress Test, in months (weighted... Variable Description Mortgage Product Type A0 Age immediately prior to start of Stress Test, in months... Enterprise projections Principal Payment Window Starting Date, Up-Rate Scenario The month in the Stress Test...

  6. Increasing the competitiveness of maintenance contract rates by using an alternative methodology for the calculation of average vehicle maintenance costs

    Directory of Open Access Journals (Sweden)

    Stephen Carstens

    2008-11-01

    Full Text Available Companies tend to outsource transport to fleet management companies to increase efficiencies if transport is a non-core activity. The provision of fleet management services on contract introduces a certain amount of financial risk to the fleet management company, specifically fixed rate maintenance contracts. The quoted rate needs to be sufficient and also competitive in the market. Currently the quoted maintenance rates are based on the maintenance specifications of the manufacturer and the risk management approach of the fleet management company. This is usually reflected in a contingency that is included in the quoted maintenance rate. An alternative methodology for calculating the average maintenance cost for a vehicle fleet is proposed based on the actual maintenance expenditures of the vehicles and accepted statistical techniques. The proposed methodology results in accurate estimates (and associated confidence limits of the true average maintenance cost and can beused as a basis for the maintenance quote.

  7. High strain rate compression testing of glass fibre reinforced polypropylene

    Directory of Open Access Journals (Sweden)

    Cloete T.J.

    2012-08-01

    Full Text Available This paper details an investigation of the high strain rate compression testing of GFPP with the Split Hopkinson Pressure Bar (SHPB in the through-thickness and in-plane directions. GFPP posed challenges to SHPB testing as it fails at relatively high stresses, while having relatively low moduli and hence mechanical impedance. The modifications to specimen geometry and incident pulse shaping in order to gather valid test results, where specimen equilibrium was achieved for SHPB tests on GFPP are presented. In addition to conventional SHPB tests to failure, SHPB experiments were designed to achieve specimen equilibration at small strains, which permitted the capture of high strain rate elastic modulus data. The strain rate dependency of GFPP’s failure strengths in the in-plane and through-thickness direction is modelled using a logarithmic law.

  8. Environmental testing of a prototypic digital safety channel, Phase I: System design and test methodology

    Energy Technology Data Exchange (ETDEWEB)

    Korsah, K.; Turner, G.W.; Mullens, J.A. [Oak Ridge National Lab., TN (United States)

    1995-04-01

    A microprocessor-based reactor trip channel has been assembled for environmental testing under an Instrumentation and Control (I&C) Qualification Program sponsored by the US Nuclear Regulatory Commission. The goal of this program is to establish the technical basis and acceptance criteria for the qualification of advanced I&C systems. The trip channel implemented for this study employs technologies and digital subsystems representative of those proposed for use in some advanced light-water reactors (ALWRs) such as the Simplified Boiling Water Reactor (SBWR). It is expected that these tests will reveal any potential system vulnerabilities for technologies representative of those proposed for use in ALWRs. The experimental channel will be purposely stressed considerably beyond what it is likely to experience in a normal nuclear power plant environment, so that the tests can uncover the worst-case failure modes (i.e., failures that are likely to prevent an entire trip system from performing its safety function when required to do so). Based on information obtained from this study, it may be possible to recommend tests that are likely to indicate the presence of such failure mechanisms. Such recommendations would be helpful in augmenting current qualification guidelines.

  9. Heart rate-based lactate minimum test: a reproducible method.

    NARCIS (Netherlands)

    Strupler, M.; Muller, G.; Perret, C.

    2009-01-01

    OBJECTIVE: To find the individual intensity for aerobic endurance training, the lactate minimum test (LMT) seems to be a promising method. LMTs described in the literature consist of speed or work rate-based protocols, but for training prescription in daily practice mostly heart rate is used. The

  10. Heart rate-based lactate minimum test: a reproducible method.

    NARCIS (Netherlands)

    Strupler, M.; Muller, G.; Perret, C.

    2009-01-01

    OBJECTIVE: To find the individual intensity for aerobic endurance training, the lactate minimum test (LMT) seems to be a promising method. LMTs described in the literature consist of speed or work rate-based protocols, but for training prescription in daily practice mostly heart rate is used. The ai

  11. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS

    Science.gov (United States)

    Jadaan, Osama M.

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. This includes completion of a literature survey regarding Weibull size effect in MEMS and strength testing techniques. Also of interest is the design of a proper test for the Weibull size effect in tensile specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. Another potential item of interest is analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structuredlife (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. Along these lines work may also be performed on transient fatigue life prediction methodologies.

  12. Fighter pilots' heart rate, heart rate variation and performance during an instrument flight rules proficiency test.

    Science.gov (United States)

    Mansikka, Heikki; Virtanen, Kai; Harris, Don; Simola, Petteri

    2016-09-01

    Increased task demand will increase the pilot mental workload (PMWL). When PMWL is increased, mental overload may occur resulting in degraded performance. During pilots' instrument flight rules (IFR) proficiency test, PMWL is typically not measured. Therefore, little is known about workload during the proficiency test and pilots' potential to cope with higher task demands than those experienced during the test. In this study, fighter pilots' performance and PMWL was measured during a real IFR proficiency test in an F/A-18 simulator. PMWL was measured using heart rate (HR) and heart rate variation (HRV). Performance was rated using Finnish Air Force's official rating scales. Results indicated that HR and HRV differentiate varying task demands in situations where variations in performance are insignificant. It was concluded that during a proficiency test, PMWL should be measured together with the task performance measurement.

  13. A methodology to quantify the differences between alternative methods of heart rate variability measurement.

    Science.gov (United States)

    García-González, M A; Fernández-Chimeno, M; Guede-Fernández, F; Ferrer-Mileo, V; Argelagós-Palau, A; Álvarez-Gómez, L; Parrado, E; Moreno, J; Capdevila, L; Ramos-Castro, J

    2016-01-01

    This work proposes a systematic procedure to report the differences between heart rate variability time series obtained from alternative measurements reporting the spread and mean of the differences as well as the agreement between measuring procedures and quantifying how stationary, random and normal the differences between alternative measurements are. A description of the complete automatic procedure to obtain a differences time series (DTS) from two alternative methods, a proposal of a battery of statistical tests, and a set of statistical indicators to better describe the differences in RR interval estimation are also provided. Results show that the spread and agreement depend on the choice of alternative measurements and that the DTS cannot be considered generally as a white or as a normally distributed process. Nevertheless, in controlled measurements the DTS can be considered as a stationary process.

  14. Evaluation and Field Assessment of Bifacial Photovoltaic Module Power Rating Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Deline, Chris; MacAlpine, Sara; Marion, Bill; Toor, Fatima; Asgharzadeh, Amir; Stein, Joshua S.

    2016-11-21

    1-sun power ratings for bifacial modules are currently undefined. This is partly because there is no standard definition of rear irradiance given 1000 Wm-2 on the front. Using field measurements and simulations, we evaluate multiple deployment scenarios for bifacial modules and provide details on the amount of irradiance that could be expected. A simplified case that represents a single module deployed under conditions consistent with existing 1-sun irradiance standards leads to a bifacial reference condition of 1000 Wm-2 Gfront and 130-140 Wm-2 Grear. For fielded systems of bifacial modules, Grear magnitude and spatial uniformity will be affected by self-shade from adjacent modules, varied ground cover, and ground-clearance height. A standard measurement procedure for bifacial modules is also currently undefined. A proposed international standard is under development, which provides the motivation for this work. Here, we compare outdoor field measurements of bifacial modules with irradiance on both sides with proposed indoor test methods where irradiance is only applied to one side at a time. The indoor method has multiple advantages, including controlled and repeatable irradiance and thermal environment, along with allowing the use of conventional single-sided flash test equipment. The comparison results are promising, showing that the indoor and outdoor methods agree within 1%-2% for multiple rear-irradiance conditions and bifacial module types.

  15. Evaluation and Field Assessment of Bifacial Photovoltaic Module Power Rating Methodologies: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Deline, Chris; MacAlpine, Sara; Marion, Bill; Toor, Fatima; Asgharzadeh, Amir; Stein, Joshua S.

    2016-06-16

    1-sun power ratings for bifacial modules are currently undefined. This is partly because there is no standard definition of rear irradiance given 1000 Wm-2 on the front. Using field measurements and simulations, we evaluate multiple deployment scenarios for bifacial modules and provide details on the amount of irradiance that could be expected. A simplified case that represents a single module deployed under conditions consistent with existing 1-sun irradiance standards leads to a bifacial reference condition of 1000 Wm-2 Gfront and 130-140 Wm-2 Grear. For fielded systems of bifacial modules, Grear magnitude and spatial uniformity will be affected by self-shade from adjacent modules, varied ground cover, and ground-clearance height. A standard measurement procedure for bifacial modules is also currently undefined. A proposed international standard is under development, which provides the motivation for this work. Here, we compare outdoor field measurements of bifacial modules with irradiance on both sides with proposed indoor test methods where irradiance is only applied to one side at a time. The indoor method has multiple advantages, including controlled and repeatable irradiance and thermal environment, along with allowing the use of conventional single-sided flash test equipment. The comparison results are promising, showing that the indoor and outdoor methods agree within 1%-2% for multiple rear-irradiance conditions and bifacial module types.

  16. 78 FR 78275 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2014

    Science.gov (United States)

    2013-12-26

    ... Federal Railroad Administration 49 CFR Part 219 Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2014 AGENCY: Federal Railroad Administration (FRA), DOT. ACTION: Notice of determination... therefore determined that the minimum annual random drug testing rate for the period January 1, 2014...

  17. 75 FR 79308 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2011

    Science.gov (United States)

    2010-12-20

    ... Federal Railroad Administration 49 CFR Part 219 Alcohol and Drug Testing: Determination of Minimum Random... rail industry random testing positive rates were .037 percent for drugs and .014 percent for alcohol. Because the industry-wide random drug testing positive rate has remained below 1.0 percent for the last...

  18. The effect of instructional methodology on high school students natural sciences standardized tests scores

    Science.gov (United States)

    Powell, P. E.

    Educators have recently come to consider inquiry based instruction as a more effective method of instruction than didactic instruction. Experience based learning theory suggests that student performance is linked to teaching method. However, research is limited on inquiry teaching and its effectiveness on preparing students to perform well on standardized tests. The purpose of the study to investigate whether one of these two teaching methodologies was more effective in increasing student performance on standardized science tests. The quasi experimental quantitative study was comprised of two stages. Stage 1 used a survey to identify teaching methods of a convenience sample of 57 teacher participants and determined level of inquiry used in instruction to place participants into instructional groups (the independent variable). Stage 2 used analysis of covariance (ANCOVA) to compare posttest scores on a standardized exam by teaching method. Additional analyses were conducted to examine the differences in science achievement by ethnicity, gender, and socioeconomic status by teaching methodology. Results demonstrated a statistically significant gain in test scores when taught using inquiry based instruction. Subpopulation analyses indicated all groups showed improved mean standardized test scores except African American students. The findings benefit teachers and students by presenting data supporting a method of content delivery that increases teacher efficacy and produces students with a greater cognition of science content that meets the school's mission and goals.

  19. Acceptance test of an activity meter to be used as reference in a calibration methodology establishment

    Energy Technology Data Exchange (ETDEWEB)

    Correa, Eduardo L.; Kuahara, Lilian T.; Potiens, Maria da Penha A., E-mail: educorrea1905@gmail.com [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2014-07-01

    The nuclear medicine is a medical physics area in which radiopharmaceuticals are used in diagnostic procedures. These radioactive elements are administered in the patient and the radiation emitted is detected by an equipment, that makes the body scan, connected to a computer software, and the image is constructed. In order to operate the nuclear medicine service must have calibrated radiation detectors. Thought, it does not exist, in Brazil, an activity meter calibration methodology, which causes many measurement uncertainties. The goal of this study is to present the acceptance test results of an activity meter to be used as reference in a new calibration methodology establishment. It was checked an activity meter Capintec, CRC-25R model, using three control sources ({sup 137}Cs, {sup 57}Co, {sup 133}Ba). The tests were based on the CNEN-NN 3.05 standard, the manufacturer manual, the TRS-454 and the TECDOC 602 and include: physical inspection, chamber voltage, zero adjustment, background response, data check and repeatability. The linearity and geometry tests could not be made, because the laboratory where the activity meter is located is not authorized to receive non-sealed radioactive sources. The equipment has presented a good behavior. All the results are in the range presented by national and international standards and the equipment is now being used in the laboratory and periodically passes through the quality control tests. (author)

  20. Testing for variation in taxonomic extinction probabilities: a suggested methodology and some results

    Science.gov (United States)

    Conroy, M.J.; Nichols, J.D.

    1984-01-01

    Several important questions in evolutionary biology and paleobiology involve sources of variation in extinction rates. In all cases of which we are aware, extinction rates have been estimated from data in which the probability that an observation (e.g., a fossil taxon) will occur is related both to extinction rates and to what we term encounter probabilities. Any statistical method for analyzing fossil data should at a minimum permit separate inferences on these two components. We develop a method for estimating taxonomic extinction rates from stratigraphic range data and for testing hypotheses about variability in these rates. We use this method to estimate extinction rates and to test the hypothesis of constant extinction rates for several sets of stratigraphic range data. The results of our tests support the hypothesis that extinction rates varied over the geologic time periods examined. We also present a test that can be used to identify periods of high or low extinction probabilities and provide an example using Phanerozoic invertebrate data. Extinction rates should be analyzed using stochastic models, in which it is recognized that stratigraphic samples are random varlates and that sampling is imperfect

  1. MELT RATE FURNACE TESTING FOR SLUDGE BATCH 5 FRIT OPTIMIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Miller, D; Fox, K; Pickenheim, B; Stone, M

    2008-10-03

    Savannah River National Laboratory (SRNL) was requested to provide the Defense Waste Processing Facility (DWPF) with a frit composition for Sludge Batch 5 (SB5) to optimize processing. A series of experiments were designed for testing in the Melt Rate Furnace (MRF). This dry fed tool can be used to quickly determine relative melt rates for a large number of candidate frit compositions and lead to a selection for further testing. Simulated Sludge Receipt and Adjustment Tank (SRAT) product was made according to the most recent SB5 sludge projections and a series of tests were conducted with frits that covered a range of boron and alkali ratios. Several frits with relatively large projected operating windows indicated melt rates that would not severely impact production. As seen with previous MRF testing, increasing the boron concentration had positive impacts on melt rate on the SB5 system. However, there appears to be maximum values for both boron and sodium above which the there is a negative effect on melt rate. Based on these data and compositional trends, Frit 418 and a specially designed frit (Frit 550) have been selected for additional melt rate testing. Frit 418 and Frit 550 will be run in the Slurry Fed Melt Rate Furnace (SMRF), which is capable of distinguishing rheological properties not detected by the MRF. Frit 418 will be used initially for SB5 processing in DWPF (given its robustness to compositional uncertainty). The Frit 418-SB5 system will provide a baseline from which potential melt rate advantages of Frit 550 can be gauged. The data from SMRF testing will be used to determine whether Frit 550 should be recommended for implementation in DWPF.

  2. Measuring hand hygiene compliance rates in different special care settings: a comparative study of methodologies

    Directory of Open Access Journals (Sweden)

    Thyago Pereira Magnus

    2015-04-01

    Conclusions: Hand hygiene compliance was reasonably high in these units, as measured by direct observation. However, a lack of correlation with results obtained by other methodologies brings into question the validity of direct observation results, and suggests that periodic audits using other methods may be needed.

  3. Spacecraft Parachute Recovery System Testing from a Failure Rate Perspective

    Science.gov (United States)

    Stewart, Christine E.

    2013-01-01

    Spacecraft parachute recovery systems, especially those with a parachute cluster, require testing to identify and reduce failures. This is especially important when the spacecraft in question is human-rated. Due to the recent effort to make spaceflight affordable, the importance of determining a minimum requirement for testing has increased. The number of tests required to achieve a mature design, with a relatively constant failure rate, can be estimated from a review of previous complex spacecraft recovery systems. Examination of the Apollo parachute testing and the Shuttle Solid Rocket Booster recovery chute system operation will clarify at which point in those programs the system reached maturity. This examination will also clarify the risks inherent in not performing a sufficient number of tests prior to operation with humans on-board. When looking at complex parachute systems used in spaceflight landing systems, a pattern begins to emerge regarding the need for a minimum amount of testing required to wring out the failure modes and reduce the failure rate of the parachute system to an acceptable level for human spaceflight. Not only a sufficient number of system level testing, but also the ability to update the design as failure modes are found is required to drive the failure rate of the system down to an acceptable level. In addition, sufficient data and images are necessary to identify incipient failure modes or to identify failure causes when a system failure occurs. In order to demonstrate the need for sufficient system level testing prior to an acceptable failure rate, the Apollo Earth Landing System (ELS) test program and the Shuttle Solid Rocket Booster Recovery System failure history will be examined, as well as some experiences in the Orion Capsule Parachute Assembly System will be noted.

  4. A methodology for use of digital image correlation for hot mix asphalt testing

    Science.gov (United States)

    Ramos, Estefany

    Digital Image Correlation (DIC) is a relatively new technology which aids in the measurement of material properties without the need for installation of sensors. DIC is a noncontact measuring technique that requires the specimen to be marked with a random speckled pattern and to be photographed during the test. The photographs are then post-processed based on the location of the pattern throughout the test. DIC can aid in calculating properties that would otherwise be too difficult even with other measuring instruments. The objective of this thesis is to discuss the methodology and validate the use of DIC in different hot mix asphalt (HMA) tests, such as, the Overlay Tester (OT) Test, Indirect Tensile (IDT) Test, and the Semicircular Bending (SCB) Test. The DIC system provides displacements and strains in any visible surface. The properly calibrated 2-D or 3-D DIC data can be used to understand the complex stress and strain distributions and the modes of the initiation and propagation of cracks. The use of this observational method will lead to further understanding of the complex boundary conditions of the different test, and therefore, allowing it to be implemented in the analysis of other materials. The use of digital image correlation will bring insight and knowledge onto what is happening during a test.

  5. A methodology for the analysis of a thermal-hydraulic phenomenon investigated in a test facility

    Energy Technology Data Exchange (ETDEWEB)

    D`Auria, F. [Dept. of Mechanical and Nuclear Constructions, Pisa Univ. (Italy); Faluomi, V. [Dept. of Mechanical and Nuclear Constructions, Pisa Univ. (Italy); Aksan, N. [Lab. for Thermal-Hydraulics, Paul Scherrer Inst., Villigen (Switzerland)

    1995-08-01

    A methodology for analysing non-homogeneous sets of experimental data for a selected phenomenon from separate effect test facilities and integral test facilities is presented in this paper. The critical heat flux from the validation matrices was chosen as the phenomenon to be studied; the results obtained in three test facilities are analysed. The method presented is applied for estimating the accuracy with which a thermalhydraulic transient code can predict the critical heat flux in an actual nuclear power plant. (orig.) [Deutsch] Gegenstand des Beitrags ist ein Verfahren zur Analyse ungleichartiger Datensaetze, die bei der experimentellen Untersuchung eines bestimmten thermohydraulischen Phaenomens in speziellen oder integralen Testeinrichtungen gewonnen wurden. Bei dem untersuchten Phaenomen handelt es sich hier um die kritische Waermestromdichte; experimentelle Daten aus drei Testeinrichtungen werden analysiert. Das Verfahren wird benutzt, um die Genauigkeit abzuschaetzen, mit der ein thermohydraulischer Rechencode zur Beschreibung von Uebergangszustaenden die kritische Waermestromdichte in einem Kernkraftwerk vorhersagen kann. (orig.)

  6. Establishing a Ballistic Test Methodology for Documenting the Containment Capability of Small Gas Turbine Engine Compressors

    Science.gov (United States)

    Heady, Joel; Pereira, J. Michael; Ruggeri, Charles R.; Bobula, George A.

    2009-01-01

    A test methodology currently employed for large engines was extended to quantify the ballistic containment capability of a small turboshaft engine compressor case. The approach involved impacting the inside of a compressor case with a compressor blade. A gas gun propelled the blade into the case at energy levels representative of failed compressor blades. The test target was a full compressor case. The aft flange was rigidly attached to a test stand and the forward flange was attached to a main frame to provide accurate boundary conditions. A window machined in the case allowed the projectile to pass through and impact the case wall from the inside with the orientation, direction and speed that would occur in a blade-out event. High-peed, digital-video cameras provided accurate velocity and orientation data. Calibrated cameras and digital image correlation software generated full field displacement and strain information at the back side of the impact point.

  7. TESTING FOR MULTIPLE STRUCTURAL BREAKS: AN APPLICATION OF BAI-PERRON TEST TO THE NOMINAL INTEREST RATES AND INFLATION IN TURKEY

    Directory of Open Access Journals (Sweden)

    GÜLCAN ÖNEL

    2013-06-01

    Full Text Available This paper aims to tests for multiple structural breaks in the nominal interest rate and inflation rate using the methodology developed by Bai and Perron (1998. The monthly data on Turkish 90 days time-deposits interest rate and consumer price index inflation rate over the period of 1980:1-2004:12 are used. The empirical results give little evidence of mean breaks in the interest rate series. However, the data on inflation rates is consistent with two breaks that are located at 1987:9 and 2000:2.

  8. Uniaxial tension test on Rubber at constant true strain rate

    Directory of Open Access Journals (Sweden)

    Sourne H.L.

    2012-08-01

    Full Text Available Elastomers are widely used for damping parts in different industrial contexts because of their remarkable dissipation properties. Indeed, they can undergo severe mechanical loading conditions, i.e., high strain rates and large strains. Nevertheless, the mechanical response of these materials can vary from purely rubber-like to glassy depending on the strain rate undergone. Classically, uniaxial tension tests are made in order to find a relation between the stress and the strain in the material at various strain rates. However, even if the strain rate is searched to be constant, it is the nominal strain rate that is considered. Here we develop a test at constant true strain rate, i.e. the strain rate that is experienced by the material. In order to do such a test, the displacement imposed by the machine is an exponential function of time. This test has been performed with a high speed hydraulic machine for strain rates between 0.01/s and 100/s. A specific specimen has been designed, yielding a uniform strain field (and so a uniform stress field. Furthermore, an instrumented aluminum bar has been used to take into account dynamic effects in the measurement of the applied force. A high speed camera enables the determination of strain in the sample using point tracking technique. Using this method, the stress-strain curve of a rubber-like material during a loading-unloading cycle has been determined, up to a stretch ratio λ = 2.5. The influence of the true strain rate both on stiffness and on dissipation of the material is then discussed.

  9. Uptake of newer methodological developments and the deployment of meta-analysis in diagnostic test research: a systematic review

    OpenAIRE

    Quigley Muireann; Willis Brian H

    2011-01-01

    Abstract Background The last decade has seen a number of methodological developments in meta-analysis of diagnostic test studies. However, it is unclear whether such developments have permeated the wider research community and on which applications they are being deployed. The objective was to assess the uptake and deployment of the main methodological developments in the meta-analysis of diagnostic tests, and identify the tests and target disorders most commonly evaluated by meta-analysis. M...

  10. A FAMILY OF SUMMARY CHI SQUARE TESTS FOR COMPARING SURVIVAL RATES RATHER THAN CONDITIONAL PROBABILITIES DYING

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Objective This paper propses a family of summary chi square tests for comparing survival rates at all points of time between two groups. Methods They are respectively derived from the Peto et al. expression for the log rank test, the Mantel-Haenszel expression for the log rank test, and the generalized Wilcoxon test by means of using the homogenetic effective sample size in place of the number at risk and using the corresponding numerator of the conditional probability surviving in place of the death number. Results After such derivations they become clearer in clinical significance, more powerful, and free from the assumption of proportional hazard. Conclusion These tests can be employed in analyzing the clinical data of cancer. A worked example illustrates the methodology.

  11. High rate tests of the LHCb RICH Upgrade system

    CERN Multimedia

    Blago, Michele Piero

    2016-01-01

    One of the biggest challenges for the upgrade of the LHCb RICH detectors from 2020 is to readout the photon detectors at the full 40 MHz rate of the LHC proton-proton collisions. A test facility has been setup at CERN with the purpose to investigate the behaviour of the Multi Anode PMTs, which have been proposed for the upgrade, and their readout electronics at high trigger rates. The MaPMTs are illuminated with a monochromatic laser that can be triggered independently of the readout electronics. A first series of tests, including threshold scans, is performed at low trigger rates (20 kHz) for both the readout and the laser with the purpose to characterise the behaviour of the system under test. Then the trigger rate is increased in two separate steps. First the MaPMTs are exposed to high illumination by triggering the pulsed laser at a high (20 MHz) repetition rate while the DAQ is readout at the same low rate as before. In this way the performance of the MaPMTs and the attached electronics can be evaluated ...

  12. Methodological and experimental study of the relationship between displacement rate of landslide and GNSS strategy for deformation monitoring

    Science.gov (United States)

    Giordan, Daniele; Piras, Marco; Allasia, Paolo; Dabove, Paolo

    2016-04-01

    The use of GNSS for landslide monitoring is not a novelty. In the field of large slope instabilities, where the phenomena are usually wide and the use of complex monitoring networks is needed, often a continuous monitoring is required. In this case, the installed GNSS solution is composed by a dual frequency receiver, with a solar power and with a radio connection to a ground station, where the measurement sessions of the rovers are collected and processed. The management of the collected data is the most critical aspect because the approach, which is commonly used, assumes a fixed position of the GNSS antenna during the acquisition time window. When the landslide is active, the position shift of the point can be considered insignificant for the low displacement rate, but together with the increase of the velocity, the GNSS time series processing becomes a crucial aspect to obtain reliable and enough accurate measurements. Starting from real case studies as the Italian large slope instabilities of Montaguto (Avellino, Italy) and Mont de La Saxe (Courmayeur, Italy), we focused on the presence of different kinematic domains with dissimilar displacement behaviors and velocities. In particular, the range of velocities registered during the main active periods ranges from several millimeters/day up to several meters/day, so the strategy for the GNSS processing data must be very different. Methodology for data acquisition (continuous or windowed) and its duration, type of receivers and antenna used (single or dual frequency, GPS or GNSS, mass market or geodetic), data processing strategies (i.e. single epoch, kinematic), and eventually GNSS network services are fundamental factors, which may favor one or another solution, according to time, economy and infrastructure readiness in the field. In the greater part of these studies, the choices were made based on the experience of responsible in the similar conditions. Starting from the behavior of real cases previously cited

  13. Testing spectral models for stellar populations with star clusters: I. Methodology

    CERN Document Server

    Fernandes, Roberto Cid

    2009-01-01

    High resolution spectral models for simple stellar populations (SSP) developed in the past few years have become a standard ingredient in studies of stellar population of galaxies. As more such models become available, it becomes increasingly important to test them. In this and a companion paper, we test a suite of publicly available evolutionary synthesis models using integrated optical spectra in the blue-near-UV range of 27 well studied star clusters from the work of Leonardi & Rose (2003) spanning a wide range of ages and metallicities. Most (23) of the clusters are from the Magellanic clouds. This paper concentrates on methodological aspects of spectral fitting. The data are fitted with SSP spectral models from Vazdekis and collaborators, based on the MILES library. Best-fit and Bayesian estimates of age, metallicity and extinction are presented, and degeneracies between these parameters are mapped. We find that these models can match the observed spectra very well in most cases, with small formal un...

  14. Methodology to determine failure characteristics of planar soft tissues using a dynamic tensile test.

    Science.gov (United States)

    Jacquemoud, C; Bruyere-Garnier, K; Coret, M

    2007-01-01

    Predicting the injury risk in automotive collisions requires accurate knowledge of human tissues, more particularly their mechanical properties under dynamic loadings. The present methodology aims to determine the failure characteristics of planar soft tissues such as skin, hollow organs and large vessel walls. This consists of a dynamic tensile test, which implies high-testing velocities close to those in automotive collisions. To proceed, I-shaped tissue samples are subjected to dynamic tensile tests using a customized tensile device based on the drop test principle. Data acquisition has especially been adapted to heterogeneous and soft biological tissues given that standard measurement systems (considered to be global) have been completed with a non-contact and full-field strain measurement (considered to be local). This local measurement technique, called the Image Correlation Method (ICM) provides an accurate strain analysis by revealing strain concentrations and avoids damaging the tissue. The methodology has first been applied to human forehead skin and can be further expanded to other planar soft tissues. The failure characteristics for the skin in terms of ultimate stress are 3 MPa +/- 1.5 MPa. The ultimate global longitudinal strains are equal to 9.5%+/-1.9% (Green-Lagrange strain), which contrasts with the ultimate local longitudinal strain values of 24.0%+/-5.3% (Green-Lagrange strain). This difference is a consequence of the tissue heterogeneity, clearly illustrated by the heterogeneous distribution of the local strain field. All data will assist in developing the tissue constitutive law that will be implemented in finite element models.

  15. Standard test method for measurement of fatigue crack growth rates

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2015-01-01

    1.1 This test method covers the determination of fatigue crack growth rates from near-threshold to Kmax controlled instability. Results are expressed in terms of the crack-tip stress-intensity factor range (ΔK), defined by the theory of linear elasticity. 1.2 Several different test procedures are provided, the optimum test procedure being primarily dependent on the magnitude of the fatigue crack growth rate to be measured. 1.3 Materials that can be tested by this test method are not limited by thickness or by strength so long as specimens are of sufficient thickness to preclude buckling and of sufficient planar size to remain predominantly elastic during testing. 1.4 A range of specimen sizes with proportional planar dimensions is provided, but size is variable to be adjusted for yield strength and applied force. Specimen thickness may be varied independent of planar size. 1.5 The details of the various specimens and test configurations are shown in Annex A1-Annex A3. Specimen configurations other than t...

  16. Research to reduce the suicide rate among older adults: methodology roadblocks and promising paradigms.

    Science.gov (United States)

    Szanto, Katalin; Lenze, Eric J; Waern, Margda; Duberstein, Paul; Bruce, Martha L; Epstein-Lubow, Gary; Conwell, Yeates

    2013-06-01

    The National Institute of Mental Health and the National Action Alliance for Suicide Prevention have requested input into the development of a national suicide research agenda. In response, a working group of the American Association for Geriatric Psychiatry has prepared recommendations to ensure that the suicide prevention dialogue includes older adults, a large and fast-growing population at high risk of suicide. In this Open Forum, the working group describes three methodology roadblocks to research into suicide prevention among elderly persons and three paradigms that might provide directions for future research into suicide prevention strategies for older adults.

  17. Leach test methodology for the Waste/Rock Interactions Technology Program

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, D.J.; McVay, G.L.; Coles, D.G.

    1980-05-01

    Experimental leach studies in the WRIT Program have two primary functions. The first is to determine radionuclide release from waste forms in laboratory environments which attempt to simulate repository conditions. The second is to elucidate leach mechanisms which can ultimately be incorporated into nearfield transport models. The tests have been utilized to generate rates of removal of elements from various waste forms and to provide specimens for surface analysis. Correlation between constituents released to the solution and corresponding solid state profiles is invaluable in the development of a leach mechanism. Several tests methods are employed in our studies which simulate various proposed leach incident scenarios. Static tests include low temperature (below 100/sup 0/C) and high temperature (above 100/sup 0/C) hydrothermal tests. These tests reproduce nonflow or low-flow repository conditions and can be used to compare materials and leach solution effects. The dynamic tests include single-pass, continuous-flow(SPCF) and solution-change (IAA)-type tests in which the leach solutions are changed at specific time intervals. These tests simulate repository conditions of higher flow rates and can also be used to compare materials and leach solution effects under dynamic conditions. The modified IAEA test is somewhat simpler to use than the one-pass flow and gives adequate results for comparative purposes. The static leach test models the condition of near-zero flow in a repository and provides information on element readsorption and solubility limits. The SPCF test is used to study the effects of flowing solutions at velocities that may be anticipated for geologic groundwaters within breached repositories. These two testing methods, coupled with the use of autoclaves, constitute the current thrust of WRIT leach testing.

  18. A systematic review of mosquito coils and passive emanators: defining recommendations for spatial repellency testing methodologies

    Directory of Open Access Journals (Sweden)

    Ogoma Sheila B

    2012-12-01

    Full Text Available Abstract Mosquito coils, vaporizer mats and emanators confer protection against mosquito bites through the spatial action of emanated vapor or airborne pyrethroid particles. These products dominate the pest control market; therefore, it is vital to characterize mosquito responses elicited by the chemical actives and their potential for disease prevention. The aim of this review was to determine effects of mosquito coils and emanators on mosquito responses that reduce human-vector contact and to propose scientific consensus on terminologies and methodologies used for evaluation of product formats that could contain spatial chemical actives, including indoor residual spraying (IRS, long lasting insecticide treated nets (LLINs and insecticide treated materials (ITMs. PubMed, (National Centre for Biotechnology Information (NCBI, U.S. National Library of Medicine, NIH, MEDLINE, LILAC, Cochrane library, IBECS and Armed Forces Pest Management Board Literature Retrieval System search engines were used to identify studies of pyrethroid based coils and emanators with key-words “Mosquito coils” “Mosquito emanators” and “Spatial repellents”. It was concluded that there is need to improve statistical reporting of studies, and reach consensus in the methodologies and terminologies used through standardized testing guidelines. Despite differing evaluation methodologies, data showed that coils and emanators induce mortality, deterrence, repellency as well as reduce the ability of mosquitoes to feed on humans. Available data on efficacy outdoors, dose–response relationships and effective distance of coils and emanators is inadequate for developing a target product profile (TPP, which will be required for such chemicals before optimized implementation can occur for maximum benefits in disease control.

  19. Organizational Benchmarks for Test Utilization Performance: An Example Based on Positivity Rates for Genetic Tests.

    Science.gov (United States)

    Rudolf, Joseph; Jackson, Brian R; Wilson, Andrew R; Smock, Kristi J; Schmidt, Robert L

    2017-04-01

    Health care organizations are under increasing pressure to deliver value by improving test utilization management. Many factors, including organizational factors, could affect utilization performance. Past research has focused on the impact of specific interventions in single organizations. The impact of organizational factors is unknown. The objective of this study is to determine whether testing patterns are subject to organizational effects, ie, are utilization patterns for individual tests correlated within organizations. Comparative analysis of ordering patterns (positivity rates for three genetic tests) across 659 organizations. Hierarchical regression was used to assess the impact of organizational factors after controlling for test-level factors (mutation prevalence) and hospital bed size. Test positivity rates were correlated within organizations. Organizations have a statistically significant impact on the positivity rate of three genetic tests.

  20. High Strain Rate Compressive Tests on Woven Graphite Epoxy Composites

    Science.gov (United States)

    Allazadeh, Mohammad Reza; Wosu, Sylvanus N.

    2011-08-01

    The behavior of composite materials may be different when they are subjected to high strain rate load. Penetrating split Hopkinson pressure bar (P-SHPB) is a method to impose high strain rate on specimen in the laboratory experiments. This research work studied the response of the thin circular shape specimens, made out of woven graphite epoxy composites, to high strain rate impact load. The stress-strain relationships and behavior of the specimens were investigated during the compressive dynamic tests for strain rates as high as 3200 s-1. One dimensional analysis was deployed for analytical calculations since the experiments fulfilled the ratio of diameter to length of bars condition in impact load experiments. The mechanics of dynamic failure was studied and the results showed the factors which govern the failure mode in high strain deformation via absorbed energy by the specimen. In this paper, the relation of particle velocity with perforation depth was discussed for woven graphite epoxy specimens.

  1. 77 FR 75896 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2013

    Science.gov (United States)

    2012-12-26

    ... Federal Railroad Administration 49 CFR Part 219 Alcohol and Drug Testing: Determination of Minimum Random.... According to data from FRA's Management Information System, the rail industry's random drug testing positive... (Administrator) has therefore determined that the minimum annual random drug testing rate for the period January...

  2. 76 FR 80781 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2012

    Science.gov (United States)

    2011-12-27

    ... Federal Railroad Administration 49 CFR Part 219 RIN 2130-AA81 Alcohol and Drug Testing: Determination of... random drug testing ] positive rate has remained below 1.0 percent for the last two years. The Federal Railroad Administrator (Administrator) has therefore determined that the minimum annual random drug testing...

  3. 75 FR 1547 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2010

    Science.gov (United States)

    2010-01-12

    ... Federal Railroad Administration 49 CFR Part 219 RIN 2130-AA81 Alcohol and Drug Testing: Determination of... percent for drugs and 0.15 percent for alcohol. Because the industry-wide random drug testing positive... (Administrator) has determined that the minimum annual random drug testing rate for the period January 1, 2010...

  4. The interest rate and capital durability, and the importance of methodological pluralism

    NARCIS (Netherlands)

    van Arkel, R.; Vermeylen, K.

    2013-01-01

    Champions of sustainable growth often call for more durable production technologies with less capital depreciation. As investment in more durable capital is encouraged by lower interest rates, we investigate whether policy makers can steer the economy towards a path with low interest rates in order

  5. 42 CFR 413.337 - Methodology for calculating the prospective payment rates.

    Science.gov (United States)

    2010-10-01

    ... data that account for the relative resource utilization of different resident types; and (v) Medicare... associated case-mix indices that account for the relative resource utilization of different patient types... prospective payment rates. (a) Data used. (1) To calculate the prospective payment rates, CMS uses—...

  6. Using Rasch Rating Scale Methodology to Examine a Behavioral Screener for Preschoolers at Risk

    Science.gov (United States)

    DiStefano, Christine; Greer, Fred W.; Kamphaus, R. W.; Brown, William H.

    2014-01-01

    A screening instrument used to identify young children at risk for behavioral and emotional difficulties, the Behavioral and Emotional Screening System Teacher Rating Scale-Preschool was examined. The Rasch Rating Scale Method was used to provide additional information about psychometric properties of items, respondents, and the response scale.…

  7. 42 CFR 422.312 - Announcement of annual capitation rate, benchmarks, and methodology changes.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Announcement of annual capitation rate, benchmarks... Payments to Medicare Advantage Organizations § 422.312 Announcement of annual capitation rate, benchmarks... description of the risk and other factors. (3) Regional benchmark announcement. Before the beginning of...

  8. Flight Test Techniques for Quantifying Pitch Rate and Angle of Attack Rate Dependencies

    Science.gov (United States)

    Grauer, Jared A.; Morelli, Eugene A.; Murri, Daniel G.

    2017-01-01

    Three different types of maneuvers were designed to separately quantify pitch rate and angle of attack rate contributions to the nondimensional aerodynamic pitching moment coefficient. These maneuvers combined pilot inputs and automatic multisine excitations, and were own with the subscale T-2 and Bat-4 airplanes using the NASA AirSTAR flight test facility. Stability and control derivatives, in particular C(sub mq) and C(sub m alpha(.)) were accurately estimated from the flight test data. These maneuvers can be performed with many types of aircraft, and the results can be used to increase simulation prediction fidelity and facilitate more accurate comparisons with wind tunnel experiments or numerical investigations.

  9. Testing the rate isomorphy hypothesis using five statistical methods

    Institute of Scientific and Technical Information of China (English)

    Xian-Ju Kuang; Megha N. Parajulee2+,; Pei-Jian Shi; Feng Ge; Fang-Sen Xue

    2012-01-01

    Organisms are said to be in developmental rate isomorphy when the proportions of developmental stage durations are unaffected by temperature.Comprehensive stage-specific developmental data were generated on the cabbage beetle,Colaphellus bowringi Baly (Coleoptera:Chrysomelidae),at eight temperatures ranging from 16℃ to 30℃ (in 2℃ increments) and five analytical methods were used to test the rate isomorphy hypothesis,including:(i) direct comparison of lower developmental thresholds with standard errors based on the traditional linear equation describing developmental rate as the linear function of temperature; (ii) analysis of covariance to compare the lower developmental thresholds of different stages based on the Ikemoto-Takai linear equation; (iii)testing the significance of the slope item in the regression line of arcsin(√P) versus temperature,where p is the ratio of the developmental duration of a particular developmental stage to the entire pre-imaginal developmental duration for one insect or mite species; (iv)analysis of variance to test for significant differences between the ratios of developmental stage durations to that of pre-imaginal development; and (v) checking whether there is an element less than a given level of significance in the p-value matrix of rotating regression line.The results revealed no significant difference among the lower developmental thresholds or among the aforementioned ratios,and thus convincingly confirmed the rate isomorphy hypothesis.

  10. METHODOLOGICAL PROBLEMS AND WAYS OF CREATION OF THE AIRCRAFT EQUIPMENT TEST AUTOMATED MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Vladimir Michailovich Vetoshkin

    2017-01-01

    Full Text Available The development of new and modernization of existing aviation equipment specimens of different classes are ac- companied and completed by the complex process of ground and flight tests. This phase of aviation equipment life cycle is implemented by means of organizational and technical systems - running centers. The latter include various proving grounds, measuring complex and systems, aircraft, ships, security and flight control offices, information processing laborato- ries and many other elements. The system analysis results of development challenges of the automated control systems of aviation equipment tests operations are presented. The automated control systems are in essence an automated data bank. The key role of development of flight tests automated control system in the process of creation of the automated control sys- tems of aviation equipment tests operations is substantiated. The way of the mobile modular measuring complexes integra- tion and the need for national methodologies and technological standards for database systems design concepts are grounded. Database system, as a central element in this scheme, provides collection, storing and updating of values of the elements described above in pace and the required frequency of the controlled object state monitoring. It is database system that pro- vides the supervisory unit with actual data corresponding to specific moments of time, which concern the state processes, assessments of the progress and results of flight experiments, creating the necessary environment for aviation equipment managing and testing as a whole. The basis for development of subsystems of automated control systems of aviation equip- ment tests operations are conceptual design processes of the respective database system, the implementation effectiveness of which largely determines the level of success and ability to develop the systems being created. Introduced conclusions and suggestions can be used in the

  11. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, Heather M [Los Alamos National Laboratory; Graham, Paul S [Los Alamos National Laboratory; Morgan, Keith S [Los Alamos National Laboratory; Caffrey, Michael P [Los Alamos National Laboratory

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA user designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.

  12. Development, testing and implementation of an emergency services methodology in Alberta.

    Science.gov (United States)

    Eliasoph, H; Ashdown, C

    1995-01-01

    Alberta was the first province in Canada to mandate reporting of hospital-based emergency services. This reporting is based on a workload measurement system that groups emergency visits into five discreet workload levels/classes driven by ICD-9-CM diagnoses. Other related workload measurement variables are incorporated, including admissions, transfers, maintenance monitoring, nursing and non-nursing patient support activities, trips, staff replacement, and personal fatigue and delay. The methodology used to design the reporting system has been subjected to extensive testing, auditing and refinement. The results of one year of province-wide data collection yielded approximately 1.5 million emergency visits. These data reveal consistent patterns/trends of workload that vary by hospital size and type. Although this information can assist in utilization management efforts to predict and compare workload and staffing levels, the impetus for establishing this system derived from its potential for funding hospital-based emergency services. This would be the first time that such services would be funded on a systemic, system-wide basis whereby hospitals would be reimbursed in relation to workload. This proposed funding system would distribute available funding in a consistent, fair and equitable manner across all hospitals providing a similar set of services, thus achieving one of the key goals of the Alberta Acute Care Funding Plan. Ultimately, this proposed funding methodology would be integrated into a broader Ambulatory Care Funding system currently being developed in Alberta.

  13. Inverse modeling of emissions for local photooxidant pollution: testing a new methodology with kriging constraints

    Energy Technology Data Exchange (ETDEWEB)

    Pison, I.; Blond, N. [Paris-7 Univ., Creteil (France). LISA, CNRS; Menut, L. [Ecole Polytechnique, Palaiseau (France). LMD/IPSL

    2006-07-01

    A new methodology for the inversion of anthropogenic emissions at a local scale is tested. The inversion constraints are provided by a kriging technique used in air quality forecast in the Paris area, which computes an analyzed concentration field from network measurements and the first-guess simulation of a CTM. The inverse developed here is based on the CHIMERE model and its adjoint to perform 4-D integration. The methodology is validated on synthetic cases inverting emission fluxes. It is shown that the information provided by the analyzed concentrations is sufficient to reach a mathematically acceptable solution to the optimization, even when little information is available in the measurements. As compared to the use of measurements alone or of measurements and a background matrix, the use of kriging leads to a more homogeneous distribution of the corrections, both in space and time. Moreover, it is then possible to double the accuracy of the inversion by performing two kriging-optimization cycles. Nevertheless, kriging analysis cannot compensate for a very important lack of information in the measurements. (orig.)

  14. Inverse modeling of emissions for local photooxidant pollution: Testing a new methodology with kriging constraints

    Directory of Open Access Journals (Sweden)

    I. Pison

    2006-07-01

    Full Text Available A new methodology for the inversion of anthropogenic emissions at a local scale is tested. The inversion constraints are provided by a kriging technique used in air quality forecast in the Paris area, which computes an analyzed concentration field from network measurements and the first-guess simulation of a CTM. The inverse developed here is based on the CHIMERE model and its adjoint to perform 4-D integration. The methodology is validated on synthetic cases inverting emission fluxes. It is shown that the information provided by the analyzed concentrations is sufficient to reach a mathematically acceptable solution to the optimization, even when little information is available in the measurements. As compared to the use of measurements alone or of measurements and a background matrix, the use of kriging leads to a more homogeneous distribution of the corrections, both in space and time. Moreover, it is then possible to double the accuracy of the inversion by performing two kriging-optimization cycles. Nevertheless, kriging analysis cannot compensate for a very important lack of information in the measurements.

  15. High Strain Rate Testing of Welded DOP-26 Iridium

    Energy Technology Data Exchange (ETDEWEB)

    Schneibel, J. H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Miller, R. G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Carmichael, C. A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fox, E. E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ulrich, G. B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); George, E. P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-08-01

    The iridium alloy DOP-26 is used to produce Clad Vent Set cups that protect the radioactive fuel in radioisotope thermoelectric generators (RTGs) which provide electric power for spacecraft and rovers. In a previous study, the tensile properties of DOP-26 were measured over a wide range of strain rates and temperatures and reported in ORNL/TM-2007/81. While that study established the properties of the base material, the fabrication of the heat sources requires welding, and the mechanical properties of welded DOP-26 have not been extensively characterized in the past. Therefore, this study was undertaken to determine the mechanical properties of DOP-26 specimens containing a transverse weld in the center of their gage sections. Tensile tests were performed at room temperature, 750, 900, and 1090°C and engineering strain rates of 1×10-3 and 10 s-1. Room temperature testing was performed in air, while testing at elevated temperatures was performed in a vacuum better than 1×10-4 Torr. The welded specimens had a significantly higher yield stress, by up to a factor of ~2, than the non-welded base material. The yield stress did not depend on the strain rate except at 1090°C, where it was slightly higher for the faster strain rate. The ultimate tensile stress, on the other hand, was significantly higher for the faster strain rate at temperatures of 750°C and above. At 750°C and above, the specimens deformed at 1×10-3 s-1 showed pronounced necking resulting sometimes in perfect chisel-edge fracture. The specimens deformed at 10 s-1 exhibited this fracture behavior only at the highest test temperature, 1090°C. Fracture occurred usually in the fusion zone of the weld and was, in most cases, primarily intergranular.

  16. Optical Methods For Automatic Rating Of Engine Test Components

    Science.gov (United States)

    Pritchard, James R.; Moss, Brian C.

    1989-03-01

    In recent years, increasing commercial and legislative pressure on automotive engine manufacturers, including increased oil drain intervals, cleaner exhaust emissions and high specific power outputs, have led to increasing demands on lubricating oil performance. Lubricant performance is defined by bench engine tests run under closely controlled conditions. After test, engines are dismantled and the parts rated for wear and accumulation of deposit. This rating must be consistently carried out in laboratories throughout the world in order to ensure lubricant quality meeting the specified standards. To this end, rating technicians evaluate components, following closely defined procedures. This process is time consuming, inaccurate and subject to drift, requiring regular recalibration of raters by means of international rating workshops. This paper describes two instruments for automatic rating of engine parts. The first uses a laser to determine the degree of polishing of the engine cylinder bore, caused by the reciprocating action of piston. This instrument has been developed to prototype stage by the NDT Centre at Harwell under contract to Exxon Chemical, and is planned for production within the next twelve months. The second instrument uses red and green filtered light to determine the type, quality and position of deposit formed on the piston surfaces. The latter device has undergone feasibility study, but no prototype exists.

  17. DILATANCY BEHAVIOR IN CONSTANT STRAIN RATE CONSOLIDATION TEST

    Directory of Open Access Journals (Sweden)

    Berty Sompie

    2006-01-01

    Full Text Available Subjected to remolded young clay, this paper shows that a lot of time dependent behavior in the standard consolidation (SC and constant strain rate consolidation (CSRC tests is represented systematically by a simple assumption concerning the time dependency of dilatancy. In the SC test, at the first stage of each loading step little dilatancy takes place and dilatancy begins to occur several minutes after step loading. In CSRC test, some time period after the stress state has entered the normally consolidated region, dilatancy tends to occur rapidly with the increase in stress ratio. Since most of dilatancy has taken place at the earlier stage of consolidation, little dilatancy occurs at the latter stage of CSRC process. This tendency makes the specimen stiffer with the passage of time, and makes the vertical pressure and pore pressure increase substantially at the last stage of CSRC process. Consideration to such behavior may be effective to correctly interpret the result of CSRC test.

  18. Divergence of conserved non-coding sequences: rate estimates and relative rate tests.

    Science.gov (United States)

    Wagner, Günter P; Fried, Claudia; Prohaska, Sonja J; Stadler, Peter F

    2004-11-01

    In many eukaryotic genomes only a small fraction of the DNA codes for proteins, but the non-protein coding DNA harbors important genetic elements directing the development and the physiology of the organisms, like promoters, enhancers, insulators, and micro-RNA genes. The molecular evolution of these genetic elements is difficult to study because their functional significance is hard to deduce from sequence information alone. Here we propose an approach to the study of the rate of evolution of functional non-coding sequences at a macro-evolutionary scale. We identify functionally important non-coding sequences as Conserved Non-Coding Nucleotide (CNCN) sequences from the comparison of two outgroup species. The CNCN sequences so identified are then compared to their homologous sequences in a pair of ingroup species, and we monitor the degree of modification these sequences suffered in the two ingroup lineages. We propose a method to test for rate differences in the modification of CNCN sequences among the two ingroup lineages, as well as a method to estimate their rate of modification. We apply this method to the full sequences of the HoxA clusters from six gnathostome species: a shark, Heterodontus francisci; a basal ray finned fish, Polypterus senegalus; the amphibian, Xenopus tropicalis; as well as three mammalian species, human, rat and mouse. The results show that the evolutionary rate of CNCN sequences is not distinguishable among the three mammalian lineages, while the Xenopus lineage has a significantly increased rate of evolution. Furthermore the estimates of the rate parameters suggest that in the stem lineage of mammals the rate of CNCN sequence evolution was more than twice the rate observed within the placental amniotes clade, suggesting a high rate of evolution of cis-regulatory elements during the origin of amniotes and mammals. We conclude that the proposed methods can be used for testing hypotheses about the rate and pattern of evolution of putative

  19. A methodology for post-occupancy evaluation of ventilation rates in schools

    OpenAIRE

    Mumovic, D.; Davies, M.; Ridley, I.; Altamirano-Medina, H.; Oreszczyn, T

    2009-01-01

    The importance of maintaining adequate indoor air quality in schools is recognised as a contributing factor to pupils' learning performance. This paper describes a series of field measurements that investigating the ventilation rates in four recently built secondary schools in England. All schools were assessed for compliance with the recently adopted Building Bulletin 101, which defines the set of criteria in relation to the ventilation rates and indoor air quality in new school buildings. U...

  20. Testing spectral models for stellar populations with star clusters - I. Methodology

    Science.gov (United States)

    Cid Fernandes, Roberto; González Delgado, Rosa M.

    2010-04-01

    High-resolution spectral models for simple stellar populations (SSP) developed in the past few years have become a standard ingredient in studies of stellar population of galaxies. As more such models become available, it becomes increasingly important to test them. In this and a companion paper, we test a suite of publicly available evolutionary synthesis models using integrated optical spectra in the blue-near-UV range of 27 well-studied star clusters from the work of Leonardi and Rose spanning a wide range of ages and metallicities. Most (23) of the clusters are from the Magellanic Clouds. This paper concentrates on the methodological aspects of spectral fitting. The data are fitted with SSP spectral models from Vazdekis and collaborators, based on the Medium-resolution INT Library of Empirical Spectra. Best-fitting and Bayesian estimates of age, metallicity and extinction are presented, and degeneracies between these parameters are mapped. We find that these models can match the observed spectra very well in most cases, with small formal uncertainties in t,Z and AV. In some cases, the spectral fits indicate that the models lack a blue old population, probably associated with the horizontal branch. This methodology, which is mostly based on the publicly available code STARLIGHT, is extended to other sets of models in Paper II, where a comparison with properties derived from spatially resolved data (colour-magnitude diagrams) is presented. The global aim of these two papers is to provide guidance to users of evolutionary synthesis models and empirical feedback to model makers.

  1. Fisheye Photogrammetry: Tests and Methodologies for the Survey of Narrow Spaces

    Science.gov (United States)

    Perfetti, L.; Polari, C.; Fassi, F.

    2017-02-01

    The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today's era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance) for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  2. TESTING FOR LONG MEMORY IN THE ASIAN FOREIGN EXCHANGE RATES

    Institute of Scientific and Technical Information of China (English)

    Abdol S. SOOFI; Shouyang WANG; Yuqin ZHANG

    2006-01-01

    In this paper, we use the plug-in and Whittle methods that are based on spectral regression analysis to test for the long memory property in 12 Asian/dollar daily exchange rates. The results according to the plug-in method show that with the exception of Chinese renminbi all series may have long memory properties. The results based on the Whittle method, on the other hand, show that only Japanese yen and Malaysian ringgit may have long memory properties. It is well known that inference about the differencing parameter, d, in presence of structural break in a series entails considerable difficulties. Therefore, given the financial crisis of 1997-1998 in Asia, further tests for unravelling of the memory property and presence of structural break in the exchange rate series are required.

  3. A Methodology for Long-Term Forecasts of Air Force Pilot Retention Rates: A Management Perspective

    Science.gov (United States)

    1990-09-01

    many runs indicates negative autocorrelation (27:8.3). When a runs test is used in this manner, it is known as a Wald - Wolfowitz runs test (4:350). The... Wald - Wolfowitz test statistic, T, is the total number of runs observed. A table of values for this statistic must then be consulted to determine if the...magnitude, and then plots them about their mean (residuals are standardized by dividing them by the mean of the regression’s squared error [ MSE ]). The

  4. A Methodology for Loading the Advanced Test Reactor Driver Core for Experiment Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cowherd, Wilson M.; Nielsen, Joseph W.; Choe, Dong O.

    2016-11-01

    In support of experiments in the ATR, a new methodology was devised for loading the ATR Driver Core. This methodology will replace the existing methodology used by the INL Neutronic Analysis group to analyze experiments. Studied in this paper was the as-run analysis for ATR Cycle 152B, specifically comparing measured lobe powers and eigenvalue calculations.

  5. Parents rate the ratings: a test of the validity of the American movie, television, and video game ratings.

    Science.gov (United States)

    Walsh, D A; Gentile, D A; Van Brederode, T M

    2002-02-01

    Numerous studies have documented the potential effects on young audiences of violent content in media products, including movies, television programs, and computer and video games. Similar studies have evaluated the effects associated with sexual content and messages. Cumulatively, these effects represent a significant public health risk for increased aggressive and violent behavior, spread of sexually transmitted diseases, and pediatric pregnancy. In partial response to these risks and to public and legislative pressure, the movie, television, and gaming industries have implemented ratings systems intended to provide information about the content and appropriate audiences for different films, shows, and games. We conducted a panel study to test the validity of the current movie, television, and video game rating systems. Participants used the KidScore media evaluation tool, which evaluates films, television shows, and video and computer games on 10 aspects, including the appropriateness of the media product for children on the basis of age. Results revealed that when an entertainment industry rates a product as inappropriate for children, parent raters agree that it is inappropriate for children. However, parent raters disagree with industry usage of many of the ratings designating material suitable for children of different ages. Products rated as appropriate for adolescents are of the greatest concern. The level of disagreement varies from industry to industry and even from rating to rating. Analysis indicates that the amount of violent content and portrayals of violence are the primary markers for disagreement between parent raters and industry ratings. Short-term and long-term recommendations are suggested.

  6. ROBUST REPETITIVE CONTROL FOR IMPROVING RATE SMOOTHNESS OF TEST TURNTABLE

    Institute of Scientific and Technical Information of China (English)

    LIUYu; ZENGMing; SUBao-ku

    2005-01-01

    A robust repetitive control scheme is used to improve the rate smoothness of a brushless DC motor (BLDCM) driven test turntable. The method synthesizes variable structure control (VSC) laws and repetitive control (RC) laws in a complementary manner. The VSC strategy can stabilize the system and suppress uncertainties, such as the aperiodic disturbance and noises, while RC strategy can eliminate the periodic rate fluctuation in a steady state. The convergence of the repetitive learning process is also guaranteed by VSC. A general nonlinear system model is discussed. The model can be considered as an extension of BLDCMs. The stability and asymptotic position tracking performance are validated by using Lyapunov functions. Simulation results show the effectiveness of the proposed approach for improving the rate smoothness.

  7. Fatigue crack growth rate test using a frequency sweep method

    Institute of Scientific and Technical Information of China (English)

    Xun ZHOU; Xiao-li YU

    2008-01-01

    Fatigue crack propagation characteristics of a diesel engine crankshaft are studied by measuring the fatigue crack growth rate using a frequency sweep method on a resonant fatigue test rig. Based on the phenomenon that the system frequency will change when the crack becomes large, this method can be directly applied to a complex component or structure. Finite element analyses (FEAs) are performed to calibrate the relation between the frequency change and the crack size, and to obtain the natural frequency of the test rig and the stress intensity factor (SIF) of growing cracks. The crack growth rate i.e. da/dN-AK of each crack size is obtained by combining the testing-time monitored data and FEA results. The results show that the crack growth rate of engine crankshaft, which is a component with complex geometry and special surface treatment, is quite different from that of a pure material. There is an apparent turning point in the Paris's crack partition. The cause of the fatigue crack growth is also dis-cussed.

  8. Rates of and reasons for condemnation of poultry carcases: harmonised methodology at the slaughterhouse.

    Science.gov (United States)

    Salines, M; Allain, V; Roul, H; Magras, C; Le Bouquin, S

    2017-05-27

    European hygiene regulations require the condemnation of any unsafe food. However, there is little information identifying and quantifying condemnation of poultry carcases at slaughterhouses. We present an in-depth view of rates of and reasons for the condemnation of broiler, turkey, meat and force-feeding duck, guinea fowl carcases in France. The experiment was conducted in 10 slaughterhouses. For one year, all condemnations were recorded on a standard form following a national reference system. The rates of and reasons for condemnation, as well as factors influencing variation were investigated. The global condemnation rates were 1.04 per cent for broilers, 1.85 per cent for turkeys, 1.23 per cent for meat ducks, 1.42 per cent for force-feeding ducks and 1.20 per cent for guinea fowl. Condemnation rates depended on several factors including slaughterhouse, animal gender and season. Reasons for condemnation varied with species, for example, the three main reasons for broilers were cachexia (41.8 per cent of condemned batches), generalised congestion (29.3 per cent) and non-purulent cutaneous lesions (14.2 per cent) versus conformation abnormalities (58.6 per cent), cachexia (14.61 per cent) and ascites (14.56 per cent) for meat ducks. While the condemnation rates can be considered low for all species, the difference between the rates of and reasons for condemnation highlights the need to lead species-specific epidemiological studies to improve the sanitary situation of poultry production. British Veterinary Association.

  9. DETERMINANTS OF GROWTH RATE: SOME METHODOLOGICAL ISSUES WITH DATA FROM FIJI

    OpenAIRE

    Bhaskara Rao; Maheshwar Rao

    2005-01-01

    Compared to many cross-country studies on the determinants of growth rate, time series approaches are relatively few and limited in scope. However, time series studies are useful for country-specific policies. But in many recent works ad hoc specifications have been used to analyze the contribution of various factors to growth. This paper examines the specification and estimation issues in the time series approach and provides some guidelines. Our approach is used to illustrate the effects of...

  10. Testing the methodology for site descriptive modelling. Application for the Laxemar area

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan [JA Streamflow AB, Aelvsjoe (Sweden); Berglund, Johan [SwedPower AB, Stockholm (Sweden); Follin, Sven [SF Geologic AB, Stockholm (Sweden); Hakami, Eva [Itasca Geomekanik AB, Stockholm (Sweden); Halvarson, Jan [Swedish Nuclear Fuel and Waste Management Co, Stockholm (Sweden); Hermanson, Jan [Golder Associates AB, Stockholm (Sweden); Laaksoharju, Marcus [Geopoint (Sweden); Rhen, Ingvar [Sweco VBB/VIAK, Stockholm (Sweden); Wahlgren, C.H. [Sveriges Geologiska Undersoekning, Uppsala (Sweden)

    2002-08-01

    A special project has been conducted where the currently available data from the Laxemar area, which is part of the Simpevarp site, have been evaluated and interpreted into a Site Descriptive Model covering: geology, hydrogeology, hydrogeochemistry and rock mechanics. Description of the surface ecosystem has been omitted, since it was re-characterised in another, parallel, project. Furthermore, there has been no evaluation of transport properties. The project is primarily a methodology test. The lessons learnt will be implemented in the Site Descriptive Modelling during the coming site investigation. The intent of the project has been to explore whether available methodology for Site Descriptive Modelling based on surface and borehole data is adequate and to identify potential needs for development and improvement in the methodology. The project has developed, with limitations in scope, a Site Descriptive Model in local scale, corresponding to the situation after completion of the Initial Site Investigations for the Laxemar area (i.e. 'version 1.2' using the vocabulary of the general execution program for the site investigations). The Site Descriptive Model should be reasonable, but should not be regarded as a 'real' model. There are limitations both in input data and in the scope of the analysis. The measured (primary) data constitute a wide range of different measurement results including data from two deep core drilled boreholes. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modelling. Examples of such evaluations are estimation of surface geology, lineament interpretation, geological single hole interpretation, hydrogeological single hole interpretation and assessment of hydrogeochemical data. Furthermore, while cross discipline interpretation is encouraged there is also a need for transparency. This means that the evaluations first are made within each discipline

  11. A Methodological Report: Adapting the 505 Change-of-Direction Speed Test Specific to American Football.

    Science.gov (United States)

    Lockie, Robert G; Farzad, Jalilvand; Orjalo, Ashley J; Giuliano, Dominic V; Moreno, Matthew R; Wright, Glenn A

    2017-02-01

    Lockie, RG, Jalilvand, F, Orjalo, AJ, Giuliano, DV, Moreno, MR, and Wright, GA. A methodological report: Adapting the 505 change-of-direction speed test specific to American football. J Strength Cond Res 31(2): 539-547, 2017-The 505 involves a 10-m sprint past a timing gate, followed by a 180° change-of-direction (COD) performed over 5 m. This methodological report investigated an adapted 505 (A505) designed to be football-specific by changing the distances to 10 and 5 yd. Twenty-five high school football players (6 linemen [LM]; 8 quarterbacks, running backs, and linebackers [QB/RB/LB]; 11 receivers and defensive backs [R/DB]) completed the A505 and 40-yd sprint. The difference between A505 and 0 to 10-yd time determined the COD deficit for each leg. In a follow-up session, 10 subjects completed the A505 again and 10 subjects completed the 505. Reliability was analyzed by t-tests to determine between-session differences, typical error (TE), and coefficient of variation. Test usefulness was examined via TE and smallest worthwhile change (SWC) differences. Pearson's correlations calculated relationships between the A505 and 505, and A505 and COD deficit with the 40-yd sprint. A 1-way analysis of variance (p ≤ 0.05) derived between-position differences in the A505 and COD deficit. There were no between-session differences for the A505 (p = 0.45-0.76; intraclass correlation coefficient = 0.84-0.95; TE = 2.03-4.13%). Additionally, the A505 was capable of detecting moderate performance changes (SWC0.5 > TE). The A505 correlated with the 505 and 40-yard sprint (r = 0.58-0.92), suggesting the modified version assessed similar qualities. Receivers and defensive backs were faster than LM in the A505 for both legs, and right-leg COD deficit. Quarterbacks, running backs, and linebackers were faster than LM in the right-leg A505. The A505 is reliable, can detect moderate performance changes, and can discriminate between football position groups.

  12. Hydrogen Embrittlement - Loading Rate Effects in Fracture Mechanics Testing

    NARCIS (Netherlands)

    Koers, R.W.J.; Krom, A.H.M.; Bakker, A.

    2001-01-01

    The fitness for purpose methodology is more and more used in the oil and gas industry to evaluate the significance of pre-existing flaws and material deficiencies with regard to the suitability of continued operation of equipment. In this methodology, traditional fracture mechanics is integrated wit

  13. Accelerated lifetime testing methodology for lifetime estimation of Lithium-ion batteries used in augmented wind power plants

    DEFF Research Database (Denmark)

    Stroe, Daniel Ioan; Swierczynski, Maciej Jozef; Stan, Ana-Irina

    2013-01-01

    The development of lifetime estimation models for Lithium-ion battery cells, which are working under highly variable mission profiles characteristic for wind power plant applications, requires a lot of expenditures and time resources. Therefore, batteries have to be tested under accelerated...... lifetime ageing conditions. This paper presents a three-stage methodology used for accelerated lifetime testing of Lithium-ion batteries. The results obtained at the end of the accelerated ageing process can be used for the parametrization of a performance-degradation lifetime model. In the proposed...... methodology both calendar and cycling lifetime tests are considered since both components are influencing the lifetime of Lithium-ion batteries. The methodology proposes also a lifetime model verification stage, where Lithium-ion battery cells are tested at normal operating conditions using an application...

  14. High false positive rates in common sensory threshold tests.

    Science.gov (United States)

    Running, Cordelia A

    2015-02-01

    Large variability in thresholds to sensory stimuli is observed frequently even in healthy populations. Much of this variability is attributed to genetics and day-to-day fluctuation in sensitivity. However, false positives are also contributing to the variability seen in these tests. In this study, random number generation was used to simulate responses in threshold methods using different "stopping rules": ascending 2-alternative forced choice (AFC) with 5 correct responses; ascending 3-AFC with 3 or 4 correct responses; staircase 2-AFC with 1 incorrect up and 2 incorrect down, as well as 1 up 4 down and 5 or 7 reversals; staircase 3-AFC with 1 up 2 down and 5 or 7 reversals. Formulas are presented for rates of false positives in the ascending methods, and curves were generated for the staircase methods. Overall, the staircase methods generally had lower false positive rates, but these methods were influenced even more by number of presentations than ascending methods. Generally, the high rates of error in all these methods should encourage researchers to conduct multiple tests per individual and/or select a method that can correct for false positives, such as fitting a logistic curve to a range of responses.

  15. Integration and software for thermal test of heat rate sensors

    Science.gov (United States)

    Wojciechowski, C. J.; Shrider, K. R.

    1982-04-01

    A minicomputer controlled radiant test facility is described which was developed and calibrated in an effort to verify analytical thermal models of instrumentation islands installed aboard the space shuttle external tank to measure thermal flight parameters during ascent. Software was provided for the facility as well as for development tests on the SRB actuator tail stock. Additional testing was conducted with the test facility to determine the temperature and heat flux rate and loads required to effect a change of color in the ET tank external paint. This requirement resulted from the review of photographs taken of the ET at separation from the orbiter which showed that 75% of the external tank paint coating had not changed color from its original white color. The paint on the remaining 25% of the tank was either brown or black, indicating that it had degraded due to heating or that the spray on form insulation had receded in these areas. The operational capability of the facility as well as the various tests which were conducted and their results are discussed.

  16. Methodological Issues in Analog Acceptability Research: Are Teachers' Acceptability Ratings of Assessment Methods Influenced by Experimental Design?

    Science.gov (United States)

    Eckert, Tanya L.; Shapiro, Edward S.

    1999-01-01

    Explores the relationship between experimental design method and teacher-rated acceptability of two analog approaches for assessing academic skills problems. Comparisons indicate that curriculum-based assessment was consistently rated as a more acceptable method of assessment than published, norm-referenced tests. Results are discussed in relation…

  17. Methodological Issues in Analog Acceptability Research: Are Teachers' Acceptability Ratings of Assessment Methods Influenced by Experimental Design?

    Science.gov (United States)

    Eckert, Tanya L.; Shapiro, Edward S.

    1999-01-01

    Explores the relationship between experimental design method and teacher-rated acceptability of two analog approaches for assessing academic skills problems. Comparisons indicate that curriculum-based assessment was consistently rated as a more acceptable method of assessment than published, norm-referenced tests. Results are discussed in relation…

  18. Methodological factors in determining risk of dementia after TIA and stroke: (III) Applicability of cognitive tests

    Science.gov (United States)

    Pendlebury, Sarah T; Klaus, Stephen P; Thomson, Ross J; Mehta, Ziyah; Wharton, Rose M; Rothwell, Peter M

    2017-01-01

    Background and Purpose Cognitive assessment is recommended after stroke but there are few data on the applicability of short cognitive tests to the full spectrum of patients. We therefore determined the rates, causes and associates of untestability in a population-based study of all TIA and stroke. Methods Patients with TIA or stroke prospectively recruited (2002-2007) into the Oxford Vascular Study had ≥1 short cognitive test (mini-mental-state examination (MMSE), telephone interview of cognitive status (TICSM), Montreal cognitive assessment (MOCA), and abbreviated mental test score (AMTS)) at baseline and on follow-up to 5 years. Results Among 1097 consecutive assessed survivors (mean age/sd 74.8/12.1 years, 378 TIA), numbers testable with a short cognitive test at baseline, 1, 6, 12 and 60 months were 835/1097 (76%), 778/947 (82%), 756/857 (88%), 692/792 (87%) and 472/567 (83%) 88% (331/378) of assessed TIA patients were testable at baseline compared to only 46% (133/290) of major stroke (panarthria/hemiparesis=84 (32%), drowsiness=58 (22%) and acute confusion=11 (4%)) whereas sensory deficits caused relatively more problems with testing at later time points (24/63 (38%) at 5 years). Conclusions Substantial numbers of patients with TIA and stroke are untestable with short cognitive tests. Future studies should report data on untestable patients and those with problems with testing in whom the likelihood of dementia is high. PMID:26463688

  19. Dynamic testing of learning potential in adults with cognitive impairments: A systematic review of methodology and predictive value.

    NARCIS (Netherlands)

    Boosman, H.; Bovend'Eerdt, T.J.; Visser-Meily, J.M.; Nijboer, T.C.W.; Van heugten, C.M.

    2016-01-01

    Dynamic testing includes procedures that examine the effects of brief training on test performance where pre- to post-training change reflects patients' learning potential. The objective of this systematic review was to provide clinicians and researchers insight into the concept and methodology of

  20. Development and validation of a Chinese music quality rating test.

    Science.gov (United States)

    Cai, Yuexin; Zhao, Fei; Zheng, Yiqing

    2013-09-01

    The present study aims to develop and validate a Chinese music quality rating test (MQRT). In Experiment 1, 22 music pieces were initially selected and paired as a 'familiar music piece' and 'unfamiliar music piece' based on familiarities amongst the general public in the categories of classical music (6), Chinese folk music (8), and pop music (8). Following the selection criteria, one pair of music pieces from each music category was selected and used for the MQRT in Experiment 2. In Experiment 2, the MQRT was validated using these music pieces in the categories 'Pleasantness', 'Naturalness', 'Fullness', 'Roughness', and 'Sharpness'. Seventy-two adult participants and 30 normal-hearing listeners were recruited in Experiments 1 and 2, respectively. Significant differences between the familiar and unfamiliar music pieces were found in respect of pleasantness rating for folk and pop music pieces as well as in sharpness rating for pop music pieces. The comparison of music category effect on MQRT found significant differences in pleasantness, fullness, and sharpness ratings. The Chinese MQRT developed in the present study is an effective tool for assessing music quality.

  1. Methodology to Achieve Enhanced Data Transmission Rate using Li-Fi in VLC Technology

    Directory of Open Access Journals (Sweden)

    Md. Shahadat Hossain,,

    2014-12-01

    Full Text Available Li-Fi (Light Fidelity or optical Wi-Fi is the transmission of data using light waves by varying the light intensity faster than human eye can follow using Visible Light Communication (VLC technology in free space. This is just like as “Data Through Illumination”. VLC uses rapid pulses of light to transmit information wirelessly. VLC using LEDs is emerging as a key technology for a ubiquitous communication system, because LED has the advantages of fast switching, long life expectancy, being less expensive and being visible light that is safe for the human body. LEDs are different from the other kinds of lamps because they are semiconductors. This characteristic gives them the capability to switch-on and off within few nanoseconds or billionth of a second. Converted in terms of data rates, this corresponds to 1 Gbits/s or more. In order to compare, at best Wi-Fi can reach 100 Mbits/s data rates and so at least 10 times or more lower. Here we shows some new conceptual methods by which we can transmit data in parallel using VLC technology. If this application is put into use, we can use every bulb like a Wi-Fi hot spot to transmit the data with ultra-high speed such as more than 10 Gbits/s.

  2. A Methodology for the Optimization of Flow Rate Injection to Looped Water Distribution Networks through Multiple Pumping Stations

    Directory of Open Access Journals (Sweden)

    Christian León-Celi

    2016-12-01

    Full Text Available The optimal function of a water distribution network is reached when the consumer demands are satisfied using the lowest quantity of energy, maintaining the minimal pressure required at the same time. One way to achieve this is through optimization of flow rate injection based on the use of the setpoint curve concept. In order to obtain that, a methodology is proposed. It allows for the assessment of the flow rate and pressure head that each pumping station has to provide for the proper functioning of the network while the minimum power consumption is kept. The methodology can be addressed in two ways: the discrete method and the continuous method. In the first method, a finite set of combinations is evaluated between pumping stations. In the continuous method, the search for the optimal solution is performed using optimization algorithms. In this paper, Hooke–Jeeves and Nelder–Mead algorithms are used. Both the hydraulics and the objective function used by the optimization are solved through EPANET and its Toolkit. Two case studies are evaluated, and the results of the application of the different methods are discussed.

  3. Methodology for Life Testing of Refractory Metal/Sodium Heat Pipes

    Science.gov (United States)

    Martin, James J.; Reid, Robert S.

    2006-01-01

    The focus of this work was to establish an approach to generate carefully controlled data that can conclusively establish heat pipe operating life with material-fluid combinations capable of extended operation. To accomplish this goal acceleration is required to compress 10 years of operational life into 3 years of laboratory testing through a combination of increased temperature and mass fluence. Specific test series have been identi3ed, based on American Society for Testing and Materials (ASTM) specifications, to investigate long term corrosion rates. The refractory metal selected for demonstration purposes is a Molybdenum-44.5%Rhenium alloy formed by powder metallurgy. The heat pipe makes use of an annular crescent wick design formed by hot isostatic pressing of Molybdenum-Rhenium wire mesh. The heat pipes are filled using vacuum distillation and purity sampling is considered. Testing of these units is round-the-clock with 6-month destructive and non-destructive inspection intervals to identify the onset and level of corrosion. Non-contact techniques are employed for providing power to the evaporator (radio frequency induction heating at I to 5 kW per unit) and calorimetry at the condenser (static gas gap coupled water cooled calorimeter). The planned operating temperature range would extend from 1123 to 1323 K. Accomplishments prior to project cancellation included successful demonstration of the heat pipe wick fabrication technique, establishment of all engineering designs, baselined operational test requirements and procurement/assembly of supporting test hardware systems.

  4. Optimization of Antifungal Extracts from Ficus hirta Fruits Using Response Surface Methodology and Antifungal Activity Tests

    Directory of Open Access Journals (Sweden)

    Chuying Chen

    2015-10-01

    Full Text Available The fruits of Ficus hirta (FH display strong antifungal activity against Penicillium italicum and Penicillium digitatum. In order to optimize the extraction conditions of antifungal extracts from FH fruit, various extraction parameters, such as ethanol concentration, extraction time, solvent to solid ratio and temperature, were chosen to identify their effects on the diameters of inhibition zones (DIZs against these two Penicillium molds. Response surface methodology (RSM was applied to obtain the optimal combination of these parameters. Results showed that the optimal extraction parameters for maximum antifungal activity were: 90% (v/v ethanol concentration, 65 min extraction time, 31 mL/g solvent to solid ratio and 51 °C temperature. Under the abovementioned extraction conditions, the experimental DIZs values obtained experimentally were 57.17 ± 0.75 and 39.33 ± 0.82 mm, which were very close to the values of 57.26 and 39.29 mm predicted by the model. Further, nine kinds of phytopathogens were tested in vitro to explore the antifungal activity of the FH extracts. It was found for the first time that the FH extracts showed significant inhibition on the growth of P. italicum, A. citri, P. vexans, P. cytosporella and P. digitatum.

  5. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    Energy Technology Data Exchange (ETDEWEB)

    Zain, Zakiyah, E-mail: zac@uum.edu.my; Ahmad, Yuhaniz, E-mail: yuhaniz@uum.edu.my [School of Quantitative Sciences, Universiti Utara Malaysia, UUM Sintok 06010, Kedah (Malaysia); Azwan, Zairul, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Raduan, Farhana, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Sagap, Ismail, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com [Surgery Department, Universiti Kebangsaan Malaysia Medical Centre, Jalan Yaacob Latif, 56000 Bandar Tun Razak, Kuala Lumpur (Malaysia); Aziz, Nazrina, E-mail: nazrina@uum.edu.my

    2014-12-04

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.

  6. F-Ratio Test and Hypothesis Weighting: A Methodology to Optimize Feature Vector Size

    Directory of Open Access Journals (Sweden)

    R. M. Dünki

    2011-01-01

    Full Text Available Reducing a feature vector to an optimized dimensionality is a common problem in biomedical signal analysis. This analysis retrieves the characteristics of the time series and its associated measures with an adequate methodology followed by an appropriate statistical assessment of these measures (e.g., spectral power or fractal dimension. As a step towards such a statistical assessment, we present a data resampling approach. The techniques allow estimating σ2(F, that is, the variance of an F-value from variance analysis. Three test statistics are derived from the so-called F-ratio σ2(F/F2. A Bayesian formalism assigns weights to hypotheses and their corresponding measures considered (hypothesis weighting. This leads to complete, partial, or noninclusion of these measures into an optimized feature vector. We thus distinguished the EEG of healthy probands from the EEG of patients diagnosed as schizophrenic. A reliable discriminance performance of 81% based on Taken's χ, α-, and δ-power was found.

  7. English language teaching methodology in a call classroom: Testing and evaluating traditional grammar instruction

    Directory of Open Access Journals (Sweden)

    Đorđević Jasmina P.

    2016-01-01

    Full Text Available Pursuant to the revised Research Policy issued by the Executive Committee of the European Association for Computer Assisted Language Learning, research in the area of Computer-Assisted Language Learning (CALL, should guide language pedagogies rather than the availability of new technologies and functionalities. Accordingly, the aim of this research was to test a combination of methods and techniques rooted in traditional grammar instruction by alternating them in a conventional classroom setting (based on paper and a whiteboard and an experimental CALL setting. The hypothesis was that if the classroom activities were anchored in traditional grammar teaching methodology, the CALL environment would prove as comprehensive as the conventional classroom, thus stimulating and yielding positive results. By means of grammar-based teaching and communicative learning the grammar item 'meaning of modal verbs' was taught. Based on a quasi-experiment and the principles of a repeated measures research design, the performance of 50 students at two English language departments was alternatively measured in both the conventional and the experimental classroom settings in several subsequent instances. The analysis of the data resulted in the conclusion that the overall performance of the participants in the experimental setting exceeded the performance in the conventional setting.

  8. First accelerated ageing cycling test on super capacitors for transportation applications: methodology, first results

    Energy Technology Data Exchange (ETDEWEB)

    Coquery, G.; Lallemand, R.; Kauv, J. [Institut National de Recherche sur les Transports et leur Securite (INRETS), Lab. des Technologies Nouvelles, 94 - Arcueil (France); Monts, A. de; Soucaze-Guillous, B. [Societe Nationale des Chemins de fer Francais (SNCF), Dir. de la Recherche, 75 - Paris (France); Chabas, J.; Darnault, A. [VALEO Electrical Energy Management, 94 - Creteil (France)

    2004-07-01

    Automotive and railway electrical traction systems are submitted to high power cycles due to the urban mission profiles. In order to increase the energy efficiency and to reduce global pollutant emission, the buffer energy storage by means super-capacitor offer major advantages to optimise the traction energy management with the highest level of regenerative braking energy. Super-capacitors are promising for the energy management between traction chain and electrical supplying systems. Because the effects of the charge-discharge cycles are a strong limitation of the batteries lifetime, it was decided to evaluate the capabilities of this technology concerning the ageing effects due to these strong cycling power stresses link to the urban traffic. Today, there are no publications to discuss about this phenomena, however it appears as a fundamental acknowledge to design the super-capacitors assembly. The first step was to establish a typical mission profile representative of the transportation working conditions, to propose a preliminary test plan, and to define a measurement methodology. The presented investigations are linked to research projects on automotive and railway applications. (authors)

  9. Summer 2012 Testing and Analysis of the Chemical Mixture Methodology -- Part I

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, Clifford S.; Yu, Xiao-Ying; Coggin, Rebekah L.; Ponder, Lashaundra A.; Booth, Alexander E.; Petrocchi, Achille J.; Horn, Sarah M.; Yao, Juan

    2012-07-01

    This report presents the key findings made by the Chemical Mixture Methodology (CMM) project team during the first stage of their summer 2012 testing and analysis of the CMM. The study focused on answering the following questions: o What is the percentage of the chemicals in the CMM Rev 27 database associated with each Health Code Number (HCN)? How does this result influence the relative importance of acute HCNs and chronic HCNs in the CMM data set? o What is the benefit of using the HCN-based approach? Which Modes of Action and Target Organ Effects tend to be important in determining the HCN-based Hazard Index (HI) for a chemical mixture? o What are some of the potential issues associated with the current HCN-based approach? What are the opportunities for improving the performance and/or technical defensibility of the HCN-based approach? How would those improvements increase the benefit of using the HCN-based approach? o What is the Target Organ System Effect approach and how can it be used to improve upon the current HCN-based approach? How does the benefits users would derive from using the Target Organ System Approach compare to the benefits available from the current HCN-based approach?

  10. Developing an Item Bank for Use in Testing in Africa: Theory and Methodology

    Science.gov (United States)

    Furtuna, Daniela

    2014-01-01

    The author describes the steps taken by a research team, of which she was part, to develop a specific methodology for assessing student attainment in primary school, working with the Programme for the Analysis of Education Systems (PASEC) of the Conference of Ministers of Education of French-speaking Countries (CONFEMEN). This methodology provides…

  11. Analytical Methodology Used To Assess/Refine Observatory Thermal Vacuum Test Conditions For the Landsat 8 Data Continuity Mission

    Science.gov (United States)

    Fantano, Louis

    2015-01-01

    Thermal and Fluids Analysis Workshop Silver Spring, MD NCTS 21070-15 The Landsat 8 Data Continuity Mission, which is part of the United States Geologic Survey (USGS), launched February 11, 2013. A Landsat environmental test requirement mandated that test conditions bound worst-case flight thermal environments. This paper describes a rigorous analytical methodology applied to assess refine proposed thermal vacuum test conditions and the issues encountered attempting to satisfy this requirement.

  12. Development and interval testing of a naturalistic driving methodology to evaluate driving behavior in clinical research

    Science.gov (United States)

    Babulal, Ganesh M.; Addison, Aaron; Ghoshal, Nupur; Stout, Sarah H.; Vernon, Elizabeth K.; Sellan, Mark; Roe, Catherine M.

    2016-01-01

    Background: The number of older adults in the United States will double by 2056. Additionally, the number of licensed drivers will increase along with extended driving-life expectancy. Motor vehicle crashes are a leading cause of injury and death in older adults. Alzheimer’s disease (AD) also negatively impacts driving ability and increases crash risk. Conventional methods to evaluate driving ability are limited in predicting decline among older adults. Innovations in GPS hardware and software can monitor driving behavior in the actual environments people drive in. Commercial off-the-shelf (COTS) devices are affordable, easy to install and capture large volumes of data in real-time. However, adapting these methodologies for research can be challenging. This study sought to adapt a COTS device and determine an interval that produced accurate data on the actual route driven for use in future studies involving older adults with and without AD.  Methods: Three subjects drove a single course in different vehicles at different intervals (30, 60 and 120 seconds), at different times of day, morning (9:00-11:59AM), afternoon (2:00-5:00PM) and night (7:00-10pm). The nine datasets were examined to determine the optimal collection interval. Results: Compared to the 120-second and 60-second intervals, the 30-second interval was optimal in capturing the actual route driven along with the lowest number of incorrect paths and affordability weighing considerations for data storage and curation. Discussion: Use of COTS devices offers minimal installation efforts, unobtrusive monitoring and discreet data extraction.  However, these devices require strict protocols and controlled testing for adoption into research paradigms.  After reliability and validity testing, these devices may provide valuable insight into daily driving behaviors and intraindividual change over time for populations of older adults with and without AD.  Data can be aggregated over time to look at changes or

  13. Accelerated Lifetime Testing Methodology for Lifetime Estimation of Lithium-ion Batteries used in Augmented Wind Power Plants

    DEFF Research Database (Denmark)

    Stroe, Daniel Ioan; Swierczynski, Maciej Jozef; Stan, Ana-Irina;

    2014-01-01

    The development of lifetime estimation models for Lithium-ion battery cells, which are working under highly variable mission profiles characteristic for wind power plant applications, requires a lot of expenditures and time resources. Therefore, batteries have to be tested under accelerated...... both the capacity fade and the power capability decrease of the selected Lithium-ion battery cells. In the proposed methodology both calendar and cycling lifetime tests were considered since both components are influencing the lifetime of Lithium-ion batteries. Furthermore, the proposed methodology...

  14. Critical assessment of jet erosion test methodologies for cohesive soil and sediment

    Science.gov (United States)

    Karamigolbaghi, Maliheh; Ghaneeizad, Seyed Mohammad; Atkinson, Joseph F.; Bennett, Sean J.; Wells, Robert R.

    2017-10-01

    The submerged Jet Erosion Test (JET) is a commonly used technique to assess the erodibility of cohesive soil. Employing a linear excess shear stress equation and impinging jet theory, simple numerical methods have been developed to analyze data collected using a JET to determine the critical shear stress and erodibility coefficient of soil. These include the Blaisdell, Iterative, and Scour Depth Methods, and all have been organized into easy to use spreadsheet routines. The analytical framework of the JET and its associated methods, however, are based on many assumptions that may not be satisfied in field and laboratory settings. The main objective of this study is to critically assess this analytical framework and these methodologies. Part of this assessment is to include the effect of flow confinement on the JET. The possible relationship between the derived erodibility coefficient and critical shear stress, a practical tool in soil erosion assessment, is examined, and a review of the deficiencies in the JET methodology also is presented. Using a large database of JET results from the United States and data from literature, it is shown that each method can generate an acceptable curve fit through the scour depth measurements as a function of time. The analysis shows, however, that the Scour Depth and Iterative Methods may result in physically unrealistic values for the erosion parameters. The effect of flow confinement of the impinging jet increases the derived critical shear stress and decreases the erodibility coefficient by a factor of 2.4 relative to unconfined flow assumption. For a given critical shear stress, the length of time over which scour depth data are collected also affects the calculation of erosion parameters. In general, there is a lack of consensus relating the derived soil erodibility coefficient to the derived critical shear stress. Although empirical relationships are statistically significant, the calculated erodibility coefficient for a

  15. Average reservoir pressure determination for homogeneous and naturally fractured formations from multi-rate testing with the TDS technique

    Energy Technology Data Exchange (ETDEWEB)

    Escobar, Freddy Humberto; Ibagon, Oscar Eduardo; Montealegre-M, Matilde [Universidad Surcolombiana, Av. Pastrana-Cra. 1, Neiva, Huila (Colombia)

    2007-11-15

    Average reservoir pressure is an important parameter which is utilized in almost all reservoir and production engineering studies. It also plays a relevant role in the majority of well intervention jobs, field appraisal, well sizing and equipment and surface facilities design. The estimation of the average reservoir pressure is normally obtained from buildup tests. However, it has a tremendous economic impact caused by shutting-in the well during the entire test. Since buildup tests are the most particular case of multi-rate tests, these are also used for estimation of the average reservoir pressure. Among them, two-rate tests present drawbacks because it is operationally difficult to keep constant the flow rates. Conventional methods for determination of the average reservoir pressure can be readily extended to multi-rate tests once the rigorous time is converted to equivalent time by time superposition. In this article a new, easy and practical methodology is presented for the determination of the average pressure in both homogeneous and naturally fractured reservoirs from multi-rate tests conducted in vertical oil wells located inside a close drainage region. The methodology which follows the philosophy of the TDS technique uses a normalized pressure and pressure derivative point found on any arbitrary point during the pseudosteady-state flow regime to readily provide the average reservoir pressure value. For verification of the effectiveness of the proposed solution, several field and simulated examples were worked out. We found that the average reservoir pressure results obtained from the proposed methodology match very well with those estimated from either conventional techniques or simulations. (author)

  16. Testing the methodology for site descriptive modelling. Application for the Laxemar area

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan [JA Streamflow AB, Aelvsjoe (Sweden); Berglund, Johan [SwedPower AB, Stockholm (Sweden); Follin, Sven [SF Geologic AB, Stockholm (Sweden); Hakami, Eva [Itasca Geomekanik AB, Stockholm (Sweden); Halvarson, Jan [Swedish Nuclear Fuel and Waste Management Co, Stockholm (Sweden); Hermanson, Jan [Golder Associates AB, Stockholm (Sweden); Laaksoharju, Marcus [Geopoint (Sweden); Rhen, Ingvar [Sweco VBB/VIAK, Stockholm (Sweden); Wahlgren, C.H. [Sveriges Geologiska Undersoekning, Uppsala (Sweden)

    2002-08-01

    A special project has been conducted where the currently available data from the Laxemar area, which is part of the Simpevarp site, have been evaluated and interpreted into a Site Descriptive Model covering: geology, hydrogeology, hydrogeochemistry and rock mechanics. Description of the surface ecosystem has been omitted, since it was re-characterised in another, parallel, project. Furthermore, there has been no evaluation of transport properties. The project is primarily a methodology test. The lessons learnt will be implemented in the Site Descriptive Modelling during the coming site investigation. The intent of the project has been to explore whether available methodology for Site Descriptive Modelling based on surface and borehole data is adequate and to identify potential needs for development and improvement in the methodology. The project has developed, with limitations in scope, a Site Descriptive Model in local scale, corresponding to the situation after completion of the Initial Site Investigations for the Laxemar area (i.e. 'version 1.2' using the vocabulary of the general execution program for the site investigations). The Site Descriptive Model should be reasonable, but should not be regarded as a 'real' model. There are limitations both in input data and in the scope of the analysis. The measured (primary) data constitute a wide range of different measurement results including data from two deep core drilled boreholes. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modelling. Examples of such evaluations are estimation of surface geology, lineament interpretation, geological single hole interpretation, hydrogeological single hole interpretation and assessment of hydrogeochemical data. Furthermore, while cross discipline interpretation is encouraged there is also a need for transparency. This means that the evaluations first are made within each discipline

  17. Comparison of two methodologies used to estimate erosion rates in Mediterranean ecosystems: (137)Cs and exposed tree roots.

    Science.gov (United States)

    Rubio-Delgado, J; Guillén, J; Corbacho, J A; Gómez-Gutiérrez, Á; Baeza, A; Schnabel, S

    2017-12-15

    The (137)Cs deposited in soil and exposed tree roots have been widely applied to estimate medium-term soil erosion rates. However, comparative studies between these methods are scarce. For this purpose, three hillsides in two Mediterranean dehesas (rangeland with disperse tree cover) were selected. Regarding the (137)Cs technique, a reference site close to the study areas and with similar altitude and rainfall was selected. In order to reduce uncertainties related to the use of point soil profiles, all those collected in an area were combined to form a representative composite profile. The total inventory was 2790±50Bq/m(2), and the relaxation coefficient indicated it was an undisturbed soil. The radiocaesium inventory in the study areas was 14-23% lower than in the reference area. The erosion rates for (137)Cs were in the range 20.9-38.1tha(-1)y(-1). The exposed root technique was applied to holm oak trees (age about 90years), and the erosion rates were in the range 22-34tha(-1)y(-1). The ratio between exposed root and (137)Cs techniques was 1.02±0.11 (S.D.) within the range 0.89-1.2. Both methods produced very similar results equally with respect to the mean erosion rate as well as the relative difference between the hillslope sections, i.e. displaying the same spatial variation in the study areas. As the accounting time for these two techniques is different, 50 and 90y for (137)Cs and exposed roots respectively, results suggest that no change in mid-term erosion rates was implied for these areas for almost a century. The use of (137)Cs and exposed roots methodology for the determination of mean erosion rates can be reproduced in other ecosystems, but a careful selection of the reference site for (137)Cs is essential. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Implementation of Prognostic Methodologies to Cryogenic Propellant Loading Test-bed

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics methodologies determine the health state of a system and predict the end of life and remaining useful life. This information enables operators to take...

  19. Voltage stress effects on microcircuit accelerated life test failure rates

    Science.gov (United States)

    Johnson, G. M.

    1976-01-01

    The applicability of Arrhenius and Eyring reaction rate models for describing microcircuit aging characteristics as a function of junction temperature and applied voltage was evaluated. The results of a matrix of accelerated life tests with a single metal oxide semiconductor microcircuit operated at six different combinations of temperature and voltage were used to evaluate the models. A total of 450 devices from two different lots were tested at ambient temperatures between 200 C and 250 C and applied voltages between 5 Vdc and 15 Vdc. A statistical analysis of the surface related failure data resulted in bimodal failure distributions comprising two lognormal distributions; a 'freak' distribution observed early in time, and a 'main' distribution observed later in time. The Arrhenius model was shown to provide a good description of device aging as a function of temperature at a fixed voltage. The Eyring model also appeared to provide a reasonable description of main distribution device aging as a function of temperature and voltage. Circuit diagrams are shown.

  20. Evaluation of the proposed FDA pilot dose-response methodology for topical corticosteroid bioequivalence testing.

    Science.gov (United States)

    Demana, P H; Smith, E W; Walker, R B; Haigh, J M; Kanfer, I

    1997-03-01

    The American FDA has recently released a Guidance document for topical corticosteroid bioequivalence testing. The purpose of this study was to evaluate the recommendations of this document for appropriateness. The new specifications require a dose-vasoconstriction response estimation by the use of a Minolta chromameter in a preliminary pilot study to determine the parameters for use in a pivotal bioequivalence study. The visually-assessed human skin balancing assay methodology routinely practiced in our laboratories was modified to comply with the requirements of the pilot study so that visual and chromameter data could be compared. Two different cream formulations, each containing 0.12% betamethasone 17-valerate, were used for this comparison. Visual data showed the expected rank order of AUC values for most dose durations whereas the chromameter data did not show similar results. The expected rank order of AUC values for both chromameter and visual data was not observed at very short dose durations. In fitting the data to pharmacodynamic models, equivalent goodness of fit criteria were obtained when several different parameter estimates were used in the model definition, however the visual data were best described by the sigmoid Emax model while the chromameter data were best described by the simple Emax model. The Emax values predicted by the models were close to the observed values for both data sets and in addition, excellent correlation between the AUC values and the maximum blanching response (Rmax) (r > 0.95) was noted for both methods of assessment. The chromameter ED50 values determined in this study were approximately 2 hours for both preparations. At this dose duration the instrument would not be sensitive enough to distinguish between weak blanching responses and normal skin for bioequivalence assessment purposes.

  1. Test Methodology Development for Experimental Structural Assessment of ASC Planar Spring Material for Long-Term Durability

    Science.gov (United States)

    Yun, Gunjin; Abdullah, A. B. M.; Binienda, Wieslaw; Krause, David L.; Kalluri, Sreeramesh

    2014-01-01

    A vibration-based testing methodology has been developed that will assess fatigue behavior of the metallic material of construction for the Advanced Stirling Convertor displacer (planar) spring component. To minimize the testing duration, the test setup is designed for base-excitation of a multiplespecimen arrangement, driven in a high-frequency resonant mode; this allows completion of fatigue testing in an accelerated period. A high performance electro-dynamic exciter (shaker) is used to generate harmonic oscillation of cantilever beam specimens, which are clasped on the shaker armature with specially-designed clamp fixtures. The shaker operates in closed-loop control with dynamic specimen response feedback provided by a scanning laser vibrometer. A test coordinator function synchronizes the shaker controller and the laser vibrometer to complete the closed-loop scheme. The test coordinator also monitors structural health of the test specimens throughout the test period, recognizing any change in specimen dynamic behavior. As this may be due to fatigue crack initiation, the test coordinator terminates test progression and then acquires test data in an orderly manner. Design of the specimen and fixture geometry was completed by finite element analysis such that peak stress does not occur at the clamping fixture attachment points. Experimental stress evaluation was conducted to verify the specimen stress predictions. A successful application of the experimental methodology was demonstrated by validation tests with carbon steel specimens subjected to fully-reversed bending stress; high-cycle fatigue failures were induced in such specimens using higher-than-prototypical stresses

  2. Stationarity test with a direct test for heteroskedasticity in exchange rate forecasting models

    Science.gov (United States)

    Khin, Aye Aye; Chau, Wong Hong; Seong, Lim Chee; Bin, Raymond Ling Leh; Teng, Kevin Low Lock

    2017-05-01

    Global economic has been decreasing in the recent years, manifested by the greater exchange rates volatility on international commodity market. This study attempts to analyze some prominent exchange rate forecasting models on Malaysian commodity trading: univariate ARIMA, ARCH and GARCH models in conjunction with stationarity test on residual diagnosis direct testing of heteroskedasticity. All forecasting models utilized the monthly data from 1990 to 2015. Given a total of 312 observations, the data used to forecast both short-term and long-term exchange rate. The forecasting power statistics suggested that the forecasting performance of ARIMA (1, 1, 1) model is more efficient than the ARCH (1) and GARCH (1, 1) models. For ex-post forecast, exchange rate was increased from RM 3.50 per USD in January 2015 to RM 4.47 per USD in December 2015 based on the baseline data. For short-term ex-ante forecast, the analysis results indicate a decrease in exchange rate on 2016 June (RM 4.27 per USD) as compared with 2015 December. A more appropriate forecasting method of exchange rate is vital to aid the decision-making process and planning on the sustainable commodities' production in the world economy.

  3. Dynamic testing of learning potential in adults with cognitive impairments: A systematic review of methodology and predictive value.

    Science.gov (United States)

    Boosman, Hileen; Bovend'Eerdt, Thamar J H; Visser-Meily, Johanna M A; Nijboer, Tanja C W; van Heugten, Caroline M

    2016-09-01

    Dynamic testing includes procedures that examine the effects of brief training on test performance where pre- to post-training change reflects patients' learning potential. The objective of this systematic review was to provide clinicians and researchers insight into the concept and methodology of dynamic testing and to explore its predictive validity in adult patients with cognitive impairments. The following electronic databases were searched: PubMed, PsychINFO, and Embase/Medline. Of 1141 potentially relevant articles, 24 studies met the inclusion criteria. The mean methodological quality score was 4.6 of 8. Eleven different dynamic tests were used. The majority of studies used dynamic versions of the Wisconsin Card Sorting Test. The training mostly consisted of a combination of performance feedback, reinforcement, expanded instruction, or strategy training. Learning potential was quantified using numerical (post-test score, difference score, gain score, regression residuals) and categorical (groups) indices. In five of six longitudinal studies, learning potential significantly predicted rehabilitation outcome. Three of four studies supported the added value of dynamic testing over conventional testing in predicting rehabilitation outcome. This review provides preliminary support that dynamic tests can provide a valuable addition to conventional tests to assess patients' abilities. Although promising, there was a large variability in methods used for dynamic testing and, therefore, it remains unclear which dynamic testing methods are most appropriate for patients with cognitive impairments. More research is warranted to further evaluate and refine dynamic testing methodology and to further elucidate its predictive validity concerning rehabilitation outcomes relative to other cognitive and functional status indices.

  4. Reliability of commercially available hydrogen sensors for detection of hydrogen at critical concentrations: Part I - Testing facility and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Boon-Brett, L.; Castello, P.; Harskamp, F. [European Commission, DG Joint Research Centre, Institute for Energy - Cleaner Energy Unit, P.O. Box 2, 1755 ZG Petten (Netherlands); Bousek, J. [European Commission, DG Joint Research Centre, Institute for Energy - Cleaner Energy Unit, P.O. Box 2, 1755 ZG Petten (Netherlands); Faculty of Electrical Engineering and Communication, Brno University of Technology, Udolni 244/53, 602 00 Brno (Czech Republic); Salyk, O. [Faculty of Electrical Engineering and Communication, Brno University of Technology, Udolni 244/53, 602 00 Brno (Czech Republic); Aldea, L.; Tinaut, F. [Fundacion Cidaut, Investigacion y Desarrollo en Transporte y Energia (CIDAUT), Parque Tecnologico de Boecillo, 47151 Boecillo, Valladolid (Spain)

    2008-12-15

    A facility for testing the performance of hydrogen safety sensors under a wide range of ambient conditions is described. A specific test protocol was developed to test sensors under conditions which could reasonably be expected during the sensors' service life. The tests were based on those described in IEC 61779 and were adapted following consultation with car manufacturers and after careful consideration of the sensors expected service environmental conditions. The protocol was evaluated by using it to test a large number of commercially available sensors. Observations made and experience gained during the testing campaign allowed the test protocol to be fine-tuned bearing in mind the sensor performance and behaviour during tests. The result of this work is an experimentally evaluated methodology which may be used as a guideline for testing the suitability of hydrogen sensors for automotive applications. (author)

  5. A proposed hardness assurance test methodology for bipolar linear circuits and devices in a space ionizing radiation environment

    Energy Technology Data Exchange (ETDEWEB)

    Pease, R.L. [RLP Research, Albuquerque, NM (United States); Brown, D.B. [Naval Research Lab., Washington, DC (United States); Cohn, L. [Defense Special Weapons Agency, Alexandria, VA (United States)] [and others

    1997-04-01

    A hardness assurance test approach has been developed for bipolar linear circuits and devices in space. It consists of a screen for dose rate sensitivity and a characterization test method to develop the conditions for a lot acceptance test at high dose rate.

  6. [The methodological assessment and qualitative evaluation of psychometric performance tests based on the example of modern tests that assess reading and spelling skills].

    Science.gov (United States)

    Galuschka, Katharina; Rothe, Josefine; Schulte-Körne, Gerd

    2015-09-01

    This article looks at a means of objectively evaluating the quality of psychometric tests. This approach enables users to evaluate psychometric tests based on their methodological characteristics, in order to decide which instrument should be used. Reading and spelling assessment tools serve as examples. The paper also provides a review of German psychometric tests for the assessment of reading and spelling skills. This method facilitates the identification of psychometric tests.of high methodological quality which can be used for the assessment of reading and spelling skills. Reading performance should ideally be assessed with the following instruments: ELFE 1-6, LGVT 6-12, LESEN 6-7, LESEN 8-9, or WLLP-R. The tests to be used for the evaluation of spelling skills are DERET 1-2+, DERET 3-4+, WRT 1+, WRT 2+, WRT 3+, WRT 4+ or HSP 1-10.

  7. Effects of strain rate, test temperature and test environment on tensile properties of vandium alloys

    Energy Technology Data Exchange (ETDEWEB)

    Gubbi, A.N.; Rowcliffe, A.F.; Eatherly, W.S.; Gibson, L.T. [Oak Ridge National Lab., TN (United States)

    1996-10-01

    Tensile testing was carried out on SS-3 tensile specimens punched from 0.762-mm-thick sheets of the large heat of V-4Cr-4Ti and small heats of V-3Cr-3Ti and V-6Cr-6Ti. The tensile specimens were annealed at 1000{degrees} for 2 h to obtain a fully recrystallized, fine grain microstructure with a grain size in the range of 10-19 {mu}m. Room temperature tests at strain rates ranging from 10{sup {minus}3} to 5 x 10{sup {minus}1}/s were carried out in air; elevated temperature testing up to 700{degrees}C was conducted in a vacuum better than 1 x 10{sup {minus}5} torr (<10{sup {minus}3} Pa). To study the effect of atomic hydrogen on ductility, tensile tests were conducted at room temperature in an ultra high vacuum chamber (UHV) with a hydrogen leak system.

  8. Testing Theories of Transfer Using Error Rate Learning Curves.

    Science.gov (United States)

    Koedinger, Kenneth R; Yudelson, Michael V; Pavlik, Philip I

    2016-07-01

    We analyze naturally occurring datasets from student use of educational technologies to explore a long-standing question of the scope of transfer of learning. We contrast a faculty theory of broad transfer with a component theory of more constrained transfer. To test these theories, we develop statistical models of them. These models use latent variables to represent mental functions that are changed while learning to cause a reduction in error rates for new tasks. Strong versions of these models provide a common explanation for the variance in task difficulty and transfer. Weak versions decouple difficulty and transfer explanations by describing task difficulty with parameters for each unique task. We evaluate these models in terms of both their prediction accuracy on held-out data and their power in explaining task difficulty and learning transfer. In comparisons across eight datasets, we find that the component models provide both better predictions and better explanations than the faculty models. Weak model variations tend to improve generalization across students, but hurt generalization across items and make a sacrifice to explanatory power. More generally, the approach could be used to identify malleable components of cognitive functions, such as spatial reasoning or executive functions. Copyright © 2016 Cognitive Science Society, Inc.

  9. A durability test rig and methodology for erosion-resistant blade coatings in turbomachinery

    Science.gov (United States)

    Leithead, Sean Gregory

    A durability test rig for erosion-resistant gas turbine engine compressor blade coatings was designed, completed and commissioned. Bare and coated 17-4PH steel V103-profile blades were rotated at up to 11500 rpm and impacted with Garnet sand for 5 hours at an average concentration of 2.51 gm3of air , at a blade leading edge Mach number of 0.50. The rig was determined to be an acceptable first stage axial compressor representation. Two types of 16 microm-thick coatings were tested: Titanium Nitride (TiN) and Chromium-Aluminum-Titanium Nitride (CrAlTiN), both applied using an Arc Physical Vapour Deposition technique at the National Research Council in Ottawa, Canada. A Leithead-Allan-Zhao (LAZ) score was created to compare the durability performance of uncoated and coated blades based on mass-loss and blade dimension changes. The bare blades' LAZ score was set as a benchmark of 1.00. The TiN-coated and CrAlTiN-coated blades obtained LAZ scores of 0.69 and 0.41, respectively. A lower score meant a more erosion-resistant coating. Major modes of blade wear included: trailing edge, leading edge and the rear suction surface. Trailing edge thickness was reduced, the leading edge became blunt, and the rear suction surface was scrubbed by overtip and recirculation zone vortices. It was found that the erosion effects of vortex flow were significant. Erosion damage due to reflected particles was not present due to the low blade solidity of 0.7. The rig is best suited for studying the performance of erosion-resistant coatings after they are proven effective in ASTM standardized testing. Keywords: erosion, compressor, coatings, turbomachinery, erosion rate, blade, experimental, gas turbine engine

  10. A rapid, non-destructive methodology to monitor activity of sulfide-induced corrosion of concrete based on H2S uptake rate.

    Science.gov (United States)

    Sun, Xiaoyan; Jiang, Guangming; Bond, Philip L; Wells, Tony; Keller, Jurg

    2014-08-01

    Many existing methods to monitor the corrosion of concrete in sewers are either very slow or destructive measurements. To overcome these limitations, a rapid, non-invasive methodology was developed to monitor the sulfide-induced corrosion process on concrete through the measurement of the H2S uptake rates of concrete at various corrosion stages. The H2S uptake rate for a concrete coupon was determined by measuring the gaseous H2S concentrations over time in a temperature- and humidity-controlled gas-tight reactor. The reliability of this method was evaluated by carrying out repeated tests on different concrete coupons previously exposed to 50 ppm of H2S, at 30 °C and 100% relative humidity for over 32 months. The H2S uptake measurements showed good reproducibility. It was also shown that a severely corroded coupon exhibited higher sulfide uptake rates than a less corroded coupon. This could be explained by the corrosion layer in the more corroded coupon having a higher biological sulfide oxidation activity than the less corroded coupon. Additionally, temperature changes had a stronger effect on the uptake rate of the heavily corroded coupon compared to the less corroded coupon. A corrosion rate of 8.9 ± 0.5 mm/year, estimated from the H2S uptake results, agreed well with the corrosion rate observed in real sewers under similar conditions. The method could be applied to investigate important factors affecting sulfide-induced concrete corrosion, particularly temperature, fluctuating gaseous H2S concentrations, oxygen concentrations, surface pH and relative humidity.

  11. Methodology for the validation of analytical methods involved in uniformity of dosage units tests.

    Science.gov (United States)

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2013-01-14

    Validation of analytical methods is required prior to their routine use. In addition, the current implementation of the Quality by Design (QbD) framework in the pharmaceutical industries aims at improving the quality of the end products starting from its early design stage. However, no regulatory guideline or none of the published methodologies to assess method validation propose decision methodologies that effectively take into account the final purpose of developed analytical methods. In this work a solution is proposed for the specific case of validating analytical methods involved in the assessment of the content uniformity or uniformity of dosage units of a batch of pharmaceutical drug products as proposed in the European or US pharmacopoeias. This methodology uses statistical tolerance intervals as decision tools. Moreover it adequately defines the Analytical Target Profile of analytical methods in order to obtain analytical methods that allow to make correct decisions about Content uniformity or uniformity of dosage units with high probability. The applicability of the proposed methodology is further illustrated using an HPLC-UV assay as well as a near infra-red spectrophotometric method.

  12. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design.

    Science.gov (United States)

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-02-28

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms.

  13. Electric Propulsion Test and Evaluation Methodologies for Plasma in the Environments of Space and Testing (EP TEMPEST)

    Science.gov (United States)

    2016-04-14

    Payload to Orbit • Rapid, Sustainable Repositioning and Station-keeping • Smaller, Low-Cost Launch Vehicle and Dual Launch • Mission Enabling...voltage while measuring thruster current, oscillation telemetry – Evaluate sensitivity to changes in pressure and input parameters • NEW RDT&E...Methodology – Plot I-V-B map with color scale for telemetry (e.g. current oscillations) to assess global trends and facility interactions – Extrapolate to

  14. Observational tests of Galileon gravity with growth rate

    Science.gov (United States)

    Hirano, Koichi

    2016-10-01

    We compare observational data of growth rate with the prediction by Galileon theory. For the same value of the energy density parameter Ω_{m,0}, the growth rate in Galileon models is enhanced compared with the Λ CDM case, due to the enhancement of Newton's constant. The smaller Ω_{m,0} is, the more suppressed growth rate is. Hence the best fit value of Ω_{m,0} in the Galileon model is 0.16 from only the growth rate data, which is considerably smaller than such value obtained from observations of supernovae Ia, the cosmic microwave background and baryon acoustic oscillations. We also find the upper limit of the Brans-Dicke parameter to be ω < -1000 (1σ ), from the growth rate data. In this paper, specific galileon models are considered, not the entire class. More and better growth rate data are required to distinguish between dark energy and modified gravity.

  15. Testing the results of municipal mixed-use zoning ordinances: a novel methodological approach.

    Science.gov (United States)

    Cannon, Carol L; Thomas, Sue; Treffers, Ryan D; Paschall, Mallie J; Heumann, Lauren; Mann, Gregory W; Dunkell, Dashiell O; Nauenberg, Saskia

    2013-08-01

    Municipal mixed-use zoning (MUZ) is one public health strategy to create more walkable neighborhoods by reducing the separation of daily activities. This study uses a novel data-gathering methodology to evaluate municipal zoning ordinances in twenty-two California cities in conjunction with the walkability potential of resulting mixed-use zones, to explore the extent to which variations in uses mandated by MUZ ordinances are correlated with variations in walking opportunities. We find that, after controlling for population, socioeconomic status, and zone size, significant relationships exist between the range and precision of uses mandated by MUZ ordinances and the mixture and breadth of walking destinations in these zones. The study also demonstrates that analysis of municipal zoning codes and a novel data-gathering methodology yield valid data. The analysis of MUZ ordinances is a significant complement to other approaches to measuring walkability and can be used across cities.

  16. Extending Synthetic Validation Methodology to Assess Occupational Similarities Within Job Sets and to Select Classification Tests

    Science.gov (United States)

    1991-12-01

    objective of the project was accomplished in two ways: (1) by using a paper-and-pencil job analysis questionnaire and developing a PC-based technology ...evaluation of methodologies for achieving two distinct sub- objectives: Developing technology to cluster similar jobs or quantify the similarity...Radio Operator (SYNVAL) 310 Moble Subscr Equip Transmasn $yet Op (SYIIVAL) 41C Fire Control Instrument RepaLrer (JSURT) 453 Small Arma Repairer (JSRRT

  17. Low HIV testing rates among tuberculosis patients in Kampala, Uganda

    Directory of Open Access Journals (Sweden)

    Cobelens Frank

    2010-03-01

    Full Text Available Abstract Background HIV testing among tuberculosis patients is critical in improving morbidity and mortality as those found to be HIV positive will be offered a continuum of care including ART if indicated. We conducted a cross-sectional study in three Kampala City primary care clinics: to assess the level of HIV test uptake among newly diagnosed pulmonary tuberculosis (PTB patients; to assess patient and health worker factors associated with HIV test uptake; and to determine factors associated with HIV test uptake at the primary care clinics Methods Adult patients who had been diagnosed with smear-positive PTB at a primary care clinic or at the referral hospital and who were being treated at any of the three clinics were interviewed. Associations between having taken the test as the main outcome and explanatory variables were assessed by multivariate logistic regression. Results Between April and October 2007, 112 adults were included in the study. An HIV test had been offered to 74 (66%. Of the 112 patients, 61 (82% had accepted the test; 45 (74% had eventually been tested; and 32 (29% had received their test results. Patients who were Conclusions The overall HIV test uptake was surprisingly low at 40%. The HIV test uptake was significantly higher among TB patients who were identified at hospital, among females and in the unemployed.

  18. Predicting pilot error: testing a new methodology and a multi-methods and analysts approach.

    Science.gov (United States)

    Stanton, Neville A; Salmon, Paul; Harris, Don; Marshall, Andrew; Demagalski, Jason; Young, Mark S; Waldmann, Thomas; Dekker, Sidney

    2009-05-01

    The Human Error Template (HET) is a recently developed methodology for predicting design-induced pilot error. This article describes a validation study undertaken to compare the performance of HET against three contemporary Human Error Identification (HEI) approaches when used to predict pilot errors for an approach and landing task and also to compare analyst error predictions to an approach to enhancing error prediction sensitivity: the multiple analysts and methods approach, whereby multiple analyst predictions using a range of HEI techniques are pooled. The findings indicate that, of the four methodologies used in isolation, analysts using the HET methodology offered the most accurate error predictions, and also that the multiple analysts and methods approach was more successful overall in terms of error prediction sensitivity than the three other methods but not the HET approach. The results suggest that when predicting design-induced error, it is appropriate to use a toolkit of different HEI approaches and multiple analysts in order to heighten error prediction sensitivity.

  19. The RADAR Test Methodology: Evaluating a Multi-Task Machine Learning System with Humans in the Loop

    Science.gov (United States)

    2006-10-01

    Learning System with Humans in the Loop 7 subject in the form of static web pages easily accessible from the subject’s home page (Figure 2, top and...middle). Other static web content included a conference planning manual (complete with documentation of standard task constraints), a PDF of the...that I can make arrangements! Thanks, you’re the best! Kim Figure 2. Static web and vendor portal examples The RADAR Test Methodology

  20. Test of The Weak Form Efficient Market Hypothesis for The Istanbul Stock Exchange By Markov Chains Methodology

    OpenAIRE

    KILIÇ, Öğr.Gör.Dr. Süleyman Bilgin

    2013-01-01

    In this study Markov chain methodology is used to test whether or not the daily returns of the Istanbul Stock Exchange ISE 100 index follows a martingale random walk process If the Weak Form Efficient Market Hypothesis EMH holds in any stock market stocks prices or returns follow a random walk process The random walk theory asserts that price movements will not follow any patterns or trends and that past price movements cannot be used to predict future price movements hence technic...

  1. How to Use the DX SYSTEM of Diagnostic Testing. Methodology Project.

    Science.gov (United States)

    McArthur, David; Cabello, Beverly

    The DX SYSTEM of Diagnostic Testing is an easy-to-use computerized system for developing and administering diagnostic tests. A diagnostic test measures a student's mastery of a specific domain (skill or content area). It examines the necessary subskills hierarchically from the most to the least complex. The DX SYSTEM features tailored testing with…

  2. Hydrogen embrittlement of duplex steel tested using slow strain rate test

    Directory of Open Access Journals (Sweden)

    P. Vaňova

    2014-04-01

    Full Text Available This paper is dealing with hydrogen embrittlement of austenitic-ferritic 2205 duplex steel using the slow strain rate test (SSRT. The original material was subjected to heat treatment under 700 °C during 5 hours and following aircooling with the aim of provoking sigma phase precipitation and embrittlement of the material. The samples of both states were electrolytic saturation with hydrogen in 0,1N solution of sulfuric acid (H2SO4 with addition KSCN during 24 hours. The hydrogen embrittlement appeared on fracture surfaces of tested tensile bars as a quasi-cleavage damage on their perimeter. From the established depth of hydrogen charging the diffusion coefficient of hydrogen in duplex steel with ferritic-austenitic structure and with the structure containing the sigma phase as well were estimated.

  3. Proposed test method for determining discharge rates from water closets

    DEFF Research Database (Denmark)

    Nielsen, V.; Fjord Jensen, T.

    At present the rates at which discharge takes place from sanitary appliances are mostly known only in the form of estimated average values. SBI has developed a measuring method enabling determination of the exact rate of discharge from a sanitary appliance as function of time. The methods depends...... on the application of a calibrated measuring vessel, the volume of water in the vessel being measured at a given moment by means of a transducer and recorded by an UV recorder which is able to follow very rapid variations. In the article the apparatus is described in detail, and an example is given...... of the measurements of the rate of discharge from a WC....

  4. Development of a performance-rating scale for a nutrition knowledge test developed for adolescents.

    Science.gov (United States)

    Whati, Lindiwe; Senekal, Marjanne; Steyn, Nelia P; Lombard, Carl; Nel, Johanna

    2009-10-01

    The objectives of the present study were to (i) develop and validate a norm-referenced performance-rating scale to interpret a nutrition knowledge test developed for urban adolescents and (ii) develop a prototype for other researchers to follow when developing nutrition knowledge tests. For norm development the nutrition knowledge test (questionnaire) was administered to a sample representative of the questionnaire target group, referred to as the norm group. These included 512 adolescents in grades 8 (n 158), 10 (n 149) and 12 (n 205) at three randomly selected schools in Soweto and Johannesburg. The performance scores (in percentages) obtained by the norm group were transformed to Z-scores which were categorised into stanines using established Z-score cut-off points. For validation purposes the questionnaire was completed by 148 volunteers: sixty university dietetics students, nineteen non-nutrition university students and sixty-nine primary-school teachers. As required of an ideal norm group, the Z-scores formed a normal distribution (a bell-shaped curve). To facilitate interpretation of the results, the Z-score cut-off points for these categories were transformed back to performance scores (percentages) so that the performance of a testee could be interpreted directly from his/her performance in percentage. As is recommended, the nine stanine categories were reduced to five: very poor, fair/below average, good/average, very good/above average and excellent. The discriminatory validity of the norms was substantiated by showing that groups with known nutrition knowledge levels were rated appropriately and that the performance ratings of these groups differed significantly, with university dietetics students scoring 98.3%, primary-school teachers 20.3% and non-nutrition university students 31.6%. The norm-referenced performance-rating scale can be used with confidence to interpret the performance score achieved by a testee on the nutrition knowledge test developed

  5. Estimating rates of biologically driven coral reef framework production and erosion: a new census-based carbonate budget methodology and applications to the reefs of Bonaire

    Science.gov (United States)

    Perry, C. T.; Edinger, E. N.; Kench, P. S.; Murphy, G. N.; Smithers, S. G.; Steneck, R. S.; Mumby, P. J.

    2012-09-01

    Census-based approaches can provide important measures of the ecological processes controlling reef carbonate production states. Here, we describe a rapid, non-destructive approach to carbonate budget assessments, termed ReefBudget that is census-based and which focuses on quantifying the relative contributions made by different biological carbonate producer/eroder groups to net reef framework carbonate production. The methodology is presently designed only for Caribbean sites, but has potential to be adapted for use in other regions. Rates are calculated using data on organism cover and abundance, combined with annual extension or production rate measures. Set against this are estimates of the rates at which bioeroding species of fish, urchins and internal substrate borers erode reef framework. Resultant data provide a measure of net rates of biologically driven carbonate production (kg CaCO3 m-2 year-1). These data have potential to be integrated into ecological assessments of reef state, to aid monitoring of temporal (same-site) changes in rates of biological carbonate production and to provide insights into the key ecological drivers of reef growth or erosion as a function of environmental change. Individual aspects of the budget methodology can also be used alongside other census approaches if deemed appropriate for specific study aims. Furthermore, the methodology spreadsheets are user-changeable, allowing local or new process/rate data to be integrated into calculations. Application of the methodology is considered at sites around Bonaire. Highest net rates of carbonate production, +9.52 to +2.30 kg CaCO3 m-2 year-1, were calculated at leeward sites, whilst lower rates, +0.98 to -0.98 kg CaCO3 m-2 year-1, were calculated at windward sites. Data are within the ranges calculated in previous budget studies and provide confidence in the production estimates the methodology generates.

  6. High Strain-Rate Testing of Mechanical Couplers

    Science.gov (United States)

    2009-09-01

    tensile strength equal to or greater than that of the control bar but did not achieve the ductility of the control bar. Specimen UHC 9 failed close to...than the Grade 60 bar, but only slightly so at the rapid rate. Upset head system The upset head coupler ( UHC ) system performed very well under the...average performance of the UHC system under the intermediate strain-rate loading condition produced 99% of the dynamic ultimate strength, 61% of the

  7. Usability testing: a review of some methodological and technical aspects of the method

    CERN Document Server

    Bastien, J M Christian

    2010-01-01

    The aim of this paper is to review some work conducted in the field of user testing that aims at specifying or clarifying the test procedures and at defining and developing tools to help conduct user tests. The topics that have been selected were considered relevant for evaluating applications in the field of medical and health care informatics. These topics are: the number of participants that should take part in a user test, the test procedure, remote usability evaluation, usability testing tools, and evaluating mobile applications.

  8. Methodology for uncertainty calculation of net total cooling effect estimation for rating room air conditioners and packaged terminal air conditioners

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca Diaz, Nestor [Universidad Tecnologica de Pereira, Facultad de Ingenieria Mecanica, Pereira (Colombia); University of Liege, Campus du Sart Tilman, Bat: B49, P33, B-4000 Liege (Belgium)

    2009-09-15

    This article presents the general procedure for uncertainty calculation of net total cooling effect estimation for rating room air conditioners and packaged terminal air conditioners, by means of measurements carried out in a test bench specially designed for this purpose. The uncertainty analysis presented in this work looks for establishing a confidence degree or certainty of experimental results. It is particularly important considering that international standards related to this type of analysis are too ambiguous when treating this subject. The uncertainty analysis is on the other hand an indispensable requirement to international standard ISO 17025 [ISO, 2005. International Standard. 17025. General Requirement to Test and Calibration Laboratories Competences. International Organization for Standardization, Geneva.], which must be applied to obtain the required quality levels according to the Word Trade Organization WTO. (author)

  9. A Combined Hazard Index Fire Test Methodology for Aircraft Cabin Materials. Volume I.

    Science.gov (United States)

    1982-04-01

    successful data processing and evaluation . R. J. Sutton, Principal Technical specialist - Advanced Programs, and E. L. Weiner, Engineering Contract... Evaluation ................................................... 49 IV. DISCUSSION OF REJLTS ............................................. 51 CHAS/S’TS Test...PROGRAM TEST PANEL NO. 1 ....... 52 5 SUMARY OF EXPERIMTAL CHAS/SATS DATA FOR CI PRGRAM TEST PANEL 2, 3 & 4

  10. Single Group, Pre- and Post-Test Research Designs: Some Methodological Concerns

    Science.gov (United States)

    Marsden, Emma; Torgerson, Carole J.

    2012-01-01

    This article provides two illustrations of some of the factors that can influence findings from pre- and post-test research designs in evaluation studies, including regression to the mean (RTM), maturation, history and test effects. The first illustration involves a re-analysis of data from a study by Marsden (2004), in which pre-test scores are…

  11. Facilitating the Interpretation of English Language Proficiency Scores: Combining Scale Anchoring and Test Score Mapping Methodologies

    Science.gov (United States)

    Powers, Donald; Schedl, Mary; Papageorgiou, Spiros

    2017-01-01

    The aim of this study was to develop, for the benefit of both test takers and test score users, enhanced "TOEFL ITP"® test score reports that go beyond the simple numerical scores that are currently reported. To do so, we applied traditional scale anchoring (proficiency scaling) to item difficulty data in order to develop performance…

  12. Single Group, Pre- and Post-Test Research Designs: Some Methodological Concerns

    Science.gov (United States)

    Marsden, Emma; Torgerson, Carole J.

    2012-01-01

    This article provides two illustrations of some of the factors that can influence findings from pre- and post-test research designs in evaluation studies, including regression to the mean (RTM), maturation, history and test effects. The first illustration involves a re-analysis of data from a study by Marsden (2004), in which pre-test scores are…

  13. High Strain Rate Compression Testing of Ceramics and Ceramic Composites.

    Energy Technology Data Exchange (ETDEWEB)

    Blumenthal, W. R. (William R.)

    2005-01-01

    The compressive deformation and failure behavior of ceramics and ceramic-metal composites for armor applications has been studied as a function of strain rate at Los Alamos National Laboratory since the late 1980s. High strain rate ({approx}10{sup 3} s{sup -1}) uniaxial compression loading can be achieved using the Kolsky-split-Hopkinson pressure bar (SHPB) technique, but special methods must be used to obtain valid strength results. This paper reviews these methods and the limitations of the Kolsky-SHPB technique for this class of materials. The Kolsky-split-Hopkinson pressure bar (Kolsky-SHPB) technique was originally developed to characterize the mechanical behavior of ductile materials such as metals and polymers where the results can be used to develop strain-rate and temperature-dependent constitutive behavior models that empirically describe macroscopic plastic flow. The flow behavior of metals and polymers is generally controlled by thermally-activated and rate-dependent dislocation motion or polymer chain motion in response to shear stresses. Conversely, the macroscopic mechanical behavior of dense, brittle, ceramic-based materials is dominated by elastic deformation terminated by rapid failure associated with the propagation of defects in the material in response to resolved tensile stresses. This behavior is usually characterized by a distribution of macroscopically measured failure strengths and strains. The basis for any strain-rate dependence observed in the failure strength must originate from rate-dependence in the damage and fracture process, since uniform, uniaxial elastic behavior is rate-independent (e.g. inertial effects on crack growth). The study of microscopic damage and fracture processes and their rate-dependence under dynamic loading conditions is a difficult experimental challenge that is not addressed in this paper. The purpose of this paper is to review the methods that have been developed at the Los Alamos National Laboratory to

  14. Methodology comparison for gamma-heating calculations in material-testing reactors

    Energy Technology Data Exchange (ETDEWEB)

    Lemaire, M.; Vaglio-Gaudard, C.; Lyoussi, A. [CEA, DEN, DER, Cadarache F-13108 Saint Paul les Durance (France); Reynard-Carette, C. [Aix Marseille Universite, CNRS, Universite de Toulon, IM2NP UMR 7334, 13397, Marseille (France)

    2015-07-01

    The Jules Horowitz Reactor (JHR) is a Material-Testing Reactor (MTR) under construction in the south of France at CEA Cadarache (French Alternative Energies and Atomic Energy Commission). It will typically host about 20 simultaneous irradiation experiments in the core and in the beryllium reflector. These experiments will help us better understand the complex phenomena occurring during the accelerated ageing of materials and the irradiation of nuclear fuels. Gamma heating, i.e. photon energy deposition, is mainly responsible for temperature rise in non-fuelled zones of nuclear reactors, including JHR internal structures and irradiation devices. As temperature is a key parameter for physical models describing the behavior of material, accurate control of temperature, and hence gamma heating, is required in irradiation devices and samples in order to perform an advanced suitable analysis of future experimental results. From a broader point of view, JHR global attractiveness as a MTR depends on its ability to monitor experimental parameters with high accuracy, including gamma heating. Strict control of temperature levels is also necessary in terms of safety. As JHR structures are warmed up by gamma heating, they must be appropriately cooled down to prevent creep deformation or melting. Cooling-power sizing is based on calculated levels of gamma heating in the JHR. Due to these safety concerns, accurate calculation of gamma heating with well-controlled bias and associated uncertainty as low as possible is all the more important. There are two main kinds of calculation bias: bias coming from nuclear data on the one hand and bias coming from physical approximations assumed by computer codes and by general calculation route on the other hand. The former must be determined by comparison between calculation and experimental data; the latter by calculation comparisons between codes and between methodologies. In this presentation, we focus on this latter kind of bias. Nuclear

  15. Can citizen science produce good science? Testing the OPAL Air Survey methodology, using lichens as indicators of nitrogenous pollution.

    Science.gov (United States)

    Tregidgo, Daniel J; West, Sarah E; Ashmore, Mike R

    2013-11-01

    Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood.

  16. Groundwater flow simulations in support of the Local Scale Hydrogeological Description developed within the Laxemar Methodology Test Project

    Energy Technology Data Exchange (ETDEWEB)

    Follin, Sven [SF GeoLogic AB, Stockholm (Sweden); Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2002-05-01

    The deduced Site Descriptive Model of the Laxemar area has been parameterised from a hydraulic point of view and subsequently put into practice in terms of a numerical flow model. The intention of the subproject has been to explore the adaptation of a numerical flow model to site-specific surface and borehole data, and to identify potential needs for development and improvement in the planned modelling methodology and tools. The experiences made during this process and the outcome of the simulations have been presented to the methodology test project group in course of the project. The discussion and conclusions made in this particular report concern two issues mainly, (i) the use of numerical simulations as a means of gaining creditability, e.g. discrimination between alternative geological models, and (ii) calibration and conditioning of probabilistic (Monte Carlo) realisations.

  17. Friction at seismic slip rates: testing thermal weakening models experimentally

    Science.gov (United States)

    Nielsen, S. B.; Spagnuolo, E.; Violay, M.; Di Toro, G.

    2013-12-01

    Recent experiments systematically explore rock friction under crustal earthquake conditions (fast slip rate 1desing an efficient and accurate wavenumber approximation for a solution of the temperature evolution on the fault. Finally, we propose a compact and paractical model based on a small number of memory variables for the implementation of thermal weakening friction in seismic fault simulations.

  18. High rate tests of microstrip gas chambers for CMS

    CERN Document Server

    Malina, R F; Bell, B; Bellazzini, R; Bozzo, M; Brand, C; Brez, A; Caner, A; Cattai, A; Chorowicz, V; Contardo, D; Latronico, L; Lumb, N; Magazzù, C; Martín, J; Massai, M M; Mirabito, L; Morelli, A; Raffo, R; Rolandi, Luigi; Smadja, G; Spandre, G; Spezziga, M; Tsirou, A L

    1999-01-01

    Microstrip gas chambers (MSGC's) have been proposed for equipping the outer region of the tracker of the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC). The MSGC's have undergone extensive development and tests during the last few years and their performance is well established. An important issue that has to be addressed to date is whether MSGC's can maintain their characteristics after a long exposure to an intense flux of particles, similar to LHC. We report results from the most recent beam test addressing this topic. (9 refs).

  19. Electronic palliative care coordination systems: Devising and testing a methodology for evaluating documentation.

    Science.gov (United States)

    Allsop, Matthew J; Kite, Suzanne; McDermott, Sarah; Penn, Naomi; Millares-Martin, Pablo; Bennett, Michael I

    2017-05-01

    The need to improve coordination of care at end of life has driven electronic palliative care coordination systems implementation across the United Kingdom and internationally. No approaches for evaluating electronic palliative care coordination systems use in practice have been developed. This study outlines and applies an evaluation framework for examining how and when electronic documentation of advance care planning is occurring in end of life care services. A pragmatic, formative process evaluation approach was adopted. The evaluation drew on the Project Review and Objective Evaluation methodology to guide the evaluation framework design, focusing on clinical processes. Data were extracted from electronic palliative care coordination systems for 82 of 108 general practices across a large UK city. All deaths ( n = 1229) recorded on electronic palliative care coordination systems between April 2014 and March 2015 were included to determine the proportion of all deaths recorded, median number of days prior to death that key information was recorded and observations about routine data use. The evaluation identified 26.8% of all deaths recorded on electronic palliative care coordination systems. The median number of days to death was calculated for initiation of an electronic palliative care coordination systems record (31 days), recording a patient's preferred place of death (8 days) and entry of Do Not Attempt Cardiopulmonary Resuscitation decisions (34 days). Where preferred and actual place of death was documented, these were matching for 75% of patients. Anomalies were identified in coding used during data entry on electronic palliative care coordination systems. This study reports the first methodology for evaluating how and when electronic palliative care coordination systems documentation is occurring. It raises questions about what can be drawn from routine data collected through electronic palliative care coordination systems and outlines

  20. Evaluating Cross-National Metrics of Tertiary Graduation Rates for OECD Countries: A Case for Increasing Methodological Congruence and Data Comparability

    Science.gov (United States)

    Heuser, Brian L.; Drake, Timothy A.; Owens, Taya L.

    2013-01-01

    By examining the different methods and processes by which national data gathering agencies compile and submit their findings to the Organization for Economic Cooperation and Development (OECD), the authors (1) assess the methodological challenges of accurately reporting tertiary completion and graduation rates cross-nationally; (2) to examine the…

  1. Melt Rate Improvement for DWPF MB3: Melt Rate Furnace Testing

    Energy Technology Data Exchange (ETDEWEB)

    Stone, M.E.

    2001-07-24

    The Defense Waste Processing Facility (DWPF) would like to increase its canister production rate. The goal of this study is to improve the melt rate in DWPF specifically for Macrobatch 3. However, the knowledge gained may result in improved melting efficiencies translating to future DWPF macrobatches and in higher throughput for other Department of Energy's (DOE) melters. Increased melting efficiencies decrease overall operational costs by reducing the immobilization campaign time for a particular waste stream. For melt rate limited systems, a small increase in melting efficiency translates into significant hard dollar savings by reducing life cycle operational costs.

  2. Comparative Study of Impedance Eduction Methods. Part 1; DLR Tests and Methodology

    Science.gov (United States)

    Busse-Gerstengarbe, Stefan; Bake, Friedrich; Enghardt, Lars; Jones, Michael G.

    2013-01-01

    The absorption efficiency of acoustic liners used in aircraft engines is characterized by the acoustic impedance. World wide, many grazing ow test rigs and eduction methods are available that provide values for that impedance. However, a direct comparison and assessment of the data of the di erent rigs and methods is often not possible because test objects and test conditions are quite di erent. Only a few papers provide a direct comparison. Therefore, this paper together with a companion paper, present data measured with a reference test object under similar conditions in the DLR and NASA grazing ow test rigs. Additionally, by applying the in-house methods Liner Impedance Non-Uniform ow Solving algorithm (LINUS, DLR) and Convected Helmhholtz Equation approach (CHE, NASA) on the data sets, similarities and differences due to underlying theory are identi ed and discussed.

  3. Methodology for determining whether an increase in a state's child poverty rate is the result of the TANF program--Administration for Children and Families, HHS. Proposed rule.

    Science.gov (United States)

    1998-09-23

    The Administration for Children and Families is proposing a methodology to determine the child poverty rate in each State. If a State experiences an increase in its child poverty rate of 5 percent or more as a result of its Temporary Assistance for Needy Families (TANF) program, the State must submit and implement a corrective action plan. This requirement is a part of the new welfare reform block grant program enacted in 1996.

  4. First impression versus extended usage: a comparison of product testing methodologies for perfume.

    Science.gov (United States)

    Shalofsky, I

    1993-04-01

    Synopsis In the fine fragrance industry, unlike many other fast moving consumer goods (fmcg) industries, systematic consumer product-testing has usually been conspicuous by its absence. The reasons are varied, including perfume's own traditions rooted in fashion rather than in marketing, the reluctance of perfumers to see their creations tested, the frequently (and perhaps, surprisingly) short lead times accorded for new product development and, of course, costs. When consumer product-testing is carried out, it is often limited for these same reasons, to 'sniff-testing', which, in the perfume industry, is equivalent to 'first impression' testing. This paper suggests that such sniff-testing may not only be unreliable, but perhaps more unreliable for the perfume category than has been realized hitherto. Reference is made to two consumer research studies on perfume, a qualitative project in France, followed by a quantitative exercise in the UK. A comparison is made between in-home test and sniff-test results for the same set of perfumes, which illustrates the limitations of sniff-testing in general, and the misleading results that it may produce, in particular. A major implication is that perfume is one product category which should be tested in extended usage, and not just for 'first impressions'. Résumé Dans l'industrie de la parfumerie fine, contrairement aux autres industries de produits de grande consommation, les tests consommateurs systématiques sont rarement utilisés. Les raisons sont diverses; les traditions propres du parfum tournées vers la mode plutôt que vers le marketing, le refus des parfumeurs de voir leurs créations subir des tests, les délais étonnamment courts pour le développement d'un nouveau produit et, bien sûr, le coût. Lorsqu'un test consommateurs est effectué, il se résume généralement pour ces mêmes raisons, en un test 'sniff', ce qui, dans l'industrie du parfum équivaut à un test de 'première impression'. Cet article

  5. Nitric-glycolic flowsheet testing for maximum hydrogen generation rate

    Energy Technology Data Exchange (ETDEWEB)

    Martino, C. J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Newell, J. D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Williams, M. S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-03-01

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site is developing for implementation a flowsheet with a new reductant to replace formic acid. Glycolic acid has been tested over the past several years and found to effectively replace the function of formic acid in the DWPF chemical process. The nitric-glycolic flowsheet reduces mercury, significantly lowers the chemical generation of hydrogen and ammonia, allows purge reduction in the Sludge Receipt and Adjustment Tank (SRAT), stabilizes the pH and chemistry in the SRAT and the Slurry Mix Evaporator (SME), allows for effective adjustment of the SRAT/SME rheology, and is favorable with respect to melter flammability. The objective of this work was to perform DWPF Chemical Process Cell (CPC) testing at conditions that would bound the catalytic hydrogen production for the nitric-glycolic flowsheet.

  6. Evaluation of leakage from fume hoods using tracer gas, tracer nanoparticles and nanopowder handling test methodologies.

    Science.gov (United States)

    Dunn, Kevin H; Tsai, Candace Su-Jung; Woskie, Susan R; Bennett, James S; Garcia, Alberto; Ellenbecker, Michael J

    2014-01-01

    The most commonly reported control used to minimize workplace exposures to nanomaterials is the chemical fume hood. Studies have shown, however, that significant releases of nanoparticles can occur when materials are handled inside fume hoods. This study evaluated the performance of a new commercially available nano fume hood using three different test protocols. Tracer gas, tracer nanoparticle, and nanopowder handling protocols were used to evaluate the hood. A static test procedure using tracer gas (sulfur hexafluoride) and nanoparticles as well as an active test using an operator handling nanoalumina were conducted. A commercially available particle generator was used to produce sodium chloride tracer nanoparticles. Containment effectiveness was evaluated by sampling both in the breathing zone (BZ) of a mannequin and operator as well as across the hood opening. These containment tests were conducted across a range of hood face velocities (60, 80, and 100 ft/min) and with the room ventilation system turned off and on. For the tracer gas and tracer nanoparticle tests, leakage was much more prominent on the left side of the hood (closest to the room supply air diffuser) although some leakage was noted on the right side and in the BZ sample locations. During the tracer gas and tracer nanoparticle tests, leakage was primarily noted when the room air conditioner was on for both the low and medium hood exhaust airflows. When the room air conditioner was turned off, the static tracer gas tests showed good containment across most test conditions. The tracer gas and nanoparticle test results were well correlated showing hood leakage under the same conditions and at the same sample locations. The impact of a room air conditioner was demonstrated with containment being adversely impacted during the use of room air ventilation. The tracer nanoparticle approach is a simple method requiring minimal setup and instrumentation. However, the method requires the reduction in

  7. High-Strain Rate Testing of Gun Propellants

    Science.gov (United States)

    1988-12-01

    formulations under high loading rates have been studied previously (see Fong (1985); et al. (1981); Schubert and Schmitt (1973); Greidanus (1976...the transmission of a wave was described by Davies and Hunter (1963) and by Hoge (1970). Impedance is defined as Z = A(pE)h, where A is the area, p is...A = ma, a2u ac a 2U m = p A dx, a = . Assembling these, - p -= at 2 ax at 2 For isotropic elastic materials, a = Ee, where e = au/ax. The partial

  8. Methodology for the development of normative data for ten Spanish-language neuropsychological tests in eleven Latin American countries.

    Science.gov (United States)

    Guàrdia-Olmos, Joan; Peró-Cebollero, Maribel; Rivera, Diego; Arango-Lasprilla, Juan Carlos

    2015-01-01

    Within the field of neuropsychology, there is a significant lack of normative data for individuals in Latin America. To describe the methodology utilized to obtain the data and create norms for 10 Spanish-language neuropsychological tests administered in 11 Latin-American countries in a sample of 3,977 healthy individuals between the ages 18 and 90. The same data manipulation process was applied to the data collected (regardless of the scale or country) using a regression-based procedure that takes into account sex, age, and educational influences on neuropsychological test scores. Following this procedure, we were able to generate age, education, and sex (if relevant) based norms for each test in each of the 11 countries studied. These norms are presented in the 10 articles that comprise this special issue.

  9. A scuba diving direct sediment sampling methodology on benthic transects in glacial lakes: procedure description, safety measures, and tests results.

    Science.gov (United States)

    Pardo, Alfonso

    2014-11-01

    This work presents an in situ sediment sampling method on benthic transects, specifically intended for scientific scuba diver teams. It was originally designed and developed to sample benthic surface and subsurface sediments and subaqueous soils in glacial lakes up to a maximum depth of 25 m. Tests were conducted on the Sabocos and Baños tarns (i.e., cirque glacial lakes) in the Spanish Pyrenees. Two 100 m transects, ranging from 24.5 to 0 m of depth in Sabocos and 14 m to 0 m deep in Baños, were conducted. In each test, 10 sediment samples of 1 kg each were successfully collected and transported to the surface. This sampling method proved operative even in low visibility conditions (sampling tests were conducted in Sabocos and Truchas tarns. This sampling methodology can be easily adapted to accomplish underwater sampling campaigns in nonglacial lakes and other continental water or marine environments.

  10. Methodology for the Assessment of 3D Conduction Effects in an Aerothermal Wind Tunnel Test

    Science.gov (United States)

    Oliver, Anthony Brandon

    2010-01-01

    This slide presentation reviews a method for the assessment of three-dimensional conduction effects during test in a Aerothermal Wind Tunnel. The test objectives were to duplicate and extend tests that were performed during the 1960's on thermal conduction on proturberance on a flat plate. Slides review the 1D versus 3D conduction data reduction error, the analysis process, CFD-based analysis, loose coupling method that simulates a wind tunnel test run, verification of the CFD solution, Grid convergence, Mach number trend, size trends, and a Sumary of the CFD conduction analysis. Other slides show comparisons to pretest CFD at Mach 1.5 and 2.16 and the geometries of the models and grids.

  11. Methodology to improve design of accelerated life tests in civil engineering projects.

    Directory of Open Access Journals (Sweden)

    Jing Lin

    Full Text Available For reliability testing an Energy Expansion Tree (EET and a companion Energy Function Model (EFM are proposed and described in this paper. Different from conventional approaches, the EET provides a more comprehensive and objective way to systematically identify external energy factors affecting reliability. The EFM introduces energy loss into a traditional Function Model to identify internal energy sources affecting reliability. The combination creates a sound way to enumerate the energies to which a system may be exposed during its lifetime. We input these energies into planning an accelerated life test, a Multi Environment Over Stress Test. The test objective is to discover weak links and interactions among the system and the energies to which it is exposed, and design them out. As an example, the methods are applied to the pipe in subsea pipeline. However, they can be widely used in other civil engineering industries as well. The proposed method is compared with current methods.

  12. Methodology to improve design of accelerated life tests in civil engineering projects.

    Science.gov (United States)

    Lin, Jing; Yuan, Yongbo; Zhou, Jilai; Gao, Jie

    2014-01-01

    For reliability testing an Energy Expansion Tree (EET) and a companion Energy Function Model (EFM) are proposed and described in this paper. Different from conventional approaches, the EET provides a more comprehensive and objective way to systematically identify external energy factors affecting reliability. The EFM introduces energy loss into a traditional Function Model to identify internal energy sources affecting reliability. The combination creates a sound way to enumerate the energies to which a system may be exposed during its lifetime. We input these energies into planning an accelerated life test, a Multi Environment Over Stress Test. The test objective is to discover weak links and interactions among the system and the energies to which it is exposed, and design them out. As an example, the methods are applied to the pipe in subsea pipeline. However, they can be widely used in other civil engineering industries as well. The proposed method is compared with current methods.

  13. Effectiveness Analysis of a Non-Destructive Single Event Burnout Test Methodology

    CERN Document Server

    Oser, P; Spiezia, G; Fadakis, E; Foucard, G; Peronnard, P; Masi, A; Gaillard, R

    2014-01-01

    It is essential to characterize power MosFETs regarding their tolerance to destructive Single Event Burnouts (SEB). Therefore, several non-destructive test methods have been developed to evaluate the SEB cross-section of power devices. A power MosFET has been evaluated using a test circuit, designed according to standard non-destructive test methods discussed in the literature. Guidelines suggest a prior adaptation of auxiliary components to the device sensitivity before the radiation test. With the first value chosen for the de-coupling capacitor, the external component initiated destructive events and affected the evaluation of the cross-section. As a result, the influence of auxiliary components on the device cross-section was studied. This paper presents the obtained experimental results, supported by SPICE simulations, to evaluate and discuss how the circuit effectiveness depends on the external components.

  14. EVALUATION OF LEAKAGE FROM FUME HOODS USING TRACER GAS, TRACER NANOPARTICLES AND NANOPOWDER HANDLING TEST METHODOLOGIES

    OpenAIRE

    Dunn, Kevin H.; Tsai, Candace Su-Jung; Woskie, Susan R.; Bennett, James S.; Garcia, Alberto; Ellenbecker, Michael J.

    2014-01-01

    The most commonly reported control used to minimize workplace exposures to nanomaterials is the chemical fume hood. Studies have shown, however, that significant releases of nanoparticles can occur when materials are handled inside fume hoods. This study evaluated the performance of a new commercially available nano fume hood using three different test protocols. Tracer gas, tracer nanoparticle, and nanopowder handling protocols were used to evaluate the hood. A static test procedure using tr...

  15. Methodological issues in product evaluation: the influence of testing environment and task scenario.

    Science.gov (United States)

    Sauer, Juergen; Sonderegger, Andreas

    2011-03-01

    This article examines the utility of two commonly used approaches in the evaluation of interactive consumer products: lab-based testing and single task scenarios. These are compared to two more complex and resource-demanding approaches (field-based testing and dual task scenarios) with regard to the test results they produce. An experiment with N = 80 users was carried out, employing a 2 (laboratory vs. field) by 2 (single task vs. dual task scenario) by 2 (on-product information: present vs. absent) between-subjects design. On-product information (advising users to save water and electricity during kettle usage) represented the intervention, of which the effects on user behaviour were compared under the different experimental conditions. The main finding was that the impact of on-product information on user behaviour was strongest in the lab-based testing environment using a single task scenario (i.e., most economical testing condition), compared to the three other experimental conditions. The work found similar effects for self-report measures. The findings of the study point to the risk that the effects of system redesign on user behaviour may be overestimated if low-fidelity testing approaches are employed. The relevance of these findings for other application areas is also discussed (e.g., design of warnings).

  16. The spin rates of O stars in WR + O binaries. I. Motivation, methodology and first results from SALT

    CERN Document Server

    Shara, Michael M; Vanbeveren, Dany; Moffat, Anthony F J; Zurek, David; Crause, Lisa

    2015-01-01

    The remarkable observation that many single O stars spin very rapidly can be explained if they accreted angular momentum from a mass-transferring companion before that star blew up as a supernova. To test this hypothesis we have measured the spin rates of eight O stars in Wolf-Rayet (WR) + O binaries, increasing the total sample size of such O stars' measured spins from two to ten. The average v sin i for the sample of 10 O stars in these binaries is a strongly super-synchronous rate of 237 km/s, with individual star's values ranging from 129 to 331 km/s. Polarimetric and other determinations of these systems' sin i allow us to determine an average equatorial rotation velocity of 290 km/s for these 10 O stars, with individual star's velocities ranging from 140 to 496 km/s. This is strong observational evidence that Roche lobe overflow mass transfer from a WR progenitor companion has played a critical role in the evolution of WR+OB binaries. While theory predicts that this mass transfer rapidly spins-up the O-...

  17. A new methodology to test galaxy formation models using the dependence of clustering on stellar mass

    Science.gov (United States)

    Campbell, David J. R.; Baugh, Carlton M.; Mitchell, Peter D.; Helly, John C.; Gonzalez-Perez, Violeta; Lacey, Cedric G.; Lagos, Claudia del P.; Simha, Vimal; Farrow, Daniel J.

    2015-09-01

    We present predictions for the two-point correlation function of galaxy clustering as a function of stellar mass, computed using two new versions of the GALFORM semi-analytic galaxy formation model. These models make use of a high resolution, large volume N-body simulation, set in the 7-year Wilkinson Microwave Anisotropy Probe cosmology. One model uses a universal stellar initial mass function (IMF), while the other assumes different IMFs for quiescent star formation and bursts. Particular consideration is given to how the assumptions required to estimate the stellar masses of observed galaxies (such as the choice of IMF, stellar population synthesis model, and dust extinction) influence the perceived dependence of galaxy clustering on stellar mass. Broad-band spectral energy distribution fitting is carried out to estimate stellar masses for the model galaxies in the same manner as in observational studies. We show clear differences between the clustering signals computed using the true and estimated model stellar masses. As such, we highlight the importance of applying our methodology to compare theoretical models to observations. We introduce an alternative scheme for the calculation of the merger time-scales for satellite galaxies in GALFORM, which takes into account the dark matter subhalo information from the simulation. This reduces the amplitude of small-scale clustering. The new merger scheme offers improved or similar agreement with observational clustering measurements, over the redshift range 0 Public Extragalactic Redshift Survey, depending on the GALFORM model used.

  18. Comparison of test methodologies for foot-and-mouth disease virus serotype A vaccine matching.

    Science.gov (United States)

    Tekleghiorghis, Tesfaalem; Weerdmeester, Klaas; van Hemert-Kluitenberg, Froukje; Moormann, Rob J M; Dekker, Aldo

    2014-05-01

    Vaccination has been one of the most important interventions in disease prevention and control. The impact of vaccination largely depends on the quality and suitability of the chosen vaccine. To determine the suitability of a vaccine strain, antigenic matching is usually studied by in vitro analysis. In this study, we performed three in vitro test methods to determine which one gives the lowest variability and the highest discriminatory capacity. Binary ethylenimine inactivated vaccines, prepared from 10 different foot-and-mouth disease (FMD) virus serotype A strains, were used to vaccinate cattle (5 animals for each strain). The antibody titers in blood serum samples 3 weeks postvaccination (w.p.v.) were determined by a virus neutralization test, neutralization index test, and liquid-phase blocking enzyme-linked immunosorbent assay (ELISA). The titers were then used to calculate relationship coefficient (r1) values. These r1 values were compared to the genetic lineage using receiver operating characteristic (ROC) analysis. In the two neutralization test methods, the median titers observed against the test strains differed considerably, and the sera of the vaccinated animals did not always show the highest titers against their respective homologous virus strains. When the titers were corrected for test strain effect (scaling), the variability (standard error of the mean per vaccinated group) increased because the results were on a different scale, but the discriminatory capacity improved. An ROC analysis of the r1 value calculated on both observed and scaled titers showed that only r1 values of the liquid-phase blocking ELISA gave a consistent statistically significant result. Under the conditions of the present study, the liquid-phase blocking ELISA showed less variation and still had a higher discriminatory capacity than the other tests.

  19. Noninvasive Fetal Trisomy (NIFTY test: an advanced noninvasive prenatal diagnosis methodology for fetal autosomal and sex chromosomal aneuploidies

    Directory of Open Access Journals (Sweden)

    Jiang Fuman

    2012-12-01

    Full Text Available Abstract Background Conventional prenatal screening tests, such as maternal serum tests and ultrasound scan, have limited resolution and accuracy. Methods We developed an advanced noninvasive prenatal diagnosis method based on massively parallel sequencing. The Noninvasive Fetal Trisomy (NIFTY test, combines an optimized Student’s t-test with a locally weighted polynomial regression and binary hypotheses. We applied the NIFTY test to 903 pregnancies and compared the diagnostic results with those of full karyotyping. Results 16 of 16 trisomy 21, 12 of 12 trisomy 18, two of two trisomy 13, three of four 45, X, one of one XYY and two of two XXY abnormalities were correctly identified. But one false positive case of trisomy 18 and one false negative case of 45, X were observed. The test performed with 100% sensitivity and 99.9% specificity for autosomal aneuploidies and 85.7% sensitivity and 99.9% specificity for sex chromosomal aneuploidies. Compared with three previously reported z-score approaches with/without GC-bias removal and with internal control, the NIFTY test was more accurate and robust for the detection of both autosomal and sex chromosomal aneuploidies in fetuses. Conclusion Our study demonstrates a powerful and reliable methodology for noninvasive prenatal diagnosis.

  20. Clinical methodologies and incidence of appropriate statistical testing in orthopaedic spine literature. Are statistics misleading?

    Science.gov (United States)

    Vrbos, L A; Lorenz, M A; Peabody, E H; McGregor, M

    1993-06-15

    An analysis of 300 randomly drawn orthopaedic spine articles, published between 1970 and 1990, was performed to assess the quality of biostatistical testing and research design reported in the literature. Of the 300 articles, 269 dealt with topics of an experimental nature, while 31 documented descriptive studies. Statistical deficiencies were identified in 54.0% of the total articles. Conclusions drawn as the result of misleading significance values occurred in 124 experimental studies (46%) while 96 failed to document the form of analysis chosen (35.7%). Statistical testing was not documented in 34 studies (12.6%), while 20 (7.4%) employed analyses considered inappropriate for the specific design structure.

  1. 42 CFR 413.220 - Methodology for calculating the per-treatment base rate under the ESRD prospective payment system...

    Science.gov (United States)

    2010-10-01

    ... Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE... Disease (ESRD) Services and Organ Procurement Costs § 413.220 Methodology for calculating the per... factor to account for the most recent estimate of increases in the prices of an appropriate market...

  2. Fatigue in high speed aluminium craft: Evaluating a design methodology for estimating the fatigue life using large scale tests and full scale trials

    NARCIS (Netherlands)

    Drummen, I.; Schiere, M.; Tuitman, J.T.

    2013-01-01

    Within the VOMAS project, a methodology has been developed to estimate the fatigue life of high-speed aluminium crafts. This paper presents the large scale test and full scale trials which were done to acquire data for evaluating the developed methodology and presents results of this evaluation. Dur

  3. Troubleshooting Assessment and Enhancement (TAE) Program: Theoretical, Methodological, Test and Evaluation Issues

    Science.gov (United States)

    1991-04-01

    Related to Troubleshooting Proficiency. Henneman and Rouse (1984) also specify prescriptive measures which appear related to problem solving skill and...studies: ACT evaluative reasoning, reading, problem solving skill in social sciences. (4) Natural sciences: ACT natural science test. (5) Composite of

  4. A Methodology to Establish the Criticality of Attributes in Operational Tests

    Science.gov (United States)

    1975-10-01

    RequiL aments for the Degree Mastei of Science in Operacions Research Georgia lnstiL’te of Technology Octobvt, 19715 kEPODUCED BY ow--s NATIONAL TECHNICAL...and that the covariance matrices for the two populations are equal, an F statistic derived from D2 can be used to test HO: A2.0 or equivalently Ho

  5. A comparison of two skin test methodologies and allergens from two different manufacturers

    NARCIS (Netherlands)

    Rhodius, R; Wickens, K; Cheng, S; Crane, J

    2002-01-01

    Background: Skin prick tests (SPTs) are a frequently used method for evaluation of atopy. A variety of standard allergen preparations are available, together with a number of different methods of application. Objective: The objective of this study was to compare SPT reactivity 1) using Soluprick SQ

  6. Methodology and instrumentation for testing the weak equivalence principle in stratospheric free fall

    Science.gov (United States)

    Iafolla, V.; Nozzoli, S.; Lorenzini, E. C.; Milyukov, V.

    1998-12-01

    The use of the GiZero free-fall facility for testing the weak equivalence principle is discussed in this article. GiZero consists of a vacuum capsule, released from a balloon at an altitude of 40 km, which shields an experimental apparatus free falling inside the capsule itself. The expected residual acceleration external to the detector is 10-12 g (with g the Earth's gravitational acceleration) for the 30 s free fall. A common-mode rejection factor of about 10-4 reduces the residual noise differential output to only 10-16 g. The gravity detector is a differential accelerometer with two test masses with coincident center of masses (i.e., zero baseline) with capacitive pick ups. Preparatory experiments have been conducted in the laboratory with a precursor detector by measuring controlled gravity signals, at low frequency, and by observing the Luni-Solar tides. The estimated accuracy in testing the weak equivalence principle, with a 95% confidence level, is 5×10-15 in a 30 s free fall. When compared to orbital free-fall experiments, the GiZero experiment can be considered as a valid compromise which is able to satisfy the requirement for improving significantly the experimental accuracy in testing the equivalence principle with a substantial lower cost, the ability to recover the detector and to repeat the experiment at relatively short time intervals.

  7. Methodology Investigation. A Comparison of Reliability Tests for Avionics Equipment Onboard U.S. Army Helicopters

    Science.gov (United States)

    1975-04-01

    electric motor driven cam mechanism. Operation of the motor was controlled by a timer. The heating and cooling auto- clave is not shown in the photograph...the laboratory methods is least expensive is debatable . TABLE 4. 0-11 ESTIMATED RELATIVE COST AND TIME REQUIREMENTS Type____Test I Relative Calendar

  8. A Practical Methodology for the Systematic Development of Multiple Choice Tests.

    Science.gov (United States)

    Blumberg, Phyllis; Felner, Joel

    Using Guttman's facet design analysis, four parallel forms of a multiple-choice test were developed. A mapping sentence, logically representing the universe of content of a basic cardiology course, specified the facets of the course and the semantic structural units linking them. The facets were: cognitive processes, disease priority, specific…

  9. Some Methodological Issues with "Draw a Scientist Tests" among Young Children

    Science.gov (United States)

    Losh, Susan C.; Wilke, Ryan; Pop, Margareta

    2008-01-01

    Children's stereotypes about scientists have been postulated to affect student science identity and interest in science. Findings from prior studies using "Draw a Scientist Test" methods suggest that students see scientists as largely white, often unattractive, men; one consequence may be that girls and minority students feel a science career is…

  10. A methodological proposal to research patients’ demands and pre-test probabilities using paper forms in primary care settings

    Directory of Open Access Journals (Sweden)

    Gustavo Diniz Ferreira Gusso

    2013-04-01

    Full Text Available Objective: The purpose of this study is to present a methodology for assessing patients’ demands and calculating pre-test probabilities using paper forms in Primary Care. Method: Most developing countries do not use Electronic Health Records (EHR in primary care settings. This makes it difficult to access information regarding what occurs within the health center working process. Basically, there are two methodologies to assess patients’ demands and problems or diagnosis stated by doctors. The first is based on single attendance at each appointment, while the second is based on episodes of care; the latter deals with each problem in a longitudinal manner. The methodology developed in this article followed the approach of confronting the ‘reason for the appointment’ and ‘the problem registered’ by doctors. Paper forms were developed taking this concept as central. All appointments were classified by the International Classification of Primary Care (ICPC. Discussion: Even in paper form, confrontation between ‘reason for the appointment’ and ‘problem registered’ is useful for measuring the pre-test probabilities of each problem-based appointment. This approach can be easily reproduced in any health center and enables a better understanding of population profile. Prevalence of many illnesses and diseases are not known in each reality, and studies conducted in other settings, such as secondary and tertiary care, are not adequate for primary health care. Conclusion: This study offers adequate technology for primary health care workers that have potential to transform each health center into a research-led practice, contributing directly to patient care.

  11. Development of Improved Accelerated Corrosion Qualification Test Methodology for Aerospace Materials

    Science.gov (United States)

    2014-11-01

    polyurethane topcoat observed ( FTIR analysis) • UV and ozone under constant salt fog on coated panels in laboratory was much more damaging than 2 years...irradiation and ozone gas • Cumulative damage model for predicting atmospheric corrosion rates of 1010 steel was developed using inputs from weather...data: – Temperature, – Relative humidity (%RH) – Atmospheric contaminants (chloride, SO2, and ozone ) levels Silver Al Alloy 7075 Al Alloy

  12. Servohydraulic methods for mechanical testing in the Sub-Hopkinson rate regime up to strain rates of 500 1/s.

    Energy Technology Data Exchange (ETDEWEB)

    Crenshaw, Thomas B.; Boyce, Brad Lee

    2005-10-01

    Tensile and compressive stress-strain experiments on metals at strain rates in the range of 1-1000 1/s are relevant to many applications such as gravity-dropped munitions and airplane accidents. While conventional test methods cover strain rates up to {approx}10 s{sup -1} and split-Hopkinson and other techniques cover strain rates in excess of {approx}1000 s{sup -1}, there are no well defined techniques for the intermediate or ''Sub-Hopkinson'' strain-rate regime. The current work outlines many of the challenges in testing in the Sub-Hopkinson regime, and establishes methods for addressing these challenges. The resulting technique for obtaining intermediate rate stress-strain data is demonstrated in tension on a high-strength, high-toughness steel alloy (Hytuf) that could be a candidate alloy for earth penetrating munitions and in compression on a Au-Cu braze alloy.

  13. Aging monitoring methodology for built-In self-test applications

    OpenAIRE

    Coelho, João Ricardo dos Santos

    2013-01-01

    Dissertação de mestrado, Engenharia Eléctrica e Electrónica, Instituto Superior de Engenharia, Universidade do Algarve, 2013 The high integration level achieved as well as complexity and performance enhancements in new nanometer technologies make IC (Integrated Circuits) products very difficult to test. Moreover, long term operation brings aging cumulative degradations, due to new processes and materials that lead to emerging defect phenomena and the consequence are products with increased...

  14. The Study of Productivity Measurement and Incentive Methodology (Phase III - Paper Test). Volume 1

    Science.gov (United States)

    1986-03-14

    goal is to prepare an implementation report/ manual that guides others in execution of recommendations and alternative approaches identified in...be machine, assembly and/or test labor, a contractor can submit an IMIP to automate a manual process for simple, average and/or complex machine shop...structure has created the need to address both produccion and non-productxon costs. VAPO has, therefore, expanded the application of measurements

  15. Critique of Test Methodologies for Biological Agent Detection and Identification Systems for Military and First Responders

    Science.gov (United States)

    2002-01-01

    Critical Control Point ( HACCP ) programs within the food industry worldwide (Cutter et al., 1996). Additionally, the ATP test with Luciferin Luciferase...Cutter, C. N., W. J. Dorsa, and G. R. Siragusa. 1996. A rapid microbial ATP bioluminescence assay for meat carcasses. Dairy, Food & Environ. San. 16...No. 73. In program and abstract book, 83rd Annual Meeting of IAMFES, p. 50. 6. Northcutt, J., and S. M. Russell. 1996. Making HACCP happen in your

  16. Methodology Plan for Minimum Resolvable Temperature Difference (MRTD) Testing of Aircraft Installed Sensors

    Science.gov (United States)

    2011-03-23

    a. Detector type: Multi-element MCT SPRITE b. Wavelength: Long wave, 8-12 um c. Cooling system : Integrated Sterling cooler d. Cooldown...protective lid on the front of the sensor. Self-test imagery will be presented on the RS-170 monitor until the sensor’s Stirling cooler reaches its...ABSTRACT The development, modernization, and integration of electro-optical sensors into U.S. Army aviation weapon systems necessitate a conclusive

  17. Methodology for the design, production, and test of plastic optical displacement sensors

    Science.gov (United States)

    Rahlves, Maik; Kelb, Christian; Reithmeier, Eduard; Roth, Bernhard

    2016-08-01

    Optical displacement sensors made entirely from plastic materials offer various advantages such as biocompatibility and high flexibility compared to their commonly used electrical and glass-based counterparts. In addition, various low-cost and large-scale fabrication techniques can potentially be utilized for their fabrication. In this work we present a toolkit for the design, production, and test of such sensors. Using the introduced methods, we demonstrate the development of a simple all-optical displacement sensor based on multimode plastic waveguides. The system consists of polymethylmethacrylate and cyclic olefin polymer which serve as cladding and core materials, respectively. We discuss several numerical models which are useful for the design and simulation of the displacement sensors as well as two manufacturing methods capable of mass-producing such devices. Prior to fabrication, the sensor layout and performance are evaluated by means of a self-implemented ray-optical simulation which can be extended to various other types of sensor concepts. Furthermore, we discuss optical and mechanical test procedures as well as a high-precision tensile testing machine especially suited for the characterization of the opto-mechanical performance of such plastic optical displacement sensors.

  18. Two-step sensitivity testing of parametrized and regionalized life cycle assessments: methodology and case study.

    Science.gov (United States)

    Mutel, Christopher L; de Baan, Laura; Hellweg, Stefanie

    2013-06-04

    Comprehensive sensitivity analysis is a significant tool to interpret and improve life cycle assessment (LCA) models, but is rarely performed. Sensitivity analysis will increase in importance as inventory databases become regionalized, increasing the number of system parameters, and parametrized, adding complexity through variables and nonlinear formulas. We propose and implement a new two-step approach to sensitivity analysis. First, we identify parameters with high global sensitivities for further examination and analysis with a screening step, the method of elementary effects. Second, the more computationally intensive contribution to variance test is used to quantify the relative importance of these parameters. The two-step sensitivity test is illustrated on a regionalized, nonlinear case study of the biodiversity impacts from land use of cocoa production, including a worldwide cocoa products trade model. Our simplified trade model can be used for transformable commodities where one is assessing market shares that vary over time. In the case study, the highly uncertain characterization factors for the Ivory Coast and Ghana contributed more than 50% of variance for almost all countries and years examined. The two-step sensitivity test allows for the interpretation, understanding, and improvement of large, complex, and nonlinear LCA systems.

  19. Test Methodology of Reproducing Fuel Rod Failure by Debris Fretting Wear

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Oh Joon; Park, Nam Gyu; Kim, Jae Ik [KEPCO NF, Daejeon (Korea, Republic of)

    2015-10-15

    A test was conducted with simple debris to reproduce debris fretting wear. 68% of fuel rod cladding thickness is worn out by Inconel debris in 75 hours. The test result shows that a simple link system is useful to accommodate debris oscillation, and mid grid mixing vanes could be a source of debris forcing. Additional tests will be conducted with various debris such as wire brush, metal chip, etc which are suspected to generate actual debris fretting wear in future works. Debris fretting is one of the most common cause of the nuclear fuel rod failure. Even the most of the nuclear fuels has debris protection system, debris still cause fuel rod failure. From 1994 to 2006, debris fretting failure is around 11% of the total fuel failure. In 2006-2010, the portion of debris rises to over 13%. The total number of fuel rods failure is decreasing, but the portion of the debris fretting wear is growing with time. Therefore reproducing and identifying the mechanism of fuel rod failure by debris fretting wear is needed to improve reliability of the nuclear fuel.

  20. 7 CFR 91.37 - Standard hourly fee rate for laboratory testing, analysis, and other services.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Standard hourly fee rate for laboratory testing... Charges § 91.37 Standard hourly fee rate for laboratory testing, analysis, and other services. (a) The standard hourly fee rate in this section for the individual laboratory analyses cover the costs of...

  1. 49 CFR 219.602 - FRA Administrator's determination of random drug testing rate.

    Science.gov (United States)

    2010-10-01

    ... Random Alcohol and Drug Testing Programs § 219.602 FRA Administrator's determination of random drug... percentage rate for random drug testing must be 50 percent of covered employees. (b) The FRA Administrator's decision to increase or decrease the minimum annual percentage rate for random drug testing is based on the...

  2. Design and pilot testing of a dietary assessment methodology for children at school

    DEFF Research Database (Denmark)

    Hansen, Mette; Laursen, Rikke; Mikkelsen, Bent Egberg

    , the Danish Dietary Recommendations, and inspired by other successful studies, a self-administered questionnaire investigating children’s eating habits was designed. After testing by an Expert Evaluation Panel and Think Aloud Interviews adjustments were integrated. Conclusion: If special attention is given...... to literacy skills and cognitive development, children in Danish 6th grade classes can be used as respondents in studies of the relation between food procurement policies and eating practice. The study suggests that a Cross-Sectional design is a satisfactory method to investigate the association between...

  3. Point-of-Care Hemoglobin/Hematocrit Testing: Comparison of Methodology and Technology.

    Science.gov (United States)

    Maslow, Andrew; Bert, Arthur; Singh, Arun; Sweeney, Joseph

    2016-04-01

    Point-of-care (POC) testing allows rapid assessment of hemoglobin (Hgb) and hematocrit (Hct) values. This study compared 3 POC testing devices--the Radical-7 pulse oximeter (Radical-7, Neuchȃtel, Switzerland), the i-STAT (Abbott Point of Care, Princeton, NJ), and the GEM 4000 (Instrumentation Laboratory, Bedford, MA)--to the hospital reference device, the UniCel DxH 800 (Beckman Coulter, Brea, CA) in cardiac surgery patients. Prospective study. Tertiary care cardiovascular center. Twenty-four consecutive elective adult cardiac surgery patients. Hgb and Hct values were measured using 3 POC devices (the Radical-7, i-STAT, and GEM 4000) and a reference laboratory device (UniCel DxH 800). Data were collected simultaneously before surgery, after heparin administration, after heparin reversal with protamine, and after sternal closure. Data were analyzed using bias analyses. POC testing data were compared with that of the reference laboratory device. Hgb levels ranged from 6.8 to 15.1 g/dL, and Hct levels ranged from 20.1% to 43.8%. The overall mean bias was lowest with the i-STAT (Hct, 0.22%; Hgb 0.05 g/dL) compared with the GEM 4000 (Hct, 2.15%; Hgb, 0.63 g/dL) and the Radical-7 (Hgb 1.16 g/dL). The range of data for the i-STAT and Radical-7 was larger than that with the GEM 4000, and the pattern or slopes changed significantly with the i-STAT and Radical-7, whereas that of the GEM 4000 remained relatively stable. The GEM 4000 demonstrated a consistent overestimation of laboratory data, which tended to improve after bypass and at lower Hct/Hgb levels. The i-STAT bias changed from overestimation to underestimation, the latter in the post-cardiopulmonary bypass period and at lower Hct/Hgb levels. By contrast, the Radical-7 biases increased during the surgical procedure and in the lower ranges of Hgb. Important clinical differences and limitations were found among the 3 POC testing devices that should caution clinicians from relying on these data as sole determinants of

  4. Quality control tests of an activity meter to be used as reference for an in situ calibration methodology

    Energy Technology Data Exchange (ETDEWEB)

    Correa, Eduardo de L.; Kuahara, Lilian T.; Potiens, Maria da Penha A., E-mail: educorrea1905@gmail.com [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    The Nuclear Medicine is a medical speciality involving the application of radioactive isotopes in diagnosis and/or treatment of disease. In order to ensure that the radiation dose applied to the patient is adequate, the radiopharmaceutical activity must be adequately measured. This work was performed to analyze the behavior of an activity meter Capintec NPL-CRC to be used as a reference for the implementation of a methodology for in situ calibration of nuclear medicine equipment. It were made the daily quality control tests, such as auto zero, background, system test, accuracy test and constancy test, and determination of repeatability and intermediate measurement precision using {sup 137}Cs, {sup 57}Co and {sup 133}Ba sources.Furthermore, this equipment was used to confirm the check sources activities produced at IPEN and used by the laboratory that produces the radiopharmaceuticals sent to the nuclear medicine services. The results showed a good behavior of this equipment. The maximum variation obtained in the accuracy test was of 1.81% for the {sup 57}Co source. For {sup 137}Cs this variation was of 4.59%, and for {sup 133}Ba, 11.83%. The high value obtained for the last case, indicates the needs of a correction that can be obtained by calibration methods. The result obtained using different reference sources showed a great repeatability with maximum variation of 1.38%. (author)

  5. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist

    OpenAIRE

    Terwee, C. B.; Mokkink, L.B.; Knol, D L; Ostelo, R. W. J. G.; Bouter, L. M.; Vet, de, E.

    2011-01-01

    Background The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5–18 items per box about design aspects and statistical methods. Our aim was to develop a scoring system for the COSMIN checklist to calculate quality scores per measurement property when using the checklist in systematic reviews of measurement properties. Methods The scoring system was devel...

  6. Patient-specific rigorously methodological test of the mean phase coherence

    DEFF Research Database (Denmark)

    Henriksen, Jonas

    respectively together with at least 24 h of interictal data from each individual. With a variant of the leave-one-out training and test method, the optimal amount of training seizures was determined. Results: The generic test on the FSPEEG database resulted in a {sensitivity, false prediction ratio} of {0.......55, 0.3/h}. For the 4 new patients the generic results were {0.91, 0.55/h}, {0.85, 0.71/h}, {0.29, 0.49/h}, and {0, 0.36/h} which is comparable to the FSPEEG database. The patient-specific approach yielded {0.81, 0.17/h}, {0.57, 0.13/h}, {0.81, 0.13/h}, and {0.60, 0.50/h} respectively, meaning...... that the patient-specific approach resulted in a mean improvement of {0.19, 0.30/h}. It was found that 4 seizures were optimal for training. Conclusions: By making the seizure prediction algorithm patient-specific, great improvements were obtained in this preliminary study. 3 out of 4 patients showed clinically...

  7. Influence of counting methodology on erythrocyte ratios in the mouse micronucleus test.

    Science.gov (United States)

    LeBaron, Matthew J; Schisler, Melissa R; Torous, Dorothea K; Dertinger, Stephen D; Gollapudi, B Bhaskar

    2013-04-01

    The mammalian erythrocyte micronucleus test is widely used to investigate the potential interaction of a test substance with chromosomes or mitotic apparatus of replicating erythroblasts. In addition to the primary endpoint, micronucleated erythrocyte frequency, the proportion of immature erythrocytes is measured to assess the influence of treatment on erythropoiesis. The guideline recommendation for an acceptable limit of the immature erythrocyte fraction of not < 20% of the controls was based on traditional scoring methods that consider RNA content. Flow-based sample analysis (e.g., MicroFlow®) characterizes a subpopulation of RNA-containing reticulocytes (RETs) based on CD71 (transferrin receptor) expression. As CD71+ cells represent a younger cohort of RETs, we hypothesized that this subpopulation may be more responsive than the RNA+ fraction for acute exposures. This study evaluated RET population in the peripheral blood of two strains of mice treated by oral gavage with three clastogens (cyclophosphamide, N-ethyl-N-nitrosourea, and methyl methanesulfonate). Although CD71+ frequencies correlated with RNA-based counts, the relative treatment-related reductions were substantially greater. Accordingly, when using the flow cytometry-based CD71+ values for scoring RETs in an acute treatment design, it is suggested that a target value ≥ 5% CD71+ reticulocytes (i.e., 95% depression in reticulocytes proportion) be considered as acceptable for a valid assay.

  8. Toothpick test: a methodology for the detection of RR soybean plants1

    Directory of Open Access Journals (Sweden)

    Fabiana Mota da Silva

    Full Text Available Due to the large increase in the area cultivated with genetically modified soybean in Brazil, it has become necessary to determine methods that are fast and efficient for detecting these cultivars. The aim of this work was to test the efficiency of the toothpick method in the detection of RR soybean plants, as well as to distinguish between cultivars, for sensitivity caused by herbicide. Ten transgenic soybean cultivars, resistant to the active ingredient glyphosate, and ten conventional soybean cultivars were used. Toothpicks soaked in glyphosate were applied to all the plants at stage V6 and evaluations were made at 2, 4, 6, 8 and 10 days after application (DAA. The effects of the glyphosate on the cultivars, and the symptoms of phytotoxicity caused in the transgenic plants were evaluated by means of grading scales. The toothpick test is effective in identifying RR soybean cultivars and also in separating them into groups by sensitivity to the symptoms caused by the glyphosate.

  9. Optical tests of 200mm MWIR polarizer wafers: methodology and results

    Science.gov (United States)

    Erbach, Peter S.; Pezzaniti, J. Larry; Reinhardt, John C.; Chenault, David B.; Goldstein, Dennis H.

    2012-06-01

    Wiregrid polarizers are commonly employed as optical components in polarization sensitive imaging systems in the infrared waveband. Achieving acceptable performance from wiregrid polarizers typically requires small feature sizes and small periods, large aspect ratios, and subtle control over duty cycle. In many cases, the metrics mentioned above can be realized with manufacturing techniques developed in the semiconductor industry. However, metrology techniques commonly utilized in the semiconductor industry are not necessarily conducive to measuring the effective performance across a large substrate. These techniques typically allow testing or inspection of only very small scale representations of the subwavelength features on the wiregrid polarizers. These techniques - for example the scanning electron micrograph, or SEM - may also damage the wiregrid polarizer. In this paper we present a non-destructive optical imaging method for measuring the performance of the entire infrared wiregrid polarizer produced on a 200mm substrate. This test method allows the users to see large scale errors present during the fabrication process that may not be visible with other metrology techniques. In addition, this technique directly correlates polarizer performance to manufacturing errors.

  10. Creep lifing methodologies applied to a single crystal superalloy by use of small scale test techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jeffs, S.P., E-mail: s.p.jeffs@swansea.ac.uk [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Lancaster, R.J. [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Garcia, T.E. [IUTA (University Institute of Industrial Technology of Asturias), University of Oviedo, Edificio Departamental Oeste 7.1.17, Campus Universitario, 33203 Gijón (Spain)

    2015-06-11

    In recent years, advances in creep data interpretation have been achieved either by modified Monkman–Grant relationships or through the more contemporary Wilshire equations, which offer the opportunity of predicting long term behaviour extrapolated from short term results. Long term lifing techniques prove extremely useful in creep dominated applications, such as in the power generation industry and in particular nuclear where large static loads are applied, equally a reduction in lead time for new alloy implementation within the industry is critical. The latter requirement brings about the utilisation of the small punch (SP) creep test, a widely recognised approach for obtaining useful mechanical property information from limited material volumes, as is typically the case with novel alloy development and for any in-situ mechanical testing that may be required. The ability to correlate SP creep results with uniaxial data is vital when considering the benefits of the technique. As such an equation has been developed, known as the k{sub SP} method, which has been proven to be an effective tool across several material systems. The current work now explores the application of the aforementioned empirical approaches to correlate small punch creep data obtained on a single crystal superalloy over a range of elevated temperatures. Finite element modelling through ABAQUS software based on the uniaxial creep data has also been implemented to characterise the SP deformation and help corroborate the experimental results.

  11. Twins eye study in Tasmania (TEST): rationale and methodology to recruit and examine twins.

    Science.gov (United States)

    Mackey, David A; Mackinnon, Jane R; Brown, Shayne A; Kearns, Lisa S; Ruddle, Jonathan B; Sanfilippo, Paul G; Sun, Cong; Hammond, Christopher J; Young, Terri L; Martin, Nicholas G; Hewitt, Alex W

    2009-10-01

    Visual impairment is a leading cause of morbidity and poor quality of life in our community. Unravelling the mechanisms underpinning important blinding diseases could allow preventative or curative steps to be implemented. Twin siblings provide a unique opportunity in biology to discover genes associated with numerous eye diseases and ocular biometry. Twins are particularly useful for quantitative trait analysis through genome-wide association and linkage studies. Although many studies involving twins rely on twin registries, we present our approach to the Twins Eye Study in Tasmania to provide insight into possible recruitment strategies, expected participation rates and potential examination strategies that can be considered by other researchers for similar studies. Five separate avenues for cohort recruitment were adopted: (1) piggy-backing existing studies where twins had been recruited, (2) utilizing the national twin registry, (3) word-of-mouth and local media publicity, (4) directly approaching schools, and finally (5) collaborating with other research groups studying twins.

  12. Downscaling the in vitro test of fungal bioremediation of polycyclic aromatic hydrocarbons: methodological approach.

    Science.gov (United States)

    Drevinskas, Tomas; Mickienė, Rūta; Maruška, Audrius; Stankevičius, Mantas; Tiso, Nicola; Mikašauskaitė, Jurgita; Ragažinskienė, Ona; Levišauskas, Donatas; Bartkuvienė, Violeta; Snieškienė, Vilija; Stankevičienė, Antanina; Polcaro, Chiara; Galli, Emanuela; Donati, Enrica; Tekorius, Tomas; Kornyšova, Olga; Kaškonienė, Vilma

    2016-02-01

    The miniaturization and optimization of a white rot fungal bioremediation experiment is described in this paper. The optimized procedure allows determination of the degradation kinetics of anthracene. The miniaturized procedure requires only 2.5 ml of culture medium. The experiment is more precise, robust, and better controlled comparing it to classical tests in flasks. Using this technique, different parts, i.e., the culture medium, the fungi, and the cotton seal, can be analyzed. A simple sample preparation speeds up the analytical process. Experiments performed show degradation of anthracene up to approximately 60% by Irpex lacteus and up to approximately 40% by Pleurotus ostreatus in 25 days. Bioremediation of anthracene by the consortium of I. lacteus and P. ostreatus shows the biodegradation of anthracene up to approximately 56% in 23 days. At the end of the experiment, the surface tension of culture medium decreased comparing it to the blank, indicating generation of surfactant compounds.

  13. Comparative Study of Impedance Eduction Methods, Part 2: NASA Tests and Methodology

    Science.gov (United States)

    Jones, Michael G.; Watson, Willie R.; Howerton, Brian M.; Busse-Gerstengarbe, Stefan

    2013-01-01

    A number of methods have been developed at NASA Langley Research Center for eduction of the acoustic impedance of sound-absorbing liners mounted in the wall of a flow duct. This investigation uses methods based on the Pridmore-Brown and convected Helmholtz equations to study the acoustic behavior of a single-layer, conventional liner fabricated by the German Aerospace Center and tested in the NASA Langley Grazing Flow Impedance Tube. Two key assumptions are explored in this portion of the investigation. First, a comparison of results achieved with uniform-flow and shear-flow impedance eduction methods is considered. Also, an approach based on the Prony method is used to extend these methods from single-mode to multi-mode implementations. Finally, a detailed investigation into the effects of harmonic distortion on the educed impedance is performed, and the results are used to develop guidelines regarding acceptable levels of harmonic distortion

  14. Methodology to evaluate the crack growth rate by stress corrosion cracking in dissimilar metals weld in simulated environment of PWR nuclear reactor

    Energy Technology Data Exchange (ETDEWEB)

    Paula, Raphael G.; Figueiredo, Celia A.; Rabelo, Emerson G., E-mail: raphaelmecanica@gmail.com, E-mail: caf@cdtn.br, E-mail: egr@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2013-07-01

    Inconel alloys weld metal is widely used to join dissimilar metals in nuclear reactors applications. It was recently observed failures of weld components in plants, which have triggered an international effort to determine reliable data on the stress corrosion cracking behavior of this material in reactor environment. The objective of this work is to develop a methodology to determine the crack growth rate caused by stress corrosion in Inconel alloy 182, using the specimen (Compact Tensile) in simulated PWR environment. (author)

  15. Test Method for High β Particle Emission Rate of 63Ni Source Plate

    Directory of Open Access Journals (Sweden)

    ZHANG Li-feng

    2015-01-01

    Full Text Available For the problem of measurement difficulties of β particle emission rate of Ni-63 source plate used for Ni-63 betavoltaic battery, a relative test method of scintillation current method was erected according to the measurement principle of scintillation detector.β particle emission rate of homemade Ni-63 source plate was tested by the method, and the test results were analysed and evaluated, it was initially thought that scintillation current method was a feasible way of testing β particle emission rate of Ni-63 source plate with high β particle emission rate.

  16. ADAPTIVE CONTROL APPROACH FOR PERIODIC RATE RIPPLES IN INERTIAL GUIDANCE TEST EQUIPMENT

    Institute of Scientific and Technical Information of China (English)

    XUGuo-zhu; LIUYue; ZHANGYuan-sheng

    2005-01-01

    The rate performance of inertial guidance test equipment (IGTE) affects the performance testing of inertia devices and navigation systems. Periodic rate ripples caused by periodic disturbances have a great effect on the rate performance of rotation test tables. The mechanism caused by the rate ripple in IGTE is analyzed. Based on the nonlinear self-adaptive control system approach, a control system scheme is proposed to eliminate the period disturbances for its uncertainty. Experimental results show that the periodic rate ripple can efficiently be controlled.

  17. The Relation of Arm Exercise Peak Heart Rate to Stress Test Results and Outcome

    National Research Council Canada - National Science Library

    Xian, Hong; Liu, Weijian; Marshall, Cynthia; Chandiramani, Pooja; Bainter, Emily; Martin, III, Wade H

    2016-01-01

    PURPOSEArm exercise is an alternative to pharmacologic stress testing for >50% of patients unable to perform treadmill exercise but no data exist regarding the effect of attained peak arm exercise heart rate on test sensitivity...

  18. Mapping and Imaging Methodologies within the Comprehensive Test Ban Treaty's On-Site Inspection Framework

    Science.gov (United States)

    Hawkins, W.; Sussman, A. J.; Kelley, R. E.; Wohletz, K. H.; Schultz-Fellenz, E. S.

    2013-12-01

    On-site inspection (OSI) is the final verification measure of the Comprehensive Nuclear Test Ban Treaty (CTBT). OSIs rely heavily on geologic and geophysical investigations. The objective is to apply methods that are effective, efficient and minimally intrusive. We present a general overview of the OSI as provisioned in the CTBT, specifying the allowed techniques and the timeline for their application. A CTBT OSI relies on many geological, geophysical and radiological methods. The search area for an OSI is mostly defined by uncertainty in the location of a suspect event detected by the International Monitoring System (IMS) and reported through the International Data Center and can be as large as 1000 km2. Thus OSI methods are fundamentally divided into general survey methods that narrow the search area and more focused, detailed survey methods to look for evidence of a potential underground explosion and try to find its location within an area of several km2. The purpose and goal of a CTBT OSI, as specified in the Article IV of the Treaty, is 'to clarify whether a nuclear explosion has been carried out in violation of the Treaty' and to 'gather any facts which might assist in identifying any possible violator.' Through the use of visual, geophysical, and radiological techniques, OSIs can detect and characterize anomalies and artifacts related to the event that triggered the inspection. In the context of an OSI, an 'observable' is a physical property that is important to recognize and document because of its relevance to the purpose of the inspection. Potential observables include: (1) visual observables such as ground/environmental disturbances and manmade features, (2) geophysical techniques that provide measurements of altered and damaged ground and buried artifacts, and (3) radiological measurements on samples. Information provided in this presentation comes from observations associated with historical testing activities that were not intended to go undetected

  19. Reliability of case definitions for public health surveillance assessed by Round-Robin test methodology

    Directory of Open Access Journals (Sweden)

    Claus Hermann

    2006-05-01

    Full Text Available Abstract Background Case definitions have been recognized to be important elements of public health surveillance systems. They are to assure comparability and consistency of surveillance data and have crucial impact on the sensitivity and the positive predictive value of a surveillance system. The reliability of case definitions has rarely been investigated systematically. Methods We conducted a Round-Robin test by asking all 425 local health departments (LHD and the 16 state health departments (SHD in Germany to classify a selection of 68 case examples using case definitions. By multivariate analysis we investigated factors linked to classification agreement with a gold standard, which was defined by an expert panel. Results A total of 7870 classifications were done by 396 LHD (93% and all SHD. Reporting sensitivity was 90.0%, positive predictive value 76.6%. Polio case examples had the lowest reporting precision, salmonellosis case examples the highest (OR = 0.008; CI: 0.005–0.013. Case definitions with a check-list format of clinical criteria resulted in higher reporting precision than case definitions with a narrative description (OR = 3.08; CI: 2.47–3.83. Reporting precision was higher among SHD compared to LHD (OR = 1.52; CI: 1.14–2.02. Conclusion Our findings led to a systematic revision of the German case definitions and build the basis for general recommendations for the creation of case definitions. These include, among others, that testable yes/no criteria in a check-list format is likely to improve reliability, and that software used for data transmission should be designed in strict accordance with the case definitions. The findings of this study are largely applicable to case definitions in many other countries or international networks as they share the same structural and editorial characteristics of the case definitions evaluated in this study before their revision.

  20. Adjoint-tomography for a Local Surface Structure: Methodology and a Blind Test

    Science.gov (United States)

    Kubina, Filip; Michlik, Filip; Moczo, Peter; Kristek, Jozef; Stripajova, Svetlana

    2017-04-01

    We have developed a multiscale full-waveform adjoint-tomography method for local surface sedimentary structures with complicated interference wavefields. The local surface sedimentary basins and valleys are often responsible for anomalous earthquake ground motions and corresponding damage in earthquakes. In many cases only relatively small number of records of a few local earthquakes is available for a site of interest. Consequently, prediction of earthquake ground motion at the site has to include numerical modeling for a realistic model of the local structure. Though limited, the information about the local structure encoded in the records is important and irreplaceable. It is therefore reasonable to have a method capable of using the limited information in records for improving a model of the local structure. A local surface structure and its interference wavefield require a specific multiscale approach. In order to verify our inversion method, we performed a blind test. We obtained synthetic seismograms at 8 receivers for 2 local sources, complete description of the sources, positions of the receivers and material parameters of the bedrock. We considered the simplest possible starting model - a homogeneous halfspace made of the bedrock. Using our inversion method we obtained an inverted model. Given the starting model, synthetic seismograms simulated for the inverted model are surprisingly close to the synthetic seismograms simulated for the true structure in the target frequency range up to 4.5 Hz. We quantify the level of agreement between the true and inverted seismograms using the L2 and time-frequency misfits, and, more importantly for earthquake-engineering applications, also using the goodness-of-fit criteria based on the earthquake-engineering characteristics of earthquake ground motion. We also verified the inverted model for other source-receiver configurations not used in the inversion.

  1. Proposal of methodology and test protocol for evaluating and qualifying pH measuring devices

    Directory of Open Access Journals (Sweden)

    Niza Helena de Almeida

    2006-01-01

    Full Text Available We present a proposal for evaluating and qualifying pH measuring devices based on the requirements of relevant standards. The proposal presented is based on ASTM E70, NBR 7353, JIS Z 8805, BS 3145, DIN 19268, NBR ISO 17025 and other standards, as well as the results of field research carried out in conjunction with professionals performing pH measurements in public health laboratories. Evaluation is performed by inspection of a form which records data from the measuring system. The form gives acceptable variations in the parameters being tested and allows a conclusion to be reached regarding acceptability of the system. Using the proposed protocol allows definition of suitable analysis criteria, while taking into account the influence pH measurement is subject and the need for correct results. This is particularly true when analyzing products already on the market, thus underlining the protocol's importance to the public health area.Este artigo apresenta uma proposta de protocolo de avaliação e qualificação de medidores de pH fundamentada no que prescrevem as normas ASTM E 70, NBR 7353, JIS Z 8805, BS 3145, DIN 19268, NBR ISO 17025 e outras, complementada com os resultados de pesquisa de campo junto a profissionais que realizam ensaio de medição de pH em laboratórios de saúde pública. A proposta consiste em avaliar o medidor de pH com auxílio de um formulário, cujo preenchimento baseia-se, principalmente, na inspeção e ensaios no sistema medidor. O formulário fornece variações aceitáveis para os parâmetros testados, propiciando parecer conclusivo quanto à adequação do instrumento. A aplicação do protocolo permite definir um critério adequado de análise, tendo em vista a influência que sofre tal ensaio de medição, especialmente, em análises de amostras de produtos decorrentes de finalidade fiscal, no âmbito da saúde pública

  2. Field Test Evaluation of Effect on Cone Resistance Caused by Change in Penetration Rate

    DEFF Research Database (Denmark)

    Poulsen, Rikke; Nielsen, Benjaminn Nordahl; Ibsen, Lars Bo

    2012-01-01

    in Dronninglund where the subsoil primary consists of sandy silt. A total of 15 cone penetration tests with penetration rates varying from 60 to 0.5 mm/s and two geotechnical borings were conducted at the test site. Large soil samples were also collected from the test site in order to determine soil properties......This paper presents how a change in cone penetration rate affects the measured cone resistance during cone penetration testing in silty soils. Regardless of soil, type the standard rate of penetration is 20 mm/s and it is generally accepted that undrained penetration occurs in clay while drained...... penetration occurs in sand. In intermediate soils such as silty soils, the standard cone penetration rate may result in drainage conditions varying from undrained to partially or fully drained conditions. Field cone penetrations tests have been conducted with different penetration rates on a test site...

  3. Dose-rate and total-dose radiation testing of the Texas Instruments TMS320C30 32-bit floating point digital signal processor. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Siy, P.F.; Carter, J.T.; D' Addario, L.R.; Loeber, D.A.

    1991-08-01

    The MITRE Corporation has performed in-flux radiation testing of the Texas Instruments TMS320C30 32-bit floating point digital signal processor in both total dose and dose rate radiation environments. This test effort has provided data relating to the applicability of the TMS320C30 in systems with total dose and/or dose rate survivability requirements. In order to accomplish these tests, the MITRE Corporation developed custom hardware and software for in-flux radiation testing. This paper summarizes the effort by providing an overview of the TMS320C30, MITRE's test methodology, test facilities, statistical analysis, and full coverage of the test results. (Author)

  4. Optimization of the general acceptability though affective tests and response surface methodology of a dry cacao powder mixture based beverage

    Directory of Open Access Journals (Sweden)

    Elena Chau Loo Kung

    2013-09-01

    Full Text Available This research work had as main objective optimizing the general acceptability though affective tests and response surface methodology of a dry cacao powder mixture based beverage. We obtained formulations of mixtures of cacao powder with different concentrations of 15%, 17.5% and 20%, as well as lecithin concentrations of 0.1%; 0.3%; and 0.5% maintaining a constant content of sugar (25 %, Vanillin (1% that included cacao powder with different pH values: natural (pH 5 and alkalinized (pH 6.5 and pH 8 and water by difference to 100%, generating a total of fifteen treatments to be evaluated, according to the Box-Behnen design for three factors. The treatments underwent satisfaction level tests to establish the general acceptability. The treatment that included cacao powder with a concentration of 17.5 %, pH 6.5 and lecithin concentration of 0.3 % obtained the best levels of acceptability. The software Statgraphics Plus 5.1 was used to obtain the treatment with maximum acceptability that corresponded to cacao powder with pH 6.81, with a concentration of 18.24 % and soy lecithin in 0.28% with a tendency to what was obtained in the satisfaction levels tests. Finally we characterized in a physical-chemistry and microbiological way the optimum formulation as well as evaluated sensitively obtaining an acceptability of 6.17.

  5. The spin rates of O stars in WR + O binaries - I. Motivation, methodology, and first results from SALT

    Science.gov (United States)

    Shara, Michael M.; Crawford, Steven M.; Vanbeveren, Dany; Moffat, Anthony F. J.; Zurek, David; Crause, Lisa

    2017-01-01

    The black holes (BH) in merging BH-BH binaries are likely progeny of binary O stars. Their properties, including their spins, will be strongly influenced by the evolution of their progenitor O stars. The remarkable observation that many single O stars spin very rapidly can be explained if they accreted angular momentum from a mass-transferring, O-type or Wolf-Rayet (WR) companion before that star blew up as a supernova. To test this prediction, we have measured the spin rates of eight O stars in WR + O binaries, increasing the total sample size of such O stars' measured spins from 2 to 10. Polarimetric and other determinations of these systems' sin i allow us to determine an average equatorial rotation velocity from He I (He II) lines of ve = 348 (173) km s-1 for these O stars, with individual star's ve from He I (He II) lines ranging from 482 (237) to 290 (91) km s-1. We argue that the ˜100 per cent difference between He I and He II speeds is due to gravity darkening. Supersynchronous spins, now observed in all 10 O stars in WR + O binaries where it has been measured, are strong observational evidence that Roche lobe overflow mass transfer from a WR progenitor companion has played a critical role in the evolution of WR + OB binaries. While theory predicts that this mass transfer rapidly spins up the O-type mass gainer to a nearly breakup rotational velocity of ve ˜ 530 km s-1, the observed average ve of the O-type stars in our sample is 65 per cent of that speed. This demonstrates that, even over the relatively short WR-phase time-scale, tidal and/or other effects causing rotational spin-down must be efficient. A challenge to tidal synchronization theory is that the two longest period binaries in our sample (with periods of 29.7 and 78.5 d) unexpectedly display supersynchronous rotation.

  6. 78 FR 41129 - Market Test of Experimental Product - International Merchandise Return Service-Non-Published Rates

    Science.gov (United States)

    2013-07-09

    ... of Experimental Product -- International Merchandise Return Service--Non-Published Rates AGENCY: U.S... International Merchandise Return Service--Non-Published Rates in accordance with statutory requirements. DATES... will begin a market test of its International Merchandise Return Service (IMRS) Non-published Rate...

  7. In vitro susceptibilities of caprine Mycoplasma agalactiae field isolates to six antimicrobial agents using the E test methodology.

    Science.gov (United States)

    Filioussis, George; Petridou, Evanthia; Giadinis, Nektarios D; Kritas, Spyridon K

    2014-12-01

    The minimum inhibitory concentrations (MICs) of enrofloxacin, ciprofloxacin, spectinomycin, tetracycline, spiramycin and erythromycin against 30 caprine Greek isolates of Mycoplasma agalactiae were determined using E test methodology. The E test strips were placed on Eaton's agar medium without antimicrobials and phenol red. MICs were then read by determining where the growth inhibition zone intersected with the MIC scale on the strip. An MIC value of 8 µg/mL was considered as a guide to mycoplasma resistance. All isolates were sensitive to fluoroquinolones (MIC50, 0.19 g/mL; MIC90, 0.38 µg/mL; highest MIC, 0.5 µg/mL), spectinomycin (MIC50, 0.5 µg/mL; MIC90, 1 µg/mL; highest MIC, 1 µg/mL), and spiramycin (MIC50, 1 µg/mL; MIC90, 1.5 µg/mL; highest MIC, 2 µg/mL). Two strains exhibited resistance to tetracycline (MIC 32 µg/mL) but these were not found to carry any of the tet(M), tet(O), and tet(S) resistance genes. Finally all isolates expressed resistance to erythromycin (MIC50, 128 µg/mL; MIC90, >256 µg/mL).

  8. Methodology for the design of accelerated stress tests for non-precious metal catalysts in fuel cell cathodes

    Science.gov (United States)

    Sharabi, Ronit; Wijsboom, Yair Haim; Borchtchoukova, Nino; Finkelshtain, Gennadi; Elbaz, Lior

    2016-12-01

    In this work we propose systematic methods for testing non-precious group metal catalysts and support degradation alkaline fuel cell cathodes. In this case study, we used a cathode composed of a pyrolyzed non-precious metal catalyst (NPMC) on activated carbon. The vulnerabilities of the cathode components were studied in order to develop the methodology and design an accelerated stress test (AST) for NPMC-based cathode in alkaline environment. Cyclic voltammetry (CV), chronoamperometry (CA) and impedance spectroscopy (EIS) were used to characterize the electrochemical behavior of the cathode and to follow the changes that occur as a result of exposing the cathodes to extreme operating conditions. Rotating ring disk electrode (RRDE) was used to study the cathodes kinetics; Raman spectroscopy and X-ray fluorescence (XRF) were used to study the structural changes in the electrode surface as well as depletion of the catalysts' active sites from the electrode. The changes in the composition of the electrode and catalyst were detected using X-ray diffraction (XRD). For the first time, we show that NPMC degrade rapidly at low operating potentials whereas the support degrades at high operating potentials and developed a tailor-made AST to take these into account.

  9. Evaluation of soil erosion rates in the southern half of the Russian Plain: methodology and initial results

    Science.gov (United States)

    Golosov, Valentin; Gusarov, Artem; Litvin, Leonid; Yermolaev, Oleg; Chizhikova, Nelly; Safina, Guzel; Kiryukhina, Zoya

    2017-03-01

    The Russian Plain (RP) is divided into two principally different parts. The northern half of the RP is a predominantly forested area with a low proportion of arable fields. In contrast, the southern half of the RP has a very high proportion of arable land. During the last 30 years, this agricultural region of the RP has experienced considerable land use transformation and changes in precipitation due to climate change have altered soil erosion rates. This paper describes the use of erosion model calculations and GIS spatial analytical methods for the evaluation of trends in erosion rates in the RP. Climate change (RIHMI World Data Center, 2016), land use transformation and crop rotation modification (Rosstat, 2016; R Core Team, 2016) are the main factors governing erosion rates in the region during recent decades. It was determined that mean annual erosion rates have decreased from 7.3 to 4.1 t ha-1 yr-1 in the forest zone mostly because of the serious reduction in the surface runoff coefficient for periods of snowmelt. At the same time, the erosion rates have increased from 3.9 to 4.6 t ha-1 yr-1 in the steppe zone due to the increasing frequency of heavy rain-storms.

  10. Existing and Past Methods of Test and Rating Standards Related to Integrated Heat Pump Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Reedy, Wayne R. [Sentech, Inc.

    2010-07-01

    This report evaluates existing and past US methods of test and rating standards related to electrically operated air, water, and ground source air conditioners and heat pumps, 65,000 Btu/hr and under in capacity, that potentiality incorporate a potable water heating function. Two AHRI (formerly ARI) standards and three DOE waivers were identified as directly related. Six other AHRI standards related to the test and rating of base units were identified as of interest, as they would form the basis of any new comprehensive test procedure. Numerous other AHRI and ASHRAE component test standards were also identified as perhaps being of help in developing a comprehensive test procedure.

  11. Study of creep behaviour in P-doped copper with slow strain rate tensile tests

    Energy Technology Data Exchange (ETDEWEB)

    Xuexing Yao; Sandstroem, Rolf [Royal Inst. of Tech., Stockholm (Sweden). Dept. of Materials Science and Engineering

    2000-08-01

    Pure copper with addition of phosphorous is planned to be used to construct the canisters for spent nuclear fuel. The copper canisters can be exposed to a creep deformation up to 2-4% at temperatures in services. The ordinary creep strain tests with dead weight loading are generally employed to study the creep behaviour; however, it is reported that an initial plastic deformation of 5-15% takes place when loading the creep specimens at lower temperatures. The slow strain rate tensile test is an alternative to study creep deformation behaviour of materials. Ordinary creep test and slow strain rate tensile test can give the same information in the secondary creep stage. The advantage of the tensile test is that the starting phase is much more controlled than in a creep test. In a tensile test the initial deformation behaviour can be determined and the initial strain of less than 5% can be modelled. In this study slow strain rate tensile tests at strain rate of 10{sup -4}, 10{sup -5}, 10{sup -6}, and 10{sup -7}/s at 75, 125 and 175 degrees C have been performed on P-doped pure Cu to supplement creep data from conventional creep tests. The deformation behaviour has successfully been modelled. It is shown that the slow strain rate tensile tests can be implemented to study the creep deformation behaviours of pure Cu.

  12. SITE project. Phase 1: Continuous data bit-error-rate testing

    Science.gov (United States)

    Fujikawa, Gene; Kerczewski, Robert J.

    1992-01-01

    The Systems Integration, Test, and Evaluation (SITE) Project at NASA LeRC encompasses a number of research and technology areas of satellite communications systems. Phase 1 of this project established a complete satellite link simulator system. The evaluation of proof-of-concept microwave devices, radiofrequency (RF) and bit-error-rate (BER) testing of hardware, testing of remote airlinks, and other tests were performed as part of this first testing phase. This final report covers the test results produced in phase 1 of the SITE Project. The data presented include 20-GHz high-power-amplifier testing, 30-GHz low-noise-receiver testing, amplitude equalization, transponder baseline testing, switch matrix tests, and continuous-wave and modulated interference tests. The report also presents the methods used to measure the RF and BER performance of the complete system. Correlations of the RF and BER data are summarized to note the effects of the RF responses on the BER.

  13. Optimization of aeration and agitation rate for lipid and gamma linolenic acid production by Cunninghamella bainieri 2A1 in submerged fermentation using response surface methodology.

    Science.gov (United States)

    Saad, Normah; Abdeshahian, Peyman; Kalil, Mohd Sahaid; Yusoff, Wan Mohtar Wan; Hamid, Aidil Abdul

    2014-01-01

    The locally isolated filamentous fungus Cunninghamella bainieri 2A1 was cultivated in a 5 L bioreactor to produce lipid and gamma-linolenic acid (GLA). The optimization was carried out using response surface methodology based on a central composite design. A statistical model, second-order polynomial model, was adjusted to the experimental data to evaluate the effect of key operating variables, including aeration rate and agitation speed on lipid production. Process analysis showed that linear and quadratic effect of agitation intensity significantly influenced lipid production process (P production (P production of lipid in the bioreactor.

  14. Validation of Test Methods for Air Leak Rate Verification of Spaceflight Hardware

    Science.gov (United States)

    Oravec, Heather Ann; Daniels, Christopher C.; Mather, Janice L.

    2017-01-01

    As deep space exploration continues to be the goal of NASAs human spaceflight program, verification of the performance of spaceflight hardware becomes increasingly critical. Suitable test methods for verifying the leak rate of sealing systems are identified in program qualification testing requirements. One acceptable method for verifying the air leak rate of gas pressure seals is the tracer gas leak detector method. In this method, a tracer gas (commonly helium) leaks past the test seal and is transported to the leak detector where the leak rate is quantified. To predict the air leak rate, a conversion factor of helium-to-air is applied depending on the magnitude of the helium flow rate. The conversion factor is based on either the molecular mass ratio or the ratio of the dynamic viscosities. The current work was aimed at validating this approach for permeation-level leak rates using a series of tests with a silicone elastomer O-ring. An established pressure decay method with constant differential pressure was used to evaluate both the air and helium leak rates of the O-ring under similar temperature and pressure conditions. The results from the pressure decay tests showed, for the elastomer O-ring, that neither the molecular flow nor the viscous flow helium-to-air conversion factors were applicable. Leak rate tests were also performed using nitrogen and argon as the test gas. Molecular mass and viscosity based helium-to-test gas conversion factors were applied, but did not correctly predict the measured leak rates of either gas. To further this study, the effect of pressure boundary conditions was investigated. Often, pressure decay leak rate tests are performed at a differential pressure of 101.3 kPa with atmospheric pressure on the downstream side of the test seal. In space applications, the differential pressure is similar, but with vacuum as the downstream pressure. The same O-ring was tested at four unique differential pressures ranging from 34.5 to 137.9 k

  15. Refinement of the wedge bar technique for compression tests at intermediate strain rates

    Directory of Open Access Journals (Sweden)

    Stander M.

    2012-08-01

    Full Text Available A refined development of the wedge-bar technique [1] for compression tests at intermediate strain rates is presented. The concept uses a wedge mechanism to compress small cylindrical specimens at strain rates in the order of 10s−1 to strains of up to 0.3. Co-linear elastic impact principles are used to accelerate the actuation mechanism from rest to test speed in under 300μs while maintaining near uniform strain rates for up to 30 ms, i.e. the transient phase of the test is less than 1% of the total test duration. In particular, a new load frame, load cell and sliding anvil designs are presented and shown to significantly reduce the noise generated during testing. Typical dynamic test results for a selection of metals and polymers are reported and compared with quasistatic and split Hopkinson pressure bar results.

  16. Accuracy of cited "facts" in medical research articles: A review of study methodology and recalculation of quotation error rate.

    Science.gov (United States)

    Mogull, Scott A

    2017-01-01

    Previous reviews estimated that approximately 20 to 25% of assertions cited from original research articles, or "facts," are inaccurately quoted in the medical literature. These reviews noted that the original studies were dissimilar and only began to compare the methods of the original studies. The aim of this review is to examine the methods of the original studies and provide a more specific rate of incorrectly cited assertions, or quotation errors, in original research articles published in medical journals. Additionally, the estimate of quotation errors calculated here is based on the ratio of quotation errors to quotations examined (a percent) rather than the more prevalent and weighted metric of quotation errors to the references selected. Overall, this resulted in a lower estimate of the quotation error rate in original medical research articles. A total of 15 studies met the criteria for inclusion in the primary quantitative analysis. Quotation errors were divided into two categories: content ("factual") or source (improper indirect citation) errors. Content errors were further subdivided into major and minor errors depending on the degree that the assertion differed from the original source. The rate of quotation errors recalculated here is 14.5% (10.5% to 18.6% at a 95% confidence interval). These content errors are predominantly, 64.8% (56.1% to 73.5% at a 95% confidence interval), major errors or cited assertions in which the referenced source either fails to substantiate, is unrelated to, or contradicts the assertion. Minor errors, which are an oversimplification, overgeneralization, or trivial inaccuracies, are 35.2% (26.5% to 43.9% at a 95% confidence interval). Additionally, improper secondary (or indirect) citations, which are distinguished from calculations of quotation accuracy, occur at a rate of 10.4% (3.4% to 17.5% at a 95% confidence interval).

  17. A methodological approach to rapid assessment of a river flood in coastal waters. First test in the Po River delta

    Science.gov (United States)

    Campanelli, Alessandra; Bellafiore, Debora; Bensi, Manuel; Bignami, Francesco; Caccamo, Giuseppe; Celussi, Mauro; Del Negro, Paola; Ferrarin, Christian; Marini, Mauro; Paschini, Elio; Zaggia, Luca

    2014-05-01

    As part of the actions of the flagship project RITMARE (Ricerca ITaliana per il MARE) a daily oceanographic survey was performed on 29th November 2013 in front of the Po River delta (Northern Adriatic Sea). The Po river affects a large part of the Northern Adriatic Sea with strong implications on the circulation and functionality of the basin. Physical-chemical and biological properties of coastal waters were investigated after a moderate flood occurred around 25th-27th November. The cruise activities, carried out using a small research boat, were mainly focused on the test of a methodological approach to investigate the environment variability after a flood event in the framework of rapid assessment. The effects of the flood on the coastal waters, have been evaluated in the field using operational forecasts and real-time satellite imagery to assist field measurements and samplings. Surface satellite chlorophyll maps and surface salinity and current maps obtained from a numerical model forced by meteorological forecast and river data were analyzed to better identify the Po plume dispersion during and after the event in order to better locate offshore monitoring stations at the sea. Profiles of Temperature, Salinity, Turbidity, Fluorescence and Colored Dissolved Organic Matter (CDOM) throughout the water column were collected at 7 stations in front of the Po River delta. Sea surface water samples were also collected for the analysis of nutrients, Dissolved Organic Carbon (DOC) and CDOM (surface and bottom). The CDOM regulates the penetration of UV light throughout the water column and mediates photochemical reactions, playing an important role in many marine biogeochemical processes. Satellite images showed a strong color front that separates the higher-chlorophyll coastal water from the more oligotrophic mid-basin and eastern boundary Adriatic waters. In front of the river mouth, the surface layer was characterized by low salinity (14-15), high turbidity (8-11 NTU

  18. The Parthenogenetic Cosmopolitan Chironomid, Paratanytarsus grimmii, as a New Standard Test Species for Ecotoxicology: Culturing Methodology and Sensitivity to Aqueous Pollutants.

    Science.gov (United States)

    Gagliardi, Bryant S; Long, Sara M; Pettigrove, Vincent J; Hoffmann, Ary A

    2015-09-01

    Chironomids from the genus Chironomus are widely used in laboratory ecotoxicology, but are prone to inbreeding depression, which can compromise test results. The standard Chironomus test species (C. riparius, C. dilutus and C. yoshimatsui) are also not cosmopolitan, making it difficult to compare results between geographic regions. In contrast, the chironomid Paratanytarsus grimmii is cosmopolitan, and not susceptible to inbreeding depression because it reproduces asexually by apomictic parthenogenesis. However, there is no standardised culturing methodology for P. grimmii, and a lack of acute toxicity data for common pollutants (metals and pesticides). In this study, we developed a reliable culturing methodology for P. grimmii. We also determined 24-h first instar LC50s for the metals Cu, Pb, Zn, Cd and the insecticide imidacloprid. By developing this culturing methodology and generating the first acute metal and imidacloprid LC50s for P. grimmii, we provide a basis for using P. grimmii in routine ecotoxicological testing.

  19. An Efficient Implementation of Fixed Failure-Rate Ratio Test for GNSS Ambiguity Resolution

    Science.gov (United States)

    Hou, Yanqing; Verhagen, Sandra; Wu, Jie

    2016-01-01

    Ambiguity Resolution (AR) plays a vital role in precise GNSS positioning. Correctly-fixed integer ambiguities can significantly improve the positioning solution, while incorrectly-fixed integer ambiguities can bring large positioning errors and, therefore, should be avoided. The ratio test is an extensively used test to validate the fixed integer ambiguities. To choose proper critical values of the ratio test, the Fixed Failure-rate Ratio Test (FFRT) has been proposed, which generates critical values according to user-defined tolerable failure rates. This contribution provides easy-to-implement fitting functions to calculate the critical values. With a massive Monte Carlo simulation, the functions for many different tolerable failure rates are provided, which enriches the choices of critical values for users. Moreover, the fitting functions for the fix rate are also provided, which for the first time allows users to evaluate the conditional success rate, i.e., the success rate once the integer candidates are accepted by FFRT. The superiority of FFRT over the traditional ratio test regarding controlling the failure rate and preventing unnecessary false alarms is shown by a simulation and a real data experiment. In the real data experiment with a baseline of 182.7 km, FFRT achieved much higher fix rates (up to 30% higher) and the same level of positioning accuracy from fixed solutions as compared to the traditional critical value. PMID:27347949

  20. An Efficient Implementation of Fixed Failure-Rate Ratio Test for GNSS Ambiguity Resolution

    Directory of Open Access Journals (Sweden)

    Yanqing Hou

    2016-06-01

    Full Text Available Ambiguity Resolution (AR plays a vital role in precise GNSS positioning. Correctly-fixed integer ambiguities can significantly improve the positioning solution, while incorrectly-fixed integer ambiguities can bring large positioning errors and, therefore, should be avoided. The ratio test is an extensively used test to validate the fixed integer ambiguities. To choose proper critical values of the ratio test, the Fixed Failure-rate Ratio Test (FFRT has been proposed, which generates critical values according to user-defined tolerable failure rates. This contribution provides easy-to-implement fitting functions to calculate the critical values. With a massive Monte Carlo simulation, the functions for many different tolerable failure rates are provided, which enriches the choices of critical values for users. Moreover, the fitting functions for the fix rate are also provided, which for the first time allows users to evaluate the conditional success rate, i.e., the success rate once the integer candidates are accepted by FFRT. The superiority of FFRT over the traditional ratio test regarding controlling the failure rate and preventing unnecessary false alarms is shown by a simulation and a real data experiment. In the real data experiment with a baseline of 182.7 km, FFRT achieved much higher fix rates (up to 30% higher and the same level of positioning accuracy from fixed solutions as compared to the traditional critical value.

  1. An Efficient Implementation of Fixed Failure-Rate Ratio Test for GNSS Ambiguity Resolution.

    Science.gov (United States)

    Hou, Yanqing; Verhagen, Sandra; Wu, Jie

    2016-06-23

    Ambiguity Resolution (AR) plays a vital role in precise GNSS positioning. Correctly-fixed integer ambiguities can significantly improve the positioning solution, while incorrectly-fixed integer ambiguities can bring large positioning errors and, therefore, should be avoided. The ratio test is an extensively used test to validate the fixed integer ambiguities. To choose proper critical values of the ratio test, the Fixed Failure-rate Ratio Test (FFRT) has been proposed, which generates critical values according to user-defined tolerable failure rates. This contribution provides easy-to-implement fitting functions to calculate the critical values. With a massive Monte Carlo simulation, the functions for many different tolerable failure rates are provided, which enriches the choices of critical values for users. Moreover, the fitting functions for the fix rate are also provided, which for the first time allows users to evaluate the conditional success rate, i.e., the success rate once the integer candidates are accepted by FFRT. The superiority of FFRT over the traditional ratio test regarding controlling the failure rate and preventing unnecessary false alarms is shown by a simulation and a real data experiment. In the real data experiment with a baseline of 182.7 km, FFRT achieved much higher fix rates (up to 30% higher) and the same level of positioning accuracy from fixed solutions as compared to the traditional critical value.

  2. How should a rainfall-runoff model be parameterized in an almost ungauged catchment? A methodology tested on 609 catchments

    Science.gov (United States)

    Rojas-Serna, Claudia; Lebecherel, Laure; Perrin, Charles; Andréassian, Vazken; Oudin, Ludovic

    2016-06-01

    This paper examines catchments that are almost ungauged, i.e., catchments for which only a small number of point flow measurements are available. In these catchments, hydrologists may still need to simulate continuous streamflow time series using a rainfall-runoff model, and the methodology presented here allows using few point measurements for model parameterization. The method combines regional information (parameter sets of neighboring gauged stations) and local information (contributed by the point measurements) within a framework where the relative weight of each source of information is made dependent on the number of point measurements available. This approach is tested with two different hydrological models on a set of 609 catchments in France. The results show that on average a few flow measurements can significantly improve the simulation efficiency, and that 10 measurements can reduce the performance gap between the gauged and ungauged situations by more than 50%. The added value of regional information progressively decreases until being almost insignificant when sufficient flow measurements are available. Model parameters tend to come closer to the values obtained by calibration in fully gauged conditions as the number of point flow measurements increases.

  3. A Methodological Framework for Assessing Agents, Proximate Drivers and Underlying Causes of Deforestation: Field Test Results from Southern Cameroon

    Directory of Open Access Journals (Sweden)

    Sophia Carodenuto

    2015-01-01

    Full Text Available The international debates on REDD+ and the expectations to receive results-based payments through international climate finance have triggered considerable political efforts to address deforestation and forest degradation in many potential beneficiary countries. Whether a country will receive such REDD+ payments is largely contingent on its ability to effectively address the relevant drivers, and to govern the context-dependent agents and forces responsible for forest loss or degradation. Currently, many REDD+ countries are embarking on the necessary analytical steps for their national REDD+ strategies. In this context, a comprehensive understanding of drivers and their underlying causes is a fundamental prerequisite for developing effective policy responses. We developed a methodological framework for assessing the drivers and underlying causes of deforestation and use the Fako Division in Southern Cameroon as a case study to test this approach. The steps described in this paper can be adapted to other geographical contexts, and the results of such assessments can be used to inform policy makers and other stakeholders.

  4. Development of the Speech Intelligibility Rating (SIR) test for hearing aid comparisons.

    Science.gov (United States)

    Cox, R M; McDaniel, D M

    1989-06-01

    The Speech Intelligibility Rating (SIR) Test has been developed for use in clinical comparisons of hearing aid conditions. After listening to a short passage of connected speech, subjects generate a rating proportional to its intelligibility using an equal-appearing interval scale from 0 to 10. Before test passages are presented, the signal-to-babble ratio (SBR) is adjusted to a level that elicits intelligibility ratings of 7-8 for a "setup" passage. Then, with SBR held constant, three or more test passages are rated and the results averaged for each aided condition. This paper describes the generation of recorded test materials and their investigation using normally hearing listeners. Based on these data, a critical difference of about 2 scale intervals is recommended. A future paper will deal with results for hearing-impaired subjects.

  5. Performance and Abuse Testing of 5 Year Old Low Rate and Medium Rate Lithium Thionyl Chloride Cells

    Science.gov (United States)

    Frerker, Rick; Zhang, Wenlin; Jeevarajan, Judith; Bragg, Bobby J.

    2001-01-01

    Most cells survived the 3 amp (A) over-discharge at room temperature for 2 hours. The cell that failed was the LTC-114 after high rate discharge of 500 mA similar to the results of the 1 A over-discharge test. Most cells opened during 0.05 Ohm short circuit test without incident but three LTC-111 cells exploded apparently due to a lack of a thermal cutoff switch. The LTC-114 cells exposed to a hard short of 0.05 Ohms recovered but the LTC-114 cells exposed to a soft short of 1 Ohm did not. This is probably due to the activation of a resetable fuse during a hard short. Fresh cells tend to survive exposure to higher temperatures than cells previously discharged at high rate (1 Amp). LTC-111 cells tend to vent at lower temperatures than the all LTC-114 cells and the LTC-115 cells that were previously discharged at rates exceeding 1 Amp.

  6. Interventions to Improve Rate of Diabetes Testing Postpartum in Women With Gestational Diabetes Mellitus.

    Science.gov (United States)

    Hamel, Maureen S; Werner, Erika F

    2017-02-01

    Gestational diabetes mellitus (GDM) is one of the most common medical complications of pregnancy. In the USA, four million women are screened annually for GDM in pregnancy in part to improve pregnancy outcomes but also because diagnosis predicts a high risk of future type 2 diabetes mellitus (T2DM). Therefore, among women with GDM, postpartum care should be focused on T2DM prevention. This review describes the current literature aimed to increase postpartum diabetes testing among women with GDM. Data suggest that proactive patient contact via a health educator, a phone call, or even postal mail is associated with higher rates of postpartum diabetes testing. There may also be utility to changing the timing of postpartum diabetes testing. Despite the widespread knowledge regarding the importance of postpartum testing for women with GDM, testing rates remain low. Alternative testing strategies and large randomized trials addressing postpartum testing are warranted.

  7. Field Test Evaluation of Effect on Cone Resistance Caused by Change in Penetration Rate

    DEFF Research Database (Denmark)

    Poulsen, Rikke; Nielsen, Benjaminn Nordahl; Ibsen, Lars Bo

    2012-01-01

    This paper presents how a change in cone penetration rate affects the measured cone resistance during cone penetration testing in silty soils. Regardless of soil, type the standard rate of penetration is 20 mm/s and it is generally accepted that undrained penetration occurs in clay while drained...... in the laboratory. A change in the measured cone resistance occurs by lowering the penetration rate. This is caused by the changes in drainage conditions. Compared to the normal penetration rate of 20 mm/s, this paper illustrates that lowering the penetration rate leads to an increase in the cone resistance from 1...

  8. 40 CFR 53.53 - Test for flow rate accuracy, regulation, measurement accuracy, and cut-off.

    Science.gov (United States)

    2010-07-01

    ... definitions. (1) Sample flow rate means the quantitative volumetric flow rate of the air stream caused by the... the flow rate cut-off test, download the archived data from the test sampler and verify that the...

  9. A rapid in situ respiration test for measuring aerobic biodegradation rates of hydrocarbons in soil.

    Science.gov (United States)

    Hinchee, R E; Ong, S K

    1992-10-01

    An in situ test method to measure the aerobic biodegradation rates of hydrocarbons in contaminated soil is presented. The test method provides an initial assessment of bioventing as a remediation technology for hydrocarbon-contaminated soil. The in situ respiration test consists of ventilating the contaminated soil of the unsaturated zone with air and periodically monitoring the depletion of oxygen (O2) and production of carbon dioxide (CO2) over time after the air is turned off. The test is simple to implement and generally takes about four to five days to complete. The test was applied at eight hydrocarbon-contaminated sites of different geological and climatic conditions. These sites were contaminated with petroleum products or petroleum fuels, except for two sites where the contaminants were primarily polycyclic aromatic hydrocarbons. Oxygen utilization rates for the eight sites ranged from 0.02 to 0.99 percent O2/hour. Estimated biodegradation rates ranged from 0.4 to 19 mg/kg of soil/day. These rates were similar to the biodegradation rates obtained from field and pilot studies using mass balance methods. Estimated biodegradation rates based on O2 utilization were generally more reliable (especially for alkaline soils) than rates based on CO2 production. CO2 produced from microbial respiration was probably converted to carbonate under alkaline conditions.

  10. A Bayesian hierarchical model with novel prior specifications for estimating HIV testing rates

    Science.gov (United States)

    An, Qian; Kang, Jian; Song, Ruiguang; Hall, H. Irene

    2016-01-01

    Human immunodeficiency virus (HIV) infection is a severe infectious disease actively spreading globally, and acquired immunodeficiency syndrome (AIDS) is an advanced stage of HIV infection. The HIV testing rate, that is, the probability that an AIDS-free HIV infected person seeks a test for HIV during a particular time interval, given no previous positive test has been obtained prior to the start of the time, is an important parameter for public health. In this paper, we propose a Bayesian hierarchical model with two levels of hierarchy to estimate the HIV testing rate using annual AIDS and AIDS-free HIV diagnoses data. At level one, we model the latent number of HIV infections for each year using a Poisson distribution with the intensity parameter representing the HIV incidence rate. At level two, the annual numbers of AIDS and AIDS-free HIV diagnosed cases and all undiagnosed cases stratified by the HIV infections at different years are modeled using a multinomial distribution with parameters including the HIV testing rate. We propose a new class of priors for the HIV incidence rate and HIV testing rate taking into account the temporal dependence of these parameters to improve the estimation accuracy. We develop an efficient posterior computation algorithm based on the adaptive rejection metropolis sampling technique. We demonstrate our model using simulation studies and the analysis of the national HIV surveillance data in the USA. PMID:26567891

  11. Classroom Test Report: Compressed Speech Tapes--An Effective Teaching Tool for Increasing Rate and Comprehension.

    Science.gov (United States)

    Berg, Jerry

    Compressed speech tapes--recordings of a voice reading a selection at the normal rate of approximately 150 words per minute that are speeded up to as much as 400 words a minute--can be used successfully with students to provide practice in improving reading rate and comprehension. In a field test conducted with 23 high school juniors and seniors…

  12. Optimization of Aeration and Agitation Rate for Lipid and Gamma Linolenic Acid Production by Cunninghamella bainieri 2A1 in Submerged Fermentation Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Normah Saad

    2014-01-01

    Full Text Available The locally isolated filamentous fungus Cunninghamella bainieri 2A1 was cultivated in a 5 L bioreactor to produce lipid and gamma-linolenic acid (GLA. The optimization was carried out using response surface methodology based on a central composite design. A statistical model, second-order polynomial model, was adjusted to the experimental data to evaluate the effect of key operating variables, including aeration rate and agitation speed on lipid production. Process analysis showed that linear and quadratic effect of agitation intensity significantly influenced lipid production process (P<0.01. The quadratic model also indicated that the interaction between aeration rate and agitation speed had a highly significant effect on lipid production (P<0.01. Experimental results showed that a lipid content of 38.71% was produced in optimum conditions using an airflow rate and agitation speed of 0.32 vvm and 599 rpm, respectively. Similar results revealed that 0.058 (g/g gamma-linolenic acid was produced in optimum conditions where 1.0 vvm aeration rate and 441.45 rpm agitation rate were used. The regression model confirmed that aeration and agitation were of prime importance for optimum production of lipid in the bioreactor.

  13. Temperature and strain rate effects in high strength high conductivity copper alloys tested in air

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, D.J. [Pacific Northwest National Lab., Richland, WA (United States)

    1998-03-01

    The tensile properties of the three candidate alloys GlidCop{trademark} Al25, CuCrZr, and CuNiBe are known to be sensitive to the testing conditions such as strain rate and test temperature. This study was conducted on GlidCop Al25 (2 conditions) and Hycon 3HP (3 conditions) to ascertain the effect of test temperature and strain rate when tested in open air. The results show that the yield strength and elongation of the GlidCop Al25 alloys exhibit a strain rate dependence that increases with temperature. Both the GlidCop and the Hycon 3 HP exhibited an increase in strength as the strain rate increased, but the GlidCop alloys proved to be the most strain rate sensitive. The GlidCop failed in a ductile manner irrespective of the test conditions, however, their strength and uniform elongation decreased with increasing test temperature and the uniform elongation also decreased dramatically at the lower strain rates. The Hycon 3 HP alloys proved to be extremely sensitive to test temperature, rapidly losing their strength and ductility when the temperature increased above 250 C. As the test temperature increased and the strain rate decreased the fracture mode shifted from a ductile transgranular failure to a ductile intergranular failure with very localized ductility. This latter observation is based on the presence of dimples on the grain facets, indicating that some ductile deformation occurred near the grain boundaries. The material failed without any reduction in area at 450 C and 3.9 {times} 10{sup {minus}4} s{sup {minus}1}, and in several cases failed prematurely.

  14. Relative User Ratings of MMPI-2 Computer-Based Test Interpretations

    Science.gov (United States)

    Williams, John E.; Weed, Nathan C.

    2004-01-01

    There are eight commercially available computer-based test interpretations (CBTIs) for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2), of which few have been empirically evaluated. Prospective users of these programs have little scientific data to guide choice of a program. This study compared ratings of these eight CBTIs. Test users…

  15. False-positive rates associated with the use of multiple performance and symptom validity tests.

    Science.gov (United States)

    Larrabee, Glenn J

    2014-06-01

    Performance validity test (PVT) error rates using Monte Carlo simulation reported by Berthelson and colleagues (in False positive diagnosis of malingering due to the use of multiple effort tests. Brain Injury, 27, 909-916, 2013) were compared with PVT and symptom validity test (SVT) failure rates in two nonmalingering clinical samples. At a per-test false-positive rate of 10%, Monte Carlo simulation overestimated error rates for: (i) failure of ≥2 out of 5 PVTs/SVT for Larrabee (in Detection of malingering using atypical performance patterns on standard neuropsychological tests. The Clinical Neuropsychologist, 17, 410-425, 2003) and ACS (Pearson, Advanced clinical solutions for use with WAIS-IV and WMS-IV. San Antonio: Pearson Education, 2009) and (ii) failure of ≥2 out of 7 PVTs/SVT for Larrabee (Detection of malingering using atypical performance patterns on standard neuropsychological tests. The Clinical Neuropsychologist, 17, 410-425, 2003; Malingering scales for the Continuous Recognition Memory Test and Continuous Visual Memory Test. The Clinical Neuropsychologist, 23, 167-180, 2009 combined). Monte Carlo overestimation is likely because PVT performances are atypical in pattern or degree for what occurs in actual neurologic, psychiatric, or developmental disorders. Consequently, PVT scores form skewed distributions with performance at ceiling and restricted range, rather than forming a standard normal distribution with mean of 0 and standard deviation of 1.0. These results support the practice of using ≥2 PVT/SVT failures as representing probable invalid clinical presentation.

  16. STATISTICAL INFERENCE OF WEIBULL DISTRIBUTION FOR TAMPERED FAILURE RATE MODEL IN PROGRESSIVE STRESS ACCELERATED LIFE TESTING

    Institute of Scientific and Technical Information of China (English)

    WANG Ronghua; FEI Heliang

    2004-01-01

    In this note, the tampered failure rate model is generalized from the step-stress accelerated life testing setting to the progressive stress accelerated life testing for the first time. For the parametric setting where the scale parameter satisfying the equation of the inverse power law is Weibull, maximum likelihood estimation is investigated.

  17. Do firms share the same functional form of their growth rate distribution? A new statistical test

    CERN Document Server

    Lunardi, Josè T; Lillo, Fabrizio; Mantegna, Rosario N; Gallegati, Mauro

    2011-01-01

    We introduce a new statistical test of the hypothesis that a balanced panel of firms have the same growth rate distribution or, more generally, that they share the same functional form of growth rate distribution. We applied the test to European Union and US publicly quoted manufacturing firms data, considering functional forms belonging to the Subbotin family of distributions. While our hypotheses are rejected for the vast majority of sets at the sector level, we cannot rejected them at the subsector level, indicating that homogenous panels of firms could be described by a common functional form of growth rate distribution.

  18. A Test of a Strong Ground Motion Prediction Methodology for the 7 September 1999, Mw=6.0 Athens Earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Hutchings, L; Ioannidou, E; Voulgaris, N; Kalogeras, I; Savy, J; Foxall, W; Stavrakakis, G

    2004-08-06

    We test a methodology to predict the range of ground-motion hazard for a fixed magnitude earthquake along a specific fault or within a specific source volume, and we demonstrate how to incorporate this into probabilistic seismic hazard analyses (PSHA). We modeled ground motion with empirical Green's functions. We tested our methodology with the 7 September 1999, Mw=6.0 Athens earthquake, we: (1) developed constraints on rupture parameters based on prior knowledge of earthquake rupture processes and sources in the region; (2) generated impulsive point shear source empirical Green's functions by deconvolving out the source contribution of M < 4.0 aftershocks; (3) used aftershocks that occurred throughout the area and not necessarily along the fault to be modeled; (4) ran a sufficient number of scenario earthquakes to span the full variability of ground motion possible; (5) found that our distribution of synthesized ground motions span what actually occurred and their distribution is realistically narrow; (6) determined that one of our source models generates records that match observed time histories well; (7) found that certain combinations of rupture parameters produced ''extreme'' ground motions at some stations; (8) identified that the ''best fitting'' rupture models occurred in the vicinity of 38.05{sup o} N 23.60{sup o} W with center of rupture near 12 km, and near unilateral rupture towards the areas of high damage, and this is consistent with independent investigations; and (9) synthesized strong motion records in high damage areas for which records from the earthquake were not recorded. We then developed a demonstration PSHA for a source region near Athens utilizing synthesized ground motion rather that traditional attenuation. We synthesized 500 earthquakes distributed throughout the source zone likely to have Mw=6.0 earthquakes near Athens. We assumed an average return period of 1000 years for this

  19. On-Line Flutter Prediction Tool for Wind Tunnel Flutter Testing using Parameter Varying Estimation Methodology Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology, Inc. (ZONA) proposes to develop an on-line flutter prediction tool using the parameter varying estimation (PVE) methodology, called the PVE Toolbox,...

  20. Bit error rate testing of fiber optic data links for MMIC-based phased array antennas

    Science.gov (United States)

    Shalkhauser, K. A.; Kunath, R. R.; Daryoush, A. S.

    1990-06-01

    The measured bit-error-rate (BER) performance of a fiber optic data link to be used in satellite communications systems is presented and discussed. In the testing, the link was measured for its ability to carry high burst rate, serial-minimum shift keyed (SMSK) digital data similar to those used in actual space communications systems. The fiber optic data link, as part of a dual-segment injection-locked RF fiber optic link system, offers a means to distribute these signals to the many radiating elements of a phased array antenna. Test procedures, experimental arrangements, and test results are presented.

  1. A NONPARAMETRIC PROCEDURE OF THE SAMPLE SIZE DETERMINATION FOR SURVIVAL RATE TEST

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Objective This paper proposes a nonparametric procedure of the sample size determination for survival rate test. Methods Using the classical asymptotic normal procedure yields the required homogenetic effective sample size and using the inverse operation with the prespecified value of the survival function of censoring times yields the required sample size. Results It is matched with the rate test for censored data, does not involve survival distributions, and reduces to its classical counterpart when there is no censoring. The observed power of the test coincides with the prescribed power under usual clinical conditions. Conclusion It can be used for planning survival studies of chronic diseases.

  2. Improving NCLEX-RN pass rates by implementing a testing policy.

    Science.gov (United States)

    Schroeder, Jean

    2013-01-01

    To improve the National Council Licensure Examination for Registered Nurses (NCLEX-RN) pass rates and to address the National League for Nursing Accrediting Commission's outcomes standard, a testing policy was developed and implemented at an associate degree of nursing (ADN) program located in a suburb south of Denver, CO. This article describes the testing policy strategies that were implemented by the ADN faculty to evaluate the curriculum. Strategies used for internal curriculum evaluation addressed test item writing, test blueprinting, and the use of item analysis data to evaluate and improve faculty-designed exams. Strategies used for external curriculum evaluation employed the use of HESI specialty exams that were administered at the completion of each course and HESI Exit Exams that were administered at the completion of the first and second years of the curriculum. These strategies were formalized with the development of a testing policy manual that described the procedures used to implement internal and external curriculum evaluation. To measure the effectiveness of the testing policy, NCLEX-RN outcomes were compared before and after implementing the testing policy. Findings indicated that the mean NCLEX-RN pass rate for the 5 years following implementation of the testing policy was significantly higher (P NCLEX-RN pass rate for the 5 years preceding implementation of the testing policy.

  3. How to test brain and brain simulant at ballistic and blast strain rates.

    Science.gov (United States)

    Zhang, Jiangyue; Song, Bo; Pintar, Frank A; Yoganandan, Narayan; Chen, Weinong; Gennarelli, Thomas A

    2008-01-01

    Mechanical properties of brain tissue and brain simulant at strain rate in the range of 1000 s-1 are essential for computational simulation of intracranial responses for ballistic and blast traumatic brain injury. Testing these ultra-soft materials at high strain rates is a challenge to most conventional material testing methods. The current study developed a modified split Hopkinson bar techniques using the combination of a few improvements to conventional split Hopkinson bar including: using low impedance aluminum bar, semiconductor strain gauge, pulse shaping technique and annular specimen. Feasibility tests were conducted using a brain stimulant, Sylgard 527. Stress-strain curves of the simulant were successfully obtained at strain rates of 2600 and 2700 s-1 for strain levels up to 60%. This confirmed the applicability of Hopkinson bar for mechanical properties testing of brain tissue in the ballistic and blast domain.

  4. Development of a calibration methodology and tests of kerma area product meters; Desenvolvimento de uma metodologia de calibracao e testes de medidores de produto Kerma-Area

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Nathalia Almeida

    2013-07-01

    The quantity kerma area product (PKA) is important to establish reference levels in diagnostic radiology exams. This quantity can be obtained using a PKA meter. The use of such meters is essential to evaluate the radiation dose in radiological procedures and is a good indicator to make sure that the dose limit to the patient's skin doesn't exceed. Sometimes, these meters come fixed to X radiation equipment, which makes its calibration difficult. In this work, it was developed a methodology for calibration of PKA meters. The instrument used for this purpose was the Patient Dose Calibrator (PDC). It was developed to be used as a reference to check the calibration of PKA and air kerma meters that are used for dosimetry in patients and to verify the consistency and behavior of systems of automatic exposure control. Because it is a new equipment, which, in Brazil, is not yet used as reference equipment for calibration, it was also performed the quality control of this equipment with characterization tests, the calibration and an evaluation of the energy dependence. After the tests, it was proved that the PDC can be used as a reference instrument and that the calibration must be performed in situ, so that the characteristics of each X-ray equipment, where the PKA meters are used, are considered. The calibration was then performed with portable PKA meters and in an interventional radiology equipment that has a PKA meter fixed. The results were good and it was proved the need for calibration of these meters and the importance of in situ calibration with a reference meter. (author)

  5. AMOVA ["Accumulative Manifold Validation Analysis"]: An Advanced Statistical Methodology Designed to Measure and Test the Validity, Reliability, and Overall Efficacy of Inquiry-Based Psychometric Instruments

    Science.gov (United States)

    Osler, James Edward, II

    2015-01-01

    This monograph provides an epistemological rational for the Accumulative Manifold Validation Analysis [also referred by the acronym "AMOVA"] statistical methodology designed to test psychometric instruments. This form of inquiry is a form of mathematical optimization in the discipline of linear stochastic modelling. AMOVA is an in-depth…

  6. Policy Implications for Continuous Employment Decisions of High School Principals: An Alternative Methodological Approach for Using High-Stakes Testing Outcomes

    Science.gov (United States)

    Young, I. Phillip; Fawcett, Paul

    2013-01-01

    Several teacher models exist for using high-stakes testing outcomes to make continuous employment decisions for principals. These models are reviewed, and specific flaws are noted if these models are retrofitted for principals. To address these flaws, a different methodology is proposed on the basis of actual field data. Specially addressed are…

  7. The relationship of regional hemoglobin A1c testing and amputation rate among patients with diabetes.

    Science.gov (United States)

    Newhall, Karina A; Bekelis, Kimon; Suckow, Bjoern D; Gottlieb, Daniel J; Farber, Adrienne E; Goodney, Philip P; Skinner, Jonathan S

    2017-04-01

    Objective The risk of leg amputation among patients with diabetes has declined over the past decade, while use of preventative measures-such as hemoglobin A1c monitoring-has increased. However, the relationship between hemoglobin A1c testing and amputation risk remains unclear. Methods We examined annual rates of hemoglobin A1c testing and major leg amputation among Medicare patients with diabetes from 2003 to 2012 across 306 hospital referral regions. We created linear regression models to study associations between hemoglobin A1c testing and lower extremity amputation. Results From 2003 to 2012, the proportion of patients who received hemoglobin A1c testing increased 10% (74% to 84%), while their rate of lower extremity amputation decreased 50% (430 to 232/100,000 beneficiaries). Regional hemoglobin A1c testing weakly correlated with crude amputation rate in both years (2003 R = -0.20, 2012 R = -0.21), and further weakened with adjustment for age, sex, and disability status (2003 R = -0.11, 2012 R = -0.17). In a multivariable model of 2012 amputation rates, hemoglobin A1c testing was not a significant predictor. Conclusion Lower extremity amputation among patients with diabetes nearly halved over the past decade but only weakly correlated with hemoglobin A1c testing throughout the study period. Better metrics are needed to understand the relationship between preventative care and amputation.

  8. Standard Test Method for Determining Thermal Neutron Reaction Rates and Thermal Neutron Fluence Rates by Radioactivation Techniques

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 The purpose of this test method is to define a general procedure for determining an unknown thermal-neutron fluence rate by neutron activation techniques. It is not practicable to describe completely a technique applicable to the large number of experimental situations that require the measurement of a thermal-neutron fluence rate. Therefore, this method is presented so that the user may adapt to his particular situation the fundamental procedures of the following techniques. 1.1.1 Radiometric counting technique using pure cobalt, pure gold, pure indium, cobalt-aluminum, alloy, gold-aluminum alloy, or indium-aluminum alloy. 1.1.2 Standard comparison technique using pure gold, or gold-aluminum alloy, and 1.1.3 Secondary standard comparison techniques using pure indium, indium-aluminum alloy, pure dysprosium, or dysprosium-aluminum alloy. 1.2 The techniques presented are limited to measurements at room temperatures. However, special problems when making thermal-neutron fluence rate measurements in high-...

  9. The Term Structure of Interest Rates and its Impact on the Liability Adequacy Test for Insurance Companies in Brazil

    Directory of Open Access Journals (Sweden)

    Antonio Aurelio Duarte

    2015-08-01

    Full Text Available The Brazilian regulation for applying the Liability Adequacy Test (LAT to technical provisions in insurance companies requires that the current estimate is discounted by a term structure of interest rates (hereafter TSIR. This article aims to analyze the LAT results, derived from the use of various models to build the TSIR: the cubic spline interpolation technique, Svensson's model (adopted by the regulator and Vasicek's model. In order to achieve the objective proposed, the exchange rates of BM&FBOVESPA trading days were used to model the ETTJ and, consequently, to discount the cash flow of the insurance company. The results indicate that: (i LAT is sensitive to the choice of the model used to build the TSIR; (ii this sensitivity increases with cash flow longevity; (iii the adoption of an ultimate forward rate (UFR for the Brazilian insurance market should be evaluated by the regulator, in order to stabilize the trajectory of the yield curve at longer maturities. The technical provision is among the main solvency items of insurance companies and the LAT result is a significant indicator of the quality of this provision, as this evaluates its sufficiency or insufficiency. Thus, this article bridges a gap in the Brazilian actuarial literature, introducing the main methodologies available for modeling the yield curve and a practical application to analyze the impact of its choice on LAT.

  10. Understanding High Recession Rates of Carbon Ablators Seen in Shear Tests in an Arc Jet

    Science.gov (United States)

    Driver, David M.; Olson, Michael W.; Barnhardt, Michael D.; MacLean, Matthew

    2010-01-01

    High rates of recession in arc jet shear tests of Phenolic Impregnated Carbon Ablator (PICA) inspired a series of tests and analysis on FiberForm (a carbon preform used in the fabrication of PICA). Arc jet tests were performed on FiberForm in both air and pure nitrogen for stagnation and shear configurations. The nitrogen tests showed little or no recession, while the air tests of FiberForm showed recession rates similar to that of PICA (when adjusted for the difference in density). While mechanical erosion can not be ruled out, this is the first step in doing so. Analysis using a carbon oxidation boundary condition within DPLR was used to predict the recession rate of FiberForm. The analysis indicates that much of the anomalous recession behavior seen in shear tests may simply be an artifact of the non-flight like test configuration (copper upstream of the test article) a result of dissimilar enthalpy and oxygen concentration profiles on the copper. Shape change effects were also investigated and shown to be relatively small.

  11. Robust Adaptive Rate-Optimal Testing for the White Noise Hypothesis

    CERN Document Server

    Guay, Alain; Lazarova, Stepana

    2011-01-01

    A new test is proposed for the weak white noise null hypothesis. The test is based on an automatic choice of the order for a Box-Pierce or Hong test statistic. The simplest version of the test uses Lobato (2001) or Kuan and Lee (2006) HAC critical values but the procedure is flexible enough to improve the detection properties of any prescribed test. This can allow for instance to calibrate the test for optimal detection of specific alternatives as in Delgado and Velasco (2010a). The data-driven order choice is tailored to give a test which achieves adaptive rate-optimality against several classes of alternatives, namely (i) alternatives with a large enough number of autocorrelation coefficients converging to 0 faster than the parametric rate; (ii) alternatives with a "peak and valley" spectral density function. A simulation experiment leads to prefer the Box-Pierce version of the test, both under the null and the alternative. An application to daily exchange rate returns illustrates the usefulness of the prop...

  12. Interlaboratory test study for ASTM E 2008 volatility rate by thermogravimetry

    Energy Technology Data Exchange (ETDEWEB)

    Kwok, Q.S.M.; Vachon, M.; Jones, D.E.G.

    2003-11-01

    The Canadian Explosives Research Laboratory (CERL) led an interlaboratory test (ILT) to assess the stability of solids and liquids at given temperatures using thermogravimetry under specific experimental conditions.The objective was to determine the number of repetitive measurements needed on fresh specimens in order to satisfy end use requirements. The study involved isothermal constant heating rate tests to determine the volatility rates for camphor at 333 K and for squalane at 573 K using ASTM Standard Test Method E 2008 called Volatility Rate by Thermogravimetry. This paper listed the participating laboratories, the scientists, and their locations. Each laboratory conducted mass and temperature calibrations according to ASTME Standard E 1582 and manufacturer's recommendations. Five replicates were obtained from each laboratory and the volatility rates for water were determined at 323 and 353 K using the Method B Constant Heating Rate Test. The results from 8 laboratories were statistically analyzed using the ASTME E 691 Interlaboratory Data Analysis Software. The report includes a table of results for volatility rates for camphor, squalane, water at 323 K and water at 353 K. 4 tabs., 9 appendices.

  13. Advances in VLSI testing at MultiGb per second rates

    Directory of Open Access Journals (Sweden)

    Topisirović Dragan

    2005-01-01

    Full Text Available Today's high performance manufacturing of digital systems requires VLSI testing at speeds of multigigabits per second (multiGbps. Testing at Gbps needs high transfer rates among channels and functional units, and requires readdressing of data format and communication within a serial mode. This implies that a physical phenomena-jitter, is becoming very essential to tester operation. This establishes functional and design shift, which in turn dictates a corresponding shift in test and DFT (Design for Testability methods. We, here, review various approaches and discuss the tradeoffs in testing actual devices. For industry, volume-production stage and testing of multigigahertz have economic challenges. A particular solution based on the conventional ATE (Automated Test Equipment resources, that will be discussed, allows for accurate testing of ICs with many channels and this systems can test ICs at 2.5 Gbps over 144 cannels, with extensions planned that will have test rates exceeding 5 Gbps. Yield improvement requires understanding failures and identifying potential sources of yield loss. This text focuses on diagnosing of random logic circuits and classifying faults. An interesting scan-based diagnosis flow, which leverages the ATPG (Automatic Test Pattern Generator patterns originally generated for fault coverage, will be described. This flow shows an adequate link between the design automation tools and the testers, and a correlation between the ATPG patterns and the tester failure reports.

  14. Specificity and false positive rates of the Test of Memory Malingering, Rey 15-item Test, and Rey Word Recognition Test among forensic inpatients with intellectual disabilities.

    Science.gov (United States)

    Love, Christopher M; Glassmire, David M; Zanolini, Shanna Jordan; Wolf, Amanda

    2014-10-01

    This study evaluated the specificity and false positive (FP) rates of the Rey 15-Item Test (FIT), Word Recognition Test (WRT), and Test of Memory Malingering (TOMM) in a sample of 21 forensic inpatients with mild intellectual disability (ID). The FIT demonstrated an FP rate of 23.8% with the standard quantitative cutoff score. Certain qualitative error types on the FIT showed promise and had low FP rates. The WRT obtained an FP rate of 0.0% with previously reported cutoff scores. Finally, the TOMM demonstrated low FP rates of 4.8% and 0.0% on Trial 2 and the Retention Trial, respectively, when applying the standard cutoff score. FP rates are reported for a range of cutoff scores and compared with published research on individuals diagnosed with ID. Results indicated that although the quantitative variables on the FIT had unacceptably high FP rates, the TOMM and WRT had low FP rates, increasing the confidence clinicians can place in scores reflecting poor effort on these measures during ID evaluations.

  15. Methodology adequation of tetrazolium test for Hymenachne amplexicaulis seedsAdequação da metodologia do teste de tetrazólio para sementes de Hymenachne amplexicaulis

    Directory of Open Access Journals (Sweden)

    Methodology adequation of tetrazolium test for Hymenachne amplexicaulis seeds

    2012-10-01

    Full Text Available The Hymenachne amplexicaulis is a grass, considered a weed in the flooded Rice. Its seeds, measure about three millimeters in length and its panicles can produce a large number of viable seeds. This study aimed to establish the appropriate methodology for evaluate the viability of seeds of H. amplexicaulis using tetrazolium test. The tests, with completely randomized design, were conducted in the Herbology Didactic of Laboratory and Seed Laboratory for Didactic and Research, Universidade Federal de Santa Maria in 2010. For the germination test, treatments seed protected by their glumes and without these (naked were exposed to potassium nitrate at a concentration of 0.2% or distilled water (control. For the tetrazolium test, naked seeds were immersed directly in water for 6, 12 and 24 hours periods at 23°C ± 1°C. Subsequently, the seeds were incubated in a solution of 2,3,5-triphenyl tetrazolium chloride at a concentration of 0.5%, at this temperature and followed-up the evolution of coloration. The seeds were classified as viable or non-viable, according with parameter predetermined, followed by data analysis. The imbibition of seeds without glumes, for six hours, with subsequent immersion in a solution of tetrazolium 0.5% for four hours was effective in determining the viability of seeds.O Hymenachne amplexicaulis é uma gramínea considerada invasora da cultura do arroz irrigado. Suas sementes medem cerca de três milímetros de comprimento e uma panícula pode produzir grande número de sementes viáveis. Este trabalho teve por objetivo, estabelecer a metodologia adequada para a identificação da viabilidade de sementes de H. amplexicaulis, usando o teste de tetrazólio. Os ensaios, com delineamento experimental inteiramente casualizado, foram conduzidos no laboratório de Herbologia Professor Loreno Covolo e no laboratório Didático e de Pesquisa em Sementes da Universidade Federal de Santa Maria, em 2010. Para o teste de germina

  16. Evaluation of susceptibility of high strength steels to delayed fracture by using cyclic corrosion test and slow strain rate test

    Energy Technology Data Exchange (ETDEWEB)

    Li Songjie [School of Materials Science and Engineering, University of Science and Technology Beijing, No. 30 Xueyuan Road, Hidian Zone, Beijing 100083 (China); Structural Metals Center, National Institute for Materials Science, 1-2-1 Sengen, Tsukuba, Ibaraki 305-0047 (Japan); Zhang Zuogui [Structural Metals Center, National Institute for Materials Science, 1-2-1 Sengen, Tsukuba, Ibaraki 305-0047 (Japan); Akiyama, Eiji [Structural Metals Center, National Institute for Materials Science, 1-2-1 Sengen, Tsukuba, Ibaraki 305-0047 (Japan)], E-mail: AKIYAMA.Eiji@nims.go.jp; Tsuzaki, Kaneaki [Structural Metals Center, National Institute for Materials Science, 1-2-1 Sengen, Tsukuba, Ibaraki 305-0047 (Japan); Zhang Boping [School of Materials Science and Engineering, University of Science and Technology Beijing, No. 30 Xueyuan Road, Hidian Zone, Beijing 100083 (China)

    2010-05-15

    To evaluate susceptibilities of high strength steels to delayed fracture, slow strain rate tests (SSRT) of notched bar specimens of AISI 4135 with tensile strengths of 1300 and 1500 MPa and boron-bearing steel with 1300 MPa have been performed after cyclic corrosion test (CCT). During SSRT the humidity around the specimen was kept high to keep absorbed diffusible hydrogen. The fracture stresses of AISI 4135 steels decreased with increment of diffusible hydrogen content which increased with CCT cycles. Their delayed fracture susceptibilities could be successfully evaluated in consideration of both influence of hydrogen content on mechanical property and hydrogen entry.

  17. PVRC/MPC Round Robin Tests for the Low Toughness High-Copper 72W Weld Using Master Curve Methodology of PCVN Specimens

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Bong-Sang; Hong, Jun Hwa; Yang, Won Jon

    2000-06-01

    This report summarizes the results obtained from the Korean contribution the PVRC/MPC cooperative program on {sup R}ound Robin Tests for Low Toughness High-Copper 72W Weld Using Master Curve Methodology of PCVN Specimens. The mandatory part of this program is to perform fracture toughness (K{sub jc}) tests on the low toughness 72W weld at three different temperatures using pre-cracked Charpy specimens. The purpose of the tests is to verify the specimen size requirements in the ASTM E 1921, 'Standard test method for determination of reference temperature, T{sub o}, for ferritic steels in the transition range'.

  18. Development definition of strength and anaerobic abilities in jump tests: classification, methodology of measuring and norms of estimation of standing high jump

    Directory of Open Access Journals (Sweden)

    Leonid Serhiyenko

    2015-10-01

    Full Text Available Purpose: to define the methodology of carrying out the tests: standing high jump and to systematize the general notion about measuring of strength and anaerobic human abilities. Material and Methods: methods of theoretic analysis and generalization, method of search and study of scientific information were used. Results: the standing high jump classification which helps to differentiate jumps according to the way of fulfillment and estimation of the development of motor abilities ware founded. Conclusion: the methodology of doing different kinds of jumps is described

  19. Bethesda System reporting rates for conventional Papanicolaou tests and liquid-based cytology in a large Chinese, College of American Pathologists-certified independent medical laboratory: analysis of 1394389 Papanicolaou test reports.

    Science.gov (United States)

    Zheng, Baowen; Austin, R Marshall; Liang, Xiaoman; Li, Zaibo; Chen, Congde; Yan, Shanshan; Zhao, Chengquan

    2015-03-01

    Reports that use the Bethesda System categories for Chinese Papanicolaou test results are rare. To document and analyze rates reported in the Bethesda System for conventional Papanicolaou tests and liquid-based cytology between 2007 and 2012 in China's largest College of American Pathologists-accredited laboratory. Results from 1,394,389 Papanicolaou tests, rendered between 2007 and 2012 by the Guangzhou Kingmed Diagnostics Cytology Laboratory, were documented by the Bethesda System report categories and Papanicolaou test methodology, which included both conventional Papanicolaou tests and 4 different liquid-based cytology preparations. Results were documented for 326,297 conventional Papanicolaou tests and 1,068,092 liquid-based cytology specimens, which included 928,884 ThinPrep (Hologic, Bedford, Massachusetts), 63,465 SurePath (BD Diagnostics, Franklin Lakes, New Jersey), 50,422 Liqui-Prep (LGM International, Melbourne, Florida), and 25,321 Lituo liquid-cytology (Lituo Biotechnology Co, Hunan, China) specimens. Abnormality rates reported were significantly higher with liquid-based cytology than they were with conventional Papanicolaou tests in all the Bethesda System categories (P < .001). Reporting rates were within the 2006 benchmark ranges from the College of American Pathologists, except for atypical glandular cells (low) and unsatisfactory rates for conventional Papanicolaou tests (low). Participation in the international College of American Pathologists Laboratory Accreditation Program provides laboratory quality standards not otherwise available in many international settings.

  20. A facility for the test of large area muon chambers at high rates

    CERN Document Server

    Agosteo, S; Belli, G; Bonifas, A; Carabelli, V; Gatignon, L; Hessey, N P; Maggi, M; Peigneux, J P; Reithler, H; Silari, Marco; Vitulo, P; Wegner, M

    2000-01-01

    Operation of large area muon detectors at the future Large Hadron Collider (LHC) will be characterized by large sustained hit rates over the whole area, reaching the range of kHz/\\scm. We describe a dedicated test zone built at CERN to test the performance and the aging of the muon chambers currently under development. A radioactive source delivers photons causing the sustained rate of random hits, while a narrow beam of high energy muons is used to directly calibrate the detector performance. A system of remotely controlled lead filters serves to vary the rate of photons over four orders of magnitude, to allow the study of performance as a function of rate.

  1. Continuous Time Models of Interest Rate: Testing the Mexican Data (1998-2006)

    OpenAIRE

    Jose Luis de la Cruz; Elizabeth Ortega.

    2007-01-01

    Distinct parametric models in continuous time for the interest rates are tested by means of a comparative analysis of the implied parametric and nonparametric densities. In this research the statistic developed by Ait-Sahalia (1996a) has been applied to the Mexican CETES (28 days) interest rate in the period 1998-2006. With this technique, the discrete approximation to the continuous model is unnecessary even when the data are discrete. The results allow to affirm that the models of interest ...

  2. The rate test of speciation: estimating the likelihood of non-allopatric speciation from reproductive isolation rates in Drosophila.

    Science.gov (United States)

    Yukilevich, Roman

    2014-04-01

    Among the most debated subjects in speciation is the question of its mode. Although allopatric (geographical) speciation is assumed the null model, the importance of parapatric and sympatric speciation is extremely difficult to assess and remains controversial. Here I develop a novel approach to distinguish these modes of speciation by studying the evolution of reproductive isolation (RI) among taxa. I focus on the Drosophila genus, for which measures of RI are known. First, I incorporate RI into age-range correlations. Plots show that almost all cases of weak RI are between allopatric taxa whereas sympatric taxa have strong RI. This either implies that most reproductive isolation (RI) was initiated in allopatry or that RI evolves too rapidly in sympatry to be captured at incipient stages. To distinguish between these explanations, I develop a new "rate test of speciation" that estimates the likelihood of non-allopatric speciation given the distribution of RI rates in allopatry versus sympatry. Most sympatric taxa were found to have likely initiated RI in allopatry. However, two putative candidate species pairs for non-allopatric speciation were identified (5% of known Drosophila). In total, this study shows how using RI measures can greatly inform us about the geographical mode of speciation in nature.

  3. The corrosion rate of copper in a bentonite test package measured with electric resistance sensors

    Energy Technology Data Exchange (ETDEWEB)

    Rosborg, Bo [Division of Surface and Corrosion Science, KTH, Stockholm (Sweden); Kosec, Tadeja; Kranjc, Andrej; Kuhar, Viljem; Legat, Andraz [Slovenian National Building and Civil Engineering Institute, Ljubljana (Slovenia)

    2012-12-15

    LOT1 test parcel A2 was exposed for six years in the Aespoe Hard Rock Laboratory, which offers a realistic environment for the conditions that will prevail in a deep repository for high-level radioactive waste disposal in Sweden. The test parcel contained copper electrodes for real-time corrosion monitoring in bentonite ring 36, where the temperature was 24 deg C, and copper coupons in bentonite rings 22 and 30, where the temperature was higher. After retrieval of the test parcel in January 2006, a bentonite test package consisting of bentonite rings 35 - 37 was placed in a container and sealed with a thick layer of paraffin. Later the same year new copper electrodes were installed in the test package. In January 2007 electric resistance (ER) sensors of pure copper with a thickness of 35 {mu}m were also installed in the test package mainly to facilitate the interpretation of the results from the real-time corrosion monitoring with electrochemical techniques. The ER measurements have shown that the corrosion rate of pure copper exposed in an oxic bentonite/ saline groundwater environment at room temperate decreases slowly with time to low but measurable values. The corrosion rates estimated from the regularly performed EIS measurements replicate the ER data. Thus, for this oxic environment in which copper acquires corrosion potentials of the order of 200 mV (SHE) or higher, electrochemical measurements provide believable data. Comparing the recorded ER data with an estimate of the average corrosion rate based on comparing cross-sections from exposed and protected sensor elements, it is obvious that the former overestimates the actual corrosion rate, which is understandable. It seems as if electrochemical measurements can provide a better estimate of the corrosion rate; however, this is quite dependent on the use of proper measuring frequencies and evaluation methods. In this respect ER measurements are more reliable. It has been shown that real-time corrosion

  4. DISCREPANCIES IN THE REGRESSION MODELLING OF RECRYSTALLIZATION RATE AS USING THE DATA FROM PHYSICAL SIMULATION TESTS

    Institute of Scientific and Technical Information of China (English)

    L.P. Karjalainen; M.C. Somani; S.F. Medina

    2004-01-01

    The analysis of numerous experimental equations published in the literature reveals a wide scatter in the predictions for the static recrystallization kinetics of steels. The powers of the deformation variables, strain and strain rate, similarly as the power of the grain size vary in these equations. These differences are highlighted and the typical values are compared between torsion and compression tests. Potential errors in physical simulation testing are discussed.

  5. An Evaluation of the Euroncap Crash Test Safety Ratings in the Real World

    OpenAIRE

    2007-01-01

    We investigated whether the rating obtained in the EuroNCAP test procedures correlates with injury protection to vehicle occupants in real crashes using data in the UK Cooperative Crash Injury Study (CCIS) database from 1996 to 2005. Multivariate Poisson regression models were developed, using the Abbreviated Injury Scale (AIS) score by body region as the dependent variable and the EuroNCAP score for that particular body region, seat belt use, mass ratio and Equivalent Test Speed (ETS) as ind...

  6. Pharmacological and methodological aspects of the separation-induced vocalization test in guinea pig pups; a systematic review and meta-analysis.

    Science.gov (United States)

    Groenink, Lucianne; Verdouw, P Monika; Bakker, Brenda; Wever, Kimberley E

    2015-04-15

    The separation-induced vocalization test in guinea pig pups is one of many that has been used to screen for anxiolytic-like properties of drugs. The test is based on the cross-species phenomenon that infants emit distress calls when placed in social isolation. Here we report a systematic review and meta-analysis of pharmacological intervention in the separation-induced vocalization test in guinea pig pups. Electronic databases were searched for original research articles, yielding 32 studies that met inclusion criteria. We extracted data on pharmacological intervention, animal and methodological characteristics, and study quality indicators. Meta-analysis showed that the different drug classes in clinical use for the treatment of anxiety disorders, have comparable effects on vocalization behaviour, irrespective of their mechanism of action. Of the experimental drugs, nociception (NOP) receptor agonists proved very effective in this test. Analysis further indicated that the commonly used read-outs total number and total duration of vocalizations are equally valid. With regard to methodological characteristics, repeated testing of pups as well as selecting pups with moderate or high levels of vocalization were associated with larger treatment effects. Finally, reporting of study methodology, randomization and blinding was poor and Egger's test for small study effects showed that publication bias likely occurred. This review illustrates the value of systematic reviews and meta-analyses in improving translational value and methodological aspects of animal models. It further shows the urgent need to implement existing publication guidelines to maximize the output and impact of experimental animal studies.

  7. Can students' reasons for choosing set answers to ethical vignettes be reliably rated? Development and testing of a method.

    Science.gov (United States)

    Goldie, John; Schwartz, Lisa; McConnachie, Alex; Jolly, Brian; Morrison, Jillian

    2004-12-01

    Although ethics is an important part of modern curricula, measures of students' ethical disposition have not been easy to develop. A potential method is to assess students' written justifications for selecting one option from a preset range of answers to vignettes and compare these justifications with predetermined 'expert' consensus. We describe the development of and reliability estimation for such a method -- the Ethics in Health Care Instrument (EHCI). Seven raters classified the responses of ten subjects to nine vignettes, on two occasions. The first stage of analysis involved raters' judging how consistent with consensus were subjects' justifications using generalizability theory, and then rating consensus responses on the action justification and values recognition hierarchies. The inter-rater reliability was 0.39 for the initial rating. Differential performance on questions was identified as the largest source of variance. Hence reliability was investigated also for students' total scores over the nine consensus vignettes. Rater effects were the largest source of variance identified. Examination of rater performance showed lack of rater consistency. D-studies were performed which showed acceptable reliability could nevertheless be obtained using four raters per EHCI. This study suggests that the EHCI has potential as an assessment instrument although further testing is required of all components of the methodology.

  8. 替硝唑栓微生物限度检查方法学研究%Methodological study on microbial limit test of tinidazole suppositories

    Institute of Scientific and Technical Information of China (English)

    蒋彦洁; 凌明; 张笑颜

    2014-01-01

    目的:对替硝唑栓的微生物限度检查方法进行研究。方法按照《中华人民共和国药典》2010年版附录的要求进行方法学验证,细菌计数采用低速离心-薄膜过滤法,真菌及酵母菌计数采用薄膜过滤法,控制菌检查采用薄膜过滤法与低速离心-薄膜过滤法。结果细菌、真菌及酵母菌的回收率均>70.0%,控制菌检查中,各阳性试验菌均检出,阴性对照无菌生长。结论薄膜过滤法与低速离心-薄膜过滤法可消除替硝唑栓的抑菌作用,适用于该制剂的微生物限度检查。%Objective To study the method of microbial limit test of tinidazole suppositories. Methods According to the appendix of Chinese Pharmacopoeia (2010 edition), methodology validation was taken in the test. Low-speed centrifugal-membrane filtration method was applied for bacteria counting, and membrane filtration method was used for fungus and saccharomycetes counting. Both the above methods were used in control bacteria check. Results The recovery rates of bacteria, fungus, and saccharomycetes were all>70.0%. The results of control bacteria check showed that all positive test bacteria were detected, and no negative control bacteria occurred. Conclusion Both of membrane filtration method and low-speed centrifugal-membrane filtration method can eliminate the bacteriostasis of tinidazole suppositories, and they are applicable to the microbial limit test of tinidazole suppositories.

  9. Test-retest reproducibility of heart rate recovery after treadmill exercise.

    Science.gov (United States)

    Yawn, Barbara P; Ammar, K Afzal; Thomas, Randal; Wollan, Peter C

    2003-01-01

    Slowed heart rate recovery (HRR) of less than 12 beats per minute in the first minute after an exercise stress test has been suggested as a useful addition to the criteria currently used to assess exercise stress test results. Although HRR has been tested in large populations, the short-term test-retest stability (reproducibility) of abnormal HRR for an individual has not been assessed. The study was a retrospective comparison of medical record information using a community-practice-based sample of 90 patients undergoing 2 exercise stress tests separated by 18 weeks or less. Concordance of abnormal HRR results on the first and second stress tests were assessed for individual patients using definitions of abnormal HRR from the medical literature. Individual patient's HRR was markedly variable from the first to second stress test. In this sample, no definition of abnormal HRR provided more than 55% concordance between results from the first and second stress tests. These preliminary data suggest that HRR appears to have limited short-term test-retest stability or reproducibility and therefore might not be a reliable addition to current results of exercise stress tests.

  10. Can the anaerobic potentially mineralizable nitrogen test improve predictions of fertilizer nitrogen rates in the Cornbelt?

    Science.gov (United States)

    Correctly estimating the amount of mineralizable nitrogen (N) can enhance nitrogen use efficiency. The anaerobic potentially mineralizable nitrogen (PMNAn) test is a tool that may help improve predictions of N uptake, grain yield, and the economical optimum nitrogen rate (EONR) of corn (Zea mays L...

  11. A new ambiguity acceptance test threshold determination method with controllable failure rate

    Science.gov (United States)

    Wang, Lei; Verhagen, Sandra

    2015-04-01

    The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

  12. Analyzing Effect of Demand Rate on Safety of Systems with Periodic Proof-tests

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Quantitative safety assessment of safety systems plays an important role in decision making at all stages of system lifecycle, i.e., design, deployment and phase out. Most safety assessment methods consider only system parameters, such as configuration, hazard rate, coverage, repair rate, etc. along with periodic proof-tests (or inspection). Not considering demand rate will give a pessimistic safety estimate for an application with low demand rate such as nuclear power plants, chemical plants, etc. In this paper, a basic model of IEC 61508 is used. The basic model is extended to incorporate process demand and behavior of electronic- and/or computer-based system following diagnosis or proof-test. A new safety index, probability of failure on actual demand (PFAD) based on extended model and demand rate is proposed. Periodic proof-test makes the model semi-Markovian, so a piece-wise continuous time Markov chain (CTMC) based method is used to derive mean state probabilities of elementary or aggregated state. Method to determine probability of failure on demand (PFD) (IEC 61508) and PFAD based on these state probabilities are described. In example, safety indices of PFD and PFAD are compared.

  13. Mutation Rate at Commonly Used Forensic STR Loci: Paternity Testing Experience

    Directory of Open Access Journals (Sweden)

    Faruk Aşıcıoğlua

    2004-01-01

    Full Text Available Paternity tests are carried out by the analysis of hypervariable short tandem repeat DNA loci. These microsatellite sequences mutate at a higher rate than that of bulk DNA. The occurrence of germline mutations at STR loci posses problems in interpretation of resulting genetic profiles. We recently analyzed 59–159 parent/child allele transfers at 13 microsatellite loci. We identified 12 mutations in 7 microsatellite loci. No mutations were occurred in other 6 loci. The highest mutation rate was observed with 5 mutations at D8S1179 locus at different alleles. The event was always single repeat related. The mutation rate was between 0 and 1.5 x 10-2 per locus per gamete per generation. The mutation event is very crucial for forensic DNA testing and accumulation of STR mutation data is extremely important for genetic profile interpretation.

  14. Heart rate deflection point relates to second ventilatory threshold in a tennis test.

    Science.gov (United States)

    Baiget, Ernest; Fernández-Fernández, Jaime; Iglesias, Xavier; Rodríguez, Ferran A

    2015-03-01

    The relationship between heart rate deflection point (HRDP) and the second ventilatory threshold (VT2) has been studied in continuous sports, but never in a tennis-specific test. The aim of the study was to assess the relationships between HRDP and the VT2, and between the maximal test performance and the maximal oxygen uptake ((Equation is included in full-text article.)) in an on-court specific endurance tennis test. Thirty-five high-level tennis players performed a progressive tennis-specific field test to exhaustion to determine HRDP, VT2, and (Equation is included in full-text article.). Ventilatory gas exchange parameters were continuously recorded by a portable telemetric breath-by-breath gas exchange measurement system. Heart rate deflection point was identified at the point at which the slope values of the linear portion of the time/heart rate (HR) relationship began to decline and was successfully determined in 91.4% of the players. High correlations (r = 0.79-0.96; p tennis test can be used to determine the VT2, and the BallfHRDP can be used as a practical performance variable to prescribe on-court specific aerobic training at or near VT2.

  15. Effects of head-down bed rest on complex heart rate variability: Response to LBNP testing

    Science.gov (United States)

    Goldberger, Ary L.; Mietus, Joseph E.; Rigney, David R.; Wood, Margie L.; Fortney, Suzanne M.

    1994-01-01

    Head-down bed rest is used to model physiological changes during spaceflight. We postulated that bed rest would decrease the degree of complex physiological heart rate variability. We analyzed continuous heart rate data from digitized Holter recordings in eight healthy female volunteers (age 28-34 yr) who underwent a 13-day 6 deg head-down bed rest study with serial lower body negative pressure (LBNP) trials. Heart rate variability was measured on a 4-min data sets using conventional time and frequency domain measures as well as with a new measure of signal 'complexity' (approximate entropy). Data were obtained pre-bed rest (control), during bed rest (day 4 and day 9 or 11), and 2 days post-bed rest (recovery). Tolerance to LBNP was significantly reduced on both bed rest days vs. pre-bed rest. Heart rate variability was assessed at peak LBNP. Heart rate approximate entropy was significantly decreased at day 4 and day 9 or 11, returning toward normal during recovery. Heart rate standard deviation and the ratio of high- to low-power frequency did not change significantly. We conclude that short-term bed rest is associated with a decrease in the complex variability of heart rate during LBNP testing in healthy young adult women. Measurement of heart rate complexity, using a method derived from nonlinear dynamics ('chaos theory'), may provide a sensitive marker of this loss of physiological variability, complementing conventional time and frequency domain statistical measures.

  16. The rate of invasive testing for trisomy 21 is reduced after implementation of NIPT.

    Science.gov (United States)

    Bjerregaard, Louise; Stenbakken, Anne Betsagoo; Andersen, Camilla Skov; Kristensen, Line; Jensen, Cecilie Vibeke; Skovbo, Peter; Sørensen, Anne Nødgaard

    2017-04-01

    The non-invasive prenatal test (NIPT) was introduced in the North Denmark Region in March 2013. NIPT is offered as an alternative to invasive tests if the combined first trimester risk of trisomy 21 (T21) is ≥ 1:300. The purpose of this study was to investigate the effect of NIPT implementation among high-risk pregnancies in a region with existing first-trimester combined screening for T21. The primary objective was to examine the effect on the invasive testing rate. This was a retrospective observational study including high-risk singleton pregnancies in the North Denmark Region. The women were included in two periods, i.e. before and after the implementation of NIPT, respectively. Group 1 (before NIPT): n = 253 and Group 2 (after NIPT): n = 302. After NIPT implementation, the invasive testing rate fell from 70% to 48% (p < 0.01), and the number of high-risk women refusing further testing dropped from 26% to 3% (p < 0.01). NIPT successfully detected four cases of T21; however, two out of three sex-chromosomal abnormalities were false positives. No false negative NIPT results were revealed in this study. In the North Denmark Region, the implementation of NIPT in high-risk pregnancies significantly reduced the rate of invasive testing. However, the proportion of high-risk women who opted for prenatal tests increased as the majority of women who previously refused further testing now opted for the NIPT. none. The study was approved by the Danish Data Protection Agency (No. 2015-104). Articles published in the Danish Medical Journal are “open access”. This means that the articles are distributed under the terms of the Creative Commons Attribution Non-commercial License, which permits any non-commercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.

  17. Study on rate ripple and its adaptive suppression method in inertia guidance test equipment

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yuan-sheng; XU Guo-zhu; LIU Yue

    2007-01-01

    This paper discusses causes of the rate ripple in inertia guidance test equipment IGET, systematically analyses their effects on the rate ripple in IGTE. The analysis result shows: The rate ripple caused by the periodic errors of inductosyn and angular encoder is higher at high speed than that caused by nagnetic ripple torque and friction torque, and it cannot be eliminated by adjusting control parameters of the system. And based on the nonlinear adaptive control system theory, the paper puts forward a new control system scheme to eliminate the rate ripple caused by the periodic errors of inductosyn and angular encoder, develops the adaptive control rules and makes simulation and test. Experimental result shows a significant improvement on those tables for the period disturbs under the system scheme designed. By this plan,with the input of rate 200°/s, the rate ripple falls from 5°/s to 0. 4°/s within about 6s adaptive adjustment time, being a twelfth of before adaptation,which can not be reached by common classical controls. The experimental results conform with the simulation,which proves the validity and practicability of the plan.

  18. Postpartum Diabetes Testing Rates after Gestational Diabetes Mellitus in Canadian Women: A Population-Based Study.

    Science.gov (United States)

    Butalia, Sonia; Donovan, Lois; Savu, Anamaria; Johnson, Jeffrey; Edwards, Alun; Kaul, Padma

    2017-05-12

    We assessed the rate and type of postpartum glycemic testing in women with impaired glucose tolerance of pregnancy (IGTp) and gestational diabetes mellitus (GDM). We examined whether the likelihood of testing was modulated by patients' characteristics and pregnancy outcomes. Our population-level cohort study included data from 132,905 pregnancies between October 1, 2008, and December 31, 2011, in Alberta, Canada. Laboratory data within 270 days before and 1 year after delivery were used to identify pregnancies involving IGTp/GDM and postpartum glycemic testing, respectively. Logistic regression was used to identify maternal and pregnancy factors associated with postpartum testing. A total of 8,703 pregnancies were affected by IGTp (n=3669) or GDM (n=5034) as defined by the prevailing Canadian Diabetes Association 2008 Clinical Practice Guidelines for the Prevention and Management of Diabetes in Canada. By 1 year postpartum, 55.1% had undergone glycemic assessments. Of those, 59.7% had had 75 g oral glucose tolerance tests, 17.4% had had glycated hemoglobin tests without oral glucose tolerance tests and 22.9% had had only fasting or random glucose tests. Women with IGTp or GDM, respectively, who were younger, smokers and residing in rural areas and whose labours were not induced were less likely to be tested postpartum. Having large for gestational age infants was also associated with a lower likelihood of postpartum testing in women with GDM. Despite a universal health-care system in Canada, many women with IGTp or GDM do not undergo postpartum glucose testing. Maternal and pregnancy characteristics influence postpartum testing and provide valuable information for creating targeted strategies to improve postpartum testing in this group of high-risk women. Copyright © 2017 Diabetes Canada. Published by Elsevier Inc. All rights reserved.

  19. The use of heart rates and graded maximal test values to determine rugby union game intensities.

    Science.gov (United States)

    Sparks, Martinique; Coetzee, Ben

    2013-02-01

    The aim of this study was to determine the intensities of university rugby union games using heart rates and graded maximal test values. Twenty-one rugby players performed a standard incremental maximal oxygen uptake (VO2max) test to the point of exhaustion in the weeks between 3 rugby matches. The heart rates that corresponded to the first and second ventilatory thresholds were used to classify the heart rates into low-, moderate-, and high-intensity zones. The heart rates recorded through heart rate telemetry during the matches were then categorized into the different zones. The average heart rates for the different intensity zones as well the percentages of the maximum heart rate (HRmax) were as follows: low, 141-152 b·min(-1) (76.2-82.0% HRmax); moderate, 153-169 b·min(-1) (82.7-91.4% HRmax); and high, 170-182 b·min(-1) (91.9-100% HRmax). The percentages of time players spent in the different intensity zones were as follows: 22.8% for the low-intensity, 33.6% for the moderate-intensity, and 43.6% for the high-intensity zones. The dependant t-test revealed significant differences (p rugby union games. It also revealed that university rugby games are categorized by significantly more high-intensity activities than was previously reported by other rugby match analyzing-related studies. Thus, sport scientists and conditioning coaches should concentrate more on high-intensity activities for longer periods during training sessions.

  20. Survival and recovery of Aeromonas hydrophila in water: development of methodology for testing bottled water in Canada.

    Science.gov (United States)

    Warburton, D W; McCormick, J K; Bowen, B

    1994-02-01

    Proposed changes to the Regulations for bottled water in the Food and Drugs Act of Canada include criteria for Aeromonas hydrophila (0 colony-forming units/100 mL water). The development of the methodology used to support these proposed Regulations and the survival of A. hydrophila in inoculated water are described. The methodology used in the isolation of A. hydrophila includes the use of hydrophobic grid membrane filters (HGMF), a resuscitation step on tryptic soy agar, and selective plating on membrane-Aeromonas-trehalose agar and Aeromonas medium. Aeromonas hydrophila proliferated and survived in inoculated water for up to 60 days or longer depending on the other contaminating bacteria. The presence of Pseudomonas aeruginosa enhanced the survival of A. hydrophila and enabled this bacteria to survive for more than 60 days.

  1. Remote sensing and hydrogeological methodologies for irrigation canal leakage detection: the Osasco and Fossano test sites (NorthWestern Italy)

    Science.gov (United States)

    Perotti, Luigi; Clemente, Paolo; De Luca, Domenico Antonio; Dino, Giovanna; Lasagna, Manuela

    2013-04-01

    irrigation channel were conducted. Then the canals seepage rates were estimated using inflow-outflow tests and tests with double-tracer, an adaptation from QUEST method (Rieckermann and Gujer, 2002). This approach allowed an experimental calibration and validation of the satellite images analysis. The applied multidisciplinary approach seem to be a promising way for a good general screening for a rapid detection of irrigation channels water losses. References Hotchkiss, R.H., Wingert, C.B., Kelly, W.E., 2001. Determining irrigation canal seepage with electrical resistivity. ASCE J. Irrig. Drain 127, 20-26. Huang Y and Fipps G. (2002). Thermal Imaging of Canals for Remote Detection of Leaks: Evaluation in the United Irrigation District. Technical Report. Biological and Agricultural Engineering Department, Texas A&M University. Huang Y, Fipps G, Maas S, Fletcher R. (2005). Airborne multispectral remote sensing imaging for detecting irrigation canal leaks in the lower rio grande valley - 20th Biennial Workshop on Aerial Photography, Videography, and High Resolution Digital Imagery for Resource Assessment October 4-6, Weslaco, Texas. Rieckermann J., Gujer W. (2002) - Quantifying Exfiltration from Leaky Sewers with Artificial Tracers - Proceedings of the International Conference on "Sewer Operation and Maintenance. 2002", Bradford, UK.

  2. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    Science.gov (United States)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test

  3. Testing the Relationship between Interest Rates Volatility and Market Capitalization: the case of Mauritius

    Directory of Open Access Journals (Sweden)

    Edesiri Godsday Okoro

    2014-12-01

    Full Text Available This paper tests the relationship between interest rates volatility and market capitalization in Mauritius. Using annual time series data sourced from the Financial Services Commission Annual Statistical Bulletin of Mauritius during the period 2006 through 2010, data of interest rates volatility and market capitalization were estimated in a non-linear model using the Vector Auto-regression technique. The study found that interest rates volatility has significant effect on the level of market capitalization although a negative effect. This implies a negative relationship between interest rates volatility and market capitalization. Thus, if market capitalization is affected by interest rates, then the economy becomes highly susceptible to volatile external distress. This indicates some dangers for the economic survival of Mauritius. It was on this note that we recommended an effective policy aimed at stabilizing macroeconomic variable like interest rates, focusing at the same time on alternative measures of promoting market capitalization if aggregate economic growth must be harnessed. Policymakers should design the optimal policy mix that would help the nation cope efficiently with the economic and social costs of the external distress accompanying higher and dwindling interest rates in Mauritius.

  4. Precision of test methods : determination of repeatability and reproducibility by inter-laboratory tests : application in development and assessment of dairy methodology

    NARCIS (Netherlands)

    Werdmuller, G.A.; Ruig, de W.G.

    1983-01-01

    The International Standard ISO 5725 is sometimes considered to be too extended for practical use in the development and assessment of dairy methodology. Therefore the Joint IDF/ISO/AOAC Group of Experts E 30 Statistics of analytical data - decided upon a simplified description, based on ISO 5725, to

  5. The relationships between exercise intensity, heart rate, and blood pressure during an incremental isometric exercise test.

    Science.gov (United States)

    Wiles, Jonathan D; Allum, Simon R; Coleman, Damian A; Swaine, Ian L

    2008-01-15

    Currently, it is not possible to prescribe isometric exercise at an intensity that corresponds to given heart rates or systolic blood pressures. This might be useful in optimizing the effects of isometric exercise training. Therefore, the aim of this study was to explore the relationships between isometric exercise intensity and both heart rate and systolic blood pressure during repeated incremental isometric exercise tests. Fifteen participants performed seated isometric double-leg knee extension, during which maximum voluntary contraction (MVC) was assessed, using an isokinetic dynamometer. From this, a corresponding peak electromyographic activity (EMG(peak)) was determined. Subsequently, participants performed two incremental isometric exercise tests (at least 48 h apart) at 10, 15, 20, 25, and 30% EMG(peak), during which steady-state heart rate and systolic blood pressure were recorded. In all participants, there were linear relationships between %EMG(peak) and heart rate (r at least 0.91; P blood pressure (r at least 0.92; P 0.50) or elevations (P > 0.10) for either of the relationships. Therefore, these linear relationships could be used to identify isometric exercise training intensities that correspond to precise heart rates or systolic blood pressures. Training performed in this way might provide greater insight into the underlying mechanisms for the cardiovascular adaptations that are known to occur as a result.

  6. EFFECT OF PRANAYAMA ON BLOOD PRESSURE AND HEART RATE IN HYPERREACTOR TO COLD PRESSOR TEST

    Directory of Open Access Journals (Sweden)

    Krishan Bihari

    2014-07-01

    Full Text Available INTRODUCTION: Stress is a dangerous and significant problem of World, which affects physical, mental, behavioral, and emotional health. Yoga has been reported to control stress, to be beneficial in treating stress related disorders, improving autonomic functions, lower blood pressure, increase strength and flexibility of muscles, improve the sense of well-being, slow ageing process, control breathing, reducing signs of oxidative stress and improving spiritual growth. AIMS: The aim of present study was to investigate whether regular practice of Yoga for three months can reduce the cardiovascular hyper-reactivity induced by cold pressor test. MATERIALS AND METHODS: The study group comprised 62 healthy male subjects of 17-27 years age group. Initially there were 30 hyper reactors to cold pressor test. The hyper-reactivity of 23 volunteers converted to hypo-reactivity after the yoga therapy of three months (76.66%. Other parameters like basal blood pressure, rise in blood pressure, pulse rate and rate of respiration were also statistically significantly reduced (by using student ‘t’ test. STATISTICAL ANALYSIS: 2 tail student‘t’ test was done by using the standard formulas. RESULTS: Regular practice of yoga significantly reduces the cardiovascular hyper-reactivity in basal blood pressure, rise in blood pressure after one minute of cold stress, heart rate, and rate of respiration, after three month of yoga practice. CONCLUSION: Regular practice of yoga for three months reduced the cardiovascular hyper-reactivity to cold pressor test in subjects, who were hyper reactive to cold stress, possibly by inducing parasympathetic predominance and cortico-hypothalamomedullary inhibition.

  7. The examination of the heart rate regulation in healthy people with the stochastic tests methods

    Directory of Open Access Journals (Sweden)

    Aksana Kotava

    2013-09-01

    Full Text Available Background: The determination of relations between the complexity of the cardiovascular system regulation and the com-plexity of the test signal is not a fully solved problem. The elimination of this uncertainty can be done using stochastic test signals and power value which changes are random. Aim of research: To compare the reaction of cardio - vascular system during the deterministic and random loads. Material and methods:In the research,h two types of physical loads were used: the traditional bicycle ergometer test with stepwise increasing load and 3 minutes steps duration and test with a stochastic pseudonormal load values distribution and 30 seconds steps duration. Results: It is established that the average load required to achieve a submaximal heart rate was 509 W for the traditional and 445 W for the stochastic test, respectively. The time of obtained submaximal heart rate during stepwise-increasing load was 7 min., whereas during the stochastic load significantly less - 5min. The results show that the limit of efficiency of the cardio-vascular system during stochastic load test is achieved faster than during deterministic load test. Conclusions: Stress tests using random loads can be useful for the athletes training. Supposedly, the use of stochastic loads must be effective during rehabilitation of patients with cardiovascular diseases, for instance the increasing of the physical load time in each stage can be used in order to reach steady state. Also, the proposed study confirms the perspectives of non-linear and stochastic methods in the diagnosis of the cardiovascular system diseases.

  8. Relation of heart rate recovery after exercise testing to coronary artery calcification.

    Science.gov (United States)

    Jae, Sae Young; Kurl, Sudhir; Laukkanen, Jari A; Yoon, Eun Sun; Choi, Yoon-Ho; Fernhall, Bo; Franklin, Barry A

    2017-08-01

    We examined whether slow heart rate recovery (HRR) after exercise testing as an estimate of impaired autonomic function is related to coronary artery calcification (CAC), an emerging marker of coronary atherosclerosis. We evaluated 2088 men who participated in a health-screening program that included measures of CAC and peak or symptom-limited cardiopulmonary exercise testing. HRR was calculated as the difference between peak heart rate (HR) during exercise testing and the HR at 2 min of recovery after peak exercise. We measured CAC using multidetector computed tomography to calculate the Agatston coronary artery calcium score. Advanced CAC was defined as a mean CAC >75th percentile for each age group. HRR was negatively correlated with CAC (r = -.14, p 52 bpm). Each 1 bpm decrease in HRR was associated with 1% increase in advanced CAC after adjusting for potential confounders. An attenuated HRR after exercise testing is associated with advanced CAC, independent of coronary risk factors and other related hemodynamic response. KEY MESSAGES Slow heart rate recovery (HRR) after maximal exercise testing, indicating decreased autonomic function, is associated with an increased risk of cardiovascular event and mortality. Slow HRR has been linked with the occurrence of malignant ventricular arrhythmias, but it remains unclear whether slow HRR is associated with an increased risk of coronary artery calcification (CAC), an emerging marker of coronary atherosclerosis. An attenuated HRR after exercise testing was associated with advanced CAC, independent of coronary risk factors and other potential hemodynamic confounder, supporting the hypothesis that slow HRR is related to the burden of atherosclerotic coronary artery disease.

  9. WTP Waste Feed Qualification: Hydrogen Generation Rate Measurement Apparatus Testing Report

    Energy Technology Data Exchange (ETDEWEB)

    Stone, M. E. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Newell, J. D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Smith, T. E. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Pareizs, J. M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-06-01

    The generation rate of hydrogen gas in the Hanford tank waste will be measured during the qualification of the staged tank waste for processing in the Hanford Tank Waste Treatment and Immobilization Plant. Based on a review of past practices in measurement of the hydrogen generation, an apparatus to perform this measurement has been designed and tested for use during waste feed qualification. The hydrogen generation rate measurement apparatus described in this document and shown in Figure 0-1 utilized a 100 milliliter sample in a continuously-purged, continuously-stirred vessel, with measurement of hydrogen concentration in the vent gas. The vessel and lid had a combined 220 milliliters of headspace. The vent gas system included a small condenser to prevent excessive evaporative losses from the sample during the test, as well as a demister and filter to prevent particle migration from the sample to the gas chromatography system. The gas chromatograph was an on line automated instrument with a large-volume sample-injection system to allow measurement of very low hydrogen concentrations. This instrument automatically sampled the vent gas from the hydrogen generation rate measurement apparatus every five minutes and performed data regression in real time. The fabrication of the hydrogen generation rate measurement apparatus was in accordance with twenty three (23) design requirements documented in the conceptual design package, as well as seven (7) required developmental activities documented in the task plan associated with this work scope. The HGRMA was initially tested for proof of concept with physical simulants and a remote demonstration of the system was performed in the Savannah River National Laboratory Shielded Cells Mockup Facility. Final verification testing was performed using non-radioactive simulants of the Hanford tank waste. Three different simulants were tested to bound the expected rheological properties expected during waste feed qualification

  10. Medium optimization to improve the flocculation rate of a novel compound bioflocculant, CBF-256, using response surface methodology and flocculation characters.

    Science.gov (United States)

    Ren, Dunjian; Li, Hongyang; Pu, Yuewu; Yi, Lvyun

    2013-01-01

    A novel compound bioflocculant, CBF-256, was obtained using three bacterial strains, Bacillus sp., Enterobacter sp., and Aeromonas sp., which were screened from activated sludge of a printing and dyeing wastewater treatment plant. Response surface methodology was employed to optimize the fermentation medium dosage to improve the flocculation rate of CBF-256, which increased from 69.00% to 82.65%. In addition the yield of the compound bioflocculant increased from 2.31 g·L(-1) to 2.84 g·L(-1). The flocculating efficiency distribution of the components of the culture broth indicated that the supernatant was the most effective component in the flocculation process. Fourier transform infrared spectroscopy and scanning electron microscopy were used to analyze the fermentation medium and composite bacteria. The compound flocculants were produced easily, and during the flocculation process, all the flocculation ingredients settled down in the remaining sludge along with the bacteria screened from the activated sludge, without causing secondary pollution.

  11. Phillips tests methods to improve drawdown and producing rates in Venezuela fire flood

    Energy Technology Data Exchange (ETDEWEB)

    Meldau, R.F.; Lumpkin, W.B.

    1974-08-01

    Phillips Petroleum Co. conducted a 4-yr test of fire flooding in the Morichal field, Venezuela. The field, located at the N. edge of the Orinoco heavy-oil belt, produces 8' to 12' API crude at high rates from poorly consolidated sands below 3,000 ft. The pilot performed well with no channeling, corrosion, emulsion or sanding problems. However, combustion-gas production in some wells seriously reduced pump efficiency and drawdown. A number of field methods were tested to improve drawdown and production rates in pumping wells. A downhole gas separator made significant improvement; foam suppressants, pump speed, pump displacement, and reduction of casing pressure had little or no effect. Field data show that flowing rather than pumping, fire-flood producers can give better drawdown under some conditions.

  12. TEST OF THE CATCH-UP HYPOTHESIS IN AFRICAN AGRICULTURAL GROWTH RATES

    Directory of Open Access Journals (Sweden)

    Kalu Ukpai IFEGWU

    2015-11-01

    Full Text Available The paper tested the catch-up hypothesis in agricultural growth rates of twenty-six African countries. Panel data used was drawn from the Food and Agricultural Organization Statistics (FAOSTAT of the United Nations. The Data Envelopment Analysis Method for measuring productivity was used to estimate productivity growth rates. The cross-section framework consisting of sigma-convergence and beta-convergence was employed to test the catching up process. Catching up is said to exist if the value of beta is negative and significant. Since catching up does not necessarily imply narrowing of national productivity inequalities, sigma-convergence which measures inequality, was estimated for the same variables. The results showed evidence of the catch-up process, but failed to find a narrowing of productivity inequalities among countries.

  13. Exchange rate volatility and export growth in India: An ARDL bounds testing approach

    Directory of Open Access Journals (Sweden)

    P. Srinivasan

    2013-07-01

    Full Text Available This paper empirically investigates the impact of exchange rate volatility on the real exports in India using the ARDL bounds testing procedure proposed by Pesaran et al. (2001. Using annual time series data, the empirical analyses has been carried out for the period 1970 to 2011. The study results confirm that real exports are cointegrated with exchange rate volatility, real exchange rate, gross domestic product and foreign economic activity. Our findings indicate that the exchange rate volatility has significant negative impact on real exports both in the short-run and long-run, implying that higher exchange rate fluctuation tends to reduce real exports in India. Besides, the real exchange rate has negative short-run and positive long-run effects on real exports. The empirical results reveal that GDP has a positive and significant impact on India’s real exports in the long-run, but the impact turns out to be insignificant in the short-run. In addition, the foreign economic activity exerts significant negative and positive impact on real exports in the short-run and long-run, respectively.

  14. Field Tests to Investigate the Penetration Rate of Piles Driven by Vibratory Installation

    Directory of Open Access Journals (Sweden)

    Zhaohui Qin

    2017-01-01

    Full Text Available Factors directly affecting the penetration rate of piles installed by vibratory driving technique are summarized and classified into seven aspects which are driving force, resistance, vibratory amplitude, energy consumption, speeding up at the beginning, pile plumbness keeping, and slowing down at the end, from the mechanism and engineering practice of the vibratory pile driving. In order to find out how these factors affect the penetration rate of the pile in three major actors of vibratory pile driving: (i the pile to be driven, (ii the selected driving system, and (iii the imposed soil conditions, field tests on steel sheet piles driven by vibratory driving technique in different soil conditions are conducted. The penetration rates of three different sheet pile types having up to four different lengths installed using two different vibratory driving systems are documented. Piles with different lengths and types driven with or without clutch have different penetration rates. The working parameters of vibratory hammer, such as driving force and vibratory amplitude, have great influences on the penetration rate of the pile, especially at the later stages of the sinking process. Penetration rate of piles driven in different soil conditions is uniform because of the different penetration resistance including shaft friction and toe resistance.

  15. RateMyProfessors.com: Testing Assumptions about Student Use and Misuse

    Science.gov (United States)

    Bleske-Rechek, April; Michels, Kelsey

    2010-01-01

    Since its inception in 1999, the RateMyProfessors.com (RMP.com) website has grown in popularity and, with that, notoriety. In this research we tested three assumptions about the website: (1) Students use RMP.com to either rant or rave; (2) Students who post on RMP.com are different from students who do not post; and (3) Students reward easiness by…

  16. Correlation analysis of gamma dose rate from natural radiation in the test field

    Directory of Open Access Journals (Sweden)

    Avdic Senada

    2016-01-01

    Full Text Available This paper deals with correlation analysis of gamma dose rate measured in the test field with the five distinctive soil samples from a few minefields in Federation of Bosnia and Herzegovina. The measurements of ambient dose equivalent rate, due to radionuclides present in each of the soil samples, were performed by the RADIAGEMTM 2000 portable survey meter, placed on the ground and 1m above the ground. The gamma spectrometric analysis of the same soil samples was carried out by GAMMA-RAD5 spectrometer. This study showed that there is a high correlation between the absorbed dose rate evaluated from soil radioactivity and the corresponding results obtained by the survey meter placed on the ground. Correlation analysis indicated that the survey meter, due to its narrow energy range, is not suitable for the examination of cosmic radiation contribution.

  17. Test context affects recollection and familiarity ratings: implications for measuring recognition experiences.

    Science.gov (United States)

    Tousignant, Cody; Bodner, Glen E

    2012-06-01

    The binary remember/know task requires participants to dichotomize their subjective recognition experiences into those with recollection and those only with familiarity. Many variables have produced dissociative effects on remember/know judgments. In contrast, having participants make independent recollection/familiarity ratings has consistently produced parallel effects, suggesting the dissociations may be artifacts of using binary judgments. Bodner and Lindsay (2003) reported a test-list context effect with binary judgments: Increased remembering but decreased knowing for a set of critical items tested with a set of less-memorable (vs. more-memorable) items. Here we report a parallel effect of test-list context on recollection and familiarity ratings, induced by a shift in response bias. We argue that independent ratings are preferable to binary judgments because they allow participants to directly report the co-occurrence of recollection and familiarity for each item. Implications for the measurement of self-reported recognition experiences, and for accounts of recognition memory, are discussed.

  18. A validity test of movie, television, and video-game ratings.

    Science.gov (United States)

    Walsh, D A; Gentile, D A

    2001-06-01

    Numerous studies have documented the potential effects on young audiences of violent content in media products, including movies, television programs, and computer and video games. Similar studies have evaluated the effects associated with sexual content and messages. Cumulatively, these effects represent a significant public health risk for increased aggressive and violent behavior, spread of sexually transmitted diseases, and pediatric pregnancy. In partial response to these risks and to public and legislative pressure, the movie, television, and gaming industries have implemented ratings systems intended to provide information about the content and appropriate audiences for different films, shows, and games. To test the validity of the current movie-, television-, and video game-rating systems. Panel study. Participants used the KidScore media evaluation tool, which evaluates films, television shows, and video games on 10 aspects, including the appropriateness of the media product for children based on age. When an entertainment industry rates a product as inappropriate for children, parent raters agree that it is inappropriate for children. However, parent raters disagree with industry usage of many of the ratings designating material suitable for children of different ages. Products rated as appropriate for adolescents are of the greatest concern. The level of disagreement varies from industry to industry and even from rating to rating. Analysis indicates that the amount of violent content and portrayals of violence are the primary markers for disagreement between parent raters and industry ratings. As 1 part of a solution to the complex public health problems posed by violent and sexually explicit media products, ratings can have value if used with caution. Parents and caregivers relying on the ratings systems to guide their children's use of media products should continue to monitor content independently. Industry ratings systems should be revised with input

  19. WTP Waste Feed Qualification: Hydrogen Generation Rate Measurement Apparatus Testing Report

    Energy Technology Data Exchange (ETDEWEB)

    Stone, M. E. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Newell, J. D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Smith, T. E. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Pareizs, J. M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-06-01

    The generation rate of hydrogen gas in the Hanford tank waste will be measured during the qualification of the staged tank waste for processing in the Hanford Tank Waste Treatment and Immobilization Plant. Based on a review of past practices in measurement of the hydrogen generation, an apparatus to perform this measurement has been designed and tested for use during waste feed qualification. The hydrogen generation rate measurement apparatus (HGRMA) described in this document utilized a 100 milliliter sample in a continuously-purged, continuously-stirred vessel, with measurement of hydrogen concentration in the vent gas. The vessel and lid had a combined 220 milliliters of headspace. The vent gas system included a small condenser to prevent excessive evaporative losses from the sample during the test, as well as a demister and filter to prevent particle migration from the sample to the gas chromatography system. The gas chromatograph was an on line automated instrument with a large-volume sample-injection system to allow measurement of very low hydrogen concentrations. This instrument automatically sampled the vent gas from the hydrogen generation rate measurement apparatus every five minutes and performed data regression in real time. The fabrication of the hydrogen generation rate measurement apparatus was in accordance with twenty three (23) design requirements documented in the conceptual design package, as well as seven (7) required developmental activities documented in the task plan associated with this work scope. The HGRMA was initially tested for proof of concept with physical simulants, and a remote demonstration of the system was performed in the Savannah River National Laboratory Shielded Cells Mockup Facility. Final verification testing was performed using non-radioactive simulants of the Hanford tank waste. Three different simulants were tested to bound the expected rheological properties expected during waste feed qualification testing. These

  20. Effects of Classroom Ventilation Rate and Temperature on Students' Test Scores.

    Directory of Open Access Journals (Sweden)

    Ulla Haverinen-Shaughnessy

    Full Text Available Using a multilevel approach, we estimated the effects of classroom ventilation rate and temperature on academic achievement. The analysis is based on measurement data from a 70 elementary school district (140 fifth grade classrooms from Southwestern United States, and student level data (N = 3109 on socioeconomic variables and standardized test scores. There was a statistically significant association between ventilation rates and mathematics scores, and it was stronger when the six classrooms with high ventilation rates that were indicated as outliers were filtered (> 7.1 l/s per person. The association remained significant when prior year test scores were included in the model, resulting in less unexplained variability. Students' mean mathematics scores (average 2286 points were increased by up to eleven points (0.5% per each liter per second per person increase in ventilation rate within the range of 0.9-7.1 l/s per person (estimated effect size 74 points. There was an additional increase of 12-13 points per each 1°C decrease in temperature within the observed range of 20-25°C (estimated effect size 67 points. Effects of similar magnitude but higher variability were observed for reading and science scores. In conclusion, maintaining adequate ventilation and thermal comfort in classrooms could significantly improve academic achievement of students.

  1. New tests of the distal speech rate effect: examining cross-linguistic generalization.

    Science.gov (United States)

    Dilley, Laura C; Morrill, Tuuli H; Banzina, Elina

    2013-01-01

    Recent findings [Dilley and Pitt, 2010. Psych. Science. 21, 1664-1670] have shown that manipulating context speech rate in English can cause entire syllables to disappear or appear perceptually. The current studies tested two rate-based explanations of this phenomenon while attempting to replicate and extend these findings to another language, Russian. In Experiment 1, native Russian speakers listened to Russian sentences which had been subjected to rate manipulations and performed a lexical report task. Experiment 2 investigated speech rate effects in cross-language speech perception; non-native speakers of Russian of both high and low proficiency were tested on the same Russian sentences as in Experiment 1. They decided between two lexical interpretations of a critical portion of the sentence, where one choice contained more phonological material than the other (e.g., /str'na/ "side" vs. /str'na/ "country"). In both experiments, with native and non-native speakers of Russian, context speech rate and the relative duration of the critical sentence portion were found to influence the amount of phonological material perceived. The results support the generalized rate normalization hypothesis, according to which the content perceived in a spectrally ambiguous stretch of speech depends on the duration of that content relative to the surrounding speech, while showing that the findings of Dilley and Pitt (2010) extend to a variety of morphosyntactic contexts and a new language, Russian. Findings indicate that relative timing cues across an utterance can be critical to accurate lexical perception by both native and non-native speakers.

  2. New tests of the distal speech rate effect: Examining cross-linguistic generalization

    Directory of Open Access Journals (Sweden)

    Laura eDilley

    2013-12-01

    Full Text Available Recent findings [Dilley and Pitt, 2010. Psych. Science. 21, 1664-1670] have shown that manipulating context speech rate in English can cause entire syllables to disappear or appear perceptually. The current studies tested two rate-based explanations of this phenomenon while attempting to replicate and extend these findings to another language, Russian. In Experiment 1, native Russian speakers listened to Russian sentences which had been subjected to rate manipulations and performed a lexical report task. Experiment 2 investigated speech rate effects in cross-language speech perception; non-native speakers of Russian of both high and low proficiency were tested on the same Russian sentences as in Experiment 1. They decided between two lexical interpretations of a critical portion of the sentence, where one choice contained more phonological material than the other (e.g., /stərʌ'na/ side vs. /strʌ'na/ country. In both experiments, with native and non-native speakers of Russian, context speech rate and the relative duration of the critical sentence portion were found to influence the amount of phonological material perceived. The results support the generalized rate normalization hypothesis, according to which the content perceived in a spectrally ambiguous stretch of speech depends on the duration of that content relative to the surrounding speech, while showing that the findings of Dilley and Pitt (2010 extend to a variety of morphosyntactic contexts and a new language, Russian. Findings indicate that relative timing cues across an utterance can be critical to accurate lexical perception by both native and non-native speakers.

  3. Methodological advances

    Directory of Open Access Journals (Sweden)

    Lebreton, J.-D.

    2004-06-01

    Full Text Available The study of population dynamics has long depended on methodological progress. Among many striking examples, continuous time models for populations structured in age (Sharpe & Lotka, 1911 were made possible by progress in the mathematics of integral equations. Therefore the relationship between population ecology and mathematical and statistical modelling in the broad sense raises a challenge in interdisciplinary research. After the impetus given in particular by Seber (1982, the regular biennial EURING conferences became a major vehicle to achieve this goal. It is thus not surprising that EURING 2003 included a session entitled “Methodological advances”. Even if at risk of heterogeneity in the topics covered and of overlap with other sessions, such a session was a logical way of ensuring that recent and exciting new developments were made available for discussion, further development by biometricians and use by population biologists. The topics covered included several to which full sessions were devoted at EURING 2000 (Anderson, 2001 such as: individual covariates, Bayesian methods, and multi–state models. Some other topics (heterogeneity models, exploited populations and integrated modelling had been addressed by contributed talks or posters. Their presence among “methodological advances”, as well as in other sessions of EURING 2003, was intended as a response to their rapid development and potential relevance to biological questions. We briefly review all talks here, including those not published in the proceedings. In the plenary talk, Pradel et al. (in prep. developed GOF tests for multi–state models. Until recently, the only goodness–of–fit procedures for multistate models were ad hoc, and non optimal, involving use of standard tests for single state models (Lebreton & Pradel, 2002. Pradel et al. (2003 proposed a general approach based in particular on mixtures of multinomial distributions. Pradel et al. (in prep. showed

  4. Life cycle assessment of biomass production: Development of a methodology to improve the environmental indicators and testing with fiber sorghum energy crop

    Energy Technology Data Exchange (ETDEWEB)

    Buratti, Cinzia; Fantozzi, Francesco [University of Perugia, Biomass Research Centre, Via G. Duranti, 06125 Perugia (Italy)

    2010-10-15

    At the Biomass Research Centre - University of Perugia some LCA studies were carried out on different biomass chains, using a detailed software (Simapro 7.0) and the EcoIndicator 99 model in order to evaluate the global burden. Results showed that EcoIndicator 99 lacks some important features (e.g. freshwater consumption, nutrient emissions into water, soil erosion) that are necessary in evaluating the environmental load of energy crops. Therefore a new LCA methodology, tailored to biomass production, was developed, in which all resources used and emissions into environment were divided into the following impact categories: depletion of abiotic resources, freshwater consumption, climate change, land use, acidification, eutrophication, human toxicity, ecotoxicity, soil erosion. The impact assessment methodology was tested on fiber sorghum crop production, adopting two different agricultural techniques mainly regarding irrigation management, employing data from experimental fields in Umbria Region (Italy). Results showed a more reliable approach to the impact assessment of biomass cultivation phase. (author)

  5. Slow Strain Rate Testing for Hydrogen Embrittlement Susceptibility of Alloy 718 in Substitute Ocean Water

    Science.gov (United States)

    LaCoursiere, M. P.; Aidun, D. K.; Morrison, D. J.

    2017-05-01

    The hydrogen embrittlement susceptibility of near-peak-aged UNS N07718 (Alloy 718) was evaluated by performing slow strain rate tests at room temperature in air and substitute ocean water. Tests in substitute ocean water were accomplished in an environmental cell that enabled in situ cathodic charging under an applied potential of -1.1 V versus SCE. Some specimens were cathodically precharged for 4 or 16 weeks at the same potential in a 3.5 wt.% NaCl-distilled water solution at 50 °C. Unprecharged specimens tested in substitute ocean water exhibited only moderate embrittlement with plastic strain to failure decreasing by about 20% compared to unprecharged specimens tested in air. However, precharged specimens exhibited significant embrittlement with plastic strain to failure decreasing by about 70%. Test environment (air or substitute ocean water with in situ charging) and precharge time (4 or 16 weeks) had little effect on the results of the precharged specimens. Fracture surfaces of precharged specimens were typical of hydrogen embrittlement and consisted of an outer brittle ring related to the region in which hydrogen infused during precharging, a finely dimpled transition zone probably related to the region where hydrogen was drawn in by dislocation transport, and a central highly dimpled ductile region. Fracture surfaces of unprecharged specimens tested in substitute ocean water consisted of a finely dimpled outer ring and heavily dimpled central region typical of ductile fracture.

  6. Testing of currency substitution effect on exchange rate volatility in Serbia

    Directory of Open Access Journals (Sweden)

    Petrović Predrag

    2016-01-01

    Full Text Available Despite numerous different definitions existing in the literature, currency substitution is generally understood as a phenomenon when domestic residents prefer to use foreign currency rather than domestic currency. The main reasons for such phenomenon include high and volatile inflation, strong depreciation of national currency and high interest rate differential in favour of foreign currency. Currency substitution, as a monetary phenomenon, is widely spread in Latin American, Eastern European and some Asian countries. This paper is dedicated to the influence of currency substitution on exchange rate volatility in Serbia. The research included testing of three hypotheses: (i currency substitution positively affects depreciation rate volatility, (ii depreciation rate volatility has stronger responses to the past negative than to the past positive depreciation shocks, and (iii currency substitution positively affects expected depreciation rate. The analysis was implemented for the period 2002:m1-2015:m12 (2004:m1- 2015:m12, applying modified EGARCH-M model. Based on the obtained results, all three hypotheses have been supremely rejected regardless of the manner of quantification of currency substitution.

  7. Mathematical analysis of the heart rate performance curve during incremental exercise testing.

    Science.gov (United States)

    Rosic, G; Pantovic, S; Niciforovic, J; Colovic, V; Rankovic, V; Obradovic, Z; Rosic, Mirko

    2011-03-01

    In this study we performed laboratory treadmill protocols of increasing load. Heart rate was continuously recorded and blood lactate concentration was measured for determination of lactate threshold by means of LTD-max and LT4.0 methods.Our results indicate that the shape of heart rate performance curve (HRPC) during incremental testing depends on the applied exercise protocol (change of initial speed and the step of running speed increase, with the constant stage duration). Depending on the applied protocol, the HRPC can be described by linear, polynomial (S-shaped), and exponential mathematical expression.We presented mathematical procedure for estimation of heart rate threshold points at the level of LTD-max and LT4.0, by means of exponential curve and its relative deflection from the initial trend line (tangent line to exponential curve at the point of starting heart rate). The relative deflection of exponential curve from the initial trend line at the level of LTD-max and/or LT4.0 can be defined, based on the slope of the initial trend line. Using originally developed software that allows mathematical analysis of heart rate-load relation, LTD-max and/or LT4.0 can be estimated without direct measurement of blood lactate concentration.

  8. 77 FR 72905 - Pipeline Safety: Random Drug Testing Rate; Contractor MIS Reporting; and Obtaining DAMIS Sign-In...

    Science.gov (United States)

    2012-12-06

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Random Drug Testing Rate... for random drug testing, reminder for operators to report contractor MIS data, and new method for... minimum random drug testing rate for covered employees will remain at 25 percent during calendar year 2013...

  9. [Testing of germination rate of hybrid rice seeds based on near-infrared reflectance spectroscopy].

    Science.gov (United States)

    Li, Yi-nian; Jiang, Dan; Liu, Ying-ying; Ding, Wei-min; Ding, Qi-shuo; Zha, Liang-yu

    2014-06-01

    Germination rate of rice seeds was measured according to technical stipulation of germination testing for agricultural crop seeds at present. There existed many faults for this technical stipulation such as long experimental period, more costing and higher professional requirement. A rapid and non-invasive method was put forward to measure the germination rate of hybrid rice seeds based on near-infrared reflectance spectroscopy. Two varieties of hybrid rice seeds were aged artificially at temperature 45 degrees C and humidity 100% condition for 0, 24, 48, 72, 96, 120 and 144 h. Spectral data of 280 samples for 2 varieties of hybrid rice seeds with different aging time were acquired individually by near-infrared spectra analyzer. Spectral data of 280 samples for 2 varieties of hybrid rice seeds were randomly divided into calibration set (168 samples) and prediction set (112 samples). Gormination rate of rice seed with different aging time was tested. Regression model was established by using partial least squares (PLS). The effect of the different spectral bands on the accuracy of models was analyzed and the effect of the different spectral preprocessing methods on the accuracy of models was also compared. Optimal model was achieved under the whole bands and by using standardization and orthogonal signal correction (OSC) preprocessing algorithms with CM2000 software for spectral data of 2 varieties of hybrid rice seeds, the coefficient of determination of the calibration set (Rc) and that of the prediction set (Rp) were 0.965 and 0.931 individually, standard error of calibration set (SEC) and that of prediction set (SEP) were 1.929 and 2.899 respectively. Relative error between tested value and predicted value for prediction set of rice seeds is below 4.2%. The experimental results show that it is feasible that rice germination rate is detected rapidly and nondestructively by using the near-infrared spectroscopy analysis technology.

  10. Experimental test of the heating and cooling rate effect on blocking temperatures

    Science.gov (United States)

    Berndt, Thomas; Paterson, Greig A.; Cao, Changqian; Muxworthy, Adrian R.

    2017-07-01

    The cooling rates at which rocks acquire thermoremanent magnetizations (TRMs), affect their unblocking temperatures in thermal demagnetization experiments; similarly the heating rates at which the thermal demagnetization experiments are done also affect the unblocking temperature. We have tested the effects of variable cooling and heating rates on the unblocking temperatures of two natural non-interacting, magnetically uniform (single-domain, SD) (titano)magnetite samples and a synthetic SD magnetoferritin sample. While previous studies have only considered unblocking temperatures for stepwise thermal demagnetization data (i.e. the room-temperature magnetization after incremental heating), in this work we derive an expression for continuous thermal demagnetization of both TRMs and viscous remanent magnetizations (VRMs) and relate the heating rate to an effective equivalent hold time of a stepwise thermal demagnetization experiment. Through our analysis we reach four main conclusions: First, the theoretical expressions for the heating/cooling rate effect do not accurately predict experimentally observed blocking temperatures. Empirically, the relation can be modified incorporating a factor that amplifies both the temperature and the heating rate dependence of the heating/cooling rate effect. Using these correction factors, Pullaiah nomograms can accurately predict blocking temperatures of both TRMs and VRMs for continuous heating/cooling. Second, demagnetization temperatures are approximately predicted by published 'Pullaiah nomograms', but blocking occurs gradually over temperature intervals of 5-40 K. Third, the theoretically predicted temperatures correspond to ∼54-82 per cent blocking, depending on the sample. Fourth, the blocking temperatures can be used to obtain estimates of the atomic attempt time τ0, which were found to be 3 × 10-10 s for large grained (titano)magnetite, 1 × 10-13 s for small grained (titano)magnetite below the Verwey transition and 9

  11. Testing the impact of local alcohol licencing policies on reported crime rates in England

    Science.gov (United States)

    De Vocht, F; Heron, J; Campbell, R; Egan, M; Mooney, J D; Angus, C; Brennan, A; Hickman, M

    2017-01-01

    Background Excessive alcohol use contributes to public nuisance, antisocial behaviour, and domestic, interpersonal and sexual violence. We test whether licencing policies aimed at restricting its spatial and/or temporal availability, including cumulative impact zones, are associated with reductions in alcohol-related crime. Methods Reported crimes at English lower tier local authority (LTLA) level were used to calculate the rates of reported crimes including alcohol-attributable rates of sexual offences and violence against a person, and public order offences. Financial fraud was included as a control crime not directly associated with alcohol abuse. Each area was classified as to its cumulative licensing policy intensity for 2009–2015 and categorised as ‘passive’, low, medium or high. Crime rates adjusted for area deprivation, outlet density, alcohol-related hospital admissions and population size at baseline were analysed using hierarchical (log-rate) growth modelling. Results 284 of 326 LTLAs could be linked and had complete data. From 2009 to 2013 alcohol-related violent and sexual crimes and public order offences rates declined faster in areas with more ‘intense’ policies (about 1.2, 0.10 and 1.7 per 1000 people compared with 0.6, 0.01 and 1.0 per 1000 people in ‘passive’ areas, respectively). Post-2013, the recorded rates increased again. No trends were observed for financial fraud. Conclusions Local areas in England with more intense alcohol licensing policies had a stronger decline in rates of violent crimes, sexual crimes and public order offences in the period up to 2013 of the order of 4–6% greater compared with areas where these policies were not in place, but not thereafter. PMID:27514936

  12. An evaluation of a Low-Dose-Rate (LDR) brachytherapy procedure using a systems engineering & error analysis methodology for health care (SEABH) - (SAVE)

    LENUS (Irish Health Repository)

    Chadwick, Liam

    2012-03-12

    Health Care Failure Modes and Effects Analysis (HFMEA®) is an established tool for risk assessment in health care. A number of deficiencies have been identified in the method. A new method called Systems and Error Analysis Bundle for Health Care (SEABH) was developed to address these deficiencies. SEABH has been applied to a number of medical processes as part of its validation and testing. One of these, Low Dose Rate (LDR) prostate Brachytherapy is reported in this paper. The case study supported the validity of SEABH with respect to its capacity to address the weaknesses of (HFMEA®).

  13. 美军面向联合试验的能力试验法及启示%Research on the Capabil ity Test Methodology of U.S.Troops in Joint Test and lnspirations

    Institute of Scientific and Technical Information of China (English)

    刘盛铭; 冯书兴

    2015-01-01

    Considering the current weaponry research on joint test and evaluation methodology is not deep enough,the paper studies the capability test methodology of U.S.Troops in joint test and evaluation as well as the process steps of the methodology,including formation of test and evaluation strategy,descriptions of test characteristics,test plan,realization of distributed simulation environ-ment,test operation management and capability evaluation;The paper points out that,this method-ology is compatible with inheritance,driven by models,implemented by threads,with flexible,rigor-ous and standardized characteristics,which has been tested in practice.Since a system engineering re-quiring many participants,joint test and evaluation methodology is researched seriously in the follow-ing respects like method creation,method development,method fulfillment and method application.%针对武器装备联合试验方法研究不够深入的问题,研究了美军面向联合试验的能力试验法;介绍了该方法的流程步骤,包括试验与评估策略制定、试验特征描述、试验计划、分布式仿真环境实现、试验运行管理和能力评估;指出该方法具有继承并兼容、模型化驱动、线程式实现、灵活可剪裁、严谨而规范的特点,并在实践中得到了检验;由于联合试验方法的制定是一项需要多人参与的系统工程,从方法创新、方法制定、方法实现、方法应用等方面对联合试验方法进行了思考。

  14. Standard Test Method for Measuring Heat Transfer Rate Using a Thin-Skin Calorimeter

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 This test method covers the design and use of a thin metallic calorimeter for measuring heat transfer rate (also called heat flux). Thermocouples are attached to the unexposed surface of the calorimeter. A one-dimensional heat flow analysis is used for calculating the heat transfer rate from the temperature measurements. Applications include aerodynamic heating, laser and radiation power measurements, and fire safety testing. 1.2 Advantages 1.2.1 Simplicity of ConstructionThe calorimeter may be constructed from a number of materials. The size and shape can often be made to match the actual application. Thermocouples may be attached to the metal by spot, electron beam, or laser welding. 1.2.2 Heat transfer rate distributions may be obtained if metals with low thermal conductivity, such as some stainless steels, are used. 1.2.3 The calorimeters can be fabricated with smooth surfaces, without insulators or plugs and the attendant temperature discontinuities, to provide more realistic flow conditions for ...

  15. Standard Test Method for Measuring Neutron Fluence Rate by Radioactivation of Cobalt and Silver

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This test method covers a suitable means of obtaining the thermal neutron fluence rate, or fluence, in well moderated nuclear reactor environments where the use of cadmium, as a thermal neutron shield as described in Method E262, is undesirable because of potential spectrum perturbations or of temperatures above the melting point of cadmium. 1.2 This test method describes a means of measuring a Westcott neutron fluence rate (Note 1) by activation of cobalt- and silver-foil monitors (See Terminology E170). The reaction 59Co(n,γ)60Co results in a well-defined gamma emitter having a half-life of 1925.28 days (1). The reaction 109Ag(n,˙γ) 110mAg results in a nuclide with a complex decay scheme which is well known and having a half-life of 249.76 days (1). Both cobalt and silver are available either in very pure form or alloyed with other metals such as aluminum. A reference source of cobalt in aluminum alloy to serve as a neutron fluence rate monitor wire standard is available from the National Institute ...

  16. Subcritical crack growth in oxide and non-oxide ceramics using the Constant Stress Rate Test

    Directory of Open Access Journals (Sweden)

    Agnieszka Wojteczko

    2015-12-01

    Full Text Available Fracture toughness is one of the most important parameters for ceramics description. In some cases, material failure occurs at lower stresses than described by KIc parameter. In these terms, determination of fracture toughness only, proves to be insufficient. This may be due to environmental factors, such as humidity, which might cause subcritical crack propagation in a material. Therefore, it is very important to estimate crack growth velocities to predict lifetime of ceramics used under specific conditions. Constant Stress Rate Test is an indirect method of subcritical crack growth parameters estimation. Calculations are made by using strength data, thus avoiding crack measurement. The expansion of flaws causes reduction of material strength. If subcritical crack growth phenomenon occurs, critical value of crack lengths increases with decreasing stress rate due to longer time for flaw to grow before the critical crack propagation at KIc takes place. Subcritical crack growth phenomenon is particularly dangerous for oxide ceramics due to chemical interactions occurring as a result of exposure to humidity. This paper presents results of Constant Stress Rate Test performed for alumina, zirconia, silicon carbide and silicon nitride in order to demonstrate the differences in subcritical crack propagation phenomenon course.

  17. Integration and software for thermal test of heat rate sensors. [space shuttle external tank

    Science.gov (United States)

    Wojciechowski, C. J.; Shrider, K. R.

    1982-01-01

    A minicomputer controlled radiant test facility is described which was developed and calibrated in an effort to verify analytical thermal models of instrumentation islands installed aboard the space shuttle external tank to measure thermal flight parameters during ascent. Software was provided for the facility as well as for development tests on the SRB actuator tail stock. Additional testing was conducted with the test facility to determine the temperature and heat flux rate and loads required to effect a change of color in the ET tank external paint. This requirement resulted from the review of photographs taken of the ET at separation from the orbiter which showed that 75% of the external tank paint coating had not changed color from its original white color. The paint on the remaining 25% of the tank was either brown or black, indicating that it had degraded due to heating or that the spray on form insulation had receded in these areas. The operational capability of the facility as well as the various tests which were conducted and their results are discussed.

  18. Heart rate recovery after aerobic and anaerobic tests: is there an influence of anaerobic speed reserve?

    Science.gov (United States)

    Del Rosso, Sebastián; Nakamura, Fabio Y; Boullosa, Daniel A

    2017-05-01

    The present study assessed if differences in the metabolic profile, inferred from the anaerobic speed reserve (ASR), would influence the dynamics of heart rate recovery (HRR) after two modes of exercise. Thirty-nine physical education students (14 females and 25 males) volunteered for this study. Participants carried out three separate testing sessions to assess maximal sprinting speed (MSS, 1st session), repeated sprint ability (RSA, 2nd session) and maximal aerobic speed (MAS) using the Université of Montreal Track Test (UMTT, 3rd session). ASR was defined as the difference between MSS and MAS. Heart rate was continuously registered throughout the tests and during the 5-min post-test recovery. To evaluate the influence of ASR on post-exercise, HRR comparisons between ASR-based groups [high ASR vs. low ASR] and sex groups (males vs. females) were performed. Significant differences (P < 0.05) were found between high ASR and low ASR groups of the same sex for indices of relative HRR after the RSA and UMTT. In addition, after the RSA test, males from the high ASR group had a significantly slower HRR kinetics compared with the males of the low ASR (P < 0.05) and the females of high ASR (P < 0.05); whereas females of the high ASR groups had a faster HRR kinetics compared with the females of low ASR group (P < 0.05). Our results showed that in males, post-exercise HRR could be related to the ASR, whereas in females, the influence of ASR is less clear.

  19. Stress corrosion cracking of alloy 600 using the constant strain rate test

    Energy Technology Data Exchange (ETDEWEB)

    Bulischeck, T. S.; van Rooyen, D.

    1980-01-01

    The most recent corrosion problems experienced in nuclear steam generators tubed with Inconel alloy 600 is a phenomenon labeled ''denting''. Denting has been found in various degrees of severity in many operating pressurized water reactors. Laboratory investigations have shown that Inconel 600 exhibits intergranular SCC when subjected to high stresses and exposed to deoxygenated water at elevated temperatures. A research project was initiated at Brookhaven National Laboratory in an attempt to improve the qualitative and quantitative understanding of factors influencing SCC in high temperature service-related environments. An effort is also being made to develop an accelerated test method which could be used to predict the service life of tubes which have been deformed or are actively denting. Several heats of commercial Inconel 600 tubing were procured for testing in deaerated pure and primary water at temperatures from 290 to 365/sup 0/C. U-bend type specimens were used to determine crack initiation times which may be expected for tubes where denting has occurred but is arrested and provide baseline data for judging the accelerating effects of the slow strain rate method. Constant extension rate tests were employed to determine the crack velocities experienced in the crack propagation stage and predict failure times of tubes which are actively denting. 8 refs., 17 figs., 5 tabs.

  20. Conformance contrast testing between rates of pulmonary tuberculosis in Ecuadorian border areas

    Directory of Open Access Journals (Sweden)

    Claudia Ortiz-Rico

    2015-11-01

    Full Text Available Objective. To estimate rates of cases of respiratory symptomatic subjects and the incidence rate of pulmonary tuberculosis in two border areas of Ecuador, and contrast them with official figures. Materials and methods. Cross-sectional survey in the southeastern (SEBA, and the Andean southern Ecuadorian border areas (ASBA, which were conducted, respectively, in 1 598 and 2 419 persons aged over 15 years recruited over periods of three weeks. In identified respiratory symptomatic cases, a sputum sample was taken for smear testing. The results (odds ratios and their respective 95% confidence intervals, were compared with local and national official figures using maximum likelihood contrasts. Results. The rates of respiratory symptomatic subjects (7.7% and 5.9% in the SEBA, and ASBA, respectively and of pulmonary tuberculosis (cumulative incidence rates of 125 and 140 per 100 000 inhabitants, in the same order were significantly greater than the official figures (of 0.98 and 0.99% for respiratory symptomatic subjects in the SEBA and ASBA, respectively; and of 38.23 per 100 000 inhabitants for pulmonary tuberculosis in Ecuador as a whole (p<0.001. Conclusion. It is necessary to reinforce both active case finding for respiratory symptomatic subject cases, and epidemiological surveillance of pulmonary tuberculosis in Ecuadorian border regions.

  1. The predictive analysis of wear work-rates in wear test rigs

    Energy Technology Data Exchange (ETDEWEB)

    Phalippou, C.; Delaune, X.

    1996-12-31

    Impact and sliding wear in components is classically studied, as far as the wear laws are concerned, in specific wear test rigs that simulate the vibratory motion induced by the flow. In this paper, an experimental and numerical study on the impact forces and wear work-rates of a typical AECL rig is presented. The mode shapes and frequencies are measured and compared with finite element computations. Impact and sliding motions between the wear specimens are calculated and compared to the experimental results. Impact forces, mean values of wear work-rates as well as the specimen relative motions are found to be close to the experimental data. (authors). 14 refs., 9 figs., 5 tabs.

  2. Food preference, keeper ratings, and reinforcer effectiveness in exotic animals: the value of systematic testing.

    Science.gov (United States)

    Gaalema, Diann E; Perdue, Bonnie M; Kelling, Angela S

    2011-01-01

    Food preference describes the behavior of selecting between items for consumption; reinforcer effectiveness is the functional effect of that item in controlling behavior. Food preference and reinforcer effectiveness were examined in giant pandas (Ailuropoda melanoleuca) and African elephants (Loxodonta africana). A pairwise comparison between food items was used to assess food preference. High-, moderate-, and low-preference items were selected and tested for reinforcer effectiveness. High-preference items controlled behavior more effectively than less-preferred items. Caregiver ratings of food preferences were also collected for each subject, but these reports did not necessarily coincide with actual subject preferences. Caregiver ratings correlated with the food preferences of only 1 individual of each species; thus, preferences of 1 nonhuman animal may be falsely generalized to all animals of that species. Results suggest that food choice and reinforcer effectiveness should be investigated empirically and not rely on anecdotal reports.

  3. A field test comparison of hiking stick use on heartrate and rating of perceived exertion.

    Science.gov (United States)

    Jacobson, B H; Wright, T

    1998-10-01

    The purpose of this study was to compare heartrate carrying a load and rating of perceived exertion with and without hiking sticks while ascending and descending a slope. 11 novice, moderately fit volunteers, ages 18 to 21 years (M = 19.3 yr.) completed two alternate 50-meter, uphill and downhill hikes on a 40 degrees slope during randomly ordered trials with and without fitted hiking sticks and backpacks (15 kg). Paired t test comparisons for 4 trials indicated that mean heartrate was significantly lower only following the first ascent by those using hiking sticks than those without sticks. Rating of perceived exertion also was significantly lower (p heartrate may be lower at the onset of climbing using hiking sticks, but as the duration the hike is extended, heartrates become comparable, presumably due to the transfer of energy utilization from the legs to the upper body.

  4. Thermal-Hydraulics and Electrochemistry of a Boiling Solution in a Porous Sludge Pile A Test Methodology

    Energy Technology Data Exchange (ETDEWEB)

    R.F. Voelker

    2001-05-03

    When boiling occurs in a pile of porous corrosion products (sludge), chemical species can concentrate. These species can react with the corrosion products and transform the sludge into a rock hard mass and/or create a corrosive environment. In-situ measurements are required to improve the understanding of this process, and the thermal-hydraulic and electrochemical environment in the pile. A test method is described that utilizes a water heated instrumented tube array in an autoclave to perform the in-situ measurements. As a proof of method feasibility, tests were performed in an alkaline phosphate solution. The test data is discussed. Temperature changes and electrochemical potential shifts were used to indicate when chemicals concentrate and if/when the pile hardens. Post-test examinations confirmed hardening occurred. Experiments were performed to reverse the hardening process. A one-dimensional model, utilizing capillary forces, was developed to understand the thermal-hydraulic measurements.

  5. Rating leniency and halo in multisource feedback ratings: testing cultural assumptions of power distance and individualism-collectivism.

    Science.gov (United States)

    Ng, Kok-Yee; Koh, Christine; Ang, Soon; Kennedy, Jeffrey C; Chan, Kim-Yin

    2011-09-01

    This study extends multisource feedback research by assessing the effects of rater source and raters' cultural value orientations on rating bias (leniency and halo). Using a motivational perspective of performance appraisal, the authors posit that subordinate raters followed by peers will exhibit more rating bias than superiors. More important, given that multisource feedback systems were premised on low power distance and individualistic cultural assumptions, the authors expect raters' power distance and individualism-collectivism orientations to moderate the effects of rater source on rating bias. Hierarchical linear modeling on data collected from 1,447 superiors, peers, and subordinates who provided developmental feedback to 172 military officers show that (a) subordinates exhibit the most rating leniency, followed by peers and superiors; (b) subordinates demonstrate more halo than superiors and peers, whereas superiors and peers do not differ; (c) the effects of power distance on leniency and halo are strongest for subordinates than for peers and superiors; (d) the effects of collectivism on leniency were stronger for subordinates and peers than for superiors; effects on halo were stronger for subordinates than superiors, but these effects did not differ for subordinates and peers. The present findings highlight the role of raters' cultural values in multisource feedback ratings.

  6. Reflections on methodological approaches and conceptual contributions in a program of caregiving research: development and testing of Wuest's theory of family caregiving.

    Science.gov (United States)

    Wuest, Judith; Hodgins, Marilyn J

    2011-02-01

    Caregiving by family members, particularly women, is a societal expectation that is intensifying in the context of an aging population and health care restructuring. Our program of caregiving research spans two decades, moving from inductive theory development using grounded theory methods to deductive theory testing. In this article, we reflect on the serendipitous development of this program of research methodologically and conceptually. We summarize the key conceptual contributions that the program has made to caregiving knowledge, particularly with respect to the past relationship between care recipient and caregiver, obligation to care, caregiver agency, and relationships between caregivers and the health care system.

  7. Testing the rated parameters of new blowers in the Stara Jama coal mine

    Energy Technology Data Exchange (ETDEWEB)

    Zugic, M.

    1987-04-01

    Presents the findings of a joint commision of specialists from the Mining Institute in Belgrade, the Zenica coal mine and the Korfman company at the trials of two KGL-160 blowers manufactured by Korfman in the FRG. These axial blowers were being offered as replacement for old blower equipment at the Stara Jama coal mine. Describes tests carried out by the commission under various operating conditions and with differing settings of the blower blades. A special note was made of electrical consumption, noise and vibration. Details the mesurement procedures together with the instrumentation used and presents the results in three tables and one diagram. Concludes that both of the blowers tested fully met their design ratings and also complied with all the operational requirements of the Stara Jama mine. 3 refs.

  8. An evaluation of the EuroNCAP crash test safety ratings in the real world.

    Science.gov (United States)

    Segui-Gomez, Maria; Lopez-Valdes, Francisco J; Frampton, Richard

    2007-01-01

    We investigated whether the rating obtained in the EuroNCAP test procedures correlates with injury protection to vehicle occupants in real crashes using data in the UK Cooperative Crash Injury Study (CCIS) database from 1996 to 2005. Multivariate Poisson regression models were developed, using the Abbreviated Injury Scale (AIS) score by body region as the dependent variable and the EuroNCAP score for that particular body region, seat belt use, mass ratio and Equivalent Test Speed (ETS) as independent variables. Our models identified statistically significant relationships between injury severity and safety belt use, mass ratio and ETS. We could not identify any statistically significant relationships between the EuroNCAP body region scores and real injury outcome except for the protection to pelvis-femur-knee in frontal impacts where scoring "green" is significantly better than scoring "yellow" or "red".

  9. Performance of high-rate TRD prototypes for the CBM experiment in test beam and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Klein-Boesing, Melanie [Institut fuer Kernphysik, Muenster (Germany)

    2008-07-01

    The goal of the future Compressed Baryonic Matter (CBM) experiment is to explore the QCD phase diagram in the region of high baryon densities not covered by other experiments. Among other detectors, it will employ a Transition Radiation Detector (TRD) for tracking of charged particles and electron identification. To meet the demands for tracking and for electron identification at large particle densities and very high interaction rates, high efficiency TRD prototypes have been developed. These prototypes with double-sided pad plane electrodes based on Multiwire Proportional Chambers (MWPC) have been tested at GSI and implemented in the simulation framework of CBM. Results of the performance in a test beam and in simulations are shown. In addition, we present a study of the performance of CBM for electron identification and dilepton reconstruction with this new detector layout.

  10. Conditions for testing the corrosion rates of ceramics in coal gasification systems

    Energy Technology Data Exchange (ETDEWEB)

    Hurley, J.P.; Nowok, J.W. [Univ. of North Dakota, Grand Forks, ND (United States)

    1996-08-01

    Coal gasifier operating conditions and gas and ash compositions affect the corrosion rates of ceramics used for construction in three ways: (1) through direct corrosion of the materials, (2) by affecting the concentration and chemical form of the primary corrodents, and (3) by affecting the mass transport rate of the primary corrodents. To perform an accurate corrosion test on a system material, the researcher must include all relevant corrodents and simulate conditions in the gasifier as closely as possible. In this paper, the authors present suggestions for conditions to be used in such corrosion tests. Two main types of corrosion conditions are discussed: those existing in hot-gas cleanup systems where vapor and dry ash may contribute to corrosion and those experienced by high-temperature heat exchangers and refractories where the main corrodent will be coal ash slag. Only the fluidized-bed gasification systems such as the Sierra Pacific Power Company Pinon Pine Power Project system are proposing the use of ceramic filters for particulate cleanup. The gasifier is an air-blown 102-MWe unit employing a Westinghouse{trademark} ceramic particle filter system operating at as high as 1100{degrees}F at 300 psia. Expected gas compositions in the filter will be approximately 25% CO, 15% H{sub 2}, 5% CO{sub 2}, 5% H{sub 2}O, and 50% N{sub 2}. Vapor-phase sodium chloride concentrations are expected to be 10 to 100 times the levels in combustion systems at similar temperatures, but in general the concentrations of the minor primary and secondary corrodents are not well understood. Slag corrosiveness will depend on its composition as well as viscosity. For a laboratory test, the slag must be in a thermodynamically stable form before the beginning of the corrosion test to assure that no inappropriate reactions are allowed to occur. Ideally, the slag would be flowing, and the appropriate atmosphere must be used to assure realistic slag viscosity.

  11. Influence of running stride frequency in heart rate variability analysis during treadmill exercise testing.

    Science.gov (United States)

    Bailón, Raquel; Garatachea, Nuria; de la Iglesia, Ignacio; Casajús, Jose Antonio; Laguna, Pablo

    2013-07-01

    The analysis and interpretation of heart rate variability (HRV) during exercise is challenging not only because of the nonstationary nature of exercise, the time-varying mean heart rate, and the fact that respiratory frequency exceeds 0.4 Hz, but there are also other factors, such as the component centered at the pedaling frequency observed in maximal cycling tests, which may confuse the interpretation of HRV analysis. The objectives of this study are to test the hypothesis that a component centered at the running stride frequency (SF) appears in the HRV of subjects during maximal treadmill exercise testing, and to study its influence in the interpretation of the low-frequency (LF) and high-frequency (HF) components of HRV during exercise. The HRV of 23 subjects during maximal treadmill exercise testing is analyzed. The instantaneous power of different HRV components is computed from the smoothed pseudo-Wigner-Ville distribution of the modulating signal assumed to carry information from the autonomic nervous system, which is estimated based on the time-varying integral pulse frequency modulation model. Besides the LF and HF components, the appearance is revealed of a component centered at the running SF as well as its aliases. The power associated with the SF component and its aliases represents 22±7% (median±median absolute deviation) of the total HRV power in all the subjects. Normalized LF power decreases as the exercise intensity increases, while normalized HF power increases. The power associated with the SF does not change significantly with exercise intensity. Consideration of the running SF component and its aliases is very important in HRV analysis since stride frequency aliases may overlap with LF and HF components.

  12. Effects of Age, Exercise Duration, and Test Conditions on Heart Rate Variability in Young Endurance Horses.

    Science.gov (United States)

    Younes, Mohamed; Robert, Céline; Barrey, Eric; Cottin, François

    2016-01-01

    Although cardiac recovery is an important criterion for ranking horses in endurance competitions, heart rate variability (HRV) has hardly ever been studied in the context of this equestrian discipline. In the present study, we sought to determine whether HRV is affected by parameters such as age, exercise duration and test site. Accordingly, HRV might be used to select endurance horses with the fastest cardiac recovery. The main objective of the present study was to determine the effects of age, exercise duration, and test site on HRV variables at rest and during exercise and recovery in young Arabian endurance horses. Over a 3-year period, 77 young Arabian horses aged 4-6 years performed one or more exercise tests (consisting of a warm-up, cantering at 22 km.h(-1)and a final 500 m gallop at full speed) at four different sites. Beat-to-beat RR intervals were continuously recorded and then analyzed (using a time-frequency approach) to determine the instantaneous HRV components before, during and after the test. At rest, the root-mean-square of successive differences in RR intervals (RMSSD) was higher in the 4-year-olds (54.4 ± 14.5 ms) than in the 5-or 6-year-olds (44.9 ± 15.5 and 49.1 ± 11.7 ms, respectively). During the first 15 min of exercise (period T), the heart rate (HR) and RMSSD decreased with age. In 6-year-olds, RMSSD decreased as the exercise duration increased (T: 3.0 ± 1.4 vs. 2T: 3.6 ± 2.2 vs. 3T: 2.8 ± 1.0). During recovery, RMSSD was negatively correlated with the cardiac recovery time (CRT) and the recovery heart rate (RHR; R = -0.56 and -0.53, respectively; p exercise and recovery, RMSSD and several HRV variables differed significantly as a function of the test conditions. HRV in endurance horses appears to be strongly influenced by age and environmental factors (such as ambient temperature, ambient humidity, and track quality). Nevertheless, RMSSD can be used to select endurance horses with the fastest cardiac recovery.

  13. PRELIMINARY FRIT DEVELOPMENT AND MELT RATE TESTING FOR SLUDGE BATCH 6 (SB6)

    Energy Technology Data Exchange (ETDEWEB)

    Fox, K.; Miller, D.; Edwards, T.

    2009-07-21

    The Liquid Waste Organization (LWO) provided the Savannah River National Laboratory (SRNL) with a Sludge Batch 6 (SB6) composition projection in March 2009. Based on this projection, frit development efforts were undertaken to gain insight into compositional effects on the predicted and measured properties of the glass waste form and to gain insight into frit components that may lead to improved melt rate for SB6-like compositions. A series of Sludge Batch 6 (SB6) based glasses was selected, fabricated and characterized in this study to better understand the ability of frit compositions to accommodate uncertainty in the projected SB6 composition. Acceptable glasses (compositions where the Product Composition Control System (PCCS) Measurement Acceptability Region (MAR) predicted acceptable properties, good chemical durability was measured, and no detrimental nepheline crystallization was observed) can be made using Frit 418 with SB6 over a range of Na{sub 2}O and Al{sub 2}O{sub 3} concentrations. However, the ability to accommodate variation in the sludge composition limits the ability to utilize alternative frits for potential improvements in melt rate. Frit 535, which may offer improvements in melt rate due to its increased B2O3 concentration, produced acceptable glasses with the baseline SB6 composition at waste loadings of 34 and 42%. However, the PCCS MAR results showed that it is not as robust as Frit 418 in accommodating variation in the sludge composition. Preliminary melt rate testing was completed in the Melt Rate Furnace (MRF) with four candidate frits for SB6. These four frits were selected to evaluate the impacts of B{sub 2}O{sub 3} and Na{sub 2}O concentrations in the frit relative to those of Frit 418, although they are not necessarily candidates for SB6 vitrification. Higher concentrations of B{sub 2}O{sub 3} in the frit relative to that of Frit 418 appeared to improve melt rate. However, when a higher concentration of B{sub 2}O{sub 3} was coupled

  14. A Randomized Rounding Approach for Optimization of Test Sheet Composing and Exposure Rate Control in Computer-Assisted Testing

    Science.gov (United States)

    Wang, Chu-Fu; Lin, Chih-Lung; Deng, Jien-Han

    2012-01-01

    Testing is an important stage of teaching as it can assist teachers in auditing students' learning results. A good test is able to accurately reflect the capability of a learner. Nowadays, Computer-Assisted Testing (CAT) is greatly improving traditional testing, since computers can automatically and quickly compose a proper test sheet to meet user…

  15. Rates of testing for HIV in the presence of serodiscordant UAI among HIV-negative gay men in committed relationships.

    Science.gov (United States)

    Chakravarty, Deepalika; Hoff, Colleen C; Neilands, Torsten B; Darbes, Lynae A

    2012-10-01

    We examined testing rates for HIV-negative men (N = 752) from a sample of gay male couples. Approximately half (52 %) tested in the past year. Among men who had engaged in sexual risk behavior in the past 3 months, 27 % tested within that period and 65 % within the past year. For men in concordant relationships these rates were 25 and 60 %, for men in serodiscordant relationships they were 34 and 72 %. MSM in primary relationships are testing at lower rates than the general MSM population, even after potential exposure to HIV. Testing and prevention messages for MSM should factor in relationship status.

  16. A methodological framework to distinguish spectrum effects from spectrum biases and to assess diagnostic and screening test accuracy for patient populations: Application to the Papanicolaou cervical cancer smear test

    Directory of Open Access Journals (Sweden)

    Coste Joël

    2008-02-01

    Full Text Available Abstract Background A spectrum effect was defined as differences in the sensitivity or specificity of a diagnostic test according to the patient's characteristics or disease features. A spectrum effect can lead to a spectrum bias when subgroup variations in sensitivity or specificity also affect the likelihood ratios and thus post-test probabilities. We propose and illustrate a methodological framework to distinguish spectrum effects from spectrum biases. Methods Data were collected for 1781 women having had a cervical smear test and colposcopy followed by biopsy if abnormalities were detected (the reference standard. Logistic models were constructed to evaluate both the sensitivity and specificity, and the likelihood ratios, of the test and to identify factors independently affecting the test's characteristics. Results For both tests, human papillomavirus test, study setting and age affected sensitivity or specificity of the smear test (spectrum effect, but only human papillomavirus test and study setting modified the likelihood ratios (spectrum bias for clinical reading, whereas only human papillomavirus test and age modified the likelihood ratios (spectrum bias for "optimized" interpretation. Conclusion Fitting sensitivity, specificity and likelihood ratios simultaneously allows the identification of covariates that independently affect diagnostic or screening test results and distinguishes spectrum effect from spectrum bias. We recommend this approach for the development of new tests, and for reporting test accuracy for different patient populations.

  17. Design Development Test and Evaluation (DDT and E) Considerations for Safe and Reliable Human Rated Spacecraft Systems

    Science.gov (United States)

    Miller, James; Leggett, Jay; Kramer-White, Julie

    2008-01-01

    A team directed by the NASA Engineering and Safety Center (NESC) collected methodologies for how best to develop safe and reliable human rated systems and how to identify the drivers that provide the basis for assessing safety and reliability. The team also identified techniques, methodologies, and best practices to assure that NASA can develop safe and reliable human rated systems. The results are drawn from a wide variety of resources, from experts involved with the space program since its inception to the best-practices espoused in contemporary engineering doctrine. This report focuses on safety and reliability considerations and does not duplicate or update any existing references. Neither does it intend to replace existing standards and policy.

  18. Metodologia alternativa para condução do teste de envelhecimento acelerado em sementes de milho Alternative methodology for the accelerated aging test for corn seeds

    Directory of Open Access Journals (Sweden)

    Sonia Regina Mudrovitsch de Bittencourt

    2012-08-01

    Full Text Available Os testes de vigor são rotineiramente empregados em programas internos de controle de qualidade por empresas sementeiras. Para tanto, é necessária a escolha de métodos eficientes que possibilitem a obtenção de respostas rápidas para a tomada de decisões relacionadas ao manuseio, descarte e comercialização dos lotes de sementes. A pesquisa objetivou verificar a redução do período de execução do teste de envelhecimento acelerado (EA em sementes de milho, empregando-se, para a avaliação do desempenho das sementes após o envelhecimento, o teste de tetrazólio - TZ (viabilidade e vigor em substituição ao de germinação (TG em 10 lotes de sementes de sete genótipos de milho, com e sem tratamento fungicida. Os dados obtidos com a metodologia proposta (EA+TZ foram comparados com os valores determinados pelo teste de envelhecimento acelerado realizado com a metodologia tradicional (EA+TG. O uso do teste de tetrazólio (vigor, associado ao teste de envelhecimento acelerado, possibilitou a obtenção de informações semelhantes às fornecidas pelo teste de germinação empregado para o mesmo fim, reduzindo de oito para três dias o tempo necessário para a obtenção dos resultados em sementes de milho.Some vigor tests are routinely used by seed industry for internal programs of seed quality control. Then, it is requested the use of efficient methods to obtain quick answers to take right decisions related to the management, discard and trade of seed lots. This research was carried out in order to study the possibility to short the period to get the accelerated aging test (AA results, using the tetrazolium test (TZ instead of germination test (GT to evaluate the seed performance after the seed aging. Tem corn seed lots were used, with and without fungicide treatment. The data obtained using the alternative method (AA+TZ were compared with those determined by the traditional one (AA+GT. There was discrimination among seed lots using

  19. Leak rate and burst test data for McGuire Unit 1 steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Sherburne, P.A. [B& W Nuclear Service Co., Lynchburg, VA (United States); Frye, C.R. [Babcock & Wilcox Co., Lynchburg, VA (United States); Mayes, D.B. [Duke Power Co., Charlotte, NC (United States)

    1992-12-31

    To support the development of tube plugging criteria that would allow tubes with through-wall cracks to remain in service, sections of 12 tubes were removed from the McGuire Unit-1 steam generators. These tubes were sent to B&W Nuclear Service Company for metallographic examination and for determination of burst pressure and leak rate at both operating and faulted conditions. Primary water stress corrosion cracking (PWSCC) had degraded these tubes in the tube-to-tubesheet roll transitions. To measure primary-to-secondary leakage at pressures and temperatures equivalent to those in the McGuire Unit-1 steam generators, an autoclave-based test loop was designed and installed at the Babcock & Wilcox Lynchburg Research Center. Sections of the tube containing the roll transitions were then installed in the autoclave and actual primary- to-secondary leakage was measured at 288{degrees}C (550{degrees}F) and at 9 and 18.3 MPa (1300 and 2650 psi) pressure differentials. Following the leak test, the tubes were pressurized internally until the tube wall ruptured. Leak rate, burst pressure, and eddy-current information were then correlated with the through-wall crack lengths as determined by metallographic examination. Results confirm the ability to measure the crack length with eddy-current techniques. Results also support analytical and empirical models developed by the nuclear industry in calculating critical crack lengths in roll transitions.

  20. A generalized Forchheimer radial flow model for constant-rate tests

    Science.gov (United States)

    Liu, Ming-Ming; Chen, Yi-Feng; Zhan, Hongbin; Hu, Ran; Zhou, Chuang-Bing

    2017-09-01

    Models used for data interpretation of constant-rate tests (CRTs) are commonly derived with the assumption of Darcian flow in an idealized integer flow dimension, where the non-Darcian nature of fluid flow and the complexity of flow geometry are disregarded. In this study, a Forchheimer's law-based analytical model is proposed with the assumption of buildup (or drawdown) decomposition for characterizing the non-Darcian flow in a generalized radial formation where the flow dimension n may become non-integer. The proposed model immediately reduces to Barker's (1988) model for Darcian flow in the generalized radial formation and to Mathias et al.'s (2008) model for non-Darcian flow in a two-dimensional confined aquifer. A comparison with numerical simulations shows that the proposed model behaves well at late times for flow dimension n > 1.5. The proposed model is finally applied for data interpretation of the constant-rate pumping tests performed at Ploemeur (Le Borgne et al., 2004), showing that the intrinsic hydraulic conductivity of formations will be underestimated and the specific storage will be overestimated if the non-Darcian effect is ignored. The proposed model is an extension of the generalized radial flow (GRF) model based on Forchheimer's law, which would be of significance for data interpretation of CRTs in aquifers of complex flow geometry in which non-Darcian flow occurs.

  1. Could sperm aneuploidy rate determination be used as a predictive test before intracytoplasmic sperm injection?

    Science.gov (United States)

    Petit, François M; Frydman, Nelly; Benkhalifa, Moncef; Le Du, Anne; Aboura, Azzedine; Fanchin, Renato; Frydman, Rene; Tachdjian, Gerard

    2005-01-01

    Chromosome abnormalities in embryos are a major cause of implantation and development failures. Some couples with normal karyotypes have repeated implantation failures after intracytoplasmic sperm injection (ICSI). In order to value patients at risk for genetic ICSI failures and the validity of sperm aneuploidy analysis, we have studied cytogenetic abnormalities in sperm from ICSI patients. Twenty-nine patients with normal karyotypes were included. Ten patients had at least 4 ICSI treatments without pregnancy (group A). Nine patients had a pregnancy after 1 to 3 ICSI treatments (group B). Ten fertile men with normal semen parameters were studied as controls (group C). Fluorescent in situ hybridization (FISH) was used for sperm nucleus cytogenetic analysis using chromosomes 8, 9, 13, 18, 21, X, and Y specific probes. Aneuploidy for each chromosome and diploidy rates were significantly higher in group A than in group B and in group B than in group C (P < .05). Considering each patient in groups A and B, aneuploidy rate for each chromosome was too variable to be considered as a significant test. We proposed analysis of the total sperm aneuploidy. Chromosomal sperm nuclei profile could be used as a predictive biological test before ICSI in order to improve genetic counseling for oligoasthenoteratozoospermia patients.

  2. Changes in Heart Rate Variability during a tonic immobility test in quail.

    Science.gov (United States)

    Valance, Dorothée; Després, Gérard; Richard, Sabine; Constantin, Paul; Mignon-Grasteau, Sandrine; Leman, Samuel; Boissy, Alain; Faure, Jean-Michel; Leterrier, Christine

    2008-02-27

    Tonic immobility (TI) is an unlearned fear response induced by a brief physical restraint and characterized by a marked autonomic nervous system involvement. This experiment aimed at studying the relative involvement of both autonomic sub-systems, the sympathetic and parasympathetic nervous systems, during TI, by analyzing Heart Rate Variability. Quail selected genetically for long (LTI) or short (STI) TI duration and quail from a control line (CTI) were used. The animals were surgically fitted with a telemetric device to record electrocardiograms before and during a TI test. Heart rate did not differ between lines at rest. The induction of TI, whether effective or not, induced an increase in HR characterized by a shift of the sympathovagal balance towards a higher sympathetic dominance. Parasympathetic activity was lower during effective than during non-effective inductions in CTI quail. During TI, the increase in sympathetic dominance was initially maintained and then declined, while relative parasympathetic activity remained low, especially in CTI and STI lines. The end of tonic immobility was characterized by a rise in overall autonomic activity in all lines and an increase in parasympathetic influence in CTI and STI quail. To conclude, the susceptibility to TI cannot be explained only by autonomic reflex changes. It is probably strongly related to the perception of the test by the quail. During TI, the differences between lines in autonomic responses probably reflect behavioural differences in the fear response.

  3. Methodological and Theoretical Issues in the Adaptation of Sign Language Tests: An Example from the Adaptation of a Test to German Sign Language

    Science.gov (United States)

    Haug, Tobias

    2012-01-01

    Despite the current need for reliable and valid test instruments in different countries in order to monitor the sign language acquisition of deaf children, very few tests are commercially available that offer strong evidence for their psychometric properties. This mirrors the current state of affairs for many sign languages, where very little…

  4. Evaluating the Effects of Restraint Systems on 4WD Testing Methodologies: A Collaborative Effort between the NVFEL and ANL

    Science.gov (United States)

    Testing vehicles for emissions and fuel economy has traditionally been conducted with a single-axle chassis dynamometer. The 2006 SAE All Wheel Drive Symposium cited four wheel drive (4WD) and all wheel drive (AWD) sales as climbing from 20% toward 30% of a motor vehicle mar...

  5. In Vitro Antifungal Susceptibility Testing of Candida Isolates with the EUCAST Methodology, a New Method for ECOFF Determination.

    Science.gov (United States)

    Meletiadis, J; Curfs-Breuker, I; Meis, J F; Mouton, J W

    2017-04-01

    The in vitro susceptibilities of 1,099 molecularly identified clinical Candida isolates against 8 antifungal drugs were determined using the EUCAST microdilution method. A new simple, objective, and mathematically solid method for determining epidemiological cutoff values (ECOFFs) was developed by derivatizing the MIC distribution and determining the derivatized ECOFF (dECOFF) as the highest MIC with the maximum second derivative. The dECOFFs were similar (95% agreement within 1 dilution) to the EUCAST ECOFFs. Overall, low non-wild-type/resistance rates were found. The highest rates were found for azoles with C. parapsilosis (2.7 to 9.8%), C. albicans (7%), and C. glabrata (1.7 to 2.3%) and for echinocandins with C. krusei (3.3%), C. albicans (1%), and C. tropicalis (1.7%). Copyright © 2017 American Society for Microbiology.

  6. NASA-STD-6001B Test 7: Impact of Test Methodology and Detection Advancements on the Obsolescence of Historical Offgas Data

    Science.gov (United States)

    Buchanan, Vanessa D.; Woods, Brenton; Harper, Susana A.; Beeson, Harold D.; Perez, Horacio; Ryder, Valerie; Tapia, Alma S.; Pedley, Michael D.

    2017-01-01

    NASA-STD-6001B states "all nonmetals tested in accordance with NASA-STD-6001 should be retested every 10 years or as required by the responsible program/project." The retesting of materials helps ensure the most accurate data are used in material selection. Manufacturer formulas and processes can change over time, sometimes without an update to product number and material information. Material performance in certain NASA-STD-6001 tests can be particularly vulnerable to these changes, such as material offgas (Test 7). In addition, Test 7 analysis techniques at NASA White Sands Test Facility were dramatically enhanced in the early 1990s, resulting in improved detection capabilities. Low level formaldehyde identification was improved again in 2004. Understanding the limitations in offgas analysis data prior to 1990 puts into question the validity and current applicability of that data. Case studies on Super Koropon (Registered trademark) and Aeroglaze (Registered trademark) topcoat highlight the importance of material retesting.

  7. Venting Design for Di-tert-butyl Peroxide Runaway Reaction Based on Accelerating Rate Calorimeter Test

    Institute of Scientific and Technical Information of China (English)

    魏彤彤; 蒋慧灵

    2012-01-01

    In order to design the relief system size of di-tert-butyl peroxide(DTBP) storage tanks,the runaway re-action of DTBP was simulated by accelerating rate calorimeter(ARC).The results indicated that under adiabatic conditions the initial exothermic temperature was 102.6 ℃,the maximum self-heating rate was 3.095×107 ℃·min-1,the maximum self-heating temperature was 375.9 ℃,and the pressure produced by unit mass was 4.512 MPa·g-1.Judged by ARC test,the emergency relief system for DTBP was a hybrid system.Based on Design Institute for Emergency Relief System(DIERS) method,the releasing mass flow rate W was determined by Leung methods,and the mass velocity G was calculated by two modified Omega methods.The two relief sizes calculated by monograph Omega method and arithmetic Omega method are close,with only 0.63% relative error.The monograph Omega method is more convenient to apply.

  8. High Strain Rate Testing of Rocks using a Split-Hopkinson-Pressure Bar

    Science.gov (United States)

    Zwiessler, Ruprecht; Kenkmann, Thomas; Poelchau, Michael; Nau, Siegfried; Hess, Sebastian

    2016-04-01

    Dynamic mechanical testing of rocks is important to define the onset of rate dependency of brittle failure. The strain rate dependency occurs through the propagation velocity limit (Rayleigh wave speed) of cracks and their reduced ability to coalesce, which, in turn, significantly increases the strength of the rock. We use a newly developed pressurized air driven Split-Hopkinson-Pressure Bar (SHPB), that is specifically designed for the investigation of high strain rate testing of rocks, consisting of several 10 to 50 cm long strikers and bar components of 50 mm in diameter and 2.5 meters in length each. The whole set up, composed of striker, incident- and transmission bar is available in aluminum, titanium and maraging steel to minimize the acoustic impedance contrast, determined by the change of density and speed of sound, to the specific rock of investigation. Dynamic mechanical parameters are obtained in compression as well as in spallation configuration, covering a wide spectrum from intermediate to high strain rates (100-103 s-1). In SHPB experiments [1] one-dimensional longitudinal compressive pulses of diverse shapes and lengths - formed with pulse shapers - are used to generate a variety of loading histories under 1D states of stress in cylindrical rock samples, in order to measure the respective stress-strain response at specific strain rates. Subsequent microstructural analysis of the deformed samples is aimed at quantification fracture orientation, fracture pattern, fracture density, and fracture surface properties as a function of the loading rate. Linking mechanical and microstructural data to natural dynamic deformation processes has relevance for the understanding of earthquakes, landslides, impacts, and has several rock engineering applications. For instance, experiments on dynamic fragmentation help to unravel super-shear rupture events that pervasively pulverize rocks up to several hundred meters from the fault core [2, 3, 4]. The dynamic, strain

  9. The mathematical analysis of the heart rate and blood lactate curves during incremental exercise testing.

    Science.gov (United States)

    Rosic, Mirko; Ilic, V; Obradovic, Z; Pantovic, S; Rosic, G

    2011-12-01

    This paper describes a new mathematical approach for the analysis of HR (heart rate) and BL (blood lactate) curves during incremental exercise testing using a HR/BL curve and its derivatives, taking into account the native shape of all curves, without any linear approximation. Using this approach the results indicate the appearance of three characteristic points (A, B and C) on the HR/BL curve. The point A on the HR/BL curve which is the value that corresponds to the load (12.73 ± 0.46 km h-1) at which BL starts to increase above the resting levels (0.9 ± 0.06 mM), and is analogous to Lactate Turn Point 1 (LTP1). The point C on the HR/BL curve which corresponds to a BL of approximately 4mM, and is analogous to LTP2. The point B on the HR/BL curve, which corresponds to the load (16.32 ± 0.49 km h-1) at which the moderate increase turns into a more pronounced increase in BL. This point has not been previously recognized in literature. We speculate this point represents attenuation of left ventricular ejection fraction (LVEF) increase, accompanied by the decrease in diastolic time duration during incremental exercise testing. Proposed mathematical approach allows precise determination of lactate turnpoints during incremental exercise testing.

  10. Reliability of heart rate variability threshold and parasympathetic reactivation after a submaximal exercise test

    Directory of Open Access Journals (Sweden)

    Carlos Janssen Gomes da Cruz

    Full Text Available Abstract The objective of this study was to evaluate reproducibility of heart rate variability threshold (HRVT and parasympathetic reactivation in physically active men (n= 16, 24.3 ± 5.1 years. During the test, HRVT was assessed by SD1 and r-MSSD dynamics. Immediately after exercise, r-MSSD was analyzed in segments of 60 seconds for a period of five minutes. High absolute and relatively reproducible analysis of HRVT were observed, as assessed by SD1 and r-MSSD dynamics (ICC = 0.92, CV = 10.8, SEM = 5.8. During the recovery phase, a moderate to high reproducibility was observed for r-MSSD from the first to the fifth minute (ICC = 0.69-0.95, CV = 7.5-14.2, SEM = 0.07-1.35. We conclude that HRVT and r-MSSD analysis after a submaximal stress test are highly reproducible measures that might be used to assess the acute and chronic effects of exercise training on cardiac autonomic modulation during and/or after a submaximal stress test.

  11. Development of a Flight Test Methodology for a U.S. Navy Half-Scale Unmanned Air Vehicle

    Science.gov (United States)

    1989-03-01

    aircraft longitudinal center of gravity (CG) and were used to help stabilize the aircraft pitch and roll axes during flight testing and to lighten...consisted of an 18-ounce fuel tank, a fuselage mounted fueling connection and a Perry Regulated fuel pump. The fuel tank was mounted on the aircraft ... longitudinal CG so as to minimize CG movement during flight. Because the fuel tank was located approximately 15 inches 8 GYROGYO PITCHl ROLL LEAD TAPE

  12. A Comparison of Validity Rates between Paper-and-Pencil and Computerized Testing with the MMPI-2

    Science.gov (United States)

    Blazek, Nicole L.; Forbey, Johnathan D.

    2011-01-01

    Although the use of computerized testing in psychopathology assessment has increased in recent years, limited research has examined the impact of this format in terms of potential differences in test validity rates. The current study explores potential differences in the rates of valid and invalid Minnesota Multiphasic Personality Inventory--2…

  13. The network adjustment aimed for the campaigned gravity survey using a Bayesian approach: methodology and model test

    Science.gov (United States)

    Chen, Shi; Liao, Xu; Ma, Hongsheng; Zhou, Longquan; Wang, Xingzhou; Zhuang, Jiancang

    2017-04-01

    The relative gravimeter, which generally uses zero-length springs as the gravity senor, is still as the first choice in the field of terrestrial gravity measurement because of its efficiency and low-cost. Because the drift rate of instrument can be changed with the time and meter, it is necessary for estimating the drift rate to back to the base or known gravity value stations for repeated measurement at regular hour's interval during the practical survey. However, the campaigned gravity survey for the large-scale region, which the distance of stations is far away from serval or tens kilometers, the frequent back to close measurement will highly reduce the gravity survey efficiency and extremely time-consuming. In this paper, we proposed a new gravity data adjustment method for estimating the meter drift by means of Bayesian statistical interference. In our approach, we assumed the change of drift rate is a smooth function depend on the time-lapse. The trade-off parameters were be used to control the fitting residuals. We employed the Akaike's Bayesian Information Criterion (ABIC) for the estimated these trade-off parameters. The comparison and analysis of simulated data between the classical and Bayesian adjustment show that our method is robust and has self-adaptive ability for facing to the unregularly non-linear meter drift. At last, we used this novel approach to process the realistic campaigned gravity data at the North China. Our adjustment method is suitable to recover the time-varied drift rate function of each meter, and also to detect the meter abnormal drift during the gravity survey. We also defined an alternative error estimation for the inversed gravity value at the each station on the basis of the marginal distribution theory. Acknowledgment: This research is supported by Science Foundation Institute of Geophysics, CEA from the Ministry of Science and Technology of China (Nos. DQJB16A05; DQJB16B07), China National Special Fund for Earthquake

  14. Effect of Rating Scales and Test Parts of Body on the Evaluation Results of Fabric-evoked Prickle

    Institute of Scientific and Technical Information of China (English)

    WANG Ge-hui; RON Postle; ZHANG Wei-yuan

    2006-01-01

    The effect of rating scales and test parts of body on the fabric-evoked prickle evaluation results are studied by carrying out subjective evaluation tests under controlled environment conditions (24±1)℃, (65±5) %RH. Ten college female students aged about 20 were chosen as the subjects, who have participated a preliminary training on subjective prickle evaluation. The prickle of a range of 9 light-weight worsted woven wool and wool blend fabrics and a cotton fabric were tested by using a 1 - 5 rating scale and using a 0 - 10 rating scale respectively at different test parts of body respectively such as forearm, upper arm ball and neck back. The test results were statistically analyzed. It is found that there is a significant correlation coefficient between the evaluation results of using the 1 - 5 rating scale and of using the 0 - 10 rating scale. It is also found that there are highly significant correlation coefficients between the evaluation results of using the forearm prickle test and the neck back prickle test, between the evaluation results of using the neck back prickle test and the upper arm ball prickle test, and between the evaluation results of using the forearm prickle test and the upper arm ball prickle test. It is suggested that the forearm prickle test is preferable in evaluating fabric-evoked prickle for its convenience and sensitivity.

  15. Tourism Methodologies

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  16. [Optimization of reaction conditions and methodological investigation on microtox-based fast testing system for traditional Chinese medicine injection].

    Science.gov (United States)

    Gao, Hong-Li; Li, Xiao-Rong; Yan, Liang-Chun; Zhao, Jun-Ning

    2016-05-01

    Vibrio fischeri CS234 was used to establish and optimize microtox assay system, laying a foundation for the application of this method in comprehensive acute toxicity test of traditional Chinese medicine (TCM) injections. Firstly, the Plackett-Burman method was carried out to optimize the factors which would affect Vibrio fischeri CS234 luminescence. Secondly, ZnSO4•7H2O was chosen as reference substance to establish its reaction system with quality control samples. The optimal luminescence conditions were achieved as follows: ①At a temperature of (15±1) ℃, Vibrio fischeri CS234 lyophilized powders were balanced for 15 min, then, 1 mL resuscitation fluid was added and blended for 10 min. 100 μL bacteria suspension was taken to measure the initial luminescence intensity, and then 1 mL resuscitation fluid or test sample was immediately added; after reaction for 10 min, corresponding luminescence intensity was measured again. Resuscitation diluent, osmotic pressure regulator and ZnSO4•7H2O stock solution showed no interference to the determination of Vibrio fischeri CS234 luminescence intensity, so this method was of good specificity. The within-and between-batch precisions of quality controls and the lower limit of quantification (LLOQ) samples were testing needs of the vast majority of traditional Chinese medicine injections. The Vibrio fischeri strain CS234 assay system was specific, stable, sensitive, accurate and adaptable after optimization, so it was suitable for the comprehensive acute toxicity assessment of TCM injections. Copyright© by the Chinese Pharmaceutical Association.

  17. Heart Rate Variability Analysis in Revascularized Individuals Submitted to an Anaerobic Potency Test

    Directory of Open Access Journals (Sweden)

    Geraldo Mendes Gutian Jr

    2007-10-01

    Full Text Available The objective of this study was to analyze the behavior of autonomic modulation before, during and after the Modified Wingate Test (WanMT, through the analysis of Heart Rate Variability (HRV. Six volunteers between the ages of 40 and 70, post-revascularization procedures (angioplasty and/or surgery, mean duration 10 months, were submitted to supervised training for at least 10 to 14 months. The following protocol, divided into 5 phases, was used: 1 Rest Phase (RP: 180 seconds; 2 Submaximum Phase (SP: 30 seconds; 3 Maximum Phase (MP: 30 seconds; 4 Active Recuperation Phase (ARP; 120 seconds and; 5 Passive Recuperation Phase (PRP: 180 seconds. For the WanMT Test, we selected the load of 3.75% of corporal weight for all volunteers. To analyze the HRV, we used the following parameters: the interval RRr, MNN, SDNN, RMSSD and PNN50. We only observed results for the group according to RMSSD parameters during the rest phase of the test protocol in which the group remained in vagal presence and during all other phases in vagal depression. However, when we analyzed the PNN50, we observed that the group was in medium vagal presence during all of the phases of the test though there was no statistically significant difference (p> 0.05 between the phases. Therefore, we can say that all of the individuals had a similar profile in the autonomic response to the WanMT, confirmed by the parameters studied in the analysis of the HRV in the time domain.

  18. Software Supportability Risk Assessment in OT&E (Operational Test and Evaluation): An Evaluation of Risk Methodologies.

    Science.gov (United States)

    1984-08-31

    DISTRIBUTION: UNIMITED) ELC 0 AUUST 31, M94 Wd/A 8 4 496-TR wwvw WW CORPORATION 1801 RANDOLPH ROAD. S.E * ALSUQUERQUE. NEW MEXICO 87106 *(505) 848-5=0...Road, S.E., Albuquerque, New Mexico , 87106, to the Air Force Operational Test and Evaluation Center, Kirtland Air Force Base, New Mexico , 87117. This...metrics described in section 3.4 should be reported in a storyboard fashion. This technique would act as a summary of the ri.sk assessment process

  19. Documentation of tests on particle size methodologies for laser diffraction compared to traditional sieving and sedimentation analysis

    DEFF Research Database (Denmark)

    Rasmussen, Charlotte; Dalsgaard, Kristian

    Sieving and sedimentation analyses by pipette or hydrometer are historically the traditional methods for determining particle size distributions (PSD). A more informative and faster alternative has for years been laser diffraction (LD). From 2003 to 2013 the authors of this paper have worked...... intensively with PSD and performed various tests and investigations, using LD, sedimentation (by pipette) and sieving. The aim was to improve and understand the relationship between these various techniques, pre-treatment effects and preferably find a unifying correlation factor. As a result, method...... content and expected PSD....

  20. Mean Abnormal Result Rate: Proof of Concept of a New Metric for Benchmarking Selectivity in Laboratory Test Ordering.

    Science.gov (United States)

    Naugler, Christopher T; Guo, Maggie

    2016-04-01

    There is a need to develop and validate new metrics to access the appropriateness of laboratory test requests. The mean abnormal result rate (MARR) is a proposed measure of ordering selectivity, the premise being that higher mean abnormal rates represent more selective test ordering. As a validation of this metric, we compared the abnormal rate of lab tests with the number of tests ordered on the same requisition. We hypothesized that requisitions with larger numbers of requested tests represent less selective test ordering and therefore would have a lower overall abnormal rate. We examined 3,864,083 tests ordered on 451,895 requisitions and found that the MARR decreased from about 25% if one test was ordered to about 7% if nine or more tests were ordered, consistent with less selectivity when more tests were ordered. We then examined the MARR for community-based testing for 1,340 family physicians and found both a wide variation in MARR as well as an inverse relationship between the total tests ordered per year per physician and the physician-specific MARR. The proposed metric represents a new utilization metric for benchmarking relative selectivity of test orders among physicians. © American Society for Clinical Pathology, 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.