WorldWideScience

Sample records for ratio test methodology

  1. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  2. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  3. Test reactor risk assessment methodology

    International Nuclear Information System (INIS)

    Jennings, R.H.; Rawlins, J.K.; Stewart, M.E.

    1976-04-01

    A methodology has been developed for the identification of accident initiating events and the fault modeling of systems, including common mode identification, as these methods are applied in overall test reactor risk assessment. The methods are exemplified by a determination of risks to a loss of primary coolant flow in the Engineering Test Reactor

  4. Environmental Testing Methodology in Biometrics

    OpenAIRE

    Fernández Saavedra, Belén; Sánchez Reíllo, Raúl; Alonso Moreno, Raúl; Miguel Hurtado, Óscar

    2010-01-01

    8 pages document + 5-slide presentation.-- Contributed to: 1st International Biometric Performance Conference (IBPC 2010, NIST, Gaithersburg, MD, US, Mar 1-5, 2010). Recently, biometrics is used in many security systems and these systems can be located in different environments. As many experts claim and previous works have demonstrated, environmental conditions influence biometric performance. Nevertheless, there is not a specific methodology for testing this influence at the moment...

  5. The Laplace Likelihood Ratio Test for Heteroscedasticity

    Directory of Open Access Journals (Sweden)

    J. Martin van Zyl

    2011-01-01

    Full Text Available It is shown that the likelihood ratio test for heteroscedasticity, assuming the Laplace distribution, gives good results for Gaussian and fat-tailed data. The likelihood ratio test, assuming normality, is very sensitive to any deviation from normality, especially when the observations are from a distribution with fat tails. Such a likelihood test can also be used as a robust test for a constant variance in residuals or a time series if the data is partitioned into groups.

  6. Dairy cattle sustainability using the emergy methodology: Environmental loading ratio

    Directory of Open Access Journals (Sweden)

    Edmar Eduardo Bassan Mendes

    2012-12-01

    Full Text Available The dairy cattle activity in São Paulo State has been depressed in recent years, evidenced by the reduction of 35.47% of dairy herd between 1996 and 2008 (LUPA and 29.73% in milk production between the census of the IBGE (1995 and 2006. Activity remains in the Agricultural Production Units (UPA that have adopted more intensive systems of milk production, using animals of high genetic potential, management-intensive rotational grazing or agricultural inputs, and with the objective of profit maximization. In face of environmental pressures, the problem is to know the degree of sustainability of milk production. The objective in this work was to analyze the production of milk from a farm in the municipality of Guzolândia, São Paulo State, during the period 2005/2011, using the emergy methodology to assess the sustainability of system, calculated by Environmental Loading Ratio (ELR. The UPA Alto da Araúna is dedicated to dairy cattle adopting the system of milk production semi-intensive type B; it produces on average 650 liters of milk per day with 45 lactating cows, using 30 ha of pasture with supplemental feed and silage. It has sandy soil, classified as latossol red, yellow, ortho phase, with gently rolling slopes. The UPA is administered with business structure, aiming to profit maximization and minimization of environmental impacts, seeking to maintain economically viable activity and preserving the environment. Currently, administrative decisions have the support of operational control that collects and records information necessary to generate animal and agricultural indexes that evaluate the performance of the UPA, in addition to managerial accounting records that generate cash flow information used to evaluate the economic efficiency of the UPA. The Environmental Loading Ratio (ELR=N+F/R is obtained by the ratio of natural non-renewable resources (N plus economic resources (F by total renewable emergy (R. It is an indicator of the

  7. Aerospace Payloads Leak Test Methodology

    Science.gov (United States)

    Lvovsky, Oleg; Grayson, Cynthia M.

    2010-01-01

    Pressurized and sealed aerospace payloads can leak on orbit. When dealing with toxic or hazardous materials, requirements for fluid and gas leakage rates have to be properly established, and most importantly, reliably verified using the best Nondestructive Test (NDT) method available. Such verification can be implemented through application of various leak test methods that will be the subject of this paper, with a purpose to show what approach to payload leakage rate requirement verification is taken by the National Aeronautics and Space Administration (NASA). The scope of this paper will be mostly a detailed description of 14 leak test methods recommended.

  8. The behavior of the likelihood ratio test for testing missingness

    OpenAIRE

    Hens, Niel; Aerts, Marc; Molenberghs, Geert; Thijs, Herbert

    2003-01-01

    To asses the sensitivity of conclusions to model choices in the context of selection models for non-random dropout, one can oppose the different missing mechanisms to each other; e.g. by the likelihood ratio tests. The finite sample behavior of the null distribution and the power of the likelihood ratio test is studied under a variety of missingness mechanisms. missing data; sensitivity analysis; likelihood ratio test; missing mechanisms

  9. Methodology for developing new test methods

    Directory of Open Access Journals (Sweden)

    A. I. Korobko

    2017-06-01

    Full Text Available The paper describes the methodology for developing new test methods and forming solutions for the development of new test methods. The basis of the methodology for developing new test methods is the individual elements of the system and process approaches. They contribute to the development of an effective research strategy for the object, the study of interrelations, the synthesis of an adequate model of the test method. The effectiveness of the developed test method is determined by the correct choice of the set of concepts, their interrelations and mutual influence. This allows you to solve the tasks assigned to achieve the goal. The methodology is based on the use of fuzzy cognitive maps. The question of the choice of the method on the basis of which the model for the formation of solutions is based is considered. The methodology provides for recording a model for a new test method in the form of a finite set of objects. These objects are significant for the test method characteristics. Then a causal relationship is established between the objects. Further, the values of fitness indicators and the observability of the method and metrological tolerance for the indicator are established. The work is aimed at the overall goal of ensuring the quality of tests by improving the methodology for developing the test method.

  10. Testing Methodology in the Student Learning Process

    Science.gov (United States)

    Gorbunova, Tatiana N.

    2017-01-01

    The subject of the research is to build methodologies to evaluate the student knowledge by testing. The author points to the importance of feedback about the mastering level in the learning process. Testing is considered as a tool. The object of the study is to create the test system models for defence practice problems. Special attention is paid…

  11. A Design Methodology for Computer Security Testing

    OpenAIRE

    Ramilli, Marco

    2013-01-01

    The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. ...

  12. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  13. Development of a flight software testing methodology

    Science.gov (United States)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  14. Methodology of diagnostic tests in hepatology

    DEFF Research Database (Denmark)

    Christensen, Erik

    2009-01-01

    The performance of diagnostic tests can be assessed by a number of methods. These include sensitivity, specificity,positive and negative predictive values, likelihood ratios and receiver operating characteristic (ROC) curves. This paper describes the methods and explains which information...... they provide. Sensitivity and specificity provides measures of the diagnostic accuracy of a test in diagnosing the condition. The positive and negative predictive values estimate the probability of the condition from the test-outcome and the condition's prevalence. The likelihood ratios bring together......' and plotting sensitivity as a function of 1-specificity. The ROC-curve can be used to define optimal cut-off values for a test, to assess the diagnostic accuracy of the test, and to compare the usefulness of different tests in the same patients. Under certain conditions it may be possible to utilize a test...

  15. Methodology for testing metal detectors using variables test data

    Energy Technology Data Exchange (ETDEWEB)

    Spencer, D.D.; Murray, D.W.

    1993-08-01

    By extracting and analyzing measurement (variables) data from portal metal detectors whenever possible instead of the more typical ``alarm``/``no-alarm`` (attributes or binomial) data, we can be more informed about metal detector health with fewer tests. This testing methodology discussed in this report is an alternative to the typical binomial testing and in many ways is far superior.

  16. Certification Testing Methodology for Composite Structure. Volume 2. Methodology Development

    Science.gov (United States)

    1986-10-01

    parameter, sample size and fa- tigue test duration. The required input are 1. Residual strength Weibull shape parameter ( ALPR ) 2. Fatigue life Weibull shape...INPUT STRENGTH ALPHA’) READ(*,*) ALPR ALPRI = 1.O/ ALPR WRITE(*, 2) 2 FORMAT( 2X, ’PLEASE INPUT LIFE ALPHA’) READ(*,*) ALPL ALPLI - 1.0/ALPL WRITE(*, 3...3 FORMAT(2X,’PLEASE INPUT SAMPLE SIZE’) READ(*,*) N AN - N WRITE(*,4) 4 FORMAT(2X,’PLEASE INPUT TEST DURATION’) READ(*,*) T RALP - ALPL/ ALPR ARGR - 1

  17. 34 CFR Appendix A to Subpart L of... - Ratio Methodology for Proprietary Institutions

    Science.gov (United States)

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Ratio Methodology for Proprietary Institutions A Appendix A to Subpart L of Part 668 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION STUDENT ASSISTANCE GENERAL PROVISIONS...

  18. 34 CFR Appendix B to Subpart L of... - Ratio Methodology for Private Non-Profit Institutions

    Science.gov (United States)

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Ratio Methodology for Private Non-Profit Institutions B Appendix B to Subpart L of Part 668 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION STUDENT ASSISTANCE GENERAL PROVISIONS...

  19. PETA: Methodology of Information Systems Security Penetration Testing

    Directory of Open Access Journals (Sweden)

    Tomáš Klíma

    2016-12-01

    Full Text Available Current methodologies of information systems penetration testing focuses mainly on a high level and technical description of the testing process. Unfortunately, there is no methodology focused primarily on the management of these tests. It often results in a situation when the tests are badly planned, managed and the vulnerabilities found are unsystematically remediated. The goal of this article is to present new methodology called PETA which is focused mainly on the management of penetration tests. Development of this methodology was based on the comparative analysis of current methodologies. New methodology incorporates current best practices of IT governance and project management represented by COBIT and PRINCE2 principles. Presented methodology has been quantitatively evaluated.

  20. Nearly Efficient Likelihood Ratio Tests for Seasonal Unit Roots

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    In an important generalization of zero frequency autore- gressive unit root tests, Hylleberg, Engle, Granger, and Yoo (1990) developed regression-based tests for unit roots at the seasonal frequencies in quarterly time series. We develop likelihood ratio tests for seasonal unit roots and show...... that these tests are "nearly efficient" in the sense of Elliott, Rothenberg, and Stock (1996), i.e. that their local asymptotic power functions are indistinguishable from the Gaussian power envelope. Currently available nearly efficient testing procedures for seasonal unit roots are regression-based and require...... the choice of a GLS detrending parameter, which our likelihood ratio tests do not....

  1. Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation

    Science.gov (United States)

    2016-05-01

    ARL-TN-0756 ● MAY 2016 US Army Research Laboratory Stress Optical Coefficient, Test Methodology , and Glass Standard Evaluation...Stress Optical Coefficient, Test Methodology , and Glass Standard Evaluation by Clayton M Weiss Oak Ridge Institute for Science and Education...ORISE), Belcamp, MD Parimal J Patel Weapons and Materials Research Directorate, ARL Approved for public release; distribution is

  2. Testing methodology of embedded software in digital plant protection system

    International Nuclear Information System (INIS)

    Seong, Ah Young; Choi, Bong Joo; Lee, Na Young; Hwang, Il Soon

    2001-01-01

    It is necessary to assure the reliability of software in order to digitalize RPS(Reactor Protection System). Since RPS causes fatal damage on accidental cases, it is classified as Safety 1E class. Therefore we propose the effective testing methodology to assure the reliability of embedded software in the DPPS(Digital Plant Protection System). To test the embedded software effectively in DPPS, our methodology consists of two steps. The first is the re-engineering step that extracts classes from structural source program, and the second is the level of testing step which is composed of unit testing, Integration Testing and System Testing. On each testing step we test the embedded software with selected test cases after the test item identification step. If we use this testing methodology, we can test the embedded software effectively by reducing the cost and the time

  3. Understanding the properties of diagnostic tests - Part 2: Likelihood ratios.

    Science.gov (United States)

    Ranganathan, Priya; Aggarwal, Rakesh

    2018-01-01

    Diagnostic tests are used to identify subjects with and without disease. In a previous article in this series, we examined some attributes of diagnostic tests - sensitivity, specificity, and predictive values. In this second article, we look at likelihood ratios, which are useful for the interpretation of diagnostic test results in everyday clinical practice.

  4. Further comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kulacsy, K. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1997-05-23

    The Bayesian method for belief updating proposed in Racz (1996) is examined. The interpretation of the belief function introduced therein is found, and the method is compared to the classical binary Sequential Probability Ratio Testing method (SPRT). (author).

  5. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  6. Methodology for testing and validating knowledge bases

    Science.gov (United States)

    Krishnamurthy, C.; Padalkar, S.; Sztipanovits, J.; Purves, B. R.

    1987-01-01

    A test and validation toolset developed for artificial intelligence programs is described. The basic premises of this method are: (1) knowledge bases have a strongly declarative character and represent mostly structural information about different domains, (2) the conditions for integrity, consistency, and correctness can be transformed into structural properties of knowledge bases, and (3) structural information and structural properties can be uniformly represented by graphs and checked by graph algorithms. The interactive test and validation environment have been implemented on a SUN workstation.

  7. Mobile Usability Testing in Healthcare: Methodological Approaches.

    Science.gov (United States)

    Borycki, Elizabeth M; Monkman, Helen; Griffith, Janessa; Kushniruk, Andre W

    2015-01-01

    The use of mobile devices and healthcare applications is increasing exponentially worldwide. This has lead to the need for the healthcare industry to develop a better understanding of the impact of the usability of mobile software and hardware upon consumer and health professional adoption and use of these technologies. There are many methodological approaches that can be employed in conducting usability evaluation of mobile technologies. More obtrusive approaches to collecting study data may lead to changes in study participant behaviour, leading to study results that are less consistent with how the technologies will be used in the real-world. Alternatively, less obstrusive methods used in evaluating the usability of mobile software and hardware in-situ and laboratory settings can lead to less detailed information being collected about how an individual interacts with both the software and hardware. In this paper we review and discuss several innovative mobile usability evaluation methods on a contiuum from least to most obtrusive and their effects on the quality of the usability data collected. The strengths and limitations of methods are also discussed.

  8. Comparison of heat-testing methodology.

    Science.gov (United States)

    Bierma, Mark M; McClanahan, Scott; Baisden, Michael K; Bowles, Walter R

    2012-08-01

    Patients with irreversible pulpitis occasionally present with a chief complaint of sensitivity to heat. To appropriately diagnose the offending tooth, a variety of techniques have been developed to reproduce this chief complaint. Such techniques cause temperature increases that are potentially damaging to the pulp. Newer electronic instruments control the temperature of a heat-testing tip that is placed directly against a tooth. The aim of this study was to determine which method produced the most consistent and safe temperature increase within the pulp. This consistency facilitates the clinician's ability to differentiate between a normal pulp and irreversible pulpitis. Four operators applied the following methods to each of 4 extracted maxillary premolars (for a total of 16 trials per method): heated gutta-percha, heated ball burnisher, hot water, and a System B unit or Elements unit with a heat-testing tip. Each test was performed for 60 seconds, and the temperatures were recorded via a thermocouple in the pulp chamber. Analysis of the data was performed by using the intraclass correlation coefficient. The least consistent warming was found with hot water. The heat-testing tip also demonstrated greater consistency between operators compared with the other methods. Hot water and the heated ball burnisher caused temperature increases high enough to damage pulp tissue. The Elements unit with a heat-testing tip provides the most consistent warming of the dental pulp. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  9. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    Science.gov (United States)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  10. Similar tests and the standardized log likelihood ratio statistic

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1986-01-01

    When testing an affine hypothesis in an exponential family the 'ideal' procedure is to calculate the exact similar test, or an approximation to this, based on the conditional distribution given the minimal sufficient statistic under the null hypothesis. By contrast to this there is a 'primitive......' approach in which the marginal distribution of a test statistic considered and any nuisance parameter appearing in the test statistic is replaced by an estimate. We show here that when using standardized likelihood ratio statistics the 'primitive' procedure is in fact an 'ideal' procedure to order O(n -3...

  11. Multiple Improvements of Multiple Imputation Likelihood Ratio Tests

    OpenAIRE

    Chan, Kin Wai; Meng, Xiao-Li

    2017-01-01

    Multiple imputation (MI) inference handles missing data by first properly imputing the missing values $m$ times, and then combining the $m$ analysis results from applying a complete-data procedure to each of the completed datasets. However, the existing method for combining likelihood ratio tests has multiple defects: (i) the combined test statistic can be negative in practice when the reference null distribution is a standard $F$ distribution; (ii) it is not invariant to re-parametrization; ...

  12. Comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Racz, A. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1996-07-01

    In this paper the classical sequential probability ratio testing method (SPRT) is reconsidered. Every individual boundary crossing event of the SPRT is regarded as a new piece of evidence about the problem under hypothesis testing. The Bayes method is applied for belief updating, i.e. integrating these individual decisions. The procedure is recommended to use when the user (1) would like to be informed about the tested hypothesis continuously and (2) would like to achieve his final conclusion with high confidence level. (Author).

  13. Equipment qualification testing methodology research at Sandia Laboratories

    International Nuclear Information System (INIS)

    Jeppesen, D.

    1983-01-01

    The Equipment Qualification Research Testing (EQRT) program is an evolutionary outgrowth of the Qualification Testing Evaluation (QTE) program at Sandia. The primary emphasis of the program has been qualification methodology research. The EQRT program offers to the industry a research-oriented perspective on qualification-related component performance, as well as refinements to component testing standards which are based upon actual component testing research

  14. Safeguarding a Lunar Rover with Wald's Sequential Probability Ratio Test

    Science.gov (United States)

    Furlong, Michael; Dille, Michael; Wong, Uland; Nefian, Ara

    2016-01-01

    The virtual bumper is a safeguarding mechanism for autonomous and remotely operated robots. In this paper we take a new approach to the virtual bumper system by using an old statistical test. By using a modified version of Wald's sequential probability ratio test we demonstrate that we can reduce the number of false positive reported by the virtual bumper, thereby saving valuable mission time. We use the concept of sequential probability ratio to control vehicle speed in the presence of possible obstacles in order to increase certainty about whether or not obstacles are present. Our new algorithm reduces the chances of collision by approximately 98 relative to traditional virtual bumper safeguarding without speed control.

  15. An Intersection–Union Test for the Sharpe Ratio

    Directory of Open Access Journals (Sweden)

    Gabriel Frahm

    2018-04-01

    Full Text Available An intersection–union test for supporting the hypothesis that a given investment strategy is optimal among a set of alternatives is presented. It compares the Sharpe ratio of the benchmark with that of each other strategy. The intersection–union test takes serial dependence into account and does not presume that asset returns are multivariate normally distributed. An empirical study based on the G–7 countries demonstrates that it is hard to find significant results due to the lack of data, which confirms a general observation in empirical finance.

  16. Proposed Objective Odor Control Test Methodology for Waste Containment

    Science.gov (United States)

    Vos, Gordon

    2010-01-01

    The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.

  17. Probabilistic fatigue life prediction methodology for notched components based on simple smooth fatigue tests

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Z. R.; Li, Z. X. [Dept.of Engineering Mechanics, Jiangsu Key Laboratory of Engineering Mechanics, Southeast University, Nanjing (China); Hu, X. T.; Xin, P. P.; Song, Y. D. [State Key Laboratory of Mechanics and Control of Mechanical Structures, Nanjing University of Aeronautics and Astronautics, Nanjing (China)

    2017-01-15

    The methodology of probabilistic fatigue life prediction for notched components based on smooth specimens is presented. Weakestlink theory incorporating Walker strain model has been utilized in this approach. The effects of stress ratio and stress gradient have been considered. Weibull distribution and median rank estimator are used to describe fatigue statistics. Fatigue tests under different stress ratios were conducted on smooth and notched specimens of titanium alloy TC-1-1. The proposed procedures were checked against the test data of TC-1-1 notched specimens. Prediction results of 50 % survival rate are all within a factor of two scatter band of the test results.

  18. Improvement in decay ratio calculation in LAPUR5 methodology for BWR instability

    International Nuclear Information System (INIS)

    Li Hsuannien; Yang Tzungshiue; Shih Chunkuan; Wang Jongrong; Lin Haotzu

    2009-01-01

    LAPUR5, based on frequency domain approach, is a computer code that analyzes the core stability and calculates decay ratios (DRs) of boiling water nuclear reactors. In current methodology, one set of parameters (three friction multipliers and one density reactivity coefficient multiplier) is chosen for LAPUR5 input files, LAPURX and LAPURW. The calculation stops and DR for this particular set of parameters is obtained when the convergence criteria (pressure, mass flow rate) are first met. However, there are other sets of parameters which could also meet the same convergence criteria without being identified. In order to cover these ranges of parameters, we developed an improved procedure to calculate DR in LAPUR5. First, we define the ranges and increments of those dominant input parameters in the input files for DR loop search. After LAPUR5 program execution, we can obtain all DRs for every set of parameters which satisfy the converge criteria in one single operation. The part for loop search procedure covers those steps in preparing LAPURX and LAPURW input files. As a demonstration, we looked into the reload design of Kuosheng Unit 2 Cycle 22. We found that the global DR has a maximum at exposure of 9070 MWd/t and the regional DR has a maximum at exposure of 5770 MWd/t. It should be noted that the regional DR turns out to be larger than the global ones for exposures less than 5770 MWd/t. Furthermore, we see that either global or regional DR by the loop search method is greater than the corresponding values from our previous approach. It is concluded that the loop search method can reduce human error and save human labor as compared with the previous version of LAPUR5 methodology. Now the maximum DR can be effectively obtained for a given plant operating conditions and a more precise stability boundary, with less uncertainty, can be plotted on plant power/flow map. (author)

  19. Cassini's Test Methodology for Flight Software Verification and Operations

    Science.gov (United States)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  20. Scale invariant for one-sided multivariate likelihood ratio tests

    Directory of Open Access Journals (Sweden)

    Samruam Chongcharoen

    2010-07-01

    Full Text Available Suppose 1 2 , ,..., n X X X is a random sample from Np ( ,V distribution. Consider 0 1 2 : ... 0 p H      and1 : 0 for 1, 2,..., i H   i  p , let 1 0 H  H denote the hypothesis that 1 H holds but 0 H does not, and let ~ 0 H denote thehypothesis that 0 H does not hold. Because the likelihood ratio test (LRT of 0 H versus 1 0 H  H is complicated, severalad hoc tests have been proposed. Tang, Gnecco and Geller (1989 proposed an approximate LRT, Follmann (1996 suggestedrejecting 0 H if the usual test of 0 H versus ~ 0 H rejects 0 H with significance level 2 and a weighted sum of the samplemeans is positive, and Chongcharoen, Singh and Wright (2002 modified Follmann’s test to include information about thecorrelation structure in the sum of the sample means. Chongcharoen and Wright (2007, 2006 give versions of the Tang-Gnecco-Geller tests and Follmann-type tests, respectively, with invariance properties. With LRT’s scale invariant desiredproperty, we investigate its powers by using Monte Carlo techniques and compare them with the tests which we recommendin Chongcharoen and Wright (2007, 2006.

  1. HIV / AIDS prevalence testing - merits, methodology and outcomes ...

    African Journals Online (AJOL)

    HIV / AIDS prevalence testing - merits, methodology and outcomes of a survey conducted at a large mining organisation in South Africa. ... These baseline prevalence data also provide an opportunity for monitoring of proposed interventions using cross-sectional surveys at designated intervals in the future. South African ...

  2. Evaluation and testing methodology for evolving entertainment systems

    NARCIS (Netherlands)

    Jurgelionis, A.; Bellotti, F.; IJsselsteijn, W.A.; Kort, de Y.A.W.; Bernhaupt, R.; Tscheligi, M.

    2007-01-01

    This paper presents a testing and evaluation methodology for evolving pervasive gaming and multimedia systems. We introduce the Games@Large system, a complex gaming and multimedia architecture comprised of a multitude of elements: heterogeneous end user devices, wireless and wired network

  3. Performance Testing Methodology for Safety-Critical Programmable Logic Controller

    International Nuclear Information System (INIS)

    Kim, Chang Ho; Oh, Do Young; Kim, Ji Hyeon; Kim, Sung Ho; Sohn, Se Do

    2009-01-01

    The Programmable Logic Controller (PLC) for use in Nuclear Power Plant safety-related applications is being developed and tested first time in Korea. This safety-related PLC is being developed with requirements of regulatory guideline and industry standards for safety system. To test that the quality of the developed PLC is sufficient to be used in safety critical system, document review and various product testings were performed over the development documents for S/W, H/W, and V/V. This paper provides the performance testing methodology and its effectiveness for PLC platform conducted by KOPEC

  4. Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis

    2013-01-01

    A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.

  5. Evaluation methodologies for security testing biometric systems beyond technological evaluation

    OpenAIRE

    Fernández Saavedra, María Belén

    2013-01-01

    The main objective of this PhD Thesis is the specification of formal evaluation methodologies for testing the security level achieved by biometric systems when these are working under specific contour conditions. This analysis is conducted through the calculation of the basic technical biometric system performance and its possible variations. To that end, the next two relevant contributions have been developed. The first contribution is the definition of two independent biometric performance ...

  6. Development of Testing Methodologies to Evaluate Postflight Locomotor Performance

    Science.gov (United States)

    Mulavara, A. P.; Peters, B. T.; Cohen, H. S.; Richards, J. T.; Miller, C. A.; Brady, R.; Warren, L. E.; Bloomberg, J. J.

    2006-01-01

    Crewmembers experience locomotor and postural instabilities during ambulation on Earth following their return from space flight. Gait training programs designed to facilitate recovery of locomotor function following a transition to a gravitational environment need to be accompanied by relevant assessment methodologies to evaluate their efficacy. The goal of this paper is to demonstrate the operational validity of two tests of locomotor function that were used to evaluate performance after long duration space flight missions on the International Space Station (ISS).

  7. Progress on qualification testing methodology study of electric cables

    International Nuclear Information System (INIS)

    Yoshida, K.; Seguchi, T.; Okada, S.; Ito, M.; Kusama, Y.; Yagi, T.; Yoshikawa, M.

    1983-01-01

    Many instrumental, control and power cables are installed in nuclear power plants, and these cables contain a large amount of organic polymers as insulating and jacketing materials. They are exposed to radiation at high dose rate, steam at high temperature and chemical (or water) spray simultaneously when a LOCA occurs. Under such conditions, the polymers tend to lose their original properties. For reactor safety, the cables should be functional even if they are subjected to a loss-of-coolant accident (LOCA) at the end of their intended service life. In Japan, cable manufacturers qualify their cables according to the proposed test standard issued from IEEJ in 1982, but the standard still has many unsolved problems or uncertainties which have been dealt with tentatively through the manufacturer-user's agreement. The objectives of this research are to study the methodologies for qualification testing of electric wires and cables, and to provide the improved technical bases for modification of the standard. Research activities are divided into the Accident (LOCA) Testing Methodology and the Accelerated Aging Methodology

  8. Methodology for dynamic biaxial tension testing of pregnant uterine tissue.

    Science.gov (United States)

    Manoogian, Sarah; Mcnally, Craig; Calloway, Britt; Duma, Stefan

    2007-01-01

    Placental abruption accounts for 50% to 70% of fetal losses in motor vehicle crashes. Since automobile crashes are the leading cause of traumatic fetal injury mortality in the United States, research of this injury mechanism is important. Before research can adequately evaluate current and future restraint designs, a detailed model of the pregnant uterine tissues is necessary. The purpose of this study is to develop a methodology for testing the pregnant uterus in biaxial tension at a rate normally seen in a motor vehicle crash. Since the majority of previous biaxial work has established methods for quasi-static testing, this paper combines previous research and new methods to develop a custom designed system to strain the tissue at a dynamic rate. Load cells and optical markers are used for calculating stress strain curves of the perpendicular loading axes. Results for this methodology show images of a tissue specimen loaded and a finite verification of the optical strain measurement. The biaxial test system dynamically pulls the tissue to failure with synchronous motion of four tissue grips that are rigidly coupled to the tissue specimen. The test device models in situ loading conditions of the pregnant uterus and overcomes previous limitations of biaxial testing. A non-contact method of measuring strains combined with data reduction to resolve the stresses in two directions provides the information necessary to develop a three dimensional constitutive model of the material. Moreover, future research can apply this method to other soft tissues with similar in situ loading conditions.

  9. Testing Strategies and Methodologies for the Max Launch Abort System

    Science.gov (United States)

    Schaible, Dawn M.; Yuchnovicz, Daniel E.

    2011-01-01

    The National Aeronautics and Space Administration (NASA) Engineering and Safety Center (NESC) was tasked to develop an alternate, tower-less launch abort system (LAS) as risk mitigation for the Orion Project. The successful pad abort flight demonstration test in July 2009 of the "Max" launch abort system (MLAS) provided data critical to the design of future LASs, while demonstrating the Agency s ability to rapidly design, build and fly full-scale hardware at minimal cost in a "virtual" work environment. Limited funding and an aggressive schedule presented a challenge for testing of the complex MLAS system. The successful pad abort flight demonstration test was attributed to the project s systems engineering and integration process, which included: a concise definition of, and an adherence to, flight test objectives; a solid operational concept; well defined performance requirements, and a test program tailored to reducing the highest flight test risks. The testing ranged from wind tunnel validation of computational fluid dynamic simulations to component ground tests of the highest risk subsystems. This paper provides an overview of the testing/risk management approach and methodologies used to understand and reduce the areas of highest risk - resulting in a successful flight demonstration test.

  10. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    Groenendijk, Patrick A.; Lucas, André; Vries, de Casper G.

    1998-01-01

    We advocate the use of absolute moment ratio statistics in conjunctionwith standard variance ratio statistics in order to disentangle lineardependence, non-linear dependence, and leptokurtosis in financial timeseries. Both statistics are computed for multiple return horizonssimultaneously, and the

  11. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  12. A micro focus with macro impact: Exploration of initial abstraction coefficient ratio (λ) in Soil Conservation Curve Number (CN) methodology

    International Nuclear Information System (INIS)

    Ling, L; Yusop, Z

    2014-01-01

    Researchers started to cross examine United States Department of Agriculture (USDA) Soil Conservation Services (SCS) Curve Number (CN) methodology after the technique produced inconsistent results throughout the world. More field data from recent decades were leaning against the assumption of the initial abstraction coefficient ratio value proposed by SCS in 1954. Physiographic conditions were identified as vital influencing factors to be considered under this methodology while practitioners of this method are encouraged to validate and derive regional specific relationship and employ the method with caution

  13. Evaluation of methodological aspects of digestibility measurements in ponies fed different haylage to concentrate ratios

    NARCIS (Netherlands)

    Schaafstra, F J W C; van Doorn, D A; Schonewille, J T; van Riet, M M J; Hendriks, W H

    2017-01-01

    Methodological aspects of digestibility measurements were studied in four Welsh pony geldings consuming haylage-based diets with increasing proportions of a pelleted concentrate according to a 4×4 Latin square design experiment. Ponies were fed four experimental, iso-energetic (net energy (NE)

  14. Evaluation of methodological aspects of digestibility measurements in ponies fed different haylage to concentrate ratios

    NARCIS (Netherlands)

    Schaafstra, F.J.W.C.; Doorn, van D.A.; Schonewille, J.T.; Riet, van M.M.J.; Visser, P.; Blok, M.C.; Hendriks, W.H.

    2017-01-01

    Methodological aspects of digestibility measurements were studied in four Welsh pony geldings consuming haylage-based diets with increasing proportions of a pelleted concentrate according to a 4×4 Latin square design experiment. Ponies were fed four experimental, iso-energetic (net energy (NE)

  15. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    P.A. Groenendijk (Patrick); A. Lucas (André); C.G. de Vries (Casper)

    1998-01-01

    textabstractWe advocate the use of absolute moment ratio statistics in conjunction with standard variance ratio statistics in order to disentangle linear dependence, non-linear dependence, and leptokurtosis in financial time series. Both statistics are computed for multiple return horizons

  16. Development of Testing Methodologies for the Mechanical Properties of MEMS

    Science.gov (United States)

    Ekwaro-Osire, Stephen

    2003-01-01

    This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.

  17. Combining tracer flux ratio methodology with low-flying aircraft measurements to estimate dairy farm CH4 emissions

    Science.gov (United States)

    Daube, C.; Conley, S.; Faloona, I. C.; Yacovitch, T. I.; Roscioli, J. R.; Morris, M.; Curry, J.; Arndt, C.; Herndon, S. C.

    2017-12-01

    Livestock activity, enteric fermentation of feed and anaerobic digestion of waste, contributes significantly to the methane budget of the United States (EPA, 2016). Studies question the reported magnitude of these methane sources (Miller et. al., 2013), calling for more detailed research of agricultural animals (Hristov, 2014). Tracer flux ratio is an attractive experimental method to bring to this problem because it does not rely on estimates of atmospheric dispersion. Collection of data occurred during one week at two dairy farms in central California (June, 2016). Each farm varied in size, layout, head count, and general operation. The tracer flux ratio method involves releasing ethane on-site with a known flow rate to serve as a tracer gas. Downwind mixed enhancements in ethane (from the tracer) and methane (from the dairy) were measured, and their ratio used to infer the unknown methane emission rate from the farm. An instrumented van drove transects downwind of each farm on public roads while tracer gases were released on-site, employing the tracer flux ratio methodology to assess simultaneous methane and tracer gas plumes. Flying circles around each farm, a small instrumented aircraft made measurements to perform a mass balance evaluation of methane gas. In the course of these two different methane quantification techniques, we were able to validate yet a third method: tracer flux ratio measured via aircraft. Ground-based tracer release rates were applied to the aircraft-observed methane-to-ethane ratios, yielding whole-site methane emission rates. Never before has the tracer flux ratio method been executed with aircraft measurements. Estimates from this new application closely resemble results from the standard ground-based technique to within their respective uncertainties. Incorporating this new dimension to the tracer flux ratio methodology provides additional context for local plume dynamics and validation of both ground and flight-based data.

  18. Efficient testing methodologies for microcameras in a gigapixel imaging system

    Science.gov (United States)

    Youn, Seo Ho; Marks, Daniel L.; McLaughlin, Paul O.; Brady, David J.; Kim, Jungsang

    2013-04-01

    Multiscale parallel imaging--based on a monocentric optical design--promises revolutionary advances in diverse imaging applications by enabling high resolution, real-time image capture over a wide field-of-view (FOV), including sport broadcast, wide-field microscopy, astronomy, and security surveillance. Recently demonstrated AWARE-2 is a gigapixel camera consisting of an objective lens and 98 microcameras spherically arranged to capture an image over FOV of 120° by 50°, using computational image processing to form a composite image of 0.96 gigapixels. Since microcameras are capable of individually adjusting exposure, gain, and focus, true parallel imaging is achieved with a high dynamic range. From the integration perspective, manufacturing and verifying consistent quality of microcameras is a key to successful realization of AWARE cameras. We have developed an efficient testing methodology that utilizes a precisely fabricated dot grid chart as a calibration target to extract critical optical properties such as optical distortion, veiling glare index, and modulation transfer function to validate imaging performance of microcameras. This approach utilizes an AWARE objective lens simulator which mimics the actual objective lens but operates with a short object distance, suitable for a laboratory environment. Here we describe the principles of the methodologies developed for AWARE microcameras and discuss the experimental results with our prototype microcameras. Reference Brady, D. J., Gehm, M. E., Stack, R. A., Marks, D. L., Kittle, D. S., Golish, D. R., Vera, E. M., and Feller, S. D., "Multiscale gigapixel photography," Nature 486, 386--389 (2012).

  19. Testing methodologies and systems for semiconductor optical amplifiers

    Science.gov (United States)

    Wieckowski, Michael

    Semiconductor optical amplifiers (SOA's) are gaining increased prominence in both optical communication systems and high-speed optical processing systems, due primarily to their unique nonlinear characteristics. This in turn, has raised questions regarding their lifetime performance reliability and has generated a demand for effective testing techniques. This is especially critical for industries utilizing SOA's as components for system-in-package products. It is important to note that very little research to date has been conducted in this area, even though production volume and market demand has continued to increase. In this thesis, the reliability of dilute-mode InP semiconductor optical amplifiers is studied experimentally and theoretically. The aging characteristics of the production level devices are demonstrated and the necessary techniques to accurately characterize them are presented. In addition, this work proposes a new methodology for characterizing the optical performance of these devices using measurements in the electrical domain. It is shown that optical performance degradation, specifically with respect to gain, can be directly qualified through measurements of electrical subthreshold differential resistance. This metric exhibits a linear proportionality to the defect concentration in the active region, and as such, can be used for prescreening devices before employing traditional optical testing methods. A complete theoretical analysis is developed in this work to explain this relationship based upon the device's current-voltage curve and its associated leakage and recombination currents. These results are then extended to realize new techniques for testing semiconductor optical amplifiers and other similarly structured devices. These techniques can be employed after fabrication and during packaged operation through the use of a proposed stand-alone testing system, or using a proposed integrated CMOS self-testing circuit. Both methods are capable

  20. Total Protein and Albumin/Globulin Ratio Test

    Science.gov (United States)

    ... Plasma Free Metanephrines Platelet Count Platelet Function Tests Pleural Fluid Analysis PML-RARA Porphyrin Tests Potassium Prealbumin ... of the various types of proteins in the liquid ( serum or plasma ) portion of the blood. Two ...

  1. TESTS AND METHODOLOGIES FOR THE SURVEY OF NARROW SPACES

    Directory of Open Access Journals (Sweden)

    L. Perfetti

    2017-02-01

    Full Text Available The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today’s era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  2. Jet-Surface Interaction - High Aspect Ratio Nozzle Test: Test Summary

    Science.gov (United States)

    Brown, Clifford A.

    2016-01-01

    The Jet-Surface Interaction High Aspect Ratio Nozzle Test was conducted in the Aero-Acoustic Propulsion Laboratory at the NASA Glenn Research Center in the fall of 2015. There were four primary goals specified for this test: (1) extend the current noise database for rectangular nozzles to higher aspect ratios, (2) verify data previously acquired at small-scale with data from a larger model, (3) acquired jet-surface interaction noise data suitable for creating verifying empirical noise models and (4) investigate the effect of nozzle septa on the jet-mixing and jet-surface interaction noise. These slides give a summary of the test with representative results for each goal.

  3. Hydrologic testing methodology and results from deep basalt boreholes

    International Nuclear Information System (INIS)

    Strait, S.R.; Spane, F.A.; Jackson, R.L.; Pidcoe, W.W.

    1982-05-01

    The objective of the hydrologic field-testing program is to provide data for characterization of the groundwater systems wihin the Pasco Basin that are significant to understanding waste isolation. The effort is directed toward characterizing the areal and vertical distributions of hydraulic head, hydraulic properties, and hydrochemistry. Data obtained from these studies provide input for numerical modeling of groundwater flow and solute transport. These models are then used for evaluating potential waste migration as a function of space and time. The groundwater system beneath the Hanford Site and surrounding area consists of a thick, accordantly layered sequence of basalt flows and associated sedimentary interbed that primarily occur in the upper part of the Columbia River basalt. Permeable horizons of the sequence are associated with the interbeds and the interflow zones within the basalt. The columnar interiors of a flow act as low-permeability aquitards, separating the more-permeable interflows or interbeds. This paper discusses the hydrologic field-gathering activities, specifically, field-testing methodology and test results from deep basalt boreholes

  4. Cernavoda NPP risk - Based test and maintenance planning - Methodology development

    International Nuclear Information System (INIS)

    Georgescu, G.; Popa, P.; Petrescu, A.; Naum, M.; Gutu, M.

    1997-01-01

    The Cernavoda Power Plant starts the commercial operation in November 1996. During operation of the nuclear power plant, several mandatory tests and maintenance are performed on stand-by safety system components to ensure their availability in case of accident. The basic purpose of such activities is the early detection of any failure and degradation, and timely correction of deteriorations. Because of the large number of such activities, emphasis on plant safety and allocation of resources becomes difficult. The probabilistic model and methodology can be effectively used to obtain the risk significance of these activities so that the resources are directed to the most important areas. The proposed Research Contract activity is strongly connected with other safety related areas under development. Since, the Cernavoda Probabilistic Safety Evaluation Level 1 PSA Study (CPSE) was performed and now the study is revised taking into account the as-built information, it is recommended to implement into the model the necessary modeling features to support further PSA application, especially related to Test and Maintenance optimization. Methods need to be developed in order to apply the PSA model including risk information together with other needed information for Test and Maintenance optimization. Also, in parallel with the CPSE study updating, the software interface for the PSA model is under development (Risk Monitor Software class), methods and models needing to be developed for the purpose of using it for qualified monitoring of Test and Maintenance Strategy efficiency. Similar, the Data Collection System need to be appropriate for the purpose of an ongoing implementation of a risk - based Test and Maintenance Strategy. (author). 4 refs, 1 fig

  5. Non-animal methodologies within biomedical research and toxicity testing.

    Science.gov (United States)

    Knight, Andrew

    2008-01-01

    Laboratory animal models are limited by scientific constraints on human applicability, and increasing regulatory restrictions, driven by social concerns. Reliance on laboratory animals also incurs marked - and in some cases, prohibitive - logistical challenges, within high-throughput chemical testing programmes, such as those currently underway within Europe and the US. However, a range of non-animal methodologies is available within biomedical research and toxicity testing. These include: mechanisms to enhance the sharing and assessment of existing data prior to conducting further studies, and physicochemical evaluation and computerised modelling, including the use of structure-activity relationships and expert systems. Minimally-sentient animals from lower phylogenetic orders or early developmental vertebral stages may be used, as well as microorganisms and higher plants. A variety of tissue cultures, including immortalised cell lines, embryonic and adult stem cells, and organotypic cultures, are also available. In vitro assays utilising bacterial, yeast, protozoal, mammalian or human cell cultures exist for a wide range of toxic and other endpoints. These may be static or perfused, and may be used individually, or combined within test batteries. Human hepatocyte cultures and metabolic activation systems offer potential assessment of metabolite activity and organ-organ interaction. Microarray technology may allow genetic expression profiling, increasing the speed of toxin detection, well prior to more invasive endpoints. Enhanced human clinical trials utilising micro- dosing, staggered dosing, and more representative study populations and durations, as well as surrogate human tissues, advanced imaging modalities and human epidemiological, sociological and psycho- logical studies, may increase our understanding of illness aetiology and pathogenesis, and facilitate the development of safe and effective pharmacologic interventions. Particularly when human tissues

  6. Pearce element ratios: A paradigm for testing hypotheses

    Science.gov (United States)

    Russell, J. K.; Nicholls, Jim; Stanley, Clifford R.; Pearce, T. H.

    Science moves forward with the development of new ideas that are encapsulated by hypotheses whose aim is to explain the structure of data sets or to expand existing theory. These hypotheses remain conjecture until they have been tested. In fact, Karl Popper advocated that a scientist's job does not finish with the creation of an idea but, rather, begins with the testing of the related hypotheses. In Popper's [1959] advocation it is implicit that there be tools with which we can test our hypotheses. Consequently, the development of rigorous tests for conceptual models plays a major role in maintaining the integrity of scientific endeavor [e.g., Greenwood, 1989].

  7. 21 CFR 862.1455 - Lecithin/sphingomyelin ratio in amniotic fluid test system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Lecithin/sphingomyelin ratio in amniotic fluid... Clinical Chemistry Test Systems § 862.1455 Lecithin/sphingomyelin ratio in amniotic fluid test system. (a) Identification. A lecithin/sphingomyelin ratio in amniotic fluid test system is a device intended to measure the...

  8. Evaluation of methodological aspects of digestibility measurements in ponies fed different haylage to concentrate ratios.

    Science.gov (United States)

    Schaafstra, F J W C; van Doorn, D A; Schonewille, J T; van Riet, M M J; Visser, P; Blok, M C; Hendriks, W H

    2017-11-01

    Methodological aspects of digestibility measurements were studied in four Welsh pony geldings consuming haylage-based diets with increasing proportions of a pelleted concentrate according to a 4×4 Latin square design experiment. Ponies were fed four experimental, iso-energetic (net energy (NE) basis) diets (i.e. 22 MJ NE/day) with increasing proportions of a pelleted concentrate (C) in relation to haylage (H). The absolute amounts of diet dry matter fed per day were 4.48 kg of H (100H), 3.36 and 0.73 kg of H and C (75H25C), 2.24 and 1.45 kg of H and C (50H50C) and 1.12 and 2.17 kg of H and C (25H75C). Diets were supplemented with minerals, vitamins and TiO2 (3.7 g Ti/day). Voluntary voided faeces were quantitatively collected daily during 10 consecutive days and analysed for moisture, ash, ADL, acid-insoluble ash (AIA) and Ti. A minimum faeces collection period of 6 consecutive days, along with a 14-day period to adapt the animals to the diets and become accustomed to the collection procedure, is recommended to obtain accurate estimations on dry matter digestibility and organic matter digestibility (OMD) in equids fed haylage-based diets supplemented with concentrate. In addition, the recovery of AIA, ADL and Ti was determined and evaluated. Mean faecal recovery over 10 consecutive days across diets for AIA, ADL and Ti was 124.9% (SEM 2.9), 108.7% (SEM 2.0) and 97.5% (SEM 0.9), respectively. Cumulative faecal recovery of AIA significantly differed between treatments, indicating that AIA is inadequate to estimate the OMD in equines. In addition, evaluation of the CV of mean cumulative faecal recoveries obtained by AIA, ADL and Ti showed greater variations in faecal excretion of AIA (9.1) and ADL (7.4) than Ti (3.7). The accuracy of prediction of OMD was higher with the use of Ti than ADL. The use of Ti is preferred as a marker in digestibility trials in equines fed haylage-based diets supplemented with increasing amounts of pelleted concentrate.

  9. Obtaining reliable Likelihood Ratio tests from simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    It is standard practice by researchers and the default option in many statistical programs to base test statistics for mixed models on simulations using asymmetric draws (e.g. Halton draws). This paper shows that when the estimated likelihood functions depend on standard deviations of mixed param...

  10. The diagnostic odds ratio: a single indicator of test performance

    NARCIS (Netherlands)

    Glas, Afina S.; Lijmer, Jeroen G.; Prins, Martin H.; Bonsel, Gouke J.; Bossuyt, Patrick M. M.

    2003-01-01

    Diagnostic testing can be used to discriminate subjects with a target disorder from subjects without it. Several indicators of diagnostic performance have been proposed, such as sensitivity and specificity. Using paired indicators can be a disadvantage in comparing the performance of competing

  11. Urine Test: Microalbumin-to-Creatinine Ratio (For Parents)

    Science.gov (United States)

    ... could interfere with test results. Be sure to review all your child's medications with your doctor. The Procedure Your child will be asked to urinate (pee) into a clean sample cup in the doctor's office or at home. Collecting the specimen should only take a few minutes. If your child isn' ...

  12. Methodology for testing subcomponents; background and motivation for subcomponent testing of wind turbine rotor blades

    DEFF Research Database (Denmark)

    Antoniou, Alexandros; Branner, Kim; Lekou, D.J.

    2016-01-01

    This report aims to provide an overview of the design methodology followed by wind turbine blade structural designers, along with the testing procedure on full scale blades which are followed by testing laboratories for blade manufacturers as required by the relevant standards and certification...... bodies’ recommendations for design and manufacturing verification. The objective of the report is not to criticize the design methodology or testing procedure and the standards thereof followed in the wind energy community, but to identify those items offered by state of the art structural design tools...... investigations performed are based on the INNWIND.EU reference 10MW horizontal axis wind turbine [1]. The structural properties and material and layout definition used within IRPWIND are defined in the INNWIND.EU report [2]. The layout of the report includes a review of the structural analysis models used...

  13. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    Science.gov (United States)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate (dynamic fatigue) testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rates in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  14. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Xiao-Ying; Yao, Juan; He, Hua; Glantz, Clifford S.; Booth, Alexander E.

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  15. A comparison of likelihood ratio tests and Rao's score test for three separable covariance matrix structures.

    Science.gov (United States)

    Filipiak, Katarzyna; Klein, Daniel; Roy, Anuradha

    2017-01-01

    The problem of testing the separability of a covariance matrix against an unstructured variance-covariance matrix is studied in the context of multivariate repeated measures data using Rao's score test (RST). The RST statistic is developed with the first component of the separable structure as a first-order autoregressive (AR(1)) correlation matrix or an unstructured (UN) covariance matrix under the assumption of multivariate normality. It is shown that the distribution of the RST statistic under the null hypothesis of any separability does not depend on the true values of the mean or the unstructured components of the separable structure. A significant advantage of the RST is that it can be performed for small samples, even smaller than the dimension of the data, where the likelihood ratio test (LRT) cannot be used, and it outperforms the standard LRT in a number of contexts. Monte Carlo simulations are then used to study the comparative behavior of the null distribution of the RST statistic, as well as that of the LRT statistic, in terms of sample size considerations, and for the estimation of the empirical percentiles. Our findings are compared with existing results where the first component of the separable structure is a compound symmetry (CS) correlation matrix. It is also shown by simulations that the empirical null distribution of the RST statistic converges faster than the empirical null distribution of the LRT statistic to the limiting χ 2 distribution. The tests are implemented on a real dataset from medical studies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Test Methodologies for Hydrogen Sensor Performance Assessment: Chamber vs. Flow Through Test Apparatus: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Buttner, William J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hartmann, Kevin S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Schmidt, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cebolla, Rafeal O [Joint Research Centre, Petten, the Netherlands; Weidner, Eveline [Joint Research Centre, Petten, the Netherlands; Bonato, Christian [Joint Research Centre, Petten, the Netherlands

    2017-11-06

    Certification of hydrogen sensors to standards often prescribes using large-volume test chambers [1, 2]. However, feedback from stakeholders such as sensor manufacturers and end-users indicate that chamber test methods are often viewed as too slow and expensive for routine assessment. Flow through test methods potentially are an efficient, cost-effective alternative for sensor performance assessment. A large number of sensors can be simultaneously tested, in series or in parallel, with an appropriate flow through test fixture. The recent development of sensors with response times of less than 1s mandates improvements in equipment and methodology to properly capture the performance of this new generation of fast sensors; flow methods are a viable approach for accurate response and recovery time determinations, but there are potential drawbacks. According to ISO 26142 [1], flow through test methods may not properly simulate ambient applications. In chamber test methods, gas transport to the sensor can be dominated by diffusion which is viewed by some users as mimicking deployment in rooms and other confined spaces. Alternatively, in flow through methods, forced flow transports the gas to the sensing element. The advective flow dynamics may induce changes in the sensor behaviour relative to the quasi-quiescent condition that may prevail in chamber test methods. One goal of the current activity in the JRC and NREL sensor laboratories [3, 4] is to develop a validated flow through apparatus and methods for hydrogen sensor performance testing. In addition to minimizing the impact on sensor behaviour induced by differences in flow dynamics, challenges associated with flow through methods include the ability to control environmental parameters (humidity, pressure and temperature) during the test and changes in the test gas composition induced by chemical reactions with upstream sensors. Guidelines on flow through test apparatus design and protocols for the evaluation of

  17. Two methodologies for physical penetration testing using social engineering

    NARCIS (Netherlands)

    Dimkov, T.; van Cleeff, A.; Pieters, Wolter; Hartel, Pieter H.

    2010-01-01

    Penetration tests on IT systems are sometimes coupled with physical penetration tests and social engineering. In physical penetration tests where social engineering is allowed, the penetration tester directly interacts with the employees. These interactions are usually based on deception and if not

  18. A study on assessment methodology of surveillance test interval and allowed outage time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol

    1996-07-01

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method

  19. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol [Seoul Nationl Univ., Seoul (Korea, Republic of)] (and others)

    1996-07-15

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method.

  20. Strategic alternatives ranking methodology: Multiple RCRA incinerator evaluation test case

    International Nuclear Information System (INIS)

    Baker, G.; Thomson, R.D.; Reece, J.; Springer, L.; Main, D.

    1988-01-01

    This paper presents an important process approach to permit quantification and ranking of multiple alternatives being considered in remedial actions or hazardous waste strategies. This process is a methodology for evaluating programmatic options in support of site selection or environmental analyses. Political or other less tangible motivations for alternatives may be quantified by means of establishing the range of significant variables, weighting their importance, and by establishing specific criteria for scoring individual alternatives. An application of the process to a recent AFLC program permitted ranking incineration alternatives from a list of over 130 options. The process forced participation by the organizations to be effected, allowed a consensus of opinion to be achieved, allowed complete flexibility to evaluate factor sensitivity, and resulted in strong, quantifiable support for any subsequent site-selection action NEPA documents

  1. Security Testing in Agile Web Application Development - A Case Study Using the EAST Methodology

    CERN Document Server

    Erdogan, Gencer

    2010-01-01

    There is a need for improved security testing methodologies specialized for Web applications and their agile development environment. The number of web application vulnerabilities is drastically increasing, while security testing tends to be given a low priority. In this paper, we analyze and compare Agile Security Testing with two other common methodologies for Web application security testing, and then present an extension of this methodology. We present a case study showing how our Extended Agile Security Testing (EAST) performs compared to a more ad hoc approach used within an organization. Our working hypothesis is that the detection of vulnerabilities in Web applications will be significantly more efficient when using a structured security testing methodology specialized for Web applications, compared to existing ad hoc ways of performing security tests. Our results show a clear indication that our hypothesis is on the right track.

  2. Improvement of test methodology for evaluating diesel fuel stability

    Energy Technology Data Exchange (ETDEWEB)

    Gutman, M.; Tartakovsky, L.; Kirzhner, Y.; Zvirin, Y. [Internal Combustion Engines Lab., Haifa (Israel); Luria, D. [Fuel Authority, Tel Aviv (Israel); Weiss, A.; Shuftan, M. [Israel Defence Forces, Tel Aviv (Israel)

    1995-05-01

    The storage stability of diesel fuel has been extensively investigated for many years under laboratory conditions. Although continuous efforts have been made to improve testing techniques, there does not yet exist a generally accepted correlation between laboratory methods (such as chemical analysis of the fuel) and actual diesel engine tests. A testing method was developed by the Technion Internal Combustion Engines Laboratory (TICEL), in order to address this problem. The test procedure was designed to simulate diesel engine operation under field conditions. It is based on running a laboratory-modified single cylinder diesel engine for 50 h under cycling operating conditions. The overall rating of each test is based on individual evaluation of the deposits and residue formation in the fuel filter, nozzle body and needle, piston head, piston rings, exhaust valve, and combustion chamber (six parameters). Two methods for analyzing the test results were used: objective, based on measured data, and subjective, based on visual evaluation results of these deposits by a group of experts. Only the residual level in the fuel filter was evaluated quantitatively by measured results. In order to achieve higher accuracy of the method, the test procedure was improved by introducing the measured results of nozzle fouling as an additional objective evaluating (seventh) parameter. This factor is evaluated on the basis of the change in the air flow rate through the nozzle before and after the complete engine test. Other improvements in the method include the use of the nozzle assembly photograph in the test evaluation, and representation of all seven parameters on a continuous scale instead of the discrete scale used anteriorly, in order to achieve higher accuracy. This paper also contains the results obtained by application of this improved fuel stability test for a diesel fuel stored for a five-year period.

  3. Methodology for Life Testing of Refractory Metal / Sodium Heat Pipes

    International Nuclear Information System (INIS)

    Martin, James J.; Reid, Robert S.

    2006-01-01

    This work establishes an approach to generate carefully controlled data to find heat pipe operating life with material-fluid combinations capable of extended operation. To accomplish this goal acceleration is required to compress 10 years of operational life into 3 years of laboratory testing through a combination of increased temperature and mass fluence. Specific test series have been identified, based on American Society for Testing and Materials (ASTM) specifications, to investigate long-term corrosion rates. The refractory metal selected for demonstration purposes is a molybdenum-44.5% rhenium alloy formed by powder metallurgy. The heat pipes each have an annular crescent wick formed by hot isostatic pressing of molybdenum-rhenium wire mesh. The heat pipes are filled by vacuum distillation with purity sampling of the completed assembly. Round-the-clock heat pipe tests with 6-month destructive and non-destructive inspection intervals are conducted to identify the onset and level of corrosion. Non-contact techniques are employed to provide power to the evaporator (radio frequency induction heating at 1 to 5 kW per heat pipe) and calorimetry at the condenser (static gas gap coupled water cooled calorimeter). The planned operating temperature range extends from 1123 to 1323 K. Accomplishments before project cancellation included successful development of the heat pipe wick fabrication technique, establishment of all engineering designs, baseline operational test requirements, and procurement/assembly of supporting test hardware systems. (authors)

  4. Diagonal Likelihood Ratio Test for Equality of Mean Vectors in High-Dimensional Data

    KAUST Repository

    Hu, Zongliang; Tong, Tiejun; Genton, Marc G.

    2017-01-01

    We propose a likelihood ratio test framework for testing normal mean vectors in high-dimensional data under two common scenarios: the one-sample test and the two-sample test with equal covariance matrices. We derive the test statistics under the assumption that the covariance matrices follow a diagonal matrix structure. In comparison with the diagonal Hotelling's tests, our proposed test statistics display some interesting characteristics. In particular, they are a summation of the log-transformed squared t-statistics rather than a direct summation of those components. More importantly, to derive the asymptotic normality of our test statistics under the null and local alternative hypotheses, we do not require the assumption that the covariance matrix follows a diagonal matrix structure. As a consequence, our proposed test methods are very flexible and can be widely applied in practice. Finally, simulation studies and a real data analysis are also conducted to demonstrate the advantages of our likelihood ratio test method.

  5. Development and testing of the methodology for performance requirements

    International Nuclear Information System (INIS)

    Rivers, J.D.

    1989-01-01

    The U.S. Department of Energy (DOE) is in the process of implementing a set of materials control and accountability (MC ampersand A) performance requirements. These graded requirements set a uniform level of performance for similar materials at various facilities against the threat of an insider adversary stealing special nuclear material (SNM). These requirements are phrased in terms of detecting the theft of a goal quantity of SNM within a specified time period and with a probability greater than or equal to a special value and include defense-in-depth requirements. The DOE has conducted an extensive effort over the last 2 1/2 yr to develop a practical methodology to be used in evaluating facility performance against the performance requirements specified in DOE order 5633.3. The major participants in the development process have been the Office of Safeguards and Security (OSS), Brookhaven National Laboratory, and Los Alamos National Laboratory. The process has included careful reviews of related evaluation systems, a review of the intent of the requirements in the order, and site visits to most of the major facilities in the DOE complex. As a result of this extensive effort to develop guidance for the MC ampersand A performance requirements, OSS was able to provide a practical method that will allow facilities to evaluate the performance of their safeguards systems against the performance requirements. In addition, the evaluations can be validated by the cognizant operations offices in a systematic manner

  6. BRAF mutation testing in solid tumors: a methodological comparison.

    Science.gov (United States)

    Weyant, Grace W; Wisotzkey, Jeffrey D; Benko, Floyd A; Donaldson, Keri J

    2014-09-01

    Solid tumor genotyping has become standard of care for the characterization of proto-oncogene mutational status, which has traditionally been accomplished with Sanger sequencing. However, companion diagnostic assays and comparable laboratory-developed tests are becoming increasingly popular, such as the cobas 4800 BRAF V600 Mutation Test and the INFINITI KRAS-BRAF assay, respectively. This study evaluates and validates the analytical performance of the INFINITI KRAS-BRAF assay and compares concordance of BRAF status with two reference assays, the cobas test and Sanger sequencing. DNA extraction from FFPE tissue specimens was performed followed by multiplex PCR amplification and fluorescent label incorporation using allele-specific primer extension. Hybridization to a microarray, signal detection, and analysis were then performed. The limits of detection were determined by testing dilutions of mutant BRAF alleles within wild-type background DNA, and accuracy was calculated based on these results. The INFINITI KRAS-BRAF assay produced 100% concordance with the cobas test and Sanger sequencing and had sensitivity equivalent to the cobas assay. The INFINITI assay is repeatable with at least 95% accuracy in the detection of mutant and wild-type BRAF alleles. These results confirm that the INFINITI KRAS-BRAF assay is comparable to traditional sequencing and the Food and Drug Administration-approved companion diagnostic assay for the detection of BRAF mutations. Copyright © 2014 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  7. A more powerful test based on ratio distribution for retention noninferiority hypothesis.

    Science.gov (United States)

    Deng, Ling; Chen, Gang

    2013-03-11

    Rothmann et al. ( 2003 ) proposed a method for the statistical inference of fraction retention noninferiority (NI) hypothesis. A fraction retention hypothesis is defined as a ratio of the new treatment effect verse the control effect in the context of a time to event endpoint. One of the major concerns using this method in the design of an NI trial is that with a limited sample size, the power of the study is usually very low. This makes an NI trial not applicable particularly when using time to event endpoint. To improve power, Wang et al. ( 2006 ) proposed a ratio test based on asymptotic normality theory. Under a strong assumption (equal variance of the NI test statistic under null and alternative hypotheses), the sample size using Wang's test was much smaller than that using Rothmann's test. However, in practice, the assumption of equal variance is generally questionable for an NI trial design. This assumption is removed in the ratio test proposed in this article, which is derived directly from a Cauchy-like ratio distribution. In addition, using this method, the fundamental assumption used in Rothmann's test, that the observed control effect is always positive, that is, the observed hazard ratio for placebo over the control is greater than 1, is no longer necessary. Without assuming equal variance under null and alternative hypotheses, the sample size required for an NI trial can be significantly reduced if using the proposed ratio test for a fraction retention NI hypothesis.

  8. A methodology of SiP testing based on boundary scan

    Science.gov (United States)

    Qin, He; Quan, Haiyang; Han, Yifei; Zhu, Tianrui; Zheng, Tuo

    2017-10-01

    System in Package (SiP) play an important role in portable, aerospace and military electronic with the microminiaturization, light weight, high density, and high reliability. At present, SiP system test has encountered the problem on system complexity and malfunction location with the system scale exponentially increase. For SiP system, this paper proposed a testing methodology and testing process based on the boundary scan technology. Combining the character of SiP system and referencing the boundary scan theory of PCB circuit and embedded core test, the specific testing methodology and process has been proposed. The hardware requirement of the under test SiP system has been provided, and the hardware platform of the testing has been constructed. The testing methodology has the character of high test efficiency and accurate malfunction location.

  9. Methodological issues in testing the marginal productivity theory

    NARCIS (Netherlands)

    P.T. Gottschalk (Peter); J. Tinbergen (Jan)

    1982-01-01

    textabstractPrevious tests of the marginal productivity theory have been criticized on several grounds reviewed by the authors. One important deficiency has been the small number of factor inputs entered in the production functions. In 1978 Gottschalk suggested a method to estimate production

  10. Two methodologies for physical penetration testing using social engineering

    NARCIS (Netherlands)

    Dimkov, T.; Pieters, Wolter; Hartel, Pieter H.

    2009-01-01

    During a penetration test on the physical security of an organization, if social engineering is used, the penetration tester directly interacts with the employees. These interactions are usually based on deception and if not done properly can upset the employees, violate their privacy or damage

  11. Escherichia coli. A sanitary methodology for faecal water pollution tests

    International Nuclear Information System (INIS)

    Bonadonna, L.

    2001-01-01

    Among the traditional indictors of faecal water pollution, Escherichia coli has shown to fit better with the definition of indicator organism. Till now its recovery has been time-consuming and needs confirmation tests. In this report more rapid and direct methods, based on enzymatic reactions, are presented [it

  12. Test cases for interface tracking methods: methodology and current status

    International Nuclear Information System (INIS)

    Lebaigue, O.; Jamet, D.; Lemonnier, E.

    2004-01-01

    Full text of publication follows:In the past decade, a large number of new methods have been developed to deal with interfaces in the numerical simulation of two-phase flows. We have collected a set of 36 test cases, which can be seen as a tool to help engineers and researchers selecting the most appropriate method(s) for their specific fields of application. This set can be use: - To perform an initial evaluation of the capabilities of available methods with regard to the specificity of the final application and the most important features to be recovered from the simulation. - To measure the maximum mesh size to be used for a given physical problem in order to obtain an accurate enough solution. - To assess and quantify the performances of a selected method equipped with its set of physical models. The computation of a well-documented test case allows estimating the error due to the numerical technique by comparison with reference solutions. This process is compulsory to gain confidence and credibility on the prediction capabilities of a numerical method and its physical models. - To broaden the capabilities of a given numerical technique. The test cases may be used to identify the need for improvement of the overall numerical scheme or to determine the physical part of the model, which is responsible for the observed limitations. Each test case falls within one of the following categories: - Analytical solutions of well-known sets of equations corresponding to simple geometrical situations. - Reference numerical solutions of moderately complex problems, produced by accurate methods (e.g., boundary Fitted coordinate method) on refined meshes. - Separate effects analytical experiments. The presentation will suggest how to use the test cases for assessing the physical models and the numerical methods. The expected fallout of using test cases is indeed on the one hand to identify the merits of existing methods and on the other hand to orient further research towards

  13. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    Science.gov (United States)

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  14. Methodology of thermalhydraulic tests of fuel assemblies for WWER-1000

    International Nuclear Information System (INIS)

    Archipov, A.; Kolochko, V.N.

    2001-01-01

    At present 11 units with WWER-1000 are in operation in Ukraine. The NPPs are provided with nuclear fuel from Russia. The fuel assemblies are fabricated and delivered to Ukrainian NPPs from Russia. However the contemporary tendencies of nuclear energy development in the world assume a diversification of nuclear fuel vendors. Therefore the creation of the own nuclear fuel cycle of Ukraine is in mind in the strategy of nuclear energy development of Ukraine. As a part of the fuel assemblies fabrication process complex of the thermalhydraulic tests should be carried out to confirm design characteristics of the fuel assemblies before they are loaded in the reactor facility. The experimental basis and scientific infrastructure for the thermalhydraulic tests arrangement and realization of the programs and procedures for the core equipment examination are under consideration. (author)

  15. Adaptive and robust active vibration control methodology and tests

    CERN Document Server

    Landau, Ioan Doré; Castellanos-Silva, Abraham; Constantinescu, Aurelian

    2017-01-01

    This book approaches the design of active vibration control systems from the perspective of today’s ideas of computer control. It formulates the various design problems encountered in the active management of vibration as control problems and searches for the most appropriate tools to solve them. The experimental validation of the solutions proposed on relevant tests benches is also addressed. To promote the widespread acceptance of these techniques, the presentation eliminates unnecessary theoretical developments (which can be found elsewhere) and focuses on algorithms and their use. The solutions proposed cannot be fully understood and creatively exploited without a clear understanding of the basic concepts and methods, so these are considered in depth. The focus is on enhancing motivations, algorithm presentation and experimental evaluation. MATLAB®routines, Simulink® diagrams and bench-test data are available for download and encourage easy assimilation of the experimental and exemplary material. Thre...

  16. Alvar engine. An engine with variable compression ratio. Experiments and tests

    Energy Technology Data Exchange (ETDEWEB)

    Erlandsson, Olof

    1998-09-01

    This report is focused on tests with Variable Compression Ratio (VCR) engines, according to the Alvar engine principle. Variable compression ratio means an engine design where it is possible to change the nominal compression ratio. The purpose is to increase the fuel efficiency at part load by increasing the compression ratio. At maximum load, and maybe supercharging with for example turbocharger, it is not possible to keep a high compression ratio because of the knock phenomena. Knock is a shock wave caused by self-ignition of the fuel-air mix. If knock occurs, the engine will be exposed to a destructive load. Because of the reasons mentioned it would be an advantage if it would be possible to change the compression ratio continuously when the load changes. The Alvar engine provides a solution for variable compression ratio based on well-known engine components. This paper provides information about efficiency and emission characteristics from tests with two Alvar engines. Results from tests with a phase shift mechanism (for automatic compression ratio control) for the Alvar engine are also reviewed Examination paper. 5 refs, 23 figs, 2 tabs, 5 appendices

  17. An evaluation of damping ratios for HVAC duct systems using vibration test data

    International Nuclear Information System (INIS)

    Gunyasu, K.; Horimizu, Y.; Kawakami, A.; Iokibe, H.; Yamazaki, T.

    1988-01-01

    The function of Heating Ventilating Air Conditioning (HVAC) systems must be maintained including HVAC duct systems to keep the operation of safety-related equipment in nuclear power plants during earthquake excitations. Therefore, it is important to carry out seismic design for HVAC duct systems. In the previous aseismic design for HVAC duct systems, the 0.5% damping ratio has been used in Japan. In recent years, vibration tests, held on actual duct systems in nuclear power plants and mockup duct systems were performed in order to investigate damping ratios for HVAC duct systems. Based on the results, it was confirmed that the damping ratio for HVAC duct systems, evaluated from these tests, were much greater than the 0.5% damping ratio used in the previous aseismic design of Japan. The new damping ratio in aseismic design was proposed to be 2.5%. The present paper describes the results of the above mentioned investigation

  18. Sex ratios in the two Germanies: a test of the economic stress hypothesis.

    Science.gov (United States)

    Catalano, Ralph A

    2003-09-01

    Literature describing temporal variation in the secondary sex ratio among humans reports an association between population stressors and declines in the odds of male birth. Explanations of this phenomenon draw on reports that stressed females spontaneously abort male more than female fetuses, and that stressed males exhibit reduced sperm motility. This work has led to the argument that population stress induced by a declining economy reduces the human sex ratio. No direct test of this hypothesis appears in the literature. Here, a test is offered based on a comparison of the sex ratio in East and West Germany for the years 1946 to 1999. The theory suggests that the East German sex ratio should be lower in 1991, when East Germany's economy collapsed, than expected from its own history and from the sex ratio in West Germany. The hypothesis is tested using time-series modelling methods. The data support the hypothesis. The sex ratio in East Germany was at its lowest in 1991. This first direct test supports the hypothesis that economic decline reduces the human sex ratio.

  19. Fast Lemons and Sour Boulders: Testing Crossmodal Correspondences Using an Internet-Based Testing Methodology

    Directory of Open Access Journals (Sweden)

    Andy T. Woods

    2013-09-01

    Full Text Available According to a popular family of hypotheses, crossmodal matches between distinct features hold because they correspond to the same polarity on several conceptual dimensions (such as active–passive, good–bad, etc. that can be identified using the semantic differential technique. The main problem here resides in turning this hypothesis into testable empirical predictions. In the present study, we outline a series of plausible consequences of the hypothesis and test a variety of well-established and previously untested crossmodal correspondences by means of a novel internet-based testing methodology. The results highlight that the semantic hypothesis cannot easily explain differences in the prevalence of crossmodal associations built on the same semantic pattern (fast lemons, slow prunes, sour boulders, heavy red; furthermore, the semantic hypothesis only minimally predicts what happens when the semantic dimensions and polarities that are supposed to drive such crossmodal associations are made more salient (e.g., by adding emotional cues that ought to make the good/bad dimension more salient; finally, the semantic hypothesis does not explain why reliable matches are no longer observed once intramodal dimensions with congruent connotations are presented (e.g., visually presented shapes and colour do not appear to correspond.

  20. Integrated vehicle-based safety systems light-vehicle field operational test, methodology and results report.

    Science.gov (United States)

    2010-12-01

    "This document presents the methodology and results from the light-vehicle field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michi...

  1. Testing Methodology of Breaking into Secured Storages of Mobile Operational System Google Android

    Directory of Open Access Journals (Sweden)

    Elena Vyacheslavovna Elistratova

    2013-02-01

    Full Text Available The methodology is developed for carrying out the test of breaking into internal storages of mobile operational system Google Android in order to detect security threats for personal data.

  2. The efficiency of the crude oil markets: Evidence from variance ratio tests

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Amelie, E-mail: acharles@audencia.co [Audencia Nantes, School of Management, 8 route de la Joneliere, 44312 Nantes (France); Darne, Olivier, E-mail: olivier.darne@univ-nantes.f [LEMNA, University of Nantes, IEMN-IAE, Chemin de la Censive du Tertre, 44322 Nantes (France)

    2009-11-15

    This study examines the random walk hypothesis for the crude oil markets, using daily data over the period 1982-2008. The weak-form efficient market hypothesis for two crude oil markets (UK Brent and US West Texas Intermediate) is tested with non-parametric variance ratio tests developed by [Wright J.H., 2000. Alternative variance-ratio tests using ranks and signs. Journal of Business and Economic Statistics, 18, 1-9] and [Belaire-Franch J. and Contreras D., 2004. Ranks and signs-based multiple variance ratio tests. Working paper, Department of Economic Analysis, University of Valencia] as well as the wild-bootstrap variance ratio tests suggested by [Kim, J.H., 2006. Wild bootstrapping variance ratio tests. Economics Letters, 92, 38-43]. We find that the Brent crude oil market is weak-form efficiency while the WTI crude oil market seems to be inefficiency on the 1994-2008 sub-period, suggesting that the deregulation have not improved the efficiency on the WTI crude oil market in the sense of making returns less predictable.

  3. The efficiency of the crude oil markets. Evidence from variance ratio tests

    International Nuclear Information System (INIS)

    Charles, Amelie; Darne, Olivier

    2009-01-01

    This study examines the random walk hypothesis for the crude oil markets, using daily data over the period 1982-2008. The weak-form efficient market hypothesis for two crude oil markets (UK Brent and US West Texas Intermediate) is tested with non-parametric variance ratio tests developed by [Wright J.H., 2000. Alternative variance-ratio tests using ranks and signs. Journal of Business and Economic Statistics, 18, 1-9] and [Belaire-Franch J. and Contreras D., 2004. Ranks and signs-based multiple variance ratio tests. Working paper, Department of Economic Analysis, University of Valencia] as well as the wild-bootstrap variance ratio tests suggested by [Kim, J.H., 2006. Wild bootstrapping variance ratio tests. Economics Letters, 92, 38-43]. We find that the Brent crude oil market is weak-form efficiency while the WTI crude oil market seems to be inefficiency on the 1994-2008 sub-period, suggesting that the deregulation have not improved the efficiency on the WTI crude oil market in the sense of making returns less predictable. (author)

  4. The efficiency of the crude oil markets. Evidence from variance ratio tests

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Amelie [Audencia Nantes, School of Management, 8 route de la Joneliere, 44312 Nantes (France); Darne, Olivier [LEMNA, University of Nantes, IEMN-IAE, Chemin de la Censive du Tertre, 44322 Nantes (France)

    2009-11-15

    This study examines the random walk hypothesis for the crude oil markets, using daily data over the period 1982-2008. The weak-form efficient market hypothesis for two crude oil markets (UK Brent and US West Texas Intermediate) is tested with non-parametric variance ratio tests developed by [Wright J.H., 2000. Alternative variance-ratio tests using ranks and signs. Journal of Business and Economic Statistics, 18, 1-9] and [Belaire-Franch J. and Contreras D., 2004. Ranks and signs-based multiple variance ratio tests. Working paper, Department of Economic Analysis, University of Valencia] as well as the wild-bootstrap variance ratio tests suggested by [Kim, J.H., 2006. Wild bootstrapping variance ratio tests. Economics Letters, 92, 38-43]. We find that the Brent crude oil market is weak-form efficiency while the WTI crude oil market seems to be inefficiency on the 1994-2008 sub-period, suggesting that the deregulation have not improved the efficiency on the WTI crude oil market in the sense of making returns less predictable. (author)

  5. Advanced Test Reactor probabilistic risk assessment methodology and results summary

    International Nuclear Information System (INIS)

    Eide, S.A.; Atkinson, S.A.; Thatcher, T.A.

    1992-01-01

    The Advanced Test Reactor (ATR) probabilistic risk assessment (PRA) Level 1 report documents a comprehensive and state-of-the-art study to establish and reduce the risk associated with operation of the ATR, expressed as a mean frequency of fuel damage. The ATR Level 1 PRA effort is unique and outstanding because of its consistent and state-of-the-art treatment of all facets of the risk study, its comprehensive and cost-effective risk reduction effort while the risk baseline was being established, and its thorough and comprehensive documentation. The PRA includes many improvements to the state-of-the-art, including the following: establishment of a comprehensive generic data base for component failures, treatment of initiating event frequencies given significant plant improvements in recent years, performance of efficient identification and screening of fire and flood events using code-assisted vital area analysis, identification and treatment of significant seismic-fire-flood-wind interactions, and modeling of large loss-of-coolant accidents (LOCAs) and experiment loop ruptures leading to direct damage of the ATR core. 18 refs

  6. Estimating negative likelihood ratio confidence when test sensitivity is 100%: A bootstrapping approach.

    Science.gov (United States)

    Marill, Keith A; Chang, Yuchiao; Wong, Kim F; Friedman, Ari B

    2017-08-01

    Objectives Assessing high-sensitivity tests for mortal illness is crucial in emergency and critical care medicine. Estimating the 95% confidence interval (CI) of the likelihood ratio (LR) can be challenging when sample sensitivity is 100%. We aimed to develop, compare, and automate a bootstrapping method to estimate the negative LR CI when sample sensitivity is 100%. Methods The lowest population sensitivity that is most likely to yield sample sensitivity 100% is located using the binomial distribution. Random binomial samples generated using this population sensitivity are then used in the LR bootstrap. A free R program, "bootLR," automates the process. Extensive simulations were performed to determine how often the LR bootstrap and comparator method 95% CIs cover the true population negative LR value. Finally, the 95% CI was compared for theoretical sample sizes and sensitivities approaching and including 100% using: (1) a technique of individual extremes, (2) SAS software based on the technique of Gart and Nam, (3) the Score CI (as implemented in the StatXact, SAS, and R PropCI package), and (4) the bootstrapping technique. Results The bootstrapping approach demonstrates appropriate coverage of the nominal 95% CI over a spectrum of populations and sample sizes. Considering a study of sample size 200 with 100 patients with disease, and specificity 60%, the lowest population sensitivity with median sample sensitivity 100% is 99.31%. When all 100 patients with disease test positive, the negative LR 95% CIs are: individual extremes technique (0,0.073), StatXact (0,0.064), SAS Score method (0,0.057), R PropCI (0,0.062), and bootstrap (0,0.048). Similar trends were observed for other sample sizes. Conclusions When study samples demonstrate 100% sensitivity, available methods may yield inappropriately wide negative LR CIs. An alternative bootstrapping approach and accompanying free open-source R package were developed to yield realistic estimates easily. This

  7. Person fit for test speededness: normal curvatures, likelihood ratio tests and empirical Bayes estimates

    NARCIS (Netherlands)

    Goegebeur, Y.; de Boeck, P.; Molenberghs, G.

    2010-01-01

    The local influence diagnostics, proposed by Cook (1986), provide a flexible way to assess the impact of minor model perturbations on key model parameters’ estimates. In this paper, we apply the local influence idea to the detection of test speededness in a model describing nonresponse in test data,

  8. Tests and Confidence Intervals for an Extended Variance Component Using the Modified Likelihood Ratio Statistic

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund; Frydenberg, Morten; Jensen, Jens Ledet

    2005-01-01

    The large deviation modified likelihood ratio statistic is studied for testing a variance component equal to a specified value. Formulas are presented in the general balanced case, whereas in the unbalanced case only the one-way random effects model is studied. Simulation studies are presented......, showing that the normal approximation to the large deviation modified likelihood ratio statistic gives confidence intervals for variance components with coverage probabilities very close to the nominal confidence coefficient....

  9. Graphite Isotope Ratio Method Development Report: Irradiation Test Demonstration of Uranium as a Low Fluence Indicator

    International Nuclear Information System (INIS)

    Reid, B.D.; Gerlach, D.C.; Love, E.F.; McNeece, J.P.; Livingston, J.V.; Greenwood, L.R.; Petersen, S.L.; Morgan, W.C.

    1999-01-01

    This report describes an irradiation test designed to investigate the suitability of uranium as a graphite isotope ratio method (GIRM) low fluence indicator. GIRM is a demonstrated concept that gives a graphite-moderated reactor's lifetime production based on measuring changes in the isotopic ratio of elements known to exist in trace quantities within reactor-grade graphite. Appendix I of this report provides a tutorial on the GIRM concept

  10. Improved ASTM G72 Test Method for Ensuring Adequate Fuel-to-Oxidizer Ratios

    Science.gov (United States)

    Juarez, Alfredo; Harper, Susana Tapia

    2016-01-01

    The ASTM G72/G72M-15 Standard Test Method for Autogenous Ignition Temperature of Liquids and Solids in a High-Pressure Oxygen-Enriched Environment is currently used to evaluate materials for the ignition susceptibility driven by exposure to external heat in an enriched oxygen environment. Testing performed on highly volatile liquids such as cleaning solvents has proven problematic due to inconsistent test results (non-ignitions). Non-ignition results can be misinterpreted as favorable oxygen compatibility, although they are more likely associated with inadequate fuel-to-oxidizer ratios. Forced evaporation during purging and inadequate sample size were identified as two potential causes for inadequate available sample material during testing. In an effort to maintain adequate fuel-to-oxidizer ratios within the reaction vessel during test, several parameters were considered, including sample size, pretest sample chilling, pretest purging, and test pressure. Tests on a variety of solvents exhibiting a range of volatilities are presented in this paper. A proposed improvement to the standard test protocol as a result of this evaluation is also presented. Execution of the final proposed improved test protocol outlines an incremental step method of determining optimal conditions using increased sample sizes while considering test system safety limits. The proposed improved test method increases confidence in results obtained by utilizing the ASTM G72 autogenous ignition temperature test method and can aid in the oxygen compatibility assessment of highly volatile liquids and other conditions that may lead to false non-ignition results.

  11. Evaluation of constraint methodologies applied to a shallow-flaw cruciform bend specimen tested under biaxial loading conditions

    International Nuclear Information System (INIS)

    Bass, B.R.; McAfee, W.J.; Williams, P.T.; Pennell, W.E.

    1998-01-01

    A technology to determine shallow-flaw fracture toughness of reactor pressure vessel (RPV) steels is being developed for application to the safety assessment of RPVs containing postulated shallow surface flaws. Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shallow surface flaws. The cruciform beam specimens were developed at Oak Ridge National Laboratory (ORNL) to introduce a prototypic, far-field. out-of-plane biaxial stress component in the test section that approximates the nonlinear stresses resulting from pressurized-thermal-shock or pressure-temperature loading of an RPV. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for RPV materials. The cruciform fracture toughness data were used to evaluate fracture methodologies for predicting the observed effects of biaxial loading on shallow-flaw fracture toughness. Initial emphasis was placed on assessment of stress-based methodologies. namely, the J-Q formulation, the Dodds-Anderson toughness scaling model, and the Weibull approach. Applications of these methodologies based on the hydrostatic stress fracture criterion indicated an effect of loading-biaxiality on fracture toughness, the conventional maximum principal stress criterion indicated no effect

  12. Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis; Gold, Dara

    2013-01-01

    We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.

  13. A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence.

    Science.gov (United States)

    Jaspers, Monique W M

    2009-05-01

    Usability evaluation is now widely recognized as critical to the success of interactive health care applications. However, the broad range of usability inspection and testing methods available may make it difficult to decide on a usability assessment plan. To guide novices in the human-computer interaction field, we provide an overview of the methodological and empirical research available on the three usability inspection and testing methods most often used. We describe two 'expert-based' and one 'user-based' usability method: (1) the heuristic evaluation, (2) the cognitive walkthrough, and (3) the think aloud. All three usability evaluation methods are applied in laboratory settings. Heuristic evaluation is a relatively efficient usability evaluation method with a high benefit-cost ratio, but requires high skills and usability experience of the evaluators to produce reliable results. The cognitive walkthrough is a more structured approach than the heuristic evaluation with a stronger focus on the learnability of a computer application. Major drawbacks of the cognitive walkthrough are the required level of detail of task and user background descriptions for an adequate application of the latest version of the technique. The think aloud is a very direct method to gain deep insight in the problems end users encounter in interaction with a system but data analyses is extensive and requires a high level of expertise both in the cognitive ergonomics and in computer system application domain. Each of the three usability evaluation methods has shown its usefulness, has its own advantages and disadvantages; no single method has revealed any significant results indicating that it is singularly effective in all circumstances. A combination of different techniques that compliment one another should preferably be used as their collective application will be more powerful than applied in isolation. Innovative mobile and automated solutions to support end-user testing have

  14. Jet-Surface Interaction: High Aspect Ratio Nozzle Test, Nozzle Design and Preliminary Data

    Science.gov (United States)

    Brown, Clifford; Dippold, Vance

    2015-01-01

    The Jet-Surface Interaction High Aspect Ratio (JSI-HAR) nozzle test is part of an ongoing effort to measure and predict the noise created when an aircraft engine exhausts close to an airframe surface. The JSI-HAR test is focused on parameters derived from the Turbo-electric Distributed Propulsion (TeDP) concept aircraft which include a high-aspect ratio mailslot exhaust nozzle, internal septa, and an aft deck. The size and mass flow rate limits of the test rig also limited the test nozzle to a 16:1 aspect ratio, half the approximately 32:1 on the TeDP concept. Also, unlike the aircraft, the test nozzle must transition from a single round duct on the High Flow Jet Exit Rig, located in the AeroAcoustic Propulsion Laboratory at the NASA Glenn Research Center, to the rectangular shape at the nozzle exit. A parametric nozzle design method was developed to design three low noise round-to-rectangular transitions, with 8:1, 12:1, and 16: aspect ratios, that minimizes flow separations and shocks while providing a flat flow profile at the nozzle exit. These designs validated using the WIND-US CFD code. A preliminary analysis of the test data shows that the actual flow profile is close to that predicted and that the noise results appear consistent with data from previous, smaller scale, tests. The JSI-HAR test is ongoing through October 2015. The results shown in the presentation are intended to provide an overview of the test and a first look at the preliminary results.

  15. Testing Measurement Invariance Using MIMIC: Likelihood Ratio Test with a Critical Value Adjustment

    Science.gov (United States)

    Kim, Eun Sook; Yoon, Myeongsun; Lee, Taehun

    2012-01-01

    Multiple-indicators multiple-causes (MIMIC) modeling is often used to test a latent group mean difference while assuming the equivalence of factor loadings and intercepts over groups. However, this study demonstrated that MIMIC was insensitive to the presence of factor loading noninvariance, which implies that factor loading invariance should be…

  16. Hypothesis testing on the fractal structure of behavioral sequences: the Bayesian assessment of scaling methodology.

    Science.gov (United States)

    Moscoso del Prado Martín, Fermín

    2013-12-01

    I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  17. Leakage localisation method in a water distribution system based on sensitivity matrix: methodology and real test

    OpenAIRE

    Pascual Pañach, Josep

    2010-01-01

    Leaks are present in all water distribution systems. In this paper a method for leakage detection and localisation is presented. It uses pressure measurements and simulation models. Leakage localisation methodology is based on pressure sensitivity matrix. Sensitivity is normalised and binarised using a common threshold for all nodes, so a signatures matrix is obtained. A pressure sensor optimal distribution methodology is developed too, but it is not used in the real test. To validate this...

  18. Diagonal Likelihood Ratio Test for Equality of Mean Vectors in High-Dimensional Data

    KAUST Repository

    Hu, Zongliang

    2017-10-27

    We propose a likelihood ratio test framework for testing normal mean vectors in high-dimensional data under two common scenarios: the one-sample test and the two-sample test with equal covariance matrices. We derive the test statistics under the assumption that the covariance matrices follow a diagonal matrix structure. In comparison with the diagonal Hotelling\\'s tests, our proposed test statistics display some interesting characteristics. In particular, they are a summation of the log-transformed squared t-statistics rather than a direct summation of those components. More importantly, to derive the asymptotic normality of our test statistics under the null and local alternative hypotheses, we do not require the assumption that the covariance matrix follows a diagonal matrix structure. As a consequence, our proposed test methods are very flexible and can be widely applied in practice. Finally, simulation studies and a real data analysis are also conducted to demonstrate the advantages of our likelihood ratio test method.

  19. A European test of pesticide-leaching models: methodology and major recommendations

    NARCIS (Netherlands)

    Vanclooster, M.; Boesten, J.J.T.I.; Trevisan, M.; Brown, C.D.; Capri, E.; Eklo, O.M.; Gottesbüren, B.; Gouy, V.; Linden, van der A.M.A.

    2000-01-01

    Testing of pesticide-leaching models is important in view of their increasing use in pesticide registration procedures in the European Union. This paper presents the methodology and major conclusions of a test of pesticide-leaching models. Twelve models simulating the vertical one-dimensional

  20. Latent Trait Theory Applications to Test Item Bias Methodology. Research Memorandum No. 1.

    Science.gov (United States)

    Osterlind, Steven J.; Martois, John S.

    This study discusses latent trait theory applications to test item bias methodology. A real data set is used in describing the rationale and application of the Rasch probabilistic model item calibrations across various ethnic group populations. A high school graduation proficiency test covering reading comprehension, writing mechanics, and…

  1. Evidence Based Medicine; Positive and Negative Likelihood Ratios of Diagnostic Tests

    Directory of Open Access Journals (Sweden)

    Alireza Baratloo

    2015-10-01

    Full Text Available In the previous two parts of educational manuscript series in Emergency, we explained some screening characteristics of diagnostic tests including accuracy, sensitivity, specificity, and positive and negative predictive values. In the 3rd  part we aimed to explain positive and negative likelihood ratio (LR as one of the most reliable performance measures of a diagnostic test. To better understand this characteristic of a test, it is first necessary to fully understand the concept of sensitivity and specificity. So we strongly advise you to review the 1st part of this series again. In short, the likelihood ratios are about the percentage of people with and without a disease but having the same test result. The prevalence of a disease can directly influence screening characteristics of a diagnostic test, especially its sensitivity and specificity. Trying to eliminate this effect, LR was developed. Pre-test probability of a disease multiplied by positive or negative LR can estimate post-test probability. Therefore, LR is the most important characteristic of a test to rule out or rule in a diagnosis. A positive likelihood ratio > 1 means higher probability of the disease to be present in a patient with a positive test. The further from 1, either higher or lower, the stronger the evidence to rule in or rule out the disease, respectively. It is obvious that tests with LR close to one are less practical. On the other hand, LR further from one will have more value for application in medicine. Usually tests with 0.1 < LR > 10 are considered suitable for implication in routine practice.

  2. Comparison of IRT Likelihood Ratio Test and Logistic Regression DIF Detection Procedures

    Science.gov (United States)

    Atar, Burcu; Kamata, Akihito

    2011-01-01

    The Type I error rates and the power of IRT likelihood ratio test and cumulative logit ordinal logistic regression procedures in detecting differential item functioning (DIF) for polytomously scored items were investigated in this Monte Carlo simulation study. For this purpose, 54 simulation conditions (combinations of 3 sample sizes, 2 sample…

  3. Sex Ratios, Economic Power, and Women's Roles: A Theoretical Extension and Empirical Test.

    Science.gov (United States)

    South, Scott J.

    1988-01-01

    Tested hypotheses concerning sex ratios, women's roles, and economic power with data from 111 countries. Found undersupply of women positively associated with proportion of women who marry and fertility rate; inversely associated with women's average age at marriage, literacy rate, and divorce rate. Suggests women's economic power may counteract…

  4. Do exchange rates follow random walks? A variance ratio test of the ...

    African Journals Online (AJOL)

    The random-walk hypothesis in foreign-exchange rates market is one of the most researched areas, particularly in developed economies. However, emerging markets in sub-Saharan Africa have received little attention in this regard. This study applies Lo and MacKinlay's (1988) conventional variance ratio test and Wright's ...

  5. A methodology to investigate size scale effects in crystalline plasticity using uniaxial compression testing

    International Nuclear Information System (INIS)

    Uchic, Michael D.; Dimiduk, Dennis M.

    2005-01-01

    A methodology for performing uniaxial compression tests on samples having micron-size dimensions is presented. Sample fabrication is accomplished using focused ion beam milling to create cylindrical samples of uniform cross-section that remain attached to the bulk substrate at one end. Once fabricated, samples are tested in uniaxial compression using a nanoindentation device outfitted with a flat tip, and a stress-strain curve is obtained. The methodology can be used to examine the plastic response of samples of different sizes that are from the same bulk material. In this manner, dimensional size effects at the micron scale can be explored for single crystals, using a readily interpretable test that minimizes imposed stretch and bending gradients. The methodology was applied to a single-crystal Ni superalloy and a transition from bulk-like to size-affected behavior was observed for samples 5 μm in diameter and smaller

  6. The effects of multiple features of alternatively spliced exons on the KA/KS ratio test

    Directory of Open Access Journals (Sweden)

    Chen Feng-Chi

    2006-05-01

    Full Text Available Abstract Background The evolution of alternatively spliced exons (ASEs is of primary interest because these exons are suggested to be a major source of functional diversity of proteins. Many exon features have been suggested to affect the evolution of ASEs. However, previous studies have relied on the KA/KS ratio test without taking into consideration information sufficiency (i.e., exon length > 75 bp, cross-species divergence > 5% of the studied exons, leading to potentially biased interpretations. Furthermore, which exon feature dominates the results of the KA/KS ratio test and whether multiple exon features have additive effects have remained unexplored. Results In this study, we collect two different datasets for analysis – the ASE dataset (which includes lineage-specific ASEs and conserved ASEs and the ACE dataset (which includes only conserved ASEs. We first show that information sufficiency can significantly affect the interpretation of relationship between exons features and the KA/KS ratio test results. After discarding exons with insufficient information, we use a Boolean method to analyze the relationship between test results and four exon features (namely length, protein domain overlapping, inclusion level, and exonic splicing enhancer (ESE frequency for the ASE dataset. We demonstrate that length and protein domain overlapping are dominant factors, and they have similar impacts on test results of ASEs. In addition, despite the weak impacts of inclusion level and ESE motif frequency when considered individually, combination of these two factors still have minor additive effects on test results. However, the ACE dataset shows a slightly different result in that inclusion level has a marginally significant effect on test results. Lineage-specific ASEs may have contributed to the difference. Overall, in both ASEs and ACEs, protein domain overlapping is the most dominant exon feature while ESE frequency is the weakest one in affecting

  7. Ensemble Empirical Mode Decomposition based methodology for ultrasonic testing of coarse grain austenitic stainless steels.

    Science.gov (United States)

    Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N

    2015-03-01

    A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.

  8. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang

    1997-07-01

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code

  9. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang [Seoul National Univ., Seoul (Korea, Republic of)

    1997-07-15

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code.

  10. A numerical test method of California bearing ratio on graded crushed rocks using particle flow modeling

    Directory of Open Access Journals (Sweden)

    Yingjun Jiang

    2015-04-01

    Full Text Available In order to better understand the mechanical properties of graded crushed rocks (GCRs and to optimize the relevant design, a numerical test method based on the particle flow modeling technique PFC2D is developed for the California bearing ratio (CBR test on GCRs. The effects of different testing conditions and micro-mechanical parameters used in the model on the CBR numerical results have been systematically studied. The reliability of the numerical technique is verified. The numerical results suggest that the influences of the loading rate and Poisson's ratio on the CBR numerical test results are not significant. As such, a loading rate of 1.0–3.0 mm/min, a piston diameter of 5 cm, a specimen height of 15 cm and a specimen diameter of 15 cm are adopted for the CBR numerical test. The numerical results reveal that the CBR values increase with the friction coefficient at the contact and shear modulus of the rocks, while the influence of Poisson's ratio on the CBR values is insignificant. The close agreement between the CBR numerical results and experimental results suggests that the numerical simulation of the CBR values is promising to help assess the mechanical properties of GCRs and to optimize the grading design. Besides, the numerical study can provide useful insights on the mesoscopic mechanism.

  11. A likelihood ratio test for species membership based on DNA sequence data

    DEFF Research Database (Denmark)

    Matz, Mikhail V.; Nielsen, Rasmus

    2005-01-01

    DNA barcoding as an approach for species identification is rapidly increasing in popularity. However, it remains unclear which statistical procedures should accompany the technique to provide a measure of uncertainty. Here we describe a likelihood ratio test which can be used to test if a sampled...... sequence is a member of an a priori specified species. We investigate the performance of the test using coalescence simulations, as well as using the real data from butterflies and frogs representing two kinds of challenge for DNA barcoding: extremely low and extremely high levels of sequence variability....

  12. Spent fuel sabotage aerosol ratio program : FY 2004 test and data summary

    International Nuclear Information System (INIS)

    Brucher, Wenzel; Koch, Wolfgang; Pretzsch, Gunter Guido; Loiseau, Olivier; Mo, Tin; Billone, Michael C.; Autrusson, Bruno A.; Young, F. I.; Coats, Richard Lee; Burtseva, Tatiana; Luna, Robert Earl; Dickey, Roy R.; Sorenson, Ken Bryce; Nolte, Oliver; Thompson, Nancy Slater; Hibbs, Russell S.; Gregson, Michael Warren; Lange, Florentin; Molecke, Martin Alan; Tsai, Han-Chung

    2005-01-01

    This multinational, multi-phase spent fuel sabotage test program is quantifying the aerosol particles produced when the products of a high energy density device (HEDD) interact with and explosively particulate test rodlets that contain pellets of either surrogate materials or actual spent fuel. This program has been underway for several years. This program provides data that are relevant to some sabotage scenarios in relation to spent fuel transport and storage casks, and associated risk assessments. The program also provides significant technical and political benefits in international cooperation. We are quantifying the Spent Fuel Ratio (SFR), the ratio of the aerosol particles released from HEDD-impacted actual spent fuel to the aerosol particles produced from surrogate materials, measured under closely matched test conditions, in a contained test chamber. In addition, we are measuring the amounts, nuclide content, size distribution of the released aerosol materials, and enhanced sorption of volatile fission product nuclides onto specific aerosol particle size fractions. These data are the input for follow-on modeling studies to quantify respirable hazards, associated radiological risk assessments, vulnerability assessments, and potential cask physical protection design modifications. This document includes an updated description of the test program and test components for all work and plans made, or revised, during FY 2004. It also serves as a program status report as of the end of FY 2004. All available test results, observations, and aerosol analyses plus interpretations--primarily for surrogate material Phase 2 tests, series 2/5A through 2/9B, using cerium oxide sintered ceramic pellets are included. Advanced plans and progress are described for upcoming tests with unirradiated, depleted uranium oxide and actual spent fuel test rodlets. This spent fuel sabotage--aerosol test program is coordinated with the international Working Group for Sabotage Concerns of

  13. Spent fuel sabotage aerosol ratio program : FY 2004 test and data summary.

    Energy Technology Data Exchange (ETDEWEB)

    Brucher, Wenzel (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany); Koch, Wolfgang (Fraunhofer Institut fur Toxikologie und Experimentelle Medizin, Germany); Pretzsch, Gunter Guido (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany); Loiseau, Olivier (Institut de Radioprotection et de Surete Nucleaire, France); Mo, Tin (U.S. Nuclear Regulatory Commission, Washington, DC); Billone, Michael C. (Argonne National Laboratory, Argonne, IL); Autrusson, Bruno A. (Institut de Radioprotection et de Surete Nucleaire, France); Young, F. I. (U.S. Nuclear Regulatory Commission, Washington, DC); Coats, Richard Lee; Burtseva, Tatiana (Argonne National Laboratory, Argonne, IL); Luna, Robert Earl; Dickey, Roy R.; Sorenson, Ken Bryce; Nolte, Oliver (Fraunhofer Institut fur Toxikologie und Experimentelle Medizin, Germany); Thompson, Nancy Slater (U.S. Department of Energy, Washington, DC); Hibbs, Russell S. (U.S. Department of Energy, Washington, DC); Gregson, Michael Warren; Lange, Florentin (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany); Molecke, Martin Alan; Tsai, Han-Chung (Argonne National Laboratory, Argonne, IL)

    2005-07-01

    This multinational, multi-phase spent fuel sabotage test program is quantifying the aerosol particles produced when the products of a high energy density device (HEDD) interact with and explosively particulate test rodlets that contain pellets of either surrogate materials or actual spent fuel. This program has been underway for several years. This program provides data that are relevant to some sabotage scenarios in relation to spent fuel transport and storage casks, and associated risk assessments. The program also provides significant technical and political benefits in international cooperation. We are quantifying the Spent Fuel Ratio (SFR), the ratio of the aerosol particles released from HEDD-impacted actual spent fuel to the aerosol particles produced from surrogate materials, measured under closely matched test conditions, in a contained test chamber. In addition, we are measuring the amounts, nuclide content, size distribution of the released aerosol materials, and enhanced sorption of volatile fission product nuclides onto specific aerosol particle size fractions. These data are the input for follow-on modeling studies to quantify respirable hazards, associated radiological risk assessments, vulnerability assessments, and potential cask physical protection design modifications. This document includes an updated description of the test program and test components for all work and plans made, or revised, during FY 2004. It also serves as a program status report as of the end of FY 2004. All available test results, observations, and aerosol analyses plus interpretations--primarily for surrogate material Phase 2 tests, series 2/5A through 2/9B, using cerium oxide sintered ceramic pellets are included. Advanced plans and progress are described for upcoming tests with unirradiated, depleted uranium oxide and actual spent fuel test rodlets. This spent fuel sabotage--aerosol test program is coordinated with the international Working Group for Sabotage Concerns of

  14. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  15. Chloride accelerated test: influence of silica fume, water/binder ratio and concrete cover thickness

    Directory of Open Access Journals (Sweden)

    E. Pereira

    Full Text Available In developed countries like the UK, France, Italy and Germany, it is estimated that spending on maintenance and repair is practically the same as investment in new constructions. Therefore, this paper aims to study different ways of interfering in the corrosion kinetic using an accelerated corrosion test - CAIM, that simulates the chloride attack. The three variables are: concrete cover thickness, use of silica fume and the water/binder ratio. It was found, by analysis of variance of the weight loss of the steel bars and chloride content in the concrete cover thickness, there is significant influence of the three variables. Also, the results indicate that the addition of silica fume is the path to improve the corrosion protection of low water/binder ratio concretes (like 0.4 and elevation of the concrete cover thickness is the most effective solution to increase protection of high water/binder ratio concrete (above 0.5.

  16. Orthogonal series generalized likelihood ratio test for failure detection and isolation. [for aircraft control

    Science.gov (United States)

    Hall, Steven R.; Walker, Bruce K.

    1990-01-01

    A new failure detection and isolation algorithm for linear dynamic systems is presented. This algorithm, the Orthogonal Series Generalized Likelihood Ratio (OSGLR) test, is based on the assumption that the failure modes of interest can be represented by truncated series expansions. This assumption leads to a failure detection algorithm with several desirable properties. Computer simulation results are presented for the detection of the failures of actuators and sensors of a C-130 aircraft. The results show that the OSGLR test generally performs as well as the GLR test in terms of time to detect a failure and is more robust to failure mode uncertainty. However, the OSGLR test is also somewhat more sensitive to modeling errors than the GLR test.

  17. An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits

    Science.gov (United States)

    Corliss, Walter F., II

    1989-03-01

    The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.

  18. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hyun; You, Young Woo; Cho, Jae Seon; Huh, Chang Wook; Kim, Do Hyoung; Kim, Ju Youl; Kim, Yoon Ik; Yang, Hui Chang; Park, Kang Min [Seoul National Univ., Seoul (Korea, Republic of)

    1998-03-15

    The objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Internal(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plant safety. In this study, the survey about the assessment methodologies, modelings and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. The sensitivity analyses about the failure factors of the components are performed in the bases of the and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code. The qualitative assessment for the STI/AOR of RPS/ESFAS assured safety the most important system in the nuclear power plant are performed.

  19. The Decisions of Elementary School Principals: A Test of Ideal Type Methodology.

    Science.gov (United States)

    Greer, John T.

    Interviews with 25 Georgia elementary school principals provided data that could be used to test an application of Max Weber's ideal type methodology to decision-making. Alfred Schuetz's model of the rational act, based on one of Weber's ideal types, was analyzed and translated into describable acts and behaviors. Interview procedures were…

  20. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1992-01-01

    The Analysis and Testing Group (WX-11) of the Design Engineering Division at Los Alamos National Laboratory (LANL) is developing methodology for designing and providing a basis for certification of Type B shipping containers. This methodology will include design, analysis, testing, fabrication, procurement, and obtaining certification of the Type B containers, allowing usage in support of the United States Department of Energy programs. While all aspects of the packaging development are included in this methodology, this paper focuses on the use of analysis and testing techniques for enhancing the design and providing a basis for certification. This methodology is based on concurrent engineering principles. Multidisciplinary teams within LANL are responsible for the design and certification of specific Type B Radioactive Material Shipping Containers. These teams include personnel with the various backgrounds and areas of expertise required to support the design, testing, analysis and certification tasks. To demonstrate that a package can pass all the performance requirements, the design needs to be characterized as completely as possible. Understanding package responses to the various environments and how these responses influence the effectiveness of the packaging requires expertise in several disciplines. In addition to characterizing the shipping container designs, these multidisciplinary teams should be able to provide insight into improving new package designs

  1. Comparison of two bond strength testing methodologies for bilayered all-ceramics

    NARCIS (Netherlands)

    Dundar, Mine; Ozcan, Mutlu; Gokce, Bulent; Comlekoglu, Erhan; Leite, Fabiola; Valandro, Luiz Felipe

    Objectives. This study compared the shear bond strength (SBS) and microtensile (MTBS) testing methodologies for core and veneering ceramics in four types of all-ceramic systems. Methods. Four different ceramic veneer/core combinations, three of which were feldspathic and the other a fluor-apatite to

  2. Determining tropical cyclone inland flooding loss on a large scale through a new flood peak ratio-based methodology

    International Nuclear Information System (INIS)

    Czajkowski, Jeffrey; Michel-Kerjan, Erwann; Villarini, Gabriele; Smith, James A

    2013-01-01

    In recent years, the United States has been severely affected by numerous tropical cyclones (TCs) which have caused massive damages. While media attention mainly focuses on coastal losses from storm surge, these TCs have inflicted significant devastation inland as well. Yet, little is known about the relationship between TC-related inland flooding and economic losses. Here we introduce a novel methodology that first successfully characterizes the spatial extent of inland flooding, and then quantifies its relationship with flood insurance claims. Hurricane Ivan in 2004 is used as illustration. We empirically demonstrate in a number of ways that our quantified inland flood magnitude produces a very good representation of the number of inland flood insurance claims experienced. These results highlight the new technological capabilities that can lead to a better risk assessment of inland TC flood. This new capacity will be of tremendous value to a number of public and private sector stakeholders dealing with disaster preparedness. (letter)

  3. Methodology to identify risk-significant components for inservice inspection and testing

    International Nuclear Information System (INIS)

    Anderson, M.T.; Hartley, R.S.; Jones, J.L. Jr.; Kido, C.; Phillips, J.H.

    1992-08-01

    Periodic inspection and testing of vital system components should be performed to ensure the safe and reliable operation of Department of Energy (DOE) nuclear processing facilities. Probabilistic techniques may be used to help identify and rank components by their relative risk. A risk-based ranking would allow varied DOE sites to implement inspection and testing programs in an effective and cost-efficient manner. This report describes a methodology that can be used to rank components, while addressing multiple risk issues

  4. Methodology and application of 13C breath test in gastroenterology practice

    International Nuclear Information System (INIS)

    Yan Weili; Jiang Yibin

    2002-01-01

    13 C breath test has been widely used in research of nutrition, pharmacology and gastroenterology for its properties such as safety, non-invasion and so on. The author describes the principle, methodology of 13 C breath test and its application in detection to Helico-bacteria pylori infection in stomach and small bowl bacterial overgrowth, measurement of gastric emptying, pancreatic exocrine function and liver function with various substrates

  5. Effect of home testing of international normalized ratio on clinical events.

    Science.gov (United States)

    Matchar, David B; Jacobson, Alan; Dolor, Rowena; Edson, Robert; Uyeda, Lauren; Phibbs, Ciaran S; Vertrees, Julia E; Shih, Mei-Chiung; Holodniy, Mark; Lavori, Philip

    2010-10-21

    Warfarin anticoagulation reduces thromboembolic complications in patients with atrial fibrillation or mechanical heart valves, but effective management is complex, and the international normalized ratio (INR) is often outside the target range. As compared with venous plasma testing, point-of-care INR measuring devices allow greater testing frequency and patient involvement and may improve clinical outcomes. We randomly assigned 2922 patients who were taking warfarin because of mechanical heart valves or atrial fibrillation and who were competent in the use of point-of-care INR devices to either weekly self-testing at home or monthly high-quality testing in a clinic. The primary end point was the time to a first major event (stroke, major bleeding episode, or death). The patients were followed for 2.0 to 4.75 years, for a total of 8730 patient-years of follow-up. The time to the first primary event was not significantly longer in the self-testing group than in the clinic-testing group (hazard ratio, 0.88; 95% confidence interval, 0.75 to 1.04; P=0.14). The two groups had similar rates of clinical outcomes except that the self-testing group reported more minor bleeding episodes. Over the entire follow-up period, the self-testing group had a small but significant improvement in the percentage of time during which the INR was within the target range (absolute difference between groups, 3.8 percentage points; P<0.001). At 2 years of follow-up, the self-testing group also had a small but significant improvement in patient satisfaction with anticoagulation therapy (P=0.002) and quality of life (P<0.001). As compared with monthly high-quality clinic testing, weekly self-testing did not delay the time to a first stroke, major bleeding episode, or death to the extent suggested by prior studies. These results do not support the superiority of self-testing over clinic testing in reducing the risk of stroke, major bleeding episode, and death among patients taking warfarin

  6. Reference Performance Test Methodology for Degradation Assessment of Lithium-Sulfur Batteries

    DEFF Research Database (Denmark)

    Knap, Vaclav; Stroe, Daniel-Ioan; Purkayastha, Rajlakshmi

    2018-01-01

    Lithium-Sulfur (Li-S) is an emerging battery technology receiving a growing amount of attention due to its potentially high gravimetric energy density, safety, and low production cost. However, there are still some obstacles preventing its swift commercialization. Li-S batteries are driven...... by different electrochemical processes than commonly used Lithium-ion batteries, which often results in very different behavior. Therefore, the testing and modeling of these systems have to be adjusted to reflect their unique behavior and to prevent possible bias. A methodology for a Reference Performance Test...... (RPT) for the Li-S batteries is proposed in this study to point out Li-S battery features and provide guidance to users how to deal with them and possible results into standardization. The proposed test methodology is demonstrated for 3.4 Ah Li-S cells aged under different conditions....

  7. Small punch creep test: A promising methodology for high temperature plant components life evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Tettamanti, S. [CISE SpA, Milan (Italy); Crudeli, R. [ENEL SpA, Milan (Italy)

    1998-12-31

    CISE and ENEL are involved for years in a miniaturization creep methodology project to obtain similar non-destructive test with the same standard creep test reliability. The goal can be reached with `Small punch creep test` that collect all the requested characteristics; quasi nondestructive disk specimens extracted both on external or internal side of components, than accurately machined and tested on little and cheap apparatus. CISE has developed complete creep small punch procedure that involved peculiar test facility and correlation`s law comparable with the more diffused isostress methodology for residual life evaluation on ex-serviced high temperature plant components. The aim of this work is to obtain a simple and immediately applicable relationship useful for plant maintenance managing. More added work is need to validate the Small Punch methodology and for relationship calibration on most diffusion high temperature structural materials. First obtained results on a comparative work on ASTM A355 P12 ex-serviced pipe material are presented joint with a description of the Small Punch apparatus realized in CISE. (orig.) 6 refs.

  8. Small punch creep test: A promising methodology for high temperature plant components life evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Tettamanti, S [CISE SpA, Milan (Italy); Crudeli, R [ENEL SpA, Milan (Italy)

    1999-12-31

    CISE and ENEL are involved for years in a miniaturization creep methodology project to obtain similar non-destructive test with the same standard creep test reliability. The goal can be reached with `Small punch creep test` that collect all the requested characteristics; quasi nondestructive disk specimens extracted both on external or internal side of components, than accurately machined and tested on little and cheap apparatus. CISE has developed complete creep small punch procedure that involved peculiar test facility and correlation`s law comparable with the more diffused isostress methodology for residual life evaluation on ex-serviced high temperature plant components. The aim of this work is to obtain a simple and immediately applicable relationship useful for plant maintenance managing. More added work is need to validate the Small Punch methodology and for relationship calibration on most diffusion high temperature structural materials. First obtained results on a comparative work on ASTM A355 P12 ex-serviced pipe material are presented joint with a description of the Small Punch apparatus realized in CISE. (orig.) 6 refs.

  9. The Sequential Probability Ratio Test: An efficient alternative to exact binomial testing for Clean Water Act 303(d) evaluation.

    Science.gov (United States)

    Chen, Connie; Gribble, Matthew O; Bartroff, Jay; Bay, Steven M; Goldstein, Larry

    2017-05-01

    The United States's Clean Water Act stipulates in section 303(d) that states must identify impaired water bodies for which total maximum daily loads (TMDLs) of pollution inputs into water bodies are developed. Decision-making procedures about how to list, or delist, water bodies as impaired, or not, per Clean Water Act 303(d) differ across states. In states such as California, whether or not a particular monitoring sample suggests that water quality is impaired can be regarded as a binary outcome variable, and California's current regulatory framework invokes a version of the exact binomial test to consolidate evidence across samples and assess whether the overall water body complies with the Clean Water Act. Here, we contrast the performance of California's exact binomial test with one potential alternative, the Sequential Probability Ratio Test (SPRT). The SPRT uses a sequential testing framework, testing samples as they become available and evaluating evidence as it emerges, rather than measuring all the samples and calculating a test statistic at the end of the data collection process. Through simulations and theoretical derivations, we demonstrate that the SPRT on average requires fewer samples to be measured to have comparable Type I and Type II error rates as the current fixed-sample binomial test. Policymakers might consider efficient alternatives such as SPRT to current procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Tests of Full-Scale Helicopter Rotors at High Advancing Tip Mach Numbers and Advance Ratios

    Science.gov (United States)

    Biggers, James C.; McCloud, John L., III; Stroub, Robert H.

    2015-01-01

    As a continuation of the studies of reference 1, three full-scale helicopter rotors have been tested in the Ames Research Center 40- by SO-foot wind tunnel. All three of them were two-bladed, teetering rotors. One of the rotors incorporated the NACA 0012 airfoil section over the entire length of the blade. This rotor was tested at advance ratios up to 1.05. Both of the other rotors were tapered in thickness and incorporated leading-edge camber over the outer 20 percent of the blade radius. The larger of these rotors was tested at advancing tip Mach numbers up to 1.02. Data were obtained for a wide range of lift and propulsive force, and are presented without discussion.

  11. A test procedure for determining the influence of stress ratio on fatigue crack growth

    Science.gov (United States)

    Fitzgerald, J. H.; Wei, R. P.

    1974-01-01

    A test procedure is outlined by which the rate of fatigue crack growth over a range of stress ratios and stress intensities can be determined expeditiously using a small number of specimens. This procedure was developed to avoid or circumvent the effects of load interactions on fatigue crack growth, and was used to develop data on a mill annealed Ti-6Al-4V alloy plate. Experimental data suggest that the rates of fatigue crack growth among the various stress ratios may be correlated in terms of an effective stress intensity range at given values of K max. This procedure is not to be used, however, for determining the corrosion fatigue crack growth characteristics of alloys when nonsteady-state effects are significant.

  12. The TL,NO/TL,CO ratio in pulmonary function test interpretation.

    Science.gov (United States)

    Hughes, J Michael B; van der Lee, Ivo

    2013-02-01

    The transfer factor of the lung for nitric oxide (T(L,NO)) is a new test for pulmonary gas exchange. The procedure is similar to the already well-established transfer factor of the lung for carbon monoxide (T(L,CO)). Physiologically, T(L,NO) predominantly measures the diffusion pathway from the alveoli to capillary plasma. In the Roughton-Forster equation, T(L,NO) acts as a surrogate for the membrane diffusing capacity (D(M)). The red blood cell resistance to carbon monoxide uptake accounts for ~50% of the total resistance from gas to blood, but it is much less for nitric oxide. T(L,NO) and T(L,CO) can be measured simultaneously with the single breath technique, and D(M) and pulmonary capillary blood volume (V(c)) can be estimated. T(L,NO), unlike T(L,CO), is independent of oxygen tension and haematocrit. The T(L,NO)/T(L,CO) ratio is weighted towards the D(M)/V(c) ratio and to α; where α is the ratio of physical diffusivities of NO to CO (α=1.97). The T(L,NO)/T(L,CO) ratio is increased in heavy smokers, with and without computed tomography evidence of emphysema, and reduced in the voluntary restriction of lung expansion; it is expected to be reduced in chronic heart failure. The T(L,NO)/T(L,CO) ratio is a new index of gas exchange that may, more than derivations from them of D(M) and V(c) with their in-built assumptions, give additional insights into pulmonary pathology.

  13. Assessing the Impact of Clothing and Individual Equipment (CIE) on Soldier Physical, Biomechanical, and Cognitive Performance Part 1: Test Methodology

    Science.gov (United States)

    2018-02-01

    29 during Soldier Equipment Configuration Impact on Performance: Establishing a Test Methodology for the...during ACSM’S resource manual for exercise testing and prescription Human Movement Science, 31(2), Proceedings of the 2016 American Biomechanics...Performance of Medium Rucksack Prototypes An investigation: Comparison of live-fire and weapon simulator test methodologies and the of three extremity armor

  14. Methodology for Mechanical Property Testing on Fuel Cladding Using an Expanded Plug Wedge Test

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jy-An John [ORNL; Jiang, Hao [ORNL

    2013-08-01

    To determine the tensile properties of irradiated fuel cladding in a hot cell, a simple test was developed at ORNL and is described fully in US Patent Application 20060070455, Expanded plug method for developing circumferential mechanical properties of tubular materials. This method is designed for testing fuel rod cladding ductility in a hot cell utilizing an expandable plug to stretch a small ring of irradiated cladding material. The specimen strain is determined using the measured diametrical expansion of the ring. This method removes many complexities associated with specimen preparation and testing. The advantages are the simplicity of measuring the test component assembly in the hot cell and the direct measurement of specimen strain. It was also found that cladding strength could be determined from the test results. The basic approach of this test method is to apply an axial compressive load to a cylindrical plug of polyurethane (or other materials) fitted inside a short ring of the test material to achieve radial expansion of the specimen. The diameter increase of the specimen is used to calculate the circumferential strain accrued during the test. The other two basic measurements are total applied load and amount of plug compression (extension). A simple procedure is used to convert the load circumferential strain data from the ring tests into material pseudo-stress-strain curves. However, several deficiencies exist in this expanded-plug loading ring test, which will impact accuracy of test results and introduce potential shear failure of the specimen due to inherited large axial compressive stress from the expansion plug test. First of all, the highly non-uniform stress and strain distribution resulted in the gage section of the clad. To ensure reliable testing and test repeatability, the potential for highly non-uniform stress distribution or displacement/strain deformation has to be eliminated at the gage section of the specimen. Second, significant

  15. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  16. The Linear Logistic Test Model (LLTM as the methodological foundation of item generating rules for a new verbal reasoning test

    Directory of Open Access Journals (Sweden)

    HERBERT POINSTINGL

    2009-06-01

    Full Text Available Based on the demand for new verbal reasoning tests to enrich psychological test inventory, a pilot version of a new test was analysed: the 'Family Relation Reasoning Test' (FRRT; Poinstingl, Kubinger, Skoda & Schechtner, forthcoming, in which several basic cognitive operations (logical rules have been embedded/implemented. Given family relationships of varying complexity embedded in short stories, testees had to logically conclude the correct relationship between two individuals within a family. Using empirical data, the linear logistic test model (LLTM; Fischer, 1972, a special case of the Rasch model, was used to test the construct validity of the test: The hypothetically assumed basic cognitive operations had to explain the Rasch model's item difficulty parameters. After being shaped in LLTM's matrices of weights ((qij, none of these operations were corroborated by means of the Andersen's Likelihood Ratio Test.

  17. The Likelihood Ratio Test of Common Factors under Non-Ideal Conditions

    Directory of Open Access Journals (Sweden)

    Ana M. Angulo

    2011-01-01

    Full Text Available El modelo espacial de Durbin ocupa una posición interesante en econometría espacial. Es la forma reducida de un modelo de corte transversal con dependencia en los errores y puede ser utilizado como ecuación de anidación en un enfoque más general de selección de modelos. En concreto, a partir de esta ecuación puede obtenerse el Ratio de Verosimilitudes conocido como test de Factores Comunes (LRCOM. Como se muestra en Mur y Angulo (2006, este test tiene buenas propiedades si el modelo está correctamente especificado. Sin embargo, por lo que sabemos, no hay referencias en la literatura sobre el comportamiento de este test bajo condiciones no ideales. En concreto, estudiamos el comportamiento del test en los casos de heterocedasticidad, no normalidad, endogeneidad, matrices de contactos densas y no-linealidad. Nuestros resultados ofrecen una visión positiva del test de Factores Comunes que parece una técnica útil en el instrumental propio de la econometría espacial contemporánea.

  18. Near-exact distributions for the block equicorrelation and equivariance likelihood ratio test statistic

    Science.gov (United States)

    Coelho, Carlos A.; Marques, Filipe J.

    2013-09-01

    In this paper the authors combine the equicorrelation and equivariance test introduced by Wilks [13] with the likelihood ratio test (l.r.t.) for independence of groups of variables to obtain the l.r.t. of block equicorrelation and equivariance. This test or its single block version may find applications in many areas as in psychology, education, medicine, genetics and they are important "in many tests of multivariate analysis, e.g. in MANOVA, Profile Analysis, Growth Curve analysis, etc" [12, 9]. By decomposing the overall hypothesis into the hypotheses of independence of groups of variables and the hypothesis of equicorrelation and equivariance we are able to obtain the expressions for the overall l.r.t. statistic and its moments. From these we obtain a suitable factorization of the characteristic function (c.f.) of the logarithm of the l.r.t. statistic, which enables us to develop highly manageable and precise near-exact distributions for the test statistic.

  19. Monte Carlo simulation of the sequential probability ratio test for radiation monitoring

    International Nuclear Information System (INIS)

    Coop, K.L.

    1984-01-01

    A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table

  20. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu

    2017-02-16

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  1. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu; Harrou, Fouzi; Sun, Ying

    2017-01-01

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  2. The Leeb Hardness Test for Rock: An Updated Methodology and UCS Correlation

    Science.gov (United States)

    Corkum, A. G.; Asiri, Y.; El Naggar, H.; Kinakin, D.

    2018-03-01

    The Leeb hardness test (LHT with test value of L D ) is a rebound hardness test, originally developed for metals, that has been correlated with the Unconfined Compressive Strength (test value of σ c ) of rock by several authors. The tests can be carried out rapidly, conveniently and nondestructively on core and block samples or on rock outcrops. This makes the relatively small LHT device convenient for field tests. The present study compiles test data from literature sources and presents new laboratory testing carried out by the authors to develop a substantially expanded database with wide-ranging rock types. In addition, the number of impacts that should be averaged to comprise a "test result" was revisited along with the issue of test specimen size. Correlation for L D and σ c for various rock types is provided along with recommended testing methodology. The accuracy of correlated σ c estimates was assessed and reasonable correlations were observed between L D and σ c . The study findings show that LHT can be useful particularly for field estimation of σ c and offers a significant improvement over the conventional field estimation methods outlined by the ISRM (e.g., hammer blows). This test is rapid and simple, with relatively low equipment costs, and provides a reasonably accurate estimate of σ c .

  3. Nondestructive Semistatic Testing Methodology for Assessing Fish Textural Characteristics via Closed-Form Mathematical Expressions

    Directory of Open Access Journals (Sweden)

    D. Dimogianopoulos

    2017-01-01

    Full Text Available This paper presents a novel methodology based on semistatic nondestructive testing of fish for the analytical computation of its textural characteristics via closed-form mathematical expressions. The novelty is that, unlike alternatives, explicit values for both stiffness and viscoelastic textural attributes may be computed, even if fish of different size/weight are tested. Furthermore, the testing procedure may be adapted to the specifications (sampling rate and accuracy of the available equipment. The experimental testing involves a fish placed on the pan of a digital weigh scale, which is subsequently tested with a ramp-like load profile in a custom-made installation. The ramp slope is (to some extent adjustable according to the specification (sampling rate and accuracy of the equipment. The scale’s reaction to fish loading, namely, the reactive force, is collected throughout time and is shown to depend on the fish textural attributes according to a closed-form mathematical formula. The latter is subsequently used along with collected data in order to compute these attributes rapidly and effectively. Four whole raw sea bass (Dicentrarchus labrax of various sizes and textures were tested. Changes in texture, related to different viscoelastic characteristics among the four fish, were correctly detected and quantified using the proposed methodology.

  4. Buffer Construction Methodology in Demonstration Test For Cavern Type Disposal Facility

    International Nuclear Information System (INIS)

    Yoshihiro, Akiyama; Takahiro, Nakajima; Katsuhide, Matsumura; Kenji, Terada; Takao, Tsuboya; Kazuhiro, Onuma; Tadafumi, Fujiwara

    2009-01-01

    A number of studies concerning a cavern type disposal facility have been carried out for disposal of low level radioactive waste mainly generated by power plant decommissioning in Japan. The disposal facility is composed of an engineered barrier system with concrete pit and bentonite buffer, and planed to be constructed in sub-surface 50 - 100 meters depth. Though the previous studies have mainly used laboratory and mock-up tests, we conducted a demonstration test in a full-size cavern. The main objectives of the test were to study the construction methodology and to confirm the quality of the engineered barrier system. The demonstration test was planned as the construction of full scale mock-up. It was focused on a buffer construction test to evaluate the construction methodology and quality control in this paper. Bentonite material was compacted to 1.6 Mg/m 3 in-site by large vibrating roller in this test. Through the construction of the buffer part, a 1.6 Mg/m 3 of the density was accomplished, and the data of workability and quality is collected. (authors)

  5. Environmental testing of a prototypic digital safety channel, Phase I: System design and test methodology

    Energy Technology Data Exchange (ETDEWEB)

    Korsah, K.; Turner, G.W.; Mullens, J.A. [Oak Ridge National Lab., TN (United States)

    1995-04-01

    A microprocessor-based reactor trip channel has been assembled for environmental testing under an Instrumentation and Control (I&C) Qualification Program sponsored by the US Nuclear Regulatory Commission. The goal of this program is to establish the technical basis and acceptance criteria for the qualification of advanced I&C systems. The trip channel implemented for this study employs technologies and digital subsystems representative of those proposed for use in some advanced light-water reactors (ALWRs) such as the Simplified Boiling Water Reactor (SBWR). It is expected that these tests will reveal any potential system vulnerabilities for technologies representative of those proposed for use in ALWRs. The experimental channel will be purposely stressed considerably beyond what it is likely to experience in a normal nuclear power plant environment, so that the tests can uncover the worst-case failure modes (i.e., failures that are likely to prevent an entire trip system from performing its safety function when required to do so). Based on information obtained from this study, it may be possible to recommend tests that are likely to indicate the presence of such failure mechanisms. Such recommendations would be helpful in augmenting current qualification guidelines.

  6. Environmental testing of a prototypic digital safety channel, phase I: System design and test methodology

    International Nuclear Information System (INIS)

    Korsah, K.; Turner, G.W.; Mullens, J.A.

    1995-01-01

    A microprocessor-based reactor trip channel has been assembled for environmental testing under an Instrumentation and Control (I ampersand C) Qualification Program sponsored by the U.S. Nuclear Regulatory Commission. The goal of this program is to establish the technical basis for the qualification of advanced I ampersand C systems. The trip channel implemented for this study employs technologies and digital subsystems representative of those proposed for use in some advanced light-water reactors (ALNWS) such as the Simplified Boiling Water Reactor (SBNW) and AP600. It is expected that these tests will reveal any potential system vulnerabilities for technologies representative of those proposed for use in ALNWS. The experimental channel will be purposely stressed considerably beyond what it is likely to experience in a normal nuclear power plant environment, so that the tests can uncover the worst-case failure modes (i.e., failures that are likely to prevent an entire trip system from performing its safety function when required to do so). Based on information obtained from this study, it may be possible to recommend tests that are likely to indicate the presence of such failure mechanisms. Such recommendations would be helpful in augmenting current qualification guidelines

  7. Aircraft control surface failure detection and isolation using the OSGLR test. [orthogonal series generalized likelihood ratio

    Science.gov (United States)

    Bonnice, W. F.; Motyka, P.; Wagner, E.; Hall, S. R.

    1986-01-01

    The performance of the orthogonal series generalized likelihood ratio (OSGLR) test in detecting and isolating commercial aircraft control surface and actuator failures is evaluated. A modification to incorporate age-weighting which significantly reduces the sensitivity of the algorithm to modeling errors is presented. The steady-state implementation of the algorithm based on a single linear model valid for a cruise flight condition is tested using a nonlinear aircraft simulation. A number of off-nominal no-failure flight conditions including maneuvers, nonzero flap deflections, different turbulence levels and steady winds were tested. Based on the no-failure decision functions produced by off-nominal flight conditions, the failure detection and isolation performance at the nominal flight condition was determined. The extension of the algorithm to a wider flight envelope by scheduling on dynamic pressure and flap deflection is examined. Based on this testing, the OSGLR algorithm should be capable of detecting control surface failures that would affect the safe operation of a commercial aircraft. Isolation may be difficult if there are several surfaces which produce similar effects on the aircraft. Extending the algorithm over the entire operating envelope of a commercial aircraft appears feasible.

  8. International normalized ratio self-testing and self-management: improving patient outcomes

    Directory of Open Access Journals (Sweden)

    Pozzi M

    2016-10-01

    Full Text Available Matteo Pozzi,1 Julia Mitchell,2 Anna Maria Henaine,3 Najib Hanna,4 Ola Safi,4 Roland Henaine2 1Department of Adult Cardiac Surgery, “Louis Pradel” Cardiologic Hospital, Lyon, France; 2Department of Congenital Cardiac Surgery, “Louis Pradel” Cardiologic Hospital, Lyon, France; 3Clinical Pharmacology Unit, Lebanese University, Beirut, Lebanon; 4Pediatric Unit, “Hotel Dieu de France” Hospital, Saint Joseph University, Beirut, Lebanon Abstract: Long term oral anti-coagulation with vitamin K antagonists is a risk factor of hemorrhagic or thromebomlic complications. Periodic laboratory testing of international normalized ratio (INR and a subsequent dose adjustment are therefore mandatory. The use of home testing devices to measure INR has been suggested as a potential way to improve the comfort and compliance of the patients and their families, the frequency of monitoring and, finally, the management and safety of long-term oral anticoagulation. In pediatric patients, increased doses to obtain and maintain the therapeutic target INR, more frequent adjustments and INR testing, multiple medication, inconstant nutritional intake, difficult venepunctures, and the need to go to the laboratory for testing (interruption of school and parents’ work attendance highlight those difficulties. After reviewing the most relevant published studies of self-testing and self-management of INR for adult patients and children on oral anticoagulation, it seems that these are valuable and effective strategies of INR control. Despite an unclear relationship between INR control and clinical effects, these self-strategies provide a better control of the anticoagulant effect, improve patients and their family quality of life, and are an appealing solution in term of cost-effectiveness. Structured education and knowledge evaluation by trained health care professionals is required for children, to be able to adjust their dose treatment safely and accurately. However

  9. Comparison of Urine Albumin-to-Creatinine Ratio (ACR) Between ACR Strip Test and Quantitative Test in Prediabetes and Diabetes

    Science.gov (United States)

    Cho, Seon; Kim, Suyoung; Cho, Han-Ik

    2017-01-01

    Background Albuminuria is generally known as a sensitive marker of renal and cardiovascular dysfunction. It can be used to help predict the occurrence of nephropathy and cardiovascular disorders in diabetes. Individuals with prediabetes have a tendency to develop macrovascular and microvascular pathology, resulting in an increased risk of retinopathy, cardiovascular diseases, and chronic renal diseases. We evaluated the clinical value of a strip test for measuring the urinary albumin-to-creatinine ratio (ACR) in prediabetes and diabetes. Methods Spot urine samples were obtained from 226 prediabetic and 275 diabetic subjects during regular health checkups. Urinary ACR was measured by using strip and laboratory quantitative tests. Results The positive rates of albuminuria measured by using the ACR strip test were 15.5% (microalbuminuria, 14.6%; macroalbuminuria, 0.9%) and 30.5% (microalbuminuria, 25.1%; macroalbuminuria, 5.5%) in prediabetes and diabetes, respectively. In the prediabetic population, the sensitivity, specificity, positive predictive value, negative predictive value, and overall accuracy of the ACR strip method were 92.0%, 94.0%, 65.7%, 99.0%, and 93.8%, respectively; the corresponding values in the diabetic population were 80.0%, 91.6%, 81.0%, 91.1%, and 88.0%, respectively. The median [interquartile range] ACR values in the strip tests for measurement ranges of 300 mg/g were 9.4 [6.3-15.4], 46.9 [26.5-87.7], and 368.8 [296.2-575.2] mg/g, respectively, using the laboratory method. Conclusions The ACR strip test showed high sensitivity, specificity, and negative predictive value, suggesting that the test can be used to screen for albuminuria in cases of prediabetes and diabetes. PMID:27834062

  10. Testing and Performance Verification of a High Bypass Ratio Turbofan Rotor in an Internal Flow Component Test Facility

    Science.gov (United States)

    VanZante, Dale E.; Podboy, Gary G.; Miller, Christopher J.; Thorp, Scott A.

    2009-01-01

    A 1/5 scale model rotor representative of a current technology, high bypass ratio, turbofan engine was installed and tested in the W8 single-stage, high-speed, compressor test facility at NASA Glenn Research Center (GRC). The same fan rotor was tested previously in the GRC 9x15 Low Speed Wind Tunnel as a fan module consisting of the rotor and outlet guide vanes mounted in a flight-like nacelle. The W8 test verified that the aerodynamic performance and detailed flow field of the rotor as installed in W8 were representative of the wind tunnel fan module installation. Modifications to W8 were necessary to ensure that this internal flow facility would have a flow field at the test package that is representative of flow conditions in the wind tunnel installation. Inlet flow conditioning was designed and installed in W8 to lower the fan face turbulence intensity to less than 1.0 percent in order to better match the wind tunnel operating environment. Also, inlet bleed was added to thin the casing boundary layer to be more representative of a flight nacelle boundary layer. On the 100 percent speed operating line the fan pressure rise and mass flow rate agreed with the wind tunnel data to within 1 percent. Detailed hot film surveys of the inlet flow, inlet boundary layer and fan exit flow were compared to results from the wind tunnel. The effect of inlet casing boundary layer thickness on fan performance was quantified. Challenges and lessons learned from testing this high flow, low static pressure rise fan in an internal flow facility are discussed.

  11. A Space Object Detection Algorithm using Fourier Domain Likelihood Ratio Test

    Science.gov (United States)

    Becker, D.; Cain, S.

    Space object detection is of great importance in the highly dependent yet competitive and congested space domain. Detection algorithms employed play a crucial role in fulfilling the detection component in the situational awareness mission to detect, track, characterize and catalog unknown space objects. Many current space detection algorithms use a matched filter or a spatial correlator to make a detection decision at a single pixel point of a spatial image based on the assumption that the data follows a Gaussian distribution. This paper explores the potential for detection performance advantages when operating in the Fourier domain of long exposure images of small and/or dim space objects from ground based telescopes. A binary hypothesis test is developed based on the joint probability distribution function of the image under the hypothesis that an object is present and under the hypothesis that the image only contains background noise. The detection algorithm tests each pixel point of the Fourier transformed images to make the determination if an object is present based on the criteria threshold found in the likelihood ratio test. Using simulated data, the performance of the Fourier domain detection algorithm is compared to the current algorithm used in space situational awareness applications to evaluate its value.

  12. Methodology for testing a system for remote monitoring and control on auxiliary machines in electric vehicles

    Directory of Open Access Journals (Sweden)

    Dimitrov Vasil

    2017-01-01

    Full Text Available A laboratory system for remote monitoring and control of an asynchronous motor controlled by a soft starter and contemporary measuring and control devices has been developed and built. This laboratory system is used for research and in teaching. A study of the principles of operation, setting up and examination of intelligent energy meters, soft starters and PLC has been made as knowledge of the relevant software products is necessary. This is of great importance because systems for remote monitoring and control of energy consumption, efficiency and proper operation of the controlled objects are very often used in different spheres of industry, in building automation, transport, electricity distribution network, etc. Their implementation in electric vehicles for remote monitoring and control on auxiliary machines is also possible and very useful. In this paper, a methodology of tests is developed and some experiments are presented. Thus, an experimental verification of the developed methodology is made.

  13. Laboratory test on maximum and minimum void ratio of tropical sand matrix soils

    Science.gov (United States)

    Othman, B. A.; Marto, A.

    2018-04-01

    Sand is generally known as loose granular material which has a grain size finer than gravel and coarser than silt and can be very angular to well-rounded in shape. The present of various amount of fines which also influence the loosest and densest state of sand in natural condition have been well known to contribute to the deformation and loss of shear strength of soil. This paper presents the effect of various range of fines content on minimum void ratio e min and maximum void ratio e max of sand matrix soils. Laboratory tests to determine e min and e max of sand matrix soil were conducted using non-standard method introduced by previous researcher. Clean sand was obtained from natural mining site at Johor, Malaysia. A set of 3 different sizes of sand (fine sand, medium sand, and coarse sand) were mixed with 0% to 40% by weight of low plasticity fine (kaolin). Results showed that generally e min and e max decreased with the increase of fines content up to a minimal value of 0% to 30%, and then increased back thereafter.

  14. A hypothesis-testing framework for studies investigating ontogenetic niche shifts using stable isotope ratios.

    Directory of Open Access Journals (Sweden)

    Caroline M Hammerschlag-Peyer

    Full Text Available Ontogenetic niche shifts occur across diverse taxonomic groups, and can have critical implications for population dynamics, community structure, and ecosystem function. In this study, we provide a hypothesis-testing framework combining univariate and multivariate analyses to examine ontogenetic niche shifts using stable isotope ratios. This framework is based on three distinct ontogenetic niche shift scenarios, i.e., (1 no niche shift, (2 niche expansion/reduction, and (3 discrete niche shift between size classes. We developed criteria for identifying each scenario, as based on three important resource use characteristics, i.e., niche width, niche position, and niche overlap. We provide an empirical example for each ontogenetic niche shift scenario, illustrating differences in resource use characteristics among different organisms. The present framework provides a foundation for future studies on ontogenetic niche shifts, and also can be applied to examine resource variability among other population sub-groupings (e.g., by sex or phenotype.

  15. Glass-surface area to solution-volume ratio and its implications to accelerated leach testing

    International Nuclear Information System (INIS)

    Pederson, L.R.; Buckwalter, C.Q.; McVay, G.L.; Riddle, B.L.

    1982-10-01

    The value of glass surface area to solution volume ratio (SA/V) can strongly influence the leaching rate of PNL 76-68 glass. The leaching rate is largely governed by silicon solubility constraints. Silicic acid in solution reduced the elemental release of all glass components. No components are leached to depths greater than that of silicon. The presence of the reaction layer had no measurable effect on the rate of leaching. Accelerated leach testing is possible since PNL 76-68 glass leaching is solubility-controlled (except at very low SA/V values). A series of glasses leached with SA/V x time = constant will yield identical elemental release

  16. AXIAL RATIO OF EDGE-ON SPIRAL GALAXIES AS A TEST FOR BRIGHT RADIO HALOS

    International Nuclear Information System (INIS)

    Singal, J.; Jones, E.; Dunlap, H.; Kogut, A.

    2015-01-01

    We use surface brightness contour maps of nearby edge-on spiral galaxies to determine whether extended bright radio halos are common. In particular, we test a recent model of the spatial structure of the diffuse radio continuum by Subrahmanyan and Cowsik which posits that a substantial fraction of the observed high-latitude surface brightness originates from an extended Galactic halo of uniform emissivity. Measurements of the axial ratio of emission contours within a sample of normal spiral galaxies at 1500 MHz and below show no evidence for such a bright, extended radio halo. Either the Galaxy is atypical compared to nearby quiescent spirals or the bulk of the observed high-latitude emission does not originate from this type of extended halo. (letters)

  17. The effect of instructional methodology on high school students natural sciences standardized tests scores

    Science.gov (United States)

    Powell, P. E.

    Educators have recently come to consider inquiry based instruction as a more effective method of instruction than didactic instruction. Experience based learning theory suggests that student performance is linked to teaching method. However, research is limited on inquiry teaching and its effectiveness on preparing students to perform well on standardized tests. The purpose of the study to investigate whether one of these two teaching methodologies was more effective in increasing student performance on standardized science tests. The quasi experimental quantitative study was comprised of two stages. Stage 1 used a survey to identify teaching methods of a convenience sample of 57 teacher participants and determined level of inquiry used in instruction to place participants into instructional groups (the independent variable). Stage 2 used analysis of covariance (ANCOVA) to compare posttest scores on a standardized exam by teaching method. Additional analyses were conducted to examine the differences in science achievement by ethnicity, gender, and socioeconomic status by teaching methodology. Results demonstrated a statistically significant gain in test scores when taught using inquiry based instruction. Subpopulation analyses indicated all groups showed improved mean standardized test scores except African American students. The findings benefit teachers and students by presenting data supporting a method of content delivery that increases teacher efficacy and produces students with a greater cognition of science content that meets the school's mission and goals.

  18. Establishing a Ballistic Test Methodology for Documenting the Containment Capability of Small Gas Turbine Engine Compressors

    Science.gov (United States)

    Heady, Joel; Pereira, J. Michael; Ruggeri, Charles R.; Bobula, George A.

    2009-01-01

    A test methodology currently employed for large engines was extended to quantify the ballistic containment capability of a small turboshaft engine compressor case. The approach involved impacting the inside of a compressor case with a compressor blade. A gas gun propelled the blade into the case at energy levels representative of failed compressor blades. The test target was a full compressor case. The aft flange was rigidly attached to a test stand and the forward flange was attached to a main frame to provide accurate boundary conditions. A window machined in the case allowed the projectile to pass through and impact the case wall from the inside with the orientation, direction and speed that would occur in a blade-out event. High-peed, digital-video cameras provided accurate velocity and orientation data. Calibrated cameras and digital image correlation software generated full field displacement and strain information at the back side of the impact point.

  19. Measurements of integrated components' parameters versus irradiation doses gamma radiation (60Co) dosimetry-methodology-tests

    International Nuclear Information System (INIS)

    Fuan, J.

    1991-01-01

    This paper describes the methodology used for the irradiation of the integrated components and the measurements of their parameters, using Quality Insurance of dosimetry: - Measurement of the integrated dose using the competences of the Laboratoire Central des Industries Electriques (LCIE): - Measurement of irradiation dose versus source/component distance, using a calibrated equipment. - Use of ALANINE dosimeters, placed on the support of the irradiated components. - Assembly and polarization of components during the irradiations. Selection of the irradiator. - Measurement of the irradiated components's parameters, using the competences of the societies: - GenRad: GR130 tests equipement placed in the DEIN/SIR-CEN SACLAY. - Laboratoire Central des Industries Electriques (LCIE): GR125 tests equipment and this associated programmes test [fr

  20. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1993-01-01

    The use of multidisciplinary teams to develop Type B shipping containers improves the quality and reliability of these reusable packagings. Including the people involved in all aspects of the design, certification and use of the package leads to more innovative, user-friendly containers. Concurrent use of testing and analysis allows engineers to more fully characterize a shipping container's responses to the environments given in the regulations, and provides a strong basis for certification. The combination of the input and output of these efforts should provide a general methodology that designers of Type B radioactive material shipping containers can utilize to optimize and certify their designs. (J.P.N.)

  1. Towards standardized testing methodologies for optical properties of components in concentrating solar thermal power plants

    Science.gov (United States)

    Sallaberry, Fabienne; Fernández-García, Aránzazu; Lüpfert, Eckhard; Morales, Angel; Vicente, Gema San; Sutter, Florian

    2017-06-01

    Precise knowledge of the optical properties of the components used in the solar field of concentrating solar thermal power plants is primordial to ensure their optimum power production. Those properties are measured and evaluated by different techniques and equipment, in laboratory conditions and/or in the field. Standards for such measurements and international consensus for the appropriate techniques are in preparation. The reference materials used as a standard for the calibration of the equipment are under discussion. This paper summarizes current testing methodologies and guidelines for the characterization of optical properties of solar mirrors and absorbers.

  2. Monitoring HIV Testing in the United States: Consequences of Methodology Changes to National Surveys.

    Directory of Open Access Journals (Sweden)

    Michelle M Van Handel

    Full Text Available In 2011, the National Health Interview Survey (NHIS, an in-person household interview, revised the human immunodeficiency virus (HIV section of the survey and the Behavioral Risk Factor Surveillance System (BRFSS, a telephone-based survey, added cellphone numbers to its sampling frame. We sought to determine how these changes might affect assessment of HIV testing trends.We used linear regression with pairwise contrasts with 2003-2013 data from NHIS and BRFSS to compare percentages of persons aged 18-64 years who reported HIV testing in landline versus cellphone-only households before and after 2011, when NHIS revised its in-person questionnaire and BRFSS added cellphone numbers to its telephone-based sample.In NHIS, the percentage of persons in cellphone-only households increased 13-fold from 2003 to 2013. The percentage ever tested for HIV was 6%-10% higher among persons in cellphone-only than landline households. The percentage ever tested for HIV increased significantly from 40.2% in 2003 to 45.0% in 2010, but was significantly lower in 2011 (40.6% and 2012 (39.7%. In BRFSS, the percentage ever tested decreased significantly from 45.9% in 2003 to 40.2% in 2010, but increased to 42.9% in 2011 and 43.5% in 2013.HIV testing estimates were lower after NHIS questionnaire changes but higher after BRFSS methodology changes. Data before and after 2011 are not comparable, complicating assessment of trends.

  3. Comparison of two bond strength testing methodologies for bilayered all-ceramics.

    Science.gov (United States)

    Dündar, Mine; Ozcan, Mutlu; Gökçe, Bülent; Cömlekoğlu, Erhan; Leite, Fabiola; Valandro, Luiz Felipe

    2007-05-01

    This study compared the shear bond strength (SBS) and microtensile (MTBS) testing methodologies for core and veneering ceramics in four types of all-ceramic systems. Four different ceramic veneer/core combinations, three of which were feldspathic and the other a fluor-apatite to their respectively corresponding cores, namely leucite-reinforced ceramic ((IPS)Empress, Ivoclar), low leucite-reinforced ceramic (Finesse, Ceramco), glass-infiltrated alumina (In-Ceram Alumina, Vita) and lithium disilicate ((IPS)Empress 2, Ivoclar) were used for SBS and MTBS tests. Ceramic cores (N=40, n=10/group for SBS test method, N=5 blocks/group for MTBS test method) were fabricated according to the manufacturers' instructions (for SBS: thickness, 3mm; diameter, 5mm and for MTBS: 10 mm x 10 mm x 2 mm) and ultrasonically cleaned. The veneering ceramics (thickness: 2mm) were vibrated and condensed in stainless steel moulds and fired onto the core ceramic materials. After trying the specimens in the mould for minor adjustments, they were again ultrasonically cleaned and embedded in PMMA. The specimens were stored in distilled water at 37 degrees C for 1 week and bond strength tests were performed in universal testing machines (cross-head speed: 1mm/min). The bond strengths (MPa+/-S.D.) and modes of failures were recorded. Significant difference between the two test methods and all-ceramic types were observed (P<0.05) (2-way ANOVA, Tukey's test and Bonferroni). The mean SBS values for veneering ceramic to lithium disilicate was significantly higher (41+/-8 MPa) than those to low leucite (28+/-4 MPa), glass-infiltrated (26+/-4 MPa) and leucite-reinforced (23+/-3 MPa) ceramics, while the mean MTBS for low leucite ceramic was significantly higher (15+/-2 MPa) than those of leucite (12+/-2 MPa), glass-infiltrated (9+/-1 MPa) and lithium disilicate ceramic (9+/-1 MPa) (ANOVA, P<0.05). Both the testing methodology and the differences in chemical compositions of the core and veneering ceramics

  4. Testing effective quantum gravity with gravitational waves from extreme mass ratio inspirals

    International Nuclear Information System (INIS)

    Yunes, N; Sopuerta, C F

    2010-01-01

    Testing deviation of GR is one of the main goals of the proposed Laser Interferometer Space Antenna. For the first time, we consistently compute the generation of gravitational waves from extreme-mass ratio inspirals (stellar compact objects into supermassive black holes) in a well-motivated alternative theory of gravity, that to date remains weakly constrained by double binary pulsar observations. The theory we concentrate on is Chern-Simons (CS) modified gravity, a 4-D, effective theory that is motivated both from string theory and loop-quantum gravity, and which enhances the Einstein-Hilbert action through the addition of a dynamical scalar field and the parity-violating Pontryagin density. We show that although point particles continue to follow geodesics in the modified theory, the background about which they inspiral is a modification to the Kerr metric, which imprints a CS correction on the gravitational waves emitted. CS modified gravitational waves are sufficiently different from the General Relativistic expectation that they lead to significant dephasing after 3 weeks of evolution, but such dephasing will probably not prevent detection of these signals, but instead lead to a systematic error in the determination of parameters. We end with a study of radiation-reaction in the modified theory and show that, to leading-order, energy-momentum emission is not CS modified, except possibly for the subdominant effect of scalar-field emission. The inclusion of radiation-reaction will allow for tests of CS modified gravity with space-borne detectors that might be two orders of magnitude larger than current binary pulsar bounds.

  5. Sequential Probability Ratio Testing with Power Projective Base Method Improves Decision-Making for BCI

    Science.gov (United States)

    Liu, Rong

    2017-01-01

    Obtaining a fast and reliable decision is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this study, the EEG signals were firstly analyzed with a power projective base method. Then we were applied a decision-making model, the sequential probability ratio testing (SPRT), for single-trial classification of motor imagery movement events. The unique strength of this proposed classification method lies in its accumulative process, which increases the discriminative power as more and more evidence is observed over time. The properties of the method were illustrated on thirteen subjects' recordings from three datasets. Results showed that our proposed power projective method outperformed two benchmark methods for every subject. Moreover, with sequential classifier, the accuracies across subjects were significantly higher than that with nonsequential ones. The average maximum accuracy of the SPRT method was 84.1%, as compared with 82.3% accuracy for the sequential Bayesian (SB) method. The proposed SPRT method provides an explicit relationship between stopping time, thresholds, and error, which is important for balancing the time-accuracy trade-off. These results suggest SPRT would be useful in speeding up decision-making while trading off errors in BCI. PMID:29348781

  6. Failure modes induced by natural radiation environments on DRAM memories: study, test methodology and mitigation technique

    International Nuclear Information System (INIS)

    Bougerol, A.

    2011-05-01

    DRAMs are frequently used in space and aeronautic systems. Their sensitivity to cosmic radiations have to be known in order to satisfy reliability requirements for critical applications. These evaluations are traditionally done with particle accelerators. However, devices become more complex with technology integration. Therefore new effects appear, inducing longer and more expensive tests. There is a complementary solution: the pulsed laser, which triggers similar effects as particles. Thanks to these two test tools, main DRAM radiation failure modes were studied: SEUs (Single Event Upset) in memory blocks, and SEFIs (Single Event Functional Interrupt) in peripheral circuits. This work demonstrates the influence of test patterns on SEU and SEFI sensitivities depending on technology used. In addition, this study identifies the origin of the most frequent type of SEFIs. Moreover, laser techniques were developed to quantify sensitive surfaces of the different effects. This work led to a new test methodology for industry, in order to optimize test cost and efficiency using both pulsed laser beams and particle accelerators. Finally, a new fault tolerant technique is proposed: based on DRAM cell radiation immunity when discharged, this technique allows to correct all bits of a logic word. (author)

  7. METHODOLOGICAL PROBLEMS AND WAYS OF CREATION OF THE AIRCRAFT EQUIPMENT TEST AUTOMATED MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Vladimir Michailovich Vetoshkin

    2017-01-01

    Full Text Available The development of new and modernization of existing aviation equipment specimens of different classes are ac- companied and completed by the complex process of ground and flight tests. This phase of aviation equipment life cycle is implemented by means of organizational and technical systems - running centers. The latter include various proving grounds, measuring complex and systems, aircraft, ships, security and flight control offices, information processing laborato- ries and many other elements. The system analysis results of development challenges of the automated control systems of aviation equipment tests operations are presented. The automated control systems are in essence an automated data bank. The key role of development of flight tests automated control system in the process of creation of the automated control sys- tems of aviation equipment tests operations is substantiated. The way of the mobile modular measuring complexes integra- tion and the need for national methodologies and technological standards for database systems design concepts are grounded. Database system, as a central element in this scheme, provides collection, storing and updating of values of the elements described above in pace and the required frequency of the controlled object state monitoring. It is database system that pro- vides the supervisory unit with actual data corresponding to specific moments of time, which concern the state processes, assessments of the progress and results of flight experiments, creating the necessary environment for aviation equipment managing and testing as a whole. The basis for development of subsystems of automated control systems of aviation equip- ment tests operations are conceptual design processes of the respective database system, the implementation effectiveness of which largely determines the level of success and ability to develop the systems being created. Introduced conclusions and suggestions can be used in the

  8. Simplified Abrasion Test Methodology for Candidate EVA Glove Lay-Ups

    Science.gov (United States)

    Rabel, Emily; Aitchison, Lindsay

    2015-01-01

    During the Apollo Program, space suit outer-layer fabrics were badly abraded after performing just a few extravehicular activities (EVAs). For example, the Apollo 12 commander reported abrasive wear on the boots that penetrated the outer-layer fabric into the thermal protection layers after less than 8 hrs of surface operations. Current plans for the exploration planetary space suits require the space suits to support hundreds of hours of EVA on a lunar or Martian surface, creating a challenge for space suit designers to utilize materials advances made over the last 40 years and improve on the space suit fabrics used in the Apollo Program. Over the past 25 years the NASA Johnson Space Center Crew and Thermal Systems Division has focused on tumble testing as means of simulating wear on the outer layer of the space suit fabric. Most recently, in 2009, testing was performed on 4 different candidate outer layers to gather baseline data for future use in design of planetary space suit outer layers. In support of the High Performance EVA Glove Element of the Next Generation Life Support Project, testing a new configuration was recently attempted in which require 10% of the fabric per replicate of that need in 2009. The smaller fabric samples allowed for reduced per sample cost and flexibility to test small samples from manufacturers without the overhead to have a production run completed. Data collected from this iteration was compared to that taken in 2009 to validate the new test method. In addition the method also evaluated the fabrics and fabric layups used in a prototype thermal micrometeoroid garment (TMG) developed for EVA gloves under the NASA High Performance EVA Glove Project. This paper provides a review of previous abrasion studies on space suit fabrics, details methodologies used for abrasion testing in this particular study, results of the validation study, and results of the TMG testing.

  9. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Graham, Paul S.; Morgan, Keith S.; Caffrey, Michael P.

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA user designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.

  10. Test methodology and technology of fracture toughness for small size specimens

    Energy Technology Data Exchange (ETDEWEB)

    Wakai, E.; Takada, F.; Ishii, T.; Ando, M. [Japan Atomic Energy Agency, Naga-gun, Ibaraki-ken (Japan); Matsukawa, S. [JNE Techno-Research Co., Kanagawa-ken (Japan)

    2007-07-01

    Full text of publication follows: Small specimen test technology (SSTT) is required to investigate mechanical properties in the limited availability of effective irradiation volumes in test reactors and accelerator-based neutron and charged particle sources. The test methodology guideline and the manufacture processes for very small size specimens have not been established, and we would have to formulate it. The technology to control exactly the load and displacement is also required in the test technology under the environment of high dose radiation produced from the specimens. The objective of this study is to examine the test technology and methodology of fracture toughness for very small size specimens. A new bend test machine installed in hot cell has been manufactured to obtain fracture toughness and DBTT (ductile - brittle transition temperature) of reduced-activation ferritic/martensitic steels for small bend specimens of t/2-1/3PCCVN (pre-cracked 1/3 size Charpy V-notch) with 20 mm length and DFMB (deformation and fracture mini bend specimen) with 9 mm length. The new machine can be performed at temperatures from -196 deg. C to 400 deg. C under unloading compliance method. Neutron irradiation was also performed at about 250 deg. C to about 2 dpa in JMTR. After the irradiation, fracture toughness and DBTT were examined by using the machine. Checking of displacement measurement between linear gauge of cross head's displacement and DVRT of the specimen displacement was performed exactly. Conditions of pre-crack due to fatigue in the specimen preparation were also examined and it depended on the shape and size of the specimens. Fracture toughness and DBTT of F82H steel for t/2-1/3PCCVN, DFMB and 0.18DCT specimens before irradiation were examined as a function of temperature. DBTT of smaller size specimens of DFMB was lower than that of larger size specimen of t/2-1/3PCCVN and 0.18DCT. The changes of fracture toughness and DBTT due to irradiation were also

  11. Assessment of current structural design methodology for high-temperature reactors based on failure tests

    International Nuclear Information System (INIS)

    Corum, J.M.; Sartory, W.K.

    1985-01-01

    A mature design methodology, consisting of inelastic analysis methods, provided in Department of Energy guidelines, and failure criteria, contained in ASME Code Case N-47, exists in the United States for high-temperature reactor components. The objective of this paper is to assess the adequacy of this overall methodology by comparing predicted inelastic deformations and lifetimes with observed results from structural failure tests and from an actual service failure. Comparisons are presented for three types of structural situations: (1) nozzle-to-spherical shell specimens, where stresses at structural discontinuities lead to cracking, (2) welded structures, where metallurgical discontinuities play a key role in failures, and (3) thermal shock loadings of cylinders and pipes, where thermal discontinuities can lead to failure. The comparison between predicted and measured inelastic responses are generally reasonalbly good; quantities are sometimes overpredicted somewhat, and, sometimes underpredicted. However, even seemingly small discrepancies can have a significant effect on structural life, and lifetimes are not always as closely predicted. For a few cases, the lifetimes are substantially overpredicted, which raises questions regarding the adequacy of existing design margins

  12. Testing methodologies for quantifying physical models uncertainties. A comparative exercise using CIRCE and IPREM (FFTBM)

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, Jordi, E-mail: jordi.freixa-terradas@upc.edu; Alfonso, Elsa de, E-mail: elsa.de.alfonso@upc.edu; Reventós, Francesc, E-mail: francesc.reventos@upc.edu

    2016-08-15

    Highlights: • Uncertainty of physical models are a key issue in Best estimate plus uncertainty analysis. • Estimation of uncertainties of physical models of thermal hydraulics system codes. • Comparison of CIRCÉ and FFTBM methodologies. • Simulation of reflood experiments in order to evaluate uncertainty of physical models related to the reflood scenario. - Abstract: The increasing importance of Best-Estimate Plus Uncertainty (BEPU) analyses in nuclear safety and licensing processes have lead to several international activities. The latest findings highlighted the uncertainties of physical models as one of the most controversial aspects of BEPU. This type of uncertainties is an important contributor to the total uncertainty of NPP BE calculations. Due to the complexity of estimating this uncertainty, it is often assessed solely by engineering judgment. The present study comprises a comparison of two different state-of-the-art methodologies CIRCÉ and IPREM (FFTBM) capable of quantifying the uncertainty of physical models. Similarities and differences of their results are discussed through the observation of probability distribution functions and envelope calculations. In particular, the analyzed scenario is core reflood. Experimental data from the FEBA and PERICLES test facilities is employed while the thermal hydraulic simulations are carried out with RELAP5/mod3.3. This work is undertaken under the framework of PREMIUM (Post-BEMUSE Reflood Model Input Uncertainty Methods) benchmark.

  13. Inverse modeling of emissions for local photooxidant pollution: Testing a new methodology with kriging constraints

    Directory of Open Access Journals (Sweden)

    I. Pison

    2006-07-01

    Full Text Available A new methodology for the inversion of anthropogenic emissions at a local scale is tested. The inversion constraints are provided by a kriging technique used in air quality forecast in the Paris area, which computes an analyzed concentration field from network measurements and the first-guess simulation of a CTM. The inverse developed here is based on the CHIMERE model and its adjoint to perform 4-D integration. The methodology is validated on synthetic cases inverting emission fluxes. It is shown that the information provided by the analyzed concentrations is sufficient to reach a mathematically acceptable solution to the optimization, even when little information is available in the measurements. As compared to the use of measurements alone or of measurements and a background matrix, the use of kriging leads to a more homogeneous distribution of the corrections, both in space and time. Moreover, it is then possible to double the accuracy of the inversion by performing two kriging-optimization cycles. Nevertheless, kriging analysis cannot compensate for a very important lack of information in the measurements.

  14. Fisheye Photogrammetry: Tests and Methodologies for the Survey of Narrow Spaces

    Science.gov (United States)

    Perfetti, L.; Polari, C.; Fassi, F.

    2017-02-01

    The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today's era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance) for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  15. A Methodology for Evaluation of Inservice Test Intervals for Pumps and Motor-Operated Valves

    International Nuclear Information System (INIS)

    Cox, D.F.; Haynes, H.D.; McElhaney, K.L.; Otaduy, P.J.; Staunton, R.H.; Vesely, W.E.

    1999-01-01

    Recent nuclear industry reevaluation of component inservice testing (IST) requirements is resulting in requests for IST interval extensions and changes to traditional IST programs. To evaluate these requests, long-term component performance and the methods for mitigating degradation need to be understood. Determining the appropriate IST intervals, along with component testing, monitoring, trending, and maintenance effects, has become necessary. This study provides guidelines to support the evaluation of IST intervals for pumps and motor-operated valves (MOVs). It presents specific engineering information pertinent to the performance and monitoring/testing of pumps and MOVs, provides an analytical methodology for assessing the bounding effects of aging on component margin behavior, and identifies basic elements of an overall program to help ensure component operability. Guidance for assessing probabilistic methods and the risk importance and safety consequences of the performance of pumps and MOVs has not been specifically included within the scope of this report, but these elements may be included in licensee change requests

  16. Testing a SEA methodology for the energy sector: a waste incineration tax proposal

    International Nuclear Information System (INIS)

    Nilsson, Maans; Bjoerklund, Anna; Finnveden, Goeran; Johansson, Jessica

    2005-01-01

    Most Strategic Environmental Assessment (SEA) research has been preoccupied with SEA as a procedure and there are relatively few developments and tests of analytical methodologies. This paper applies and tests an analytical framework for an energy sector SEA. In a case study on a policy proposal for waste-to-energy taxation in Sweden, it studies changes in the energy system as a result of implementing the suggested tax by testing three analytical pathways: an LCA pathway, a site-dependent pathway, and a qualitative pathway. In addition, several valuation methods are applied. The assessment indicates that there are some overall environmental benefits to introducing a tax, but that benefits are modest compared to the potential. The methods are discussed in relation to characteristics for effective policy learning and knowledge uptake. The application shows that in many ways they complement each other rather than substitute for each other. The qualitative pathway is useful for raising awareness and getting a comprehensive view of environmental issues, but has limited potential for decision support. The precision increased as we went to LCA and to site-dependent analysis, and a hierarchy emerged in which the qualitative pathway filled rudimentary functions whereas the site-dependent analysis gave more advanced decision support. All methods had limited potential in supporting a choice between alternatives unless data was aggregated through a valuation exercise

  17. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  18. Response surface methodology based optimization of diesel–n-butanol –cotton oil ternary blend ratios to improve engine performance and exhaust emission characteristics

    International Nuclear Information System (INIS)

    Atmanlı, Alpaslan; Yüksel, Bedri; İleri, Erol; Deniz Karaoglan, A.

    2015-01-01

    Highlights: • RSM based optimization for optimum blend ratio of diesel fuel, n-butanol and cotton oil was done. • 65.5 vol.% diesel fuel, 23.1 vol.% n-butanol and 11.4 vol.% cotton oil (DnBC) was determined. • DnBC decreased brake torque, brake power, BTE and BMEP, while increased BSFC. • DnBC decreased NO x , CO and HC emissions. - Abstract: Many studies declare that 20% biodiesel is the optimum concentration for biodiesel–diesel fuel blends to improve performance. The present work focuses on finding diesel fuel, n-butanol, and cotton oil optimum blend ratios for diesel engine applications by using the response surface method (RSM). Experimental test fuels were prepared by choosing 7 different concentrations, where phase decomposition did not occur in the phase diagram of −10 °C. Experiments were carried out at full load conditions and the constant speed (2200 rpm) of maximum brake torque to determine engine performance and emission parameters. According to the test results of the engine, optimization was done by using RSM considering engine performance and exhaust emissions parameters, to identify the rates of concentrations of components in the optimum blend of three. Confirmation tests were employed to compare the output values of concentrations that were identified by optimization. The real experiment results and the R 2 actual values that show the relation between the outputs from the optimizations and real experiments were determined in high accordance. The optimum component concentration was determined as 65.5 vol.% diesel, 23.1 vol.% n-butanol and 11.4 vol.% cotton oil (DnBC). According to engine performance tests brake torque, brake power, BTE and BMEP of DnBC decreased while BSFC increased compared to those of diesel fuel. NO x , CO and HC emissions of DnBC drastically decreased as 11.33%, 45.17% and 81.45%, respectively

  19. A note on imperfect hedging: a method for testing stability of the hedge ratio

    Directory of Open Access Journals (Sweden)

    Michal Černý

    2012-01-01

    Full Text Available Companies producing, processing and consuming commodities in the production process often hedge their commodity expositions using derivative strategies based on different, highly correlated underlying commodities. Once the open position in a commodity is hedged using a derivative position with another underlying commodity, the appropriate hedge ratio must be determined in order the hedge relationship be as effective as possible. However, it is questionable whether the hedge ratio determined at the inception of the risk management strategy remains stable over the whole period for which the hedging strategy exists. Usually it is assumed that in the short run, the relationship (say, correlation between the two commodities remains stable, while in the long run it may vary. We propose a method, based on statistical theory of stability, for on-line detection whether market movements of prices of the commodities involved in the hedge relationship indicate that the hedge ratio may have been subject to a recent change. The change in the hedge ratio decreases the effectiveness of the original hedge relationship and creates a new open position. The method proposed should inform the risk manager that it could be reasonable to adjust the derivative strategy in a way reflecting the market conditions after the change in the hedge ratio.

  20. The role of the epoxy resin: Curing agent ratio in composite interfacial strength by single fibre microbond test

    DEFF Research Database (Denmark)

    Minty, Ross; Thomason, James L.; Petersen, Helga Nørgaard

    2015-01-01

    This paper focuses on an investigation into the role of the epoxy resin: curing agent ratio in composite interfacial shear strength of glass fibre composites. The procedure involved changing the percentage of curing agent (Triethylenetetramine [TETA]) used in the mixture with several different...... percentages used, ranging from 4% up to 30%, including the stoichiometric ratio. It was found by using the microbond test, that there may exist a relationship between the epoxy resin to curing agent ratio and the level of adhesion between the reinforcing fibre and the polymer matrix of the composite....

  1. Testing of January Anomaly at ISE-100 Index with Power Ratio Method

    Directory of Open Access Journals (Sweden)

    Şule Yüksel Yiğiter

    2015-12-01

    Full Text Available AbstractNone of investors that can access all informations in the same ratio is not possible to earn higher returns according to Efficient Market Hypothesis. However, it has been set forth effect of time on returns in several studies and reached conflicting conclusions with hypothesis. In this context, one of the most important existing anomalies is also January month anomaly. In this study, it has been researched that if there is  January effect in BIST-100 index covering 2008-2014 period by using power ratio method. The presence of January month anomaly in BIST-100 index within specified period determined by analysis results.Keywords: Efficient Markets Hypothesis, January Month Anomaly, Power Ratio MethodJEL Classification Codes: G1,C22

  2. Precarious Rock Methodology for Seismic Hazard: Physical Testing, Numerical Modeling and Coherence Studies

    Energy Technology Data Exchange (ETDEWEB)

    Anooshehpoor, Rasool; Purvance, Matthew D.; Brune, James N.; Preston, Leiph A.; Anderson, John G.; Smith, Kenneth D.

    2006-09-29

    This report covers the following projects: Shake table tests of precarious rock methodology, field tests of precarious rocks at Yucca Mountain and comparison of the results with PSHA predictions, study of the coherence of the wave field in the ESF, and a limited survey of precarious rocks south of the proposed repository footprint. A series of shake table experiments have been carried out at the University of Nevada, Reno Large Scale Structures Laboratory. The bulk of the experiments involved scaling acceleration time histories (uniaxial forcing) from 0.1g to the point where the objects on the shake table overturned a specified number of times. The results of these experiments have been compared with numerical overturning predictions. Numerical predictions for toppling of large objects with simple contact conditions (e.g., I-beams with sharp basal edges) agree well with shake-table results. The numerical model slightly underpredicts the overturning of small rectangular blocks. It overpredicts the overturning PGA for asymmetric granite boulders with complex basal contact conditions. In general the results confirm the approximate predictions of previous studies. Field testing of several rocks at Yucca Mountain has approximately confirmed the preliminary results from previous studies, suggesting that he PSHA predictions are too high, possibly because the uncertainty in the mean of the attenuation relations. Study of the coherence of wavefields in the ESF has provided results which will be very important in design of the canisters distribution, in particular a preliminary estimate of the wavelengths at which the wavefields become incoherent. No evidence was found for extreme focusing by lens-like inhomogeneities. A limited survey for precarious rocks confirmed that they extend south of the repository, and one of these has been field tested.

  3. A Methodological Report: Adapting the 505 Change-of-Direction Speed Test Specific to American Football.

    Science.gov (United States)

    Lockie, Robert G; Farzad, Jalilvand; Orjalo, Ashley J; Giuliano, Dominic V; Moreno, Matthew R; Wright, Glenn A

    2017-02-01

    Lockie, RG, Jalilvand, F, Orjalo, AJ, Giuliano, DV, Moreno, MR, and Wright, GA. A methodological report: Adapting the 505 change-of-direction speed test specific to American football. J Strength Cond Res 31(2): 539-547, 2017-The 505 involves a 10-m sprint past a timing gate, followed by a 180° change-of-direction (COD) performed over 5 m. This methodological report investigated an adapted 505 (A505) designed to be football-specific by changing the distances to 10 and 5 yd. Twenty-five high school football players (6 linemen [LM]; 8 quarterbacks, running backs, and linebackers [QB/RB/LB]; 11 receivers and defensive backs [R/DB]) completed the A505 and 40-yd sprint. The difference between A505 and 0 to 10-yd time determined the COD deficit for each leg. In a follow-up session, 10 subjects completed the A505 again and 10 subjects completed the 505. Reliability was analyzed by t-tests to determine between-session differences, typical error (TE), and coefficient of variation. Test usefulness was examined via TE and smallest worthwhile change (SWC) differences. Pearson's correlations calculated relationships between the A505 and 505, and A505 and COD deficit with the 40-yd sprint. A 1-way analysis of variance (p ≤ 0.05) derived between-position differences in the A505 and COD deficit. There were no between-session differences for the A505 (p = 0.45-0.76; intraclass correlation coefficient = 0.84-0.95; TE = 2.03-4.13%). Additionally, the A505 was capable of detecting moderate performance changes (SWC0.5 > TE). The A505 correlated with the 505 and 40-yard sprint (r = 0.58-0.92), suggesting the modified version assessed similar qualities. Receivers and defensive backs were faster than LM in the A505 for both legs, and right-leg COD deficit. Quarterbacks, running backs, and linebackers were faster than LM in the right-leg A505. The A505 is reliable, can detect moderate performance changes, and can discriminate between football position groups.

  4. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    Energy Technology Data Exchange (ETDEWEB)

    Izzuddin, Nur; Sunarsih,; Priyanto, Agoes [Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Skudai, Johor (Malaysia)

    2015-05-15

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel’s speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel’s speed to obtain better characteristics and hence optimize the fuel saving rate.

  5. Accelerated lifetime testing methodology for lifetime estimation of Lithium-ion batteries used in augmented wind power plants

    DEFF Research Database (Denmark)

    Stroe, Daniel Ioan; Swierczynski, Maciej Jozef; Stan, Ana-Irina

    2013-01-01

    The development of lifetime estimation models for Lithium-ion battery cells, which are working under highly variable mission profiles characteristic for wind power plant applications, requires a lot of expenditures and time resources. Therefore, batteries have to be tested under accelerated...... lifetime ageing conditions. This paper presents a three-stage methodology used for accelerated lifetime testing of Lithium-ion batteries. The results obtained at the end of the accelerated ageing process can be used for the parametrization of a performance-degradation lifetime model. In the proposed...... methodology both calendar and cycling lifetime tests are considered since both components are influencing the lifetime of Lithium-ion batteries. The methodology proposes also a lifetime model verification stage, where Lithium-ion battery cells are tested at normal operating conditions using an application...

  6. A test of the mean density approximation for Lennard-Jones mixtures with large size ratios

    International Nuclear Information System (INIS)

    Ely, J.F.

    1986-01-01

    The mean density approximation for mixture radial distribution functions plays a central role in modern corresponding-states theories. This approximation is reasonably accurate for systems that do not differ widely in size and energy ratios and which are nearly equimolar. As the size ratio increases, however, or if one approaches an infinite dilution of one of the components, the approximation becomes progressively worse, especially for the small molecule pair. In an attempt to better understand and improve this approximation, isothermal molecular dynamics simulations have been performed on a series of Lennard-Jones mixtures. Thermodynamic properties, including the mixture radial distribution functions, have been obtained at seven compositions ranging from 5 to 95 mol%. In all cases the size ratio was fixed at two and three energy ratios were investigated, 22 / 11 =0.5, 1.0, and 1.5. The results of the simulations are compared with the mean density approximation and a modification to integrals evaluated with the mean density approximation is proposed

  7. Dynamic moduli and damping ratios of soil evaluated from pressuremeter test

    International Nuclear Information System (INIS)

    Yoshida, Yasuo; Ezashi, Yasuyuki; Kokusho, Takaji; Nishi, Yoshikazu

    1984-01-01

    Dynamic and static properties of soils are investigated using the newly developed equipment of in-situ test, which imposes dynamic repeated pressure on borehole wall at any depth covering a wide range of strain amplitude. This paper describes mainly the shear modulus and damping characteristics of soils obtained by using the equipment in several sites covering wide variety of soils. The test results are compared and with those obtained by other test methods such as the dynamic triaxial test, the simple shear test and the shear wave velocity test, and discussions are made with regard to their relation ships to each other, which demonstrates the efficiency of this in-situ test. (author)

  8. QUASARS ARE NOT LIGHT BULBS: TESTING MODELS OF QUASAR LIFETIMES WITH THE OBSERVED EDDINGTON RATIO DISTRIBUTION

    International Nuclear Information System (INIS)

    Hopkins, Philip F.; Hernquist, Lars

    2009-01-01

    We use the observed distribution of Eddington ratios as a function of supermassive black hole (BH) mass to constrain models of quasar/active galactic nucleus (AGN) lifetimes and light curves. Given the observed (well constrained) AGN luminosity function, a particular model for AGN light curves L(t) or, equivalently, the distribution of AGN lifetimes (time above a given luminosity t(>L)) translates directly and uniquely (without further assumptions) to a predicted distribution of Eddington ratios at each BH mass. Models for self-regulated BH growth, in which feedback produces a self-regulating 'decay' or 'blowout' phase after the AGN reaches some peak luminosity/BH mass and begins to expel gas and shut down accretion, make specific predictions for the light curves/lifetimes, distinct from, e.g., the expected distribution if AGN simply shut down by gas starvation (without feedback) and very different from the prediction of simple phenomenological 'light bulb' scenarios. We show that the present observations of the Eddington ratio distribution, spanning nearly 5 orders of magnitude in Eddington ratio, 3 orders of magnitude in BH mass, and redshifts z = 0-1, agree well with the predictions of self-regulated models, and rule out phenomenological 'light bulb' or pure exponential models, as well as gas starvation models, at high significance (∼5σ). We also compare with observations of the distribution of Eddington ratios at a given AGN luminosity, and find similar good agreement (but show that these observations are much less constraining). We fit the functional form of the quasar lifetime distribution and provide these fits for use, and show how the Eddington ratio distributions place precise, tight limits on the AGN lifetimes at various luminosities, in agreement with model predictions. We compare with independent estimates of episodic lifetimes and use this to constrain the shape of the typical AGN light curve, and provide simple analytic fits to these for use in

  9. Experimental tests of the effect of rotor diameter ratio and blade number to the cross-flow wind turbine performance

    Science.gov (United States)

    Susanto, Sandi; Tjahjana, Dominicus Danardono Dwi Prija; Santoso, Budi

    2018-02-01

    Cross-flow wind turbine is one of the alternative energy harvester for low wind speeds area. Several factors that influence the power coefficient of cross-flow wind turbine are the diameter ratio of blades and the number of blades. The aim of this study is to find out the influence of the number of blades and the diameter ratio on the performance of cross-flow wind turbine and to find out the best configuration between number of blades and diameter ratio of the turbine. The experimental test were conducted under several variation including diameter ratio between outer and inner diameter of the turbine and number of blades. The variation of turbine diameter ratio between inner and outer diameter consisted of 0.58, 0.63, 0.68 and 0.73 while the variations of the number of blades used was 16, 20 and 24. The experimental test were conducted under certain wind speed which are 3m/s until 4 m/s. The result showed that the configurations between 0.68 diameter ratio and 20 blade numbers is the best configurations that has power coefficient of 0.049 and moment coefficient of 0.185.

  10. Methodology for predicting the life of waste-package materials, and components using multifactor accelerated life tests

    International Nuclear Information System (INIS)

    Thomas, R.E.; Cote, R.W.

    1983-09-01

    Accelerated life tests are essential for estimating the service life of waste-package materials and components. A recommended methodology for generating accelerated life tests is described in this report. The objective of the methodology is to define an accelerated life test program that is scientifically and statistically defensible. The methodology is carried out using a select team of scientists and usually requires 4 to 12 man-months of effort. Specific agendas for the successive meetings of the team are included in the report for use by the team manager. The agendas include assignments for the team scientists and a different set of assignments for the team statistician. The report also includes descriptions of factorial tables, hierarchical trees, and associated mathematical models that are proposed as technical tools to guide the efforts of the design team

  11. Accelerated Lifetime Testing Methodology for Lifetime Estimation of Lithium-ion Batteries used in Augmented Wind Power Plants

    DEFF Research Database (Denmark)

    Stroe, Daniel Ioan; Swierczynski, Maciej Jozef; Stan, Ana-Irina

    2014-01-01

    The development of lifetime estimation models for Lithium-ion battery cells, which are working under highly variable mission profiles characteristic for wind power plant applications, requires a lot of expenditures and time resources. Therefore, batteries have to be tested under accelerated...... lifetime ageing conditions. This paper presents a three-stage methodology used for accelerated lifetime testing of Lithium ion batteries. The results obtained at the end of the accelerated ageing process were used for the parametrization of a performance-degradation lifetime model, which is able to predict...... both the capacity fade and the power capability decrease of the selected Lithium-ion battery cells. In the proposed methodology both calendar and cycling lifetime tests were considered since both components are influencing the lifetime of Lithium-ion batteries. Furthermore, the proposed methodology...

  12. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    Energy Technology Data Exchange (ETDEWEB)

    Zain, Zakiyah, E-mail: zac@uum.edu.my; Ahmad, Yuhaniz, E-mail: yuhaniz@uum.edu.my [School of Quantitative Sciences, Universiti Utara Malaysia, UUM Sintok 06010, Kedah (Malaysia); Azwan, Zairul, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Raduan, Farhana, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Sagap, Ismail, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com [Surgery Department, Universiti Kebangsaan Malaysia Medical Centre, Jalan Yaacob Latif, 56000 Bandar Tun Razak, Kuala Lumpur (Malaysia); Aziz, Nazrina, E-mail: nazrina@uum.edu.my

    2014-12-04

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.

  13. Summer 2012 Testing and Analysis of the Chemical Mixture Methodology -- Part I

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, Clifford S.; Yu, Xiao-Ying; Coggin, Rebekah L.; Ponder, Lashaundra A.; Booth, Alexander E.; Petrocchi, Achille J.; Horn, Sarah M.; Yao, Juan

    2012-07-01

    This report presents the key findings made by the Chemical Mixture Methodology (CMM) project team during the first stage of their summer 2012 testing and analysis of the CMM. The study focused on answering the following questions: o What is the percentage of the chemicals in the CMM Rev 27 database associated with each Health Code Number (HCN)? How does this result influence the relative importance of acute HCNs and chronic HCNs in the CMM data set? o What is the benefit of using the HCN-based approach? Which Modes of Action and Target Organ Effects tend to be important in determining the HCN-based Hazard Index (HI) for a chemical mixture? o What are some of the potential issues associated with the current HCN-based approach? What are the opportunities for improving the performance and/or technical defensibility of the HCN-based approach? How would those improvements increase the benefit of using the HCN-based approach? o What is the Target Organ System Effect approach and how can it be used to improve upon the current HCN-based approach? How does the benefits users would derive from using the Target Organ System Approach compare to the benefits available from the current HCN-based approach?

  14. Measures of effect size for chi-squared and likelihood-ratio goodness-of-fit tests.

    Science.gov (United States)

    Johnston, Janis E; Berry, Kenneth J; Mielke, Paul W

    2006-10-01

    A fundamental shift in editorial policy for psychological journals was initiated when the fourth edition of the Publication Manual of the American Psychological Association (1994) placed emphasis on reporting measures of effect size. This paper presents measures of effect size for the chi-squared and the likelihood-ratio goodness-of-fit statistic tests.

  15. Development and interval testing of a naturalistic driving methodology to evaluate driving behavior in clinical research.

    Science.gov (United States)

    Babulal, Ganesh M; Addison, Aaron; Ghoshal, Nupur; Stout, Sarah H; Vernon, Elizabeth K; Sellan, Mark; Roe, Catherine M

    2016-01-01

    Background : The number of older adults in the United States will double by 2056. Additionally, the number of licensed drivers will increase along with extended driving-life expectancy. Motor vehicle crashes are a leading cause of injury and death in older adults. Alzheimer's disease (AD) also negatively impacts driving ability and increases crash risk. Conventional methods to evaluate driving ability are limited in predicting decline among older adults. Innovations in GPS hardware and software can monitor driving behavior in the actual environments people drive in. Commercial off-the-shelf (COTS) devices are affordable, easy to install and capture large volumes of data in real-time. However, adapting these methodologies for research can be challenging. This study sought to adapt a COTS device and determine an interval that produced accurate data on the actual route driven for use in future studies involving older adults with and without AD.  Methods : Three subjects drove a single course in different vehicles at different intervals (30, 60 and 120 seconds), at different times of day, morning (9:00-11:59AM), afternoon (2:00-5:00PM) and night (7:00-10pm). The nine datasets were examined to determine the optimal collection interval. Results : Compared to the 120-second and 60-second intervals, the 30-second interval was optimal in capturing the actual route driven along with the lowest number of incorrect paths and affordability weighing considerations for data storage and curation. Discussion : Use of COTS devices offers minimal installation efforts, unobtrusive monitoring and discreet data extraction.  However, these devices require strict protocols and controlled testing for adoption into research paradigms.  After reliability and validity testing, these devices may provide valuable insight into daily driving behaviors and intraindividual change over time for populations of older adults with and without AD.  Data can be aggregated over time to look at changes

  16. Development of a calibration methodology and tests of kerma area product meters

    International Nuclear Information System (INIS)

    Costa, Nathalia Almeida

    2013-01-01

    The quantity kerma area product (PKA) is important to establish reference levels in diagnostic radiology exams. This quantity can be obtained using a PKA meter. The use of such meters is essential to evaluate the radiation dose in radiological procedures and is a good indicator to make sure that the dose limit to the patient's skin doesn't exceed. Sometimes, these meters come fixed to X radiation equipment, which makes its calibration difficult. In this work, it was developed a methodology for calibration of PKA meters. The instrument used for this purpose was the Patient Dose Calibrator (PDC). It was developed to be used as a reference to check the calibration of PKA and air kerma meters that are used for dosimetry in patients and to verify the consistency and behavior of systems of automatic exposure control. Because it is a new equipment, which, in Brazil, is not yet used as reference equipment for calibration, it was also performed the quality control of this equipment with characterization tests, the calibration and an evaluation of the energy dependence. After the tests, it was proved that the PDC can be used as a reference instrument and that the calibration must be performed in situ, so that the characteristics of each X-ray equipment, where the PKA meters are used, are considered. The calibration was then performed with portable PKA meters and in an interventional radiology equipment that has a PKA meter fixed. The results were good and it was proved the need for calibration of these meters and the importance of in situ calibration with a reference meter. (author)

  17. Combining rigour with relevance: a novel methodology for testing Chinese herbal medicine.

    Science.gov (United States)

    Flower, Andrew; Lewith, George; Little, Paul

    2011-03-24

    There is a need to develop an evidence base for Chinese herbal medicine (CHM) that is both rigorous and reflective of good practice. This paper proposes a novel methodology to test individualised herbal decoctions using a randomised, double blinded, placebo controlled clinical trial. A feasibility study was conducted to explore the role of CHM in the treatment of endometriosis. Herbal formulae were pre-cooked and dispensed as individual doses in sealed plastic sachets. This permitted the development and testing of a plausible placebo decoction. Participants were randomised at a distant pharmacy to receive either an individualised herbal prescription or a placebo. The trial met the predetermined criteria for good practice. Neither the participants nor the practitioner-researcher could reliably identify group allocation. Of the 28 women who completed the trial, in the placebo group (n=15) 3 women (20%) correctly guessed they were on placebo, 8 (53%) thought they were on herbs and 4 (27%) did not know which group they had been allocated to. In the active group (n=13) 2 (15%) though they were on placebo, 8 (62%) thought they were on herbs and 3 (23%) did not know. Randomisation, double blinding and allocation concealment were successful and the study model appeared to be feasible and effective. It is now possible to subject CHM to rigorous scientific scrutiny without compromising model validity. Improvement in the design of the placebo using food colourings and flavourings instead of dried food will help guarantee the therapeutic inertia of the placebo decoction. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  18. Testing the methodology for site descriptive modelling. Application for the Laxemar area

    International Nuclear Information System (INIS)

    Andersson, Johan; Berglund, Johan; Follin, Sven; Hakami, Eva; Halvarson, Jan; Hermanson, Jan; Laaksoharju, Marcus; Rhen, Ingvar; Wahlgren, C.H.

    2002-08-01

    A special project has been conducted where the currently available data from the Laxemar area, which is part of the Simpevarp site, have been evaluated and interpreted into a Site Descriptive Model covering: geology, hydrogeology, hydrogeochemistry and rock mechanics. Description of the surface ecosystem has been omitted, since it was re-characterised in another, parallel, project. Furthermore, there has been no evaluation of transport properties. The project is primarily a methodology test. The lessons learnt will be implemented in the Site Descriptive Modelling during the coming site investigation. The intent of the project has been to explore whether available methodology for Site Descriptive Modelling based on surface and borehole data is adequate and to identify potential needs for development and improvement in the methodology. The project has developed, with limitations in scope, a Site Descriptive Model in local scale, corresponding to the situation after completion of the Initial Site Investigations for the Laxemar area (i.e. 'version 1.2' using the vocabulary of the general execution program for the site investigations). The Site Descriptive Model should be reasonable, but should not be regarded as a 'real' model. There are limitations both in input data and in the scope of the analysis. The measured (primary) data constitute a wide range of different measurement results including data from two deep core drilled boreholes. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modelling. Examples of such evaluations are estimation of surface geology, lineament interpretation, geological single hole interpretation, hydrogeological single hole interpretation and assessment of hydrogeochemical data. Furthermore, while cross discipline interpretation is encouraged there is also a need for transparency. This means that the evaluations first are made within each discipline and after this

  19. Can persistence hunting signal male quality? A test considering digit ratio in endurance athletes.

    Directory of Open Access Journals (Sweden)

    Daniel Longman

    Full Text Available Various theories have been posed to explain the fitness payoffs of hunting success among hunter-gatherers. 'Having' theories refer to the acquisition of resources, and include the direct provisioning hypothesis. In contrast, 'getting' theories concern the signalling of male resourcefulness and other desirable traits, such as athleticism and intelligence, via hunting prowess. We investigated the association between androgenisation and endurance running ability as a potential signalling mechanism, whereby running prowess, vital for persistence hunting, might be used as a reliable signal of male reproductive fitness by females. Digit ratio (2D:4D was used as a proxy for prenatal androgenisation in 439 males and 103 females, while a half marathon race (21km, representing a distance/duration comparable with that of persistence hunting, was used to assess running ability. Digit ratio was significantly and positively correlated with half-marathon time in males (right hand: r = 0.45, p<0.001; left hand: r = 0.42, p<0.001 and females (right hand: r = 0.26, p<0.01; left hand: r = 0.23, p = 0.02. Sex-interaction analysis showed that this correlation was significantly stronger in males than females, suggesting that androgenisation may have experienced stronger selective pressure from endurance running in males. As digit ratio has previously been shown to predict reproductive success, our results are consistent with the hypothesis that endurance running ability may signal reproductive potential in males, through its association with prenatal androgen exposure. However, further work is required to establish whether and how females respond to this signalling for fitness.

  20. Critical assessment of jet erosion test methodologies for cohesive soil and sediment

    Science.gov (United States)

    Karamigolbaghi, Maliheh; Ghaneeizad, Seyed Mohammad; Atkinson, Joseph F.; Bennett, Sean J.; Wells, Robert R.

    2017-10-01

    The submerged Jet Erosion Test (JET) is a commonly used technique to assess the erodibility of cohesive soil. Employing a linear excess shear stress equation and impinging jet theory, simple numerical methods have been developed to analyze data collected using a JET to determine the critical shear stress and erodibility coefficient of soil. These include the Blaisdell, Iterative, and Scour Depth Methods, and all have been organized into easy to use spreadsheet routines. The analytical framework of the JET and its associated methods, however, are based on many assumptions that may not be satisfied in field and laboratory settings. The main objective of this study is to critically assess this analytical framework and these methodologies. Part of this assessment is to include the effect of flow confinement on the JET. The possible relationship between the derived erodibility coefficient and critical shear stress, a practical tool in soil erosion assessment, is examined, and a review of the deficiencies in the JET methodology also is presented. Using a large database of JET results from the United States and data from literature, it is shown that each method can generate an acceptable curve fit through the scour depth measurements as a function of time. The analysis shows, however, that the Scour Depth and Iterative Methods may result in physically unrealistic values for the erosion parameters. The effect of flow confinement of the impinging jet increases the derived critical shear stress and decreases the erodibility coefficient by a factor of 2.4 relative to unconfined flow assumption. For a given critical shear stress, the length of time over which scour depth data are collected also affects the calculation of erosion parameters. In general, there is a lack of consensus relating the derived soil erodibility coefficient to the derived critical shear stress. Although empirical relationships are statistically significant, the calculated erodibility coefficient for a

  1. Restrictions on the Ratio of Normal to Tangential Field Components in Magnetic Rubber Testing

    National Research Council Canada - National Science Library

    Burke, S. K; Ibrahim, M. E

    2007-01-01

    Magnetic Rubber Testing (MRT) is an extremely sensitive method for deteckng surface-breaking cracks in ferromagnetic materials, and is used extensively in critical inspections for D6ac steel components of the F-111 aircraft...

  2. The ξ/ξ2nd ratio as a test for Effective Polyakov Loop Actions

    Science.gov (United States)

    Caselle, Michele; Nada, Alessandro

    2018-03-01

    Effective Polyakov line actions are a powerful tool to study the finite temperature behaviour of lattice gauge theories. They are much simpler to simulate than the original (3+1) dimensional LGTs and are affected by a milder sign problem. However it is not clear to which extent they really capture the rich spectrum of the original theories, a feature which is instead of great importance if one aims to address the sign problem. We propose here a simple way to address this issue based on the so called second moment correlation length ξ2nd. The ratio ξ/ξ2nd between the exponential correlation length and the second moment one is equal to 1 if only a single mass is present in the spectrum, and becomes larger and larger as the complexity of the spectrum increases. Since both ξexp and ξ2nd are easy to measure on the lattice, this is an economic and effective way to keep track of the spectrum of the theory. In this respect we show using both numerical simulation and effective string calculations that this ratio increases dramatically as the temperature decreases. This non-trivial behaviour should be reproduced by the Polyakov loop effective action.

  3. Testing the methodology for site descriptive modelling. Application for the Laxemar area

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan [JA Streamflow AB, Aelvsjoe (Sweden); Berglund, Johan [SwedPower AB, Stockholm (Sweden); Follin, Sven [SF Geologic AB, Stockholm (Sweden); Hakami, Eva [Itasca Geomekanik AB, Stockholm (Sweden); Halvarson, Jan [Swedish Nuclear Fuel and Waste Management Co, Stockholm (Sweden); Hermanson, Jan [Golder Associates AB, Stockholm (Sweden); Laaksoharju, Marcus [Geopoint (Sweden); Rhen, Ingvar [Sweco VBB/VIAK, Stockholm (Sweden); Wahlgren, C.H. [Sveriges Geologiska Undersoekning, Uppsala (Sweden)

    2002-08-01

    A special project has been conducted where the currently available data from the Laxemar area, which is part of the Simpevarp site, have been evaluated and interpreted into a Site Descriptive Model covering: geology, hydrogeology, hydrogeochemistry and rock mechanics. Description of the surface ecosystem has been omitted, since it was re-characterised in another, parallel, project. Furthermore, there has been no evaluation of transport properties. The project is primarily a methodology test. The lessons learnt will be implemented in the Site Descriptive Modelling during the coming site investigation. The intent of the project has been to explore whether available methodology for Site Descriptive Modelling based on surface and borehole data is adequate and to identify potential needs for development and improvement in the methodology. The project has developed, with limitations in scope, a Site Descriptive Model in local scale, corresponding to the situation after completion of the Initial Site Investigations for the Laxemar area (i.e. 'version 1.2' using the vocabulary of the general execution program for the site investigations). The Site Descriptive Model should be reasonable, but should not be regarded as a 'real' model. There are limitations both in input data and in the scope of the analysis. The measured (primary) data constitute a wide range of different measurement results including data from two deep core drilled boreholes. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modelling. Examples of such evaluations are estimation of surface geology, lineament interpretation, geological single hole interpretation, hydrogeological single hole interpretation and assessment of hydrogeochemical data. Furthermore, while cross discipline interpretation is encouraged there is also a need for transparency. This means that the evaluations first are made within each discipline

  4. Design, manufacture and spin test of high contact ratio helicopter transmission utilizing Self-Aligning Bearingless Planetary (SABP)

    Science.gov (United States)

    Folenta, Dezi; Lebo, William

    1988-01-01

    A 450 hp high ratio Self-Aligning Bearingless Planetary (SABP) for a helicopter application was designed, manufactured, and spin tested under NASA contract NAS3-24539. The objective of the program was to conduct research and development work on a high contact ratio helical gear SABP to reduce weight and noise and to improve efficiency. The results accomplished include the design, manufacturing, and no-load spin testing of two prototype helicopter transmissions, rated at 450 hp with an input speed of 35,000 rpm and an output speed of 350 rpm. The weight power density ratio of these gear units is 0.33 lb hp. The measured airborne noise at 35,000 rpm input speed and light load is 94 dB at 5 ft. The high speed, high contact ratio SABP transmission appears to be significantly lighter and quieter than comtemporary helicopter transmissions. The concept of the SABP is applicable not only to high ratio helicopter type transmissions but also to other rotorcraft and aircraft propulsion systems.

  5. Improvement in post test accident analysis results prediction for the test no. 2 in PSB test facility by applying UMAE methodology

    International Nuclear Information System (INIS)

    Dubey, S.K.; Petruzzi, A.; Giannotti, W.; D'Auria, F.

    2006-01-01

    This paper mainly deals with the improvement in the post test accident analysis results prediction for the test no. 2, 'Total loss of feed water with failure of HPIS pumps and operator actions on primary and secondary circuit depressurization', carried-out on PSB integral test facility in May 2005. This is one the most complicated test conducted in PSB test facility. The prime objective of this test is to provide support for the verification of the accident management strategies for NPPs and also to verify the correctness of some safety systems operating only during accident. The objective of this analysis is to assess the capability to reproduce the phenomena occurring during the selected tests and to quantify the accuracy of the code calculation qualitatively and quantitatively for the best estimate code Relap5/mod3.3 by systematically applying all the procedures lead by Uncertainty Methodology based on Accuracy Extrapolation (UMAE), developed at University of Pisa. In order to achieve these objectives test facility nodalisation qualification for both 'steady state level' and 'on transient level' are demonstrated. For the 'steady state level' qualification compliance to acceptance criteria established in UMAE has been checked for geometrical details and thermal hydraulic parameters. The following steps have been performed for evaluation of qualitative qualification of 'on transient level': visual comparisons between experimental and calculated relevant parameters time trends; list of comparison between experimental and code calculation resulting time sequence of significant events; identification/verification of CSNI phenomena validation matrix; use of the Phenomenological Windows (PhW), identification of Key Phenomena and Relevant Thermal-hydraulic Aspects (RTA). A successful application of the qualitative process constitutes a prerequisite to the application of the quantitative analysis. For quantitative accuracy of code prediction Fast Fourier Transform Based

  6. Implementation of Prognostic Methodologies to Cryogenic Propellant Loading Test-bed

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics methodologies determine the health state of a system and predict the end of life and remaining useful life. This information enables operators to take...

  7. An improved single sensor parity space algorithm for sequential probability ratio test

    Energy Technology Data Exchange (ETDEWEB)

    Racz, A. [Hungarian Academy of Sciences, Budapest (Hungary). Atomic Energy Research Inst.

    1995-12-01

    In our paper we propose a modification of the single sensor parity algorithm in order to make the statistical properties of the generated residual determinable in advance. The algorithm is tested via computer simulated ramp failure at the temperature readings of the pressurizer. (author).

  8. Testing temperature on interfacial shear strength measurements of epoxy resins at different mixing ratios

    DEFF Research Database (Denmark)

    Petersen, Helga Nørgaard; Thomason, James L.; Minty, Ross

    2015-01-01

    The interfacial properties as Interfacial Shear Stress (IFSS) in fibre reinforced polymers are essential for further understanding of the mechanical properties of the composite. In this work a single fibre testing method is used in combination with an epoxy matrix made from Araldite 506 epoxy res...

  9. Test Methodology Development for Experimental Structural Assessment of ASC Planar Spring Material for Long-Term Durability

    Science.gov (United States)

    Yun, Gunjin; Abdullah, A. B. M.; Binienda, Wieslaw; Krause, David L.; Kalluri, Sreeramesh

    2014-01-01

    A vibration-based testing methodology has been developed that will assess fatigue behavior of the metallic material of construction for the Advanced Stirling Convertor displacer (planar) spring component. To minimize the testing duration, the test setup is designed for base-excitation of a multiplespecimen arrangement, driven in a high-frequency resonant mode; this allows completion of fatigue testing in an accelerated period. A high performance electro-dynamic exciter (shaker) is used to generate harmonic oscillation of cantilever beam specimens, which are clasped on the shaker armature with specially-designed clamp fixtures. The shaker operates in closed-loop control with dynamic specimen response feedback provided by a scanning laser vibrometer. A test coordinator function synchronizes the shaker controller and the laser vibrometer to complete the closed-loop scheme. The test coordinator also monitors structural health of the test specimens throughout the test period, recognizing any change in specimen dynamic behavior. As this may be due to fatigue crack initiation, the test coordinator terminates test progression and then acquires test data in an orderly manner. Design of the specimen and fixture geometry was completed by finite element analysis such that peak stress does not occur at the clamping fixture attachment points. Experimental stress evaluation was conducted to verify the specimen stress predictions. A successful application of the experimental methodology was demonstrated by validation tests with carbon steel specimens subjected to fully-reversed bending stress; high-cycle fatigue failures were induced in such specimens using higher-than-prototypical stresses

  10. The Wedge Splitting Test: Influence of Aggregate Size and Water-to-Cement Ratio

    DEFF Research Database (Denmark)

    Pease, Bradley Justin; Skocek, Jan; Geiker, Mette Rica

    2007-01-01

    Since the development of the wedge splitting test (WST), techniques have been used to extract material properties that can describe the fracture behavior of the tested materials. Inverse analysis approaches are commonly used to estimate the stress-crack width relationship; which is described...... by the elastic modulus, tensile strength, fracture energy, and the assumed softening behavior. The stress-crack width relation can be implemented in finite element models for computing the cracking behavior of cementitious systems. While inverse analysis provides information about the material properties...... of various concrete mixtures there are limitations to the current analysis techniques. To date these techniques analyze the result of one WST specimen, thereby providing an estimate of material properties from single result. This paper utilizes a recent improvement to the inverse analysis technique, which...

  11. Bioactivity tests of calcium phosphates with variant molar ratios of main components.

    Science.gov (United States)

    Pluta, Klaudia; Sobczak-Kupiec, Agnieszka; Półtorak, Olga; Malina, Dagmara; Tyliszczak, Bożena

    2018-03-09

    Calcium phosphates constitute attractive materials of biomedical applications. Among them particular attention is devoted to bioactive hydroxyapatite (HAp) and bioresorbable tricalcium phosphate (TCP) that possess ability to bind to living bones and can be used clinically as important bone substitutes. Notably, in vivo bone bioactivity can be predicted from apatite formation of bone immersed in SBF fluids. Thus, analyses of behavior of calcium phosphates immersed in various bio fluids are of great importance. Recently, stoichiometric HAp and TCP structures have been widely studied, whereas only limited number of publications have been devoted to analyses of nonstoichiometric calcium phosphates. Here, we report physicochemical analysis of natural and synthetic phosphates with variable Ca/P molar ratios. Subsequently attained structures were subjected to incubation in either artificial saliva or Ringer's fluids. Both pH and conductivity of such fluids were determined before and after incubation. Furthermore, the influence of the Ca/P values on such parameters was exemplified. Physicochemical analysis of received materials was performed by XRD and FT-IR characterization techniques. Their potential antibacterial activity and behavior in the presence of infectious microorganisms as Escherichia coli and Staphylococcus aureus was also evaluated. © 2018 Wiley Periodicals, Inc. J Biomed Mater Res Part A, 2018. © 2018 Wiley Periodicals, Inc.

  12. The patients' perspective of international normalized ratio self-testing, remote communication of test results and confidence to move to self-management.

    Science.gov (United States)

    Grogan, Anne; Coughlan, Michael; Prizeman, Geraldine; O'Connell, Niamh; O'Mahony, Nora; Quinn, Katherine; McKee, Gabrielle

    2017-12-01

    To elicit the perceptions of patients, who self-tested their international normalized ratio and communicated their results via a text or phone messaging system, to determine their satisfaction with the education and support that they received and to establish their confidence to move to self-management. Self-testing of international normalized ratio has been shown to be reliable and is fast becoming common practice. As innovations are introduced to point of care testing, more research is needed to elicit patients' perceptions of the self-testing process. This three site study used a cross-sectional prospective descriptive survey. Three hundred and thirty patients who were prescribed warfarin and using international normalized ratio self-testing were invited to take part in the study. The anonymous survey examined patient profile, patients' usage, issues, perceptions, confidence and satisfaction with using the self-testing system and their preparedness for self-management of warfarin dosage. The response rate was 57% (n = 178). Patients' confidence in self-testing was high (90%). Patients expressed a high level of satisfaction with the support received, but expressed the need for more information on support groups, side effects of warfarin, dietary information and how to dispose of needles. When asked if they felt confident to adjust their own warfarin levels 73% agreed. Chi-squared tests for independence revealed that none of the patient profile factors examined influenced this confidence. The patients cited the greatest advantages of the service were reduced burden, more autonomy, convenience and ease of use. The main disadvantages cited were cost and communication issues. Patients were satisfied with self-testing. The majority felt they were ready to move to self-management. The introduction of innovations to remote point of care testing, such as warfarin self-testing, needs to have support at least equal to that provided in a hospital setting. © 2017 John

  13. A MULTIPLE TESTING OF THE ABC METHOD AND THE DEVELOPMENT OF A SECOND-GENERATION MODEL. PART II, TEST RESULTS AND AN ANALYSIS OF RECALL RATIO.

    Science.gov (United States)

    ALTMANN, BERTHOLD

    AFTER A BRIEF SUMMARY OF THE TEST PROGRAM (DESCRIBED MORE FULLY IN LI 000 318), THE STATISTICAL RESULTS TABULATED AS OVERALL "ABC (APPROACH BY CONCEPT)-RELEVANCE RATIOS" AND "ABC-RECALL FIGURES" ARE PRESENTED AND REVIEWED. AN ABSTRACT MODEL DEVELOPED IN ACCORDANCE WITH MAX WEBER'S "IDEALTYPUS" ("DIE OBJEKTIVITAET…

  14. [The methodological assessment and qualitative evaluation of psychometric performance tests based on the example of modern tests that assess reading and spelling skills].

    Science.gov (United States)

    Galuschka, Katharina; Rothe, Josefine; Schulte-Körne, Gerd

    2015-09-01

    This article looks at a means of objectively evaluating the quality of psychometric tests. This approach enables users to evaluate psychometric tests based on their methodological characteristics, in order to decide which instrument should be used. Reading and spelling assessment tools serve as examples. The paper also provides a review of German psychometric tests for the assessment of reading and spelling skills. This method facilitates the identification of psychometric tests.of high methodological quality which can be used for the assessment of reading and spelling skills. Reading performance should ideally be assessed with the following instruments: ELFE 1-6, LGVT 6-12, LESEN 6-7, LESEN 8-9, or WLLP-R. The tests to be used for the evaluation of spelling skills are DERET 1-2+, DERET 3-4+, WRT 1+, WRT 2+, WRT 3+, WRT 4+ or HSP 1-10.

  15. Signal to noise ratio enhancement for Eddy Current testing of steam generator tubes in PWR's

    International Nuclear Information System (INIS)

    Georgel, B.

    1985-01-01

    Noise reduction is a compulsory task when we try to recognize and characterize flaws. The signals we deal with come from Eddy Current testings of steam generator steel tubes. We point out the need for a spectral invariant in digital spectral analysis of 2 components signals. We make clear the pros and cons of classical passband filtering and suggest the use of a new noise cancellation method first discussed by Moriwaki and Tlusty. We generalize this tricky technique and prove it is a very special case of the well-known Wiener filter. In that sense the M-T method is shown to be optimal. 6 refs

  16. Financial Key Ratios

    OpenAIRE

    Tănase Alin-Eliodor

    2014-01-01

    This article focuses on computing techniques starting from trial balance data regarding financial key ratios. There are presented activity, liquidity, solvency and profitability financial key ratios. It is presented a computing methodology in three steps based on a trial balance.

  17. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    INTRODUCTION: Oral anticoagulation treatment (OACT) with warfarin is common in general practice. Increasingly, international normalised ratio (INR) point of care testing (POCT) is being used to manage patients. The aim of this study was to describe and analyse the quality of OACT with warfarin...... practices using INR POCT in the management of patients in warfarin treatment provided good quality of care. Sampling interval and diagnostic coding were significantly correlated with treatment quality....

  18. A methodology for the design and testing of atmospheric boundary layer models for wind energy applications

    Directory of Open Access Journals (Sweden)

    J. Sanz Rodrigo

    2017-02-01

    Full Text Available The GEWEX Atmospheric Boundary Layer Studies (GABLS 1, 2 and 3 are used to develop a methodology for the design and testing of Reynolds-averaged Navier–Stokes (RANS atmospheric boundary layer (ABL models for wind energy applications. The first two GABLS cases are based on idealized boundary conditions and are suitable for verification purposes by comparing with results from higher-fidelity models based on large-eddy simulation. Results from three single-column RANS models, of 1st, 1.5th and 2nd turbulence closure order, show high consistency in predicting the mean flow. The third GABLS case is suitable for the study of these ABL models under realistic forcing such that validation versus observations from the Cabauw meteorological tower are possible. The case consists on a diurnal cycle that leads to a nocturnal low-level jet and addresses fundamental questions related to the definition of the large-scale forcing, the interaction of the ABL with the surface and the evaluation of model results with observations. The simulations are evaluated in terms of surface-layer fluxes and wind energy quantities of interest: rotor equivalent wind speed, hub-height wind direction, wind speed shear and wind direction veer. The characterization of mesoscale forcing is based on spatially and temporally averaged momentum budget terms from Weather Research and Forecasting (WRF simulations. These mesoscale tendencies are used to drive single-column models, which were verified previously in the first two GABLS cases, to first demonstrate that they can produce similar wind profile characteristics to the WRF simulations even though the physics are more simplified. The added value of incorporating different forcing mechanisms into microscale models is quantified by systematically removing forcing terms in the momentum and heat equations. This mesoscale-to-microscale modeling approach is affected, to a large extent, by the input uncertainties of the mesoscale

  19. An empirical likelihood ratio test robust to individual heterogeneity for differential expression analysis of RNA-seq.

    Science.gov (United States)

    Xu, Maoqi; Chen, Liang

    2018-01-01

    The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Initiation of depleted uranium oxide and spent fuel testing for the spent fuel sabotage aerosol ratio program

    Energy Technology Data Exchange (ETDEWEB)

    Molecke, M.A.; Gregson, M.W.; Sorenson, K.B. [Sandia National Labs. (United States); Billone, M.C.; Tsai, H. [Argonne National Lab. (United States); Koch, W.; Nolte, O. [Fraunhofer Inst. fuer Toxikologie und Experimentelle Medizin (Germany); Pretzsch, G.; Lange, F. [Gesellschaft fuer Anlagen- und Reaktorsicherheit (Germany); Autrusson, B.; Loiseau, O. [Inst. de Radioprotection et de Surete Nucleaire (France); Thompson, N.S.; Hibbs, R.S. [U.S. Dept. of Energy (United States); Young, F.I.; Mo, T. [U.S. Nuclear Regulatory Commission (United States)

    2004-07-01

    We provide a detailed overview of an ongoing, multinational test program that is developing aerosol data for some spent fuel sabotage scenarios on spent fuel transport and storage casks. Experiments are being performed to quantify the aerosolized materials plus volatilized fission products generated from actual spent fuel and surrogate material test rods, due to impact by a high energy density device, HEDD. The program participants in the U.S. plus Germany, France, and the U.K., part of the international Working Group for Sabotage Concerns of Transport and Storage Casks, WGSTSC have strongly supported and coordinated this research program. Sandia National Laboratories, SNL, has the lead role for conducting this research program; test program support is provided by both the U.S. Department of Energy and Nuclear Regulatory Commission. WGSTSC partners need this research to better understand potential radiological impacts from sabotage of nuclear material shipments and storage casks, and to support subsequent risk assessments, modeling, and preventative measures. We provide a summary of the overall, multi-phase test design and a description of all explosive containment and aerosol collection test components used. We focus on the recently initiated tests on ''surrogate'' spent fuel, unirradiated depleted uranium oxide, and forthcoming actual spent fuel tests. The depleted uranium oxide test rodlets were prepared by the Institut de Radioprotection et de Surete Nucleaire, in France. These surrogate test rodlets closely match the diameter of the test rodlets of actual spent fuel from the H.B. Robinson reactor (high burnup PWR fuel) and the Surry reactor (lower, medium burnup PWR fuel), generated from U.S. reactors. The characterization of the spent fuels and fabrication into short, pressurized rodlets has been performed by Argonne National Laboratory, for testing at SNL. The ratio of the aerosol and respirable particles released from HEDD-impacted spent

  1. Application of a Bayesian model for the quantification of the European methodology for qualification of non-destructive testing

    International Nuclear Information System (INIS)

    Gandossi, Luca; Simola, Kaisa; Shepherd, Barrie

    2010-01-01

    The European methodology for qualification of non-destructive testing is a well-established approach adopted by nuclear utilities in many European countries. According to this methodology, qualification is based on a combination of technical justification and practical trials. The methodology is qualitative in nature, and it does not give explicit guidance on how the evidence from the technical justification and results from trials should be weighted. A Bayesian model for the quantification process was presented in a previous paper, proposing a way to combine the 'soft' evidence contained in a technical justification with the 'hard' evidence obtained from practical trials. This paper describes the results of a pilot study in which such a Bayesian model was applied to two realistic Qualification Dossiers by experienced NDT qualification specialists. At the end of the study, recommendations were made and a set of guidelines was developed for the application of the Bayesian model.

  2. Review of titanium dioxide nanoparticle phototoxicity: Developing a phototoxicity ratio to correct the endpoint values of toxicity tests

    Science.gov (United States)

    2015-01-01

    Abstract Titanium dioxide nanoparticles are photoactive and produce reactive oxygen species under natural sunlight. Reactive oxygen species can be detrimental to many organisms, causing oxidative damage, cell injury, and death. Most studies investigating TiO2 nanoparticle toxicity did not consider photoactivation and performed tests either in dark conditions or under artificial lighting that did not simulate natural irradiation. The present study summarizes the literature and derives a phototoxicity ratio between the results of nano‐titanium dioxide (nano‐TiO2) experiments conducted in the absence of sunlight and those conducted under solar or simulated solar radiation (SSR) for aquatic species. Therefore, the phototoxicity ratio can be used to correct endpoints of the toxicity tests with nano‐TiO2 that were performed in absence of sunlight. Such corrections also may be important for regulators and risk assessors when reviewing previously published data. A significant difference was observed between the phototoxicity ratios of 2 distinct groups: aquatic species belonging to order Cladocera, and all other aquatic species. Order Cladocera appeared very sensitive and prone to nano‐TiO2 phototoxicity. On average nano‐TiO2 was 20 times more toxic to non‐Cladocera and 1867 times more toxic to Cladocera (median values 3.3 and 24.7, respectively) after illumination. Both median value and 75% quartile of the phototoxicity ratio are chosen as the most practical values for the correction of endpoints of nano‐TiO2 toxicity tests that were performed in dark conditions, or in the absence of sunlight. Environ Toxicol Chem 2015;34:1070–1077. © 2015 The Author. Published by SETAC. PMID:25640001

  3. On the hypothesis-free testing of metabolite ratios in genome-wide and metabolome-wide association studies

    Directory of Open Access Journals (Sweden)

    Petersen Ann-Kristin

    2012-06-01

    Full Text Available Abstract Background Genome-wide association studies (GWAS with metabolic traits and metabolome-wide association studies (MWAS with traits of biomedical relevance are powerful tools to identify the contribution of genetic, environmental and lifestyle factors to the etiology of complex diseases. Hypothesis-free testing of ratios between all possible metabolite pairs in GWAS and MWAS has proven to be an innovative approach in the discovery of new biologically meaningful associations. The p-gain statistic was introduced as an ad-hoc measure to determine whether a ratio between two metabolite concentrations carries more information than the two corresponding metabolite concentrations alone. So far, only a rule of thumb was applied to determine the significance of the p-gain. Results Here we explore the statistical properties of the p-gain through simulation of its density and by sampling of experimental data. We derive critical values of the p-gain for different levels of correlation between metabolite pairs and show that B/(2*α is a conservative critical value for the p-gain, where α is the level of significance and B the number of tested metabolite pairs. Conclusions We show that the p-gain is a well defined measure that can be used to identify statistically significant metabolite ratios in association studies and provide a conservative significance cut-off for the p-gain for use in future association studies with metabolic traits.

  4. Application of the modified chi-square ratio statistic in a stepwise procedure for cascade impactor equivalence testing.

    Science.gov (United States)

    Weber, Benjamin; Lee, Sau L; Delvadia, Renishkumar; Lionberger, Robert; Li, Bing V; Tsong, Yi; Hochhaus, Guenther

    2015-03-01

    Equivalence testing of aerodynamic particle size distribution (APSD) through multi-stage cascade impactors (CIs) is important for establishing bioequivalence of orally inhaled drug products. Recent work demonstrated that the median of the modified chi-square ratio statistic (MmCSRS) is a promising metric for APSD equivalence testing of test (T) and reference (R) products as it can be applied to a reduced number of CI sites that are more relevant for lung deposition. This metric is also less sensitive to the increased variability often observed for low-deposition sites. A method to establish critical values for the MmCSRS is described here. This method considers the variability of the R product by employing a reference variance scaling approach that allows definition of critical values as a function of the observed variability of the R product. A stepwise CI equivalence test is proposed that integrates the MmCSRS as a method for comparing the relative shapes of CI profiles and incorporates statistical tests for assessing equivalence of single actuation content and impactor sized mass. This stepwise CI equivalence test was applied to 55 published CI profile scenarios, which were classified as equivalent or inequivalent by members of the Product Quality Research Institute working group (PQRI WG). The results of the stepwise CI equivalence test using a 25% difference in MmCSRS as an acceptance criterion provided the best matching with those of the PQRI WG as decisions of both methods agreed in 75% of the 55 CI profile scenarios.

  5. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design.

    Science.gov (United States)

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-02-28

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  6. Nuclear Power Plant Thermocouple Sensor-Fault Detection and Classification Using Deep Learning and Generalized Likelihood Ratio Test

    Science.gov (United States)

    Mandal, Shyamapada; Santhi, B.; Sridhar, S.; Vinolia, K.; Swaminathan, P.

    2017-06-01

    In this paper, an online fault detection and classification method is proposed for thermocouples used in nuclear power plants. In the proposed method, the fault data are detected by the classification method, which classifies the fault data from the normal data. Deep belief network (DBN), a technique for deep learning, is applied to classify the fault data. The DBN has a multilayer feature extraction scheme, which is highly sensitive to a small variation of data. Since the classification method is unable to detect the faulty sensor; therefore, a technique is proposed to identify the faulty sensor from the fault data. Finally, the composite statistical hypothesis test, namely generalized likelihood ratio test, is applied to compute the fault pattern of the faulty sensor signal based on the magnitude of the fault. The performance of the proposed method is validated by field data obtained from thermocouple sensors of the fast breeder test reactor.

  7. A comparison of between hyomental distance ratios, ratio of height to thyromental, modified Mallamapati classification test and upper lip bite test in predicting difficult laryngoscopy of patients undergoing general anesthesia

    Directory of Open Access Journals (Sweden)

    Azim Honarmand

    2014-01-01

    Full Text Available Background: Failed intubation is imperative source of anesthetic interrelated patient′s mortality. The aim of this present study was to compare the ability to predict difficult visualization of the larynx from the following pre-operative airway predictive indices, in isolation and combination: Modified Mallampati test (MMT, the ratio of height to thyromental distance (RHTMD, hyomental distance ratios (HMDR, and the upper-lip-bite test (ULBT. Materials and Methods: We collected data on 525 consecutive patients scheduled for elective surgery under general anesthesia requiring endotracheal intubation and then evaluated all four factors before surgery. A skilled anesthesiologist, not imparted of the noted pre-operative airway assessment, did the laryngoscopy and rating (as per Cormack and Lehane′s classification. Sensitivity, specificity, and positive predictive value for every airway predictor in isolation and in combination were established. Results: The most sensitive of the single tests was ULBT with a sensitivity of 90.2%. The hyomental distance extreme of head extension was the least sensitive of the single tests with a sensitivity of 56.9. The HMDR had sensitivity 86.3%. The ULBT had the highest negative predictive value: And the area under a receiver-operating characteristic curve (AUC of ROC curve among single predictors. The AUC of ROC curve for ULBT, HMDR and RHTMD was significantly more than for MMT (P 0.05. Conclusion: The HMDR is comparable with RHTMD and ULBT for prediction of difficult laryngoscopy in the general population, but was significantly more than for MMT.

  8. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design

    OpenAIRE

    Matha, Denis; Sandner, Frank; Molins i Borrell, Climent; Campos Hortigüela, Alexis; Cheng, Po Wen

    2015-01-01

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provide...

  9. Proceedings of International monitoring conference 'Development of rehabilitation methodology of environment of the Semipalatinsk region polluted by nuclear tests'

    International Nuclear Information System (INIS)

    2002-01-01

    The aim of the monitoring conference is draw an attention of government, national and international agencies, scientific societies, and local administrations to the ecological problems of Semipalatinsk nuclear test site, to combine the efforts of scientists to solve problems of soil disinfection, purification of surface and ground water from radioactive and heavy metals. It is expected that the knowledge, experience and methodology accumulated on the monitoring conference might be successfully transferred to solve analogous environmental problems of Kazakhstan

  10. Performance of a high-work, low-aspect-ratio turbine stator tested with a realistic inlet radial temperature gradient

    Science.gov (United States)

    Stabe, Roy G.; Schwab, John R.

    1991-01-01

    A 0.767-scale model of a turbine stator designed for the core of a high-bypass-ratio aircraft engine was tested with uniform inlet conditions and with an inlet radial temperature profile simulating engine conditions. The principal measurements were radial and circumferential surveys of stator-exit total temperature, total pressure, and flow angle. The stator-exit flow field was also computed by using a three-dimensional Navier-Stokes solver. Other than temperature, there were no apparent differences in performance due to the inlet conditions. The computed results compared quite well with the experimental results.

  11. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    collected retrospectively for a period of six months. For each patient, time in therapeutic range (TTR) was calculated and correlated with practice and patient characteristics using multilevel linear regression models. RESULTS: We identified 447 patients in warfarin treatment in the 20 practices using POCT......INTRODUCTION: Oral anticoagulation treatment (OACT) with warfarin is common in general practice. Increasingly, international normalised ratio (INR) point of care testing (POCT) is being used to manage patients. The aim of this study was to describe and analyse the quality of OACT with warfarin...

  12. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    INTRODUCTION: Oral anticoagulation treatment (OACT)with warfarin is common in general practice. Increasingly,international normalised ratio (INR) point of care testing(POCT) is being used to manage patients. The aim of thisstudy was to describe and analyse the quality of OACT withwarfarin...... in the management of patients in warfarintreatment provided good quality of care. Sampling intervaland diagnostic coding were significantly correlated withtreatment quality. FUNDING: The study received financial support from theSarah Krabbe Foundation, the General Practitioners’ Educationand Development Foundation...

  13. Design Feature and Prototype Testing Methodology of DHIC's Nuclear I and C System

    International Nuclear Information System (INIS)

    Kim, K.H.; Baeg, S.Y.; Kim, S.A.; Lee, S.J.; Yoon, S.P.; Park, C.Y.

    2011-01-01

    The DHIC has developed an I and C system for a nuclear power plant through a Korean Government R and D project since 2001. This I and C system was designed and implemented to be applied for the new 1400MW nuclear power plant of KHNP. This system's design is based on the class-1E PLC platform and the non-class1E DCS platform. The PPS, the ESF-CCS, the RCOPS, the QIAS-P/N, the PCS, the NPCS, the P-CCS and the NIMS were designed, implemented and tested. The R and D project has been developed under a systematic and guided QA plan, but it is not easy to be applied for a new NPP such as Shin-Ulchin 1 and 2. To resolve problems of the first-application concerns, a new idea of integrated performance testing was adopted. A main control room for a verification test facility was constructed and it has features of a compact, video-based man-machine interface. The MCR includes five operation consoles, a Large Display Panel. A test system for a verification test facility is implemented as similar as a control and protection system of SUN 1 and 2. Integration level tests such as a system test, an interface test, a MMI test, a system function/performance test, a failure mode test, a response time test, a network load test, an alarm test, a reactor power cutback system test, an unit load transient test and a scenario test were performed using the prototype test facilities. These kinds of testing can verify and pre-validate the integrated I and C system's performance and flexibility. It could offer an implementation training before construction and also minimize trial errors to be found in the site. (author)

  14. Can citizen science produce good science? Testing the OPAL Air Survey methodology, using lichens as indicators of nitrogenous pollution

    International Nuclear Information System (INIS)

    Tregidgo, Daniel J.; West, Sarah E.; Ashmore, Mike R.

    2013-01-01

    Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood. -- Highlights: •We investigated the validity of a simplified citizen science methodology. •Lichen abundance data were used to indicate nitrogenous air pollution. •Significant changes were detected beside busy roads with low background pollution. •The methodology detected major, but not subtle, contrasts in pollution. •Sensitivity of citizen science methods to environmental change must be evaluated. -- A simplified lichen biomonitoring method used for citizen science can detect the impact of nitrogenous air pollution from local roads

  15. A Methodology for Evaluation of Inservice Test Intervals for Pumps and Motor Operated Valves

    International Nuclear Information System (INIS)

    McElhaney, K.L.

    1999-01-01

    The nuclear industry has begun efforts to reevaluate inservice tests (ISTs) for key components such as pumps and valves. At issue are two important questions--What kinds of tests provide the most meaningful information about component health, and what periodic test intervals are appropriate? In the past, requirements for component testing were prescribed by the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code. The tests and test intervals specified in the Code were generic in nature and test intervals were relatively short. Operating experience has shown, however, that performance and safety improvements and cost savings could be realized by tailoring IST programs to similar components with comparable safety importance and service conditions. In many cases, test intervals may be lengthened, resulting in cost savings for utilities and their customers

  16. Facilitating the Interpretation of English Language Proficiency Scores: Combining Scale Anchoring and Test Score Mapping Methodologies

    Science.gov (United States)

    Powers, Donald; Schedl, Mary; Papageorgiou, Spiros

    2017-01-01

    The aim of this study was to develop, for the benefit of both test takers and test score users, enhanced "TOEFL ITP"® test score reports that go beyond the simple numerical scores that are currently reported. To do so, we applied traditional scale anchoring (proficiency scaling) to item difficulty data in order to develop performance…

  17. TESTING TREE-CLASSIFIER VARIANTS AND ALTERNATE MODELING METHODOLOGIES IN THE EAST GREAT BASIN MAPPING UNIT OF THE SOUTHWEST REGIONAL GAP ANALYSIS PROJECT (SW REGAP)

    Science.gov (United States)

    We tested two methods for dataset generation and model construction, and three tree-classifier variants to identify the most parsimonious and thematically accurate mapping methodology for the SW ReGAP project. Competing methodologies were tested in the East Great Basin mapping un...

  18. Methodology comparison for gamma-heating calculations in material-testing reactors

    Energy Technology Data Exchange (ETDEWEB)

    Lemaire, M.; Vaglio-Gaudard, C.; Lyoussi, A. [CEA, DEN, DER, Cadarache F-13108 Saint Paul les Durance (France); Reynard-Carette, C. [Aix Marseille Universite, CNRS, Universite de Toulon, IM2NP UMR 7334, 13397, Marseille (France)

    2015-07-01

    The Jules Horowitz Reactor (JHR) is a Material-Testing Reactor (MTR) under construction in the south of France at CEA Cadarache (French Alternative Energies and Atomic Energy Commission). It will typically host about 20 simultaneous irradiation experiments in the core and in the beryllium reflector. These experiments will help us better understand the complex phenomena occurring during the accelerated ageing of materials and the irradiation of nuclear fuels. Gamma heating, i.e. photon energy deposition, is mainly responsible for temperature rise in non-fuelled zones of nuclear reactors, including JHR internal structures and irradiation devices. As temperature is a key parameter for physical models describing the behavior of material, accurate control of temperature, and hence gamma heating, is required in irradiation devices and samples in order to perform an advanced suitable analysis of future experimental results. From a broader point of view, JHR global attractiveness as a MTR depends on its ability to monitor experimental parameters with high accuracy, including gamma heating. Strict control of temperature levels is also necessary in terms of safety. As JHR structures are warmed up by gamma heating, they must be appropriately cooled down to prevent creep deformation or melting. Cooling-power sizing is based on calculated levels of gamma heating in the JHR. Due to these safety concerns, accurate calculation of gamma heating with well-controlled bias and associated uncertainty as low as possible is all the more important. There are two main kinds of calculation bias: bias coming from nuclear data on the one hand and bias coming from physical approximations assumed by computer codes and by general calculation route on the other hand. The former must be determined by comparison between calculation and experimental data; the latter by calculation comparisons between codes and between methodologies. In this presentation, we focus on this latter kind of bias. Nuclear

  19. Leach test methodology for the Waste/Rock Interactions Technology Program

    International Nuclear Information System (INIS)

    Bradley, D.J.; McVay, G.L.; Coles, D.G.

    1980-05-01

    Experimental leach studies in the WRIT Program have two primary functions. The first is to determine radionuclide release from waste forms in laboratory environments which attempt to simulate repository conditions. The second is to elucidate leach mechanisms which can ultimately be incorporated into nearfield transport models. The tests have been utilized to generate rates of removal of elements from various waste forms and to provide specimens for surface analysis. Correlation between constituents released to the solution and corresponding solid state profiles is invaluable in the development of a leach mechanism. Several tests methods are employed in our studies which simulate various proposed leach incident scenarios. Static tests include low temperature (below 100 0 C) and high temperature (above 100 0 C) hydrothermal tests. These tests reproduce nonflow or low-flow repository conditions and can be used to compare materials and leach solution effects. The dynamic tests include single-pass, continuous-flow(SPCF) and solution-change (IAA)-type tests in which the leach solutions are changed at specific time intervals. These tests simulate repository conditions of higher flow rates and can also be used to compare materials and leach solution effects under dynamic conditions. The modified IAEA test is somewhat simpler to use than the one-pass flow and gives adequate results for comparative purposes. The static leach test models the condition of near-zero flow in a repository and provides information on element readsorption and solubility limits. The SPCF test is used to study the effects of flowing solutions at velocities that may be anticipated for geologic groundwaters within breached repositories. These two testing methods, coupled with the use of autoclaves, constitute the current thrust of WRIT leach testing

  20. Decision making about healthcare-related tests and diagnostic test strategies. Paper 2: a review of methodological and practical challenges.

    Science.gov (United States)

    Mustafa, Reem A; Wiercioch, Wojtek; Cheung, Adrienne; Prediger, Barbara; Brozek, Jan; Bossuyt, Patrick; Garg, Amit X; Lelgemann, Monika; Büehler, Diedrich; Schünemann, Holger J

    2017-12-01

    In this first of a series of five articles, we provide an overview of how and why healthcare-related tests and diagnostic strategies are currently applied. We also describe how our findings can be integrated with existing frameworks for making decisions that guide the use of healthcare-related tests and diagnostic strategies. We searched MEDLINE, references of identified articles, chapters in relevant textbooks, and identified articles citing classic literature on this topic. We provide updated frameworks for the potential roles and applications of tests with suggested definitions and practical examples. We also discuss study designs that are commonly used to assess tests' performance and the effects of tests on people's health. These designs include diagnostic randomized controlled trials and retrospective validation. We describe the utility of these and other currently suggested designs, which questions they can answer and which ones they cannot. In addition, we summarize the challenges unique to decision-making resulting from the use of tests. This overview highlights current challenges in the application of tests in decision-making in healthcare, provides clarifications, and informs the proposed solutions. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology, and Experiments

    Science.gov (United States)

    2016-03-24

    Nickle nT Nano- Tesla Si Silicon V Volts w Exchange Energy W Watts Zm Coil Impedance, measured  Circumferential Field Direction T Micro... Tesla  Ratio of Coil Length to Diameter  Ohm ° Degrees 1 (2 blank) 1. INTRODUCTION Magneto-induction (MI) effects in soft...axial magnetic field is utilized to excite the fiber. Previous investigators have demonstrated this effect with small coils applied directly to the

  2. (Re)evaluating the Implications of the Autoregressive Latent Trajectory Model Through Likelihood Ratio Tests of Its Initial Conditions.

    Science.gov (United States)

    Ou, Lu; Chow, Sy-Miin; Ji, Linying; Molenaar, Peter C M

    2017-01-01

    The autoregressive latent trajectory (ALT) model synthesizes the autoregressive model and the latent growth curve model. The ALT model is flexible enough to produce a variety of discrepant model-implied change trajectories. While some researchers consider this a virtue, others have cautioned that this may confound interpretations of the model's parameters. In this article, we show that some-but not all-of these interpretational difficulties may be clarified mathematically and tested explicitly via likelihood ratio tests (LRTs) imposed on the initial conditions of the model. We show analytically the nested relations among three variants of the ALT model and the constraints needed to establish equivalences. A Monte Carlo simulation study indicated that LRTs, particularly when used in combination with information criterion measures, can allow researchers to test targeted hypotheses about the functional forms of the change process under study. We further demonstrate when and how such tests may justifiably be used to facilitate our understanding of the underlying process of change using a subsample (N = 3,995) of longitudinal family income data from the National Longitudinal Survey of Youth.

  3. Can citizen science produce good science? Testing the OPAL Air Survey methodology, using lichens as indicators of nitrogenous pollution.

    Science.gov (United States)

    Tregidgo, Daniel J; West, Sarah E; Ashmore, Mike R

    2013-11-01

    Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Groundwater flow simulations in support of the Local Scale Hydrogeological Description developed within the Laxemar Methodology Test Project

    International Nuclear Information System (INIS)

    Follin, Sven; Svensson, Urban

    2002-05-01

    The deduced Site Descriptive Model of the Laxemar area has been parameterised from a hydraulic point of view and subsequently put into practice in terms of a numerical flow model. The intention of the subproject has been to explore the adaptation of a numerical flow model to site-specific surface and borehole data, and to identify potential needs for development and improvement in the planned modelling methodology and tools. The experiences made during this process and the outcome of the simulations have been presented to the methodology test project group in course of the project. The discussion and conclusions made in this particular report concern two issues mainly, (i) the use of numerical simulations as a means of gaining creditability, e.g. discrimination between alternative geological models, and (ii) calibration and conditioning of probabilistic (Monte Carlo) realisations

  5. Methodological aspects of creating a radiological 'passport' of the former Semipalatinsk nuclear test site

    International Nuclear Information System (INIS)

    Dubasov, Yu.V.; Smagulov, S.G.; Tukhvatulin, Sh.T.

    2002-01-01

    During its existence, 456 nuclear tests were carried out at the Semipalatinsk Test Site - 30 at the ground surface, 86 in the atmosphere and 340 underground. Radioactive fallout from ground surface tests is responsible for the present radiation conditions within the 'Test Field'. The radiation situation in the Degelen Mountains is caused by 209 underground tests carried out in local tunnels. Within the former Test Site there are three large and several small zones to which general access is prohibited for public health reasons: the 'Test Field', the Degelen Mountains, lake Shagan, the rim of the lake, and the adjacent land to the north. The information and characteristics, which have to be included in radiological passport of the former Semipalatinsk Test Site, are discussed along with general information about the Semipalatinsk site, its administrative status, the population distribution throughout the territory, all the economic activities taking place within the territory, the zones and structures representing a radiation hazard, and radiohydrogeological conditions of the test site and the adjacent regions, biogenic conditions (topography, soil, vegetation), wildlife, fauna monitoring, etc. (author)

  6. Larval development ratio test with the calanoid copepod Acartia tonsa as a new bioassay to assess marine sediment quality.

    Science.gov (United States)

    Buttino, Isabella; Vitiello, Valentina; Macchia, Simona; Scuderi, Alice; Pellegrini, David

    2018-03-01

    The copepod Acartia tonsa was used as a model species to assess marine sediment quality. Acute and chronic bioassays, such as larval development ratio (LDR) and different end-points were evaluated. As a pelagic species, A. tonsa is mainly exposed to water-soluble toxicants and bioassays are commonly performed in seawater. However, an interaction among A. tonsa eggs and the first larval stages with marine sediments might occur in shallow water environments. Here we tested two different LDR protocols by incubating A. tonsa eggs in elutriates and sediments coming from two areas located in Tuscany Region (Central Italy): Livorno harbour and Viareggio coast. The end-points analyzed were larval mortality (LM) and development inhibition (DI) expressed as the percentage of copepods that completed the metamorphosis from nauplius to copepodite. Aims of this study were: i) to verify the suitability of A. tonsa copepod for the bioassay with sediment and ii) to compare the sensitivity of A. tonsa exposed to different matrices, such as water and sediment. A preliminary acute test was also performed. Acute tests showed the highest toxicity of Livorno's samples (two out of three) compared to Viareggio samples, for which no effect was observed. On the contrary, LDR tests with sediments and elutriates revealed some toxic effects also for Viareggio's samples. Results were discussed with regards to the chemical characterization of the samples. Our results indicated that different end-points were affected in A. tonsa, depending on the matrices to which the copepods were exposed and on the test used. Bioassays with elutriates and sediments are suggested and LDR test could help decision-makers to identify a more appropriate management of dredging materials. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. A Proposal for a Methodology to Develop a Cyber-Attack Penetration Test Scenario Including NPPs Safety

    Energy Technology Data Exchange (ETDEWEB)

    Lee, In Hyo [KAIST, Daejeon (Korea, Republic of); Son, Han Seong [Joongbu Univ., Geumsan (Korea, Republic of); Kim, Si Won [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of); Kang, Hyun Gook [Rensselaer Polytechnic Institute, Troy (United States)

    2016-10-15

    Penetration test is a method to evaluate the cyber security of NPPs; so, this approach was performed in some studies. Because they focused on vulnerability finding or test bed construction, scenario based approach was not performed. However, to test the cyber security of NPPs, a proper test scenario should be needed. Ahn et al. developed cyber-attack scenarios but those scenarios couldn't be applied in penetration test because they developed the scenarios based on past incidents of NPPs induced by cyber-attack. That is, those scenarios only covered scenarios which were happened before; so, they couldn't cover other various scenarios and couldn't reflect them into a penetration test. In this study, a method to develop a cyber-attack penetration test scenario of NPPs especially focused on safety point of view is suggested. To evaluate the cyber security of NPPs, penetration test can be a possible way. In this study, a method to develop a penetration test scenario was explained. Especially, the goal of hacker was focused on nuclear fuel integrity deterioration. So, in the methodology, Level 1 PSA results were utilized to reflect plant safety into the security. From the PSA results, basic event was post processed and possible cyber-attacks were reviewed with vulnerabilities of digital control system.

  8. A Proposal for a Methodology to Develop a Cyber-Attack Penetration Test Scenario Including NPPs Safety

    International Nuclear Information System (INIS)

    Lee, In Hyo; Son, Han Seong; Kim, Si Won; Kang, Hyun Gook

    2016-01-01

    Penetration test is a method to evaluate the cyber security of NPPs; so, this approach was performed in some studies. Because they focused on vulnerability finding or test bed construction, scenario based approach was not performed. However, to test the cyber security of NPPs, a proper test scenario should be needed. Ahn et al. developed cyber-attack scenarios but those scenarios couldn't be applied in penetration test because they developed the scenarios based on past incidents of NPPs induced by cyber-attack. That is, those scenarios only covered scenarios which were happened before; so, they couldn't cover other various scenarios and couldn't reflect them into a penetration test. In this study, a method to develop a cyber-attack penetration test scenario of NPPs especially focused on safety point of view is suggested. To evaluate the cyber security of NPPs, penetration test can be a possible way. In this study, a method to develop a penetration test scenario was explained. Especially, the goal of hacker was focused on nuclear fuel integrity deterioration. So, in the methodology, Level 1 PSA results were utilized to reflect plant safety into the security. From the PSA results, basic event was post processed and possible cyber-attacks were reviewed with vulnerabilities of digital control system

  9. Reactor building integrity testing: A novel approach at Gentilly 2 - principles and methodology

    International Nuclear Information System (INIS)

    Collins, N.; Lafreniere, P.

    1991-01-01

    In 1987, Hydro-Quebec embarked on an ambitious development program to provide the Gentilly 2 nuclear power station with an effective, yet practical reactor building Integrity Test. The Gentilly 2 Integrity Test employs an innovative approach based on the reference volume concept. It is identified as the Temperature Compensation Method (TCM) System. This configuration has been demonstrated at both high and low test pressure and has achieved extraordinary precision in the leak rate measurement. The Gentilly 2 design allows the Integrity Test to be performed at a nominal 3 kPa(g) test pressure during an (11) hour period with the reactor at full power. The reactor building Pressure Test by comparison, is typically performed at high pressure 124 kPa(g)) in a 7 day window during an annual outage. The Integrity Test was developed with the goal of demonstrating containment availability. Specifically it was purported to detect a leak or hole in the 'bottled-up' reactor building greater in magnitude than an equivalent pipe of 25 mm diameter. However it is considered feasible that the high precision of the Gentilly 2 TCM System Integrity Test and a stable reactor building leak characteristic will constitute sufficient grounds for the reduction of the Pressure Test frequency. It is noted that only the TCM System has, to this date, allowed a relevant determination of the reactor building leak rate at a nominal test pressure of 3 kPa(g). Classical method tests at low pressure have lead to inconclusive results due to the high lack of precision

  10. Accelerated Testing Methodology for the Determination of Slow Crack Growth of Advanced Ceramics

    Science.gov (United States)

    Choi, Sung R.; Salem, Jonathan A.; Gyekenyesi, John P.

    1997-01-01

    Constant stress-rate (dynamic fatigue) testing has been used for several decades to characterize slow crack growth behavior of glass and ceramics at both ambient and elevated temperatures. The advantage of constant stress-rate testing over other methods lies in its simplicity: Strengths are measured in a routine manner at four or more stress rates by applying a constant crosshead speed or constant loading rate. The slow crack growth parameters (n and A) required for design can be estimated from a relationship between strength and stress rate. With the proper use of preloading in constant stress-rate testing, an appreciable saving of test time can be achieved. If a preload corresponding to 50 % of the strength is applied to the specimen prior to testing, 50 % of the test time can be saved as long as the strength remains unchanged regardless of the applied preload. In fact, it has been a common, empirical practice in strength testing of ceramics or optical fibers to apply some preloading (less then 40%). The purpose of this work is to study the effect of preloading on the strength to lay a theoretical foundation on such an empirical practice. For this purpose, analytical and numerical solutions of strength as a function of preloading were developed. To verify the solution, constant stress-rate testing using glass and alumina at room temperature and alumina silicon nitride, and silicon carbide at elevated temperatures was conducted in a range of preloadings from O to 90 %.

  11. Qualification methodologies for mechanical component, I and C, piping using test lab

    International Nuclear Information System (INIS)

    Ichikawa, Toshio

    2001-01-01

    There are many methods of verifying the intensity of a structure, a function, a vibration characteristics, etc. The seismic test which verifies the function during the earthquake of a components simple substance (seismic test which checks durability according to components types). How to verify the analysis technique by the scale model and to check the intensity of plant operating conditions by the scale model results. The model of the same size as the actual plant is created and there is a method of verifying intensity and the function directly. A seismic test is restrained by the frequency of an evaluation objective, and the capability of actuator equipment, and is carried out. Moreover, otherwise, restrictions are the size of a table, actuation power, environment, etc. Here, further examples are introduced, such as evaluation by the examination that combined analysis, experimental test use and analysis, and the experimental test, and the method of proving only by test, and have the seismic check method by test learned in this lecture. Typical examples are explained. Based on the seismic test result carried out with experimental research equipment, how to verify that the required function to components, such as a structure of reactor internals, is maintained at the time of an earthquake is explained. In this case, differences of the simulation environment of the model in. a test, earthquake conditions simulated by shaker table of test conditions and actual plant conditions are important for the evaluation method determination. In nuclear equipment, there is what is required to achieve the static function to hold pressure boundary to the high temperature inside apparatus piping - high-pressure flow, and dynamic functions, such as insertion of a valve, a pump, and a control rod. Moreover, in order to maintain and carry out the safe stop of the safe operation, there is I and C for controlling - supervising these components. In order for this functional maintenance

  12. Accelerated Testing Methodology Developed for Determining the Slow Crack Growth of Advanced Ceramics

    Science.gov (United States)

    Choi, Sung R.; Gyekenyesi, John P.

    1998-01-01

    Constant stress-rate ("dynamic fatigue") testing has been used for several decades to characterize the slow crack growth behavior of glass and structural ceramics at both ambient and elevated temperatures. The advantage of such testing over other methods lies in its simplicity: strengths are measured in a routine manner at four or more stress rates by applying a constant displacement or loading rate. The slow crack growth parameters required for component design can be estimated from a relationship between strength and stress rate. With the proper use of preloading in constant stress-rate testing, test time can be reduced appreciably. If a preload corresponding to 50 percent of the strength is applied to the specimen prior to testing, 50 percent of the test time can be saved as long as the applied preload does not change the strength. In fact, it has been a common, empirical practice in the strength testing of ceramics or optical fibers to apply some preloading (<40 percent). The purpose of this work at the NASA Lewis Research Center is to study the effect of preloading on measured strength in order to add a theoretical foundation to the empirical practice.

  13. Analysis and implementation of software testing in an agile development methodology

    OpenAIRE

    Pinheiro, Sérgio Agostinho Machado

    2015-01-01

    Dissertação de mestrado em Engenharia de Sistemas Nesta dissertação é apresentado o estudo e implementação de testes de software em desenvolvimento ágil. Os testes de software têm cada vez mais importância para as empresas que desenvolvem software, devido à natural evolução das exigências do cliente. Face à necessidade de cumprir as expetativas do cliente, a F3M Information Systems, SA sentiu que devia melhorar as suas práticas de testes. Com base na metodologia de des...

  14. Zero crossing and ratio spectra derivative spectrophotometry for the dissolution tests of amlodipine and perindopril in their fixed dose formulations

    Directory of Open Access Journals (Sweden)

    Maczka Paulina

    2014-06-01

    Full Text Available Dissolution tests of amlodipine and perindopril from their fixed dose formulations were performed in 900 mL of phosphate buffer of pH 5.5 at 37°C using the paddle apparatus. Then, two simple and rapid derivative spectrophotometric methods were used for the quantitative measurements of amlodipine and perindopril. The first method was zero crossing first derivative spectrophotometry in which measuring of amplitudes at 253 nm for amlodipine and 229 nm for perindopril were used. The second method was ratio derivative spectrophotometry in which spectra of amlodipine over the linearity range were divided by one selected standard spectrum of perindopril and then amplitudes at 242 nm were measured. Similarly, spectra of perindopril were divided by one selected standard spectrum of amlodipine and then amplitudes at 298 nm were measured. Both of the methods were validated to meet official requirements and were demonstrated to be selective, precise and accurate. Since there is no official monograph for these drugs in binary formulations, the dissolution tests and quantification procedure presented here can be used as a quality control test for amlodipine and perindopril in respective dosage forms.

  15. Gust load alleviation wind tunnel tests of a large-aspect-ratio flexible wing with piezoelectric control

    Directory of Open Access Journals (Sweden)

    Ying Bi

    2017-02-01

    Full Text Available An active control technique utilizing piezoelectric actuators to alleviate gust-response loads of a large-aspect-ratio flexible wing is investigated. Piezoelectric materials have been extensively used for active vibration control of engineering structures. In this paper, piezoelectric materials further attempt to suppress the vibration of the aeroelastic wing caused by gust. The motion equation of the flexible wing with piezoelectric patches is obtained by Hamilton’s principle with the modal approach, and then numerical gust responses are analyzed, based on which a gust load alleviation (GLA control system is proposed. The gust load alleviation system employs classic proportional-integral-derivative (PID controllers which treat piezoelectric patches as control actuators and acceleration as the feedback signal. By a numerical method, the control mechanism that piezoelectric actuators can be used to alleviate gust-response loads is also analyzed qualitatively. Furthermore, through low-speed wind tunnel tests, the effectiveness of the gust load alleviation active control technology is validated. The test results agree well with the numerical results. Test results show that at a certain frequency range, the control scheme can effectively alleviate the z and x wingtip accelerations and the root bending moment of the wing to a certain extent. The control system gives satisfying gust load alleviation efficacy with the reduction rate being generally over 20%.

  16. Further Evaluation of Covariate Analysis using Empirical Bayes Estimates in Population Pharmacokinetics: the Perception of Shrinkage and Likelihood Ratio Test.

    Science.gov (United States)

    Xu, Xu Steven; Yuan, Min; Yang, Haitao; Feng, Yan; Xu, Jinfeng; Pinheiro, Jose

    2017-01-01

    Covariate analysis based on population pharmacokinetics (PPK) is used to identify clinically relevant factors. The likelihood ratio test (LRT) based on nonlinear mixed effect model fits is currently recommended for covariate identification, whereas individual empirical Bayesian estimates (EBEs) are considered unreliable due to the presence of shrinkage. The objectives of this research were to investigate the type I error for LRT and EBE approaches, to confirm the similarity of power between the LRT and EBE approaches from a previous report and to explore the influence of shrinkage on LRT and EBE inferences. Using an oral one-compartment PK model with a single covariate impacting on clearance, we conducted a wide range of simulations according to a two-way factorial design. The results revealed that the EBE-based regression not only provided almost identical power for detecting a covariate effect, but also controlled the false positive rate better than the LRT approach. Shrinkage of EBEs is likely not the root cause for decrease in power or inflated false positive rate although the size of the covariate effect tends to be underestimated at high shrinkage. In summary, contrary to the current recommendations, EBEs may be a better choice for statistical tests in PPK covariate analysis compared to LRT. We proposed a three-step covariate modeling approach for population PK analysis to utilize the advantages of EBEs while overcoming their shortcomings, which allows not only markedly reducing the run time for population PK analysis, but also providing more accurate covariate tests.

  17. Test and evaluation about damping characteristics of hanger supports for nuclear power plant piping systems (Seismic Damping Ratio Evaluation Program)

    International Nuclear Information System (INIS)

    Shibata, H.; Ito, A.; Tanaka, K.; Niino, T.; Gotoh, N.

    1981-01-01

    Generally, damping phenomena of structures and equipments is caused by very complex energy dissipation. Especially, as piping systems are composed of many components, it is very difficult to evaluate damping characteristics of its system theoretically. On the other hand, the damping value for aseismic design of nuclear power plants is very important design factor to decide seismic response loads of structures, equipments and piping systems. The very extensive studies titled SDREP (Seismic Damping Ratio Evaluation Program) were performed to establish proper damping values for seismic design of piping as a joint work among a university, electric companies and plant makers. In SDREP, various systematic vibration tests were conducted to investigate factors which may contribute to damping characteristics of piping systems and to supplement the data of the pre-operating tests. This study is related to the component damping characteristics tests of that program. The object of this study is to clarify damping characteristics and mechanism of hanger supports used in piping systems, and to establish the evaluation technique of dispersing energy at hanger support points and its effect to the total damping ability of piping system. (orig./WL)

  18. Model test on the relationship feed energy and protein ratio to the production and quality of milk protein

    Science.gov (United States)

    Hartanto, R.; Jantra, M. A. C.; Santosa, S. A. B.; Purnomoadi, A.

    2018-01-01

    The purpose of this research was to find an appropriate relationship model between the feed energy and protein ratio with the amount of production and quality of milk proteins. This research was conducted at Getasan Sub-district, Semarang Regency, Central Java Province, Indonesia using 40 samples (Holstein Friesian cattle, lactation period II-III and lactation month 3-4). Data were analyzed using linear and quadratic regressions, to predict the production and quality of milk protein from feed energy and protein ratio that describe the diet. The significance of model was tested using analysis of variance. Coefficient of determination (R2), residual variance (RV) and root mean square prediction error (RMSPE) were reported for the developed equations as an indicator of the goodness of model fit. The results showed no relationship in milk protein (kg), milk casein (%), milk casein (kg) and milk urea N (mg/dl) as function of CP/TDN. The significant relationship was observed in milk production (L or kg) and milk protein (%) as function of CP/TDN, both in linear and quadratic models. In addition, a quadratic change in milk production (L) (P = 0.003), milk production (kg) (P = 0.003) and milk protein concentration (%) (P = 0.026) were observed with increase of CP/TDN. It can be concluded that quadratic equation was the good fitting model for this research, because quadratic equation has larger R2, smaller RV and smaller RMSPE than those of linear equation.

  19. Methodology to improve design of accelerated life tests in civil engineering projects.

    Directory of Open Access Journals (Sweden)

    Jing Lin

    Full Text Available For reliability testing an Energy Expansion Tree (EET and a companion Energy Function Model (EFM are proposed and described in this paper. Different from conventional approaches, the EET provides a more comprehensive and objective way to systematically identify external energy factors affecting reliability. The EFM introduces energy loss into a traditional Function Model to identify internal energy sources affecting reliability. The combination creates a sound way to enumerate the energies to which a system may be exposed during its lifetime. We input these energies into planning an accelerated life test, a Multi Environment Over Stress Test. The test objective is to discover weak links and interactions among the system and the energies to which it is exposed, and design them out. As an example, the methods are applied to the pipe in subsea pipeline. However, they can be widely used in other civil engineering industries as well. The proposed method is compared with current methods.

  20. A methodological aspect of the 14C-urea breath test used in Helicobacter pylori diagnosis

    International Nuclear Information System (INIS)

    Kopanski, Z.; Niziol, J.; Micherdzinski, J.; Wasilewska-Radwanska, M.; Cienciala, A.; Lasa, J.; Witkowska, B.

    1996-01-01

    The main purpose of those investigations was optimisation of the performing time of the breath test with 14 C-labelled urea which reveals Helicobacter pylori infection. It was analysed 117 species, preselected according to endoscopy and histopathology results, 56 of them have suffered from chronic gastritis and 61 from gastric ulcer disease. Using microbiology diagnosis (culture + IFP test) it was found that 86 species were H. pylori infected. This group of patients were next subject to investigations with the breath test with 14 C-labelled urea. Measurements of radioactivity of breathe air have been carried out for 30 minutes. The obtained results allow us to maintain that the optimal time of duration of the test described above is 30 minutes. (author)

  1. Effectiveness Analysis of a Non-Destructive Single Event Burnout Test Methodology

    CERN Document Server

    Oser, P; Spiezia, G; Fadakis, E; Foucard, G; Peronnard, P; Masi, A; Gaillard, R

    2014-01-01

    It is essential to characterize power MosFETs regarding their tolerance to destructive Single Event Burnouts (SEB). Therefore, several non-destructive test methods have been developed to evaluate the SEB cross-section of power devices. A power MosFET has been evaluated using a test circuit, designed according to standard non-destructive test methods discussed in the literature. Guidelines suggest a prior adaptation of auxiliary components to the device sensitivity before the radiation test. With the first value chosen for the de-coupling capacitor, the external component initiated destructive events and affected the evaluation of the cross-section. As a result, the influence of auxiliary components on the device cross-section was studied. This paper presents the obtained experimental results, supported by SPICE simulations, to evaluate and discuss how the circuit effectiveness depends on the external components.

  2. Development and testing of incident detection algorithms. Vol. 2, research methodology and detailed results.

    Science.gov (United States)

    1976-04-01

    The development and testing of incident detection algorithms was based on Los Angeles and Minneapolis freeway surveillance data. Algorithms considered were based on times series and pattern recognition techniques. Attention was given to the effects o...

  3. Methodology to improve design of accelerated life tests in civil engineering projects.

    Science.gov (United States)

    Lin, Jing; Yuan, Yongbo; Zhou, Jilai; Gao, Jie

    2014-01-01

    For reliability testing an Energy Expansion Tree (EET) and a companion Energy Function Model (EFM) are proposed and described in this paper. Different from conventional approaches, the EET provides a more comprehensive and objective way to systematically identify external energy factors affecting reliability. The EFM introduces energy loss into a traditional Function Model to identify internal energy sources affecting reliability. The combination creates a sound way to enumerate the energies to which a system may be exposed during its lifetime. We input these energies into planning an accelerated life test, a Multi Environment Over Stress Test. The test objective is to discover weak links and interactions among the system and the energies to which it is exposed, and design them out. As an example, the methods are applied to the pipe in subsea pipeline. However, they can be widely used in other civil engineering industries as well. The proposed method is compared with current methods.

  4. Evaluation of Test Methodologies for Dissolution and Corrosion of Al-SNF

    International Nuclear Information System (INIS)

    Wiersma, B.J.; Mickalonis, J.I.; Louthan, M.R.

    1998-09-01

    The performance of aluminum-based spent nuclear fuel (Al-SNF) in the repository will differ from that of the commercial nuclear fuels and the high level waste glasses. The program consists of evaluating three test methods

  5. Performance of a high-work low aspect ratio turbine tested with a realistic inlet radial temperature profile

    Science.gov (United States)

    Stabe, R. G.; Whitney, W. J.; Moffitt, T. P.

    1984-01-01

    Experimental results are presented for a 0.767 scale model of the first stage of a two-stage turbine designed for a high by-pass ratio engine. The turbine was tested with both uniform inlet conditions and with an inlet radial temperature profile simulating engine conditions. The inlet temperature profile was essentially mixed-out in the rotor. There was also substantial underturning of the exit flow at the mean diameter. Both of these effects were attributed to strong secondary flows in the rotor blading. There were no significant differences in the stage performance with either inlet condition when differences in tip clearance were considered. Performance was very close to design intent in both cases. Previously announced in STAR as N84-24589

  6. Decision making about healthcare-related tests and diagnostic test strategies. Paper 2: a review of methodological and practical challenges

    NARCIS (Netherlands)

    Mustafa, Reem A.; Wiercioch, Wojtek; Cheung, Adrienne; Prediger, Barbara; Brozek, Jan; Bossuyt, Patrick; Garg, Amit X.; Lelgemann, Monika; Büehler, Diedrich; Schünemann, Holger J.

    2017-01-01

    Objectives: In this first of a series of five articles, we provide an overview of how and why healthcare-related tests and diagnostic strategies are currently applied. We also describe how our findings can be integrated with existing frameworks for making decisions that guide the use of

  7. Effect of C/N Ratio and Media Optimization through Response Surface Methodology on Simultaneous Productions of Intra- and Extracellular Inulinase and Invertase from Aspergillus niger ATCC 20611

    Science.gov (United States)

    Dinarvand, Mojdeh; Rezaee, Malahat; Masomian, Malihe; Jazayeri, Seyed Davoud; Zareian, Mohsen; Abbasi, Sahar; Ariff, Arbakariya B.

    2013-01-01

    The study is to identify the extraction of intracellular inulinase (exo- and endoinulinase) and invertase as well as optimization medium composition for maximum productions of intra- and extracellular enzymes from Aspergillus niger ATCC 20611. From two different methods for extraction of intracellular enzymes, ultrasonic method was found more effective. Response surface methodology (RSM) with a five-variable and three-level central composite design (CCD) was employed to optimize the medium composition. The effect of five main reaction parameters including sucrose, yeast extract, NaNO3, Zn+2, and Triton X-100 on the production of enzymes was analyzed. A modified quadratic model was fitted to the data with a coefficient of determination (R 2) more than 0.90 for all responses. The intra-extracellular inulinase and invertase productions increased in the range from 16 to 8.4 times in the optimized medium (10% (w/v) sucrose, 2.5% (w/v) yeast extract, 2% (w/v) NaNO3, 1.5 mM (v/v) Zn+2, and 1% (v/v) Triton X-100) by RSM and from around 1.2 to 1.3 times greater than in the medium optimized by one-factor-at-a-time, respectively. The results of bioprocesses optimization can be useful in the scale-up fermentation and food industry. PMID:24151605

  8. Effect of C/N Ratio and Media Optimization through Response Surface Methodology on Simultaneous Productions of Intra- and Extracellular Inulinase and Invertase from Aspergillus niger ATCC 20611

    Directory of Open Access Journals (Sweden)

    Mojdeh Dinarvand

    2013-01-01

    Full Text Available The study is to identify the extraction of intracellular inulinase (exo- and endoinulinase and invertase as well as optimization medium composition for maximum productions of intra- and extracellular enzymes from Aspergillus niger ATCC 20611. From two different methods for extraction of intracellular enzymes, ultrasonic method was found more effective. Response surface methodology (RSM with a five-variable and three-level central composite design (CCD was employed to optimize the medium composition. The effect of five main reaction parameters including sucrose, yeast extract, NaNO3, Zn+2, and Triton X-100 on the production of enzymes was analyzed. A modified quadratic model was fitted to the data with a coefficient of determination (R2 more than 0.90 for all responses. The intra-extracellular inulinase and invertase productions increased in the range from 16 to 8.4 times in the optimized medium (10% (w/v sucrose, 2.5% (w/v yeast extract, 2% (w/v NaNO3, 1.5 mM (v/v Zn+2, and 1% (v/v Triton X-100 by RSM and from around 1.2 to 1.3 times greater than in the medium optimized by one-factor-at-a-time, respectively. The results of bioprocesses optimization can be useful in the scale-up fermentation and food industry.

  9. A reliability as an independent variable (RAIV) methodology for optimizing test planning for liquid rocket engines

    Science.gov (United States)

    Strunz, Richard; Herrmann, Jeffrey W.

    2011-12-01

    The hot fire test strategy for liquid rocket engines has always been a concern of space industry and agency alike because no recognized standard exists. Previous hot fire test plans focused on the verification of performance requirements but did not explicitly include reliability as a dimensioning variable. The stakeholders are, however, concerned about a hot fire test strategy that balances reliability, schedule, and affordability. A multiple criteria test planning model is presented that provides a framework to optimize the hot fire test strategy with respect to stakeholder concerns. The Staged Combustion Rocket Engine Demonstrator, a program of the European Space Agency, is used as example to provide the quantitative answer to the claim that a reduced thrust scale demonstrator is cost beneficial for a subsequent flight engine development. Scalability aspects of major subsystems are considered in the prior information definition inside the Bayesian framework. The model is also applied to assess the impact of an increase of the demonstrated reliability level on schedule and affordability.

  10. VAR, stress-testing and supplementary methodologies: uses and constraints in energy risk management

    International Nuclear Information System (INIS)

    Senior, Brian

    1999-01-01

    This chapter lists some of the special risks associated with a range of energy markets, and questions what is risk. Market risk, the use of value-at-risk (VAR) for measuring and managing market risk, use of VAR in the banking sector, back-testing of VAR, the corporate sector, making investment decisions, and the need for additional methods of risk analysis are discussed. Scenario analysis and stress testing, liquidity, and combining VAR and stress-testing are described. Credit risk and the quantitative analysis of credit risk are addressed, and operational risk, and organisational challenges are considered. Panels present examples of a simple VAR calculation and give descriptions of VAR in corporate decisions, the measurement of liquidity, and the use of the Greeks in decisions on day to day trading and risk management

  11. A comparison of usability methods for testing interactive health technologies: Methodological aspects and empirical evidence

    NARCIS (Netherlands)

    Jaspers, Monique W. M.

    2009-01-01

    OBJECTIVE: Usability evaluation is now widely recognized as critical to the success of interactive health care applications. However, the broad range of usability inspection and testing methods available may make it difficult to decide on a usability assessment plan. To guide novices in the

  12. Methodology for performing RF reliability experiments on a generic test structure

    NARCIS (Netherlands)

    Sasse, G.T.; de Vries, Rein J.; Schmitz, Jurriaan

    2007-01-01

    This paper discusses a new technique developed for generating well defined RF large voltage swing signals for on wafer experiments. This technique can be employed for performing a broad range of different RF reliability experiments on one generic test structure. The frequency dependence of a

  13. Methodology Investigation of AI(Artificial Intelligence) Test Officer Support Tool. Volume 1

    Science.gov (United States)

    1989-03-01

    American Association for Artificial inteligence A! ............. Artificial inteliigence AMC ............ Unt:ed States Army Maeriel Comand ASL...block number) FIELD GROUP SUB-GROUP Artificial Intelligence, Expert Systems Automated Aids to Testing 9. ABSTRACT (Continue on reverse if necessary and...identify by block number) This report covers the application of Artificial Intelligence-Techniques to the problem of creating automated tools to

  14. Testing for variation in taxonomic extinction probabilities: a suggested methodology and some results

    Science.gov (United States)

    Conroy, M.J.; Nichols, J.D.

    1984-01-01

    Several important questions in evolutionary biology and paleobiology involve sources of variation in extinction rates. In all cases of which we are aware, extinction rates have been estimated from data in which the probability that an observation (e.g., a fossil taxon) will occur is related both to extinction rates and to what we term encounter probabilities. Any statistical method for analyzing fossil data should at a minimum permit separate inferences on these two components. We develop a method for estimating taxonomic extinction rates from stratigraphic range data and for testing hypotheses about variability in these rates. We use this method to estimate extinction rates and to test the hypothesis of constant extinction rates for several sets of stratigraphic range data. The results of our tests support the hypothesis that extinction rates varied over the geologic time periods examined. We also present a test that can be used to identify periods of high or low extinction probabilities and provide an example using Phanerozoic invertebrate data. Extinction rates should be analyzed using stochastic models, in which it is recognized that stratigraphic samples are random varlates and that sampling is imperfect

  15. Multi-axial Creep and the LICON Methodology for Accelerated Creep Testing

    International Nuclear Information System (INIS)

    Bowyer, William H.

    2006-05-01

    The copper-Iron canister for disposal of nuclear waste in the Swedish Programme has a design life exceeding 100,000 years. Whilst the operating temperature (100 deg C max.) and operating stress (50 MPa max.) are modest, the very long design life does require that the likely creep performance of the canister should be investigated. Many studies have been carried out by SKB but these have all involved very short duration tests at relatively high stresses. The process of predicting canister creep life by extrapolation of data from such tests has been challenged for two main reasons. The first is that the deformation and failure mechanisms in the tests employed are different from the mechanism expected under service conditions and the second is that the extrapolation is extreme. It has been recognised that there is usually scope for some increase in test temperatures and stresses which will accelerate the development of creep damage without compromising the use of extrapolation for life prediction. Cane demonstrated that in steels designed for high temperature and pressure applications, conditions of multi-axial stressing could lead to increases or decreases in the rate of damage accumulation without changing the damage mechanism. This provided a third method for accelerating creep testing which has been implemented as the LICON method. This report aims to explain the background to the LICON method and its application to the case of the copper canister. It seems likely that the method could be used to improve our knowledge of the creep resistance of the copper canister. Multiplication factors that may be achieved by the technique could be increased by attention to specimen design but an extensive and targeted programme of data collection on creep of copper would still be needed to implement the method to best advantage

  16. Multi-axial Creep and the LICON Methodology for Accelerated Creep Testing

    Energy Technology Data Exchange (ETDEWEB)

    Bowyer, William H. [Meadow End Farm, Farnham (United Kingdom)

    2006-05-15

    The copper-Iron canister for disposal of nuclear waste in the Swedish Programme has a design life exceeding 100,000 years. Whilst the operating temperature (100 deg C max.) and operating stress (50 MPa max.) are modest, the very long design life does require that the likely creep performance of the canister should be investigated. Many studies have been carried out by SKB but these have all involved very short duration tests at relatively high stresses. The process of predicting canister creep life by extrapolation of data from such tests has been challenged for two main reasons. The first is that the deformation and failure mechanisms in the tests employed are different from the mechanism expected under service conditions and the second is that the extrapolation is extreme. It has been recognised that there is usually scope for some increase in test temperatures and stresses which will accelerate the development of creep damage without compromising the use of extrapolation for life prediction. Cane demonstrated that in steels designed for high temperature and pressure applications, conditions of multi-axial stressing could lead to increases or decreases in the rate of damage accumulation without changing the damage mechanism. This provided a third method for accelerating creep testing which has been implemented as the LICON method. This report aims to explain the background to the LICON method and its application to the case of the copper canister. It seems likely that the method could be used to improve our knowledge of the creep resistance of the copper canister. Multiplication factors that may be achieved by the technique could be increased by attention to specimen design but an extensive and targeted programme of data collection on creep of copper would still be needed to implement the method to best advantage.

  17. Electric Propulsion Test and Evaluation Methodologies for Plasma in the Environments of Space and Testing (EP TEMPEST)

    Science.gov (United States)

    2016-04-14

    temperature & density Hall Thruster Test Article Graphite Beam Dump High-Speed Camera Viewport (~7m) DISTRIBUTION A:  Approved for public release...412TW/PA Clearance No. 16177 20 Hall thruster Optical diagnostics viewing window Six instrumented segments Six instrumented beam dump  segments • Six...EP thruster propellant – Sinks: vacuum  pumps  w/defined surface temperatures and activation  energies to compute residence time).   Does NOT

  18. HIV Risks, Testing, and Treatment in the Former Soviet Union: Challenges and Future Directions in Research and Methodology.

    Science.gov (United States)

    Saadat, Victoria M

    2015-01-01

    The dissolution of the USSR resulted in independence for constituent republics but left them battling an unstable economic environment and healthcare. Increases in injection drug use, prostitution, and migration were all widespread responses to this transition and have contributed to the emergence of an HIV epidemic in the countries of former Soviet Union. Researchers have begun to identify the risks of HIV infection as well as the barriers to HIV testing and treatment in the former Soviet Union. Significant methodological challenges have arisen and need to be addressed. The objective of this review is to determine common threads in HIV research in the former Soviet Union and provide useful recommendations for future research studies. In this systematic review of the literature, Pubmed was searched for English-language studies using the key search terms "HIV", "AIDS", "human immunodeficiency virus", "acquired immune deficiency syndrome", "Central Asia", "Kazakhstan", "Kyrgyzstan", "Uzbekistan", "Tajikistan", "Turkmenistan", "Russia", "Ukraine", "Armenia", "Azerbaijan", and "Georgia". Studies were evaluated against eligibility criteria for inclusion. Thirty-nine studies were identified across the two main topic areas of HIV risk and barriers to testing and treatment, themes subsequently referred to as "risk" and "barriers". Study design was predominantly cross-sectional. The most frequently used sampling methods were peer-to-peer and non-probabilistic sampling. The most frequently reported risks were condom misuse, risky intercourse, and unsafe practices among injection drug users. Common barriers to testing included that testing was inconvenient, and that results would not remain confidential. Frequent barriers to treatment were based on a distrust in the treatment system. The findings of this review reveal methodological limitations that span the existing studies. Small sample size, cross-sectional design, and non-probabilistic sampling methods were frequently

  19. Development of a testing methodology for computerized procedure system based on JUnit framework and MFM

    International Nuclear Information System (INIS)

    Qin, Wei

    2004-02-01

    Paper Based Procedure (PBP) and Computerized Procedure System (CPS) are studied to demonstrate that it is necessary to develop CPS in Nuclear Power Plant (NPP) Instrumentation and Control (I and C) system. Computerized procedure system is actually a software system. All the desired and undesired properties of a software system can be described and evaluated as software qualities. Generally, software qualities can be categorized into product quality and process quality. In order to achieve product quality, the process quality of a software system should also be considered and achieved. Characteristics of CPS will be described to analyse the product and process of an example CPS: ImPRO. At the same time, several main product and process issues will be analysed from Verification and Validation (V and V) point of view. It is concluded and suggested that V and V activities can also be regarded as a software development process, this point of view then is applied to the V and V activities of ImPRO as a systematic approach of testing of ImPRO. To support and realize this approach, suitable testing technologies and testing strategies are suggested based on JUnit framework and Multi-level Flow Modeling (MFM)

  20. Testing for adaptive evolution of the female reproductive protein ZPC in mammals, birds and fishes reveals problems with the M7-M8 likelihood ratio test.

    Science.gov (United States)

    Berlin, Sofia; Smith, Nick G C

    2005-11-10

    Adaptive evolution appears to be a common feature of reproductive proteins across a very wide range of organisms. A promising way of addressing the evolutionary forces responsible for this general phenomenon is to test for adaptive evolution in the same gene but among groups of species, which differ in their reproductive biology. One can then test evolutionary hypotheses by asking whether the variation in adaptive evolution is consistent with the variation in reproductive biology. We have attempted to apply this approach to the study of a female reproductive protein, zona pellucida C (ZPC), which has been previously shown by the use of likelihood ratio tests (LRTs) to be under positive selection in mammals. We tested for evidence of adaptive evolution of ZPC in 15 mammalian species, in 11 avian species and in six fish species using three different LRTs (M1a-M2a, M7-M8, and M8a-M8). The only significant findings of adaptive evolution came from the M7-M8 test in mammals and fishes. Since LRTs of adaptive evolution may yield false positives in some situations, we examined the properties of the LRTs by several different simulation methods. When we simulated data to test the robustness of the LRTs, we found that the pattern of evolution in ZPC generates an excess of false positives for the M7-M8 LRT but not for the M1a-M2a or M8a-M8 LRTs. This bias is strong enough to have generated the significant M7-M8 results for mammals and fishes. We conclude that there is no strong evidence for adaptive evolution of ZPC in any of the vertebrate groups we studied, and that the M7-M8 LRT can be biased towards false inference of adaptive evolution by certain patterns of non-adaptive evolution.

  1. Home urine C-peptide creatinine ratio (UCPCR) testing can identify type 2 and MODY in pediatric diabetes.

    Science.gov (United States)

    Besser, Rachel E J; Shields, Beverley M; Hammersley, Suzanne E; Colclough, Kevin; McDonald, Timothy J; Gray, Zoe; Heywood, James J N; Barrett, Timothy G; Hattersley, Andrew T

    2013-05-01

    Making the correct diabetes diagnosis in children is crucial for lifelong management. Type 2 diabetes and maturity onset diabetes of the young (MODY) are seen in the pediatric setting, and can be difficult to discriminate from type 1 diabetes. Postprandial urinary C-peptide creatinine ratio (UCPCR) is a non-invasive measure of endogenous insulin secretion that has not been tested as a diagnostic tool in children or in patients with diabetes duration MODY and type 2 in pediatric diabetes. Two-hour postprandial UCPCR was measured in 264 patients aged MODY, n = 63). Receiver operating characteristic curves were used to identify the optimal UCPCR cutoff for discriminating diabetes subtypes. UCPCR was lower in type 1 diabetes [0.05 (MODY [3.51 (2.37-5.32) nmol/mmol, p MODY (p = 0.25), so patients were combined for subsequent analyses. After 2-yr duration, UCPCR ≥ 0.7 nmol/mmol has 100% sensitivity [95% confidence interval (CI): 92-100] and 97% specificity (95% CI: 91-99) for identifying non-type 1 (MODY + type 2 diabetes) from type 1 diabetes [area under the curve (AUC) 0.997]. UCPCR was poor at discriminating MODY from type 2 diabetes (AUC 0.57). UCPCR testing can be used in diabetes duration greater than 2 yr to identify pediatric patients with non-type 1 diabetes. UCPCR testing is a practical non-invasive method for use in the pediatric outpatient setting. © 2013 John Wiley & Sons A/S.

  2. The Relationship Between 14C Urea Breath Test Results and Neutrophil/Lymphocyte and Platelet/Lymphocyte Ratios

    Directory of Open Access Journals (Sweden)

    Ertan Şahin

    2018-04-01

    Full Text Available Aim: Neutrophil/lymphocyte ratio (NLR and platelet/lymphocyte ratio (PLR are used as inflammatory markers in several diseases. However, there are little data regarding the diagnostic ability of NLR and PLR in Helicobacter pylori. We aimed to assess the association between the 14C urea breath test (14C-UBT results and NLR and PLR in H. pylori diagnosis. Methods: Results of 89 patients were retrospectively analysed in this study. According to the 14C-UBT results, patients were divided into two groups: H. pylori (+ and H. pylori (- (control group. Haematological parameters, including hemoglobine, white blood cell (WBC count, neutrophil count, lymphocyte count, NLR, platelet count, and PLR were compared between the two groups. Results: The mean total WBC count, neutrophil count, NLR and PLR in H. pylori (+ patients were significantly higher than in the control group (p<0.001 for all these parameters. In the receiver operating characteristic curve analysis, the cut-off value for NLR and PLR for the presence of H. pylori was calculated as ≥2.39 [sensitivity: 67.3%, specificity: 79.4%, area under the curve (AUC: 0.747 (0.637-0.856, p<0.0001] and ≥133.3 [sensitivity: 61.8%, specificity: 55.9%, AUC: 0.572 (0.447-0.697, p<0.05], respectively. Conclusion: The present study shows that NLR and PLR are associated with H. pylori positivity based on 14C-UBT, and they can be used as an additional biomarker for supporting the 14C-UBT results.

  3. Thermal-hydraulic analysis for changing feedwater check valve leakage rate testing methodology

    Energy Technology Data Exchange (ETDEWEB)

    Fuller, R.; Harrell, J.

    1996-12-01

    The current design and testing requirements for the feedwater check valves (FWCVs) at the Grand Gulf Nuclear Station are established from original licensing requirements that necessitate extremely restrictive air testing with tight allowable leakage limits. As a direct result of these requirements, the original high endurance hard seats in the FWCVs were modified with elastomeric seals to provide a sealing surface capable of meeting the stringent air leakage limits. However, due to the relatively short functional life of the elastomeric seals compared to the hard seats, the overall reliability of the sealing function actually decreased. This degraded performance was exhibited by frequent seal failures and subsequent valve repairs. The original requirements were based on limited analysis and the belief that all of the high energy feedwater vaporized during the LOCA blowdown. These phenomena would have resulted in completely voided feedwater lines and thus a steam environment within the feedwater leak pathway. To challenge these criteria, a comprehensive design basis accident analysis was developed using the RELAP5/MOD3.1 thermal-hydraulic code. Realistic assumptions were used to more accurately model the post-accident fluid conditions within the feedwater system. The results of this analysis demonstrated that no leak path exists through the feedwater lines during the reactor blowdown phase and that sufficient subcooled water remains in various portions of the feedwater piping to form liquid water loop seals that effectively isolate this leak path. These results provided the bases for changing the leak testing requirements of the FWCVs from air to water. The analysis results also established more accurate allowable leakage limits, determined the real effective margins associated with the FWCV safety functions, and led to design changes that improved the overall functional performance of the valves.

  4. Validation of the Engineering Plant Analyzer methodology with Peach Bottom 2 stability tests

    International Nuclear Information System (INIS)

    Rohatgi, U.S.; Mallen, A.N.; Cheng, H.S.; Wulff, W.

    1994-01-01

    The Engineering Plant Analyzer (EPA) had been developed in 1984 at Brookhaven National Laboratory to simulate plant transients in boiling water reactors (BWR). Recently, the EPA with its High-Speed Interactive Plant Analyzer code for BWRs ( ppercase HIPA-BWR ) simulated for the first time oscillatory transients with large, non-linear power and flow amplitudes; transients which are centered around the March 9, 1988 instability at the LaSalle-2 BWR power plant.The EPA's capability to simulate oscillatory transients has been demonstrated first by comparing simulation results with LaSalle-2 plant data (Wulff et al., NUREG/CR-5816, BNL-NUREG-52312, Brookhaven National Laboratory, 1992). This paper presents an EPA assessment on the basis of the Peach Bottom 2 instability tests (Carmichael and Niemi, EPRI NP-564, Electric Power Research Institute, Palo Alto, CA, 1978). This assessment of the EPA appears to constitute the first validation of a time-domain reactor systems code on the basis of frequency-domain criteria, namely power spectral density, gain and phase shift of the pressure-to-power transfer function.The reactor system pressure was disturbed in the Peach Bottom 2 power plant tests, and in their EPA simulation, by a pseudo-random, binary sequence signal. The data comparison revealed that the EPA predicted for Peach Bottom tests PT1, PT2, and PT4 the gain of the power-to-pressure transfer function with the biases and standard deviations of (-10±28)%, (-1±40)% and (+28±52)%, respectively. The respective frequencies at the peak gains were predicted with the errors of +6%, +3%, and -28%. The differences between the predicted and the measured phase shift increased with increasing frequency, but stayed within the margin of experimental uncertainty. ((orig.))

  5. Thermal-hydraulic analysis for changing feedwater check valve leakage rate testing methodology

    International Nuclear Information System (INIS)

    Fuller, R.; Harrell, J.

    1996-01-01

    The current design and testing requirements for the feedwater check valves (FWCVs) at the Grand Gulf Nuclear Station are established from original licensing requirements that necessitate extremely restrictive air testing with tight allowable leakage limits. As a direct result of these requirements, the original high endurance hard seats in the FWCVs were modified with elastomeric seals to provide a sealing surface capable of meeting the stringent air leakage limits. However, due to the relatively short functional life of the elastomeric seals compared to the hard seats, the overall reliability of the sealing function actually decreased. This degraded performance was exhibited by frequent seal failures and subsequent valve repairs. The original requirements were based on limited analysis and the belief that all of the high energy feedwater vaporized during the LOCA blowdown. These phenomena would have resulted in completely voided feedwater lines and thus a steam environment within the feedwater leak pathway. To challenge these criteria, a comprehensive design basis accident analysis was developed using the RELAP5/MOD3.1 thermal-hydraulic code. Realistic assumptions were used to more accurately model the post-accident fluid conditions within the feedwater system. The results of this analysis demonstrated that no leak path exists through the feedwater lines during the reactor blowdown phase and that sufficient subcooled water remains in various portions of the feedwater piping to form liquid water loop seals that effectively isolate this leak path. These results provided the bases for changing the leak testing requirements of the FWCVs from air to water. The analysis results also established more accurate allowable leakage limits, determined the real effective margins associated with the FWCV safety functions, and led to design changes that improved the overall functional performance of the valves

  6. A durability test rig and methodology for erosion-resistant blade coatings in turbomachinery

    Science.gov (United States)

    Leithead, Sean Gregory

    A durability test rig for erosion-resistant gas turbine engine compressor blade coatings was designed, completed and commissioned. Bare and coated 17-4PH steel V103-profile blades were rotated at up to 11500 rpm and impacted with Garnet sand for 5 hours at an average concentration of 2.51 gm3of air , at a blade leading edge Mach number of 0.50. The rig was determined to be an acceptable first stage axial compressor representation. Two types of 16 microm-thick coatings were tested: Titanium Nitride (TiN) and Chromium-Aluminum-Titanium Nitride (CrAlTiN), both applied using an Arc Physical Vapour Deposition technique at the National Research Council in Ottawa, Canada. A Leithead-Allan-Zhao (LAZ) score was created to compare the durability performance of uncoated and coated blades based on mass-loss and blade dimension changes. The bare blades' LAZ score was set as a benchmark of 1.00. The TiN-coated and CrAlTiN-coated blades obtained LAZ scores of 0.69 and 0.41, respectively. A lower score meant a more erosion-resistant coating. Major modes of blade wear included: trailing edge, leading edge and the rear suction surface. Trailing edge thickness was reduced, the leading edge became blunt, and the rear suction surface was scrubbed by overtip and recirculation zone vortices. It was found that the erosion effects of vortex flow were significant. Erosion damage due to reflected particles was not present due to the low blade solidity of 0.7. The rig is best suited for studying the performance of erosion-resistant coatings after they are proven effective in ASTM standardized testing. Keywords: erosion, compressor, coatings, turbomachinery, erosion rate, blade, experimental, gas turbine engine

  7. Development of Improved Accelerated Corrosion Qualification Test Methodology for Aerospace Materials

    Science.gov (United States)

    2014-11-01

    performance of magnesium -rich primer for aluminum alloys under salt spray test (ASTM B117) and natural exposure”, Corrosion Science 52 (2010) 1453...Center, FL (midnight 12-13-05 to midnight 12-14-05) 19400 19500 19600 19700 19800 19900 20000 20100 0 5 10 15 20 Cu m ul at iv e W ei gh t L os s...13-05 to midnight 12-14-05) 0 10000 20000 30000 40000 50000 60000 0 2000 4000 6000 8000 10000 Cu m ul at iv e W ei gh t L os s ( µg /c m 2 ) Hours

  8. Point-of-Care Hemoglobin/Hematocrit Testing: Comparison of Methodology and Technology.

    Science.gov (United States)

    Maslow, Andrew; Bert, Arthur; Singh, Arun; Sweeney, Joseph

    2016-04-01

    Point-of-care (POC) testing allows rapid assessment of hemoglobin (Hgb) and hematocrit (Hct) values. This study compared 3 POC testing devices--the Radical-7 pulse oximeter (Radical-7, Neuchȃtel, Switzerland), the i-STAT (Abbott Point of Care, Princeton, NJ), and the GEM 4000 (Instrumentation Laboratory, Bedford, MA)--to the hospital reference device, the UniCel DxH 800 (Beckman Coulter, Brea, CA) in cardiac surgery patients. Prospective study. Tertiary care cardiovascular center. Twenty-four consecutive elective adult cardiac surgery patients. Hgb and Hct values were measured using 3 POC devices (the Radical-7, i-STAT, and GEM 4000) and a reference laboratory device (UniCel DxH 800). Data were collected simultaneously before surgery, after heparin administration, after heparin reversal with protamine, and after sternal closure. Data were analyzed using bias analyses. POC testing data were compared with that of the reference laboratory device. Hgb levels ranged from 6.8 to 15.1 g/dL, and Hct levels ranged from 20.1% to 43.8%. The overall mean bias was lowest with the i-STAT (Hct, 0.22%; Hgb 0.05 g/dL) compared with the GEM 4000 (Hct, 2.15%; Hgb, 0.63 g/dL) and the Radical-7 (Hgb 1.16 g/dL). The range of data for the i-STAT and Radical-7 was larger than that with the GEM 4000, and the pattern or slopes changed significantly with the i-STAT and Radical-7, whereas that of the GEM 4000 remained relatively stable. The GEM 4000 demonstrated a consistent overestimation of laboratory data, which tended to improve after bypass and at lower Hct/Hgb levels. The i-STAT bias changed from overestimation to underestimation, the latter in the post-cardiopulmonary bypass period and at lower Hct/Hgb levels. By contrast, the Radical-7 biases increased during the surgical procedure and in the lower ranges of Hgb. Important clinical differences and limitations were found among the 3 POC testing devices that should caution clinicians from relying on these data as sole determinants of

  9. Toothpick test: a methodology for the detection of RR soybean plants1

    Directory of Open Access Journals (Sweden)

    Fabiana Mota da Silva

    Full Text Available Due to the large increase in the area cultivated with genetically modified soybean in Brazil, it has become necessary to determine methods that are fast and efficient for detecting these cultivars. The aim of this work was to test the efficiency of the toothpick method in the detection of RR soybean plants, as well as to distinguish between cultivars, for sensitivity caused by herbicide. Ten transgenic soybean cultivars, resistant to the active ingredient glyphosate, and ten conventional soybean cultivars were used. Toothpicks soaked in glyphosate were applied to all the plants at stage V6 and evaluations were made at 2, 4, 6, 8 and 10 days after application (DAA. The effects of the glyphosate on the cultivars, and the symptoms of phytotoxicity caused in the transgenic plants were evaluated by means of grading scales. The toothpick test is effective in identifying RR soybean cultivars and also in separating them into groups by sensitivity to the symptoms caused by the glyphosate.

  10. Creep lifing methodologies applied to a single crystal superalloy by use of small scale test techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jeffs, S.P., E-mail: s.p.jeffs@swansea.ac.uk [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Lancaster, R.J. [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Garcia, T.E. [IUTA (University Institute of Industrial Technology of Asturias), University of Oviedo, Edificio Departamental Oeste 7.1.17, Campus Universitario, 33203 Gijón (Spain)

    2015-06-11

    In recent years, advances in creep data interpretation have been achieved either by modified Monkman–Grant relationships or through the more contemporary Wilshire equations, which offer the opportunity of predicting long term behaviour extrapolated from short term results. Long term lifing techniques prove extremely useful in creep dominated applications, such as in the power generation industry and in particular nuclear where large static loads are applied, equally a reduction in lead time for new alloy implementation within the industry is critical. The latter requirement brings about the utilisation of the small punch (SP) creep test, a widely recognised approach for obtaining useful mechanical property information from limited material volumes, as is typically the case with novel alloy development and for any in-situ mechanical testing that may be required. The ability to correlate SP creep results with uniaxial data is vital when considering the benefits of the technique. As such an equation has been developed, known as the k{sub SP} method, which has been proven to be an effective tool across several material systems. The current work now explores the application of the aforementioned empirical approaches to correlate small punch creep data obtained on a single crystal superalloy over a range of elevated temperatures. Finite element modelling through ABAQUS software based on the uniaxial creep data has also been implemented to characterise the SP deformation and help corroborate the experimental results.

  11. Creep lifing methodologies applied to a single crystal superalloy by use of small scale test techniques

    International Nuclear Information System (INIS)

    Jeffs, S.P.; Lancaster, R.J.; Garcia, T.E.

    2015-01-01

    In recent years, advances in creep data interpretation have been achieved either by modified Monkman–Grant relationships or through the more contemporary Wilshire equations, which offer the opportunity of predicting long term behaviour extrapolated from short term results. Long term lifing techniques prove extremely useful in creep dominated applications, such as in the power generation industry and in particular nuclear where large static loads are applied, equally a reduction in lead time for new alloy implementation within the industry is critical. The latter requirement brings about the utilisation of the small punch (SP) creep test, a widely recognised approach for obtaining useful mechanical property information from limited material volumes, as is typically the case with novel alloy development and for any in-situ mechanical testing that may be required. The ability to correlate SP creep results with uniaxial data is vital when considering the benefits of the technique. As such an equation has been developed, known as the k SP method, which has been proven to be an effective tool across several material systems. The current work now explores the application of the aforementioned empirical approaches to correlate small punch creep data obtained on a single crystal superalloy over a range of elevated temperatures. Finite element modelling through ABAQUS software based on the uniaxial creep data has also been implemented to characterise the SP deformation and help corroborate the experimental results

  12. Design and pilot testing of a dietary assessment methodology for children at school

    DEFF Research Database (Denmark)

    Hansen, Mette; Laursen, Rikke; Mikkelsen, Bent Egberg

    in school food environments. Aim: The aim of this report was to investigate and develop appropriate methods for studying the link between healthy eating practices and organic food procurement policies using Danish public elementary schools as a setting. Methods: Based on relevant scientific literature......, the Danish Dietary Recommendations, and inspired by other successful studies, a self-administered questionnaire investigating children’s eating habits was designed. After testing by an Expert Evaluation Panel and Think Aloud Interviews adjustments were integrated. Conclusion: If special attention is given...... to literacy skills and cognitive development, children in Danish 6th grade classes can be used as respondents in studies of the relation between food procurement policies and eating practice. The study suggests that a Cross-Sectional design is a satisfactory method to investigate the association between...

  13. Overview of a benefit/risk ratio optimized for a radiation emitting device used in non-destructive testing

    Energy Technology Data Exchange (ETDEWEB)

    Maharaj, H.P., E-mail: H_P_Maharaj@hc-sc.gc.ca [Health Canada, Dept. of Health, Consumer and Clinical Radiaton Protection Bureau, Ottawa, Ontario (Canada)

    2016-03-15

    This paper aims to provide an overview of an optimized benefit/risk ratio for a radiation emitting device. The device, which is portable, hand-held, and open-beam x-ray tube based, is utilized by a wide variety of industries for purposes of determining elemental or chemical analyses of materials in-situ based on fluorescent x-rays. These analyses do not cause damage or permanent alteration of the test materials and are considered a non-destructive test (NDT). Briefly, the key characteristics, principles of use and radiation hazards associated with the Hay device are presented and discussed. In view of the potential radiation risks, a long term strategy that incorporates risk factors and guiding principles intended to mitigate the radiation risks to the end user was considered and applied. Consequently, an operator certification program was developed on the basis of an International Standards Organization (ISO) standard (ISO 20807:2004) and in collaboration with various stake holders and was implemented by a federal national NDT certification body several years ago. It comprises a written radiation safety examination and hands-on training with the x-ray device. The operator certification program was recently revised and the changes appear beneficial. There is a fivefold increase in operator certification (Levels 1 a nd 2) to date compared with earlier years. Results are favorable and promising. An operational guidance document is available to help mitigate radiation risks. Operator certification in conjunction with the use of the operational guidance document is prudent, and is recommended for end users of the x-ray device. Manufacturers and owners of the x-ray devices will also benefit from the operational guidance document. (author)

  14. Overview of a benefit/risk ratio optimized for a radiation emitting device used in non-destructive testing

    International Nuclear Information System (INIS)

    Maharaj, H.P.

    2016-01-01

    This paper aims to provide an overview of an optimized benefit/risk ratio for a radiation emitting device. The device, which is portable, hand-held, and open-beam x-ray tube based, is utilized by a wide variety of industries for purposes of determining elemental or chemical analyses of materials in-situ based on fluorescent x-rays. These analyses do not cause damage or permanent alteration of the test materials and are considered a non-destructive test (NDT). Briefly, the key characteristics, principles of use and radiation hazards associated with the Hay device are presented and discussed. In view of the potential radiation risks, a long term strategy that incorporates risk factors and guiding principles intended to mitigate the radiation risks to the end user was considered and applied. Consequently, an operator certification program was developed on the basis of an International Standards Organization (ISO) standard (ISO 20807:2004) and in collaboration with various stake holders and was implemented by a federal national NDT certification body several years ago. It comprises a written radiation safety examination and hands-on training with the x-ray device. The operator certification program was recently revised and the changes appear beneficial. There is a fivefold increase in operator certification (Levels 1 a nd 2) to date compared with earlier years. Results are favorable and promising. An operational guidance document is available to help mitigate radiation risks. Operator certification in conjunction with the use of the operational guidance document is prudent, and is recommended for end users of the x-ray device. Manufacturers and owners of the x-ray devices will also benefit from the operational guidance document. (author)

  15. Quantitative EEG analysis using error reduction ratio-causality test; validation on simulated and real EEG data.

    Science.gov (United States)

    Sarrigiannis, Ptolemaios G; Zhao, Yifan; Wei, Hua-Liang; Billings, Stephen A; Fotheringham, Jayne; Hadjivassiliou, Marios

    2014-01-01

    To introduce a new method of quantitative EEG analysis in the time domain, the error reduction ratio (ERR)-causality test. To compare performance against cross-correlation and coherence with phase measures. A simulation example was used as a gold standard to assess the performance of ERR-causality, against cross-correlation and coherence. The methods were then applied to real EEG data. Analysis of both simulated and real EEG data demonstrates that ERR-causality successfully detects dynamically evolving changes between two signals, with very high time resolution, dependent on the sampling rate of the data. Our method can properly detect both linear and non-linear effects, encountered during analysis of focal and generalised seizures. We introduce a new quantitative EEG method of analysis. It detects real time levels of synchronisation in the linear and non-linear domains. It computes directionality of information flow with corresponding time lags. This novel dynamic real time EEG signal analysis unveils hidden neural network interactions with a very high time resolution. These interactions cannot be adequately resolved by the traditional methods of coherence and cross-correlation, which provide limited results in the presence of non-linear effects and lack fidelity for changes appearing over small periods of time. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  16. The caffeine breath test and caffeine urinary metabolite ratios in the Michigan cohort exposed to polybrominated biphenyls: A preliminary study

    Energy Technology Data Exchange (ETDEWEB)

    Lambert, G.H.; Schoeller, D.A.; Kotake, A.N.; Lietz, H. (Univ. of Chicago, IL (USA)); Humphrey, H.E.B.; Budd, M. (Michigan Dept. of Public Health, Lansing (USA)); Campbell, M.; Kalow, W.; Spielberg, P. (Univ. of Toronto, Ontario (Canada))

    1990-11-01

    A field biochemical epidemiology study was conducted using the Michigan cohort consisting of 51 rural residents exposed to polybrominated biphenyls (PBB). The study had three major objectives: (a) to determine the serum half-life of the major PBB congener, hexabromobiphenyl (HBB), in the human, (b) to determine if the PBB-exposed subjects had elevated cytochrome P-450I function as determined by the caffeine breath test (CBT) and the caffeine urinary metabolite ratio (CMR), and (c) to determine the applicability of the CBT and CMR in field studies. PBB serum levels were detected in 36 of the 51 PBB-exposed subjects. The serum half-life of HBB was determined by comparing the current serum HBB values to the subject's previous serum values obtained 5 to 8 years earlier. The median HBB half-life was 12 years (range 4-97 years). The CBT and CMR were elevated in the subjects exposed to PBBs as compared to the values obtained from urban nonsmokers and were similar to those found in adults who smoke. A gender effect was seen in the PBB-exposed subjects. There was a correlation between the CBT and the HBB serum values but not between CMR and HBB serum values. The CBT and CMR were easily conducted in the field and appear to be useful metabolic probes of cytochrome P-450I activity in human environmental toxicology.

  17. A Neuro-Fuzzy Inference System Combining Wavelet Denoising, Principal Component Analysis, and Sequential Probability Ratio Test for Sensor Monitoring

    International Nuclear Information System (INIS)

    Na, Man Gyun; Oh, Seungrohk

    2002-01-01

    A neuro-fuzzy inference system combined with the wavelet denoising, principal component analysis (PCA), and sequential probability ratio test (SPRT) methods has been developed to monitor the relevant sensor using the information of other sensors. The parameters of the neuro-fuzzy inference system that estimates the relevant sensor signal are optimized by a genetic algorithm and a least-squares algorithm. The wavelet denoising technique was applied to remove noise components in input signals into the neuro-fuzzy system. By reducing the dimension of an input space into the neuro-fuzzy system without losing a significant amount of information, the PCA was used to reduce the time necessary to train the neuro-fuzzy system, simplify the structure of the neuro-fuzzy inference system, and also, make easy the selection of the input signals into the neuro-fuzzy system. By using the residual signals between the estimated signals and the measured signals, the SPRT is applied to detect whether the sensors are degraded or not. The proposed sensor-monitoring algorithm was verified through applications to the pressurizer water level, the pressurizer pressure, and the hot-leg temperature sensors in pressurized water reactors

  18. A Methodological Approach for Testing the Viability of Seeds Stored in Short-Term Seed Banks

    Directory of Open Access Journals (Sweden)

    Jose A. FORTE GIL

    2017-12-01

    Full Text Available Efficient management of ‘active’ seed banks – specifically aimed at the short-term storage at room temperature of seeds to be used locally in conservation/regeneration programmes of endemic or endangered plant species – requires establishing the optimal storage time to maintain high seed viability, for each stored species. In this work, germination of seeds of the halophytes Thalictrum maritimum, Centaurea dracunculifolia and Linum maritimum has been investigated. The seeds had been stored for different periods of time in the seed bank of ‘La Albufera’ Natural Park (Valencia, SE Spain after collection in salt marshes of the Park, where small populations of the three species are present. Seeds of T. maritimum and C. dracunculifolia have a relatively short period of viability at room temperature, and should not be stored for more than three years. On the other hand, L. maritimum seeds maintain a high germination percentage and can be kept at room temperature for up to 10 years. T. maritimum seeds, in contrast to those of the other two species, did not germinate in in vitro tests nor when sown directly on a standard substrate, unless a pre-treatment of the seeds was applied, mechanical scarification being the most effective. These results will help to improve the management of the seed bank, to generate more efficiently new plants for reintroduction and reinforcement of populations of these species in their natural ecosystems within the Natural Park.

  19. Internal jugular vein: Peripheral vein adrenocorticotropic hormone ratio in patients with adrenocorticotropic hormone-dependent Cushing′s syndrome: Ratio calculated from one adrenocorticotropic hormone sample each from right and left internal jugular vein during corticotrophin releasing hormone stimulation test

    Directory of Open Access Journals (Sweden)

    Sachin Chittawar

    2013-01-01

    Full Text Available Background: Demonstration of central: Peripheral adrenocorticotropic hormone (ACTH gradient is important for diagnosis of Cushing′s disease. Aim: The aim was to assess the utility of internal jugular vein (IJV: Peripheral vein ACTH ratio for diagnosis of Cushing′s disease. Materials and Methods: Patients with ACTH-dependent Cushing′s syndrome (CS patients were the subjects for this study. One blood sample each was collected from right and left IJV following intravenous hCRH at 3 and 5 min, respectively. A simultaneous peripheral vein sample was also collected with each IJV sample for calculation of IJV: Peripheral vein ACTH ratio. IJV sample collection was done under ultrasound guidance. ACTH was assayed using electrochemiluminescence immunoassay (ECLIA. Results: Thirty-two patients participated in this study. The IJV: Peripheral vein ACTH ratio ranged from 1.07 to 6.99 ( n = 32. It was more than 1.6 in 23 patients. Cushing′s disease could be confirmed in 20 of the 23 cases with IJV: Peripheral vein ratio more than 1.6. Four patients with Cushing′s disease and 2 patients with ectopic ACTH syndrome had IJV: Peripheral vein ACTH ratio less than 1.6. Six cases with unknown ACTH source were excluded for calculation of sensitivity and specificity of the test. Conclusion: IJV: Peripheral vein ACTH ratio calculated from a single sample from each IJV obtained after hCRH had 83% sensitivity and 100% specificity for diagnosis of CD.

  20. Reliability of case definitions for public health surveillance assessed by Round-Robin test methodology

    Directory of Open Access Journals (Sweden)

    Claus Hermann

    2006-05-01

    Full Text Available Abstract Background Case definitions have been recognized to be important elements of public health surveillance systems. They are to assure comparability and consistency of surveillance data and have crucial impact on the sensitivity and the positive predictive value of a surveillance system. The reliability of case definitions has rarely been investigated systematically. Methods We conducted a Round-Robin test by asking all 425 local health departments (LHD and the 16 state health departments (SHD in Germany to classify a selection of 68 case examples using case definitions. By multivariate analysis we investigated factors linked to classification agreement with a gold standard, which was defined by an expert panel. Results A total of 7870 classifications were done by 396 LHD (93% and all SHD. Reporting sensitivity was 90.0%, positive predictive value 76.6%. Polio case examples had the lowest reporting precision, salmonellosis case examples the highest (OR = 0.008; CI: 0.005–0.013. Case definitions with a check-list format of clinical criteria resulted in higher reporting precision than case definitions with a narrative description (OR = 3.08; CI: 2.47–3.83. Reporting precision was higher among SHD compared to LHD (OR = 1.52; CI: 1.14–2.02. Conclusion Our findings led to a systematic revision of the German case definitions and build the basis for general recommendations for the creation of case definitions. These include, among others, that testable yes/no criteria in a check-list format is likely to improve reliability, and that software used for data transmission should be designed in strict accordance with the case definitions. The findings of this study are largely applicable to case definitions in many other countries or international networks as they share the same structural and editorial characteristics of the case definitions evaluated in this study before their revision.

  1. Receiver-operating characteristic curves and likelihood ratios: improvements over traditional methods for the evaluation and application of veterinary clinical pathology tests

    DEFF Research Database (Denmark)

    Gardner, Ian A.; Greiner, Matthias

    2006-01-01

    Receiver-operating characteristic (ROC) curves provide a cutoff-independent method for the evaluation of continuous or ordinal tests used in clinical pathology laboratories. The area under the curve is a useful overall measure of test accuracy and can be used to compare different tests (or...... different equipment) used by the same tester, as well as the accuracy of different diagnosticians that use the same test material. To date, ROC analysis has not been widely used in veterinary clinical pathology studies, although it should be considered a useful complement to estimates of sensitivity...... and specificity in test evaluation studies. In addition, calculation of likelihood ratios can potentially improve the clinical utility of such studies because likelihood ratios provide an indication of how the post-test probability changes as a function of the magnitude of the test results. For ordinal test...

  2. HIV Risks, Testing, and Treatment in the Former Soviet Union: Challenges and Future Directions in Research and Methodology

    Directory of Open Access Journals (Sweden)

    Victoria M. Saadat

    2016-01-01

    Full Text Available Background. The dissolution of the USSR resulted in independence for constituent republics but left them battling an unstable economic environment and healthcare. Increases in injection drug use, prostitution, and migration were all widespread responses to this transition and have contributed to the emergence of an HIV epidemic in the countries of former Soviet Union. Researchers have begun to identify the risks of HIV infection as well as the barriers to HIV testing and treatment in the former Soviet Union. Significant methodological challenges have arisen and need to be addressed. The objective of this review is to determine common threads in HIV research in the former Soviet Union and provide useful recommendations for future research studies.Methods. In this systematic review of the literature, Pubmed was searched for English-language studies using the key search terms “HIV”, “AIDS”, “human immunodeficiency virus”, “acquired immune deficiency syndrome”, “Central Asia”, “Kazakhstan”, “Kyrgyzstan”, “Uzbekistan”, “Tajikistan”, “Turkmenistan”, “Russia”, “Ukraine”, “Armenia”, “Azerbaijan”, and “Georgia”. Studies were evaluated against eligibility criteria for inclusion.Results. Thirty-nine studies were identified across the two main topic areas of HIV risk and barriers to testing and treatment, themes subsequently referred to as “risk” and “barriers”. Study design was predominantly cross-sectional. The most frequently used sampling methods were peer-to-peer and non-probabilistic sampling. The most frequently reported risks were condom misuse, risky intercourse, and unsafe practices among injection drug users.  Common barriers to testing included that testing was inconvenient, and that results would not remain confidential.  Frequent barriers to treatment were based on a distrust in the treatment system. Conclusion. The findings of this review reveal methodological limitations

  3. An Accurate Method for Inferring Relatedness in Large Datasets of Unphased Genotypes via an Embedded Likelihood-Ratio Test

    KAUST Repository

    Rodriguez, Jesse M.

    2013-01-01

    Studies that map disease genes rely on accurate annotations that indicate whether individuals in the studied cohorts are related to each other or not. For example, in genome-wide association studies, the cohort members are assumed to be unrelated to one another. Investigators can correct for individuals in a cohort with previously-unknown shared familial descent by detecting genomic segments that are shared between them, which are considered to be identical by descent (IBD). Alternatively, elevated frequencies of IBD segments near a particular locus among affected individuals can be indicative of a disease-associated gene. As genotyping studies grow to use increasingly large sample sizes and meta-analyses begin to include many data sets, accurate and efficient detection of hidden relatedness becomes a challenge. To enable disease-mapping studies of increasingly large cohorts, a fast and accurate method to detect IBD segments is required. We present PARENTE, a novel method for detecting related pairs of individuals and shared haplotypic segments within these pairs. PARENTE is a computationally-efficient method based on an embedded likelihood ratio test. As demonstrated by the results of our simulations, our method exhibits better accuracy than the current state of the art, and can be used for the analysis of large genotyped cohorts. PARENTE\\'s higher accuracy becomes even more significant in more challenging scenarios, such as detecting shorter IBD segments or when an extremely low false-positive rate is required. PARENTE is publicly and freely available at http://parente.stanford.edu/. © 2013 Springer-Verlag.

  4. Lifetime prediction and reliability estimation methodology for Stirling-type pulse tube refrigerators by gaseous contamination accelerated degradation testing

    Science.gov (United States)

    Wan, Fubin; Tan, Yuanyuan; Jiang, Zhenhua; Chen, Xun; Wu, Yinong; Zhao, Peng

    2017-12-01

    Lifetime and reliability are the two performance parameters of premium importance for modern space Stirling-type pulse tube refrigerators (SPTRs), which are required to operate in excess of 10 years. Demonstration of these parameters provides a significant challenge. This paper proposes a lifetime prediction and reliability estimation method that utilizes accelerated degradation testing (ADT) for SPTRs related to gaseous contamination failure. The method was experimentally validated via three groups of gaseous contamination ADT. First, the performance degradation model based on mechanism of contamination failure and material outgassing characteristics of SPTRs was established. Next, a preliminary test was performed to determine whether the mechanism of contamination failure of the SPTRs during ADT is consistent with normal life testing. Subsequently, the experimental program of ADT was designed for SPTRs. Then, three groups of gaseous contamination ADT were performed at elevated ambient temperatures of 40 °C, 50 °C, and 60 °C, respectively and the estimated lifetimes of the SPTRs under normal condition were obtained through acceleration model (Arrhenius model). The results show good fitting of the degradation model with the experimental data. Finally, we obtained the reliability estimation of SPTRs through using the Weibull distribution. The proposed novel methodology enables us to take less than one year time to estimate the reliability of the SPTRs designed for more than 10 years.

  5. Optimization of the general acceptability though affective tests and response surface methodology of a dry cacao powder mixture based beverage

    Directory of Open Access Journals (Sweden)

    Elena Chau Loo Kung

    2013-09-01

    Full Text Available This research work had as main objective optimizing the general acceptability though affective tests and response surface methodology of a dry cacao powder mixture based beverage. We obtained formulations of mixtures of cacao powder with different concentrations of 15%, 17.5% and 20%, as well as lecithin concentrations of 0.1%; 0.3%; and 0.5% maintaining a constant content of sugar (25 %, Vanillin (1% that included cacao powder with different pH values: natural (pH 5 and alkalinized (pH 6.5 and pH 8 and water by difference to 100%, generating a total of fifteen treatments to be evaluated, according to the Box-Behnen design for three factors. The treatments underwent satisfaction level tests to establish the general acceptability. The treatment that included cacao powder with a concentration of 17.5 %, pH 6.5 and lecithin concentration of 0.3 % obtained the best levels of acceptability. The software Statgraphics Plus 5.1 was used to obtain the treatment with maximum acceptability that corresponded to cacao powder with pH 6.81, with a concentration of 18.24 % and soy lecithin in 0.28% with a tendency to what was obtained in the satisfaction levels tests. Finally we characterized in a physical-chemistry and microbiological way the optimum formulation as well as evaluated sensitively obtaining an acceptability of 6.17.

  6. Clostridium difficile Testing Algorithm: Is There a Difference in Patients Who Test Positive by Enzyme Immunoassay vs. Those Who Only Test Positive by Nucleic Acid Amplification Methodology?

    OpenAIRE

    Polak, Jonathan; Odili, Ogheneruona; Craver, Mary Ashleigh; Mayen, Anthony; Purrman, Kyle; Rahman, Asem; Sang, Charlie Joseph; Cook, Paul P

    2017-01-01

    Abstract Background Testing for Clostridium difficile infection (CDI) commonly involves checking for the presence of toxins A and B by enzyme immunoassay (EIA) or nucleic acid amplification (NAA). The former is very specific, but not very sensitive. The latter is very sensitive. Beginning in 2011, our hospital incorporated an algorithm that involved testing liquid stool specimens for glutamate dehydrogenase (GDH) and toxin by EIA. For discrepant results, the stool specimen was tested for the ...

  7. Testing cost-effective methodologies for flood and seismic vulnerability assessment in communities of developing countries (Dajç, northern Albania

    Directory of Open Access Journals (Sweden)

    Veronica Pazzi

    2016-05-01

    Full Text Available Nowadays many developing countries need effective measures to reduce the disaster related risks. Structural interventions are the most effective to achieve these aims. Nevertheless, in the absence of adequate financial resources different low-cost strategies can be used to minimize losses. The purpose of this paper is to demonstrate that the disaster risk reduction can be gathered building a community coping capacity. In the case study, flood and seismic analyses have been carried out using relatively simple and low-cost technologies, fundamental for governments and research institutions of poorly developed countries. In fact, through the acquisition and dissemination of these basic information, a reduction of vulnerability and risk can be achieved. In detail, two methodologies for the evaluation of hydraulic and seismic vulnerability were tested in the Dajç municipality (Northern Albania, a high-seismicity region that is also severely affected by floods. Updated bathymetric, topographic and hydraulic data were processed with HEC-RAS software to identify sites potentially affected by dykes overflowing. Besides, the soil-structure interaction effects for three strategic buildings were studied using microtremors and the Horizontal to Vertical Spectral Ratio method. This flood and seismic vulnerability analysis was then evaluated in terms of costs and ease of accessibility in order to suggest the best use both of the employed devices and the obtained information for designing good civil protection plans and to inform the population about the right behaviour in case of threat.

  8. Visuospatial information processing load and the ratio between parietal cue and target P3 amplitudes in the Attentional Network Test.

    Science.gov (United States)

    Abramov, Dimitri M; Pontes, Monique; Pontes, Adailton T; Mourao-Junior, Carlos A; Vieira, Juliana; Quero Cunha, Carla; Tamborino, Tiago; Galhanone, Paulo R; deAzevedo, Leonardo C; Lazarev, Vladimir V

    2017-04-24

    In ERP studies of cognitive processes during attentional tasks, the cue signals containing information about the target can increase the amplitude of the parietal cue P3 in relation to the 'neutral' temporal cue, and reduce the subsequent target P3 when this information is valid, i.e. corresponds to the target's attributes. The present study compared the cue-to-target P3 ratios in neutral and visuospatial cueing, in order to estimate the contribution of valid visuospatial information from the cue to target stages of the task performance, in terms of cognitive load. The P3 characteristics were also correlated with the results of individuals' performance of the visuospatial tasks, in order to estimate the relationship of the observed ERP with spatial reasoning. In 20 typically developing boys, aged 10-13 years (11.3±0.86), the intelligence quotient (I.Q.) was estimated by the Block Design and Vocabulary subtests from the WISC-III. The subjects performed the Attentional Network Test (ANT) accompanied by EEG recording. The cued two-choice task had three equiprobable cue conditions: No cue, with no information about the target; Neutral (temporal) cue, with an asterisk in the center of the visual field, predicting the target onset; and Spatial cues, with an asterisk in the upper or lower hemifield, predicting the onset and corresponding location of the target. The ERPs were estimated for the mid-frontal (Fz) and mid-parietal (Pz) scalp derivations. In the Pz, the Neutral cue P3 had a lower amplitude than the Spatial cue P3; whereas for the target ERPs, the P3 of the Neutral cue condition was larger than that of the Spatial cue condition. However, the sums of the magnitudes of the cue and target P3 were equal in the spatial and neutral cueing, probably indicating that in both cases the equivalent information processing load is included in either the cue or the target reaction, respectively. Meantime, in the Fz, the analog ERP components for both the cue and target

  9. Effect of mixing geopolymer and peat on bearing capacity in Ogan Komering Ilir (OKI) by California bearing ratio (CBR) test

    Science.gov (United States)

    Raharja, Danang S.; Hadiwardoyo, Sigit P.; Rahayu, Wiwik; Zain, Nasuhi

    2017-06-01

    Geopolymer is binder material that consists of solid material and the activator solution. Geopolymer material has successfully replaced cement in the manufacture of concrete with aluminosilicate bonding system. Geopolymer concrete has properties similar to cement concrete with high compressive strength, low shrinkage value, relatively low creep value, as well as acid-resistant. Based on these, the addition of polymers in peat soils is expected to improve the bearing capacity of peat soils. A study on the influence of geopolymer addition in peat soils was done by comparing before and after the peat soil was mixed with geopolymer using CBR (California Bearing Ratio) test in unsoaked and soaked conditions. 10% mixture content of the peat dry was used, weighted with a variety of curing time 4 hours, 5 days, and 10 days. There were two methods of mixing: first, peat was mixed with fly ash geopolymer activators and mixed solution (waterglass, NaOH, water), and second, peat was mixed with fly ash and mixed geopolymer (waterglass, NaOH, water, fly ash). Changes were observed in specific gravity, dry density, acidity (pH), and the microscopic structure with Scanning Electron Microscope (SEM). Curing time did not significantly affect the CBR value. It even shows a tendency to decline with longer curing time. The first type mixture obtained CBR value of: 5.4% for 4 hours curing, 4.6% for 5 days curing and 3.6% for 10 days curing. The second type mixture obtained CBR value of: 6.1% for 4 hours curing, 5.2% for 5 days curing and 5.2% for 10 days curing. Furthermore, the specific gravity value, dry density, pH near neutral and swelling percentage increased. From both variants, the second type mixture shows better results than the first type mixture. The results of SEM (Scanning Electron Microscopy) show the structure of the peat which became denser with the fly ash particles filling the peat microporous. Also, the reaction of fly ash with geopolymer is indicated by the solid

  10. Mirror-mark tests performed on jackdaws reveal potential methodological problems in the use of stickers in avian mark-test studies.

    Directory of Open Access Journals (Sweden)

    Manuel Soler

    Full Text Available Some animals are capable of recognizing themselves in a mirror, which is considered to be demonstrated by passing the mark test. Mirror self-recognition capacity has been found in just a few mammals having very large brains and only in one bird, the magpie (Pica pica. The results obtained in magpies have enormous biological and cognitive implications because the fact that magpies were able to pass the mark test meant that this species is at the same cognitive level with great apes, that mirror self-recognition has evolved independently in the magpie and great apes (which diverged 300 million years ago, and that the neocortex (which is not present in the bird's brains is not a prerequisite for mirror self-recognition as previously believed. Here, we have replicated the experimental design used on magpies to determine whether jackdaws (Corvus monedula are also capable of mirror self-recognition by passing the mark test. We found that our nine jackdaws showed a very high interest towards the mirror and exhibited self-contingent behavior as soon as mirrors were introduced. However, jackdaws were not able to pass the mark test: both sticker-directed actions and sticker removal were performed with a similar frequency in both the cardboard (control and the mirror conditions. We conclude that our jackdaws' behaviour raises non-trivial questions about the methodology used in the avian mark test. Our study suggests that the use of self-adhesive stickers on sensitive throat feathers may open the way to artefactual results because birds might perceive the stickers tactilely.

  11. Experimental study on the natural gas dual fuel engine test and the higher the mixture ratio of hydrogen to natural gas

    Energy Technology Data Exchange (ETDEWEB)

    Kim, B.S.; Lee, Y.S.; Park, C.K. [Cheonnam University, Kwangju (Korea); Masahiro, S. [Kyoto University, Kyoto (Japan)

    1999-05-28

    One of the unsolved problems of the natural gas dual fuel engine is that there is too much exhaust of Total Hydrogen Carbon(THC) at a low equivalent mixture ratio. To fix it, a natural gas mixed with hydrogen was applied to engine test. The results showed that the higher the mixture ratio of hydrogen to natural gas, the higher the combustion efficiency. And when the amount of the intake air is reached to 90% of WOT, the combustion efficiency was promoted. But, like a case making the injection timing earlier, the equivalent mixture ratio for the nocking limit decreases and the produce of NOx increases. 5 refs., 9 figs., 1 tab.

  12. The Lithium isotope ratio in Population II halo dwarfs: A proposed test of the late decaying massive particle nucleosynthesis scenario

    International Nuclear Information System (INIS)

    Brown, L.; Schramm, D.N.

    1988-02-01

    It is shown that observations of the Lithium isotope ratio in high surface temperature Population II stars may be critical to cosmological nucleosynthesis models. In particular, decaying particle scenarios as derived in some supersymmetric models may stand or fall with such observations. 15 refs., 3 figs., 2 tabs

  13. A Methodological Framework for Assessing Agents, Proximate Drivers and Underlying Causes of Deforestation: Field Test Results from Southern Cameroon

    Directory of Open Access Journals (Sweden)

    Sophia Carodenuto

    2015-01-01

    Full Text Available The international debates on REDD+ and the expectations to receive results-based payments through international climate finance have triggered considerable political efforts to address deforestation and forest degradation in many potential beneficiary countries. Whether a country will receive such REDD+ payments is largely contingent on its ability to effectively address the relevant drivers, and to govern the context-dependent agents and forces responsible for forest loss or degradation. Currently, many REDD+ countries are embarking on the necessary analytical steps for their national REDD+ strategies. In this context, a comprehensive understanding of drivers and their underlying causes is a fundamental prerequisite for developing effective policy responses. We developed a methodological framework for assessing the drivers and underlying causes of deforestation and use the Fako Division in Southern Cameroon as a case study to test this approach. The steps described in this paper can be adapted to other geographical contexts, and the results of such assessments can be used to inform policy makers and other stakeholders.

  14. Kaner biodiesel production through hybrid reactor and its performance testing on a CI engine at different compression ratios

    Directory of Open Access Journals (Sweden)

    Ashok Kumar Yadav

    2017-06-01

    Full Text Available The present study deals with development of a hybrid reactor for biodiesel production based on the combined hydrodynamic cavitation and mechanical stirring processes. Biodiesel were produced using Kaner Seed Oil (KSO. The experimental results show that hybrid reactor produces 95% biodiesel yield within 45 min for 0.75% of catalyst and 6:1 M ratio which is significantly higher as compared to mechanical stirring or hydrodynamic cavitation alone. Thus biodiesel production process in hybrid reactor is cheap (high yield, efficient (time saving and environmentally friendly (lower% of catalyst. Performance study on engine shows that an increase in compression ratios (from 16 to 18 improves the engine performance using biodiesel blends as compared to petroleum diesel.

  15. Screening Test for Shed Skin Cells by Measuring the Ratio of Human DNA to Staphylococcus epidermidis DNA.

    Science.gov (United States)

    Nakanishi, Hiroaki; Ohmori, Takeshi; Hara, Masaaki; Takahashi, Shirushi; Kurosu, Akira; Takada, Aya; Saito, Kazuyuki

    2016-05-01

    A novel screening method for shed skin cells by detecting Staphylococcus epidermidis (S. epidermidis), which is a resident bacterium on skin, was developed. Staphylococcus epidermidis was detected using real-time PCR. Staphylococcus epidermidis was detected in all 20 human skin surface samples. Although not present in blood and urine samples, S. epidermidis was detected in 6 of 20 saliva samples, and 5 of 18 semen samples. The ratio of human DNA to S. epidermidisDNA was significantly smaller in human skin surface samples than in saliva and semen samples in which S. epidermidis was detected. Therefore, although skin cells could not be identified by detecting only S. epidermidis, they could be distinguished by measuring the S. epidermidis to human DNA ratio. This method could be applied to casework touch samples, which suggests that it is useful for screening whether skin cells and human DNA are present on potential evidentiary touch samples. © 2016 American Academy of Forensic Sciences.

  16. Applications of Isotope Ratio Mass Spectrometry in Sports Drug Testing Accounting for Isotope Fractionation in Analysis of Biological Samples.

    Science.gov (United States)

    Piper, Thomas; Thevis, Mario

    2017-01-01

    The misuse of anabolic-androgenic steroids (AAS) in sports aiming at enhancing athletic performance has been a challenging matter for doping control laboratories for decades. While the presence of a xenobiotic AAS or its metabolite(s) in human urine immediately represents an antidoping rule violation, the detection of the misuse of endogenous steroids such as testosterone necessitates comparably complex procedures. Concentration thresholds and diagnostic analyte ratios computed from urinary steroid concentrations of, e.g., testosterone and epitestosterone have aided identifying suspicious doping control samples in the past. These ratios can however also be affected by confounding factors and are therefore not sufficient to prove illicit steroid administrations. Here, carbon and, in rare cases, hydrogen isotope ratio mass spectrometry (IRMS) has become an indispensable tool. Importantly, the isotopic signatures of pharmaceutical steroid preparations commonly differ slightly but significantly from those found with endogenously produced steroids. By comparing the isotope ratios of endogenous reference compounds like pregnanediol to that of testosterone and its metabolites, the unambiguous identification of the urinary steroids' origin is accomplished. Due to the complex urinary matrix, several steps in sample preparation are inevitable as pure analyte peaks are a prerequisite for valid IRMS determinations. The sample cleanup encompasses steps such as solid phase or liquid-liquid extraction that are presumably not accompanied by isotopic fractionation processes, as well as more critical steps like enzymatic hydrolysis, high-performance liquid chromatography fractionation, and derivatization of analytes. In order to exclude any bias of the analytical results, each step of the analytical procedure is optimized and validated to exclude, or at least result in constant, isotopic fractionation. These efforts are explained in detail. © 2017 Elsevier Inc. All rights reserved.

  17. Nonlinear relationship between the Product Consistency Test (PCT) response and the Al/B ratio in a soda-lime aluminoborosilicate glass

    Energy Technology Data Exchange (ETDEWEB)

    Farooqi, Rahmat Ullah, E-mail: rufarooqi@postech.ac.kr [Division of Advanced Nuclear Engineering, Pohang University of Science and Technology, 77 Cheongam-Ro, Nam-Gu, Pohang, Gyeongbuk 790-784 (Korea, Republic of); Hrma, Pavel [Division of Advanced Nuclear Engineering, Pohang University of Science and Technology, 77 Cheongam-Ro, Nam-Gu, Pohang, Gyeongbuk 790-784 (Korea, Republic of); Pacific Northwest National Laboratory, Richland, WA (United States)

    2016-06-15

    We have investigated the effect of Al/B ratio on the Product Consistency Test (PCT) response. In an aluminoborosilicate soda-lime glass based on a modified International Simple Glass, ISG-3, the Al/B ratio varied from 0 to 0.55 (in mole fractions). In agreement with various models of the PCT response as a function of glass composition, we observed a monotonic increase of B and Na releases with decreasing Al/B mole ratio, but only when the ratio was higher than 0.05. Below this value (Al/B < 0.05), we observed a sharp decrease that we attribute to B in tetrahedral coordination.

  18. Searching for degenerate Higgs bosons a profile likelihood ratio method to test for mass-degenerate states in the presence of censored data and uncertainties

    CERN Document Server

    David, André; Petrucciani, Giovanni

    2015-01-01

    Using the likelihood ratio test statistic, we present a method which can be employed to test the hypothesis of a single Higgs boson using the matrix of measured signal strengths. This method can be applied in the presence of censored data and takes into account uncertainties on the measurements. The p-value against the hypothesis of a single Higgs boson is defined from the expected distribution of the test statistic, generated using pseudo-experiments. The applicability of the likelihood-based test is demonstrated using numerical examples with uncertainties and missing matrix elements.

  19. Initiation of depleted uranium oxide and spent fuel testing for the spent fuel sabotage aerosol ratio programme

    International Nuclear Information System (INIS)

    Molecke, M.A.; Gregson, M.W.; Sorenson, K.B.

    2004-01-01

    We provide a detailed overview of an on-going, multinational test programme that is developing aerosol data for some spent fuel sabotage scenarios on spent fuel transport and storage casks. Experiments are being performed to quantify the aerosolised materials plus volatilised fission products generated from actual spent fuel and surrogate material test rods, due to impact by a high-energy/density device. The programme participants in the United States plus Germany, France and the United Kingdom, part of the international Working Group for Sabotage Concerns of Transport and Storage Casks (WGSTSC) have strongly supported and coordinated this research programme. Sandia National Laboratories has the lead role for conducting this research programme; test programme support is provided by both the US Department of Energy and the US Nuclear Regulatory Commission. We provide a summary of the overall, multiphase test design and a description of all explosive containment and aerosol collection test components used. We focus on the recently initiated tests on 'surrogate' spent fuel, unirradiated depleted uranium oxide and forthcoming actual spent fuel tests. We briefly summarise similar results from completed surrogate tests that used non-radioactive, sintered cerium oxide ceramic pellets in test rods. (author)

  20. The fast ratio: A rapid measure for testing the dominance of the fast component in the initial OSL signal from quartz

    International Nuclear Information System (INIS)

    Durcan, Julie A.; Duller, Geoff A.T.

    2011-01-01

    The signal from the fast component is usually considered preferable for quartz optically stimulated luminescence (OSL) dating, however its presence in a continuous wave (CW) OSL signal is often assumed, rather than verified. This paper presents an objective measure (termed the fast ratio) for testing the dominance of the fast component in the initial part of a quartz OSL signal. The ratio is based upon the photo ionisation cross-sections of the fast and medium components and the power of the measurement equipment used to record the OSL signal, and it compares parts of the OSL signal selected to represent the fast and medium components. The ability of the fast ratio to distinguish between samples whose CW-OSL signal is dominated by the fast and non-fast components is demonstrated by comparing the fast ratio with the contribution of the fast component calculated from curve deconvolution of measured OSL signals and from simulated data. The ratio offers a rapid method for screening a large number of OSL signals obtained for individual equivalent dose estimates, it can be calculated and applied as easily as other routine screening methods, and is transferrable between different aliquots, samples and measurement equipment. - Highlights: → Fast ratio is a measure which tests dominance of fast component in quartz OSL signals. → A fast ratio above 20 implies a CW-OSL signal is dominated by fast component. → Fast ratio can be easily and rapidly applied to a large number of OSL signals. → Uses include signal comparison, data screening, identify need for further analysis.

  1. Design, Development and Tests in Real Time of Control Methodologies for a Morphing Wing in Wind Tunnel =

    Science.gov (United States)

    Tchatchueng Kammegne, Michel Joel

    In order to leave a cleaner environmental space to future generations, the international community has been mobilized to find green solutions that are effective and feasible in all sectors. The CRIAQ MDO505 project was initiated to test the morphing wingtip (wing and aileron) technology as one of these possible solutions. The main objectives of this project are: the design and manufacturing of a morphing wing prototype, the extension and control of the laminar region over the extrados, and to compare the effects of morphing and rigid aileron in terms of lift, drag and pressure distributions. The advantage of the extension of the laminar region over a wing is the drag reduction that results by delaying the transition towards its trailing edge. The location of the transition region depends on the flight case and it is controlled, for a morphing wing, via the actuators positions and displacements. Therefore, this thesis work focuses on the control of the actuators positions and displacements. This thesis presents essentially the modeling, instrumentation and wind tunnel testing results. Three series of wind tunnel tests with different values of aileron deflection angle, angle of attack and Mach number have been performed in the subsonic wind tunnel of the IAR-NRC. The used wing airfoil consisted of stringers, ribs, spars and a flexible upper surface mad of composite materials (glass fiber carbon), a rigid aileron and flexible aileron. The aileron was able to move between +/-6 degrees. The demonstrator's span measures 1.5 m and its chord measures 1.5 m. Structural analyses have been performed to determine the plies orientation, and the number of fiberglass layers for the flexible skin. These analyses allowed also to determine the actuator's forces to push and pull the wing upper surface. The 2D XFoil and 3D solvers Fluent were used to find the optimized airfoil and the optimal location of the transition for each flight case. Based on the analyses done by the

  2. Development of a methodology for successful multigeneration life-cycle testing of the estuarine sheepshead minnow, Cyprinodon variegatus.

    Science.gov (United States)

    Cripe, G M; Hemmer, B L; Goodman, L R; Vennari, J C

    2009-04-01

    Evaluation of effects on fish reproduction and development during chemical exposures lasting for multiple generations is sometimes limited by variable reproductive responses and the time required for the exposure. Established testing methods and the short life cycle of the sheepshead minnow, Cyprinodon variegatus, make this species particularly suitable for use in identifying potential impacts of contaminants in estuarine and marine environments. This study describes the refinement of life-cycle exposure methods that increased the reliability of reproduction in sheepshead minnows and reduced the time to maturation for larvae and juvenile fishes. A test of three spawning chamber designs, three sex ratios, and two photoperiods identified conditions that reduced the coefficient of variation in egg production from >100% to as little as 32%. The most reliable results were produced with groups of three female and two male fishes (all of similar size) when they were placed in a rectangular chamber and acclimated for 12 days. A test water temperature of 26.5 +/- 2 degrees C and a 14L:10D photoperiod resulted in fish producing a mean of 74 embryos per female per day, with a coefficient of variation of 31.8%. Egg fertility exceeded 90%, with a hatch rate of 95% for normal embryos (>or=80% yolk) and a hatch rate of size (>or=2.7 cm standard length) was critical for spawning readiness. Adult fish were prepared for the spawning assessment by adding frozen brine shrimp to their diet. Results of these experiments provide methods that are of particular interest in assessment of endocrine-disrupting chemicals that are known to affect reproduction.

  3. Development of a calibration methodology and tests of kerma area product meters; Desenvolvimento de uma metodologia de calibracao e testes de medidores de produto Kerma-Area

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Nathalia Almeida

    2013-07-01

    The quantity kerma area product (PKA) is important to establish reference levels in diagnostic radiology exams. This quantity can be obtained using a PKA meter. The use of such meters is essential to evaluate the radiation dose in radiological procedures and is a good indicator to make sure that the dose limit to the patient's skin doesn't exceed. Sometimes, these meters come fixed to X radiation equipment, which makes its calibration difficult. In this work, it was developed a methodology for calibration of PKA meters. The instrument used for this purpose was the Patient Dose Calibrator (PDC). It was developed to be used as a reference to check the calibration of PKA and air kerma meters that are used for dosimetry in patients and to verify the consistency and behavior of systems of automatic exposure control. Because it is a new equipment, which, in Brazil, is not yet used as reference equipment for calibration, it was also performed the quality control of this equipment with characterization tests, the calibration and an evaluation of the energy dependence. After the tests, it was proved that the PDC can be used as a reference instrument and that the calibration must be performed in situ, so that the characteristics of each X-ray equipment, where the PKA meters are used, are considered. The calibration was then performed with portable PKA meters and in an interventional radiology equipment that has a PKA meter fixed. The results were good and it was proved the need for calibration of these meters and the importance of in situ calibration with a reference meter. (author)

  4. Testing hypotheses for excess flower production and low fruit-to-flower ratios in a pollinating seed-consuming mutualism

    Science.gov (United States)

    Holland, J. Nathaniel; Bronstein, Judith L.; DeAngelis, Donald L.

    2004-01-01

    Pollinator attraction, pollen limitation, resource limitation, pollen donation and selective fruit abortion have all been proposed as processes explaining why hermaphroditic plants commonly produce many more flowers than mature fruit. We conducted a series of experiments in Arizona to investigate low fruit-to-flower ratios in senita cacti, which rely exclusively on pollinating seed-consumers. Selective abortion of fruit based on seed predators is of particular interest in this case because plants relying on pollinating seed-consumers are predicted to have such a mechanism to minimize seed loss. Pollinator attraction and pollen dispersal increased with flower number, but fruit set did not, refuting the hypothesis that excess flowers increase fruit set by attracting more pollinators. Fruit set of natural- and hand-pollinated flowers were not different, supporting the resource, rather than pollen, limitation hypothesis. Senita did abort fruit, but not selectively based on pollen quantity, pollen donors, or seed predators. Collectively, these results are consistent with sex allocation theory in that resource allocation to excess flower production can increase pollen dispersal and the male fitness function of flowers, but consequently results in reduced resources available for fruit set. Inconsistent with sex allocation theory, however, fruit production and the female fitness function of flowers may actually increase with flower production. This is because excess flower production lowers pollinator-to-flower ratios and results in fruit abortion, both of which limit the abundance and hence oviposition rates, of pre-dispersal seed predators.

  5. Golden Ratio

    Indian Academy of Sciences (India)

    Our attraction to another body increases if the body is symmetricaland in proportion. If a face or a structure is in proportion,we are more likely to notice it and find it beautiful.The universal ratio of beauty is the 'Golden Ratio', found inmany structures. This ratio comes from Fibonacci numbers.In this article, we explore this ...

  6. Golden Ratio

    Indian Academy of Sciences (India)

    Keywords. Fibonacci numbers, golden ratio, Sanskrit prosody, solar panel. Abstract. Our attraction to another body increases if the body is symmetricaland in proportion. If a face or a structure is in proportion,we are more likely to notice it and find it beautiful.The universal ratio of beauty is the 'Golden Ratio', found inmany ...

  7. Golden Ratio

    Indian Academy of Sciences (India)

    Our attraction to another body increases if the body is sym- metrical and in proportion. If a face or a structure is in pro- portion, we are more likely to notice it and find it beautiful. The universal ratio of beauty is the 'Golden Ratio', found in many structures. This ratio comes from Fibonacci numbers. In this article, we explore this ...

  8. 236U and 239,240Pu ratios from soils around an Australian nuclear weapons test site

    International Nuclear Information System (INIS)

    Tims, S.G.; Froehlich, M.B.; Fifield, L.K.; Wallner, A.; De Cesare, M.

    2016-01-01

    The isotopes 236 U, 239 Pu and 240 Pu are present in surface soils as a result of global fallout from nuclear weapons tests carried out in the 1950's and 1960's. These isotopes potentially constitute artificial tracers of recent soil erosion and sediment movement. Only Accelerator Mass Spectrometry has the requisite sensitivity to measure all three isotopes at these environmental levels. Coupled with its relatively high throughput capabilities, this makes it feasible to conduct studies of erosion across the geographical extent of the Australian continent. In the Australian context, however, global fallout is not the only source of these isotopes. As part of its weapons development program the United Kingdom carried out a series of atmospheric and surface nuclear weapons tests at Maralinga, South Australia in 1956 and 1957. The tests have made a significant contribution to the Pu isotopic abundances present in the region around Maralinga and out to distances ∼1000 km, and impact on the assessment techniques used in the soil and sediment tracer studies. Quantification of the relative fallout contribution derived from detonations at Maralinga is complicated owing to significant contamination around the test site from numerous nuclear weapons safety trials that were also carried out around the site. We show that 236 U can provide new information on the component of the fallout that is derived from the local nuclear weapons tests, and highlight the potential of 236 U as a new fallout tracer. - Highlights: • Measured 236 U inventories around the Maralinga Test Nuclear weapons test site. • Comparison of 236 U and 239 Pu soil depth profiles at Maralinga. • Differences in 236 U and 239 Pu inventories indicate most Pu fallout is from the safety trials, rather than the weapons tests.

  9. Policy Implications for Continuous Employment Decisions of High School Principals: An Alternative Methodological Approach for Using High-Stakes Testing Outcomes

    Science.gov (United States)

    Young, I. Phillip; Fawcett, Paul

    2013-01-01

    Several teacher models exist for using high-stakes testing outcomes to make continuous employment decisions for principals. These models are reviewed, and specific flaws are noted if these models are retrofitted for principals. To address these flaws, a different methodology is proposed on the basis of actual field data. Specially addressed are…

  10. (236)U and (239,)(240)Pu ratios from soils around an Australian nuclear weapons test site.

    Science.gov (United States)

    Tims, S G; Froehlich, M B; Fifield, L K; Wallner, A; De Cesare, M

    2016-01-01

    The isotopes (236)U, (239)Pu and (240)Pu are present in surface soils as a result of global fallout from nuclear weapons tests carried out in the 1950's and 1960's. These isotopes potentially constitute artificial tracers of recent soil erosion and sediment movement. Only Accelerator Mass Spectrometry has the requisite sensitivity to measure all three isotopes at these environmental levels. Coupled with its relatively high throughput capabilities, this makes it feasible to conduct studies of erosion across the geographical extent of the Australian continent. In the Australian context, however, global fallout is not the only source of these isotopes. As part of its weapons development program the United Kingdom carried out a series of atmospheric and surface nuclear weapons tests at Maralinga, South Australia in 1956 and 1957. The tests have made a significant contribution to the Pu isotopic abundances present in the region around Maralinga and out to distances ∼1000 km, and impact on the assessment techniques used in the soil and sediment tracer studies. Quantification of the relative fallout contribution derived from detonations at Maralinga is complicated owing to significant contamination around the test site from numerous nuclear weapons safety trials that were also carried out around the site. We show that (236)U can provide new information on the component of the fallout that is derived from the local nuclear weapons tests, and highlight the potential of (236)U as a new fallout tracer. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  11. Discrepancy in Vancomycin AUC/MIC Ratio Targeted Attainment Based upon the Susceptibility Testing in Staphylococcus aureus.

    Science.gov (United States)

    Eum, Seenae; Bergsbaken, Robert L; Harvey, Craig L; Warren, J Bryan; Rotschafer, John C

    2016-09-27

    This study demonstrated a statistically significant difference in vancomycin minimum inhibitory concentration (MIC) for Staphylococcus aureus between a common automated system (Vitek 2) and the E-test method in patients with S. aureus bloodstream infections. At an area under the serum concentration time curve (AUC) threshold of 400 mg∙h/L, we would have reached the current Infectious Diseases Society of America (IDSA)/American Society of Health System Pharmacists (ASHP)/Society of Infectious Diseases Pharmacists (SIDP) guideline suggested AUC/MIC target in almost 100% of patients while using the Vitek 2 MIC data; however, we could only generate 40% target attainment while using E-test MIC data ( p AUC of 450 mg∙h/L or greater was required to achieve 100% target attainment using either Vitek 2 or E-test MIC results.

  12. Discrimination of DPRK M5.1 February 12th, 2013 Earthquake as Nuclear Test Using Analysis of Magnitude, Rupture Duration and Ratio of Seismic Energy and Moment

    Science.gov (United States)

    Salomo Sianipar, Dimas; Subakti, Hendri; Pribadi, Sugeng

    2015-04-01

    On February 12th, 2013 morning at 02:57 UTC, there had been an earthquake with its epicenter in the region of North Korea precisely around Sungjibaegam Mountains. Monitoring stations of the Preparatory Commission for the Comprehensive Nuclear Test-Ban Treaty Organization (CTBTO) and some other seismic network detected this shallow seismic event. Analyzing seismograms recorded after this event can discriminate between a natural earthquake or an explosion. Zhao et. al. (2014) have been successfully discriminate this seismic event of North Korea nuclear test 2013 from ordinary earthquakes based on network P/S spectral ratios using broadband regional seismic data recorded in China, South Korea and Japan. The P/S-type spectral ratios were powerful discriminants to separate explosions from earthquake (Zhao et. al., 2014). Pribadi et. al. (2014) have characterized 27 earthquake-generated tsunamis (tsunamigenic earthquake or tsunami earthquake) from 1991 to 2012 in Indonesia using W-phase inversion analysis, the ratio between the seismic energy (E) and the seismic moment (Mo), the moment magnitude (Mw), the rupture duration (To) and the distance of the hypocenter to the trench. Some of this method was also used by us to characterize the nuclear test earthquake. We discriminate this DPRK M5.1 February 12th, 2013 earthquake from a natural earthquake using analysis magnitude mb, ms and mw, ratio of seismic energy and moment and rupture duration. We used the waveform data of the seismicity on the scope region in radius 5 degrees from the DPRK M5.1 February 12th, 2013 epicenter 41.29, 129.07 (Zhang and Wen, 2013) from 2006 to 2014 with magnitude M ≥ 4.0. We conclude that this earthquake was a shallow seismic event with explosion characteristics and can be discriminate from a natural or tectonic earthquake. Keywords: North Korean nuclear test, magnitude mb, ms, mw, ratio between seismic energy and moment, ruptures duration

  13. Development of acoustically lined ejector technology for multitube jet noise suppressor nozzles by model and engine tests over a wide range of jet pressure ratios and temperatures

    Science.gov (United States)

    Atvars, J.; Paynter, G. C.; Walker, D. Q.; Wintermeyer, C. F.

    1974-01-01

    An experimental program comprising model nozzle and full-scale engine tests was undertaken to acquire parametric data for acoustically lined ejectors applied to primary jet noise suppression. Ejector lining design technology and acoustical scaling of lined ejector configurations were the major objectives. Ground static tests were run with a J-75 turbojet engine fitted with a 37-tube, area ratio 3.3 suppressor nozzle and two lengths of ejector shroud (L/D = 1 and 2). Seven ejector lining configurations were tested over the engine pressure ratio range of 1.40 to 2.40 with corresponding jet velocities between 305 and 610 M/sec. One-fourth scale model nozzles were tested over a pressure ratio range of 1.40 to 4.0 with jet total temperatures between ambient and 1088 K. Scaling of multielement nozzle ejector configurations was also studied using a single element of the nozzle array with identical ejector lengths and lining materials. Acoustic far field and near field data together with nozzle thrust performance and jet aerodynamic flow profiles are presented.

  14. A comparison of discriminant logistic regression and Item Response Theory Likelihood-Ratio Tests for Differential Item Functioning (IRTLRDIF) in polytomous short tests.

    Science.gov (United States)

    Hidalgo, María D; López-Martínez, María D; Gómez-Benito, Juana; Guilera, Georgina

    2016-01-01

    Short scales are typically used in the social, behavioural and health sciences. This is relevant since test length can influence whether items showing DIF are correctly flagged. This paper compares the relative effectiveness of discriminant logistic regression (DLR) and IRTLRDIF for detecting DIF in polytomous short tests. A simulation study was designed. Test length, sample size, DIF amount and item response categories number were manipulated. Type I error and power were evaluated. IRTLRDIF and DLR yielded Type I error rates close to nominal level in no-DIF conditions. Under DIF conditions, Type I error rates were affected by test length DIF amount, degree of test contamination, sample size and number of item response categories. DLR showed a higher Type I error rate than did IRTLRDIF. Power rates were affected by DIF amount and sample size, but not by test length. DLR achieved higher power rates than did IRTLRDIF in very short tests, although the high Type I error rate involved means that this result cannot be taken into account. Test length had an important impact on the Type I error rate. IRTLRDIF and DLR showed a low power rate in short tests and with small sample sizes.

  15. Authenticity testing of environment-friendly Korean rice (Oryza sativa L.) using carbon and nitrogen stable isotope ratio analysis.

    Science.gov (United States)

    Chung, Ill-Min; Park, Sung-Kyu; Lee, Kyoung-Jin; An, Min-Jeong; Lee, Ji-Hee; Oh, Yong-Taek; Kim, Seung-Hyun

    2017-11-01

    The increasing demand for organic foods creates, in turn, a pressing need for the development of more accurate tools for the authentication of organic food in order to ensure both fair trade and food safety. This study examines the feasibility of δ 13 C and δ 15 N analyses as potential tools for authentication of environment-friendly rice sold in Korea. δ 13 C and δ 15 N examination in different rice grains showed that environment-friendly rice can be successfully distinguished from conventional rice. No multi-residue pesticides were detected in the examined rice samples, including conventional rice. This study demonstrates the complementary feasibility of δ 13 C and δ 15 N analyses for the authentication of environment-friendly rice sold in Korea in cases where pesticide residue analysis alone is insufficient for discrimination of organic and conventional rice. In future, complementary analyses including compound-specific isotope ratio analysis might be employed for improving the reliability of organic authentication. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Sex ratios

    OpenAIRE

    West, Stuart A; Reece, S E; Sheldon, Ben C

    2002-01-01

    Sex ratio theory attempts to explain variation at all levels (species, population, individual, brood) in the proportion of offspring that are male (the sex ratio). In many cases this work has been extremely successful, providing qualitative and even quantitative explanations of sex ratio variation. However, this is not always the situation, and one of the greatest remaining problems is explaining broad taxonomic patterns. Specifically, why do different organisms show so ...

  17. The distinct element analysis for swelling pressure test of bentonite. Discussion on the effects of wall friction force and aspect ratio of specimen

    International Nuclear Information System (INIS)

    Shimizu, Hiroyuki; Kikuchi, Hirohito; Fujita, Tomoo; Tanai, Kenji

    2011-10-01

    For geological isolation systems for radioactive waste, bentonite based material is assumed to be used as a buffer material. The swelling characteristics of the bentonite based material are expected to fill up the void space around the radioactive wastes by swelling. In general, swelling characteristics and properties of bentonite are evaluated by the laboratory tests. However, due to the lack of standardization of testing method for bentonite, the accuracy and reproducibility of the testing results are not sufficiently proved. In this study, bentonite swelling pressure test were simulated by newly developed Distinct Element Method (DEM) code, and the effects of wall friction force and aspect ratio of bentonite specimen were discussed. As a result, the followings were found. In the beginning of the swelling pressure test, since swelling occurs only around the fluid injection side of the specimen, wall friction force acts only in the swelling area and the specimen moves to opposite side from fluid injection side. However, when the entire specimen started swelling, displacement of the specimen prevented by the wall friction force, and the specimen is pressed against the pressure measurement side. Then, the swelling pressure measured on the pressure measurement side increases. Such displacement in the specimen is significantly affected by the decreasing of mechanical properties and the difference of saturation in the bentonite specimen during the fluid infiltration. Moreover, when the aspect ratio of the specimen is large, the displacement of the particle in the specimen becomes large and the area on which the wall frictional force acts is also large. Therefore, measured swelling pressure increases more greatly as the aspect ratio of the specimen increases. To contributes to the standardization of laboratory test methods for bentonite, these effects of wall friction force revealed by the DEM simulation should be verified through laboratory experiments. (author)

  18. Standard test method for the determination of uranium by ignition and the oxygen to uranium (O/U) atomic ratio of nuclear grade uranium dioxide powders and pellets

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2000-01-01

    1.1 This test method covers the determination of uranium and the oxygen to uranium atomic ratio in nuclear grade uranium dioxide powder and pellets. 1.2 This test method does not include provisions for preventing criticality accidents or requirements for health and safety. Observance of this test method does not relieve the user of the obligation to be aware of and conform to all international, national, or federal, state and local regulations pertaining to possessing, shipping, processing, or using source or special nuclear material. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use. 1.4 This test method also is applicable to UO3 and U3O8 powder.

  19. Using non-performing loan ratios as default rates in the estimation of credit losses and macroeconomic credit risk stress testing: A case from Turkey

    Directory of Open Access Journals (Sweden)

    Guray Kucukkocaoglu

    2016-02-01

    Full Text Available In this study, inspired by the Credit Portfolio View approach, we intend to develop an econometric credit risk model to estimate credit loss distributions of Turkish Banking System under baseline and stress macro scenarios, by substituting default rates with non-performing loan (NPL ratios. Since customer number based historical default rates are not available for the whole Turkish banking system’s credit portfolio, we used NPL ratios as dependent variable instead of default rates, a common practice for many countries where historical default rates are not available. Although, there are many problems in using NPL ratios as default rates such as underestimating portfolio losses as a result of totally non-homogeneous total credit portfolios and transferring non-performing loans to asset management companies from banks’ balance sheets, our aim is to underline and limit some ignored problems using accounting based NPL ratios as default rates in macroeconomic credit risk modeling. Developed models confirm the strong statistical relationship between systematic component of credit risk and macroeconomic variables in Turkey. Stress test results also are compatible with the past experiences

  20. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    Science.gov (United States)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test

  1. Tests of variable-band multilayers designed for investigating optimal signal-to-noise vs artifact signal ratios in Dual-Energy Digital Subtraction Angiography (DDSA) imaging systems

    International Nuclear Information System (INIS)

    Boyers, D.; Ho, A.; Li, Q.; Piestrup, M.; Rice, M.; Tatchyn, R.

    1993-08-01

    In recent work, various design techniques were applied to investigate the feasibility of controlling the bandwidth and bandshape profiles of tungsten/boron-carbon (W/B 4 C) and tungsten/silicon (W/Si) multilayers for optimizing their performance in synchrotron radiation based angiographical imaging systems at 33 keV. Varied parameters included alternative spacing geometries, material thickness ratios, and numbers of layer pairs. Planar optics with nominal design reflectivities of 30%--94% and bandwidths ranging from 0.6%--10% were designed at the Stanford Radiation Laboratory, fabricated by the Ovonic Synthetic Materials Company, and characterized on Beam Line 4-3 at the Stanford Synchrotron Radiation Laboratory, in this paper we report selected results of these tests and review the possible use of the multilayers for determining optimal signal to noise vs. artifact signal ratios in practical Dual-Energy Digital Subtraction Angiography systems

  2. Safety and reliability of pressure components with special emphasis on the contribution of component and large specimen testing to structural integrity assessment methodology. Vol. 1 and 2

    International Nuclear Information System (INIS)

    1987-01-01

    The 51 papers of the 13. MPA-seminar contribute to structural integrity assessment methodology with special emphasis on the component and large specimen testing. 8 of the papers deal with fracture mechanics, 6 papers with dynamic loading, 13 papers with nondestructive testing, 2 papers with radiation embrittlement, 5 papers with pipe failure, 4 papers with components, 2 papers with thermal shock loading, 5 papers with the high temperature behaviour, 4 papers with the integrity of vessels and 3 papers with the integrity of welded joints. Especially also the fracture behaviour of steel material is verificated. All papers are separately indexed and analysed for the database. (DG) [de

  3. A proposed hardness assurance test methodology for bipolar linear circuits and devices in a space ionizing radiation environment

    International Nuclear Information System (INIS)

    Pease, R.L.; Brown, D.B.; Cohn, L.

    1997-01-01

    A hardness assurance test approach has been developed for bipolar linear circuits and devices in space. It consists of a screen for dose rate sensitivity and a characterization test method to develop the conditions for a lot acceptance test at high dose rate

  4. Recent advances in ratio primary reference measurement procedures (definitive methods) and their use in certification of reference materials and controlling assigned values in proficiency testing

    International Nuclear Information System (INIS)

    Dybczynski, R.S.; Polkowska-Motrenko, H.; Chajduk, E.; Danko, B.; Pyszynska, M.

    2014-01-01

    Three very accurate (definitive) methods by RNAA for the determination of Se, As and Fe respectively, which were recently elaborated in our laboratory, are reviewed and their use in certification of reference materials and in checking the assigned values in proficiency tests is demonstrated on several examples. According to VIM 3 nomenclature these methods may be called: ratio primary reference measurement procedures (RPRMPs). RPRMPs with their expanded uncertainties of 2.7-3.6 % are comparable to ID-MS methods and are the only methods of such high metrological quality which can be used for the determination of trace amounts of monoisotopic elements. (author)

  5. PROFITABILITY RATIO AS A TOOL FOR BANKRUPTCY PREDICTION

    OpenAIRE

    Daniel BRÎNDESCU – OLARIU

    2016-01-01

    The current study evaluates the potential of the profitability ratio in predicting corporate bankruptcy. The research is focused on Romanian companies, with the targeted event being represented by the manifestation of bankruptcy 2 years after the date of the financial statements of reference. All tests were conducted over 2 paired samples of 1176 Romanian companies. The methodology employed in evaluating the potential of the profitability ratio was based on the Area Under the ROC Curve (0.663...

  6. Butterfly valve torque prediction methodology

    International Nuclear Information System (INIS)

    Eldiwany, B.H.; Sharma, V.; Kalsi, M.S.; Wolfe, K.

    1994-01-01

    As part of the Motor-Operated Valve (MOV) Performance Prediction Program, the Electric Power Research Institute has sponsored the development of methodologies for predicting thrust and torque requirements of gate, globe, and butterfly MOVs. This paper presents the methodology that will be used by utilities to calculate the dynamic torque requirements for butterfly valves. The total dynamic torque at any disc position is the sum of the hydrodynamic torque, bearing torque (which is induced by the hydrodynamic force), as well as other small torque components (such as packing torque). The hydrodynamic torque on the valve disc, caused by the fluid flow through the valve, depends on the disc angle, flow velocity, upstream flow disturbances, disc shape, and the disc aspect ratio. The butterfly valve model provides sets of nondimensional flow and torque coefficients that can be used to predict flow rate and hydrodynamic torque throughout the disc stroke and to calculate the required actuation torque and the maximum transmitted torque throughout the opening and closing stroke. The scope of the model includes symmetric and nonsymmetric discs of different shapes and aspects ratios in compressible and incompressible fluid applications under both choked and nonchoked flow conditions. The model features were validated against test data from a comprehensive flowloop and in situ test program. These tests were designed to systematically address the effect of the following parameters on the required torque: valve size, disc shapes and disc aspect ratios, upstream elbow orientation and its proximity, and flow conditions. The applicability of the nondimensional coefficients to valves of different sizes was validated by performing tests on 42-in. valve and a precisely scaled 6-in. model. The butterfly valve model torque predictions were found to bound test data from the flow-loop and in situ testing, as shown in the examples provided in this paper

  7. Use and Application of the SADRWMS Methodology and SAFRAN Tool on the Thailand Institute of Nuclear Technology (TINT) Radioactive Waste Management Facility. Test Case Results. 05 October 2011

    International Nuclear Information System (INIS)

    2015-01-01

    The purpose of this document is to describe the working procedure of the test case and to provide feedback on the application of the methodology described in DS284 and the SAFRAN tool. This report documents how the test case was performed, describes how the methodology and software tool were applied, and provides feedback on the use and application of the SAFRAN Tool. The aim of this document is to address the key elements of the safety assessment and to demonstrate their principle contents and roles within the overall context of the safety case. This is done with particular emphasis on investigating the role of the SAFRAN Tool in developing a safety case for facilities similar to the TINT Facility. It is intended that this report will be the first of a series of complimentary safety reports illustrating the use and application of the methodology prescribed in DS284 and the application of the SAFRAN tool to a range of predisposal radioactive waste management activities

  8. Computing power and sample size for case-control association studies with copy number polymorphism: application of mixture-based likelihood ratio test.

    Directory of Open Access Journals (Sweden)

    Wonkuk Kim

    Full Text Available Recent studies suggest that copy number polymorphisms (CNPs may play an important role in disease susceptibility and onset. Currently, the detection of CNPs mainly depends on microarray technology. For case-control studies, conventionally, subjects are assigned to a specific CNP category based on the continuous quantitative measure produced by microarray experiments, and cases and controls are then compared using a chi-square test of independence. The purpose of this work is to specify the likelihood ratio test statistic (LRTS for case-control sampling design based on the underlying continuous quantitative measurement, and to assess its power and relative efficiency (as compared to the chi-square test of independence on CNP counts. The sample size and power formulas of both methods are given. For the latter, the CNPs are classified using the Bayesian classification rule. The LRTS is more powerful than this chi-square test for the alternatives considered, especially alternatives in which the at-risk CNP categories have low frequencies. An example of the application of the LRTS is given for a comparison of CNP distributions in individuals of Caucasian or Taiwanese ethnicity, where the LRTS appears to be more powerful than the chi-square test, possibly due to misclassification of the most common CNP category into a less common category.

  9. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions Based on a Bank of Norm-Inequality-Constrained Epoch-State Filters

    Science.gov (United States)

    Carpenter, J. R.; Markley, F. L.; Alfriend, K. T.; Wright, C.; Arcido, J.

    2011-01-01

    Sequential probability ratio tests explicitly allow decision makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models 1he null hypothesis 1ha1 the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming highly-elliptical orbit formation flying mission.

  10. From neural oscillations to reasoning ability: Simulating the effect of the theta-to-gamma cycle length ratio on individual scores in a figural analogy test.

    Science.gov (United States)

    Chuderski, Adam; Andrelczyk, Krzysztof

    2015-02-01

    Several existing computational models of working memory (WM) have predicted a positive relationship (later confirmed empirically) between WM capacity and the individual ratio of theta to gamma oscillatory band lengths. These models assume that each gamma cycle represents one WM object (e.g., a binding of its features), whereas the theta cycle integrates such objects into the maintained list. As WM capacity strongly predicts reasoning, it might be expected that this ratio also predicts performance in reasoning tasks. However, no computational model has yet explained how the differences in the theta-to-gamma ratio found among adult individuals might contribute to their scores on a reasoning test. Here, we propose a novel model of how WM capacity constraints figural analogical reasoning, aimed at explaining inter-individual differences in reasoning scores in terms of the characteristics of oscillatory patterns in the brain. In the model, the gamma cycle encodes the bindings between objects/features and the roles they play in the relations processed. Asynchrony between consecutive gamma cycles results from lateral inhibition between oscillating bindings. Computer simulations showed that achieving the highest WM capacity required reaching the optimal level of inhibition. When too strong, this inhibition eliminated some bindings from WM, whereas, when inhibition was too weak, the bindings became unstable and fell apart or became improperly grouped. The model aptly replicated several empirical effects and the distribution of individual scores, as well as the patterns of correlations found in the 100-people sample attempting the same reasoning task. Most importantly, the model's reasoning performance strongly depended on its theta-to-gamma ratio in same way as the performance of human participants depended on their WM capacity. The data suggest that proper regulation of oscillations in the theta and gamma bands may be crucial for both high WM capacity and effective complex

  11. Development of a quality management system for borehole investigations. (1) Quality assurance and quality control methodology for hydraulic packer testing

    International Nuclear Information System (INIS)

    Takeuchi, Shinji; Kunimaru, Takanori; Ota, Kunio; Frieg, Bernd

    2011-01-01

    A quality assurance and quality control (QA/QC) system for the hydraulic packer tests has been established based on the surface-based investigations at JAEA's underground research laboratories in Mizunami and Horonobe. The established QA/QC system covers field investigations (data acquisition) and data analysis. For the field investigations, the adopted procedure is selection of a test section based on a detail fluid logging and checking with tally list, followed by inspection of test tools such as pressure transducers and shut-in valves, etc., test method selection using a 'sequential hydraulic test' for deciding appropriate method, and finally data quality confirmation by pressure changes and derivatives on a log-log plots during testing. Test event logs should also be described during testing for traceability. For the test data analysis, a quick analysis for rough estimation of hydraulic parameters, and a detailed analysis using type curve and/or numerical analyses are conducted stepwise. The established QA/QC system has been applied to the recent borehole investigations and its efficiency has been confirmed. (author)

  12. Automated microscopic characterization of metallic ores with image analysis: a key to improve ore processing. I: test of the methodology

    International Nuclear Information System (INIS)

    Berrezueta, E.; Castroviejo, R.

    2007-01-01

    Ore microscopy has traditionally been an important support to control ore processing, but the volume of present day processes is beyond the reach of human operators. Automation is therefore compulsory, but its development through digital image analysis, DIA, is limited by various problems, such as the similarity in reflectance values of some important ores, their anisotropism, and the performance of instruments and methods. The results presented show that automated identification and quantification by DIA are possible through multiband (RGB) determinations with a research 3CCD video camera on reflected light microscope. These results were obtained by systematic measurement of selected ores accounting for most of the industrial applications. Polarized light is avoided, so the effects of anisotropism can be neglected. Quality control at various stages and statistical analysis are important, as is the application of complementary criteria (e.g. metallogenetic). The sequential methodology is described and shown through practical examples. (Author)

  13. Assessment of the Speech Intelligibility Performance of Post Lingual Cochlear Implant Users at Different Signal-to-Noise Ratios Using the Turkish Matrix Test

    Directory of Open Access Journals (Sweden)

    Zahra Polat

    2016-10-01

    Full Text Available Background: Spoken word recognition and speech perception tests in quiet are being used as a routine in assessment of the benefit which children and adult cochlear implant users receive from their devices. Cochlear implant users generally demonstrate high level performances in these test materials as they are able to achieve high level speech perception ability in quiet situations. Although these test materials provide valuable information regarding Cochlear Implant (CI users’ performances in optimal listening conditions, they do not give realistic information regarding performances in adverse listening conditions, which is the case in the everyday environment. Aims: The aim of this study was to assess the speech intelligibility performance of post lingual CI users in the presence of noise at different signal-to-noise ratio with the Matrix Test developed for Turkish language. Study Design: Cross-sectional study. Methods: The thirty post lingual implant user adult subjects, who had been using implants for a minimum of one year, were evaluated with Turkish Matrix test. Subjects’ speech intelligibility was measured using the adaptive and non-adaptive Matrix Test in quiet and noisy environments. Results: The results of the study show a correlation between Pure Tone Average (PTA values of the subjects and Matrix test Speech Reception Threshold (SRT values in the quiet. Hence, it is possible to asses PTA values of CI users using the Matrix Test also. However, no correlations were found between Matrix SRT values in the quiet and Matrix SRT values in noise. Similarly, the correlation between PTA values and intelligibility scores in noise was also not significant. Therefore, it may not be possible to assess the intelligibility performance of CI users using test batteries performed in quiet conditions. Conclusion: The Matrix Test can be used to assess the benefit of CI users from their systems in everyday life, since it is possible to perform

  14. Testing new methodologies and assessing their potential for reservoir characterisation: Geoelectrical studies in the Northwest Carboniferous Basin (Ireland).

    Science.gov (United States)

    Ogaya, Xènia; Campanyà, Joan; Rath, Volker; Jones, Alan G.; Reay, Derek; Raine, Rob; McConnell, Brian; Ledo, Juanjo

    2016-04-01

    The overarching objective of this study is to improve our methods of characterising saline aquifers by integrating newly acquired electromagnetic data with existing geophysical and geological data. The work presented here is part of an ongoing project to evaluate Ireland's potential for onshore carbon sequestration (IRECCSEM; funded by Science Foundation Ireland). The methodology presented in this characterisation work is not only relevant for studying the potential for onshore carbon sequestration, but is generally applicable for aquifer characterisation, particularly for the evaluation of geothermal resources in appropriate geological settings. We present first results of the three-dimensional (3D) modelling and inversion of the magnetotelluric (MT) data acquired in the Northwest Carboniferous Basin (Ireland) in summer 2015. The electrical resistivity distribution beneath the survey area is constrained using a joint inversion of three different types of electromagnetic data: MT impedance tensor responses (Z), geomagnetic transfer functions (GTF) and inter-station horizontal magnetic transfer-functions (HMT). The preliminary 3D resistivity model obtained reveals the geoelectrical structure of the subsurface, which is translated into parameters relevant to fluid flow. The electromagnetic data were acquired along profiles linking four wells drilled in the area and the available well log data from those wells are used to evaluate some of the existing petrophysical relationships and calibrate them for the study area. This allows us to interpolate the rock physical properties from one well to another well, using the computed geoelectrical model as a reference. The obtained results are compared to available independent geological and geophysical data in order to analyse the validity of this technique, to characterise the uncertainties inherent to our approach, and to assess the potential of this methodology for reservoir characterisation.

  15. Budget impact analysis of sFlt-1/PlGF ratio as prediction test in Italian women with suspected preeclampsia.

    Science.gov (United States)

    Frusca, Tiziana; Gervasi, Maria-Teresa; Paolini, Davide; Dionisi, Matteo; Ferre, Francesca; Cetin, Irene

    2017-09-01

    Preeclampsia (PE) is a pregnancy disease which represents a leading cause of maternal and perinatal mortality and morbidity. Accurate prediction of PE risk could provide an increase in health benefits and better patient management. To estimate the economic impact of introducing Elecsys sFlt-1/PlGF ratio test, in addition to standard practice, for the prediction of PE in women with suspected PE in the Italian National Health Service (INHS). A decision tree model has been developed to simulate the progression of a cohort of pregnant women from the first presentation of clinical suspicion of PE in the second and third trimesters until delivery. The model provides an estimation of the financial impact of introducing sFlt-1/PlGF versus standard practice. Clinical inputs have been derived from PROGNOSIS study and from literature review, and validated by National Clinical Experts. Resources and unit costs have been obtained from Italian-specific sources. Healthcare costs associated with the management of a pregnant woman with clinical suspicion of PE equal €2384 when following standard practice versus €1714 using sFlt-1/PlGF ratio test. Introduction of sFlt-1/PlGF into hospital practice is cost-saving. Savings are generated primarily through improvement in diagnostic accuracy and reduction in unnecessary hospitalization for women before PE's onset.

  16. F-15 inlet/engine test techniques and distortion methodologies studies. Volume 2: Time variant data quality analysis plots

    Science.gov (United States)

    Stevens, C. H.; Spong, E. D.; Hammock, M. S.

    1978-01-01

    Time variant data quality analysis plots were used to determine if peak distortion data taken from a subscale inlet model can be used to predict peak distortion levels for a full scale flight test vehicle.

  17. First measurements of (236)U concentrations and (236)U/(239)Pu isotopic ratios in a Southern Hemisphere soil far from nuclear test or reactor sites.

    Science.gov (United States)

    Srncik, M; Tims, S G; De Cesare, M; Fifield, L K

    2014-06-01

    The variation of the (236)U and (239)Pu concentrations as a function of depth has been studied in a soil profile at a site in the Southern Hemisphere well removed from nuclear weapon test sites. Total inventories of (236)U and (239)Pu as well as the (236)U/(239)Pu isotopic ratio were derived. For this investigation a soil core from an undisturbed forest area in the Herbert River catchment (17°30' - 19°S) which is located in north-eastern Queensland (Australia) was chosen. The chemical separation of U and Pu was carried out with a double column which has the advantage of the extraction of both elements from a relatively large soil sample (∼20 g) within a day. The samples were measured by Accelerator Mass Spectrometry using the 14UD pelletron accelerator at the Australian National University. The highest atom concentrations of both (236)U and (239)Pu were found at a depth of 2-3 cm. The (236)U/(239)Pu isotopic ratio in fallout at this site, as deduced from the ratio of the (236)U and (239)Pu inventories, is 0.085 ± 0.003 which is clearly lower than the Northern Hemisphere value of ∼0.2. The (236)U inventory of (8.4 ± 0.3) × 10(11) at/m(2) was more than an order of magnitude lower than values reported for the Northern Hemisphere. The (239)Pu activity concentrations are in excellent agreement with a previous study and the (239+240)Pu inventory was (13.85 ± 0.29) Bq/m(2). The weighted mean (240)Pu/(239)Pu isotopic ratio of 0.142 ± 0.005 is slightly lower than the value for global fallout, but our results are consistent with the average ratio of 0.173 ± 0.027 for the southern equatorial region (0-30°S). Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires.

    Science.gov (United States)

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  19. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires

    Science.gov (United States)

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  20. Methodological approaches to conducting pilot and proof tests on reverse-osmosis systems: Results of comparative studies

    Science.gov (United States)

    Panteleev, A. A.; Bobinkin, V. V.; Larionov, S. Yu.; Ryabchikov, B. E.; Smirnov, V. B.; Shapovalov, D. A.

    2017-10-01

    When designing large-scale water-treatment plants based on reverse-osmosis systems, it is proposed to conduct experimental-industrial or pilot tests for validated simulation of the operation of the equipment. It is shown that such tests allow establishing efficient operating conditions and characteristics of the plant under design. It is proposed to conduct pilot tests of the reverse-osmosis systems on pilot membrane plants (PMPs) and test membrane plants (TMPs). The results of a comparative experimental study of pilot and test membrane plants are exemplified by simulating the operating parameters of the membrane elements of an industrial plant. It is concluded that the reliability of the data obtained on the TMP may not be sufficient to design industrial water-treatment plants, while the PMPs are capable of providing reliable data that can be used for full-scale simulation of the operation of industrial reverse-osmosis systems. The test membrane plants allow simulation of the operating conditions of individual industrial plant systems; therefore, potential areas of their application are shown. A method for numerical calculation and experimental determination of the true selectivity and the salt passage are proposed. An expression has been derived that describes the functional dependence between the observed and true salt passage. The results of the experiments conducted on a test membrane plant to determine the true value of the salt passage of a reverse-osmosis membrane are exemplified by magnesium sulfate solution at different initial operating parameters. It is shown that the initial content of a particular solution component has a significant effect on the change in the true salt passage of the membrane.

  1. Metodologia alternativa para condução do teste de envelhecimento acelerado em sementes de milho Alternative methodology for the accelerated aging test for corn seeds

    Directory of Open Access Journals (Sweden)

    Sonia Regina Mudrovitsch de Bittencourt

    2012-08-01

    Full Text Available Os testes de vigor são rotineiramente empregados em programas internos de controle de qualidade por empresas sementeiras. Para tanto, é necessária a escolha de métodos eficientes que possibilitem a obtenção de respostas rápidas para a tomada de decisões relacionadas ao manuseio, descarte e comercialização dos lotes de sementes. A pesquisa objetivou verificar a redução do período de execução do teste de envelhecimento acelerado (EA em sementes de milho, empregando-se, para a avaliação do desempenho das sementes após o envelhecimento, o teste de tetrazólio - TZ (viabilidade e vigor em substituição ao de germinação (TG em 10 lotes de sementes de sete genótipos de milho, com e sem tratamento fungicida. Os dados obtidos com a metodologia proposta (EA+TZ foram comparados com os valores determinados pelo teste de envelhecimento acelerado realizado com a metodologia tradicional (EA+TG. O uso do teste de tetrazólio (vigor, associado ao teste de envelhecimento acelerado, possibilitou a obtenção de informações semelhantes às fornecidas pelo teste de germinação empregado para o mesmo fim, reduzindo de oito para três dias o tempo necessário para a obtenção dos resultados em sementes de milho.Some vigor tests are routinely used by seed industry for internal programs of seed quality control. Then, it is requested the use of efficient methods to obtain quick answers to take right decisions related to the management, discard and trade of seed lots. This research was carried out in order to study the possibility to short the period to get the accelerated aging test (AA results, using the tetrazolium test (TZ instead of germination test (GT to evaluate the seed performance after the seed aging. Tem corn seed lots were used, with and without fungicide treatment. The data obtained using the alternative method (AA+TZ were compared with those determined by the traditional one (AA+GT. There was discrimination among seed lots using

  2. Functional Case Test Design to Optimize the Software Development in Italian Tax Processes (Part I: Methodology Definition

    Directory of Open Access Journals (Sweden)

    Rolli Fernando

    2017-01-01

    Full Text Available In Europe's general context of economic integration, the National States have preserved a few competences. Among them, the most important competence is taxation management, which has now become an important lever to stabilize the State's budget and to meet the economic parameters set by the European Agreements. From this perspective, it is crucial to identify a software development mode to reduce the time spent for the implementation/adjustment of the tax payment software application and, at the same time, to minimize the overall risk level. In software development, approximately 40% of time is spent in a series of testing activities: this stage of the development process is mostly placed at the end of the implementation activities. Consequently, since many testing activities have to be waived in order to meet the deadlines for software delivery, applications that are not entirely in compliance with the user's needs and/or entailing non-compliance are more likely to be introduced. This paper focuses on improving the testing process in tax procedures. The proposed method aims to improve the process, by introducing an integrated procedure based on Axiomatic Design. The approach developed will facilitate a reduction both in the testing preparation time and in performing the test cases. In this scenario, it will be possible to optimize the data compilation process, to verify the compliance within the technical specifications as provided by the Italian Revenue Agency, to identify possible critical scenarios with a proactive approach, and to avoid classes of non-conformities.

  3. FY17 Status Report on Testing Supporting the Inclusion of Grade 91 Steel as an Acceptable Material for Application of the EPP Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Messner, Mark C. [Argonne National Lab. (ANL), Argonne, IL (United States); Sham, Sam [Argonne National Lab. (ANL), Argonne, IL (United States); Wang, Yanli [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-08-01

    This report summarizes the experiments performed in FY17 on Gr. 91 steels. The testing of Gr. 91 has technical significance because, currently, it is the only approved material for Class A construction that is strongly cyclic softening. Specific FY17 testing includes the following activities for Gr. 91 steel. First, two types of key feature testing have been initiated, including two-bar thermal ratcheting and Simplified Model Testing (SMT). The goal is to qualify the Elastic – Perfectly Plastic (EPP) design methodologies and to support incorporation of these rules for Gr. 91 into the ASME Division 5 Code. The preliminary SMT test results show that Gr. 91 is most damaging when tested with compression hold mode under the SMT creep fatigue testing condition. Two-bar thermal ratcheting test results at a temperature range between 350 to 650o C were compared with the EPP strain limits code case evaluation, and the results show that the EPP strain limits code case is conservative. The material information obtained from these key feature tests can also be used to verify its material model. Second, to provide experimental data in support of the viscoplastic material model development at Argonne National Laboratory, selective tests were performed to evaluate the effect of cyclic softening on strain rate sensitivity and creep rates. The results show the prior cyclic loading history decreases the strain rate sensitivity and increases creep rates. In addition, isothermal cyclic stress-strain curves were generated at six different temperatures, and a nonisothermal thermomechanical testing was also performed to provide data to calibrate the viscoplastic material model.

  4. Methodological and Theoretical Issues in the Adaptation of Sign Language Tests: An Example from the Adaptation of a Test to German Sign Language

    Science.gov (United States)

    Haug, Tobias

    2012-01-01

    Despite the current need for reliable and valid test instruments in different countries in order to monitor the sign language acquisition of deaf children, very few tests are commercially available that offer strong evidence for their psychometric properties. This mirrors the current state of affairs for many sign languages, where very little…

  5. PROFITABILITY RATIO AS A TOOL FOR BANKRUPTCY PREDICTION

    Directory of Open Access Journals (Sweden)

    Daniel BRÎNDESCU – OLARIU

    2016-07-01

    Full Text Available The current study evaluates the potential of the profitability ratio in predicting corporate bankruptcy. The research is focused on Romanian companies, with the targeted event being represented by the manifestation of bankruptcy 2 years after the date of the financial statements of reference. All tests were conducted over 2 paired samples of 1176 Romanian companies. The methodology employed in evaluating the potential of the profitability ratio was based on the Area Under the ROC Curve (0.663 and the general accuracy ensured by the ratio (62.6% out-of-sample accuracy. The results confirm the practical utility of the profitability ratio in the prediction of bankruptcy and thus validate the need for further research focused on developing a methodology of analysis.

  6. Organization and methodology applied to the control of commissioning tests to guarantee safe operation of nuclear units

    International Nuclear Information System (INIS)

    Clausner, J.P.; Jorel, M.

    1990-12-01

    This paper describes the activities of the Safety Analysis Department (DAS), which provides technical support for the French safety authorities in the specific context of analysis and control of startup test programme quality at each of the different stages of the programme. These activities combine to ensure that the objective of the startup tests is reached, in particular that the functions of each safety-related system are guaranteed in all operating configurations, that the performance levels of all components in the system comply with design criteria and that defects revealed during previous tests have been dealt with correctly. The special case of French nuclear facilities, linked to unit standardization, has made it possible to acquire a large amount of experience with the startup of the 900 MWe units and has illustrated the importance of defining a startup test programme. In 1981, a working group, comprising operating organization and safety authority representatives, studied the lessons which had to be learned from 900 MWe unit startup and the improvements which could be made and taken into account in the 1300 MWe unit startup programme. To illustrate the approach adopted by the DAS, we go on to describe the lessons learned from startup of the first 1300 MWe (P4) units

  7. Evaluating the Effects of Restraint Systems on 4WD Testing Methodologies: A Collaborative Effort between the NVFEL and ANL

    Science.gov (United States)

    Testing vehicles for emissions and fuel economy has traditionally been conducted with a single-axle chassis dynamometer. The 2006 SAE All Wheel Drive Symposium cited four wheel drive (4WD) and all wheel drive (AWD) sales as climbing from 20% toward 30% of a motor vehicle mar...

  8. Employing think-aloud protocols and constructive interaction to test the usability of online library catalogues: A methodological comparison

    NARCIS (Netherlands)

    Van Den Haak, M. J.; De Jong, M. D T; Schellens, P. J.

    2004-01-01

    This paper describes a comparative study of three usability test approaches: concurrent think-aloud protocols, retrospective think-aloud protocols, and constructive interaction. These three methods were compared by means of an evaluation of an online library catalogue, which involved four points of

  9. Employing think-aloud protocols and constructive interaction to test the usability of online library catalogues: a methodological comparison.

    NARCIS (Netherlands)

    van den Haak, M.J.; de Jong, Menno D.T.; Schellens, P.J.

    2004-01-01

    This paper describes a comparative study of three usability test approaches: concurrent think-aloud protocols, retrospective think-aloud protocols, and constructive interaction. These three methods were compared by means of an evaluation of an online library catalogue, which involved four points of

  10. Recent advances in ratio primary reference measurement procedures (definitive methods) and their use in certification of reference materials and controlling assigned values in proficiency testing

    International Nuclear Information System (INIS)

    Dybczyñski, R.S.; Polkowska-Motrenko, H.; Chajduk, E.; Danko, B.; Pyszynska, M.

    2014-01-01

    The idea of definitive methods based on radiochemical neutron activation analysis (RNAA), consists in combination of neutron activation with the highly selective and quantitative post-irradiation isolation of the desired radionuclide by column chromatography followed by γ-ray spectrometric measurement. The principles of construction of such methods, which were devised in the Institute of Nuclear Chemistry and Technology, are reminded and the significance of these methods for analytical quality assurance is emphasized. According to VIM 3 nomenclature these methods may be called: ratio primary reference measurement procedures (RPRMPs). RPRMP for the determination of Se is briefly presented and its use for checking the accuracy of 'assigned values' established by expert laboratories in some proficiency tests, is demonstrated

  11. Radiation resistance of elastomeric O-rings in mixed neutron and gamma fields: Testing methodology and experimental results

    Science.gov (United States)

    Zenoni, A.; Bignotti, F.; Donzella, A.; Donzella, G.; Ferrari, M.; Pandini, S.; Andrighetto, A.; Ballan, M.; Corradetti, S.; Manzolaro, M.; Monetti, A.; Rossignoli, M.; Scarpa, D.; Alloni, D.; Prata, M.; Salvini, A.; Zelaschi, F.

    2017-11-01

    Materials and components employed in the presence of intense neutron and gamma fields are expected to absorb high dose levels that may induce deep modifications of their physical and mechanical properties, possibly causing loss of their function. A protocol for irradiating elastomeric materials in reactor mixed neutron and gamma fields and for testing the evolution of their main mechanical and physical properties with absorbed dose has been developed. Four elastomeric compounds used for vacuum O-rings, one fluoroelastomer polymer (FPM) based and three ethylene propylene diene monomer rubber (EPDM) based, presently available on the market have been selected for the test. One EPDM is rated as radiation resistant in gamma fields, while the other elastomers are general purpose products. Particular care has been devoted to dosimetry calculations, since absorbed dose in neutron fields, unlike pure gamma fields, is strongly dependent on the material composition and, in particular, on the hydrogen content. The products have been tested up to about 2 MGy absorbed dose. The FPM based elastomer, in spite of its lower dose absorption in fast neutron fields, features the largest variations of properties, with a dramatic increase in stiffness and brittleness. Out of the three EPDM based compounds, one shows large and rapid changes in the main mechanical properties, whereas the other two feature more stable behaviors. The performance of the EPDM rated as radiation resistant in pure gamma fields does not appear significantly better than that of the standard product. The predictive capability of the accelerated irradiation tests performed as well as the applicable concepts of threshold of radiation damage is discussed in view of the use of the examined products in the selective production of exotic species facility, now under construction at the Legnaro National Laboratories of the Italian Istituto Nazionale di Fisica Nucleare. It results that a careful account of dose rate effects

  12. THE METHODOLOGY OF TESTING THE CAUSALITY BETWEEN THE ROMANIAN MUTUAL FUNDS MARKET AND THE ECONOMY’S DYNAMICS

    Directory of Open Access Journals (Sweden)

    Ioana RADU

    2013-06-01

    Full Text Available The paper tests and evaluates the causality between the dynamics of the Romanian mutual fund market and the economy. Using the Granger causality test, a regression analysis has been developed on quarterly data during 2004Q3 – 2012Q2 for the Romanian economy. Based on this relationship, we can emphasize that the controversial debate upon the economic growth and the mutual fund market has became a complex research subject. Therefore, due to its complexity, the timeliness and the continuous growth of the investment funds area, this paper complements the existing literature by identifying the causal linkage between the mutual fund market and the economy. The paper is organized as it follows. First part presents the main premises that have emphasized our research. Second part presents a brief literature review and extracts the studies that appreciate best the relationship between the analyzed variables. Next section is set on defining the potential correlation between the analyzed variables. Then, section 4 tests the causality by using the R facility. The last part concludes.

  13. Development of a Methodology for Conducting Hall Thruster EMI Tests in Metal Vacuum Chambers of Arbitrary Shape and Size

    Science.gov (United States)

    Gallimore, Alec D.

    2000-01-01

    While the closed-drift Hall thruster (CDT) offers significant improvement in performance over conventional chemical rockets and other advanced propulsion systems such as the arcjet, its potential impact on spacecraft communication signals must be carefully assessed before widespread use of this device can take place. To this end, many of the potentially unique issues that are associated with these thrusters center on its plume plasma characteristics and the its interaction with electromagnetic waves. Although a great deal of experiments have been made in characterizing the electromagnetic interference (EMI) potential of these thrusters, the interpretation of the resulting data is difficult because most of these measurements have been made in vacuum chambers with metal walls which reflect radio waves emanating from the thruster. This project developed a means of assessing the impact of metal vacuum chambers of arbitrary size or shape on EMI experiments, thereby allowing for test results to be interpreted properly. Chamber calibration techniques were developed and initially tested at RIAME using their vacuum chamber. Calibration experiments were to have been made at Tank 5 of NASA GRC and the 6 m by 9 m vacuum chamber at the University of Michigan to test the new procedure, however the subcontract to RIAME was cancelled by NASA memorandum on Feb. 26. 1999.

  14. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  15. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  16. Production controls (PC) and technical verification testing (TVT). A methodology for the control and tracking of LILW waste package conditioning

    International Nuclear Information System (INIS)

    Leon, A.M.; Nieto, J.L.L.; Garrido, J.G.

    2003-01-01

    As part of its low and intermediate level radioactive waste (LILW) characterisation and acceptance activities, ENRESA has set up a quality control programme that covers the different phases of radioactive waste package production and implies different levels of tracking in generation, assessment of activity and control of the documentation associated therewith. Furthermore, ENRESA has made available the mechanisms required for verification, depending on the results of periodic sampling, of the quality of the end product delivered by the waste producers. Both processes are included within the framework of two programmes of complementary activities: production controls (PC) and technical verification testing (TVT). (orig.)

  17. Design, Fabrication, and Performance Test of a 100-W Helical-Blade Vertical-Axis Wind Turbine at Low Tip-Speed Ratio

    Directory of Open Access Journals (Sweden)

    Dowon Han

    2018-06-01

    Full Text Available A 100-W helical-blade vertical-axis wind turbine was designed, manufactured, and tested in a wind tunnel. A relatively low tip-speed ratio of 1.1 was targeted for usage in an urban environment at a rated wind speed of 9 m/s and a rotational speed of 170 rpm. The basic dimensions were determined through a momentum-based design method according to the IEC 61400-2 protocol. The power output was estimated by a mathematical model that takes into account the aerodynamic performance of the NACA0018 blade shape. The lift and drag of the blade with respect to the angle of attack during rotation were calculated using 2D computational fluid dynamics (CFD simulation to take into account stall region. The average power output calculated by the model was 108.34 W, which satisfies the target output of 100 W. The manufactured wind turbine was tested in a large closed-circuit wind tunnel, and the power outputs were measured for given wind speeds. At the design condition, the measured power output was 114.7 W, which is 5.9% higher than that of the mathematical model. This result validates the proposed design method and power estimation by the mathematical model.

  18. Fluid-Elastic Instability Tests on Parallel Triangular Tube Bundles with Different Mass Ratio Values under Increasing and Decreasing Flow Velocities

    Directory of Open Access Journals (Sweden)

    Xu Zhang

    2016-01-01

    Full Text Available To study the effects of increasing and decreasing flow velocities on the fluid-elastic instability of tube bundles, the responses of an elastically mounted tube in a rigid parallel triangular tube bundle with a pitch-to-diameter ratio of 1.67 were tested in a water tunnel subjected to crossflow. Aluminum and stainless steel tubes were tested, respectively. In the in-line and transverse directions, the amplitudes, power spectrum density functions, response frequencies, added mass coefficients, and other results were obtained and compared. Results show that the nonlinear hysteresis phenomenon occurred in both tube bundle vibrations. When the flow velocity is decreasing, the tubes which have been in the state of fluid-elastic instability can keep on this state for a certain flow velocity range. During this process, the response frequencies of the tubes will decrease. Furthermore, the response frequencies of the aluminum tube can decrease much more than those of the stainless steel tube. The fluid-elastic instability constants fitted for these experiments were obtained from experimental data. A deeper insight into the fluid-elastic instability of tube bundles was also obtained by synthesizing the results. This study is beneficial for designing and operating equipment with tube bundles inside, as well as for further research on the fluid-elastic instability of tube bundles.

  19. Technological considerations in emergency instrumentation preparedness. Phase II-D. Evaluation testing and calibration methodology for emergency radiological instrumentation

    International Nuclear Information System (INIS)

    Bramson, P.E.; Andersen, B.V.; Fleming, D.M.; Kathren, R.L.; Mulhern, O.R.; Newton, C.E.; Oscarson, E.E.; Selby, J.M.

    1976-09-01

    In response to recommendations from the Advisory Committee on Reactor Safeguards, the Division of Operational Safety, U.S. ERDA has contracted with Battelle, Pacific Northwest Laboratories to survey the adequacy of existing instrumentation at nuclear fuel cycle facilities to meet emergency requirements and to develop technical criteria for instrumentation systems to be used in assessment of environmental conditions following plant emergencies. This report, the fifth in a series, provides: (1) calibration methods to assure the quality of radiological measurements and (2) testing procedures for determining whether an emergency radiological instrument meets the performance specifications. Three previous reports in this series identified the emergency instrumentation needs for power reactors, mixed oxide fuel plants, and fuel reprocessing facilities. Each of these three reports contains a Section VI, which sets forth applicable radiological instrument performance criteria and calibration requirements. Testing and calibration procedures in this report have been formatted in two parts: IV and V, each divided into three subsections: (1) Power Reactors, (2) Mixed Oxide Fuel Plants, and (3) Fuel Reprocessing Facilities. The three performance criteria subsections directly coincide with the performance criteria sections of the previous reports. These performance criteria sections have been reproduced in this report as Part III with references of ''required action'' added

  20. Monitoring of Bridges by a Laser Pointer: Dynamic Measurement of Support Rotations and Elastic Line Displacements: Methodology and First Test.

    Science.gov (United States)

    Artese, Serena; Achilli, Vladimiro; Zinno, Raffaele

    2018-01-25

    Deck inclination and vertical displacements are among the most important technical parameters to evaluate the health status of a bridge and to verify its bearing capacity. Several methods, both conventional and innovative, are used for structural rotations and displacement monitoring; however, none of these allow, at the same time, precision, automation, static and dynamic monitoring without using high cost instrumentation. The proposed system uses a common laser pointer and image processing. The elastic line inclination is measured by analyzing the single frames of an HD video of the laser beam imprint projected on a flat target. For the image processing, a code was developed in Matlab ® that provides instantaneous rotation and displacement of a bridge, charged by a mobile load. An important feature is the synchronization of the load positioning, obtained by a GNSS receiver or by a video. After the calibration procedures, a test was carried out during the movements of a heavy truck maneuvering on a bridge. Data acquisition synchronization allowed us to relate the position of the truck on the deck to inclination and displacements. The inclination of the elastic line at the support was obtained with a precision of 0.01 mrad. The results demonstrate the suitability of the method for dynamic load tests, and the control and monitoring of bridges.

  1. Monitoring of Bridges by a Laser Pointer: Dynamic Measurement of Support Rotations and Elastic Line Displacements: Methodology and First Test

    Directory of Open Access Journals (Sweden)

    Serena Artese

    2018-01-01

    Full Text Available Deck inclination and vertical displacements are among the most important technical parameters to evaluate the health status of a bridge and to verify its bearing capacity. Several methods, both conventional and innovative, are used for structural rotations and displacement monitoring; however, none of these allow, at the same time, precision, automation, static and dynamic monitoring without using high cost instrumentation. The proposed system uses a common laser pointer and image processing. The elastic line inclination is measured by analyzing the single frames of an HD video of the laser beam imprint projected on a flat target. For the image processing, a code was developed in Matlab® that provides instantaneous rotation and displacement of a bridge, charged by a mobile load. An important feature is the synchronization of the load positioning, obtained by a GNSS receiver or by a video. After the calibration procedures, a test was carried out during the movements of a heavy truck maneuvering on a bridge. Data acquisition synchronization allowed us to relate the position of the truck on the deck to inclination and displacements. The inclination of the elastic line at the support was obtained with a precision of 0.01 mrad. The results demonstrate the suitability of the method for dynamic load tests, and the control and monitoring of bridges.

  2. Evaluation of the aspartate aminotransferase/platelet ratio index and enhanced liver fibrosis tests to detect significant fibrosis due to chronic hepatitis C.

    Science.gov (United States)

    Petersen, John R; Stevenson, Heather L; Kasturi, Krishna S; Naniwadekar, Ashutosh; Parkes, Julie; Cross, Richard; Rosenberg, William M; Xiao, Shu-Yuan; Snyder, Ned

    2014-04-01

    The assessment of liver fibrosis in chronic hepatitis C patients is important for prognosis and making decisions regarding antiviral treatment. Although liver biopsy is considered the reference standard for assessing hepatic fibrosis in patients with chronic hepatitis C, it is invasive and associated with sampling and interobserver variability. Serum fibrosis markers have been utilized as surrogates for a liver biopsy. We completed a prospective study of 191 patients in which blood draws and liver biopsies were performed on the same visit. Using liver biopsies the sensitivity, specificity, and negative and positive predictive values for both aspartate aminotransferase/platelet ratio index (APRI) and enhanced liver fibrosis (ELF) were determined. The patients were divided into training and validation patient sets to develop and validate a clinically useful algorithm for differentiating mild and significant fibrosis. The area under the ROC curve for the APRI and ELF tests for the training set was 0.865 and 0.880, respectively. The clinical sensitivity in separating mild (F0-F1) from significant fibrosis (F2-F4) was 80% and 86.0% with a clinical specificity of 86.7% and 77.8%, respectively. For the validation sets the area under the ROC curve for the APRI and ELF tests was, 0.855 and 0.780, respectively. The clinical sensitivity of the APRI and ELF tests in separating mild (F0-F1) from significant (F2-F4) fibrosis for the validation set was 90.0% and 70.0% with a clinical specificity of 73.3% and 86.7%, respectively. There were no differences between the APRI and ELF tests in distinguishing mild from significant fibrosis for either the training or validation sets (P=0.61 and 0.20, respectively). Using APRI as the primary test followed by ELF for patients in the intermediate zone, would have decreased the number of liver biopsies needed by 40% for the validation set. Overall, use of our algorithm would have decreased the number of patients who needed a liver biopsy

  3. Research and systematization of 'hot' particles in the Semipalatinsk nuclear test site soils - methodology and first results

    International Nuclear Information System (INIS)

    Gorlachev, I.D.; Knyazev, B.B.; Kvochkina, T.N.; Lukashenko, S.N.

    2005-01-01

    Full text: Sources of soil activity in Semipalatinsk Nuclear Test Site (SNTS) could be both 'hot' particles dimensions from tens microns to units millimeters and sub-microns particles determining a matrix activity of soil samples. The fractionating of radionuclides and formation of 'hot' particles radionuclide composition arose from temperature changes and complicated nuclear-physical and thermodynamics processes occurring in a fire ball and cloud of nuclear explosion. Knowledge of 'hot' particles physical-chemical properties is needed for evaluation of radioactive products migration in the environment and danger level of the people external and internal exposure. Moreover, minute information about the structure and compound of 'radioactive' particles can be useful for specification of processes occurring in a fiery sphere when conducting explosions of different phylum and also for specification of radioactive fallout forming mechanism. The main polluted spots of SNTS could be divided into the four the species depending on nuclear explosion phylum. Species of radionuclide and their distribution for the different nuclear explosions are able to differ considerably. Therefore, several most typical areas for the each phylum test were selected and twenty soil samples were collected to reveal their radionuclide pollution peculiarities. Collected soil samples were separated into the five granulometric fractions: 1 mm - 2 mm, 0.5 mm - 1 mm. 0.28 mm-0.5 mm, 0.112 mm - 0.28 mm and 1 mm), 210 'hot' particles of second fraction (1>f>0.5 mm) and 154 'hot' particles of third fraction (0.5>f>0.28 mm) have been selection from the twelve SNTS soil samples by the compelled fission and visual identification methods. Main sources of soil samples and 'hot' particles activities are 239+240 Pu, 241 Am, 137 Cs and 152 Eu isotopes.In addition to the described works the special sampling of large 'hot' particles (dimension more than 2 mm) was carried out in areas of the ground and air tests

  4. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    Science.gov (United States)

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  5. Valuation of environmental problems in landfill deposition and composting - test of methodology; Verdsetting av miljoekonsekvenser av avfallsdeponering og kompostering - metodeutproeving

    Energy Technology Data Exchange (ETDEWEB)

    Leknes, Einar; Movik, Espen; Wiik, Ragnhild; Meissnes, Rudolf

    1995-08-01

    This study is aimed at the tests and design of methods for valuation of environmental problems associated with the landfill deposition of household waste. An extensive review of literature has been conducted with respect to the environmental impacts and valuation methods. Environmental impact assessment and valuation with respect to emission of greenhouse gases (GHG's), leachate and disamenity, have been performed for 4 Norwegian landfills. These differ in their approach towards waste treatment in terms of GHG-collection, briquette production and composting and also in their location in terms of proximity to residential areas and the quality of natural recipients. The study shows that the collection of methane and production of briquettes causes major reductions in the generation of GHG's, whereas composting brings significant reductions for all types of environmental impacts. (author)

  6. Escherichia coli. A sanitary methodology for faecal water pollution tests; Escherichia coli nelle acque. Significato sanitario e metodologie di analisi

    Energy Technology Data Exchange (ETDEWEB)

    Bonadonna, L. [Istituto Superiore di Sanita' , Rome (Italy)

    2001-02-01

    Among the traditional indictors of faecal water pollution, Escherichia coli has shown to fit better with the definition of indicator organism. Till now its recovery has been time-consuming and needs confirmation tests. In this report more rapid and direct methods, based on enzymatic reactions, are presented. [Italian] Per talune peculiari caratteristiche, Escherichia coli sembra meglio soddisfare i requisiti insiti nella definizione di organismo indicatore, rispetto ai tradizionali indicatori di contaminazione fecale dell'acqua. Finora, i substrati disponibili per il suo rilevamento necessitano tutti di almeno una prova di conferma. Di qui l'esigenza di indicare metodi di rilevamento a riposta piu' rapida, anche in relazione all'inserimento, nelle piu' recenti normative nazionali ed europee, del microrganismo tra i parametri microbiologici da ricercare.

  7. Valuation of environmental problems in landfill deposition and composting - test of methodology; Verdsetting av miljoekonsekvenser av avfallsdeponering og kompostering - metodeutproeving

    Energy Technology Data Exchange (ETDEWEB)

    Leknes, Einar; Movik, Espen; Wiik, Ragnhild; Meissnes, Rudolf

    1995-08-01

    This study is aimed at the tests and design of methods for valuation of environmental problems associated with the landfill deposition of household waste. An extensive review of literature has been conducted with respect to the environmental impacts and valuation methods. Environmental impact assessment and valuation with respect to emission of greenhouse gases (GHG's), leachate and disamenity, have been performed for 4 Norwegian landfills. These differ in their approach towards waste treatment in terms of GHG-collection, briquette production and composting and also in their location in terms of proximity to residential areas and the quality of natural recipients. The study shows that the collection of methane and production of briquettes causes major reductions in the generation of GHG's, whereas composting brings significant reductions for all types of environmental impacts. (author)

  8. Documentation of tests on particle size methodologies for laser diffraction compared to traditional sieving and sedimentation analysis

    DEFF Research Database (Denmark)

    Rasmussen, Charlotte; Dalsgaard, Kristian

    Sieving and sedimentation analyses by pipette or hydrometer are historically the traditional methods for determining particle size distributions (PSD). A more informative and faster alternative has for years been laser diffraction (LD). From 2003 to 2013 the authors of this paper have worked...... intensively with PSD and performed various tests and investigations, using LD, sedimentation (by pipette) and sieving. The aim was to improve and understand the relationship between these various techniques, pre-treatment effects and preferably find a unifying correlation factor. As a result, method...... comparisons of LD and sieving/sedimentation are difficult, as LD is a 3D optical volume measurement, sieving is a 2D width measurement, and sedimentation is density dependent. Platy particles like clay are generally measured to be coarser than traditional methods when LD is used. For LD the clay...

  9. Test-day somatic cell score, fat-to-protein ratio and milk yield as indicator traits for sub-clinical mastitis in dairy cattle.

    Science.gov (United States)

    Jamrozik, J; Schaeffer, L R

    2012-02-01

    Test-day (TD) records of milk, fat-to-protein ratio (F:P) and somatic cell score (SCS) of first-lactation Canadian Holstein cows were analysed by a three-trait finite mixture random regression model, with the purpose of revealing hidden structures in the data owing to putative, sub-clinical mastitis. Different distributions of the data were allowed in 30 intervals of days in milk (DIM), covering the lactation from 5 to 305 days. Bayesian analysis with Gibbs sampling was used for model inferences. Estimated proportion of TD records originated from cows infected with mastitis was 0.66 in DIM from 5 to 15 and averaged 0.2 in the remaining part of lactation. Data from healthy and mastitic cows exhibited markedly different distributions, with respect to both average value and the variance, across all parts of lactation. Heterogeneity of distributions for infected cows was also apparent in different DIM intervals. Cows with mastitis were characterized by smaller milk yield (down to -5 kg) and larger F:P (up to 0.13) and SCS (up to 1.3) compared with healthy contemporaries. Differences in averages between healthy and infected cows for F:P were the most profound at the beginning of lactation, when a dairy cow suffers the strongest energy deficit and is therefore more prone to mammary infection. Residual variances for data from infected cows were substantially larger than for the other mixture components. Fat-to-protein ratio had a significant genetic component, with estimates of heritability that were larger or comparable with milk yield, and was not strongly correlated with milk and SCS on both genetic and environmental scales. Daily milk, F:P and SCS are easily available from milk-recording data for most breeding schemes in dairy cattle. Fat-to-protein ratio can potentially be a valuable addition to SCS and milk yield as an indicator trait for selection against mastitis. © 2011 Blackwell Verlag GmbH.

  10. Poster - 44: Development and implementation of a comprehensive end-to-end testing methodology for linac-based frameless SRS QA using a modified commercial stereotactic anthropomorphic phantom

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Derek; Mutanga, Theodore [University of Toronto, Carlo Fidani Peel Regional Cancer Center (Canada)

    2016-08-15

    Purpose: An end-to-end testing methodology was designed to evaluate the overall SRS treatment fidelity, incorporating all steps in the linac-based frameless radiosurgery treatment delivery process. The study details our commissioning experience of the Steev (CIRS, Norfolk, VA) stereotactic anthropomorphic head phantom including modification, test design, and baseline measurements. Methods: Repeated MR and CT scans were performed with interchanging inserts. MR-CT fusion accuracy was evaluated and the insert spatial coincidence was verified on CT. Five non-coplanar arcs delivered a prescription dose to a 15 mm spherical CTV with 2 mm PTV margin. Following setup, CBCT-based shifts were applied as per protocol. Sequential measurements were performed by interchanging inserts without disturbing the setup. Spatial and dosimetric accuracy was assessed by a combination of CBCT hidden target, radiochromic film, and ion chamber measurements. To facilitate film registration, the film insert was modified in-house by etching marks. Results: MR fusion error and insert spatial coincidences were within 0.3 mm. Both CBCT and film measurements showed spatial displacements of 1.0 mm in similar directions. Both coronal and sagittal films reported 2.3 % higher target dose relative to the treatment plan. The corrected ion chamber measurement was similarly greater by 1.0 %. The 3 %/2 mm gamma pass rate was 99% for both films Conclusions: A comprehensive end-to-end testing methodology was implemented for our SRS QA program. The Steev phantom enabled realistic evaluation of the entire treatment process. Overall spatial and dosimetric accuracy of the delivery were 1 mm and 3 % respectively.

  11. Mental models or methodological artefacts? Adults' 'naïve' responses to a test of children's conceptions of the earth.

    Science.gov (United States)

    Nobes, Gavin; Panagiotaki, Georgia

    2009-05-01

    Vosniadou and Brewer (1992) claim that children's drawings and answers to questions show that they have naive, theory-like 'mental models' of the earth; for example, they believe it to be flat, or hollow with people inside. However, recent studies that have used different methods have found little or no evidence of these misconceptions. The contrasting accounts, and possible reasons for the inconsistent findings, were tested by giving adults (N = 484) either the original task (designed for 5-year olds) or a new version in which the same drawing instructions and questions were rephrased and clarified. Many adults' responses to the original version were identical to children's 'naïve' drawings and answers. The new version elicited substantially fewer non-scientific responses. These findings indicate that even adults find the original instructions and questions ambiguous and confusing, and that this is the principal reason for their non-scientific drawings and answers. Since children must find the task even more confusing than adults, this explanation very probably applies to many of their non-scientific responses, too, and therefore accounts for the discrepant findings of previous research. 'Naïve' responses result largely from misinterpretation of Vosniadou and Brewer's apparently simple task, rather than from mental models of the earth.

  12. New methodology to investigate potential contaminant mass fluxes at the stream-aquifer interface by combining integral pumping tests and streambed temperatures

    International Nuclear Information System (INIS)

    Kalbus, E.; Schmidt, C.; Bayer-Raich, M.; Leschik, S.; Reinstorf, F.; Balcke, G.U.; Schirmer, M.

    2007-01-01

    The spatial pattern and magnitude of mass fluxes at the stream-aquifer interface have important implications for the fate and transport of contaminants in river basins. Integral pumping tests were performed to quantify average concentrations of chlorinated benzenes in an unconfined aquifer partially penetrated by a stream. Four pumping wells were operated simultaneously for a time period of 5 days and sampled for contaminant concentrations. Streambed temperatures were mapped at multiple depths along a 60 m long stream reach to identify the spatial patterns of groundwater discharge and to quantify water fluxes at the stream-aquifer interface. The combined interpretation of the results showed average potential contaminant mass fluxes from the aquifer to the stream of 272 μg m -2 d -1 MCB and 71 μg m -2 d -1 DCB, respectively. This methodology combines a large-scale assessment of aquifer contamination with a high-resolution survey of groundwater discharge zones to estimate contaminant mass fluxes between aquifer and stream. - We provide a new methodology to quantify the potential contaminant mass flux from an aquifer to a stream

  13. From Theory-Inspired to Theory-Based Interventions: A Protocol for Developing and Testing a Methodology for Linking Behaviour Change Techniques to Theoretical Mechanisms of Action.

    Science.gov (United States)

    Michie, Susan; Carey, Rachel N; Johnston, Marie; Rothman, Alexander J; de Bruin, Marijn; Kelly, Michael P; Connell, Lauren E

    2018-05-18

    Understanding links between behaviour change techniques (BCTs) and mechanisms of action (the processes through which they affect behaviour) helps inform the systematic development of behaviour change interventions. This research aims to develop and test a methodology for linking BCTs to their mechanisms of action. Study 1 (published explicit links): Hypothesised links between 93 BCTs (from the 93-item BCT taxonomy, BCTTv1) and mechanisms of action will be identified from published interventions and their frequency, explicitness and precision documented. Study 2 (expert-agreed explicit links): Behaviour change experts will identify links between 61 BCTs and 26 mechanisms of action in a formal consensus study. Study 3 (integrated matrix of explicit links): Agreement between studies 1 and 2 will be evaluated and a new group of experts will discuss discrepancies. An integrated matrix of BCT-mechanism of action links, annotated to indicate strength of evidence, will be generated. Study 4 (published implicit links): To determine whether groups of co-occurring BCTs can be linked to theories, we will identify groups of BCTs that are used together from the study 1 literature. A consensus exercise will be used to rate strength of links between groups of BCT and theories. A formal methodology for linking BCTs to their hypothesised mechanisms of action can contribute to the development and evaluation of behaviour change interventions. This research is a step towards developing a behaviour change 'ontology', specifying relations between BCTs, mechanisms of action, modes of delivery, populations, settings and types of behaviour.

  14. Development of methodology for alternative testing strategies for the assessment of the toxicological profile of nanoparticles used in medical diagnostics. NanoTEST - EC FP7 project

    International Nuclear Information System (INIS)

    Dusinska, Maria; Fjellsbo, Lise Maria; Heimstad, Eldbjorg; Harju, Mikael; Bartonova, Alena; Tran, Lang; Juillerat-Jeanneret, Lucienne; Halamoda, Blanka; Marano, Francelyne; Boland, Sonja; Saunders, Margaret; Cartwright, Laura; Carreira, Sara; Thawley, Susan; Whelan, Maurice; Klein, Christoph; Housiadas, Christos; Volkovova, Katarina; Tulinska, Jana; Beno, Milan

    2009-01-01

    Nanoparticles (NPs) have unique, potentially beneficial properties, but their possible impact on human health is still not known. The area of nanomedicine brings humans into direct contact with NPs and it is essential for both public confidence and the nanotech companies that appropriate risk assessments are undertaken in relation to health and safety. There is a pressing need to understand how engineered NPs can interact with the human body following exposure. The FP7 project NanoTEST (www.nanotest-fp7.eu) addresses these requirements in relation to the toxicological profile of NPs used in medical diagnostics.

  15. Testing for HPV as an objective measure for quality assurance in gynecologic cytology: positive rates in equivocal and abnormal specimens and comparison with the ASCUS to SIL ratio.

    Science.gov (United States)

    Ko, Vincent; Nanji, Shabin; Tambouret, Rosemary H; Wilbur, David C

    2007-04-25

    Inappropriate use of the category of atypical squamous cells of undetermined significance (ASCUS) can result in overtreatment or undertreatment of patients, which may decrease the cost effectiveness of screening. Quality assurance tools, such as the ASCUS to squamous intraepithelial lesion ratio (ASCUS:SIL) and case review, are imperfect. High-risk HPV (hrHPV) testing is an objective test for a known viral carcinogen, and hrHPV may be more useful in monitoring the quality of ASCUS interpretations. hrHPV rates for cytologic diagnoses and patient age groups were calculated for a 2-year period. All hrHPV results for ASCUS and SIL over a 17-month period were analyzed by patient age group, over time, and by individual cytopathologist to compare hrHPV rates with the corresponding ASCUS:SIL. The hrHPV positive rate for SIL was >90%, and it was 32.6% for ASCUS. Stratification by patient age showed that approximately 50% of patients younger than 30 years and older than 70 years of age were hrHPV positive, whereas other patients had a lower rate ranging from 14% to 34%. The overall ASCUS:SIL was 1.42, and the overall hrHPV positive rate was 39.9%. Over time and by individual cytopathologist, the hrHPV rate performed similarly to the ASCUS:SIL. The analysis by patient age showed a high statistical correlation (R(2) = 0.9772) between the 2 methods. Despite differences between these techniques, the hrHPV rate closely recapitulates the ASCUS:SIL. When used together, the 2 methods can complement each other. The desirable hrHPV-positive range appears to be 40% to 50%; however, this may vary based on the patient population. The hrHPV rate is as quick and cost effective as determining the ASCUS:SIL. (c) 2007 American Cancer Society.

  16. Development of methodologies for optimization of surveillance testing and maintenance of safety related equipment at NPPs. Report of a research coordination meeting. Working material

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-09-01

    This report summarizes the results of the first meeting of the Coordinated Research Programme (CRP) on Development of Methodologies for Optimization of Surveillance Testing and Maintenance of Safety Related Equipment at NPPs, held at the Agency Headquarters in Vienna, from 16 to 20 December 1996. The purpose of this Research Coordination Meeting (RCM) was that all Chief Scientific Investigators of the groups participating in the CRP presented an outline of their proposed research projects. Additionally, the participants discussed the objective, scope, work plan and information channels of the CRP in detail. Based on these presentations and discussions, the entire project plan was updated, completed and included in this report. This report represents a common agreed project work plan for the CRP. Refs, figs, tabs.

  17. Methodological requirements to test a possible in-group advantage in judging emotions across cultures: comment on Elfenbein and Ambady (2002) and evidence.

    Science.gov (United States)

    Matsumoto, David

    2002-03-01

    H. A. Elfenbein and N. Ambady's (2002) conclusions concerning a possible in-group advantage in judging emotions across cultures are unwarranted. The author discusses 2 methodological requirements for studies to test adequately the in-group advantage hypothesis and an additional requirement in reviewing multiple judgment studies and examining variance in judgment effects across those studies. The few studies that Elfenbein and Ambady reported that support the in-group advantage hypothesis need to be examined for whether they meet the criteria discussed; if they do not, their data cannot be used to support any contention of cultural differences in judgments, let alone the in-group advantage hypothesis. Furthermore, the role of signal clarity needs to be explored in possibly moderating effects across studies; however, this was not done.

  18. Testing methodology of diamond composite inserts to be used in the drilling of petroleum wells; Metodologia de testes de insertos compositos diamantados a serem usados na perfuracao de pocos de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Bobrovnitchii, G.S.; Filgueira, M.; Skury, A.L.D.; Tardim, R.C. [Universidade Estadual do Norte Fluminense (UENF), Campos dos Goytacazes, RJ (Brazil)], e-mail: rtardim@terra.com.br

    2006-07-01

    The useful life of the inserts used in the cutters of the drills for perforation of oil wells determines the quality of the perforation as well as the productivity. Therefore, the research of the wear of insert is carried through with the objective to foretell the most important properties of the inserts. Due to the fact of the UENF to be developing the processes of composites sintering to the synthetic diamond base, it is interesting to define the testing methodology of the gotten inserts. The proposed methodology is based on the evaluation of the wear suffered by de sample. For this end a micro processed 'Abrasimeter', model AB800-E, manufactured for the Contenco Company was used. The instrument capacity is 1,36 kVA; axial load applied in the cutter up to 50 kgf; rotation of table speed 20 rpm; course of the tool in radial direction speed before 2 m/min; dimensions of the granite block D = 808 mm, d = 484 mm, h = 50 mm. The gotten results show that the proposed methodology can be used for the evaluation of the inserts of the cutters applied in perforation drills. (author)

  19. The perfectionism model of binge eating: testing unique contributions, mediating mechanisms, and cross-cultural similarities using a daily diary methodology.

    Science.gov (United States)

    Sherry, Simon B; Sabourin, Brigitte C; Hall, Peter A; Hewitt, Paul L; Flett, Gordon L; Gralnick, Tara M

    2014-12-01

    The perfectionism model of binge eating (PMOBE) is an integrative model explaining the link between perfectionism and binge eating. This model proposes socially prescribed perfectionism confers risk for binge eating by generating exposure to 4 putative binge triggers: interpersonal discrepancies, low interpersonal esteem, depressive affect, and dietary restraint. The present study addresses important gaps in knowledge by testing if these 4 binge triggers uniquely predict changes in binge eating on a daily basis and if daily variations in each binge trigger mediate the link between socially prescribed perfectionism and daily binge eating. Analyses also tested if proposed mediational models generalized across Asian and European Canadians. The PMOBE was tested in 566 undergraduate women using a 7-day daily diary methodology. Depressive affect predicted binge eating, whereas anxious affect did not. Each binge trigger uniquely contributed to binge eating on a daily basis. All binge triggers except for dietary restraint mediated the relationship between socially prescribed perfectionism and change in daily binge eating. Results suggested cross-cultural similarities, with the PMOBE applying to both Asian and European Canadian women. The present study advances understanding of the personality traits and the contextual conditions accompanying binge eating and provides an important step toward improving treatments for people suffering from eating binges and associated negative consequences.

  20. Concurrent Sr/Ca Ratios and Bomb Test 14C Records from a Porites evermanni Colony on Kure Atoll: SST, Climate Change, Ocean Circulation and Management Applications

    Science.gov (United States)

    Covarrubias, S.; Potts, D.; Siciliano, D.; Andrews, A.; Franks, R.

    2013-12-01

    Coral reefs near their latitudinal and ecological limits may be affected disproportionately by global climate changes, especially by changing sea surface temperatures (SST's). One such reef is Kure Atoll, the northernmost reef in the Hawaiian chain. Kure Atoll experiences dramatic temperature and seasonal differences throughout the year. Tracking these fluctuations is important for understanding recent physical forces affecting coral growth in such marginal reefs, and for predicting likely responses to future climate and oceanic changes. We used Sr/Ca ratios of a 50cm Porites evermanni coral core collected in Kure (September 2002) as a SST proxy for reconstructing a temperature timescale spanning the length of the core (~62 years). After cutting a 5 mm thick slab through the center growth axis and X-raying it to identify annual density banding, we extracted 4 equally-spaced samples from each annual increment to quantify, seasonal, inter-annual, and decadal SST patterns. We measured Sr and Ca concentrations by Inductively Coupled Plasma-Optical Emission Spectroscopy (ICP-OES). We then converted Sr/Ca ratios (mmol/mol) to SST using published equations, and calibrated the more recent SST estimates against satellite-based SST imagery and instrumental records from Midway Atoll (ca. 90 km to SE). We coupled the ICP-OES data with Laser Ablation Inductively Coupled Plasma Mass Spectrometry (LA-ICP-MS) scans along the core to provide higher temporal resolution for interpreting intra-seasonal and inter-seasonal trends. Higher resolution of temperature dating can help us interpret strong inter-seasonal changes not readily seen with low resolution measurements, giving us the ability to track temperature anomalies at interannual and decadal timescales, such as El Niño/Southern Oscillation or La Niña/North Pacific Decadal Oscillation. Further, the SST signature from the Sr/Ca analyses are being used in conjunction with bomb radiocarbon signals in order to establish a complete

  1. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  2. Malware Analysis Sandbox Testing Methodology

    Directory of Open Access Journals (Sweden)

    Zoltan Balazs

    2016-01-01

    Full Text Available Manual processing of malware samples became impossible years ago. Sandboxes are used to automate the analysis of malware samples to gather information about the dynamic behaviour of the malware, both at AV companies and at enterprises. Some malware samples use known techniques to detect when it runs in a sandbox, but most of these sandbox detection techniques can be easily detected and thus flagged as malicious. I invented new approaches to detect these sandboxes. I developed a tool, which can collect a lot of interesting information from these sandboxes to create statistics how the current technologies work. After analysing these results I will demonstrate tricks to detect sandboxes. These tricks can’t be easily flagged as malicious. Some sandboxes don’t not interact with the Internet in order to block data extraction, but with some DNS-fu the information can be extracted from these appliances as well.

  3. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  4. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  5. SOLVENCY RATIO AS A TOOL FOR BANKRUPTCY PREDICTION

    Directory of Open Access Journals (Sweden)

    Daniel BRÎNDESCU–OLARIU

    2016-08-01

    Full Text Available The current study evaluates the potential of the solvency ratio in predicting corporate bankruptcy. The research is focused on Romania and, in particular, on Timis County. The interest for the solvency ratio was based on the recommendations of the scientific literature, as well as on the availability of information concerning its values to all stakeholders. The event on which the research was focused was represented by the manifestation of bankruptcy 2 years after the date of the financial statements of reference. All tests were performed over 2 paired samples of 1176 companies in total. The methodology employed in evaluating the potential of the solvency ratio was based on the Area Under the ROC Curve (0.646 and the general accuracy ensured by the ratio (64.5% out-of-sample accuracy. The results confirm the practical utility of the solvency ratio in the prediction of bankruptcy.

  6. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  7. Primary cultured fibroblasts derived from patients with chronic wounds: a methodology to produce human cell lines and test putative growth factor therapy such as GMCSF

    Directory of Open Access Journals (Sweden)

    Coppock Donald L

    2008-12-01

    Full Text Available Abstract Background Multiple physiologic impairments are responsible for chronic wounds. A cell line grown which retains its phenotype from patient wounds would provide means of testing new therapies. Clinical information on patients from whom cells were grown can provide insights into mechanisms of specific disease such as diabetes or biological processes such as aging. The objective of this study was 1 To culture human cells derived from patients with chronic wounds and to test the effects of putative therapies, Granulocyte-Macrophage Colony Stimulating Factor (GM-CSF on these cells. 2 To describe a methodology to create fibroblast cell lines from patients with chronic wounds. Methods Patient biopsies were obtained from 3 distinct locations on venous ulcers. Fibroblasts derived from different wound locations were tested for their migration capacities without stimulators and in response to GM-CSF. Another portion of the patient biopsy was used to develop primary fibroblast cultures after rigorous passage and antimicrobial testing. Results Fibroblasts from the non-healing edge had almost no migration capacity, wound base fibroblasts were intermediate, and fibroblasts derived from the healing edge had a capacity to migrate similar to healthy, normal, primary dermal fibroblasts. Non-healing edge fibroblasts did not respond to GM-CSF. Six fibroblast cell lines are currently available at the National Institute on Aging (NIA Cell Repository. Conclusion We conclude that primary cells from chronic ulcers can be established in culture and that they maintain their in vivo phenotype. These cells can be utilized for evaluating the effects of wound healing stimulators in vitro.

  8. Assessment of tree response to drought: validation of a methodology to identify and test proxies for monitoring past environmental changes in trees.

    Science.gov (United States)

    Tene, A; Tobin, B; Dyckmans, J; Ray, D; Black, K; Nieuwenhuis, M

    2011-03-01

    A thinning experiment stand at Avoca, Ballinvalley, on the east coast of the Republic of Ireland was used to test a developed methodology aimed at monitoring drought stress, based on the analysis of growth rings obtained by coring. The stand incorporated six plots representing three thinning regimes (light, moderate and heavy) and was planted in the spring of 1943 on a brown earth soil. Radial growth (early- and latewood) was measured for the purpose of this study. A multidisciplinary approach was used to assess historic tree response to climate: specifically, the application of statistical tools such as principal component and canonical correlation analysis to dendrochronology, stable isotopes, ring density proxy, blue reflectance and forest biometrics. Results showed that radial growth was a good proxy for monitoring changes to moisture deficit, while maximum density and blue reflectance were appropriate for assessing changes in accumulated temperature for the growing season. Rainfall also influenced radial growth changes but not significantly, and was a major factor in stable carbon and oxygen discrimination, mostly in the latewood formation phase. Stable oxygen isotope analysis was more accurate than radial growth analysis in drought detection, as it helped detect drought signals in both early- and latewood while radial growth analysis only detected the drought signal in earlywood. Many studies have shown that tree rings provide vital information for marking past climatic events. This work provides a methodology to better identify and understand how commonly measured tree proxies relate to environmental parameters, and can best be used to characterize and pinpoint drought events (variously described using parameters such as like moisture deficit, accumulated temperature, rainfall and potential evaporation).

  9. Validation of a methodology to develop a test facility in reduced scale related to boron dispersion in a pressurizer of an iPWR

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, Samira R.V.; Lira, Carlos A.B.O.; Lapa, Celso M.F.; Lima, Fernando R.A.; Bezerra, Jair L.; Silva, Mário A.B., E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Centro de Tecnologia e Geociências. Departamento de Energia Nuclear; Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Centro Regional de Ciências Nucleares do Nordeste (CRCN/CNEN-PE), Recife, PE (Brazil)

    2017-11-01

    The conception and the project of a 1:200 reduced scale test facility have been developed in earlier researches. Such a facility aims to investigate boron homogenization process inside the pressurizer of an iPWR (integral PWR) by considering water mixing from this component with that coming from the reactor core. For this kind of reactor, the pressurizer is located at the top of the pressure vessel demanding the need of identifying the proper mechanisms in order to warrant an adequate homogenization for the water mixture. Once the installation of the experimental setup was concluded, its behavior has been analyzed by considering the concentration of a tracer diluted in the circulation water, whose measurements were obtained at the pressurizer outlet orifices. Two experiments representing boration(boron concentration increase)/deboration(boron concentration decrease) scenarios have been accomplished. Sample acquisition was carried out for every ten minutes during a total time equal to 180 minutes. Results showed that the combination of Fractional Scaling Analysis with local Froude number consisted of an appropriate methodology to provide the reduced scale test facility parameters, inasmuch the measured concentrations from the experiments reproduced the theoretical behavior with sufficient accuracy. (author)

  10. Validation of a methodology to develop a test facility in reduced scale related to boron dispersion in a pressurizer of an iPWR

    International Nuclear Information System (INIS)

    Nascimento, Samira R.V.; Lira, Carlos A.B.O.; Lapa, Celso M.F.; Lima, Fernando R.A.; Bezerra, Jair L.; Silva, Mário A.B.

    2017-01-01

    The conception and the project of a 1:200 reduced scale test facility have been developed in earlier researches. Such a facility aims to investigate boron homogenization process inside the pressurizer of an iPWR (integral PWR) by considering water mixing from this component with that coming from the reactor core. For this kind of reactor, the pressurizer is located at the top of the pressure vessel demanding the need of identifying the proper mechanisms in order to warrant an adequate homogenization for the water mixture. Once the installation of the experimental setup was concluded, its behavior has been analyzed by considering the concentration of a tracer diluted in the circulation water, whose measurements were obtained at the pressurizer outlet orifices. Two experiments representing boration(boron concentration increase)/deboration(boron concentration decrease) scenarios have been accomplished. Sample acquisition was carried out for every ten minutes during a total time equal to 180 minutes. Results showed that the combination of Fractional Scaling Analysis with local Froude number consisted of an appropriate methodology to provide the reduced scale test facility parameters, inasmuch the measured concentrations from the experiments reproduced the theoretical behavior with sufficient accuracy. (author)

  11. Processes influencing migration of bioavailable organic compounds from polymers - investigated during biotic and abiotic testing under static and non-static conditions with varying S/V-ratios

    DEFF Research Database (Denmark)

    Corfitzen, Charlotte B.; Arvin, Erik; Albrechtsen, Hans-Jørgen

    . The bioavailable migration from the polymer surface was influence by diffusion over the solid-liquid boundary layer under sterile conditions, which resulted in an inversely proportionally relationship between bioavailable migration expressed per unit surface area of material and the surface to volume ratio (S/V-ratio...... the effect of the boundary layer, since bioavailable migration was continuously consumed by the bacteria. Thus the driving force for the diffusion process was maintained at a maximum, thereby enhancing the bioavailable migration from the material surfaces. Thus neither non-static conditions nor varying S/V-ratios...

  12. Contribution to the problem of liquidity ratios

    OpenAIRE

    Dvoøáèek Jaroslav

    1997-01-01

    The article is based on the importance of the financial analysis in mining industry. The author pays attention to liquidity ratios given in literature from the standpoint of their number, content, units and recommended quantity value of single ratios. For the application in practice two liquidity ratios are suggested and the methodology of their recommended values determination is given.

  13. Contribution to the problem of liquidity ratios

    Directory of Open Access Journals (Sweden)

    Dvoøáèek Jaroslav

    1997-03-01

    Full Text Available The article is based on the importance of the financial analysis in mining industry. The author pays attention to liquidity ratios given in literature from the standpoint of their number, content, units and recommended quantity value of single ratios. For the application in practice two liquidity ratios are suggested and the methodology of their recommended values determination is given.

  14. Measuring liquidity on stock market: impact on liquidity ratio

    OpenAIRE

    Siniša Bogdan; Suzana Bareša; Saša Ivanović

    2012-01-01

    The purpose – It is important to emphasize that liquidity on Croatian stock market is low, the purpose of this paper is to test empirically and find out which variables make crucial role in decision making process of investing in stocks. Design – This paper explores the impact of various liquidity variables on liquidity ratio since it is still insufficiently researched topic. Methodology –This research uses secondary and primary data available from Croatian stock market. Considering pri...

  15. The relationship of normal body temperature, end-expired breath temperature, and BAC/BrAC ratio in 98 physically fit human test subjects.

    Science.gov (United States)

    Cowan, J Mack; Burris, James M; Hughes, James R; Cunningham, Margaret P

    2010-06-01

    The relationship between normal body temperature, end-expired breath temperature, and blood alcohol concentration (BAC)/breath alcohol concentration (BrAC) ratio was studied in 98 subjects (84 men, 14 women). Subjects consumed alcohol sufficient to produce a BrAC of at least 0.06 g/210 L 45-75 min after drinking. Breath samples were analyzed using an Intoxilyzer 8000 specially equipped to measure breath temperature. Venous blood samples and body temperatures were then taken. The mean body temperature of the men (36.6 degrees C) was lower than the women (37.0 degrees C); however, their mean breath temperatures were virtually identical (men: 34.5 degrees C; women: 34.6 degrees C). The BAC exceeded the BrAC for every subject. BAC/BrAC ratios were calculated from the BAC and BrAC analytical results. There was no difference in the BAC/BrAC ratios for men (1:2379) and women (1:2385). The correlation between BAC and BrAC was high (r = 0.938, p body temperature and end-expired breath temperature, body temperature and BAC/BrAC ratio, and breath temperature and BAC/BrAC ratio were much lower. Neither normal body temperature nor end-expired breath temperature was strongly associated with BAC/BrAC ratio.

  16. Munitions and Explosives of Concern Survey Methodology and In-field Testing for Wind Energy Areas on the Atlantic Outer Continental Shelf

    Science.gov (United States)

    DuVal, C.; Carton, G.; Trembanis, A. C.; Edwards, M.; Miller, J. K.

    2017-12-01

    Munitions and explosives of concern (MEC) are present in U.S. waters as a result of past and ongoing live-fire testing and training, combat operations, and sea disposal. To identify MEC that may pose a risk to human safety during development of offshore wind facilities on the Atlantic Outer Continental Shelf (OCS), the Bureau of Ocean Energy Management (BOEM) is preparing to develop guidance on risk analysis and selection processes for methods and technologies to identify MEC in Wind Energy Areas (WEA). This study developed a process for selecting appropriate technologies and methodologies for MEC detection using a synthesis of historical research, physical site characterization, remote sensing technology review, and in-field trials. Personnel were tasked with seeding a portion of the Delaware WEA with munitions surrogates, while a second group of researchers not privy to the surrogate locations tested and optimized the selected methodology to find and identify the placed targets. This in-field trial, conducted in July 2016, emphasized the use of multiple sensors for MEC detection, and led to further guidance for future MEC detection efforts on the Atlantic OCS. An April 2017 follow on study determined the fate of the munitions surrogates after the Atlantic storm season had passed. Using regional hydrodynamic models and incorporating the recommendations from the 2016 field trial, the follow on study examined the fate of the MEC and compared the findings to existing research on munitions mobility, as well as models developed as part of the Office of Naval Research Mine-Burial Program. Focus was given to characterizing the influence of sediment type on surrogate munitions behavior and the influence of mophodynamics and object burial on MEC detection. Supporting Mine-Burial models, ripple bedforms were observed to impede surrogate scour and burial in coarse sediments, while surrogate burial was both predicted and observed in finer sediments. Further, incorporation of

  17. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  18. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  19. High-Energy Physics Fault Tolerance Metrics and Testing Methodologies for SRAM-based FPGAs A case of study based on the Xilinx Triple Modular Redundancy (TMR) Subsystem

    CERN Document Server

    Canessa, Emanuele; Agnello, Michelangelo

    Field-Programmable Gate Arrays have become more and more actractive to the developers of mission-critical and safety-critical systems. Thanks to their reconfigurability properties, as well as their I/O capabilities these devices are often employed as core logic in many different applications. On top of that, the use of soft microcontrollers can ease the complexity related to the some of the control logic of these devices, allowing to easily develop new features without having to redesign most of the control logic involved. However, for application safety-critical and mission-critical like Aerospace and High-Energy Physics these devices require a further analisys on radiation effects. The main matter of this thesis, that has been developed in collaboration with the Conseil Européen pour la Recherche Nucléaire (CERN) A Large Ion Collider Experiment (ALICE), for the planned Inner Tracking System (ITS) Upgrade, are discussed the fault tolerance metrics and the testing methodologies that can be applicable to sof...

  20. Pb-Isotopic Study of Galena by LA-Q-ICP-MS: Testing a New Methodology with Applications to Base-Metal Sulphide Deposits

    Directory of Open Access Journals (Sweden)

    Christopher R. M. McFarlane

    2016-09-01

    Full Text Available In situ laser ablation quadrupole inductively coupled plasma mass spectrometry was used to measure Pb isotopes in galena. Data acquisition was optimized by adjusting spot size, energy density, and ablation time to obtain near steady-state low relative standard deviation (%RSD signals. Standard-sample bracketing using in-house Broken Hill galena as external reference standard was used and offline data reduction was carried out using VizualAge for Iolite3. Using this methodology, galena grain in polished thin sections from selected massive sulphide deposits of the Bathurst Mining Camp, Canada, were tested and compared to previously published data. Absolute values and errors on the weighted mean of ~20 individual analyses from each sample compared favourably with whole-rock Pb-Pb isotope data. This approach provides a mean to obtain rapid, accurate, and moderately (0.1% 2σ precise Pb isotope measurements in galena and is particularly well suited for exploratory or reconnaissance studies. Further refinement of this approach may be useful in exploration for volcanogenic massive sulphides deposits and might be a useful vectoring tool when complemented with other conventional exploration techniques.

  1. Modeling Nearly Spherical Pure-bulge Galaxies with a Stellar Mass-to-light Ratio Gradient under the ΛCDM and MOND Paradigms. I. Methodology, Dynamical Stellar Mass, and Fundamental Mass Plane

    Science.gov (United States)

    Chae, Kyu-Hyun; Bernardi, Mariangela; Sheth, Ravi K.

    2018-06-01

    We carry out spherical Jeans modeling of nearly round pure-bulge galaxies selected from the ATLAS3D sample. Our modeling allows for gradients in the stellar mass-to-light ratio (M ⋆/L) through analytic prescriptions parameterized with a “gradient strength” K introduced to accommodate any viable gradient. We use a generalized Osipkov–Merritt model for the velocity dispersion (VD) anisotropy. We produce Monte Carlo sets of models based on the stellar VD profiles under both the ΛCDM and MOND paradigms. Here, we describe the galaxy data, the empirical inputs, and the modeling procedures of obtaining the Monte Carlo sets. We then present the projected dynamical stellar mass, {M}\\star {{e}}, within the effective radius R e, and the fundamental mass plane (FMP) as a function of K. We find the scaling of the K-dependent mass with respect to the ATLAS3D reported mass as: {log}}10[{M}\\star {{e}}(K)/{M}\\star {{e}}{{A}3{{D}}}]=a\\prime +b\\prime K with a‧ = ‑0.019 ± 0.012 and b‧ = ‑0.18 ± 0.02 (ΛCDM), or a‧ = ‑0.023 ± 0.014 and b‧ = ‑0.23 ± 0.03 (MOND), for 0 ≤ K expectation and only the zero-point scales with K. The median value of K for the ATLAS3D galaxies is ={0.53}-0.04+0.05. We perform a similar analysis of the much larger SDSS DR7 spectroscopic sample. In this case, only the VD within a single aperture is available, so we impose the additional requirement that the VD slope be similar to that in the ATLAS3D galaxies. Our analysis of the SDSS galaxies suggests a positive correlation of K with stellar mass.

  2. A Single Conversation with a Wise Man Is Better than Ten Years of Study: A Model for Testing Methodologies for Pedagogy or Andragogy

    Science.gov (United States)

    Taylor, Bryan; Kroth, Michael

    2009-01-01

    This article creates the Teaching Methodology Instrument (TMI) to help determine the level of adult learning principles being used by a particular teaching methodology in a classroom. The instrument incorporates the principles and assumptions set forth by Malcolm Knowles of what makes a good adult learning environment. The Socratic method as used…

  3. Ratio of ovarian stroma and total ovarian area by ultrasound in prediction of hyperandrogenemia in reproductive-aged Thai women with polycystic ovary syndrome: a diagnostic test.

    Science.gov (United States)

    Leerasiri, Pichai; Wongwananuruk, Thanyarat; Rattanachaiyanont, Manee; Indhavivadhana, Suchada; Techatraisak, Kitirat; Angsuwathana, Surasak

    2015-02-01

    To evaluate the performance of ovarian stromal area to total ovarian area (S/A) ratio for the prediction of biochemical hyperandrogenism in Thai women with polycystic ovary syndrome (PCOS). A cross-sectional study was performed in 222 reproductive-aged Thai women with PCOS attending the Gynecologic Endocrinology Unit (GEU), Department of Obstetrics and Gynecology, Faculty of Medicine Siriraj Hospital from May 2007 to January 2009. The patients were interviewed for medical history and examined for anthropometry and clinical hyperandrogenism. Venous blood samples were obtained for androgen profiles. An ovarian ultrasonogram was obtained via transvaginal or transrectal ultrasonography. The prevalences of clinical and biochemical hyperandrogenism were 48.6% and 81.1%, respectively. The S/A ratio at a cut-off point of 0.33 had modest predictability for hyperandrogenism, namely, 0.537 area under the receiver-operator curve, 36.6% sensitivity, 72.1% specificity, 83.8% positive predictive value (PPV) and 20.9% negative predictive value (NPV). The combination of clinical hyperandrogenism and S/A ratio improved the predictability for biochemical hyperandrogenism, with sensitivity, specificity, PPV and NPV of 72.1%, 58.1%, 87.8% and 33.3%, respectively. The S/A ratio alone is not a good predictor for biochemical hyperandrogenism in Thai PCOS women attending GEU for menstrual dysfunction. The combination of S/A ratio and clinical hyperandrogenism has better performance than the S/A ratio alone to predict biochemical hyperandrogenism. © 2014 The Authors. Journal of Obstetrics and Gynaecology Research © 2014 Japan Society of Obstetrics and Gynecology.

  4. A computational methodology for a micro launcher engine test bench using a combined linear static and dynamic in frequency response analysis

    Directory of Open Access Journals (Sweden)

    Ion DIMA

    2017-03-01

    Full Text Available This article aims to provide a quick methodology to determine the critical values of the forces, displacements and stress function of frequency, under a combined linear static (101 Solution - Linear Static and dynamic load in frequency response (108 Solution - Frequency Response, Direct Method, applied to a micro launcher engine test bench, using NASTRAN 400 Solution - Implicit Nonlinear. NASTRAN/PATRAN software is used. Practically in PATRAN the preprocessor has to define a linear or nonlinear static load at step 1 and a dynamic in frequency response load (time dependent at step 2. In Analyze the following options are chosen: for Solution Type Implicit Nonlinear Solution (SOL 400 is selected, for Subcases Static Load and Transient Dynamic is chosen and for Subcase Select the two cases static and dynamic will be selected. NASTRAN solver will overlap results from static analysis with the dynamic analysis. The running time will be reduced three times if using Krylov solver. NASTRAN SYSTEM (387 = -1 instruction is used in order to activate Krylov option. Also, in Analysis the OP2 Output Format shall be selected, meaning that in bdf NASTRAN input file the PARAM POST 1 instruction shall be written. The structural damping can be defined in two different ways: either at the material card or using the PARAM, G, 0.05 instruction (in this example a damping coefficient by 5% was used. The SDAMPING instruction in pair with TABDMP1 work only for dynamic in frequency response, modal method, or in direct method with viscoelastic material, not for dynamic in frequency response, direct method (DFREQ, with linear elastic material. The Direct method – DFREQ used in this example is more accurate. A set in translation of boundary conditions was used and defined at the base of the test bench.

  5. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  6. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  7. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  8. Combination of the ionic-to-atomic line intensity ratios from two test elements for the diagnostic of plasma temperature and electron number density in Inductively Coupled Plasma Atomic Emission Spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Tognoni, E. [Istituto per i Processi Chimico-Fisici, Area della Ricerca del Consiglio Nazionale delle Ricerche Via Moruzzi 1, 56124 Pisa (Italy)], E-mail: tognoni@ipcf.cnr.it; Hidalgo, M.; Canals, A. [Departamento de Quimica Analitica, Nutricion y Bromatologia. Universidad de Alicante. Apdo. 99, 03080, Alicante (Spain); Cristoforetti, G.; Legnaioli, S.; Salvetti, A.; Palleschi, V. [Istituto per i Processi Chimico-Fisici, Area della Ricerca del Consiglio Nazionale delle Ricerche Via Moruzzi 1, 56124 Pisa (Italy)

    2007-05-15

    In Inductively Coupled Plasma-Atomic Emission Spectroscopy (ICP-AES) spectrochemical analysis, the MgII(280.270 nm)/MgI(285.213 nm) ionic to atomic line intensity ratio is commonly used as a monitor of the robustness of operating conditions. This approach is based on the univocal relationship existing between intensity ratio and plasma temperature, for a pure argon atmospheric ICP in thermodynamic equilibrium. In a multi-elemental plasma in the lower temperature range, the measurement of the intensity ratio may not be sufficient to characterize temperature and electron density. In such a range, the correct relationship between intensity ratio and plasma temperature can be calculated only when the complete plasma composition is known. We propose the combination of the line intensity ratios of two test elements (double ratio) as an effective diagnostic tool for a multi-elemental low temperature LTE plasma of unknown composition. In particular, the variation of the double ratio allows us discriminating changes in the plasma temperature from changes in the electron density. Thus, the effects on plasma excitation and ionization possibly caused by introduction of different samples and matrices in non-robust conditions can be more accurately interpreted. The method is illustrated by the measurement of plasma temperature and electron density in a specific analytic case.

  9. Methodology for the analysis of external flooding in CN Asco-II and CN Vandellos during the performance of stress tests

    International Nuclear Information System (INIS)

    Aleman, A.; Cobas, I.; Sabater, J.; Canadell, F.; Garces, L.; Otero, M.

    2012-01-01

    The work carried out in relation to extemal floods have allowed synthesized in a unique methodology to obtain the entire process of margins against external flooding, including identification of the extemal external events could cause flooding.

  10. The three stages of building and testing mid-level theories in a realist RCT: a theoretical and methodological case-example.

    Science.gov (United States)

    Jamal, Farah; Fletcher, Adam; Shackleton, Nichola; Elbourne, Diana; Viner, Russell; Bonell, Chris

    2015-10-15

    Randomised controlled trials (RCTs) of social interventions are often criticised as failing to open the 'black box' whereby they only address questions about 'what works' without explaining the underlying processes of implementation and mechanisms of action, and how these vary by contextual characteristics of person and place. Realist RCTs are proposed as an approach to evaluation science that addresses these gaps while preserving the strengths of RCTs in providing evidence with strong internal validity in estimating effects. In the context of growing interest in designing and conducting realist trials, there is an urgent need to offer a worked example to provide guidance on how such an approach might be practically taken forward. The aim of this paper is to outline a three-staged theoretical and methodological process of undertaking a realist RCT using the example of the evaluation of a whole-school restorative intervention aiming to reduce aggression and bullying in English secondary schools. First, informed by the findings of our initial pilot trial and sociological theory, we elaborate our theory of change and specific a priori hypotheses about how intervention mechanisms interact with context to produce outcomes. Second, we describe how we will use emerging findings from the integral process evaluation within the RCT to refine, and add to, these a priori hypotheses before the collection of quantitative, follow-up data. Third, we will test our hypotheses using a combination of process and outcome data via quantitative analyses of effect mediation (examining mechanisms) and moderation (examining contextual contingencies). The results are then used to refine and further develop the theory of change. The aim of the realist RCT approach is thus not merely to assess whether the intervention is effective or not, but to develop empirically informed mid-range theory through a three-stage process. There are important implications for those involved with reporting and

  11. Inference for the Sharpe Ratio Using a Likelihood-Based Approach

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2012-01-01

    Full Text Available The Sharpe ratio is the prominent risk-adjusted performance measure used by practitioners. Statistical testing of this ratio using its asymptotic distribution has lagged behind its use. In this paper, highly accurate likelihood analysis is applied for inference on the Sharpe ratio. Both the one- and two-sample problems are considered. The methodology has O(n−3/2 distributional accuracy and can be implemented using any parametric return distribution structure. Simulations are provided to demonstrate the method's superior accuracy over existing methods used for testing in the literature.

  12. Brain GABA Detection in vivo with the J-editing 1H MRS Technique: A Comprehensive Methodological Evaluation of Sensitivity Enhancement, Macromolecule Contamination and Test-Retest Reliability

    Science.gov (United States)

    Shungu, Dikoma C.; Mao, Xiangling; Gonzales, Robyn; Soones, Tacara N.; Dyke, Jonathan P.; van der Veen, Jan Willem; Kegeles, Lawrence S.

    2016-01-01

    Abnormalities in brain γ-aminobutyric acid (GABA) have been implicated in various neuropsychiatric and neurological disorders. However, in vivo GABA detection by proton magnetic resonance spectroscopy (1H MRS) presents significant challenges arising from low brain concentration, overlap by much stronger resonances, and contamination by mobile macromolecule (MM) signals. This study addresses these impediments to reliable brain GABA detection with the J-editing difference technique on a 3T MR system in healthy human subjects by (a) assessing the sensitivity gains attainable with an 8-channel phased-array head coil, (b) determining the magnitude and anatomic variation of the contamination of GABA by MM, and (c) estimating the test-retest reliability of measuring GABA with this method. Sensitivity gains and test-retest reliability were examined in the dorsolateral prefrontal cortex (DLPFC), while MM levels were compared across three cortical regions: the DLPFC, the medial prefrontal cortex (MPFC) and the occipital cortex (OCC). A 3-fold higher GABA detection sensitivity was attained with the 8-channel head coil compared to the standard single-channel head coil in DLPFC. Despite significant anatomic variation in GABA+MM and MM across the three brain regions (p GABA+MM was relatively stable across the three voxels, ranging from 41% to 49%, a non-significant regional variation (p = 0.58). The test-retest reliability of GABA measurement, expressed either as ratios to voxel tissue water (W) or total creatine, was found to be very high for both the single-channel coil and the 8-channel phased-array coil. For the 8-channel coil, for example, Pearson’s correlation coefficient of test vs. retest for GABA/W was 0.98 (R2 = 0.96, p = 0.0007), the percent coefficient of variation (CV) was 1.25%, and the intraclass correlation coefficient (ICC) was 0.98. Similar reliability was also found for the co-edited resonance of combined glutamate and glutamine (Glx) for both coils. PMID

  13. Design and development of a prototypical software for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small- and medium-sized enterprises (SME)

    Science.gov (United States)

    Möller, Thomas; Bellin, Knut; Creutzburg, Reiner

    2015-03-01

    The aim of this paper is to show the recent progress in the design and prototypical development of a software suite Copra Breeder* for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small and medium-sized enterprises.

  14. Updated of the events tree of total loss of power at the site, SBO, taking into account the results of stress tests and methodological updates, convolution, and hydraulic power recovery from model RCPs

    International Nuclear Information System (INIS)

    Lopez Lorenzo, M. A.; Perez Martin, F.

    2013-01-01

    In this paper, is described a tree of events to an accident loss total power at the site (SBO) considering, first the results of stress tests arising from the Fukushima accident and moreover, various methodological updates related to this initiating event.

  15. Transformer ratio enhancement experiment

    International Nuclear Information System (INIS)

    Gai, W.; Power, J. G.; Kanareykin, A.; Neasheva, E.; Altmark, A.

    2004-01-01

    Recently, a multibunch scheme for efficient acceleration based on dielectric wakefield accelerator technology was outlined in J.G. Power, W. Gai, A. Kanareykin, X. Sun. PAC 2001 Proceedings, pp. 114-116, 2002. In this paper we present an experimental program for the design, development and demonstration of an Enhanced Transformer Ratio Dielectric Wakefield Accelerator (ETR-DWA). The principal goal is to increase the transformer ratio R, the parameter that characterizes the energy transfer efficiency from the accelerating structure to the accelerated electron beam. We present here an experimental design of a 13.625 GHz dielectric loaded accelerating structure, a laser multisplitter producing a ramped bunch train, and simulations of the bunch train parameters required. Experimental results of the accelerating structure bench testing and ramped pulsed train generation with the laser multisplitter are shown as well. Using beam dynamic simulations, we also obtain the focusing FODO lattice parameters

  16. Low concentration ratio solar array for low Earth orbit multi-100 kW application. Volume 1: Design, analysis and development tests

    Science.gov (United States)

    1983-01-01

    A preliminary design effort directed toward a low concentration ratio photovoltaic array system capable of delivering multihundred kilowatts (300 kW to 1000 kW range) in low earth orbit is described. The array system consists of two or more array modules each capable of delivering between 113 kW to 175 kW using silicon solar cells or gallium arsenide solar cells, respectively. The array module deployed area is 1320 square meters and consists of 4356 pyramidal concentrator elements. The module, when stowed in the Space Shuttle's payload bay, has a stowage volume of a cube with 3.24 meters on a side. The concentrator elements are sized for a geometric concentration ratio (GCR) of six with an aperture area of .25 sq. m. The structural analysis and design trades leading to the baseline design are discussed. It describes the configuration, as well as optical, thermal and electrical performance analyses that support the design and overall performance estimates for the array are described.

  17. Electromagnetic attenuation of eight earthquakes registered in Mexico using FFT-based spectrum and t-test statistical analysis for ULF Q-R ratios signals

    Directory of Open Access Journals (Sweden)

    Omar Chavez

    2016-07-01

    Full Text Available A method to improve the detection of seismo-magnetic signals is presented herein. Eight events registered for periods of 24 hours with seismic activity were analyzed and compared with non-seismic periods of the same duration. The distance between the earthquakes (EQs and the ultra-low frequency detector is of  ρ = (1.8 100.45M, where M is the magnitude of the EQ reported by the Seismological National Service of Mexico, in a period of three years. An improved fast Fourier transform analysis in the form of the ratio of the vertical magnetic field component to the horizontal one (Q = Bz/Bx has been developed. There are important differences between the frequencies obtained during the days of seismic activity compared with those with no seismic activity.

  18. New decision criteria for selecting delta check methods based on the ratio of the delta difference to the width of the reference range can be generally applicable for each clinical chemistry test item.

    Science.gov (United States)

    Park, Sang Hyuk; Kim, So-Young; Lee, Woochang; Chun, Sail; Min, Won-Ki

    2012-09-01

    Many laboratories use 4 delta check methods: delta difference, delta percent change, rate difference, and rate percent change. However, guidelines regarding decision criteria for selecting delta check methods have not yet been provided. We present new decision criteria for selecting delta check methods for each clinical chemistry test item. We collected 811,920 and 669,750 paired (present and previous) test results for 27 clinical chemistry test items from inpatients and outpatients, respectively. We devised new decision criteria for the selection of delta check methods based on the ratio of the delta difference to the width of the reference range (DD/RR). Delta check methods based on these criteria were compared with those based on the CV% of the absolute delta difference (ADD) as well as those reported in 2 previous studies. The delta check methods suggested by new decision criteria based on the DD/RR ratio corresponded well with those based on the CV% of the ADD except for only 2 items each in inpatients and outpatients. Delta check methods based on the DD/RR ratio also corresponded with those suggested in the 2 previous studies, except for 1 and 7 items in inpatients and outpatients, respectively. The DD/RR method appears to yield more feasible and intuitive selection criteria and can easily explain changes in the results by reflecting both the biological variation of the test item and the clinical characteristics of patients in each laboratory. We suggest this as a measure to determine delta check methods.

  19. Behind the Battle Lines of History as Politics: An International and Intergenerational Methodology for Testing the Social Identity Thesis of History Education

    Science.gov (United States)

    Taylor, Tony; Collins, Sue

    2012-01-01

    This article critiques popular assumptions that underlie the ongoing politicisation of school history curriculum as an agent of social identity and behaviour. It raises some key research questions which need further investigation and suggests a potential methodology for establishing evidence-based understanding of the relationship between history…

  20. Multi-Population Invariance with Dichotomous Measures: Combining Multi-Group and MIMIC Methodologies in Evaluating the General Aptitude Test in the Arabic Language

    Science.gov (United States)

    Sideridis, Georgios D.; Tsaousis, Ioannis; Al-harbi, Khaleel A.

    2015-01-01

    The purpose of the present study was to extend the model of measurement invariance by simultaneously estimating invariance across multiple populations in the dichotomous instrument case using multi-group confirmatory factor analytic and multiple indicator multiple causes (MIMIC) methodologies. Using the Arabic version of the General Aptitude Test…

  1. Practicability of patient self-testing of oral anticoagulant therapy by the international normalized ratio (INR) using a portable whole blood monitor. A pilot investigation.

    Science.gov (United States)

    Hasenkam, J M; Knudsen, L; Kimose, H H; Grønnesby, H; Attermann, J; Andersen, N T; Pilegaard, H K

    1997-01-01

    The prophylactic efficacy of long-term oral anticoagulant treatment (OAT) has been demonstrated in a number of clinical conditions with increased tendency to thromboembolism, and the number of individuals subjected to OAT in the industrialised world has increased substantially in recent years. Since this therapy requires considerable resources from both the health care system and the patients, the feasibility of patients' self-monitoring and self-management of OAT has been investigated (1,2,3). The anticipated advantages of this approach include improved convenience and compliance for the patient, who may increase his apprehension for managing the treatment. In addition, self-testing allows for more frequent control compared to the conventional out-patient approach. Importantly, a prerequisite for conceiving a safe and operational concept for patient self-management (PSM) is the availability of a portable INR monitoring system with an accuracy, precision, reproducibility, and long-term reliability comparable to standard coagulometric equipment. The purpose of the present study was to evaluate the feasibility of a commercially available INR-monitor. CoaguChek, for patient self-testing, through a step-wise investigation of the performance characteristics of the equipment in the laboratory, in command of the patient, and during self-testing and self-adjustment of treatment at home. Laboratory INR values were used as reference.

  2. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  3. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    The reports comprising this volume concern the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in granites in Cornwall, with particular reference to the effect of structures imposed by discontinuities on the engineering behaviour of rock masses. The topics covered are: in-situ stress measurements using (a) the hydraulic fracturing method, or (b) the US Bureau of Mines deformation probe; scanline discontinuity survey - coding form and instructions, and data; applicability of geostatistical estimation methods to scalar rock properties; comments on in-situ stress at the Carwynnen test mine and the state of stress in the British Isles. (U.K.)

  4. Loss-of-Use Damages From U.S. Nuclear Testing in the Marshall Islands: Technical Analysis of the Nuclear Claims Tribunal’s Methodology and Alternative Estimates

    Science.gov (United States)

    2005-08-12

    productivity of the islands in producing copra or fish, was not considered. The assumption is also inconsistent with the capitalization model that the value of...David Barker and Jay Wa-Aadu, “Is Real Estate Becoming Important Again? A Neo Ricardian Model of Land Rent.” Real Estate Economics, Spring, 2004, pp...the model explicit, it avoids shortcomings of the NCT methodology, by using available data from RMI’s national income and product accounts that is

  5. Quantitative testing of the methodology for genome size estimation in plants using flow cytometry: a case study of the Primulina genus

    Directory of Open Access Journals (Sweden)

    Jing eWang

    2015-05-01

    Full Text Available Flow cytometry (FCM is a commonly used method for estimating genome size in many organisms. The use of flow cytometry in plants is influenced by endogenous fluorescence inhibitors and may cause an inaccurate estimation of genome size; thus, falsifying the relationship between genome size and phenotypic traits/ecological performance. Quantitative optimization of FCM methodology minimizes such errors, yet there are few studies detailing this methodology. We selected the genus Primulina, one of the most representative and diverse genera of the Old World Gesneriaceae, to evaluate the methodology effect on determining genome size. Our results showed that buffer choice significantly affected genome size estimation in six out of the eight species examined and altered the 2C-value (DNA content by as much as 21.4%. The staining duration and propidium iodide (PI concentration slightly affected the 2C-value. Our experiments showed better histogram quality when the samples were stained for 40 minutes at a PI concentration of 100 µg ml-1. The quality of the estimates was not improved by one-day incubation in the dark at 4 °C or by centrifugation. Thus, our study determined an optimum protocol for genome size measurement in Primulina: LB01 buffer supplemented with 100 µg ml-1 PI and stained for 40 minutes. This protocol also demonstrated a high universality in other Gesneriaceae genera. We report the genome size of nine Gesneriaceae species for the first time. The results showed substantial genome size variation both within and among the species, with the 2C-value ranging between 1.62 and 2.71 pg. Our study highlights the necessity of optimizing the FCM methodology prior to obtaining reliable genome size estimates in a given taxon.

  6. Agreement between clinicians' and care givers' assessment of intelligence in Nigerian children with intellectual disability: 'ratio IQ' as a viable option in the absence of standardized 'deviance IQ' tests in sub-Saharan Africa

    Directory of Open Access Journals (Sweden)

    Aguocha Chinyere M

    2009-09-01

    showed higher correlation score between clinicians' assessed IQ scores and 'ratio IQ' scores (r = 0.75, df = 41, p = 0.000. Conclusion Agreement between clinicians' assessed IQ scores and 'ratio IQ' scores was good. 'Ratio IQ' test would provide a viable option of assessing IQ scores in sub-Saharan African children with intellectual disability in the absence of culture-appropriate standardized intelligence scales, which is often the case because of great diversity in socio-cultural structures of sub-Saharan Africa.

  7. Agreement between clinicians' and care givers' assessment of intelligence in Nigerian children with intellectual disability: 'ratio IQ' as a viable option in the absence of standardized 'deviance IQ' tests in sub-Saharan Africa.

    Science.gov (United States)

    Bakare, Muideen O; Ubochi, Vincent N; Okoroikpa, Ifeoma N; Aguocha, Chinyere M; Ebigbo, Peter O

    2009-09-15

    ' assessed IQ scores and 'ratio IQ' scores (r = 0.75, df = 41, p = 0.000). Agreement between clinicians' assessed IQ scores and 'ratio IQ' scores was good. 'Ratio IQ' test would provide a viable option of assessing IQ scores in sub-Saharan African children with intellectual disability in the absence of culture-appropriate standardized intelligence scales, which is often the case because of great diversity in socio-cultural structures of sub-Saharan Africa.

  8. Safety assessment methodologies for near surface disposal facilities. Results of a co-ordinated research project (ISAM). Volume 1: Review and enhancement of safety assessment approaches and tools. Volume 2: Test cases

    International Nuclear Information System (INIS)

    2004-07-01

    For several decades, countries have made use of near surface facilities for the disposal of low and intermediate level radioactive waste. In line with the internationally agreed principles of radioactive waste management, the safety of these facilities needs to be ensured during all stages of their lifetimes, including the post-closure period. By the mid 1990s, formal methodologies for evaluating the long term safety of such facilities had been developed, but intercomparison of these methodologies had revealed a number of discrepancies between them. Consequently, in 1997, the International Atomic Energy Agency launched a Co-ordinated Research Project (CRP) on Improvement of Safety Assessment Methodologies for Near Surface Disposal Facilities (ISAM). The particular objectives of the CRP were to provide a critical evaluation of the approaches and tools used in post-closure safety assessment for proposed and existing near-surface radioactive waste disposal facilities, enhance the approaches and tools used and build confidence in the approaches and tools used. The CRP ran until 2000 and resulted in the development of a harmonized assessment methodology (the ISAM project methodology), which was applied to a number of test cases. Over seventy participants from twenty-two Member States played an active role in the project and it attracted interest from around seven hundred persons involved with safety assessment in seventy-two Member States. The results of the CRP have contributed to the Action Plan on the Safety of Radioactive Waste Management which was approved by the Board of Governors and endorsed by the General Conference in September 2001. Specifically, they contribute to Action 5, which requests the IAEA Secretariat to 'develop a structured and systematic programme to ensure adequate application of the Agency's waste safety standards', by elaborating on the Safety Requirements on 'Near Surface Disposal of Radioactive Waste' (Safety Standards Series No. WS-R-1) and

  9. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  10. Tau hadronic branching ratios

    CERN Document Server

    Buskulic, Damir; De Bonis, I; Décamp, D; Ghez, P; Goy, C; Lees, J P; Lucotte, A; Minard, M N; Odier, P; Pietrzyk, B; Ariztizabal, F; Chmeissani, M; Crespo, J M; Efthymiopoulos, I; Fernández, E; Fernández-Bosman, M; Gaitan, V; Martínez, M; Orteu, S; Pacheco, A; Padilla, C; Palla, Fabrizio; Pascual, A; Perlas, J A; Sánchez, F; Teubert, F; Colaleo, A; Creanza, D; De Palma, M; Farilla, A; Gelao, G; Girone, M; Iaselli, Giuseppe; Maggi, G; Maggi, M; Marinelli, N; Natali, S; Nuzzo, S; Ranieri, A; Raso, G; Romano, F; Ruggieri, F; Selvaggi, G; Silvestris, L; Tempesta, P; Zito, G; Huang, X; Lin, J; Ouyang, Q; Wang, T; Xie, Y; Xu, R; Xue, S; Zhang, J; Zhang, L; Zhao, W; Bonvicini, G; Cattaneo, M; Comas, P; Coyle, P; Drevermann, H; Engelhardt, A; Forty, Roger W; Frank, M; Hagelberg, R; Harvey, J; Jacobsen, R; Janot, P; Jost, B; Kneringer, E; Knobloch, J; Lehraus, Ivan; Markou, C; Martin, E B; Mato, P; Minten, Adolf G; Miquel, R; Oest, T; Palazzi, P; Pater, J R; Pusztaszeri, J F; Ranjard, F; Rensing, P E; Rolandi, Luigi; Schlatter, W D; Schmelling, M; Schneider, O; Tejessy, W; Tomalin, I R; Venturi, A; Wachsmuth, H W; Wiedenmann, W; Wildish, T; Witzeling, W; Wotschack, J; Ajaltouni, Ziad J; Bardadin-Otwinowska, Maria; Barrès, A; Boyer, C; Falvard, A; Gay, P; Guicheney, C; Henrard, P; Jousset, J; Michel, B; Monteil, S; Pallin, D; Perret, P; Podlyski, F; Proriol, J; Rossignol, J M; Saadi, F; Fearnley, Tom; Hansen, J B; Hansen, J D; Hansen, J R; Hansen, P H; Nilsson, B S; Kyriakis, A; Simopoulou, Errietta; Siotis, I; Vayaki, Anna; Zachariadou, K; Blondel, A; Bonneaud, G R; Brient, J C; Bourdon, P; Passalacqua, L; Rougé, A; Rumpf, M; Tanaka, R; Valassi, Andrea; Verderi, M; Videau, H L; Candlin, D J; Parsons, M I; Focardi, E; Parrini, G; Corden, M; Delfino, M C; Georgiopoulos, C H; Jaffe, D E; Antonelli, A; Bencivenni, G; Bologna, G; Bossi, F; Campana, P; Capon, G; Chiarella, V; Felici, G; Laurelli, P; Mannocchi, G; Murtas, F; Murtas, G P; Pepé-Altarelli, M; Dorris, S J; Halley, A W; ten Have, I; Knowles, I G; Lynch, J G; Morton, W T; O'Shea, V; Raine, C; Reeves, P; Scarr, J M; Smith, K; Smith, M G; Thompson, A S; Thomson, F; Thorn, S; Turnbull, R M; Becker, U; Braun, O; Geweniger, C; Graefe, G; Hanke, P; Hepp, V; Kluge, E E; Putzer, A; Rensch, B; Schmidt, M; Sommer, J; Stenzel, H; Tittel, K; Werner, S; Wunsch, M; Beuselinck, R; Binnie, David M; Cameron, W; Colling, D J; Dornan, Peter J; Konstantinidis, N P; Moneta, L; Moutoussi, A; Nash, J; San Martin, G; Sedgbeer, J K; Stacey, A M; Dissertori, G; Girtler, P; Kuhn, D; Rudolph, G; Bowdery, C K; Brodbeck, T J; Colrain, P; Crawford, G; Finch, A J; Foster, F; Hughes, G; Sloan, Terence; Whelan, E P; Williams, M I; Galla, A; Greene, A M; Kleinknecht, K; Quast, G; Raab, J; Renk, B; Sander, H G; Wanke, R; Van Gemmeren, P; Zeitnitz, C; Aubert, Jean-Jacques; Bencheikh, A M; Benchouk, C; Bonissent, A; Bujosa, G; Calvet, D; Carr, J; Diaconu, C A; Etienne, F; Thulasidas, M; Nicod, D; Payre, P; Rousseau, D; Talby, M; Abt, I; Assmann, R W; Bauer, C; Blum, Walter; Brown, D; Dietl, H; Dydak, Friedrich; Ganis, G; Gotzhein, C; Jakobs, K; Kroha, H; Lütjens, G; Lutz, Gerhard; Männer, W; Moser, H G; Richter, R H; Rosado-Schlosser, A; Schael, S; Settles, Ronald; Seywerd, H C J; Saint-Denis, R; Wolf, G; Alemany, R; Boucrot, J; Callot, O; Cordier, A; Courault, F; Davier, M; Duflot, L; Grivaz, J F; Heusse, P; Jacquet, M; Kim, D W; Le Diberder, F R; Lefrançois, J; Lutz, A M; Musolino, G; Nikolic, I A; Park, H J; Park, I C; Schune, M H; Simion, S; Veillet, J J; Videau, I; Abbaneo, D; Azzurri, P; Bagliesi, G; Batignani, G; Bettarini, S; Bozzi, C; Calderini, G; Carpinelli, M; Ciocci, M A; Ciulli, V; Dell'Orso, R; Fantechi, R; Ferrante, I; Foà, L; Forti, F; Giassi, A; Giorgi, M A; Gregorio, A; Ligabue, F; Lusiani, A; Marrocchesi, P S; Messineo, A; Rizzo, G; Sanguinetti, G; Sciabà, A; Spagnolo, P; Steinberger, Jack; Tenchini, Roberto; Tonelli, G; Triggiani, G; Vannini, C; Verdini, P G; Walsh, J; Betteridge, A P; Blair, G A; Bryant, L M; Cerutti, F; Gao, Y; Green, M G; Johnson, D L; Medcalf, T; Mir, L M; Perrodo, P; Strong, J A; Bertin, V; Botterill, David R; Clifft, R W; Edgecock, T R; Haywood, S; Edwards, M; Maley, P; Norton, P R; Thompson, J C; Bloch-Devaux, B; Colas, P; Emery, S; Kozanecki, Witold; Lançon, E; Lemaire, M C; Locci, E; Marx, B; Pérez, P; Rander, J; Renardy, J F; Roussarie, A; Schuller, J P; Schwindling, J; Trabelsi, A; Vallage, B; Johnson, R P; Kim, H Y; Litke, A M; McNeil, M A; Taylor, G; Beddall, A; Booth, C N; Boswell, R; Cartwright, S L; Combley, F; Dawson, I; Köksal, A; Letho, M; Newton, W M; Rankin, C; Thompson, L F; Böhrer, A; Brandt, S; Cowan, G D; Feigl, E; Grupen, Claus; Lutters, G; Minguet-Rodríguez, J A; Rivera, F; Saraiva, P; Smolik, L; Stephan, F; Apollonio, M; Bosisio, L; Della Marina, R; Giannini, G; Gobbo, B; Ragusa, F; Rothberg, J E; Wasserbaech, S R; Armstrong, S R; Bellantoni, L; Elmer, P; Feng, Z; Ferguson, D P S; Gao, Y S; González, S; Grahl, J; Harton, J L; Hayes, O J; Hu, H; McNamara, P A; Nachtman, J M; Orejudos, W; Pan, Y B; Saadi, Y; Schmitt, M; Scott, I J; Sharma, V; Turk, J; Walsh, A M; Wu Sau Lan; Wu, X; Yamartino, J M; Zheng, M; Zobernig, G

    1996-01-01

    From 64492 selected \\tau-pair events, produced at the Z^0 resonance, the measurement of the tau decays into hadrons from a global analysis using 1991, 1992 and 1993 ALEPH data is presented. Special emphasis is given to the reconstruction of photons and \\pi^0's, and the removal of fake photons. A detailed study of the systematics entering the \\pi^0 reconstruction is also given. A complete and consistent set of tau hadronic branching ratios is presented for 18 exclusive modes. Most measurements are more precise than the present world average. The new level of precision reached allows a stringent test of \\tau-\\mu universality in hadronic decays, g_\\tau/g_\\mu \\ = \\ 1.0013 \\ \\pm \\ 0.0095, and the first measurement of the vector and axial-vector contributions to the non-strange hadronic \\tau decay width: R_{\\tau ,V} \\ = \\ 1.788 \\ \\pm \\ 0.025 and R_{\\tau ,A} \\ = \\ 1.694 \\ \\pm \\ 0.027. The ratio (R_{\\tau ,V} - R_{\\tau ,A}) / (R_{\\tau ,V} + R_{\\tau ,A}), equal to (2.7 \\pm 1.3) \\ \\%, is a measure of the importance of Q...

  11. Holes at High Blowing Ratios

    Directory of Open Access Journals (Sweden)

    Phillip M. Ligrani

    1996-01-01

    Full Text Available Experimental results are presented which describe the development and structure of flow downstream of a single row of holes with compound angle orientations producing film cooling at high blowing ratios. This film cooling configuration is important because similar arrangements are frequently employed on the first stage of rotating blades of operating gas turbine engines. With this configuration, holes are spaced 6d apart in the spanwise direction, with inclination angles of 24 degrees, and angles of orientation of 50.5 degrees. Blowing ratios range from 1.5 to 4.0 and the ratio of injectant to freestream density is near 1.0. Results show that spanwise averaged adiabatic effectiveness, spanwise-averaged iso-energetic Stanton number ratios, surveys of streamwise mean velocity, and surveys of injectant distributions change by important amounts as the blowing ratio increases. This is due to injectant lift-off from the test surface just downstream of the holes.

  12. Methodology of Accelerated Life-Time Tests For Stirling-Type "Bae-Co"-Made Cryocoolers Against Displacer-Blockage by Cryo-Pollutant Deposits

    National Research Council Canada - National Science Library

    Getmanits, Vladimir

    2000-01-01

    ...: The contractor will investigate techniques for accelerated testing of cryocooler technology. During this phase of the effort the contractor will perform a detailed design of the equipment needed to conduct accelerated testing...

  13. Low-speed tests of a high-aspect-ratio, supercritical-wing transport model equipped with a high-lift flap system in the Langley 4- by 7-meter and Ames 12-foot pressure tunnels

    Science.gov (United States)

    Morgan, H. L., Jr.; Kjelgaard, S. O.

    1983-01-01

    The Ames 12-Foot Pressure Tunnel was used to determine the effects of Reynolds number on the static longitudinal aerodynamic characteristics of an advanced, high-aspect-ratio, supercritical wing transport model equipped with a full span, leading edge slat and part span, double slotted, trailing edge flaps. The model had a wing span of 7.5 ft and was tested through a free stream Reynolds number range from 1.3 to 6.0 x 10 to 6th power per foot at a Mach number of 0.20. Prior to the Ames tests, an investigation was also conducted in the Langley 4 by 7 Meter Tunnel at a Reynolds number of 1.3 x 10 to 6th power per foot with the model mounted on an Ames strut support system and on the Langley sting support system to determine strut interference corrections. The data obtained from the Langley tests were also used to compare the aerodynamic charactertistics of the rather stiff, 7.5-ft-span steel wing model tested during this investigation and the larger, and rather flexible, 12-ft-span aluminum-wing model tested during a previous investigation. During the tests in both the Langley and Ames tunnels, the model was tested with six basic wing configurations: (1) cruise; (2) climb (slats only extended); (3) 15 deg take-off flaps; (4) 30 deg take-off flaps; (5) 45 deg landing flaps; and (6) 60 deg landing flaps.

  14. Arcjet nozzle area ratio effects

    Science.gov (United States)

    Curran, Francis M.; Sarmiento, Charles J.; Birkner, Bjorn W.; Kwasny, James

    1990-01-01

    An experimental investigation was conducted to determine the effect of nozzle area ratio on the operating characteristics and performance of a low power dc arcjet thruster. Conical thoriated tungsten nozzle inserts were tested in a modular laboratory arcjet thruster run on hydrogen/nitrogen mixtures simulating the decomposition products of hydrazine. The converging and diverging sides of the inserts had half angles of 30 and 20 degrees, respectively, similar to a flight type unit currently under development. The length of the diverging side was varied to change the area ratio. The nozzle inserts were run over a wide range of specific power. Current, voltage, mass flow rate, and thrust were monitored to provide accurate comparisons between tests. While small differences in performance were observed between the two nozzle inserts, it was determined that for each nozzle insert, arcjet performance improved with increasing nozzle area ratio to the highest area ratio tested and that the losses become very pronounced for area ratios below 50. These trends are somewhat different than those obtained in previous experimental and analytical studies of low Re number nozzles. It appears that arcjet performance can be enhanced via area ratio optimization.

  15. Arcjet Nozzle Area Ratio Effects

    Science.gov (United States)

    Curran, Francis M.; Sarmiento, Charles J.; Birkner, Bjorn W.; Kwasny, James

    1990-01-01

    An experimental investigation was conducted to determine the effect of nozzle area ratio on the operating characteristics and performance of a low power dc arcjet thruster. Conical thoriated tungsten nozzle inserts were tested in a modular laboratory arcjet thruster run on hydrogen/nitrogen mixtures simulating the decomposition products of hydrazine. The converging and diverging sides of the inserts had half angles of 30 and 20 degrees, respectively, similar to a flight type unit currently under development. The length of the diverging side was varied to change the area ratio. The nozzle inserts were run over a wide range of specific power. Current, voltage, mass flow rate, and thrust were monitored to provide accurate comparisons between tests. While small differences in performance were observed between the two nozzle inserts, it was determined that for each nozzle insert, arcjet performance improved with increasing nozzle area ratio to the highest area ratio tested and that the losses become very pronounced for area ratios below 50. These trends are somewhat different than those obtained in previous experimental and analytical studies of low Re number nozzles. It appears that arcjet performance can be enhanced via area ratio optimization.

  16. Evidence synthesis to inform model-based cost-effectiveness evaluations of diagnostic tests: a methodological review of health technology assessments

    Directory of Open Access Journals (Sweden)

    Bethany Shinkins

    2017-04-01

    Full Text Available Abstract Background Evaluations of diagnostic tests are challenging because of the indirect nature of their impact on patient outcomes. Model-based health economic evaluations of tests allow different types of evidence from various sources to be incorporated and enable cost-effectiveness estimates to be made beyond the duration of available study data. To parameterize a health-economic model fully, all the ways a test impacts on patient health must be quantified, including but not limited to diagnostic test accuracy. Methods We assessed all UK NIHR HTA reports published May 2009-July 2015. Reports were included if they evaluated a diagnostic test, included a model-based health economic evaluation and included a systematic review and meta-analysis of test accuracy. From each eligible report we extracted information on the following topics: 1 what evidence aside from test accuracy was searched for and synthesised, 2 which methods were used to synthesise test accuracy evidence and how did the results inform the economic model, 3 how/whether threshold effects were explored, 4 how the potential dependency between multiple tests in a pathway was accounted for, and 5 for evaluations of tests targeted at the primary care setting, how evidence from differing healthcare settings was incorporated. Results The bivariate or HSROC model was implemented in 20/22 reports that met all inclusion criteria. Test accuracy data for health economic modelling was obtained from meta-analyses completely in four reports, partially in fourteen reports and not at all in four reports. Only 2/7 reports that used a quantitative test gave clear threshold recommendations. All 22 reports explored the effect of uncertainty in accuracy parameters but most of those that used multiple tests did not allow for dependence between test results. 7/22 tests were potentially suitable for primary care but the majority found limited evidence on test accuracy in primary care settings

  17. Determination of 240Pu/239Pu isotopic ratios in human tissues collected from areas around the Semipalatinsk Nuclear Test Site by sector-field high resolution ICP-MS.

    Science.gov (United States)

    Yamamoto, M; Oikawa, S; Sakaguchi, A; Tomita, J; Hoshi, M; Apsalikov, K N

    2008-09-01

    Information on the 240Pu/239Pu isotope ratios in human tissues for people living around the Semipalatinsk Nuclear Test Site (SNTS) was deduced from 9 sets of soft tissues and bones, and 23 other bone samples obtained by autopsy. Plutonium was radiochemically separated and purified, and plutonium isotopes (239Pu and 240Pu) were determined by sector-field high resolution inductively coupled plasma-mass spectrometry. For most of the tissue samples from the former nine subjects, low 240Pu/239Pu isotope ratios were determined: bone, 0.125 +/- 0.018 (0.113-0.145, n = 4); lungs, 0.063 +/- 0.010 (0.051-0.078, n = 5); and liver, 0.148 +/- 0.026 (0.104-0.189, n = 9). Only 239Pu was detected in the kidney samples; the amount of 240Pu was too small to be measured, probably due to the small size of samples analyzed. The mean 240Pu/239Pu isotope ratio for bone samples from the latter 23 subjects was 0.152 +/- 0.034, ranging from 0.088 to 0.207. A significant difference (a two-tailed Student's t test; 95% significant level, alpha = 0.05) between mean 240Pu/239Pu isotope ratios for the tissue samples and for the global fallout value (0.178 +/- 0.014) indicated that weapons-grade plutonium from the atomic bombs has been incorporated into the human tissues, especially lungs, in the residents living around the SNTS. The present 239,240Pu concentrations in bone, lung, and liver samples were, however, not much different from ranges found for human tissues from other countries that were due solely to global fallout during the 1970's-1980's.

  18. The ratio of change in muscle thickness between superficial and deep cervical flexor muscles during the craniocervical flexion test and a suggestion regarding clinical treatment of patients with musculoskeletal neck pain.

    Science.gov (United States)

    Goo, Miran; Kim, Seong-Gil; Jun, Deokhoon

    2015-08-01

    [Purpose] The purpose of this study was to identify the imbalance of muscle recruitment in cervical flexor muscles during the craniocervical flexion test by using ultrasonography and to propose the optimal level of pressure in clinical craniocervical flexion exercise for people with neck pain. [Subjects and Methods] A total of 18 students (9 males and 9 females) with neck pain at D University in Gyeongsangbuk-do, South Korea, participated in this study. The change in muscle thickness in superficial and deep cervical flexor muscles during the craniocervical flexion test was measured using ultrasonography. The ratio of muscle thickness changes between superficial and deep muscles during the test were obtained to interpret the imbalance of muscle recruitment in cervical flexor muscles. [Results] The muscle thickness ratio of the sternocleidomastoid muscle/deep cervical flexor muscles according to the incremental pressure showed significant differences between 22 mmHg and 24 mmHg, between 24 mmHg and 28 mmHg, between 24 mmHg and 30 mmHg, and between 26 mmHg and 28 mmHg. [Conclusion] Ultrasonography can be applied for examination of cervical flexor muscles in clinical environment, and practical suggestion for intervention exercise of craniocervical flexors can be expected on the pressure level between 24 mmHg and 26 mmHg enabling the smallest activation of the sternocleidomastoid muscle.

  19. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  20. Specificity of the Acute Tryptophan and Tyrosine Plus Phenylalanine Depletion and Loading Tests Part II: Normalisation of the Tryptophan and the Tyrosine Plus Phenylalanine to Competing Amino Acid Ratios in a New Control Formulation

    Directory of Open Access Journals (Sweden)

    Abdulla A.-B. Badawy

    2010-06-01

    Full Text Available Current formulations for acute tryptophan (Trp or tyrosine (Tyr plus phenylalanine (Phe depletion and loading cause undesirable decreases in ratios of Trp or Tyr + Phe to competing amino acids (CAA, thus undermining the specificities of these tests. Branched-chain amino acids (BCAA cause these unintended decreases, and lowering their content in a new balanced control formulation in the present study led to normalization of all ratios. Four groups (n = 12 each of adults each received one of four 50 g control formulations, with 0% (traditional, 20%, 30%, or 40% less of the BCAA. The free and total [Trp]/[CAA] and [Phe + Tyr]/[BCAA + Trp] ratios all decreased significantly during the first 5 h following the traditional formulation, but were fully normalized by the formulation containing 40% less of the BCAA. We recommend the latter as a balanced control formulation and propose adjustments in the depletion and loading formulations to enhance their specificities for 5-HT and the catecholamines.

  1. A pilot study to test psychophonetics methodology for self-care and empathy in compassion fatigue, burnout and secondary traumatic stress

    Directory of Open Access Journals (Sweden)

    Katherine J. Train

    2013-09-01

    Conclusion: The results gave adequate indication for the implementation of a larger study in order to apply and test the intervention. The study highlights a dire need for further research in this field.

  2. STUDIES IN RESEARCH METHODOLOGY. IV. A SAMPLING STUDY OF THE CENTRAL LIMIT THEOREM AND THE ROBUSTNESS OF ONE-SAMPLE PARAMETRIC TESTS,

    Science.gov (United States)

    iconoclastic . Even at N=1024 these departures were quite appreciable at the testing tails, being greatest for chi-square and least for Z, and becoming worse in all cases at increasingly extreme tail areas. (Author)

  3. Methodologies, solutions, and lessons learned from heavy oil well testing with an ESP, offshore UK in the Bentley field, block 9/3b

    Energy Technology Data Exchange (ETDEWEB)

    Brennan, Barny; Lucas-Clements, Charles; Kew, Steve [Xcite Energy Resources (United Kingdom); Shumakov, Yakov; Camilleri, Lawrence; Akuanyionwu, Obinna; Tonoglu, Ahmet [Schlumberger (United Kingdom)

    2011-07-01

    Over the past decade, there has been an increase in hydrocarbon demand that led to the production of heavy oil fields in the United Kingdom continental shelf (UKCS). Most of the activity has been confined to exploration and appraisal drilling, the reason being the high uncertainty of the reservoir and fluid properties. Due to the operational complexity inherent to heavy oil, the use of conventional appraisal-well testing technology is limited. A novel technique developed to determine the most appropriate technology for testing wells with heavy oil using an electrical submersible pump (ESP) is presented in this paper. This technique was applied in the Bentley field. Some of the technical challenges include, maintaining fluid mobility using a surface-testing equipment, obtaining accurate flow measurements, a short weather window, and oil and gas separation for metering. Combining technologies such as dual-energy gamma ray venturi multiphase flowmeter, realtime monitoring, and ESP completion made it possible to execute the well test.

  4. Methodological study of the diffusion of interacting cations through clays. Application: experimental tests and simulation of coupled chemistry-diffusion transport of alkaline ions through a synthetical bentonite

    International Nuclear Information System (INIS)

    Melkior, Th.

    2000-01-01

    The subject of this work deals with the project of underground disposal of radioactive wastes in deep geological formations. It concerns the study of the migration of radionuclides through clays. In these materials, the main transport mechanism is assumed to be diffusion under natural conditions. Therefore, some diffusion experiments are conducted. With interacting solutes which present a strong affinity for the material, the duration of these tests will be too long, for the range of concentrations of interest. An alternative is to determine on one hand the geochemical retention properties using batch tests and crushed rock samples and, on the other hand, to deduce the transport parameters from diffusion tests realised with a non-interacting tracer, tritiated water. These data are then used to simulate the migration of the reactive elements with a numerical code which can deal with coupled chemistry-diffusion equations. The validity of this approach is tested by comparing the numerical simulations with the results of diffusion experiments of cations through a clay. The subject is investigated in the case of the diffusion of cesium, lithium and sodium through a compacted sodium bentonite. The diffusion tests are realised with the through-diffusion method. The comparison between the experimental results and the simulations shows that the latter tends to under estimate the propagation of the considered species. The differences could be attributed to surface diffusion and to a decrease of the accessibility to the sites of fixation of the bentonite, from the conditions of clay suspensions in batch tests to the situation of compacted samples. The influence of the experimental apparatus used during the diffusion tests on the results of the measurement has also been tested. It showed that these apparatus have to be taken into consideration when the experimental data are interpreted. A specific model has been therefore developed with the numerical code CASTEM 2000. (author)

  5. ANALISIS PENGARUH LDR, NPL DAN OPERATIONAL EFFICIENCY RATIO TERHADAP RETURN ON ASSETS PADA BANK DEVISA DI INDONESIA PERIODE 2010-2012

    OpenAIRE

    Hamidah Hamidah; Goldan Merion Siallagan; Umi Mardiyati

    2014-01-01

    This research is performed on order to test analysis the influence of the Loan to Deposit Ratio (LDR), Non Performing Loan (NPL) and Operational Efficiency Ratio (OER) toward Return On Asset (ROA) On Foreign Exchange Banks In Indonesia Period 2010-2012. Methodology research as the sample used purposive sampling, samplewas accured fromforeign banks in Indonesia. Data analysis with multi liniearregression of ordinary least square and hypotheses test used t-statistic and Fstatistic, a classic as...

  6. Current Evidence to Justify, and the Methodological Considerations for a Randomised Controlled Trial Testing the Hypothesis that Statins Prevent the Malignant Progression of Barrett's Oesophagus

    Directory of Open Access Journals (Sweden)

    David Thurtle

    2014-12-01

    Full Text Available Barrett’s oesophagus is the predominant risk factor for oesophageal adenocarcinoma, a cancer whose incidence is increasing and which has a poor prognosis. This article reviews the latest experimental and epidemiological evidence justifying the development of a randomised controlled trial investigating the hypothesis that statins prevent the malignant progression of Barrett’s oesophagus, and explores the methodological considerations for such a trial. The experimental evidence suggests anti-carcinogenic properties of statins on oesophageal cancer cell lines, based on the inhibition of the mevalonate pathway and the production of pro-apoptotic proteins. The epidemiological evidence reports inverse associations between statin use and the incidence of oesophageal carcinoma in both general population and Barrett’s oesophagus cohorts. Such a randomised controlled trial would be a large multi-centre trial, probably investigating simvastatin, given the wide clinical experience with this drug, relatively low side-effect profile and low financial cost. As with any clinical trial, high adherence is important, which could be increased with therapy, patient, doctor and system-focussed interventions. We would suggest there is now sufficient evidence to justify a full clinical trial that attempts to prevent this aggressive cancer in a high-risk population.

  7. Test

    DEFF Research Database (Denmark)

    Bendixen, Carsten

    2014-01-01

    Bidrag med en kortfattet, introducerende, perspektiverende og begrebsafklarende fremstilling af begrebet test i det pædagogiske univers.......Bidrag med en kortfattet, introducerende, perspektiverende og begrebsafklarende fremstilling af begrebet test i det pædagogiske univers....

  8. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    A final report summarizing the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in Cornwall. The geological setting of the test site in the Cornubian granite batholith is described. The effect of structure imposed by discontinuities on the engineering behaviour of rock masses is discussed and the scanline survey method of obtaining data on discontinuities in the rock mass is described. The applicability of some methods of statistical analysis for discontinuity data is reviewed. The requirement for remote geophysical methods of characterizing the mass is discussed and experiments using seismic and ultrasonic velocity measurements are reported. Methods of determining the in-situ stresses are described and the final results of a programme of in-situ stress measurements using the overcoring and hydrofracture methods are reported. (author)

  9. Airfoil selection methodology for Small Wind Turbines

    DEFF Research Database (Denmark)

    Salgado Fuentes, Valentin; Troya, Cesar; Moreno, Gustavo

    2016-01-01

    On wind turbine technology, the aerodynamic performance is fundamental to increase efficiency. Nowadays there are several databases with airfoils designed and simulated for different applications; that is why it is necessary to select those suitable for a specific application. This work presents...... a new methodology for airfoil selection used in feasibility and optimization of small wind turbines with low cut-in speed. On the first stage, airfoils data is tested on XFOIL software to check its compatibility with the simulator; then, arithmetic mean criteria is recursively used to discard...... underperformed airfoils; the best airfoil data was exported to Matlab for a deeper analysis. In the second part, data points were interpolated using "splines" to calculate glide ratio and stability across multiple angles of attack, those who present a bigger steadiness were conserved. As a result, 3 airfoils...

  10. Development and testing of VTT approach to risk-informed in-service inspection methodology. Final report of SAFIR INTELI INPUT Project RI-ISI

    International Nuclear Information System (INIS)

    Cronvall, O.; Maennistoe, I.; Simola, K.

    2007-04-01

    This report summarises the results of a research project on risk-informed in-service inspection (RI-ISI) methodology conducted in the Finnish national nuclear energy research programme SAFIR (2003-2006). The purpose of this work was to increase the accuracy of risk estimates used in RI-ISI analyses of nuclear power plant (NPP) piping systems, and to quantitatively evaluate the effects of different piping inspection strategies on risk. Piping failure occurrences were sampled by using probabilistic fracture mechanics (PFM) analyses. The PFM results for crack growth were used to construct transition matrices for a discrete-time Markov process model, which in turn was applied to examine the effects of various inspection strategies on the failure probabilities and risks. The applicability of the developed quantitative risk matrix approach was examined as a pilot study performed to the Shut-down cooling piping system 321 in NPP unit OL1 of Teollisuuden Voima Oy (TVO). The analysed degradation mechanisms were stress corrosion cracking (SCC) and thermal fatigue induced cracking (in the mixing points). Here a new and rather straightforward approach was developed to model thermal fatigue induced cracking, which degradation mechanism is much more difficult to model than SCC. This study further demonstrated the usefulness of Markov analysis procedure development by VTT in RI-ISI applications. The most important results are the quantified comparisons of different inspections strategies. It was shown in this study that Markov models are useful for this purpose, when combined with PFM analyses. While the numerical results could benefit from further considerations of inspection reliability, this does not affect the feasibility of the method itself. The approach can be used to identify an optimal inspection strategy for achieving a balanced risk profile of piping segments. (orig.)

  11. VVER-1000 dominance ratio

    International Nuclear Information System (INIS)

    Gorodkov, S.

    2009-01-01

    Dominance ratio, or more precisely, its closeness to unity, is important characteristic of large reactor. It allows evaluate beforehand the number of source iterations required in deterministic calculations of power spatial distribution. Or the minimal number of histories to be modeled for achievement of statistical error level desired in large core Monte Carlo calculations. In this work relatively simple approach for dominance ratio evaluation is proposed. It essentially uses core symmetry. Dependence of dominance ratio on neutron flux spatial distribution is demonstrated. (author)

  12. WWER-1000 dominance ratio

    International Nuclear Information System (INIS)

    Gorodkov, S.S.

    2009-01-01

    Dominance ratio, or more precisely, its closeness to unity, is important characteristic of large reactor. It allows evaluate beforehand the number of source iterations required in deterministic calculations of power spatial distribution. Or the minimal number of histories to be modeled for achievement of statistical error level desired in large core Monte Carlo calculations. In this work relatively simple approach for dominance ratio evaluation is proposed. It essentially uses core symmetry. Dependence of dominance ratio on neutron flux spatial distribution is demonstrated. (Authors)

  13. Evidence, Methodology, Test-Based Accountability, and Educational Policy: A Scholarly Exchange between Dr. Eric A. Hanushek and Drs. John Robert Warren and Eric Grodsky

    Science.gov (United States)

    Hanushek, Eric A.; Warren, John Robert; Grodsky, Eric

    2012-01-01

    This exchange represents a follow-up to an article on the effects of state high school exit examinations that previously appeared in this journal (Warren, Grodsky, & Kalogrides 2009). That 2009 article was featured prominently in a report by the National Research Council (NRC) that evaluated the efficacy of test-based accountability systems.…

  14. Sharpening Sharpe Ratios

    OpenAIRE

    William N. Goetzmann; Jonathan E. Ingersoll Jr.; Matthew I. Spiegel; Ivo Welch

    2002-01-01

    It is now well known that the Sharpe ratio and other related reward-to-risk measures may be manipulated with option-like strategies. In this paper we derive the general conditions for achieving the maximum expected Sharpe ratio. We derive static rules for achieving the maximum Sharpe ratio with two or more options, as well as a continuum of derivative contracts. The optimal strategy has a truncated right tail and a fat left tail. We also derive dynamic rules for increasing the Sharpe ratio. O...

  15. Methodological aspects of breath hydrogen (H2) analysis. Evaluation of a H2 monitor and interpretation of the breath H2 test

    DEFF Research Database (Denmark)

    Rumessen, J J; Kokholm, G; Gudmand-Høyer, E

    1987-01-01

    The reliability of end-expiratory hydrogen (H2) breath tests were assessed and the significance of some important pitfalls were studied, using a compact, rapid H2-monitor with electrochemical cells. The H2 response was shown to be linear and stable. The reproducibility of the breath collection...... were studied in 10 healthy adults during a 4-month period and they showed very marked inter- and intra-individual variability (16% above 40 p.p.m.). Initial peaks (early, short-lived H2 rises unrelated to carbohydrate malabsorption) were identified in 25% of the breath tests (in 4% above 20 p.......p.m). It is concluded that the technique used for interval sampling of end-expiratory breath samples for H2 concentration gives reliable results. The biological significance of H2 concentration increments can only be evaluated if the limitations of the technical procedures and the individual ability to produce H2...

  16. A methodological approach to improve the sexual health of vulnerable female populations: incentivized peer-recruitment and field-based STD testing.

    Science.gov (United States)

    Roth, Alexis M; Rosenberger, Joshua G; Reece, Michael; Van Der Pol, Barbara

    2012-02-01

    Transactional sex has been associated with increased risk of adverse health outcomes, including sexually transmitted infections (STIs). Participants included female sex workers and men they recruited utilizing incentivized snowball sampling. Participants provided specimens for STI diagnostic testing and completed a semi-structured interview. Forty-four participants aged 19-65 were interviewed. Participants found self-sampling to be acceptable and overwhelmingly endorsed sampling outside of a clinic (90%) for reasons such as convenience, privacy, and lack of stigma. A substantial minority (38%) tested positive for at least one STI. Novel strategies may encourage sexual health care and prevent STIs among sex workers. High infection and screening acceptance rates across the sample suggests that individuals engaged in transactional sex would benefit from, and would be responsive to, community-based self-sampling for STI screening.

  17. Evaluation of a new methodology to simulate damage and wear of polyethylene hip replacements subjected to edge loading in hip simulator testing.

    Science.gov (United States)

    Partridge, Susan; Tipper, Joanne L; Al-Hajjar, Mazen; Isaac, Graham H; Fisher, John; Williams, Sophie

    2018-05-01

    Wear and fatigue of polyethylene acetabular cups have been reported to play a role in the failure of total hip replacements. Hip simulator testing under a wide range of clinically relevant loading conditions is important. Edge loading of hip replacements can occur following impingement under extreme activities and can also occur during normal gait, where there is an offset deficiency and/or joint laxity. This study evaluated a hip simulator method that assessed wear and damage in polyethylene acetabular liners that were subjected to edge loading. The liners tested to evaluate the method were a currently manufactured crosslinked polyethylene acetabular liner and an aged conventional polyethylene acetabular liner. The acetabular liners were tested for 5 million standard walking cycles and following this 5 million walking cycles with edge loading. Edge loading conditions represented a separation of the centers of rotation of the femoral head and the acetabular liner during the swing phase, leading to loading of the liner rim on heel strike. Rim damage and cracking was observed in the aged conventional polyethylene liner. Steady-state wear rates assessed gravimetrically were lower under edge loading compared to standard loading. This study supports previous clinical findings that edge loading may cause rim cracking in liners, where component positioning is suboptimal or where material degradation is present. The simulation method developed has the potential to be used in the future to test the effect of aging and different levels of severity of edge loading on a range of cross-linked polyethylene materials. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 106B: 1456-1462, 2018. © 2017 Wiley Periodicals, Inc.

  18. Methodological basis for the optimization of a marine sea-urchin embryo test (SET) for the ecological assessment of coastal water quality.

    Science.gov (United States)

    Saco-Alvarez, Liliana; Durán, Iria; Ignacio Lorenzo, J; Beiras, Ricardo

    2010-05-01

    The sea-urchin embryo test (SET) has been frequently used as a rapid, sensitive, and cost-effective biological tool for marine monitoring worldwide, but the selection of a sensitive, objective, and automatically readable endpoint, a stricter quality control to guarantee optimum handling and biological material, and the identification of confounding factors that interfere with the response have hampered its widespread routine use. Size increase in a minimum of n=30 individuals per replicate, either normal larvae or earlier developmental stages, was preferred to observer-dependent, discontinuous responses as test endpoint. Control size increase after 48 h incubation at 20 degrees C must meet an acceptability criterion of 218 microm. In order to avoid false positives minimums of 32 per thousand salinity, 7 pH and 2mg/L oxygen, and a maximum of 40 microg/L NH(3) (NOEC) are required in the incubation media. For in situ testing size increase rates must be corrected on a degree-day basis using 12 degrees C as the developmental threshold. Copyright 2010 Elsevier Inc. All rights reserved.

  19. Three-dimensional stereo by photometric ratios

    International Nuclear Information System (INIS)

    Wolff, L.B.; Angelopoulou, E.

    1994-01-01

    We present a methodology for corresponding a dense set of points on an object surface from photometric values for three-dimensional stereo computation of depth. The methodology utilizes multiple stereo pairs of images, with each stereo pair being taken of the identical scene but under different illumination. With just two stereo pairs of images taken under two different illumination conditions, a stereo pair of ratio images can be produced, one for the ratio of left-hand images and one for the ratio of right-hand images. We demonstrate how the photometric ratios composing these images can be used for accurate correspondence of object points. Object points having the same photometric ratio with respect to two different illumination conditions constitute a well-defined equivalence class of physical constraints defined by local surface orientation relative to illumination conditions. We formally show that for diffuse reflection the photometric ratio is invariant to varying camera characteristics, surface albedo, and viewpoint and that therefore the same photometric ratio in both images of a stereo pair implies the same equivalence class of physical constraints. The correspondence of photometric ratios along epipolar lines in a stereo pair of images under different illumination conditions is a correspondence of equivalent physical constraints, and the determination of depth from stereo can be performed. Whereas illumination planning is required, our photometric-based stereo methodology does not require knowledge of illumination conditions in the actual computation of three-dimensional depth and is applicable to perspective views. This technique extends the stereo determination of three-dimensional depth to smooth featureless surfaces without the use of precisely calibrated lighting. We demonstrate experimental depth maps from a dense set of points on smooth objects of known ground-truth shape, determined to within 1% depth accuracy

  20. Detecting isotopic ratio outliers

    International Nuclear Information System (INIS)

    Bayne, C.K.; Smith, D.H.

    1985-01-01

    An alternative method is proposed for improving isotopic ratio estimates. This method mathematically models pulse-count data and uses iterative reweighted Poisson regression to estimate model parameters to calculate the isotopic ratios. This computer-oriented approach provides theoretically better methods than conventional techniques to establish error limits and to identify outliers. 6 refs., 3 figs., 3 tabs