WorldWideScience

Sample records for ratio test methodology

  1. Methodology for developing new test methods

    Directory of Open Access Journals (Sweden)

    A. I. Korobko

    2017-06-01

    Full Text Available The paper describes the methodology for developing new test methods and forming solutions for the development of new test methods. The basis of the methodology for developing new test methods is the individual elements of the system and process approaches. They contribute to the development of an effective research strategy for the object, the study of interrelations, the synthesis of an adequate model of the test method. The effectiveness of the developed test method is determined by the correct choice of the set of concepts, their interrelations and mutual influence. This allows you to solve the tasks assigned to achieve the goal. The methodology is based on the use of fuzzy cognitive maps. The question of the choice of the method on the basis of which the model for the formation of solutions is based is considered. The methodology provides for recording a model for a new test method in the form of a finite set of objects. These objects are significant for the test method characteristics. Then a causal relationship is established between the objects. Further, the values of fitness indicators and the observability of the method and metrological tolerance for the indicator are established. The work is aimed at the overall goal of ensuring the quality of tests by improving the methodology for developing the test method.

  2. PETA: Methodology of Information Systems Security Penetration Testing

    Directory of Open Access Journals (Sweden)

    Tomáš Klíma

    2016-12-01

    Full Text Available Current methodologies of information systems penetration testing focuses mainly on a high level and technical description of the testing process. Unfortunately, there is no methodology focused primarily on the management of these tests. It often results in a situation when the tests are badly planned, managed and the vulnerabilities found are unsystematically remediated. The goal of this article is to present new methodology called PETA which is focused mainly on the management of penetration tests. Development of this methodology was based on the comparative analysis of current methodologies. New methodology incorporates current best practices of IT governance and project management represented by COBIT and PRINCE2 principles. Presented methodology has been quantitatively evaluated.

  3. Probabilistic fatigue life prediction methodology for notched components based on simple smooth fatigue tests

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Z. R.; Li, Z. X. [Dept.of Engineering Mechanics, Jiangsu Key Laboratory of Engineering Mechanics, Southeast University, Nanjing (China); Hu, X. T.; Xin, P. P.; Song, Y. D. [State Key Laboratory of Mechanics and Control of Mechanical Structures, Nanjing University of Aeronautics and Astronautics, Nanjing (China)

    2017-01-15

    The methodology of probabilistic fatigue life prediction for notched components based on smooth specimens is presented. Weakestlink theory incorporating Walker strain model has been utilized in this approach. The effects of stress ratio and stress gradient have been considered. Weibull distribution and median rank estimator are used to describe fatigue statistics. Fatigue tests under different stress ratios were conducted on smooth and notched specimens of titanium alloy TC-1-1. The proposed procedures were checked against the test data of TC-1-1 notched specimens. Prediction results of 50 % survival rate are all within a factor of two scatter band of the test results.

  4. The behavior of the likelihood ratio test for testing missingness

    OpenAIRE

    Hens, Niel; Aerts, Marc; Molenberghs, Geert; Thijs, Herbert

    2003-01-01

    To asses the sensitivity of conclusions to model choices in the context of selection models for non-random dropout, one can oppose the different missing mechanisms to each other; e.g. by the likelihood ratio tests. The finite sample behavior of the null distribution and the power of the likelihood ratio test is studied under a variety of missingness mechanisms. missing data; sensitivity analysis; likelihood ratio test; missing mechanisms

  5. Test reactor risk assessment methodology

    International Nuclear Information System (INIS)

    Jennings, R.H.; Rawlins, J.K.; Stewart, M.E.

    1976-04-01

    A methodology has been developed for the identification of accident initiating events and the fault modeling of systems, including common mode identification, as these methods are applied in overall test reactor risk assessment. The methods are exemplified by a determination of risks to a loss of primary coolant flow in the Engineering Test Reactor

  6. Methodology for testing metal detectors using variables test data

    Energy Technology Data Exchange (ETDEWEB)

    Spencer, D.D.; Murray, D.W.

    1993-08-01

    By extracting and analyzing measurement (variables) data from portal metal detectors whenever possible instead of the more typical ``alarm``/``no-alarm`` (attributes or binomial) data, we can be more informed about metal detector health with fewer tests. This testing methodology discussed in this report is an alternative to the typical binomial testing and in many ways is far superior.

  7. 34 CFR Appendix A to Subpart L of... - Ratio Methodology for Proprietary Institutions

    Science.gov (United States)

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Ratio Methodology for Proprietary Institutions A Appendix A to Subpart L of Part 668 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION STUDENT ASSISTANCE GENERAL PROVISIONS...

  8. Proposed Objective Odor Control Test Methodology for Waste Containment

    Science.gov (United States)

    Vos, Gordon

    2010-01-01

    The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.

  9. The Laplace Likelihood Ratio Test for Heteroscedasticity

    Directory of Open Access Journals (Sweden)

    J. Martin van Zyl

    2011-01-01

    Full Text Available It is shown that the likelihood ratio test for heteroscedasticity, assuming the Laplace distribution, gives good results for Gaussian and fat-tailed data. The likelihood ratio test, assuming normality, is very sensitive to any deviation from normality, especially when the observations are from a distribution with fat tails. Such a likelihood test can also be used as a robust test for a constant variance in residuals or a time series if the data is partitioned into groups.

  10. Testing methodology of embedded software in digital plant protection system

    International Nuclear Information System (INIS)

    Seong, Ah Young; Choi, Bong Joo; Lee, Na Young; Hwang, Il Soon

    2001-01-01

    It is necessary to assure the reliability of software in order to digitalize RPS(Reactor Protection System). Since RPS causes fatal damage on accidental cases, it is classified as Safety 1E class. Therefore we propose the effective testing methodology to assure the reliability of embedded software in the DPPS(Digital Plant Protection System). To test the embedded software effectively in DPPS, our methodology consists of two steps. The first is the re-engineering step that extracts classes from structural source program, and the second is the level of testing step which is composed of unit testing, Integration Testing and System Testing. On each testing step we test the embedded software with selected test cases after the test item identification step. If we use this testing methodology, we can test the embedded software effectively by reducing the cost and the time

  11. A methodology of SiP testing based on boundary scan

    Science.gov (United States)

    Qin, He; Quan, Haiyang; Han, Yifei; Zhu, Tianrui; Zheng, Tuo

    2017-10-01

    System in Package (SiP) play an important role in portable, aerospace and military electronic with the microminiaturization, light weight, high density, and high reliability. At present, SiP system test has encountered the problem on system complexity and malfunction location with the system scale exponentially increase. For SiP system, this paper proposed a testing methodology and testing process based on the boundary scan technology. Combining the character of SiP system and referencing the boundary scan theory of PCB circuit and embedded core test, the specific testing methodology and process has been proposed. The hardware requirement of the under test SiP system has been provided, and the hardware platform of the testing has been constructed. The testing methodology has the character of high test efficiency and accurate malfunction location.

  12. Jet-Surface Interaction - High Aspect Ratio Nozzle Test: Test Summary

    Science.gov (United States)

    Brown, Clifford A.

    2016-01-01

    The Jet-Surface Interaction High Aspect Ratio Nozzle Test was conducted in the Aero-Acoustic Propulsion Laboratory at the NASA Glenn Research Center in the fall of 2015. There were four primary goals specified for this test: (1) extend the current noise database for rectangular nozzles to higher aspect ratios, (2) verify data previously acquired at small-scale with data from a larger model, (3) acquired jet-surface interaction noise data suitable for creating verifying empirical noise models and (4) investigate the effect of nozzle septa on the jet-mixing and jet-surface interaction noise. These slides give a summary of the test with representative results for each goal.

  13. Methodology of diagnostic tests in hepatology

    DEFF Research Database (Denmark)

    Christensen, Erik

    2009-01-01

    The performance of diagnostic tests can be assessed by a number of methods. These include sensitivity, specificity,positive and negative predictive values, likelihood ratios and receiver operating characteristic (ROC) curves. This paper describes the methods and explains which information...... they provide. Sensitivity and specificity provides measures of the diagnostic accuracy of a test in diagnosing the condition. The positive and negative predictive values estimate the probability of the condition from the test-outcome and the condition's prevalence. The likelihood ratios bring together......' and plotting sensitivity as a function of 1-specificity. The ROC-curve can be used to define optimal cut-off values for a test, to assess the diagnostic accuracy of the test, and to compare the usefulness of different tests in the same patients. Under certain conditions it may be possible to utilize a test...

  14. Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation

    Science.gov (United States)

    2016-05-01

    ARL-TN-0756 ● MAY 2016 US Army Research Laboratory Stress Optical Coefficient, Test Methodology , and Glass Standard Evaluation...Stress Optical Coefficient, Test Methodology , and Glass Standard Evaluation by Clayton M Weiss Oak Ridge Institute for Science and Education...ORISE), Belcamp, MD Parimal J Patel Weapons and Materials Research Directorate, ARL Approved for public release; distribution is

  15. Testing Methodology in the Student Learning Process

    Science.gov (United States)

    Gorbunova, Tatiana N.

    2017-01-01

    The subject of the research is to build methodologies to evaluate the student knowledge by testing. The author points to the importance of feedback about the mastering level in the learning process. Testing is considered as a tool. The object of the study is to create the test system models for defence practice problems. Special attention is paid…

  16. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  17. A Design Methodology for Computer Security Testing

    OpenAIRE

    Ramilli, Marco

    2013-01-01

    The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. ...

  18. 34 CFR Appendix B to Subpart L of... - Ratio Methodology for Private Non-Profit Institutions

    Science.gov (United States)

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Ratio Methodology for Private Non-Profit Institutions B Appendix B to Subpart L of Part 668 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION STUDENT ASSISTANCE GENERAL PROVISIONS...

  19. Environmental Testing Methodology in Biometrics

    OpenAIRE

    Fernández Saavedra, Belén; Sánchez Reíllo, Raúl; Alonso Moreno, Raúl; Miguel Hurtado, Óscar

    2010-01-01

    8 pages document + 5-slide presentation.-- Contributed to: 1st International Biometric Performance Conference (IBPC 2010, NIST, Gaithersburg, MD, US, Mar 1-5, 2010). Recently, biometrics is used in many security systems and these systems can be located in different environments. As many experts claim and previous works have demonstrated, environmental conditions influence biometric performance. Nevertheless, there is not a specific methodology for testing this influence at the moment...

  20. Equipment qualification testing methodology research at Sandia Laboratories

    International Nuclear Information System (INIS)

    Jeppesen, D.

    1983-01-01

    The Equipment Qualification Research Testing (EQRT) program is an evolutionary outgrowth of the Qualification Testing Evaluation (QTE) program at Sandia. The primary emphasis of the program has been qualification methodology research. The EQRT program offers to the industry a research-oriented perspective on qualification-related component performance, as well as refinements to component testing standards which are based upon actual component testing research

  1. Combining tracer flux ratio methodology with low-flying aircraft measurements to estimate dairy farm CH4 emissions

    Science.gov (United States)

    Daube, C.; Conley, S.; Faloona, I. C.; Yacovitch, T. I.; Roscioli, J. R.; Morris, M.; Curry, J.; Arndt, C.; Herndon, S. C.

    2017-12-01

    Livestock activity, enteric fermentation of feed and anaerobic digestion of waste, contributes significantly to the methane budget of the United States (EPA, 2016). Studies question the reported magnitude of these methane sources (Miller et. al., 2013), calling for more detailed research of agricultural animals (Hristov, 2014). Tracer flux ratio is an attractive experimental method to bring to this problem because it does not rely on estimates of atmospheric dispersion. Collection of data occurred during one week at two dairy farms in central California (June, 2016). Each farm varied in size, layout, head count, and general operation. The tracer flux ratio method involves releasing ethane on-site with a known flow rate to serve as a tracer gas. Downwind mixed enhancements in ethane (from the tracer) and methane (from the dairy) were measured, and their ratio used to infer the unknown methane emission rate from the farm. An instrumented van drove transects downwind of each farm on public roads while tracer gases were released on-site, employing the tracer flux ratio methodology to assess simultaneous methane and tracer gas plumes. Flying circles around each farm, a small instrumented aircraft made measurements to perform a mass balance evaluation of methane gas. In the course of these two different methane quantification techniques, we were able to validate yet a third method: tracer flux ratio measured via aircraft. Ground-based tracer release rates were applied to the aircraft-observed methane-to-ethane ratios, yielding whole-site methane emission rates. Never before has the tracer flux ratio method been executed with aircraft measurements. Estimates from this new application closely resemble results from the standard ground-based technique to within their respective uncertainties. Incorporating this new dimension to the tracer flux ratio methodology provides additional context for local plume dynamics and validation of both ground and flight-based data.

  2. Nearly Efficient Likelihood Ratio Tests for Seasonal Unit Roots

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    In an important generalization of zero frequency autore- gressive unit root tests, Hylleberg, Engle, Granger, and Yoo (1990) developed regression-based tests for unit roots at the seasonal frequencies in quarterly time series. We develop likelihood ratio tests for seasonal unit roots and show...... that these tests are "nearly efficient" in the sense of Elliott, Rothenberg, and Stock (1996), i.e. that their local asymptotic power functions are indistinguishable from the Gaussian power envelope. Currently available nearly efficient testing procedures for seasonal unit roots are regression-based and require...... the choice of a GLS detrending parameter, which our likelihood ratio tests do not....

  3. Evaluation of constraint methodologies applied to a shallow-flaw cruciform bend specimen tested under biaxial loading conditions

    International Nuclear Information System (INIS)

    Bass, B.R.; McAfee, W.J.; Williams, P.T.; Pennell, W.E.

    1998-01-01

    A technology to determine shallow-flaw fracture toughness of reactor pressure vessel (RPV) steels is being developed for application to the safety assessment of RPVs containing postulated shallow surface flaws. Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shallow surface flaws. The cruciform beam specimens were developed at Oak Ridge National Laboratory (ORNL) to introduce a prototypic, far-field. out-of-plane biaxial stress component in the test section that approximates the nonlinear stresses resulting from pressurized-thermal-shock or pressure-temperature loading of an RPV. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for RPV materials. The cruciform fracture toughness data were used to evaluate fracture methodologies for predicting the observed effects of biaxial loading on shallow-flaw fracture toughness. Initial emphasis was placed on assessment of stress-based methodologies. namely, the J-Q formulation, the Dodds-Anderson toughness scaling model, and the Weibull approach. Applications of these methodologies based on the hydrostatic stress fracture criterion indicated an effect of loading-biaxiality on fracture toughness, the conventional maximum principal stress criterion indicated no effect

  4. Cassini's Test Methodology for Flight Software Verification and Operations

    Science.gov (United States)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  5. PROFITABILITY RATIO AS A TOOL FOR BANKRUPTCY PREDICTION

    Directory of Open Access Journals (Sweden)

    Daniel BRÎNDESCU – OLARIU

    2016-07-01

    Full Text Available The current study evaluates the potential of the profitability ratio in predicting corporate bankruptcy. The research is focused on Romanian companies, with the targeted event being represented by the manifestation of bankruptcy 2 years after the date of the financial statements of reference. All tests were conducted over 2 paired samples of 1176 Romanian companies. The methodology employed in evaluating the potential of the profitability ratio was based on the Area Under the ROC Curve (0.663 and the general accuracy ensured by the ratio (62.6% out-of-sample accuracy. The results confirm the practical utility of the profitability ratio in the prediction of bankruptcy and thus validate the need for further research focused on developing a methodology of analysis.

  6. Progress on qualification testing methodology study of electric cables

    International Nuclear Information System (INIS)

    Yoshida, K.; Seguchi, T.; Okada, S.; Ito, M.; Kusama, Y.; Yagi, T.; Yoshikawa, M.

    1983-01-01

    Many instrumental, control and power cables are installed in nuclear power plants, and these cables contain a large amount of organic polymers as insulating and jacketing materials. They are exposed to radiation at high dose rate, steam at high temperature and chemical (or water) spray simultaneously when a LOCA occurs. Under such conditions, the polymers tend to lose their original properties. For reactor safety, the cables should be functional even if they are subjected to a loss-of-coolant accident (LOCA) at the end of their intended service life. In Japan, cable manufacturers qualify their cables according to the proposed test standard issued from IEEJ in 1982, but the standard still has many unsolved problems or uncertainties which have been dealt with tentatively through the manufacturer-user's agreement. The objectives of this research are to study the methodologies for qualification testing of electric wires and cables, and to provide the improved technical bases for modification of the standard. Research activities are divided into the Accident (LOCA) Testing Methodology and the Accelerated Aging Methodology

  7. Development of a flight software testing methodology

    Science.gov (United States)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  8. Evaluation and testing methodology for evolving entertainment systems

    NARCIS (Netherlands)

    Jurgelionis, A.; Bellotti, F.; IJsselsteijn, W.A.; Kort, de Y.A.W.; Bernhaupt, R.; Tscheligi, M.

    2007-01-01

    This paper presents a testing and evaluation methodology for evolving pervasive gaming and multimedia systems. We introduce the Games@Large system, a complex gaming and multimedia architecture comprised of a multitude of elements: heterogeneous end user devices, wireless and wired network

  9. Security Testing in Agile Web Application Development - A Case Study Using the EAST Methodology

    CERN Document Server

    Erdogan, Gencer

    2010-01-01

    There is a need for improved security testing methodologies specialized for Web applications and their agile development environment. The number of web application vulnerabilities is drastically increasing, while security testing tends to be given a low priority. In this paper, we analyze and compare Agile Security Testing with two other common methodologies for Web application security testing, and then present an extension of this methodology. We present a case study showing how our Extended Agile Security Testing (EAST) performs compared to a more ad hoc approach used within an organization. Our working hypothesis is that the detection of vulnerabilities in Web applications will be significantly more efficient when using a structured security testing methodology specialized for Web applications, compared to existing ad hoc ways of performing security tests. Our results show a clear indication that our hypothesis is on the right track.

  10. 21 CFR 862.1455 - Lecithin/sphingomyelin ratio in amniotic fluid test system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Lecithin/sphingomyelin ratio in amniotic fluid... Clinical Chemistry Test Systems § 862.1455 Lecithin/sphingomyelin ratio in amniotic fluid test system. (a) Identification. A lecithin/sphingomyelin ratio in amniotic fluid test system is a device intended to measure the...

  11. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  12. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  13. Alvar engine. An engine with variable compression ratio. Experiments and tests

    Energy Technology Data Exchange (ETDEWEB)

    Erlandsson, Olof

    1998-09-01

    This report is focused on tests with Variable Compression Ratio (VCR) engines, according to the Alvar engine principle. Variable compression ratio means an engine design where it is possible to change the nominal compression ratio. The purpose is to increase the fuel efficiency at part load by increasing the compression ratio. At maximum load, and maybe supercharging with for example turbocharger, it is not possible to keep a high compression ratio because of the knock phenomena. Knock is a shock wave caused by self-ignition of the fuel-air mix. If knock occurs, the engine will be exposed to a destructive load. Because of the reasons mentioned it would be an advantage if it would be possible to change the compression ratio continuously when the load changes. The Alvar engine provides a solution for variable compression ratio based on well-known engine components. This paper provides information about efficiency and emission characteristics from tests with two Alvar engines. Results from tests with a phase shift mechanism (for automatic compression ratio control) for the Alvar engine are also reviewed Examination paper. 5 refs, 23 figs, 2 tabs, 5 appendices

  14. HIV / AIDS prevalence testing - merits, methodology and outcomes ...

    African Journals Online (AJOL)

    HIV / AIDS prevalence testing - merits, methodology and outcomes of a survey conducted at a large mining organisation in South Africa. ... These baseline prevalence data also provide an opportunity for monitoring of proposed interventions using cross-sectional surveys at designated intervals in the future. South African ...

  15. Inference for the Sharpe Ratio Using a Likelihood-Based Approach

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2012-01-01

    Full Text Available The Sharpe ratio is the prominent risk-adjusted performance measure used by practitioners. Statistical testing of this ratio using its asymptotic distribution has lagged behind its use. In this paper, highly accurate likelihood analysis is applied for inference on the Sharpe ratio. Both the one- and two-sample problems are considered. The methodology has O(n−3/2 distributional accuracy and can be implemented using any parametric return distribution structure. Simulations are provided to demonstrate the method's superior accuracy over existing methods used for testing in the literature.

  16. The efficiency of the crude oil markets: Evidence from variance ratio tests

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Amelie, E-mail: acharles@audencia.co [Audencia Nantes, School of Management, 8 route de la Joneliere, 44312 Nantes (France); Darne, Olivier, E-mail: olivier.darne@univ-nantes.f [LEMNA, University of Nantes, IEMN-IAE, Chemin de la Censive du Tertre, 44322 Nantes (France)

    2009-11-15

    This study examines the random walk hypothesis for the crude oil markets, using daily data over the period 1982-2008. The weak-form efficient market hypothesis for two crude oil markets (UK Brent and US West Texas Intermediate) is tested with non-parametric variance ratio tests developed by [Wright J.H., 2000. Alternative variance-ratio tests using ranks and signs. Journal of Business and Economic Statistics, 18, 1-9] and [Belaire-Franch J. and Contreras D., 2004. Ranks and signs-based multiple variance ratio tests. Working paper, Department of Economic Analysis, University of Valencia] as well as the wild-bootstrap variance ratio tests suggested by [Kim, J.H., 2006. Wild bootstrapping variance ratio tests. Economics Letters, 92, 38-43]. We find that the Brent crude oil market is weak-form efficiency while the WTI crude oil market seems to be inefficiency on the 1994-2008 sub-period, suggesting that the deregulation have not improved the efficiency on the WTI crude oil market in the sense of making returns less predictable.

  17. The efficiency of the crude oil markets. Evidence from variance ratio tests

    International Nuclear Information System (INIS)

    Charles, Amelie; Darne, Olivier

    2009-01-01

    This study examines the random walk hypothesis for the crude oil markets, using daily data over the period 1982-2008. The weak-form efficient market hypothesis for two crude oil markets (UK Brent and US West Texas Intermediate) is tested with non-parametric variance ratio tests developed by [Wright J.H., 2000. Alternative variance-ratio tests using ranks and signs. Journal of Business and Economic Statistics, 18, 1-9] and [Belaire-Franch J. and Contreras D., 2004. Ranks and signs-based multiple variance ratio tests. Working paper, Department of Economic Analysis, University of Valencia] as well as the wild-bootstrap variance ratio tests suggested by [Kim, J.H., 2006. Wild bootstrapping variance ratio tests. Economics Letters, 92, 38-43]. We find that the Brent crude oil market is weak-form efficiency while the WTI crude oil market seems to be inefficiency on the 1994-2008 sub-period, suggesting that the deregulation have not improved the efficiency on the WTI crude oil market in the sense of making returns less predictable. (author)

  18. The efficiency of the crude oil markets. Evidence from variance ratio tests

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Amelie [Audencia Nantes, School of Management, 8 route de la Joneliere, 44312 Nantes (France); Darne, Olivier [LEMNA, University of Nantes, IEMN-IAE, Chemin de la Censive du Tertre, 44322 Nantes (France)

    2009-11-15

    This study examines the random walk hypothesis for the crude oil markets, using daily data over the period 1982-2008. The weak-form efficient market hypothesis for two crude oil markets (UK Brent and US West Texas Intermediate) is tested with non-parametric variance ratio tests developed by [Wright J.H., 2000. Alternative variance-ratio tests using ranks and signs. Journal of Business and Economic Statistics, 18, 1-9] and [Belaire-Franch J. and Contreras D., 2004. Ranks and signs-based multiple variance ratio tests. Working paper, Department of Economic Analysis, University of Valencia] as well as the wild-bootstrap variance ratio tests suggested by [Kim, J.H., 2006. Wild bootstrapping variance ratio tests. Economics Letters, 92, 38-43]. We find that the Brent crude oil market is weak-form efficiency while the WTI crude oil market seems to be inefficiency on the 1994-2008 sub-period, suggesting that the deregulation have not improved the efficiency on the WTI crude oil market in the sense of making returns less predictable. (author)

  19. Performance Testing Methodology for Safety-Critical Programmable Logic Controller

    International Nuclear Information System (INIS)

    Kim, Chang Ho; Oh, Do Young; Kim, Ji Hyeon; Kim, Sung Ho; Sohn, Se Do

    2009-01-01

    The Programmable Logic Controller (PLC) for use in Nuclear Power Plant safety-related applications is being developed and tested first time in Korea. This safety-related PLC is being developed with requirements of regulatory guideline and industry standards for safety system. To test that the quality of the developed PLC is sufficient to be used in safety critical system, document review and various product testings were performed over the development documents for S/W, H/W, and V/V. This paper provides the performance testing methodology and its effectiveness for PLC platform conducted by KOPEC

  20. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    Science.gov (United States)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  1. Methodology for testing subcomponents; background and motivation for subcomponent testing of wind turbine rotor blades

    DEFF Research Database (Denmark)

    Antoniou, Alexandros; Branner, Kim; Lekou, D.J.

    2016-01-01

    This report aims to provide an overview of the design methodology followed by wind turbine blade structural designers, along with the testing procedure on full scale blades which are followed by testing laboratories for blade manufacturers as required by the relevant standards and certification...... bodies’ recommendations for design and manufacturing verification. The objective of the report is not to criticize the design methodology or testing procedure and the standards thereof followed in the wind energy community, but to identify those items offered by state of the art structural design tools...... investigations performed are based on the INNWIND.EU reference 10MW horizontal axis wind turbine [1]. The structural properties and material and layout definition used within IRPWIND are defined in the INNWIND.EU report [2]. The layout of the report includes a review of the structural analysis models used...

  2. Comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Racz, A. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1996-07-01

    In this paper the classical sequential probability ratio testing method (SPRT) is reconsidered. Every individual boundary crossing event of the SPRT is regarded as a new piece of evidence about the problem under hypothesis testing. The Bayes method is applied for belief updating, i.e. integrating these individual decisions. The procedure is recommended to use when the user (1) would like to be informed about the tested hypothesis continuously and (2) would like to achieve his final conclusion with high confidence level. (Author).

  3. A more powerful test based on ratio distribution for retention noninferiority hypothesis.

    Science.gov (United States)

    Deng, Ling; Chen, Gang

    2013-03-11

    Rothmann et al. ( 2003 ) proposed a method for the statistical inference of fraction retention noninferiority (NI) hypothesis. A fraction retention hypothesis is defined as a ratio of the new treatment effect verse the control effect in the context of a time to event endpoint. One of the major concerns using this method in the design of an NI trial is that with a limited sample size, the power of the study is usually very low. This makes an NI trial not applicable particularly when using time to event endpoint. To improve power, Wang et al. ( 2006 ) proposed a ratio test based on asymptotic normality theory. Under a strong assumption (equal variance of the NI test statistic under null and alternative hypotheses), the sample size using Wang's test was much smaller than that using Rothmann's test. However, in practice, the assumption of equal variance is generally questionable for an NI trial design. This assumption is removed in the ratio test proposed in this article, which is derived directly from a Cauchy-like ratio distribution. In addition, using this method, the fundamental assumption used in Rothmann's test, that the observed control effect is always positive, that is, the observed hazard ratio for placebo over the control is greater than 1, is no longer necessary. Without assuming equal variance under null and alternative hypotheses, the sample size required for an NI trial can be significantly reduced if using the proposed ratio test for a fraction retention NI hypothesis.

  4. A micro focus with macro impact: Exploration of initial abstraction coefficient ratio (λ) in Soil Conservation Curve Number (CN) methodology

    International Nuclear Information System (INIS)

    Ling, L; Yusop, Z

    2014-01-01

    Researchers started to cross examine United States Department of Agriculture (USDA) Soil Conservation Services (SCS) Curve Number (CN) methodology after the technique produced inconsistent results throughout the world. More field data from recent decades were leaning against the assumption of the initial abstraction coefficient ratio value proposed by SCS in 1954. Physiographic conditions were identified as vital influencing factors to be considered under this methodology while practitioners of this method are encouraged to validate and derive regional specific relationship and employ the method with caution

  5. Butterfly valve torque prediction methodology

    International Nuclear Information System (INIS)

    Eldiwany, B.H.; Sharma, V.; Kalsi, M.S.; Wolfe, K.

    1994-01-01

    As part of the Motor-Operated Valve (MOV) Performance Prediction Program, the Electric Power Research Institute has sponsored the development of methodologies for predicting thrust and torque requirements of gate, globe, and butterfly MOVs. This paper presents the methodology that will be used by utilities to calculate the dynamic torque requirements for butterfly valves. The total dynamic torque at any disc position is the sum of the hydrodynamic torque, bearing torque (which is induced by the hydrodynamic force), as well as other small torque components (such as packing torque). The hydrodynamic torque on the valve disc, caused by the fluid flow through the valve, depends on the disc angle, flow velocity, upstream flow disturbances, disc shape, and the disc aspect ratio. The butterfly valve model provides sets of nondimensional flow and torque coefficients that can be used to predict flow rate and hydrodynamic torque throughout the disc stroke and to calculate the required actuation torque and the maximum transmitted torque throughout the opening and closing stroke. The scope of the model includes symmetric and nonsymmetric discs of different shapes and aspects ratios in compressible and incompressible fluid applications under both choked and nonchoked flow conditions. The model features were validated against test data from a comprehensive flowloop and in situ test program. These tests were designed to systematically address the effect of the following parameters on the required torque: valve size, disc shapes and disc aspect ratios, upstream elbow orientation and its proximity, and flow conditions. The applicability of the nondimensional coefficients to valves of different sizes was validated by performing tests on 42-in. valve and a precisely scaled 6-in. model. The butterfly valve model torque predictions were found to bound test data from the flow-loop and in situ testing, as shown in the examples provided in this paper

  6. An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits

    Science.gov (United States)

    Corliss, Walter F., II

    1989-03-01

    The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.

  7. PROFITABILITY RATIO AS A TOOL FOR BANKRUPTCY PREDICTION

    OpenAIRE

    Daniel BRÎNDESCU – OLARIU

    2016-01-01

    The current study evaluates the potential of the profitability ratio in predicting corporate bankruptcy. The research is focused on Romanian companies, with the targeted event being represented by the manifestation of bankruptcy 2 years after the date of the financial statements of reference. All tests were conducted over 2 paired samples of 1176 Romanian companies. The methodology employed in evaluating the potential of the profitability ratio was based on the Area Under the ROC Curve (0.663...

  8. Ensemble Empirical Mode Decomposition based methodology for ultrasonic testing of coarse grain austenitic stainless steels.

    Science.gov (United States)

    Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N

    2015-03-01

    A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.

  9. Multiple Improvements of Multiple Imputation Likelihood Ratio Tests

    OpenAIRE

    Chan, Kin Wai; Meng, Xiao-Li

    2017-01-01

    Multiple imputation (MI) inference handles missing data by first properly imputing the missing values $m$ times, and then combining the $m$ analysis results from applying a complete-data procedure to each of the completed datasets. However, the existing method for combining likelihood ratio tests has multiple defects: (i) the combined test statistic can be negative in practice when the reference null distribution is a standard $F$ distribution; (ii) it is not invariant to re-parametrization; ...

  10. Sex ratios in the two Germanies: a test of the economic stress hypothesis.

    Science.gov (United States)

    Catalano, Ralph A

    2003-09-01

    Literature describing temporal variation in the secondary sex ratio among humans reports an association between population stressors and declines in the odds of male birth. Explanations of this phenomenon draw on reports that stressed females spontaneously abort male more than female fetuses, and that stressed males exhibit reduced sperm motility. This work has led to the argument that population stress induced by a declining economy reduces the human sex ratio. No direct test of this hypothesis appears in the literature. Here, a test is offered based on a comparison of the sex ratio in East and West Germany for the years 1946 to 1999. The theory suggests that the East German sex ratio should be lower in 1991, when East Germany's economy collapsed, than expected from its own history and from the sex ratio in West Germany. The hypothesis is tested using time-series modelling methods. The data support the hypothesis. The sex ratio in East Germany was at its lowest in 1991. This first direct test supports the hypothesis that economic decline reduces the human sex ratio.

  11. A European test of pesticide-leaching models: methodology and major recommendations

    NARCIS (Netherlands)

    Vanclooster, M.; Boesten, J.J.T.I.; Trevisan, M.; Brown, C.D.; Capri, E.; Eklo, O.M.; Gottesbüren, B.; Gouy, V.; Linden, van der A.M.A.

    2000-01-01

    Testing of pesticide-leaching models is important in view of their increasing use in pesticide registration procedures in the European Union. This paper presents the methodology and major conclusions of a test of pesticide-leaching models. Twelve models simulating the vertical one-dimensional

  12. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    Science.gov (United States)

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  13. Buffer Construction Methodology in Demonstration Test For Cavern Type Disposal Facility

    International Nuclear Information System (INIS)

    Yoshihiro, Akiyama; Takahiro, Nakajima; Katsuhide, Matsumura; Kenji, Terada; Takao, Tsuboya; Kazuhiro, Onuma; Tadafumi, Fujiwara

    2009-01-01

    A number of studies concerning a cavern type disposal facility have been carried out for disposal of low level radioactive waste mainly generated by power plant decommissioning in Japan. The disposal facility is composed of an engineered barrier system with concrete pit and bentonite buffer, and planed to be constructed in sub-surface 50 - 100 meters depth. Though the previous studies have mainly used laboratory and mock-up tests, we conducted a demonstration test in a full-size cavern. The main objectives of the test were to study the construction methodology and to confirm the quality of the engineered barrier system. The demonstration test was planned as the construction of full scale mock-up. It was focused on a buffer construction test to evaluate the construction methodology and quality control in this paper. Bentonite material was compacted to 1.6 Mg/m 3 in-site by large vibrating roller in this test. Through the construction of the buffer part, a 1.6 Mg/m 3 of the density was accomplished, and the data of workability and quality is collected. (authors)

  14. Understanding the properties of diagnostic tests - Part 2: Likelihood ratios.

    Science.gov (United States)

    Ranganathan, Priya; Aggarwal, Rakesh

    2018-01-01

    Diagnostic tests are used to identify subjects with and without disease. In a previous article in this series, we examined some attributes of diagnostic tests - sensitivity, specificity, and predictive values. In this second article, we look at likelihood ratios, which are useful for the interpretation of diagnostic test results in everyday clinical practice.

  15. Financial Key Ratios

    OpenAIRE

    Tănase Alin-Eliodor

    2014-01-01

    This article focuses on computing techniques starting from trial balance data regarding financial key ratios. There are presented activity, liquidity, solvency and profitability financial key ratios. It is presented a computing methodology in three steps based on a trial balance.

  16. Reference Performance Test Methodology for Degradation Assessment of Lithium-Sulfur Batteries

    DEFF Research Database (Denmark)

    Knap, Vaclav; Stroe, Daniel-Ioan; Purkayastha, Rajlakshmi

    2018-01-01

    Lithium-Sulfur (Li-S) is an emerging battery technology receiving a growing amount of attention due to its potentially high gravimetric energy density, safety, and low production cost. However, there are still some obstacles preventing its swift commercialization. Li-S batteries are driven...... by different electrochemical processes than commonly used Lithium-ion batteries, which often results in very different behavior. Therefore, the testing and modeling of these systems have to be adjusted to reflect their unique behavior and to prevent possible bias. A methodology for a Reference Performance Test...... (RPT) for the Li-S batteries is proposed in this study to point out Li-S battery features and provide guidance to users how to deal with them and possible results into standardization. The proposed test methodology is demonstrated for 3.4 Ah Li-S cells aged under different conditions....

  17. Similar tests and the standardized log likelihood ratio statistic

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1986-01-01

    When testing an affine hypothesis in an exponential family the 'ideal' procedure is to calculate the exact similar test, or an approximation to this, based on the conditional distribution given the minimal sufficient statistic under the null hypothesis. By contrast to this there is a 'primitive......' approach in which the marginal distribution of a test statistic considered and any nuisance parameter appearing in the test statistic is replaced by an estimate. We show here that when using standardized likelihood ratio statistics the 'primitive' procedure is in fact an 'ideal' procedure to order O(n -3...

  18. A study on assessment methodology of surveillance test interval and allowed outage time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol

    1996-07-01

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method

  19. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol [Seoul Nationl Univ., Seoul (Korea, Republic of)] (and others)

    1996-07-15

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method.

  20. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  1. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  2. Test Methodologies for Hydrogen Sensor Performance Assessment: Chamber vs. Flow Through Test Apparatus: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Buttner, William J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hartmann, Kevin S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Schmidt, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cebolla, Rafeal O [Joint Research Centre, Petten, the Netherlands; Weidner, Eveline [Joint Research Centre, Petten, the Netherlands; Bonato, Christian [Joint Research Centre, Petten, the Netherlands

    2017-11-06

    Certification of hydrogen sensors to standards often prescribes using large-volume test chambers [1, 2]. However, feedback from stakeholders such as sensor manufacturers and end-users indicate that chamber test methods are often viewed as too slow and expensive for routine assessment. Flow through test methods potentially are an efficient, cost-effective alternative for sensor performance assessment. A large number of sensors can be simultaneously tested, in series or in parallel, with an appropriate flow through test fixture. The recent development of sensors with response times of less than 1s mandates improvements in equipment and methodology to properly capture the performance of this new generation of fast sensors; flow methods are a viable approach for accurate response and recovery time determinations, but there are potential drawbacks. According to ISO 26142 [1], flow through test methods may not properly simulate ambient applications. In chamber test methods, gas transport to the sensor can be dominated by diffusion which is viewed by some users as mimicking deployment in rooms and other confined spaces. Alternatively, in flow through methods, forced flow transports the gas to the sensing element. The advective flow dynamics may induce changes in the sensor behaviour relative to the quasi-quiescent condition that may prevail in chamber test methods. One goal of the current activity in the JRC and NREL sensor laboratories [3, 4] is to develop a validated flow through apparatus and methods for hydrogen sensor performance testing. In addition to minimizing the impact on sensor behaviour induced by differences in flow dynamics, challenges associated with flow through methods include the ability to control environmental parameters (humidity, pressure and temperature) during the test and changes in the test gas composition induced by chemical reactions with upstream sensors. Guidelines on flow through test apparatus design and protocols for the evaluation of

  3. An Intersection–Union Test for the Sharpe Ratio

    Directory of Open Access Journals (Sweden)

    Gabriel Frahm

    2018-04-01

    Full Text Available An intersection–union test for supporting the hypothesis that a given investment strategy is optimal among a set of alternatives is presented. It compares the Sharpe ratio of the benchmark with that of each other strategy. The intersection–union test takes serial dependence into account and does not presume that asset returns are multivariate normally distributed. An empirical study based on the G–7 countries demonstrates that it is hard to find significant results due to the lack of data, which confirms a general observation in empirical finance.

  4. Further comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kulacsy, K. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1997-05-23

    The Bayesian method for belief updating proposed in Racz (1996) is examined. The interpretation of the belief function introduced therein is found, and the method is compared to the classical binary Sequential Probability Ratio Testing method (SPRT). (author).

  5. Evidence Based Medicine; Positive and Negative Likelihood Ratios of Diagnostic Tests

    Directory of Open Access Journals (Sweden)

    Alireza Baratloo

    2015-10-01

    Full Text Available In the previous two parts of educational manuscript series in Emergency, we explained some screening characteristics of diagnostic tests including accuracy, sensitivity, specificity, and positive and negative predictive values. In the 3rd  part we aimed to explain positive and negative likelihood ratio (LR as one of the most reliable performance measures of a diagnostic test. To better understand this characteristic of a test, it is first necessary to fully understand the concept of sensitivity and specificity. So we strongly advise you to review the 1st part of this series again. In short, the likelihood ratios are about the percentage of people with and without a disease but having the same test result. The prevalence of a disease can directly influence screening characteristics of a diagnostic test, especially its sensitivity and specificity. Trying to eliminate this effect, LR was developed. Pre-test probability of a disease multiplied by positive or negative LR can estimate post-test probability. Therefore, LR is the most important characteristic of a test to rule out or rule in a diagnosis. A positive likelihood ratio > 1 means higher probability of the disease to be present in a patient with a positive test. The further from 1, either higher or lower, the stronger the evidence to rule in or rule out the disease, respectively. It is obvious that tests with LR close to one are less practical. On the other hand, LR further from one will have more value for application in medicine. Usually tests with 0.1 < LR > 10 are considered suitable for implication in routine practice.

  6. Small punch creep test: A promising methodology for high temperature plant components life evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Tettamanti, S [CISE SpA, Milan (Italy); Crudeli, R [ENEL SpA, Milan (Italy)

    1999-12-31

    CISE and ENEL are involved for years in a miniaturization creep methodology project to obtain similar non-destructive test with the same standard creep test reliability. The goal can be reached with `Small punch creep test` that collect all the requested characteristics; quasi nondestructive disk specimens extracted both on external or internal side of components, than accurately machined and tested on little and cheap apparatus. CISE has developed complete creep small punch procedure that involved peculiar test facility and correlation`s law comparable with the more diffused isostress methodology for residual life evaluation on ex-serviced high temperature plant components. The aim of this work is to obtain a simple and immediately applicable relationship useful for plant maintenance managing. More added work is need to validate the Small Punch methodology and for relationship calibration on most diffusion high temperature structural materials. First obtained results on a comparative work on ASTM A355 P12 ex-serviced pipe material are presented joint with a description of the Small Punch apparatus realized in CISE. (orig.) 6 refs.

  7. Small punch creep test: A promising methodology for high temperature plant components life evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Tettamanti, S. [CISE SpA, Milan (Italy); Crudeli, R. [ENEL SpA, Milan (Italy)

    1998-12-31

    CISE and ENEL are involved for years in a miniaturization creep methodology project to obtain similar non-destructive test with the same standard creep test reliability. The goal can be reached with `Small punch creep test` that collect all the requested characteristics; quasi nondestructive disk specimens extracted both on external or internal side of components, than accurately machined and tested on little and cheap apparatus. CISE has developed complete creep small punch procedure that involved peculiar test facility and correlation`s law comparable with the more diffused isostress methodology for residual life evaluation on ex-serviced high temperature plant components. The aim of this work is to obtain a simple and immediately applicable relationship useful for plant maintenance managing. More added work is need to validate the Small Punch methodology and for relationship calibration on most diffusion high temperature structural materials. First obtained results on a comparative work on ASTM A355 P12 ex-serviced pipe material are presented joint with a description of the Small Punch apparatus realized in CISE. (orig.) 6 refs.

  8. Safeguarding a Lunar Rover with Wald's Sequential Probability Ratio Test

    Science.gov (United States)

    Furlong, Michael; Dille, Michael; Wong, Uland; Nefian, Ara

    2016-01-01

    The virtual bumper is a safeguarding mechanism for autonomous and remotely operated robots. In this paper we take a new approach to the virtual bumper system by using an old statistical test. By using a modified version of Wald's sequential probability ratio test we demonstrate that we can reduce the number of false positive reported by the virtual bumper, thereby saving valuable mission time. We use the concept of sequential probability ratio to control vehicle speed in the presence of possible obstacles in order to increase certainty about whether or not obstacles are present. Our new algorithm reduces the chances of collision by approximately 98 relative to traditional virtual bumper safeguarding without speed control.

  9. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang [Seoul National Univ., Seoul (Korea, Republic of)

    1997-07-15

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code.

  10. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang

    1997-07-01

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code

  11. Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis; Gold, Dara

    2013-01-01

    We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.

  12. Dairy cattle sustainability using the emergy methodology: Environmental loading ratio

    Directory of Open Access Journals (Sweden)

    Edmar Eduardo Bassan Mendes

    2012-12-01

    Full Text Available The dairy cattle activity in São Paulo State has been depressed in recent years, evidenced by the reduction of 35.47% of dairy herd between 1996 and 2008 (LUPA and 29.73% in milk production between the census of the IBGE (1995 and 2006. Activity remains in the Agricultural Production Units (UPA that have adopted more intensive systems of milk production, using animals of high genetic potential, management-intensive rotational grazing or agricultural inputs, and with the objective of profit maximization. In face of environmental pressures, the problem is to know the degree of sustainability of milk production. The objective in this work was to analyze the production of milk from a farm in the municipality of Guzolândia, São Paulo State, during the period 2005/2011, using the emergy methodology to assess the sustainability of system, calculated by Environmental Loading Ratio (ELR. The UPA Alto da Araúna is dedicated to dairy cattle adopting the system of milk production semi-intensive type B; it produces on average 650 liters of milk per day with 45 lactating cows, using 30 ha of pasture with supplemental feed and silage. It has sandy soil, classified as latossol red, yellow, ortho phase, with gently rolling slopes. The UPA is administered with business structure, aiming to profit maximization and minimization of environmental impacts, seeking to maintain economically viable activity and preserving the environment. Currently, administrative decisions have the support of operational control that collects and records information necessary to generate animal and agricultural indexes that evaluate the performance of the UPA, in addition to managerial accounting records that generate cash flow information used to evaluate the economic efficiency of the UPA. The Environmental Loading Ratio (ELR=N+F/R is obtained by the ratio of natural non-renewable resources (N plus economic resources (F by total renewable emergy (R. It is an indicator of the

  13. A methodology to investigate size scale effects in crystalline plasticity using uniaxial compression testing

    International Nuclear Information System (INIS)

    Uchic, Michael D.; Dimiduk, Dennis M.

    2005-01-01

    A methodology for performing uniaxial compression tests on samples having micron-size dimensions is presented. Sample fabrication is accomplished using focused ion beam milling to create cylindrical samples of uniform cross-section that remain attached to the bulk substrate at one end. Once fabricated, samples are tested in uniaxial compression using a nanoindentation device outfitted with a flat tip, and a stress-strain curve is obtained. The methodology can be used to examine the plastic response of samples of different sizes that are from the same bulk material. In this manner, dimensional size effects at the micron scale can be explored for single crystals, using a readily interpretable test that minimizes imposed stretch and bending gradients. The methodology was applied to a single-crystal Ni superalloy and a transition from bulk-like to size-affected behavior was observed for samples 5 μm in diameter and smaller

  14. Methodology and application of 13C breath test in gastroenterology practice

    International Nuclear Information System (INIS)

    Yan Weili; Jiang Yibin

    2002-01-01

    13 C breath test has been widely used in research of nutrition, pharmacology and gastroenterology for its properties such as safety, non-invasion and so on. The author describes the principle, methodology of 13 C breath test and its application in detection to Helico-bacteria pylori infection in stomach and small bowl bacterial overgrowth, measurement of gastric emptying, pancreatic exocrine function and liver function with various substrates

  15. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hyun; You, Young Woo; Cho, Jae Seon; Huh, Chang Wook; Kim, Do Hyoung; Kim, Ju Youl; Kim, Yoon Ik; Yang, Hui Chang; Park, Kang Min [Seoul National Univ., Seoul (Korea, Republic of)

    1998-03-15

    The objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Internal(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plant safety. In this study, the survey about the assessment methodologies, modelings and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. The sensitivity analyses about the failure factors of the components are performed in the bases of the and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code. The qualitative assessment for the STI/AOR of RPS/ESFAS assured safety the most important system in the nuclear power plant are performed.

  16. Hypothesis testing on the fractal structure of behavioral sequences: the Bayesian assessment of scaling methodology.

    Science.gov (United States)

    Moscoso del Prado Martín, Fermín

    2013-12-01

    I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  17. Jet-Surface Interaction: High Aspect Ratio Nozzle Test, Nozzle Design and Preliminary Data

    Science.gov (United States)

    Brown, Clifford; Dippold, Vance

    2015-01-01

    The Jet-Surface Interaction High Aspect Ratio (JSI-HAR) nozzle test is part of an ongoing effort to measure and predict the noise created when an aircraft engine exhausts close to an airframe surface. The JSI-HAR test is focused on parameters derived from the Turbo-electric Distributed Propulsion (TeDP) concept aircraft which include a high-aspect ratio mailslot exhaust nozzle, internal septa, and an aft deck. The size and mass flow rate limits of the test rig also limited the test nozzle to a 16:1 aspect ratio, half the approximately 32:1 on the TeDP concept. Also, unlike the aircraft, the test nozzle must transition from a single round duct on the High Flow Jet Exit Rig, located in the AeroAcoustic Propulsion Laboratory at the NASA Glenn Research Center, to the rectangular shape at the nozzle exit. A parametric nozzle design method was developed to design three low noise round-to-rectangular transitions, with 8:1, 12:1, and 16: aspect ratios, that minimizes flow separations and shocks while providing a flat flow profile at the nozzle exit. These designs validated using the WIND-US CFD code. A preliminary analysis of the test data shows that the actual flow profile is close to that predicted and that the noise results appear consistent with data from previous, smaller scale, tests. The JSI-HAR test is ongoing through October 2015. The results shown in the presentation are intended to provide an overview of the test and a first look at the preliminary results.

  18. Methodology for dynamic biaxial tension testing of pregnant uterine tissue.

    Science.gov (United States)

    Manoogian, Sarah; Mcnally, Craig; Calloway, Britt; Duma, Stefan

    2007-01-01

    Placental abruption accounts for 50% to 70% of fetal losses in motor vehicle crashes. Since automobile crashes are the leading cause of traumatic fetal injury mortality in the United States, research of this injury mechanism is important. Before research can adequately evaluate current and future restraint designs, a detailed model of the pregnant uterine tissues is necessary. The purpose of this study is to develop a methodology for testing the pregnant uterus in biaxial tension at a rate normally seen in a motor vehicle crash. Since the majority of previous biaxial work has established methods for quasi-static testing, this paper combines previous research and new methods to develop a custom designed system to strain the tissue at a dynamic rate. Load cells and optical markers are used for calculating stress strain curves of the perpendicular loading axes. Results for this methodology show images of a tissue specimen loaded and a finite verification of the optical strain measurement. The biaxial test system dynamically pulls the tissue to failure with synchronous motion of four tissue grips that are rigidly coupled to the tissue specimen. The test device models in situ loading conditions of the pregnant uterus and overcomes previous limitations of biaxial testing. A non-contact method of measuring strains combined with data reduction to resolve the stresses in two directions provides the information necessary to develop a three dimensional constitutive model of the material. Moreover, future research can apply this method to other soft tissues with similar in situ loading conditions.

  19. Comparison of two bond strength testing methodologies for bilayered all-ceramics

    NARCIS (Netherlands)

    Dundar, Mine; Ozcan, Mutlu; Gokce, Bulent; Comlekoglu, Erhan; Leite, Fabiola; Valandro, Luiz Felipe

    Objectives. This study compared the shear bond strength (SBS) and microtensile (MTBS) testing methodologies for core and veneering ceramics in four types of all-ceramic systems. Methods. Four different ceramic veneer/core combinations, three of which were feldspathic and the other a fluor-apatite to

  20. SOLVENCY RATIO AS A TOOL FOR BANKRUPTCY PREDICTION

    Directory of Open Access Journals (Sweden)

    Daniel BRÎNDESCU–OLARIU

    2016-08-01

    Full Text Available The current study evaluates the potential of the solvency ratio in predicting corporate bankruptcy. The research is focused on Romania and, in particular, on Timis County. The interest for the solvency ratio was based on the recommendations of the scientific literature, as well as on the availability of information concerning its values to all stakeholders. The event on which the research was focused was represented by the manifestation of bankruptcy 2 years after the date of the financial statements of reference. All tests were performed over 2 paired samples of 1176 companies in total. The methodology employed in evaluating the potential of the solvency ratio was based on the Area Under the ROC Curve (0.646 and the general accuracy ensured by the ratio (64.5% out-of-sample accuracy. The results confirm the practical utility of the solvency ratio in the prediction of bankruptcy.

  1. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1992-01-01

    The Analysis and Testing Group (WX-11) of the Design Engineering Division at Los Alamos National Laboratory (LANL) is developing methodology for designing and providing a basis for certification of Type B shipping containers. This methodology will include design, analysis, testing, fabrication, procurement, and obtaining certification of the Type B containers, allowing usage in support of the United States Department of Energy programs. While all aspects of the packaging development are included in this methodology, this paper focuses on the use of analysis and testing techniques for enhancing the design and providing a basis for certification. This methodology is based on concurrent engineering principles. Multidisciplinary teams within LANL are responsible for the design and certification of specific Type B Radioactive Material Shipping Containers. These teams include personnel with the various backgrounds and areas of expertise required to support the design, testing, analysis and certification tasks. To demonstrate that a package can pass all the performance requirements, the design needs to be characterized as completely as possible. Understanding package responses to the various environments and how these responses influence the effectiveness of the packaging requires expertise in several disciplines. In addition to characterizing the shipping container designs, these multidisciplinary teams should be able to provide insight into improving new package designs

  2. Improvement in decay ratio calculation in LAPUR5 methodology for BWR instability

    International Nuclear Information System (INIS)

    Li Hsuannien; Yang Tzungshiue; Shih Chunkuan; Wang Jongrong; Lin Haotzu

    2009-01-01

    LAPUR5, based on frequency domain approach, is a computer code that analyzes the core stability and calculates decay ratios (DRs) of boiling water nuclear reactors. In current methodology, one set of parameters (three friction multipliers and one density reactivity coefficient multiplier) is chosen for LAPUR5 input files, LAPURX and LAPURW. The calculation stops and DR for this particular set of parameters is obtained when the convergence criteria (pressure, mass flow rate) are first met. However, there are other sets of parameters which could also meet the same convergence criteria without being identified. In order to cover these ranges of parameters, we developed an improved procedure to calculate DR in LAPUR5. First, we define the ranges and increments of those dominant input parameters in the input files for DR loop search. After LAPUR5 program execution, we can obtain all DRs for every set of parameters which satisfy the converge criteria in one single operation. The part for loop search procedure covers those steps in preparing LAPURX and LAPURW input files. As a demonstration, we looked into the reload design of Kuosheng Unit 2 Cycle 22. We found that the global DR has a maximum at exposure of 9070 MWd/t and the regional DR has a maximum at exposure of 5770 MWd/t. It should be noted that the regional DR turns out to be larger than the global ones for exposures less than 5770 MWd/t. Furthermore, we see that either global or regional DR by the loop search method is greater than the corresponding values from our previous approach. It is concluded that the loop search method can reduce human error and save human labor as compared with the previous version of LAPUR5 methodology. Now the maximum DR can be effectively obtained for a given plant operating conditions and a more precise stability boundary, with less uncertainty, can be plotted on plant power/flow map. (author)

  3. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 2: Appendices

    Science.gov (United States)

    Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.

    1993-05-01

    This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.

  4. Development of Testing Methodologies for the Mechanical Properties of MEMS

    Science.gov (United States)

    Ekwaro-Osire, Stephen

    2003-01-01

    This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.

  5. An evaluation of damping ratios for HVAC duct systems using vibration test data

    International Nuclear Information System (INIS)

    Gunyasu, K.; Horimizu, Y.; Kawakami, A.; Iokibe, H.; Yamazaki, T.

    1988-01-01

    The function of Heating Ventilating Air Conditioning (HVAC) systems must be maintained including HVAC duct systems to keep the operation of safety-related equipment in nuclear power plants during earthquake excitations. Therefore, it is important to carry out seismic design for HVAC duct systems. In the previous aseismic design for HVAC duct systems, the 0.5% damping ratio has been used in Japan. In recent years, vibration tests, held on actual duct systems in nuclear power plants and mockup duct systems were performed in order to investigate damping ratios for HVAC duct systems. Based on the results, it was confirmed that the damping ratio for HVAC duct systems, evaluated from these tests, were much greater than the 0.5% damping ratio used in the previous aseismic design of Japan. The new damping ratio in aseismic design was proposed to be 2.5%. The present paper describes the results of the above mentioned investigation

  6. Latent Trait Theory Applications to Test Item Bias Methodology. Research Memorandum No. 1.

    Science.gov (United States)

    Osterlind, Steven J.; Martois, John S.

    This study discusses latent trait theory applications to test item bias methodology. A real data set is used in describing the rationale and application of the Rasch probabilistic model item calibrations across various ethnic group populations. A high school graduation proficiency test covering reading comprehension, writing mechanics, and…

  7. Diagonal Likelihood Ratio Test for Equality of Mean Vectors in High-Dimensional Data

    KAUST Repository

    Hu, Zongliang

    2017-10-27

    We propose a likelihood ratio test framework for testing normal mean vectors in high-dimensional data under two common scenarios: the one-sample test and the two-sample test with equal covariance matrices. We derive the test statistics under the assumption that the covariance matrices follow a diagonal matrix structure. In comparison with the diagonal Hotelling\\'s tests, our proposed test statistics display some interesting characteristics. In particular, they are a summation of the log-transformed squared t-statistics rather than a direct summation of those components. More importantly, to derive the asymptotic normality of our test statistics under the null and local alternative hypotheses, we do not require the assumption that the covariance matrix follows a diagonal matrix structure. As a consequence, our proposed test methods are very flexible and can be widely applied in practice. Finally, simulation studies and a real data analysis are also conducted to demonstrate the advantages of our likelihood ratio test method.

  8. Diagonal Likelihood Ratio Test for Equality of Mean Vectors in High-Dimensional Data

    KAUST Repository

    Hu, Zongliang; Tong, Tiejun; Genton, Marc G.

    2017-01-01

    We propose a likelihood ratio test framework for testing normal mean vectors in high-dimensional data under two common scenarios: the one-sample test and the two-sample test with equal covariance matrices. We derive the test statistics under the assumption that the covariance matrices follow a diagonal matrix structure. In comparison with the diagonal Hotelling's tests, our proposed test statistics display some interesting characteristics. In particular, they are a summation of the log-transformed squared t-statistics rather than a direct summation of those components. More importantly, to derive the asymptotic normality of our test statistics under the null and local alternative hypotheses, we do not require the assumption that the covariance matrix follows a diagonal matrix structure. As a consequence, our proposed test methods are very flexible and can be widely applied in practice. Finally, simulation studies and a real data analysis are also conducted to demonstrate the advantages of our likelihood ratio test method.

  9. Development of Testing Methodologies to Evaluate Postflight Locomotor Performance

    Science.gov (United States)

    Mulavara, A. P.; Peters, B. T.; Cohen, H. S.; Richards, J. T.; Miller, C. A.; Brady, R.; Warren, L. E.; Bloomberg, J. J.

    2006-01-01

    Crewmembers experience locomotor and postural instabilities during ambulation on Earth following their return from space flight. Gait training programs designed to facilitate recovery of locomotor function following a transition to a gravitational environment need to be accompanied by relevant assessment methodologies to evaluate their efficacy. The goal of this paper is to demonstrate the operational validity of two tests of locomotor function that were used to evaluate performance after long duration space flight missions on the International Space Station (ISS).

  10. A bench-scale biotreatability methodology to evaluate field bioremediation

    International Nuclear Information System (INIS)

    Saberiyan, A.G.; MacPherson, J.R. Jr.; Moore, R.; Pruess, A.J.; Andrilenas, J.S.

    1995-01-01

    A bench-scale biotreatability methodology was designed to assess field bioremediation of petroleum contaminated soil samples. This methodology was performed successfully on soil samples from more than 40 sites. The methodology is composed of two phases, characterization and experimentation. The first phase is physical, chemical, and biological characterization of the contaminated soil sample. This phase determines soil parameters, contaminant type, presence of indigenous contaminant-degrading bacteria, and bacterial population size. The second phase, experimentation, consists of a respirometry test to measure the growth of microbes indirectly (via generation of CO 2 ) and the consumption of their food source directly (via contaminant loss). Based on a Monod kinetic analysis, the half-life of a contaminant can be calculated. Abiotic losses are accounted for based on a control test. The contaminant molecular structure is used to generate a stoichiometric equation. The stoichiometric equation yields a theoretical ratio for mg of contaminant degraded per mg of CO 2 produced. Data collected from the respirometry test are compared to theoretical values to evaluate bioremediation feasibility

  11. Methodology for predicting the life of waste-package materials, and components using multifactor accelerated life tests

    International Nuclear Information System (INIS)

    Thomas, R.E.; Cote, R.W.

    1983-09-01

    Accelerated life tests are essential for estimating the service life of waste-package materials and components. A recommended methodology for generating accelerated life tests is described in this report. The objective of the methodology is to define an accelerated life test program that is scientifically and statistically defensible. The methodology is carried out using a select team of scientists and usually requires 4 to 12 man-months of effort. Specific agendas for the successive meetings of the team are included in the report for use by the team manager. The agendas include assignments for the team scientists and a different set of assignments for the team statistician. The report also includes descriptions of factorial tables, hierarchical trees, and associated mathematical models that are proposed as technical tools to guide the efforts of the design team

  12. Methodology to identify risk-significant components for inservice inspection and testing

    International Nuclear Information System (INIS)

    Anderson, M.T.; Hartley, R.S.; Jones, J.L. Jr.; Kido, C.; Phillips, J.H.

    1992-08-01

    Periodic inspection and testing of vital system components should be performed to ensure the safe and reliable operation of Department of Energy (DOE) nuclear processing facilities. Probabilistic techniques may be used to help identify and rank components by their relative risk. A risk-based ranking would allow varied DOE sites to implement inspection and testing programs in an effective and cost-efficient manner. This report describes a methodology that can be used to rank components, while addressing multiple risk issues

  13. Accelerated lifetime testing methodology for lifetime estimation of Lithium-ion batteries used in augmented wind power plants

    DEFF Research Database (Denmark)

    Stroe, Daniel Ioan; Swierczynski, Maciej Jozef; Stan, Ana-Irina

    2013-01-01

    The development of lifetime estimation models for Lithium-ion battery cells, which are working under highly variable mission profiles characteristic for wind power plant applications, requires a lot of expenditures and time resources. Therefore, batteries have to be tested under accelerated...... lifetime ageing conditions. This paper presents a three-stage methodology used for accelerated lifetime testing of Lithium-ion batteries. The results obtained at the end of the accelerated ageing process can be used for the parametrization of a performance-degradation lifetime model. In the proposed...... methodology both calendar and cycling lifetime tests are considered since both components are influencing the lifetime of Lithium-ion batteries. The methodology proposes also a lifetime model verification stage, where Lithium-ion battery cells are tested at normal operating conditions using an application...

  14. Testing Strategies and Methodologies for the Max Launch Abort System

    Science.gov (United States)

    Schaible, Dawn M.; Yuchnovicz, Daniel E.

    2011-01-01

    The National Aeronautics and Space Administration (NASA) Engineering and Safety Center (NESC) was tasked to develop an alternate, tower-less launch abort system (LAS) as risk mitigation for the Orion Project. The successful pad abort flight demonstration test in July 2009 of the "Max" launch abort system (MLAS) provided data critical to the design of future LASs, while demonstrating the Agency s ability to rapidly design, build and fly full-scale hardware at minimal cost in a "virtual" work environment. Limited funding and an aggressive schedule presented a challenge for testing of the complex MLAS system. The successful pad abort flight demonstration test was attributed to the project s systems engineering and integration process, which included: a concise definition of, and an adherence to, flight test objectives; a solid operational concept; well defined performance requirements, and a test program tailored to reducing the highest flight test risks. The testing ranged from wind tunnel validation of computational fluid dynamic simulations to component ground tests of the highest risk subsystems. This paper provides an overview of the testing/risk management approach and methodologies used to understand and reduce the areas of highest risk - resulting in a successful flight demonstration test.

  15. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  16. Testing Methodology of Breaking into Secured Storages of Mobile Operational System Google Android

    Directory of Open Access Journals (Sweden)

    Elena Vyacheslavovna Elistratova

    2013-02-01

    Full Text Available The methodology is developed for carrying out the test of breaking into internal storages of mobile operational system Google Android in order to detect security threats for personal data.

  17. Evaluation methodologies for security testing biometric systems beyond technological evaluation

    OpenAIRE

    Fernández Saavedra, María Belén

    2013-01-01

    The main objective of this PhD Thesis is the specification of formal evaluation methodologies for testing the security level achieved by biometric systems when these are working under specific contour conditions. This analysis is conducted through the calculation of the basic technical biometric system performance and its possible variations. To that end, the next two relevant contributions have been developed. The first contribution is the definition of two independent biometric performance ...

  18. Comparison of two bond strength testing methodologies for bilayered all-ceramics.

    Science.gov (United States)

    Dündar, Mine; Ozcan, Mutlu; Gökçe, Bülent; Cömlekoğlu, Erhan; Leite, Fabiola; Valandro, Luiz Felipe

    2007-05-01

    This study compared the shear bond strength (SBS) and microtensile (MTBS) testing methodologies for core and veneering ceramics in four types of all-ceramic systems. Four different ceramic veneer/core combinations, three of which were feldspathic and the other a fluor-apatite to their respectively corresponding cores, namely leucite-reinforced ceramic ((IPS)Empress, Ivoclar), low leucite-reinforced ceramic (Finesse, Ceramco), glass-infiltrated alumina (In-Ceram Alumina, Vita) and lithium disilicate ((IPS)Empress 2, Ivoclar) were used for SBS and MTBS tests. Ceramic cores (N=40, n=10/group for SBS test method, N=5 blocks/group for MTBS test method) were fabricated according to the manufacturers' instructions (for SBS: thickness, 3mm; diameter, 5mm and for MTBS: 10 mm x 10 mm x 2 mm) and ultrasonically cleaned. The veneering ceramics (thickness: 2mm) were vibrated and condensed in stainless steel moulds and fired onto the core ceramic materials. After trying the specimens in the mould for minor adjustments, they were again ultrasonically cleaned and embedded in PMMA. The specimens were stored in distilled water at 37 degrees C for 1 week and bond strength tests were performed in universal testing machines (cross-head speed: 1mm/min). The bond strengths (MPa+/-S.D.) and modes of failures were recorded. Significant difference between the two test methods and all-ceramic types were observed (P<0.05) (2-way ANOVA, Tukey's test and Bonferroni). The mean SBS values for veneering ceramic to lithium disilicate was significantly higher (41+/-8 MPa) than those to low leucite (28+/-4 MPa), glass-infiltrated (26+/-4 MPa) and leucite-reinforced (23+/-3 MPa) ceramics, while the mean MTBS for low leucite ceramic was significantly higher (15+/-2 MPa) than those of leucite (12+/-2 MPa), glass-infiltrated (9+/-1 MPa) and lithium disilicate ceramic (9+/-1 MPa) (ANOVA, P<0.05). Both the testing methodology and the differences in chemical compositions of the core and veneering ceramics

  19. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  20. Accelerated Lifetime Testing Methodology for Lifetime Estimation of Lithium-ion Batteries used in Augmented Wind Power Plants

    DEFF Research Database (Denmark)

    Stroe, Daniel Ioan; Swierczynski, Maciej Jozef; Stan, Ana-Irina

    2014-01-01

    The development of lifetime estimation models for Lithium-ion battery cells, which are working under highly variable mission profiles characteristic for wind power plant applications, requires a lot of expenditures and time resources. Therefore, batteries have to be tested under accelerated...... lifetime ageing conditions. This paper presents a three-stage methodology used for accelerated lifetime testing of Lithium ion batteries. The results obtained at the end of the accelerated ageing process were used for the parametrization of a performance-degradation lifetime model, which is able to predict...... both the capacity fade and the power capability decrease of the selected Lithium-ion battery cells. In the proposed methodology both calendar and cycling lifetime tests were considered since both components are influencing the lifetime of Lithium-ion batteries. Furthermore, the proposed methodology...

  1. Scale invariant for one-sided multivariate likelihood ratio tests

    Directory of Open Access Journals (Sweden)

    Samruam Chongcharoen

    2010-07-01

    Full Text Available Suppose 1 2 , ,..., n X X X is a random sample from Np ( ,V distribution. Consider 0 1 2 : ... 0 p H      and1 : 0 for 1, 2,..., i H   i  p , let 1 0 H  H denote the hypothesis that 1 H holds but 0 H does not, and let ~ 0 H denote thehypothesis that 0 H does not hold. Because the likelihood ratio test (LRT of 0 H versus 1 0 H  H is complicated, severalad hoc tests have been proposed. Tang, Gnecco and Geller (1989 proposed an approximate LRT, Follmann (1996 suggestedrejecting 0 H if the usual test of 0 H versus ~ 0 H rejects 0 H with significance level 2 and a weighted sum of the samplemeans is positive, and Chongcharoen, Singh and Wright (2002 modified Follmann’s test to include information about thecorrelation structure in the sum of the sample means. Chongcharoen and Wright (2007, 2006 give versions of the Tang-Gnecco-Geller tests and Follmann-type tests, respectively, with invariance properties. With LRT’s scale invariant desiredproperty, we investigate its powers by using Monte Carlo techniques and compare them with the tests which we recommendin Chongcharoen and Wright (2007, 2006.

  2. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    Science.gov (United States)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate (dynamic fatigue) testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rates in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  3. Hydrologic testing methodology and results from deep basalt boreholes

    International Nuclear Information System (INIS)

    Strait, S.R.; Spane, F.A.; Jackson, R.L.; Pidcoe, W.W.

    1982-05-01

    The objective of the hydrologic field-testing program is to provide data for characterization of the groundwater systems wihin the Pasco Basin that are significant to understanding waste isolation. The effort is directed toward characterizing the areal and vertical distributions of hydraulic head, hydraulic properties, and hydrochemistry. Data obtained from these studies provide input for numerical modeling of groundwater flow and solute transport. These models are then used for evaluating potential waste migration as a function of space and time. The groundwater system beneath the Hanford Site and surrounding area consists of a thick, accordantly layered sequence of basalt flows and associated sedimentary interbed that primarily occur in the upper part of the Columbia River basalt. Permeable horizons of the sequence are associated with the interbeds and the interflow zones within the basalt. The columnar interiors of a flow act as low-permeability aquitards, separating the more-permeable interflows or interbeds. This paper discusses the hydrologic field-gathering activities, specifically, field-testing methodology and test results from deep basalt boreholes

  4. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Xiao-Ying; Yao, Juan; He, Hua; Glantz, Clifford S.; Booth, Alexander E.

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  5. The Decisions of Elementary School Principals: A Test of Ideal Type Methodology.

    Science.gov (United States)

    Greer, John T.

    Interviews with 25 Georgia elementary school principals provided data that could be used to test an application of Max Weber's ideal type methodology to decision-making. Alfred Schuetz's model of the rational act, based on one of Weber's ideal types, was analyzed and translated into describable acts and behaviors. Interview procedures were…

  6. [The methodological assessment and qualitative evaluation of psychometric performance tests based on the example of modern tests that assess reading and spelling skills].

    Science.gov (United States)

    Galuschka, Katharina; Rothe, Josefine; Schulte-Körne, Gerd

    2015-09-01

    This article looks at a means of objectively evaluating the quality of psychometric tests. This approach enables users to evaluate psychometric tests based on their methodological characteristics, in order to decide which instrument should be used. Reading and spelling assessment tools serve as examples. The paper also provides a review of German psychometric tests for the assessment of reading and spelling skills. This method facilitates the identification of psychometric tests.of high methodological quality which can be used for the assessment of reading and spelling skills. Reading performance should ideally be assessed with the following instruments: ELFE 1-6, LGVT 6-12, LESEN 6-7, LESEN 8-9, or WLLP-R. The tests to be used for the evaluation of spelling skills are DERET 1-2+, DERET 3-4+, WRT 1+, WRT 2+, WRT 3+, WRT 4+ or HSP 1-10.

  7. Three-dimensional stereo by photometric ratios

    International Nuclear Information System (INIS)

    Wolff, L.B.; Angelopoulou, E.

    1994-01-01

    We present a methodology for corresponding a dense set of points on an object surface from photometric values for three-dimensional stereo computation of depth. The methodology utilizes multiple stereo pairs of images, with each stereo pair being taken of the identical scene but under different illumination. With just two stereo pairs of images taken under two different illumination conditions, a stereo pair of ratio images can be produced, one for the ratio of left-hand images and one for the ratio of right-hand images. We demonstrate how the photometric ratios composing these images can be used for accurate correspondence of object points. Object points having the same photometric ratio with respect to two different illumination conditions constitute a well-defined equivalence class of physical constraints defined by local surface orientation relative to illumination conditions. We formally show that for diffuse reflection the photometric ratio is invariant to varying camera characteristics, surface albedo, and viewpoint and that therefore the same photometric ratio in both images of a stereo pair implies the same equivalence class of physical constraints. The correspondence of photometric ratios along epipolar lines in a stereo pair of images under different illumination conditions is a correspondence of equivalent physical constraints, and the determination of depth from stereo can be performed. Whereas illumination planning is required, our photometric-based stereo methodology does not require knowledge of illumination conditions in the actual computation of three-dimensional depth and is applicable to perspective views. This technique extends the stereo determination of three-dimensional depth to smooth featureless surfaces without the use of precisely calibrated lighting. We demonstrate experimental depth maps from a dense set of points on smooth objects of known ground-truth shape, determined to within 1% depth accuracy

  8. Measuring liquidity on stock market: impact on liquidity ratio

    OpenAIRE

    Siniša Bogdan; Suzana Bareša; Saša Ivanović

    2012-01-01

    The purpose – It is important to emphasize that liquidity on Croatian stock market is low, the purpose of this paper is to test empirically and find out which variables make crucial role in decision making process of investing in stocks. Design – This paper explores the impact of various liquidity variables on liquidity ratio since it is still insufficiently researched topic. Methodology –This research uses secondary and primary data available from Croatian stock market. Considering pri...

  9. Bankruptcy Prediction Based on the Autonomy Ratio

    Directory of Open Access Journals (Sweden)

    Daniel Brîndescu Olariu

    2016-11-01

    Full Text Available The theory and practice of the financial ratio analysis suggest the existence of a negative correlation between the autonomy ratio and the bankruptcy risk. Previous studies conducted on a sample of companies from Timis County (largest county in Romania confirm this hypothesis and recommend the autonomy ratio as a useful tool for measuring the bankruptcy risk two years in advance. The objective of the current research was to develop a methodology for measuring the bankruptcy risk that would be applicable for the companies from the Timis County (specific methodologies are considered necessary for each region. The target population consisted of all the companies from Timis County with annual sales of over 10,000 lei (aprox. 2,200 Euros. The research was performed over all the target population. The study has thus included 53,252 yearly financial statements from the period 2007 – 2010. The results of the study allow for the setting of benchmarks, as well as the configuration of a methodology of analysis. The proposed methodology cannot predict with perfect accuracy the state of the company, but it allows for a valuation of the risk level to which the company is subjected.

  10. Nondestructive Semistatic Testing Methodology for Assessing Fish Textural Characteristics via Closed-Form Mathematical Expressions

    Directory of Open Access Journals (Sweden)

    D. Dimogianopoulos

    2017-01-01

    Full Text Available This paper presents a novel methodology based on semistatic nondestructive testing of fish for the analytical computation of its textural characteristics via closed-form mathematical expressions. The novelty is that, unlike alternatives, explicit values for both stiffness and viscoelastic textural attributes may be computed, even if fish of different size/weight are tested. Furthermore, the testing procedure may be adapted to the specifications (sampling rate and accuracy of the available equipment. The experimental testing involves a fish placed on the pan of a digital weigh scale, which is subsequently tested with a ramp-like load profile in a custom-made installation. The ramp slope is (to some extent adjustable according to the specification (sampling rate and accuracy of the equipment. The scale’s reaction to fish loading, namely, the reactive force, is collected throughout time and is shown to depend on the fish textural attributes according to a closed-form mathematical formula. The latter is subsequently used along with collected data in order to compute these attributes rapidly and effectively. Four whole raw sea bass (Dicentrarchus labrax of various sizes and textures were tested. Changes in texture, related to different viscoelastic characteristics among the four fish, were correctly detected and quantified using the proposed methodology.

  11. Testing methodologies and systems for semiconductor optical amplifiers

    Science.gov (United States)

    Wieckowski, Michael

    Semiconductor optical amplifiers (SOA's) are gaining increased prominence in both optical communication systems and high-speed optical processing systems, due primarily to their unique nonlinear characteristics. This in turn, has raised questions regarding their lifetime performance reliability and has generated a demand for effective testing techniques. This is especially critical for industries utilizing SOA's as components for system-in-package products. It is important to note that very little research to date has been conducted in this area, even though production volume and market demand has continued to increase. In this thesis, the reliability of dilute-mode InP semiconductor optical amplifiers is studied experimentally and theoretically. The aging characteristics of the production level devices are demonstrated and the necessary techniques to accurately characterize them are presented. In addition, this work proposes a new methodology for characterizing the optical performance of these devices using measurements in the electrical domain. It is shown that optical performance degradation, specifically with respect to gain, can be directly qualified through measurements of electrical subthreshold differential resistance. This metric exhibits a linear proportionality to the defect concentration in the active region, and as such, can be used for prescreening devices before employing traditional optical testing methods. A complete theoretical analysis is developed in this work to explain this relationship based upon the device's current-voltage curve and its associated leakage and recombination currents. These results are then extended to realize new techniques for testing semiconductor optical amplifiers and other similarly structured devices. These techniques can be employed after fabrication and during packaged operation through the use of a proposed stand-alone testing system, or using a proposed integrated CMOS self-testing circuit. Both methods are capable

  12. Assessing the Impact of Clothing and Individual Equipment (CIE) on Soldier Physical, Biomechanical, and Cognitive Performance Part 1: Test Methodology

    Science.gov (United States)

    2018-02-01

    29 during Soldier Equipment Configuration Impact on Performance: Establishing a Test Methodology for the...during ACSM’S resource manual for exercise testing and prescription Human Movement Science, 31(2), Proceedings of the 2016 American Biomechanics...Performance of Medium Rucksack Prototypes An investigation: Comparison of live-fire and weapon simulator test methodologies and the of three extremity armor

  13. ANALISIS PENGARUH LDR, NPL DAN OPERATIONAL EFFICIENCY RATIO TERHADAP RETURN ON ASSETS PADA BANK DEVISA DI INDONESIA PERIODE 2010-2012

    OpenAIRE

    Hamidah Hamidah; Goldan Merion Siallagan; Umi Mardiyati

    2014-01-01

    This research is performed on order to test analysis the influence of the Loan to Deposit Ratio (LDR), Non Performing Loan (NPL) and Operational Efficiency Ratio (OER) toward Return On Asset (ROA) On Foreign Exchange Banks In Indonesia Period 2010-2012. Methodology research as the sample used purposive sampling, samplewas accured fromforeign banks in Indonesia. Data analysis with multi liniearregression of ordinary least square and hypotheses test used t-statistic and Fstatistic, a classic as...

  14. Effect of home testing of international normalized ratio on clinical events.

    Science.gov (United States)

    Matchar, David B; Jacobson, Alan; Dolor, Rowena; Edson, Robert; Uyeda, Lauren; Phibbs, Ciaran S; Vertrees, Julia E; Shih, Mei-Chiung; Holodniy, Mark; Lavori, Philip

    2010-10-21

    Warfarin anticoagulation reduces thromboembolic complications in patients with atrial fibrillation or mechanical heart valves, but effective management is complex, and the international normalized ratio (INR) is often outside the target range. As compared with venous plasma testing, point-of-care INR measuring devices allow greater testing frequency and patient involvement and may improve clinical outcomes. We randomly assigned 2922 patients who were taking warfarin because of mechanical heart valves or atrial fibrillation and who were competent in the use of point-of-care INR devices to either weekly self-testing at home or monthly high-quality testing in a clinic. The primary end point was the time to a first major event (stroke, major bleeding episode, or death). The patients were followed for 2.0 to 4.75 years, for a total of 8730 patient-years of follow-up. The time to the first primary event was not significantly longer in the self-testing group than in the clinic-testing group (hazard ratio, 0.88; 95% confidence interval, 0.75 to 1.04; P=0.14). The two groups had similar rates of clinical outcomes except that the self-testing group reported more minor bleeding episodes. Over the entire follow-up period, the self-testing group had a small but significant improvement in the percentage of time during which the INR was within the target range (absolute difference between groups, 3.8 percentage points; P<0.001). At 2 years of follow-up, the self-testing group also had a small but significant improvement in patient satisfaction with anticoagulation therapy (P=0.002) and quality of life (P<0.001). As compared with monthly high-quality clinic testing, weekly self-testing did not delay the time to a first stroke, major bleeding episode, or death to the extent suggested by prior studies. These results do not support the superiority of self-testing over clinic testing in reducing the risk of stroke, major bleeding episode, and death among patients taking warfarin

  15. Improved ASTM G72 Test Method for Ensuring Adequate Fuel-to-Oxidizer Ratios

    Science.gov (United States)

    Juarez, Alfredo; Harper, Susana Tapia

    2016-01-01

    The ASTM G72/G72M-15 Standard Test Method for Autogenous Ignition Temperature of Liquids and Solids in a High-Pressure Oxygen-Enriched Environment is currently used to evaluate materials for the ignition susceptibility driven by exposure to external heat in an enriched oxygen environment. Testing performed on highly volatile liquids such as cleaning solvents has proven problematic due to inconsistent test results (non-ignitions). Non-ignition results can be misinterpreted as favorable oxygen compatibility, although they are more likely associated with inadequate fuel-to-oxidizer ratios. Forced evaporation during purging and inadequate sample size were identified as two potential causes for inadequate available sample material during testing. In an effort to maintain adequate fuel-to-oxidizer ratios within the reaction vessel during test, several parameters were considered, including sample size, pretest sample chilling, pretest purging, and test pressure. Tests on a variety of solvents exhibiting a range of volatilities are presented in this paper. A proposed improvement to the standard test protocol as a result of this evaluation is also presented. Execution of the final proposed improved test protocol outlines an incremental step method of determining optimal conditions using increased sample sizes while considering test system safety limits. The proposed improved test method increases confidence in results obtained by utilizing the ASTM G72 autogenous ignition temperature test method and can aid in the oxygen compatibility assessment of highly volatile liquids and other conditions that may lead to false non-ignition results.

  16. TESTS AND METHODOLOGIES FOR THE SURVEY OF NARROW SPACES

    Directory of Open Access Journals (Sweden)

    L. Perfetti

    2017-02-01

    Full Text Available The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today’s era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  17. Integrated vehicle-based safety systems light-vehicle field operational test, methodology and results report.

    Science.gov (United States)

    2010-12-01

    "This document presents the methodology and results from the light-vehicle field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michi...

  18. Contribution to the problem of liquidity ratios

    OpenAIRE

    Dvoøáèek Jaroslav

    1997-01-01

    The article is based on the importance of the financial analysis in mining industry. The author pays attention to liquidity ratios given in literature from the standpoint of their number, content, units and recommended quantity value of single ratios. For the application in practice two liquidity ratios are suggested and the methodology of their recommended values determination is given.

  19. A likelihood ratio test for species membership based on DNA sequence data

    DEFF Research Database (Denmark)

    Matz, Mikhail V.; Nielsen, Rasmus

    2005-01-01

    DNA barcoding as an approach for species identification is rapidly increasing in popularity. However, it remains unclear which statistical procedures should accompany the technique to provide a measure of uncertainty. Here we describe a likelihood ratio test which can be used to test if a sampled...... sequence is a member of an a priori specified species. We investigate the performance of the test using coalescence simulations, as well as using the real data from butterflies and frogs representing two kinds of challenge for DNA barcoding: extremely low and extremely high levels of sequence variability....

  20. The effects of multiple features of alternatively spliced exons on the KA/KS ratio test

    Directory of Open Access Journals (Sweden)

    Chen Feng-Chi

    2006-05-01

    Full Text Available Abstract Background The evolution of alternatively spliced exons (ASEs is of primary interest because these exons are suggested to be a major source of functional diversity of proteins. Many exon features have been suggested to affect the evolution of ASEs. However, previous studies have relied on the KA/KS ratio test without taking into consideration information sufficiency (i.e., exon length > 75 bp, cross-species divergence > 5% of the studied exons, leading to potentially biased interpretations. Furthermore, which exon feature dominates the results of the KA/KS ratio test and whether multiple exon features have additive effects have remained unexplored. Results In this study, we collect two different datasets for analysis – the ASE dataset (which includes lineage-specific ASEs and conserved ASEs and the ACE dataset (which includes only conserved ASEs. We first show that information sufficiency can significantly affect the interpretation of relationship between exons features and the KA/KS ratio test results. After discarding exons with insufficient information, we use a Boolean method to analyze the relationship between test results and four exon features (namely length, protein domain overlapping, inclusion level, and exonic splicing enhancer (ESE frequency for the ASE dataset. We demonstrate that length and protein domain overlapping are dominant factors, and they have similar impacts on test results of ASEs. In addition, despite the weak impacts of inclusion level and ESE motif frequency when considered individually, combination of these two factors still have minor additive effects on test results. However, the ACE dataset shows a slightly different result in that inclusion level has a marginally significant effect on test results. Lineage-specific ASEs may have contributed to the difference. Overall, in both ASEs and ACEs, protein domain overlapping is the most dominant exon feature while ESE frequency is the weakest one in affecting

  1. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design.

    Science.gov (United States)

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-02-28

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  2. Contribution to the problem of liquidity ratios

    Directory of Open Access Journals (Sweden)

    Dvoøáèek Jaroslav

    1997-03-01

    Full Text Available The article is based on the importance of the financial analysis in mining industry. The author pays attention to liquidity ratios given in literature from the standpoint of their number, content, units and recommended quantity value of single ratios. For the application in practice two liquidity ratios are suggested and the methodology of their recommended values determination is given.

  3. The patients' perspective of international normalized ratio self-testing, remote communication of test results and confidence to move to self-management.

    Science.gov (United States)

    Grogan, Anne; Coughlan, Michael; Prizeman, Geraldine; O'Connell, Niamh; O'Mahony, Nora; Quinn, Katherine; McKee, Gabrielle

    2017-12-01

    To elicit the perceptions of patients, who self-tested their international normalized ratio and communicated their results via a text or phone messaging system, to determine their satisfaction with the education and support that they received and to establish their confidence to move to self-management. Self-testing of international normalized ratio has been shown to be reliable and is fast becoming common practice. As innovations are introduced to point of care testing, more research is needed to elicit patients' perceptions of the self-testing process. This three site study used a cross-sectional prospective descriptive survey. Three hundred and thirty patients who were prescribed warfarin and using international normalized ratio self-testing were invited to take part in the study. The anonymous survey examined patient profile, patients' usage, issues, perceptions, confidence and satisfaction with using the self-testing system and their preparedness for self-management of warfarin dosage. The response rate was 57% (n = 178). Patients' confidence in self-testing was high (90%). Patients expressed a high level of satisfaction with the support received, but expressed the need for more information on support groups, side effects of warfarin, dietary information and how to dispose of needles. When asked if they felt confident to adjust their own warfarin levels 73% agreed. Chi-squared tests for independence revealed that none of the patient profile factors examined influenced this confidence. The patients cited the greatest advantages of the service were reduced burden, more autonomy, convenience and ease of use. The main disadvantages cited were cost and communication issues. Patients were satisfied with self-testing. The majority felt they were ready to move to self-management. The introduction of innovations to remote point of care testing, such as warfarin self-testing, needs to have support at least equal to that provided in a hospital setting. © 2017 John

  4. Test methodology and technology of fracture toughness for small size specimens

    Energy Technology Data Exchange (ETDEWEB)

    Wakai, E.; Takada, F.; Ishii, T.; Ando, M. [Japan Atomic Energy Agency, Naga-gun, Ibaraki-ken (Japan); Matsukawa, S. [JNE Techno-Research Co., Kanagawa-ken (Japan)

    2007-07-01

    Full text of publication follows: Small specimen test technology (SSTT) is required to investigate mechanical properties in the limited availability of effective irradiation volumes in test reactors and accelerator-based neutron and charged particle sources. The test methodology guideline and the manufacture processes for very small size specimens have not been established, and we would have to formulate it. The technology to control exactly the load and displacement is also required in the test technology under the environment of high dose radiation produced from the specimens. The objective of this study is to examine the test technology and methodology of fracture toughness for very small size specimens. A new bend test machine installed in hot cell has been manufactured to obtain fracture toughness and DBTT (ductile - brittle transition temperature) of reduced-activation ferritic/martensitic steels for small bend specimens of t/2-1/3PCCVN (pre-cracked 1/3 size Charpy V-notch) with 20 mm length and DFMB (deformation and fracture mini bend specimen) with 9 mm length. The new machine can be performed at temperatures from -196 deg. C to 400 deg. C under unloading compliance method. Neutron irradiation was also performed at about 250 deg. C to about 2 dpa in JMTR. After the irradiation, fracture toughness and DBTT were examined by using the machine. Checking of displacement measurement between linear gauge of cross head's displacement and DVRT of the specimen displacement was performed exactly. Conditions of pre-crack due to fatigue in the specimen preparation were also examined and it depended on the shape and size of the specimens. Fracture toughness and DBTT of F82H steel for t/2-1/3PCCVN, DFMB and 0.18DCT specimens before irradiation were examined as a function of temperature. DBTT of smaller size specimens of DFMB was lower than that of larger size specimen of t/2-1/3PCCVN and 0.18DCT. The changes of fracture toughness and DBTT due to irradiation were also

  5. Spent fuel sabotage aerosol ratio program : FY 2004 test and data summary.

    Energy Technology Data Exchange (ETDEWEB)

    Brucher, Wenzel (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany); Koch, Wolfgang (Fraunhofer Institut fur Toxikologie und Experimentelle Medizin, Germany); Pretzsch, Gunter Guido (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany); Loiseau, Olivier (Institut de Radioprotection et de Surete Nucleaire, France); Mo, Tin (U.S. Nuclear Regulatory Commission, Washington, DC); Billone, Michael C. (Argonne National Laboratory, Argonne, IL); Autrusson, Bruno A. (Institut de Radioprotection et de Surete Nucleaire, France); Young, F. I. (U.S. Nuclear Regulatory Commission, Washington, DC); Coats, Richard Lee; Burtseva, Tatiana (Argonne National Laboratory, Argonne, IL); Luna, Robert Earl; Dickey, Roy R.; Sorenson, Ken Bryce; Nolte, Oliver (Fraunhofer Institut fur Toxikologie und Experimentelle Medizin, Germany); Thompson, Nancy Slater (U.S. Department of Energy, Washington, DC); Hibbs, Russell S. (U.S. Department of Energy, Washington, DC); Gregson, Michael Warren; Lange, Florentin (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany); Molecke, Martin Alan; Tsai, Han-Chung (Argonne National Laboratory, Argonne, IL)

    2005-07-01

    This multinational, multi-phase spent fuel sabotage test program is quantifying the aerosol particles produced when the products of a high energy density device (HEDD) interact with and explosively particulate test rodlets that contain pellets of either surrogate materials or actual spent fuel. This program has been underway for several years. This program provides data that are relevant to some sabotage scenarios in relation to spent fuel transport and storage casks, and associated risk assessments. The program also provides significant technical and political benefits in international cooperation. We are quantifying the Spent Fuel Ratio (SFR), the ratio of the aerosol particles released from HEDD-impacted actual spent fuel to the aerosol particles produced from surrogate materials, measured under closely matched test conditions, in a contained test chamber. In addition, we are measuring the amounts, nuclide content, size distribution of the released aerosol materials, and enhanced sorption of volatile fission product nuclides onto specific aerosol particle size fractions. These data are the input for follow-on modeling studies to quantify respirable hazards, associated radiological risk assessments, vulnerability assessments, and potential cask physical protection design modifications. This document includes an updated description of the test program and test components for all work and plans made, or revised, during FY 2004. It also serves as a program status report as of the end of FY 2004. All available test results, observations, and aerosol analyses plus interpretations--primarily for surrogate material Phase 2 tests, series 2/5A through 2/9B, using cerium oxide sintered ceramic pellets are included. Advanced plans and progress are described for upcoming tests with unirradiated, depleted uranium oxide and actual spent fuel test rodlets. This spent fuel sabotage--aerosol test program is coordinated with the international Working Group for Sabotage Concerns of

  6. Spent fuel sabotage aerosol ratio program : FY 2004 test and data summary

    International Nuclear Information System (INIS)

    Brucher, Wenzel; Koch, Wolfgang; Pretzsch, Gunter Guido; Loiseau, Olivier; Mo, Tin; Billone, Michael C.; Autrusson, Bruno A.; Young, F. I.; Coats, Richard Lee; Burtseva, Tatiana; Luna, Robert Earl; Dickey, Roy R.; Sorenson, Ken Bryce; Nolte, Oliver; Thompson, Nancy Slater; Hibbs, Russell S.; Gregson, Michael Warren; Lange, Florentin; Molecke, Martin Alan; Tsai, Han-Chung

    2005-01-01

    This multinational, multi-phase spent fuel sabotage test program is quantifying the aerosol particles produced when the products of a high energy density device (HEDD) interact with and explosively particulate test rodlets that contain pellets of either surrogate materials or actual spent fuel. This program has been underway for several years. This program provides data that are relevant to some sabotage scenarios in relation to spent fuel transport and storage casks, and associated risk assessments. The program also provides significant technical and political benefits in international cooperation. We are quantifying the Spent Fuel Ratio (SFR), the ratio of the aerosol particles released from HEDD-impacted actual spent fuel to the aerosol particles produced from surrogate materials, measured under closely matched test conditions, in a contained test chamber. In addition, we are measuring the amounts, nuclide content, size distribution of the released aerosol materials, and enhanced sorption of volatile fission product nuclides onto specific aerosol particle size fractions. These data are the input for follow-on modeling studies to quantify respirable hazards, associated radiological risk assessments, vulnerability assessments, and potential cask physical protection design modifications. This document includes an updated description of the test program and test components for all work and plans made, or revised, during FY 2004. It also serves as a program status report as of the end of FY 2004. All available test results, observations, and aerosol analyses plus interpretations--primarily for surrogate material Phase 2 tests, series 2/5A through 2/9B, using cerium oxide sintered ceramic pellets are included. Advanced plans and progress are described for upcoming tests with unirradiated, depleted uranium oxide and actual spent fuel test rodlets. This spent fuel sabotage--aerosol test program is coordinated with the international Working Group for Sabotage Concerns of

  7. Leakage localisation method in a water distribution system based on sensitivity matrix: methodology and real test

    OpenAIRE

    Pascual Pañach, Josep

    2010-01-01

    Leaks are present in all water distribution systems. In this paper a method for leakage detection and localisation is presented. It uses pressure measurements and simulation models. Leakage localisation methodology is based on pressure sensitivity matrix. Sensitivity is normalised and binarised using a common threshold for all nodes, so a signatures matrix is obtained. A pressure sensor optimal distribution methodology is developed too, but it is not used in the real test. To validate this...

  8. International normalized ratio self-testing and self-management: improving patient outcomes

    Directory of Open Access Journals (Sweden)

    Pozzi M

    2016-10-01

    Full Text Available Matteo Pozzi,1 Julia Mitchell,2 Anna Maria Henaine,3 Najib Hanna,4 Ola Safi,4 Roland Henaine2 1Department of Adult Cardiac Surgery, “Louis Pradel” Cardiologic Hospital, Lyon, France; 2Department of Congenital Cardiac Surgery, “Louis Pradel” Cardiologic Hospital, Lyon, France; 3Clinical Pharmacology Unit, Lebanese University, Beirut, Lebanon; 4Pediatric Unit, “Hotel Dieu de France” Hospital, Saint Joseph University, Beirut, Lebanon Abstract: Long term oral anti-coagulation with vitamin K antagonists is a risk factor of hemorrhagic or thromebomlic complications. Periodic laboratory testing of international normalized ratio (INR and a subsequent dose adjustment are therefore mandatory. The use of home testing devices to measure INR has been suggested as a potential way to improve the comfort and compliance of the patients and their families, the frequency of monitoring and, finally, the management and safety of long-term oral anticoagulation. In pediatric patients, increased doses to obtain and maintain the therapeutic target INR, more frequent adjustments and INR testing, multiple medication, inconstant nutritional intake, difficult venepunctures, and the need to go to the laboratory for testing (interruption of school and parents’ work attendance highlight those difficulties. After reviewing the most relevant published studies of self-testing and self-management of INR for adult patients and children on oral anticoagulation, it seems that these are valuable and effective strategies of INR control. Despite an unclear relationship between INR control and clinical effects, these self-strategies provide a better control of the anticoagulant effect, improve patients and their family quality of life, and are an appealing solution in term of cost-effectiveness. Structured education and knowledge evaluation by trained health care professionals is required for children, to be able to adjust their dose treatment safely and accurately. However

  9. The TL,NO/TL,CO ratio in pulmonary function test interpretation.

    Science.gov (United States)

    Hughes, J Michael B; van der Lee, Ivo

    2013-02-01

    The transfer factor of the lung for nitric oxide (T(L,NO)) is a new test for pulmonary gas exchange. The procedure is similar to the already well-established transfer factor of the lung for carbon monoxide (T(L,CO)). Physiologically, T(L,NO) predominantly measures the diffusion pathway from the alveoli to capillary plasma. In the Roughton-Forster equation, T(L,NO) acts as a surrogate for the membrane diffusing capacity (D(M)). The red blood cell resistance to carbon monoxide uptake accounts for ~50% of the total resistance from gas to blood, but it is much less for nitric oxide. T(L,NO) and T(L,CO) can be measured simultaneously with the single breath technique, and D(M) and pulmonary capillary blood volume (V(c)) can be estimated. T(L,NO), unlike T(L,CO), is independent of oxygen tension and haematocrit. The T(L,NO)/T(L,CO) ratio is weighted towards the D(M)/V(c) ratio and to α; where α is the ratio of physical diffusivities of NO to CO (α=1.97). The T(L,NO)/T(L,CO) ratio is increased in heavy smokers, with and without computed tomography evidence of emphysema, and reduced in the voluntary restriction of lung expansion; it is expected to be reduced in chronic heart failure. The T(L,NO)/T(L,CO) ratio is a new index of gas exchange that may, more than derivations from them of D(M) and V(c) with their in-built assumptions, give additional insights into pulmonary pathology.

  10. Do exchange rates follow random walks? A variance ratio test of the ...

    African Journals Online (AJOL)

    The random-walk hypothesis in foreign-exchange rates market is one of the most researched areas, particularly in developed economies. However, emerging markets in sub-Saharan Africa have received little attention in this regard. This study applies Lo and MacKinlay's (1988) conventional variance ratio test and Wright's ...

  11. Comparison of IRT Likelihood Ratio Test and Logistic Regression DIF Detection Procedures

    Science.gov (United States)

    Atar, Burcu; Kamata, Akihito

    2011-01-01

    The Type I error rates and the power of IRT likelihood ratio test and cumulative logit ordinal logistic regression procedures in detecting differential item functioning (DIF) for polytomously scored items were investigated in this Monte Carlo simulation study. For this purpose, 54 simulation conditions (combinations of 3 sample sizes, 2 sample…

  12. Test Methodology Development for Experimental Structural Assessment of ASC Planar Spring Material for Long-Term Durability

    Science.gov (United States)

    Yun, Gunjin; Abdullah, A. B. M.; Binienda, Wieslaw; Krause, David L.; Kalluri, Sreeramesh

    2014-01-01

    A vibration-based testing methodology has been developed that will assess fatigue behavior of the metallic material of construction for the Advanced Stirling Convertor displacer (planar) spring component. To minimize the testing duration, the test setup is designed for base-excitation of a multiplespecimen arrangement, driven in a high-frequency resonant mode; this allows completion of fatigue testing in an accelerated period. A high performance electro-dynamic exciter (shaker) is used to generate harmonic oscillation of cantilever beam specimens, which are clasped on the shaker armature with specially-designed clamp fixtures. The shaker operates in closed-loop control with dynamic specimen response feedback provided by a scanning laser vibrometer. A test coordinator function synchronizes the shaker controller and the laser vibrometer to complete the closed-loop scheme. The test coordinator also monitors structural health of the test specimens throughout the test period, recognizing any change in specimen dynamic behavior. As this may be due to fatigue crack initiation, the test coordinator terminates test progression and then acquires test data in an orderly manner. Design of the specimen and fixture geometry was completed by finite element analysis such that peak stress does not occur at the clamping fixture attachment points. Experimental stress evaluation was conducted to verify the specimen stress predictions. A successful application of the experimental methodology was demonstrated by validation tests with carbon steel specimens subjected to fully-reversed bending stress; high-cycle fatigue failures were induced in such specimens using higher-than-prototypical stresses

  13. The effect of instructional methodology on high school students natural sciences standardized tests scores

    Science.gov (United States)

    Powell, P. E.

    Educators have recently come to consider inquiry based instruction as a more effective method of instruction than didactic instruction. Experience based learning theory suggests that student performance is linked to teaching method. However, research is limited on inquiry teaching and its effectiveness on preparing students to perform well on standardized tests. The purpose of the study to investigate whether one of these two teaching methodologies was more effective in increasing student performance on standardized science tests. The quasi experimental quantitative study was comprised of two stages. Stage 1 used a survey to identify teaching methods of a convenience sample of 57 teacher participants and determined level of inquiry used in instruction to place participants into instructional groups (the independent variable). Stage 2 used analysis of covariance (ANCOVA) to compare posttest scores on a standardized exam by teaching method. Additional analyses were conducted to examine the differences in science achievement by ethnicity, gender, and socioeconomic status by teaching methodology. Results demonstrated a statistically significant gain in test scores when taught using inquiry based instruction. Subpopulation analyses indicated all groups showed improved mean standardized test scores except African American students. The findings benefit teachers and students by presenting data supporting a method of content delivery that increases teacher efficacy and produces students with a greater cognition of science content that meets the school's mission and goals.

  14. The Likelihood Ratio Test of Common Factors under Non-Ideal Conditions

    Directory of Open Access Journals (Sweden)

    Ana M. Angulo

    2011-01-01

    Full Text Available El modelo espacial de Durbin ocupa una posición interesante en econometría espacial. Es la forma reducida de un modelo de corte transversal con dependencia en los errores y puede ser utilizado como ecuación de anidación en un enfoque más general de selección de modelos. En concreto, a partir de esta ecuación puede obtenerse el Ratio de Verosimilitudes conocido como test de Factores Comunes (LRCOM. Como se muestra en Mur y Angulo (2006, este test tiene buenas propiedades si el modelo está correctamente especificado. Sin embargo, por lo que sabemos, no hay referencias en la literatura sobre el comportamiento de este test bajo condiciones no ideales. En concreto, estudiamos el comportamiento del test en los casos de heterocedasticidad, no normalidad, endogeneidad, matrices de contactos densas y no-linealidad. Nuestros resultados ofrecen una visión positiva del test de Factores Comunes que parece una técnica útil en el instrumental propio de la econometría espacial contemporánea.

  15. Estimating negative likelihood ratio confidence when test sensitivity is 100%: A bootstrapping approach.

    Science.gov (United States)

    Marill, Keith A; Chang, Yuchiao; Wong, Kim F; Friedman, Ari B

    2017-08-01

    Objectives Assessing high-sensitivity tests for mortal illness is crucial in emergency and critical care medicine. Estimating the 95% confidence interval (CI) of the likelihood ratio (LR) can be challenging when sample sensitivity is 100%. We aimed to develop, compare, and automate a bootstrapping method to estimate the negative LR CI when sample sensitivity is 100%. Methods The lowest population sensitivity that is most likely to yield sample sensitivity 100% is located using the binomial distribution. Random binomial samples generated using this population sensitivity are then used in the LR bootstrap. A free R program, "bootLR," automates the process. Extensive simulations were performed to determine how often the LR bootstrap and comparator method 95% CIs cover the true population negative LR value. Finally, the 95% CI was compared for theoretical sample sizes and sensitivities approaching and including 100% using: (1) a technique of individual extremes, (2) SAS software based on the technique of Gart and Nam, (3) the Score CI (as implemented in the StatXact, SAS, and R PropCI package), and (4) the bootstrapping technique. Results The bootstrapping approach demonstrates appropriate coverage of the nominal 95% CI over a spectrum of populations and sample sizes. Considering a study of sample size 200 with 100 patients with disease, and specificity 60%, the lowest population sensitivity with median sample sensitivity 100% is 99.31%. When all 100 patients with disease test positive, the negative LR 95% CIs are: individual extremes technique (0,0.073), StatXact (0,0.064), SAS Score method (0,0.057), R PropCI (0,0.062), and bootstrap (0,0.048). Similar trends were observed for other sample sizes. Conclusions When study samples demonstrate 100% sensitivity, available methods may yield inappropriately wide negative LR CIs. An alternative bootstrapping approach and accompanying free open-source R package were developed to yield realistic estimates easily. This

  16. Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis

    2013-01-01

    A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.

  17. The Leeb Hardness Test for Rock: An Updated Methodology and UCS Correlation

    Science.gov (United States)

    Corkum, A. G.; Asiri, Y.; El Naggar, H.; Kinakin, D.

    2018-03-01

    The Leeb hardness test (LHT with test value of L D ) is a rebound hardness test, originally developed for metals, that has been correlated with the Unconfined Compressive Strength (test value of σ c ) of rock by several authors. The tests can be carried out rapidly, conveniently and nondestructively on core and block samples or on rock outcrops. This makes the relatively small LHT device convenient for field tests. The present study compiles test data from literature sources and presents new laboratory testing carried out by the authors to develop a substantially expanded database with wide-ranging rock types. In addition, the number of impacts that should be averaged to comprise a "test result" was revisited along with the issue of test specimen size. Correlation for L D and σ c for various rock types is provided along with recommended testing methodology. The accuracy of correlated σ c estimates was assessed and reasonable correlations were observed between L D and σ c . The study findings show that LHT can be useful particularly for field estimation of σ c and offers a significant improvement over the conventional field estimation methods outlined by the ISRM (e.g., hammer blows). This test is rapid and simple, with relatively low equipment costs, and provides a reasonably accurate estimate of σ c .

  18. Investigation of Pockels Cells Crystal Contrast Ratio Distribution

    Directory of Open Access Journals (Sweden)

    Giedrius Sinkevičius

    2017-07-01

    Full Text Available The BBO Pockel’s cell has been investigated. The investigation results of optimal operating area on the surface of the crystal dependent of intrinsic contrast ratio (ICR and voltage contrast ratio (VCR for Pockel’s cell are presented. The block diagram of Pockel’s cells contrast measurement stand and measurement methodology are introduced and discussed. The graphs of intrinsic contrast ratio distribution on crystal surface, contrast ratio with voltage dependency and voltage contrast ratio distribution on crystal surface with half-wave voltage are presented.

  19. Fast Lemons and Sour Boulders: Testing Crossmodal Correspondences Using an Internet-Based Testing Methodology

    Directory of Open Access Journals (Sweden)

    Andy T. Woods

    2013-09-01

    Full Text Available According to a popular family of hypotheses, crossmodal matches between distinct features hold because they correspond to the same polarity on several conceptual dimensions (such as active–passive, good–bad, etc. that can be identified using the semantic differential technique. The main problem here resides in turning this hypothesis into testable empirical predictions. In the present study, we outline a series of plausible consequences of the hypothesis and test a variety of well-established and previously untested crossmodal correspondences by means of a novel internet-based testing methodology. The results highlight that the semantic hypothesis cannot easily explain differences in the prevalence of crossmodal associations built on the same semantic pattern (fast lemons, slow prunes, sour boulders, heavy red; furthermore, the semantic hypothesis only minimally predicts what happens when the semantic dimensions and polarities that are supposed to drive such crossmodal associations are made more salient (e.g., by adding emotional cues that ought to make the good/bad dimension more salient; finally, the semantic hypothesis does not explain why reliable matches are no longer observed once intramodal dimensions with congruent connotations are presented (e.g., visually presented shapes and colour do not appear to correspond.

  20. Sex Ratios, Economic Power, and Women's Roles: A Theoretical Extension and Empirical Test.

    Science.gov (United States)

    South, Scott J.

    1988-01-01

    Tested hypotheses concerning sex ratios, women's roles, and economic power with data from 111 countries. Found undersupply of women positively associated with proportion of women who marry and fertility rate; inversely associated with women's average age at marriage, literacy rate, and divorce rate. Suggests women's economic power may counteract…

  1. A numerical test method of California bearing ratio on graded crushed rocks using particle flow modeling

    Directory of Open Access Journals (Sweden)

    Yingjun Jiang

    2015-04-01

    Full Text Available In order to better understand the mechanical properties of graded crushed rocks (GCRs and to optimize the relevant design, a numerical test method based on the particle flow modeling technique PFC2D is developed for the California bearing ratio (CBR test on GCRs. The effects of different testing conditions and micro-mechanical parameters used in the model on the CBR numerical results have been systematically studied. The reliability of the numerical technique is verified. The numerical results suggest that the influences of the loading rate and Poisson's ratio on the CBR numerical test results are not significant. As such, a loading rate of 1.0–3.0 mm/min, a piston diameter of 5 cm, a specimen height of 15 cm and a specimen diameter of 15 cm are adopted for the CBR numerical test. The numerical results reveal that the CBR values increase with the friction coefficient at the contact and shear modulus of the rocks, while the influence of Poisson's ratio on the CBR values is insignificant. The close agreement between the CBR numerical results and experimental results suggests that the numerical simulation of the CBR values is promising to help assess the mechanical properties of GCRs and to optimize the grading design. Besides, the numerical study can provide useful insights on the mesoscopic mechanism.

  2. Fisheye Photogrammetry: Tests and Methodologies for the Survey of Narrow Spaces

    Science.gov (United States)

    Perfetti, L.; Polari, C.; Fassi, F.

    2017-02-01

    The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today's era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance) for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  3. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  4. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  5. Design, manufacture and spin test of high contact ratio helicopter transmission utilizing Self-Aligning Bearingless Planetary (SABP)

    Science.gov (United States)

    Folenta, Dezi; Lebo, William

    1988-01-01

    A 450 hp high ratio Self-Aligning Bearingless Planetary (SABP) for a helicopter application was designed, manufactured, and spin tested under NASA contract NAS3-24539. The objective of the program was to conduct research and development work on a high contact ratio helical gear SABP to reduce weight and noise and to improve efficiency. The results accomplished include the design, manufacturing, and no-load spin testing of two prototype helicopter transmissions, rated at 450 hp with an input speed of 35,000 rpm and an output speed of 350 rpm. The weight power density ratio of these gear units is 0.33 lb hp. The measured airborne noise at 35,000 rpm input speed and light load is 94 dB at 5 ft. The high speed, high contact ratio SABP transmission appears to be significantly lighter and quieter than comtemporary helicopter transmissions. The concept of the SABP is applicable not only to high ratio helicopter type transmissions but also to other rotorcraft and aircraft propulsion systems.

  6. TESTING TREE-CLASSIFIER VARIANTS AND ALTERNATE MODELING METHODOLOGIES IN THE EAST GREAT BASIN MAPPING UNIT OF THE SOUTHWEST REGIONAL GAP ANALYSIS PROJECT (SW REGAP)

    Science.gov (United States)

    We tested two methods for dataset generation and model construction, and three tree-classifier variants to identify the most parsimonious and thematically accurate mapping methodology for the SW ReGAP project. Competing methodologies were tested in the East Great Basin mapping un...

  7. New evaluation methodology of regenerative braking contribution to energy efficiency improvement of electric vehicles

    International Nuclear Information System (INIS)

    Qiu, Chengqun; Wang, Guolin

    2016-01-01

    Highlights: • Two different contribution ratio evaluation parameters according to the deceleration braking process are proposed. • Methodologies for calculating the contribution made by regenerative brake to improve vehicle energy efficiency are proposed. • Road test results imply that the proposed parameters are effective. - Abstract: Comprehensive research is conducted on the design and control of a regenerative braking system for electric vehicles. The mechanism and evaluation methods of contribution brought by regenerative braking to improve electric vehicle’s energy efficiency are discussed and analyzed by the energy flow. Methodologies for calculating the contribution made by regenerative brake are proposed. Additionally a new regenerative braking control strategy called “serial 2 control strategy” is introduced. Moreover, two control strategies called “parallel control strategy” and “serial 1 control strategy” are proposed as the comparative control strategy. Furthermore, two different contribution ratio evaluation parameters according to the deceleration braking process are proposed. Finally, road tests are carried out under China typical city regenerative driving cycle standard with three different control strategies. The serial 2 control strategy offers considerably higher regeneration efficiency than the parallel strategy and serial 1 strategy.

  8. Tests and Confidence Intervals for an Extended Variance Component Using the Modified Likelihood Ratio Statistic

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund; Frydenberg, Morten; Jensen, Jens Ledet

    2005-01-01

    The large deviation modified likelihood ratio statistic is studied for testing a variance component equal to a specified value. Formulas are presented in the general balanced case, whereas in the unbalanced case only the one-way random effects model is studied. Simulation studies are presented......, showing that the normal approximation to the large deviation modified likelihood ratio statistic gives confidence intervals for variance components with coverage probabilities very close to the nominal confidence coefficient....

  9. Efficient testing methodologies for microcameras in a gigapixel imaging system

    Science.gov (United States)

    Youn, Seo Ho; Marks, Daniel L.; McLaughlin, Paul O.; Brady, David J.; Kim, Jungsang

    2013-04-01

    Multiscale parallel imaging--based on a monocentric optical design--promises revolutionary advances in diverse imaging applications by enabling high resolution, real-time image capture over a wide field-of-view (FOV), including sport broadcast, wide-field microscopy, astronomy, and security surveillance. Recently demonstrated AWARE-2 is a gigapixel camera consisting of an objective lens and 98 microcameras spherically arranged to capture an image over FOV of 120° by 50°, using computational image processing to form a composite image of 0.96 gigapixels. Since microcameras are capable of individually adjusting exposure, gain, and focus, true parallel imaging is achieved with a high dynamic range. From the integration perspective, manufacturing and verifying consistent quality of microcameras is a key to successful realization of AWARE cameras. We have developed an efficient testing methodology that utilizes a precisely fabricated dot grid chart as a calibration target to extract critical optical properties such as optical distortion, veiling glare index, and modulation transfer function to validate imaging performance of microcameras. This approach utilizes an AWARE objective lens simulator which mimics the actual objective lens but operates with a short object distance, suitable for a laboratory environment. Here we describe the principles of the methodologies developed for AWARE microcameras and discuss the experimental results with our prototype microcameras. Reference Brady, D. J., Gehm, M. E., Stack, R. A., Marks, D. L., Kittle, D. S., Golish, D. R., Vera, E. M., and Feller, S. D., "Multiscale gigapixel photography," Nature 486, 386--389 (2012).

  10. Methodological examination of UAR-based change detection

    International Nuclear Information System (INIS)

    Racz, A.; Kiss, S.

    1994-07-01

    A methodological examination was performed in order to investigate the applicability of the combination of the well-known Univariate Auto Regressive (UAR) model and the classical binary Sequential Probability Ratio Testing (SPRT) method. The signal was recorded by a vibration detector fixed at a white-noise excited fuel rod. During the experiments, the following abnormality (or minor changes) were simulated: loosening of the detector, changes in the underlying system (constraints and the environment), rod impact. The residual time series were generated by an UAR model while the hypothesis testing was performed by a binary SPRT applied for checking the variation of the variance of the residual. Although the results are very promising, few disturbing effects were recognized also, which seem to be unexplained yet, therefore they need more careful application of this familiar combination. (author) 14 refs.; 21 figs.; 3 tabs

  11. Non-animal methodologies within biomedical research and toxicity testing.

    Science.gov (United States)

    Knight, Andrew

    2008-01-01

    Laboratory animal models are limited by scientific constraints on human applicability, and increasing regulatory restrictions, driven by social concerns. Reliance on laboratory animals also incurs marked - and in some cases, prohibitive - logistical challenges, within high-throughput chemical testing programmes, such as those currently underway within Europe and the US. However, a range of non-animal methodologies is available within biomedical research and toxicity testing. These include: mechanisms to enhance the sharing and assessment of existing data prior to conducting further studies, and physicochemical evaluation and computerised modelling, including the use of structure-activity relationships and expert systems. Minimally-sentient animals from lower phylogenetic orders or early developmental vertebral stages may be used, as well as microorganisms and higher plants. A variety of tissue cultures, including immortalised cell lines, embryonic and adult stem cells, and organotypic cultures, are also available. In vitro assays utilising bacterial, yeast, protozoal, mammalian or human cell cultures exist for a wide range of toxic and other endpoints. These may be static or perfused, and may be used individually, or combined within test batteries. Human hepatocyte cultures and metabolic activation systems offer potential assessment of metabolite activity and organ-organ interaction. Microarray technology may allow genetic expression profiling, increasing the speed of toxin detection, well prior to more invasive endpoints. Enhanced human clinical trials utilising micro- dosing, staggered dosing, and more representative study populations and durations, as well as surrogate human tissues, advanced imaging modalities and human epidemiological, sociological and psycho- logical studies, may increase our understanding of illness aetiology and pathogenesis, and facilitate the development of safe and effective pharmacologic interventions. Particularly when human tissues

  12. A comparison of between hyomental distance ratios, ratio of height to thyromental, modified Mallamapati classification test and upper lip bite test in predicting difficult laryngoscopy of patients undergoing general anesthesia

    Directory of Open Access Journals (Sweden)

    Azim Honarmand

    2014-01-01

    Full Text Available Background: Failed intubation is imperative source of anesthetic interrelated patient′s mortality. The aim of this present study was to compare the ability to predict difficult visualization of the larynx from the following pre-operative airway predictive indices, in isolation and combination: Modified Mallampati test (MMT, the ratio of height to thyromental distance (RHTMD, hyomental distance ratios (HMDR, and the upper-lip-bite test (ULBT. Materials and Methods: We collected data on 525 consecutive patients scheduled for elective surgery under general anesthesia requiring endotracheal intubation and then evaluated all four factors before surgery. A skilled anesthesiologist, not imparted of the noted pre-operative airway assessment, did the laryngoscopy and rating (as per Cormack and Lehane′s classification. Sensitivity, specificity, and positive predictive value for every airway predictor in isolation and in combination were established. Results: The most sensitive of the single tests was ULBT with a sensitivity of 90.2%. The hyomental distance extreme of head extension was the least sensitive of the single tests with a sensitivity of 56.9. The HMDR had sensitivity 86.3%. The ULBT had the highest negative predictive value: And the area under a receiver-operating characteristic curve (AUC of ROC curve among single predictors. The AUC of ROC curve for ULBT, HMDR and RHTMD was significantly more than for MMT (P 0.05. Conclusion: The HMDR is comparable with RHTMD and ULBT for prediction of difficult laryngoscopy in the general population, but was significantly more than for MMT.

  13. Cernavoda NPP risk - Based test and maintenance planning - Methodology development

    International Nuclear Information System (INIS)

    Georgescu, G.; Popa, P.; Petrescu, A.; Naum, M.; Gutu, M.

    1997-01-01

    The Cernavoda Power Plant starts the commercial operation in November 1996. During operation of the nuclear power plant, several mandatory tests and maintenance are performed on stand-by safety system components to ensure their availability in case of accident. The basic purpose of such activities is the early detection of any failure and degradation, and timely correction of deteriorations. Because of the large number of such activities, emphasis on plant safety and allocation of resources becomes difficult. The probabilistic model and methodology can be effectively used to obtain the risk significance of these activities so that the resources are directed to the most important areas. The proposed Research Contract activity is strongly connected with other safety related areas under development. Since, the Cernavoda Probabilistic Safety Evaluation Level 1 PSA Study (CPSE) was performed and now the study is revised taking into account the as-built information, it is recommended to implement into the model the necessary modeling features to support further PSA application, especially related to Test and Maintenance optimization. Methods need to be developed in order to apply the PSA model including risk information together with other needed information for Test and Maintenance optimization. Also, in parallel with the CPSE study updating, the software interface for the PSA model is under development (Risk Monitor Software class), methods and models needing to be developed for the purpose of using it for qualified monitoring of Test and Maintenance Strategy efficiency. Similar, the Data Collection System need to be appropriate for the purpose of an ongoing implementation of a risk - based Test and Maintenance Strategy. (author). 4 refs, 1 fig

  14. Simplified Abrasion Test Methodology for Candidate EVA Glove Lay-Ups

    Science.gov (United States)

    Rabel, Emily; Aitchison, Lindsay

    2015-01-01

    During the Apollo Program, space suit outer-layer fabrics were badly abraded after performing just a few extravehicular activities (EVAs). For example, the Apollo 12 commander reported abrasive wear on the boots that penetrated the outer-layer fabric into the thermal protection layers after less than 8 hrs of surface operations. Current plans for the exploration planetary space suits require the space suits to support hundreds of hours of EVA on a lunar or Martian surface, creating a challenge for space suit designers to utilize materials advances made over the last 40 years and improve on the space suit fabrics used in the Apollo Program. Over the past 25 years the NASA Johnson Space Center Crew and Thermal Systems Division has focused on tumble testing as means of simulating wear on the outer layer of the space suit fabric. Most recently, in 2009, testing was performed on 4 different candidate outer layers to gather baseline data for future use in design of planetary space suit outer layers. In support of the High Performance EVA Glove Element of the Next Generation Life Support Project, testing a new configuration was recently attempted in which require 10% of the fabric per replicate of that need in 2009. The smaller fabric samples allowed for reduced per sample cost and flexibility to test small samples from manufacturers without the overhead to have a production run completed. Data collected from this iteration was compared to that taken in 2009 to validate the new test method. In addition the method also evaluated the fabrics and fabric layups used in a prototype thermal micrometeoroid garment (TMG) developed for EVA gloves under the NASA High Performance EVA Glove Project. This paper provides a review of previous abrasion studies on space suit fabrics, details methodologies used for abrasion testing in this particular study, results of the validation study, and results of the TMG testing.

  15. Orthogonal series generalized likelihood ratio test for failure detection and isolation. [for aircraft control

    Science.gov (United States)

    Hall, Steven R.; Walker, Bruce K.

    1990-01-01

    A new failure detection and isolation algorithm for linear dynamic systems is presented. This algorithm, the Orthogonal Series Generalized Likelihood Ratio (OSGLR) test, is based on the assumption that the failure modes of interest can be represented by truncated series expansions. This assumption leads to a failure detection algorithm with several desirable properties. Computer simulation results are presented for the detection of the failures of actuators and sensors of a C-130 aircraft. The results show that the OSGLR test generally performs as well as the GLR test in terms of time to detect a failure and is more robust to failure mode uncertainty. However, the OSGLR test is also somewhat more sensitive to modeling errors than the GLR test.

  16. A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence.

    Science.gov (United States)

    Jaspers, Monique W M

    2009-05-01

    Usability evaluation is now widely recognized as critical to the success of interactive health care applications. However, the broad range of usability inspection and testing methods available may make it difficult to decide on a usability assessment plan. To guide novices in the human-computer interaction field, we provide an overview of the methodological and empirical research available on the three usability inspection and testing methods most often used. We describe two 'expert-based' and one 'user-based' usability method: (1) the heuristic evaluation, (2) the cognitive walkthrough, and (3) the think aloud. All three usability evaluation methods are applied in laboratory settings. Heuristic evaluation is a relatively efficient usability evaluation method with a high benefit-cost ratio, but requires high skills and usability experience of the evaluators to produce reliable results. The cognitive walkthrough is a more structured approach than the heuristic evaluation with a stronger focus on the learnability of a computer application. Major drawbacks of the cognitive walkthrough are the required level of detail of task and user background descriptions for an adequate application of the latest version of the technique. The think aloud is a very direct method to gain deep insight in the problems end users encounter in interaction with a system but data analyses is extensive and requires a high level of expertise both in the cognitive ergonomics and in computer system application domain. Each of the three usability evaluation methods has shown its usefulness, has its own advantages and disadvantages; no single method has revealed any significant results indicating that it is singularly effective in all circumstances. A combination of different techniques that compliment one another should preferably be used as their collective application will be more powerful than applied in isolation. Innovative mobile and automated solutions to support end-user testing have

  17. Near-exact distributions for the block equicorrelation and equivariance likelihood ratio test statistic

    Science.gov (United States)

    Coelho, Carlos A.; Marques, Filipe J.

    2013-09-01

    In this paper the authors combine the equicorrelation and equivariance test introduced by Wilks [13] with the likelihood ratio test (l.r.t.) for independence of groups of variables to obtain the l.r.t. of block equicorrelation and equivariance. This test or its single block version may find applications in many areas as in psychology, education, medicine, genetics and they are important "in many tests of multivariate analysis, e.g. in MANOVA, Profile Analysis, Growth Curve analysis, etc" [12, 9]. By decomposing the overall hypothesis into the hypotheses of independence of groups of variables and the hypothesis of equicorrelation and equivariance we are able to obtain the expressions for the overall l.r.t. statistic and its moments. From these we obtain a suitable factorization of the characteristic function (c.f.) of the logarithm of the l.r.t. statistic, which enables us to develop highly manageable and precise near-exact distributions for the test statistic.

  18. Monte Carlo simulation of the sequential probability ratio test for radiation monitoring

    International Nuclear Information System (INIS)

    Coop, K.L.

    1984-01-01

    A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table

  19. Graphite Isotope Ratio Method Development Report: Irradiation Test Demonstration of Uranium as a Low Fluence Indicator

    International Nuclear Information System (INIS)

    Reid, B.D.; Gerlach, D.C.; Love, E.F.; McNeece, J.P.; Livingston, J.V.; Greenwood, L.R.; Petersen, S.L.; Morgan, W.C.

    1999-01-01

    This report describes an irradiation test designed to investigate the suitability of uranium as a graphite isotope ratio method (GIRM) low fluence indicator. GIRM is a demonstrated concept that gives a graphite-moderated reactor's lifetime production based on measuring changes in the isotopic ratio of elements known to exist in trace quantities within reactor-grade graphite. Appendix I of this report provides a tutorial on the GIRM concept

  20. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu

    2017-02-16

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  1. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu; Harrou, Fouzi; Sun, Ying

    2017-01-01

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  2. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Graham, Paul S.; Morgan, Keith S.; Caffrey, Michael P.

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA user designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.

  3. Measurements of integrated components' parameters versus irradiation doses gamma radiation (60Co) dosimetry-methodology-tests

    International Nuclear Information System (INIS)

    Fuan, J.

    1991-01-01

    This paper describes the methodology used for the irradiation of the integrated components and the measurements of their parameters, using Quality Insurance of dosimetry: - Measurement of the integrated dose using the competences of the Laboratoire Central des Industries Electriques (LCIE): - Measurement of irradiation dose versus source/component distance, using a calibrated equipment. - Use of ALANINE dosimeters, placed on the support of the irradiated components. - Assembly and polarization of components during the irradiations. Selection of the irradiator. - Measurement of the irradiated components's parameters, using the competences of the societies: - GenRad: GR130 tests equipement placed in the DEIN/SIR-CEN SACLAY. - Laboratoire Central des Industries Electriques (LCIE): GR125 tests equipment and this associated programmes test [fr

  4. Mål for diagnostiske tests ydeevne

    DEFF Research Database (Denmark)

    Rud, Bo; Matzen, Peter; Hilden, Jørgen

    2005-01-01

    commonly used to describe the accuracy of diagnostic tests. The performance of tests is often given in terms of sensitivity and specificity. However, these measures have no relevance to clinicians unless they can be converted into predictive values. We describe how to calculate the predictive values...... and how they can be determined using likelihood ratios and Fagan's nomogram. The reader is introduced to the critical appraisal of results based on studies of the accuracy of tests. We describe how both the clinical spectrum and the methodological quality can influence estimates of diagnostic accuracy...

  5. An empirical likelihood ratio test robust to individual heterogeneity for differential expression analysis of RNA-seq.

    Science.gov (United States)

    Xu, Maoqi; Chen, Liang

    2018-01-01

    The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Determination of Ca/P molar ratio in hydroxyapatite (HA) by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Scapin, Marcos A.; Guilhen, Sabine N.; Cotrim, Marycel E.B.; Pires, Maria Ap. F.

    2015-01-01

    Hydroxyapatite (HA) is a mineral composed of calcium phosphate employed for endodontics, restorative dentistry and other applications in orthopedics and prosthesis. Additionally, this biomaterial is an inexpensive but efficient adsorbent for the removal of heavy metals and other unwanted species of contaminated liquid effluents. This is especially interesting when low-cost effective remediation is required. A Ca / P molar ratio of 1.667 is consistent with the theoretical Ca / P ratio for calcium hydroxyapatite with a compositional formula of Ca 10 (PO 4 ) 6 (OH) 2 , which properties are well discussed in the literature. The aim of this work was to implement and validate a methodology for simultaneous determination of major and minor constituents in the hydroxyapatite (HA) as well as providing the Ca / P molar ratio. To accomplish these achievements, wavelength dispersive X-ray fluorescence spectroscopy (WDXRF) was applied. This is a non-destructive technique that requires no chemical treatment, enabling fast chemical analysis in a wide variety of samples, with no hazardous waste being generated as a result of the process of determination. A standard reference material from NIST (SRM 1400 – Bone Ash) was used to validate the methodology for the determination of magnesium, phosphorus, potassium, calcium, iron, zinc, strontium and the Ca / P ratio in HA samples by WDXRF. The Z-score test was applied as a statistical tool and showed that the calculated values were of less than 1.8 for all the measured analytes. (author)

  7. Determination of Ca/P molar ratio in hydroxyapatite (HA) by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Scapin, Marcos A.; Guilhen, Sabine N.; Cotrim, Marycel E.B.; Pires, Maria Ap. F., E-mail: mascapin@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Hydroxyapatite (HA) is a mineral composed of calcium phosphate employed for endodontics, restorative dentistry and other applications in orthopedics and prosthesis. Additionally, this biomaterial is an inexpensive but efficient adsorbent for the removal of heavy metals and other unwanted species of contaminated liquid effluents. This is especially interesting when low-cost effective remediation is required. A Ca / P molar ratio of 1.667 is consistent with the theoretical Ca / P ratio for calcium hydroxyapatite with a compositional formula of Ca{sub 10}(PO{sub 4}){sub 6}(OH){sub 2}, which properties are well discussed in the literature. The aim of this work was to implement and validate a methodology for simultaneous determination of major and minor constituents in the hydroxyapatite (HA) as well as providing the Ca / P molar ratio. To accomplish these achievements, wavelength dispersive X-ray fluorescence spectroscopy (WDXRF) was applied. This is a non-destructive technique that requires no chemical treatment, enabling fast chemical analysis in a wide variety of samples, with no hazardous waste being generated as a result of the process of determination. A standard reference material from NIST (SRM 1400 – Bone Ash) was used to validate the methodology for the determination of magnesium, phosphorus, potassium, calcium, iron, zinc, strontium and the Ca / P ratio in HA samples by WDXRF. The Z-score test was applied as a statistical tool and showed that the calculated values were of less than 1.8 for all the measured analytes. (author)

  8. Statistical methodology for exploring elevational differences in precipitation chemistry

    International Nuclear Information System (INIS)

    Warren, W.G.; Boehm, M.; Link, D.

    1992-01-01

    A statistical methodology for exploring the relationships between elevation and precipitation chemistry is outlined and illustrated. The methodology utilizes maximum likelihood estimates and likelihood ratio tests with contour ellipses of assumed bivariate lognormal distributions used to assist in interpretation. The approach was illustrated using 12 NADP/NTN sites located in six study areas in the Wyoming and Colorado Rockies. These sites are part of the Rocky Mountain Deposition Monitoring Project (RMDMP), which was initiated in 1986 to investigate the relationships between elevation and the chemistry of precipitation. The results indicate differences in sulfate concentrations between airsheds, between snow and rain, and between higher and lower elevations. In general, sulfate concentrations in snow are greater at lower elevations and this difference is independent of concentration. A similar relationship for rain was not well established. In addition there is evidence that, overall, the sulfate concentrations differed between the six study areas, although pairwise differences were not always significant

  9. Quantile arithmetic methodology for uncertainty propagation in fault trees

    International Nuclear Information System (INIS)

    Abdelhai, M.; Ragheb, M.

    1986-01-01

    A methodology based on quantile arithmetic, the probabilistic analog to interval analysis, is proposed for the computation of uncertainties propagation in fault tree analysis. The basic events' continuous probability density functions (pdf's) are represented by equivalent discrete distributions by dividing them into a number of quantiles N. Quantile arithmetic is then used to performthe binary arithmetical operations corresponding to the logical gates in the Boolean expression of the top event expression of a given fault tree. The computational advantage of the present methodology as compared with the widely used Monte Carlo method was demonstrated for the cases of summation of M normal variables through the efficiency ratio defined as the product of the labor and error ratios. The efficiency ratio values obtained by the suggested methodology for M = 2 were 2279 for N = 5, 445 for N = 25, and 66 for N = 45 when compared with the results for 19,200 Monte Carlo samples at the 40th percentile point. Another advantage of the approach is that the exact analytical value of the median is always obtained for the top event

  10. Measures of effect size for chi-squared and likelihood-ratio goodness-of-fit tests.

    Science.gov (United States)

    Johnston, Janis E; Berry, Kenneth J; Mielke, Paul W

    2006-10-01

    A fundamental shift in editorial policy for psychological journals was initiated when the fourth edition of the Publication Manual of the American Psychological Association (1994) placed emphasis on reporting measures of effect size. This paper presents measures of effect size for the chi-squared and the likelihood-ratio goodness-of-fit statistic tests.

  11. The Linear Logistic Test Model (LLTM as the methodological foundation of item generating rules for a new verbal reasoning test

    Directory of Open Access Journals (Sweden)

    HERBERT POINSTINGL

    2009-06-01

    Full Text Available Based on the demand for new verbal reasoning tests to enrich psychological test inventory, a pilot version of a new test was analysed: the 'Family Relation Reasoning Test' (FRRT; Poinstingl, Kubinger, Skoda & Schechtner, forthcoming, in which several basic cognitive operations (logical rules have been embedded/implemented. Given family relationships of varying complexity embedded in short stories, testees had to logically conclude the correct relationship between two individuals within a family. Using empirical data, the linear logistic test model (LLTM; Fischer, 1972, a special case of the Rasch model, was used to test the construct validity of the test: The hypothetically assumed basic cognitive operations had to explain the Rasch model's item difficulty parameters. After being shaped in LLTM's matrices of weights ((qij, none of these operations were corroborated by means of the Andersen's Likelihood Ratio Test.

  12. ANALISIS PENGARUH LDR, NPL DAN OPERATIONAL EFFICIENCY RATIO TERHADAP RETURN ON ASSETS PADA BANK DEVISA DI INDONESIA PERIODE 2010-2012

    Directory of Open Access Journals (Sweden)

    Hamidah Hamidah

    2014-04-01

    Full Text Available This research is performed on order to test analysis the influence of the Loan to Deposit Ratio (LDR, Non Performing Loan (NPL and Operational Efficiency Ratio (OER toward Return On Asset (ROA On Foreign Exchange Banks In Indonesia Period 2010-2012. Methodology research as the sample used purposive sampling, sample was accrued from foreign banks in Indonesia. Data analysis with multi linear regression of ordinary least square and hypotheses test used t-statistic and F statistic, a classic assumption examination to test the hypotheses.Based on normality test, multicolinearity test, heterosskedasticity test and auto correlation test were not found variables that deviate from the classical assumptions, this indicate that the available data has fulfill the condition to use multi linear regression model. This result of research show that variable LDR and NPL partially have positive influence but not significant toward ROA. Variable OERpartially have negative significant influence toward ROA. Variable LDR, NPL and OER simultaneously have significant influence toward ROA.

  13. Chloride accelerated test: influence of silica fume, water/binder ratio and concrete cover thickness

    Directory of Open Access Journals (Sweden)

    E. Pereira

    Full Text Available In developed countries like the UK, France, Italy and Germany, it is estimated that spending on maintenance and repair is practically the same as investment in new constructions. Therefore, this paper aims to study different ways of interfering in the corrosion kinetic using an accelerated corrosion test - CAIM, that simulates the chloride attack. The three variables are: concrete cover thickness, use of silica fume and the water/binder ratio. It was found, by analysis of variance of the weight loss of the steel bars and chloride content in the concrete cover thickness, there is significant influence of the three variables. Also, the results indicate that the addition of silica fume is the path to improve the corrosion protection of low water/binder ratio concretes (like 0.4 and elevation of the concrete cover thickness is the most effective solution to increase protection of high water/binder ratio concrete (above 0.5.

  14. Methodologies for rapid evaluation of seismic demand levels in nuclear power plant structures

    International Nuclear Information System (INIS)

    Manrique, M.; Asfura, A.; Mukhim, G.

    1990-01-01

    A methodology for rapid assessment of both acceleration spectral peak and 'zero period acceleration' (ZPA) values for virtually any major structure in a nuclear power plant is presented. The methodology is based on spectral peak and ZPA amplification factors, developed from regression analyses of an analytical database. The developed amplification factors are applied to the plant's design ground spectrum to obtain amplified response parameters. A practical application of the methodology is presented. This paper also presents a methodology for calculating acceleration response spectrum curves at any number of desired damping ratios directly from a single known damping ratio spectrum. The methodology presented is particularly useful and directly applicable to older vintage nuclear power plant facilities (i.e. such as those affected by USI A-46). The methodology is based on principles of random vibration theory. The methodology has been implemented in a computer program (SPECGEN). SPECGEN results are compared with results obtained from time history analyses. (orig.)

  15. Experimental tests of the effect of rotor diameter ratio and blade number to the cross-flow wind turbine performance

    Science.gov (United States)

    Susanto, Sandi; Tjahjana, Dominicus Danardono Dwi Prija; Santoso, Budi

    2018-02-01

    Cross-flow wind turbine is one of the alternative energy harvester for low wind speeds area. Several factors that influence the power coefficient of cross-flow wind turbine are the diameter ratio of blades and the number of blades. The aim of this study is to find out the influence of the number of blades and the diameter ratio on the performance of cross-flow wind turbine and to find out the best configuration between number of blades and diameter ratio of the turbine. The experimental test were conducted under several variation including diameter ratio between outer and inner diameter of the turbine and number of blades. The variation of turbine diameter ratio between inner and outer diameter consisted of 0.58, 0.63, 0.68 and 0.73 while the variations of the number of blades used was 16, 20 and 24. The experimental test were conducted under certain wind speed which are 3m/s until 4 m/s. The result showed that the configurations between 0.68 diameter ratio and 20 blade numbers is the best configurations that has power coefficient of 0.049 and moment coefficient of 0.185.

  16. Impacts of data covariances on the calculated breeding ratio for CRBRP

    International Nuclear Information System (INIS)

    Liaw, J.R.; Collins, P.J.; Henryson, H. II; Shenter, R.E.

    1983-01-01

    In order to establish confidence on the data adjustment methodology as applied to LMFBR design, and to estimate the importance of data correlations in that respect, an investigation was initiated on the impacts of data covariances on the calculated reactor performance parameters. This paper summarizes the results and findings of such an effort specifically related to the calculation of breeding ratio for CRBRP as an illustration. Thirty-nine integral parameters and their covariances, including k/sub eff/ and various capture and fission reaction rate ratios, from the ZEBRA-8 series and four ZPR physics benchmark assemblies were used in the least-squares fitting processes. Multigroup differential data and the sensitivity coefficients of those 39 integral parameters were generated by standard 2-D diffusion theory neutronic calculational modules at ANL. Three differential data covariance libraries, all based on ENDF/B-V evaluations, were tested in this study

  17. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  18. A Space Object Detection Algorithm using Fourier Domain Likelihood Ratio Test

    Science.gov (United States)

    Becker, D.; Cain, S.

    Space object detection is of great importance in the highly dependent yet competitive and congested space domain. Detection algorithms employed play a crucial role in fulfilling the detection component in the situational awareness mission to detect, track, characterize and catalog unknown space objects. Many current space detection algorithms use a matched filter or a spatial correlator to make a detection decision at a single pixel point of a spatial image based on the assumption that the data follows a Gaussian distribution. This paper explores the potential for detection performance advantages when operating in the Fourier domain of long exposure images of small and/or dim space objects from ground based telescopes. A binary hypothesis test is developed based on the joint probability distribution function of the image under the hypothesis that an object is present and under the hypothesis that the image only contains background noise. The detection algorithm tests each pixel point of the Fourier transformed images to make the determination if an object is present based on the criteria threshold found in the likelihood ratio test. Using simulated data, the performance of the Fourier domain detection algorithm is compared to the current algorithm used in space situational awareness applications to evaluate its value.

  19. A test procedure for determining the influence of stress ratio on fatigue crack growth

    Science.gov (United States)

    Fitzgerald, J. H.; Wei, R. P.

    1974-01-01

    A test procedure is outlined by which the rate of fatigue crack growth over a range of stress ratios and stress intensities can be determined expeditiously using a small number of specimens. This procedure was developed to avoid or circumvent the effects of load interactions on fatigue crack growth, and was used to develop data on a mill annealed Ti-6Al-4V alloy plate. Experimental data suggest that the rates of fatigue crack growth among the various stress ratios may be correlated in terms of an effective stress intensity range at given values of K max. This procedure is not to be used, however, for determining the corrosion fatigue crack growth characteristics of alloys when nonsteady-state effects are significant.

  20. PENGARUH PERUBAHAN RETURN ON ASSETS, PERUBAHAN DEBT TO EQUITY RATIO DAN PERUBAHAN CASH RATIO TERHADAP PERUBAHAN DIVIDEND PAYOUT RATIO

    Directory of Open Access Journals (Sweden)

    Yuli Soesetio

    2008-02-01

    Full Text Available Dividend Payout Ratio used to calculate all of revenue that will be accepted by stockholders as cash dividend, usually explained as percentage. This research was conducted to know several factors that affected change of Dividend Payout Ratio and to know the significance level and the correlation between dependent and independent variable. Analysis instrument used was parametric statistic. Based on the result of statistic test,  The Change of Return on Asset (X1, The Change of Debt to Equity Ratio (X2,  were able to explain dependent variable of the change Dividend Payout Ratio, and The Change of CashRatio can’t explain dependent variable of the change Dividend Payout Ratio

  1. A Methodology for Evaluation of Inservice Test Intervals for Pumps and Motor-Operated Valves

    International Nuclear Information System (INIS)

    Cox, D.F.; Haynes, H.D.; McElhaney, K.L.; Otaduy, P.J.; Staunton, R.H.; Vesely, W.E.

    1999-01-01

    Recent nuclear industry reevaluation of component inservice testing (IST) requirements is resulting in requests for IST interval extensions and changes to traditional IST programs. To evaluate these requests, long-term component performance and the methods for mitigating degradation need to be understood. Determining the appropriate IST intervals, along with component testing, monitoring, trending, and maintenance effects, has become necessary. This study provides guidelines to support the evaluation of IST intervals for pumps and motor-operated valves (MOVs). It presents specific engineering information pertinent to the performance and monitoring/testing of pumps and MOVs, provides an analytical methodology for assessing the bounding effects of aging on component margin behavior, and identifies basic elements of an overall program to help ensure component operability. Guidance for assessing probabilistic methods and the risk importance and safety consequences of the performance of pumps and MOVs has not been specifically included within the scope of this report, but these elements may be included in licensee change requests

  2. Assessment of current structural design methodology for high-temperature reactors based on failure tests

    International Nuclear Information System (INIS)

    Corum, J.M.; Sartory, W.K.

    1985-01-01

    A mature design methodology, consisting of inelastic analysis methods, provided in Department of Energy guidelines, and failure criteria, contained in ASME Code Case N-47, exists in the United States for high-temperature reactor components. The objective of this paper is to assess the adequacy of this overall methodology by comparing predicted inelastic deformations and lifetimes with observed results from structural failure tests and from an actual service failure. Comparisons are presented for three types of structural situations: (1) nozzle-to-spherical shell specimens, where stresses at structural discontinuities lead to cracking, (2) welded structures, where metallurgical discontinuities play a key role in failures, and (3) thermal shock loadings of cylinders and pipes, where thermal discontinuities can lead to failure. The comparison between predicted and measured inelastic responses are generally reasonalbly good; quantities are sometimes overpredicted somewhat, and, sometimes underpredicted. However, even seemingly small discrepancies can have a significant effect on structural life, and lifetimes are not always as closely predicted. For a few cases, the lifetimes are substantially overpredicted, which raises questions regarding the adequacy of existing design margins

  3. Tolerance limits and tolerance intervals for ratios of normal random variables using a bootstrap calibration.

    Science.gov (United States)

    Flouri, Marilena; Zhai, Shuyan; Mathew, Thomas; Bebu, Ionut

    2017-05-01

    This paper addresses the problem of deriving one-sided tolerance limits and two-sided tolerance intervals for a ratio of two random variables that follow a bivariate normal distribution, or a lognormal/normal distribution. The methodology that is developed uses nonparametric tolerance limits based on a parametric bootstrap sample, coupled with a bootstrap calibration in order to improve accuracy. The methodology is also adopted for computing confidence limits for the median of the ratio random variable. Numerical results are reported to demonstrate the accuracy of the proposed approach. The methodology is illustrated using examples where ratio random variables are of interest: an example on the radioactivity count in reverse transcriptase assays and an example from the area of cost-effectiveness analysis in health economics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Improvement in post test accident analysis results prediction for the test no. 2 in PSB test facility by applying UMAE methodology

    International Nuclear Information System (INIS)

    Dubey, S.K.; Petruzzi, A.; Giannotti, W.; D'Auria, F.

    2006-01-01

    This paper mainly deals with the improvement in the post test accident analysis results prediction for the test no. 2, 'Total loss of feed water with failure of HPIS pumps and operator actions on primary and secondary circuit depressurization', carried-out on PSB integral test facility in May 2005. This is one the most complicated test conducted in PSB test facility. The prime objective of this test is to provide support for the verification of the accident management strategies for NPPs and also to verify the correctness of some safety systems operating only during accident. The objective of this analysis is to assess the capability to reproduce the phenomena occurring during the selected tests and to quantify the accuracy of the code calculation qualitatively and quantitatively for the best estimate code Relap5/mod3.3 by systematically applying all the procedures lead by Uncertainty Methodology based on Accuracy Extrapolation (UMAE), developed at University of Pisa. In order to achieve these objectives test facility nodalisation qualification for both 'steady state level' and 'on transient level' are demonstrated. For the 'steady state level' qualification compliance to acceptance criteria established in UMAE has been checked for geometrical details and thermal hydraulic parameters. The following steps have been performed for evaluation of qualitative qualification of 'on transient level': visual comparisons between experimental and calculated relevant parameters time trends; list of comparison between experimental and code calculation resulting time sequence of significant events; identification/verification of CSNI phenomena validation matrix; use of the Phenomenological Windows (PhW), identification of Key Phenomena and Relevant Thermal-hydraulic Aspects (RTA). A successful application of the qualitative process constitutes a prerequisite to the application of the quantitative analysis. For quantitative accuracy of code prediction Fast Fourier Transform Based

  5. The role of the epoxy resin: Curing agent ratio in composite interfacial strength by single fibre microbond test

    DEFF Research Database (Denmark)

    Minty, Ross; Thomason, James L.; Petersen, Helga Nørgaard

    2015-01-01

    This paper focuses on an investigation into the role of the epoxy resin: curing agent ratio in composite interfacial shear strength of glass fibre composites. The procedure involved changing the percentage of curing agent (Triethylenetetramine [TETA]) used in the mixture with several different...... percentages used, ranging from 4% up to 30%, including the stoichiometric ratio. It was found by using the microbond test, that there may exist a relationship between the epoxy resin to curing agent ratio and the level of adhesion between the reinforcing fibre and the polymer matrix of the composite....

  6. Application of a Bayesian model for the quantification of the European methodology for qualification of non-destructive testing

    International Nuclear Information System (INIS)

    Gandossi, Luca; Simola, Kaisa; Shepherd, Barrie

    2010-01-01

    The European methodology for qualification of non-destructive testing is a well-established approach adopted by nuclear utilities in many European countries. According to this methodology, qualification is based on a combination of technical justification and practical trials. The methodology is qualitative in nature, and it does not give explicit guidance on how the evidence from the technical justification and results from trials should be weighted. A Bayesian model for the quantification process was presented in a previous paper, proposing a way to combine the 'soft' evidence contained in a technical justification with the 'hard' evidence obtained from practical trials. This paper describes the results of a pilot study in which such a Bayesian model was applied to two realistic Qualification Dossiers by experienced NDT qualification specialists. At the end of the study, recommendations were made and a set of guidelines was developed for the application of the Bayesian model.

  7. A comparison of likelihood ratio tests and Rao's score test for three separable covariance matrix structures.

    Science.gov (United States)

    Filipiak, Katarzyna; Klein, Daniel; Roy, Anuradha

    2017-01-01

    The problem of testing the separability of a covariance matrix against an unstructured variance-covariance matrix is studied in the context of multivariate repeated measures data using Rao's score test (RST). The RST statistic is developed with the first component of the separable structure as a first-order autoregressive (AR(1)) correlation matrix or an unstructured (UN) covariance matrix under the assumption of multivariate normality. It is shown that the distribution of the RST statistic under the null hypothesis of any separability does not depend on the true values of the mean or the unstructured components of the separable structure. A significant advantage of the RST is that it can be performed for small samples, even smaller than the dimension of the data, where the likelihood ratio test (LRT) cannot be used, and it outperforms the standard LRT in a number of contexts. Monte Carlo simulations are then used to study the comparative behavior of the null distribution of the RST statistic, as well as that of the LRT statistic, in terms of sample size considerations, and for the estimation of the empirical percentiles. Our findings are compared with existing results where the first component of the separable structure is a compound symmetry (CS) correlation matrix. It is also shown by simulations that the empirical null distribution of the RST statistic converges faster than the empirical null distribution of the LRT statistic to the limiting χ 2 distribution. The tests are implemented on a real dataset from medical studies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    Science.gov (United States)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test

  9. The Sequential Probability Ratio Test: An efficient alternative to exact binomial testing for Clean Water Act 303(d) evaluation.

    Science.gov (United States)

    Chen, Connie; Gribble, Matthew O; Bartroff, Jay; Bay, Steven M; Goldstein, Larry

    2017-05-01

    The United States's Clean Water Act stipulates in section 303(d) that states must identify impaired water bodies for which total maximum daily loads (TMDLs) of pollution inputs into water bodies are developed. Decision-making procedures about how to list, or delist, water bodies as impaired, or not, per Clean Water Act 303(d) differ across states. In states such as California, whether or not a particular monitoring sample suggests that water quality is impaired can be regarded as a binary outcome variable, and California's current regulatory framework invokes a version of the exact binomial test to consolidate evidence across samples and assess whether the overall water body complies with the Clean Water Act. Here, we contrast the performance of California's exact binomial test with one potential alternative, the Sequential Probability Ratio Test (SPRT). The SPRT uses a sequential testing framework, testing samples as they become available and evaluating evidence as it emerges, rather than measuring all the samples and calculating a test statistic at the end of the data collection process. Through simulations and theoretical derivations, we demonstrate that the SPRT on average requires fewer samples to be measured to have comparable Type I and Type II error rates as the current fixed-sample binomial test. Policymakers might consider efficient alternatives such as SPRT to current procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Methodology for the analysis of pollutant emissions from a city bus

    International Nuclear Information System (INIS)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-01-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel–air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least. (paper)

  11. Methodology for the analysis of pollutant emissions from a city bus

    Science.gov (United States)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-04-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

  12. Receiver-operating characteristic curves and likelihood ratios: improvements over traditional methods for the evaluation and application of veterinary clinical pathology tests

    DEFF Research Database (Denmark)

    Gardner, Ian A.; Greiner, Matthias

    2006-01-01

    Receiver-operating characteristic (ROC) curves provide a cutoff-independent method for the evaluation of continuous or ordinal tests used in clinical pathology laboratories. The area under the curve is a useful overall measure of test accuracy and can be used to compare different tests (or...... different equipment) used by the same tester, as well as the accuracy of different diagnosticians that use the same test material. To date, ROC analysis has not been widely used in veterinary clinical pathology studies, although it should be considered a useful complement to estimates of sensitivity...... and specificity in test evaluation studies. In addition, calculation of likelihood ratios can potentially improve the clinical utility of such studies because likelihood ratios provide an indication of how the post-test probability changes as a function of the magnitude of the test results. For ordinal test...

  13. Aerospace Payloads Leak Test Methodology

    Science.gov (United States)

    Lvovsky, Oleg; Grayson, Cynthia M.

    2010-01-01

    Pressurized and sealed aerospace payloads can leak on orbit. When dealing with toxic or hazardous materials, requirements for fluid and gas leakage rates have to be properly established, and most importantly, reliably verified using the best Nondestructive Test (NDT) method available. Such verification can be implemented through application of various leak test methods that will be the subject of this paper, with a purpose to show what approach to payload leakage rate requirement verification is taken by the National Aeronautics and Space Administration (NASA). The scope of this paper will be mostly a detailed description of 14 leak test methods recommended.

  14. Arcjet nozzle area ratio effects

    Science.gov (United States)

    Curran, Francis M.; Sarmiento, Charles J.; Birkner, Bjorn W.; Kwasny, James

    1990-01-01

    An experimental investigation was conducted to determine the effect of nozzle area ratio on the operating characteristics and performance of a low power dc arcjet thruster. Conical thoriated tungsten nozzle inserts were tested in a modular laboratory arcjet thruster run on hydrogen/nitrogen mixtures simulating the decomposition products of hydrazine. The converging and diverging sides of the inserts had half angles of 30 and 20 degrees, respectively, similar to a flight type unit currently under development. The length of the diverging side was varied to change the area ratio. The nozzle inserts were run over a wide range of specific power. Current, voltage, mass flow rate, and thrust were monitored to provide accurate comparisons between tests. While small differences in performance were observed between the two nozzle inserts, it was determined that for each nozzle insert, arcjet performance improved with increasing nozzle area ratio to the highest area ratio tested and that the losses become very pronounced for area ratios below 50. These trends are somewhat different than those obtained in previous experimental and analytical studies of low Re number nozzles. It appears that arcjet performance can be enhanced via area ratio optimization.

  15. Arcjet Nozzle Area Ratio Effects

    Science.gov (United States)

    Curran, Francis M.; Sarmiento, Charles J.; Birkner, Bjorn W.; Kwasny, James

    1990-01-01

    An experimental investigation was conducted to determine the effect of nozzle area ratio on the operating characteristics and performance of a low power dc arcjet thruster. Conical thoriated tungsten nozzle inserts were tested in a modular laboratory arcjet thruster run on hydrogen/nitrogen mixtures simulating the decomposition products of hydrazine. The converging and diverging sides of the inserts had half angles of 30 and 20 degrees, respectively, similar to a flight type unit currently under development. The length of the diverging side was varied to change the area ratio. The nozzle inserts were run over a wide range of specific power. Current, voltage, mass flow rate, and thrust were monitored to provide accurate comparisons between tests. While small differences in performance were observed between the two nozzle inserts, it was determined that for each nozzle insert, arcjet performance improved with increasing nozzle area ratio to the highest area ratio tested and that the losses become very pronounced for area ratios below 50. These trends are somewhat different than those obtained in previous experimental and analytical studies of low Re number nozzles. It appears that arcjet performance can be enhanced via area ratio optimization.

  16. The fairness of the PPS reimbursement methodology.

    Science.gov (United States)

    Gianfrancesco, F D

    1990-01-01

    In FY 1984 the Medicare program implemented a new method of reimbursing hospitals for inpatient services, the Prospective Payment System (PPS). Under this system, hospitals are paid a predetermined amount per Medicare discharge, which varies according to certain patient and hospital characteristics. This article investigates the presence of systematic biases and other potential imperfections in the PPS reimbursement methodology as revealed by its effects on Medicare operating ratios. The study covers the first three years of the PPS (approximately 1984-1986) and is based on hospital data from the Medicare cost reports and other related sources. Regression techniques were applied to these data to determine how Medicare operating ratios were affected by specific aspects of the reimbursement methodology. Several possible imbalances were detected. The potential undercompensation relating to these can be harmful to certain classes of hospitals and to the Medicare populations that they serve. PMID:2109738

  17. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    Groenendijk, Patrick A.; Lucas, André; Vries, de Casper G.

    1998-01-01

    We advocate the use of absolute moment ratio statistics in conjunctionwith standard variance ratio statistics in order to disentangle lineardependence, non-linear dependence, and leptokurtosis in financial timeseries. Both statistics are computed for multiple return horizonssimultaneously, and the

  18. Yearbook sectoral financial ratios in mexico for business benchmarking

    Directory of Open Access Journals (Sweden)

    Deyanira Bernal Domínguez

    2012-05-01

    Full Text Available Financial analysis through ratios is a useful tool for improving organizational performance. Databases of financial information in Mexico are limited, therefore the importance of an annual publication of financial ratios per company and industry average. The objectives of this research are: describe the financial ratios with higher predictive potential and their formulas, as well as the design of a research instrument for measuring the relevance of the publication. A descriptive methodology was applied selecting through the analysis ofempirical studies, several ratios of liquidity, leverage, asset management, business cycle, performance and self-financing. The questionnaire contains 43 reagents to be applied to a statistically representative sample of 46 entrepreneurs in Culiacan, Sinaloa, Mexico.

  19. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    P.A. Groenendijk (Patrick); A. Lucas (André); C.G. de Vries (Casper)

    1998-01-01

    textabstractWe advocate the use of absolute moment ratio statistics in conjunction with standard variance ratio statistics in order to disentangle linear dependence, non-linear dependence, and leptokurtosis in financial time series. Both statistics are computed for multiple return horizons

  20. Airfoil selection methodology for Small Wind Turbines

    DEFF Research Database (Denmark)

    Salgado Fuentes, Valentin; Troya, Cesar; Moreno, Gustavo

    2016-01-01

    On wind turbine technology, the aerodynamic performance is fundamental to increase efficiency. Nowadays there are several databases with airfoils designed and simulated for different applications; that is why it is necessary to select those suitable for a specific application. This work presents...... a new methodology for airfoil selection used in feasibility and optimization of small wind turbines with low cut-in speed. On the first stage, airfoils data is tested on XFOIL software to check its compatibility with the simulator; then, arithmetic mean criteria is recursively used to discard...... underperformed airfoils; the best airfoil data was exported to Matlab for a deeper analysis. In the second part, data points were interpolated using "splines" to calculate glide ratio and stability across multiple angles of attack, those who present a bigger steadiness were conserved. As a result, 3 airfoils...

  1. Tests of Full-Scale Helicopter Rotors at High Advancing Tip Mach Numbers and Advance Ratios

    Science.gov (United States)

    Biggers, James C.; McCloud, John L., III; Stroub, Robert H.

    2015-01-01

    As a continuation of the studies of reference 1, three full-scale helicopter rotors have been tested in the Ames Research Center 40- by SO-foot wind tunnel. All three of them were two-bladed, teetering rotors. One of the rotors incorporated the NACA 0012 airfoil section over the entire length of the blade. This rotor was tested at advance ratios up to 1.05. Both of the other rotors were tapered in thickness and incorporated leading-edge camber over the outer 20 percent of the blade radius. The larger of these rotors was tested at advancing tip Mach numbers up to 1.02. Data were obtained for a wide range of lift and propulsive force, and are presented without discussion.

  2. Comparison of heat-testing methodology.

    Science.gov (United States)

    Bierma, Mark M; McClanahan, Scott; Baisden, Michael K; Bowles, Walter R

    2012-08-01

    Patients with irreversible pulpitis occasionally present with a chief complaint of sensitivity to heat. To appropriately diagnose the offending tooth, a variety of techniques have been developed to reproduce this chief complaint. Such techniques cause temperature increases that are potentially damaging to the pulp. Newer electronic instruments control the temperature of a heat-testing tip that is placed directly against a tooth. The aim of this study was to determine which method produced the most consistent and safe temperature increase within the pulp. This consistency facilitates the clinician's ability to differentiate between a normal pulp and irreversible pulpitis. Four operators applied the following methods to each of 4 extracted maxillary premolars (for a total of 16 trials per method): heated gutta-percha, heated ball burnisher, hot water, and a System B unit or Elements unit with a heat-testing tip. Each test was performed for 60 seconds, and the temperatures were recorded via a thermocouple in the pulp chamber. Analysis of the data was performed by using the intraclass correlation coefficient. The least consistent warming was found with hot water. The heat-testing tip also demonstrated greater consistency between operators compared with the other methods. Hot water and the heated ball burnisher caused temperature increases high enough to damage pulp tissue. The Elements unit with a heat-testing tip provides the most consistent warming of the dental pulp. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  3. Three-dimensional RAMA fluence methodology benchmarking

    International Nuclear Information System (INIS)

    Baker, S. P.; Carter, R. G.; Watkins, K. E.; Jones, D. B.

    2004-01-01

    This paper describes the benchmarking of the RAMA Fluence Methodology software, that has been performed in accordance with U. S. Nuclear Regulatory Commission Regulatory Guide 1.190. The RAMA Fluence Methodology has been developed by TransWare Enterprises Inc. through funding provided by the Electric Power Research Inst., Inc. (EPRI) and the Boiling Water Reactor Vessel and Internals Project (BWRVIP). The purpose of the software is to provide an accurate method for calculating neutron fluence in BWR pressure vessels and internal components. The Methodology incorporates a three-dimensional deterministic transport solution with flexible arbitrary geometry representation of reactor system components, previously available only with Monte Carlo solution techniques. Benchmarking was performed on measurements obtained from three standard benchmark problems which include the Pool Criticality Assembly (PCA), VENUS-3, and H. B. Robinson Unit 2 benchmarks, and on flux wire measurements obtained from two BWR nuclear plants. The calculated to measured (C/M) ratios range from 0.93 to 1.04 demonstrating the accuracy of the RAMA Fluence Methodology in predicting neutron flux, fluence, and dosimetry activation. (authors)

  4. CATHARE code development and assessment methodologies

    International Nuclear Information System (INIS)

    Micaelli, J.C.; Barre, F.; Bestion, D.

    1995-01-01

    The CATHARE thermal-hydraulic code has been developed jointly by Commissariat a l'Energie Atomique (CEA), Electricite de France (EdF), and Framatorne for safety analysis. Since the beginning of the project (September 1979), development and assessment activities have followed a methodology supported by two series of experimental tests: separate effects tests and integral effects tests. The purpose of this paper is to describe this methodology, the code assessment status, and the evolution to take into account two new components of this program: the modeling of three-dimensional phenomena and the requirements of code uncertainty evaluation

  5. Evaluation of methodological aspects of digestibility measurements in ponies fed different haylage to concentrate ratios

    NARCIS (Netherlands)

    Schaafstra, F J W C; van Doorn, D A; Schonewille, J T; van Riet, M M J; Hendriks, W H

    2017-01-01

    Methodological aspects of digestibility measurements were studied in four Welsh pony geldings consuming haylage-based diets with increasing proportions of a pelleted concentrate according to a 4×4 Latin square design experiment. Ponies were fed four experimental, iso-energetic (net energy (NE)

  6. Radiometric ratio characterization for low-to-mid CPV modules operating in variable irradiance conditions

    Science.gov (United States)

    Vorndran, Shelby; Russo, Juan; Zhang, Deming; Gordon, Michael; Kostuk, Raymond

    2012-10-01

    In this work, a concentrating photovoltaic (CPV) design methodology is proposed which aims to maximize system efficiency for a given irradiance condition. In this technique, the acceptance angle of the system is radiometrically matched to the angular spread of the site's average irradiance conditions using a simple geometric ratio. The optical efficiency of CPV systems from flat-plate to high-concentration is plotted at all irradiance conditions. Concentrator systems are measured outdoors in various irradiance conditions to test the methodology. This modeling technique is valuable at the design stage to determine the ideal level of concentration for a CPV module. It requires only two inputs: the acceptance angle profile of the system and the site's average direct and diffuse irradiance fractions. Acceptance angle can be determined by raytracing or testing a fabricated prototype in the lab with a solar simulator. The average irradiance conditions can be found in the Typical Metrological Year (TMY3) database. Additionally, the information gained from this technique can be used to determine tracking tolerance, quantify power loss during an isolated weather event, and do more sophisticated analysis such as I-V curve simulation.

  7. Inverse modeling of emissions for local photooxidant pollution: Testing a new methodology with kriging constraints

    Directory of Open Access Journals (Sweden)

    I. Pison

    2006-07-01

    Full Text Available A new methodology for the inversion of anthropogenic emissions at a local scale is tested. The inversion constraints are provided by a kriging technique used in air quality forecast in the Paris area, which computes an analyzed concentration field from network measurements and the first-guess simulation of a CTM. The inverse developed here is based on the CHIMERE model and its adjoint to perform 4-D integration. The methodology is validated on synthetic cases inverting emission fluxes. It is shown that the information provided by the analyzed concentrations is sufficient to reach a mathematically acceptable solution to the optimization, even when little information is available in the measurements. As compared to the use of measurements alone or of measurements and a background matrix, the use of kriging leads to a more homogeneous distribution of the corrections, both in space and time. Moreover, it is then possible to double the accuracy of the inversion by performing two kriging-optimization cycles. Nevertheless, kriging analysis cannot compensate for a very important lack of information in the measurements.

  8. Development of design and analysis methodology for composite bolted joints

    Science.gov (United States)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  9. Review of titanium dioxide nanoparticle phototoxicity: Developing a phototoxicity ratio to correct the endpoint values of toxicity tests

    Science.gov (United States)

    2015-01-01

    Abstract Titanium dioxide nanoparticles are photoactive and produce reactive oxygen species under natural sunlight. Reactive oxygen species can be detrimental to many organisms, causing oxidative damage, cell injury, and death. Most studies investigating TiO2 nanoparticle toxicity did not consider photoactivation and performed tests either in dark conditions or under artificial lighting that did not simulate natural irradiation. The present study summarizes the literature and derives a phototoxicity ratio between the results of nano‐titanium dioxide (nano‐TiO2) experiments conducted in the absence of sunlight and those conducted under solar or simulated solar radiation (SSR) for aquatic species. Therefore, the phototoxicity ratio can be used to correct endpoints of the toxicity tests with nano‐TiO2 that were performed in absence of sunlight. Such corrections also may be important for regulators and risk assessors when reviewing previously published data. A significant difference was observed between the phototoxicity ratios of 2 distinct groups: aquatic species belonging to order Cladocera, and all other aquatic species. Order Cladocera appeared very sensitive and prone to nano‐TiO2 phototoxicity. On average nano‐TiO2 was 20 times more toxic to non‐Cladocera and 1867 times more toxic to Cladocera (median values 3.3 and 24.7, respectively) after illumination. Both median value and 75% quartile of the phototoxicity ratio are chosen as the most practical values for the correction of endpoints of nano‐TiO2 toxicity tests that were performed in dark conditions, or in the absence of sunlight. Environ Toxicol Chem 2015;34:1070–1077. © 2015 The Author. Published by SETAC. PMID:25640001

  10. Evaluation of methodological aspects of digestibility measurements in ponies fed different haylage to concentrate ratios

    NARCIS (Netherlands)

    Schaafstra, F.J.W.C.; Doorn, van D.A.; Schonewille, J.T.; Riet, van M.M.J.; Visser, P.; Blok, M.C.; Hendriks, W.H.

    2017-01-01

    Methodological aspects of digestibility measurements were studied in four Welsh pony geldings consuming haylage-based diets with increasing proportions of a pelleted concentrate according to a 4×4 Latin square design experiment. Ponies were fed four experimental, iso-energetic (net energy (NE)

  11. Theory and methodology of social, political and economic processes risks determining in different countries of the world

    Directory of Open Access Journals (Sweden)

    Yashina Nadezhda, I.

    2015-06-01

    Full Text Available The study deals with the problems of the theory and methodology of social, political and economic processes risks in different countries with relative indicators of the socio-economic development level, as well as the size and condition of the public debt. Developed by the authors the methodology of determining the risks of social, political and economic processes of public policy around the world revealed close relationship between socio-economic situation of the countries and their public debt. Within the framework of this methodology two groups of factors characterizing the socio-political and economic processes in the country are being developed. After that each exponent and indicator are being processed, using expert procedures. Maximum statutory values for tentatively referenced countries with effective and ineffective government policies are identified. Then standardization (specification and definition of integral (generalized indexes of socio-political and economic processes in the country are taking place. After that the ranking of countries by aggregated standardized ratio is arranged, taking into account the significance of the developed indicators. The final phase of implementation methodology is identifying risks of social, political and economic processes of public policy around the world. This is the ranking of countries by ratio of stability in public policy (stability of economic and socio-political processes in the country. As the result of implementation methodology the following output was received: what really makes a difference is not the amount of the country's debt, but how effectively it manages this debt, whether it has a goal to improve social and economic indicators. Practical testing methodology has proven that studied indicators fully characterize the development of the countries, their political, social and economic situation on the world stage.

  12. Testing and Performance Verification of a High Bypass Ratio Turbofan Rotor in an Internal Flow Component Test Facility

    Science.gov (United States)

    VanZante, Dale E.; Podboy, Gary G.; Miller, Christopher J.; Thorp, Scott A.

    2009-01-01

    A 1/5 scale model rotor representative of a current technology, high bypass ratio, turbofan engine was installed and tested in the W8 single-stage, high-speed, compressor test facility at NASA Glenn Research Center (GRC). The same fan rotor was tested previously in the GRC 9x15 Low Speed Wind Tunnel as a fan module consisting of the rotor and outlet guide vanes mounted in a flight-like nacelle. The W8 test verified that the aerodynamic performance and detailed flow field of the rotor as installed in W8 were representative of the wind tunnel fan module installation. Modifications to W8 were necessary to ensure that this internal flow facility would have a flow field at the test package that is representative of flow conditions in the wind tunnel installation. Inlet flow conditioning was designed and installed in W8 to lower the fan face turbulence intensity to less than 1.0 percent in order to better match the wind tunnel operating environment. Also, inlet bleed was added to thin the casing boundary layer to be more representative of a flight nacelle boundary layer. On the 100 percent speed operating line the fan pressure rise and mass flow rate agreed with the wind tunnel data to within 1 percent. Detailed hot film surveys of the inlet flow, inlet boundary layer and fan exit flow were compared to results from the wind tunnel. The effect of inlet casing boundary layer thickness on fan performance was quantified. Challenges and lessons learned from testing this high flow, low static pressure rise fan in an internal flow facility are discussed.

  13. Methodology for testing a system for remote monitoring and control on auxiliary machines in electric vehicles

    Directory of Open Access Journals (Sweden)

    Dimitrov Vasil

    2017-01-01

    Full Text Available A laboratory system for remote monitoring and control of an asynchronous motor controlled by a soft starter and contemporary measuring and control devices has been developed and built. This laboratory system is used for research and in teaching. A study of the principles of operation, setting up and examination of intelligent energy meters, soft starters and PLC has been made as knowledge of the relevant software products is necessary. This is of great importance because systems for remote monitoring and control of energy consumption, efficiency and proper operation of the controlled objects are very often used in different spheres of industry, in building automation, transport, electricity distribution network, etc. Their implementation in electric vehicles for remote monitoring and control on auxiliary machines is also possible and very useful. In this paper, a methodology of tests is developed and some experiments are presented. Thus, an experimental verification of the developed methodology is made.

  14. A Proposal for a Methodology to Develop a Cyber-Attack Penetration Test Scenario Including NPPs Safety

    Energy Technology Data Exchange (ETDEWEB)

    Lee, In Hyo [KAIST, Daejeon (Korea, Republic of); Son, Han Seong [Joongbu Univ., Geumsan (Korea, Republic of); Kim, Si Won [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of); Kang, Hyun Gook [Rensselaer Polytechnic Institute, Troy (United States)

    2016-10-15

    Penetration test is a method to evaluate the cyber security of NPPs; so, this approach was performed in some studies. Because they focused on vulnerability finding or test bed construction, scenario based approach was not performed. However, to test the cyber security of NPPs, a proper test scenario should be needed. Ahn et al. developed cyber-attack scenarios but those scenarios couldn't be applied in penetration test because they developed the scenarios based on past incidents of NPPs induced by cyber-attack. That is, those scenarios only covered scenarios which were happened before; so, they couldn't cover other various scenarios and couldn't reflect them into a penetration test. In this study, a method to develop a cyber-attack penetration test scenario of NPPs especially focused on safety point of view is suggested. To evaluate the cyber security of NPPs, penetration test can be a possible way. In this study, a method to develop a penetration test scenario was explained. Especially, the goal of hacker was focused on nuclear fuel integrity deterioration. So, in the methodology, Level 1 PSA results were utilized to reflect plant safety into the security. From the PSA results, basic event was post processed and possible cyber-attacks were reviewed with vulnerabilities of digital control system.

  15. A Proposal for a Methodology to Develop a Cyber-Attack Penetration Test Scenario Including NPPs Safety

    International Nuclear Information System (INIS)

    Lee, In Hyo; Son, Han Seong; Kim, Si Won; Kang, Hyun Gook

    2016-01-01

    Penetration test is a method to evaluate the cyber security of NPPs; so, this approach was performed in some studies. Because they focused on vulnerability finding or test bed construction, scenario based approach was not performed. However, to test the cyber security of NPPs, a proper test scenario should be needed. Ahn et al. developed cyber-attack scenarios but those scenarios couldn't be applied in penetration test because they developed the scenarios based on past incidents of NPPs induced by cyber-attack. That is, those scenarios only covered scenarios which were happened before; so, they couldn't cover other various scenarios and couldn't reflect them into a penetration test. In this study, a method to develop a cyber-attack penetration test scenario of NPPs especially focused on safety point of view is suggested. To evaluate the cyber security of NPPs, penetration test can be a possible way. In this study, a method to develop a penetration test scenario was explained. Especially, the goal of hacker was focused on nuclear fuel integrity deterioration. So, in the methodology, Level 1 PSA results were utilized to reflect plant safety into the security. From the PSA results, basic event was post processed and possible cyber-attacks were reviewed with vulnerabilities of digital control system

  16. Monitoring HIV Testing in the United States: Consequences of Methodology Changes to National Surveys.

    Directory of Open Access Journals (Sweden)

    Michelle M Van Handel

    Full Text Available In 2011, the National Health Interview Survey (NHIS, an in-person household interview, revised the human immunodeficiency virus (HIV section of the survey and the Behavioral Risk Factor Surveillance System (BRFSS, a telephone-based survey, added cellphone numbers to its sampling frame. We sought to determine how these changes might affect assessment of HIV testing trends.We used linear regression with pairwise contrasts with 2003-2013 data from NHIS and BRFSS to compare percentages of persons aged 18-64 years who reported HIV testing in landline versus cellphone-only households before and after 2011, when NHIS revised its in-person questionnaire and BRFSS added cellphone numbers to its telephone-based sample.In NHIS, the percentage of persons in cellphone-only households increased 13-fold from 2003 to 2013. The percentage ever tested for HIV was 6%-10% higher among persons in cellphone-only than landline households. The percentage ever tested for HIV increased significantly from 40.2% in 2003 to 45.0% in 2010, but was significantly lower in 2011 (40.6% and 2012 (39.7%. In BRFSS, the percentage ever tested decreased significantly from 45.9% in 2003 to 40.2% in 2010, but increased to 42.9% in 2011 and 43.5% in 2013.HIV testing estimates were lower after NHIS questionnaire changes but higher after BRFSS methodology changes. Data before and after 2011 are not comparable, complicating assessment of trends.

  17. On the hypothesis-free testing of metabolite ratios in genome-wide and metabolome-wide association studies

    Directory of Open Access Journals (Sweden)

    Petersen Ann-Kristin

    2012-06-01

    Full Text Available Abstract Background Genome-wide association studies (GWAS with metabolic traits and metabolome-wide association studies (MWAS with traits of biomedical relevance are powerful tools to identify the contribution of genetic, environmental and lifestyle factors to the etiology of complex diseases. Hypothesis-free testing of ratios between all possible metabolite pairs in GWAS and MWAS has proven to be an innovative approach in the discovery of new biologically meaningful associations. The p-gain statistic was introduced as an ad-hoc measure to determine whether a ratio between two metabolite concentrations carries more information than the two corresponding metabolite concentrations alone. So far, only a rule of thumb was applied to determine the significance of the p-gain. Results Here we explore the statistical properties of the p-gain through simulation of its density and by sampling of experimental data. We derive critical values of the p-gain for different levels of correlation between metabolite pairs and show that B/(2*α is a conservative critical value for the p-gain, where α is the level of significance and B the number of tested metabolite pairs. Conclusions We show that the p-gain is a well defined measure that can be used to identify statistically significant metabolite ratios in association studies and provide a conservative significance cut-off for the p-gain for use in future association studies with metabolic traits.

  18. Implementation of Prognostic Methodologies to Cryogenic Propellant Loading Test-bed

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics methodologies determine the health state of a system and predict the end of life and remaining useful life. This information enables operators to take...

  19. Nonlinear relationship between the Product Consistency Test (PCT) response and the Al/B ratio in a soda-lime aluminoborosilicate glass

    Energy Technology Data Exchange (ETDEWEB)

    Farooqi, Rahmat Ullah, E-mail: rufarooqi@postech.ac.kr [Division of Advanced Nuclear Engineering, Pohang University of Science and Technology, 77 Cheongam-Ro, Nam-Gu, Pohang, Gyeongbuk 790-784 (Korea, Republic of); Hrma, Pavel [Division of Advanced Nuclear Engineering, Pohang University of Science and Technology, 77 Cheongam-Ro, Nam-Gu, Pohang, Gyeongbuk 790-784 (Korea, Republic of); Pacific Northwest National Laboratory, Richland, WA (United States)

    2016-06-15

    We have investigated the effect of Al/B ratio on the Product Consistency Test (PCT) response. In an aluminoborosilicate soda-lime glass based on a modified International Simple Glass, ISG-3, the Al/B ratio varied from 0 to 0.55 (in mole fractions). In agreement with various models of the PCT response as a function of glass composition, we observed a monotonic increase of B and Na releases with decreasing Al/B mole ratio, but only when the ratio was higher than 0.05. Below this value (Al/B < 0.05), we observed a sharp decrease that we attribute to B in tetrahedral coordination.

  20. Experimental study of the crack depth ratio threshold to analyze the slow crack growth by creep of high density polyethylene pipes

    International Nuclear Information System (INIS)

    Laiarinandrasana, Lucien; Devilliers, Clémence; Lucatelli, Jean Marc; Gaudichet-Maurin, Emmanuelle; Brossard, Jean Michel

    2014-01-01

    To assess the durability of drinking water connection pipes subjected to oxidation and slow crack growth, a comprehensive database was constructed on a novel specimen geometry: the pre-cracked NOL ring. 135 tests were carried out consisting of initial crack depth ratio ranging from 0.08 to 0.6; single or double longitudinal cracks: tensile with steady strain rate and creep loading. A threshold value of the crack depth ratio of 0.2, induced by the oxidation was determined by analyzing several mechanical parameters. This threshold value was shown to be independent on the strain rate effects, single or double crack configuration and the kind of loading: tensile or creep. Creep test results with crack depth ratio larger than 0.2 were then utilized to establish a failure assessment diagram. A methodology allowing the prediction of residual lifetime of in-service pipes was proposed, using this diagram. - Highlights: • Experimental data on pre-cracked rings featuring a longitudinally cracked HDPE pipe. • Crack depth ratio threshold for slow crack growth study consecutive to oxidation. • Investigation of the effects of the single/double notch(es) and of the strain rate. • Original results obtained from tests performed with tensile and creep loadings. • Correlation between creep initiation time and C* with DENT and ring specimens

  1. Laboratory test on maximum and minimum void ratio of tropical sand matrix soils

    Science.gov (United States)

    Othman, B. A.; Marto, A.

    2018-04-01

    Sand is generally known as loose granular material which has a grain size finer than gravel and coarser than silt and can be very angular to well-rounded in shape. The present of various amount of fines which also influence the loosest and densest state of sand in natural condition have been well known to contribute to the deformation and loss of shear strength of soil. This paper presents the effect of various range of fines content on minimum void ratio e min and maximum void ratio e max of sand matrix soils. Laboratory tests to determine e min and e max of sand matrix soil were conducted using non-standard method introduced by previous researcher. Clean sand was obtained from natural mining site at Johor, Malaysia. A set of 3 different sizes of sand (fine sand, medium sand, and coarse sand) were mixed with 0% to 40% by weight of low plasticity fine (kaolin). Results showed that generally e min and e max decreased with the increase of fines content up to a minimal value of 0% to 30%, and then increased back thereafter.

  2. Reliability of Soft Tissue Model Based Implant Surgical Guides; A Methodological Mistake.

    Science.gov (United States)

    Sabour, Siamak; Dastjerdi, Elahe Vahid

    2012-08-20

    Abstract We were interested to read the paper by Maney P and colleagues published in the July 2012 issue of J Oral Implantol. The authors aimed to assess the reliability of soft tissue model based implant surgical guides reported that the accuracy was evaluated using software. 1 I found the manuscript title of Maney P, et al. incorrect and misleading. Moreover, they reported twenty-two sites (46.81%) were considered accurate (13 of 24 maxillary and 9 of 23 mandibular sites). As the authors point out in their conclusion, Soft tissue models do not always provide sufficient accuracy for implant surgical guide fabrication.Reliability (precision) and validity (accuracy) are two different methodological issues in researches. Sensitivity, specificity, PPV, NPV, likelihood ratio positive (true positive/false negative) and likelihood ratio negative (false positive/ true negative) as well as odds ratio (true results\\false results - preferably more than 50) are among the tests to evaluate the validity (accuracy) of a single test compared to a gold standard.2-4 It is not clear that the reported twenty-two sites (46.81%) which were considered accurate related to which of the above mentioned estimates for validity analysis. Reliability (repeatability or reproducibility) is being assessed by different statistical tests such as Pearson r, least square and paired t.test which all of them are among common mistakes in reliability analysis 5. Briefly, for quantitative variable Intra Class Correlation Coefficient (ICC) and for qualitative variables weighted kappa should be used with caution because kappa has its own limitation too. Regarding reliability or agreement, it is good to know that for computing kappa value, just concordant cells are being considered, whereas discordant cells should also be taking into account in order to reach a correct estimation of agreement (Weighted kappa).2-4 As a take home message, for reliability and validity analysis, appropriate tests should be

  3. Towards standardized testing methodologies for optical properties of components in concentrating solar thermal power plants

    Science.gov (United States)

    Sallaberry, Fabienne; Fernández-García, Aránzazu; Lüpfert, Eckhard; Morales, Angel; Vicente, Gema San; Sutter, Florian

    2017-06-01

    Precise knowledge of the optical properties of the components used in the solar field of concentrating solar thermal power plants is primordial to ensure their optimum power production. Those properties are measured and evaluated by different techniques and equipment, in laboratory conditions and/or in the field. Standards for such measurements and international consensus for the appropriate techniques are in preparation. The reference materials used as a standard for the calibration of the equipment are under discussion. This paper summarizes current testing methodologies and guidelines for the characterization of optical properties of solar mirrors and absorbers.

  4. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design

    OpenAIRE

    Matha, Denis; Sandner, Frank; Molins i Borrell, Climent; Campos Hortigüela, Alexis; Cheng, Po Wen

    2015-01-01

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provide...

  5. Initiation of depleted uranium oxide and spent fuel testing for the spent fuel sabotage aerosol ratio program

    Energy Technology Data Exchange (ETDEWEB)

    Molecke, M.A.; Gregson, M.W.; Sorenson, K.B. [Sandia National Labs. (United States); Billone, M.C.; Tsai, H. [Argonne National Lab. (United States); Koch, W.; Nolte, O. [Fraunhofer Inst. fuer Toxikologie und Experimentelle Medizin (Germany); Pretzsch, G.; Lange, F. [Gesellschaft fuer Anlagen- und Reaktorsicherheit (Germany); Autrusson, B.; Loiseau, O. [Inst. de Radioprotection et de Surete Nucleaire (France); Thompson, N.S.; Hibbs, R.S. [U.S. Dept. of Energy (United States); Young, F.I.; Mo, T. [U.S. Nuclear Regulatory Commission (United States)

    2004-07-01

    We provide a detailed overview of an ongoing, multinational test program that is developing aerosol data for some spent fuel sabotage scenarios on spent fuel transport and storage casks. Experiments are being performed to quantify the aerosolized materials plus volatilized fission products generated from actual spent fuel and surrogate material test rods, due to impact by a high energy density device, HEDD. The program participants in the U.S. plus Germany, France, and the U.K., part of the international Working Group for Sabotage Concerns of Transport and Storage Casks, WGSTSC have strongly supported and coordinated this research program. Sandia National Laboratories, SNL, has the lead role for conducting this research program; test program support is provided by both the U.S. Department of Energy and Nuclear Regulatory Commission. WGSTSC partners need this research to better understand potential radiological impacts from sabotage of nuclear material shipments and storage casks, and to support subsequent risk assessments, modeling, and preventative measures. We provide a summary of the overall, multi-phase test design and a description of all explosive containment and aerosol collection test components used. We focus on the recently initiated tests on ''surrogate'' spent fuel, unirradiated depleted uranium oxide, and forthcoming actual spent fuel tests. The depleted uranium oxide test rodlets were prepared by the Institut de Radioprotection et de Surete Nucleaire, in France. These surrogate test rodlets closely match the diameter of the test rodlets of actual spent fuel from the H.B. Robinson reactor (high burnup PWR fuel) and the Surry reactor (lower, medium burnup PWR fuel), generated from U.S. reactors. The characterization of the spent fuels and fabrication into short, pressurized rodlets has been performed by Argonne National Laboratory, for testing at SNL. The ratio of the aerosol and respirable particles released from HEDD-impacted spent

  6. Testing a SEA methodology for the energy sector: a waste incineration tax proposal

    International Nuclear Information System (INIS)

    Nilsson, Maans; Bjoerklund, Anna; Finnveden, Goeran; Johansson, Jessica

    2005-01-01

    Most Strategic Environmental Assessment (SEA) research has been preoccupied with SEA as a procedure and there are relatively few developments and tests of analytical methodologies. This paper applies and tests an analytical framework for an energy sector SEA. In a case study on a policy proposal for waste-to-energy taxation in Sweden, it studies changes in the energy system as a result of implementing the suggested tax by testing three analytical pathways: an LCA pathway, a site-dependent pathway, and a qualitative pathway. In addition, several valuation methods are applied. The assessment indicates that there are some overall environmental benefits to introducing a tax, but that benefits are modest compared to the potential. The methods are discussed in relation to characteristics for effective policy learning and knowledge uptake. The application shows that in many ways they complement each other rather than substitute for each other. The qualitative pathway is useful for raising awareness and getting a comprehensive view of environmental issues, but has limited potential for decision support. The precision increased as we went to LCA and to site-dependent analysis, and a hierarchy emerged in which the qualitative pathway filled rudimentary functions whereas the site-dependent analysis gave more advanced decision support. All methods had limited potential in supporting a choice between alternatives unless data was aggregated through a valuation exercise

  7. Lead isotope ratios in artists' lead white: a progress report

    Energy Technology Data Exchange (ETDEWEB)

    Keisch, B; Callahan, R C [Carnegie-Mellon Univ., Pittsburgh, Pa. (USA)

    1976-07-01

    The lead isotope ratios in over four hundred samples of lead white have been determined. The samples represent various geographical sources and dates from the thirteenth century to the present. A new method for organizing this large volume of data is described which helps with the visualization of temporal and geographic patterns. A number of interesting relationships between lead isotope ratio and date or source are shown to exist. Some examples of successful applications of this methodology are described.

  8. The fast ratio: A rapid measure for testing the dominance of the fast component in the initial OSL signal from quartz

    International Nuclear Information System (INIS)

    Durcan, Julie A.; Duller, Geoff A.T.

    2011-01-01

    The signal from the fast component is usually considered preferable for quartz optically stimulated luminescence (OSL) dating, however its presence in a continuous wave (CW) OSL signal is often assumed, rather than verified. This paper presents an objective measure (termed the fast ratio) for testing the dominance of the fast component in the initial part of a quartz OSL signal. The ratio is based upon the photo ionisation cross-sections of the fast and medium components and the power of the measurement equipment used to record the OSL signal, and it compares parts of the OSL signal selected to represent the fast and medium components. The ability of the fast ratio to distinguish between samples whose CW-OSL signal is dominated by the fast and non-fast components is demonstrated by comparing the fast ratio with the contribution of the fast component calculated from curve deconvolution of measured OSL signals and from simulated data. The ratio offers a rapid method for screening a large number of OSL signals obtained for individual equivalent dose estimates, it can be calculated and applied as easily as other routine screening methods, and is transferrable between different aliquots, samples and measurement equipment. - Highlights: → Fast ratio is a measure which tests dominance of fast component in quartz OSL signals. → A fast ratio above 20 implies a CW-OSL signal is dominated by fast component. → Fast ratio can be easily and rapidly applied to a large number of OSL signals. → Uses include signal comparison, data screening, identify need for further analysis.

  9. Chemiluminescence-based multivariate sensing of local equivalence ratios in premixed atmospheric methane-air flames

    Energy Technology Data Exchange (ETDEWEB)

    Tripathi, Markandey M.; Krishnan, Sundar R.; Srinivasan, Kalyan K.; Yueh, Fang-Yu; Singh, Jagdish P.

    2011-09-07

    Chemiluminescence emissions from OH*, CH*, C2, and CO2 formed within the reaction zone of premixed flames depend upon the fuel-air equivalence ratio in the burning mixture. In the present paper, a new partial least square regression (PLS-R) based multivariate sensing methodology is investigated and compared with an OH*/CH* intensity ratio-based calibration model for sensing equivalence ratio in atmospheric methane-air premixed flames. Five replications of spectral data at nine different equivalence ratios ranging from 0.73 to 1.48 were used in the calibration of both models. During model development, the PLS-R model was initially validated with the calibration data set using the leave-one-out cross validation technique. Since the PLS-R model used the entire raw spectral intensities, it did not need the nonlinear background subtraction of CO2 emission that is required for typical OH*/CH* intensity ratio calibrations. An unbiased spectral data set (not used in the PLS-R model development), for 28 different equivalence ratio conditions ranging from 0.71 to 1.67, was used to predict equivalence ratios using the PLS-R and the intensity ratio calibration models. It was found that the equivalence ratios predicted with the PLS-R based multivariate calibration model matched the experimentally measured equivalence ratios within 7%; whereas, the OH*/CH* intensity ratio calibration grossly underpredicted equivalence ratios in comparison to measured equivalence ratios, especially under rich conditions ( > 1.2). The practical implications of the chemiluminescence-based multivariate equivalence ratio sensing methodology are also discussed.

  10. Integrating Test-Form Formatting into Automated Test Assembly

    Science.gov (United States)

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  11. Certification Testing Methodology for Composite Structure. Volume 2. Methodology Development

    Science.gov (United States)

    1986-10-01

    parameter, sample size and fa- tigue test duration. The required input are 1. Residual strength Weibull shape parameter ( ALPR ) 2. Fatigue life Weibull shape...INPUT STRENGTH ALPHA’) READ(*,*) ALPR ALPRI = 1.O/ ALPR WRITE(*, 2) 2 FORMAT( 2X, ’PLEASE INPUT LIFE ALPHA’) READ(*,*) ALPL ALPLI - 1.0/ALPL WRITE(*, 3...3 FORMAT(2X,’PLEASE INPUT SAMPLE SIZE’) READ(*,*) N AN - N WRITE(*,4) 4 FORMAT(2X,’PLEASE INPUT TEST DURATION’) READ(*,*) T RALP - ALPL/ ALPR ARGR - 1

  12. Failure modes induced by natural radiation environments on DRAM memories: study, test methodology and mitigation technique

    International Nuclear Information System (INIS)

    Bougerol, A.

    2011-05-01

    DRAMs are frequently used in space and aeronautic systems. Their sensitivity to cosmic radiations have to be known in order to satisfy reliability requirements for critical applications. These evaluations are traditionally done with particle accelerators. However, devices become more complex with technology integration. Therefore new effects appear, inducing longer and more expensive tests. There is a complementary solution: the pulsed laser, which triggers similar effects as particles. Thanks to these two test tools, main DRAM radiation failure modes were studied: SEUs (Single Event Upset) in memory blocks, and SEFIs (Single Event Functional Interrupt) in peripheral circuits. This work demonstrates the influence of test patterns on SEU and SEFI sensitivities depending on technology used. In addition, this study identifies the origin of the most frequent type of SEFIs. Moreover, laser techniques were developed to quantify sensitive surfaces of the different effects. This work led to a new test methodology for industry, in order to optimize test cost and efficiency using both pulsed laser beams and particle accelerators. Finally, a new fault tolerant technique is proposed: based on DRAM cell radiation immunity when discharged, this technique allows to correct all bits of a logic word. (author)

  13. A MULTIPLE TESTING OF THE ABC METHOD AND THE DEVELOPMENT OF A SECOND-GENERATION MODEL. PART II, TEST RESULTS AND AN ANALYSIS OF RECALL RATIO.

    Science.gov (United States)

    ALTMANN, BERTHOLD

    AFTER A BRIEF SUMMARY OF THE TEST PROGRAM (DESCRIBED MORE FULLY IN LI 000 318), THE STATISTICAL RESULTS TABULATED AS OVERALL "ABC (APPROACH BY CONCEPT)-RELEVANCE RATIOS" AND "ABC-RECALL FIGURES" ARE PRESENTED AND REVIEWED. AN ABSTRACT MODEL DEVELOPED IN ACCORDANCE WITH MAX WEBER'S "IDEALTYPUS" ("DIE OBJEKTIVITAET…

  14. Mirror-mark tests performed on jackdaws reveal potential methodological problems in the use of stickers in avian mark-test studies.

    Directory of Open Access Journals (Sweden)

    Manuel Soler

    Full Text Available Some animals are capable of recognizing themselves in a mirror, which is considered to be demonstrated by passing the mark test. Mirror self-recognition capacity has been found in just a few mammals having very large brains and only in one bird, the magpie (Pica pica. The results obtained in magpies have enormous biological and cognitive implications because the fact that magpies were able to pass the mark test meant that this species is at the same cognitive level with great apes, that mirror self-recognition has evolved independently in the magpie and great apes (which diverged 300 million years ago, and that the neocortex (which is not present in the bird's brains is not a prerequisite for mirror self-recognition as previously believed. Here, we have replicated the experimental design used on magpies to determine whether jackdaws (Corvus monedula are also capable of mirror self-recognition by passing the mark test. We found that our nine jackdaws showed a very high interest towards the mirror and exhibited self-contingent behavior as soon as mirrors were introduced. However, jackdaws were not able to pass the mark test: both sticker-directed actions and sticker removal were performed with a similar frequency in both the cardboard (control and the mirror conditions. We conclude that our jackdaws' behaviour raises non-trivial questions about the methodology used in the avian mark test. Our study suggests that the use of self-adhesive stickers on sensitive throat feathers may open the way to artefactual results because birds might perceive the stickers tactilely.

  15. Establishing a Ballistic Test Methodology for Documenting the Containment Capability of Small Gas Turbine Engine Compressors

    Science.gov (United States)

    Heady, Joel; Pereira, J. Michael; Ruggeri, Charles R.; Bobula, George A.

    2009-01-01

    A test methodology currently employed for large engines was extended to quantify the ballistic containment capability of a small turboshaft engine compressor case. The approach involved impacting the inside of a compressor case with a compressor blade. A gas gun propelled the blade into the case at energy levels representative of failed compressor blades. The test target was a full compressor case. The aft flange was rigidly attached to a test stand and the forward flange was attached to a main frame to provide accurate boundary conditions. A window machined in the case allowed the projectile to pass through and impact the case wall from the inside with the orientation, direction and speed that would occur in a blade-out event. High-peed, digital-video cameras provided accurate velocity and orientation data. Calibrated cameras and digital image correlation software generated full field displacement and strain information at the back side of the impact point.

  16. HIV Risks, Testing, and Treatment in the Former Soviet Union: Challenges and Future Directions in Research and Methodology.

    Science.gov (United States)

    Saadat, Victoria M

    2015-01-01

    The dissolution of the USSR resulted in independence for constituent republics but left them battling an unstable economic environment and healthcare. Increases in injection drug use, prostitution, and migration were all widespread responses to this transition and have contributed to the emergence of an HIV epidemic in the countries of former Soviet Union. Researchers have begun to identify the risks of HIV infection as well as the barriers to HIV testing and treatment in the former Soviet Union. Significant methodological challenges have arisen and need to be addressed. The objective of this review is to determine common threads in HIV research in the former Soviet Union and provide useful recommendations for future research studies. In this systematic review of the literature, Pubmed was searched for English-language studies using the key search terms "HIV", "AIDS", "human immunodeficiency virus", "acquired immune deficiency syndrome", "Central Asia", "Kazakhstan", "Kyrgyzstan", "Uzbekistan", "Tajikistan", "Turkmenistan", "Russia", "Ukraine", "Armenia", "Azerbaijan", and "Georgia". Studies were evaluated against eligibility criteria for inclusion. Thirty-nine studies were identified across the two main topic areas of HIV risk and barriers to testing and treatment, themes subsequently referred to as "risk" and "barriers". Study design was predominantly cross-sectional. The most frequently used sampling methods were peer-to-peer and non-probabilistic sampling. The most frequently reported risks were condom misuse, risky intercourse, and unsafe practices among injection drug users. Common barriers to testing included that testing was inconvenient, and that results would not remain confidential. Frequent barriers to treatment were based on a distrust in the treatment system. The findings of this review reveal methodological limitations that span the existing studies. Small sample size, cross-sectional design, and non-probabilistic sampling methods were frequently

  17. Comparison of Urine Albumin-to-Creatinine Ratio (ACR) Between ACR Strip Test and Quantitative Test in Prediabetes and Diabetes

    Science.gov (United States)

    Cho, Seon; Kim, Suyoung; Cho, Han-Ik

    2017-01-01

    Background Albuminuria is generally known as a sensitive marker of renal and cardiovascular dysfunction. It can be used to help predict the occurrence of nephropathy and cardiovascular disorders in diabetes. Individuals with prediabetes have a tendency to develop macrovascular and microvascular pathology, resulting in an increased risk of retinopathy, cardiovascular diseases, and chronic renal diseases. We evaluated the clinical value of a strip test for measuring the urinary albumin-to-creatinine ratio (ACR) in prediabetes and diabetes. Methods Spot urine samples were obtained from 226 prediabetic and 275 diabetic subjects during regular health checkups. Urinary ACR was measured by using strip and laboratory quantitative tests. Results The positive rates of albuminuria measured by using the ACR strip test were 15.5% (microalbuminuria, 14.6%; macroalbuminuria, 0.9%) and 30.5% (microalbuminuria, 25.1%; macroalbuminuria, 5.5%) in prediabetes and diabetes, respectively. In the prediabetic population, the sensitivity, specificity, positive predictive value, negative predictive value, and overall accuracy of the ACR strip method were 92.0%, 94.0%, 65.7%, 99.0%, and 93.8%, respectively; the corresponding values in the diabetic population were 80.0%, 91.6%, 81.0%, 91.1%, and 88.0%, respectively. The median [interquartile range] ACR values in the strip tests for measurement ranges of 300 mg/g were 9.4 [6.3-15.4], 46.9 [26.5-87.7], and 368.8 [296.2-575.2] mg/g, respectively, using the laboratory method. Conclusions The ACR strip test showed high sensitivity, specificity, and negative predictive value, suggesting that the test can be used to screen for albuminuria in cases of prediabetes and diabetes. PMID:27834062

  18. National Certification Methodology for the Nuclear Weapons Stockpile

    International Nuclear Information System (INIS)

    Goodwin, B T; Juzaitis, R J

    2006-01-01

    Lawrence Livermore and Los Alamos National Laboratories have developed a common framework and key elements of a national certification methodology called Quantification of Margins and Uncertainties (QMU). A spectrum from senior managers to weapons designers has been engaged in this activity at the two laboratories for on the order of a year to codify this methodology in an overarching and integrated paper. Following is the certification paper that has evolved. In the process of writing this paper, an important outcome has been the realization that a joint Livermore/Los Alamos workshop on QMU, focusing on clearly identifying and quantifying differences between approaches between the two labs plus developing an even stronger technical foundation on methodology, will be valuable. Later in FY03, such a joint laboratory workshop will be held. One of the outcomes of this workshop will be a new version of this certification paper. A comprehensive approach to certification must include specification of problem scope, development of system baseline models, formulation of standards of performance assessment, and effective procedures for peer review and documentation. This document concentrates on the assessment and peer review aspects of the problem. In addressing these points, a central role is played by a 'watch list' for weapons derived from credible failure modes and performance gate analyses. The watch list must reflect our best assessment of factors that are critical to weapons performance. High fidelity experiments and calculations as well as full exploitation of archival test data are essential to this process. Peer review, advisory groups and red teams play an important role in confirming the validity of the watch list. The framework for certification developed by the Laboratories has many basic features in common, but some significant differences in the detailed technical implementation of the overall methodology remain. Joint certification workshops held in June

  19. A Methodological Report: Adapting the 505 Change-of-Direction Speed Test Specific to American Football.

    Science.gov (United States)

    Lockie, Robert G; Farzad, Jalilvand; Orjalo, Ashley J; Giuliano, Dominic V; Moreno, Matthew R; Wright, Glenn A

    2017-02-01

    Lockie, RG, Jalilvand, F, Orjalo, AJ, Giuliano, DV, Moreno, MR, and Wright, GA. A methodological report: Adapting the 505 change-of-direction speed test specific to American football. J Strength Cond Res 31(2): 539-547, 2017-The 505 involves a 10-m sprint past a timing gate, followed by a 180° change-of-direction (COD) performed over 5 m. This methodological report investigated an adapted 505 (A505) designed to be football-specific by changing the distances to 10 and 5 yd. Twenty-five high school football players (6 linemen [LM]; 8 quarterbacks, running backs, and linebackers [QB/RB/LB]; 11 receivers and defensive backs [R/DB]) completed the A505 and 40-yd sprint. The difference between A505 and 0 to 10-yd time determined the COD deficit for each leg. In a follow-up session, 10 subjects completed the A505 again and 10 subjects completed the 505. Reliability was analyzed by t-tests to determine between-session differences, typical error (TE), and coefficient of variation. Test usefulness was examined via TE and smallest worthwhile change (SWC) differences. Pearson's correlations calculated relationships between the A505 and 505, and A505 and COD deficit with the 40-yd sprint. A 1-way analysis of variance (p ≤ 0.05) derived between-position differences in the A505 and COD deficit. There were no between-session differences for the A505 (p = 0.45-0.76; intraclass correlation coefficient = 0.84-0.95; TE = 2.03-4.13%). Additionally, the A505 was capable of detecting moderate performance changes (SWC0.5 > TE). The A505 correlated with the 505 and 40-yard sprint (r = 0.58-0.92), suggesting the modified version assessed similar qualities. Receivers and defensive backs were faster than LM in the A505 for both legs, and right-leg COD deficit. Quarterbacks, running backs, and linebackers were faster than LM in the right-leg A505. The A505 is reliable, can detect moderate performance changes, and can discriminate between football position groups.

  20. Lead isotope ratios in artists' lead white: a progress report

    International Nuclear Information System (INIS)

    Keisch, B.; Callahan, R.C.

    1976-01-01

    The lead isotope ratios in over four hundred samples of lead white have been determined. The samples represent various geographical sources and dates from the thirteenth century to the present. A new method for organizing this large volume of data is described which helps with the visualization of temporal and geographic patterns. A number of interesting relationships between lead isotope ratio and date or source are shown to exist. Some examples of successful applications of this methodology are described. (author)

  1. Comment on Hall et al. (2017), "How to Choose Between Measures of Tinnitus Loudness for Clinical Research? A Report on the Reliability and Validity of an Investigator-Administered Test and a Patient-Reported Measure Using Baseline Data Collected in a Phase IIa Drug Trial".

    Science.gov (United States)

    Sabour, Siamak

    2018-03-08

    The purpose of this letter, in response to Hall, Mehta, and Fackrell (2017), is to provide important knowledge about methodology and statistical issues in assessing the reliability and validity of an audiologist-administered tinnitus loudness matching test and a patient-reported tinnitus loudness rating. The author uses reference textbooks and published articles regarding scientific assessment of the validity and reliability of a clinical test to discuss the statistical test and the methodological approach in assessing validity and reliability in clinical research. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess reliability and validity. The qualitative variables of sensitivity, specificity, positive predictive value, negative predictive value, false positive and false negative rates, likelihood ratio positive and likelihood ratio negative, as well as odds ratio (i.e., ratio of true to false results), are the most appropriate estimates to evaluate validity of a test compared to a gold standard. In the case of quantitative variables, depending on distribution of the variable, Pearson r or Spearman rho can be applied. Diagnostic accuracy (validity) and diagnostic precision (reliability or agreement) are two completely different methodological issues. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess validity.

  2. Groundwater flow simulations in support of the Local Scale Hydrogeological Description developed within the Laxemar Methodology Test Project

    International Nuclear Information System (INIS)

    Follin, Sven; Svensson, Urban

    2002-05-01

    The deduced Site Descriptive Model of the Laxemar area has been parameterised from a hydraulic point of view and subsequently put into practice in terms of a numerical flow model. The intention of the subproject has been to explore the adaptation of a numerical flow model to site-specific surface and borehole data, and to identify potential needs for development and improvement in the planned modelling methodology and tools. The experiences made during this process and the outcome of the simulations have been presented to the methodology test project group in course of the project. The discussion and conclusions made in this particular report concern two issues mainly, (i) the use of numerical simulations as a means of gaining creditability, e.g. discrimination between alternative geological models, and (ii) calibration and conditioning of probabilistic (Monte Carlo) realisations

  3. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  4. Can citizen science produce good science? Testing the OPAL Air Survey methodology, using lichens as indicators of nitrogenous pollution

    International Nuclear Information System (INIS)

    Tregidgo, Daniel J.; West, Sarah E.; Ashmore, Mike R.

    2013-01-01

    Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood. -- Highlights: •We investigated the validity of a simplified citizen science methodology. •Lichen abundance data were used to indicate nitrogenous air pollution. •Significant changes were detected beside busy roads with low background pollution. •The methodology detected major, but not subtle, contrasts in pollution. •Sensitivity of citizen science methods to environmental change must be evaluated. -- A simplified lichen biomonitoring method used for citizen science can detect the impact of nitrogenous air pollution from local roads

  5. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1993-01-01

    The use of multidisciplinary teams to develop Type B shipping containers improves the quality and reliability of these reusable packagings. Including the people involved in all aspects of the design, certification and use of the package leads to more innovative, user-friendly containers. Concurrent use of testing and analysis allows engineers to more fully characterize a shipping container's responses to the environments given in the regulations, and provides a strong basis for certification. The combination of the input and output of these efforts should provide a general methodology that designers of Type B radioactive material shipping containers can utilize to optimize and certify their designs. (J.P.N.)

  6. A balanced hazard ratio for risk group evaluation from survival data.

    Science.gov (United States)

    Branders, Samuel; Dupont, Pierre

    2015-07-30

    Common clinical studies assess the quality of prognostic factors, such as gene expression signatures, clinical variables or environmental factors, and cluster patients into various risk groups. Typical examples include cancer clinical trials where patients are clustered into high or low risk groups. Whenever applied to survival data analysis, such groups are intended to represent patients with similar survival odds and to select the most appropriate therapy accordingly. The relevance of such risk groups, and of the related prognostic factors, is typically assessed through the computation of a hazard ratio. We first stress three limitations of assessing risk groups through the hazard ratio: (1) it may promote the definition of arbitrarily unbalanced risk groups; (2) an apparently optimal group hazard ratio can be largely inconsistent with the p-value commonly associated to it; and (3) some marginal changes between risk group proportions may lead to highly different hazard ratio values. Those issues could lead to inappropriate comparisons between various prognostic factors. Next, we propose the balanced hazard ratio to solve those issues. This new performance metric keeps an intuitive interpretation and is as simple to compute. We also show how the balanced hazard ratio leads to a natural cut-off choice to define risk groups from continuous risk scores. The proposed methodology is validated through controlled experiments for which a prescribed cut-off value is defined by design. Further results are also reported on several cancer prognosis studies, and the proposed methodology could be applied more generally to assess the quality of any prognostic markers. Copyright © 2015 John Wiley & Sons, Ltd.

  7. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  8. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the

  9. Testing the methodology for site descriptive modelling. Application for the Laxemar area

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan [JA Streamflow AB, Aelvsjoe (Sweden); Berglund, Johan [SwedPower AB, Stockholm (Sweden); Follin, Sven [SF Geologic AB, Stockholm (Sweden); Hakami, Eva [Itasca Geomekanik AB, Stockholm (Sweden); Halvarson, Jan [Swedish Nuclear Fuel and Waste Management Co, Stockholm (Sweden); Hermanson, Jan [Golder Associates AB, Stockholm (Sweden); Laaksoharju, Marcus [Geopoint (Sweden); Rhen, Ingvar [Sweco VBB/VIAK, Stockholm (Sweden); Wahlgren, C.H. [Sveriges Geologiska Undersoekning, Uppsala (Sweden)

    2002-08-01

    A special project has been conducted where the currently available data from the Laxemar area, which is part of the Simpevarp site, have been evaluated and interpreted into a Site Descriptive Model covering: geology, hydrogeology, hydrogeochemistry and rock mechanics. Description of the surface ecosystem has been omitted, since it was re-characterised in another, parallel, project. Furthermore, there has been no evaluation of transport properties. The project is primarily a methodology test. The lessons learnt will be implemented in the Site Descriptive Modelling during the coming site investigation. The intent of the project has been to explore whether available methodology for Site Descriptive Modelling based on surface and borehole data is adequate and to identify potential needs for development and improvement in the methodology. The project has developed, with limitations in scope, a Site Descriptive Model in local scale, corresponding to the situation after completion of the Initial Site Investigations for the Laxemar area (i.e. 'version 1.2' using the vocabulary of the general execution program for the site investigations). The Site Descriptive Model should be reasonable, but should not be regarded as a 'real' model. There are limitations both in input data and in the scope of the analysis. The measured (primary) data constitute a wide range of different measurement results including data from two deep core drilled boreholes. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modelling. Examples of such evaluations are estimation of surface geology, lineament interpretation, geological single hole interpretation, hydrogeological single hole interpretation and assessment of hydrogeochemical data. Furthermore, while cross discipline interpretation is encouraged there is also a need for transparency. This means that the evaluations first are made within each discipline

  10. Total Protein and Albumin/Globulin Ratio Test

    Science.gov (United States)

    ... Plasma Free Metanephrines Platelet Count Platelet Function Tests Pleural Fluid Analysis PML-RARA Porphyrin Tests Potassium Prealbumin ... of the various types of proteins in the liquid ( serum or plasma ) portion of the blood. Two ...

  11. Electroencephalogram-based methodology for determining unconsciousness during depopulation.

    Science.gov (United States)

    Benson, E R; Alphin, R L; Rankin, M K; Caputo, M P; Johnson, A L

    2012-12-01

    When an avian influenza or virulent Newcastle disease outbreak occurs within commercial poultry, key steps involved in managing a fast-moving poultry disease can include: education; biosecurity; diagnostics and surveillance; quarantine; elimination of infected poultry through depopulation or culling, disposal, and disinfection; and decreasing host susceptibility. Available mass emergency depopulation procedures include whole-house gassing, partial-house gassing, containerized gassing, and water-based foam. To evaluate potential depopulation methods, it is often necessary to determine the time to the loss of consciousness (LOC) in poultry. Many current approaches to evaluating LOC are qualitative and require visual observation of the birds. This study outlines an electroencephalogram (EEG) frequency domain-based approach for determining the point at which a bird loses consciousness. In this study, commercial broilers were used to develop the methodology, and the methodology was validated with layer hens. In total, 42 data sets from 13 broilers aged 5-10 wk and 12 data sets from four spent hens (age greater than 1 yr) were collected and analyzed. A wireless EEG transmitter was surgically implanted, and each bird was monitored during individual treatment with isoflurane anesthesia. EEG data were evaluated using a frequency-based approach. The alpha/delta (A/D, alpha: 8-12 Hz, delta: 0.5-4 Hz) ratio and loss of posture (LOP) were used to determine the point at which the birds became unconscious. Unconsciousness, regardless of the method of induction, causes suppression in alpha and a rise in the delta frequency component, and this change is used to determine unconsciousness. There was no statistically significant difference between time to unconsciousness as measured by A/D ratio or LOP, and the A/D values were correlated at the times of unconsciousness. The correlation between LOP and A/D ratio indicates that the methodology is appropriate for determining

  12. ASTM and VAMAS activities in titanium matrix composites test methods development

    Science.gov (United States)

    Johnson, W. S.; Harmon, D. M.; Bartolotta, P. A.; Russ, S. M.

    1994-01-01

    Titanium matrix composites (TMC's) are being considered for a number of aerospace applications ranging from high performance engine components to airframe structures in areas that require high stiffness to weight ratios at temperatures up to 400 C. TMC's exhibit unique mechanical behavior due to fiber-matrix interface failures, matrix cracks bridged by fibers, thermo-viscoplastic behavior of the matrix at elevated temperatures, and the development of significant thermal residual stresses in the composite due to fabrication. Standard testing methodology must be developed to reflect the uniqueness of this type of material systems. The purpose of this paper is to review the current activities in ASTM and Versailles Project on Advanced Materials and Standards (VAMAS) that are directed toward the development of standard test methodology for titanium matrix composites.

  13. Policy Implications for Continuous Employment Decisions of High School Principals: An Alternative Methodological Approach for Using High-Stakes Testing Outcomes

    Science.gov (United States)

    Young, I. Phillip; Fawcett, Paul

    2013-01-01

    Several teacher models exist for using high-stakes testing outcomes to make continuous employment decisions for principals. These models are reviewed, and specific flaws are noted if these models are retrofitted for principals. To address these flaws, a different methodology is proposed on the basis of actual field data. Specially addressed are…

  14. Calibrating recruitment estimates for mourning doves from harvest age ratios

    Science.gov (United States)

    Miller, David A.; Otis, David L.

    2010-01-01

    We examined results from the first national-scale effort to estimate mourning dove (Zenaida macroura) age ratios and developed a simple, efficient, and generalizable methodology for calibrating estimates. Our method predicted age classes of unknown-age wings based on backward projection of molt distributions from fall harvest collections to preseason banding. We estimated 1) the proportion of late-molt individuals in each age class, and 2) the molt rates of juvenile and adult birds. Monte Carlo simulations demonstrated our estimator was minimally biased. We estimated model parameters using 96,811 wings collected from hunters and 42,189 birds banded during preseason from 68 collection blocks in 22 states during the 2005–2007 hunting seasons. We also used estimates to derive a correction factor, based on latitude and longitude of samples, which can be applied to future surveys. We estimated differential vulnerability of age classes to harvest using data from banded birds and applied that to harvest age ratios to estimate population age ratios. Average, uncorrected age ratio of known-age wings for states that allow hunting was 2.25 (SD 0.85) juveniles:adult, and average, corrected ratio was 1.91 (SD 0.68), as determined from harvest age ratios from an independent sample of 41,084 wings collected from random hunters in 2007 and 2008. We used an independent estimate of differential vulnerability to adjust corrected harvest age ratios and estimated the average population age ratio as 1.45 (SD 0.52), a direct measure of recruitment rates. Average annual recruitment rates were highest east of the Mississippi River and in the northwestern United States, with lower rates between. Our results demonstrate a robust methodology for calibrating recruitment estimates for mourning doves and represent the first large-scale estimates of recruitment for the species. Our methods can be used by managers to correct future harvest survey data to generate recruitment estimates for use in

  15. Testing the methodology for site descriptive modelling. Application for the Laxemar area

    International Nuclear Information System (INIS)

    Andersson, Johan; Berglund, Johan; Follin, Sven; Hakami, Eva; Halvarson, Jan; Hermanson, Jan; Laaksoharju, Marcus; Rhen, Ingvar; Wahlgren, C.H.

    2002-08-01

    A special project has been conducted where the currently available data from the Laxemar area, which is part of the Simpevarp site, have been evaluated and interpreted into a Site Descriptive Model covering: geology, hydrogeology, hydrogeochemistry and rock mechanics. Description of the surface ecosystem has been omitted, since it was re-characterised in another, parallel, project. Furthermore, there has been no evaluation of transport properties. The project is primarily a methodology test. The lessons learnt will be implemented in the Site Descriptive Modelling during the coming site investigation. The intent of the project has been to explore whether available methodology for Site Descriptive Modelling based on surface and borehole data is adequate and to identify potential needs for development and improvement in the methodology. The project has developed, with limitations in scope, a Site Descriptive Model in local scale, corresponding to the situation after completion of the Initial Site Investigations for the Laxemar area (i.e. 'version 1.2' using the vocabulary of the general execution program for the site investigations). The Site Descriptive Model should be reasonable, but should not be regarded as a 'real' model. There are limitations both in input data and in the scope of the analysis. The measured (primary) data constitute a wide range of different measurement results including data from two deep core drilled boreholes. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modelling. Examples of such evaluations are estimation of surface geology, lineament interpretation, geological single hole interpretation, hydrogeological single hole interpretation and assessment of hydrogeochemical data. Furthermore, while cross discipline interpretation is encouraged there is also a need for transparency. This means that the evaluations first are made within each discipline and after this

  16. Is case-chaos methodology an appropriate alternative to conventional case-control studies for investigating outbreaks?

    Science.gov (United States)

    Edelstein, Michael; Wallensten, Anders; Kühlmann-Berenzon, Sharon

    2014-08-15

    Case-chaos methodology is a proposed alternative to case-control studies that simulates controls by randomly reshuffling the exposures of cases. We evaluated the method using data on outbreaks in Sweden. We identified 5 case-control studies from foodborne illness outbreaks that occurred between 2005 and 2012. Using case-chaos methodology, we calculated odds ratios 1,000 times for each exposure. We used the median as the point estimate and the 2.5th and 97.5th percentiles as the confidence interval. We compared case-chaos matched odds ratios with their respective case-control odds ratios in terms of statistical significance. Using Spearman's correlation, we estimated the correlation between matched odds ratios and the proportion of cases exposed to each exposure and quantified the relationship between the 2 using a normal linear mixed model. Each case-control study identified an outbreak vehicle (odds ratios = 4.9-45). Case-chaos methodology identified the outbreak vehicle 3 out of 5 times. It identified significant associations in 22 of 113 exposures that were not associated with outcome and 5 of 18 exposures that were significantly associated with outcome. Log matched odds ratios correlated with their respective proportion of cases exposed (Spearman ρ = 0.91) and increased significantly with the proportion of cases exposed (b = 0.054). Case-chaos methodology missed the outbreak source 2 of 5 times and identified spurious associations between a number of exposures and outcome. Measures of association correlated with the proportion of cases exposed. We recommended against using case-chaos analysis during outbreak investigations. © The Author 2014. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Testing cost-effective methodologies for flood and seismic vulnerability assessment in communities of developing countries (Dajç, northern Albania

    Directory of Open Access Journals (Sweden)

    Veronica Pazzi

    2016-05-01

    Full Text Available Nowadays many developing countries need effective measures to reduce the disaster related risks. Structural interventions are the most effective to achieve these aims. Nevertheless, in the absence of adequate financial resources different low-cost strategies can be used to minimize losses. The purpose of this paper is to demonstrate that the disaster risk reduction can be gathered building a community coping capacity. In the case study, flood and seismic analyses have been carried out using relatively simple and low-cost technologies, fundamental for governments and research institutions of poorly developed countries. In fact, through the acquisition and dissemination of these basic information, a reduction of vulnerability and risk can be achieved. In detail, two methodologies for the evaluation of hydraulic and seismic vulnerability were tested in the Dajç municipality (Northern Albania, a high-seismicity region that is also severely affected by floods. Updated bathymetric, topographic and hydraulic data were processed with HEC-RAS software to identify sites potentially affected by dykes overflowing. Besides, the soil-structure interaction effects for three strategic buildings were studied using microtremors and the Horizontal to Vertical Spectral Ratio method. This flood and seismic vulnerability analysis was then evaluated in terms of costs and ease of accessibility in order to suggest the best use both of the employed devices and the obtained information for designing good civil protection plans and to inform the population about the right behaviour in case of threat.

  18. Methodological quality of systematic reviews on influenza vaccination.

    Science.gov (United States)

    Remschmidt, Cornelius; Wichmann, Ole; Harder, Thomas

    2014-03-26

    There is a growing body of evidence on the risks and benefits of influenza vaccination in various target groups. Systematic reviews are of particular importance for policy decisions. However, their methodological quality can vary considerably. To investigate the methodological quality of systematic reviews on influenza vaccination (efficacy, effectiveness, safety) and to identify influencing factors. A systematic literature search on systematic reviews on influenza vaccination was performed, using MEDLINE, EMBASE and three additional databases (1990-2013). Review characteristics were extracted and the methodological quality of the reviews was evaluated using the assessment of multiple systematic reviews (AMSTAR) tool. U-test, Kruskal-Wallis test, chi-square test, and multivariable linear regression analysis were used to assess the influence of review characteristics on AMSTAR-score. Fourty-six systematic reviews fulfilled the inclusion criteria. Average methodological quality was high (median AMSTAR-score: 8), but variability was large (AMSTAR range: 0-11). Quality did not differ significantly according to vaccination target group. Cochrane reviews had higher methodological quality than non-Cochrane reviews (p=0.001). Detailed analysis showed that this was due to better study selection and data extraction, inclusion of unpublished studies, and better reporting of study characteristics (all p<0.05). In the adjusted analysis, no other factor, including industry sponsorship or journal impact factor had an influence on AMSTAR score. Systematic reviews on influenza vaccination showed large differences regarding their methodological quality. Reviews conducted by the Cochrane collaboration were of higher quality than others. When using systematic reviews to guide the development of vaccination recommendations, the methodological quality of a review in addition to its content should be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Methodologies for certification of transuranic waste packages

    International Nuclear Information System (INIS)

    Christensen, R.N.; Kok, K.D.

    1980-10-01

    The objective of this study was to postulate methodologies for certification that a waste package is acceptable for disposal in a licensed geologic repository. Within the context of this report, certification means the overall process which verifies that a waste package meets the criteria or specifications established for acceptance for disposal in a repository. The overall methodology for certification will include (1) certifying authorities, (2) tests and procedures, and (3) documentation and quality assurance programs. Each criterion will require a methodology that is specific to that criterion. In some cases, different waste forms will require a different methodology. The purpose of predicting certification methodologies is to provide additional information as to what changes, if any, are needed for the TRU waste in storage

  20. Optimization of biodiesel production from castor oil using response surface methodology.

    Science.gov (United States)

    Jeong, Gwi-Taek; Park, Don-Hee

    2009-05-01

    The short supply of edible vegetable oils is the limiting factor in the progression of biodiesel technology; thus, in this study, we applied response surface methodology in order to optimize the reaction factors for biodiesel synthesis from inedible castor oil. Specifically, we evaluated the effects of multiple parameters and their reciprocal interactions using a five-level three-factor design. In a total of 20 individual experiments, we optimized the reaction temperature, oil-to-methanol molar ratio, and quantity of catalyst. Our model equation predicted that the following conditions would generate the maximum quantity of castor biodiesel (92 wt.%): a 40-min reaction at 35.5 degrees C, with an oil-to-methanol molar ratio of 1:8.24, and a catalyst concentration of 1.45% of KOH by weight of castor oil. Subsequent empirical analyses of the biodiesel generated under the predicted conditions showed that the model equation accurately predicted castor biodiesel yields within the tested ranges. The biodiesel produced from castor oil satisfied the relevant quality standards without regard to viscosity and cold filter plugging point.

  1. Glass-surface area to solution-volume ratio and its implications to accelerated leach testing

    International Nuclear Information System (INIS)

    Pederson, L.R.; Buckwalter, C.Q.; McVay, G.L.; Riddle, B.L.

    1982-10-01

    The value of glass surface area to solution volume ratio (SA/V) can strongly influence the leaching rate of PNL 76-68 glass. The leaching rate is largely governed by silicon solubility constraints. Silicic acid in solution reduced the elemental release of all glass components. No components are leached to depths greater than that of silicon. The presence of the reaction layer had no measurable effect on the rate of leaching. Accelerated leach testing is possible since PNL 76-68 glass leaching is solubility-controlled (except at very low SA/V values). A series of glasses leached with SA/V x time = constant will yield identical elemental release

  2. Supplement to a Methodology for Succession Planning for Technical Experts

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, Bernadette Lugue [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cain, Ronald A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Agreda, Carla L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-11-01

    This report complements A Methodology for Succession Planning for Technical Experts (Ron Cain, Shaheen Dewji, Carla Agreda, Bernadette Kirk, July 2017), which describes a draft methodology for identifying and evaluating the loss of key technical skills at nuclear operations facilities. This report targets the methodology for identifying critical skills, and the methodology is tested through interviews with selected subject matter experts.

  3. Precarious Rock Methodology for Seismic Hazard: Physical Testing, Numerical Modeling and Coherence Studies

    Energy Technology Data Exchange (ETDEWEB)

    Anooshehpoor, Rasool; Purvance, Matthew D.; Brune, James N.; Preston, Leiph A.; Anderson, John G.; Smith, Kenneth D.

    2006-09-29

    This report covers the following projects: Shake table tests of precarious rock methodology, field tests of precarious rocks at Yucca Mountain and comparison of the results with PSHA predictions, study of the coherence of the wave field in the ESF, and a limited survey of precarious rocks south of the proposed repository footprint. A series of shake table experiments have been carried out at the University of Nevada, Reno Large Scale Structures Laboratory. The bulk of the experiments involved scaling acceleration time histories (uniaxial forcing) from 0.1g to the point where the objects on the shake table overturned a specified number of times. The results of these experiments have been compared with numerical overturning predictions. Numerical predictions for toppling of large objects with simple contact conditions (e.g., I-beams with sharp basal edges) agree well with shake-table results. The numerical model slightly underpredicts the overturning of small rectangular blocks. It overpredicts the overturning PGA for asymmetric granite boulders with complex basal contact conditions. In general the results confirm the approximate predictions of previous studies. Field testing of several rocks at Yucca Mountain has approximately confirmed the preliminary results from previous studies, suggesting that he PSHA predictions are too high, possibly because the uncertainty in the mean of the attenuation relations. Study of the coherence of wavefields in the ESF has provided results which will be very important in design of the canisters distribution, in particular a preliminary estimate of the wavelengths at which the wavefields become incoherent. No evidence was found for extreme focusing by lens-like inhomogeneities. A limited survey for precarious rocks confirmed that they extend south of the repository, and one of these has been field tested.

  4. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types...... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules...

  5. HIV Risks, Testing, and Treatment in the Former Soviet Union: Challenges and Future Directions in Research and Methodology

    Directory of Open Access Journals (Sweden)

    Victoria M. Saadat

    2016-01-01

    Full Text Available Background. The dissolution of the USSR resulted in independence for constituent republics but left them battling an unstable economic environment and healthcare. Increases in injection drug use, prostitution, and migration were all widespread responses to this transition and have contributed to the emergence of an HIV epidemic in the countries of former Soviet Union. Researchers have begun to identify the risks of HIV infection as well as the barriers to HIV testing and treatment in the former Soviet Union. Significant methodological challenges have arisen and need to be addressed. The objective of this review is to determine common threads in HIV research in the former Soviet Union and provide useful recommendations for future research studies.Methods. In this systematic review of the literature, Pubmed was searched for English-language studies using the key search terms “HIV”, “AIDS”, “human immunodeficiency virus”, “acquired immune deficiency syndrome”, “Central Asia”, “Kazakhstan”, “Kyrgyzstan”, “Uzbekistan”, “Tajikistan”, “Turkmenistan”, “Russia”, “Ukraine”, “Armenia”, “Azerbaijan”, and “Georgia”. Studies were evaluated against eligibility criteria for inclusion.Results. Thirty-nine studies were identified across the two main topic areas of HIV risk and barriers to testing and treatment, themes subsequently referred to as “risk” and “barriers”. Study design was predominantly cross-sectional. The most frequently used sampling methods were peer-to-peer and non-probabilistic sampling. The most frequently reported risks were condom misuse, risky intercourse, and unsafe practices among injection drug users.  Common barriers to testing included that testing was inconvenient, and that results would not remain confidential.  Frequent barriers to treatment were based on a distrust in the treatment system. Conclusion. The findings of this review reveal methodological limitations

  6. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  7. METHODOLOGICAL PROBLEMS AND WAYS OF CREATION OF THE AIRCRAFT EQUIPMENT TEST AUTOMATED MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Vladimir Michailovich Vetoshkin

    2017-01-01

    Full Text Available The development of new and modernization of existing aviation equipment specimens of different classes are ac- companied and completed by the complex process of ground and flight tests. This phase of aviation equipment life cycle is implemented by means of organizational and technical systems - running centers. The latter include various proving grounds, measuring complex and systems, aircraft, ships, security and flight control offices, information processing laborato- ries and many other elements. The system analysis results of development challenges of the automated control systems of aviation equipment tests operations are presented. The automated control systems are in essence an automated data bank. The key role of development of flight tests automated control system in the process of creation of the automated control sys- tems of aviation equipment tests operations is substantiated. The way of the mobile modular measuring complexes integra- tion and the need for national methodologies and technological standards for database systems design concepts are grounded. Database system, as a central element in this scheme, provides collection, storing and updating of values of the elements described above in pace and the required frequency of the controlled object state monitoring. It is database system that pro- vides the supervisory unit with actual data corresponding to specific moments of time, which concern the state processes, assessments of the progress and results of flight experiments, creating the necessary environment for aviation equipment managing and testing as a whole. The basis for development of subsystems of automated control systems of aviation equip- ment tests operations are conceptual design processes of the respective database system, the implementation effectiveness of which largely determines the level of success and ability to develop the systems being created. Introduced conclusions and suggestions can be used in the

  8. Testing methodologies for quantifying physical models uncertainties. A comparative exercise using CIRCE and IPREM (FFTBM)

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, Jordi, E-mail: jordi.freixa-terradas@upc.edu; Alfonso, Elsa de, E-mail: elsa.de.alfonso@upc.edu; Reventós, Francesc, E-mail: francesc.reventos@upc.edu

    2016-08-15

    Highlights: • Uncertainty of physical models are a key issue in Best estimate plus uncertainty analysis. • Estimation of uncertainties of physical models of thermal hydraulics system codes. • Comparison of CIRCÉ and FFTBM methodologies. • Simulation of reflood experiments in order to evaluate uncertainty of physical models related to the reflood scenario. - Abstract: The increasing importance of Best-Estimate Plus Uncertainty (BEPU) analyses in nuclear safety and licensing processes have lead to several international activities. The latest findings highlighted the uncertainties of physical models as one of the most controversial aspects of BEPU. This type of uncertainties is an important contributor to the total uncertainty of NPP BE calculations. Due to the complexity of estimating this uncertainty, it is often assessed solely by engineering judgment. The present study comprises a comparison of two different state-of-the-art methodologies CIRCÉ and IPREM (FFTBM) capable of quantifying the uncertainty of physical models. Similarities and differences of their results are discussed through the observation of probability distribution functions and envelope calculations. In particular, the analyzed scenario is core reflood. Experimental data from the FEBA and PERICLES test facilities is employed while the thermal hydraulic simulations are carried out with RELAP5/mod3.3. This work is undertaken under the framework of PREMIUM (Post-BEMUSE Reflood Model Input Uncertainty Methods) benchmark.

  9. Toward a simple, repeatable, non-destructive approach to measuring stable-isotope ratios of water within tree stems

    Science.gov (United States)

    Raulerson, S.; Volkmann, T.; Pangle, L. A.

    2017-12-01

    Traditional methodologies for measuring ratios of stable isotopes within the xylem water of trees involve destructive coring of the stem. A recent approach involves permanently installed probes within the stem, and an on-site assembly of pumps, switching valves, gas lines, and climate-controlled structure for field deployment of a laser spectrometer. The former method limits the possible temporal resolution of sampling, and sample size, while the latter may not be feasible for many research groups. We present results from initial laboratory efforts towards developing a non-destructive, temporally-resolved technique for measuring stable isotope ratios within the xylem flow of trees. Researchers have used direct liquid-vapor equilibration as a method to measure isotope ratios of the water in soil pores. Typically, this is done by placing soil samples in a fixed container, and allowing the liquid water within the soil to come into isotopic equilibrium with the headspace of the container. Water can also be removed via cryogenic distillation or azeotropic distillation, with the resulting liquid tested for isotope ratios. Alternatively, the isotope ratios of the water vapor can be directly measured using a laser-based water vapor isotope analyzer. Well-established fractionation factors and the isotope ratios in the vapor phase are then used to calculate the isotope ratios in the liquid phase. We propose a setup which would install a single, removable chamber onto a tree, where vapor samples could non-destructively and repeatedly be taken. These vapor samples will be injected into a laser-based isotope analyzer by a recirculating gas conveyance system. A major part of what is presented here is in the procedure of taking vapor samples at 100% relative humidity, appropriately diluting them with completely dry N2 calibration gas, and injecting them into the gas conveyance system without inducing fractionation in the process. This methodology will be helpful in making

  10. Screening radon risks: A methodology for policymakers

    International Nuclear Information System (INIS)

    Eisinger, D.S.; Simmons, R.A.; Lammering, M.; Sotiros, R.

    1991-01-01

    This paper provides an easy-to-use screening methodology to estimate potential excess lifetime lung cancer risk resulting from indoor radon exposure. The methodology was developed under U.S. EPA Office of Policy, Planning, and Evaluation sponsorship of the agency's Integrated Environmental Management Projects (IEMP) and State/Regional Comparative Risk Projects. These projects help policymakers understand and use scientific data to develop environmental problem-solving strategies. This research presents the risk assessment methodology, discusses its basis, and identifies appropriate applications. The paper also identifies assumptions built into the methodology and qualitatively addresses methodological uncertainties, the direction in which these uncertainties could bias analyses, and their relative importance. The methodology draws from several sources, including risk assessment formulations developed by the U.S. EPA's Office of Radiation Programs, the EPA's Integrated Environmental Management Project (Denver), the International Commission on Radiological Protection, and the National Institute for Occupational Safety and Health. When constructed as a spreadsheet program, the methodology easily facilitates analyses and sensitivity studies (the paper includes several sensitivity study options). The methodology will be most helpful to those who need to make decisions concerning radon testing, public education, and exposure prevention and mitigation programs.26 references

  11. TESTING TESTS ON ACTIVE GALACTIC NUCLEI MICROVARIABILITY

    International Nuclear Information System (INIS)

    De Diego, Jose A.

    2010-01-01

    Literature on optical and infrared microvariability in active galactic nuclei (AGNs) reflects a diversity of statistical tests and strategies to detect tiny variations in the light curves of these sources. Comparison between the results obtained using different methodologies is difficult, and the pros and cons of each statistical method are often badly understood or even ignored. Even worse, improperly tested methodologies are becoming more and more common, and biased results may be misleading with regard to the origin of the AGN microvariability. This paper intends to point future research on AGN microvariability toward the use of powerful and well-tested statistical methodologies, providing a reference for choosing the best strategy to obtain unbiased results. Light curves monitoring has been simulated for quasars and for reference and comparison stars. Changes for the quasar light curves include both Gaussian fluctuations and linear variations. Simulated light curves have been analyzed using χ 2 tests, F tests for variances, one-way analyses of variance and C-statistics. Statistical Type I and Type II errors, which indicate the robustness and the power of the tests, have been obtained in each case. One-way analyses of variance and χ 2 prove to be powerful and robust estimators for microvariations, while the C-statistic is not a reliable methodology and its use should be avoided.

  12. Common methodological flaws in economic evaluations.

    Science.gov (United States)

    Drummond, Michael; Sculpher, Mark

    2005-07-01

    Economic evaluations are increasingly being used by those bodies such as government agencies and managed care groups that make decisions about the reimbursement of health technologies. However, several reviews of economic evaluations point to numerous deficiencies in the methodology of studies or the failure to follow published methodological guidelines. This article, written for healthcare decision-makers and other users of economic evaluations, outlines the common methodological flaws in studies, focussing on those issues that are likely to be most important when deciding on the reimbursement, or guidance for use, of health technologies. The main flaws discussed are: (i) omission of important costs or benefits; (ii) inappropriate selection of alternatives for comparison; (iii) problems in making indirect comparisons; (iv) inadequate representation of the effectiveness data; (v) inappropriate extrapolation beyond the period observed in clinical studies; (vi) excessive use of assumptions rather than data; (vii) inadequate characterization of uncertainty; (viii) problems in aggregation of results; (ix) reporting of average cost-effectiveness ratios; (x) lack of consideration of generalizability issues; and (xi) selective reporting of findings. In each case examples are given from the literature and guidance is offered on how to detect flaws in economic evaluations.

  13. Combining rigour with relevance: a novel methodology for testing Chinese herbal medicine.

    Science.gov (United States)

    Flower, Andrew; Lewith, George; Little, Paul

    2011-03-24

    There is a need to develop an evidence base for Chinese herbal medicine (CHM) that is both rigorous and reflective of good practice. This paper proposes a novel methodology to test individualised herbal decoctions using a randomised, double blinded, placebo controlled clinical trial. A feasibility study was conducted to explore the role of CHM in the treatment of endometriosis. Herbal formulae were pre-cooked and dispensed as individual doses in sealed plastic sachets. This permitted the development and testing of a plausible placebo decoction. Participants were randomised at a distant pharmacy to receive either an individualised herbal prescription or a placebo. The trial met the predetermined criteria for good practice. Neither the participants nor the practitioner-researcher could reliably identify group allocation. Of the 28 women who completed the trial, in the placebo group (n=15) 3 women (20%) correctly guessed they were on placebo, 8 (53%) thought they were on herbs and 4 (27%) did not know which group they had been allocated to. In the active group (n=13) 2 (15%) though they were on placebo, 8 (62%) thought they were on herbs and 3 (23%) did not know. Randomisation, double blinding and allocation concealment were successful and the study model appeared to be feasible and effective. It is now possible to subject CHM to rigorous scientific scrutiny without compromising model validity. Improvement in the design of the placebo using food colourings and flavourings instead of dried food will help guarantee the therapeutic inertia of the placebo decoction. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  14. Research Methodology in Recurrent Pregnancy Loss

    DEFF Research Database (Denmark)

    Christiansen, Ole B

    2014-01-01

    The aim of this article is to highlight pitfalls in research methodology that may explain why studies in recurrent pregnancy loss (RPL) often provide very divergent results. It is hoped that insight into this issue may help clinicians decide which published studies are the most valid. It may help...... researchers to eliminate methodological flaws in future studies, which may hopefully come to some kind of agreement about the usefulness of diagnostic tests and treatments in RPL....

  15. Mobile Usability Testing in Healthcare: Methodological Approaches.

    Science.gov (United States)

    Borycki, Elizabeth M; Monkman, Helen; Griffith, Janessa; Kushniruk, Andre W

    2015-01-01

    The use of mobile devices and healthcare applications is increasing exponentially worldwide. This has lead to the need for the healthcare industry to develop a better understanding of the impact of the usability of mobile software and hardware upon consumer and health professional adoption and use of these technologies. There are many methodological approaches that can be employed in conducting usability evaluation of mobile technologies. More obtrusive approaches to collecting study data may lead to changes in study participant behaviour, leading to study results that are less consistent with how the technologies will be used in the real-world. Alternatively, less obstrusive methods used in evaluating the usability of mobile software and hardware in-situ and laboratory settings can lead to less detailed information being collected about how an individual interacts with both the software and hardware. In this paper we review and discuss several innovative mobile usability evaluation methods on a contiuum from least to most obtrusive and their effects on the quality of the usability data collected. The strengths and limitations of methods are also discussed.

  16. Response surface methodology based optimization of diesel–n-butanol –cotton oil ternary blend ratios to improve engine performance and exhaust emission characteristics

    International Nuclear Information System (INIS)

    Atmanlı, Alpaslan; Yüksel, Bedri; İleri, Erol; Deniz Karaoglan, A.

    2015-01-01

    Highlights: • RSM based optimization for optimum blend ratio of diesel fuel, n-butanol and cotton oil was done. • 65.5 vol.% diesel fuel, 23.1 vol.% n-butanol and 11.4 vol.% cotton oil (DnBC) was determined. • DnBC decreased brake torque, brake power, BTE and BMEP, while increased BSFC. • DnBC decreased NO x , CO and HC emissions. - Abstract: Many studies declare that 20% biodiesel is the optimum concentration for biodiesel–diesel fuel blends to improve performance. The present work focuses on finding diesel fuel, n-butanol, and cotton oil optimum blend ratios for diesel engine applications by using the response surface method (RSM). Experimental test fuels were prepared by choosing 7 different concentrations, where phase decomposition did not occur in the phase diagram of −10 °C. Experiments were carried out at full load conditions and the constant speed (2200 rpm) of maximum brake torque to determine engine performance and emission parameters. According to the test results of the engine, optimization was done by using RSM considering engine performance and exhaust emissions parameters, to identify the rates of concentrations of components in the optimum blend of three. Confirmation tests were employed to compare the output values of concentrations that were identified by optimization. The real experiment results and the R 2 actual values that show the relation between the outputs from the optimizations and real experiments were determined in high accordance. The optimum component concentration was determined as 65.5 vol.% diesel, 23.1 vol.% n-butanol and 11.4 vol.% cotton oil (DnBC). According to engine performance tests brake torque, brake power, BTE and BMEP of DnBC decreased while BSFC increased compared to those of diesel fuel. NO x , CO and HC emissions of DnBC drastically decreased as 11.33%, 45.17% and 81.45%, respectively

  17. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  18. A generic semi-implicit coupling methodology for use in RELAP5-3Dcopyright

    International Nuclear Information System (INIS)

    Aumiller, D.L.; Tomlinson, E.T.; Weaver, W.L.

    2000-01-01

    A generic semi-implicit coupling methodology has been developed and implemented in the RELAP5-3Dcopyright computer program. This methodology allows RELAP5-3Dcopyright to be used with other computer programs to perform integrated analyses of nuclear power reactor systems and related experimental facilities. The coupling methodology potentially allows different programs to be used to model different portions of the system. The programs are chosen based on their capability to model the phenomena that are important in the simulation in the various portions of the system being considered. The methodology was demonstrated using a test case in which the test geometry was divided into two parts each of which was solved as a RELAP5-3Dcopyright simulation. This test problem exercised all of the semi-implicit coupling features which were installed in RELAP5-3D0. The results of this verification test case show that the semi-implicit coupling methodology produces the same answer as the simulation of the test system as a single process

  19. Nuclear power plant simulation facility evaluation methodology

    International Nuclear Information System (INIS)

    Haas, P.M.; Carter, R.J.; Laughery, K.R. Jr.

    1985-01-01

    A methodology for evaluation of nuclear power plant simulation facilities with regard to their acceptability for use in the US Nuclear Regulatory Commission (NRC) operator licensing exam is described. The evaluation is based primarily on simulator fidelity, but incorporates some aspects of direct operator/trainee performance measurement. The panel presentation and paper discuss data requirements, data collection, data analysis and criteria for conclusions regarding the fidelity evaluation, and summarize the proposed use of direct performance measurment. While field testing and refinement of the methodology are recommended, this initial effort provides a firm basis for NRC to fully develop the necessary methodology

  20. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and

  1. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    INTRODUCTION: Oral anticoagulation treatment (OACT) with warfarin is common in general practice. Increasingly, international normalised ratio (INR) point of care testing (POCT) is being used to manage patients. The aim of this study was to describe and analyse the quality of OACT with warfarin...... practices using INR POCT in the management of patients in warfarin treatment provided good quality of care. Sampling interval and diagnostic coding were significantly correlated with treatment quality....

  2. Pearce element ratios: A paradigm for testing hypotheses

    Science.gov (United States)

    Russell, J. K.; Nicholls, Jim; Stanley, Clifford R.; Pearce, T. H.

    Science moves forward with the development of new ideas that are encapsulated by hypotheses whose aim is to explain the structure of data sets or to expand existing theory. These hypotheses remain conjecture until they have been tested. In fact, Karl Popper advocated that a scientist's job does not finish with the creation of an idea but, rather, begins with the testing of the related hypotheses. In Popper's [1959] advocation it is implicit that there be tools with which we can test our hypotheses. Consequently, the development of rigorous tests for conceptual models plays a major role in maintaining the integrity of scientific endeavor [e.g., Greenwood, 1989].

  3. Aircraft control surface failure detection and isolation using the OSGLR test. [orthogonal series generalized likelihood ratio

    Science.gov (United States)

    Bonnice, W. F.; Motyka, P.; Wagner, E.; Hall, S. R.

    1986-01-01

    The performance of the orthogonal series generalized likelihood ratio (OSGLR) test in detecting and isolating commercial aircraft control surface and actuator failures is evaluated. A modification to incorporate age-weighting which significantly reduces the sensitivity of the algorithm to modeling errors is presented. The steady-state implementation of the algorithm based on a single linear model valid for a cruise flight condition is tested using a nonlinear aircraft simulation. A number of off-nominal no-failure flight conditions including maneuvers, nonzero flap deflections, different turbulence levels and steady winds were tested. Based on the no-failure decision functions produced by off-nominal flight conditions, the failure detection and isolation performance at the nominal flight condition was determined. The extension of the algorithm to a wider flight envelope by scheduling on dynamic pressure and flap deflection is examined. Based on this testing, the OSGLR algorithm should be capable of detecting control surface failures that would affect the safe operation of a commercial aircraft. Isolation may be difficult if there are several surfaces which produce similar effects on the aircraft. Extending the algorithm over the entire operating envelope of a commercial aircraft appears feasible.

  4. Proceedings of International monitoring conference 'Development of rehabilitation methodology of environment of the Semipalatinsk region polluted by nuclear tests'

    International Nuclear Information System (INIS)

    2002-01-01

    The aim of the monitoring conference is draw an attention of government, national and international agencies, scientific societies, and local administrations to the ecological problems of Semipalatinsk nuclear test site, to combine the efforts of scientists to solve problems of soil disinfection, purification of surface and ground water from radioactive and heavy metals. It is expected that the knowledge, experience and methodology accumulated on the monitoring conference might be successfully transferred to solve analogous environmental problems of Kazakhstan

  5. Experimental study on the natural gas dual fuel engine test and the higher the mixture ratio of hydrogen to natural gas

    Energy Technology Data Exchange (ETDEWEB)

    Kim, B.S.; Lee, Y.S.; Park, C.K. [Cheonnam University, Kwangju (Korea); Masahiro, S. [Kyoto University, Kyoto (Japan)

    1999-05-28

    One of the unsolved problems of the natural gas dual fuel engine is that there is too much exhaust of Total Hydrogen Carbon(THC) at a low equivalent mixture ratio. To fix it, a natural gas mixed with hydrogen was applied to engine test. The results showed that the higher the mixture ratio of hydrogen to natural gas, the higher the combustion efficiency. And when the amount of the intake air is reached to 90% of WOT, the combustion efficiency was promoted. But, like a case making the injection timing earlier, the equivalent mixture ratio for the nocking limit decreases and the produce of NOx increases. 5 refs., 9 figs., 1 tab.

  6. Modeling companion diagnostics in economic evaluations of targeted oncology therapies: systematic review and methodological checklist.

    Science.gov (United States)

    Doble, Brett; Tan, Marcus; Harris, Anthony; Lorgelly, Paula

    2015-02-01

    The successful use of a targeted therapy is intrinsically linked to the ability of a companion diagnostic to correctly identify patients most likely to benefit from treatment. The aim of this study was to review the characteristics of companion diagnostics that are of importance for inclusion in an economic evaluation. Approaches for including these characteristics in model-based economic evaluations are compared with the intent to describe best practice methods. Five databases and government agency websites were searched to identify model-based economic evaluations comparing a companion diagnostic and subsequent treatment strategy to another alternative treatment strategy with model parameters for the sensitivity and specificity of the companion diagnostic (primary synthesis). Economic evaluations that limited model parameters for the companion diagnostic to only its cost were also identified (secondary synthesis). Quality was assessed using the Quality of Health Economic Studies instrument. 30 studies were included in the review (primary synthesis n = 12; secondary synthesis n = 18). Incremental cost-effectiveness ratios may be lower when the only parameter for the companion diagnostic included in a model is the cost of testing. Incorporating the test's accuracy in addition to its cost may be a more appropriate methodological approach. Altering the prevalence of the genetic biomarker, specific population tested, type of test, test accuracy and timing/sequence of multiple tests can all impact overall model results. The impact of altering a test's threshold for positivity is unknown as it was not addressed in any of the included studies. Additional quality criteria as outlined in our methodological checklist should be considered due to the shortcomings of standard quality assessment tools in differentiating studies that incorporate important test-related characteristics and those that do not. There is a need to refine methods for incorporating the characteristics

  7. AXIAL RATIO OF EDGE-ON SPIRAL GALAXIES AS A TEST FOR BRIGHT RADIO HALOS

    International Nuclear Information System (INIS)

    Singal, J.; Jones, E.; Dunlap, H.; Kogut, A.

    2015-01-01

    We use surface brightness contour maps of nearby edge-on spiral galaxies to determine whether extended bright radio halos are common. In particular, we test a recent model of the spatial structure of the diffuse radio continuum by Subrahmanyan and Cowsik which posits that a substantial fraction of the observed high-latitude surface brightness originates from an extended Galactic halo of uniform emissivity. Measurements of the axial ratio of emission contours within a sample of normal spiral galaxies at 1500 MHz and below show no evidence for such a bright, extended radio halo. Either the Galaxy is atypical compared to nearby quiescent spirals or the bulk of the observed high-latitude emission does not originate from this type of extended halo. (letters)

  8. An approach for measuring the {sup 129}I/{sup 127}I ratio in fish samples

    Energy Technology Data Exchange (ETDEWEB)

    Kusuno, Haruka, E-mail: kusuno@um.u-tokyo.ac.jp [The University Museum, The University of Tokyo, 3-7-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Matsuzaki, Hiroyuki [The University Museum, The University of Tokyo, 3-7-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Nagata, Toshi; Miyairi, Yosuke; Yokoyama, Yusuke [Atmosphere and Ocean Research Institute, The University of Tokyo, 5-1-5, Kashiwanoha, Kashiwa-shi, Chiba 277-8564 (Japan); Ohkouchi, Naohiko [Japan Agency for Marine-Earth Science and Technology, 2-15, Natsushima-cho, Yokosuka-city, Kanagawa 237-0061 (Japan)

    2015-10-15

    The {sup 129}I/{sup 127}I ratio in marine fish samples was measured employing accelerator mass spectrometry. The measurement was successful because of the low experimental background of {sup 129}I. Pyrohydrolysis was applied to extract iodine from fish samples. The experimental background of pyrohydrolysis was checked carefully and evaluated as 10{sup 4}–10{sup 5} atoms {sup 129}I/combustion. The methodology employed in the present study thus required only 0.05–0.2 g of dried fish samples. The methodology was then applied to obtain the {sup 129}I/{sup 127}I ratio of marine fish samples collected from the Western Pacific Ocean as (0.63–1.2) × 10{sup −10}. These values were similar to the ratio for the surface seawater collected at the same station, 0.4 × 10{sup −10}. The {sup 129}I/{sup 127}I ratio of IAEA-414, which was a mix of fish from the Irish Sea and the North Sea, was also measured and determined as 1.82 × 10{sup −7}. Consequently, fish from the Western Pacific Ocean and the North Sea were distinguished by their {sup 129}I/{sup 127}I ratios. The {sup 129}I/{sup 127}I ratio is thus a direct indicator of the area of habitat of fish.

  9. Can citizen science produce good science? Testing the OPAL Air Survey methodology, using lichens as indicators of nitrogenous pollution.

    Science.gov (United States)

    Tregidgo, Daniel J; West, Sarah E; Ashmore, Mike R

    2013-11-01

    Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Pengaruh Debt to Equty Ratio, Current Ratio , Net Profit Margin Terhadap Harga Saham dengan Price Earning Ratio Sebagai Variabel Pemoderasi pada Perusahaan Manufaktur yang Terdaftar di BEI Periode 2012-2014

    OpenAIRE

    Theresia, Paskah Lia

    2017-01-01

    This study conducted to analyze the effect of variable Debt to Equity Ratio (DER), Current Ratio (CR), Net Profit Margin (NPM) andPrice Earnings Ratio (PER) to the Stock Prices with Price Earnings Ratio (PER) as an moderating variable on companies listed on Indonesian Stock Exchange from 2012 - 2014.The samplingtechnique used is purposive sampling and number of samples used by 23 companies. The analysis technique used are Descriptive Statistic Analysis, Classical Assumption Test, Hypothesis T...

  11. Nuclear Power Plant Thermocouple Sensor-Fault Detection and Classification Using Deep Learning and Generalized Likelihood Ratio Test

    Science.gov (United States)

    Mandal, Shyamapada; Santhi, B.; Sridhar, S.; Vinolia, K.; Swaminathan, P.

    2017-06-01

    In this paper, an online fault detection and classification method is proposed for thermocouples used in nuclear power plants. In the proposed method, the fault data are detected by the classification method, which classifies the fault data from the normal data. Deep belief network (DBN), a technique for deep learning, is applied to classify the fault data. The DBN has a multilayer feature extraction scheme, which is highly sensitive to a small variation of data. Since the classification method is unable to detect the faulty sensor; therefore, a technique is proposed to identify the faulty sensor from the fault data. Finally, the composite statistical hypothesis test, namely generalized likelihood ratio test, is applied to compute the fault pattern of the faulty sensor signal based on the magnitude of the fault. The performance of the proposed method is validated by field data obtained from thermocouple sensors of the fast breeder test reactor.

  12. Development of a calibration methodology and tests of kerma area product meters

    International Nuclear Information System (INIS)

    Costa, Nathalia Almeida

    2013-01-01

    The quantity kerma area product (PKA) is important to establish reference levels in diagnostic radiology exams. This quantity can be obtained using a PKA meter. The use of such meters is essential to evaluate the radiation dose in radiological procedures and is a good indicator to make sure that the dose limit to the patient's skin doesn't exceed. Sometimes, these meters come fixed to X radiation equipment, which makes its calibration difficult. In this work, it was developed a methodology for calibration of PKA meters. The instrument used for this purpose was the Patient Dose Calibrator (PDC). It was developed to be used as a reference to check the calibration of PKA and air kerma meters that are used for dosimetry in patients and to verify the consistency and behavior of systems of automatic exposure control. Because it is a new equipment, which, in Brazil, is not yet used as reference equipment for calibration, it was also performed the quality control of this equipment with characterization tests, the calibration and an evaluation of the energy dependence. After the tests, it was proved that the PDC can be used as a reference instrument and that the calibration must be performed in situ, so that the characteristics of each X-ray equipment, where the PKA meters are used, are considered. The calibration was then performed with portable PKA meters and in an interventional radiology equipment that has a PKA meter fixed. The results were good and it was proved the need for calibration of these meters and the importance of in situ calibration with a reference meter. (author)

  13. Determination of the stoichiometric ratio uranium dioxide samples

    International Nuclear Information System (INIS)

    Moura, Sergio Carvalho

    1999-01-01

    The determination of the O/U stoichiometric ratio in uranium dioxide is an important parameter in order to qualify nuclear fuels. The excess oxygen in the crystallographic structure can cause changes in the physico-chemical properties of this compound such as variation of the thermal conductivity alterations, fuel plasticity and others, affecting the efficiency of this material when it is utilized as nuclear fuel in the reactor core. The purpose of this work is to evaluate methods for the determination of uranium oxide samples from two different production processes, using gravimetric, voltammetric and X-ray diffraction techniques. After the evaluation of these techniques, the main aspect of this work is to define a reliable methodology in order to characterize the behavior of uranium oxide. The methodology used in this work consisted of two different steps: utilization of gravimetric and volumetric methods in order to determine the ratio in uranium dioxide samples; utilization of X-ray diffraction technique in order to determine the lattice parameters using patterns and application of the Rietveld method during refining of the structural data. As a result of the experimental part of this work it was found that the X-ray diffraction analysis performs better and detects the presence of more phases than gravimetric and voltammetric techniques, not sensitive enough in this detection. (author)

  14. Testing Homogeneity in a Semiparametric Two-Sample Problem

    Directory of Open Access Journals (Sweden)

    Yukun Liu

    2012-01-01

    Full Text Available We study a two-sample homogeneity testing problem, in which one sample comes from a population with density f(x and the other is from a mixture population with mixture density (1−λf(x+λg(x. This problem arises naturally from many statistical applications such as test for partial differential gene expression in microarray study or genetic studies for gene mutation. Under the semiparametric assumption g(x=f(xeα+βx, a penalized empirical likelihood ratio test could be constructed, but its implementation is hindered by the fact that there is neither feasible algorithm for computing the test statistic nor available research results on its theoretical properties. To circumvent these difficulties, we propose an EM test based on the penalized empirical likelihood. We prove that the EM test has a simple chi-square limiting distribution, and we also demonstrate its competitive testing performances by simulations. A real-data example is used to illustrate the proposed methodology.

  15. Research methodology in recurrent pregnancy loss.

    Science.gov (United States)

    Christiansen, Ole B

    2014-03-01

    The aim of this article is to highlight pitfalls in research methodology that may explain why studies in recurrent pregnancy loss (RPL) often provide very divergent results. It is hoped that insight into this issue may help clinicians decide which published studies are the most valid. It may help researchers to eliminate methodological flaws in future studies, which may hopefully come to some kind of agreement about the usefulness of diagnostic tests and treatments in RPL. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Implementation and adaptation of a macro-scale methodology to calculate direct economic losses

    Science.gov (United States)

    Natho, Stephanie; Thieken, Annegret

    2017-04-01

    As one of the 195 member countries of the United Nations, Germany signed the Sendai Framework for Disaster Risk Reduction 2015-2030 (SFDRR). With this, though voluntary and non-binding, Germany agreed to report on achievements to reduce disaster impacts. Among other targets, the SFDRR aims at reducing direct economic losses in relation to the global gross domestic product by 2030 - but how to measure this without a standardized approach? The United Nations Office for Disaster Risk Reduction (UNISDR) has hence proposed a methodology to estimate direct economic losses per event and country on the basis of the number of damaged or destroyed items in different sectors. The method bases on experiences from developing countries. However, its applicability in industrial countries has not been investigated so far. Therefore, this study presents the first implementation of this approach in Germany to test its applicability for the costliest natural hazards and suggests adaptations. The approach proposed by UNISDR considers assets in the sectors agriculture, industry, commerce, housing, and infrastructure by considering roads, medical and educational facilities. The asset values are estimated on the basis of sector and event specific number of affected items, sector specific mean sizes per item, their standardized construction costs per square meter and a loss ratio of 25%. The methodology was tested for the three costliest natural hazard types in Germany, i.e. floods, storms and hail storms, considering 13 case studies on the federal or state scale between 1984 and 2016. Not any complete calculation of all sectors necessary to describe the total direct economic loss was possible due to incomplete documentation. Therefore, the method was tested sector-wise. Three new modules were developed to better adapt this methodology to German conditions covering private transport (cars), forestry and paved roads. Unpaved roads in contrast were integrated into the agricultural and

  17. Testing effective quantum gravity with gravitational waves from extreme mass ratio inspirals

    International Nuclear Information System (INIS)

    Yunes, N; Sopuerta, C F

    2010-01-01

    Testing deviation of GR is one of the main goals of the proposed Laser Interferometer Space Antenna. For the first time, we consistently compute the generation of gravitational waves from extreme-mass ratio inspirals (stellar compact objects into supermassive black holes) in a well-motivated alternative theory of gravity, that to date remains weakly constrained by double binary pulsar observations. The theory we concentrate on is Chern-Simons (CS) modified gravity, a 4-D, effective theory that is motivated both from string theory and loop-quantum gravity, and which enhances the Einstein-Hilbert action through the addition of a dynamical scalar field and the parity-violating Pontryagin density. We show that although point particles continue to follow geodesics in the modified theory, the background about which they inspiral is a modification to the Kerr metric, which imprints a CS correction on the gravitational waves emitted. CS modified gravitational waves are sufficiently different from the General Relativistic expectation that they lead to significant dephasing after 3 weeks of evolution, but such dephasing will probably not prevent detection of these signals, but instead lead to a systematic error in the determination of parameters. We end with a study of radiation-reaction in the modified theory and show that, to leading-order, energy-momentum emission is not CS modified, except possibly for the subdominant effect of scalar-field emission. The inclusion of radiation-reaction will allow for tests of CS modified gravity with space-borne detectors that might be two orders of magnitude larger than current binary pulsar bounds.

  18. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  19. Testing of January Anomaly at ISE-100 Index with Power Ratio Method

    Directory of Open Access Journals (Sweden)

    Şule Yüksel Yiğiter

    2015-12-01

    Full Text Available AbstractNone of investors that can access all informations in the same ratio is not possible to earn higher returns according to Efficient Market Hypothesis. However, it has been set forth effect of time on returns in several studies and reached conflicting conclusions with hypothesis. In this context, one of the most important existing anomalies is also January month anomaly. In this study, it has been researched that if there is  January effect in BIST-100 index covering 2008-2014 period by using power ratio method. The presence of January month anomaly in BIST-100 index within specified period determined by analysis results.Keywords: Efficient Markets Hypothesis, January Month Anomaly, Power Ratio MethodJEL Classification Codes: G1,C22

  20. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  1. Critical assessment of jet erosion test methodologies for cohesive soil and sediment

    Science.gov (United States)

    Karamigolbaghi, Maliheh; Ghaneeizad, Seyed Mohammad; Atkinson, Joseph F.; Bennett, Sean J.; Wells, Robert R.

    2017-10-01

    The submerged Jet Erosion Test (JET) is a commonly used technique to assess the erodibility of cohesive soil. Employing a linear excess shear stress equation and impinging jet theory, simple numerical methods have been developed to analyze data collected using a JET to determine the critical shear stress and erodibility coefficient of soil. These include the Blaisdell, Iterative, and Scour Depth Methods, and all have been organized into easy to use spreadsheet routines. The analytical framework of the JET and its associated methods, however, are based on many assumptions that may not be satisfied in field and laboratory settings. The main objective of this study is to critically assess this analytical framework and these methodologies. Part of this assessment is to include the effect of flow confinement on the JET. The possible relationship between the derived erodibility coefficient and critical shear stress, a practical tool in soil erosion assessment, is examined, and a review of the deficiencies in the JET methodology also is presented. Using a large database of JET results from the United States and data from literature, it is shown that each method can generate an acceptable curve fit through the scour depth measurements as a function of time. The analysis shows, however, that the Scour Depth and Iterative Methods may result in physically unrealistic values for the erosion parameters. The effect of flow confinement of the impinging jet increases the derived critical shear stress and decreases the erodibility coefficient by a factor of 2.4 relative to unconfined flow assumption. For a given critical shear stress, the length of time over which scour depth data are collected also affects the calculation of erosion parameters. In general, there is a lack of consensus relating the derived soil erodibility coefficient to the derived critical shear stress. Although empirical relationships are statistically significant, the calculated erodibility coefficient for a

  2. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    INTRODUCTION: Oral anticoagulation treatment (OACT)with warfarin is common in general practice. Increasingly,international normalised ratio (INR) point of care testing(POCT) is being used to manage patients. The aim of thisstudy was to describe and analyse the quality of OACT withwarfarin...... in the management of patients in warfarintreatment provided good quality of care. Sampling intervaland diagnostic coding were significantly correlated withtreatment quality. FUNDING: The study received financial support from theSarah Krabbe Foundation, the General Practitioners’ Educationand Development Foundation...

  3. GROUNDED THEORY METHODOLOGY and GROUNDED THEORY RESEARCH in TURKEY

    OpenAIRE

    ARIK, Ferhat; ARIK, Işıl Avşar

    2016-01-01

    This research discusses the historical development of the Grounded Theory Methodology, which is one of the qualitative research method, its transformation over time and how it is used as a methodology in Turkey. The Grounded Theory which was founded by Strauss and Glaser, is a qualitative methodology based on inductive logic to discover theories in contrast with the deductive understanding which is based on testing an existing theory in sociology. It is possible to examine the Grounded Theory...

  4. Helicopter-Ship Qualification Testing

    NARCIS (Netherlands)

    Hoencamp, A.

    2015-01-01

    The goal of this research project is to develop a novel test methodology which can be used for optimizing cost and time efficiency of helicopter-ship qualification testing without reducing safety. For this purpose, the so-called “SHOL-X” test methodology has been established, which includes the

  5. Safety assessment of a borehole type disposal facility using the ISAM methodology

    International Nuclear Information System (INIS)

    Blerk, J.J. van; Yucel, V.; Kozak, M.W.; Moore, B.A.

    2002-01-01

    As part of the IAEA's Co-ordinated Research Project (CRP) on Improving Long-term of Safety Assessment Methodologies for Near Surface Waste Disposal Facilities (ISAM), three example cases were developed. The aim was to test the ISAM safety assessment methodology using as realistic as possible data. One of the Test Cases, the Borehole Test Case (BTC), related to a proposed future disposal option for disused sealed radioactive sources. This paper uses the various steps of the ISAM safety assessment methodology to describe the work undertaken by ISAM participants in developing the BTC and provides some general conclusions that can be drawn from the findings of their work. (author)

  6. Assessment of change in knowledge about research methods among delegates attending research methodology workshop.

    Science.gov (United States)

    Shrivastava, Manisha; Shah, Nehal; Navaid, Seema

    2018-01-01

    In an era of evidence based medicine research is an essential part of medical profession whether clinical or academic. A research methodology workshop intends to help participants, those who are newer to research field or those who are already doing empirical research. The present study was conducted to assess the changes in knowledge of the participants of a research methodology workshop through a structured questionnaire. With administrative and ethical approval, a four day research methodology workshop was planned. The participants were subjected to a structured questionnaire (pre-test) containing 20 multiple choice questions (Q1-Q 20) related to the topics to be covered in research methodology workshop before the commencement of the workshop and then subjected to similar posttest questionnaire after the completion of workshop. The mean values of pre and post-test scores were calculated and the results were analyzed and compared. Out of the total 153 delegates, 45(29 %) were males and 108 were (71 %) females. 92 (60%) participants consented to fill the pre-test questionnaire and 68 (44%) filled the post-test questionnaire. The mean Pre-test and post-test scores at 95% Confidence Interval were 07.62 (SD ±3.220) and 09.66 (SD ±2.477) respectively. The differences were found to be significant using Paired Sample T test ( P research methodology workshops. Participatory research methodology workshops are good methods of imparting knowledge, also the long term effects needs to be evaluated.

  7. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    Science.gov (United States)

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  8. Summer 2012 Testing and Analysis of the Chemical Mixture Methodology -- Part I

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, Clifford S.; Yu, Xiao-Ying; Coggin, Rebekah L.; Ponder, Lashaundra A.; Booth, Alexander E.; Petrocchi, Achille J.; Horn, Sarah M.; Yao, Juan

    2012-07-01

    This report presents the key findings made by the Chemical Mixture Methodology (CMM) project team during the first stage of their summer 2012 testing and analysis of the CMM. The study focused on answering the following questions: o What is the percentage of the chemicals in the CMM Rev 27 database associated with each Health Code Number (HCN)? How does this result influence the relative importance of acute HCNs and chronic HCNs in the CMM data set? o What is the benefit of using the HCN-based approach? Which Modes of Action and Target Organ Effects tend to be important in determining the HCN-based Hazard Index (HI) for a chemical mixture? o What are some of the potential issues associated with the current HCN-based approach? What are the opportunities for improving the performance and/or technical defensibility of the HCN-based approach? How would those improvements increase the benefit of using the HCN-based approach? o What is the Target Organ System Effect approach and how can it be used to improve upon the current HCN-based approach? How does the benefits users would derive from using the Target Organ System Approach compare to the benefits available from the current HCN-based approach?

  9. THE RELATIVE IMPORTANCE OF FINANCIAL RATIOS AND NONFINANCIAL VARIABLES IN PREDICTING OF INSOLVENCY

    Directory of Open Access Journals (Sweden)

    Ivica Pervan

    2013-02-01

    Full Text Available One of the most important decisions in every bank is approving loans to firms, which is based on evaluated credit risk and collateral. Namely, it is necessary to evaluate the risk that client will be unable to repay the obligations according to the contract. After Beaver's (1967 and Altman's (1968 seminal papers many authors extended the initial research by changing the methodology, samples, countries, etc. But majority of business failure papers as predictors use financial ratios, while in the real life banks combine financial and nonfinancial variables. In order to test predictive power of nonfinancial variables authors in the paper compare two insolvency prediction models. The first model that used financial rations resulted with classification accuracy of 82.8%, while the combined model with financial and nonfinancial variables resulted with classification accuracy of 88.1%.

  10. Evaluation of probable maximum snow accumulation: Development of a methodology for climate change studies

    Science.gov (United States)

    Klein, Iris M.; Rousseau, Alain N.; Frigon, Anne; Freudiger, Daphné; Gagnon, Patrick

    2016-06-01

    Probable maximum snow accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood (PMF). A robust methodology for evaluating the PMSA is imperative so the ensuing spring PMF is a reasonable estimation. This is of particular importance in times of climate change (CC) since it is known that solid precipitation in Nordic landscapes will in all likelihood change over the next century. In this paper, a PMSA methodology based on simulated data from regional climate models is developed. Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationarity tests indicate that CC will affect the monthly maximum precipitable water and, thus, the ensuing ratio to maximize important snowfall events. Therefore, a non-stationary approach is used to describe the monthly maximum precipitable water. Outputs from three simulations produced by the Canadian Regional Climate Model were used to give first estimates of potential PMSA changes for southern Quebec, Canada. A sensitivity analysis of the computed PMSA was performed with respect to the number of time-steps used (so-called snowstorm duration) and the threshold for a snowstorm to be maximized or not. The developed methodology is robust and a powerful tool to estimate the relative change of the PMSA. Absolute results are in the same order of magnitude as those obtained with the traditional method and observed data; but are also found to depend strongly on the climate projection used and show spatial variability.

  11. Holes at High Blowing Ratios

    Directory of Open Access Journals (Sweden)

    Phillip M. Ligrani

    1996-01-01

    Full Text Available Experimental results are presented which describe the development and structure of flow downstream of a single row of holes with compound angle orientations producing film cooling at high blowing ratios. This film cooling configuration is important because similar arrangements are frequently employed on the first stage of rotating blades of operating gas turbine engines. With this configuration, holes are spaced 6d apart in the spanwise direction, with inclination angles of 24 degrees, and angles of orientation of 50.5 degrees. Blowing ratios range from 1.5 to 4.0 and the ratio of injectant to freestream density is near 1.0. Results show that spanwise averaged adiabatic effectiveness, spanwise-averaged iso-energetic Stanton number ratios, surveys of streamwise mean velocity, and surveys of injectant distributions change by important amounts as the blowing ratio increases. This is due to injectant lift-off from the test surface just downstream of the holes.

  12. Use and Application of the SADRWMS Methodology and SAFRAN Tool on the Thailand Institute of Nuclear Technology (TINT) Radioactive Waste Management Facility. Test Case Results. 05 October 2011

    International Nuclear Information System (INIS)

    2015-01-01

    The purpose of this document is to describe the working procedure of the test case and to provide feedback on the application of the methodology described in DS284 and the SAFRAN tool. This report documents how the test case was performed, describes how the methodology and software tool were applied, and provides feedback on the use and application of the SAFRAN Tool. The aim of this document is to address the key elements of the safety assessment and to demonstrate their principle contents and roles within the overall context of the safety case. This is done with particular emphasis on investigating the role of the SAFRAN Tool in developing a safety case for facilities similar to the TINT Facility. It is intended that this report will be the first of a series of complimentary safety reports illustrating the use and application of the methodology prescribed in DS284 and the application of the SAFRAN tool to a range of predisposal radioactive waste management activities

  13. Parametric optimization of rice bran oil extraction using response surface methodology

    Directory of Open Access Journals (Sweden)

    Ahmad Syed W.

    2016-09-01

    Full Text Available Use of bran oil in various edible and nonedible industries is very common. In this research work, efficient and optimized methodology for the recovery of rice bran oil has been investigated. The present statistical study includes parametric optimization, based on experimental results of rice bran oil extraction. In this study, three solvents, acetone, ethanol and solvent mixture (SM [acetone: ethanol (1:1 v/v] were employed in extraction investigations. Response surface methodology (RSM, an optimization technique, was exploited for this purpose. A five level central composite design (CCD consisting four operating parameter, like temperature, stirring rate, solvent-bran ratio and contact time were examined to optimize rice bran oil extraction. Experimental results showed that oil recovery can be enhanced from 71% to 82% when temperature, solvent-bran ratio, stirring rate and contact time were kept at 55°C, 6:1, 180 rpm and 45 minutes, respectively while fixing the pH of the mixture at 7.1.

  14. Methodology for formulating predictions of stress corrosion cracking life

    International Nuclear Information System (INIS)

    Yamauchi, Kiyoshi; Hattori, Shigeo; Shindo, Takenori; Kuniya, Jiro

    1994-01-01

    This paper presents a methodology for formulating predictions to evaluate the stress corrosion cracking (SCC) potential of each light-water reactor component, where an index is introduced as a life index or F index. The index denotes the SCC time ratio of a given SCC system to be evaluated against a reference SCC system. The life index is expressed by the products of several subdivided life indexes, which correspond to each SCC influencing factor. Each subdivided life index is constructed as a function containing the influencing factor variable, obtained by analyzing experimental SCC life data. The methodology was termed the subdivided factor method. Application of the life index to SCC life data and field data showed that it was effective for evaluating the SCC potential, i.e. the SCC life. Accordingly, the proposed methodology can potentially describe a phenomenon expressed by a function which consists of the variables of several influencing factors whether there are formulae which unite as a physical model or not. ((orig.))

  15. Methodological Approaches to Experimental Teaching of Mathematics to University Students

    Directory of Open Access Journals (Sweden)

    Nikolay I.

    2018-03-01

    Full Text Available Introduction: the article imparts authors’ thoughtson a new teaching methodology for mathematical education in universities. The aim of the study is to substantiate the efficiency of the comprehensive usage of mathematical electronic courses, computer tests, original textbooks and methodologies when teaching mathematics to future agrarian engineers. The authors consider this implementation a unified educational process. Materials and Methods: the synthesis of international and domestic pedagogical experience of teaching students in university and the following methods of empirical research were used: pedagogical experiment, pedagogical measurementsand experimental teaching of mathematics. The authors applied the methodology of revealing interdisciplinary links on the continuum of mathematical problems using the key examples and exercises. Results: the online course “Mathematics” was designed and developed on the platform of Learning Management System Moodle. The article presents the results of test assignments assessing students’ intellectual abilities and analysis of solutions of various types of mathematical problems by students. The pedagogical experiment substantiated the integrated selection of textbooks, online course and online tests using the methodology of determination of the key examples and exercises. Discussion and Conclusions: the analysis of the experimental work suggested that the new methodology is able to have positive effect on the learning process. The learning programme determined the problem points for each student. The findings of this study have a number of important implications for future educational practice.

  16. Shading Ratio Impact on Photovoltaic Modules and Correlation with Shading Patterns

    Directory of Open Access Journals (Sweden)

    Alonso Gutiérrez Galeano

    2018-04-01

    Full Text Available This paper presents the study of a simplified approach to model and analyze the performance of partially shaded photovoltaic modules using the shading ratio. This approach integrates the characteristics of shaded area and shadow opacity into the photovoltaic cell model. The studied methodology is intended to improve the description of shaded photovoltaic systems by specifying an experimental procedure to quantify the shadow impact. Furthermore, with the help of image processing, the analysis of the shading ratio provides a set of rules useful for predicting the current–voltage behavior and the maximum power points of shaded photovoltaic modules. This correlation of the shading ratio and shading patterns can contribute to the supervision of actual photovoltaic installations. The experimental results validate the proposed approach in monocrystalline and polycrystalline technologies of solar panels.

  17. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  18. The distinct element analysis for swelling pressure test of bentonite. Discussion on the effects of wall friction force and aspect ratio of specimen

    International Nuclear Information System (INIS)

    Shimizu, Hiroyuki; Kikuchi, Hirohito; Fujita, Tomoo; Tanai, Kenji

    2011-10-01

    For geological isolation systems for radioactive waste, bentonite based material is assumed to be used as a buffer material. The swelling characteristics of the bentonite based material are expected to fill up the void space around the radioactive wastes by swelling. In general, swelling characteristics and properties of bentonite are evaluated by the laboratory tests. However, due to the lack of standardization of testing method for bentonite, the accuracy and reproducibility of the testing results are not sufficiently proved. In this study, bentonite swelling pressure test were simulated by newly developed Distinct Element Method (DEM) code, and the effects of wall friction force and aspect ratio of bentonite specimen were discussed. As a result, the followings were found. In the beginning of the swelling pressure test, since swelling occurs only around the fluid injection side of the specimen, wall friction force acts only in the swelling area and the specimen moves to opposite side from fluid injection side. However, when the entire specimen started swelling, displacement of the specimen prevented by the wall friction force, and the specimen is pressed against the pressure measurement side. Then, the swelling pressure measured on the pressure measurement side increases. Such displacement in the specimen is significantly affected by the decreasing of mechanical properties and the difference of saturation in the bentonite specimen during the fluid infiltration. Moreover, when the aspect ratio of the specimen is large, the displacement of the particle in the specimen becomes large and the area on which the wall frictional force acts is also large. Therefore, measured swelling pressure increases more greatly as the aspect ratio of the specimen increases. To contributes to the standardization of laboratory test methods for bentonite, these effects of wall friction force revealed by the DEM simulation should be verified through laboratory experiments. (author)

  19. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    The reports comprising this volume concern the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in granites in Cornwall, with particular reference to the effect of structures imposed by discontinuities on the engineering behaviour of rock masses. The topics covered are: in-situ stress measurements using (a) the hydraulic fracturing method, or (b) the US Bureau of Mines deformation probe; scanline discontinuity survey - coding form and instructions, and data; applicability of geostatistical estimation methods to scalar rock properties; comments on in-situ stress at the Carwynnen test mine and the state of stress in the British Isles. (U.K.)

  20. Safety assessment of a vault-based disposal facility using the ISAM methodology

    International Nuclear Information System (INIS)

    Kelly, E.; Kim, C.-L.; Lietava, P.; Little, R.; Simon, I.

    2002-01-01

    As part of the IAEA's Co-ordinated Research Project (CRP) on Improving Long-term of Safety Assessment Methodologies for Near Surface Waste Disposal Facilities (ISAM), three example cases were developed. The aim was to testing the ISAM safety assessment methodology using as realistic as possible data. One of the Test Cases, the Vault Test Case (VTC), related to the disposal of low level radioactive waste (LLW) to a hypothetical facility comprising a set of above surface vaults. This paper uses the various steps of the ISAM safety assessment methodology to describe the work undertaken by ISAM participants in developing the VTC and provides some general conclusions that can be drawn from the findings of their work. (author)

  1. Methodological individualism as opposed to methodological holism. History, relevancy and the implications of the (insoluble? debate on the explanatory capacity and scientific status of sociocultural anthropology

    Directory of Open Access Journals (Sweden)

    Nina Kulenović

    2016-02-01

    Full Text Available The paper is part of wider research into the status of explanation in the debate on the scientific status of anthropology – wherein one of the key assumptions is that there is a strong correlation between theoretical and methodological structures which would make them inseparable, and that explanation or explanatory potential, is the point of convergence which can be used to test for the possibility of separating theoretical and methodological structures in the first place. To test this idea, a line of debate between methodological holism and methodological individualism – one of the longest running and most complex debates in the social sciences and humanities – was considered. The historical background of the debate has been highlighted, and its relevancy and implications in the controversy about the explanatory capacity and scientific status of sociocultural anthropology.

  2. The reaction index and positivity ratio revisited

    DEFF Research Database (Denmark)

    Andersen, Klaus Ejner; Andersen, Flemming

    2008-01-01

    BACKGROUND AND OBJECTIVES: Assessing the quality of patch test preparations continues to be a challenge. 2 parameters, the reaction index (RI) and positivity ratio (PR), have been proposed as quality indicators by the Information Network of Departments of Dermatology (IVDK). The value of these st......BACKGROUND AND OBJECTIVES: Assessing the quality of patch test preparations continues to be a challenge. 2 parameters, the reaction index (RI) and positivity ratio (PR), have been proposed as quality indicators by the Information Network of Departments of Dermatology (IVDK). The value...

  3. Simultaneous stable carbon isotopic analysis of wine glycerol and ethanol by liquid chromatography coupled to isotope ratio mass spectrometry.

    Science.gov (United States)

    Cabañero, Ana I; Recio, Jose L; Rupérez, Mercedes

    2010-01-27

    A novel procedure was established for the simultaneous characterization of wine glycerol and ethanol (13)C/(12)C isotope ratio, using liquid chromatography/isotope ratio mass spectrometry (LC-IRMS). Several parameters influencing separation of glycerol and ethanol from wine matrix were optimized. Results obtained for 35 Spanish samples exposed no significant differences and very strong correlations (r = 0.99) between the glycerol (13)C/(12)C ratios obtained by an alternative method (gas chromatography/isotope ratio mass spectrometry) and the proposed new methodology, and between the ethanol (13)C/(12)C ratios obtained by the official method (elemental analyzer/isotope ratio mass spectrometry) and the proposed new methodology. The accuracy of the proposed method varied from 0.01 to 0.19 per thousand, and the analytical precision was better than 0.25 per thousand. The new developed LC-IRMS method it is the first isotopic method that allows (13)C/(12)C determination of both analytes in the same run directly from a liquid sample with no previous glycerol or ethanol isolation, overcoming technical difficulties associated with complex sample treatment and improving in terms of simplicity and speed.

  4. A generic semi-implicit coupling methodology for use in RELAP5-3D(c)

    International Nuclear Information System (INIS)

    Weaver, W.L.; Tomlinson, E.T.; Aumiller, D.L.

    2002-01-01

    A generic semi-implicit coupling methodology has been developed and implemented in the RELAP5-3D (c) computer program. This methodology allows RELAP5-3D (c) to be used with other computer programs to perform integrated analyses of nuclear power reactor systems and related experimental facilities. The coupling methodology potentially allows different programs to be used to model different portions of the system. The programs are chosen based on their capability to model the phenomena that are important in the simulation in the various portions of the system being considered and may use different numbers of conservation equations to model fluid flow in their respective solution domains. The methodology was demonstrated using a test case in which the test geometry was divided into two parts, each of which was solved as a RELAP5-3D (c) simulation. This test problem exercised all of the semi-implicit coupling features that were implemented in RELAP5-3D (c) The results of this verification test case show that the semi-implicit coupling methodology produces the same answer as the simulation of the test system as a single process

  5. Methodological Behaviorism from the Standpoint of a Radical Behaviorist.

    Science.gov (United States)

    Moore, J

    2013-01-01

    Methodological behaviorism is the name for a prescriptive orientation to psychological science. Its first and original feature is that the terms and concepts deployed in psychological theories and explanations should be based on observable stimuli and behavior. I argue that the interpretation of the phrase "based on" has changed over the years because of the influence of operationism. Its second feature, which developed after the first and is prominent in contemporary psychology, is that research should emphasize formal testing of a theory that involves mediating theoretical entities from an nonbehavioral dimension according to the hypothetico-deductive method. I argue that for contemporary methodological behaviorism, explanations of the behavior of both participants and scientists appeal to the mediating entities as mental causes, if only indirectly. In contrast to methodological behaviorism is the radical behaviorism of B. F. Skinner. Unlike methodological behaviorism, radical behaviorism conceives of verbal behavior in terms of an operant process that involves antecedent circumstances and reinforcing consequences, rather than in terms of a nonbehavioral process that involves reference and symbolism. In addition, radical behaviorism recognizes private behavioral events and subscribes to research and explanatory practices that do not include testing hypotheses about supposed mediating entities from another dimension. I conclude that methodological behaviorism is actually closer to mentalism than to Skinner's radical behaviorism.

  6. Prescriptive Training Courseware: IS-Design Methodology

    Directory of Open Access Journals (Sweden)

    Elspeth McKay

    2018-03-01

    Full Text Available Information systems (IS research is found in many diverse communities. This paper explores the human-dimension of human-computer interaction (HCI to present IS-design practice in the light of courseware development. Assumptions are made that online courseware provides the perfect solution for maintaining a knowledgeable, well skilled workforce. However, empirical investigations into the effectiveness of information technology (IT-induced training solutions are scarce. Contemporary research concentrates on information communications technology (ICT training tools without considering their effectiveness. This paper offers a prescriptive IS-design methodology for managing the requirements for efficient and effective courseware development. To develop the methodology, we examined the main instructional design (ID factors that affect the design of IT-induced training programs. We also examined the tension between maintaining a well-skilled workforce and effective instructional systems design (ISD practice by probing the current ID models used by courseware developers since 1990. An empirical research project, which utilized this IS-design methodology investigated the effectiveness of using IT to train government employees in introductory ethics; this was a study that operationalized the interactive effect of cognitive preference and instructional format on training performance outcomes. The data was analysed using Rasch item response theory (IRT that models the discrimination of people’s performance relative to each other’s performance and the test-items’ difficulty relative to each test-item on the same logit scale. The findings revealed that IS training solutions developed using this IS-design methodology can be adapted to provide trainees with their preferred instructional mode and facilitate cost effective eTraining outcomes.

  7. Testing methodology of diamond composite inserts to be used in the drilling of petroleum wells; Metodologia de testes de insertos compositos diamantados a serem usados na perfuracao de pocos de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Bobrovnitchii, G.S.; Filgueira, M.; Skury, A.L.D.; Tardim, R.C. [Universidade Estadual do Norte Fluminense (UENF), Campos dos Goytacazes, RJ (Brazil)], e-mail: rtardim@terra.com.br

    2006-07-01

    The useful life of the inserts used in the cutters of the drills for perforation of oil wells determines the quality of the perforation as well as the productivity. Therefore, the research of the wear of insert is carried through with the objective to foretell the most important properties of the inserts. Due to the fact of the UENF to be developing the processes of composites sintering to the synthetic diamond base, it is interesting to define the testing methodology of the gotten inserts. The proposed methodology is based on the evaluation of the wear suffered by de sample. For this end a micro processed 'Abrasimeter', model AB800-E, manufactured for the Contenco Company was used. The instrument capacity is 1,36 kVA; axial load applied in the cutter up to 50 kgf; rotation of table speed 20 rpm; course of the tool in radial direction speed before 2 m/min; dimensions of the granite block D = 808 mm, d = 484 mm, h = 50 mm. The gotten results show that the proposed methodology can be used for the evaluation of the inserts of the cutters applied in perforation drills. (author)

  8. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    Energy Technology Data Exchange (ETDEWEB)

    Izzuddin, Nur; Sunarsih,; Priyanto, Agoes [Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Skudai, Johor (Malaysia)

    2015-05-15

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel’s speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel’s speed to obtain better characteristics and hence optimize the fuel saving rate.

  9. Multilayered samples reconstructed by measuring Kα/Kβ or Lα/Lβ X-ray intensity ratios by EDXRF

    Science.gov (United States)

    Cesareo, Roberto; de Assis, Joaquim T.; Roldán, Clodoaldo; Bustamante, Angel D.; Brunetti, Antonio; Schiavon, Nick

    2013-10-01

    In this paper a general method based on energy-dispersive X-ray fluorescence (EDXRF) analysis has been tested to assess its possible use as a tool to reconstruct the structure and determine the thickness of two and/or multi-layered materials. The method utilizes the X-ray intensity ratios of Kα/Kβ or Lα/Lβ peaks (or the ratio of these peaks) for selected elements present in multi-layered objects of various materials (Au alloys, gilded Cu, gilded Ag, gilded Pb, Ag-Au Tumbaga, stone surfaces with protective treatments, Zn or Nickel plating on metals). Results show that, in the case of multi-layered samples, a correct calculation of the peak ratio (Kα /Kβ and/or Lα/Lβ) of relevant elements from energy-dispersive X-ray fluorescence spectra, can provide important information in assessing the exact location of each layer and for calculating its thickness. The methodological approach shown may have important applications not only in materials science but also when dealing with the conservation and restoration of multi-layered cultural heritage objects where the use of a Non-Destructive techniques to determine slight chemical and thickness variations in the layered structure is often of paramount importance to achieve the best results.

  10. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    collected retrospectively for a period of six months. For each patient, time in therapeutic range (TTR) was calculated and correlated with practice and patient characteristics using multilevel linear regression models. RESULTS: We identified 447 patients in warfarin treatment in the 20 practices using POCT......INTRODUCTION: Oral anticoagulation treatment (OACT) with warfarin is common in general practice. Increasingly, international normalised ratio (INR) point of care testing (POCT) is being used to manage patients. The aim of this study was to describe and analyse the quality of OACT with warfarin...

  11. Experimental characterization of the concrete behaviour under high confinement: influence of the saturation ratio and of the water/cement ratio

    International Nuclear Information System (INIS)

    Vu, X.H.

    2007-08-01

    The objective of this thesis is to experimentally characterize the influence of the saturation ratio and of the water/cement ratio of concrete on its behaviour under high confinement. This thesis lies within a more general scope of the understanding of concrete behaviour under severe loading situations (near field detonation or ballistic impacts). A near field detonation or an impact on a concrete structure generate very high levels of stress associated with complex loading paths in the concrete material. To validate concrete behaviour models, experimental results are required. The work presented in this thesis concerns tests conducted using a static triaxial press that allows to obtain stress levels of the order of the giga Pascal. The porous character of concrete and the high confinement required on the one hand, a development of a specimen protection device, and on the other hand, a development of an instrumentation with strain gauges, which is unprecedented for such high confinements. Hydrostatic and triaxial tests, conducted on the one hand on model materials and on the other hand on concrete, allowed to validate the developed experimental procedures as well as the technique of strain and stress measurements. The studies concerning the influence of the saturation ratio and of the water/cement ratio of concrete on its behaviour required the formulation of a plain baseline concrete and of two modified concretes with different water/cement ratios. The analysis of triaxial tests performed on the baseline concrete shows that the saturation ratio of concrete has a major influence on its static behaviour under high confinement. This influence is particularly marked for the concrete loading capacity and for the shape of limit state curves for saturation ratios greater than 50%. The concrete loading capacity increases with the confinement pressure for tests on dry concrete whereas beyond a given confinement pressure, it remains limited for wet or saturated concrete

  12. Aspect ratio has no effect on genotoxicity of multi-wall carbon nanotubes.

    Science.gov (United States)

    Kim, Jin Sik; Lee, Kyu; Lee, Young Hee; Cho, Hyun Sun; Kim, Ki Heon; Choi, Kyung Hee; Lee, Sang Hee; Song, Kyung Seuk; Kang, Chang Soo; Yu, Il Je

    2011-07-01

    Carbon nanotubes (CNTs) have specific physico-chemical and electrical properties that are useful for telecommunications, medicine, materials, manufacturing processes and the environmental and energy sectors. Yet, despite their many advantages, it is also important to determine whether CNTs may represent a hazard to the environment and human health. Like asbestos, the aspect ratio (length:diameter) and metal components of CNTs are known to have an effect on the toxicity of carbon nanotubes. Thus, to evaluate the toxic potential of CNTs in relation to their aspect ratio and metal contamination, in vivo and in vitro genotoxicity tests were conducted using high-aspect-ratio (diameter: 10-15 nm, length: ~10 μm) and low-aspect-ratio multi-wall carbon nanotubes (MWCNTs, diameter: 10-15 nm, length: ~150 nm) according to OECD test guidelines 471 (bacterial reverse mutation test), 473 (in vitro chromosome aberration test), and 474 (in vivo micronuclei test) with a good laboratory practice system. To determine the treatment concentration for all the tests, a solubility and dispersive test was performed, and a 1,2-dipalmitoyl-sn-glycero-3-phosphocholine (DPPC) solution found to be more suitable than distilled water. Neither the high- nor the low-aspect-ratio MWCNTs induced any genotoxicity in a bacterial reverse mutation test (~1,000 μg/plate), in vitro chromosome aberration test (without S9: ~6.25 μg/ml, with S9: ~50 μg/ml), or in vivo micronuclei test (~50 mg/kg). However, the high-aspect-ratio MWCNTs were found to be more toxic than the low-aspect-ratio MWCNTs. Thus, while high-aspect-ratio MWCNTs do not induce direct genotoxicity or metabolic activation-mediated genotoxicity, genotoxicity could still be induced indirectly through oxidative stress or inflammation.

  13. Tractor accelerated test on test rig

    Directory of Open Access Journals (Sweden)

    M. Mattetti

    2013-09-01

    Full Text Available The experimental tests performed to validate a tractor prototype before its production, need a substantial financial and time commitment. The tests could be reduced using accelerated tests able to reproduce on the structural part of the tractor, the same damage produced on the tractor during real life in a reduced time. These tests were usually performed reproducing a particular harsh condition a defined number of times, as for example using a bumpy road on track to carry out the test in any weather condition. Using these procedures the loads applied on the tractor structure are different with respect to those obtained during the real use, with the risk to apply loads hard to find in reality. Recently it has been demonstrated how, using the methodologies designed for cars, it is possible to also expedite the structural tests for tractors. In particular, automotive proving grounds were recently successfully used with tractors to perform accelerated structural tests able to reproduce the real use of the machine with an acceleration factor higher than that obtained with the traditional methods. However, the acceleration factor obtained with a tractor on proving grounds is in any case reduced due to the reduced speed of the tractors with respect to cars. In this context, the goal of the paper is to show the development of a methodology to perform an accelerated structural test on a medium power tractor using a 4 post test rig. In particular, several proving ground testing conditions have been performed to measure the loads on the tractor. The loads obtained were then edited to remove the not damaging portion of signals, and finally the loads obtained were reproduced in a 4 post test rig. The methodology proposed could be a valid alternative to the use of a proving ground to reproduce accelerated structural tests on tractors.

  14. MODELING AND FORECASTING THE GROSS ENROLLMENT RATIO IN ROMANIAN PRIMARY SCHOOL

    Directory of Open Access Journals (Sweden)

    MARINOIU CRISTIAN

    2014-06-01

    Full Text Available The gross enrollment ratio in primary school is one of the basic indicators used in order to evaluate the proposed objectives of the educational system. Knowing its evolution allows a more rigorous substantiation of the strategies and of the human resources politics not only from the educational field but also from the economic one. In this paper we propose an econometric model in order to describe the gross enrollment ratio in Romanian primary school and we achieve its prediction for the next years, having as a guide the Box-Jenkins’s methodology. The obtained results indicate the continuous decrease of this rate for the next years.

  15. Insights from implementation of a risk management methodology

    International Nuclear Information System (INIS)

    Mahn, J.A.; Germann, R.P.; Jacobs, R.R.

    1992-01-01

    In 1988, GPU Nuclear (GPUN) Corporation embarked on a research effort to identify or develop an appropriate methodology for proactively managing risks. The objective of this effort was to increase its ability to identify potential risks and to aid resource allocation decision making for risk control. Such a methodology was presented at a risk management symposium sponsored by GPUN in September of 1989. A pilot project based on this methodology has been conducted at GPUN to test and validate the elements of the methodology and to compare the results of its application with current corporate methods for guiding risk decision making. The pilot project also led to a follow-up policy-capturing study to elicit information about the various risk decision-making models of GPUN decision makers. The combination of these endeavors provided an opportunity to gain numerous insights with respect to understanding the real value of a risk management process, obtaining acceptance of and commitment to risk management and improving operational aspects of the methodology

  16. A systematic review of the diagnostic performance of orthopedic physical examination tests of the hip.

    Science.gov (United States)

    Rahman, Labib Ataur; Adie, Sam; Naylor, Justine Maree; Mittal, Rajat; So, Sarah; Harris, Ian Andrew

    2013-08-30

    Previous reviews of the diagnostic performances of physical tests of the hip in orthopedics have drawn limited conclusions because of the low to moderate quality of primary studies published in the literature. This systematic review aims to build on these reviews by assessing a broad range of hip pathologies, and employing a more selective approach to the inclusion of studies in order to accurately gauge diagnostic performance for the purposes of making recommendations for clinical practice and future research. It specifically identifies tests which demonstrate strong and moderate diagnostic performance. A systematic search of Medline, Embase, Embase Classic and CINAHL was conducted to identify studies of hip tests. Our selection criteria included an analysis of internal and external validity. We reported diagnostic performance in terms of sensitivity, specificity, predictive values and likelihood ratios. Likelihood ratios were used to identify tests with strong and moderate diagnostic utility. Only a small proportion of tests reported in the literature have been assessed in methodologically valid primary studies. 16 studies were included in our review, producing 56 independent test-pathology combinations. Two tests demonstrated strong clinical utility, the patellar-pubic percussion test for excluding radiologically occult hip fractures (negative LR 0.05, 95% Confidence Interval [CI] 0.03-0.08) and the hip abduction sign for diagnosing sarcoglycanopathies in patients with known muscular dystrophies (positive LR 34.29, 95% CI 10.97-122.30). Fifteen tests demonstrated moderate diagnostic utility for diagnosing and/or excluding hip fractures, symptomatic osteoarthritis and loosening of components post-total hip arthroplasty. We have identified a number of tests demonstrating strong and moderate diagnostic performance. These findings must be viewed with caution as there are concerns over the methodological quality of the primary studies from which we have extracted our

  17. (Re)evaluating the Implications of the Autoregressive Latent Trajectory Model Through Likelihood Ratio Tests of Its Initial Conditions.

    Science.gov (United States)

    Ou, Lu; Chow, Sy-Miin; Ji, Linying; Molenaar, Peter C M

    2017-01-01

    The autoregressive latent trajectory (ALT) model synthesizes the autoregressive model and the latent growth curve model. The ALT model is flexible enough to produce a variety of discrepant model-implied change trajectories. While some researchers consider this a virtue, others have cautioned that this may confound interpretations of the model's parameters. In this article, we show that some-but not all-of these interpretational difficulties may be clarified mathematically and tested explicitly via likelihood ratio tests (LRTs) imposed on the initial conditions of the model. We show analytically the nested relations among three variants of the ALT model and the constraints needed to establish equivalences. A Monte Carlo simulation study indicated that LRTs, particularly when used in combination with information criterion measures, can allow researchers to test targeted hypotheses about the functional forms of the change process under study. We further demonstrate when and how such tests may justifiably be used to facilitate our understanding of the underlying process of change using a subsample (N = 3,995) of longitudinal family income data from the National Longitudinal Survey of Youth.

  18. Medicine, methodology, and values: trade-offs in clinical science and practice.

    Science.gov (United States)

    Ho, Vincent K Y

    2011-01-01

    The current guidelines of evidence-based medicine (EBM) presuppose that clinical research and clinical practice should advance from rigorous scientific tests as they generate reliable, value-free knowledge. Under this presupposition, hypotheses postulated by doctors and patients in the process of their decision making are preferably tested in randomized clinical trials (RCTs), and in systematic reviews and meta-analyses summarizing outcomes from multiple RCTs. Since testing under this scheme is predominantly focused on the criteria of generality and precision achieved through methodological rigor, at the cost of the criterion of realism, translating test results to clinical practice is often problematic. Choices concerning which methodological criteria should have priority are inevitable, however, as clinical trials, and scientific research in general, cannot meet all relevant criteria at the same time. Since these choices may be informed by considerations external to science, we must acknowledge that science cannot be value-free in a strict sense, and this invites a more prominent role for value-laden considerations in evaluating clinical research. The urgency for this becomes even more apparent when we consider the important yet implicit role of scientific theories in EBM, which may also be subjected to methodological evaluation and for which selectiveness in methodological focus is likewise inevitable.

  19. Methodology for testing and validating knowledge bases

    Science.gov (United States)

    Krishnamurthy, C.; Padalkar, S.; Sztipanovits, J.; Purves, B. R.

    1987-01-01

    A test and validation toolset developed for artificial intelligence programs is described. The basic premises of this method are: (1) knowledge bases have a strongly declarative character and represent mostly structural information about different domains, (2) the conditions for integrity, consistency, and correctness can be transformed into structural properties of knowledge bases, and (3) structural information and structural properties can be uniformly represented by graphs and checked by graph algorithms. The interactive test and validation environment have been implemented on a SUN workstation.

  20. Development of acoustically lined ejector technology for multitube jet noise suppressor nozzles by model and engine tests over a wide range of jet pressure ratios and temperatures

    Science.gov (United States)

    Atvars, J.; Paynter, G. C.; Walker, D. Q.; Wintermeyer, C. F.

    1974-01-01

    An experimental program comprising model nozzle and full-scale engine tests was undertaken to acquire parametric data for acoustically lined ejectors applied to primary jet noise suppression. Ejector lining design technology and acoustical scaling of lined ejector configurations were the major objectives. Ground static tests were run with a J-75 turbojet engine fitted with a 37-tube, area ratio 3.3 suppressor nozzle and two lengths of ejector shroud (L/D = 1 and 2). Seven ejector lining configurations were tested over the engine pressure ratio range of 1.40 to 2.40 with corresponding jet velocities between 305 and 610 M/sec. One-fourth scale model nozzles were tested over a pressure ratio range of 1.40 to 4.0 with jet total temperatures between ambient and 1088 K. Scaling of multielement nozzle ejector configurations was also studied using a single element of the nozzle array with identical ejector lengths and lining materials. Acoustic far field and near field data together with nozzle thrust performance and jet aerodynamic flow profiles are presented.

  1. A new lean change methodology for small & medium sized enterprises

    OpenAIRE

    April, Joris; Powell, Daryl; Bart, Schanssema

    2010-01-01

    SMEs find it difficult to implement productivity improvement tools, particularly those associated with Lean Manufacturing. Larger companies have more success due to greater access to resources. To provide the SMEs with a way to implement Lean sustainably, the European project ERIP develops a new lean change methodology for SMEs. In this paper the methodology is explained and three test cases show the strength of the methodology. The method is a sequence of achieving management and company sup...

  2. Methodological Approach to Company Cash Flows Target-Oriented Forecasting Based on Financial Position Analysis

    OpenAIRE

    Sergey Krylov

    2012-01-01

    The article treats a new methodological approach to the company cash flows target-oriented forecasting based on its financial position analysis. The approach is featured to be universal and presumes application of the following techniques developed by the author: financial ratio values correction techniques and correcting cash flows techniques. The financial ratio values correction technique assumes to analyze and forecast company financial position while the correcting cash flows technique i...

  3. Application of the modified chi-square ratio statistic in a stepwise procedure for cascade impactor equivalence testing.

    Science.gov (United States)

    Weber, Benjamin; Lee, Sau L; Delvadia, Renishkumar; Lionberger, Robert; Li, Bing V; Tsong, Yi; Hochhaus, Guenther

    2015-03-01

    Equivalence testing of aerodynamic particle size distribution (APSD) through multi-stage cascade impactors (CIs) is important for establishing bioequivalence of orally inhaled drug products. Recent work demonstrated that the median of the modified chi-square ratio statistic (MmCSRS) is a promising metric for APSD equivalence testing of test (T) and reference (R) products as it can be applied to a reduced number of CI sites that are more relevant for lung deposition. This metric is also less sensitive to the increased variability often observed for low-deposition sites. A method to establish critical values for the MmCSRS is described here. This method considers the variability of the R product by employing a reference variance scaling approach that allows definition of critical values as a function of the observed variability of the R product. A stepwise CI equivalence test is proposed that integrates the MmCSRS as a method for comparing the relative shapes of CI profiles and incorporates statistical tests for assessing equivalence of single actuation content and impactor sized mass. This stepwise CI equivalence test was applied to 55 published CI profile scenarios, which were classified as equivalent or inequivalent by members of the Product Quality Research Institute working group (PQRI WG). The results of the stepwise CI equivalence test using a 25% difference in MmCSRS as an acceptance criterion provided the best matching with those of the PQRI WG as decisions of both methods agreed in 75% of the 55 CI profile scenarios.

  4. Methodology for full comparative assessment of direct gross glycerin combustion in a flame tube furnace

    Energy Technology Data Exchange (ETDEWEB)

    Maturana, Aymer Yeferson; Pagliuso, Josmar D. [Dept. of Mechanical Engineering. Sao Carlos School of Engineering. University of Sao Paulo, Sao Carlos, SP (Brazil)], e-mails: aymermat@sc.usp.br, josmar@sc.usp.br

    2010-07-01

    This study is to develop a methodology to identify and evaluate the emissions and heat transfer associated to combustion of gross glycerin a by-product of the Brazilian biodiesel manufacture process as alternative energy source. It aims to increase the present knowledge on the matter and to contribute to the improvement of the economic and environmental perspective of biodiesel industry. This methodology was considered to be used for assessment of gross glycerin combustion from three different types of biodiesel (bovine tallow, palm and soy). The procedures for evaluation and quantification of emissions of sulphur and nitrogen oxides, total hydrocarbons, carbon monoxide, carbon dioxide, and acrolein were analyzed, described and standardized. Experimental techniques for mutagenic and toxic effects assessment of gases similarly were analyzed and standardized, as well as the calorific power, the associate heat transfer and fundamentals operational parameters. The methodology was developed, using a full-instrumented flame tube furnace, continuous gas analyzers, a chromatograph, automatic data acquisition systems and other auxiliary equipment. The mutagenic and toxic effects of the study was based on Tradescantia clone KU-20, using chambers of intoxication and biological analytical techniques previously developed and others were specially adapted. The benchmark for the initial set up was based on the performance evaluation of the previous equipment tested with diesel considering its behavior during direct combustion. Finally, the following factors were defined for the combustion of crude glycerin, configurations of equipment types, operational parameters such as air fuel ratio adiabatic temperature and other necessary aspect for successful application of the methodology. The developed and integrated methodology was made available to the concern industry, environmental authorities and researchers as procedures to access the viability of gross glycerin or similar fuels as

  5. Assessing neural activity related to decision-making through flexible odds ratio curves and their derivatives.

    Science.gov (United States)

    Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Pardo-Vazquez, Jose L; Leboran, Victor; Molenberghs, Geert; Faes, Christel; Acuña, Carlos

    2011-06-30

    It is well established that neural activity is stochastically modulated over time. Therefore, direct comparisons across experimental conditions and determination of change points or maximum firing rates are not straightforward. This study sought to compare temporal firing probability curves that may vary across groups defined by different experimental conditions. Odds-ratio (OR) curves were used as a measure of comparison, and the main goal was to provide a global test to detect significant differences of such curves through the study of their derivatives. An algorithm is proposed that enables ORs based on generalized additive models, including factor-by-curve-type interactions to be flexibly estimated. Bootstrap methods were used to draw inferences from the derivatives curves, and binning techniques were applied to speed up computation in the estimation and testing processes. A simulation study was conducted to assess the validity of these bootstrap-based tests. This methodology was applied to study premotor ventral cortex neural activity associated with decision-making. The proposed statistical procedures proved very useful in revealing the neural activity correlates of decision-making in a visual discrimination task. Copyright © 2011 John Wiley & Sons, Ltd.

  6. Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk

    2008-01-01

    of combined sewer overflow. The GLUE methodology is used to test different conceptual setups in order to determine if one model setup gives a better goodness of fit conditional on the observations than the other. Moreover, different methodological investigations of GLUE are conducted in order to test......In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... if the uncertainty analysis is unambiguous. It is shown that the GLUE methodology is very applicable in uncertainty analysis of this application of an urban drainage model, although it was shown to be quite difficult of get good fits of the whole time series....

  7. Novel Apparatus for the Real-Time Quantification of Dissolved Gas Concentrations and Isotope Ratios

    Science.gov (United States)

    Gupta, M.; Leen, J.; Baer, D. S.; Owano, T. G.; Liem, J.

    2013-12-01

    Measurements of dissolved gases and their isotopic composition are critical in studying a variety of phenomena, including underwater greenhouse gas generation, air-surface exchange, and pollution migration. These studies typically involve obtaining water samples from streams, lakes, or ocean water and transporting them to a laboratory, where they are degased. The gases obtained are then generally measured using gas chromatography and isotope ratio mass spectrometry for concentrations and isotope ratios, respectively. This conventional, off-line methodology is time consuming, significantly limits the number of the samples that can be measured and thus severely inhibits detailed spatial and temporal mapping of gas concentrations and isotope ratios. In this work, we describe the development of a new membrane-based degassing device that interfaces directly to Los Gatos Research (cavity enhanced laser absorption or Off-Axis ICOS) gas analyzers (cavity enhanced laser absorption or Off-Axis ICOS analyzers) to create an autonomous system that can continuously and quickly measure concentrations and isotope ratios of dissolved gases in real time in the field. By accurately controlling the water flow rate through the membrane degasser, gas pressure on the outside of the membrane, and water pressure on the inside of the membrane, the system is able to generate precise and highly reproducible results. Moreover, by accurately measuring the gas flow rates in and out of the degasser, the gas-phase concentrations (ppm) could be converted into dissolved gas concentrations (nM). We will present detailed laboratory test data that quantifies the linearity, precision, and dynamic range of the system for the concentrations and isotope ratios of dissolved methane, carbon dioxide, and nitrous oxide. By interfacing the degassing device to a novel cavity-enhanced spectrometer (developed by LGR), preliminary data will also be presented for dissolved volatile organics (VOC) and other

  8. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    Boesmans, B.; Auglaire, M.; Snoeck, J.

    1997-01-01

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  9. Novel bacterial ratio for predicting fecal age

    Energy Technology Data Exchange (ETDEWEB)

    Nieman, J.; Brion, G.M. [Univ. of Kentucky, Dept. of Civil Engineering, Lexington, Kentucky (United States)]. E-mail: gbrion@engr.uky.edu

    2002-06-15

    This study presents an extension of ongoing research into the utility of the ratio of bacterial colonies isolated on membrane filters during the total coliform test using m-Endo broth media for the prediction of fecal age. Analysis of the relative shifts in concentrations of indicator bacterial populations in Kentucky River water quality data collected from the inlet of a local water treatment plant showed a correlation between raw concentrations of atypical colonies (AC) and total coliform colonies (TC) formed on m-Endo membrane filter tests, and fecal age. Visual analysis of plant treatment records showed that low values of the AC/TC ratio were related to periods of high flow, when runoff added fresh fecal material to the river. A more detailed analysis of 2 years of Kentucky River water quality data showed the average AC/TC ratio during months with high river flow (rain) to be 3.4, rising to an average of 27.6 during months with low flow. The average AC/TC ratio during high flow months compared to that found in other studies for raw human sewage (3.9) and the ratio increased to values associated with animal impacted urban runoff (18.9) during low flow months. (author)

  10. Novel bacterial ratio for predicting fecal age

    International Nuclear Information System (INIS)

    Nieman, J.; Brion, G.M.

    2002-01-01

    This study presents an extension of ongoing research into the utility of the ratio of bacterial colonies isolated on membrane filters during the total coliform test using m-Endo broth media for the prediction of fecal age. Analysis of the relative shifts in concentrations of indicator bacterial populations in Kentucky River water quality data collected from the inlet of a local water treatment plant showed a correlation between raw concentrations of atypical colonies (AC) and total coliform colonies (TC) formed on m-Endo membrane filter tests, and fecal age. Visual analysis of plant treatment records showed that low values of the AC/TC ratio were related to periods of high flow, when runoff added fresh fecal material to the river. A more detailed analysis of 2 years of Kentucky River water quality data showed the average AC/TC ratio during months with high river flow (rain) to be 3.4, rising to an average of 27.6 during months with low flow. The average AC/TC ratio during high flow months compared to that found in other studies for raw human sewage (3.9) and the ratio increased to values associated with animal impacted urban runoff (18.9) during low flow months. (author)

  11. Hydrogeological testing in the Sellafield area

    International Nuclear Information System (INIS)

    Sutton, J.S.

    1996-01-01

    A summary of the hydrogeological test methodologies employed in the Sellafield geological investigations is provided in order that an objective appraisal of the quality of the data can be formed. A brief presentation of some of these data illustrates the corroborative nature of different test and measurement methodologies and provides a preliminary view of the results obtained. The programme of hydrogeological testing is an evolving one and methodologies are developing as work proceeds and targets become more clearly defined. As the testing is focused on relatively low permeability rocks at depth, the approach to testing differs slightly from conventional hydrogeological well testing and makes extensive use of oilfield technology. (author)

  12. Combined slope ratio analysis and linear-subtraction: An extension of the Pearce ratio method

    Science.gov (United States)

    De Waal, Sybrand A.

    1996-07-01

    A new technique, called combined slope ratio analysis, has been developed by extending the Pearce element ratio or conserved-denominator method (Pearce, 1968) to its logical conclusions. If two stoichiometric substances are mixed and certain chemical components are uniquely contained in either one of the two mixing substances, then by treating these unique components as conserved, the composition of the substance not containing the relevant component can be accurately calculated within the limits allowed by analytical and geological error. The calculated composition can then be subjected to rigorous statistical testing using the linear-subtraction method recently advanced by Woronow (1994). Application of combined slope ratio analysis to the rocks of the Uwekahuna Laccolith, Hawaii, USA, and the lavas of the 1959-summit eruption of Kilauea Volcano, Hawaii, USA, yields results that are consistent with field observations.

  13. Impact limiters for radioactive materials transport packagings: a methodology for assessment

    International Nuclear Information System (INIS)

    Mourao, Rogerio Pimenta

    2002-01-01

    This work aims at establishing a methodology for design assessment of a cellular material-filled impact limiter to be used as part of a radioactive material transport packaging. This methodology comprises the selection of the cellular material, its structural characterization by mechanical tests, the development of a case study in the nuclear field, preliminary determination of the best cellular material density for the case study, performance of the case and its numerical simulation using the finite element method. Among the several materials used as shock absorbers in packagings, the polyurethane foam was chosen, particularly the foam obtained from the castor oil plant (Ricinus communis), a non-polluting and renewable source. The case study carried out was the 9 m drop test of a package prototype containing radioactive wastes incorporated in a cement matrix, considered one of the most severe tests prescribed by the Brazilian and international transport standards. Prototypes with foam density pre-determined as ideal as well as prototypes using lighter and heavier foams were tested for comparison. The results obtained validate the methodology in that expectations regarding the ideal foam density were confirmed by the drop tests and the numerical simulation. (author)

  14. A hypothesis-testing framework for studies investigating ontogenetic niche shifts using stable isotope ratios.

    Directory of Open Access Journals (Sweden)

    Caroline M Hammerschlag-Peyer

    Full Text Available Ontogenetic niche shifts occur across diverse taxonomic groups, and can have critical implications for population dynamics, community structure, and ecosystem function. In this study, we provide a hypothesis-testing framework combining univariate and multivariate analyses to examine ontogenetic niche shifts using stable isotope ratios. This framework is based on three distinct ontogenetic niche shift scenarios, i.e., (1 no niche shift, (2 niche expansion/reduction, and (3 discrete niche shift between size classes. We developed criteria for identifying each scenario, as based on three important resource use characteristics, i.e., niche width, niche position, and niche overlap. We provide an empirical example for each ontogenetic niche shift scenario, illustrating differences in resource use characteristics among different organisms. The present framework provides a foundation for future studies on ontogenetic niche shifts, and also can be applied to examine resource variability among other population sub-groupings (e.g., by sex or phenotype.

  15. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology

    Science.gov (United States)

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid

    2017-01-01

    Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the

  16. Civil migration and risk assessment methodology

    International Nuclear Information System (INIS)

    Onishi, Y.; Brown, S.M.; Olsen, A.R.; Parkhurst, M.A.

    1981-01-01

    To provide a scientific basis for risk assessment and decision making, the Chemical Migration and Risk Assessment (CMRA) Methodology was developed to simulate overland and instream toxic containment migration and fate, and to predict the probability of acute and chronic impacts on aquatic biota. The simulation results indicated that the time between the pesticide application and the subsequent runoff producing event was the most important factor determining the amount of the alachlor. The study also revealed that sediment transport has important effects on contaminant migration when sediment concentrations in receiving streams are high or contaminants are highly susceptible to adsorption by sediment. Although the capabilities of the CMRA methodology were only partially tested in this study, the results demonstrate that methodology can be used as a scientific decision-making tool for toxic chemical regulations, a research tool to evaluate the relative significance of various transport and degradation phenomena, as well as a tool to examine the effectiveness of toxic chemical control practice

  17. A Methodology for Equitable Performance Assessment and Presentation of Wave Energy Converters Based on Sea Trials

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Pecher, Arthur; Margheritini, Lucia

    2013-01-01

    This paper provides a methodology for the analysis and presentation of data obtained from sea trials of wave energy converters (WEC). The equitable aspect of this methodology lies in its wide application, as any WEC at any scale or stage of development can be considered as long as the tests are p...... parameters influence the performance of the WEC can also be investigated using this methodology.......This paper provides a methodology for the analysis and presentation of data obtained from sea trials of wave energy converters (WEC). The equitable aspect of this methodology lies in its wide application, as any WEC at any scale or stage of development can be considered as long as the tests...... leads to testing campaigns that are not as extensive as desired. Therefore, the performance analysis should be robust enough to allow for not fully complete sea trials and sub optimal performance data. In other words, this methodology is focused at retrieving the maximum amount of useful information out...

  18. Urine Albumin and Albumin/ Creatinine Ratio

    Science.gov (United States)

    ... it used? The urine albumin test or albumin/creatinine ratio (ACR) is used to screen people with chronic conditions, such as diabetes and high blood pressure ( hypertension ) that put them at an ...

  19. Performance of a high-work, low-aspect-ratio turbine stator tested with a realistic inlet radial temperature gradient

    Science.gov (United States)

    Stabe, Roy G.; Schwab, John R.

    1991-01-01

    A 0.767-scale model of a turbine stator designed for the core of a high-bypass-ratio aircraft engine was tested with uniform inlet conditions and with an inlet radial temperature profile simulating engine conditions. The principal measurements were radial and circumferential surveys of stator-exit total temperature, total pressure, and flow angle. The stator-exit flow field was also computed by using a three-dimensional Navier-Stokes solver. Other than temperature, there were no apparent differences in performance due to the inlet conditions. The computed results compared quite well with the experimental results.

  20. A methodology for quantitatively managing the bug fixing process using Mahalanobis Taguchi system

    Directory of Open Access Journals (Sweden)

    Boby John

    2015-12-01

    Full Text Available The controlling of bug fixing process during the system testing phase of software development life cycle is very important for fixing all the detected bugs within the scheduled time. The presence of open bugs often delays the release of the software or result in releasing the software with compromised functionalities. These can lead to customer dissatisfaction, cost overrun and eventually the loss of market share. In this paper, the authors propose a methodology to quantitatively manage the bug fixing process during system testing. The proposed methodology identifies the critical milestones in the system testing phase which differentiates the successful projects from the unsuccessful ones using Mahalanobis Taguchi system. Then a model is developed to predict whether a project is successful or not with the bug fix progress at critical milestones as control factors. Finally the model is used to control the bug fixing process. It is found that the performance of the proposed methodology using Mahalanobis Taguchi system is superior to the models developed using other multi-dimensional pattern recognition techniques. The proposed methodology also reduces the number of control points providing the managers with more options and flexibility to utilize the bug fixing resources across system testing phase. Moreover the methodology allows the mangers to carry out mid- course corrections to bring the bug fixing process back on track so that all the detected bugs can be fixed on time. The methodology is validated with eight new projects and the results are very encouraging.

  1. Nonintrusive methodology for wellness baseline profiling

    Science.gov (United States)

    Chung, Danny Wen-Yaw; Tsai, Yuh-Show; Miaou, Shaou-Gang; Chang, Walter H.; Chang, Yaw-Jen; Chen, Shia-Chung; Hong, Y. Y.; Chyang, C. S.; Chang, Quan-Shong; Hsu, Hon-Yen; Hsu, James; Yao, Wei-Cheng; Hsu, Ming-Sin; Chen, Ming-Chung; Lee, Shi-Chen; Hsu, Charles; Miao, Lidan; Byrd, Kenny; Chouikha, Mohamed F.; Gu, Xin-Bin; Wang, Paul C.; Szu, Harold

    2007-04-01

    We develop an accumulatively effective and affordable set of smart pair devices to save the exuberant expenditure for the healthcare of aging population, which will not be sustainable when all the post-war baby boomers retire (78 millions will cost 1/5~1/4 GDP in US alone). To design an accessible test-bed for distributed points of homecare, we choose two exemplars of the set to demonstrate the possibility of translation of modern military and clinical know-how, because two exemplars share identically the noninvasive algorithm adapted to the Smart Sensor-pairs for the real world persistent surveillance. Currently, the standard diagnoses for malignant tumors and diabetes disorders are blood serum tests, X-ray CAT scan, and biopsy used sometime in the physical checkup by physicians as cohort-average wellness baselines. The loss of the quality of life in making second careers productive may be caused by the missing of timeliness for correct diagnoses and easier treatments, which contributes to the one quarter of human errors generating the lawsuits against physicians and hospitals, which further escalates the insurance cost and wasteful healthcare expenditure. Such a vicious cycle should be entirely eliminated by building an "individual diagnostic aids (IDA)," similar to the trend of personalized drug, developed from daily noninvasive intelligent databases of the "wellness baseline profiling (WBP)". Since our physiology state undulates diurnally, the Nyquist anti-aliasing theory dictates a minimum twice-a-day sampling of the WBP for the IDA, which must be made affordable by means of noninvasive, unsupervised and unbiased methodology at the convenience of homes. Thus, a pair of military infrared (IR) spectral cameras has been demonstrated for the noninvasive spectrogram ratio test of the spontaneously emitted thermal radiation from a normal human body at 37°C temperature. This invisible self-emission spreads from 3 microns to 12 microns of the radiation wavelengths

  2. Financial Ratios and Perceived Household Financial Satisfaction

    Directory of Open Access Journals (Sweden)

    Scott Garrett

    2013-08-01

    Full Text Available This paper tests the relative strength of three objective measures of financial health (using the solvency, liquidity, and investment asset ratio in predicting a household’s subjective feeling of current financial satisfaction. Using a sample of 6,923 respondents in the 2008 Health and Retirement Study this paper presents evidence of two main findings: 1 the solvency ratio is most strongly associated with financial satisfaction levels based on a cross-sectional design and 2 changes in the investment asset ratio are most strongly associated with changes in financial satisfaction over time.

  3. Financial Ratio and Its Influence to Profitability in Islamic Banks.

    Directory of Open Access Journals (Sweden)

    Erika Amelia

    2015-10-01

    Full Text Available This research aims to analyze the influence of the Capital Adequacy Ratio (CAR, Non Performing Financing (NPF, Financing to Deposit Ratio (FDR and Biaya Operasional Pendapatan Operasional (BOPO to Return on Asset (ROA in Bank Muamalat Indonesia and Bank Syariah Mega. The data analysis method used in this research is multiple regression analysis. From the test results show that the Capital Adequacy Ratio (CAR, Non Performing Financing (NPF, Financing to Deposit Ratio (FDR and Biaya Operasional Pendapatan Operasional (BOPO simultaneously effect to Return on Asset (ROA. Based on the test results of the t statistic was concluded that the Capital Adequacy Ratio (CAR, Non Performing Financing (NPF and the Financing to Deposit Ratio (FDR partially no significant effect to Return on Asset (ROA, while Biaya Operasional Pendapatan Operasional (BOPO partially significant effect to Return on Asset (ROADOI: 10.15408/aiq.v7i2.1700

  4. Searching for degenerate Higgs bosons a profile likelihood ratio method to test for mass-degenerate states in the presence of censored data and uncertainties

    CERN Document Server

    David, André; Petrucciani, Giovanni

    2015-01-01

    Using the likelihood ratio test statistic, we present a method which can be employed to test the hypothesis of a single Higgs boson using the matrix of measured signal strengths. This method can be applied in the presence of censored data and takes into account uncertainties on the measurements. The p-value against the hypothesis of a single Higgs boson is defined from the expected distribution of the test statistic, generated using pseudo-experiments. The applicability of the likelihood-based test is demonstrated using numerical examples with uncertainties and missing matrix elements.

  5. Diuresis renography in children: methodological aspects

    International Nuclear Information System (INIS)

    Bonnin, F.; Le Stanc, E.; Busquet, G.; Saidi, L.; Lyonnet, F.

    1995-01-01

    In paediatrics, diuresis renography is used as a method to guide clinical management of hydronephrosis or hydro-uretero-nephrosis. Various pitfalls in the technique and other errors exist and may lead to a misinterpretation of the test. The methodology for performing and interpreting the diuresis renography is discussed. (authors). 12 refs., 4 figs

  6. Gust load alleviation wind tunnel tests of a large-aspect-ratio flexible wing with piezoelectric control

    Directory of Open Access Journals (Sweden)

    Ying Bi

    2017-02-01

    Full Text Available An active control technique utilizing piezoelectric actuators to alleviate gust-response loads of a large-aspect-ratio flexible wing is investigated. Piezoelectric materials have been extensively used for active vibration control of engineering structures. In this paper, piezoelectric materials further attempt to suppress the vibration of the aeroelastic wing caused by gust. The motion equation of the flexible wing with piezoelectric patches is obtained by Hamilton’s principle with the modal approach, and then numerical gust responses are analyzed, based on which a gust load alleviation (GLA control system is proposed. The gust load alleviation system employs classic proportional-integral-derivative (PID controllers which treat piezoelectric patches as control actuators and acceleration as the feedback signal. By a numerical method, the control mechanism that piezoelectric actuators can be used to alleviate gust-response loads is also analyzed qualitatively. Furthermore, through low-speed wind tunnel tests, the effectiveness of the gust load alleviation active control technology is validated. The test results agree well with the numerical results. Test results show that at a certain frequency range, the control scheme can effectively alleviate the z and x wingtip accelerations and the root bending moment of the wing to a certain extent. The control system gives satisfying gust load alleviation efficacy with the reduction rate being generally over 20%.

  7. Employee Turnover: An Empirical and Methodological Assessment.

    Science.gov (United States)

    Muchinsky, Paul M.; Tuttle, Mark L.

    1979-01-01

    Reviews research on the prediction of employee turnover. Groups predictor variables into five general categories: attitudinal (job satisfaction), biodata, work-related, personal, and test-score predictors. Consistent relationships between common predictor variables and turnover were found for four categories. Eight methodological problems/issues…

  8. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    Energy Technology Data Exchange (ETDEWEB)

    Zain, Zakiyah, E-mail: zac@uum.edu.my; Ahmad, Yuhaniz, E-mail: yuhaniz@uum.edu.my [School of Quantitative Sciences, Universiti Utara Malaysia, UUM Sintok 06010, Kedah (Malaysia); Azwan, Zairul, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Raduan, Farhana, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Sagap, Ismail, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com [Surgery Department, Universiti Kebangsaan Malaysia Medical Centre, Jalan Yaacob Latif, 56000 Bandar Tun Razak, Kuala Lumpur (Malaysia); Aziz, Nazrina, E-mail: nazrina@uum.edu.my

    2014-12-04

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.

  9. Budget impact analysis of sFlt-1/PlGF ratio as prediction test in Italian women with suspected preeclampsia.

    Science.gov (United States)

    Frusca, Tiziana; Gervasi, Maria-Teresa; Paolini, Davide; Dionisi, Matteo; Ferre, Francesca; Cetin, Irene

    2017-09-01

    Preeclampsia (PE) is a pregnancy disease which represents a leading cause of maternal and perinatal mortality and morbidity. Accurate prediction of PE risk could provide an increase in health benefits and better patient management. To estimate the economic impact of introducing Elecsys sFlt-1/PlGF ratio test, in addition to standard practice, for the prediction of PE in women with suspected PE in the Italian National Health Service (INHS). A decision tree model has been developed to simulate the progression of a cohort of pregnant women from the first presentation of clinical suspicion of PE in the second and third trimesters until delivery. The model provides an estimation of the financial impact of introducing sFlt-1/PlGF versus standard practice. Clinical inputs have been derived from PROGNOSIS study and from literature review, and validated by National Clinical Experts. Resources and unit costs have been obtained from Italian-specific sources. Healthcare costs associated with the management of a pregnant woman with clinical suspicion of PE equal €2384 when following standard practice versus €1714 using sFlt-1/PlGF ratio test. Introduction of sFlt-1/PlGF into hospital practice is cost-saving. Savings are generated primarily through improvement in diagnostic accuracy and reduction in unnecessary hospitalization for women before PE's onset.

  10. A note on imperfect hedging: a method for testing stability of the hedge ratio

    Directory of Open Access Journals (Sweden)

    Michal Černý

    2012-01-01

    Full Text Available Companies producing, processing and consuming commodities in the production process often hedge their commodity expositions using derivative strategies based on different, highly correlated underlying commodities. Once the open position in a commodity is hedged using a derivative position with another underlying commodity, the appropriate hedge ratio must be determined in order the hedge relationship be as effective as possible. However, it is questionable whether the hedge ratio determined at the inception of the risk management strategy remains stable over the whole period for which the hedging strategy exists. Usually it is assumed that in the short run, the relationship (say, correlation between the two commodities remains stable, while in the long run it may vary. We propose a method, based on statistical theory of stability, for on-line detection whether market movements of prices of the commodities involved in the hedge relationship indicate that the hedge ratio may have been subject to a recent change. The change in the hedge ratio decreases the effectiveness of the original hedge relationship and creates a new open position. The method proposed should inform the risk manager that it could be reasonable to adjust the derivative strategy in a way reflecting the market conditions after the change in the hedge ratio.

  11. Including test errors in evaluating surveillance test intervals

    International Nuclear Information System (INIS)

    Kim, I.S.; Samanta, P.K.; Martorell, S.; Vesely, W.E.

    1991-01-01

    Technical Specifications require surveillance testing to assure that the standby systems important to safety will start and perform their intended functions in the event of plant abnormality. However, as evidenced by operating experience, the surveillance tests may be adversely impact safety because of their undesirable side effects, such as initiation of plant transients during testing or wearing-out of safety systems due to testing. This paper first defines the concerns, i.e., the potential adverse effects of surveillance testing, from a risk perspective. Then, we present a methodology to evaluate the risk impact of those adverse effects, focusing on two important kinds of adverse impacts of surveillance testing: (1) risk impact of test-caused trips and (2) risk impact of test-caused equipment wear. The quantitative risk methodology is demonstrated with several surveillance tests conducted at boiling water reactors, such as the tests of the main steam isolation valves, the turbine overspeed protection system, and the emergency diesel generators. We present the results of the risk-effectiveness evaluation of surveillance test intervals, which compares the adverse risk impact with the beneficial risk impact of testing from potential failure detection, along with insights from sensitivity studies

  12. Optimization of the extraction of flavonoids from grape leaves by response surface methodology

    International Nuclear Information System (INIS)

    Brad, K.; Liu, W.

    2013-01-01

    The extraction of flavonoids from grape leaves was optimized to maximize flavonoids yield in this study. A central composite design of response surface methodology involving extracting time, power, liquid-solid ratio, and concentration was used, and second-order model for Y was employed to generate the response surfaces. The optimum condition for flavonoids yield was determined as follows: extracting time 24.95 min, power 72.05, ethanol concentration 63.35%, liquid-solid ratio 10.04. Under the optimum condition, the flavonoids yield was 76.84 %. (author)

  13. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires

    Science.gov (United States)

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  14. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires.

    Science.gov (United States)

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  15. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  16. An extra-memetic empirical methodology to accompany theoretical memetics

    OpenAIRE

    Gill, Jameson

    2012-01-01

    Abstract\\ud \\ud Purpose: The paper describes the difficulties encountered by researchers who are looking to operationalise theoretical memetics and provides a methodological avenue for studies that can test meme theory.\\ud \\ud Design/Methodology/Approach: The application of evolutionary theory to organisations is reviewed by critically reflecting on the validity of its truth claims. To focus the discussion a number of applications of meme theory are reviewed to raise specific issues which oug...

  17. Application of a new methodology to evaluate Dnb limits based on statistical propagation of uncertainties

    International Nuclear Information System (INIS)

    Machado, Marcio Dornellas

    1998-09-01

    One of the most important thermalhydraulics safety parameters is the DNBR (Departure from Nucleate Boiling Ratio). The current methodology in use at Eletronuclear to determine DNBR is extremely conservative and may result in penalties to the reactor power due to an increase plugging level of steam generator tubes. This work uses a new methodology to evaluate DNBR, named mini-RTDP. The standard methodology (STDP) currently in use establishes a limit design value which cannot be surpassed. This limit value is determined taking into account the uncertainties of the empirical correlation used in COBRA IIC/MIT code, modified to Angra 1 conditions. The correlation used is the Westinghouse's W-3 and the minimum DNBR (MDBR) value cannot be less than 1.3. The new methodology reduces the excessive level of conservatism associated with the parameters used in the DNBR calculation, which take most unfavorable values in the STDP methodology, by using their best estimate values. The final goal is to obtain a new DNBR design limit which will provide a margin gain due to more realistic parameters values used in the methodology. (author)

  18. An Innovative Fuzzy-Logic-Based Methodology for Trend Identification

    International Nuclear Information System (INIS)

    Wang Xin; Tsoukalas, Lefteri H.; Wei, Thomas Y.C.; Reifman, Jaques

    2001-01-01

    A new fuzzy-logic-based methodology for on-line signal trend identification is introduced. The methodology may be used for detecting the onset of nuclear power plant (NPP) transients at the earliest possible time and could be of great benefit to diagnostic, maintenance, and performance-monitoring programs. Although signal trend identification is complicated by the presence of noise, fuzzy methods can help capture important features of on-line signals, integrate the information included in these features, and classify incoming NPP signals into increasing, decreasing, and steady-state trend categories. A computer program named PROTREN is developed and tested for the purpose of verifying this methodology using NPP and simulation data. The results indicate that the new fuzzy-logic-based methodology is capable of detecting transients accurately, it identifies trends reliably and does not misinterpret a steady-state signal as a transient one

  19. Pengaruh Likuiditas, Profitabilitas, Leverage, dan Market Ratio terhadap Dividend Payout Ratio pada Perusahaan Manufaktur

    Directory of Open Access Journals (Sweden)

    Erna Puspita

    2017-04-01

    Full Text Available Dividend policy is concerned with financial policies regarding what amount cash dividend paid to shareholders and re-invested as retained earnings. The recent research aimed to test empirically various factors is considered to affect dividend policy. The independent variables in his research included Current Ratio (CR, Return on Equity (ROE, Debt to Equity Ratio (DER, and Earning Per Share (EPS. Meanwhile, the dependent variable was Dividend Payout Ratio (DPR. Quantitative research was used as the research design and the data was secondary data. Furthermore, purposive sampling was selected to get the sample. The result was 14 companies that pay dividend continuously during this research conducted on 2012 - 2014 were selected as the sample of this research. Multiple linier regression was used to analyze the data. The results showed that ROE and EPS has a contribution to the DPR, and then CR and DER has no contribution to the DPR.

  20. [Optimization of Polysaccharide Extraction from Spirodela polyrrhiza by Plackett-Burman Design Combined with Box-Behnken Response Surface Methodology].

    Science.gov (United States)

    Jiang, Zheng; Wang, Hong; Wu, Qi-nan

    2015-06-01

    To optimize the processing of polysaccharide extraction from Spirodela polyrrhiza. Five factors related to extraction rate of polysaccharide were optimized by the Plackett-Burman design. Based on this study, three factors, including alcohol volume fraction, extraction temperature and ratio of material to liquid, were regarded as investigation factors by Box-Behnken response surface methodology. The effect order of three factors on the extraction rate of polysaccharide from Spirodela polyrrhiza were as follows: extraction temperature, alcohol volume fraction,ratio of material to liquid. According to Box-Behnken response, the best extraction conditions were: alcohol volume fraction of 81%, ratio of material to liquid of 1:42, extraction temperature of 100 degrees C, extraction time of 60 min for four times. Plackett-Burman design and Box-Behnken response surface methodology used to optimize the extraction process for the polysaccharide in this study is effective and stable.

  1. Optimization of Tape Winding Process Parameters to Enhance the Performance of Solid Rocket Nozzle Throat Back Up Liners using Taguchi's Robust Design Methodology

    Science.gov (United States)

    Nath, Nayani Kishore

    2017-08-01

    The throat back up liners is used to protect the nozzle structural members from the severe thermal environment in solid rocket nozzles. The throat back up liners is made with E-glass phenolic prepregs by tape winding process. The objective of this work is to demonstrate the optimization of process parameters of tape winding process to achieve better insulative resistance using Taguchi's robust design methodology. In this method four control factors machine speed, roller pressure, tape tension, tape temperature that were investigated for the tape winding process. The presented work was to study the cogency and acceptability of Taguchi's methodology in manufacturing of throat back up liners. The quality characteristic identified was Back wall temperature. Experiments carried out using L 9 ' (34) orthogonal array with three levels of four different control factors. The test results were analyzed using smaller the better criteria for Signal to Noise ratio in order to optimize the process. The experimental results were analyzed conformed and successfully used to achieve the minimum back wall temperature of the throat back up liners. The enhancement in performance of the throat back up liners was observed by carrying out the oxy-acetylene tests. The influence of back wall temperature on the performance of throat back up liners was verified by ground firing test.

  2. Sequential probability ratio controllers for safeguards radiation monitors

    International Nuclear Information System (INIS)

    Fehlau, P.E.; Coop, K.L.; Nixon, K.V.

    1984-01-01

    Sequential hypothesis tests applied to nuclear safeguards accounting methods make the methods more sensitive to detecting diversion. The sequential tests also improve transient signal detection in safeguards radiation monitors. This paper describes three microprocessor control units with sequential probability-ratio tests for detecting transient increases in radiation intensity. The control units are designed for three specific applications: low-intensity monitoring with Poisson probability ratios, higher intensity gamma-ray monitoring where fixed counting intervals are shortened by sequential testing, and monitoring moving traffic where the sequential technique responds to variable-duration signals. The fixed-interval controller shortens a customary 50-s monitoring time to an average of 18 s, making the monitoring delay less bothersome. The controller for monitoring moving vehicles benefits from the sequential technique by maintaining more than half its sensitivity when the normal passage speed doubles

  3. Spring back evaluation by bending under tension tests in conditions of multiaxial stresses corresponding to deep drawing processes. Application to AISI 304 DDQ stainless steel sheet; Evaluacion del springback mediante ensayos de doblado bajo tension en condiciones de multiaxialidad tipicas de los procesos de embuticion profunda. Aplicacion a chapa de acero inoxidable AISI 304 DDQ

    Energy Technology Data Exchange (ETDEWEB)

    Miguel, V.; Coello, J.; Martinez, A.; Calatayud, A.

    2013-09-01

    In this paper, a methodology has been developed for evaluating the spring back of AISI 304 DDQ stainless steel sheet based on a bending under tension test. The main difference of the methodology herein carried out is that tests are made under the multiaxial stresses state that take place in deep drawing processes. This affects to the level of stress value in the test and to the hardening state of the sheet. Springback evaluation has been done in two different areas. Bending area has been evaluated from elastic recovery ratio defined as the ratio between the bending radius after and before bending. Bending and unbending extreme has been studied from the measured curvature radius in this area and taking into account the geometric equivalence of the test with the drawing cups process. Results found allow to state that drawing ratio or deformation ratio have a negligible influence on the springback into the range of values experimented here. Bending radius has hardly influence as well while bending angle is the most significant variable. The results obtained are compared to those measured in deep-drawn cups, finding a great agreement. (Author)

  4. Methodology for Mechanical Property Testing on Fuel Cladding Using an Expanded Plug Wedge Test

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jy-An John [ORNL; Jiang, Hao [ORNL

    2013-08-01

    To determine the tensile properties of irradiated fuel cladding in a hot cell, a simple test was developed at ORNL and is described fully in US Patent Application 20060070455, Expanded plug method for developing circumferential mechanical properties of tubular materials. This method is designed for testing fuel rod cladding ductility in a hot cell utilizing an expandable plug to stretch a small ring of irradiated cladding material. The specimen strain is determined using the measured diametrical expansion of the ring. This method removes many complexities associated with specimen preparation and testing. The advantages are the simplicity of measuring the test component assembly in the hot cell and the direct measurement of specimen strain. It was also found that cladding strength could be determined from the test results. The basic approach of this test method is to apply an axial compressive load to a cylindrical plug of polyurethane (or other materials) fitted inside a short ring of the test material to achieve radial expansion of the specimen. The diameter increase of the specimen is used to calculate the circumferential strain accrued during the test. The other two basic measurements are total applied load and amount of plug compression (extension). A simple procedure is used to convert the load circumferential strain data from the ring tests into material pseudo-stress-strain curves. However, several deficiencies exist in this expanded-plug loading ring test, which will impact accuracy of test results and introduce potential shear failure of the specimen due to inherited large axial compressive stress from the expansion plug test. First of all, the highly non-uniform stress and strain distribution resulted in the gage section of the clad. To ensure reliable testing and test repeatability, the potential for highly non-uniform stress distribution or displacement/strain deformation has to be eliminated at the gage section of the specimen. Second, significant

  5. New methodology to investigate potential contaminant mass fluxes at the stream-aquifer interface by combining integral pumping tests and streambed temperatures

    International Nuclear Information System (INIS)

    Kalbus, E.; Schmidt, C.; Bayer-Raich, M.; Leschik, S.; Reinstorf, F.; Balcke, G.U.; Schirmer, M.

    2007-01-01

    The spatial pattern and magnitude of mass fluxes at the stream-aquifer interface have important implications for the fate and transport of contaminants in river basins. Integral pumping tests were performed to quantify average concentrations of chlorinated benzenes in an unconfined aquifer partially penetrated by a stream. Four pumping wells were operated simultaneously for a time period of 5 days and sampled for contaminant concentrations. Streambed temperatures were mapped at multiple depths along a 60 m long stream reach to identify the spatial patterns of groundwater discharge and to quantify water fluxes at the stream-aquifer interface. The combined interpretation of the results showed average potential contaminant mass fluxes from the aquifer to the stream of 272 μg m -2 d -1 MCB and 71 μg m -2 d -1 DCB, respectively. This methodology combines a large-scale assessment of aquifer contamination with a high-resolution survey of groundwater discharge zones to estimate contaminant mass fluxes between aquifer and stream. - We provide a new methodology to quantify the potential contaminant mass flux from an aquifer to a stream

  6. Two methodologies for physical penetration testing using social engineering

    NARCIS (Netherlands)

    Dimkov, T.; van Cleeff, A.; Pieters, Wolter; Hartel, Pieter H.

    2010-01-01

    Penetration tests on IT systems are sometimes coupled with physical penetration tests and social engineering. In physical penetration tests where social engineering is allowed, the penetration tester directly interacts with the employees. These interactions are usually based on deception and if not

  7. Improvement of test methodology for evaluating diesel fuel stability

    Energy Technology Data Exchange (ETDEWEB)

    Gutman, M.; Tartakovsky, L.; Kirzhner, Y.; Zvirin, Y. [Internal Combustion Engines Lab., Haifa (Israel); Luria, D. [Fuel Authority, Tel Aviv (Israel); Weiss, A.; Shuftan, M. [Israel Defence Forces, Tel Aviv (Israel)

    1995-05-01

    The storage stability of diesel fuel has been extensively investigated for many years under laboratory conditions. Although continuous efforts have been made to improve testing techniques, there does not yet exist a generally accepted correlation between laboratory methods (such as chemical analysis of the fuel) and actual diesel engine tests. A testing method was developed by the Technion Internal Combustion Engines Laboratory (TICEL), in order to address this problem. The test procedure was designed to simulate diesel engine operation under field conditions. It is based on running a laboratory-modified single cylinder diesel engine for 50 h under cycling operating conditions. The overall rating of each test is based on individual evaluation of the deposits and residue formation in the fuel filter, nozzle body and needle, piston head, piston rings, exhaust valve, and combustion chamber (six parameters). Two methods for analyzing the test results were used: objective, based on measured data, and subjective, based on visual evaluation results of these deposits by a group of experts. Only the residual level in the fuel filter was evaluated quantitatively by measured results. In order to achieve higher accuracy of the method, the test procedure was improved by introducing the measured results of nozzle fouling as an additional objective evaluating (seventh) parameter. This factor is evaluated on the basis of the change in the air flow rate through the nozzle before and after the complete engine test. Other improvements in the method include the use of the nozzle assembly photograph in the test evaluation, and representation of all seven parameters on a continuous scale instead of the discrete scale used anteriorly, in order to achieve higher accuracy. This paper also contains the results obtained by application of this improved fuel stability test for a diesel fuel stored for a five-year period.

  8. Assessment of Feasibility of Suction Pile/Anchor Installation and Pullout Testing through Field Tests

    Directory of Open Access Journals (Sweden)

    R. Vijaya

    2014-09-01

    Full Text Available Suction pile anchors are large cylindrical (inverted bucket type structure open at the bottom and closed at the top and largely used for mooring of offshore platforms, exploratory vessels etc. Prediction of the mooring capacity of suction piles is a critical issue faced by the design engineers and rational methods are required to produce reliable designs. Tests have been conducted in an existing natural pond within NIOT campus with the objective of developing methodology of deployment, design and logistics for suction pile installation and testing of mooring capacity under static pullout. Small size suction piles with varying diameters and lengths have been used in the tests. The tests have been carried out in the natural pond with constant water depth of 1.5 m with the top 1.5 m layer of bed comprising soft marine clay. It is found that pile geometry, aspect ratio and angle of pullout have a significant influence on the response to pullout. As angle of mooring load application changes from vertical to horizontal the reaction offered by the suction pile changes from skin friction to passive soil resistance. Resistance offered by the internal plug of soil is found to vary according to dimension of the anchor piles.

  9. Analog automatic test pattern generation for quasi-static structural test.

    NARCIS (Netherlands)

    Zjajo, A.; Pineda de Gyvez, J.

    2009-01-01

    A new approach for structural, fault-oriented analog test generation methodology to test for the presence of manufacturing-related defects is proposed. The output of the test generator consists of optimized test stimuli, fault coverage and sampling instants that are sufficient to detect the failure

  10. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology.

    Science.gov (United States)

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul Ma; Scharf, Thomas; Quinlan, Leo R; ÓLaighin, Gearóid

    2017-03-16

    Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. We report a successful implementation of the methodology for the design and development

  11. Assessing soundscape: Comparison between in situ and laboratory methodologies

    Directory of Open Access Journals (Sweden)

    Hermida Cadena Luis Fernando

    2017-06-01

    Full Text Available The assessment of soundscape implies an interdisciplinary approach, where objective and subjective aspects are considered. For the subjective evaluation, in situ and laboratory methodologies are usually followed. Local observations allow the collection of information on the influence of different stimuli present in the environment, whereas laboratory tests present a determined quantity of controlled stimuli to the evaluator. The purpose of this work is to compare results from the different methodologies in order to understand their strengths and their weaknesses. Three urban parks in the city of Lisbon, Portugal, were evaluated. Fragments of binaural sound recordings collected in the parks were used in laboratory tests to compare with the responses in situ and of expert and nonexpert listeners. Statistically significant differences were found in several of the perceptual attributes under observation, which led to variation in the results of the main model’s components. The sound environments were found to be more pleasant and uneventful in situ than in the laboratory, a phenomenon possibly due to the influence of other stimuli such as visual in the process of assessment. The in situ tests allow a systemic and holistic evaluation of the environment under study,whereas the laboratory tests allow a specific and tightly targeted analysis of different component sound events. Therefore, the two methodologies can be useful in soundscape assessment depending on the specific application and needs. No differences were found in the assessment made by either experts or nonexperts.

  12. IMPROVING METHODOLOGY OF RISK IDENTIFICATION OF OCCUPATIONAL DANGEROUS

    Directory of Open Access Journals (Sweden)

    A. P. BOCHKOVSKYI

    2018-04-01

    Full Text Available In the paper, according to the analysis of statistical data, correlation between the amount of occupational injuries and occupationaldiseases in Ukraine within last 5 years is defined. Also, using methodology of the International Labor Organizationcorrelcation between the amount of accident fatalities and general number of accidents in Ukraine and EU countries (Austria, GreatBritain, Germany, Denmark, Norway, Poland, Hungry, Finland, France is defined. It is shown that in spite of the positive dynamicsof decreasing amount of occupational injuries, the number of occupational diseases in Ukraine always increases. The comparativeanalysis of the ratio of the number of accident fatalities to the total number of registered accidents showed that, on average, Ukraineexceeds the EU countries by this indicator by 100 times.It is noted, that such negative indicators (in particular, increasing amount of occupational diseases, may occure because ofimperfect methodology for identifying the risks of professional dangerous.Also, it is ascertained that basing on the existed methodology, the identefication process of occupational dangerous isquite subjective, which reduces objectivity of conducting quantitative assessment. In order to eliminate defined drawnbacks it is firsttime proposed to use corresponding integral criterion to conduct the process of quantitative risk assessmentTo solve this problem authors formulate and propose an algorithm of improving methodology of a process of analysing dangerousand harmful production effects (DHPE which are the mainest reasons of occupational dangerous.The proposed algorithm includes implementation of four following successive steps: DHPE identification, indication of theirmaximum allowed threshold of concentrations (levels, identification of the sources of identified DHPE, esimation of consequencesof manifestation.The improved proposed methodology allows indentify risks of occurrence occupational dangerous in systems

  13. Methodological issues of assessing the effects of social inequality in Russia’s regions

    Directory of Open Access Journals (Sweden)

    Ol’ga Anatol’evna Kozlova

    2014-11-01

    Full Text Available The article deals with the issue concerning the assessment of the impact of social inequality on the socio-demographic characteristics of society. The authors evaluate the impact of the social inequality growth in the Russian Federation subjects on the basis of the analysis of the decile ratio dynamics. They propose a methodological approach to determine the degree of dependence of crime rate on the growth of social inequality. The authors compare the influence of the decile ratio and purchasing power on mortality rate in the regions of Russia

  14. Meiotic sex ratio variation in natural populations of Ceratodon purpureus (Ditrichaceae).

    Science.gov (United States)

    Norrell, Tatum E; Jones, Kelly S; Payton, Adam C; McDaniel, Stuart F

    2014-09-01

    • Sex ratio variation is a common but often unexplained phenomenon in species across the tree of life. Here we evaluate the hypothesis that meiotic sex ratio variation can contribute to the biased sex ratios found in natural populations of the moss Ceratodon purpureus.• We obtained sporophytes from several populations of C. purpureus from eastern North America. From each sporophyte, we estimated the mean spore viability by germinating replicate samples on agar plates. We estimated the meiotic sex ratio of each sporophyte by inferring the sex of a random sample of germinated spores (mean = 77) using a PCR-RFLP test. We tested for among-sporophyte variation in viability using an ANOVA and for deviations from 1:1 sex ratio using a χ(2)-test and evaluated the relationship between these quantities using a linear regression.• We found among-sporophyte variation in spore viability and meiotic sex ratio, suggesting that genetic variants that contribute to variation in both of these traits segregate within populations of this species. However, we found no relationship between these quantities, suggesting that factors other than sex ratio distorters contribute to variation in spore viability within populations.• These results demonstrate that sex ratio distortion may partially explain the population sex ratio variation seen in C. purpureus, but more generally that genetic conflict over meiotic segregation may contribute to fitness variation in this species. Overall, this study lays the groundwork for future studies on the genetic basis of meiotic sex ratio variation. © 2014 Botanical Society of America, Inc.

  15. Application of transient analysis methodology to heat exchanger performance monitoring

    International Nuclear Information System (INIS)

    Rampall, I.; Soler, A.I.; Singh, K.P.; Scott, B.H.

    1994-01-01

    A transient testing technique is developed to evaluate the thermal performance of industrial scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated herein with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  16. Fatigue crack closure behavior at high stress ratios

    Science.gov (United States)

    Turner, C. Christopher; Carman, C. Davis; Hillberry, Ben M.

    1988-01-01

    Fatigue crack delay behavior at high stress ratio caused by single peak overloads was investigated in two thicknesses of 7475-T731 aluminum alloy. Closure measurements indicated no closure occurred before or throughout the overload plastic zones following the overload. This was further substantiated by comparing the specimen compliance following the overload with the compliance of a low R ratio test when the crack was fully open. Scanning electron microscope studies revealed that crack tunneling and possibly reinitiation of the crack occurred, most likely a result of crack-tip blunting. The number of delay cycles was greater for the thinner mixed mode stress state specimen than for the thicker plane strain stress state specimen, which is similar to low R ratio test results and may be due to a larger plastic zone for the mixed mode cased.

  17. Impact of Inflation Accounting Application on Key Financial Ratios

    Directory of Open Access Journals (Sweden)

    Aydın KARAPINAR

    2012-03-01

    Full Text Available This paper investigates the impact of inflation accounting on key financal ratios. To this end, the financial statements of 132 companies listed in the Istanbul Stock Exchange (ISE are studied. An analyis of paired samples t test has been conducted on the financial ratios of the companies. The results show that a significant difference between adjusted cost based financial ratios and historical cost based financial ratios occurs only for current, ratios, equity ratios and noncurrent turnover ratios. The study does not cover companies operating in the financial sector. The companies reporting in accordance with IFRS for the studied periods that spans 2001-2004 are not included in the study either. The study offers valuable information as to analysing companies operating in hiper inflation economies.

  18. Assessing Personality and Mood With Adjective Check List Methodology: A Review

    Science.gov (United States)

    Craig, Robert J.

    2005-01-01

    This article addresses the benefits and problems in using adjective check list methodology to assess personality. Recent developments in this assessment method are reviewed, emphasizing seminal adjective-based personality tests (Gough's Adjective Check List), mood tests (Lubin's Depressive Adjective Test, Multiple Affect Adjective Check List),…

  19. A Methodology for Optimization in Multistage Industrial Processes: A Pilot Study

    Directory of Open Access Journals (Sweden)

    Piotr Jarosz

    2015-01-01

    Full Text Available The paper introduces a methodology for optimization in multistage industrial processes with multiple quality criteria. Two ways of formulation of optimization problem and four different approaches to solve the problem are considered. Proposed methodologies were tested first on a virtual process described by benchmark functions and next were applied in optimization of multistage lead refining process.

  20. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  1. Design of formulated products: a systematic methodology

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Ng, K.M.

    2011-01-01

    /or verifies a specified set through a sequence of predefined activities (work-flow). Stage-2 and stage-3 (not presented here) deal with the planning and execution of experiments, for product validation. Four case studies have been developed to test the methodology. The computer-aided design (stage-1...

  2. Acceleration-based methodology to assess the blast mitigation performance of explosive ordnance disposal helmets

    Science.gov (United States)

    Dionne, J. P.; Levine, J.; Makris, A.

    2018-01-01

    To design the next generation of blast mitigation helmets that offer increasing levels of protection against explosive devices, manufacturers must be able to rely on appropriate test methodologies and human surrogates that will differentiate the performance level of various helmet solutions and ensure user safety. Ideally, such test methodologies and associated injury thresholds should be based on widely accepted injury criteria relevant within the context of blast. Unfortunately, even though significant research has taken place over the last decade in the area of blast neurotrauma, there currently exists no agreement in terms of injury mechanisms for blast-induced traumatic brain injury. In absence of such widely accepted test methods and injury criteria, the current study presents a specific blast test methodology focusing on explosive ordnance disposal protective equipment, involving the readily available Hybrid III mannequin, initially developed for the automotive industry. The unlikely applicability of the associated brain injury criteria (based on both linear and rotational head acceleration) is discussed in the context of blast. Test results encompassing a large number of blast configurations and personal protective equipment are presented, emphasizing the possibility to develop useful correlations between blast parameters, such as the scaled distance, and mannequin engineering measurements (head acceleration). Suggestions are put forward for a practical standardized blast testing methodology taking into account limitations in the applicability of acceleration-based injury criteria as well as the inherent variability in blast testing results.

  3. ECSIN's methodological approach for hazard evaluation of engineered nanomaterials

    Science.gov (United States)

    Bregoli, Lisa; Benetti, Federico; Venturini, Marco; Sabbioni, Enrico

    2013-04-01

    The increasing production volumes and commercialization of engineered nanomaterials (ENM), together with data on their higher biological reactivity when compared to bulk counterpart and ability to cross biological barriers, have caused concerns about their potential impacts on the health and safety of both humans and the environment. A multidisciplinary component of the scientific community has been called to evaluate the real risks associated with the use of products containing ENM, and is today in the process of developing specific definitions and testing strategies for nanomaterials. At ECSIN we are developing an integrated multidisciplinary methodological approach for the evaluation of the biological effects of ENM on the environment and human health. While our testing strategy agrees with the most widely advanced line of work at the European level, the choice of methods and optimization of protocols is made with an extended treatment of details. Our attention to the methodological and technical details is based on the acknowledgment that the innovative characteristics of matter at the nano-size range may influence the existing testing methods in a partially unpredictable manner, an aspect which is frequently recognized at the discussion level but oftentimes disregarded at the laboratory bench level. This work outlines the most important steps of our testing approach. In particular, each step will be briefly discussed in terms of potential technical and methodological pitfalls that we have encountered, and which are often ignored in nanotoxicology research. The final aim is to draw attention to the need of preliminary studies in developing reliable tests, a crucial aspect to confirm the suitability of the chosen analytical and toxicological methods to be used for the specific tested nanoparticle, and to express the idea that in nanotoxicology,"devil is in the detail".

  4. ECSIN's methodological approach for hazard evaluation of engineered nanomaterials

    International Nuclear Information System (INIS)

    Bregoli, Lisa; Benetti, Federico; Venturini, Marco; Sabbioni, Enrico

    2013-01-01

    The increasing production volumes and commercialization of engineered nanomaterials (ENM), together with data on their higher biological reactivity when compared to bulk counterpart and ability to cross biological barriers, have caused concerns about their potential impacts on the health and safety of both humans and the environment. A multidisciplinary component of the scientific community has been called to evaluate the real risks associated with the use of products containing ENM, and is today in the process of developing specific definitions and testing strategies for nanomaterials. At ECSIN we are developing an integrated multidisciplinary methodological approach for the evaluation of the biological effects of ENM on the environment and human health. While our testing strategy agrees with the most widely advanced line of work at the European level, the choice of methods and optimization of protocols is made with an extended treatment of details. Our attention to the methodological and technical details is based on the acknowledgment that the innovative characteristics of matter at the nano-size range may influence the existing testing methods in a partially unpredictable manner, an aspect which is frequently recognized at the discussion level but oftentimes disregarded at the laboratory bench level. This work outlines the most important steps of our testing approach. In particular, each step will be briefly discussed in terms of potential technical and methodological pitfalls that we have encountered, and which are often ignored in nanotoxicology research. The final aim is to draw attention to the need of preliminary studies in developing reliable tests, a crucial aspect to confirm the suitability of the chosen analytical and toxicological methods to be used for the specific tested nanoparticle, and to express the idea that in nanotoxicology,'devil is in the detail'.

  5. Gyromagnetic ratio of charged Kerr-anti-de Sitter black holes

    International Nuclear Information System (INIS)

    Aliev, Alikram N

    2007-01-01

    We examine the gyromagnetic ratios of rotating and charged AdS black holes in four and higher spacetime dimensions. We compute the gyromagnetic ratio for Kerr-AdS black holes with an arbitrary electric charge in four dimensions and show that it corresponds to g = 2 irrespective of the AdS nature of the spacetime. We also compute the gyromagnetic ratio for Kerr-AdS black holes with a single angular momentum and with a test electric charge in all higher dimensions. The gyromagnetic ratio crucially depends on the dimensionless ratio of the rotation parameter to the curvature radius of the AdS background. At the critical limit, when the boundary Einstein universe is rotating at the speed of light, it exhibits a striking feature leading to g 2 regardless of the spacetime dimension. Next, we extend our consideration to include the exact metric for five-dimensional rotating charged black holes in minimal gauged supergravity. We show that the value of the gyromagnetic ratio found in the 'test-charge' approach remains unchanged for these black holes

  6. Discrimination of DPRK M5.1 February 12th, 2013 Earthquake as Nuclear Test Using Analysis of Magnitude, Rupture Duration and Ratio of Seismic Energy and Moment

    Science.gov (United States)

    Salomo Sianipar, Dimas; Subakti, Hendri; Pribadi, Sugeng

    2015-04-01

    On February 12th, 2013 morning at 02:57 UTC, there had been an earthquake with its epicenter in the region of North Korea precisely around Sungjibaegam Mountains. Monitoring stations of the Preparatory Commission for the Comprehensive Nuclear Test-Ban Treaty Organization (CTBTO) and some other seismic network detected this shallow seismic event. Analyzing seismograms recorded after this event can discriminate between a natural earthquake or an explosion. Zhao et. al. (2014) have been successfully discriminate this seismic event of North Korea nuclear test 2013 from ordinary earthquakes based on network P/S spectral ratios using broadband regional seismic data recorded in China, South Korea and Japan. The P/S-type spectral ratios were powerful discriminants to separate explosions from earthquake (Zhao et. al., 2014). Pribadi et. al. (2014) have characterized 27 earthquake-generated tsunamis (tsunamigenic earthquake or tsunami earthquake) from 1991 to 2012 in Indonesia using W-phase inversion analysis, the ratio between the seismic energy (E) and the seismic moment (Mo), the moment magnitude (Mw), the rupture duration (To) and the distance of the hypocenter to the trench. Some of this method was also used by us to characterize the nuclear test earthquake. We discriminate this DPRK M5.1 February 12th, 2013 earthquake from a natural earthquake using analysis magnitude mb, ms and mw, ratio of seismic energy and moment and rupture duration. We used the waveform data of the seismicity on the scope region in radius 5 degrees from the DPRK M5.1 February 12th, 2013 epicenter 41.29, 129.07 (Zhang and Wen, 2013) from 2006 to 2014 with magnitude M ≥ 4.0. We conclude that this earthquake was a shallow seismic event with explosion characteristics and can be discriminate from a natural or tectonic earthquake. Keywords: North Korean nuclear test, magnitude mb, ms, mw, ratio between seismic energy and moment, ruptures duration

  7. The ATP/DNA Ratio Is a Better Indicator of Islet Cell Viability Than the ADP/ATP Ratio

    Science.gov (United States)

    Suszynski, T.M.; Wildey, G.M.; Falde, E.J.; Cline, G.W.; Maynard, K. Stewart; Ko, N.; Sotiris, J.; Naji, A.; Hering, B.J.; Papas, K.K.

    2009-01-01

    Real-time, accurate assessment of islet viability is critical for avoiding transplantation of nontherapeutic preparations. Measurements of the intracellular ADP/ATP ratio have been recently proposed as useful prospective estimates of islet cell viability and potency. However, dead cells may be rapidly depleted of both ATP and ADP, which would render the ratio incapable of accounting for dead cells. Since the DNA of dead cells is expected to remain stable over prolonged periods of time (days), we hypothesized that use of the ATP/DNA ratio would take into account dead cells and may be a better indicator of islet cell viability than the ADP/ATP ratio. We tested this hypothesis using mixtures of healthy and lethally heat-treated (HT) rat insulinoma cells and human islets. Measurements of ATP/DNA and ADP/ATP from the known mixtures of healthy and HT cells and islets were used to evaluate how well these parameters correlated with viability. The results indicated that ATP and ADP were rapidly (within 1 hour) depleted in HT cells. The fraction of HT cells in a mixture correlated linearly with the ATP/DNA ratio, whereas the ADP/ADP ratio was highly scattered, remaining effectively unchanged. Despite similar limitations in both ADP/ADP and ATP/DNA ratios, in that ATP levels may fluctuate significantly and reversibly with metabolic stress, the results indicated that ATP/DNA was a better measure of islet viability than the ADP/ATP ratio. PMID:18374063

  8. Design and development of a prototypical software for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small- and medium-sized enterprises (SME)

    Science.gov (United States)

    Möller, Thomas; Bellin, Knut; Creutzburg, Reiner

    2015-03-01

    The aim of this paper is to show the recent progress in the design and prototypical development of a software suite Copra Breeder* for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small and medium-sized enterprises.

  9. The specific aspects for the ASSET methodology implementation in Romania

    Energy Technology Data Exchange (ETDEWEB)

    Serbanescu, D [National Commission for Nuclear Activities Control of Romania (Romania)

    1997-10-01

    The main aspects of the implementation of a root cause analysis methodology are as follows: The Test Operating Licence requires that a systematical root cause analysis method for the event analysis to clarify the three questions from the ASSET methodology has to be implemented; A Training seminar on the ASSET methodology for the plant staff was performed at Cernavoda 1 NPP in April 1997, with the IAEA support; The self assessment process for the events which occurred during commissioning phases has to be performed by the plant up to the end of this year; An ASSET Peer Review of the Plant Self Assessment is planned in 1998; The Regulatory Authority has the task to evaluated independently the plant conclusions on various events. The tool used by CNCAN is the ASSET methodology.

  10. The specific aspects for the ASSET methodology implementation in Romania

    International Nuclear Information System (INIS)

    Serbanescu, D.

    1997-01-01

    The main aspects of the implementation of a root cause analysis methodology are as follows: The Test Operating Licence requires that a systematical root cause analysis method for the event analysis to clarify the three questions from the ASSET methodology has to be implemented; A Training seminar on the ASSET methodology for the plant staff was performed at Cernavoda 1 NPP in April 1997, with the IAEA support; The self assessment process for the events which occurred during commissioning phases has to be performed by the plant up to the end of this year; An ASSET Peer Review of the Plant Self Assessment is planned in 1998; The Regulatory Authority has the task to evaluated independently the plant conclusions on various events. The tool used by CNCAN is the ASSET methodology

  11. Strategic alternatives ranking methodology: Multiple RCRA incinerator evaluation test case

    International Nuclear Information System (INIS)

    Baker, G.; Thomson, R.D.; Reece, J.; Springer, L.; Main, D.

    1988-01-01

    This paper presents an important process approach to permit quantification and ranking of multiple alternatives being considered in remedial actions or hazardous waste strategies. This process is a methodology for evaluating programmatic options in support of site selection or environmental analyses. Political or other less tangible motivations for alternatives may be quantified by means of establishing the range of significant variables, weighting their importance, and by establishing specific criteria for scoring individual alternatives. An application of the process to a recent AFLC program permitted ranking incineration alternatives from a list of over 130 options. The process forced participation by the organizations to be effected, allowed a consensus of opinion to be achieved, allowed complete flexibility to evaluate factor sensitivity, and resulted in strong, quantifiable support for any subsequent site-selection action NEPA documents

  12. A biamperometric method for the determination of O/U ratio in uranium oxide

    International Nuclear Information System (INIS)

    Xavier, Mary; Nair, P.R.; Aggarwal, S.K.

    2007-01-01

    The methodology based on the dissolution of the uranium dioxide samples in H 2 SO 4 + HF mixture and the indirect determination of U(VI) by biamperometric redox titration is a simple method for determining ratio in hyperstoichiometric UO 2 powders.Analytical methods for % measurements in hyperstoichiometric. The present paper describes a simple method based on the determination of U(IV) and total U by biamperometric titration

  13. Technical report on LWR design decision methodology. Phase I

    International Nuclear Information System (INIS)

    1980-03-01

    Energy Incorporated (EI) was selected by Sandia Laboratories to develop and test on LWR design decision methodology. Contract Number 42-4229 provided funding for Phase I of this work. This technical report on LWR design decision methodology documents the activities performed under that contract. Phase I was a short-term effort to thoroughly review the curret LWR design decision process to assure complete understanding of current practices and to establish a well defined interface for development of initial quantitative design guidelines

  14. Stress-strain curve of concretes with recycled concrete aggregates: analysis of the NBR 8522 methodology

    Directory of Open Access Journals (Sweden)

    D. A. GUJEL

    Full Text Available ABSTRACT This work analyses the methodology "A" (item A.4 employed by the Brazilian Standard ABNT 8522 (ABNT, 2008 for determining the stress-strain behavior of cylindrical specimens of concrete, presenting considerations about possible enhancements aiming it use for concretes with recycled aggregates with automatic test equipment. The methodology specified by the Brazilian Standard presents methodological issues that brings distortions in obtaining the stress-strain curve, as the use of a very limited number of sampling points and by inducing micro cracks and fluency in the elastic behavior of the material due to the use of steady stress levels in the test. The use of a base stress of 0.5 MPa is too low for modern high load test machines designed do high strength concrete test. The work presents a discussion over these subjects, and a proposal of a modified test procedure to avoid such situations.

  15. Development and Validation of a Translation Test.

    Science.gov (United States)

    Ghonsooly, Behzad

    1993-01-01

    Translation testing methodology has been criticized for its subjective character. No real strides have so far been made in developing an objective translation test. In this paper, certain detailed procedures including various phases of pretesting have been performed to achieve objectivity and scorability in translation testing methodology. In…

  16. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Lifetime prediction and reliability estimation methodology for Stirling-type pulse tube refrigerators by gaseous contamination accelerated degradation testing

    Science.gov (United States)

    Wan, Fubin; Tan, Yuanyuan; Jiang, Zhenhua; Chen, Xun; Wu, Yinong; Zhao, Peng

    2017-12-01

    Lifetime and reliability are the two performance parameters of premium importance for modern space Stirling-type pulse tube refrigerators (SPTRs), which are required to operate in excess of 10 years. Demonstration of these parameters provides a significant challenge. This paper proposes a lifetime prediction and reliability estimation method that utilizes accelerated degradation testing (ADT) for SPTRs related to gaseous contamination failure. The method was experimentally validated via three groups of gaseous contamination ADT. First, the performance degradation model based on mechanism of contamination failure and material outgassing characteristics of SPTRs was established. Next, a preliminary test was performed to determine whether the mechanism of contamination failure of the SPTRs during ADT is consistent with normal life testing. Subsequently, the experimental program of ADT was designed for SPTRs. Then, three groups of gaseous contamination ADT were performed at elevated ambient temperatures of 40 °C, 50 °C, and 60 °C, respectively and the estimated lifetimes of the SPTRs under normal condition were obtained through acceleration model (Arrhenius model). The results show good fitting of the degradation model with the experimental data. Finally, we obtained the reliability estimation of SPTRs through using the Weibull distribution. The proposed novel methodology enables us to take less than one year time to estimate the reliability of the SPTRs designed for more than 10 years.

  18. Person fit for test speededness: normal curvatures, likelihood ratio tests and empirical Bayes estimates

    NARCIS (Netherlands)

    Goegebeur, Y.; de Boeck, P.; Molenberghs, G.

    2010-01-01

    The local influence diagnostics, proposed by Cook (1986), provide a flexible way to assess the impact of minor model perturbations on key model parameters’ estimates. In this paper, we apply the local influence idea to the detection of test speededness in a model describing nonresponse in test data,

  19. Study of a methodology of identifying important research problems by the PIRT process

    International Nuclear Information System (INIS)

    Aoki, Takayuki; Takagi, Toshiyuki; Urayama, Ryoichi; Komura, Ichiro; Furukawa, Takashi; Yusa, Noritaka

    2014-01-01

    In this paper, we propose a new methodology of identifying important research problems to be solved to improve the performance of some specific scientific technologies by the phenomena identification and ranking table (PIRT) process which has been used as a methodology for demonstrating the validity of the best estimate simulation codes in US Nuclear Regulatory Commission (USNRC) licensing of nuclear power plants. The new methodology makes it possible to identify important factors affecting the performance of the technologies from the viewpoint of the figure of merit and problems associated with them while it keeps the fundamental concepts of the original PIRT process. Also in this paper, we demonstrate the effectiveness of the new methodology by applying it to a task of extracting research problems for improving an inspection accuracy of ultrasonic testing or eddy current testing in the inspection of objects having cracks due to fatigue or stress corrosion cracking. (author)

  20. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    A final report summarizing the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in Cornwall. The geological setting of the test site in the Cornubian granite batholith is described. The effect of structure imposed by discontinuities on the engineering behaviour of rock masses is discussed and the scanline survey method of obtaining data on discontinuities in the rock mass is described. The applicability of some methods of statistical analysis for discontinuity data is reviewed. The requirement for remote geophysical methods of characterizing the mass is discussed and experiments using seismic and ultrasonic velocity measurements are reported. Methods of determining the in-situ stresses are described and the final results of a programme of in-situ stress measurements using the overcoring and hydrofracture methods are reported. (author)

  1. Assessing digital control system dependability using the dynamic flowgraph methodology

    International Nuclear Information System (INIS)

    Garrett, C.J.; Guarro, S.B.; Apostolakis, G.E.

    1993-01-01

    Dynamic Flowgraph Methodology (DFM) is a methodological approach to modeling and analyzing the behavior of software-driven embedded systems for the purpose of reliability/safety assessment and verification. The methodology has two fundamental goals: (a) to identify how certain postulated events may occur in a system and (b) to identify an appropriate testing strategy based on an analysis of system functional behavior. To achieve these goals, the methodology employs a modeling framework in which system models are developed in terms of causal relationships between physical variables and temporal characteristics of the execution of software modules. These models are then analyzed to determine how a certain state (desirable or undesirable) can be reached. This is done by developing timed fault trees, which take the form of logical combinations of static trees relating system parameters at different points in time. The prime implicants (multistate analog of minimal cut sets) of the fault trees can be used to identify and eliminate system faults resulting from unanticipated combinations of software logic errors, hardware failures, and adverse environmental conditions and to direct testing activity to more efficiently eliminate implementation errors by focusing on the neighborhood of potential failure modes arising from these combinations of system conditions

  2. [Triglycerides/HDL-cholesterol ratio: in adolescents without cardiovascular risk factors].

    Science.gov (United States)

    Soutelo, Jimena; Graffigna, Mabel; Honfi, Margarita; Migliano, Marta; Aranguren, Marcela; Proietti, Adrian; Musso, Carla; Berg, Gabriela

    2012-06-01

    Triglycerides/HDL-cholesterol ratio (TG/HDL) is an easy resource determination and it has good correlation with the HOMA index in adults. Due to physiological insulin resistance (IR) in adolescence it is necessary to find markers of IR independent of age, sex and pubertal stage. The objective was to identify reference values of TG/HDL ratio in a population of adolescents without cardiovascular risk factors. We evaluated 943 adolescents, 429 females and 514 males between 11 and 14. Anthropometric measures were determined and body mass index was calculated (BMI). Blood was extracted after 12 hours of fasting to determine glucose, triglycerides, HDL. The metabolic syndrome (MS) was diagnosed according to criteria of NCEP/ATP III modified by Cook. We excluded adolescents with MS or any component of it. We evaluated 562 adolescents (289 women and 273 men) with a weight of 48.91 +/- 6.51kg, BMI: 18.95 +/- 1.78, systolic blood pressure of 108.12 +/- 13.60 mmHg, diastolic blood pressure: 63.82 +/- 9.43 and waist circumference: 65.09 +/- 4.54 cm. TG/HDL ratio was 1.25 +/- 0.43, with a 95 percentile of 2.05. In adults, TG/HDL ratio greater than 3 is a marker of insulin resistance. We believe that a higher value to 2.05 might be a good index of insulin resistance in adolescence. TG/HDL ratio has the advantage of being methodologically simpler, more economical and independent of pubertal stage.

  3. MARKETING MIX BY BED OCCUPANCY RATIO (BOR

    Directory of Open Access Journals (Sweden)

    Abdul Muhith

    2017-04-01

    Full Text Available Introduction: Bed Occupancy Ratio (BOR in RSI Arafah Mojosari during the last three years are at under ideal rate and the lowest of the three existing hospitals in the area of Mojosari. The purpose of this study was to determine the relationship marketing mix with Bed Occupancy Ratio in RSI Arafah Mojosari. Methods: This research uses analytic methods with crossectional approach. Variables in the study is marketing mix and Bed Occupancy Ratio (BOR. The population in this study were all patients hospitalized in the RSI Arafah Mojosari. Samples amounted 44 respondents taken by the Stratified random sampling technique. Data were collected using the questionnaire and analyzed using Fisher's Exact test. Result: The results obtained more than 50% of respondents (59.1% rate well against the marketing mix is developed by the hospital management and the majority of respondents (79.5% are in the treatment room that has a number BOR is not ideal. Fisher Exact test test results obtained probabililty value=0.02<0.05 so that H0 is rejected, which means there is a relationship marketing mix with the Bed Occupancy Ratio in RSI Arafah Mojosari. Discussion: Hospitals which able to develop the marketing mix very well, can attract consumers to use inpatient services at the hospital, with that BOR value will increase as the increased use of inpatient services. Hospital management must be able to formulate a good marketing mix strategy that hospital marketing objectives can be achieved. Conformity between service quality and service rates must be addressed, otherwise it extent of media promotions can attract patients to inpatient services.

  4. Built-In Test Engine For Memory Test

    OpenAIRE

    McEvoy, Paul; Farrell, Ronan

    2004-01-01

    In this paper we will present an on-chip method for testing high performance memory devices, that occupies minimal area and retains full flexibility. This is achieved through microcode test instructions and the associated on-chip state machine. In addition, the proposed methodology will enable at-speed testing of memory devices. The relevancy of this work is placed in context with an introduction to memory testing and the techniques and algorithms generally used today.

  5. Fracture assessment of shallow-flaw cruciform beams tested under uniaxial and biaxial loading conditions

    International Nuclear Information System (INIS)

    Bass, B.R.; McAfee, W.J.; Williams, P.T.; Pennell, W.E.

    1999-01-01

    A technology to determine shallow-flaw fracture toughness of reactor pressure vessel (RPV) steels is being developed for application to the safety assessment of RPVs containing postulated shallow surface flaws. Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shallow, surface flaws. The cruciform beam specimens were developed at Oak Ridge National Laboratory (ORNL) to introduce a far-field, out-of-plane biaxial stress component in the test section that approximates the nonlinear stresses resulting from pressurized-thermal-shock or pressure-temperature loading of an RPV. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for an RPV material. The cruciform fracture toughness data were used to evaluate fracture methodologies for predicting the observed effects of biaxial loading on shallow-flaw fracture toughness. Initial emphasis was placed on assessment of stress-based methodologies, namely, the J-Q formulation, the Dodds-Anderson toughness scaling model, and the Weibull approach. Applications of these methodologies based on the hydrostatic stress fracture criterion indicated an effect of loading-biaxiality on fracture toughness; the conventional maximum principal stress criterion indicated no effect. A three-parameter Weibull model based on the hydrostatic stress criterion is shown to correlate with the experimentally observed biaxial effect on cleavage fracture toughness by providing a scaling mechanism between uniaxial and biaxial loading states. (orig.)

  6. Internal jugular vein: Peripheral vein adrenocorticotropic hormone ratio in patients with adrenocorticotropic hormone-dependent Cushing′s syndrome: Ratio calculated from one adrenocorticotropic hormone sample each from right and left internal jugular vein during corticotrophin releasing hormone stimulation test

    Directory of Open Access Journals (Sweden)

    Sachin Chittawar

    2013-01-01

    Full Text Available Background: Demonstration of central: Peripheral adrenocorticotropic hormone (ACTH gradient is important for diagnosis of Cushing′s disease. Aim: The aim was to assess the utility of internal jugular vein (IJV: Peripheral vein ACTH ratio for diagnosis of Cushing′s disease. Materials and Methods: Patients with ACTH-dependent Cushing′s syndrome (CS patients were the subjects for this study. One blood sample each was collected from right and left IJV following intravenous hCRH at 3 and 5 min, respectively. A simultaneous peripheral vein sample was also collected with each IJV sample for calculation of IJV: Peripheral vein ACTH ratio. IJV sample collection was done under ultrasound guidance. ACTH was assayed using electrochemiluminescence immunoassay (ECLIA. Results: Thirty-two patients participated in this study. The IJV: Peripheral vein ACTH ratio ranged from 1.07 to 6.99 ( n = 32. It was more than 1.6 in 23 patients. Cushing′s disease could be confirmed in 20 of the 23 cases with IJV: Peripheral vein ratio more than 1.6. Four patients with Cushing′s disease and 2 patients with ectopic ACTH syndrome had IJV: Peripheral vein ACTH ratio less than 1.6. Six cases with unknown ACTH source were excluded for calculation of sensitivity and specificity of the test. Conclusion: IJV: Peripheral vein ACTH ratio calculated from a single sample from each IJV obtained after hCRH had 83% sensitivity and 100% specificity for diagnosis of CD.

  7. Urine Test: Microalbumin-to-Creatinine Ratio (For Parents)

    Science.gov (United States)

    ... could interfere with test results. Be sure to review all your child's medications with your doctor. The Procedure Your child will be asked to urinate (pee) into a clean sample cup in the doctor's office or at home. Collecting the specimen should only take a few minutes. If your child isn' ...

  8. Performance of a high-work low aspect ratio turbine tested with a realistic inlet radial temperature profile

    Science.gov (United States)

    Stabe, R. G.; Whitney, W. J.; Moffitt, T. P.

    1984-01-01

    Experimental results are presented for a 0.767 scale model of the first stage of a two-stage turbine designed for a high by-pass ratio engine. The turbine was tested with both uniform inlet conditions and with an inlet radial temperature profile simulating engine conditions. The inlet temperature profile was essentially mixed-out in the rotor. There was also substantial underturning of the exit flow at the mean diameter. Both of these effects were attributed to strong secondary flows in the rotor blading. There were no significant differences in the stage performance with either inlet condition when differences in tip clearance were considered. Performance was very close to design intent in both cases. Previously announced in STAR as N84-24589

  9. Evaluating score- and feature-based likelihood ratio models for multivariate continuous data: applied to forensic MDMA comparison

    NARCIS (Netherlands)

    Bolck, A.; Ni, H.; Lopatka, M.

    2015-01-01

    Likelihood ratio (LR) models are moving into the forefront of forensic evidence evaluation as these methods are adopted by a diverse range of application areas in forensic science. We examine the fundamentally different results that can be achieved when feature- and score-based methodologies are

  10. A methodology for replacement of conventional steel by microalloyed steel in bus tubular structures

    International Nuclear Information System (INIS)

    Cruz, Magnus G.H.; Viecelli, Alexandre

    2008-01-01

    The aim of this article is to show the use of a methodology that allows, in a trustful way and without the need to build up a complete physical model, the replacement of conventional steel by structural microalloyed steel (HSLA) in tubular structure, concerning passengers transport in vehicles with capacity of more than 20 people. The validation of the methodology is based on the ECE R66-00 regulation and on the Brazilian CONTRAN 811/96 resolution, which regulate minimal conditions of safety for this kind of vehicle. The methodology has four sequential and dependent stages, where the main focus is related to the experimental tests through the models that are simplified initially for later calibration using finite element method. Modular structures made of two different materials were tested and analyzed to confirm the present methodology, first the structure made of steel that is used by the bus industry in Brazil was tested and then it was compared with the new microalloyed steel. Experimental values are compared with calculated ones, foreseeing parametric optimisation and keeping the security levels according to legislation

  11. A methodology for replacement of conventional steel by microalloyed steel in bus tubular structures

    Energy Technology Data Exchange (ETDEWEB)

    Cruz, Magnus G.H. [Marcopolo S.A., Unidade Ana Rech, Av. Rio Branco, 4889, Ana Rach, 95060-650 Caxias do Sul (Brazil)], E-mail: magnus@verbonet.com.br; Viecelli, Alexandre [Mechanical Engineering Department, Universidade de Caxias do Sul, Rua Francisco Getulio Vargas, 1130, 95070-560 Caxias do Sul, RS (Brazil)], E-mail: avieceli@ucs.br

    2008-07-01

    The aim of this article is to show the use of a methodology that allows, in a trustful way and without the need to build up a complete physical model, the replacement of conventional steel by structural microalloyed steel (HSLA) in tubular structure, concerning passengers transport in vehicles with capacity of more than 20 people. The validation of the methodology is based on the ECE R66-00 regulation and on the Brazilian CONTRAN 811/96 resolution, which regulate minimal conditions of safety for this kind of vehicle. The methodology has four sequential and dependent stages, where the main focus is related to the experimental tests through the models that are simplified initially for later calibration using finite element method. Modular structures made of two different materials were tested and analyzed to confirm the present methodology, first the structure made of steel that is used by the bus industry in Brazil was tested and then it was compared with the new microalloyed steel. Experimental values are compared with calculated ones, foreseeing parametric optimisation and keeping the security levels according to legislation.

  12. The use of agile systems development methodologies in the telecommunication industry in South Africa / B.M. Mazengera

    OpenAIRE

    Mazengera, Bruce Mwai Analinafe

    2009-01-01

    Over the last decade, systems development professionals have recognised the need to use agile systems development methodologies (ASDMs) in the telecommunication industry. This is partly due to the barriers identified by Mansurov (2000) which suggest that the use of agile methodologies in the telecommunication industry would reduce the ratio of time-to-market. In the South African context, the industry has cemented its position as a major driving force of the economy as a whole. The industry's...

  13. Planned Enhanced Wakefield Transformer Ratio Experiment at Argonne Wakefield Accelerator

    CERN Document Server

    Kanareykin, Alex; Gai, Wei; Jing, Chunguang; Konecny, Richard; Power, John G

    2005-01-01

    In this paper, we present a preliminary experimental study of a wakefield accelerating scheme that uses a carefully spaced and current ramped electron pulse train to produce wakefields that increases the transformer ratio much higher than 2. A dielectric structure was designed and fabricated to operate at 13.625 GHz with dielectric constant of 15.7. The structure will be initially excited by two beams with first and second beam charge ratio of 1:3. The expected transformer ratio is 3 and the setup can be easily extend to 4 pulses which leads to a transformer ratio of more than 6. The dielectric structure cold test results show the tube is within the specification. A set of laser splitters was also tested to produce ramped bunch train of 2 - 4 pulses. Overall design of the experiment and initial results will be presented.

  14. Is law enforcement of drug-impaired driving cost-efficient? An explorative study of a methodology for cost-benefit analysis.

    Science.gov (United States)

    Veisten, Knut; Houwing, Sjoerd; Mathijssen, M P M René; Akhtar, Juned

    2013-03-01

    Road users driving under the influence of psychoactive substances may be at much higher relative risk (RR) in road traffic than the average driver. Legislation banning blood alcohol concentrations above certain threshold levels combined with roadside breath-testing of alcohol have been in lieu for decades in many countries, but new legislation and testing of drivers for drug use have recently been implemented in some countries. In this article we present a methodology for cost-benefit analysis (CBA) of increased law enforcement of roadside drug screening. This is an analysis of the profitability for society, where costs of control are weighed against the reduction in injuries expected from fewer drugged drivers on the roads. We specify assumptions regarding costs and the effect of the specificity of the drug screening device, and quantify a deterrence effect related to sensitivity of the device yielding the benefit estimates. Three European countries with different current enforcement levels were studied, yielding benefit-cost ratios in the approximate range of 0.5-5 for a tripling of current levels of enforcement, with costs of about 4000 EUR per convicted and in the range of 1.5 and 13 million EUR per prevented fatality. The applied methodology for CBA has involved a simplistic behavioural response to enforcement increase and control efficiency. Although this methodology should be developed further, it is clearly indicated that the cost-efficiency of increased law enforcement of drug driving offences is dependent on the baseline situation of drug-use in traffic and on the current level of enforcement, as well as the RR and prevalence of drugs in road traffic. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    International Nuclear Information System (INIS)

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-01-01

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report

  16. Creating and evaluating a new clicker methodology

    Science.gov (United States)

    Li, Pengfei

    "Clickers", an in-class polling system, has been used by many instructors to add active learning and formative assessment to previously passive traditional lectures. While considerable research has been conducted on clicker increasing student interaction in class, less research has been reported on the effectiveness of using clicker to help students understand concepts. This thesis reported a systemic project by the OSU Physics Education group to develop and test a new clicker methodology. Clickers question sequences based on a constructivist model of learning were used to improve classroom dynamics and student learning. They also helped students and lecturers understand in real time whether a concept had been assimilated or more effort was required. Chapter 1 provided an introduction to the clicker project. Chapter 2 summarized widely-accepted teaching principles that have arisen from a long history of research and practice in psychology, cognitive science and physics education. The OSU clicker methodology described in this thesis originated partly from our years of teaching experience, but mostly was based on these teaching principles. Chapter 3 provided an overview of the history of clicker technology and different types of clickers. Also, OSU's use of clickers was summarized together with a list of common problems and corresponding solutions. These technical details may be useful for those who want to use clickers. Chapter 4 discussed examples of the type and use of question sequences based on the new clicker methodology. In several years of research, we developed a base of clicker materials for calculus-based introductory physics courses at OSU. As discussed in chapter 5, a year-long controlled quantitative study was conducted to determine whether using clickers helps students learn, how using clickers helps students learn and whether students perceive that clicker has a positive effect on their own learning process. The strategy for this test was based on

  17. High-level methodology for carrying out combined red and blue teams

    CSIR Research Space (South Africa)

    Veerasamy, N

    2009-12-01

    Full Text Available This paper proposes a combined Red and Blue Team Methodology to guide the process of carrying out such security auditing and penetration testing tasks. Red and Blue Teams consist of various security auditing and penetration testing tasks which serve...

  18. Development and demonstration of a validation methodology for vehicle lateral dynamics simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Kutluay, Emir

    2013-02-01

    In this thesis a validation methodology to be used in the assessment of the vehicle dynamics simulation models is presented. Simulation of vehicle dynamics is used to estimate the dynamic responses of existing or proposed vehicles and has a wide array of applications in the development of vehicle technologies. Although simulation environments, measurement tools and mathematical theories on vehicle dynamics are well established, the methodical link between the experimental test data and validity analysis of the simulation model is still lacking. The developed validation paradigm has a top-down approach to the problem. It is ascertained that vehicle dynamics simulation models can only be validated using test maneuvers although they are aimed for real world maneuvers. Test maneuvers are determined according to the requirements of the real event at the start of the model development project and data handling techniques, validation metrics and criteria are declared for each of the selected maneuvers. If the simulation results satisfy these criteria, then the simulation is deemed ''not invalid''. If the simulation model fails to meet the criteria, the model is deemed invalid, and model iteration should be performed. The results are analyzed to determine if the results indicate a modeling error or a modeling inadequacy; and if a conditional validity in terms of system variables can be defined. Three test cases are used to demonstrate the application of the methodology. The developed methodology successfully identified the shortcomings of the tested simulation model, and defined the limits of application. The tested simulation model is found to be acceptable but valid only in a certain dynamical range. Several insights for the deficiencies of the model are reported in the analysis but the iteration step of the methodology is not demonstrated. Utilizing the proposed methodology will help to achieve more time and cost efficient simulation projects with

  19. Using non-performing loan ratios as default rates in the estimation of credit losses and macroeconomic credit risk stress testing: A case from Turkey

    Directory of Open Access Journals (Sweden)

    Guray Kucukkocaoglu

    2016-02-01

    Full Text Available In this study, inspired by the Credit Portfolio View approach, we intend to develop an econometric credit risk model to estimate credit loss distributions of Turkish Banking System under baseline and stress macro scenarios, by substituting default rates with non-performing loan (NPL ratios. Since customer number based historical default rates are not available for the whole Turkish banking system’s credit portfolio, we used NPL ratios as dependent variable instead of default rates, a common practice for many countries where historical default rates are not available. Although, there are many problems in using NPL ratios as default rates such as underestimating portfolio losses as a result of totally non-homogeneous total credit portfolios and transferring non-performing loans to asset management companies from banks’ balance sheets, our aim is to underline and limit some ignored problems using accounting based NPL ratios as default rates in macroeconomic credit risk modeling. Developed models confirm the strong statistical relationship between systematic component of credit risk and macroeconomic variables in Turkey. Stress test results also are compatible with the past experiences

  20. A test of the mean density approximation for Lennard-Jones mixtures with large size ratios

    International Nuclear Information System (INIS)

    Ely, J.F.

    1986-01-01

    The mean density approximation for mixture radial distribution functions plays a central role in modern corresponding-states theories. This approximation is reasonably accurate for systems that do not differ widely in size and energy ratios and which are nearly equimolar. As the size ratio increases, however, or if one approaches an infinite dilution of one of the components, the approximation becomes progressively worse, especially for the small molecule pair. In an attempt to better understand and improve this approximation, isothermal molecular dynamics simulations have been performed on a series of Lennard-Jones mixtures. Thermodynamic properties, including the mixture radial distribution functions, have been obtained at seven compositions ranging from 5 to 95 mol%. In all cases the size ratio was fixed at two and three energy ratios were investigated, 22 / 11 =0.5, 1.0, and 1.5. The results of the simulations are compared with the mean density approximation and a modification to integrals evaluated with the mean density approximation is proposed

  1. Dynamic moduli and damping ratios of soil evaluated from pressuremeter test

    International Nuclear Information System (INIS)

    Yoshida, Yasuo; Ezashi, Yasuyuki; Kokusho, Takaji; Nishi, Yoshikazu

    1984-01-01

    Dynamic and static properties of soils are investigated using the newly developed equipment of in-situ test, which imposes dynamic repeated pressure on borehole wall at any depth covering a wide range of strain amplitude. This paper describes mainly the shear modulus and damping characteristics of soils obtained by using the equipment in several sites covering wide variety of soils. The test results are compared and with those obtained by other test methods such as the dynamic triaxial test, the simple shear test and the shear wave velocity test, and discussions are made with regard to their relation ships to each other, which demonstrates the efficiency of this in-situ test. (author)

  2. Optimization of the production of biodiesel by a commercial immobilized lipase in a solvent-free system using a response surface methodology

    Directory of Open Access Journals (Sweden)

    ZORICA KNEZEVIC

    2008-02-01

    Full Text Available Response surface methodology was used for the evaluation of the effects of various factors on the synthesis of biodiesel catalyzed with immobilized lipase from Rhizomucor miehei in a solvent-free system. The production of biodiesel was optimized and model response equations were obtained, enabling the prediction of biodiesel production from the values of the four main factors. It would seem that the reaction temperature and the amount of water predominantly determined the conversion process while the methanol/oil molar ratio had no significant influence on the reaction rate. The temperature and amount of water showed negative interactive effects on the observed reaction rate per amount of enzyme. However, there were no significant interactions among the other variables according to the test of statistical significance. The highest yield of 10.15 mol kg-1 enzyme was observed at 45 °C with a 6:1 methanol to oil molar ratio and with no added water in the system.

  3. Leach test methodology for the Waste/Rock Interactions Technology Program

    International Nuclear Information System (INIS)

    Bradley, D.J.; McVay, G.L.; Coles, D.G.

    1980-05-01

    Experimental leach studies in the WRIT Program have two primary functions. The first is to determine radionuclide release from waste forms in laboratory environments which attempt to simulate repository conditions. The second is to elucidate leach mechanisms which can ultimately be incorporated into nearfield transport models. The tests have been utilized to generate rates of removal of elements from various waste forms and to provide specimens for surface analysis. Correlation between constituents released to the solution and corresponding solid state profiles is invaluable in the development of a leach mechanism. Several tests methods are employed in our studies which simulate various proposed leach incident scenarios. Static tests include low temperature (below 100 0 C) and high temperature (above 100 0 C) hydrothermal tests. These tests reproduce nonflow or low-flow repository conditions and can be used to compare materials and leach solution effects. The dynamic tests include single-pass, continuous-flow(SPCF) and solution-change (IAA)-type tests in which the leach solutions are changed at specific time intervals. These tests simulate repository conditions of higher flow rates and can also be used to compare materials and leach solution effects under dynamic conditions. The modified IAEA test is somewhat simpler to use than the one-pass flow and gives adequate results for comparative purposes. The static leach test models the condition of near-zero flow in a repository and provides information on element readsorption and solubility limits. The SPCF test is used to study the effects of flowing solutions at velocities that may be anticipated for geologic groundwaters within breached repositories. These two testing methods, coupled with the use of autoclaves, constitute the current thrust of WRIT leach testing

  4. RATIO_TOOL - SOFTWARE FOR COMPUTING IMAGE RATIOS

    Science.gov (United States)

    Yates, G. L.

    1994-01-01

    Geological studies analyze spectral data in order to gain information on surface materials. RATIO_TOOL is an interactive program for viewing and analyzing large multispectral image data sets that have been created by an imaging spectrometer. While the standard approach to classification of multispectral data is to match the spectrum for each input pixel against a library of known mineral spectra, RATIO_TOOL uses ratios of spectral bands in order to spot significant areas of interest within a multispectral image. Each image band can be viewed iteratively, or a selected image band of the data set can be requested and displayed. When the image ratios are computed, the result is displayed as a gray scale image. At this point a histogram option helps in viewing the distribution of values. A thresholding option can then be used to segment the ratio image result into two to four classes. The segmented image is then color coded to indicate threshold classes and displayed alongside the gray scale image. RATIO_TOOL is written in C language for Sun series computers running SunOS 4.0 and later. It requires the XView toolkit and the OpenWindows window manager (version 2.0 or 3.0). The XView toolkit is distributed with Open Windows. A color monitor is also required. The standard distribution medium for RATIO_TOOL is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation is included on the program media. RATIO_TOOL was developed in 1992 and is a copyrighted work with all copyright vested in NASA. Sun, SunOS, and OpenWindows are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  5. Transformer ratio enhancement experiment

    International Nuclear Information System (INIS)

    Gai, W.; Power, J. G.; Kanareykin, A.; Neasheva, E.; Altmark, A.

    2004-01-01

    Recently, a multibunch scheme for efficient acceleration based on dielectric wakefield accelerator technology was outlined in J.G. Power, W. Gai, A. Kanareykin, X. Sun. PAC 2001 Proceedings, pp. 114-116, 2002. In this paper we present an experimental program for the design, development and demonstration of an Enhanced Transformer Ratio Dielectric Wakefield Accelerator (ETR-DWA). The principal goal is to increase the transformer ratio R, the parameter that characterizes the energy transfer efficiency from the accelerating structure to the accelerated electron beam. We present here an experimental design of a 13.625 GHz dielectric loaded accelerating structure, a laser multisplitter producing a ramped bunch train, and simulations of the bunch train parameters required. Experimental results of the accelerating structure bench testing and ramped pulsed train generation with the laser multisplitter are shown as well. Using beam dynamic simulations, we also obtain the focusing FODO lattice parameters

  6. The Relationship Between 14C Urea Breath Test Results and Neutrophil/Lymphocyte and Platelet/Lymphocyte Ratios

    Directory of Open Access Journals (Sweden)

    Ertan Şahin

    2018-04-01

    Full Text Available Aim: Neutrophil/lymphocyte ratio (NLR and platelet/lymphocyte ratio (PLR are used as inflammatory markers in several diseases. However, there are little data regarding the diagnostic ability of NLR and PLR in Helicobacter pylori. We aimed to assess the association between the 14C urea breath test (14C-UBT results and NLR and PLR in H. pylori diagnosis. Methods: Results of 89 patients were retrospectively analysed in this study. According to the 14C-UBT results, patients were divided into two groups: H. pylori (+ and H. pylori (- (control group. Haematological parameters, including hemoglobine, white blood cell (WBC count, neutrophil count, lymphocyte count, NLR, platelet count, and PLR were compared between the two groups. Results: The mean total WBC count, neutrophil count, NLR and PLR in H. pylori (+ patients were significantly higher than in the control group (p<0.001 for all these parameters. In the receiver operating characteristic curve analysis, the cut-off value for NLR and PLR for the presence of H. pylori was calculated as ≥2.39 [sensitivity: 67.3%, specificity: 79.4%, area under the curve (AUC: 0.747 (0.637-0.856, p<0.0001] and ≥133.3 [sensitivity: 61.8%, specificity: 55.9%, AUC: 0.572 (0.447-0.697, p<0.05], respectively. Conclusion: The present study shows that NLR and PLR are associated with H. pylori positivity based on 14C-UBT, and they can be used as an additional biomarker for supporting the 14C-UBT results.

  7. Recommendations for benefit-risk assessment methodologies and visual representations

    DEFF Research Database (Denmark)

    Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul

    2016-01-01

    PURPOSE: The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. METHODS: Eight case studies based on the benefit......-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. RESULTS: A general pathway through the case studies...

  8. Could changes in reported sex ratios at birth during China's 1958-1961 famine support the adaptive sex ratio adjustment hypothesis?

    Directory of Open Access Journals (Sweden)

    Anna Reimondos

    2013-10-01

    Full Text Available Background: The adaptive sex ratio adjustment hypothesis suggests that when mothers are in poor conditions the sex ratio of their offspring will be biased towards females. Major famines provide opportunities for testing this hypothesis because they lead to the widespread deterioration of living conditions in the affected population. Objective: This study examines changes in sex ratio at birth before, during, and after China's 1958-1961 famine, to see whether they provide any support for the adaptive sex ratio adjustment hypothesis. Methods: We use descriptive statistics to analyse data collected by both China's 1982 and 1988 fertility sample surveys and examine changes in sex ratio at birth in recent history. In addition, we examine the effectiveness of using different methods to model changes in sex ratio at birth and compare their differences. Results: During China's 1958-1961 famine, reported sex ratio at birth remained notably higher than that observed in most countries in the world. The timing of the decline in sex ratio at birth did not coincide with the timing of the famine. After the famine, although living conditions were considerably improved, the sex ratio at birth was not higher but lower than that recorded during the famine. Conclusions: The analysis of the data collected by the two fertility surveys has found no evidence that changes in sex ratio at birth during China's 1958-1961 famine and the post-famine period supported the adaptive sex ratio adjustment hypothesis.

  9. SOAC - State-of-the-Art Car Engineering Tests at Department of Transportation High Speed Ground Test Center : Volume 2. Performance Tests.

    Science.gov (United States)

    1975-01-01

    The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...

  10. Online Stable Isotope Analysis of Dissolved Organic Carbon Size Classes Using Size Exclusion Chromatography Coupled to an Isotope Ratio Mass Spectrometer

    Digital Repository Service at National Institute of Oceanography (India)

    Malik, A.; Scheibe, A.; LokaBharathi, P.A.; Gleixner, G.

    size classes by coupling high-performance liquid chromatography (HPLC) - size exclusion chromatography (SEC) to online isotope ratio mass spectrometry (IRMS). This represents a significant methodological contribution to DOC research. The interface...

  11. La Dinámica Cross-Section de los Ratios Financieros: ¿Tienden los Ratios a Converger hacia la Media Sectorial?

    Directory of Open Access Journals (Sweden)

    Manuel Illueca Muñoz

    2002-12-01

    Full Text Available El objetivo fundamental de este artículo consiste en contrastar si los ratios financieros describen un proceso de ajuste / convergencia hacia la media sectorial. Utilizando un enfoque no paramétrico, se ha modelizado explícitamente la dinámica de las distribuciones de seis ratios, calculados a partir de una muestra de empresas del sector español de pavimentos y revestimientos cerámicos. Los resultados no permiten afirmar que las empresas de la muestra converjan hacia la media del sector, al contrario, las distribuciones de probabilidad de los ratios analizados presentan a largo plazo una dispersión similar a la del periodo muestral.The main objective of this paper is to test whether financial ratios follow a convergent path towards the industry mean. A non parametric approach is used to model the cross-section dynamics of six financial ratios, computed on a sample of Spanish tile firms. Our findings do not support the hipothesis of convergence. The dispersion of the ratios does not shrink in the long term.

  12. Methodology for quantitative evaluation of diagnostic performance

    International Nuclear Information System (INIS)

    Metz, C.

    1981-01-01

    Of various approaches that might be taken to the diagnostic performance evaluation problem, Receiver Operating Characteristic (ROC) analysis holds great promise. Further development of the methodology for a unified, objective, and meaningful approach to evaluating the usefulness of medical imaging procedures is done by consideration of statistical significance testing, optimal sequencing of correlated studies, and analysis of observer performance

  13. ETE-EVAL: a methodology for D and D cost estimation

    International Nuclear Information System (INIS)

    Decobert, G.; Robic, S.; Vanel, V.

    2008-01-01

    In compliance with Article 20 of the sustainable radioactive materials and waste management act dated 28 June 2006, the CEA and AREVA are required every three years to revise the cost of decommissioning their facilities and to provide the necessary assets by constituting a dedicated fund. For the 2007 revision the CEA used ETE-EVAL V5. Similarly, AREVA reevaluated the cost of decontaminating and dismantling its facilities at La Hague, as the previous estimate in 2004 did not take into account the complete cleanup of all the structural work. ETE-EVAL V5 is a computer application designed to estimate the cost of decontamination and dismantling of basic nuclear installations (INB). It has been qualified by Bureau Veritas and audited. ETE-EVAL V5 has become the official software for cost assessment of CEA civilian and AREVA decommissioning projects. It has been used by the DPAD (Decontamination and Dismantling Projects Department) cost assessment group to estimate the cost of decommissioning some thirty facilities (cost update on completion for the dedicated fund for dismantling civilian CEA facilities) and by AREVA to estimate the cost of decommissioning its fuel cycle back-end facilities. Some necessary modifications are now being implemented to allow for the specific aspects of fuel cycle front-end facilities. The computational method is based on physical, radiological and waste inventories following a particular methodology, and on interviews with operating personnel to compile ratios and financial data (operating cost, etc.) and enter them in a database called GREEN (from the French acronym for Management Ratios for Assessment of Nuclear Facilities). ETE-EVAL V5 comprises the cost assessment module and GREEN database. It has been enriched with the lessons learned from experience, and can be adapted as necessary to meet installation-specific requirements. The cost assessment module allows the user to estimate decommissioning costs once the inventory has been

  14. A methodology for identification and control of electro-mechanical actuators.

    Science.gov (United States)

    Tutunji, Tarek A; Saleem, Ashraf

    2015-01-01

    Mechatronic systems are fully-integrated engineering systems that are composed of mechanical, electronic, and computer control sub-systems. These integrated systems use electro-mechanical actuators to cause the required motion. Therefore, the design of appropriate controllers for these actuators are an essential step in mechatronic system design. In this paper, a three-stage methodology for real-time identification and control of electro-mechanical actuator plants is presented, tested, and validated. First, identification models are constructed from experimental data to approximate the plants' response. Second, the identified model is used in a simulation environment for the purpose of designing a suitable controller. Finally, the designed controller is applied and tested on the real plant through Hardware-in-the-Loop (HIL) environment. The described three-stage methodology provides the following practical contributions: •Establishes an easy-to-follow methodology for controller design of electro-mechanical actuators.•Combines off-line and on-line controller design for practical performance.•Modifies the HIL concept by using physical plants with computer control (rather than virtual plants with physical controllers). Simulated and experimental results for two case studies, induction motor and vehicle drive system, are presented in order to validate the proposed methodology. These results showed that electromechanical actuators can be identified and controlled using an easy-to-duplicate and flexible procedure.

  15. On the assessment of usability testing methods for children

    NARCIS (Netherlands)

    Markopoulos, P.; Bekker, M.M.

    2003-01-01

    The paper motivates the need to acquire methodological knowledge for involving children as test users in usability testing. It introduces a methodological framework for delineating comparative assessments of usability testing methods for children participants. This framework consists in three

  16. System-level design methodologies for telecommunication

    CERN Document Server

    Sklavos, Nicolas; Goehringer, Diana; Kitsos, Paris

    2013-01-01

    This book provides a comprehensive overview of modern networks design, from specifications and modeling to implementations and test procedures, including the design and implementation of modern networks on chip, in both wireless and mobile applications.  Topical coverage includes algorithms and methodologies, telecommunications, hardware (including networks on chip), security and privacy, wireless and mobile networks and a variety of modern applications, such as VoLTE and the internet of things.

  17. Application of validity theory and methodology to patient-reported outcome measures (PROMs): building an argument for validity.

    Science.gov (United States)

    Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H

    2018-07-01

    Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.

  18. Home urine C-peptide creatinine ratio (UCPCR) testing can identify type 2 and MODY in pediatric diabetes.

    Science.gov (United States)

    Besser, Rachel E J; Shields, Beverley M; Hammersley, Suzanne E; Colclough, Kevin; McDonald, Timothy J; Gray, Zoe; Heywood, James J N; Barrett, Timothy G; Hattersley, Andrew T

    2013-05-01

    Making the correct diabetes diagnosis in children is crucial for lifelong management. Type 2 diabetes and maturity onset diabetes of the young (MODY) are seen in the pediatric setting, and can be difficult to discriminate from type 1 diabetes. Postprandial urinary C-peptide creatinine ratio (UCPCR) is a non-invasive measure of endogenous insulin secretion that has not been tested as a diagnostic tool in children or in patients with diabetes duration MODY and type 2 in pediatric diabetes. Two-hour postprandial UCPCR was measured in 264 patients aged MODY, n = 63). Receiver operating characteristic curves were used to identify the optimal UCPCR cutoff for discriminating diabetes subtypes. UCPCR was lower in type 1 diabetes [0.05 (MODY [3.51 (2.37-5.32) nmol/mmol, p MODY (p = 0.25), so patients were combined for subsequent analyses. After 2-yr duration, UCPCR ≥ 0.7 nmol/mmol has 100% sensitivity [95% confidence interval (CI): 92-100] and 97% specificity (95% CI: 91-99) for identifying non-type 1 (MODY + type 2 diabetes) from type 1 diabetes [area under the curve (AUC) 0.997]. UCPCR was poor at discriminating MODY from type 2 diabetes (AUC 0.57). UCPCR testing can be used in diabetes duration greater than 2 yr to identify pediatric patients with non-type 1 diabetes. UCPCR testing is a practical non-invasive method for use in the pediatric outpatient setting. © 2013 John Wiley & Sons A/S.

  19. Large sample NAA facility and methodology development

    International Nuclear Information System (INIS)

    Roth, C.; Gugiu, D.; Barbos, D.; Datcu, A.; Aioanei, L.; Dobrea, D.; Taroiu, I. E.; Bucsa, A.; Ghinescu, A.

    2013-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) facility has been developed at the TRIGA- Annular Core Pulsed Reactor (ACPR) operated by the Institute for Nuclear Research in Pitesti, Romania. The central irradiation cavity of the ACPR core can accommodate a large irradiation device. The ACPR neutron flux characteristics are well known and spectrum adjustment techniques have been successfully applied to enhance the thermal component of the neutron flux in the central irradiation cavity. An analysis methodology was developed by using the MCNP code in order to estimate counting efficiency and correction factors for the major perturbing phenomena. Test experiments, comparison with classical instrumental neutron activation analysis (INAA) methods and international inter-comparison exercise have been performed to validate the new methodology. (authors)

  20. Methodology and applications for the benefit cost analysis of the seismic risk reduction in building portfolios at broadscale

    OpenAIRE

    Valcarcel, Jairo A.; Mora, Miguel G.; Cardona, Omar D.; Pujades, Lluis G.; Barbat, Alex H.; Bernal, Gabriel A.

    2013-01-01

    This article presents a methodology for an estimate of the benefit cost ratio of the seismic risk reduction in buildings portfolio at broadscale, for a world region, allowing comparing the results obtained for the countries belonging to that region. This methodology encompasses (1) the generation of a set of random seismic events and the evaluation of the spectral accelerations at the buildings location; (2) the estimation of the buildings built area, the economic value, as well as the cla...

  1. Current psychometric and methodological issues in the measurement of overgeneral autobiographical memory.

    Science.gov (United States)

    Griffith, James W; Sumner, Jennifer A; Raes, Filip; Barnhofer, Thorsten; Debeer, Elise; Hermans, Dirk

    2012-12-01

    Autobiographical memory is a multifaceted construct that is related to psychopathology and other difficulties in functioning. Across many studies, a variety of methods have been used to study autobiographical memory. The relationship between overgeneral autobiographical memory (OGM) and psychopathology has been of particular interest, and many studies of this cognitive phenomenon rely on the Autobiographical Memory Test (AMT) to assess it. In this paper, we examine several methodological approaches to studying autobiographical memory, and focus primarily on methodological and psychometric considerations in OGM research. We pay particular attention to what is known about the reliability, validity, and methodological variations of the AMT. The AMT has adequate psychometric properties, but there is great variability in methodology across studies that use it. Methodological recommendations and suggestions for future studies are presented. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. A Methodology for Off-line Evaluation of New Environmentally Friendly Tribo-systems for Sheet Metal Forming

    DEFF Research Database (Denmark)

    Ceron, Ermanno; Bay, Niels

    2013-01-01

    Increasing focus on environmental issues in industrial production has urged sheet stamping companies to look for new tribo-systems in order to substitute hazardous lubricants such as chlorinated paraffin oils. Production testing of new lubricants is, however, costly and makes industry reluctant...... towards testing alternative solutions. The present paper presents a methodology for off-line testing of new tribo-systems based on numerical modelling of production process as well as laboratory test to adjust the latter combined with testing of selected tribo-systems on a new automatic sheet......-tribo-tester emulating typical sheet forming production processes. Final testing of the tribo-systems in production verifies the methodology. © 2013 CIRP....

  3. Optimization of Extraction Process for Antidiabetic and Antioxidant Activities of Kursi Wufarikun Ziyabit Using Response Surface Methodology and Quantitative Analysis of Main Components.

    Science.gov (United States)

    Edirs, Salamet; Turak, Ablajan; Numonov, Sodik; Xin, Xuelei; Aisa, Haji Akber

    2017-01-01

    By using extraction yield, total polyphenolic content, antidiabetic activities (PTP-1B and α -glycosidase), and antioxidant activity (ABTS and DPPH) as indicated markers, the extraction conditions of the prescription Kursi Wufarikun Ziyabit (KWZ) were optimized by response surface methodology (RSM). Independent variables were ethanol concentration, extraction temperature, solid-to-solvent ratio, and extraction time. The result of RSM analysis showed that the four variables investigated have a significant effect ( p analysis of effective part of KWZ was characterized via UPLC method, 12 main components were identified by standard compounds, and all of them have shown good regression within the test ranges and the total content of them was 11.18%.

  4. AEGIS methodology and a perspective from AEGIS methodology demonstrations

    International Nuclear Information System (INIS)

    Dove, F.H.

    1981-03-01

    Objectives of AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) are to develop the capabilities needed to assess the post-closure safety of waste isolation in geologic formation; demonstrate these capabilities on reference sites; apply the assessment methodology to assist the NWTS program in site selection, waste package and repository design; and perform repository site analyses for the licensing needs of NWTS. This paper summarizes the AEGIS methodology, the experience gained from methodology demonstrations, and provides an overview in the following areas: estimation of the response of a repository to perturbing geologic and hydrologic events; estimation of the transport of radionuclides from a repository to man; and assessment of uncertainties

  5. Safety assessment methodologies for near surface disposal facilities. Results of a co-ordinated research project (ISAM). Volume 1: Review and enhancement of safety assessment approaches and tools. Volume 2: Test cases

    International Nuclear Information System (INIS)

    2004-07-01

    For several decades, countries have made use of near surface facilities for the disposal of low and intermediate level radioactive waste. In line with the internationally agreed principles of radioactive waste management, the safety of these facilities needs to be ensured during all stages of their lifetimes, including the post-closure period. By the mid 1990s, formal methodologies for evaluating the long term safety of such facilities had been developed, but intercomparison of these methodologies had revealed a number of discrepancies between them. Consequently, in 1997, the International Atomic Energy Agency launched a Co-ordinated Research Project (CRP) on Improvement of Safety Assessment Methodologies for Near Surface Disposal Facilities (ISAM). The particular objectives of the CRP were to provide a critical evaluation of the approaches and tools used in post-closure safety assessment for proposed and existing near-surface radioactive waste disposal facilities, enhance the approaches and tools used and build confidence in the approaches and tools used. The CRP ran until 2000 and resulted in the development of a harmonized assessment methodology (the ISAM project methodology), which was applied to a number of test cases. Over seventy participants from twenty-two Member States played an active role in the project and it attracted interest from around seven hundred persons involved with safety assessment in seventy-two Member States. The results of the CRP have contributed to the Action Plan on the Safety of Radioactive Waste Management which was approved by the Board of Governors and endorsed by the General Conference in September 2001. Specifically, they contribute to Action 5, which requests the IAEA Secretariat to 'develop a structured and systematic programme to ensure adequate application of the Agency's waste safety standards', by elaborating on the Safety Requirements on 'Near Surface Disposal of Radioactive Waste' (Safety Standards Series No. WS-R-1) and

  6. A rational methodology for the study of foundations for marine structures

    International Nuclear Information System (INIS)

    Mira Mc Willams, P.; Fernandez-Merodo, J. A.; Pastor Perez, M.; Monte Saez, J. L.; Martinez Santamaria, J. M.; Cuellar Mirasol, V.; Martin Baanante, M. E.; Rodriguez Sanchez-Arevalo, I; Lopez Maldonando, J. D.; Tomas Sampedro, A.

    2011-01-01

    A methodology for the study of marine foundations is presented. The response in displacements, stresses and pore water pressures in obtained from a finite element coupled formulation. Loads due to wave action of the foundation are obtained from a volume of fluid type fluid-structure interaction numerical model. Additionally, the methodology includes a Generalized Plasticity based constitutive model for granular materials capable of representing liquefaction fenomena of sands subjected to cyclic loading, such as those frequently appearing in the problems studied. Calibration of this model requires a series of laboratory tests detailed herein. This methodology is applied to the study of the response of a caisson breakwater foundation. (Author) 10 refs.

  7. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions Based on a Bank of Norm-Inequality-Constrained Epoch-State Filters

    Science.gov (United States)

    Carpenter, J. R.; Markley, F. L.; Alfriend, K. T.; Wright, C.; Arcido, J.

    2011-01-01

    Sequential probability ratio tests explicitly allow decision makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models 1he null hypothesis 1ha1 the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming highly-elliptical orbit formation flying mission.

  8. SPACE code simulation of ATLAS DVI line break accident test (SB DVI 08 Test)

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Sang Gyu [KHNP, Daejeon (Korea, Republic of)

    2012-10-15

    APR1400 has adopted new safety design features which are 4 mechanically independent DVI (Direct Vessel Injection) systems and fluidic device in the safety injection tanks (SITs). Hence, DVI line break accident has to be evaluated as one of the small break LOCA (SBLOCA) to ensure the safety of APR1400. KAERI has been performed for DVI line break test (SB DVI 08) using ATLAS (Advanced Thermal Hydraulic Test Loop for Accident Simulation) facility which is an integral effect test facility for APR1400. The test result shows that the core collapsed water level decreased before a loop seal clearance, so that a core uncover occurred. At this time, the peak cladding temperature (PCT) is rapidly increased even though the emergency core cooling (ECC) water is injected from safety injection pump (SIP). This test result is useful for supporting safety analysis using thermal hydraulic safety analysis code and increases the understanding of SBLOCA phenomena in APR1400. The SBLOCA evaluation methodology for APR1400 is now being developed using SPACE code. The object of the development of this methodology is to set up a conservative evaluation methodology in accordance with appendix K of 10 CFR 50. ATLAS SB DVI 08 test is selected for the evaluation of SBLOCA methodology using SPACE code. Before applying the conservative models and correlations, benchmark calculation of the test is performed with the best estimate models and correlations to verify SPACE code capability. This paper deals with benchmark calculations results of ATLAS SB DVI 08 test. Calculation results of the major hydraulics variables are compared with measured data. Finally, this paper carries out the SPACE code performances for simulating the integral effect test of SBLOCA.

  9. Simultaneous monitoring of ice accretion and thermography of an airfoil: an IR imaging methodology

    International Nuclear Information System (INIS)

    Mohseni, M; Frioult, M; Amirfazli, A

    2012-01-01

    A novel image analysis methodology based on infrared (IR) imaging was developed for simultaneous monitoring of ice accretion and thermography of airfoils. In this study, an IR camera was calibrated and used to measure the surface temperature of the energized airfoils, and monitor the ice accretion and growth pattern on the airfoils’ surfaces. The methodology comprises the automatic processing of a series of IR video frames with the purpose of detecting ice pattern evolution during the icing test period. A specially developed MATLAB code was used to detect the iced areas in the IR images, and simultaneously monitor surface temperature evolution of the airfoil during an icing test. Knowing the correlation between the icing pattern and surface temperature changes during an icing test is essential for energy efficient design of thermal icing mitigation systems. Processed IR images were also used to determine the ice accumulation rate on the airfoil's surface in a given icing test. The proposed methodology has been demonstrated to work successfully, since the optical images taken at the end of icing tests from the airfoils’ surfaces compared well with the processed IR images detecting the ice grown outward from the airfoils’ leading edge area. (paper)

  10. RDANN a new methodology to solve the neutron spectra unfolding problem

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J.M.; Martinez B, M.R.; Vega C, H.R. [UAZ, Av. Ramon Lopez Velarde No. 801, 98000 Zacatecas (Mexico)

    2006-07-01

    The optimization processes known as Taguchi method and DOE methodology are applied to the design, training and testing of Artificial Neural Networks in the neutron spectrometry field, which offer potential benefits in the evaluation of the behavior of the net as well as the ability to examine the interaction of the weights and neurons inside the same one. In this work, the Robust Design of Artificial Neural Networks methodology is used to solve the neutron spectra unfolding problem, designing, training and testing an ANN using a set of 187 neutron spectra compiled by the International Atomic Energy Agency, to obtain the better neutron spectra unfolded from the Bonner spheres spectrometer's count rates. (Author)

  11. RDANN a new methodology to solve the neutron spectra unfolding problem

    International Nuclear Information System (INIS)

    Ortiz R, J.M.; Martinez B, M.R.; Vega C, H.R.

    2006-01-01

    The optimization processes known as Taguchi method and DOE methodology are applied to the design, training and testing of Artificial Neural Networks in the neutron spectrometry field, which offer potential benefits in the evaluation of the behavior of the net as well as the ability to examine the interaction of the weights and neurons inside the same one. In this work, the Robust Design of Artificial Neural Networks methodology is used to solve the neutron spectra unfolding problem, designing, training and testing an ANN using a set of 187 neutron spectra compiled by the International Atomic Energy Agency, to obtain the better neutron spectra unfolded from the Bonner spheres spectrometer's count rates. (Author)

  12. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  13. BRAF mutation testing in solid tumors: a methodological comparison.

    Science.gov (United States)

    Weyant, Grace W; Wisotzkey, Jeffrey D; Benko, Floyd A; Donaldson, Keri J

    2014-09-01

    Solid tumor genotyping has become standard of care for the characterization of proto-oncogene mutational status, which has traditionally been accomplished with Sanger sequencing. However, companion diagnostic assays and comparable laboratory-developed tests are becoming increasingly popular, such as the cobas 4800 BRAF V600 Mutation Test and the INFINITI KRAS-BRAF assay, respectively. This study evaluates and validates the analytical performance of the INFINITI KRAS-BRAF assay and compares concordance of BRAF status with two reference assays, the cobas test and Sanger sequencing. DNA extraction from FFPE tissue specimens was performed followed by multiplex PCR amplification and fluorescent label incorporation using allele-specific primer extension. Hybridization to a microarray, signal detection, and analysis were then performed. The limits of detection were determined by testing dilutions of mutant BRAF alleles within wild-type background DNA, and accuracy was calculated based on these results. The INFINITI KRAS-BRAF assay produced 100% concordance with the cobas test and Sanger sequencing and had sensitivity equivalent to the cobas assay. The INFINITI assay is repeatable with at least 95% accuracy in the detection of mutant and wild-type BRAF alleles. These results confirm that the INFINITI KRAS-BRAF assay is comparable to traditional sequencing and the Food and Drug Administration-approved companion diagnostic assay for the detection of BRAF mutations. Copyright © 2014 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  14. Sequential Probability Ratio Testing with Power Projective Base Method Improves Decision-Making for BCI

    Science.gov (United States)

    Liu, Rong

    2017-01-01

    Obtaining a fast and reliable decision is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this study, the EEG signals were firstly analyzed with a power projective base method. Then we were applied a decision-making model, the sequential probability ratio testing (SPRT), for single-trial classification of motor imagery movement events. The unique strength of this proposed classification method lies in its accumulative process, which increases the discriminative power as more and more evidence is observed over time. The properties of the method were illustrated on thirteen subjects' recordings from three datasets. Results showed that our proposed power projective method outperformed two benchmark methods for every subject. Moreover, with sequential classifier, the accuracies across subjects were significantly higher than that with nonsequential ones. The average maximum accuracy of the SPRT method was 84.1%, as compared with 82.3% accuracy for the sequential Bayesian (SB) method. The proposed SPRT method provides an explicit relationship between stopping time, thresholds, and error, which is important for balancing the time-accuracy trade-off. These results suggest SPRT would be useful in speeding up decision-making while trading off errors in BCI. PMID:29348781

  15. A Design Methodology for Medical Processes

    Science.gov (United States)

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  16. A Design Methodology for Medical Processes.

    Science.gov (United States)

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  17. submitter Methodologies for the Statistical Analysis of Memory Response to Radiation

    CERN Document Server

    Bosser, Alexandre L; Tsiligiannis, Georgios; Frost, Christopher D; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigne, Frederic; Virtanen, Ari; Wrobel, Frederic; Dilillo, Luigi

    2016-01-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  18. Methodology for assessing the impacts of distributed generation interconnection

    Directory of Open Access Journals (Sweden)

    Luis E. Luna

    2011-06-01

    Full Text Available This paper proposes a methodology for identifying and assessing the impact of distributed generation interconnection on distribution systems using Monte Carlo techniques. This methodology consists of two analysis schemes: a technical analysis, which evaluates the reliability conditions of the distribution system; on the other hand, an economic analysis that evaluates the financial impacts on the electric utility and its customers, according to the system reliability level. The proposed methodology was applied to an IEEE test distribution system, considering different operation schemes for the distributed generation interconnection. The application of each one of these schemes provided significant improvements regarding the reliability and important economic benefits for the electric utility. However, such schemes resulted in negative profitability levels for certain customers, therefore, regulatory measures and bilateral contracts were proposed which would provide a solution for this kind of problem.

  19. Fracture assessment of HSST Plate 14 shallow-flaw cruciform bend specimens tested under biaxial loading conditions

    Energy Technology Data Exchange (ETDEWEB)

    Bass, B.R.; McAfee, W.J.; Williams, P.T.; Pennell, W.E.

    1998-06-01

    A technology to determine shallow-flaw fracture toughness of reactor pressure vessel (RPV) steels is being developed for application to the safety assessment of RPVs containing postulated shallow surface flaws. Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shallow, surface flaws. The cruciform beam specimens were developed at Oak Ridge National Laboratory (ORNL) to introduce a far-field, out-of-plane biaxial stress component in the test section that approximates the nonlinear stresses resulting from pressurized-thermal-shock or pressure-temperature loading of an RPV. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for an RPV material. The cruciform fracture toughness data were used to evaluate fracture methodologies for predicting the observed effects of biaxial loading on shallow-flaw fracture toughness. Initial emphasis was placed on assessment of stress-based methodologies, namely, the J-Q formulation, the Dodds-Anderson toughness scaling model, and the Weibull approach. Applications of these methodologies based on the hydrostatic stress fracture criterion indicated an effect of loading-biaxiality on fracture toughness; the conventional maximum principal stress criterion indicated no effect. A three-parameter Weibull model based on the hydrostatic stress criterion is shown to correlate the experimentally observed biaxial effect on cleavage fracture toughness by providing a scaling mechanism between uniaxial and biaxial loading states.

  20. Methodology for quantitative evalution of diagnostic performance. Project III

    International Nuclear Information System (INIS)

    Metz, C.E.

    1985-01-01

    Receiver Operation Characteristic (ROC) methodology is now widely recognized as the most satisfactory approach to the problem of measuring and specifying the performance of a diagnostic procedure. The primary advantage of ROC analysis over alternative methodologies is that it seperates differences in diagnostic accuracy that are due to actual differences in discrimination capacity from those that are due to decision threshold effects. Our effort during the past year has been devoted to developing digital computer programs for fitting ROC curves to diagnostic data by maximum likelihood estimation and to developing meaningful and valid statistical tests for assessing the significance of apparent differences between measured ROC curves. FORTRAN programs previously written here for ROC curve fitting and statistical testing have been refined to make them easier to use and to allow them to be run in a large variety of computer systems. We have attempted also to develop two new curve-fitting programs: one for conventional ROC data that assumes a different functional form for the ROC curve, and one that can be used for ''free-response'' ROC data. Finally, we have cooperated with other investigators to apply our techniques to analyze ROC data generated in clinical studies, and we have sought to familiarize the medical community with the advantages of ROC methodology. 36 ref

  1. Preloading To Accelerate Slow-Crack-Growth Testing

    Science.gov (United States)

    Gyekenyesi, John P.; Choi, Sung R.; Pawlik, Ralph J.

    2004-01-01

    An accelerated-testing methodology has been developed for measuring the slow-crack-growth (SCG) behavior of brittle materials. Like the prior methodology, the accelerated-testing methodology involves dynamic fatigue ( constant stress-rate) testing, in which a load or a displacement is applied to a specimen at a constant rate. SCG parameters or life prediction parameters needed for designing components made of the same material as that of the specimen are calculated from the relationship between (1) the strength of the material as measured in the test and (2) the applied stress rate used in the test. Despite its simplicity and convenience, dynamic fatigue testing as practiced heretofore has one major drawback: it is extremely time-consuming, especially at low stress rates. The present accelerated methodology reduces the time needed to test a specimen at a given rate of applied load, stress, or displacement. Instead of starting the test from zero applied load or displacement as in the prior methodology, one preloads the specimen and increases the applied load at the specified rate (see Figure 1). One might expect the preload to alter the results of the test and indeed it does, but fortunately, it is possible to account for the effect of the preload in interpreting the results. The accounting is done by calculating the normalized strength (defined as the strength in the presence of preload the strength in the absence of preload) as a function of (1) the preloading factor (defined as the preload stress the strength in the absence of preload) and (2) a SCG parameter, denoted n, that is used in a power-law crack-speed formulation. Figure 2 presents numerical results from this theoretical calculation.

  2. Embedding filtering criteria into a wrapper marker selection method for brain tumor classification: an application on metabolic peak area ratios

    International Nuclear Information System (INIS)

    Kounelakis, M G; Zervakis, M E; Giakos, G C; Postma, G J; Buydens, L M C; Kotsiakis, X

    2011-01-01

    The purpose of this study is to identify reliable sets of metabolic markers that provide accurate classification of complex brain tumors and facilitate the process of clinical diagnosis. Several ratios of metabolites are tested alone or in combination with imaging markers. A wrapper feature selection and classification methodology is studied, employing Fisher's criterion for ranking the markers. The set of extracted markers that express statistical significance is further studied in terms of biological behavior with respect to the brain tumor type and grade. The outcome of this study indicates that the proposed method by exploiting the intrinsic properties of data can actually reveal reliable and biologically relevant sets of metabolic markers, which form an important adjunct toward a more accurate type and grade discrimination of complex brain tumors

  3. The perfectionism model of binge eating: testing unique contributions, mediating mechanisms, and cross-cultural similarities using a daily diary methodology.

    Science.gov (United States)

    Sherry, Simon B; Sabourin, Brigitte C; Hall, Peter A; Hewitt, Paul L; Flett, Gordon L; Gralnick, Tara M

    2014-12-01

    The perfectionism model of binge eating (PMOBE) is an integrative model explaining the link between perfectionism and binge eating. This model proposes socially prescribed perfectionism confers risk for binge eating by generating exposure to 4 putative binge triggers: interpersonal discrepancies, low interpersonal esteem, depressive affect, and dietary restraint. The present study addresses important gaps in knowledge by testing if these 4 binge triggers uniquely predict changes in binge eating on a daily basis and if daily variations in each binge trigger mediate the link between socially prescribed perfectionism and daily binge eating. Analyses also tested if proposed mediational models generalized across Asian and European Canadians. The PMOBE was tested in 566 undergraduate women using a 7-day daily diary methodology. Depressive affect predicted binge eating, whereas anxious affect did not. Each binge trigger uniquely contributed to binge eating on a daily basis. All binge triggers except for dietary restraint mediated the relationship between socially prescribed perfectionism and change in daily binge eating. Results suggested cross-cultural similarities, with the PMOBE applying to both Asian and European Canadian women. The present study advances understanding of the personality traits and the contextual conditions accompanying binge eating and provides an important step toward improving treatments for people suffering from eating binges and associated negative consequences.

  4. A note on Youden's J and its cost ratio

    Directory of Open Access Journals (Sweden)

    Smits Niels

    2010-09-01

    Full Text Available Abstract Background The Youden index, the sum of sensitivity and specificity minus one, is an index used for setting optimal thresholds on medical tests. Discussion When using this index, one implicitly uses decision theory with a ratio of misclassification costs which is equal to one minus the prevalence proportion of the disease. It is doubtful whether this cost ratio truly represents the decision maker's preferences. Moreover, in populations with a different prevalence, a selected threshold is optimal with reference to a different cost ratio. Summary The Youden index is not a truly optimal decision rule for setting thresholds because its cost ratio varies with prevalence. Researchers should look into their cost ratio and employ it in a decision theoretic framework to obtain genuinely optimal thresholds.

  5. Evaluating test-retest reliability in patient-reported outcome measures for older people: A systematic review.

    Science.gov (United States)

    Park, Myung Sook; Kang, Kyung Ja; Jang, Sun Joo; Lee, Joo Yun; Chang, Sun Ju

    2018-03-01

    This study aimed to evaluate the components of test-retest reliability including time interval, sample size, and statistical methods used in patient-reported outcome measures in older people and to provide suggestions on the methodology for calculating test-retest reliability for patient-reported outcomes in older people. This was a systematic literature review. MEDLINE, Embase, CINAHL, and PsycINFO were searched from January 1, 2000 to August 10, 2017 by an information specialist. This systematic review was guided by both the Preferred Reporting Items for Systematic Reviews and Meta-Analyses checklist and the guideline for systematic review published by the National Evidence-based Healthcare Collaborating Agency in Korea. The methodological quality was assessed by the Consensus-based Standards for the selection of health Measurement Instruments checklist box B. Ninety-five out of 12,641 studies were selected for the analysis. The median time interval for test-retest reliability was 14days, and the ratio of sample size for test-retest reliability to the number of items in each measure ranged from 1:1 to 1:4. The most frequently used statistical methods for continuous scores was intraclass correlation coefficients (ICCs). Among the 63 studies that used ICCs, 21 studies presented models for ICC calculations and 30 studies reported 95% confidence intervals of the ICCs. Additional analyses using 17 studies that reported a strong ICC (>0.09) showed that the mean time interval was 12.88days and the mean ratio of the number of items to sample size was 1:5.37. When researchers plan to assess the test-retest reliability of patient-reported outcome measures for older people, they need to consider an adequate time interval of approximately 13days and the sample size of about 5 times the number of items. Particularly, statistical methods should not only be selected based on the types of scores of the patient-reported outcome measures, but should also be described clearly in

  6. Methodology for time-dependent reliability analysis of accident sequences and complex reactor systems

    International Nuclear Information System (INIS)

    Paula, H.M.

    1984-01-01

    The work presented here is of direct use in probabilistic risk assessment (PRA) and is of value to utilities as well as the Nuclear Regulatory Commission (NRC). Specifically, this report presents a methodology and a computer program to calculate the expected number of occurrences for each accident sequence in an event tree. The methodology evaluates the time-dependent (instantaneous) and the average behavior of the accident sequence. The methodology accounts for standby safety system and component failures that occur (a) before they are demanded, (b) upon demand, and (c) during the mission (system operation). With respect to failures that occur during the mission, this methodology is unique in the sense that it models components that can be repaired during the mission. The expected number of system failures during the mission provides an upper bound for the probability of a system failure to run - the mission unreliability. The basic event modeling includes components that are continuously monitored, periodically tested, and those that are not tested or are otherwise nonrepairable. The computer program ASA allows practical applications of the method developed. This work represents a required extension of the presently available methodology and allows a more realistic PRA of nuclear power plants

  7. A methodology for selecting optimum organizations for space communities

    Science.gov (United States)

    Ragusa, J. M.

    1978-01-01

    This paper suggests that a methodology exists for selecting optimum organizations for future space communities of various sizes and purposes. Results of an exploratory study to identify an optimum hypothetical organizational structure for a large earth-orbiting multidisciplinary research and applications (R&A) Space Base manned by a mixed crew of technologists are presented. Since such a facility does not presently exist, in situ empirical testing was not possible. Study activity was, therefore, concerned with the identification of a desired organizational structural model rather than the empirical testing of it. The principal finding of this research was that a four-level project type 'total matrix' model will optimize the effectiveness of Space Base technologists. An overall conclusion which can be reached from the research is that application of this methodology, or portions of it, may provide planning insights for the formal organizations which will be needed during the Space Industrialization Age.

  8. Pengaruh Current Ratio, Asset Size, dan Earnings Variability terhadap Beta Pasar

    Directory of Open Access Journals (Sweden)

    Ahim Abdurahim

    2016-02-01

    Full Text Available The research objective was to determine the effect of variable accounting ie :, current ratio, asset size and earnings variability of the market beta. This study used 72 samples. Analyzer used to test the hypothesis that regression. Previous methods of Fowler and Rorke (1983 to adjust the market beta, and BLUE test is used to test classic assumptions of the independent variables are multikolinearitas, heteroskedasitas with Breushch-Pagan-Godfrey test, and autocorrelation with BG (The Breussh-Godfrey. The results found that the hypothesis H1a, H1b, H1c, and H2a powered means no influence current ratio, asset size and earnings variability of the market beta, both individually and simultaneously.

  9. Optimization of ultrasonic assisted extraction of antioxidants from black soybean (Glycine max var) sprouts using response surface methodology.

    Science.gov (United States)

    Lai, Jixiang; Xin, Can; Zhao, Ya; Feng, Bing; He, Congfen; Dong, Yinmao; Fang, Yun; Wei, Shaomin

    2013-01-16

    Response surface methodology (RSM) using a central composite design (CCD) was employed to optimize the conditions for extraction of antioxidants from black soybean (Glycine max var) sprouts. Three influencing factors: liquid-solid ratio, period of ultrasonic assisted extraction and extraction temperature were investigated in the ultrasonic aqueous extraction. Then Response Surface Methodology (RSM) was applied to optimize the extraction process focused on DPPH radical-scavenging capacity of the antioxidants with respect to the above influencing factors. The best combination of each significant factor was determined by RSM design and optimum pretreatment conditions for maximum radical-scavenging capacity were established to be liquid-solid ratio of 29.19:1, extraction time of 32.13 min, and extraction temperature of 30 °C. Under these conditions, 67.60% of DPPH radical-scavenging capacity was observed experimentally, similar to the theoretical prediction of 66.36%.

  10. The labor/land ratio and India's caste system

    OpenAIRE

    Duleep, Harriet

    2012-01-01

    This paper proposes that India’s caste system and involuntary labor were joint responses by a nonworking landowning class to a low labor/land ratio in which the rules of the caste system supported the institution of involuntary labor. The hypothesis is tested in two ways: longitudinally, with data from ancient religious texts, and cross-sectionally, with twentieth-century statistics on regional population/land ratios linked to anthropological measures of caste-system rigidity. Both the longit...

  11. A study on the effect of free cash flow and profitability current ratio on dividend payout ratio: Evidence from Tehran Stock Exchange

    Directory of Open Access Journals (Sweden)

    Hosein Parsian

    2014-01-01

    Full Text Available Decision making about dividend payout is one of the most important decision that companies should encounter. Identifying factors that influence dividends can help managers in making an appropriate dividend policy. In the other side, companies’ dividend payouts over time and with a stable manner may influence on stock price, future earnings growth and finally investor's evaluation about owners' equity. Hence, investigating the factors influencing dividend payout ratio is of high importance. In this research, we investigate the effects of various factors on dividend payout ratio of Tehran Stock Exchange (TSE listed companies. We use time series regression (panel data in order to test the hypothesis of this study. This study provides empirical evidences by choosing a sample of 102 companies over the time span of 2005-2010. The result shows that independent variables of free cash flow and profitability current ratio have negative and significant impact on dividend payout ratio; whereas, the independent variable of leverage ratio has a positive and significant impact on dividend payout ratio. The other independent ratio such as size of the company, growth opportunities and systematic risk do not have any significant influence on dividend payout ratio.

  12. [Types of medical registries - definitions, methodological aspects and quality of the scientific work with registries].

    Science.gov (United States)

    Mathis-Edenhofer, Stefan; Piso, Brigitte

    2011-12-01

    This work presents a comprehensive list of registry definitions including broader and narrower definitions. Compared to each other different methodological issues can be identified. Some of these issues are common for all registry types; some can be assigned more easily to a specific registry type. Instruments for evaluating the quality of registers reflect many of the mentioned aspects. Generally, and especially at registers with a descriptive or exploratory research dimension it is important to consider their intended purpose and in about it was achieved. This includes, for instance, whether the purpose and the methodology are coordinated. From the start of registration an initiator should be - based on the purpose - aware of the methodological dimension of the registry. This helps to apply the correct type of the registry, the appropriate guidance and, ultimately, the arguments for the effort (cost-benefit ratio).

  13. Sex ratio and Wolbachia infection in the ant Formica exsecta.

    Science.gov (United States)

    Keller, L; Liautard, C; Reuter, M; Brown, W D; Sundström, L; Chapuisat, M

    2001-08-01

    Sex allocation data in social Hymenoptera provide some of the best tests of kin selection, parent-offspring conflict and sex ratio theories. However, these studies critically depend on controlling for confounding ecological factors and on identifying all parties that potentially manipulate colony sex ratio. It has been suggested that maternally inherited parasites may influence sex allocation in social Hymenoptera. If the parasites can influence sex allocation, infected colonies are predicted to invest more resources in females than non-infected colonies, because the parasites are transmitted through females but not males. Prime candidates for such sex ratio manipulation are Wolbachia, because these cytoplasmically transmitted bacteria have been shown to affect the sex ratio of host arthropods by cytoplasmic incompatibility, parthenogenesis, male-killing and feminization. In this study, we tested whether Wolbachia infection is associated with colony sex ratio in two populations of the ant Formica exsecta that have been the subject of extensive sex ratio studies. In these populations colonies specialize in the production of one sex or the other. We found that almost all F. exsecta colonies in both populations are infected with Wolbachia. However, in neither population did we find a significant association in the predicted direction between the prevalence of Wolbachia and colony sex ratio. In particular, colonies with a higher proportion of infected workers did not produce more females. Hence, we conclude that Wolbachia does not seem to alter the sex ratio of its hosts as a means to increase transmission rate in these two populations of ants.

  14. Ownership of dwelling affects the sex ratio at birth in Uganda.

    Directory of Open Access Journals (Sweden)

    Bernard Wallner

    Full Text Available BACKGROUND: Socio-economic conditions can affect the secondary sex ratio in humans. Mothers under good environmental conditions are predicted to increase the birth rates of sons according to the Trivers-Willard hypothesis (TWH. This study analyzed the effects of ownership and non-ownership of dwellings on the sex ratio at birth (SRB on a Ugandan sample. METHODOLOGY/PRINCIPAL FINDINGS: Our investigation included 438,640 mothers aged between 12 and 54 years. The overall average SRB was 0.5008. Mothers who live in owned dwellings gave increased births to sons (0.5019 compared to those who live in non-owned dwellings (0.458. Multivariate statistics revealed the strongest effects of dwelling ownership when controlling for demographic and social variables such as marital status, type of marriage, mothers' age, mothers' education, parity and others. CONCLUSIONS/SIGNIFICANCE: The results are discussed in the framework of recent plausible models dealing with the adjustment of the sex ratio. We conclude that the aspect of dwelling status could represent an important socio-economic parameter in relation to SRB variations in humans if further studies are able to analyze it between different countries in a comparative way.

  15. Ownership of Dwelling Affects the Sex Ratio at Birth in Uganda

    Science.gov (United States)

    Wallner, Bernard; Fieder, Martin; Seidler, Horst

    2012-01-01

    Background Socio-economic conditions can affect the secondary sex ratio in humans. Mothers under good environmental conditions are predicted to increase the birth rates of sons according to the Trivers-Willard hypothesis (TWH). This study analyzed the effects of ownership and non-ownership of dwellings on the sex ratio at birth (SRB) on a Ugandan sample. Methodology/Principal Findings Our investigation included 438,640 mothers aged between 12 and 54 years. The overall average SRB was 0.5008. Mothers who live in owned dwellings gave increased births to sons (0.5019) compared to those who live in non-owned dwellings (0.458). Multivariate statistics revealed the strongest effects of dwelling ownership when controlling for demographic and social variables such as marital status, type of marriage, mothers’ age, mothers’ education, parity and others. Conclusions/Significance The results are discussed in the framework of recent plausible models dealing with the adjustment of the sex ratio. We conclude that the aspect of dwelling status could represent an important socio-economic parameter in relation to SRB variations in humans if further studies are able to analyze it between different countries in a comparative way. PMID:23284697

  16. Experimental testing of an ABB Master application

    International Nuclear Information System (INIS)

    Haapanen, P.; Maskuniitty, M.; Korhonen, J.; Tuulari, E.

    1995-10-01

    A prototype dynamic testing harness for programmable automation systems has been specified and implemented at the Technical Research Centre of Finland (VTT). In order to get experience on the methodology and equipment for the testing of systems important to the safety of nuclear power plants, where the safety and reliability requirements often are very high, two different pilot systems have been tested. One system was an ABB Master application, which was loaned for testing from ABB Atom by Teollisuuden Voima Oy (TVO). Another system, loaned from Siemens AG (SAG) by IVO International Oy (IVO), was an application realized with SAG's digital SILT technology. The report describes the experiences gained in testing an APRM pilot system realized with ABB Master technology. The testing of the pilot application took place in the VTT Automation laboratory in Otaniemi in September-October 1994. The purpose of the testing was not to assess the quality of the pilot system, but to get experience in the testing methodology and find out the further development needs and potentials of the test methodology and equipment. (7 refs., 14 figs., 9 tabs.)

  17. Robust Confidence Interval for a Ratio of Standard Deviations

    Science.gov (United States)

    Bonett, Douglas G.

    2006-01-01

    Comparing variability of test scores across alternate forms, test conditions, or subpopulations is a fundamental problem in psychometrics. A confidence interval for a ratio of standard deviations is proposed that performs as well as the classic method with normal distributions and performs dramatically better with nonnormal distributions. A simple…

  18. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  19. Design studies of low-aspect ratio quasi-omnigenous stellarators

    International Nuclear Information System (INIS)

    Spong, D.A.; Hirshman, S.; Whitson, J.C.

    2001-01-01

    Significant progress has been made in the development of new modest-size compact stellarator devices that could test optimization principles for the design of a more attractive reactor. These are 3 and 4 field period low-aspect-ratio quasi-omnigenous (QO) stellarators based on an optimization method that targets improved confinement, stability, ease of coil design, low-aspect-ratio, and low bootstrap current. (author)

  20. Analysing Diagnostic Assessment on the Ratio of Sine in a Right Triangle

    Science.gov (United States)

    Andika, R.; Juandi, D.; Rosjanuardi, R.

    2017-09-01

    This study aims to develop diagnostic assessment with the special topic of the ratio of sinus in a right triangle and analyze the result whether the students are ready to continue to the next lesson of trigonometry specially the sinus rule. The methodology that use in this study is a design research of Plomp model which is it comprises of 3 phases: (a) preliminary research; (b) prototyping phase; and (c) assessment phase. The findings show that almost half of students made a mistake in determining the ratio of sin in a right triangle, consequently the procedure for solving the problem went wrong. In strategic competency and adaptive communication most of students did not solve the problem that was given. According to the result, the students have to get remedial program before to the next lesson, the rule of sin.