WorldWideScience

Sample records for ratio test based

  1. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    Science.gov (United States)

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  2. A more powerful test based on ratio distribution for retention noninferiority hypothesis.

    Science.gov (United States)

    Deng, Ling; Chen, Gang

    2013-03-11

    Rothmann et al. ( 2003 ) proposed a method for the statistical inference of fraction retention noninferiority (NI) hypothesis. A fraction retention hypothesis is defined as a ratio of the new treatment effect verse the control effect in the context of a time to event endpoint. One of the major concerns using this method in the design of an NI trial is that with a limited sample size, the power of the study is usually very low. This makes an NI trial not applicable particularly when using time to event endpoint. To improve power, Wang et al. ( 2006 ) proposed a ratio test based on asymptotic normality theory. Under a strong assumption (equal variance of the NI test statistic under null and alternative hypotheses), the sample size using Wang's test was much smaller than that using Rothmann's test. However, in practice, the assumption of equal variance is generally questionable for an NI trial design. This assumption is removed in the ratio test proposed in this article, which is derived directly from a Cauchy-like ratio distribution. In addition, using this method, the fundamental assumption used in Rothmann's test, that the observed control effect is always positive, that is, the observed hazard ratio for placebo over the control is greater than 1, is no longer necessary. Without assuming equal variance under null and alternative hypotheses, the sample size required for an NI trial can be significantly reduced if using the proposed ratio test for a fraction retention NI hypothesis.

  3. Nearly Efficient Likelihood Ratio Tests for Seasonal Unit Roots

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    In an important generalization of zero frequency autore- gressive unit root tests, Hylleberg, Engle, Granger, and Yoo (1990) developed regression-based tests for unit roots at the seasonal frequencies in quarterly time series. We develop likelihood ratio tests for seasonal unit roots and show...... that these tests are "nearly efficient" in the sense of Elliott, Rothenberg, and Stock (1996), i.e. that their local asymptotic power functions are indistinguishable from the Gaussian power envelope. Currently available nearly efficient testing procedures for seasonal unit roots are regression-based and require...... the choice of a GLS detrending parameter, which our likelihood ratio tests do not....

  4. Evidence Based Medicine; Positive and Negative Likelihood Ratios of Diagnostic Tests

    Directory of Open Access Journals (Sweden)

    Alireza Baratloo

    2015-10-01

    Full Text Available In the previous two parts of educational manuscript series in Emergency, we explained some screening characteristics of diagnostic tests including accuracy, sensitivity, specificity, and positive and negative predictive values. In the 3rd  part we aimed to explain positive and negative likelihood ratio (LR as one of the most reliable performance measures of a diagnostic test. To better understand this characteristic of a test, it is first necessary to fully understand the concept of sensitivity and specificity. So we strongly advise you to review the 1st part of this series again. In short, the likelihood ratios are about the percentage of people with and without a disease but having the same test result. The prevalence of a disease can directly influence screening characteristics of a diagnostic test, especially its sensitivity and specificity. Trying to eliminate this effect, LR was developed. Pre-test probability of a disease multiplied by positive or negative LR can estimate post-test probability. Therefore, LR is the most important characteristic of a test to rule out or rule in a diagnosis. A positive likelihood ratio > 1 means higher probability of the disease to be present in a patient with a positive test. The further from 1, either higher or lower, the stronger the evidence to rule in or rule out the disease, respectively. It is obvious that tests with LR close to one are less practical. On the other hand, LR further from one will have more value for application in medicine. Usually tests with 0.1 < LR > 10 are considered suitable for implication in routine practice.

  5. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    Science.gov (United States)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  6. The behavior of the likelihood ratio test for testing missingness

    OpenAIRE

    Hens, Niel; Aerts, Marc; Molenberghs, Geert; Thijs, Herbert

    2003-01-01

    To asses the sensitivity of conclusions to model choices in the context of selection models for non-random dropout, one can oppose the different missing mechanisms to each other; e.g. by the likelihood ratio tests. The finite sample behavior of the null distribution and the power of the likelihood ratio test is studied under a variety of missingness mechanisms. missing data; sensitivity analysis; likelihood ratio test; missing mechanisms

  7. Alvar engine. An engine with variable compression ratio. Experiments and tests

    Energy Technology Data Exchange (ETDEWEB)

    Erlandsson, Olof

    1998-09-01

    This report is focused on tests with Variable Compression Ratio (VCR) engines, according to the Alvar engine principle. Variable compression ratio means an engine design where it is possible to change the nominal compression ratio. The purpose is to increase the fuel efficiency at part load by increasing the compression ratio. At maximum load, and maybe supercharging with for example turbocharger, it is not possible to keep a high compression ratio because of the knock phenomena. Knock is a shock wave caused by self-ignition of the fuel-air mix. If knock occurs, the engine will be exposed to a destructive load. Because of the reasons mentioned it would be an advantage if it would be possible to change the compression ratio continuously when the load changes. The Alvar engine provides a solution for variable compression ratio based on well-known engine components. This paper provides information about efficiency and emission characteristics from tests with two Alvar engines. Results from tests with a phase shift mechanism (for automatic compression ratio control) for the Alvar engine are also reviewed Examination paper. 5 refs, 23 figs, 2 tabs, 5 appendices

  8. A likelihood ratio test for species membership based on DNA sequence data

    DEFF Research Database (Denmark)

    Matz, Mikhail V.; Nielsen, Rasmus

    2005-01-01

    DNA barcoding as an approach for species identification is rapidly increasing in popularity. However, it remains unclear which statistical procedures should accompany the technique to provide a measure of uncertainty. Here we describe a likelihood ratio test which can be used to test if a sampled...... sequence is a member of an a priori specified species. We investigate the performance of the test using coalescence simulations, as well as using the real data from butterflies and frogs representing two kinds of challenge for DNA barcoding: extremely low and extremely high levels of sequence variability....

  9. The efficiency of the crude oil markets: Evidence from variance ratio tests

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Amelie, E-mail: acharles@audencia.co [Audencia Nantes, School of Management, 8 route de la Joneliere, 44312 Nantes (France); Darne, Olivier, E-mail: olivier.darne@univ-nantes.f [LEMNA, University of Nantes, IEMN-IAE, Chemin de la Censive du Tertre, 44322 Nantes (France)

    2009-11-15

    This study examines the random walk hypothesis for the crude oil markets, using daily data over the period 1982-2008. The weak-form efficient market hypothesis for two crude oil markets (UK Brent and US West Texas Intermediate) is tested with non-parametric variance ratio tests developed by [Wright J.H., 2000. Alternative variance-ratio tests using ranks and signs. Journal of Business and Economic Statistics, 18, 1-9] and [Belaire-Franch J. and Contreras D., 2004. Ranks and signs-based multiple variance ratio tests. Working paper, Department of Economic Analysis, University of Valencia] as well as the wild-bootstrap variance ratio tests suggested by [Kim, J.H., 2006. Wild bootstrapping variance ratio tests. Economics Letters, 92, 38-43]. We find that the Brent crude oil market is weak-form efficiency while the WTI crude oil market seems to be inefficiency on the 1994-2008 sub-period, suggesting that the deregulation have not improved the efficiency on the WTI crude oil market in the sense of making returns less predictable.

  10. The efficiency of the crude oil markets. Evidence from variance ratio tests

    International Nuclear Information System (INIS)

    Charles, Amelie; Darne, Olivier

    2009-01-01

    This study examines the random walk hypothesis for the crude oil markets, using daily data over the period 1982-2008. The weak-form efficient market hypothesis for two crude oil markets (UK Brent and US West Texas Intermediate) is tested with non-parametric variance ratio tests developed by [Wright J.H., 2000. Alternative variance-ratio tests using ranks and signs. Journal of Business and Economic Statistics, 18, 1-9] and [Belaire-Franch J. and Contreras D., 2004. Ranks and signs-based multiple variance ratio tests. Working paper, Department of Economic Analysis, University of Valencia] as well as the wild-bootstrap variance ratio tests suggested by [Kim, J.H., 2006. Wild bootstrapping variance ratio tests. Economics Letters, 92, 38-43]. We find that the Brent crude oil market is weak-form efficiency while the WTI crude oil market seems to be inefficiency on the 1994-2008 sub-period, suggesting that the deregulation have not improved the efficiency on the WTI crude oil market in the sense of making returns less predictable. (author)

  11. The efficiency of the crude oil markets. Evidence from variance ratio tests

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Amelie [Audencia Nantes, School of Management, 8 route de la Joneliere, 44312 Nantes (France); Darne, Olivier [LEMNA, University of Nantes, IEMN-IAE, Chemin de la Censive du Tertre, 44322 Nantes (France)

    2009-11-15

    This study examines the random walk hypothesis for the crude oil markets, using daily data over the period 1982-2008. The weak-form efficient market hypothesis for two crude oil markets (UK Brent and US West Texas Intermediate) is tested with non-parametric variance ratio tests developed by [Wright J.H., 2000. Alternative variance-ratio tests using ranks and signs. Journal of Business and Economic Statistics, 18, 1-9] and [Belaire-Franch J. and Contreras D., 2004. Ranks and signs-based multiple variance ratio tests. Working paper, Department of Economic Analysis, University of Valencia] as well as the wild-bootstrap variance ratio tests suggested by [Kim, J.H., 2006. Wild bootstrapping variance ratio tests. Economics Letters, 92, 38-43]. We find that the Brent crude oil market is weak-form efficiency while the WTI crude oil market seems to be inefficiency on the 1994-2008 sub-period, suggesting that the deregulation have not improved the efficiency on the WTI crude oil market in the sense of making returns less predictable. (author)

  12. Similar tests and the standardized log likelihood ratio statistic

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1986-01-01

    When testing an affine hypothesis in an exponential family the 'ideal' procedure is to calculate the exact similar test, or an approximation to this, based on the conditional distribution given the minimal sufficient statistic under the null hypothesis. By contrast to this there is a 'primitive......' approach in which the marginal distribution of a test statistic considered and any nuisance parameter appearing in the test statistic is replaced by an estimate. We show here that when using standardized likelihood ratio statistics the 'primitive' procedure is in fact an 'ideal' procedure to order O(n -3...

  13. The Laplace Likelihood Ratio Test for Heteroscedasticity

    Directory of Open Access Journals (Sweden)

    J. Martin van Zyl

    2011-01-01

    Full Text Available It is shown that the likelihood ratio test for heteroscedasticity, assuming the Laplace distribution, gives good results for Gaussian and fat-tailed data. The likelihood ratio test, assuming normality, is very sensitive to any deviation from normality, especially when the observations are from a distribution with fat tails. Such a likelihood test can also be used as a robust test for a constant variance in residuals or a time series if the data is partitioned into groups.

  14. An Intersection–Union Test for the Sharpe Ratio

    Directory of Open Access Journals (Sweden)

    Gabriel Frahm

    2018-04-01

    Full Text Available An intersection–union test for supporting the hypothesis that a given investment strategy is optimal among a set of alternatives is presented. It compares the Sharpe ratio of the benchmark with that of each other strategy. The intersection–union test takes serial dependence into account and does not presume that asset returns are multivariate normally distributed. An empirical study based on the G–7 countries demonstrates that it is hard to find significant results due to the lack of data, which confirms a general observation in empirical finance.

  15. Sex ratios in the two Germanies: a test of the economic stress hypothesis.

    Science.gov (United States)

    Catalano, Ralph A

    2003-09-01

    Literature describing temporal variation in the secondary sex ratio among humans reports an association between population stressors and declines in the odds of male birth. Explanations of this phenomenon draw on reports that stressed females spontaneously abort male more than female fetuses, and that stressed males exhibit reduced sperm motility. This work has led to the argument that population stress induced by a declining economy reduces the human sex ratio. No direct test of this hypothesis appears in the literature. Here, a test is offered based on a comparison of the sex ratio in East and West Germany for the years 1946 to 1999. The theory suggests that the East German sex ratio should be lower in 1991, when East Germany's economy collapsed, than expected from its own history and from the sex ratio in West Germany. The hypothesis is tested using time-series modelling methods. The data support the hypothesis. The sex ratio in East Germany was at its lowest in 1991. This first direct test supports the hypothesis that economic decline reduces the human sex ratio.

  16. Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis; Gold, Dara

    2013-01-01

    We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.

  17. Jet-Surface Interaction - High Aspect Ratio Nozzle Test: Test Summary

    Science.gov (United States)

    Brown, Clifford A.

    2016-01-01

    The Jet-Surface Interaction High Aspect Ratio Nozzle Test was conducted in the Aero-Acoustic Propulsion Laboratory at the NASA Glenn Research Center in the fall of 2015. There were four primary goals specified for this test: (1) extend the current noise database for rectangular nozzles to higher aspect ratios, (2) verify data previously acquired at small-scale with data from a larger model, (3) acquired jet-surface interaction noise data suitable for creating verifying empirical noise models and (4) investigate the effect of nozzle septa on the jet-mixing and jet-surface interaction noise. These slides give a summary of the test with representative results for each goal.

  18. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  19. An evaluation of damping ratios for HVAC duct systems using vibration test data

    International Nuclear Information System (INIS)

    Gunyasu, K.; Horimizu, Y.; Kawakami, A.; Iokibe, H.; Yamazaki, T.

    1988-01-01

    The function of Heating Ventilating Air Conditioning (HVAC) systems must be maintained including HVAC duct systems to keep the operation of safety-related equipment in nuclear power plants during earthquake excitations. Therefore, it is important to carry out seismic design for HVAC duct systems. In the previous aseismic design for HVAC duct systems, the 0.5% damping ratio has been used in Japan. In recent years, vibration tests, held on actual duct systems in nuclear power plants and mockup duct systems were performed in order to investigate damping ratios for HVAC duct systems. Based on the results, it was confirmed that the damping ratio for HVAC duct systems, evaluated from these tests, were much greater than the 0.5% damping ratio used in the previous aseismic design of Japan. The new damping ratio in aseismic design was proposed to be 2.5%. The present paper describes the results of the above mentioned investigation

  20. Systems Biology and Ratio-Based, Real-Time Disease Surveillance.

    Science.gov (United States)

    Fair, J M; Rivas, A L

    2015-08-01

    Most infectious disease surveillance methods are not well fit for early detection. To address such limitation, here we evaluated a ratio- and Systems Biology-based method that does not require prior knowledge on the identity of an infective agent. Using a reference group of birds experimentally infected with West Nile virus (WNV) and a problem group of unknown health status (except that they were WNV-negative and displayed inflammation), both groups were followed over 22 days and tested with a system that analyses blood leucocyte ratios. To test the ability of the method to discriminate small data sets, both the reference group (n = 5) and the problem group (n = 4) were small. The questions of interest were as follows: (i) whether individuals presenting inflammation (disease-positive or D+) can be distinguished from non-inflamed (disease-negative or D-) birds, (ii) whether two or more D+ stages can be detected and (iii) whether sample size influences detection. Within the problem group, the ratio-based method distinguished the following: (i) three (one D- and two D+) data classes; (ii) two (early and late) inflammatory stages; (iii) fast versus regular or slow responders; and (iv) individuals that recovered from those that remained inflamed. Because ratios differed in larger magnitudes (up to 48 times larger) than percentages, it is suggested that data patterns are likely to be recognized when disease surveillance methods are designed to measure inflammation and utilize ratios. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  1. 21 CFR 862.1455 - Lecithin/sphingomyelin ratio in amniotic fluid test system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Lecithin/sphingomyelin ratio in amniotic fluid... Clinical Chemistry Test Systems § 862.1455 Lecithin/sphingomyelin ratio in amniotic fluid test system. (a) Identification. A lecithin/sphingomyelin ratio in amniotic fluid test system is a device intended to measure the...

  2. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  3. Comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Racz, A. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1996-07-01

    In this paper the classical sequential probability ratio testing method (SPRT) is reconsidered. Every individual boundary crossing event of the SPRT is regarded as a new piece of evidence about the problem under hypothesis testing. The Bayes method is applied for belief updating, i.e. integrating these individual decisions. The procedure is recommended to use when the user (1) would like to be informed about the tested hypothesis continuously and (2) would like to achieve his final conclusion with high confidence level. (Author).

  4. A Space Object Detection Algorithm using Fourier Domain Likelihood Ratio Test

    Science.gov (United States)

    Becker, D.; Cain, S.

    Space object detection is of great importance in the highly dependent yet competitive and congested space domain. Detection algorithms employed play a crucial role in fulfilling the detection component in the situational awareness mission to detect, track, characterize and catalog unknown space objects. Many current space detection algorithms use a matched filter or a spatial correlator to make a detection decision at a single pixel point of a spatial image based on the assumption that the data follows a Gaussian distribution. This paper explores the potential for detection performance advantages when operating in the Fourier domain of long exposure images of small and/or dim space objects from ground based telescopes. A binary hypothesis test is developed based on the joint probability distribution function of the image under the hypothesis that an object is present and under the hypothesis that the image only contains background noise. The detection algorithm tests each pixel point of the Fourier transformed images to make the determination if an object is present based on the criteria threshold found in the likelihood ratio test. Using simulated data, the performance of the Fourier domain detection algorithm is compared to the current algorithm used in space situational awareness applications to evaluate its value.

  5. Sequential Probability Ratio Testing with Power Projective Base Method Improves Decision-Making for BCI

    Science.gov (United States)

    Liu, Rong

    2017-01-01

    Obtaining a fast and reliable decision is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this study, the EEG signals were firstly analyzed with a power projective base method. Then we were applied a decision-making model, the sequential probability ratio testing (SPRT), for single-trial classification of motor imagery movement events. The unique strength of this proposed classification method lies in its accumulative process, which increases the discriminative power as more and more evidence is observed over time. The properties of the method were illustrated on thirteen subjects' recordings from three datasets. Results showed that our proposed power projective method outperformed two benchmark methods for every subject. Moreover, with sequential classifier, the accuracies across subjects were significantly higher than that with nonsequential ones. The average maximum accuracy of the SPRT method was 84.1%, as compared with 82.3% accuracy for the sequential Bayesian (SB) method. The proposed SPRT method provides an explicit relationship between stopping time, thresholds, and error, which is important for balancing the time-accuracy trade-off. These results suggest SPRT would be useful in speeding up decision-making while trading off errors in BCI. PMID:29348781

  6. Ratio-based vs. model-based methods to correct for urinary creatinine concentrations.

    Science.gov (United States)

    Jain, Ram B

    2016-08-01

    Creatinine-corrected urinary analyte concentration is usually computed as the ratio of the observed level of analyte concentration divided by the observed level of the urinary creatinine concentration (UCR). This ratio-based method is flawed since it implicitly assumes that hydration is the only factor that affects urinary creatinine concentrations. On the contrary, it has been shown in the literature, that age, gender, race/ethnicity, and other factors also affect UCR. Consequently, an optimal method to correct for UCR should correct for hydration as well as other factors like age, gender, and race/ethnicity that affect UCR. Model-based creatinine correction in which observed UCRs are used as an independent variable in regression models has been proposed. This study was conducted to evaluate the performance of ratio-based and model-based creatinine correction methods when the effects of gender, age, and race/ethnicity are evaluated one factor at a time for selected urinary analytes and metabolites. It was observed that ratio-based method leads to statistically significant pairwise differences, for example, between males and females or between non-Hispanic whites (NHW) and non-Hispanic blacks (NHB), more often than the model-based method. However, depending upon the analyte of interest, the reverse is also possible. The estimated ratios of geometric means (GM), for example, male to female or NHW to NHB, were also compared for the two methods. When estimated UCRs were higher for the group (for example, males) in the numerator of this ratio, these ratios were higher for the model-based method, for example, male to female ratio of GMs. When estimated UCR were lower for the group (for example, NHW) in the numerator of this ratio, these ratios were higher for the ratio-based method, for example, NHW to NHB ratio of GMs. Model-based method is the method of choice if all factors that affect UCR are to be accounted for.

  7. Computing power and sample size for case-control association studies with copy number polymorphism: application of mixture-based likelihood ratio test.

    Directory of Open Access Journals (Sweden)

    Wonkuk Kim

    Full Text Available Recent studies suggest that copy number polymorphisms (CNPs may play an important role in disease susceptibility and onset. Currently, the detection of CNPs mainly depends on microarray technology. For case-control studies, conventionally, subjects are assigned to a specific CNP category based on the continuous quantitative measure produced by microarray experiments, and cases and controls are then compared using a chi-square test of independence. The purpose of this work is to specify the likelihood ratio test statistic (LRTS for case-control sampling design based on the underlying continuous quantitative measurement, and to assess its power and relative efficiency (as compared to the chi-square test of independence on CNP counts. The sample size and power formulas of both methods are given. For the latter, the CNPs are classified using the Bayesian classification rule. The LRTS is more powerful than this chi-square test for the alternatives considered, especially alternatives in which the at-risk CNP categories have low frequencies. An example of the application of the LRTS is given for a comparison of CNP distributions in individuals of Caucasian or Taiwanese ethnicity, where the LRTS appears to be more powerful than the chi-square test, possibly due to misclassification of the most common CNP category into a less common category.

  8. Safety Testing of Ammonium Nitrate Based Mixtures

    Science.gov (United States)

    Phillips, Jason; Lappo, Karmen; Phelan, James; Peterson, Nathan; Gilbert, Don

    2013-06-01

    Ammonium nitrate (AN)/ammonium nitrate based explosives have a lengthy documented history of use by adversaries in acts of terror. While historical research has been conducted on AN-based explosive mixtures, it has primarily focused on detonation performance while varying the oxygen balance between the oxidizer and fuel components. Similarly, historical safety data on these materials is often lacking in pertinent details such as specific fuel type, particle size parameters, oxidizer form, etc. A variety of AN-based fuel-oxidizer mixtures were tested for small-scale sensitivity in preparation for large-scale testing. Current efforts focus on maintaining a zero oxygen-balance (a stoichiometric ratio for active chemical participants) while varying factors such as charge geometry, oxidizer form, particle size, and inert diluent ratios. Small-scale safety testing was conducted on various mixtures and fuels. It was found that ESD sensitivity is significantly affected by particle size, while this is less so for impact and friction. Thermal testing is in progress to evaluate hazards that may be experienced during large-scale testing.

  9. White Matter Fiber-based Analysis of T1w/T2w Ratio Map.

    Science.gov (United States)

    Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D; Entringer, Sonja; Buss, Claudia; Styner, Martin

    2017-02-01

    To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.

  10. White matter fiber-based analysis of T1w/T2w ratio map

    Science.gov (United States)

    Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D.; Entringer, Sonja; Buss, Claudia; Styner, Martin

    2017-02-01

    Purpose: To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. Background: The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. Methods: We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. Results: We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.

  11. Multiple Improvements of Multiple Imputation Likelihood Ratio Tests

    OpenAIRE

    Chan, Kin Wai; Meng, Xiao-Li

    2017-01-01

    Multiple imputation (MI) inference handles missing data by first properly imputing the missing values $m$ times, and then combining the $m$ analysis results from applying a complete-data procedure to each of the completed datasets. However, the existing method for combining likelihood ratio tests has multiple defects: (i) the combined test statistic can be negative in practice when the reference null distribution is a standard $F$ distribution; (ii) it is not invariant to re-parametrization; ...

  12. Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis

    2013-01-01

    A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.

  13. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions Based on a Bank of Norm-Inequality-Constrained Epoch-State Filters

    Science.gov (United States)

    Carpenter, J. R.; Markley, F. L.; Alfriend, K. T.; Wright, C.; Arcido, J.

    2011-01-01

    Sequential probability ratio tests explicitly allow decision makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models 1he null hypothesis 1ha1 the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming highly-elliptical orbit formation flying mission.

  14. Graphite Isotope Ratio Method Development Report: Irradiation Test Demonstration of Uranium as a Low Fluence Indicator

    International Nuclear Information System (INIS)

    Reid, B.D.; Gerlach, D.C.; Love, E.F.; McNeece, J.P.; Livingston, J.V.; Greenwood, L.R.; Petersen, S.L.; Morgan, W.C.

    1999-01-01

    This report describes an irradiation test designed to investigate the suitability of uranium as a graphite isotope ratio method (GIRM) low fluence indicator. GIRM is a demonstrated concept that gives a graphite-moderated reactor's lifetime production based on measuring changes in the isotopic ratio of elements known to exist in trace quantities within reactor-grade graphite. Appendix I of this report provides a tutorial on the GIRM concept

  15. Understanding the properties of diagnostic tests - Part 2: Likelihood ratios.

    Science.gov (United States)

    Ranganathan, Priya; Aggarwal, Rakesh

    2018-01-01

    Diagnostic tests are used to identify subjects with and without disease. In a previous article in this series, we examined some attributes of diagnostic tests - sensitivity, specificity, and predictive values. In this second article, we look at likelihood ratios, which are useful for the interpretation of diagnostic test results in everyday clinical practice.

  16. A Time-Measurement System Based on Isotopic Ratios

    International Nuclear Information System (INIS)

    Vo, Duc T.; Karpius, P.J.; MacArthur, D.W.; Thron, J.L.

    2007-01-01

    A time-measurement system can be built based on the ratio of gamma-ray peak intensities from two radioactive isotopes. The ideal system would use a parent isotope with a short half-life decaying to a long half-life daughter. The activities of the parent-daughter isotopes would be measured using a gamma-ray detector system. The time can then be determined from the ratio of the activities. The best-known candidate for such a system is the 241 Pu- 241 Am parent-daughter pair. However, this 241 Pu- 241 Am system would require a high-purity germanium detector system and sophisticated software to separate and distinguish between the many gamma-ray peaks produced by the decays of the two isotopes. An alternate system would use two different isotopes, again one with a short half-life and one with a half-life that is long relative to the other. The pair of isotopes 210 Pb and 241 Am (with half-lives of 22 and 432 years, respectively) appears suitable for such a system. This time-measurement system operates by measuring the change in the ratio of the 47-keV peak of 210 Pb to the 60-keV peak of 241 Am. For the system to work reasonably well, the resolution of the detector would need to be such that the two gamma-ray peaks are well separated so that their peak areas can be accurately determined using a simple region-of-interest (ROI) method. A variety of detectors were tested to find a suitable system for this application. The results of these tests are presented here.

  17. Prediction of Microcystis Blooms Based on TN:TP Ratio and Lake Origin

    Directory of Open Access Journals (Sweden)

    Yoshimasa Amano

    2008-01-01

    Full Text Available We evaluated the relationship between TN:TP ratio and Microcystis growth via a database that includes worldwide lakes based on four types of lake origin (dammed, tectonic, coastal, and volcanic lakes. We used microcosm and mesocosm for the nutrient elution tests with lake water and four kinds of sediment (nontreated, MgO sprinkling treated, dissolved air flotation [DAF] treated, and combined treated sediment in order to control TN:TP ratio and to suppress Microcystis growth. Microcystis growth was related to TN:TP ratio, with the maximum value at an optimum TN:TP ratio and the minimum values when the TN:TP ratios reached to 0 or ∞. The kurtosis of the distribution curve varied with the type of lake origin; the lowest kurtosis was found in dammed lakes, while the highest was found in volcanic lakes. The lake trophic state could affect the change in the kurtosis, providing much lower kurtosis at eutrophic lakes (dammed lakes than that at oligotrophic lakes (volcanic lakes. The relationship between TN:TP ratio and Microcystis growth could be explained by the nutrient elution tests under controlled TN:TP ratios through the various sediment treatments. A significant suppression of Microcystis growth of 70% could be achieved when the TN:TP ratios exceeded 21. Lake origin could be regarded as an index including morphological and geographical factors, and controlling the trophic state in lakes. The origin rather than trophic state for lakes could be considered as an important factor of TN:TP influences on Microcystis growth.

  18. Further comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kulacsy, K. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1997-05-23

    The Bayesian method for belief updating proposed in Racz (1996) is examined. The interpretation of the belief function introduced therein is found, and the method is compared to the classical binary Sequential Probability Ratio Testing method (SPRT). (author).

  19. The Golden Ratio in Time-based Media

    Directory of Open Access Journals (Sweden)

    Emily Verba

    2013-06-01

    Full Text Available Measure and proportion manifest themselves in all areas of beauty and virtue.–Socrates Mathematics and visual communication share a long historical, symbiotic relationship. In their pursuit of achieving order and beauty, they find common ground through geometry. The golden ratio is a mathematic and aesthetic phenomenon inherent in nature that has consistently evoked sensory enjoyment since antiquity. It may be assumed that the manifestation of the golden ratio in nature accounts for human’s innate enjoyment of it. Throughout the ages, the conscious application of the golden ratio to proportions found in art, architecture, poetry, literature and musical composition has consistently evoked subconscious sensory pleasure. However, the application of the golden ratio to visual temporal proportion, or time-based media, has seldom been investigated. This thesis investigates various applications of the golden ratio as a mathematical framework for choreographing visually harmonious temporal compositions through time-based media. The proliferation of moving images we face on a daily basis is cause for great concern, as we have increasingly less free time in our days. Informative and pleasing images are buried in an avalanche of visual rubbish, constantly streaming into our physical and virtual worlds. Time-based media has the ability to expand and contract movement, thus directing the way viewers experience and spend their time. This investigation presupposes that editing moving images via increments of time determined by the golden ratio may streamline messages, isolating what is most symbolic and effectively communicative within a mathematical framework. A physiological and psychological benefit is created for viewers; there is no wasted time or space. Image-makers and visual communicators have a responsibility to create only that which is useful and/or aesthetically pleasing. An investigation into the temporal structure of time-based media, using

  20. Orthogonal series generalized likelihood ratio test for failure detection and isolation. [for aircraft control

    Science.gov (United States)

    Hall, Steven R.; Walker, Bruce K.

    1990-01-01

    A new failure detection and isolation algorithm for linear dynamic systems is presented. This algorithm, the Orthogonal Series Generalized Likelihood Ratio (OSGLR) test, is based on the assumption that the failure modes of interest can be represented by truncated series expansions. This assumption leads to a failure detection algorithm with several desirable properties. Computer simulation results are presented for the detection of the failures of actuators and sensors of a C-130 aircraft. The results show that the OSGLR test generally performs as well as the GLR test in terms of time to detect a failure and is more robust to failure mode uncertainty. However, the OSGLR test is also somewhat more sensitive to modeling errors than the GLR test.

  1. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  2. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  3. Safeguarding a Lunar Rover with Wald's Sequential Probability Ratio Test

    Science.gov (United States)

    Furlong, Michael; Dille, Michael; Wong, Uland; Nefian, Ara

    2016-01-01

    The virtual bumper is a safeguarding mechanism for autonomous and remotely operated robots. In this paper we take a new approach to the virtual bumper system by using an old statistical test. By using a modified version of Wald's sequential probability ratio test we demonstrate that we can reduce the number of false positive reported by the virtual bumper, thereby saving valuable mission time. We use the concept of sequential probability ratio to control vehicle speed in the presence of possible obstacles in order to increase certainty about whether or not obstacles are present. Our new algorithm reduces the chances of collision by approximately 98 relative to traditional virtual bumper safeguarding without speed control.

  4. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu

    2017-02-16

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  5. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu; Harrou, Fouzi; Sun, Ying

    2017-01-01

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  6. A numerical test method of California bearing ratio on graded crushed rocks using particle flow modeling

    Directory of Open Access Journals (Sweden)

    Yingjun Jiang

    2015-04-01

    Full Text Available In order to better understand the mechanical properties of graded crushed rocks (GCRs and to optimize the relevant design, a numerical test method based on the particle flow modeling technique PFC2D is developed for the California bearing ratio (CBR test on GCRs. The effects of different testing conditions and micro-mechanical parameters used in the model on the CBR numerical results have been systematically studied. The reliability of the numerical technique is verified. The numerical results suggest that the influences of the loading rate and Poisson's ratio on the CBR numerical test results are not significant. As such, a loading rate of 1.0–3.0 mm/min, a piston diameter of 5 cm, a specimen height of 15 cm and a specimen diameter of 15 cm are adopted for the CBR numerical test. The numerical results reveal that the CBR values increase with the friction coefficient at the contact and shear modulus of the rocks, while the influence of Poisson's ratio on the CBR values is insignificant. The close agreement between the CBR numerical results and experimental results suggests that the numerical simulation of the CBR values is promising to help assess the mechanical properties of GCRs and to optimize the grading design. Besides, the numerical study can provide useful insights on the mesoscopic mechanism.

  7. Inference for the Sharpe Ratio Using a Likelihood-Based Approach

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2012-01-01

    Full Text Available The Sharpe ratio is the prominent risk-adjusted performance measure used by practitioners. Statistical testing of this ratio using its asymptotic distribution has lagged behind its use. In this paper, highly accurate likelihood analysis is applied for inference on the Sharpe ratio. Both the one- and two-sample problems are considered. The methodology has O(n−3/2 distributional accuracy and can be implemented using any parametric return distribution structure. Simulations are provided to demonstrate the method's superior accuracy over existing methods used for testing in the literature.

  8. New decision criteria for selecting delta check methods based on the ratio of the delta difference to the width of the reference range can be generally applicable for each clinical chemistry test item.

    Science.gov (United States)

    Park, Sang Hyuk; Kim, So-Young; Lee, Woochang; Chun, Sail; Min, Won-Ki

    2012-09-01

    Many laboratories use 4 delta check methods: delta difference, delta percent change, rate difference, and rate percent change. However, guidelines regarding decision criteria for selecting delta check methods have not yet been provided. We present new decision criteria for selecting delta check methods for each clinical chemistry test item. We collected 811,920 and 669,750 paired (present and previous) test results for 27 clinical chemistry test items from inpatients and outpatients, respectively. We devised new decision criteria for the selection of delta check methods based on the ratio of the delta difference to the width of the reference range (DD/RR). Delta check methods based on these criteria were compared with those based on the CV% of the absolute delta difference (ADD) as well as those reported in 2 previous studies. The delta check methods suggested by new decision criteria based on the DD/RR ratio corresponded well with those based on the CV% of the ADD except for only 2 items each in inpatients and outpatients. Delta check methods based on the DD/RR ratio also corresponded with those suggested in the 2 previous studies, except for 1 and 7 items in inpatients and outpatients, respectively. The DD/RR method appears to yield more feasible and intuitive selection criteria and can easily explain changes in the results by reflecting both the biological variation of the test item and the clinical characteristics of patients in each laboratory. We suggest this as a measure to determine delta check methods.

  9. Jet-Surface Interaction: High Aspect Ratio Nozzle Test, Nozzle Design and Preliminary Data

    Science.gov (United States)

    Brown, Clifford; Dippold, Vance

    2015-01-01

    The Jet-Surface Interaction High Aspect Ratio (JSI-HAR) nozzle test is part of an ongoing effort to measure and predict the noise created when an aircraft engine exhausts close to an airframe surface. The JSI-HAR test is focused on parameters derived from the Turbo-electric Distributed Propulsion (TeDP) concept aircraft which include a high-aspect ratio mailslot exhaust nozzle, internal septa, and an aft deck. The size and mass flow rate limits of the test rig also limited the test nozzle to a 16:1 aspect ratio, half the approximately 32:1 on the TeDP concept. Also, unlike the aircraft, the test nozzle must transition from a single round duct on the High Flow Jet Exit Rig, located in the AeroAcoustic Propulsion Laboratory at the NASA Glenn Research Center, to the rectangular shape at the nozzle exit. A parametric nozzle design method was developed to design three low noise round-to-rectangular transitions, with 8:1, 12:1, and 16: aspect ratios, that minimizes flow separations and shocks while providing a flat flow profile at the nozzle exit. These designs validated using the WIND-US CFD code. A preliminary analysis of the test data shows that the actual flow profile is close to that predicted and that the noise results appear consistent with data from previous, smaller scale, tests. The JSI-HAR test is ongoing through October 2015. The results shown in the presentation are intended to provide an overview of the test and a first look at the preliminary results.

  10. Attribute Weighting Based K-Nearest Neighbor Using Gain Ratio

    Science.gov (United States)

    Nababan, A. A.; Sitompul, O. S.; Tulus

    2018-04-01

    K- Nearest Neighbor (KNN) is a good classifier, but from several studies, the result performance accuracy of KNN still lower than other methods. One of the causes of the low accuracy produced, because each attribute has the same effect on the classification process, while some less relevant characteristics lead to miss-classification of the class assignment for new data. In this research, we proposed Attribute Weighting Based K-Nearest Neighbor Using Gain Ratio as a parameter to see the correlation between each attribute in the data and the Gain Ratio also will be used as the basis for weighting each attribute of the dataset. The accuracy of results is compared to the accuracy acquired from the original KNN method using 10-fold Cross-Validation with several datasets from the UCI Machine Learning repository and KEEL-Dataset Repository, such as abalone, glass identification, haberman, hayes-roth and water quality status. Based on the result of the test, the proposed method was able to increase the classification accuracy of KNN, where the highest difference of accuracy obtained hayes-roth dataset is worth 12.73%, and the lowest difference of accuracy obtained in the abalone dataset of 0.07%. The average result of the accuracy of all dataset increases the accuracy by 5.33%.

  11. PENGARUH PERUBAHAN RETURN ON ASSETS, PERUBAHAN DEBT TO EQUITY RATIO DAN PERUBAHAN CASH RATIO TERHADAP PERUBAHAN DIVIDEND PAYOUT RATIO

    Directory of Open Access Journals (Sweden)

    Yuli Soesetio

    2008-02-01

    Full Text Available Dividend Payout Ratio used to calculate all of revenue that will be accepted by stockholders as cash dividend, usually explained as percentage. This research was conducted to know several factors that affected change of Dividend Payout Ratio and to know the significance level and the correlation between dependent and independent variable. Analysis instrument used was parametric statistic. Based on the result of statistic test,  The Change of Return on Asset (X1, The Change of Debt to Equity Ratio (X2,  were able to explain dependent variable of the change Dividend Payout Ratio, and The Change of CashRatio can’t explain dependent variable of the change Dividend Payout Ratio

  12. Diagonal Likelihood Ratio Test for Equality of Mean Vectors in High-Dimensional Data

    KAUST Repository

    Hu, Zongliang

    2017-10-27

    We propose a likelihood ratio test framework for testing normal mean vectors in high-dimensional data under two common scenarios: the one-sample test and the two-sample test with equal covariance matrices. We derive the test statistics under the assumption that the covariance matrices follow a diagonal matrix structure. In comparison with the diagonal Hotelling\\'s tests, our proposed test statistics display some interesting characteristics. In particular, they are a summation of the log-transformed squared t-statistics rather than a direct summation of those components. More importantly, to derive the asymptotic normality of our test statistics under the null and local alternative hypotheses, we do not require the assumption that the covariance matrix follows a diagonal matrix structure. As a consequence, our proposed test methods are very flexible and can be widely applied in practice. Finally, simulation studies and a real data analysis are also conducted to demonstrate the advantages of our likelihood ratio test method.

  13. Diagonal Likelihood Ratio Test for Equality of Mean Vectors in High-Dimensional Data

    KAUST Repository

    Hu, Zongliang; Tong, Tiejun; Genton, Marc G.

    2017-01-01

    We propose a likelihood ratio test framework for testing normal mean vectors in high-dimensional data under two common scenarios: the one-sample test and the two-sample test with equal covariance matrices. We derive the test statistics under the assumption that the covariance matrices follow a diagonal matrix structure. In comparison with the diagonal Hotelling's tests, our proposed test statistics display some interesting characteristics. In particular, they are a summation of the log-transformed squared t-statistics rather than a direct summation of those components. More importantly, to derive the asymptotic normality of our test statistics under the null and local alternative hypotheses, we do not require the assumption that the covariance matrix follows a diagonal matrix structure. As a consequence, our proposed test methods are very flexible and can be widely applied in practice. Finally, simulation studies and a real data analysis are also conducted to demonstrate the advantages of our likelihood ratio test method.

  14. Nonlinear relationship between the Product Consistency Test (PCT) response and the Al/B ratio in a soda-lime aluminoborosilicate glass

    Energy Technology Data Exchange (ETDEWEB)

    Farooqi, Rahmat Ullah, E-mail: rufarooqi@postech.ac.kr [Division of Advanced Nuclear Engineering, Pohang University of Science and Technology, 77 Cheongam-Ro, Nam-Gu, Pohang, Gyeongbuk 790-784 (Korea, Republic of); Hrma, Pavel [Division of Advanced Nuclear Engineering, Pohang University of Science and Technology, 77 Cheongam-Ro, Nam-Gu, Pohang, Gyeongbuk 790-784 (Korea, Republic of); Pacific Northwest National Laboratory, Richland, WA (United States)

    2016-06-15

    We have investigated the effect of Al/B ratio on the Product Consistency Test (PCT) response. In an aluminoborosilicate soda-lime glass based on a modified International Simple Glass, ISG-3, the Al/B ratio varied from 0 to 0.55 (in mole fractions). In agreement with various models of the PCT response as a function of glass composition, we observed a monotonic increase of B and Na releases with decreasing Al/B mole ratio, but only when the ratio was higher than 0.05. Below this value (Al/B < 0.05), we observed a sharp decrease that we attribute to B in tetrahedral coordination.

  15. Aircraft control surface failure detection and isolation using the OSGLR test. [orthogonal series generalized likelihood ratio

    Science.gov (United States)

    Bonnice, W. F.; Motyka, P.; Wagner, E.; Hall, S. R.

    1986-01-01

    The performance of the orthogonal series generalized likelihood ratio (OSGLR) test in detecting and isolating commercial aircraft control surface and actuator failures is evaluated. A modification to incorporate age-weighting which significantly reduces the sensitivity of the algorithm to modeling errors is presented. The steady-state implementation of the algorithm based on a single linear model valid for a cruise flight condition is tested using a nonlinear aircraft simulation. A number of off-nominal no-failure flight conditions including maneuvers, nonzero flap deflections, different turbulence levels and steady winds were tested. Based on the no-failure decision functions produced by off-nominal flight conditions, the failure detection and isolation performance at the nominal flight condition was determined. The extension of the algorithm to a wider flight envelope by scheduling on dynamic pressure and flap deflection is examined. Based on this testing, the OSGLR algorithm should be capable of detecting control surface failures that would affect the safe operation of a commercial aircraft. Isolation may be difficult if there are several surfaces which produce similar effects on the aircraft. Extending the algorithm over the entire operating envelope of a commercial aircraft appears feasible.

  16. Scale invariant for one-sided multivariate likelihood ratio tests

    Directory of Open Access Journals (Sweden)

    Samruam Chongcharoen

    2010-07-01

    Full Text Available Suppose 1 2 , ,..., n X X X is a random sample from Np ( ,V distribution. Consider 0 1 2 : ... 0 p H      and1 : 0 for 1, 2,..., i H   i  p , let 1 0 H  H denote the hypothesis that 1 H holds but 0 H does not, and let ~ 0 H denote thehypothesis that 0 H does not hold. Because the likelihood ratio test (LRT of 0 H versus 1 0 H  H is complicated, severalad hoc tests have been proposed. Tang, Gnecco and Geller (1989 proposed an approximate LRT, Follmann (1996 suggestedrejecting 0 H if the usual test of 0 H versus ~ 0 H rejects 0 H with significance level 2 and a weighted sum of the samplemeans is positive, and Chongcharoen, Singh and Wright (2002 modified Follmann’s test to include information about thecorrelation structure in the sum of the sample means. Chongcharoen and Wright (2007, 2006 give versions of the Tang-Gnecco-Geller tests and Follmann-type tests, respectively, with invariance properties. With LRT’s scale invariant desiredproperty, we investigate its powers by using Monte Carlo techniques and compare them with the tests which we recommendin Chongcharoen and Wright (2007, 2006.

  17. A neural network-based estimator for the mixture ratio of the Space Shuttle Main Engine

    Science.gov (United States)

    Guo, T. H.; Musgrave, J.

    1992-11-01

    In order to properly utilize the available fuel and oxidizer of a liquid propellant rocket engine, the mixture ratio is closed loop controlled during main stage (65 percent - 109 percent power) operation. However, because of the lack of flight-capable instrumentation for measuring mixture ratio, the value of mixture ratio in the control loop is estimated using available sensor measurements such as the combustion chamber pressure and the volumetric flow, and the temperature and pressure at the exit duct on the low pressure fuel pump. This estimation scheme has two limitations. First, the estimation formula is based on an empirical curve fitting which is accurate only within a narrow operating range. Second, the mixture ratio estimate relies on a few sensor measurements and loss of any of these measurements will make the estimate invalid. In this paper, we propose a neural network-based estimator for the mixture ratio of the Space Shuttle Main Engine. The estimator is an extension of a previously developed neural network based sensor failure detection and recovery algorithm (sensor validation). This neural network uses an auto associative structure which utilizes the redundant information of dissimilar sensors to detect inconsistent measurements. Two approaches have been identified for synthesizing mixture ratio from measurement data using a neural network. The first approach uses an auto associative neural network for sensor validation which is modified to include the mixture ratio as an additional output. The second uses a new network for the mixture ratio estimation in addition to the sensor validation network. Although mixture ratio is not directly measured in flight, it is generally available in simulation and in test bed firing data from facility measurements of fuel and oxidizer volumetric flows. The pros and cons of these two approaches will be discussed in terms of robustness to sensor failures and accuracy of the estimate during typical transients using

  18. The Effect of Alkaline Activator Ratio on the Compressive Strength of Fly Ash-Based Geopolymer Paste

    Science.gov (United States)

    Lăzărescu, A. V.; Szilagyi, H.; Baeră, C.; Ioani, A.

    2017-06-01

    Alkaline activation of fly ash is a particular procedure in which ash resulting from a power plant combined with a specific alkaline activator creates a solid material when dried at a certain temperature. In order to obtain desirable compressive strengths, the mix design of fly ash based geopolymer pastes should be explored comprehensively. To determine the preliminary compressive strength for fly ash based geopolymer paste using Romanian material source, various ratios of Na2SiO3 solution/ NaOH solution were produced, keeping the fly ash/alkaline activator ratio constant. All the mixes were then cured at 70 °C for 24 hours and tested at 2 and 7 days, respectively. The aim of this paper is to present the preliminary compressive strength results for producing fly ash based geopolymer paste using Romanian material sources, the effect of alkaline activators ratio on the compressive strength and studying the directions for future research.

  19. Likelihood ratio-based integrated personal risk assessment of type 2 diabetes.

    Science.gov (United States)

    Sato, Noriko; Htun, Nay Chi; Daimon, Makoto; Tamiya, Gen; Kato, Takeo; Kubota, Isao; Ueno, Yoshiyuki; Yamashita, Hidetoshi; Fukao, Akira; Kayama, Takamasa; Muramatsu, Masaaki

    2014-01-01

    To facilitate personalized health care for multifactorial diseases, risks of genetic and clinical/environmental factors should be assessed together for each individual in an integrated fashion. This approach is possible with the likelihood ratio (LR)-based risk assessment system, as this system can incorporate manifold tests. We examined the usefulness of this system for assessing type 2 diabetes (T2D). Our system employed 29 genetic susceptibility variants, body mass index (BMI), and hypertension as risk factors whose LRs can be estimated from openly available T2D association data for the Japanese population. The pretest probability was set at a sex- and age-appropriate population average of diabetes prevalence. The classification performance of our LR-based risk assessment was compared to that of a non-invasive screening test for diabetes called TOPICS (with score based on age, sex, family history, smoking, BMI, and hypertension) using receiver operating characteristic analysis with a community cohort (n = 1263). The area under the receiver operating characteristic curve (AUC) for the LR-based assessment and TOPICS was 0.707 (95% CI 0.665-0.750) and 0.719 (0.675-0.762), respectively. These AUCs were much higher than that of a genetic risk score constructed using the same genetic susceptibility variants, 0.624 (0.574-0.674). The use of ethnically matched LRs is necessary for proper personal risk assessment. In conclusion, although LR-based integrated risk assessment for T2D still requires additional tests that evaluate other factors, such as risks involved in missing heritability, our results indicate the potential usability of LR-based assessment system and stress the importance of stratified epidemiological investigations in personalized medicine.

  20. Ratio-based lengths of intervals to improve fuzzy time series forecasting.

    Science.gov (United States)

    Huarng, Kunhuang; Yu, Tiffany Hui-Kuang

    2006-04-01

    The objective of this study is to explore ways of determining the useful lengths of intervals in fuzzy time series. It is suggested that ratios, instead of equal lengths of intervals, can more properly represent the intervals among observations. Ratio-based lengths of intervals are, therefore, proposed to improve fuzzy time series forecasting. Algebraic growth data, such as enrollments and the stock index, and exponential growth data, such as inventory demand, are chosen as the forecasting targets, before forecasting based on the various lengths of intervals is performed. Furthermore, sensitivity analyses are also carried out for various percentiles. The ratio-based lengths of intervals are found to outperform the effective lengths of intervals, as well as the arbitrary ones in regard to the different statistical measures. The empirical analysis suggests that the ratio-based lengths of intervals can also be used to improve fuzzy time series forecasting.

  1. Effect of home testing of international normalized ratio on clinical events.

    Science.gov (United States)

    Matchar, David B; Jacobson, Alan; Dolor, Rowena; Edson, Robert; Uyeda, Lauren; Phibbs, Ciaran S; Vertrees, Julia E; Shih, Mei-Chiung; Holodniy, Mark; Lavori, Philip

    2010-10-21

    Warfarin anticoagulation reduces thromboembolic complications in patients with atrial fibrillation or mechanical heart valves, but effective management is complex, and the international normalized ratio (INR) is often outside the target range. As compared with venous plasma testing, point-of-care INR measuring devices allow greater testing frequency and patient involvement and may improve clinical outcomes. We randomly assigned 2922 patients who were taking warfarin because of mechanical heart valves or atrial fibrillation and who were competent in the use of point-of-care INR devices to either weekly self-testing at home or monthly high-quality testing in a clinic. The primary end point was the time to a first major event (stroke, major bleeding episode, or death). The patients were followed for 2.0 to 4.75 years, for a total of 8730 patient-years of follow-up. The time to the first primary event was not significantly longer in the self-testing group than in the clinic-testing group (hazard ratio, 0.88; 95% confidence interval, 0.75 to 1.04; P=0.14). The two groups had similar rates of clinical outcomes except that the self-testing group reported more minor bleeding episodes. Over the entire follow-up period, the self-testing group had a small but significant improvement in the percentage of time during which the INR was within the target range (absolute difference between groups, 3.8 percentage points; P<0.001). At 2 years of follow-up, the self-testing group also had a small but significant improvement in patient satisfaction with anticoagulation therapy (P=0.002) and quality of life (P<0.001). As compared with monthly high-quality clinic testing, weekly self-testing did not delay the time to a first stroke, major bleeding episode, or death to the extent suggested by prior studies. These results do not support the superiority of self-testing over clinic testing in reducing the risk of stroke, major bleeding episode, and death among patients taking warfarin

  2. Microscopic silicon-based lateral high-aspect-ratio structures for thin film conformality analysis

    International Nuclear Information System (INIS)

    Gao, Feng; Arpiainen, Sanna; Puurunen, Riikka L.

    2015-01-01

    Film conformality is one of the major drivers for the interest in atomic layer deposition (ALD) processes. This work presents new silicon-based microscopic lateral high-aspect-ratio (LHAR) test structures for the analysis of the conformality of thin films deposited by ALD and by other chemical vapor deposition means. The microscopic LHAR structures consist of a lateral cavity inside silicon with a roof supported by pillars. The cavity length (e.g., 20–5000 μm) and cavity height (e.g., 200–1000 nm) can be varied, giving aspect ratios of, e.g., 20:1 to 25 000:1. Film conformality can be analyzed with the microscopic LHAR by several means, as demonstrated for the ALD Al 2 O 3 and TiO 2 processes from Me 3 Al/H 2 O and TiCl 4 /H 2 O. The microscopic LHAR test structures introduced in this work expose a new parameter space for thin film conformality investigations expected to prove useful in the development, tuning and modeling of ALD and other chemical vapor deposition processes

  3. Improved ASTM G72 Test Method for Ensuring Adequate Fuel-to-Oxidizer Ratios

    Science.gov (United States)

    Juarez, Alfredo; Harper, Susana Tapia

    2016-01-01

    The ASTM G72/G72M-15 Standard Test Method for Autogenous Ignition Temperature of Liquids and Solids in a High-Pressure Oxygen-Enriched Environment is currently used to evaluate materials for the ignition susceptibility driven by exposure to external heat in an enriched oxygen environment. Testing performed on highly volatile liquids such as cleaning solvents has proven problematic due to inconsistent test results (non-ignitions). Non-ignition results can be misinterpreted as favorable oxygen compatibility, although they are more likely associated with inadequate fuel-to-oxidizer ratios. Forced evaporation during purging and inadequate sample size were identified as two potential causes for inadequate available sample material during testing. In an effort to maintain adequate fuel-to-oxidizer ratios within the reaction vessel during test, several parameters were considered, including sample size, pretest sample chilling, pretest purging, and test pressure. Tests on a variety of solvents exhibiting a range of volatilities are presented in this paper. A proposed improvement to the standard test protocol as a result of this evaluation is also presented. Execution of the final proposed improved test protocol outlines an incremental step method of determining optimal conditions using increased sample sizes while considering test system safety limits. The proposed improved test method increases confidence in results obtained by utilizing the ASTM G72 autogenous ignition temperature test method and can aid in the oxygen compatibility assessment of highly volatile liquids and other conditions that may lead to false non-ignition results.

  4. The effects of multiple features of alternatively spliced exons on the KA/KS ratio test

    Directory of Open Access Journals (Sweden)

    Chen Feng-Chi

    2006-05-01

    Full Text Available Abstract Background The evolution of alternatively spliced exons (ASEs is of primary interest because these exons are suggested to be a major source of functional diversity of proteins. Many exon features have been suggested to affect the evolution of ASEs. However, previous studies have relied on the KA/KS ratio test without taking into consideration information sufficiency (i.e., exon length > 75 bp, cross-species divergence > 5% of the studied exons, leading to potentially biased interpretations. Furthermore, which exon feature dominates the results of the KA/KS ratio test and whether multiple exon features have additive effects have remained unexplored. Results In this study, we collect two different datasets for analysis – the ASE dataset (which includes lineage-specific ASEs and conserved ASEs and the ACE dataset (which includes only conserved ASEs. We first show that information sufficiency can significantly affect the interpretation of relationship between exons features and the KA/KS ratio test results. After discarding exons with insufficient information, we use a Boolean method to analyze the relationship between test results and four exon features (namely length, protein domain overlapping, inclusion level, and exonic splicing enhancer (ESE frequency for the ASE dataset. We demonstrate that length and protein domain overlapping are dominant factors, and they have similar impacts on test results of ASEs. In addition, despite the weak impacts of inclusion level and ESE motif frequency when considered individually, combination of these two factors still have minor additive effects on test results. However, the ACE dataset shows a slightly different result in that inclusion level has a marginally significant effect on test results. Lineage-specific ASEs may have contributed to the difference. Overall, in both ASEs and ACEs, protein domain overlapping is the most dominant exon feature while ESE frequency is the weakest one in affecting

  5. The Golden Ratio in Time-based Media

    Directory of Open Access Journals (Sweden)

    Emily Verba

    2013-06-01

    The proliferation of moving images we face on a daily basis is cause for great concern, as we have increasingly less free time in our days. Informative and pleasing images are buried in an avalanche of visual rubbish, constantly streaming into our physical and virtual worlds. Time-based media has the ability to expand and contract movement, thus directing the way viewers experience and spend their time. This investigation presupposes that editing moving images via increments of time determined by the golden ratio may streamline messages, isolating what is most symbolic and effectively communicative within a mathematical framework. A physiological and psychological benefit is created for viewers; there is no wasted time or space. Image-makers and visual communicators have a responsibility to create only that which is useful and/or aesthetically pleasing. An investigation into the temporal structure of time-based media, using mathematical algorithms derived from the golden ratio, has led to the aim of creating through it a viable solution for the implementation of visual communication messages in today's society.

  6. The patients' perspective of international normalized ratio self-testing, remote communication of test results and confidence to move to self-management.

    Science.gov (United States)

    Grogan, Anne; Coughlan, Michael; Prizeman, Geraldine; O'Connell, Niamh; O'Mahony, Nora; Quinn, Katherine; McKee, Gabrielle

    2017-12-01

    To elicit the perceptions of patients, who self-tested their international normalized ratio and communicated their results via a text or phone messaging system, to determine their satisfaction with the education and support that they received and to establish their confidence to move to self-management. Self-testing of international normalized ratio has been shown to be reliable and is fast becoming common practice. As innovations are introduced to point of care testing, more research is needed to elicit patients' perceptions of the self-testing process. This three site study used a cross-sectional prospective descriptive survey. Three hundred and thirty patients who were prescribed warfarin and using international normalized ratio self-testing were invited to take part in the study. The anonymous survey examined patient profile, patients' usage, issues, perceptions, confidence and satisfaction with using the self-testing system and their preparedness for self-management of warfarin dosage. The response rate was 57% (n = 178). Patients' confidence in self-testing was high (90%). Patients expressed a high level of satisfaction with the support received, but expressed the need for more information on support groups, side effects of warfarin, dietary information and how to dispose of needles. When asked if they felt confident to adjust their own warfarin levels 73% agreed. Chi-squared tests for independence revealed that none of the patient profile factors examined influenced this confidence. The patients cited the greatest advantages of the service were reduced burden, more autonomy, convenience and ease of use. The main disadvantages cited were cost and communication issues. Patients were satisfied with self-testing. The majority felt they were ready to move to self-management. The introduction of innovations to remote point of care testing, such as warfarin self-testing, needs to have support at least equal to that provided in a hospital setting. © 2017 John

  7. Can T1 w/T2 w ratio be used as a myelin-specific measure in subcortical structures? Comparisons between FSE-based T1 w/T2 w ratios, GRASE-based T1 w/T2 w ratios and multi-echo GRASE-based myelin water fractions.

    Science.gov (United States)

    Uddin, Md Nasir; Figley, Teresa D; Marrie, Ruth Ann; Figley, Chase R

    2018-03-01

    Given the growing popularity of T 1 -weighted/T 2 -weighted (T 1 w/T 2 w) ratio measurements, the objective of the current study was to evaluate the concordance between T 1 w/T 2 w ratios obtained using conventional fast spin echo (FSE) versus combined gradient and spin echo (GRASE) sequences for T 2 w image acquisition, and to compare the resulting T 1 w/T 2 w ratios with histologically validated myelin water fraction (MWF) measurements in several subcortical brain structures. In order to compare these measurements across a relatively wide range of myelin concentrations, whole-brain T 1 w magnetization prepared rapid acquisition gradient echo (MPRAGE), T 2 w FSE and three-dimensional multi-echo GRASE data were acquired from 10 participants with multiple sclerosis at 3 T. Then, after high-dimensional, non-linear warping, region of interest (ROI) analyses were performed to compare T 1 w/T 2 w ratios and MWF estimates (across participants and brain regions) in 11 bilateral white matter (WM) and four bilateral subcortical grey matter (SGM) structures extracted from the JHU_MNI_SS 'Eve' atlas. Although the GRASE sequence systematically underestimated T 1 w/T 2 w values compared to the FSE sequence (revealed by Bland-Altman and mountain plots), linear regressions across participants and ROIs revealed consistently high correlations between the two methods (r 2 = 0.62 for all ROIs, r 2 = 0.62 for WM structures and r 2 = 0.73 for SGM structures). However, correlations between either FSE-based or GRASE-based T 1 w/T 2 w ratios and MWFs were extremely low in WM structures (FSE-based, r 2 = 0.000020; GRASE-based, r 2 = 0.0014), low across all ROIs (FSE-based, r 2 = 0.053; GRASE-based, r 2 = 0.029) and moderate in SGM structures (FSE-based, r 2 = 0.20; GRASE-based, r 2 = 0.17). Overall, our findings indicated a high degree of correlation (but not equivalence) between FSE-based and GRASE-based T 1 w/T 2 w ratios, and low correlations between T 1 w/T 2 w ratios and MWFs. This

  8. Spent fuel sabotage aerosol ratio program : FY 2004 test and data summary.

    Energy Technology Data Exchange (ETDEWEB)

    Brucher, Wenzel (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany); Koch, Wolfgang (Fraunhofer Institut fur Toxikologie und Experimentelle Medizin, Germany); Pretzsch, Gunter Guido (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany); Loiseau, Olivier (Institut de Radioprotection et de Surete Nucleaire, France); Mo, Tin (U.S. Nuclear Regulatory Commission, Washington, DC); Billone, Michael C. (Argonne National Laboratory, Argonne, IL); Autrusson, Bruno A. (Institut de Radioprotection et de Surete Nucleaire, France); Young, F. I. (U.S. Nuclear Regulatory Commission, Washington, DC); Coats, Richard Lee; Burtseva, Tatiana (Argonne National Laboratory, Argonne, IL); Luna, Robert Earl; Dickey, Roy R.; Sorenson, Ken Bryce; Nolte, Oliver (Fraunhofer Institut fur Toxikologie und Experimentelle Medizin, Germany); Thompson, Nancy Slater (U.S. Department of Energy, Washington, DC); Hibbs, Russell S. (U.S. Department of Energy, Washington, DC); Gregson, Michael Warren; Lange, Florentin (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany); Molecke, Martin Alan; Tsai, Han-Chung (Argonne National Laboratory, Argonne, IL)

    2005-07-01

    This multinational, multi-phase spent fuel sabotage test program is quantifying the aerosol particles produced when the products of a high energy density device (HEDD) interact with and explosively particulate test rodlets that contain pellets of either surrogate materials or actual spent fuel. This program has been underway for several years. This program provides data that are relevant to some sabotage scenarios in relation to spent fuel transport and storage casks, and associated risk assessments. The program also provides significant technical and political benefits in international cooperation. We are quantifying the Spent Fuel Ratio (SFR), the ratio of the aerosol particles released from HEDD-impacted actual spent fuel to the aerosol particles produced from surrogate materials, measured under closely matched test conditions, in a contained test chamber. In addition, we are measuring the amounts, nuclide content, size distribution of the released aerosol materials, and enhanced sorption of volatile fission product nuclides onto specific aerosol particle size fractions. These data are the input for follow-on modeling studies to quantify respirable hazards, associated radiological risk assessments, vulnerability assessments, and potential cask physical protection design modifications. This document includes an updated description of the test program and test components for all work and plans made, or revised, during FY 2004. It also serves as a program status report as of the end of FY 2004. All available test results, observations, and aerosol analyses plus interpretations--primarily for surrogate material Phase 2 tests, series 2/5A through 2/9B, using cerium oxide sintered ceramic pellets are included. Advanced plans and progress are described for upcoming tests with unirradiated, depleted uranium oxide and actual spent fuel test rodlets. This spent fuel sabotage--aerosol test program is coordinated with the international Working Group for Sabotage Concerns of

  9. Spent fuel sabotage aerosol ratio program : FY 2004 test and data summary

    International Nuclear Information System (INIS)

    Brucher, Wenzel; Koch, Wolfgang; Pretzsch, Gunter Guido; Loiseau, Olivier; Mo, Tin; Billone, Michael C.; Autrusson, Bruno A.; Young, F. I.; Coats, Richard Lee; Burtseva, Tatiana; Luna, Robert Earl; Dickey, Roy R.; Sorenson, Ken Bryce; Nolte, Oliver; Thompson, Nancy Slater; Hibbs, Russell S.; Gregson, Michael Warren; Lange, Florentin; Molecke, Martin Alan; Tsai, Han-Chung

    2005-01-01

    This multinational, multi-phase spent fuel sabotage test program is quantifying the aerosol particles produced when the products of a high energy density device (HEDD) interact with and explosively particulate test rodlets that contain pellets of either surrogate materials or actual spent fuel. This program has been underway for several years. This program provides data that are relevant to some sabotage scenarios in relation to spent fuel transport and storage casks, and associated risk assessments. The program also provides significant technical and political benefits in international cooperation. We are quantifying the Spent Fuel Ratio (SFR), the ratio of the aerosol particles released from HEDD-impacted actual spent fuel to the aerosol particles produced from surrogate materials, measured under closely matched test conditions, in a contained test chamber. In addition, we are measuring the amounts, nuclide content, size distribution of the released aerosol materials, and enhanced sorption of volatile fission product nuclides onto specific aerosol particle size fractions. These data are the input for follow-on modeling studies to quantify respirable hazards, associated radiological risk assessments, vulnerability assessments, and potential cask physical protection design modifications. This document includes an updated description of the test program and test components for all work and plans made, or revised, during FY 2004. It also serves as a program status report as of the end of FY 2004. All available test results, observations, and aerosol analyses plus interpretations--primarily for surrogate material Phase 2 tests, series 2/5A through 2/9B, using cerium oxide sintered ceramic pellets are included. Advanced plans and progress are described for upcoming tests with unirradiated, depleted uranium oxide and actual spent fuel test rodlets. This spent fuel sabotage--aerosol test program is coordinated with the international Working Group for Sabotage Concerns of

  10. International normalized ratio self-testing and self-management: improving patient outcomes

    Directory of Open Access Journals (Sweden)

    Pozzi M

    2016-10-01

    Full Text Available Matteo Pozzi,1 Julia Mitchell,2 Anna Maria Henaine,3 Najib Hanna,4 Ola Safi,4 Roland Henaine2 1Department of Adult Cardiac Surgery, “Louis Pradel” Cardiologic Hospital, Lyon, France; 2Department of Congenital Cardiac Surgery, “Louis Pradel” Cardiologic Hospital, Lyon, France; 3Clinical Pharmacology Unit, Lebanese University, Beirut, Lebanon; 4Pediatric Unit, “Hotel Dieu de France” Hospital, Saint Joseph University, Beirut, Lebanon Abstract: Long term oral anti-coagulation with vitamin K antagonists is a risk factor of hemorrhagic or thromebomlic complications. Periodic laboratory testing of international normalized ratio (INR and a subsequent dose adjustment are therefore mandatory. The use of home testing devices to measure INR has been suggested as a potential way to improve the comfort and compliance of the patients and their families, the frequency of monitoring and, finally, the management and safety of long-term oral anticoagulation. In pediatric patients, increased doses to obtain and maintain the therapeutic target INR, more frequent adjustments and INR testing, multiple medication, inconstant nutritional intake, difficult venepunctures, and the need to go to the laboratory for testing (interruption of school and parents’ work attendance highlight those difficulties. After reviewing the most relevant published studies of self-testing and self-management of INR for adult patients and children on oral anticoagulation, it seems that these are valuable and effective strategies of INR control. Despite an unclear relationship between INR control and clinical effects, these self-strategies provide a better control of the anticoagulant effect, improve patients and their family quality of life, and are an appealing solution in term of cost-effectiveness. Structured education and knowledge evaluation by trained health care professionals is required for children, to be able to adjust their dose treatment safely and accurately. However

  11. Arbitrary-ratio power splitter based on nonlinear multimode interference coupler

    International Nuclear Information System (INIS)

    Tajaldini, Mehdi; Jafri, Mohd Zubir Mat

    2015-01-01

    We propose an ultra-compact multimode interference (MMI) power splitter based on nonlinear effects from simulations using nonlinear modal propagation analysis (NMPA) cooperation with finite difference Method (FDM) to access free choice of splitting ratio. Conventional multimode interference power splitter could only obtain a few discrete ratios. The power splitting ratio may be adjusted continuously while the input set power is varying by a tunable laser. In fact, using an ultra- compact MMI with a simple structure that is launched by a tunable nonlinear input fulfills the problem of arbitrary-ratio in integrated photonics circuits. Silicon on insulator (SOI) is used as the offered material due to the high contrast refractive index and Centro symmetric properties. The high-resolution images at the end of the multimode waveguide in the simulated power splitter have a high power balance, whereas access to a free choice of splitting ratio is not possible under the linear regime in the proposed length range except changes in the dimension for any ratio. The compact dimensions and ideal performance of the device are established according to optimized parameters. The proposed regime can be extended to the design of M×N arbitrary power splitters ratio for programmable logic devices in all optical digital signal processing. The results of this study indicate that nonlinear modal propagation analysis solves the miniaturization problem for all-optical devices based on MMI couplers to achieve multiple functions in a compact planar integrated circuit and also overcomes the limitations of previously proposed methods for nonlinear MMI

  12. Arbitrary-ratio power splitter based on nonlinear multimode interference coupler

    Energy Technology Data Exchange (ETDEWEB)

    Tajaldini, Mehdi [School of Physics, Universiti Sains Malaysia, 11800 Pulau Pinang (Malaysia); Young Researchers and Elite Club, Baft Branch, Islamic Azad University, Baft (Iran, Islamic Republic of); Jafri, Mohd Zubir Mat [School of Physics, Universiti Sains Malaysia, 11800 Pulau Pinang (Malaysia)

    2015-04-24

    We propose an ultra-compact multimode interference (MMI) power splitter based on nonlinear effects from simulations using nonlinear modal propagation analysis (NMPA) cooperation with finite difference Method (FDM) to access free choice of splitting ratio. Conventional multimode interference power splitter could only obtain a few discrete ratios. The power splitting ratio may be adjusted continuously while the input set power is varying by a tunable laser. In fact, using an ultra- compact MMI with a simple structure that is launched by a tunable nonlinear input fulfills the problem of arbitrary-ratio in integrated photonics circuits. Silicon on insulator (SOI) is used as the offered material due to the high contrast refractive index and Centro symmetric properties. The high-resolution images at the end of the multimode waveguide in the simulated power splitter have a high power balance, whereas access to a free choice of splitting ratio is not possible under the linear regime in the proposed length range except changes in the dimension for any ratio. The compact dimensions and ideal performance of the device are established according to optimized parameters. The proposed regime can be extended to the design of M×N arbitrary power splitters ratio for programmable logic devices in all optical digital signal processing. The results of this study indicate that nonlinear modal propagation analysis solves the miniaturization problem for all-optical devices based on MMI couplers to achieve multiple functions in a compact planar integrated circuit and also overcomes the limitations of previously proposed methods for nonlinear MMI.

  13. The TL,NO/TL,CO ratio in pulmonary function test interpretation.

    Science.gov (United States)

    Hughes, J Michael B; van der Lee, Ivo

    2013-02-01

    The transfer factor of the lung for nitric oxide (T(L,NO)) is a new test for pulmonary gas exchange. The procedure is similar to the already well-established transfer factor of the lung for carbon monoxide (T(L,CO)). Physiologically, T(L,NO) predominantly measures the diffusion pathway from the alveoli to capillary plasma. In the Roughton-Forster equation, T(L,NO) acts as a surrogate for the membrane diffusing capacity (D(M)). The red blood cell resistance to carbon monoxide uptake accounts for ~50% of the total resistance from gas to blood, but it is much less for nitric oxide. T(L,NO) and T(L,CO) can be measured simultaneously with the single breath technique, and D(M) and pulmonary capillary blood volume (V(c)) can be estimated. T(L,NO), unlike T(L,CO), is independent of oxygen tension and haematocrit. The T(L,NO)/T(L,CO) ratio is weighted towards the D(M)/V(c) ratio and to α; where α is the ratio of physical diffusivities of NO to CO (α=1.97). The T(L,NO)/T(L,CO) ratio is increased in heavy smokers, with and without computed tomography evidence of emphysema, and reduced in the voluntary restriction of lung expansion; it is expected to be reduced in chronic heart failure. The T(L,NO)/T(L,CO) ratio is a new index of gas exchange that may, more than derivations from them of D(M) and V(c) with their in-built assumptions, give additional insights into pulmonary pathology.

  14. Do exchange rates follow random walks? A variance ratio test of the ...

    African Journals Online (AJOL)

    The random-walk hypothesis in foreign-exchange rates market is one of the most researched areas, particularly in developed economies. However, emerging markets in sub-Saharan Africa have received little attention in this regard. This study applies Lo and MacKinlay's (1988) conventional variance ratio test and Wright's ...

  15. Comparison of IRT Likelihood Ratio Test and Logistic Regression DIF Detection Procedures

    Science.gov (United States)

    Atar, Burcu; Kamata, Akihito

    2011-01-01

    The Type I error rates and the power of IRT likelihood ratio test and cumulative logit ordinal logistic regression procedures in detecting differential item functioning (DIF) for polytomously scored items were investigated in this Monte Carlo simulation study. For this purpose, 54 simulation conditions (combinations of 3 sample sizes, 2 sample…

  16. The fast ratio: A rapid measure for testing the dominance of the fast component in the initial OSL signal from quartz

    International Nuclear Information System (INIS)

    Durcan, Julie A.; Duller, Geoff A.T.

    2011-01-01

    The signal from the fast component is usually considered preferable for quartz optically stimulated luminescence (OSL) dating, however its presence in a continuous wave (CW) OSL signal is often assumed, rather than verified. This paper presents an objective measure (termed the fast ratio) for testing the dominance of the fast component in the initial part of a quartz OSL signal. The ratio is based upon the photo ionisation cross-sections of the fast and medium components and the power of the measurement equipment used to record the OSL signal, and it compares parts of the OSL signal selected to represent the fast and medium components. The ability of the fast ratio to distinguish between samples whose CW-OSL signal is dominated by the fast and non-fast components is demonstrated by comparing the fast ratio with the contribution of the fast component calculated from curve deconvolution of measured OSL signals and from simulated data. The ratio offers a rapid method for screening a large number of OSL signals obtained for individual equivalent dose estimates, it can be calculated and applied as easily as other routine screening methods, and is transferrable between different aliquots, samples and measurement equipment. - Highlights: → Fast ratio is a measure which tests dominance of fast component in quartz OSL signals. → A fast ratio above 20 implies a CW-OSL signal is dominated by fast component. → Fast ratio can be easily and rapidly applied to a large number of OSL signals. → Uses include signal comparison, data screening, identify need for further analysis.

  17. A hypothesis-testing framework for studies investigating ontogenetic niche shifts using stable isotope ratios.

    Directory of Open Access Journals (Sweden)

    Caroline M Hammerschlag-Peyer

    Full Text Available Ontogenetic niche shifts occur across diverse taxonomic groups, and can have critical implications for population dynamics, community structure, and ecosystem function. In this study, we provide a hypothesis-testing framework combining univariate and multivariate analyses to examine ontogenetic niche shifts using stable isotope ratios. This framework is based on three distinct ontogenetic niche shift scenarios, i.e., (1 no niche shift, (2 niche expansion/reduction, and (3 discrete niche shift between size classes. We developed criteria for identifying each scenario, as based on three important resource use characteristics, i.e., niche width, niche position, and niche overlap. We provide an empirical example for each ontogenetic niche shift scenario, illustrating differences in resource use characteristics among different organisms. The present framework provides a foundation for future studies on ontogenetic niche shifts, and also can be applied to examine resource variability among other population sub-groupings (e.g., by sex or phenotype.

  18. The Likelihood Ratio Test of Common Factors under Non-Ideal Conditions

    Directory of Open Access Journals (Sweden)

    Ana M. Angulo

    2011-01-01

    Full Text Available El modelo espacial de Durbin ocupa una posición interesante en econometría espacial. Es la forma reducida de un modelo de corte transversal con dependencia en los errores y puede ser utilizado como ecuación de anidación en un enfoque más general de selección de modelos. En concreto, a partir de esta ecuación puede obtenerse el Ratio de Verosimilitudes conocido como test de Factores Comunes (LRCOM. Como se muestra en Mur y Angulo (2006, este test tiene buenas propiedades si el modelo está correctamente especificado. Sin embargo, por lo que sabemos, no hay referencias en la literatura sobre el comportamiento de este test bajo condiciones no ideales. En concreto, estudiamos el comportamiento del test en los casos de heterocedasticidad, no normalidad, endogeneidad, matrices de contactos densas y no-linealidad. Nuestros resultados ofrecen una visión positiva del test de Factores Comunes que parece una técnica útil en el instrumental propio de la econometría espacial contemporánea.

  19. Chemiluminescence-based multivariate sensing of local equivalence ratios in premixed atmospheric methane-air flames

    Energy Technology Data Exchange (ETDEWEB)

    Tripathi, Markandey M.; Krishnan, Sundar R.; Srinivasan, Kalyan K.; Yueh, Fang-Yu; Singh, Jagdish P.

    2011-09-07

    Chemiluminescence emissions from OH*, CH*, C2, and CO2 formed within the reaction zone of premixed flames depend upon the fuel-air equivalence ratio in the burning mixture. In the present paper, a new partial least square regression (PLS-R) based multivariate sensing methodology is investigated and compared with an OH*/CH* intensity ratio-based calibration model for sensing equivalence ratio in atmospheric methane-air premixed flames. Five replications of spectral data at nine different equivalence ratios ranging from 0.73 to 1.48 were used in the calibration of both models. During model development, the PLS-R model was initially validated with the calibration data set using the leave-one-out cross validation technique. Since the PLS-R model used the entire raw spectral intensities, it did not need the nonlinear background subtraction of CO2 emission that is required for typical OH*/CH* intensity ratio calibrations. An unbiased spectral data set (not used in the PLS-R model development), for 28 different equivalence ratio conditions ranging from 0.71 to 1.67, was used to predict equivalence ratios using the PLS-R and the intensity ratio calibration models. It was found that the equivalence ratios predicted with the PLS-R based multivariate calibration model matched the experimentally measured equivalence ratios within 7%; whereas, the OH*/CH* intensity ratio calibration grossly underpredicted equivalence ratios in comparison to measured equivalence ratios, especially under rich conditions ( > 1.2). The practical implications of the chemiluminescence-based multivariate equivalence ratio sensing methodology are also discussed.

  20. Improved method for SNR prediction in machine-learning-based test

    NARCIS (Netherlands)

    Sheng, Xiaoqin; Kerkhoff, Hans G.

    2010-01-01

    This paper applies an improved method for testing the signal-to-noise ratio (SNR) of Analogue-to-Digital Converters (ADC). In previous work, a noisy and nonlinear pulse signal is exploited as the input stimulus to obtain the signature results of ADC. By applying a machine-learning-based approach,

  1. Ratio-Based Gradual Aggregation of Data

    DEFF Research Database (Denmark)

    Iftikhar, Nadeem

    2012-01-01

    cause data management and data storage issues. However, non-flexible and ineffective means of data aggregation not only reduce performance of database queries but also lead to erroneous reporting. This paper presents flexible and effective ratio-based methods for gradual data aggregation in databases....... Gradual data aggregation is a process that reduces data volume by converting the detailed data into multiple levels of summarized data as the data gets older. This paper also describes implementation strategies of the proposed methods based on standard database technology.......Majority of databases contain large amounts of data, gathered over long intervals of time. In most cases, the data is aggregated so that it can be used for analysis and reporting purposes. The other reason of data aggregation is to reduce data volume in order to avoid over-sized databases that may...

  2. Impact of Inflation Accounting Application on Key Financial Ratios

    Directory of Open Access Journals (Sweden)

    Aydın KARAPINAR

    2012-03-01

    Full Text Available This paper investigates the impact of inflation accounting on key financal ratios. To this end, the financial statements of 132 companies listed in the Istanbul Stock Exchange (ISE are studied. An analyis of paired samples t test has been conducted on the financial ratios of the companies. The results show that a significant difference between adjusted cost based financial ratios and historical cost based financial ratios occurs only for current, ratios, equity ratios and noncurrent turnover ratios. The study does not cover companies operating in the financial sector. The companies reporting in accordance with IFRS for the studied periods that spans 2001-2004 are not included in the study either. The study offers valuable information as to analysing companies operating in hiper inflation economies.

  3. Nuclear Power Plant Thermocouple Sensor-Fault Detection and Classification Using Deep Learning and Generalized Likelihood Ratio Test

    Science.gov (United States)

    Mandal, Shyamapada; Santhi, B.; Sridhar, S.; Vinolia, K.; Swaminathan, P.

    2017-06-01

    In this paper, an online fault detection and classification method is proposed for thermocouples used in nuclear power plants. In the proposed method, the fault data are detected by the classification method, which classifies the fault data from the normal data. Deep belief network (DBN), a technique for deep learning, is applied to classify the fault data. The DBN has a multilayer feature extraction scheme, which is highly sensitive to a small variation of data. Since the classification method is unable to detect the faulty sensor; therefore, a technique is proposed to identify the faulty sensor from the fault data. Finally, the composite statistical hypothesis test, namely generalized likelihood ratio test, is applied to compute the fault pattern of the faulty sensor signal based on the magnitude of the fault. The performance of the proposed method is validated by field data obtained from thermocouple sensors of the fast breeder test reactor.

  4. Quantitative structure activity relationships (QSAR) for binary mixtures at non-equitoxic ratios based on toxic ratios-effects curves.

    Science.gov (United States)

    Tian, Dayong; Lin, Zhifen; Yin, Daqiang

    2013-01-01

    The present study proposed a QSAR model to predict joint effects at non-equitoxic ratios for binary mixtures containing reactive toxicants, cyanogenic compounds and aldehydes. Toxicity of single and binary mixtures was measured by quantifying the decrease in light emission from the Photobacterium phosphoreum for 15 min. The joint effects of binary mixtures (TU sum) can thus be obtained. The results showed that the relationships between toxic ratios of the individual chemicals and their joint effects can be described by normal distribution function. Based on normal distribution equations, the joint effects of binary mixtures at non-equitoxic ratios ( [Formula: see text]) can be predicted quantitatively using the joint effects at equitoxic ratios ( [Formula: see text]). Combined with a QSAR model of [Formula: see text]in our previous work, a novel QSAR model can be proposed to predict the joint effects of mixtures at non-equitoxic ratios ( [Formula: see text]). The proposed model has been validated using additional mixtures other than the one used for the development of the model. Predicted and observed results were similar (p>0.05). This study provides an approach to the prediction of joint effects for binary mixtures at non-equitoxic ratios.

  5. HIV testing uptake and prevalence among adolescents and adults in a large home-based HIV testing program in Western Kenya.

    Science.gov (United States)

    Wachira, Juddy; Ndege, Samson; Koech, Julius; Vreeman, Rachel C; Ayuo, Paul; Braitstein, Paula

    2014-02-01

    To describe HIV testing uptake and prevalence among adolescents and adults in a home-based HIV counseling and testing program in western Kenya. Since 2007, the Academic Model Providing Access to Healthcare program has implemented home-based HIV counseling and testing on a large scale. All individuals aged ≥13 years were eligible for testing. Data from 5 of 8 catchments were included in this analysis. We used descriptive statistics and multivariate logistic regression to examine testing uptake and HIV prevalence among adolescents (13-18 years), younger adults (19-24 years), and older adults (≥25 years). There were 154,463 individuals eligible for analyses as follows: 22% adolescents, 19% younger adults, and 59% older adults. Overall mean age was 32.8 years and 56% were female. HIV testing was high (96%) across the following 3 groups: 99% in adolescents, 98% in younger adults, and 94% in older adults (P < 0.001). HIV prevalence was higher (11.0%) among older adults compared with younger adults (4.8%) and adolescents (0.8%) (P < 0.001). Those who had ever previously tested for HIV were less likely to accept HIV testing (adjusted odds ratio: 0.06, 95% confidence interval: 0.05 to 0.07) but more likely to newly test HIV positive (adjusted odds ratio: 1.30, 95% confidence interval: 1.21 to 1.40). Age group differences were evident in the sociodemographic and socioeconomic factors associated with testing uptake and HIV prevalence, particularly, gender, relationship status, and HIV testing history. Sociodemographic and socioeconomic factors were independently associated with HIV testing and prevalence among the age groups. Community-based treatment and prevention strategies will need to consider these factors.

  6. The distinct element analysis for swelling pressure test of bentonite. Discussion on the effects of wall friction force and aspect ratio of specimen

    International Nuclear Information System (INIS)

    Shimizu, Hiroyuki; Kikuchi, Hirohito; Fujita, Tomoo; Tanai, Kenji

    2011-10-01

    For geological isolation systems for radioactive waste, bentonite based material is assumed to be used as a buffer material. The swelling characteristics of the bentonite based material are expected to fill up the void space around the radioactive wastes by swelling. In general, swelling characteristics and properties of bentonite are evaluated by the laboratory tests. However, due to the lack of standardization of testing method for bentonite, the accuracy and reproducibility of the testing results are not sufficiently proved. In this study, bentonite swelling pressure test were simulated by newly developed Distinct Element Method (DEM) code, and the effects of wall friction force and aspect ratio of bentonite specimen were discussed. As a result, the followings were found. In the beginning of the swelling pressure test, since swelling occurs only around the fluid injection side of the specimen, wall friction force acts only in the swelling area and the specimen moves to opposite side from fluid injection side. However, when the entire specimen started swelling, displacement of the specimen prevented by the wall friction force, and the specimen is pressed against the pressure measurement side. Then, the swelling pressure measured on the pressure measurement side increases. Such displacement in the specimen is significantly affected by the decreasing of mechanical properties and the difference of saturation in the bentonite specimen during the fluid infiltration. Moreover, when the aspect ratio of the specimen is large, the displacement of the particle in the specimen becomes large and the area on which the wall frictional force acts is also large. Therefore, measured swelling pressure increases more greatly as the aspect ratio of the specimen increases. To contributes to the standardization of laboratory test methods for bentonite, these effects of wall friction force revealed by the DEM simulation should be verified through laboratory experiments. (author)

  7. Sex Ratios, Economic Power, and Women's Roles: A Theoretical Extension and Empirical Test.

    Science.gov (United States)

    South, Scott J.

    1988-01-01

    Tested hypotheses concerning sex ratios, women's roles, and economic power with data from 111 countries. Found undersupply of women positively associated with proportion of women who marry and fertility rate; inversely associated with women's average age at marriage, literacy rate, and divorce rate. Suggests women's economic power may counteract…

  8. Performances of the likelihood-ratio classifier based on different data modelings

    NARCIS (Netherlands)

    Chen, C.; Veldhuis, Raymond N.J.

    2008-01-01

    The classical likelihood ratio classifier easily collapses in many biometric applications especially with independent training-test subjects. The reason lies in the inaccurate estimation of the underlying user-specific feature density. Firstly, the feature density estimation suffers from

  9. Design, manufacture and spin test of high contact ratio helicopter transmission utilizing Self-Aligning Bearingless Planetary (SABP)

    Science.gov (United States)

    Folenta, Dezi; Lebo, William

    1988-01-01

    A 450 hp high ratio Self-Aligning Bearingless Planetary (SABP) for a helicopter application was designed, manufactured, and spin tested under NASA contract NAS3-24539. The objective of the program was to conduct research and development work on a high contact ratio helical gear SABP to reduce weight and noise and to improve efficiency. The results accomplished include the design, manufacturing, and no-load spin testing of two prototype helicopter transmissions, rated at 450 hp with an input speed of 35,000 rpm and an output speed of 350 rpm. The weight power density ratio of these gear units is 0.33 lb hp. The measured airborne noise at 35,000 rpm input speed and light load is 94 dB at 5 ft. The high speed, high contact ratio SABP transmission appears to be significantly lighter and quieter than comtemporary helicopter transmissions. The concept of the SABP is applicable not only to high ratio helicopter type transmissions but also to other rotorcraft and aircraft propulsion systems.

  10. A note on imperfect hedging: a method for testing stability of the hedge ratio

    Directory of Open Access Journals (Sweden)

    Michal Černý

    2012-01-01

    Full Text Available Companies producing, processing and consuming commodities in the production process often hedge their commodity expositions using derivative strategies based on different, highly correlated underlying commodities. Once the open position in a commodity is hedged using a derivative position with another underlying commodity, the appropriate hedge ratio must be determined in order the hedge relationship be as effective as possible. However, it is questionable whether the hedge ratio determined at the inception of the risk management strategy remains stable over the whole period for which the hedging strategy exists. Usually it is assumed that in the short run, the relationship (say, correlation between the two commodities remains stable, while in the long run it may vary. We propose a method, based on statistical theory of stability, for on-line detection whether market movements of prices of the commodities involved in the hedge relationship indicate that the hedge ratio may have been subject to a recent change. The change in the hedge ratio decreases the effectiveness of the original hedge relationship and creates a new open position. The method proposed should inform the risk manager that it could be reasonable to adjust the derivative strategy in a way reflecting the market conditions after the change in the hedge ratio.

  11. Tests and Confidence Intervals for an Extended Variance Component Using the Modified Likelihood Ratio Statistic

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund; Frydenberg, Morten; Jensen, Jens Ledet

    2005-01-01

    The large deviation modified likelihood ratio statistic is studied for testing a variance component equal to a specified value. Formulas are presented in the general balanced case, whereas in the unbalanced case only the one-way random effects model is studied. Simulation studies are presented......, showing that the normal approximation to the large deviation modified likelihood ratio statistic gives confidence intervals for variance components with coverage probabilities very close to the nominal confidence coefficient....

  12. A comparison of between hyomental distance ratios, ratio of height to thyromental, modified Mallamapati classification test and upper lip bite test in predicting difficult laryngoscopy of patients undergoing general anesthesia

    Directory of Open Access Journals (Sweden)

    Azim Honarmand

    2014-01-01

    Full Text Available Background: Failed intubation is imperative source of anesthetic interrelated patient′s mortality. The aim of this present study was to compare the ability to predict difficult visualization of the larynx from the following pre-operative airway predictive indices, in isolation and combination: Modified Mallampati test (MMT, the ratio of height to thyromental distance (RHTMD, hyomental distance ratios (HMDR, and the upper-lip-bite test (ULBT. Materials and Methods: We collected data on 525 consecutive patients scheduled for elective surgery under general anesthesia requiring endotracheal intubation and then evaluated all four factors before surgery. A skilled anesthesiologist, not imparted of the noted pre-operative airway assessment, did the laryngoscopy and rating (as per Cormack and Lehane′s classification. Sensitivity, specificity, and positive predictive value for every airway predictor in isolation and in combination were established. Results: The most sensitive of the single tests was ULBT with a sensitivity of 90.2%. The hyomental distance extreme of head extension was the least sensitive of the single tests with a sensitivity of 56.9. The HMDR had sensitivity 86.3%. The ULBT had the highest negative predictive value: And the area under a receiver-operating characteristic curve (AUC of ROC curve among single predictors. The AUC of ROC curve for ULBT, HMDR and RHTMD was significantly more than for MMT (P 0.05. Conclusion: The HMDR is comparable with RHTMD and ULBT for prediction of difficult laryngoscopy in the general population, but was significantly more than for MMT.

  13. Financial Ratios and Perceived Household Financial Satisfaction

    Directory of Open Access Journals (Sweden)

    Scott Garrett

    2013-08-01

    Full Text Available This paper tests the relative strength of three objective measures of financial health (using the solvency, liquidity, and investment asset ratio in predicting a household’s subjective feeling of current financial satisfaction. Using a sample of 6,923 respondents in the 2008 Health and Retirement Study this paper presents evidence of two main findings: 1 the solvency ratio is most strongly associated with financial satisfaction levels based on a cross-sectional design and 2 changes in the investment asset ratio are most strongly associated with changes in financial satisfaction over time.

  14. Probabilistic fatigue life prediction methodology for notched components based on simple smooth fatigue tests

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Z. R.; Li, Z. X. [Dept.of Engineering Mechanics, Jiangsu Key Laboratory of Engineering Mechanics, Southeast University, Nanjing (China); Hu, X. T.; Xin, P. P.; Song, Y. D. [State Key Laboratory of Mechanics and Control of Mechanical Structures, Nanjing University of Aeronautics and Astronautics, Nanjing (China)

    2017-01-15

    The methodology of probabilistic fatigue life prediction for notched components based on smooth specimens is presented. Weakestlink theory incorporating Walker strain model has been utilized in this approach. The effects of stress ratio and stress gradient have been considered. Weibull distribution and median rank estimator are used to describe fatigue statistics. Fatigue tests under different stress ratios were conducted on smooth and notched specimens of titanium alloy TC-1-1. The proposed procedures were checked against the test data of TC-1-1 notched specimens. Prediction results of 50 % survival rate are all within a factor of two scatter band of the test results.

  15. Searching for degenerate Higgs bosons a profile likelihood ratio method to test for mass-degenerate states in the presence of censored data and uncertainties

    CERN Document Server

    David, André; Petrucciani, Giovanni

    2015-01-01

    Using the likelihood ratio test statistic, we present a method which can be employed to test the hypothesis of a single Higgs boson using the matrix of measured signal strengths. This method can be applied in the presence of censored data and takes into account uncertainties on the measurements. The p-value against the hypothesis of a single Higgs boson is defined from the expected distribution of the test statistic, generated using pseudo-experiments. The applicability of the likelihood-based test is demonstrated using numerical examples with uncertainties and missing matrix elements.

  16. Ratio-based estimators for a change point in persistence.

    Science.gov (United States)

    Halunga, Andreea G; Osborn, Denise R

    2012-11-01

    We study estimation of the date of change in persistence, from [Formula: see text] to [Formula: see text] or vice versa. Contrary to statements in the original papers, our analytical results establish that the ratio-based break point estimators of Kim [Kim, J.Y., 2000. Detection of change in persistence of a linear time series. Journal of Econometrics 95, 97-116], Kim et al. [Kim, J.Y., Belaire-Franch, J., Badillo Amador, R., 2002. Corringendum to "Detection of change in persistence of a linear time series". Journal of Econometrics 109, 389-392] and Busetti and Taylor [Busetti, F., Taylor, A.M.R., 2004. Tests of stationarity against a change in persistence. Journal of Econometrics 123, 33-66] are inconsistent when a mean (or other deterministic component) is estimated for the process. In such cases, the estimators converge to random variables with upper bound given by the true break date when persistence changes from [Formula: see text] to [Formula: see text]. A Monte Carlo study confirms the large sample downward bias and also finds substantial biases in moderate sized samples, partly due to properties at the end points of the search interval.

  17. Cost/Performance Ratio Achieved by Using a Commodity-Based Cluster

    Science.gov (United States)

    Lopez, Isaac

    2001-01-01

    Researchers at the NASA Glenn Research Center acquired a commodity cluster based on Intel Corporation processors to compare its performance with a traditional UNIX cluster in the execution of aeropropulsion applications. Since the cost differential of the clusters was significant, a cost/performance ratio was calculated. After executing a propulsion application on both clusters, the researchers demonstrated a 9.4 cost/performance ratio in favor of the Intel-based cluster. These researchers utilize the Aeroshark cluster as one of the primary testbeds for developing NPSS parallel application codes and system software. The Aero-shark cluster provides 64 Intel Pentium II 400-MHz processors, housed in 32 nodes. Recently, APNASA - a code developed by a Government/industry team for the design and analysis of turbomachinery systems was used for a simulation on Glenn's Aeroshark cluster.

  18. Determination of Geometrical REVs Based on Volumetric Fracture Intensity and Statistical Tests

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2018-05-01

    Full Text Available This paper presents a method to estimate a representative element volume (REV of a fractured rock mass based on the volumetric fracture intensity P32 and statistical tests. A 150 m × 80 m × 50 m 3D fracture network model was generated based on field data collected at the Maji dam site by using the rectangular window sampling method. The volumetric fracture intensity P32 of each cube was calculated by varying the cube location in the generated 3D fracture network model and varying the cube side length from 1 to 20 m, and the distribution of the P32 values was described. The size effect and spatial effect of the fractured rock mass were studied; the P32 values from the same cube sizes and different locations were significantly different, and the fluctuation in P32 values clearly decreases as the cube side length increases. In this paper, a new method that comprehensively considers the anisotropy of rock masses, simplicity of calculation and differences between different methods was proposed to estimate the geometrical REV size. The geometrical REV size of the fractured rock mass was determined based on the volumetric fracture intensity P32 and two statistical test methods, namely, the likelihood ratio test and the Wald–Wolfowitz runs test. The results of the two statistical tests were substantially different; critical cube sizes of 13 m and 12 m were estimated by the Wald–Wolfowitz runs test and the likelihood ratio test, respectively. Because the different test methods emphasize different considerations and impact factors, considering a result that these two tests accept, the larger cube size, 13 m, was selected as the geometrical REV size of the fractured rock mass at the Maji dam site in China.

  19. SOLVENCY RATIO AS A TOOL FOR BANKRUPTCY PREDICTION

    Directory of Open Access Journals (Sweden)

    Daniel BRÎNDESCU–OLARIU

    2016-08-01

    Full Text Available The current study evaluates the potential of the solvency ratio in predicting corporate bankruptcy. The research is focused on Romania and, in particular, on Timis County. The interest for the solvency ratio was based on the recommendations of the scientific literature, as well as on the availability of information concerning its values to all stakeholders. The event on which the research was focused was represented by the manifestation of bankruptcy 2 years after the date of the financial statements of reference. All tests were performed over 2 paired samples of 1176 companies in total. The methodology employed in evaluating the potential of the solvency ratio was based on the Area Under the ROC Curve (0.646 and the general accuracy ensured by the ratio (64.5% out-of-sample accuracy. The results confirm the practical utility of the solvency ratio in the prediction of bankruptcy.

  20. Using non-performing loan ratios as default rates in the estimation of credit losses and macroeconomic credit risk stress testing: A case from Turkey

    Directory of Open Access Journals (Sweden)

    Guray Kucukkocaoglu

    2016-02-01

    Full Text Available In this study, inspired by the Credit Portfolio View approach, we intend to develop an econometric credit risk model to estimate credit loss distributions of Turkish Banking System under baseline and stress macro scenarios, by substituting default rates with non-performing loan (NPL ratios. Since customer number based historical default rates are not available for the whole Turkish banking system’s credit portfolio, we used NPL ratios as dependent variable instead of default rates, a common practice for many countries where historical default rates are not available. Although, there are many problems in using NPL ratios as default rates such as underestimating portfolio losses as a result of totally non-homogeneous total credit portfolios and transferring non-performing loans to asset management companies from banks’ balance sheets, our aim is to underline and limit some ignored problems using accounting based NPL ratios as default rates in macroeconomic credit risk modeling. Developed models confirm the strong statistical relationship between systematic component of credit risk and macroeconomic variables in Turkey. Stress test results also are compatible with the past experiences

  1. A trace ratio maximization approach to multiple kernel-based dimensionality reduction.

    Science.gov (United States)

    Jiang, Wenhao; Chung, Fu-lai

    2014-01-01

    Most dimensionality reduction techniques are based on one metric or one kernel, hence it is necessary to select an appropriate kernel for kernel-based dimensionality reduction. Multiple kernel learning for dimensionality reduction (MKL-DR) has been recently proposed to learn a kernel from a set of base kernels which are seen as different descriptions of data. As MKL-DR does not involve regularization, it might be ill-posed under some conditions and consequently its applications are hindered. This paper proposes a multiple kernel learning framework for dimensionality reduction based on regularized trace ratio, termed as MKL-TR. Our method aims at learning a transformation into a space of lower dimension and a corresponding kernel from the given base kernels among which some may not be suitable for the given data. The solutions for the proposed framework can be found based on trace ratio maximization. The experimental results demonstrate its effectiveness in benchmark datasets, which include text, image and sound datasets, for supervised, unsupervised as well as semi-supervised settings. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Performance evaluation of tile-based Fisher Ratio analysis using a benchmark yeast metabolome dataset.

    Science.gov (United States)

    Watson, Nathanial E; Parsons, Brendon A; Synovec, Robert E

    2016-08-12

    Performance of tile-based Fisher Ratio (F-ratio) data analysis, recently developed for discovery-based studies using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC×GC-TOFMS), is evaluated with a metabolomics dataset that had been previously analyzed in great detail, but while taking a brute force approach. The previously analyzed data (referred to herein as the benchmark dataset) were intracellular extracts from Saccharomyces cerevisiae (yeast), either metabolizing glucose (repressed) or ethanol (derepressed), which define the two classes in the discovery-based analysis to find metabolites that are statistically different in concentration between the two classes. Beneficially, this previously analyzed dataset provides a concrete means to validate the tile-based F-ratio software. Herein, we demonstrate and validate the significant benefits of applying tile-based F-ratio analysis. The yeast metabolomics data are analyzed more rapidly in about one week versus one year for the prior studies with this dataset. Furthermore, a null distribution analysis is implemented to statistically determine an adequate F-ratio threshold, whereby the variables with F-ratio values below the threshold can be ignored as not class distinguishing, which provides the analyst with confidence when analyzing the hit table. Forty-six of the fifty-four benchmarked changing metabolites were discovered by the new methodology while consistently excluding all but one of the benchmarked nineteen false positive metabolites previously identified. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Development and evaluation of a regression-based model to predict cesium concentration ratios for freshwater fish

    International Nuclear Information System (INIS)

    Pinder, John E.; Rowan, David J.; Rasmussen, Joseph B.; Smith, Jim T.; Hinton, Thomas G.; Whicker, F.W.

    2014-01-01

    Data from published studies and World Wide Web sources were combined to produce and test a regression model to predict Cs concentration ratios for freshwater fish species. The accuracies of predicted concentration ratios, which were computed using 1) species trophic levels obtained from random resampling of known food items and 2) K concentrations in the water for 207 fish from 44 species and 43 locations, were tested against independent observations of ratios for 57 fish from 17 species from 25 locations. Accuracy was assessed as the percent of observed to predicted ratios within factors of 2 or 3. Conservatism, expressed as the lack of under prediction, was assessed as the percent of observed to predicted ratios that were less than 2 or less than 3. The model's median observed to predicted ratio was 1.26, which was not significantly different from 1, and 50% of the ratios were between 0.73 and 1.85. The percentages of ratios within factors of 2 or 3 were 67 and 82%, respectively. The percentages of ratios that were <2 or <3 were 79 and 88%, respectively. An example for Perca fluviatilis demonstrated that increased prediction accuracy could be obtained when more detailed knowledge of diet was available to estimate trophic level. - Highlights: • We developed a model to predict Cs concentration ratios for freshwater fish species. • The model uses only two variables to predict a species CR for any location. • One variable is the K concentration in the freshwater. • The other is a species mean trophic level measure easily obtained from (fishbase.org). • The median observed to predicted ratio for 57 independent test cases was 1.26

  4. Near-exact distributions for the block equicorrelation and equivariance likelihood ratio test statistic

    Science.gov (United States)

    Coelho, Carlos A.; Marques, Filipe J.

    2013-09-01

    In this paper the authors combine the equicorrelation and equivariance test introduced by Wilks [13] with the likelihood ratio test (l.r.t.) for independence of groups of variables to obtain the l.r.t. of block equicorrelation and equivariance. This test or its single block version may find applications in many areas as in psychology, education, medicine, genetics and they are important "in many tests of multivariate analysis, e.g. in MANOVA, Profile Analysis, Growth Curve analysis, etc" [12, 9]. By decomposing the overall hypothesis into the hypotheses of independence of groups of variables and the hypothesis of equicorrelation and equivariance we are able to obtain the expressions for the overall l.r.t. statistic and its moments. From these we obtain a suitable factorization of the characteristic function (c.f.) of the logarithm of the l.r.t. statistic, which enables us to develop highly manageable and precise near-exact distributions for the test statistic.

  5. Frequency of Testing for Dyslipidemia: An Evidence-Based Analysis

    Science.gov (United States)

    2014-01-01

    Background Dyslipidemias include high levels of total cholesterol, low-density lipoprotein (LDL) cholesterol, and triglycerides and low levels of high-density lipoprotein (HDL) cholesterol. Dyslipidemia is a risk factor for cardiovascular disease, which is a major contributor to mortality in Canada. Approximately 23% of the 2009/11 Canadian Health Measures Survey (CHMS) participants had a high level of LDL cholesterol, with prevalence increasing with age, and approximately 15% had a total cholesterol to HDL ratio above the threshold. Objectives To evaluate the frequency of lipid testing in adults not diagnosed with dyslipidemia and in adults on treatment for dyslipidemia. Research Methods A systematic review of the literature set out to identify randomized controlled trials (RCTs), systematic reviews, health technology assessments (HTAs), and observational studies published between January 1, 2000, and November 29, 2012, that evaluated the frequency of testing for dyslipidemia in the 2 populations. Results Two observational studies assessed the frequency of lipid testing, 1 in individuals not on lipid-lowering medications and 1 in treated individuals. Both studies were based on previously collected data intended for a different objective and, therefore, no conclusions could be reached about the frequency of testing at intervals other than the ones used in the original studies. Given this limitation and generalizability issues, the quality of evidence was considered very low. No evidence for the frequency of lipid testing was identified in the 2 HTAs included. Canadian and international guidelines recommend testing for dyslipidemia in individuals at an increased risk for cardiovascular disease. The frequency of testing recommended is based on expert consensus. Conclusions Conclusions on the frequency of lipid testing could not be made based on the 2 observational studies. Current guidelines recommend lipid testing in adults with increased cardiovascular risk, with

  6. Monte Carlo simulation of the sequential probability ratio test for radiation monitoring

    International Nuclear Information System (INIS)

    Coop, K.L.

    1984-01-01

    A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table

  7. An Effective Strategy to Build Up a Balanced Test Suite for Spectrum-Based Fault Localization

    Directory of Open Access Journals (Sweden)

    Ning Li

    2016-01-01

    Full Text Available During past decades, many automated software faults diagnosis techniques including Spectrum-Based Fault Localization (SBFL have been proposed to improve the efficiency of software debugging activity. In the field of SBFL, suspiciousness calculation is closely related to the number of failed and passed test cases. Studies have shown that the ratio of the number of failed and passed test case has more significant impact on the accuracy of SBFL than the total number of test cases, and a balanced test suite is more beneficial to improving the accuracy of SBFL. Based on theoretical analysis, we proposed an PNF (Passed test cases, Not execute Faulty statement strategy to reduce test suite and build up a more balanced one for SBFL, which can be used in regression testing. We evaluated the strategy making experiments using the Siemens program and Space program. Experiments indicated that our PNF strategy can be used to construct a new test suite effectively. Compared with the original test suite, the new one has smaller size (average 90% test case was reduced in experiments and more balanced ratio of failed test cases to passed test cases, while it has the same statement coverage and fault localization accuracy.

  8. Fully iterative scatter corrected digital breast tomosynthesis using GPU-based fast Monte Carlo simulation and composition ratio update

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kyungsang; Ye, Jong Chul, E-mail: jong.ye@kaist.ac.kr [Bio Imaging and Signal Processing Laboratory, Department of Bio and Brain Engineering, KAIST 291, Daehak-ro, Yuseong-gu, Daejeon 34141 (Korea, Republic of); Lee, Taewon; Cho, Seungryong [Medical Imaging and Radiotherapeutics Laboratory, Department of Nuclear and Quantum Engineering, KAIST 291, Daehak-ro, Yuseong-gu, Daejeon 34141 (Korea, Republic of); Seong, Younghun; Lee, Jongha; Jang, Kwang Eun [Samsung Advanced Institute of Technology, Samsung Electronics, 130, Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 443-803 (Korea, Republic of); Choi, Jaegu; Choi, Young Wook [Korea Electrotechnology Research Institute (KERI), 111, Hanggaul-ro, Sangnok-gu, Ansan-si, Gyeonggi-do, 426-170 (Korea, Republic of); Kim, Hak Hee; Shin, Hee Jung; Cha, Joo Hee [Department of Radiology and Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, 88 Olympic-ro, 43-gil, Songpa-gu, Seoul, 138-736 (Korea, Republic of)

    2015-09-15

    accurate under a variety of conditions. Our GPU-based fast MCS implementation took approximately 3 s to generate each angular projection for a 6 cm thick breast, which is believed to make this process acceptable for clinical applications. In addition, the clinical preferences of three radiologists were evaluated; the preference for the proposed method compared to the preference for the convolution-based method was statistically meaningful (p < 0.05, McNemar test). Conclusions: The proposed fully iterative scatter correction method and the GPU-based fast MCS using tissue-composition ratio estimation successfully improved the image quality within a reasonable computational time, which may potentially increase the clinical utility of DBT.

  9. Should the diagnosis of COPD be based on a single spirometry test?

    NARCIS (Netherlands)

    Schermer, T.R.; Robberts, B.; Crockett, A.J.; Thoonen, B.P.; Lucas, A.; Grootens, J.; Smeele, I.J.; Thamrin, C.; Reddel, H.K.

    2016-01-01

    Clinical guidelines indicate that a chronic obstructive pulmonary disease (COPD) diagnosis is made from a single spirometry test. However, long-term stability of diagnosis based on forced expiratory volume in 1 s over forced vital capacity (FEV1/FVC) ratio has not been reported. In primary care

  10. An empirical likelihood ratio test robust to individual heterogeneity for differential expression analysis of RNA-seq.

    Science.gov (United States)

    Xu, Maoqi; Chen, Liang

    2018-01-01

    The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. A fuzzy logic-based damage identification method for simply-supported bridge using modal shape ratios

    Directory of Open Access Journals (Sweden)

    Hanbing Liu

    2012-08-01

    Full Text Available A fuzzy logic system (FLS is established for damage identification of simply supported bridge. A novel damage indicator is developed based on ratios of mode shape components between before and after damage. Numerical simulation of a simply-supported bridge is presented to demonstrate the memory, inference and anti-noise ability of the proposed method. The bridge is divided into eight elements and nine nodes, the damage indicator vector at characteristic nodes is used as the input measurement of FLS. Results reveal that FLS can detect damage of training patterns with an accuracy of 100%. Aiming at other test patterns, the FLS also possesses favorable inference ability, the identification accuracy for single damage location is up to 93.75%. Tests with noise simulated data show that the FLS possesses favorable anti-noise ability.

  12. Computer-Based Testing: Test Site Security.

    Science.gov (United States)

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  13. Measures of effect size for chi-squared and likelihood-ratio goodness-of-fit tests.

    Science.gov (United States)

    Johnston, Janis E; Berry, Kenneth J; Mielke, Paul W

    2006-10-01

    A fundamental shift in editorial policy for psychological journals was initiated when the fourth edition of the Publication Manual of the American Psychological Association (1994) placed emphasis on reporting measures of effect size. This paper presents measures of effect size for the chi-squared and the likelihood-ratio goodness-of-fit statistic tests.

  14. Neonatal Acid-Base Status in Fetuses with Abnormal Vertebro- and Cerebro-Placental Ratios.

    Science.gov (United States)

    Morales-Roselló, José; Khalil, Asma; Ferri-Folch, Blanca; Perales-Marín, Alfredo

    2015-01-01

    A low cerebro-placental ratio (CPR) at term suggests the existence of failure to reach growth potential (FRGP) with a higher risk of poor neonatal acid-base status. This study aimed to evaluate whether similar findings were also seen in the vertebral artery (vertebro-placental ratio, VPR), supplying 30% of the cerebral flow. We studied term fetuses classified into groups according to birth weight (BW), CPR and VPR. BW was expressed in centiles and ratios in multiples of the median (MoM). Subsequently, associations with neonatal pH values were evaluated by means of regression curves and Mann-Whitney tests. VPR MoM correlated with BW centiles (p < 0.0001, R2 = 0.042) and its distribution resembled that of CPR MoM (p < 0.001). When both arteries were compared, adequate-for-gestational-age (AGA) fetuses with either low CPR or low VPR had lower neonatal venous pH values (p < 0.05, p < 0.01, respectively). However, in case of small-for-gestational-age (SGA) fetuses, only those with low VPR had significantly lower neonatal arterial and venous pH values (p < 0.05). Blood flow in the vertebral artery mimics that in the middle cerebral artery supporting the FRGP model. Both CPR and VPR identify AGA fetuses with lower neonatal pH values, but only VPR identifies SGA with lower pH values. Hypoxemia might be reflected as a generalized cerebral vasodilation demonstrated as low CPR and VPR.

  15. Correlation Between Cometary Gas/Dust Ratios and Heliocentric Distance

    Science.gov (United States)

    Harrington, Olga; Womack, Maria; Lastra, Nathan

    2017-10-01

    We compiled CO-based gas/dust ratios for several comets out to heliocentric distances, rh, of 8 au to probe whether there is a noticeable change in comet behavior over the range that water-ice sublimation starts. Previously, gas/dust ratios were calculated for an ensemble of comets using Q(CO2)/efp values derived from infrared measurements, which showed that the gas/dust ratio follows a rh-2 within 4 AU, but is flat at greater distances (Bauer et al. 2015). Our project focuses on gas/dust ratios for which CO is assumed to be the dominant gas, in order to test whether similar breaks in slope occur for CO. The gas/dust ratios were calculated from measurements of CO production rates (mostly from millimeter-wavelength spectroscopy) and reflected sunlight of comets (mostly via reported visual magnitudes of dusty comets). We present our new CO-based gas/dust ratios at different heliocentric distances, compare them to existing CO2-based gas/dust ratios, and discuss implications for CO-driven and CO2-driven activity. We discuss O.H. acknowledges support from the Hartmann Student Travel Grant program. M.W. acknowledges support from NSF grant AST-1615917.

  16. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    Science.gov (United States)

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  17. Improved protection system for phase faults on marine vessels based on ratio between negative sequence and positive sequence of the fault current

    DEFF Research Database (Denmark)

    Ciontea, Catalin-Iosif; Hong, Qiteng; Booth, Campbell

    2018-01-01

    algorithm is implemented in a programmable digital relay embedded in a hardware-in-the-loop (HIL) test set-up that emulates a typical maritime feeder using a real-time digital simulator. The HIL set-up allows testing of the new protection method under a wide range of faults and network conditions......This study presents a new method to protect the radial feeders on marine vessels. The proposed protection method is effective against phase–phase (PP) faults and is based on evaluation of the ratio between the negative sequence and positive sequence of the fault currents. It is shown...... that the magnitude of the introduced ratio increases significantly during the PP fault, hence indicating the fault presence in an electric network. Here, the theoretical background of the new method of protection is firstly discussed, based on which the new protection algorithm is described afterwards. The proposed...

  18. Financial Ratio and Its Influence to Profitability in Islamic Banks.

    Directory of Open Access Journals (Sweden)

    Erika Amelia

    2015-10-01

    Full Text Available This research aims to analyze the influence of the Capital Adequacy Ratio (CAR, Non Performing Financing (NPF, Financing to Deposit Ratio (FDR and Biaya Operasional Pendapatan Operasional (BOPO to Return on Asset (ROA in Bank Muamalat Indonesia and Bank Syariah Mega. The data analysis method used in this research is multiple regression analysis. From the test results show that the Capital Adequacy Ratio (CAR, Non Performing Financing (NPF, Financing to Deposit Ratio (FDR and Biaya Operasional Pendapatan Operasional (BOPO simultaneously effect to Return on Asset (ROA. Based on the test results of the t statistic was concluded that the Capital Adequacy Ratio (CAR, Non Performing Financing (NPF and the Financing to Deposit Ratio (FDR partially no significant effect to Return on Asset (ROA, while Biaya Operasional Pendapatan Operasional (BOPO partially significant effect to Return on Asset (ROADOI: 10.15408/aiq.v7i2.1700

  19. Chloride accelerated test: influence of silica fume, water/binder ratio and concrete cover thickness

    Directory of Open Access Journals (Sweden)

    E. Pereira

    Full Text Available In developed countries like the UK, France, Italy and Germany, it is estimated that spending on maintenance and repair is practically the same as investment in new constructions. Therefore, this paper aims to study different ways of interfering in the corrosion kinetic using an accelerated corrosion test - CAIM, that simulates the chloride attack. The three variables are: concrete cover thickness, use of silica fume and the water/binder ratio. It was found, by analysis of variance of the weight loss of the steel bars and chloride content in the concrete cover thickness, there is significant influence of the three variables. Also, the results indicate that the addition of silica fume is the path to improve the corrosion protection of low water/binder ratio concretes (like 0.4 and elevation of the concrete cover thickness is the most effective solution to increase protection of high water/binder ratio concrete (above 0.5.

  20. Aerosol characteristics inversion based on the improved lidar ratio profile with the ground-based rotational Raman-Mie lidar

    Science.gov (United States)

    Ji, Hongzhu; Zhang, Yinchao; Chen, Siying; Chen, He; Guo, Pan

    2018-06-01

    An iterative method, based on a derived inverse relationship between atmospheric backscatter coefficient and aerosol lidar ratio, is proposed to invert the lidar ratio profile and aerosol extinction coefficient. The feasibility of this method is investigated theoretically and experimentally. Simulation results show the inversion accuracy of aerosol optical properties for iterative method can be improved in the near-surface aerosol layer and the optical thick layer. Experimentally, as a result of the reduced insufficiency error and incoherence error, the aerosol optical properties with higher accuracy can be obtained in the near-surface region and the region of numerical derivative distortion. In addition, the particle component can be distinguished roughly based on this improved lidar ratio profile.

  1. A lead isotope ratio data base of ancient Chinese bronzes

    International Nuclear Information System (INIS)

    Jin Zhengyao

    2005-01-01

    A data base of lead isotope ratio of ancient Chinese bronzes is set up. There are 2888 members, including bronze objects, casting remains, and related ores, etc. in the file. The file contents of data base are made from analysis work on Chinese bronze previously carried out in several laboratories in China, Japan and USA. The main body of the file contents is formed from records, analysis data, reference documents, and images. The data base is designed for sharing information in provenance study on raw metal material for bronze production in China Bronze Age. (author)

  2. Experimental tests of the effect of rotor diameter ratio and blade number to the cross-flow wind turbine performance

    Science.gov (United States)

    Susanto, Sandi; Tjahjana, Dominicus Danardono Dwi Prija; Santoso, Budi

    2018-02-01

    Cross-flow wind turbine is one of the alternative energy harvester for low wind speeds area. Several factors that influence the power coefficient of cross-flow wind turbine are the diameter ratio of blades and the number of blades. The aim of this study is to find out the influence of the number of blades and the diameter ratio on the performance of cross-flow wind turbine and to find out the best configuration between number of blades and diameter ratio of the turbine. The experimental test were conducted under several variation including diameter ratio between outer and inner diameter of the turbine and number of blades. The variation of turbine diameter ratio between inner and outer diameter consisted of 0.58, 0.63, 0.68 and 0.73 while the variations of the number of blades used was 16, 20 and 24. The experimental test were conducted under certain wind speed which are 3m/s until 4 m/s. The result showed that the configurations between 0.68 diameter ratio and 20 blade numbers is the best configurations that has power coefficient of 0.049 and moment coefficient of 0.185.

  3. Obtaining reliable Likelihood Ratio tests from simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    It is standard practice by researchers and the default option in many statistical programs to base test statistics for mixed models on simulations using asymmetric draws (e.g. Halton draws). This paper shows that when the estimated likelihood functions depend on standard deviations of mixed param...

  4. THE METHOD OF GEOMETRIC CALIBRATION OF OPTOELECTRONIC SYSTEMS BASED ON ELECTRONIC TEST OBJECT

    Directory of Open Access Journals (Sweden)

    D. A. Kozhevnikov

    2017-01-01

    Full Text Available Designing remote sensing of the Earth devices is requires a lot of attention to evaluation lens distortion level and providing the required accuracy values of geometric calibration of optoelectronic systems at all. Test- objects known as most common tools for optical systems geometric calibration. The purpose of the research was creating an automatically method of distortion correction coefficients calculating with a 3 μm precision in the measurement process. The method of geometric calibration of the internal orientation elements of the optical system based on the electronic test object is proposed. The calculation of the test string brightness image from its multispectral image and filtered signal extrema position determination are presented. Ratio of magnitude of the distortion and interval center is given. Three variants of electronic test-objects with different step and element size are considered. Оptimal size of calibration element was defined as 3×3 pixels due to shape of the subpixels with the aspect ratio of the radiating areas about 1 : 3. It is advisable to use IPS as an electronic test object template. An experimental test and measurement stand functional diagram based on the collimator and optical bench «OSK-2CL» is showed. It was determined that test objects with a grid spacing of 4 and 8 pixels can’t provide tolerable image because of non-collimated emission of active sites and scattering on optical surfaces – the shape of the elements is substantially disrupted. Test-object with a 12 pixels grid spacing was used to distortion level analyzing as most suitable.Ratio of coordinate increment and element number graphs for two photographic lenses (Canon EF-S 17-85 f/4-5.6 IS USM and EF-S 18-55 f/3.5-5.6 IS II are presented. A calculation of the distortion values in edge zones was held, which were respectively 43 μm and 51.6 μm. The technique and algorithm of software implementation is described. Possible directions of the

  5. Design studies of low-aspect ratio quasi-omnigenous stellarators

    International Nuclear Information System (INIS)

    Spong, D.A.; Hirshman, S.; Whitson, J.C.

    2001-01-01

    Significant progress has been made in the development of new modest-size compact stellarator devices that could test optimization principles for the design of a more attractive reactor. These are 3 and 4 field period low-aspect-ratio quasi-omnigenous (QO) stellarators based on an optimization method that targets improved confinement, stability, ease of coil design, low-aspect-ratio, and low bootstrap current. (author)

  6. An investigation of the statistical power of neutrality tests based on comparative and population genetic data

    DEFF Research Database (Denmark)

    Zhai, Weiwei; Nielsen, Rasmus; Slatkin, Montgomery

    2009-01-01

    In this report, we investigate the statistical power of several tests of selective neutrality based on patterns of genetic diversity within and between species. The goal is to compare tests based solely on population genetic data with tests using comparative data or a combination of comparative...... and population genetic data. We show that in the presence of repeated selective sweeps on relatively neutral background, tests based on the d(N)/d(S) ratios in comparative data almost always have more power to detect selection than tests based on population genetic data, even if the overall level of divergence...... selection. The Hudson-Kreitman-Aguadé test is the most powerful test for detecting positive selection among the population genetic tests investigated, whereas McDonald-Kreitman test typically has more power to detect negative selection. We discuss our findings in the light of the discordant results obtained...

  7. A test procedure for determining the influence of stress ratio on fatigue crack growth

    Science.gov (United States)

    Fitzgerald, J. H.; Wei, R. P.

    1974-01-01

    A test procedure is outlined by which the rate of fatigue crack growth over a range of stress ratios and stress intensities can be determined expeditiously using a small number of specimens. This procedure was developed to avoid or circumvent the effects of load interactions on fatigue crack growth, and was used to develop data on a mill annealed Ti-6Al-4V alloy plate. Experimental data suggest that the rates of fatigue crack growth among the various stress ratios may be correlated in terms of an effective stress intensity range at given values of K max. This procedure is not to be used, however, for determining the corrosion fatigue crack growth characteristics of alloys when nonsteady-state effects are significant.

  8. PROFITABILITY RATIO AS A TOOL FOR BANKRUPTCY PREDICTION

    OpenAIRE

    Daniel BRÎNDESCU – OLARIU

    2016-01-01

    The current study evaluates the potential of the profitability ratio in predicting corporate bankruptcy. The research is focused on Romanian companies, with the targeted event being represented by the manifestation of bankruptcy 2 years after the date of the financial statements of reference. All tests were conducted over 2 paired samples of 1176 Romanian companies. The methodology employed in evaluating the potential of the profitability ratio was based on the Area Under the ROC Curve (0.663...

  9. Risk Measure and Early-Warning System of China's Stock Market Based on Price-Earnings Ratio and Price-to-Book Ratio

    Directory of Open Access Journals (Sweden)

    Rongda Chen

    2014-01-01

    Full Text Available Based on the actual situation of China's stock market, this paper proposes a method for measuring the stock market's risk and early-warning methods which are based on price-to-earnings ratio and price-to-book ratio. The study found that the method of VaR can capture the bigger daily drops in a period, and if the drop is at the periodical top of the index, the probability of a sharp index decline will be very high. It also confirmed that the method is feasible and practical for people to use. In the long run, this method really can send early-warning signals of sharp decline; the warning levels increase as the index rises. The study also found that index will not fall after every warning but will continue going forward because of inertia, particularly during a big trend.

  10. Test of cold asphalt storability based on alternative approaches

    Science.gov (United States)

    Abaffyová, Zora; Komačka, Jozef

    2017-09-01

    Cold asphalt products for potholes repairs should be workable (soft enough) for long time to ensure their applicability. Storability is assessed indirectly using various tests of workability. Therefore, simple test methods (self-compaction and disintegration test) was developed and verified to investigate changes of storability of this group of cold asphalts. Selfcompaction of the tested mixture in the upturned Abram’s cone for the cement concrete slump test and in the mould for the California Bearing Ratio test was assessed in first stage. After that the video record of disintegration test was taken. During this test, the mould was lifted up and the mixture fell off the mould (Abram’s cone) or disintegrate (CBR mould). The drop of surface after 10 min self-compaction and netto time related to falling out or disintegration of the mixture were used to evaluate the mixture from storability point of view. It was found out the self-compaction test has not a potential to reveal and prove changes of mixture properties. Based on the disintegration test results it can be stated this test at 5 °C using the upturned Abram’s cone could be a suitable approach to determine qualitative changes of a cold mixture from storability point of view.

  11. The role of the epoxy resin: Curing agent ratio in composite interfacial strength by single fibre microbond test

    DEFF Research Database (Denmark)

    Minty, Ross; Thomason, James L.; Petersen, Helga Nørgaard

    2015-01-01

    This paper focuses on an investigation into the role of the epoxy resin: curing agent ratio in composite interfacial shear strength of glass fibre composites. The procedure involved changing the percentage of curing agent (Triethylenetetramine [TETA]) used in the mixture with several different...... percentages used, ranging from 4% up to 30%, including the stoichiometric ratio. It was found by using the microbond test, that there may exist a relationship between the epoxy resin to curing agent ratio and the level of adhesion between the reinforcing fibre and the polymer matrix of the composite....

  12. A comparison of likelihood ratio tests and Rao's score test for three separable covariance matrix structures.

    Science.gov (United States)

    Filipiak, Katarzyna; Klein, Daniel; Roy, Anuradha

    2017-01-01

    The problem of testing the separability of a covariance matrix against an unstructured variance-covariance matrix is studied in the context of multivariate repeated measures data using Rao's score test (RST). The RST statistic is developed with the first component of the separable structure as a first-order autoregressive (AR(1)) correlation matrix or an unstructured (UN) covariance matrix under the assumption of multivariate normality. It is shown that the distribution of the RST statistic under the null hypothesis of any separability does not depend on the true values of the mean or the unstructured components of the separable structure. A significant advantage of the RST is that it can be performed for small samples, even smaller than the dimension of the data, where the likelihood ratio test (LRT) cannot be used, and it outperforms the standard LRT in a number of contexts. Monte Carlo simulations are then used to study the comparative behavior of the null distribution of the RST statistic, as well as that of the LRT statistic, in terms of sample size considerations, and for the estimation of the empirical percentiles. Our findings are compared with existing results where the first component of the separable structure is a compound symmetry (CS) correlation matrix. It is also shown by simulations that the empirical null distribution of the RST statistic converges faster than the empirical null distribution of the LRT statistic to the limiting χ 2 distribution. The tests are implemented on a real dataset from medical studies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. PROFITABILITY RATIO AS A TOOL FOR BANKRUPTCY PREDICTION

    Directory of Open Access Journals (Sweden)

    Daniel BRÎNDESCU – OLARIU

    2016-07-01

    Full Text Available The current study evaluates the potential of the profitability ratio in predicting corporate bankruptcy. The research is focused on Romanian companies, with the targeted event being represented by the manifestation of bankruptcy 2 years after the date of the financial statements of reference. All tests were conducted over 2 paired samples of 1176 Romanian companies. The methodology employed in evaluating the potential of the profitability ratio was based on the Area Under the ROC Curve (0.663 and the general accuracy ensured by the ratio (62.6% out-of-sample accuracy. The results confirm the practical utility of the profitability ratio in the prediction of bankruptcy and thus validate the need for further research focused on developing a methodology of analysis.

  14. Gust load alleviation wind tunnel tests of a large-aspect-ratio flexible wing with piezoelectric control

    Directory of Open Access Journals (Sweden)

    Ying Bi

    2017-02-01

    Full Text Available An active control technique utilizing piezoelectric actuators to alleviate gust-response loads of a large-aspect-ratio flexible wing is investigated. Piezoelectric materials have been extensively used for active vibration control of engineering structures. In this paper, piezoelectric materials further attempt to suppress the vibration of the aeroelastic wing caused by gust. The motion equation of the flexible wing with piezoelectric patches is obtained by Hamilton’s principle with the modal approach, and then numerical gust responses are analyzed, based on which a gust load alleviation (GLA control system is proposed. The gust load alleviation system employs classic proportional-integral-derivative (PID controllers which treat piezoelectric patches as control actuators and acceleration as the feedback signal. By a numerical method, the control mechanism that piezoelectric actuators can be used to alleviate gust-response loads is also analyzed qualitatively. Furthermore, through low-speed wind tunnel tests, the effectiveness of the gust load alleviation active control technology is validated. The test results agree well with the numerical results. Test results show that at a certain frequency range, the control scheme can effectively alleviate the z and x wingtip accelerations and the root bending moment of the wing to a certain extent. The control system gives satisfying gust load alleviation efficacy with the reduction rate being generally over 20%.

  15. Assessment of chloroethene degradation rates based on ratios of daughter/parent compounds in groundwater plumes

    Science.gov (United States)

    Höhener, Patrick

    2014-05-01

    Chlorinated solvent spills at industrial and urban sites create groundwater plumes where tetrachloro- and trichloroethene may degrade to their daughter compounds, dichloroethenes, vinyl chloride and ethane. The assessment of degradation and natural attenuation at such sites may be based on the analysis and inverse modelling of concentration data, on the calculation of mass fluxes in transsects, and/or on the analysis of stable isotope ratios in the ethenes. Relatively few work has investigated the possibility of using ratio of concentrations for gaining information on degradation rates. The use of ratios bears the advantage that dilution of a single sample with contaminant-free water does not matter. It will be shown that molar ratios of daughter to parent compounds measured along a plume streamline are a rapid and robust mean of determining whether degradation rates increase or decrease along the degradation chain, and allow furthermore a quantitation of the relative magnitude of degradation rates compared to the rate of the parent compound. Furthermore, ratios of concentration will become constant in zones where degradation is absent, and this allows to sketching the extension of actively degrading zones. The assessment is possible for pure sources and also for mixed sources. A quantification method is proposed in order to estimate first-order degradation rates in zones of constant degradation activity. This quantification method includes corrections that are needed due to longitudinal and transversal dispersivity. The method was tested on a number of real field sites from literature. At the majority of these sites, the first-order degradation rates were decreasing along the degradation chain from tetrachloroethene to vinyl chloride, meaning that the latter was often reaching important concentrations. This is bad news for site owners due to the increased toxicity of vinyl chloride compared to its parent compounds.

  16. The Sequential Probability Ratio Test: An efficient alternative to exact binomial testing for Clean Water Act 303(d) evaluation.

    Science.gov (United States)

    Chen, Connie; Gribble, Matthew O; Bartroff, Jay; Bay, Steven M; Goldstein, Larry

    2017-05-01

    The United States's Clean Water Act stipulates in section 303(d) that states must identify impaired water bodies for which total maximum daily loads (TMDLs) of pollution inputs into water bodies are developed. Decision-making procedures about how to list, or delist, water bodies as impaired, or not, per Clean Water Act 303(d) differ across states. In states such as California, whether or not a particular monitoring sample suggests that water quality is impaired can be regarded as a binary outcome variable, and California's current regulatory framework invokes a version of the exact binomial test to consolidate evidence across samples and assess whether the overall water body complies with the Clean Water Act. Here, we contrast the performance of California's exact binomial test with one potential alternative, the Sequential Probability Ratio Test (SPRT). The SPRT uses a sequential testing framework, testing samples as they become available and evaluating evidence as it emerges, rather than measuring all the samples and calculating a test statistic at the end of the data collection process. Through simulations and theoretical derivations, we demonstrate that the SPRT on average requires fewer samples to be measured to have comparable Type I and Type II error rates as the current fixed-sample binomial test. Policymakers might consider efficient alternatives such as SPRT to current procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. On the equivalence of the Clauser–Horne and Eberhard inequality based tests

    International Nuclear Information System (INIS)

    Khrennikov, Andrei; Ramelow, Sven; Ursin, Rupert; Wittmann, Bernhard; Kofler, Johannes; Basieva, Irina

    2014-01-01

    Recently, the results of the first experimental test for entangled photons closing the detection loophole (also referred to as the fair sampling loophole) were published (Vienna, 2013). From the theoretical viewpoint the main distinguishing feature of this long-aspired to experiment was that the Eberhard inequality was used. Almost simultaneously another experiment closing this loophole was performed (Urbana-Champaign, 2013) and it was based on the Clauser–Horne inequality (for probabilities). The aim of this note is to analyze the mathematical and experimental equivalence of tests based on the Eberhard inequality and various forms of the Clauser–Horne inequality. The structure of the mathematical equivalence is nontrivial. In particular, it is necessary to distinguish between algebraic and statistical equivalence. Although the tests based on these inequalities are algebraically equivalent, they need not be equivalent statistically, i.e., theoretically the level of statistical significance can drop under transition from one test to another (at least for finite samples). Nevertheless, the data collected in the Vienna test implies not only a statistically significant violation of the Eberhard inequality, but also of the Clauser–Horne inequality (in the ratio-rate form): for both a violation >60σ. (paper)

  18. Cost effectiveness of population based BRCA1 founder mutation testing in Sephardi Jewish women.

    Science.gov (United States)

    Patel, Shreeya; Legood, Rosa; Evans, D Gareth; Turnbull, Clare; Antoniou, Antonis C; Menon, Usha; Jacobs, Ian; Manchanda, Ranjit

    2018-04-01

    Population-based BRCA1/BRCA2 founder-mutation testing has been demonstrated as cost effective compared with family history based testing in Ashkenazi Jewish women. However, only 1 of the 3 Ashkenazi Jewish BRCA1/BRCA2 founder mutations (185delAG[c.68_69delAG]), 5382insC[c.5266dupC]), and 6174delT[c.5946delT]) is found in the Sephardi Jewish population (185delAG[c.68_69delAG]), and the overall prevalence of BRCA mutations in the Sephardi Jewish population is accordingly lower (0.7% compared with 2.5% in the Ashkenazi Jewish population). Cost-effectiveness analyses of BRCA testing have not previously been performed at these lower BRCA prevalence levels seen in the Sephardi Jewish population. Here we present a cost-effectiveness analysis for UK and US populations comparing population testing with clinical criteria/family history-based testing in Sephardi Jewish women. A Markov model was built comparing the lifetime costs and effects of population-based BRCA1 testing, with testing using family history-based clinical criteria in Sephardi Jewish women aged ≥30 years. BRCA1 carriers identified were offered magnetic resonance imaging/mammograms and risk-reducing surgery. Costs are reported at 2015 prices. Outcomes include breast cancer, ovarian cancer, and excess deaths from heart disease. All costs and outcomes are discounted at 3.5%. The time horizon is lifetime, and perspective is payer. The incremental cost-effectiveness ratio per quality-adjusted life-year was calculated. Parameter uncertainty was evaluated through 1-way and probabilistic sensitivity analysis. Population testing resulted in gain in life expectancy of 12 months (quality-adjusted life-year = 1.00). The baseline discounted incremental cost-effectiveness ratio for UK population-based testing was £67.04/quality-adjusted life-year and for US population was $308.42/quality-adjusted life-year. Results were robust in the 1-way sensitivity analysis. The probabilistic sensitivity analysis showed 100% of

  19. Semantics-based Automated Web Testing

    Directory of Open Access Journals (Sweden)

    Hai-Feng Guo

    2015-08-01

    Full Text Available We present TAO, a software testing tool performing automated test and oracle generation based on a semantic approach. TAO entangles grammar-based test generation with automated semantics evaluation using a denotational semantics framework. We show how TAO can be incorporated with the Selenium automation tool for automated web testing, and how TAO can be further extended to support automated delta debugging, where a failing web test script can be systematically reduced based on grammar-directed strategies. A real-life parking website is adopted throughout the paper to demonstrate the effectivity of our semantics-based web testing approach.

  20. Sex Ratio Elasticity Influences the Selection of Sex Ratio Strategy

    Science.gov (United States)

    Wang, Yaqiang; Wang, Ruiwu; Li, Yaotang; (Sam) Ma, Zhanshan

    2016-12-01

    There are three sex ratio strategies (SRS) in nature—male-biased sex ratio, female-biased sex ratio and, equal sex ratio. It was R. A. Fisher who first explained why most species in nature display a sex ratio of ½. Consequent SRS theories such as Hamilton’s local mate competition (LMC) and Clark’s local resource competition (LRC) separately explained the observed deviations from the seemingly universal 1:1 ratio. However, to the best of our knowledge, there is not yet a unified theory that accounts for the mechanisms of the three SRS. Here, we introduce the price elasticity theory in economics to define sex ratio elasticity (SRE), and present an analytical model that derives three SRSs based on the following assumption: simultaneously existing competitions for both resources A and resources B influence the level of SRE in both sexes differently. Consequently, it is the difference (between two sexes) in the level of their sex ratio elasticity that leads to three different SRS. Our analytical results demonstrate that the elasticity-based model not only reveals a highly plausible mechanism that explains the evolution of SRS in nature, but also offers a novel framework for unifying two major classical theories (i.e., LMC & LRC) in the field of SRS research.

  1. Signal-to-Noise ratio and design complexity based on Unified Loss ...

    African Journals Online (AJOL)

    Taguchi's quality loss function for larger-the-better performance characteristics uses a reciprocal transformation to compute quality loss. This paper suggests that reciprocal transformation unnecessarily complicates and may distort results. Examples of this distortion include the signal-to-noise ratio based on mean squared ...

  2. Receiver-operating characteristic curves and likelihood ratios: improvements over traditional methods for the evaluation and application of veterinary clinical pathology tests

    DEFF Research Database (Denmark)

    Gardner, Ian A.; Greiner, Matthias

    2006-01-01

    Receiver-operating characteristic (ROC) curves provide a cutoff-independent method for the evaluation of continuous or ordinal tests used in clinical pathology laboratories. The area under the curve is a useful overall measure of test accuracy and can be used to compare different tests (or...... different equipment) used by the same tester, as well as the accuracy of different diagnosticians that use the same test material. To date, ROC analysis has not been widely used in veterinary clinical pathology studies, although it should be considered a useful complement to estimates of sensitivity...... and specificity in test evaluation studies. In addition, calculation of likelihood ratios can potentially improve the clinical utility of such studies because likelihood ratios provide an indication of how the post-test probability changes as a function of the magnitude of the test results. For ordinal test...

  3. Transformer ratio enhancement experiment

    International Nuclear Information System (INIS)

    Gai, W.; Power, J. G.; Kanareykin, A.; Neasheva, E.; Altmark, A.

    2004-01-01

    Recently, a multibunch scheme for efficient acceleration based on dielectric wakefield accelerator technology was outlined in J.G. Power, W. Gai, A. Kanareykin, X. Sun. PAC 2001 Proceedings, pp. 114-116, 2002. In this paper we present an experimental program for the design, development and demonstration of an Enhanced Transformer Ratio Dielectric Wakefield Accelerator (ETR-DWA). The principal goal is to increase the transformer ratio R, the parameter that characterizes the energy transfer efficiency from the accelerating structure to the accelerated electron beam. We present here an experimental design of a 13.625 GHz dielectric loaded accelerating structure, a laser multisplitter producing a ramped bunch train, and simulations of the bunch train parameters required. Experimental results of the accelerating structure bench testing and ramped pulsed train generation with the laser multisplitter are shown as well. Using beam dynamic simulations, we also obtain the focusing FODO lattice parameters

  4. Arcjet nozzle area ratio effects

    Science.gov (United States)

    Curran, Francis M.; Sarmiento, Charles J.; Birkner, Bjorn W.; Kwasny, James

    1990-01-01

    An experimental investigation was conducted to determine the effect of nozzle area ratio on the operating characteristics and performance of a low power dc arcjet thruster. Conical thoriated tungsten nozzle inserts were tested in a modular laboratory arcjet thruster run on hydrogen/nitrogen mixtures simulating the decomposition products of hydrazine. The converging and diverging sides of the inserts had half angles of 30 and 20 degrees, respectively, similar to a flight type unit currently under development. The length of the diverging side was varied to change the area ratio. The nozzle inserts were run over a wide range of specific power. Current, voltage, mass flow rate, and thrust were monitored to provide accurate comparisons between tests. While small differences in performance were observed between the two nozzle inserts, it was determined that for each nozzle insert, arcjet performance improved with increasing nozzle area ratio to the highest area ratio tested and that the losses become very pronounced for area ratios below 50. These trends are somewhat different than those obtained in previous experimental and analytical studies of low Re number nozzles. It appears that arcjet performance can be enhanced via area ratio optimization.

  5. Arcjet Nozzle Area Ratio Effects

    Science.gov (United States)

    Curran, Francis M.; Sarmiento, Charles J.; Birkner, Bjorn W.; Kwasny, James

    1990-01-01

    An experimental investigation was conducted to determine the effect of nozzle area ratio on the operating characteristics and performance of a low power dc arcjet thruster. Conical thoriated tungsten nozzle inserts were tested in a modular laboratory arcjet thruster run on hydrogen/nitrogen mixtures simulating the decomposition products of hydrazine. The converging and diverging sides of the inserts had half angles of 30 and 20 degrees, respectively, similar to a flight type unit currently under development. The length of the diverging side was varied to change the area ratio. The nozzle inserts were run over a wide range of specific power. Current, voltage, mass flow rate, and thrust were monitored to provide accurate comparisons between tests. While small differences in performance were observed between the two nozzle inserts, it was determined that for each nozzle insert, arcjet performance improved with increasing nozzle area ratio to the highest area ratio tested and that the losses become very pronounced for area ratios below 50. These trends are somewhat different than those obtained in previous experimental and analytical studies of low Re number nozzles. It appears that arcjet performance can be enhanced via area ratio optimization.

  6. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    Groenendijk, Patrick A.; Lucas, André; Vries, de Casper G.

    1998-01-01

    We advocate the use of absolute moment ratio statistics in conjunctionwith standard variance ratio statistics in order to disentangle lineardependence, non-linear dependence, and leptokurtosis in financial timeseries. Both statistics are computed for multiple return horizonssimultaneously, and the

  7. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    P.A. Groenendijk (Patrick); A. Lucas (André); C.G. de Vries (Casper)

    1998-01-01

    textabstractWe advocate the use of absolute moment ratio statistics in conjunction with standard variance ratio statistics in order to disentangle linear dependence, non-linear dependence, and leptokurtosis in financial time series. Both statistics are computed for multiple return horizons

  8. Tests of Full-Scale Helicopter Rotors at High Advancing Tip Mach Numbers and Advance Ratios

    Science.gov (United States)

    Biggers, James C.; McCloud, John L., III; Stroub, Robert H.

    2015-01-01

    As a continuation of the studies of reference 1, three full-scale helicopter rotors have been tested in the Ames Research Center 40- by SO-foot wind tunnel. All three of them were two-bladed, teetering rotors. One of the rotors incorporated the NACA 0012 airfoil section over the entire length of the blade. This rotor was tested at advance ratios up to 1.05. Both of the other rotors were tapered in thickness and incorporated leading-edge camber over the outer 20 percent of the blade radius. The larger of these rotors was tested at advancing tip Mach numbers up to 1.02. Data were obtained for a wide range of lift and propulsive force, and are presented without discussion.

  9. Validation of DNA-based identification software by computation of pedigree likelihood ratios.

    Science.gov (United States)

    Slooten, K

    2011-08-01

    Disaster victim identification (DVI) can be aided by DNA-evidence, by comparing the DNA-profiles of unidentified individuals with those of surviving relatives. The DNA-evidence is used optimally when such a comparison is done by calculating the appropriate likelihood ratios. Though conceptually simple, the calculations can be quite involved, especially with large pedigrees, precise mutation models etc. In this article we describe a series of test cases designed to check if software designed to calculate such likelihood ratios computes them correctly. The cases include both simple and more complicated pedigrees, among which inbred ones. We show how to calculate the likelihood ratio numerically and algebraically, including a general mutation model and possibility of allelic dropout. In Appendix A we show how to derive such algebraic expressions mathematically. We have set up these cases to validate new software, called Bonaparte, which performs pedigree likelihood ratio calculations in a DVI context. Bonaparte has been developed by SNN Nijmegen (The Netherlands) for the Netherlands Forensic Institute (NFI). It is available free of charge for non-commercial purposes (see www.dnadvi.nl for details). Commercial licenses can also be obtained. The software uses Bayesian networks and the junction tree algorithm to perform its calculations. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  10. A Power Transformers Fault Diagnosis Model Based on Three DGA Ratios and PSO Optimization SVM

    Science.gov (United States)

    Ma, Hongzhe; Zhang, Wei; Wu, Rongrong; Yang, Chunyan

    2018-03-01

    In order to make up for the shortcomings of existing transformer fault diagnosis methods in dissolved gas-in-oil analysis (DGA) feature selection and parameter optimization, a transformer fault diagnosis model based on the three DGA ratios and particle swarm optimization (PSO) optimize support vector machine (SVM) is proposed. Using transforming support vector machine to the nonlinear and multi-classification SVM, establishing the particle swarm optimization to optimize the SVM multi classification model, and conducting transformer fault diagnosis combined with the cross validation principle. The fault diagnosis results show that the average accuracy of test method is better than the standard support vector machine and genetic algorithm support vector machine, and the proposed method can effectively improve the accuracy of transformer fault diagnosis is proved.

  11. Tree-Based Global Model Tests for Polytomous Rasch Models

    Science.gov (United States)

    Komboz, Basil; Strobl, Carolin; Zeileis, Achim

    2018-01-01

    Psychometric measurement models are only valid if measurement invariance holds between test takers of different groups. Global model tests, such as the well-established likelihood ratio (LR) test, are sensitive to violations of measurement invariance, such as differential item functioning and differential step functioning. However, these…

  12. Review of titanium dioxide nanoparticle phototoxicity: Developing a phototoxicity ratio to correct the endpoint values of toxicity tests

    Science.gov (United States)

    2015-01-01

    Abstract Titanium dioxide nanoparticles are photoactive and produce reactive oxygen species under natural sunlight. Reactive oxygen species can be detrimental to many organisms, causing oxidative damage, cell injury, and death. Most studies investigating TiO2 nanoparticle toxicity did not consider photoactivation and performed tests either in dark conditions or under artificial lighting that did not simulate natural irradiation. The present study summarizes the literature and derives a phototoxicity ratio between the results of nano‐titanium dioxide (nano‐TiO2) experiments conducted in the absence of sunlight and those conducted under solar or simulated solar radiation (SSR) for aquatic species. Therefore, the phototoxicity ratio can be used to correct endpoints of the toxicity tests with nano‐TiO2 that were performed in absence of sunlight. Such corrections also may be important for regulators and risk assessors when reviewing previously published data. A significant difference was observed between the phototoxicity ratios of 2 distinct groups: aquatic species belonging to order Cladocera, and all other aquatic species. Order Cladocera appeared very sensitive and prone to nano‐TiO2 phototoxicity. On average nano‐TiO2 was 20 times more toxic to non‐Cladocera and 1867 times more toxic to Cladocera (median values 3.3 and 24.7, respectively) after illumination. Both median value and 75% quartile of the phototoxicity ratio are chosen as the most practical values for the correction of endpoints of nano‐TiO2 toxicity tests that were performed in dark conditions, or in the absence of sunlight. Environ Toxicol Chem 2015;34:1070–1077. © 2015 The Author. Published by SETAC. PMID:25640001

  13. Testing and Performance Verification of a High Bypass Ratio Turbofan Rotor in an Internal Flow Component Test Facility

    Science.gov (United States)

    VanZante, Dale E.; Podboy, Gary G.; Miller, Christopher J.; Thorp, Scott A.

    2009-01-01

    A 1/5 scale model rotor representative of a current technology, high bypass ratio, turbofan engine was installed and tested in the W8 single-stage, high-speed, compressor test facility at NASA Glenn Research Center (GRC). The same fan rotor was tested previously in the GRC 9x15 Low Speed Wind Tunnel as a fan module consisting of the rotor and outlet guide vanes mounted in a flight-like nacelle. The W8 test verified that the aerodynamic performance and detailed flow field of the rotor as installed in W8 were representative of the wind tunnel fan module installation. Modifications to W8 were necessary to ensure that this internal flow facility would have a flow field at the test package that is representative of flow conditions in the wind tunnel installation. Inlet flow conditioning was designed and installed in W8 to lower the fan face turbulence intensity to less than 1.0 percent in order to better match the wind tunnel operating environment. Also, inlet bleed was added to thin the casing boundary layer to be more representative of a flight nacelle boundary layer. On the 100 percent speed operating line the fan pressure rise and mass flow rate agreed with the wind tunnel data to within 1 percent. Detailed hot film surveys of the inlet flow, inlet boundary layer and fan exit flow were compared to results from the wind tunnel. The effect of inlet casing boundary layer thickness on fan performance was quantified. Challenges and lessons learned from testing this high flow, low static pressure rise fan in an internal flow facility are discussed.

  14. Psychological distress during early gestation and offspring sex ratio

    DEFF Research Database (Denmark)

    Obel, C; Henriksen, TB; Secher, Niels Jørgen

    2007-01-01

    BACKGROUND: Exposure to severe stress in early pregnancy is associated with a lower male to female ratio (sex ratio), but whether more moderate levels of psychological discomfort have the same kind of effect is unknown. In a population based follow-up study, we aimed to test whether psychological...... suggest that not only severe stress, but also more moderate and common levels of psychological distress, may decrease the sex ratio in the offspring. Stress during pregnancy is a likely candidate involved in the decreasing sex ratio observed in many countries....... distress was associated with the sex ratio in the offspring. METHODS: From 1989 to 1992, a cohort of 8,719 Danish-speaking pregnant women were followed until delivery. Questionnaires were administered to the women in early pregnancy and 6,629 (76%) completed the 30-item version of the General Health...

  15. Psychological distress during early gestation and offspring sex ratio

    DEFF Research Database (Denmark)

    Obel, Carsten; Henriksen, Tine Brink; Secher, Niels Jørgen

    2007-01-01

    BACKGROUND: Exposure to severe stress in early pregnancy is associated with a lower male to female ratio (sex ratio), but whether more moderate levels of psychological discomfort have the same kind of effect is unknown. In a population based follow-up study, we aimed to test whether psychological...... suggest that not only severe stress, but also more moderate and common levels of psychological distress, may decrease the sex ratio in the offspring. Stress during pregnancy is a likely candidate involved in the decreasing sex ratio observed in many countries. Udgivelsesdato: 2007-Nov...... distress was associated with the sex ratio in the offspring. METHODS: From 1989 to 1992, a cohort of 8,719 Danish-speaking pregnant women were followed until delivery. Questionnaires were administered to the women in early pregnancy and 6,629 (76%) completed the 30-item version of the General Health...

  16. Eigenvalue ratio detection based on exact moments of smallest and largest eigenvalues

    KAUST Repository

    Shakir, Muhammad; Tang, Wuchen; Rao, Anlei; Imran, Muhammad Ali; Alouini, Mohamed-Slim

    2011-01-01

    Detection based on eigenvalues of received signal covariance matrix is currently one of the most effective solution for spectrum sensing problem in cognitive radios. However, the results of these schemes always depend on asymptotic assumptions since the close-formed expression of exact eigenvalues ratio distribution is exceptionally complex to compute in practice. In this paper, non-asymptotic spectrum sensing approach to approximate the extreme eigenvalues is introduced. In this context, the Gaussian approximation approach based on exact analytical moments of extreme eigenvalues is presented. In this approach, the extreme eigenvalues are considered as dependent Gaussian random variables such that the joint probability density function (PDF) is approximated by bivariate Gaussian distribution function for any number of cooperating secondary users and received samples. In this context, the definition of Copula is cited to analyze the extent of the dependency between the extreme eigenvalues. Later, the decision threshold based on the ratio of dependent Gaussian extreme eigenvalues is derived. The performance analysis of our newly proposed approach is compared with the already published asymptotic Tracy-Widom approximation approach. © 2011 ICST.

  17. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  18. On the hypothesis-free testing of metabolite ratios in genome-wide and metabolome-wide association studies

    Directory of Open Access Journals (Sweden)

    Petersen Ann-Kristin

    2012-06-01

    Full Text Available Abstract Background Genome-wide association studies (GWAS with metabolic traits and metabolome-wide association studies (MWAS with traits of biomedical relevance are powerful tools to identify the contribution of genetic, environmental and lifestyle factors to the etiology of complex diseases. Hypothesis-free testing of ratios between all possible metabolite pairs in GWAS and MWAS has proven to be an innovative approach in the discovery of new biologically meaningful associations. The p-gain statistic was introduced as an ad-hoc measure to determine whether a ratio between two metabolite concentrations carries more information than the two corresponding metabolite concentrations alone. So far, only a rule of thumb was applied to determine the significance of the p-gain. Results Here we explore the statistical properties of the p-gain through simulation of its density and by sampling of experimental data. We derive critical values of the p-gain for different levels of correlation between metabolite pairs and show that B/(2*α is a conservative critical value for the p-gain, where α is the level of significance and B the number of tested metabolite pairs. Conclusions We show that the p-gain is a well defined measure that can be used to identify statistically significant metabolite ratios in association studies and provide a conservative significance cut-off for the p-gain for use in future association studies with metabolic traits.

  19. Laboratory test on maximum and minimum void ratio of tropical sand matrix soils

    Science.gov (United States)

    Othman, B. A.; Marto, A.

    2018-04-01

    Sand is generally known as loose granular material which has a grain size finer than gravel and coarser than silt and can be very angular to well-rounded in shape. The present of various amount of fines which also influence the loosest and densest state of sand in natural condition have been well known to contribute to the deformation and loss of shear strength of soil. This paper presents the effect of various range of fines content on minimum void ratio e min and maximum void ratio e max of sand matrix soils. Laboratory tests to determine e min and e max of sand matrix soil were conducted using non-standard method introduced by previous researcher. Clean sand was obtained from natural mining site at Johor, Malaysia. A set of 3 different sizes of sand (fine sand, medium sand, and coarse sand) were mixed with 0% to 40% by weight of low plasticity fine (kaolin). Results showed that generally e min and e max decreased with the increase of fines content up to a minimal value of 0% to 30%, and then increased back thereafter.

  20. Retrieval of water vapor mixing ratios from a laser-based sensor

    Science.gov (United States)

    Tucker, George F.

    1995-01-01

    Langley Research Center has developed a novel external path sensor which monitors water vapor along an optical path between an airplane window and reflective material on the plane's engine. An infrared tunable diode laser is wavelength modulated across a water vapor absorption line at a frequency f. The 2f and DC signals are measured by a detector mounted adjacent to the laser. The 2f/DC ratio depends on the amount of wavelength modulation, the water vapor absorption line being observed, and the temperature, pressure, and water vapor content of the atmosphere. The present work concerns efforts to quantify the contributions of these factors and to derive a method for extracting the water vapor mixing ratio from the measurements. A 3 m cell was fabricated in order to perform laboratory tests of the sensor. Measurements of 2f/DC were made for a series of pressures and modulation amplitudes. During my 1994 faculty fellowship, a computer program was created which allowed 2f/DC to be calculated for any combination of the variables which effect it. This code was used to generate 2f/DC values for the conditions measured in the laboratory. The experimental and theoretical values agreed to within a few percent. As a result, the laser modulation amplitude can now be set in the field by comparing the response of the instrument to the calculated response as a function of modulation amplitude. Once the validity of the computer code was established, it was used to investigate possible candidate absorption lines. 2f/DC values were calculated for pressures, temperatures, and water vapor mixing ratios expected to be encountered in future missions. The results have been incorporated into a database which will be used to select the best line for a particular mission. The database will also be used to select a retrieval technique. For examples under some circumstances there is little temperature dependence in 2f/DC so temperature can be neglected. In other cases, there is a dependence

  1. Financial Key Ratios

    OpenAIRE

    Tănase Alin-Eliodor

    2014-01-01

    This article focuses on computing techniques starting from trial balance data regarding financial key ratios. There are presented activity, liquidity, solvency and profitability financial key ratios. It is presented a computing methodology in three steps based on a trial balance.

  2. Further Evaluation of Covariate Analysis using Empirical Bayes Estimates in Population Pharmacokinetics: the Perception of Shrinkage and Likelihood Ratio Test.

    Science.gov (United States)

    Xu, Xu Steven; Yuan, Min; Yang, Haitao; Feng, Yan; Xu, Jinfeng; Pinheiro, Jose

    2017-01-01

    Covariate analysis based on population pharmacokinetics (PPK) is used to identify clinically relevant factors. The likelihood ratio test (LRT) based on nonlinear mixed effect model fits is currently recommended for covariate identification, whereas individual empirical Bayesian estimates (EBEs) are considered unreliable due to the presence of shrinkage. The objectives of this research were to investigate the type I error for LRT and EBE approaches, to confirm the similarity of power between the LRT and EBE approaches from a previous report and to explore the influence of shrinkage on LRT and EBE inferences. Using an oral one-compartment PK model with a single covariate impacting on clearance, we conducted a wide range of simulations according to a two-way factorial design. The results revealed that the EBE-based regression not only provided almost identical power for detecting a covariate effect, but also controlled the false positive rate better than the LRT approach. Shrinkage of EBEs is likely not the root cause for decrease in power or inflated false positive rate although the size of the covariate effect tends to be underestimated at high shrinkage. In summary, contrary to the current recommendations, EBEs may be a better choice for statistical tests in PPK covariate analysis compared to LRT. We proposed a three-step covariate modeling approach for population PK analysis to utilize the advantages of EBEs while overcoming their shortcomings, which allows not only markedly reducing the run time for population PK analysis, but also providing more accurate covariate tests.

  3. The relationship between size, book-to-market equity ratio, earnings–price ratio, and return for the Tehran stock Exchange

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Sadeghi Lafmejani

    2016-01-01

    Full Text Available This paper presents an empirical investigation to determine whether or there is any difference between the returns of two value and growth portfolios, sorted by price-to-earnings (P/E and price-to-book value (P/BV, in terms of the ratios of market sensitivity to index (β, firm size and market liquidity in listed firms in Tehran Stock Exchange (TSE over the period 2001-2008. The selected firms were collected from those with existing two-consecutive positive P/E and P/BV ratios and by excluding financial and holding firms. There were five independent variables for the proposed study of this paper including P/E, P/B, market size, market sensitivity beta (β and market liquidity. In each year, we first sort firms in non-decreasing order and setup four set of portfolios with equal firms. Therefore, the first portfolio with the lowest P/E ratio is called value portfolio and the last one with the highest P/E ratio is called growth portfolio. This process was repeated based on P/BV ratio to determine value and growth portfolios, accordingly. The study investigated the characteristics of two portfolios based on firm size, β and liquidity. The study has implemented t-student and Levin’s test to examine different hypotheses and the results have indicated mix effects of market sensitivity, firm size and market liquidity on returns of the firms in various periods.

  4. Initiation of depleted uranium oxide and spent fuel testing for the spent fuel sabotage aerosol ratio program

    Energy Technology Data Exchange (ETDEWEB)

    Molecke, M.A.; Gregson, M.W.; Sorenson, K.B. [Sandia National Labs. (United States); Billone, M.C.; Tsai, H. [Argonne National Lab. (United States); Koch, W.; Nolte, O. [Fraunhofer Inst. fuer Toxikologie und Experimentelle Medizin (Germany); Pretzsch, G.; Lange, F. [Gesellschaft fuer Anlagen- und Reaktorsicherheit (Germany); Autrusson, B.; Loiseau, O. [Inst. de Radioprotection et de Surete Nucleaire (France); Thompson, N.S.; Hibbs, R.S. [U.S. Dept. of Energy (United States); Young, F.I.; Mo, T. [U.S. Nuclear Regulatory Commission (United States)

    2004-07-01

    We provide a detailed overview of an ongoing, multinational test program that is developing aerosol data for some spent fuel sabotage scenarios on spent fuel transport and storage casks. Experiments are being performed to quantify the aerosolized materials plus volatilized fission products generated from actual spent fuel and surrogate material test rods, due to impact by a high energy density device, HEDD. The program participants in the U.S. plus Germany, France, and the U.K., part of the international Working Group for Sabotage Concerns of Transport and Storage Casks, WGSTSC have strongly supported and coordinated this research program. Sandia National Laboratories, SNL, has the lead role for conducting this research program; test program support is provided by both the U.S. Department of Energy and Nuclear Regulatory Commission. WGSTSC partners need this research to better understand potential radiological impacts from sabotage of nuclear material shipments and storage casks, and to support subsequent risk assessments, modeling, and preventative measures. We provide a summary of the overall, multi-phase test design and a description of all explosive containment and aerosol collection test components used. We focus on the recently initiated tests on ''surrogate'' spent fuel, unirradiated depleted uranium oxide, and forthcoming actual spent fuel tests. The depleted uranium oxide test rodlets were prepared by the Institut de Radioprotection et de Surete Nucleaire, in France. These surrogate test rodlets closely match the diameter of the test rodlets of actual spent fuel from the H.B. Robinson reactor (high burnup PWR fuel) and the Surry reactor (lower, medium burnup PWR fuel), generated from U.S. reactors. The characterization of the spent fuels and fabrication into short, pressurized rodlets has been performed by Argonne National Laboratory, for testing at SNL. The ratio of the aerosol and respirable particles released from HEDD-impacted spent

  5. Model-based security testing

    OpenAIRE

    Schieferdecker, Ina; Großmann, Jürgen; Schneider, Martin

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security...

  6. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  7. Discrimination of DPRK M5.1 February 12th, 2013 Earthquake as Nuclear Test Using Analysis of Magnitude, Rupture Duration and Ratio of Seismic Energy and Moment

    Science.gov (United States)

    Salomo Sianipar, Dimas; Subakti, Hendri; Pribadi, Sugeng

    2015-04-01

    On February 12th, 2013 morning at 02:57 UTC, there had been an earthquake with its epicenter in the region of North Korea precisely around Sungjibaegam Mountains. Monitoring stations of the Preparatory Commission for the Comprehensive Nuclear Test-Ban Treaty Organization (CTBTO) and some other seismic network detected this shallow seismic event. Analyzing seismograms recorded after this event can discriminate between a natural earthquake or an explosion. Zhao et. al. (2014) have been successfully discriminate this seismic event of North Korea nuclear test 2013 from ordinary earthquakes based on network P/S spectral ratios using broadband regional seismic data recorded in China, South Korea and Japan. The P/S-type spectral ratios were powerful discriminants to separate explosions from earthquake (Zhao et. al., 2014). Pribadi et. al. (2014) have characterized 27 earthquake-generated tsunamis (tsunamigenic earthquake or tsunami earthquake) from 1991 to 2012 in Indonesia using W-phase inversion analysis, the ratio between the seismic energy (E) and the seismic moment (Mo), the moment magnitude (Mw), the rupture duration (To) and the distance of the hypocenter to the trench. Some of this method was also used by us to characterize the nuclear test earthquake. We discriminate this DPRK M5.1 February 12th, 2013 earthquake from a natural earthquake using analysis magnitude mb, ms and mw, ratio of seismic energy and moment and rupture duration. We used the waveform data of the seismicity on the scope region in radius 5 degrees from the DPRK M5.1 February 12th, 2013 epicenter 41.29, 129.07 (Zhang and Wen, 2013) from 2006 to 2014 with magnitude M ≥ 4.0. We conclude that this earthquake was a shallow seismic event with explosion characteristics and can be discriminate from a natural or tectonic earthquake. Keywords: North Korean nuclear test, magnitude mb, ms, mw, ratio between seismic energy and moment, ruptures duration

  8. The Relationship Between 14C Urea Breath Test Results and Neutrophil/Lymphocyte and Platelet/Lymphocyte Ratios

    Directory of Open Access Journals (Sweden)

    Ertan Şahin

    2018-04-01

    Full Text Available Aim: Neutrophil/lymphocyte ratio (NLR and platelet/lymphocyte ratio (PLR are used as inflammatory markers in several diseases. However, there are little data regarding the diagnostic ability of NLR and PLR in Helicobacter pylori. We aimed to assess the association between the 14C urea breath test (14C-UBT results and NLR and PLR in H. pylori diagnosis. Methods: Results of 89 patients were retrospectively analysed in this study. According to the 14C-UBT results, patients were divided into two groups: H. pylori (+ and H. pylori (- (control group. Haematological parameters, including hemoglobine, white blood cell (WBC count, neutrophil count, lymphocyte count, NLR, platelet count, and PLR were compared between the two groups. Results: The mean total WBC count, neutrophil count, NLR and PLR in H. pylori (+ patients were significantly higher than in the control group (p<0.001 for all these parameters. In the receiver operating characteristic curve analysis, the cut-off value for NLR and PLR for the presence of H. pylori was calculated as ≥2.39 [sensitivity: 67.3%, specificity: 79.4%, area under the curve (AUC: 0.747 (0.637-0.856, p<0.0001] and ≥133.3 [sensitivity: 61.8%, specificity: 55.9%, AUC: 0.572 (0.447-0.697, p<0.05], respectively. Conclusion: The present study shows that NLR and PLR are associated with H. pylori positivity based on 14C-UBT, and they can be used as an additional biomarker for supporting the 14C-UBT results.

  9. A flexible and coherent test/estimation procedure based on restricted mean survival times for censored time-to-event data in randomized clinical trials.

    Science.gov (United States)

    Horiguchi, Miki; Cronin, Angel M; Takeuchi, Masahiro; Uno, Hajime

    2018-04-22

    In randomized clinical trials where time-to-event is the primary outcome, almost routinely, the logrank test is prespecified as the primary test and the hazard ratio is used to quantify treatment effect. If the ratio of 2 hazard functions is not constant, the logrank test is not optimal and the interpretation of hazard ratio is not obvious. When such a nonproportional hazards case is expected at the design stage, the conventional practice is to prespecify another member of weighted logrank tests, eg, Peto-Prentice-Wilcoxon test. Alternatively, one may specify a robust test as the primary test, which can capture various patterns of difference between 2 event time distributions. However, most of those tests do not have companion procedures to quantify the treatment difference, and investigators have fallen back on reporting treatment effect estimates not associated with the primary test. Such incoherence in the "test/estimation" procedure may potentially mislead clinicians/patients who have to balance risk-benefit for treatment decision. To address this, we propose a flexible and coherent test/estimation procedure based on restricted mean survival time, where the truncation time τ is selected data dependently. The proposed procedure is composed of a prespecified test and an estimation of corresponding robust and interpretable quantitative treatment effect. The utility of the new procedure is demonstrated by numerical studies based on 2 randomized cancer clinical trials; the test is dramatically more powerful than the logrank, Wilcoxon tests, and the restricted mean survival time-based test with a fixed τ, for the patterns of difference seen in these cancer clinical trials. Copyright © 2018 John Wiley & Sons, Ltd.

  10. A MULTIPLE TESTING OF THE ABC METHOD AND THE DEVELOPMENT OF A SECOND-GENERATION MODEL. PART II, TEST RESULTS AND AN ANALYSIS OF RECALL RATIO.

    Science.gov (United States)

    ALTMANN, BERTHOLD

    AFTER A BRIEF SUMMARY OF THE TEST PROGRAM (DESCRIBED MORE FULLY IN LI 000 318), THE STATISTICAL RESULTS TABULATED AS OVERALL "ABC (APPROACH BY CONCEPT)-RELEVANCE RATIOS" AND "ABC-RECALL FIGURES" ARE PRESENTED AND REVIEWED. AN ABSTRACT MODEL DEVELOPED IN ACCORDANCE WITH MAX WEBER'S "IDEALTYPUS" ("DIE OBJEKTIVITAET…

  11. Acceptance Sampling Plans Based on Truncated Life Tests for Sushila Distribution

    Directory of Open Access Journals (Sweden)

    Amer Ibrahim Al-Omari

    2018-03-01

    Full Text Available An acceptance sampling plan problem based on truncated life tests when the lifetime following a Sushila distribution is considered in this paper. For various acceptance numbers, confidence levels and values of the ratio between fixed experiment time and particular mean lifetime, the minimum sample sizes required to ascertain a specified mean life were found. The operating characteristic function values of the suggested sampling plans and the producer’s risk are presented. Some tables are provided and the results are illustrated by an example of a real data set.

  12. Comparison of Urine Albumin-to-Creatinine Ratio (ACR) Between ACR Strip Test and Quantitative Test in Prediabetes and Diabetes

    Science.gov (United States)

    Cho, Seon; Kim, Suyoung; Cho, Han-Ik

    2017-01-01

    Background Albuminuria is generally known as a sensitive marker of renal and cardiovascular dysfunction. It can be used to help predict the occurrence of nephropathy and cardiovascular disorders in diabetes. Individuals with prediabetes have a tendency to develop macrovascular and microvascular pathology, resulting in an increased risk of retinopathy, cardiovascular diseases, and chronic renal diseases. We evaluated the clinical value of a strip test for measuring the urinary albumin-to-creatinine ratio (ACR) in prediabetes and diabetes. Methods Spot urine samples were obtained from 226 prediabetic and 275 diabetic subjects during regular health checkups. Urinary ACR was measured by using strip and laboratory quantitative tests. Results The positive rates of albuminuria measured by using the ACR strip test were 15.5% (microalbuminuria, 14.6%; macroalbuminuria, 0.9%) and 30.5% (microalbuminuria, 25.1%; macroalbuminuria, 5.5%) in prediabetes and diabetes, respectively. In the prediabetic population, the sensitivity, specificity, positive predictive value, negative predictive value, and overall accuracy of the ACR strip method were 92.0%, 94.0%, 65.7%, 99.0%, and 93.8%, respectively; the corresponding values in the diabetic population were 80.0%, 91.6%, 81.0%, 91.1%, and 88.0%, respectively. The median [interquartile range] ACR values in the strip tests for measurement ranges of 300 mg/g were 9.4 [6.3-15.4], 46.9 [26.5-87.7], and 368.8 [296.2-575.2] mg/g, respectively, using the laboratory method. Conclusions The ACR strip test showed high sensitivity, specificity, and negative predictive value, suggesting that the test can be used to screen for albuminuria in cases of prediabetes and diabetes. PMID:27834062

  13. IRT-based test construction

    OpenAIRE

    van der Linden, Willem J.; Theunissen, T.J.J.M.; Boekkooi-Timminga, Ellen; Kelderman, Henk

    1987-01-01

    Four discussions of test construction based on item response theory (IRT) are presented. The first discussion, "Test Design as Model Building in Mathematical Programming" (T.J.J.M. Theunissen), presents test design as a decision process under certainty. A natural way of modeling this process leads to mathematical programming. General models of test construction are discussed, with information about algorithms and heuristics; ideas about the analysis and refinement of test constraints are also...

  14. Comparison and applicability of landslide susceptibility models based on landslide ratio-based logistic regression, frequency ratio, weight of evidence, and instability index methods in an extreme rainfall event

    Science.gov (United States)

    Wu, Chunhung

    2016-04-01

    Few researches have discussed about the applicability of applying the statistical landslide susceptibility (LS) model for extreme rainfall-induced landslide events. The researches focuses on the comparison and applicability of LS models based on four methods, including landslide ratio-based logistic regression (LRBLR), frequency ratio (FR), weight of evidence (WOE), and instability index (II) methods, in an extreme rainfall-induced landslide cases. The landslide inventory in the Chishan river watershed, Southwestern Taiwan, after 2009 Typhoon Morakot is the main materials in this research. The Chishan river watershed is a tributary watershed of Kaoping river watershed, which is a landslide- and erosion-prone watershed with the annual average suspended load of 3.6×107 MT/yr (ranks 11th in the world). Typhoon Morakot struck Southern Taiwan from Aug. 6-10 in 2009 and dumped nearly 2,000 mm of rainfall in the Chishan river watershed. The 24-hour, 48-hour, and 72-hours accumulated rainfall in the Chishan river watershed exceeded the 200-year return period accumulated rainfall. 2,389 landslide polygons in the Chishan river watershed were extracted from SPOT 5 images after 2009 Typhoon Morakot. The total landslide area is around 33.5 km2, equals to the landslide ratio of 4.1%. The main landslide types based on Varnes' (1978) classification are rotational and translational slides. The two characteristics of extreme rainfall-induced landslide event are dense landslide distribution and large occupation of downslope landslide areas owing to headward erosion and bank erosion in the flooding processes. The area of downslope landslide in the Chishan river watershed after 2009 Typhoon Morakot is 3.2 times higher than that of upslope landslide areas. The prediction accuracy of LS models based on LRBLR, FR, WOE, and II methods have been proven over 70%. The model performance and applicability of four models in a landslide-prone watershed with dense distribution of rainfall

  15. Statistical methods for improving verification of claims of origin for Italian wines based on stable isotope ratios

    International Nuclear Information System (INIS)

    Dordevic, N.; Wehrens, R.; Postma, G.J.; Buydens, L.M.C.; Camin, F.

    2012-01-01

    Highlights: ► The assessment of claims of origin is of enormous economic importance for DOC and DOCG wines. ► The official method is based on univariate statistical tests of H, C and O isotopic ratios. ► We consider 5220 Italian wine samples collected in the period 2000–2010. ► Multivariate statistical analysis leads to much better specificity and easier detection of false claims of origin. ► In the case of multi-modal data, mixture modelling provides additional improvements. - Abstract: Wine derives its economic value to a large extent from geographical origin, which has a significant impact on the quality of the wine. According to the food legislation, wines can be without geographical origin (table wine) and wines with origin. Wines with origin must have characteristics which are essential due to its region of production and must be produced, processed and prepared, exclusively within that region. The development of fast and reliable analytical methods for the assessment of claims of origin is very important. The current official method is based on the measurement of stable isotope ratios of water and alcohol in wine, which are influenced by climatic factors. The results in this paper are based on 5220 Italian wine samples collected in the period 2000–2010. We evaluate the univariate approach underlying the official method to assess claims of origin and propose several new methods to get better geographical discrimination between samples. It is shown that multivariate methods are superior to univariate approaches in that they show increased sensitivity and specificity. In cases where data are non-normally distributed, an approach based on mixture modelling provides additional improvements.

  16. Corrosion resistance and electrochemical potentiokinetic reactivation testing of some iron-base hardfacing alloys

    International Nuclear Information System (INIS)

    Cockeram, B.V.

    1999-01-01

    Hardfacing alloys are weld deposited on a base material to provide a wear resistant surface. Commercially available iron-base hardfacing alloys are being evaluated for replacement of cobalt-base alloys to reduce nuclear plant activation levels. Corrosion testing was used to evaluate the corrosion resistance of several iron-base hardfacing alloys in highly oxygenated environments. The corrosion test results indicate that iron-base hardfacing alloys in the as-deposited condition have acceptable corrosion resistance when the chromium to carbon ratio is greater than 4. Tristelle 5183, with a high niobium (stabilizer) content, did not follow this trend due to precipitation of niobium-rich carbides instead of chromium-rich carbides. This result indicates that iron-base hardfacing alloys containing high stabilizer contents may possess good corrosion resistance with Cr:C < 4. NOREM 02, NOREM 01, and NoCo-M2 hardfacing alloys had acceptable corrosion resistance in the as-deposited and 885 C/4 hour heat treated condition, but rusting from sensitization was observed in the 621 C/6 hour heat treated condition. The feasibility of using an Electrochemical Potentiokinetic Reactivation (EPR) test method, such as used for stainless steel, to detect sensitization in iron-base hardfacing alloys was evaluated. A single loop-EPR method was found to provide a more consistent measurement of sensitization than a double loop-EPR method. The high carbon content that is needed for a wear resistant hardfacing alloy produces a high volume fraction of chromium-rich carbides that are attacked during EPR testing. This results in inherently lower sensitivity for detection of a sensitized iron-base hardfacing alloy than stainless steel using conventional EPR test methods

  17. The influence of the negative-positive ratio and screening database size on the performance of machine learning-based virtual screening.

    Science.gov (United States)

    Kurczab, Rafał; Bojarski, Andrzej J

    2017-01-01

    The machine learning-based virtual screening of molecular databases is a commonly used approach to identify hits. However, many aspects associated with training predictive models can influence the final performance and, consequently, the number of hits found. Thus, we performed a systematic study of the simultaneous influence of the proportion of negatives to positives in the testing set, the size of screening databases and the type of molecular representations on the effectiveness of classification. The results obtained for eight protein targets, five machine learning algorithms (SMO, Naïve Bayes, Ibk, J48 and Random Forest), two types of molecular fingerprints (MACCS and CDK FP) and eight screening databases with different numbers of molecules confirmed our previous findings that increases in the ratio of negative to positive training instances greatly influenced most of the investigated parameters of the ML methods in simulated virtual screening experiments. However, the performance of screening was shown to also be highly dependent on the molecular library dimension. Generally, with the increasing size of the screened database, the optimal training ratio also increased, and this ratio can be rationalized using the proposed cost-effectiveness threshold approach. To increase the performance of machine learning-based virtual screening, the training set should be constructed in a way that considers the size of the screening database.

  18. Induced Voltages Ratio-Based Algorithm for Fault Detection, and Faulted Phase and Winding Identification of a Three-Winding Power Transformer

    Directory of Open Access Journals (Sweden)

    Byung Eun Lee

    2014-09-01

    Full Text Available This paper proposes an algorithm for fault detection, faulted phase and winding identification of a three-winding power transformer based on the induced voltages in the electrical power system. The ratio of the induced voltages of the primary-secondary, primary-tertiary and secondary-tertiary windings is the same as the corresponding turns ratio during normal operating conditions, magnetic inrush, and over-excitation. It differs from the turns ratio during an internal fault. For a single phase and a three-phase power transformer with wye-connected windings, the induced voltages of each pair of windings are estimated. For a three-phase power transformer with delta-connected windings, the induced voltage differences are estimated to use the line currents, because the delta winding currents are practically unavailable. Six detectors are suggested for fault detection. An additional three detectors and a rule for faulted phase and winding identification are presented as well. The proposed algorithm can not only detect an internal fault, but also identify the faulted phase and winding of a three-winding power transformer. The various test results with Electromagnetic Transients Program (EMTP-generated data show that the proposed algorithm successfully discriminates internal faults from normal operating conditions including magnetic inrush and over-excitation. This paper concludes by implementing the algorithm into a prototype relay based on a digital signal processor.

  19. Visualization of and Software for Omnibus Test Based Change Detected in a Time Series of Polarimetric SAR Data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Conradsen, Knut; Skriver, Henning

    2017-01-01

    Based on an omnibus likelihood ratio test statistic for the equality of several variance-covariance matrices following the complex Wishart distribution and a factorization of this test statistic with associated p-values, change analysis in a time series of multilook polarimetric SAR data...... in the covariance matrix representation is carried out. The omnibus test statistic and its factorization detect if and when change occurs. Using airborne EMISAR and spaceborne RADARSAT-2 data this paper focuses on change detection based on the p-values, on visualization of change at pixel as well as segment level......, and on computer software....

  20. Practical in-situ determination of ortho-para hydrogen ratios via fiber-optic based Raman spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Sutherland, Liese-Marie; Knudson, James N.; Mocko, Michal; Renneke, Richard M.

    2016-02-21

    An experiment was designed and developed to prototype a fiber-optic-based laser system, which measures the ratio of ortho-hydrogen to para-hydrogen in an operating neutron moderator system at the Los Alamos Neutron Science Center (LANSCE) spallation neutron source. Preliminary measurements resulted in an ortho to para ratio of 3.06:1, which is within acceptable agreement with the previously published ratio. The successful demonstration of Raman Spectroscopy for this measurement is expected to lead to a practical method that can be applied for similar in-situ measurements at operating neutron spallation sources.

  1. Bin Ratio-Based Histogram Distances and Their Application to Image Classification.

    Science.gov (United States)

    Hu, Weiming; Xie, Nianhua; Hu, Ruiguang; Ling, Haibin; Chen, Qiang; Yan, Shuicheng; Maybank, Stephen

    2014-12-01

    Large variations in image background may cause partial matching and normalization problems for histogram-based representations, i.e., the histograms of the same category may have bins which are significantly different, and normalization may produce large changes in the differences between corresponding bins. In this paper, we deal with this problem by using the ratios between bin values of histograms, rather than bin values' differences which are used in the traditional histogram distances. We propose a bin ratio-based histogram distance (BRD), which is an intra-cross-bin distance, in contrast with previous bin-to-bin distances and cross-bin distances. The BRD is robust to partial matching and histogram normalization, and captures correlations between bins with only a linear computational complexity. We combine the BRD with the ℓ1 histogram distance and the χ(2) histogram distance to generate the ℓ1 BRD and the χ(2) BRD, respectively. These combinations exploit and benefit from the robustness of the BRD under partial matching and the robustness of the ℓ1 and χ(2) distances to small noise. We propose a method for assessing the robustness of histogram distances to partial matching. The BRDs and logistic regression-based histogram fusion are applied to image classification. The experimental results on synthetic data sets show the robustness of the BRDs to partial matching, and the experiments on seven benchmark data sets demonstrate promising results of the BRDs for image classification.

  2. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  3. Cost-effectiveness of population based BRCA testing with varying Ashkenazi Jewish ancestry.

    Science.gov (United States)

    Manchanda, Ranjit; Patel, Shreeya; Antoniou, Antonis C; Levy-Lahad, Ephrat; Turnbull, Clare; Evans, D Gareth; Hopper, John L; Macinnis, Robert J; Menon, Usha; Jacobs, Ian; Legood, Rosa

    2017-11-01

    Population-based BRCA1/BRCA2 testing has been found to be cost-effective compared with family history-based testing in Ashkenazi-Jewish women were >30 years old with 4 Ashkenazi-Jewish grandparents. However, individuals may have 1, 2, or 3 Ashkenazi-Jewish grandparents, and cost-effectiveness data are lacking at these lower BRCA prevalence estimates. We present an updated cost-effectiveness analysis of population BRCA1/BRCA2 testing for women with 1, 2, and 3 Ashkenazi-Jewish grandparents. Decision analysis model. Lifetime costs and effects of population and family history-based testing were compared with the use of a decision analysis model. 56% BRCA carriers are missed by family history criteria alone. Analyses were conducted for United Kingdom and United States populations. Model parameters were obtained from the Genetic Cancer Prediction through Population Screening trial and published literature. Model parameters and BRCA population prevalence for individuals with 3, 2, or 1 Ashkenazi-Jewish grandparent were adjusted for the relative frequency of BRCA mutations in the Ashkenazi-Jewish and general populations. Incremental cost-effectiveness ratios were calculated for all Ashkenazi-Jewish grandparent scenarios. Costs, along with outcomes, were discounted at 3.5%. The time horizon of the analysis is "life-time," and perspective is "payer." Probabilistic sensitivity analysis evaluated model uncertainty. Population testing for BRCA mutations is cost-saving in Ashkenazi-Jewish women with 2, 3, or 4 grandparents (22-33 days life-gained) in the United Kingdom and 1, 2, 3, or 4 grandparents (12-26 days life-gained) in the United States populations, respectively. It is also extremely cost-effective in women in the United Kingdom with just 1 Ashkenazi-Jewish grandparent with an incremental cost-effectiveness ratio of £863 per quality-adjusted life-years and 15 days life gained. Results show that population-testing remains cost-effective at the £20,000-30000 per quality

  4. Base Flow and Heat Transfer Characteristics of a Four-Nozzle Clustered Rocket Engine: Effect of Nozzle Pressure Ratio

    Science.gov (United States)

    Nallasamy, R.; Kandula, M.; Duncil, L.; Schallhorn, P.

    2010-01-01

    The base pressure and heating characteristics of a four-nozzle clustered rocket configuration is studied numerically with the aid of OVERFLOW Navier-Stokes code. A pressure ratio (chamber pressure to freestream static pressure) range of 990 to 5,920 and a freestream Mach number range of 2.5 to 3.5 are studied. The qualitative trends of decreasing base pressure with increasing pressure ratio and increasing base heat flux with increasing pressure ratio are correctly predicted. However, the predictions for base pressure and base heat flux show deviations from the wind tunnel data. The differences in absolute values between the computation and the data are attributed to factors such as perfect gas (thermally and calorically perfect) assumption, turbulence model inaccuracies in the simulation, and lack of grid adaptation.

  5. Evaluation of a Secure Laptop-Based Testing Program in an Undergraduate Nursing Program: Students' Perspective.

    Science.gov (United States)

    Tao, Jinyuan; Gunter, Glenda; Tsai, Ming-Hsiu; Lim, Dan

    2016-01-01

    Recently, the many robust learning management systems, and the availability of affordable laptops, have made secure laptop-based testing a reality on many campuses. The undergraduate nursing program at the authors' university began to implement a secure laptop-based testing program in 2009, which allowed students to use their newly purchased laptops to take quizzes and tests securely in classrooms. After nearly 5 years' secure laptop-based testing program implementation, a formative evaluation, using a mixed method that has both descriptive and correlational data elements, was conducted to seek constructive feedback from students to improve the program. Evaluation data show that, overall, students (n = 166) believed the secure laptop-based testing program helps them get hands-on experience of taking examinations on the computer and gets them prepared for their computerized NCLEX-RN. Students, however, had a lot of concerns about laptop glitches and campus wireless network glitches they experienced during testing. At the same time, NCLEX-RN first-time passing rate data were analyzed using the χ2 test, and revealed no significant association between the two testing methods (paper-and-pencil testing and the secure laptop-based testing) and students' first-time NCLEX-RN passing rate. Based on the odds ratio, however, the odds of students passing NCLEX-RN the first time was 1.37 times higher if they were taught with the secure laptop-based testing method than if taught with the traditional paper-and-pencil testing method in nursing school. It was recommended to the institution that better quality of laptops needs to be provided to future students, measures needed to be taken to further stabilize the campus wireless Internet network, and there was a need to reevaluate the Laptop Initiative Program.

  6. Total Protein and Albumin/Globulin Ratio Test

    Science.gov (United States)

    ... Plasma Free Metanephrines Platelet Count Platelet Function Tests Pleural Fluid Analysis PML-RARA Porphyrin Tests Potassium Prealbumin ... of the various types of proteins in the liquid ( serum or plasma ) portion of the blood. Two ...

  7. Neutron measurements of stresses in a test artifact produced by laser-based additive manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Gnäupel-Herold, Thomas [Center for Neutron Research, National Institute of Standards and Technology, Gaithersburg MD 20899-6102 (United States); Slotwinski, John; Moylan, Shawn [Intelligent Systems Division, National Institute of Standards and Technology, Gaithersburg MD 20899-8220 (United States)

    2014-02-18

    A stainless steel test artifact produced by Direct Metal Laser Sintering and similar to a proposed standardized test artifact was examined using neutron diffraction. The artifact contained a number of structures with different aspect ratios pertaining to wall thickness, height above base plate, and side length. Through spatial resolutions of the order of one millimeter the volumetric distribution of stresses in several was measured. It was found that the stresses peak in the tensile region around 500 MPa near the top surface, with balancing compressive stresses in the interior. The presence of a support structure (a one millimeter high, thin walled, hence weaker, lattice structure deposited on the base plate, followed by a fully dense AM structure) has only minor effects on the stresses.

  8. A Feature Selection Method Based on Fisher's Discriminant Ratio for Text Sentiment Classification

    Science.gov (United States)

    Wang, Suge; Li, Deyu; Wei, Yingjie; Li, Hongxia

    With the rapid growth of e-commerce, product reviews on the Web have become an important information source for customers' decision making when they intend to buy some product. As the reviews are often too many for customers to go through, how to automatically classify them into different sentiment orientation categories (i.e. positive/negative) has become a research problem. In this paper, based on Fisher's discriminant ratio, an effective feature selection method is proposed for product review text sentiment classification. In order to validate the validity of the proposed method, we compared it with other methods respectively based on information gain and mutual information while support vector machine is adopted as the classifier. In this paper, 6 subexperiments are conducted by combining different feature selection methods with 2 kinds of candidate feature sets. Under 1006 review documents of cars, the experimental results indicate that the Fisher's discriminant ratio based on word frequency estimation has the best performance with F value 83.3% while the candidate features are the words which appear in both positive and negative texts.

  9. A rule-based software test data generator

    Science.gov (United States)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  10. Bankruptcy Prediction Based on the Autonomy Ratio

    Directory of Open Access Journals (Sweden)

    Daniel Brîndescu Olariu

    2016-11-01

    Full Text Available The theory and practice of the financial ratio analysis suggest the existence of a negative correlation between the autonomy ratio and the bankruptcy risk. Previous studies conducted on a sample of companies from Timis County (largest county in Romania confirm this hypothesis and recommend the autonomy ratio as a useful tool for measuring the bankruptcy risk two years in advance. The objective of the current research was to develop a methodology for measuring the bankruptcy risk that would be applicable for the companies from the Timis County (specific methodologies are considered necessary for each region. The target population consisted of all the companies from Timis County with annual sales of over 10,000 lei (aprox. 2,200 Euros. The research was performed over all the target population. The study has thus included 53,252 yearly financial statements from the period 2007 – 2010. The results of the study allow for the setting of benchmarks, as well as the configuration of a methodology of analysis. The proposed methodology cannot predict with perfect accuracy the state of the company, but it allows for a valuation of the risk level to which the company is subjected.

  11. Bridge Testing With Ground-Based Interferometric Radar: Experimental Results

    International Nuclear Information System (INIS)

    Chiara, P.; Morelli, A.

    2010-01-01

    The research of innovative non-contact techniques aimed at the vibration measurement of civil engineering structures (also for damage detection and structural health monitoring) is continuously directed to the optimization of measures and methods. Ground-Based Radar Interferometry (GBRI) represents the more recent technique available for static and dynamic control of structures and ground movements.Dynamic testing of bridges and buildings in operational conditions are currently performed: (a) to assess the conformity of the structure to the project design at the end of construction; (b) to identify the modal parameters (i.e. natural frequencies, mode shapes and damping ratios) and to check the variation of any modal parameters over the years; (c) to evaluate the amplitude of the structural response to special load conditions (i.e. strong winds, earthquakes, heavy railway or roadway loads). If such tests are carried out by using a non-contact technique (like GBRI), the classical issues of contact sensors (like accelerometers) are easily overtaken.This paper presents and discusses the results of various tests carried out on full-scale bridges by using a Stepped Frequency-Continuous Wave radar system.

  12. Bridge Testing With Ground-Based Interferometric Radar: Experimental Results

    Science.gov (United States)

    Chiara, P.; Morelli, A.

    2010-05-01

    The research of innovative non-contact techniques aimed at the vibration measurement of civil engineering structures (also for damage detection and structural health monitoring) is continuously directed to the optimization of measures and methods. Ground-Based Radar Interferometry (GBRI) represents the more recent technique available for static and dynamic control of structures and ground movements. Dynamic testing of bridges and buildings in operational conditions are currently performed: (a) to assess the conformity of the structure to the project design at the end of construction; (b) to identify the modal parameters (i.e. natural frequencies, mode shapes and damping ratios) and to check the variation of any modal parameters over the years; (c) to evaluate the amplitude of the structural response to special load conditions (i.e. strong winds, earthquakes, heavy railway or roadway loads). If such tests are carried out by using a non-contact technique (like GBRI), the classical issues of contact sensors (like accelerometers) are easily overtaken. This paper presents and discusses the results of various tests carried out on full-scale bridges by using a Stepped Frequency-Continuous Wave radar system.

  13. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  14. Pengaruh Profitabilitas Dan Likuiditas Terhadap Capital Adequacy Ratio (CAR) Pada Bank Rakyat Indonesia (PERSERO) Tbk

    OpenAIRE

    Situmorang, Patar Sardo

    2011-01-01

    There are several factor that influent in banking performance such as liquidity and profitability. There could be represented with its financial ratios which can predict banking performance on capital matter (Capital Adequacy Ratio). The purpose of this research is to test influence of the variabel Interest Margin Loan (IML), Return on Equity (ROE), Loan To Deposit Ratio (LDR), and Reserve Requirement (GWM) toward Capital Adequacy Ratio (CAR). Data was used in this research based on public...

  15. Multimodal Personal Verification Using Likelihood Ratio for the Match Score Fusion

    Directory of Open Access Journals (Sweden)

    Long Binh Tran

    2017-01-01

    Full Text Available In this paper, the authors present a novel personal verification system based on the likelihood ratio test for fusion of match scores from multiple biometric matchers (face, fingerprint, hand shape, and palm print. In the proposed system, multimodal features are extracted by Zernike Moment (ZM. After matching, the match scores from multiple biometric matchers are fused based on the likelihood ratio test. A finite Gaussian mixture model (GMM is used for estimating the genuine and impostor densities of match scores for personal verification. Our approach is also compared to some different famous approaches such as the support vector machine and the sum rule with min-max. The experimental results have confirmed that the proposed system can achieve excellent identification performance for its higher level in accuracy than different famous approaches and thus can be utilized for more application related to person verification.

  16. Estimating negative likelihood ratio confidence when test sensitivity is 100%: A bootstrapping approach.

    Science.gov (United States)

    Marill, Keith A; Chang, Yuchiao; Wong, Kim F; Friedman, Ari B

    2017-08-01

    Objectives Assessing high-sensitivity tests for mortal illness is crucial in emergency and critical care medicine. Estimating the 95% confidence interval (CI) of the likelihood ratio (LR) can be challenging when sample sensitivity is 100%. We aimed to develop, compare, and automate a bootstrapping method to estimate the negative LR CI when sample sensitivity is 100%. Methods The lowest population sensitivity that is most likely to yield sample sensitivity 100% is located using the binomial distribution. Random binomial samples generated using this population sensitivity are then used in the LR bootstrap. A free R program, "bootLR," automates the process. Extensive simulations were performed to determine how often the LR bootstrap and comparator method 95% CIs cover the true population negative LR value. Finally, the 95% CI was compared for theoretical sample sizes and sensitivities approaching and including 100% using: (1) a technique of individual extremes, (2) SAS software based on the technique of Gart and Nam, (3) the Score CI (as implemented in the StatXact, SAS, and R PropCI package), and (4) the bootstrapping technique. Results The bootstrapping approach demonstrates appropriate coverage of the nominal 95% CI over a spectrum of populations and sample sizes. Considering a study of sample size 200 with 100 patients with disease, and specificity 60%, the lowest population sensitivity with median sample sensitivity 100% is 99.31%. When all 100 patients with disease test positive, the negative LR 95% CIs are: individual extremes technique (0,0.073), StatXact (0,0.064), SAS Score method (0,0.057), R PropCI (0,0.062), and bootstrap (0,0.048). Similar trends were observed for other sample sizes. Conclusions When study samples demonstrate 100% sensitivity, available methods may yield inappropriately wide negative LR CIs. An alternative bootstrapping approach and accompanying free open-source R package were developed to yield realistic estimates easily. This

  17. Multi-objective Search-based Mobile Testing

    OpenAIRE

    Mao, K.

    2017-01-01

    Despite the tremendous popularity of mobile applications, mobile testing still relies heavily on manual testing. This thesis presents mobile test automation approaches based on multi-objective search. We introduce three approaches: Sapienz (for native Android app testing), Octopuz (for hybrid/web JavaScript app testing) and Polariz (for using crowdsourcing to support search-based mobile testing). These three approaches represent the primary scientific and technical contributions of the thesis...

  18. A Switched-Capacitor Based High Conversion Ratio Converter for Renewable Energy Applications

    DEFF Research Database (Denmark)

    Li, Kerui; Yin, Zhijian; Yang, Yongheng

    2017-01-01

    A high step-up switched-capacitor based converter is proposed in this paper. The proposed converter features high conversion ratio, low voltage stress and continuous input current, which makes it very suitable for renewable energy applications like photovoltaic systems. More importantly...... voltage gain, low voltage stress on the switches, continuous input current, and relatively high efficiency....

  19. Computerized Classification Testing with the Rasch Model

    Science.gov (United States)

    Eggen, Theo J. H. M.

    2011-01-01

    If classification in a limited number of categories is the purpose of testing, computerized adaptive tests (CATs) with algorithms based on sequential statistical testing perform better than estimation-based CATs (e.g., Eggen & Straetmans, 2000). In these computerized classification tests (CCTs), the Sequential Probability Ratio Test (SPRT) (Wald,…

  20. Overview of a benefit/risk ratio optimized for a radiation emitting device used in non-destructive testing

    Energy Technology Data Exchange (ETDEWEB)

    Maharaj, H.P., E-mail: H_P_Maharaj@hc-sc.gc.ca [Health Canada, Dept. of Health, Consumer and Clinical Radiaton Protection Bureau, Ottawa, Ontario (Canada)

    2016-03-15

    This paper aims to provide an overview of an optimized benefit/risk ratio for a radiation emitting device. The device, which is portable, hand-held, and open-beam x-ray tube based, is utilized by a wide variety of industries for purposes of determining elemental or chemical analyses of materials in-situ based on fluorescent x-rays. These analyses do not cause damage or permanent alteration of the test materials and are considered a non-destructive test (NDT). Briefly, the key characteristics, principles of use and radiation hazards associated with the Hay device are presented and discussed. In view of the potential radiation risks, a long term strategy that incorporates risk factors and guiding principles intended to mitigate the radiation risks to the end user was considered and applied. Consequently, an operator certification program was developed on the basis of an International Standards Organization (ISO) standard (ISO 20807:2004) and in collaboration with various stake holders and was implemented by a federal national NDT certification body several years ago. It comprises a written radiation safety examination and hands-on training with the x-ray device. The operator certification program was recently revised and the changes appear beneficial. There is a fivefold increase in operator certification (Levels 1 a nd 2) to date compared with earlier years. Results are favorable and promising. An operational guidance document is available to help mitigate radiation risks. Operator certification in conjunction with the use of the operational guidance document is prudent, and is recommended for end users of the x-ray device. Manufacturers and owners of the x-ray devices will also benefit from the operational guidance document. (author)

  1. Overview of a benefit/risk ratio optimized for a radiation emitting device used in non-destructive testing

    International Nuclear Information System (INIS)

    Maharaj, H.P.

    2016-01-01

    This paper aims to provide an overview of an optimized benefit/risk ratio for a radiation emitting device. The device, which is portable, hand-held, and open-beam x-ray tube based, is utilized by a wide variety of industries for purposes of determining elemental or chemical analyses of materials in-situ based on fluorescent x-rays. These analyses do not cause damage or permanent alteration of the test materials and are considered a non-destructive test (NDT). Briefly, the key characteristics, principles of use and radiation hazards associated with the Hay device are presented and discussed. In view of the potential radiation risks, a long term strategy that incorporates risk factors and guiding principles intended to mitigate the radiation risks to the end user was considered and applied. Consequently, an operator certification program was developed on the basis of an International Standards Organization (ISO) standard (ISO 20807:2004) and in collaboration with various stake holders and was implemented by a federal national NDT certification body several years ago. It comprises a written radiation safety examination and hands-on training with the x-ray device. The operator certification program was recently revised and the changes appear beneficial. There is a fivefold increase in operator certification (Levels 1 a nd 2) to date compared with earlier years. Results are favorable and promising. An operational guidance document is available to help mitigate radiation risks. Operator certification in conjunction with the use of the operational guidance document is prudent, and is recommended for end users of the x-ray device. Manufacturers and owners of the x-ray devices will also benefit from the operational guidance document. (author)

  2. Grinding temperature and energy ratio coefficient in MQL grinding of high-temperature nickel-base alloy by using different vegetable oils as base oil

    Directory of Open Access Journals (Sweden)

    Li Benkai

    2016-08-01

    Full Text Available Vegetable oil can be used as a base oil in minimal quantity of lubrication (MQL. This study compared the performances of MQL grinding by using castor oil, soybean oil, rapeseed oil, corn oil, sunflower oil, peanut oil, and palm oil as base oils. A K-P36 numerical-control precision surface grinder was used to perform plain grinding on a workpiece material with a high-temperature nickel base alloy. A YDM–III 99 three-dimensional dynamometer was used to measure grinding force, and a clip-type thermocouple was used to determine grinding temperature. The grinding force, grinding temperature, and energy ratio coefficient of MQL grinding were compared among the seven vegetable oil types. Results revealed that (1 castor oil-based MQL grinding yields the lowest grinding force but exhibits the highest grinding temperature and energy ratio coefficient; (2 palm oil-based MQL grinding generates the second lowest grinding force but shows the lowest grinding temperature and energy ratio coefficient; (3 MQL grinding based on the five other vegetable oils produces similar grinding forces, grinding temperatures, and energy ratio coefficients, with values ranging between those of castor oil and palm oil; (4 viscosity significantly influences grinding force and grinding temperature to a greater extent than fatty acid varieties and contents in vegetable oils; (5 although more viscous vegetable oil exhibits greater lubrication and significantly lower grinding force than less viscous vegetable oil, high viscosity reduces the heat exchange capability of vegetable oil and thus yields a high grinding temperature; (6 saturated fatty acid is a more efficient lubricant than unsaturated fatty acid; and (7 a short carbon chain transfers heat more effectively than a long carbon chain. Palm oil is the optimum base oil of MQL grinding, and this base oil yields 26.98 N tangential grinding force, 87.10 N normal grinding force, 119.6 °C grinding temperature, and 42.7% energy

  3. Comparison of ergometer- and track-based testing in junior track-sprint cyclists. Implications for talent identification and development.

    Science.gov (United States)

    Tofari, Paul J; Cormack, Stuart J; Ebert, Tammie R; Gardner, A Scott; Kemp, Justin G

    2017-10-01

    Talent identification (TID) and talent development (TDE) programmes in track sprint cycling use ergometer- and track-based tests to select junior athletes and assess their development. The purpose of this study was to assess which tests are best at monitoring TID and TDE. Ten male participants (16.2 ± 1.1 year; 178.5 ± 6.0 cm and 73.6 ± 7.6 kg) were selected into the national TID squad based on initial testing. These tests consisted of two 6-s maximal sprints on a custom-built ergometer and 4 maximal track-based tests (2 rolling and 2 standing starts) using 2 gear ratios. Magnitude-based inferences and correlation coefficients assessed changes following a 3-month TDE programme. Training elicited meaningful improvements (80-100% likely) in all ergometer parameters. The standing and rolling small gear, track-based effort times were likely and very likely (3.2 ± 2.4% and 3.3 ± 1.9%, respectively) improved by training. Stronger correlations between ergometer- and track-based measures were very likely following training. Ergometer-based testing provides a more sensitive tool than track-based testing to monitor changes in neuromuscular function during the early stages of TDE. However, track-based testing can indicate skill-based improvements in performance when interpreted with ergometer testing. In combination, these tests provide information on overall talent development.

  4. A family-based joint test for mean and variance heterogeneity for quantitative traits.

    Science.gov (United States)

    Cao, Ying; Maxwell, Taylor J; Wei, Peng

    2015-01-01

    Traditional quantitative trait locus (QTL) analysis focuses on identifying loci associated with mean heterogeneity. Recent research has discovered loci associated with phenotype variance heterogeneity (vQTL), which is important in studying genetic association with complex traits, especially for identifying gene-gene and gene-environment interactions. While several tests have been proposed to detect vQTL for unrelated individuals, there are no tests for related individuals, commonly seen in family-based genetic studies. Here we introduce a likelihood ratio test (LRT) for identifying mean and variance heterogeneity simultaneously or for either effect alone, adjusting for covariates and family relatedness using a linear mixed effect model approach. The LRT test statistic for normally distributed quantitative traits approximately follows χ(2)-distributions. To correct for inflated Type I error for non-normally distributed quantitative traits, we propose a parametric bootstrap-based LRT that removes the best linear unbiased prediction (BLUP) of family random effect. Simulation studies show that our family-based test controls Type I error and has good power, while Type I error inflation is observed when family relatedness is ignored. We demonstrate the utility and efficiency gains of the proposed method using data from the Framingham Heart Study to detect loci associated with body mass index (BMI) variability. © 2014 John Wiley & Sons Ltd/University College London.

  5. Test and improvement of readout system based on APV25 chip for GEM detector

    International Nuclear Information System (INIS)

    Hu Shouyang; Jian Siyu; Zhou Jing; Shan Chao; Li Xinglong; Li Xia; Li Xiaomei; Zhou Yi

    2014-01-01

    Gas electron multiplier (GEM) is the most promising position sensitive gas detector. The new generation of readout electronics system includes APV25 front-end card, multi-purpose digitizer (MPD), VME controller and Linux-based acquisition software DAQ. The construction and preliminary test of this readout system were finished, and the ideal data with the system working frequency of 40 MHz and 20 MHz were obtained. The long time running test shows that the system has a very good time-stable ability. Through optimizing the software configuration and improving hardware quality, the noise level was reduced, and the signal noise ratio was improved. (authors)

  6. Ground-based remote sensing of HDO/H2O ratio profiles: introduction and validation of an innovative retrieval approach

    Science.gov (United States)

    Schneider, M.; Hase, F.; Blumenstock, T.

    2006-10-01

    We propose an innovative approach for analysing ground-based FTIR spectra which allows us to detect variabilities of lower and middle/upper tropospheric HDO/H2O ratios. We show that the proposed method is superior to common approaches. We estimate that lower tropospheric HDO/H2O ratios can be detected with a noise to signal ratio of 15% and middle/upper tropospheric ratios with a noise to signal ratio of 50%. The method requires the inversion to be performed on a logarithmic scale and to introduce an inter-species constraint. While common methods calculate the isotope ratio posterior to an independent, optimal estimation of the HDO and H2O profile, the proposed approach is an optimal estimator for the ratio itself. We apply the innovative approach to spectra measured continuously during 15 months and present, for the first time, an annual cycle of tropospheric HDO/H2O ratio profiles as detected by ground-based measurements. Outliers in the detected middle/upper tropospheric ratios are interpreted by backward trajectories.

  7. Modeling speech intelligibility based on the signal-to-noise envelope power ratio

    DEFF Research Database (Denmark)

    Jørgensen, Søren

    of modulation frequency selectivity in the auditory processing of sound with a decision metric for intelligibility that is based on the signal-to-noise envelope power ratio (SNRenv). The proposed speech-based envelope power spectrum model (sEPSM) is demonstrated to account for the effects of stationary...... through three commercially available mobile phones. The model successfully accounts for the performance across the phones in conditions with a stationary speech-shaped background noise, whereas deviations were observed in conditions with “Traffic” and “Pub” noise. Overall, the results of this thesis...

  8. A statistical strategy to assess cleaning level of surfaces using fluorescence spectroscopy and Wilks’ ratio

    DEFF Research Database (Denmark)

    Stoica, Iuliana-Madalina; Babamoradi, Hamid; van den Berg, Frans

    2017-01-01

    •A statistical strategy combining fluorescence spectroscopy, multivariate analysis and Wilks’ ratio is proposed.•The method was tested both off-line and on-line having riboflavin as a (controlled) contaminant.•Wilks’ ratio signals unusual recordings based on shifts in variance and covariance...... structure described in in-control data....

  9. Statistical methods for improving verification of claims of origin for Italian wines based on stable isotope ratios

    Energy Technology Data Exchange (ETDEWEB)

    Dordevic, N.; Wehrens, R. [IASMA Research and Innovation Centre, Fondazione Edmund Mach, via Mach 1, 38010 San Michele all' Adige (Italy); Postma, G.J.; Buydens, L.M.C. [Radboud University Nijmegen, Institute for Molecules and Materials, Analytical Chemistry, P.O. Box 9010, 6500 GL Nijmegen (Netherlands); Camin, F., E-mail: federica.camin@fmach.it [IASMA Research and Innovation Centre, Fondazione Edmund Mach, via Mach 1, 38010 San Michele all' Adige (Italy)

    2012-12-13

    Highlights: Black-Right-Pointing-Pointer The assessment of claims of origin is of enormous economic importance for DOC and DOCG wines. Black-Right-Pointing-Pointer The official method is based on univariate statistical tests of H, C and O isotopic ratios. Black-Right-Pointing-Pointer We consider 5220 Italian wine samples collected in the period 2000-2010. Black-Right-Pointing-Pointer Multivariate statistical analysis leads to much better specificity and easier detection of false claims of origin. Black-Right-Pointing-Pointer In the case of multi-modal data, mixture modelling provides additional improvements. - Abstract: Wine derives its economic value to a large extent from geographical origin, which has a significant impact on the quality of the wine. According to the food legislation, wines can be without geographical origin (table wine) and wines with origin. Wines with origin must have characteristics which are essential due to its region of production and must be produced, processed and prepared, exclusively within that region. The development of fast and reliable analytical methods for the assessment of claims of origin is very important. The current official method is based on the measurement of stable isotope ratios of water and alcohol in wine, which are influenced by climatic factors. The results in this paper are based on 5220 Italian wine samples collected in the period 2000-2010. We evaluate the univariate approach underlying the official method to assess claims of origin and propose several new methods to get better geographical discrimination between samples. It is shown that multivariate methods are superior to univariate approaches in that they show increased sensitivity and specificity. In cases where data are non-normally distributed, an approach based on mixture modelling provides additional improvements.

  10. Pengaruh Return on Assets dan Debt to Equity Ratio terhadap Harga Saham pada Institusi Finansial di Bursa Efek Indonesia

    Directory of Open Access Journals (Sweden)

    Rani Ramdhani

    2013-03-01

    Full Text Available This study aims to determine the effect of Return on Assets and Debt to Equity Ratio of Stock Price on Financial Institutions in Indonesia Stock Exchange. This study used secondary data, with samples 2 financial companies in Indonesia Stock Exchange during the study period 2004-2010. Independent variables in this study are Return on Assets and Debt to Equity Ratio. This study used purposive sampling technique. The method of data analysis used classical assumption test, hypothesis test, multiple regression analysis, the F test and t test. Based on results of the study, Return on Assets and Debt to Equity Ratio have no significant effect on stock price. Meanwhile, the F test result shows that Return on Assets and Debt to Equity Ratio jointly have no effect on stock price.

  11. Glass-surface area to solution-volume ratio and its implications to accelerated leach testing

    International Nuclear Information System (INIS)

    Pederson, L.R.; Buckwalter, C.Q.; McVay, G.L.; Riddle, B.L.

    1982-10-01

    The value of glass surface area to solution volume ratio (SA/V) can strongly influence the leaching rate of PNL 76-68 glass. The leaching rate is largely governed by silicon solubility constraints. Silicic acid in solution reduced the elemental release of all glass components. No components are leached to depths greater than that of silicon. The presence of the reaction layer had no measurable effect on the rate of leaching. Accelerated leach testing is possible since PNL 76-68 glass leaching is solubility-controlled (except at very low SA/V values). A series of glasses leached with SA/V x time = constant will yield identical elemental release

  12. Development of remote data acquisition system based on OPC for brake test bench

    Science.gov (United States)

    Wang, Yiwei; Wu, Mengling; Tian, Chun; Ma, Tianhe

    2017-08-01

    The 1:1 train brake system test bench can be used to carry out brake-related adhesion-slid control, stability test, noise test and dynamic test. To collect data of the test bench, a data acquisition method is needed. In this paper, the remote data acquisition system of test bench is built by LabVIEW based on OPC technology. Unlike the traditional hardwire way connecting PLC acquisition module with sensors, the novel method is used to collect data and share them through the internal LAN built by Ethernet switches, which avoids the complex wiring interference in an easy, efficient and flexible way. The system has been successfully applied to the data acquisition activities of the comprehensive brake system test bench of CRRC Nanjing Puzhen Haitai Brake Equipment Co., Ltd., and the relationship test between the adhesion coefficient and the slip-ratio is realized. The speed signal, torque signal and brake disc temperature can be collected and displayed. The results show that the system is reliable, convenient, and efficient, and can meet the requirements of data acquisition.

  13. Tests of variable-band multilayers designed for investigating optimal signal-to-noise vs artifact signal ratios in Dual-Energy Digital Subtraction Angiography (DDSA) imaging systems

    International Nuclear Information System (INIS)

    Boyers, D.; Ho, A.; Li, Q.; Piestrup, M.; Rice, M.; Tatchyn, R.

    1993-08-01

    In recent work, various design techniques were applied to investigate the feasibility of controlling the bandwidth and bandshape profiles of tungsten/boron-carbon (W/B 4 C) and tungsten/silicon (W/Si) multilayers for optimizing their performance in synchrotron radiation based angiographical imaging systems at 33 keV. Varied parameters included alternative spacing geometries, material thickness ratios, and numbers of layer pairs. Planar optics with nominal design reflectivities of 30%--94% and bandwidths ranging from 0.6%--10% were designed at the Stanford Radiation Laboratory, fabricated by the Ovonic Synthetic Materials Company, and characterized on Beam Line 4-3 at the Stanford Synchrotron Radiation Laboratory, in this paper we report selected results of these tests and review the possible use of the multilayers for determining optimal signal to noise vs. artifact signal ratios in practical Dual-Energy Digital Subtraction Angiography systems

  14. Sex ratios, mating frequencies and relative abundance of sympatric millipedes in the genus Chersastus (Diplopoda: Pachybolidae

    Directory of Open Access Journals (Sweden)

    Mark Ian Cooper

    2014-12-01

    Full Text Available Three hypotheses exist for explaining climbing behavior in millipedes: 1 waterlogging, 2 detritus limiting, and 3 mate avoidance. Data of sex ratios, mating frequency and relative abundance are provided to suggest an alternative explanation for the pattern in sympatric forest millipedes. Sex ratio differences - from equality - were tested using a G-test comparing millipedes on and above ground. Mating frequencies were calculated based on the percentage of paired individuals. Relative abundance may correlate with male-biases in the sex ratios. All three factors suggest Chersastus inscriptus has a higher reproductive potential than C. anulatus. This is evidence for mating hotspots.

  15. Effect of electrode mass ratio on aging of activated carbon based supercapacitors utilizing organic electrolytes

    Science.gov (United States)

    Cericola, D.; Kötz, R.; Wokaun, A.

    2011-03-01

    The accelerated degradation of carbon based supercapacitors utilizing 1 M Et4NBF4 in acetonitrile and in propylene carbonate as electrolyte is investigated for a constant cell voltage of 3.5 V as a function of the positive over total electrode mass ratio. The degradation rate of the supercapacitor using acetonitrile as a solvent can be decreased by increasing the mass of the positive electrode. With a mass ratio (positive electrode mass/total electrode mass) of 0.65 the degradation rate is minimum. For the capacitor utilizing propylene carbonate as a solvent a similar effect was observed. The degradation rate was smallest for a mass ratio above 0.5.

  16. Numerical simulation of CICC design based on optimization of ratio of copper to superconductor

    International Nuclear Information System (INIS)

    Jiang Huawei; Li Yuan; Yan Shuailing

    2007-01-01

    For cable-in-conduit conductor (CICC) structure design, a numeric simulation is proposed for conductor configuration based on optimization of ratio of copper to superconductor. The simulation outcome is in agreement with engineering design one. (authors)

  17. Optimization of the wavelength shifter ratio in a polystyrene based plastic scintillator through energy spectrum analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ye Won; Kim, Myung Soo; Yoo, Hyun Jun; Lee, Dae Hee; Cho, Gyu Seong [Dept. of Nuclear and Quantum Engineering, KAIST, Daejeon (Korea, Republic of); Moon, Myung Kook [Neutron Instrumentation Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2017-02-15

    The scintillation efficiency of the polystyrene based plastic scintillator depends on the ratio of the wavelength shifters, organic fluors (PPO and POPOP). Thus, 24 samples of the plastic scintillator were fabricated in order to find out the optimum ratio of the wavelength shifters in the plastic scintillator. The fabricated plastic scintillators were trimmed through a cutting and polishing process. They were used in gamma energy spectrum measurement with the {sup 137}Cs emitting monoenergy photon with 662 keV for the comparison of the scintillation efficiency. As a result, it was found out that the scintillator sample with 1.00 g of PPO (2,5-Diphenyloxazole) and 0.50 g of POPOP (1,4-Bis(5-phnyl-2oxidazolyl)benzene) dissolved in 100 g of styrene solution has the optimum ratio in terms of the light yield of the polystyrene based plastic scintillator.

  18. Spectral-ratio radon background correction method in airborne γ-ray spectrometry based on compton scattering deduction

    International Nuclear Information System (INIS)

    Gu Yi; Xiong Shengqing; Zhou Jianxin; Fan Zhengguo; Ge Liangquan

    2014-01-01

    γ-ray released by the radon daughter has severe impact on airborne γ-ray spectrometry. The spectral-ratio method is one of the best mathematical methods for radon background deduction in airborne γ-ray spectrometry. In this paper, an advanced spectral-ratio method was proposed which deducts Compton scattering ray by the fast Fourier transform rather than tripping ratios, the relationship between survey height and correction coefficient of the advanced spectral-ratio radon background correction method was studied, the advanced spectral-ratio radon background correction mathematic model was established, and the ground saturation model calibrating technology for correction coefficient was proposed. As for the advanced spectral-ratio radon background correction method, its applicability and correction efficiency are improved, and the application cost is saved. Furthermore, it can prevent the physical meaning lost and avoid the possible errors caused by matrix computation and mathematical fitting based on spectrum shape which is applied in traditional correction coefficient. (authors)

  19. A thermogravimetric analysis (TGA) method developed for estimating the stoichiometric ratio of solid-state {alpha}-cyclodextrin-based inclusion complexes

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Yuxiang; Wang, Jinpeng; Bashari, Mohanad; Hu, Xiuting [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Feng, Tao [School of Perfume and Aroma Technology, Shanghai Institute of Technology, Shanghai 201418 (China); Xu, Xueming [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Jin, Zhengyu, E-mail: jinlab2008@yahoo.com [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Tian, Yaoqi, E-mail: yqtian@jiangnan.edu.cn [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China)

    2012-08-10

    Highlights: Black-Right-Pointing-Pointer We develop a TGA method for the measurement of the stoichiometric ratio. Black-Right-Pointing-Pointer A series of formulas are deduced to calculate the stoichiometric ratio. Black-Right-Pointing-Pointer Four {alpha}-CD-based inclusion complexes were successfully prepared. Black-Right-Pointing-Pointer The developed method is applicable. - Abstract: An approach mainly based on thermogravimetric analysis (TGA) was developed to evaluate the stoichiometric ratio (SR, guest to host) of the guest-{alpha}-cyclodextrin (Guest-{alpha}-CD) inclusion complexes (4-cresol-{alpha}-CD, benzyl alcohol-{alpha}-CD, ferrocene-{alpha}-CD and decanoic acid-{alpha}-CD). The present data obtained from Fourier transform-infrared (FT-IR) spectroscopy showed that all the {alpha}-CD-based inclusion complexes were successfully prepared in a solid-state form. The stoichiometric ratios of {alpha}-CD to the relative guests (4-cresol, benzyl alcohol, ferrocene and decanoic acid) determined by the developed method were 1:1, 1:2, 2:1 and 1:2, respectively. These SR data were well demonstrated by the previously reported X-ray diffraction (XRD) method and the NMR confirmatory experiments, except the SR of decanoic acid with a larger size and longer chain was not consistent. It is, therefore, suggested that the TGA-based method is applicable to follow the stoichiometric ratio of the polycrystalline {alpha}-CD-based inclusion complexes with smaller and shorter chain guests.

  20. A thermogravimetric analysis (TGA) method developed for estimating the stoichiometric ratio of solid-state α-cyclodextrin-based inclusion complexes

    International Nuclear Information System (INIS)

    Bai, Yuxiang; Wang, Jinpeng; Bashari, Mohanad; Hu, Xiuting; Feng, Tao; Xu, Xueming; Jin, Zhengyu; Tian, Yaoqi

    2012-01-01

    Highlights: ► We develop a TGA method for the measurement of the stoichiometric ratio. ► A series of formulas are deduced to calculate the stoichiometric ratio. ► Four α-CD-based inclusion complexes were successfully prepared. ► The developed method is applicable. - Abstract: An approach mainly based on thermogravimetric analysis (TGA) was developed to evaluate the stoichiometric ratio (SR, guest to host) of the guest–α-cyclodextrin (Guest-α-CD) inclusion complexes (4-cresol-α-CD, benzyl alcohol-α-CD, ferrocene-α-CD and decanoic acid-α-CD). The present data obtained from Fourier transform-infrared (FT-IR) spectroscopy showed that all the α-CD-based inclusion complexes were successfully prepared in a solid-state form. The stoichiometric ratios of α-CD to the relative guests (4-cresol, benzyl alcohol, ferrocene and decanoic acid) determined by the developed method were 1:1, 1:2, 2:1 and 1:2, respectively. These SR data were well demonstrated by the previously reported X-ray diffraction (XRD) method and the NMR confirmatory experiments, except the SR of decanoic acid with a larger size and longer chain was not consistent. It is, therefore, suggested that the TGA-based method is applicable to follow the stoichiometric ratio of the polycrystalline α-CD-based inclusion complexes with smaller and shorter chain guests.

  1. Termination Criteria for Computerized Classification Testing

    Directory of Open Access Journals (Sweden)

    Nathan A. Thompson

    2011-02-01

    Full Text Available Computerized classification testing (CCT is an approach to designing tests with intelligent algorithms, similar to adaptive testing, but specifically designed for the purpose of classifying examinees into categories such as - pass- and - fail.- Like adaptive testing for point estimation of ability, the key component is the termination criterion, namely the algorithm that decides whether to classify the examinee and end the test or to continue and administer another item. This paper applies a newly suggested termination criterion, the generalized likelihood ratio (GLR, to CCT. It also explores the role of the indifference region in the specification of likelihood-ratio based termination criteria, comparing the GLR to the sequential probability ratio test. Results from simulation studies suggest that the GLR is always at least as efficient as existing methods.

  2. Discrepancy in Vancomycin AUC/MIC Ratio Targeted Attainment Based upon the Susceptibility Testing in Staphylococcus aureus.

    Science.gov (United States)

    Eum, Seenae; Bergsbaken, Robert L; Harvey, Craig L; Warren, J Bryan; Rotschafer, John C

    2016-09-27

    This study demonstrated a statistically significant difference in vancomycin minimum inhibitory concentration (MIC) for Staphylococcus aureus between a common automated system (Vitek 2) and the E-test method in patients with S. aureus bloodstream infections. At an area under the serum concentration time curve (AUC) threshold of 400 mg∙h/L, we would have reached the current Infectious Diseases Society of America (IDSA)/American Society of Health System Pharmacists (ASHP)/Society of Infectious Diseases Pharmacists (SIDP) guideline suggested AUC/MIC target in almost 100% of patients while using the Vitek 2 MIC data; however, we could only generate 40% target attainment while using E-test MIC data ( p AUC of 450 mg∙h/L or greater was required to achieve 100% target attainment using either Vitek 2 or E-test MIC results.

  3. Identification of Rice Accessions Associated with K+/Na+ Ratio and Salt Tolerance Based on Physiological and Molecular Responses

    Directory of Open Access Journals (Sweden)

    Inja Naga Bheema Lingeswara Reddy

    2017-11-01

    Full Text Available The key for rice plant survival under NaCl salt stress is maintaining a high K+/Na+ ratio in its cells. Selection for salt tolerance rice genotypes based on phenotypic performance alone will delay in progress in breeding. Use of molecular markers in tandem with physiological studies will help in better identification of salt tolerant rice accessions. Eight rice accessions along with the check Dongjin were screened using 1/2 Yoshida solution with 50 mmol/L NaCl at the seedling stage. The accessions IT001158, IT246674, IT260533 and IT291341 were classified as salt tolerant based on their K+/Na+ ratios. Seventeen SSR markers reported to be associated with K+/Na+ ratio were used to screen the accessions. Five SSR markers (RM8053, RM345, RM318, RM253 and RM7075 could differentiate accessions classified based on their K+/Na+ ratios. Banding pattern of the accessions was scored compared to the banding pattern of Dongjin. The study differentiated accessions based on their association of K+/Na+ ratio with molecular markers which are very reliable. These markers can play a significant role in screening large set of rice germplasms for salt tolerance and also help in identification of high-yielding varieties with better salt tolerance. The salt tolerant accessions can be taken forward into developing better varieties by conventional breeding and exploring genes for salt tolerance.

  4. Control of size and aspect ratio in hydroquinone-based synthesis of gold nanorods

    International Nuclear Information System (INIS)

    Morasso, Carlo; Picciolini, Silvia; Schiumarini, Domitilla; Mehn, Dora; Ojea-Jiménez, Isaac; Zanchetta, Giuliano; Vanna, Renzo; Bedoni, Marzia; Prosperi, Davide; Gramatica, Furio

    2015-01-01

    In this article, we describe how it is possible to tune the size and the aspect ratio of gold nanorods obtained using a highly efficient protocol based on the use of hydroquinone as a reducing agent by varying the amounts of CTAB and silver ions present in the “seed-growth” solution. Our approach not only allows us to prepare nanorods with a four times increased Au 3+ reduction yield, when compared with the commonly used protocol based on ascorbic acid, but also allows a remarkable reduction of 50–60 % of the amount of CTAB needed. In fact, according to our findings, the concentration of CTAB present in the seed-growth solution do not linearly influence the final aspect ratio of the obtained nanorods, and an optimal concentration range between 30 and 50 mM has been identified as the one that is able to generate particles with more elongated shapes. On the optimized protocol, the effect of the concentration of Ag + ions in the seed-growth solution and the stability of the obtained particles has also been investigated

  5. Likelihood ratio sequential sampling models of recognition memory.

    Science.gov (United States)

    Osth, Adam F; Dennis, Simon; Heathcote, Andrew

    2017-02-01

    The mirror effect - a phenomenon whereby a manipulation produces opposite effects on hit and false alarm rates - is benchmark regularity of recognition memory. A likelihood ratio decision process, basing recognition on the relative likelihood that a stimulus is a target or a lure, naturally predicts the mirror effect, and so has been widely adopted in quantitative models of recognition memory. Glanzer, Hilford, and Maloney (2009) demonstrated that likelihood ratio models, assuming Gaussian memory strength, are also capable of explaining regularities observed in receiver-operating characteristics (ROCs), such as greater target than lure variance. Despite its central place in theorising about recognition memory, however, this class of models has not been tested using response time (RT) distributions. In this article, we develop a linear approximation to the likelihood ratio transformation, which we show predicts the same regularities as the exact transformation. This development enabled us to develop a tractable model of recognition-memory RT based on the diffusion decision model (DDM), with inputs (drift rates) provided by an approximate likelihood ratio transformation. We compared this "LR-DDM" to a standard DDM where all targets and lures receive their own drift rate parameters. Both were implemented as hierarchical Bayesian models and applied to four datasets. Model selection taking into account parsimony favored the LR-DDM, which requires fewer parameters than the standard DDM but still fits the data well. These results support log-likelihood based models as providing an elegant explanation of the regularities of recognition memory, not only in terms of choices made but also in terms of the times it takes to make them. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Testlet-Based Multidimensional Adaptive Testing.

    Science.gov (United States)

    Frey, Andreas; Seitz, Nicki-Nils; Brandt, Steffen

    2016-01-01

    Multidimensional adaptive testing (MAT) is a highly efficient method for the simultaneous measurement of several latent traits. Currently, no psychometrically sound approach is available for the use of MAT in testlet-based tests. Testlets are sets of items sharing a common stimulus such as a graph or a text. They are frequently used in large operational testing programs like TOEFL, PISA, PIRLS, or NAEP. To make MAT accessible for such testing programs, we present a novel combination of MAT with a multidimensional generalization of the random effects testlet model (MAT-MTIRT). MAT-MTIRT compared to non-adaptive testing is examined for several combinations of testlet effect variances (0.0, 0.5, 1.0, and 1.5) and testlet sizes (3, 6, and 9 items) with a simulation study considering three ability dimensions with simple loading structure. MAT-MTIRT outperformed non-adaptive testing regarding the measurement precision of the ability estimates. Further, the measurement precision decreased when testlet effect variances and testlet sizes increased. The suggested combination of the MTIRT model therefore provides a solution to the substantial problems of testlet-based tests while keeping the length of the test within an acceptable range.

  7. Testlet-based Multidimensional Adaptive Testing

    Directory of Open Access Journals (Sweden)

    Andreas Frey

    2016-11-01

    Full Text Available Multidimensional adaptive testing (MAT is a highly efficient method for the simultaneous measurement of several latent traits. Currently, no psychometrically sound approach is available for the use of MAT in testlet-based tests. Testlets are sets of items sharing a common stimulus such as a graph or a text. They are frequently used in large operational testing programs like TOEFL, PISA, PIRLS, or NAEP. To make MAT accessible for such testing programs, we present a novel combination of MAT with a multidimensional generalization of the random effects testlet model (MAT-MTIRT. MAT-MTIRT compared to non-adaptive testing is examined for several combinations of testlet effect variances (0.0, 0.5, 1.0, 1.5 and testlet sizes (3 items, 6 items, 9 items with a simulation study considering three ability dimensions with simple loading structure. MAT-MTIRT outperformed non-adaptive testing regarding the measurement precision of the ability estimates. Further, the measurement precision decreased when testlet effect variances and testlet sizes increased. The suggested combination of the MTIRT model therefore provides a solution to the substantial problems of testlet-based tests while keeping the length of the test within an acceptable range.

  8. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    INTRODUCTION: Oral anticoagulation treatment (OACT) with warfarin is common in general practice. Increasingly, international normalised ratio (INR) point of care testing (POCT) is being used to manage patients. The aim of this study was to describe and analyse the quality of OACT with warfarin...... practices using INR POCT in the management of patients in warfarin treatment provided good quality of care. Sampling interval and diagnostic coding were significantly correlated with treatment quality....

  9. Sequential boundaries approach in clinical trials with unequal allocation ratios

    Directory of Open Access Journals (Sweden)

    Ayatollahi Seyyed

    2006-01-01

    Full Text Available Abstract Background In clinical trials, both unequal randomization design and sequential analyses have ethical and economic advantages. In the single-stage-design (SSD, however, if the sample size is not adjusted based on unequal randomization, the power of the trial will decrease, whereas with sequential analysis the power will always remain constant. Our aim was to compare sequential boundaries approach with the SSD when the allocation ratio (R was not equal. Methods We evaluated the influence of R, the ratio of the patients in experimental group to the standard group, on the statistical properties of two-sided tests, including the two-sided single triangular test (TT, double triangular test (DTT and SSD by multiple simulations. The average sample size numbers (ASNs and power (1-β were evaluated for all tests. Results Our simulation study showed that choosing R = 2 instead of R = 1 increases the sample size of SSD by 12% and the ASN of the TT and DTT by the same proportion. Moreover, when R = 2, compared to the adjusted SSD, using the TT or DTT allows to retrieve the well known reductions of ASN observed when R = 1, compared to SSD. In addition, when R = 2, compared to SSD, using the TT and DTT allows to obtain smaller reductions of ASN than when R = 1, but maintains the power of the test to its planned value. Conclusion This study indicates that when the allocation ratio is not equal among the treatment groups, sequential analysis could indeed serve as a compromise between ethicists, economists and statisticians.

  10. Evaluation of a clinic-based cholinesterase test kit for the Washington State Cholinesterase Monitoring Program.

    Science.gov (United States)

    Hofmann, Jonathan N; Carden, Angela; Fenske, Richard A; Ruark, Harold E; Keifer, Matthew C

    2008-07-01

    The Washington State Cholinesterase Monitoring Program for pesticide handlers requires blood draws at local clinics, with samples tested at a central laboratory. At present, workers with inhibited cholinesterase activity may be re-exposed before they can be removed from work. In this study we explored the option of on-site testing at local clinics using the EQM Test-mate Kittrade mark, a portable cholinesterase test kit. Test kit cholinesterase activity measurements were performed on 50 blood samples by our research staff, and compared to measurements on the same samples by the Washington State Public Health Laboratory. Another set of samples was also analyzed with the test kit by medical staff at an eastern Washington clinic. Triplicate measurements with the test kit had a 3.3% average coefficient of variation (CV) for plasma cholinesterase (PChE), and a 3.5% average CV for erythrocyte cholinesterase (AChE) measurements. The kit's PChE measurements were similar to PHL measurements (average ratio of 0.98) when performed in the laboratory, but had a tendency to underestimate activity when used in the clinic setting (average ratio of 0.87). The kit systematically overestimated AChE activity by 42-48% relative to the PHL measurements, regardless of where the samples were analyzed. This easy-to-use test kit appeared to be a viable method for clinic-based PChE measurements, but was less consistent for AChE measurements performed in the clinic. Absolute measurements with the kit need to be evaluated carefully relative to standardized methods. (c) 2008 Wiley-Liss, Inc.

  11. Combination of the ionic-to-atomic line intensity ratios from two test elements for the diagnostic of plasma temperature and electron number density in Inductively Coupled Plasma Atomic Emission Spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Tognoni, E. [Istituto per i Processi Chimico-Fisici, Area della Ricerca del Consiglio Nazionale delle Ricerche Via Moruzzi 1, 56124 Pisa (Italy)], E-mail: tognoni@ipcf.cnr.it; Hidalgo, M.; Canals, A. [Departamento de Quimica Analitica, Nutricion y Bromatologia. Universidad de Alicante. Apdo. 99, 03080, Alicante (Spain); Cristoforetti, G.; Legnaioli, S.; Salvetti, A.; Palleschi, V. [Istituto per i Processi Chimico-Fisici, Area della Ricerca del Consiglio Nazionale delle Ricerche Via Moruzzi 1, 56124 Pisa (Italy)

    2007-05-15

    In Inductively Coupled Plasma-Atomic Emission Spectroscopy (ICP-AES) spectrochemical analysis, the MgII(280.270 nm)/MgI(285.213 nm) ionic to atomic line intensity ratio is commonly used as a monitor of the robustness of operating conditions. This approach is based on the univocal relationship existing between intensity ratio and plasma temperature, for a pure argon atmospheric ICP in thermodynamic equilibrium. In a multi-elemental plasma in the lower temperature range, the measurement of the intensity ratio may not be sufficient to characterize temperature and electron density. In such a range, the correct relationship between intensity ratio and plasma temperature can be calculated only when the complete plasma composition is known. We propose the combination of the line intensity ratios of two test elements (double ratio) as an effective diagnostic tool for a multi-elemental low temperature LTE plasma of unknown composition. In particular, the variation of the double ratio allows us discriminating changes in the plasma temperature from changes in the electron density. Thus, the effects on plasma excitation and ionization possibly caused by introduction of different samples and matrices in non-robust conditions can be more accurately interpreted. The method is illustrated by the measurement of plasma temperature and electron density in a specific analytic case.

  12. Effect of Mo/B atomic ratio on the properties of Mo2NiB2-based cermets

    International Nuclear Information System (INIS)

    Xie, Lang; Li, XiaoBo; Zhang, Dan; Yi, Li; Gao, XiaoQing; Xiangtan Univ.

    2015-01-01

    Using three elementary substances, Mo, Ni, and amorphous B as raw materials, four series of Mo 2 NiB 2 -based cermets with the Mo/B atomic ratio ranging from 0.9 to 1.2 were successfully prepared via reaction sintering. The effect of Mo/B atomic ratio on the microstructure and properties was studied for the cermets. The results indicate that there is a strong correlation between the Mo/B atomic ratio and properties. The transverse rupture strength of the cermets increases with an increase in Mo/B ratio and shows a maximum value of 1 872 MPa at an Mo/B atomic ratio of 1.1 and then decreases with increasing Mo/B atomic ratio. The hardness and the corrosion resistance of the cermets increase monotonically with an increase in Mo/B atomic ratio. In Mo-rich cermets with an atomic ratio of Mo/B above 1.1, a small amount Ni-Mo intermetallic compound is found precipitated at the interface of Mo 2 NiB 2 grains.

  13. Pearce element ratios: A paradigm for testing hypotheses

    Science.gov (United States)

    Russell, J. K.; Nicholls, Jim; Stanley, Clifford R.; Pearce, T. H.

    Science moves forward with the development of new ideas that are encapsulated by hypotheses whose aim is to explain the structure of data sets or to expand existing theory. These hypotheses remain conjecture until they have been tested. In fact, Karl Popper advocated that a scientist's job does not finish with the creation of an idea but, rather, begins with the testing of the related hypotheses. In Popper's [1959] advocation it is implicit that there be tools with which we can test our hypotheses. Consequently, the development of rigorous tests for conceptual models plays a major role in maintaining the integrity of scientific endeavor [e.g., Greenwood, 1989].

  14. Collaborative spectrum sensing based on the ratio between largest eigenvalue and Geometric mean of eigenvalues

    KAUST Repository

    Shakir, Muhammad

    2011-12-01

    In this paper, we introduce a new detector referred to as Geometric mean detector (GEMD) which is based on the ratio of the largest eigenvalue to the Geometric mean of the eigenvalues for collaborative spectrum sensing. The decision threshold has been derived by employing Gaussian approximation approach. In this approach, the two random variables, i.e. The largest eigenvalue and the Geometric mean of the eigenvalues are considered as independent Gaussian random variables such that their cumulative distribution functions (CDFs) are approximated by a univariate Gaussian distribution function for any number of cooperating secondary users and received samples. The approximation approach is based on the calculation of exact analytical moments of the largest eigenvalue and the Geometric mean of the eigenvalues of the received covariance matrix. The decision threshold has been calculated by exploiting the CDF of the ratio of two Gaussian distributed random variables. In this context, we exchange the analytical moments of the two random variables with the moments of the Gaussian distribution function. The performance of the detector is compared with the performance of the energy detector and eigenvalue ratio detector. Analytical and simulation results show that our newly proposed detector yields considerable performance advantage in realistic spectrum sensing scenarios. Moreover, our results based on proposed approximation approach are in perfect agreement with the empirical results. © 2011 IEEE.

  15. Experimental study on the natural gas dual fuel engine test and the higher the mixture ratio of hydrogen to natural gas

    Energy Technology Data Exchange (ETDEWEB)

    Kim, B.S.; Lee, Y.S.; Park, C.K. [Cheonnam University, Kwangju (Korea); Masahiro, S. [Kyoto University, Kyoto (Japan)

    1999-05-28

    One of the unsolved problems of the natural gas dual fuel engine is that there is too much exhaust of Total Hydrogen Carbon(THC) at a low equivalent mixture ratio. To fix it, a natural gas mixed with hydrogen was applied to engine test. The results showed that the higher the mixture ratio of hydrogen to natural gas, the higher the combustion efficiency. And when the amount of the intake air is reached to 90% of WOT, the combustion efficiency was promoted. But, like a case making the injection timing earlier, the equivalent mixture ratio for the nocking limit decreases and the produce of NOx increases. 5 refs., 9 figs., 1 tab.

  16. Prognostic value of the C-reactive protein to albumin ratio: a novel inflammation-based prognostic indicator in osteosarcoma

    Directory of Open Access Journals (Sweden)

    Li YJ

    2017-11-01

    Full Text Available Yong-Jiang Li,1,* Kai Yao,2,* Min-Xun Lu,2 Wen-Biao Zhang,1 Cong Xiao,2 Chong-Qi Tu2 1Department of Oncology, 2Department of Orthopedics, West China Hospital, Sichuan University, Chengdu, People’s Republic of China *These authors contributed equally to this work Abstract: The prognostic role of the C-reactive protein to albumin ratio (CRP/Alb ratio in patients with osteosarcoma has not been investigated. A total of 216 osteosarcoma patients were enrolled in the study. Univariate and multivariate survival analyses between the groups were performed and Kaplan–Meier analysis was conducted to plot the survival curves. Receiver operating characteristic curves were generated and areas under the curve (AUCs were compared to assess the discriminatory ability of the inflammation-based indicators, including CRP/Alb ratio, Glasgow prognostic score (GPS, neutrophil–lymphocyte ratio (NLR, and platelet–lymphocyte ratio (PLR. The optimal cutoff value was 0.210 for CRP/Alb ratio with a Youden index of 0.319. Higher values of CRP/Alb ratio were significantly associated with poorer overall survival in univariate (HR =2.62, 95% CI =1.70–4.03; P<0.001 and multivariate (HR =2.21, 95% CI =1.40–3.49; P=0.001 analyses. In addition, the CRP/Alb ratio had significantly higher AUC values compared with GPS (P=0.003, NLR (P<0.001, and PLR (P<0.001. The study demonstrated that the CRP/Alb ratio is an effective inflammation-based prognostic indicator in osteosarcoma, which potentially has a discriminatory ability superior to that of other inflammatory indicators including GPS, NLR, and PLR. Keywords: osteosarcoma, CRP to albumin ratio, prognosis

  17. A Spline-Based Lack-Of-Fit Test for Independent Variable Effect in Poisson Regression.

    Science.gov (United States)

    Li, Chin-Shang; Tu, Wanzhu

    2007-05-01

    In regression analysis of count data, independent variables are often modeled by their linear effects under the assumption of log-linearity. In reality, the validity of such an assumption is rarely tested, and its use is at times unjustifiable. A lack-of-fit test is proposed for the adequacy of a postulated functional form of an independent variable within the framework of semiparametric Poisson regression models based on penalized splines. It offers added flexibility in accommodating the potentially non-loglinear effect of the independent variable. A likelihood ratio test is constructed for the adequacy of the postulated parametric form, for example log-linearity, of the independent variable effect. Simulations indicate that the proposed model performs well, and misspecified parametric model has much reduced power. An example is given.

  18. AXIAL RATIO OF EDGE-ON SPIRAL GALAXIES AS A TEST FOR BRIGHT RADIO HALOS

    International Nuclear Information System (INIS)

    Singal, J.; Jones, E.; Dunlap, H.; Kogut, A.

    2015-01-01

    We use surface brightness contour maps of nearby edge-on spiral galaxies to determine whether extended bright radio halos are common. In particular, we test a recent model of the spatial structure of the diffuse radio continuum by Subrahmanyan and Cowsik which posits that a substantial fraction of the observed high-latitude surface brightness originates from an extended Galactic halo of uniform emissivity. Measurements of the axial ratio of emission contours within a sample of normal spiral galaxies at 1500 MHz and below show no evidence for such a bright, extended radio halo. Either the Galaxy is atypical compared to nearby quiescent spirals or the bulk of the observed high-latitude emission does not originate from this type of extended halo. (letters)

  19. Equity Theory Ratios as Causal Schemas

    Directory of Open Access Journals (Sweden)

    Alexios Arvanitis

    2016-08-01

    Full Text Available Equity theory approaches justice evaluations based on ratios of exchange inputs to exchange outcomes. Situations are evaluated as just if ratios are equal and unjust if unequal. We suggest that equity ratios serve a more fundamental cognitive function than the evaluation of justice. More particularly, we propose that they serve as causal schemas for exchange outcomes, that is, they assist in determining whether certain outcomes are caused by inputs of other people in the context of an exchange process. Equality or inequality of ratios in this sense points to an exchange process. Indeed, Study 1 shows that different exchange situations, such as disproportional or balanced proportional situations, create perceptions of give-and-take on the basis of equity ratios. Study 2 shows that perceptions of justice are based more on communicatively accepted rules of interaction than equity-based evaluations, thereby offering a distinction between an attribution and an evaluation cognitive process for exchange outcomes.

  20. Equity Theory Ratios as Causal Schemas.

    Science.gov (United States)

    Arvanitis, Alexios; Hantzi, Alexandra

    2016-01-01

    Equity theory approaches justice evaluations based on ratios of exchange inputs to exchange outcomes. Situations are evaluated as just if ratios are equal and unjust if unequal. We suggest that equity ratios serve a more fundamental cognitive function than the evaluation of justice. More particularly, we propose that they serve as causal schemas for exchange outcomes, that is, they assist in determining whether certain outcomes are caused by inputs of other people in the context of an exchange process. Equality or inequality of ratios in this sense points to an exchange process. Indeed, Study 1 shows that different exchange situations, such as disproportional or balanced proportional situations, create perceptions of give-and-take on the basis of equity ratios. Study 2 shows that perceptions of justice are based more on communicatively accepted rules of interaction than equity-based evaluations, thereby offering a distinction between an attribution and an evaluation cognitive process for exchange outcomes.

  1. Simulation-based Testing of Control Software

    Energy Technology Data Exchange (ETDEWEB)

    Ozmen, Ozgur [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nutaro, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sanyal, Jibonananda [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Olama, Mohammed M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-10

    It is impossible to adequately test complex software by examining its operation in a physical prototype of the system monitored. Adequate test coverage can require millions of test cases, and the cost of equipment prototypes combined with the real-time constraints of testing with them makes it infeasible to sample more than a small number of these tests. Model based testing seeks to avoid this problem by allowing for large numbers of relatively inexpensive virtual prototypes that operate in simulation time at a speed limited only by the available computing resources. In this report, we describe how a computer system emulator can be used as part of a model based testing environment; specifically, we show that a complete software stack including operating system and application software - can be deployed within a simulated environment, and that these simulations can proceed as fast as possible. To illustrate this approach to model based testing, we describe how it is being used to test several building control systems that act to coordinate air conditioning loads for the purpose of reducing peak demand. These tests involve the use of ADEVS (A Discrete Event System Simulator) and QEMU (Quick Emulator) to host the operational software within the simulation, and a building model developed with the MODELICA programming language using Buildings Library and packaged as an FMU (Functional Mock-up Unit) that serves as the virtual test environment.

  2. Classification of user performance in the Ruff Figural Fluency Test based on eye-tracking features

    Directory of Open Access Journals (Sweden)

    Borys Magdalena

    2017-01-01

    Full Text Available Cognitive assessment in neurological diseases represents a relevant topic due to its diagnostic significance in detecting disease, but also in assessing progress of the treatment. Computer-based tests provide objective and accurate cognitive skills and capacity measures. The Ruff Figural Fluency Test (RFFT provides information about non-verbal capacity for initiation, planning, and divergent reasoning. The traditional paper form of the test was transformed into a computer application and examined. The RFFT was applied in an experiment performed among 70 male students to assess their cognitive performance in the laboratory environment. Each student was examined in three sequential series. Besides the students’ performances measured by using in app keylogging, the eye-tracking data obtained by non-invasive video-based oculography were gathered, from which several features were extracted. Eye-tracking features combined with performance measures (a total number of designs and/or error ratio were applied in machine learning classification. Various classification algorithms were applied, and their accuracy, specificity, sensitivity and performance were compared.

  3. Blind deblurring of spiral CT images - comparative studies on edge-to-noise ratios

    International Nuclear Information System (INIS)

    Jiang Ming; Wan Ge; Skinner, Margaret W.; Rubinstein, Jay T.; Vannier, Michael W.

    2002-01-01

    A recently developed blind deblurring algorithm based on the edge-to-noise ratio has been applied to improve the quality of spiral CT images. Since the discrepancy measure used to quantify the edge and noise effects is not symmetric, there are several ways to formulate the edge-to-noise ratio. This article is to investigate the performance of those ratios with phantom and patient data. In the phantom study, it is shown that all the ratios share similar properties, validating the blind deblurring algorithm. The image fidelity improvement varies from 29% to 33% for different ratios, according to the root mean square error (RMSE) criterion; the optimal iteration number determined for each ratio varies from 25 to 35. Those ratios that are associated with most satisfactory performance are singled out for the image fidelity improvement of about 33% in the numerical simulation. After automatic blind deblurring with the selected ratios, the spatial resolution of CT is substantially refined in all the cases tested

  4. Goodness of Fit Test and Test of Independence by Entropy

    OpenAIRE

    M. Sharifdoost; N. Nematollahi; E. Pasha

    2009-01-01

    To test whether a set of data has a specific distribution or not, we can use the goodness of fit test. This test can be done by one of Pearson X 2 -statistic or the likelihood ratio statistic G 2 , which are asymptotically equal, and also by using the Kolmogorov-Smirnov statistic in continuous distributions. In this paper, we introduce a new test statistic for goodness of fit test which is based on entropy distance, and which can be applied for large sample sizes...

  5. Pengaruh Debt to Equty Ratio, Current Ratio , Net Profit Margin Terhadap Harga Saham dengan Price Earning Ratio Sebagai Variabel Pemoderasi pada Perusahaan Manufaktur yang Terdaftar di BEI Periode 2012-2014

    OpenAIRE

    Theresia, Paskah Lia

    2017-01-01

    This study conducted to analyze the effect of variable Debt to Equity Ratio (DER), Current Ratio (CR), Net Profit Margin (NPM) andPrice Earnings Ratio (PER) to the Stock Prices with Price Earnings Ratio (PER) as an moderating variable on companies listed on Indonesian Stock Exchange from 2012 - 2014.The samplingtechnique used is purposive sampling and number of samples used by 23 companies. The analysis technique used are Descriptive Statistic Analysis, Classical Assumption Test, Hypothesis T...

  6. Data report of a tight-lattice rod bundle thermal-hydraulic tests (1). Base case test using 37-rod bundle simulated water-cooled breeder reactor (Contract research)

    International Nuclear Information System (INIS)

    Kureta, Masatoshi; Tamai, Hidesada; Liu, Wei; Akimoto, Hajime; Sato, Takashi; Watanabe, Hironori; Ohnuki, Akira

    2006-03-01

    Japan Atomic Energy Agency has been performing tight-lattice rod bundle thermal-hydraulic tests to realize essential technologies for the technological and engineering feasibility of super high burn-up water-cooled breeder reactor featured by a high breeding ratio and super high burn-up by reducing the core water volume in water-cooled reactor. The tests are performing to make clear the fundamental subjects related to the boiling transition (BT) (Subjects: BT criteria under a highly tight-lattice rod bundle, effects of gap-width between rods and of rod-bowing) using 37-rod bundles (Base case test section (1.3mm gap-width), Two parameter effect test sections (Gap-width effect one (1.0mm) and Rod-bowing one)). In the present report, we summarize the test results from the base case test section. The thermal-hydraulic characteristics using the large scale test section were obtained for the critical power, the pressure drop and the wall heat transfer under a wide range of pressure, flow rate, etc. including normal operational conditions of the designed reactor. Effects of local peaking factor on the critical power were also obtained. (author)

  7. Effects of free-stream turbulence intensity and blowing ratio on film cooling of turbine blade leading edge

    International Nuclear Information System (INIS)

    Kim, S. M.; Kim, Youn J.; Cho, H. H.

    2001-01-01

    We used a cylindrical model which simulates turbine blade leading edge to investigate the effects of free-stream turbulence intensity and blowing ratio on film cooling of turbine blade leading edge. Tests are carried out in a low-speed wind tunnel on a cylindrical model with three rows of injection holes. Mainstream Reynolds number based on the cylinder diameter was 7.1x10 4 . Two types of turbulence grid are used to increase a free-stream turbulence intensity. The effect of coolant blowing ratio was studied for various blowing ratios. For each blowing ratios, wall temperatures around the surface of the test model are measured by thermocouples installed inside the model. Results show that blowing ratios have small effect on spanwise-averaged film effectiveness at high free-stream turbulence intensity. However, an increase in free-stream turbulence intensity enhances significantly spanwise-averaged film effectiveness at low blowing ratio

  8. Isotopic ratio based source apportionment of children's blood lead around coking plant area.

    Science.gov (United States)

    Cao, Suzhen; Duan, Xiaoli; Zhao, Xiuge; Wang, Beibei; Ma, Jin; Fan, Delong; Sun, Chengye; He, Bin; Wei, Fusheng; Jiang, Guibin

    2014-12-01

    Lead exposure in the environment is a major hazard affecting human health, particularly for children. The blood lead levels in the local children living around the largest coking area in China were measured, and the source of blood lead and the main pathways of lead exposure were investigated based on lead isotopic ratios ((207)Pb/(206)Pb and (208)Pb/(206)Pb) in blood and in a variety of media, including food, airborne particulate matter, soil, dust and drinking water. The children's blood lead level was 5.25 (1.59 to 34.36 as range) μg dL(-1), lower than the threshold in the current criteria of China defined by the US Centers for Disease Control (10 μg dL(-1)). The isotopic ratios in the blood were 2.111±0.018 for (208)Pb/(206)Pb and 0.864±0.005 for (207)Pb/(206)Pb, similar to those of vegetables, wheat, drinking water, airborne particulate matter, but different from those of vehicle emission and soil/dust, suggesting that the formers were the main pathway of lead exposure among the children. The exposure pathway analysis based on the isotopic ratios and the human health risk assessment showed that dietary intake of food and drinking water contributed 93.67% of total exposed lead. The study further indicated that the coal used in the coking plant is the dominant pollution source of lead in children's blood. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Development and evaluation of a regression-based model to predict cesium-137 concentration ratios for saltwater fish

    International Nuclear Information System (INIS)

    Pinder, John E.; Rowan, David J.; Smith, Jim T.

    2016-01-01

    Data from published studies and World Wide Web sources were combined to develop a regression model to predict "1"3"7Cs concentration ratios for saltwater fish. Predictions were developed from 1) numeric trophic levels computed primarily from random resampling of known food items and 2) K concentrations in the saltwater for 65 samplings from 41 different species from both the Atlantic and Pacific Oceans. A number of different models were initially developed and evaluated for accuracy which was assessed as the ratios of independently measured concentration ratios to those predicted by the model. In contrast to freshwater systems, were K concentrations are highly variable and are an important factor in affecting fish concentration ratios, the less variable K concentrations in saltwater were relatively unimportant in affecting concentration ratios. As a result, the simplest model, which used only trophic level as a predictor, had comparable accuracies to more complex models that also included K concentrations. A test of model accuracy involving comparisons of 56 published concentration ratios from 51 species of marine fish to those predicted by the model indicated that 52 of the predicted concentration ratios were within a factor of 2 of the observed concentration ratios. - Highlights: • We developed a model to predict concentration ratios (C_r) for saltwater fish. • The model requires only a single input variable to predict C_r. • That variable is a mean numeric trophic level available at (fishbase.org). • The K concentrations in seawater were not an important predictor variable. • The median-to observed ratio for 56 independently measured C_r was 0.83.

  10. DC-to-AC inverter ratio failure detector

    Science.gov (United States)

    Ebersole, T. J.; Andrews, R. E.

    1975-01-01

    Failure detection technique is based upon input-output ratios, which is independent of inverter loading. Since inverter has fixed relationship between V-in/V-out and I-in/I-out, failure detection criteria are based on this ratio, which is simply inverter transformer turns ratio, K, equal to primary turns divided by secondary turns.

  11. BEAT: A Web-Based Boolean Expression Fault-Based Test Case Generation Tool

    Science.gov (United States)

    Chen, T. Y.; Grant, D. D.; Lau, M. F.; Ng, S. P.; Vasa, V. R.

    2006-01-01

    BEAT is a Web-based system that generates fault-based test cases from Boolean expressions. It is based on the integration of our several fault-based test case selection strategies. The generated test cases are considered to be fault-based, because they are aiming at the detection of particular faults. For example, when the Boolean expression is in…

  12. Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model

    Science.gov (United States)

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-05-01

    Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.

  13. High-Resolution Time-Frequency Spectrum-Based Lung Function Test from a Smartphone Microphone

    Directory of Open Access Journals (Sweden)

    Tharoeun Thap

    2016-08-01

    Full Text Available In this paper, a smartphone-based lung function test, developed to estimate lung function parameters using a high-resolution time-frequency spectrum from a smartphone built-in microphone is presented. A method of estimation of the forced expiratory volume in 1 s divided by forced vital capacity (FEV1/FVC based on the variable frequency complex demodulation method (VFCDM is first proposed. We evaluated our proposed method on 26 subjects, including 13 healthy subjects and 13 chronic obstructive pulmonary disease (COPD patients, by comparing with the parameters clinically obtained from pulmonary function tests (PFTs. For the healthy subjects, we found that an absolute error (AE and a root mean squared error (RMSE of the FEV1/FVC ratio were 4.49% ± 3.38% and 5.54%, respectively. For the COPD patients, we found that AE and RMSE from COPD patients were 10.30% ± 10.59% and 14.48%, respectively. For both groups, we compared the results using the continuous wavelet transform (CWT and short-time Fourier transform (STFT, and found that VFCDM was superior to CWT and STFT. Further, to estimate other parameters, including forced vital capacity (FVC, forced expiratory volume in 1 s (FEV1, and peak expiratory flow (PEF, regression analysis was conducted to establish a linear transformation. However, the parameters FVC, FEV1, and PEF had correlation factor r values of 0.323, 0.275, and −0.257, respectively, while FEV1/FVC had an r value of 0.814. The results obtained suggest that only the FEV1/FVC ratio can be accurately estimated from a smartphone built-in microphone. The other parameters, including FVC, FEV1, and PEF, were subjective and dependent on the subject’s familiarization with the test and performance of forced exhalation toward the microphone.

  14. A Three End-Member Mixing Model Based on Isotopic Composition and Elemental Ratio

    Directory of Open Access Journals (Sweden)

    Kon-Kee Liu Shuh-Ji Kao

    2007-01-01

    Full Text Available A three end-member mixing model based on nitrogen isotopic composition and organic carbon to nitrogen ratio of suspended particulate matter in an aquatic environment has been developed. Mathematical expressions have been derived for the calculation of the fractions of nitrogen or organic carbon originating from three different sources of distinct isotopic and elemental compositions. The model was successfully applied to determine the contributions from anthropogenic wastes, soils and bedrock-derived sediments to particulate nitrogen and particulate organic carbon in the Danshuei River during the flood caused by Typhoon Bilis in August 2000. The model solutions have been expressed in a general form that allows applications to mixtures with other types of isotopic compositions and elemental ratios or in forms other than suspended particulate matter.

  15. Testing effective quantum gravity with gravitational waves from extreme mass ratio inspirals

    International Nuclear Information System (INIS)

    Yunes, N; Sopuerta, C F

    2010-01-01

    Testing deviation of GR is one of the main goals of the proposed Laser Interferometer Space Antenna. For the first time, we consistently compute the generation of gravitational waves from extreme-mass ratio inspirals (stellar compact objects into supermassive black holes) in a well-motivated alternative theory of gravity, that to date remains weakly constrained by double binary pulsar observations. The theory we concentrate on is Chern-Simons (CS) modified gravity, a 4-D, effective theory that is motivated both from string theory and loop-quantum gravity, and which enhances the Einstein-Hilbert action through the addition of a dynamical scalar field and the parity-violating Pontryagin density. We show that although point particles continue to follow geodesics in the modified theory, the background about which they inspiral is a modification to the Kerr metric, which imprints a CS correction on the gravitational waves emitted. CS modified gravitational waves are sufficiently different from the General Relativistic expectation that they lead to significant dephasing after 3 weeks of evolution, but such dephasing will probably not prevent detection of these signals, but instead lead to a systematic error in the determination of parameters. We end with a study of radiation-reaction in the modified theory and show that, to leading-order, energy-momentum emission is not CS modified, except possibly for the subdominant effect of scalar-field emission. The inclusion of radiation-reaction will allow for tests of CS modified gravity with space-borne detectors that might be two orders of magnitude larger than current binary pulsar bounds.

  16. MIR hollow waveguide (HWG) isotope ratio analyzer for environmental applications

    Science.gov (United States)

    Wang, Zhenyou; Zhuang, Yan; Deev, Andrei; Wu, Sheng

    2017-05-01

    An advanced commercial Mid-InfraRed Isotope Ratio (IR2) analyzer was developed in Arrow Grand Technologies based on hollow waveguide (HWG) as the sample tube. The stable carbon isotope ratio, i.e. δ13C, was obtained by measuring the selected CO2 absorption peaks in the MIR. Combined with a GC and a combustor, it has been successfully employed to measure compound specific δ13C isotope ratios in the field. By using both the 1- pass HWG and 5-path HWG, we are able to measure δ13C isotope ratio at a broad CO2 concentration of 300 ppm-37,500 ppm. Here, we demonstrate its applications in environmental studies. The δ13C isotope ratio and concentration of CO2 exhaled by soil samples was measured in real time with the isotope analyzer. The concentration was found to change with the time. We also convert the Dissolved Inorganic Carbon (DIC) into CO2, and then measure the δ13C isotope ratio with an accuracy of better than 0.3 ‰ (1 σ) with a 6 min test time and 1 ml sample usage. Tap water, NaHCO3 solvent, coca, and even beer were tested. Lastly, the 13C isotope ratio of CO2 exhaled by human beings was obtained <10 seconds after simply blowing the exhaled CO2 into a tube with an accuracy of 0.5‰ (1 σ) without sample preconditioning. In summary, a commercial HWG isotope analyzer was demonstrated to be able to perform environmental and health studies with a high accuracy ( 0.3 ‰/Hz1/2 1 σ), fast sampling rate (up to 10 Hz), low sample consumption ( 1 ml), and broad CO2 concentration range (300 ppm-37,500 ppm).

  17. Control of size and aspect ratio in hydroquinone-based synthesis of gold nanorods

    Energy Technology Data Exchange (ETDEWEB)

    Morasso, Carlo, E-mail: cmorasso@dongnocchi.it; Picciolini, Silvia; Schiumarini, Domitilla [Fondazione Don Carlo Gnocchi ONLUS, Laboratory of Nanomedicine and Clinical Biophotonics (LABION) (Italy); Mehn, Dora; Ojea-Jiménez, Isaac [European Commission Joint Research Centre, Institute for Health and Consumer Protection (IHCP) (Italy); Zanchetta, Giuliano [Universitá degli Studi di Milano, Dipartimento di Biotecnologie Mediche e Medicina Traslazionale (Italy); Vanna, Renzo; Bedoni, Marzia [Fondazione Don Carlo Gnocchi ONLUS, Laboratory of Nanomedicine and Clinical Biophotonics (LABION) (Italy); Prosperi, Davide [Università degli Studi di Milano Bicocca, NanoBioLab, Dipartimento di Biotecnologie e Bioscienze (Italy); Gramatica, Furio [Fondazione Don Carlo Gnocchi ONLUS, Laboratory of Nanomedicine and Clinical Biophotonics (LABION) (Italy)

    2015-08-15

    In this article, we describe how it is possible to tune the size and the aspect ratio of gold nanorods obtained using a highly efficient protocol based on the use of hydroquinone as a reducing agent by varying the amounts of CTAB and silver ions present in the “seed-growth” solution. Our approach not only allows us to prepare nanorods with a four times increased Au{sup 3+} reduction yield, when compared with the commonly used protocol based on ascorbic acid, but also allows a remarkable reduction of 50–60 % of the amount of CTAB needed. In fact, according to our findings, the concentration of CTAB present in the seed-growth solution do not linearly influence the final aspect ratio of the obtained nanorods, and an optimal concentration range between 30 and 50 mM has been identified as the one that is able to generate particles with more elongated shapes. On the optimized protocol, the effect of the concentration of Ag{sup +} ions in the seed-growth solution and the stability of the obtained particles has also been investigated.

  18. Testing of January Anomaly at ISE-100 Index with Power Ratio Method

    Directory of Open Access Journals (Sweden)

    Şule Yüksel Yiğiter

    2015-12-01

    Full Text Available AbstractNone of investors that can access all informations in the same ratio is not possible to earn higher returns according to Efficient Market Hypothesis. However, it has been set forth effect of time on returns in several studies and reached conflicting conclusions with hypothesis. In this context, one of the most important existing anomalies is also January month anomaly. In this study, it has been researched that if there is  January effect in BIST-100 index covering 2008-2014 period by using power ratio method. The presence of January month anomaly in BIST-100 index within specified period determined by analysis results.Keywords: Efficient Markets Hypothesis, January Month Anomaly, Power Ratio MethodJEL Classification Codes: G1,C22

  19. Space Launch System Base Heating Test: Environments and Base Flow Physics

    Science.gov (United States)

    Mehta, Manish; Knox, Kyle S.; Seaford, C. Mark; Dufrene, Aaron T.

    2016-01-01

    The NASA Space Launch System (SLS) vehicle is composed of four RS-25 liquid oxygen- hydrogen rocket engines in the core-stage and two 5-segment solid rocket boosters and as a result six hot supersonic plumes interact within the aft section of the vehicle during ight. Due to the complex nature of rocket plume-induced ows within the launch vehicle base during ascent and a new vehicle con guration, sub-scale wind tunnel testing is required to reduce SLS base convective environment uncertainty and design risk levels. This hot- re test program was conducted at the CUBRC Large Energy National Shock (LENS) II short-duration test facility to simulate ight from altitudes of 50 kft to 210 kft. The test program is a challenging and innovative e ort that has not been attempted in 40+ years for a NASA vehicle. This presentation discusses the various trends of base convective heat ux and pressure as a function of altitude at various locations within the core-stage and booster base regions of the two-percent SLS wind tunnel model. In-depth understanding of the base ow physics is presented using the test data, infrared high-speed imaging and theory. The normalized test design environments are compared to various NASA semi- empirical numerical models to determine exceedance and conservatism of the ight scaled test-derived base design environments. Brief discussion of thermal impact to the launch vehicle base components is also presented.

  20. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  1. Fault Modeling and Testing for Analog Circuits in Complex Space Based on Supply Current and Output Voltage

    Directory of Open Access Journals (Sweden)

    Hongzhi Hu

    2015-01-01

    Full Text Available This paper deals with the modeling of fault for analog circuits. A two-dimensional (2D fault model is first proposed based on collaborative analysis of supply current and output voltage. This model is a family of circle loci on the complex plane, and it simplifies greatly the algorithms for test point selection and potential fault simulations, which are primary difficulties in fault diagnosis of analog circuits. Furthermore, in order to reduce the difficulty of fault location, an improved fault model in three-dimensional (3D complex space is proposed, which achieves a far better fault detection ratio (FDR against measurement error and parametric tolerance. To address the problem of fault masking in both 2D and 3D fault models, this paper proposes an effective design for testability (DFT method. By adding redundant bypassing-components in the circuit under test (CUT, this method achieves excellent fault isolation ratio (FIR in ambiguity group isolation. The efficacy of the proposed model and testing method is validated through experimental results provided in this paper.

  2. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    INTRODUCTION: Oral anticoagulation treatment (OACT)with warfarin is common in general practice. Increasingly,international normalised ratio (INR) point of care testing(POCT) is being used to manage patients. The aim of thisstudy was to describe and analyse the quality of OACT withwarfarin...... in the management of patients in warfarintreatment provided good quality of care. Sampling intervaland diagnostic coding were significantly correlated withtreatment quality. FUNDING: The study received financial support from theSarah Krabbe Foundation, the General Practitioners’ Educationand Development Foundation...

  3. Oscillation-based test in mixed-signal circuits

    CERN Document Server

    Sánchez, Gloria Huertas; Rueda, Adoración Rueda

    2007-01-01

    This book presents the development and experimental validation of the structural test strategy called Oscillation-Based Test - OBT in short. The results presented here assert, not only from a theoretical point of view, but also based on a wide experimental support, that OBT is an efficient defect-oriented test solution, complementing the existing functional test techniques for mixed-signal circuits.

  4. Composition-ratio influence on resistive switching behavior of solution-processed InGaZnO-based thin-film.

    Science.gov (United States)

    Hwang, Yeong-Hyeon; Hwang, Inchan; Cho, Won-Ju

    2014-11-01

    The influence of composition ratio on the bipolar resistive switching behavior of resistive switching memory devices based on amorphous indium-gallium-zinc-oxide (a-IGZO) using the spin-coating process was investigated. To study the stoichiometric effects of the a-IGZO films on device characteristics, four devices with In/Ga/Zn stoichiometries of 1:1:1, 3:1:1, 1:3:1, and 1:1:3 were fabricated and characterized. The 3:1:1 film showed an ohmic behavior and the 1:1:3 film showed a rectifying switching behavior. The current-voltage characteristics of the a-IGZO films with stoichiometries of 1:1:1 and 1:3:1, however, showed a bipolar resistive memory switching behavior. We found that the three-fold increase in the gallium content ratio reduces the reset voltage from -0.9 to - 0.4 V and enhances the current ratio of high to low resistive states from 0.7 x 10(1) to 3 x 10(1). Our results show that the increase in the Ga composition ratio in the a-IGZO-based ReRAM cells effectively improves the device performance and reliability by increasing the initial defect density in the a-IGZO films.

  5. [The effect of core veneer thickness ratio on the flexural strength of diatomite-based dental ceramic].

    Science.gov (United States)

    Jiang, Jie; Zhang, Xin; Gao, Mei-qin; Zhang, Fei-min; Lu, Xiao-li

    2015-06-01

    To evaluate the effect of different core veneer thickness ratios on the flexural strength and failure mode of bilayered diatomite-based dental ceramics. Diatomite-based dental ceramics blocks (16 mm×5.4 mm×1 mm) were sintered with different thickness of veneer porcelains: 0 mm (group A), 0.6 mm (group B), 0.8 mm (group C) and 1.0 mm (group D). Flexural strength was detected and scanning electron microscope was used to observe the interface microstructure. Statistical analysis was performed using SPSS 17.0 software package. With the increase of the thickness of the veneer porcelain, flexural strength of group C showed highest flexural strength up to (277.24±5.47) MPa. Different core veneer thickness ratios can significantly influence the flexural strength of bilayered diatomite-based dental ceramics. Supported by Science and Technology Projects of Nantong City (HS2013010).

  6. Model-based testing for software safety

    NARCIS (Netherlands)

    Gurbuz, Havva Gulay; Tekinerdogan, Bedir

    2017-01-01

    Testing safety-critical systems is crucial since a failure or malfunction may result in death or serious injuries to people, equipment, or environment. An important challenge in testing is the derivation of test cases that can identify the potential faults. Model-based testing adopts models of a

  7. Physician Bayesian updating from personal beliefs about the base rate and likelihood ratio.

    Science.gov (United States)

    Rottman, Benjamin Margolin

    2017-02-01

    Whether humans can accurately make decisions in line with Bayes' rule has been one of the most important yet contentious topics in cognitive psychology. Though a number of paradigms have been used for studying Bayesian updating, rarely have subjects been allowed to use their own preexisting beliefs about the prior and the likelihood. A study is reported in which physicians judged the posttest probability of a diagnosis for a patient vignette after receiving a test result, and the physicians' posttest judgments were compared to the normative posttest calculated from their own beliefs in the sensitivity and false positive rate of the test (likelihood ratio) and prior probability of the diagnosis. On the one hand, the posttest judgments were strongly related to the physicians' beliefs about both the prior probability as well as the likelihood ratio, and the priors were used considerably more strongly than in previous research. On the other hand, both the prior and the likelihoods were still not used quite as much as they should have been, and there was evidence of other nonnormative aspects to the updating, such as updating independent of the likelihood beliefs. By focusing on how physicians use their own prior beliefs for Bayesian updating, this study provides insight into how well experts perform probabilistic inference in settings in which they rely upon their own prior beliefs rather than experimenter-provided cues. It suggests that there is reason to be optimistic about experts' abilities, but that there is still considerable need for improvement.

  8. Holes at High Blowing Ratios

    Directory of Open Access Journals (Sweden)

    Phillip M. Ligrani

    1996-01-01

    Full Text Available Experimental results are presented which describe the development and structure of flow downstream of a single row of holes with compound angle orientations producing film cooling at high blowing ratios. This film cooling configuration is important because similar arrangements are frequently employed on the first stage of rotating blades of operating gas turbine engines. With this configuration, holes are spaced 6d apart in the spanwise direction, with inclination angles of 24 degrees, and angles of orientation of 50.5 degrees. Blowing ratios range from 1.5 to 4.0 and the ratio of injectant to freestream density is near 1.0. Results show that spanwise averaged adiabatic effectiveness, spanwise-averaged iso-energetic Stanton number ratios, surveys of streamwise mean velocity, and surveys of injectant distributions change by important amounts as the blowing ratio increases. This is due to injectant lift-off from the test surface just downstream of the holes.

  9. DFT based spatial multiplexing and maximum ratio transmission for mm-wawe large MIMO

    DEFF Research Database (Denmark)

    Phan-Huy, D.-T.; Tölli, A.; Rajatheva, N.

    2014-01-01

    -SM-MRT). When the DFT-SM scheme alone is used, the data streams are either mapped onto different angles of departures in the case of aligned linear arrays, or mapped onto different orbital angular momentums in the case of aligned circular arrays. Maximum ratio transmission pre-equalizes the channel......By using large point-to-point multiple input multiple output (MIMO), spatial multiplexing of a large number of data streams in wireless communications using millimeter-waves (mm-waves) can be achieved. However, according to the antenna spacing and transmitter-receiver distance, the MIMO channel...... is likely to be ill-conditioned. In such conditions, highly complex schemes such as the singular value decomposition (SVD) are necessary. In this paper, we propose a new low complexity system called discrete Fourier transform based spatial multiplexing (DFT-SM) with maximum ratio transmission (DFT...

  10. Extinction ratio enhancement of SOA-based delayed-interference signal converter using detuned filtering

    Science.gov (United States)

    Zhang, B.; Kumar, S.; Yan, L.-S.; Willner, A. E.

    2007-12-01

    We demonstrate experimentally >3 dB extinction ratio improvement at the output of SOA-based delayed-interference signal converter (DISC) using optical off-centered filtering. Through careful modeling of the carrier and the phase dynamics, we explain in detail the origin of sub-pulses in the wavelength converted output, with an emphasis on the time-resolved frequency chirping of the output signal. Through our simulations we conclude that the sub-pulses and the main-pulses are oppositely chirped, which is also verified experimentally by analyzing the output with a chirp form analyzer. We propose and demonstrate an optical off-center filtering technique which effectively suppresses these sub-pulses. The effects of filter detuning and phase bias adjustment in the delayed-interferometer are experimentally characterized and optimized, leading to a >3 dB extinction ratio enhancement of the output signal.

  11. Estimation of contribution ratios of pollutant sources to a specific section based on an enhanced water quality model.

    Science.gov (United States)

    Cao, Bibo; Li, Chuan; Liu, Yan; Zhao, Yue; Sha, Jian; Wang, Yuqiu

    2015-05-01

    Because water quality monitoring sections or sites could reflect the water quality status of rivers, surface water quality management based on water quality monitoring sections or sites would be effective. For the purpose of improving water quality of rivers, quantifying the contribution ratios of pollutant resources to a specific section is necessary. Because physical and chemical processes of nutrient pollutants are complex in water bodies, it is difficult to quantitatively compute the contribution ratios. However, water quality models have proved to be effective tools to estimate surface water quality. In this project, an enhanced QUAL2Kw model with an added module was applied to the Xin'anjiang Watershed, to obtain water quality information along the river and to assess the contribution ratios of each pollutant source to a certain section (the Jiekou state-controlled section). Model validation indicated that the results were reliable. Then, contribution ratios were analyzed through the added module. Results show that among the pollutant sources, the Lianjiang tributary contributes the largest part of total nitrogen (50.43%), total phosphorus (45.60%), ammonia nitrogen (32.90%), nitrate (nitrite + nitrate) nitrogen (47.73%), and organic nitrogen (37.87%). Furthermore, contribution ratios in different reaches varied along the river. Compared with pollutant loads ratios of different sources in the watershed, an analysis of contribution ratios of pollutant sources for each specific section, which takes the localized chemical and physical processes into consideration, was more suitable for local-regional water quality management. In summary, this method of analyzing the contribution ratios of pollutant sources to a specific section based on the QUAL2Kw model was found to support the improvement of the local environment.

  12. Blind Source Separation Based on Covariance Ratio and Artificial Bee Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Lei Chen

    2014-01-01

    Full Text Available The computation amount in blind source separation based on bioinspired intelligence optimization is high. In order to solve this problem, we propose an effective blind source separation algorithm based on the artificial bee colony algorithm. In the proposed algorithm, the covariance ratio of the signals is utilized as the objective function and the artificial bee colony algorithm is used to solve it. The source signal component which is separated out, is then wiped off from mixtures using the deflation method. All the source signals can be recovered successfully by repeating the separation process. Simulation experiments demonstrate that significant improvement of the computation amount and the quality of signal separation is achieved by the proposed algorithm when compared to previous algorithms.

  13. Grinding temperature and energy ratio coe cient in MQL grinding of high-temperature nickel-base alloy by using di erent vegetable oils as base oil

    Institute of Scientific and Technical Information of China (English)

    Li Benkai; Li Changhe; Zhang Yanbin; Wang Yaogang; Jia Dongzhou; Yang Min

    2016-01-01

    Vegetable oil can be used as a base oil in minimal quantity of lubrication (MQL). This study compared the performances of MQL grinding by using castor oil, soybean oil, rapeseed oil, corn oil, sunflower oil, peanut oil, and palm oil as base oils. A K-P36 numerical-control precision surface grinder was used to perform plain grinding on a workpiece material with a high-temperature nickel base alloy. A YDM–III 99 three-dimensional dynamometer was used to measure grinding force, and a clip-type thermocouple was used to determine grinding temperature. The grinding force, grind-ing temperature, and energy ratio coefficient of MQL grinding were compared among the seven veg-etable oil types. Results revealed that (1) castor oil-based MQL grinding yields the lowest grinding force but exhibits the highest grinding temperature and energy ratio coefficient;(2) palm oil-based MQL grinding generates the second lowest grinding force but shows the lowest grinding temperature and energy ratio coefficient;(3) MQL grinding based on the five other vegetable oils produces similar grinding forces, grinding temperatures, and energy ratio coefficients, with values ranging between those of castor oil and palm oil;(4) viscosity significantly influences grinding force and grinding tem-perature to a greater extent than fatty acid varieties and contents in vegetable oils;(5) although more viscous vegetable oil exhibits greater lubrication and significantly lower grinding force than less vis-cous vegetable oil, high viscosity reduces the heat exchange capability of vegetable oil and thus yields a high grinding temperature;(6) saturated fatty acid is a more efficient lubricant than unsaturated fatty acid;and (7) a short carbon chain transfers heat more effectively than a long carbon chain. Palm oil is the optimum base oil of MQL grinding, and this base oil yields 26.98 N tangential grinding force, 87.10 N normal grinding force, 119.6 °C grinding temperature, and 42.7%energy ratio coefficient

  14. Design, Fabrication, and Performance Test of a 100-W Helical-Blade Vertical-Axis Wind Turbine at Low Tip-Speed Ratio

    Directory of Open Access Journals (Sweden)

    Dowon Han

    2018-06-01

    Full Text Available A 100-W helical-blade vertical-axis wind turbine was designed, manufactured, and tested in a wind tunnel. A relatively low tip-speed ratio of 1.1 was targeted for usage in an urban environment at a rated wind speed of 9 m/s and a rotational speed of 170 rpm. The basic dimensions were determined through a momentum-based design method according to the IEC 61400-2 protocol. The power output was estimated by a mathematical model that takes into account the aerodynamic performance of the NACA0018 blade shape. The lift and drag of the blade with respect to the angle of attack during rotation were calculated using 2D computational fluid dynamics (CFD simulation to take into account stall region. The average power output calculated by the model was 108.34 W, which satisfies the target output of 100 W. The manufactured wind turbine was tested in a large closed-circuit wind tunnel, and the power outputs were measured for given wind speeds. At the design condition, the measured power output was 114.7 W, which is 5.9% higher than that of the mathematical model. This result validates the proposed design method and power estimation by the mathematical model.

  15. On school choice and test-based accountability.

    Directory of Open Access Journals (Sweden)

    Damian W. Betebenner

    2005-10-01

    Full Text Available Among the two most prominent school reform measures currently being implemented in The United States are school choice and test-based accountability. Until recently, the two policy initiatives remained relatively distinct from one another. With the passage of the No Child Left Behind Act of 2001 (NCLB, a mutualism between choice and accountability emerged whereby school choice complements test-based accountability. In the first portion of this study we present a conceptual overview of school choice and test-based accountability and explicate connections between the two that are explicit in reform implementations like NCLB or implicit within the market-based reform literature in which school choice and test-based accountability reside. In the second portion we scrutinize the connections, in particular, between school choice and test-based accountability using a large western school district with a popular choice system in place. Data from three sources are combined to explore the ways in which school choice and test-based accountability draw on each other: state assessment data of children in the district, school choice data for every participating student in the district choice program, and a parental survey of both participants and non-participants of choice asking their attitudes concerning the use of school report cards in the district. Results suggest that choice is of benefit academically to only the lowest achieving students, choice participation is not uniform across different ethnic groups in the district, and parents' primary motivations as reported on a survey for participation in choice are not due to test scores, though this is not consistent with choice preferences among parents in the district. As such, our results generally confirm the hypotheses of choice critics more so than advocates. Keywords: school choice; accountability; student testing.

  16. Cross-Mode Comparability of Computer-Based Testing (CBT) versus Paper-Pencil Based Testing (PPT): An Investigation of Testing Administration Mode among Iranian Intermediate EFL Learners

    Science.gov (United States)

    Khoshsima, Hooshang; Hosseini, Monirosadat; Toroujeni, Seyyed Morteza Hashemi

    2017-01-01

    Advent of technology has caused growing interest in using computers to convert conventional paper and pencil-based testing (Henceforth PPT) into Computer-based testing (Henceforth CBT) in the field of education during last decades. This constant promulgation of computers to reshape the conventional tests into computerized format permeated the…

  17. Time-dependent fracture probability of bilayer, lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation

    Science.gov (United States)

    Anusavice, Kenneth J.; Jadaan, Osama M.; Esquivel–Upshaw, Josephine

    2013-01-01

    Recent reports on bilayer ceramic crown prostheses suggest that fractures of the veneering ceramic represent the most common reason for prosthesis failure. Objective The aims of this study were to test the hypotheses that: (1) an increase in core ceramic/veneer ceramic thickness ratio for a crown thickness of 1.6 mm reduces the time-dependent fracture probability (Pf) of bilayer crowns with a lithium-disilicate-based glass-ceramic core, and (2) oblique loading, within the central fossa, increases Pf for 1.6-mm-thick crowns compared with vertical loading. Materials and methods Time-dependent fracture probabilities were calculated for 1.6-mm-thick, veneered lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation in the central fossa area. Time-dependent fracture probability analyses were computed by CARES/Life software and finite element analysis, using dynamic fatigue strength data for monolithic discs of a lithium-disilicate glass-ceramic core (Empress 2), and ceramic veneer (Empress 2 Veneer Ceramic). Results Predicted fracture probabilities (Pf) for centrally-loaded 1,6-mm-thick bilayer crowns over periods of 1, 5, and 10 years are 1.2%, 2.7%, and 3.5%, respectively, for a core/veneer thickness ratio of 1.0 (0.8 mm/0.8 mm), and 2.5%, 5.1%, and 7.0%, respectively, for a core/veneer thickness ratio of 0.33 (0.4 mm/1.2 mm). Conclusion CARES/Life results support the proposed crown design and load orientation hypotheses. Significance The application of dynamic fatigue data, finite element stress analysis, and CARES/Life analysis represent an optimal approach to optimize fixed dental prosthesis designs produced from dental ceramics and to predict time-dependent fracture probabilities of ceramic-based fixed dental prostheses that can minimize the risk for clinical failures. PMID:24060349

  18. Time-dependent fracture probability of bilayer, lithium-disilicate-based, glass-ceramic, molar crowns as a function of core/veneer thickness ratio and load orientation.

    Science.gov (United States)

    Anusavice, Kenneth J; Jadaan, Osama M; Esquivel-Upshaw, Josephine F

    2013-11-01

    Recent reports on bilayer ceramic crown prostheses suggest that fractures of the veneering ceramic represent the most common reason for prosthesis failure. The aims of this study were to test the hypotheses that: (1) an increase in core ceramic/veneer ceramic thickness ratio for a crown thickness of 1.6mm reduces the time-dependent fracture probability (Pf) of bilayer crowns with a lithium-disilicate-based glass-ceramic core, and (2) oblique loading, within the central fossa, increases Pf for 1.6-mm-thick crowns compared with vertical loading. Time-dependent fracture probabilities were calculated for 1.6-mm-thick, veneered lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation in the central fossa area. Time-dependent fracture probability analyses were computed by CARES/Life software and finite element analysis, using dynamic fatigue strength data for monolithic discs of a lithium-disilicate glass-ceramic core (Empress 2), and ceramic veneer (Empress 2 Veneer Ceramic). Predicted fracture probabilities (Pf) for centrally loaded 1.6-mm-thick bilayer crowns over periods of 1, 5, and 10 years are 1.2%, 2.7%, and 3.5%, respectively, for a core/veneer thickness ratio of 1.0 (0.8mm/0.8mm), and 2.5%, 5.1%, and 7.0%, respectively, for a core/veneer thickness ratio of 0.33 (0.4mm/1.2mm). CARES/Life results support the proposed crown design and load orientation hypotheses. The application of dynamic fatigue data, finite element stress analysis, and CARES/Life analysis represent an optimal approach to optimize fixed dental prosthesis designs produced from dental ceramics and to predict time-dependent fracture probabilities of ceramic-based fixed dental prostheses that can minimize the risk for clinical failures. Copyright © 2013 Academy of Dental Materials. All rights reserved.

  19. Association between mild cognitive impairment and trajectory-based spatial parameters during timed up and go test using a laser range sensor.

    Science.gov (United States)

    Nishiguchi, Shu; Yorozu, Ayanori; Adachi, Daiki; Takahashi, Masaki; Aoyama, Tomoki

    2017-08-08

    The Timed Up and Go (TUG) test may be a useful tool to detect not only mobility impairment but also possible cognitive impairment. In this cross-sectional study, we used the TUG test to investigate the associations between trajectory-based spatial parameters measured by laser range sensor (LRS) and cognitive impairment in community-dwelling older adults. The participants were 63 community-dwelling older adults (mean age, 73.0 ± 6.3 years). The trajectory-based spatial parameters during the TUG test were measured using an LRS. In each forward and backward phase, we calculated the minimum distance from the marker, the maximum distance from the x-axis (center line), the length of the trajectories, and the area of region surrounded by the trajectory of the center of gravity and the x-axis (center line). We measured mild cognitive impairment using the Mini-Mental State Examination score (26/27 was the cut-off score for defining mild cognitive impairment). Compared with participants with normal cognitive function, those with mild cognitive impairment exhibited the following trajectory-based spatial parameters: short minimum distance from the marker (p = 0.044), narrow area of center of gravity in the forward phase (p = 0.012), and a large forward/whole phase ratio of the area of the center of gravity (p = 0.026) during the TUG test. In multivariate logistic regression analyses, a short minimum distance from the marker (odds ratio [OR]: 0.82, 95% confidence interval [CI]: 0.69-0.98), narrow area of the center of gravity in the forward phase (OR: 0.01, 95% CI: 0.00-0.36), and large forward/whole phase ratio of the area of the center of gravity (OR: 0.94, 95% CI: 0.88-0.99) were independently associated with mild cognitive impairment. In conclusion, our results indicate that some of the trajectory-based spatial parameters measured by LRS during the TUG test were independently associated with cognitive impairment in older adults. In particular, older adults with

  20. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  1. Team-Based Testing Improves Individual Learning

    Science.gov (United States)

    Vogler, Jane S.; Robinson, Daniel H.

    2016-01-01

    In two experiments, 90 undergraduates took six tests as part of an educational psychology course. Using a crossover design, students took three tests individually without feedback and then took the same test again, following the process of team-based testing (TBT), in teams in which the members reached consensus for each question and answered…

  2. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  3. A Comparison Study of Return Ratio-Based Academic Enrollment Forecasting Models. Professional File. Article 129, Spring 2013

    Science.gov (United States)

    Zan, Xinxing Anna; Yoon, Sang Won; Khasawneh, Mohammad; Srihari, Krishnaswami

    2013-01-01

    In an effort to develop a low-cost and user-friendly forecasting model to minimize forecasting error, we have applied average and exponentially weighted return ratios to project undergraduate student enrollment. We tested the proposed forecasting models with different sets of historical enrollment data, such as university-, school-, and…

  4. Kernel-based tests for joint independence

    DEFF Research Database (Denmark)

    Pfister, Niklas; Bühlmann, Peter; Schölkopf, Bernhard

    2018-01-01

    if the $d$ variables are jointly independent, as long as the kernel is characteristic. Based on an empirical estimate of dHSIC, we define three different non-parametric hypothesis tests: a permutation test, a bootstrap test and a test based on a Gamma approximation. We prove that the permutation test......We investigate the problem of testing whether $d$ random variables, which may or may not be continuous, are jointly (or mutually) independent. Our method builds on ideas of the two variable Hilbert-Schmidt independence criterion (HSIC) but allows for an arbitrary number of variables. We embed...... the $d$-dimensional joint distribution and the product of the marginals into a reproducing kernel Hilbert space and define the $d$-variable Hilbert-Schmidt independence criterion (dHSIC) as the squared distance between the embeddings. In the population case, the value of dHSIC is zero if and only...

  5. Validity evidence based on test content.

    Science.gov (United States)

    Sireci, Stephen; Faulkner-Bond, Molly

    2014-01-01

    Validity evidence based on test content is one of the five forms of validity evidence stipulated in the Standards for Educational and Psychological Testing developed by the American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. In this paper, we describe the logic and theory underlying such evidence and describe traditional and modern methods for gathering and analyzing content validity data. A comprehensive review of the literature and of the aforementioned Standards is presented. For educational tests and other assessments targeting knowledge and skill possessed by examinees, validity evidence based on test content is necessary for building a validity argument to support the use of a test for a particular purpose. By following the methods described in this article, practitioners have a wide arsenal of tools available for determining how well the content of an assessment is congruent with and appropriate for the specific testing purposes.

  6. The reaction index and positivity ratio revisited

    DEFF Research Database (Denmark)

    Andersen, Klaus Ejner; Andersen, Flemming

    2008-01-01

    BACKGROUND AND OBJECTIVES: Assessing the quality of patch test preparations continues to be a challenge. 2 parameters, the reaction index (RI) and positivity ratio (PR), have been proposed as quality indicators by the Information Network of Departments of Dermatology (IVDK). The value of these st......BACKGROUND AND OBJECTIVES: Assessing the quality of patch test preparations continues to be a challenge. 2 parameters, the reaction index (RI) and positivity ratio (PR), have been proposed as quality indicators by the Information Network of Departments of Dermatology (IVDK). The value...

  7. Test Review: Test of English as a Foreign Language[TM]--Internet-Based Test (TOEFL iBT[R])

    Science.gov (United States)

    Alderson, J. Charles

    2009-01-01

    In this article, the author reviews the TOEFL iBT which is the latest version of the TOEFL, whose history stretches back to 1961. The TOEFL iBT was introduced in the USA, Canada, France, Germany and Italy in late 2005. Currently the TOEFL test is offered in two testing formats: (1) Internet-based testing (iBT); and (2) paper-based testing (PBT).…

  8. Goodness of Fit Test and Test of Independence by Entropy

    Directory of Open Access Journals (Sweden)

    M. Sharifdoost

    2009-06-01

    Full Text Available To test whether a set of data has a specific distribution or not, we can use the goodness of fit test. This test can be done by one of Pearson X 2 -statistic or the likelihood ratio statistic G 2 , which are asymptotically equal, and also by using the Kolmogorov-Smirnov statistic in continuous distributions. In this paper, we introduce a new test statistic for goodness of fit test which is based on entropy distance, and which can be applied for large sample sizes. We compare this new statistic with the classical test statistics X 2 , G 2 , and Tn by some simulation studies. We conclude that the new statistic is more sensitive than the usual statistics to the rejection of distributions which are almost closed to the desired distribution. Also for testing independence, a new test statistic based on mutual information is introduced

  9. A Novel Inflammation-Based Prognostic Score: The Fibrinogen/Albumin Ratio Predicts Prognoses of Patients after Curative Resection for Hepatocellular Carcinoma

    Directory of Open Access Journals (Sweden)

    Qiaodong Xu

    2018-01-01

    Full Text Available Background. Inflammation is an important hallmark of cancer. Fibrinogen and albumin are both vital factors in systemic inflammation. This study investigated the prognostic value of the fibrinogen/albumin ratio in HCC patients who underwent curative resection. Methods. HCC patients (n=151 who underwent curative resection were evaluated retrospectively. The optimal cutoff value for the fibrinogen/albumin ratio was selected by receiver operating characteristic (ROC curve analysis. Correlations between preoperative fibrinogen/albumin ratios and clinicopathologic characteristics were analyzed by χ2 test. The area under the receiver operating characteristic curve (AUC was calculated to compare the prognostic value of the fibrinogen/albumin ratio with other prognostic scores (neutrophil to lymphocyte ratio (NLR, platelet to lymphocyte ratio (PLR, and albumin-bilirubin (ALBI score. The overall survival (OS and time to recurrence (TTR were assessed by the log-rank test and the Cox proportional hazard regression model. Results. An optimal cutoff value of the preoperative fibrinogen/albumin ratio (0.062 was determined for 151 patients who underwent curative resection for HCC via a ROC curve analysis. Fibrinogen/albumin ratio > 0.062 was significantly associated with microvascular invasion, an advanced BCLC stage, and ALBI grade. Multivariate analyses revealed that fibrinogen/albumin ratio was an independent predictor for OS (P=0.003 and TTR (P=0.035. The prognostic ability of fibrinogen/albumin ratio was comparable to other prognostic scores (NLR, PLR, and ALBI score by AUC analysis. Patients with a fibrinogen/albumin ratio > 0.062 had lower 1-, 3-, and 5-year OS rates (66.0%, 41.8%, and 28.2% versus 81.9%, 69.3%, and 56.1%, resp., P<0.001 and higher 1-, 3-, and 5-year recurrence rates (60.9%, 79.2%, and 90.5% versus 49.5%, 69.1%, and 77.1%, resp., P=0.008 compared with patients with fibrinogen/albumin ratio ≤ 0.062. Conclusion. The

  10. The ξ/ξ2nd ratio as a test for Effective Polyakov Loop Actions

    Science.gov (United States)

    Caselle, Michele; Nada, Alessandro

    2018-03-01

    Effective Polyakov line actions are a powerful tool to study the finite temperature behaviour of lattice gauge theories. They are much simpler to simulate than the original (3+1) dimensional LGTs and are affected by a milder sign problem. However it is not clear to which extent they really capture the rich spectrum of the original theories, a feature which is instead of great importance if one aims to address the sign problem. We propose here a simple way to address this issue based on the so called second moment correlation length ξ2nd. The ratio ξ/ξ2nd between the exponential correlation length and the second moment one is equal to 1 if only a single mass is present in the spectrum, and becomes larger and larger as the complexity of the spectrum increases. Since both ξexp and ξ2nd are easy to measure on the lattice, this is an economic and effective way to keep track of the spectrum of the theory. In this respect we show using both numerical simulation and effective string calculations that this ratio increases dramatically as the temperature decreases. This non-trivial behaviour should be reproduced by the Polyakov loop effective action.

  11. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    collected retrospectively for a period of six months. For each patient, time in therapeutic range (TTR) was calculated and correlated with practice and patient characteristics using multilevel linear regression models. RESULTS: We identified 447 patients in warfarin treatment in the 20 practices using POCT......INTRODUCTION: Oral anticoagulation treatment (OACT) with warfarin is common in general practice. Increasingly, international normalised ratio (INR) point of care testing (POCT) is being used to manage patients. The aim of this study was to describe and analyse the quality of OACT with warfarin...

  12. Experimental characterization of the concrete behaviour under high confinement: influence of the saturation ratio and of the water/cement ratio

    International Nuclear Information System (INIS)

    Vu, X.H.

    2007-08-01

    The objective of this thesis is to experimentally characterize the influence of the saturation ratio and of the water/cement ratio of concrete on its behaviour under high confinement. This thesis lies within a more general scope of the understanding of concrete behaviour under severe loading situations (near field detonation or ballistic impacts). A near field detonation or an impact on a concrete structure generate very high levels of stress associated with complex loading paths in the concrete material. To validate concrete behaviour models, experimental results are required. The work presented in this thesis concerns tests conducted using a static triaxial press that allows to obtain stress levels of the order of the giga Pascal. The porous character of concrete and the high confinement required on the one hand, a development of a specimen protection device, and on the other hand, a development of an instrumentation with strain gauges, which is unprecedented for such high confinements. Hydrostatic and triaxial tests, conducted on the one hand on model materials and on the other hand on concrete, allowed to validate the developed experimental procedures as well as the technique of strain and stress measurements. The studies concerning the influence of the saturation ratio and of the water/cement ratio of concrete on its behaviour required the formulation of a plain baseline concrete and of two modified concretes with different water/cement ratios. The analysis of triaxial tests performed on the baseline concrete shows that the saturation ratio of concrete has a major influence on its static behaviour under high confinement. This influence is particularly marked for the concrete loading capacity and for the shape of limit state curves for saturation ratios greater than 50%. The concrete loading capacity increases with the confinement pressure for tests on dry concrete whereas beyond a given confinement pressure, it remains limited for wet or saturated concrete

  13. Aspect ratio has no effect on genotoxicity of multi-wall carbon nanotubes.

    Science.gov (United States)

    Kim, Jin Sik; Lee, Kyu; Lee, Young Hee; Cho, Hyun Sun; Kim, Ki Heon; Choi, Kyung Hee; Lee, Sang Hee; Song, Kyung Seuk; Kang, Chang Soo; Yu, Il Je

    2011-07-01

    Carbon nanotubes (CNTs) have specific physico-chemical and electrical properties that are useful for telecommunications, medicine, materials, manufacturing processes and the environmental and energy sectors. Yet, despite their many advantages, it is also important to determine whether CNTs may represent a hazard to the environment and human health. Like asbestos, the aspect ratio (length:diameter) and metal components of CNTs are known to have an effect on the toxicity of carbon nanotubes. Thus, to evaluate the toxic potential of CNTs in relation to their aspect ratio and metal contamination, in vivo and in vitro genotoxicity tests were conducted using high-aspect-ratio (diameter: 10-15 nm, length: ~10 μm) and low-aspect-ratio multi-wall carbon nanotubes (MWCNTs, diameter: 10-15 nm, length: ~150 nm) according to OECD test guidelines 471 (bacterial reverse mutation test), 473 (in vitro chromosome aberration test), and 474 (in vivo micronuclei test) with a good laboratory practice system. To determine the treatment concentration for all the tests, a solubility and dispersive test was performed, and a 1,2-dipalmitoyl-sn-glycero-3-phosphocholine (DPPC) solution found to be more suitable than distilled water. Neither the high- nor the low-aspect-ratio MWCNTs induced any genotoxicity in a bacterial reverse mutation test (~1,000 μg/plate), in vitro chromosome aberration test (without S9: ~6.25 μg/ml, with S9: ~50 μg/ml), or in vivo micronuclei test (~50 mg/kg). However, the high-aspect-ratio MWCNTs were found to be more toxic than the low-aspect-ratio MWCNTs. Thus, while high-aspect-ratio MWCNTs do not induce direct genotoxicity or metabolic activation-mediated genotoxicity, genotoxicity could still be induced indirectly through oxidative stress or inflammation.

  14. A robust computational solution for automated quantification of a specific binding ratio based on [123I]FP-CIT SPECT images

    International Nuclear Information System (INIS)

    Oliveira, F. P. M.; Tavares, J. M. R. S.; Borges, Faria D.; Campos, Costa D.

    2014-01-01

    The purpose of the current paper is to present a computational solution to accurately quantify a specific to a non-specific uptake ratio in [ 123 I]fP-CIT single photon emission computed tomography (SPECT) images and simultaneously measure the spatial dimensions of the basal ganglia, also known as basal nuclei. A statistical analysis based on a reference dataset selected by the user is also automatically performed. The quantification of the specific to non-specific uptake ratio here is based on regions of interest defined after the registration of the image under study with a template image. The computational solution was tested on a dataset of 38 [ 123 I]FP-CIT SPECT images: 28 images were from patients with Parkinson’s disease and the remainder from normal patients, and the results of the automated quantification were compared to the ones obtained by three well-known semi-automated quantification methods. The results revealed a high correlation coefficient between the developed automated method and the three semi-automated methods used for comparison (r ≥0.975). The solution also showed good robustness against different positions of the patient, as an almost perfect agreement between the specific to non-specific uptake ratio was found (ICC=1.000). The mean processing time was around 6 seconds per study using a common notebook PC. The solution developed can be useful for clinicians to evaluate [ 123 I]FP-CIT SPECT images due to its accuracy, robustness and speed. Also, the comparison between case studies and the follow-up of patients can be done more accurately and proficiently since the intra- and inter-observer variability of the semi-automated calculation does not exist in automated solutions. The dimensions of the basal ganglia and their automatic comparison with the values of the population selected as reference are also important for professionals in this area.

  15. Study of an optimal configuration of a transmutation reactor based on a low-aspect-ratio tokamak

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Bong Guen, E-mail: bghong@jbnu.ac.kr [Department of Quantum System Engineering, Chonbuk National University, 567 Baekje-daero, Jeonju, Jeonbuk 54896 (Korea, Republic of); Kim, Hoseok [Department of Applied Plasma Engineering, Chonbuk National University, 567 Baekje-daero, Jeonju, Jeonbuk 54896 (Korea, Republic of)

    2016-11-15

    Highlights: • Optimum configuration of a transmutation reactor based on a low aspect ratio tokamak was found. • Inboard and outboard radial build are determined by plasma physics, engineering and neutronics constraints. • Radial build and equilibrium fuel cycle play a major role in determining the transmutation characteristics. - Abstract: We determine the optimal configuration of a transmutation reactor based on a low-aspect-ratio tokamak. For self-consistent determination of the radial build of the reactor components, we couple a tokamak systems analysis with a radiation transport calculation. The inboard radial build of the reactor components is obtained from plasma physics and engineering constraints, while outboard radial builds are mainly determined by constraints on neutron multiplication, the tritium-breeding ratio, and the power density. We show that the breeding blanket model has an effect on the radial build of a transmutation blanket. A burn cycle has to be determined to keep the fast neutron fluence plasma-facing material below its radiation damage limit. We show that the radial build of the transmutation reactor components and the equilibrium fuel cycle play a major role in determining the transmutation characteristics.

  16. Multifunctional Polymer-Based Graphene Foams with Buckled Structure and Negative Poisson’s Ratio

    Science.gov (United States)

    Dai, Zhaohe; Weng, Chuanxin; Liu, Luqi; Hou, Yuan; Zhao, Xuanliang; Kuang, Jun; Shi, Jidong; Wei, Yueguang; Lou, Jun; Zhang, Zhong

    2016-01-01

    In this study, we report the polymer-based graphene foams through combination of bottom-up assembly and simple triaxially buckled structure design. The resulting polymer-based graphene foams not only effectively transfer the functional properties of graphene, but also exhibit novel negative Poisson’s ratio (NPR) behaviors due to the presence of buckled structure. Our results show that after the introduction of buckled structure, improvement in stretchability, toughness, flexibility, energy absorbing ability, hydrophobicity, conductivity, piezoresistive sensitivity and crack resistance could be achieved simultaneously. The combination of mechanical properties, multifunctional performance and unusual deformation behavior would lead to the use of our polymer-based graphene foams for a variety of novel applications in future such as stretchable capacitors or conductors, sensors and oil/water separators and so on. PMID:27608928

  17. Improving Stiffness-to-weight Ratio of Spot-welded Structures based upon Nonlinear Finite Element Modelling

    Science.gov (United States)

    Zhang, Shengyong

    2017-07-01

    Spot welding has been widely used for vehicle body construction due to its advantages of high speed and adaptability for automation. An effort to increase the stiffness-to-weight ratio of spot-welded structures is investigated based upon nonlinear finite element analysis. Topology optimization is conducted for reducing weight in the overlapping regions by choosing an appropriate topology. Three spot-welded models (lap, doubt-hat and T-shape) that approximate “typical” vehicle body components are studied for validating and illustrating the proposed method. It is concluded that removing underutilized material from overlapping regions can result in a significant increase in structural stiffness-to-weight ratio.

  18. (Re)evaluating the Implications of the Autoregressive Latent Trajectory Model Through Likelihood Ratio Tests of Its Initial Conditions.

    Science.gov (United States)

    Ou, Lu; Chow, Sy-Miin; Ji, Linying; Molenaar, Peter C M

    2017-01-01

    The autoregressive latent trajectory (ALT) model synthesizes the autoregressive model and the latent growth curve model. The ALT model is flexible enough to produce a variety of discrepant model-implied change trajectories. While some researchers consider this a virtue, others have cautioned that this may confound interpretations of the model's parameters. In this article, we show that some-but not all-of these interpretational difficulties may be clarified mathematically and tested explicitly via likelihood ratio tests (LRTs) imposed on the initial conditions of the model. We show analytically the nested relations among three variants of the ALT model and the constraints needed to establish equivalences. A Monte Carlo simulation study indicated that LRTs, particularly when used in combination with information criterion measures, can allow researchers to test targeted hypotheses about the functional forms of the change process under study. We further demonstrate when and how such tests may justifiably be used to facilitate our understanding of the underlying process of change using a subsample (N = 3,995) of longitudinal family income data from the National Longitudinal Survey of Youth.

  19. Development of acoustically lined ejector technology for multitube jet noise suppressor nozzles by model and engine tests over a wide range of jet pressure ratios and temperatures

    Science.gov (United States)

    Atvars, J.; Paynter, G. C.; Walker, D. Q.; Wintermeyer, C. F.

    1974-01-01

    An experimental program comprising model nozzle and full-scale engine tests was undertaken to acquire parametric data for acoustically lined ejectors applied to primary jet noise suppression. Ejector lining design technology and acoustical scaling of lined ejector configurations were the major objectives. Ground static tests were run with a J-75 turbojet engine fitted with a 37-tube, area ratio 3.3 suppressor nozzle and two lengths of ejector shroud (L/D = 1 and 2). Seven ejector lining configurations were tested over the engine pressure ratio range of 1.40 to 2.40 with corresponding jet velocities between 305 and 610 M/sec. One-fourth scale model nozzles were tested over a pressure ratio range of 1.40 to 4.0 with jet total temperatures between ambient and 1088 K. Scaling of multielement nozzle ejector configurations was also studied using a single element of the nozzle array with identical ejector lengths and lining materials. Acoustic far field and near field data together with nozzle thrust performance and jet aerodynamic flow profiles are presented.

  20. Tile-based Fisher-ratio software for improved feature selection analysis of comprehensive two-dimensional gas chromatography-time-of-flight mass spectrometry data.

    Science.gov (United States)

    Marney, Luke C; Siegler, W Christopher; Parsons, Brendon A; Hoggard, Jamin C; Wright, Bob W; Synovec, Robert E

    2013-10-15

    Comprehensive two-dimensional (2D) gas chromatography coupled with time-of-flight mass spectrometry (GC × GC-TOFMS) is a highly capable instrumental platform that produces complex and information-rich multi-dimensional chemical data. The data can be initially overwhelming, especially when many samples (of various sample classes) are analyzed with multiple injections for each sample. Thus, the data must be analyzed in such a way as to extract the most meaningful information. The pixel-based and peak table-based Fisher ratio algorithmic approaches have been used successfully in the past to reduce the multi-dimensional data down to those chemical compounds that are changing between the sample classes relative to those that are not changing (i.e., chemical feature selection). We report on the initial development of a computationally fast novel tile-based Fisher-ratio software that addresses the challenges due to 2D retention time misalignment without explicitly aligning the data, which is often a shortcoming for both pixel-based and peak table-based algorithmic approaches. Concurrently, the tile-based Fisher-ratio algorithm significantly improves the sensitivity contrast of true positives against a background of potential false positives and noise. In this study, eight compounds, plus one internal standard, were spiked into diesel at various concentrations. The tile-based F-ratio algorithmic approach was able to "discover" all spiked analytes, within the complex diesel sample matrix with thousands of potential false positives, in each possible concentration comparison, even at the lowest absolute spiked analyte concentration ratio of 1.06, the ratio between the concentrations in the spiked diesel sample to the native concentration in diesel. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Exploring the Effect of Au/Pt Ratio on Glycerol Oxidation in Presence and Absence of a Base

    Directory of Open Access Journals (Sweden)

    Alberto Villa

    2018-01-01

    Full Text Available Bimetallic AuPt nanoparticles with different Au:Pt ratios (molar ratio: 9-1, 8-2, 6-4, 2-8, 1-9 and the corresponding Au and Pt monometallic ones were prepared by sol immobilization and immobilized on commercial TiO2 (P25. The catalytic activity was evaluated in the liquid phase glycerol oxidation in presence and absence of a base (NaOH. It was found that the Au:Pt molar ratio and reaction conditions strongly influence the catalytic performance. In the presence of NaOH, Au-rich catalysts were more active than Pt-rich ones, with the highest activity observed for Au9Pt1/TiO2 (6575 h−1. In absence of a base, a higher content of Pt is needed to produce the most active catalyst (Au6Pt4/TiO2, 301 h−1. In terms of selectivity, in presence of NaOH, Au-rich catalysts showed a high selectivity to C3 products (63–72% whereas Pt-rich catalysts promote the formation of formic and glycolic acids. The opposite trend was observed in absence of a base with Pt-rich catalysts showing higher selectivity to C3 products (83–88%.

  2. Correlations between power and test reactor data bases

    International Nuclear Information System (INIS)

    Guthrie, G.L.; Simonen, E.P.

    1989-02-01

    Differences between power reactor and test reactor data bases have been evaluated. Charpy shift data has been assembled from specimens irradiated in both high-flux test reactors and low-flux power reactors. Preliminary tests for the existence of a bias between test and power reactor data bases indicate a possible bias between the weld data bases. The bias is nonconservative for power predictive purposes, using test reactor data. The lesser shift for test reactor data compared to power reactor data is interpreted primarily in terms of greater point defect recombination for test reactor fluxes compared to power reactor fluxes. The possibility of greater thermal aging effects during lower damage rates is also discussed. 15 refs., 5 figs., 2 tabs

  3. High aspect ratio problem in simulation of a fault current limiter based on superconducting tapes

    Energy Technology Data Exchange (ETDEWEB)

    Velichko, A V; Coombs, T A [Electrical Engineering Division, University of Cambridge (United Kingdom)

    2006-06-15

    We are offering a solution for the high-aspect-ratio problem relevant to the numerical simulation of AC loss in superconductors and metals with high aspect (width-to-thickness) ratio. This is particularly relevant to simulation of fault current limiters (FCLs) based on second generation YBCO tapes on RABiTS. By assuming a linear scaling of the electric and thermal properties with the size of the structure, we can replace the real sample with an effective sample of a reduced aspect ratio by introducing size multipliers into the equations that govern the physics of the system. The simulation is performed using both a proprietary equivalent circuit software and a commercial FEM software. The correctness of the procedure is verified by simulating temperature and current distributions for samples with all three dimensions varying within 10{sup -3}-10{sup 3} of the original size. Qualitatively the distributions for the original and scaled samples are indistinguishable, whereas quantitative differences in the worst case do not exceed 10%.

  4. High aspect ratio problem in simulation of a fault current limiter based on superconducting tapes

    International Nuclear Information System (INIS)

    Velichko, A V; Coombs, T A

    2006-01-01

    We are offering a solution for the high-aspect-ratio problem relevant to the numerical simulation of AC loss in superconductors and metals with high aspect (width-to-thickness) ratio. This is particularly relevant to simulation of fault current limiters (FCLs) based on second generation YBCO tapes on RABiTS. By assuming a linear scaling of the electric and thermal properties with the size of the structure, we can replace the real sample with an effective sample of a reduced aspect ratio by introducing size multipliers into the equations that govern the physics of the system. The simulation is performed using both a proprietary equivalent circuit software and a commercial FEM software. The correctness of the procedure is verified by simulating temperature and current distributions for samples with all three dimensions varying within 10 -3 -10 3 of the original size. Qualitatively the distributions for the original and scaled samples are indistinguishable, whereas quantitative differences in the worst case do not exceed 10%

  5. Testing the performance of technical trading rules in the Chinese markets based on superior predictive test

    Science.gov (United States)

    Wang, Shan; Jiang, Zhi-Qiang; Li, Sai-Ping; Zhou, Wei-Xing

    2015-12-01

    Technical trading rules have a long history of being used by practitioners in financial markets. The profitable ability and efficiency of technical trading rules are yet controversial. In this paper, we test the performance of more than seven thousand traditional technical trading rules on the Shanghai Securities Composite Index (SSCI) from May 21, 1992 through June 30, 2013 and China Securities Index 300 (CSI 300) from April 8, 2005 through June 30, 2013 to check whether an effective trading strategy could be found by using the performance measurements based on the return and Sharpe ratio. To correct for the influence of the data-snooping effect, we adopt the Superior Predictive Ability test to evaluate if there exists a trading rule that can significantly outperform the benchmark. The result shows that for SSCI, technical trading rules offer significant profitability, while for CSI 300, this ability is lost. We further partition the SSCI into two sub-series and find that the efficiency of technical trading in sub-series, which have exactly the same spanning period as that of CSI 300, is severely weakened. By testing the trading rules on both indexes with a five-year moving window, we find that during the financial bubble from 2005 to 2007, the effectiveness of technical trading rules is greatly improved. This is consistent with the predictive ability of technical trading rules which appears when the market is less efficient.

  6. Application of the modified chi-square ratio statistic in a stepwise procedure for cascade impactor equivalence testing.

    Science.gov (United States)

    Weber, Benjamin; Lee, Sau L; Delvadia, Renishkumar; Lionberger, Robert; Li, Bing V; Tsong, Yi; Hochhaus, Guenther

    2015-03-01

    Equivalence testing of aerodynamic particle size distribution (APSD) through multi-stage cascade impactors (CIs) is important for establishing bioequivalence of orally inhaled drug products. Recent work demonstrated that the median of the modified chi-square ratio statistic (MmCSRS) is a promising metric for APSD equivalence testing of test (T) and reference (R) products as it can be applied to a reduced number of CI sites that are more relevant for lung deposition. This metric is also less sensitive to the increased variability often observed for low-deposition sites. A method to establish critical values for the MmCSRS is described here. This method considers the variability of the R product by employing a reference variance scaling approach that allows definition of critical values as a function of the observed variability of the R product. A stepwise CI equivalence test is proposed that integrates the MmCSRS as a method for comparing the relative shapes of CI profiles and incorporates statistical tests for assessing equivalence of single actuation content and impactor sized mass. This stepwise CI equivalence test was applied to 55 published CI profile scenarios, which were classified as equivalent or inequivalent by members of the Product Quality Research Institute working group (PQRI WG). The results of the stepwise CI equivalence test using a 25% difference in MmCSRS as an acceptance criterion provided the best matching with those of the PQRI WG as decisions of both methods agreed in 75% of the 55 CI profile scenarios.

  7. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the efficient test-pattern generation in a core-based design. A consistent Computer-Aided Test (CAT) flow is proposed based on the required core-test strategy. It generates a test-pattern set for the embedded cores with high fault coverage and low DfT area overhead. The CAT

  8. Sand characterization by combined centrifuge and laboratory tests

    OpenAIRE

    GAUDIN, C; SCHNAID, F; GARNIER, J

    2005-01-01

    The purpose of this paper is to evaluate new methods of interpretation of in situ tests in sand from correlations established from centrifuge and laboratory data. Emphasis is given to methods that are based on the combination of measurements from independent tests, such as the ratio of the elastic stiffness to ultimate strenght and the ratio of cone resistance and limit pressure. For that purpose, a series of centrifuge tests using a cone penetrometer and a cone pressuremeter was carried out ...

  9. Ethernet-based test stand for a CAN network

    Science.gov (United States)

    Ziebinski, Adam; Cupek, Rafal; Drewniak, Marek

    2017-11-01

    This paper presents a test stand for the CAN-based systems that are used in automotive systems. The authors propose applying an Ethernet-based test system that supports the virtualisation of a CAN network. The proposed solution has many advantages compared to classical test beds that are based on dedicated CAN-PC interfaces: it allows the physical constraints associated with the number of interfaces that can be simultaneously connected to a tested system to be avoided, which enables the test time for parallel tests to be shortened; the high speed of Ethernet transmission allows for more frequent sampling of the messages that are transmitted by a CAN network (as the authors show in the experiment results section) and the cost of the proposed solution is much lower than the traditional lab-based dedicated CAN interfaces for PCs.

  10. Difference and ratio plots

    DEFF Research Database (Denmark)

    Svendsen, Anders Jørgen; Holmskov, U; Bro, Peter

    1995-01-01

    and systemic lupus erythematosus from another previously published study (Macanovic, M. and Lachmann, P.J. (1979) Clin. Exp. Immunol. 38, 274) are also represented using ratio plots. Our observations indicate that analysis by regression analysis may often be misleading....... hitherto unnoted differences between controls and patients with either rheumatoid arthritis or systemic lupus erythematosus. For this we use simple, but unconventional, graphic representations of the data, based on difference plots and ratio plots. Differences between patients with Burkitt's lymphoma...

  11. Recent advances in ratio primary reference measurement procedures (definitive methods) and their use in certification of reference materials and controlling assigned values in proficiency testing

    International Nuclear Information System (INIS)

    Dybczyñski, R.S.; Polkowska-Motrenko, H.; Chajduk, E.; Danko, B.; Pyszynska, M.

    2014-01-01

    The idea of definitive methods based on radiochemical neutron activation analysis (RNAA), consists in combination of neutron activation with the highly selective and quantitative post-irradiation isolation of the desired radionuclide by column chromatography followed by γ-ray spectrometric measurement. The principles of construction of such methods, which were devised in the Institute of Nuclear Chemistry and Technology, are reminded and the significance of these methods for analytical quality assurance is emphasized. According to VIM 3 nomenclature these methods may be called: ratio primary reference measurement procedures (RPRMPs). RPRMP for the determination of Se is briefly presented and its use for checking the accuracy of 'assigned values' established by expert laboratories in some proficiency tests, is demonstrated

  12. Agreement between clinicians' and care givers' assessment of intelligence in Nigerian children with intellectual disability: 'ratio IQ' as a viable option in the absence of standardized 'deviance IQ' tests in sub-Saharan Africa.

    Science.gov (United States)

    Bakare, Muideen O; Ubochi, Vincent N; Okoroikpa, Ifeoma N; Aguocha, Chinyere M; Ebigbo, Peter O

    2009-09-15

    There may be need to assess intelligent quotient (IQ) scores in sub-Saharan African children with intellectual disability, either for the purpose of educational needs assessment or research. However, modern intelligence scales developed in the western parts of the world suffer limitation of widespread use because of the influence of socio-cultural variations across the world. This study examined the agreement between IQ scores estimation among Nigerian children with intellectual disability using clinicians' judgment based on International Classification of Diseases, tenth Edition(ICD - 10) criteria for mental retardation and caregivers judgment based on 'ratio IQ' scores calculated from estimated mental age in the context of socio-cultural milieu of the children. It proposed a viable option of IQ score assessment among sub-Saharan African children with intellectual disability, using a ratio of culture-specific estimated mental age and chronological age of the child in the absence of standardized alternatives, borne out of great diversity in socio-cultural context of sub-Saharan Africa. Clinicians and care-givers independently assessed the children in relation to their socio-cultural background. Clinicians assessed the IQ scores of the children based on the ICD - 10 diagnostic criteria for mental retardation. 'Ratio IQ' scores were calculated from the ratio of estimated mental age and chronological age of each child. The IQ scores as assessed by the clinicians were then compared with the 'ratio IQ' scores using correlation statistics. A total of forty-four (44) children with intellectual disability were assessed. There was a significant correlation between clinicians' assessed IQ scores and the 'ratio IQ' scores employing zero order correlation without controlling for the chronological age of the children (r = 0.47, df = 42, p = 0.001). First order correlation controlling for the chronological age of the children showed higher correlation score between clinicians

  13. Automated Search-Based Robustness Testing for Autonomous Vehicle Software

    Directory of Open Access Journals (Sweden)

    Kevin M. Betts

    2016-01-01

    Full Text Available Autonomous systems must successfully operate in complex time-varying spatial environments even when dealing with system faults that may occur during a mission. Consequently, evaluating the robustness, or ability to operate correctly under unexpected conditions, of autonomous vehicle control software is an increasingly important issue in software testing. New methods to automatically generate test cases for robustness testing of autonomous vehicle control software in closed-loop simulation are needed. Search-based testing techniques were used to automatically generate test cases, consisting of initial conditions and fault sequences, intended to challenge the control software more than test cases generated using current methods. Two different search-based testing methods, genetic algorithms and surrogate-based optimization, were used to generate test cases for a simulated unmanned aerial vehicle attempting to fly through an entryway. The effectiveness of the search-based methods in generating challenging test cases was compared to both a truth reference (full combinatorial testing and the method most commonly used today (Monte Carlo testing. The search-based testing techniques demonstrated better performance than Monte Carlo testing for both of the test case generation performance metrics: (1 finding the single most challenging test case and (2 finding the set of fifty test cases with the highest mean degree of challenge.

  14. Kaner biodiesel production through hybrid reactor and its performance testing on a CI engine at different compression ratios

    Directory of Open Access Journals (Sweden)

    Ashok Kumar Yadav

    2017-06-01

    Full Text Available The present study deals with development of a hybrid reactor for biodiesel production based on the combined hydrodynamic cavitation and mechanical stirring processes. Biodiesel were produced using Kaner Seed Oil (KSO. The experimental results show that hybrid reactor produces 95% biodiesel yield within 45 min for 0.75% of catalyst and 6:1 M ratio which is significantly higher as compared to mechanical stirring or hydrodynamic cavitation alone. Thus biodiesel production process in hybrid reactor is cheap (high yield, efficient (time saving and environmentally friendly (lower% of catalyst. Performance study on engine shows that an increase in compression ratios (from 16 to 18 improves the engine performance using biodiesel blends as compared to petroleum diesel.

  15. Improved characterization of EV preparations based on protein to lipid ratio and lipid properties.

    Directory of Open Access Journals (Sweden)

    Xabier Osteikoetxea

    Full Text Available In recent years the study of extracellular vesicles has gathered much scientific and clinical interest. As the field is expanding, it is becoming clear that better methods for characterization and quantification of extracellular vesicles as well as better standards to compare studies are warranted. The goal of the present work was to find improved parameters to characterize extracellular vesicle preparations. Here we introduce a simple 96 well plate-based total lipid assay for determination of lipid content and protein to lipid ratios of extracellular vesicle preparations from various myeloid and lymphoid cell lines as well as blood plasma. These preparations included apoptotic bodies, microvesicles/microparticles, and exosomes isolated by size-based fractionation. We also investigated lipid bilayer order of extracellular vesicle subpopulations using Di-4-ANEPPDHQ lipid probe, and lipid composition using affinity reagents to clustered cholesterol (monoclonal anti-cholesterol antibody and ganglioside GM1 (cholera toxin subunit B. We have consistently found different protein to lipid ratios characteristic for the investigated extracellular vesicle subpopulations which were substantially altered in the case of vesicular damage or protein contamination. Spectral ratiometric imaging and flow cytometric analysis also revealed marked differences between the various vesicle populations in their lipid order and their clustered membrane cholesterol and GM1 content. Our study introduces for the first time a simple and readily available lipid assay to complement the widely used protein assays in order to better characterize extracellular vesicle preparations. Besides differentiating extracellular vesicle subpopulations, the novel parameters introduced in this work (protein to lipid ratio, lipid bilayer order, and lipid composition, may prove useful for quality control of extracellular vesicle related basic and clinical studies.

  16. Comparison of the Clock Test and a questionnaire-based test for ...

    African Journals Online (AJOL)

    Comparison of the Clock Test and a questionnaire-based test for screening for cognitive impairment in Nigerians. D J VanderJagt, S Ganga, M O Obadofin, P Stanley, M Zimmerman, B J Skipper, R H Glew ...

  17. Novel bacterial ratio for predicting fecal age

    Energy Technology Data Exchange (ETDEWEB)

    Nieman, J.; Brion, G.M. [Univ. of Kentucky, Dept. of Civil Engineering, Lexington, Kentucky (United States)]. E-mail: gbrion@engr.uky.edu

    2002-06-15

    This study presents an extension of ongoing research into the utility of the ratio of bacterial colonies isolated on membrane filters during the total coliform test using m-Endo broth media for the prediction of fecal age. Analysis of the relative shifts in concentrations of indicator bacterial populations in Kentucky River water quality data collected from the inlet of a local water treatment plant showed a correlation between raw concentrations of atypical colonies (AC) and total coliform colonies (TC) formed on m-Endo membrane filter tests, and fecal age. Visual analysis of plant treatment records showed that low values of the AC/TC ratio were related to periods of high flow, when runoff added fresh fecal material to the river. A more detailed analysis of 2 years of Kentucky River water quality data showed the average AC/TC ratio during months with high river flow (rain) to be 3.4, rising to an average of 27.6 during months with low flow. The average AC/TC ratio during high flow months compared to that found in other studies for raw human sewage (3.9) and the ratio increased to values associated with animal impacted urban runoff (18.9) during low flow months. (author)

  18. Novel bacterial ratio for predicting fecal age

    International Nuclear Information System (INIS)

    Nieman, J.; Brion, G.M.

    2002-01-01

    This study presents an extension of ongoing research into the utility of the ratio of bacterial colonies isolated on membrane filters during the total coliform test using m-Endo broth media for the prediction of fecal age. Analysis of the relative shifts in concentrations of indicator bacterial populations in Kentucky River water quality data collected from the inlet of a local water treatment plant showed a correlation between raw concentrations of atypical colonies (AC) and total coliform colonies (TC) formed on m-Endo membrane filter tests, and fecal age. Visual analysis of plant treatment records showed that low values of the AC/TC ratio were related to periods of high flow, when runoff added fresh fecal material to the river. A more detailed analysis of 2 years of Kentucky River water quality data showed the average AC/TC ratio during months with high river flow (rain) to be 3.4, rising to an average of 27.6 during months with low flow. The average AC/TC ratio during high flow months compared to that found in other studies for raw human sewage (3.9) and the ratio increased to values associated with animal impacted urban runoff (18.9) during low flow months. (author)

  19. Combined slope ratio analysis and linear-subtraction: An extension of the Pearce ratio method

    Science.gov (United States)

    De Waal, Sybrand A.

    1996-07-01

    A new technique, called combined slope ratio analysis, has been developed by extending the Pearce element ratio or conserved-denominator method (Pearce, 1968) to its logical conclusions. If two stoichiometric substances are mixed and certain chemical components are uniquely contained in either one of the two mixing substances, then by treating these unique components as conserved, the composition of the substance not containing the relevant component can be accurately calculated within the limits allowed by analytical and geological error. The calculated composition can then be subjected to rigorous statistical testing using the linear-subtraction method recently advanced by Woronow (1994). Application of combined slope ratio analysis to the rocks of the Uwekahuna Laccolith, Hawaii, USA, and the lavas of the 1959-summit eruption of Kilauea Volcano, Hawaii, USA, yields results that are consistent with field observations.

  20. Test-Access Planning and Test Scheduling for Embedded Core-Based System Chips

    OpenAIRE

    Goel, Sandeep Kumar

    2005-01-01

    Advances in the semiconductor process technology enable the creation of a complete system on one single die, the so-called system chip or SOC. To reduce time-to-market for large SOCs, reuse of pre-designed and pre-veried blocks called cores is employed. Like the design style, testing of SOCs can be best approached in a core-based fashion. In order to enable core-based test development, an embedded core should be isolated from its surrounding circuitry and electrical test access from chip pins...

  1. Effect of shoulder to pin ratio on magnesium alloy Friction Stir Welding

    Science.gov (United States)

    Othman, N. H.; Ishak, M.; Shah, L. H.

    2017-09-01

    This study focuses on the effect of shoulder to pin diameter ratio on friction stir welding of magnesium alloy AZ31. Two pieces of AZ31 alloy with thickness of 2 mm were friction stir welded by using conventional milling machine. The shoulder to pin diameter ratio used in this experiment are 2.25, 2.5, 2.75, 3, 3.33, 3.66, 4.5, 5 and 5.5. The rotational speed and welding speed used in this study are 1000 rpm and 100 mm/min, respectively. Microstructure observation of welded area was studied by using optical microscope. Equiaxed grains were observed at the TMAZ and stir zone indicating fully plastic deformation. The grain size of stir zone increased with decreasing shoulder to pin ratio from ratio 3.33 to 5.5 due to higher heat input. It is observed that, surface galling and faying surface defect is produced when excessive heat input is applied. To evaluate the mechanical properties of this specimen, tensile test was used in this study. Shoulder to pin ratio 5.5 shows lowest tensile strength while shoulder to pin diameter ratio 3.33 shows highest tensile strength with weld efficiency 91 % from based metal.

  2. 13CO2/12CO2 ratio analysis in exhaled air by lead-salt tunable diode lasers for noninvasive diagnostics in gastroenterology

    Science.gov (United States)

    Stepanov, Eugene V.; Zyrianov, Pavel V.; Miliaev, Valerii A.; Selivanov, Yurii G.; Chizhevskii, Eugene G.; Os'kina, Svetlana; Ivashkin, Vladimir T.; Nikitina, Elena I.

    1999-07-01

    An analyzer of 13CO2/12CO2 ratio in exhaled air based on lead-salt tunable diode lasers is presented. High accuracy of the carbon isotope ratio detection in exhaled carbon dioxide was achieved with help of very simple optical schematics. It was based on the use of MBE laser diodes operating in pulse mode and on recording the resonance CO2 absorption at 4.2 micrometers . Special fast acquisition electronics and software were applied for spectral data collection and processing. Developed laser system was tested in a clinical train aimed to assessment eradication efficiency in therapy of gastritis associated with Helicobacter pylori infection. Data on the 13C-urea breath test used for P.pylori detection and obtained with tunable diode lasers in the course of the trail was compared with the results of Mass-Spectroscopy analysis and histology observations. The analyzer can be used also for 13CO2/12CO2 ratio detection in exhalation to perform gastroenterology breath test based on using other compounds labeled with stable isotopes.

  3. Optimized Matching Lift Unit Transmission Ratio of Engine Driven Ducted Fan

    Directory of Open Access Journals (Sweden)

    Xiao Senlin

    2018-01-01

    Full Text Available As a kind of VTOL technology, ducted fan is not only used by many kinds of aircrafts, but also one of the trends of the future aircraft lift system, and attracts more and more attention. For an engine driven ducted fan lift unit, involving the engine and ducted fan matching problem, the form of transmission and transmission ratio are the key design parameters. In order to design and develop a ducted fan aircraft reasonably, a thrust test platform was set up to connect the engine with the ducted fan through the belt driving. The matching relationship between the engine and the transmission system was experimentally studied and the optimal transmission ratio was determined. The results showed that the optimal transmission ratio for the engine 1 is 2.2:1, and for the engine 2, the optimal transmission ratio should be 2.95:1 based on the current ducted and movable blade aerofoil design. At this time, the lift will exceed 130 kg•f, meeting the aircraft's original design requirements.

  4. Wind turbine blade testing system using base excitation

    Science.gov (United States)

    Cotrell, Jason; Thresher, Robert; Lambert, Scott; Hughes, Scott; Johnson, Jay

    2014-03-25

    An apparatus (500) for fatigue testing elongate test articles (404) including wind turbine blades through forced or resonant excitation of the base (406) of the test articles (404). The apparatus (500) includes a testing platform or foundation (402). A blade support (410) is provided for retaining or supporting a base (406) of an elongate test article (404), and the blade support (410) is pivotally mounted on the testing platform (402) with at least two degrees of freedom of motion relative to the testing platform (402). An excitation input assembly (540) is interconnected with the blade support (410) and includes first and second actuators (444, 446, 541) that act to concurrently apply forces or loads to the blade support (410). The actuator forces are cyclically applied in first and second transverse directions. The test article (404) responds to shaking of its base (406) by oscillating in two, transverse directions (505, 507).

  5. Evaluation of liquefaction potential of soil based on standard penetration test using multi-gene genetic programming model

    Science.gov (United States)

    Muduli, Pradyut; Das, Sarat

    2014-06-01

    This paper discusses the evaluation of liquefaction potential of soil based on standard penetration test (SPT) dataset using evolutionary artificial intelligence technique, multi-gene genetic programming (MGGP). The liquefaction classification accuracy (94.19%) of the developed liquefaction index (LI) model is found to be better than that of available artificial neural network (ANN) model (88.37%) and at par with the available support vector machine (SVM) model (94.19%) on the basis of the testing data. Further, an empirical equation is presented using MGGP to approximate the unknown limit state function representing the cyclic resistance ratio (CRR) of soil based on developed LI model. Using an independent database of 227 cases, the overall rates of successful prediction of occurrence of liquefaction and non-liquefaction are found to be 87, 86, and 84% by the developed MGGP based model, available ANN and the statistical models, respectively, on the basis of calculated factor of safety (F s) against the liquefaction occurrence.

  6. Rule-based Test Generation with Mind Maps

    Directory of Open Access Journals (Sweden)

    Dimitry Polivaev

    2012-02-01

    Full Text Available This paper introduces basic concepts of rule based test generation with mind maps, and reports experiences learned from industrial application of this technique in the domain of smart card testing by Giesecke & Devrient GmbH over the last years. It describes the formalization of test selection criteria used by our test generator, our test generation architecture and test generation framework.

  7. The comparison of landslide ratio-based and general logistic regression landslide susceptibility models in the Chishan watershed after 2009 Typhoon Morakot

    Science.gov (United States)

    WU, Chunhung

    2015-04-01

    The research built the original logistic regression landslide susceptibility model (abbreviated as or-LRLSM) and landslide ratio-based ogistic regression landslide susceptibility model (abbreviated as lr-LRLSM), compared the performance and explained the error source of two models. The research assumes that the performance of the logistic regression model can be better if the distribution of landslide ratio and weighted value of each variable is similar. Landslide ratio is the ratio of landslide area to total area in the specific area and an useful index to evaluate the seriousness of landslide disaster in Taiwan. The research adopted the landside inventory induced by 2009 Typhoon Morakot in the Chishan watershed, which was the most serious disaster event in the last decade, in Taiwan. The research adopted the 20 m grid as the basic unit in building the LRLSM, and six variables, including elevation, slope, aspect, geological formation, accumulated rainfall, and bank erosion, were included in the two models. The six variables were divided as continuous variables, including elevation, slope, and accumulated rainfall, and categorical variables, including aspect, geological formation and bank erosion in building the or-LRLSM, while all variables, which were classified based on landslide ratio, were categorical variables in building the lr-LRLSM. Because the count of whole basic unit in the Chishan watershed was too much to calculate by using commercial software, the research took random sampling instead of the whole basic units. The research adopted equal proportions of landslide unit and not landslide unit in logistic regression analysis. The research took 10 times random sampling and selected the group with the best Cox & Snell R2 value and Nagelkerker R2 value as the database for the following analysis. Based on the best result from 10 random sampling groups, the or-LRLSM (lr-LRLSM) is significant at the 1% level with Cox & Snell R2 = 0.190 (0.196) and Nagelkerke R2

  8. Effects of applied stress ratio on the fatigue behavior of additively manufactured porous biomaterials under compressive loading.

    Science.gov (United States)

    de Krijger, Joep; Rans, Calvin; Van Hooreweder, Brecht; Lietaert, Karel; Pouran, Behdad; Zadpoor, Amir A

    2017-06-01

    Additively manufactured (AM) porous metallic biomaterials are considered promising candidates for bone substitution. In particular, AM porous titanium can be designed to exhibit mechanical properties similar to bone. There is some experimental data available in the literature regarding the fatigue behavior of AM porous titanium, but the effect of stress ratio on the fatigue behavior of those materials has not been studied before. In this paper, we study the effect of applied stress ratio on the compression-compression fatigue behavior of selective laser melted porous titanium (Ti-6Al-4V) based on the diamond unit cell. The porous titanium biomaterial is treated as a meta-material in the context of this work, meaning that R-ratios are calculated based on the applied stresses acting on a homogenized volume. After morphological characterization using micro computed tomography and quasi-static mechanical testing, the porous structures were tested under cyclic loading using five different stress ratios, i.e. R = 0.1, 0.3, 0.5, 0.7 and 0.8, to determine their S-N curves. Feature tracking algorithms were used for full-field deformation measurements during the fatigue tests. It was observed that the S-N curves of the porous structures shift upwards as the stress ratio increases. The stress amplitude was the most important factor determining the fatigue life. Constant fatigue life diagrams were constructed and compared with similar diagrams for bulk Ti-6Al-4V. Contrary to the bulk material, there was limited dependency of the constant life diagrams to mean stress. The notches present in the AM biomaterials were the sites of crack initiation. This observation and other evidence suggest that the notches created by the AM process cause the insensitivity of the fatigue life diagrams to mean stress. Feature tracking algorithms visualized the deformation during fatigue tests and demonstrated the root cause of inclined (45°) planes of specimen failure. In conclusion, the R-ratio

  9. Influence of ceramic dental crown coating substrate thickness ratio on strain energy release rate

    Science.gov (United States)

    Khasnulhadi, K.; Daud, R.; Mat, F.; Noor, S. N. F. M.; Basaruddin, K. S.; Sulaiman, M. H.

    2017-10-01

    This paper presents the analysis of coating substrate thickness ratio effect on the crown coating fracture behaviour. The bi-layer material is examined under four point bending with pre-crack at the bottom of the core material by using finite element. Three different coating thickness of core/substrate was tested which is 1:1, 1:2 and 2:1. The fracture parameters are analysed based on bilayer and homogenous elastic interaction. The result shows that the ratio thickness of core/veneer provided a significant effect on energy release rate.

  10. TESTING TECHNICAL AND SCALE EFFICIENCY OF KAZAKHBANKS:EVIDENCE BASED ON DATA ENVELOPMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Razzaque H Bhatti

    2013-01-01

    Full Text Available This paper tests technical and scale efficiency of 20 Kazakh banks using annualdata on three inputs (interest expenses, non-interestexpenses and deposits andthree outputs (interest income, non-interest income and loans over the period2007-2011. Two input-oriented data envelopment analysis models of Charnes etal (1978 and Banker et al(1984, which are based on constant return to scale andvariable return to scale respectively, areused to evaluate technical efficiency,whereas scale efficiency is computed bydividing the former efficiency ratio bythe latter one. The resultsobtained show that the average efficiency ratios ofindividual banks under constant and variable returns toscale range from 0.88 and1.00 to 0.93 and 1.00 respectively, whereas those of all banks between 0.95 and0.98 respectively. Only are the fivebanks (ATFB, Citibank, HSBC bank,KazInvest bank and Exim bank the mostefficient banks in Kazakhstan, sincetheir efficiency ratios have been consistently equal to unity, implying that thesebanks operate at their optimal levels. The efficiency scores of the remaining 15banks range from 0.88 to 0.99, and as suchthe majority of these banks do notseem to operate far more below their optimal level. The results indicate that theperformance of the Kazakh banks deteriorated substantially during the globalfinancial crisis of 2008 because theCRS ratio dropped from 0.65 in 2007 to 0.50in 2008 and to 0.40 in 2009. The results alsoconfirm that most of the foreignbanks perform relatively better than domestic banks.

  11. Dynamics of compressible gas-liquid flows with a stiff density ratio

    International Nuclear Information System (INIS)

    Cortes, Julien

    1999-01-01

    This work is devoted to the study of transient two-phase flows when the ratio of the two densities is stiff. At first, we review briefly some of the basic principles about two-phase flow, hyperbolicity and the finite volume method. Then we develop a perturbation method, based on the stiffness of the density ratio, to examine the Eigen-structure of two-fluid models. Indeed, in such models, complex phasic interactions yield a complex Eigen-structure which may raise numerous problems in simulations. We show that our approach provides a convenient frame to study the hyperbolicity of such models. At this stage, advanced numerical tests are computed showing the efficiency of our approach in the context of unstructured multidimensional meshes. Our tests are validated for non-equilibrium flows using experimental data or through mesh refinements. At last, we use the scaling of the densities to analyse how momentum is transferred between phases in the context of bubbly flows. We study the relevance of a stiff relaxation term related to the ratio of the densities using linear stability properties and Chapman-Enskog expansions. Our results and some numerical computations tends to show that such a system is apparently well-posed despite being 'weakly' hyperbolic. (author) [fr

  12. Urine Albumin and Albumin/ Creatinine Ratio

    Science.gov (United States)

    ... it used? The urine albumin test or albumin/creatinine ratio (ACR) is used to screen people with chronic conditions, such as diabetes and high blood pressure ( hypertension ) that put them at an ...

  13. Performance of a high-work, low-aspect-ratio turbine stator tested with a realistic inlet radial temperature gradient

    Science.gov (United States)

    Stabe, Roy G.; Schwab, John R.

    1991-01-01

    A 0.767-scale model of a turbine stator designed for the core of a high-bypass-ratio aircraft engine was tested with uniform inlet conditions and with an inlet radial temperature profile simulating engine conditions. The principal measurements were radial and circumferential surveys of stator-exit total temperature, total pressure, and flow angle. The stator-exit flow field was also computed by using a three-dimensional Navier-Stokes solver. Other than temperature, there were no apparent differences in performance due to the inlet conditions. The computed results compared quite well with the experimental results.

  14. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of

  15. An authoring tool for building both mobile adaptable tests and web-based adaptive or classic tests

    NARCIS (Netherlands)

    Romero, C.; Ventura, S.; Hervás, C.; De Bra, P.M.E.; Wade, V.; Ashman, H.; Smyth, B.

    2006-01-01

    This paper describes Test Editor, an authoring tool for building both mobile adaptable tests and web-based adaptive or classic tests. This tool facilitates the development and maintenance of different types of XML-based multiple- choice tests for using in web-based education systems and wireless

  16. Development of Two-Tier Diagnostic Test Pictorial-Based for Identifying High School Students Misconceptions on the Mole Concept

    Science.gov (United States)

    Siswaningsih, W.; Firman, H.; Zackiyah; Khoirunnisa, A.

    2017-02-01

    The aim of this study was to develop the two-tier pictorial-based diagnostic test for identifying student misconceptions on mole concept. The method of this study is used development and validation. The development of the test Obtained through four phases, development of any items, validation, determination key, and application test. Test was developed in the form of pictorial consisting of two tier, the first tier Consist of four possible answers and the second tier Consist of four possible reasons. Based on the results of content validity of 20 items using the CVR (Content Validity Ratio), a number of 18 items declared valid. Based on the results of the reliability test using SPSS, Obtained 17 items with Cronbach’s Alpha value of 0703, the which means that items have accepted. A total of 10 items was conducted to 35 students of senior high school students who have studied the mole concept on one of the high schools in Cimahi. Based on the results of the application test, student misconceptions were identified in each label concept in mole concept with the percentage of misconceptions on the label concept of mole (60.15%), Avogadro’s number (34.28%), relative atomic mass (62, 84%), relative molecule mass (77.08%), molar mass (68.53%), molar volume of gas (57.11%), molarity (71.32%), chemical equation (82.77%), limiting reactants (91.40%), and molecular formula (77.13%).

  17. Computer-Aided Test Flow in Core-Based Design

    OpenAIRE

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of embedded cores. The CAT now is applied to a few cores within the Philips Core Test Pilot IC project

  18. Efficient p-value evaluation for resampling-based tests

    KAUST Repository

    Yu, K.

    2011-01-05

    The resampling-based test, which often relies on permutation or bootstrap procedures, has been widely used for statistical hypothesis testing when the asymptotic distribution of the test statistic is unavailable or unreliable. It requires repeated calculations of the test statistic on a large number of simulated data sets for its significance level assessment, and thus it could become very computationally intensive. Here, we propose an efficient p-value evaluation procedure by adapting the stochastic approximation Markov chain Monte Carlo algorithm. The new procedure can be used easily for estimating the p-value for any resampling-based test. We show through numeric simulations that the proposed procedure can be 100-500 000 times as efficient (in term of computing time) as the standard resampling-based procedure when evaluating a test statistic with a small p-value (e.g. less than 10( - 6)). With its computational burden reduced by this proposed procedure, the versatile resampling-based test would become computationally feasible for a much wider range of applications. We demonstrate the application of the new method by applying it to a large-scale genetic association study of prostate cancer.

  19. Initial economic and operations data base for DSS 13 automation test

    Science.gov (United States)

    Remer, D. S.; Lorden, G.

    1979-01-01

    A summary is given of the data base collected for nine weeks of Deep Space Station II. Life cycle cost parameters on efficiency and productivity ratios, costs, and telemetry were calculated from this data base.

  20. Budget impact analysis of sFlt-1/PlGF ratio as prediction test in Italian women with suspected preeclampsia.

    Science.gov (United States)

    Frusca, Tiziana; Gervasi, Maria-Teresa; Paolini, Davide; Dionisi, Matteo; Ferre, Francesca; Cetin, Irene

    2017-09-01

    Preeclampsia (PE) is a pregnancy disease which represents a leading cause of maternal and perinatal mortality and morbidity. Accurate prediction of PE risk could provide an increase in health benefits and better patient management. To estimate the economic impact of introducing Elecsys sFlt-1/PlGF ratio test, in addition to standard practice, for the prediction of PE in women with suspected PE in the Italian National Health Service (INHS). A decision tree model has been developed to simulate the progression of a cohort of pregnant women from the first presentation of clinical suspicion of PE in the second and third trimesters until delivery. The model provides an estimation of the financial impact of introducing sFlt-1/PlGF versus standard practice. Clinical inputs have been derived from PROGNOSIS study and from literature review, and validated by National Clinical Experts. Resources and unit costs have been obtained from Italian-specific sources. Healthcare costs associated with the management of a pregnant woman with clinical suspicion of PE equal €2384 when following standard practice versus €1714 using sFlt-1/PlGF ratio test. Introduction of sFlt-1/PlGF into hospital practice is cost-saving. Savings are generated primarily through improvement in diagnostic accuracy and reduction in unnecessary hospitalization for women before PE's onset.

  1. Mean value-based power allocation and ratio selection for MIMO cognitive radio systems

    KAUST Repository

    Tourki, Kamel; Qaraqe, Khalid A.; Alouini, Mohamed-Slim

    2013-01-01

    In this paper, we consider a spectrum sharing cognitive radio system with ratio selection using a mean value-based power allocation strategy. We first provide the exact statistics in terms of probability density function and cumulative density function of the secondary channel gain as well as of the interference channel gain. These statistics are then used to derive exact closed form expression of the secondary outage probability. Furthermore, asymptotical analysis is derived and generalized diversity gain is deduced. We validate our analysis with simulation results in a Rayleigh fading environment. © 2013 IEEE.

  2. Mean value-based power allocation and ratio selection for MIMO cognitive radio systems

    KAUST Repository

    Tourki, Kamel

    2013-06-01

    In this paper, we consider a spectrum sharing cognitive radio system with ratio selection using a mean value-based power allocation strategy. We first provide the exact statistics in terms of probability density function and cumulative density function of the secondary channel gain as well as of the interference channel gain. These statistics are then used to derive exact closed form expression of the secondary outage probability. Furthermore, asymptotical analysis is derived and generalized diversity gain is deduced. We validate our analysis with simulation results in a Rayleigh fading environment. © 2013 IEEE.

  3. Security Considerations and Recommendations in Computer-Based Testing

    Directory of Open Access Journals (Sweden)

    Saleh M. Al-Saleem

    2014-01-01

    Full Text Available Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT. However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password in order to check the identity and authenticity of the examinee.

  4. Security considerations and recommendations in computer-based testing.

    Science.gov (United States)

    Al-Saleem, Saleh M; Ullah, Hanif

    2014-01-01

    Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT). However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password) in order to check the identity and authenticity of the examinee.

  5. Contribution to the problem of liquidity ratios

    OpenAIRE

    Dvoøáèek Jaroslav

    1997-01-01

    The article is based on the importance of the financial analysis in mining industry. The author pays attention to liquidity ratios given in literature from the standpoint of their number, content, units and recommended quantity value of single ratios. For the application in practice two liquidity ratios are suggested and the methodology of their recommended values determination is given.

  6. Study on the contact ratio of base mat of reactor buildings considering nonlinear soil-structure interaction effects

    International Nuclear Information System (INIS)

    Aihara, S.; Atsumi, K.; Ujiie, K.; Emori, K.; Odajima, M.; Masuda, K.

    1983-01-01

    The objective of this paper is to evaluate the nonlinear soil-structure interaction effects resulting from base mat uplift for static lateral loads. Nonlinear soil-structure interaction effects are modeled through the use of equivalent soil-structure interaction frictional and axial springs, which properties are determined by results of experimental data. It is assumed that normal stresses in compression and corresponding shear stresses, and friction, can occur in the area of contact between the embedded structure and soil. The remaining parts of the structure and soil are based on elastic analysis. A two-dimensional finite element method with incremental loadings is applied. The substructuring technique is used to reduce computation time. The results of this method with respect to the contact ratio of the base mat are compared with the values obtained by static elastic calculation which is simply derived from an overturning moment and a vertical load of the structure. This analytical concept will be developed into dynamic problems, and then it will be possible to state whether or not this concept can represent a true alternative for the contact ratio of the base mat of a structure. (orig./HP)

  7. Response surface methodology based optimization of diesel–n-butanol –cotton oil ternary blend ratios to improve engine performance and exhaust emission characteristics

    International Nuclear Information System (INIS)

    Atmanlı, Alpaslan; Yüksel, Bedri; İleri, Erol; Deniz Karaoglan, A.

    2015-01-01

    Highlights: • RSM based optimization for optimum blend ratio of diesel fuel, n-butanol and cotton oil was done. • 65.5 vol.% diesel fuel, 23.1 vol.% n-butanol and 11.4 vol.% cotton oil (DnBC) was determined. • DnBC decreased brake torque, brake power, BTE and BMEP, while increased BSFC. • DnBC decreased NO x , CO and HC emissions. - Abstract: Many studies declare that 20% biodiesel is the optimum concentration for biodiesel–diesel fuel blends to improve performance. The present work focuses on finding diesel fuel, n-butanol, and cotton oil optimum blend ratios for diesel engine applications by using the response surface method (RSM). Experimental test fuels were prepared by choosing 7 different concentrations, where phase decomposition did not occur in the phase diagram of −10 °C. Experiments were carried out at full load conditions and the constant speed (2200 rpm) of maximum brake torque to determine engine performance and emission parameters. According to the test results of the engine, optimization was done by using RSM considering engine performance and exhaust emissions parameters, to identify the rates of concentrations of components in the optimum blend of three. Confirmation tests were employed to compare the output values of concentrations that were identified by optimization. The real experiment results and the R 2 actual values that show the relation between the outputs from the optimizations and real experiments were determined in high accordance. The optimum component concentration was determined as 65.5 vol.% diesel, 23.1 vol.% n-butanol and 11.4 vol.% cotton oil (DnBC). According to engine performance tests brake torque, brake power, BTE and BMEP of DnBC decreased while BSFC increased compared to those of diesel fuel. NO x , CO and HC emissions of DnBC drastically decreased as 11.33%, 45.17% and 81.45%, respectively

  8. A Simple Plasma Retinol Isotope Ratio Method for Estimating β-Carotene Relative Bioefficacy in Humans: Validation with the Use of Model-Based Compartmental Analysis.

    Science.gov (United States)

    Ford, Jennifer Lynn; Green, Joanne Balmer; Lietz, Georg; Oxley, Anthony; Green, Michael H

    2017-09-01

    Background: Provitamin A carotenoids are an important source of dietary vitamin A for many populations. Thus, accurate and simple methods for estimating carotenoid bioefficacy are needed to evaluate the vitamin A value of test solutions and plant sources. β-Carotene bioefficacy is often estimated from the ratio of the areas under plasma isotope response curves after subjects ingest labeled β-carotene and a labeled retinyl acetate reference dose [isotope reference method (IRM)], but to our knowledge, the method has not yet been evaluated for accuracy. Objectives: Our objectives were to develop and test a physiologically based compartmental model that includes both absorptive and postabsorptive β-carotene bioconversion and to use the model to evaluate the accuracy of the IRM and a simple plasma retinol isotope ratio [(RIR), labeled β-carotene-derived retinol/labeled reference-dose-derived retinol in one plasma sample] for estimating relative bioefficacy. Methods: We used model-based compartmental analysis (Simulation, Analysis and Modeling software) to develop and apply a model that provided known values for β-carotene bioefficacy. Theoretical data for 10 subjects were generated by the model and used to determine bioefficacy by RIR and IRM; predictions were compared with known values. We also applied RIR and IRM to previously published data. Results: Plasma RIR accurately predicted β-carotene relative bioefficacy at 14 d or later. IRM also accurately predicted bioefficacy by 14 d, except that, when there was substantial postabsorptive bioconversion, IRM underestimated bioefficacy. Based on our model, 1-d predictions of relative bioefficacy include absorptive plus a portion of early postabsorptive conversion. Conclusion: The plasma RIR is a simple tracer method that accurately predicts β-carotene relative bioefficacy based on analysis of one blood sample obtained at ≥14 d after co-ingestion of labeled β-carotene and retinyl acetate. The method also provides

  9. Testing Game-Based Performance in Team-Handball.

    Science.gov (United States)

    Wagner, Herbert; Orwat, Matthias; Hinz, Matthias; Pfusterschmied, Jürgen; Bacharach, David W; von Duvillard, Serge P; Müller, Erich

    2016-10-01

    Wagner, H, Orwat, M, Hinz, M, Pfusterschmied, J, Bacharach, DW, von Duvillard, SP, and Müller, E. Testing game-based performance in team-handball. J Strength Cond Res 30(10): 2794-2801, 2016-Team-handball is a fast paced game of defensive and offensive action that includes specific movements of jumping, passing, throwing, checking, and screening. To date and to the best of our knowledge, a game-based performance test (GBPT) for team-handball does not exist. Therefore, the aim of this study was to develop and validate such a test. Seventeen experienced team-handball players performed 2 GBPTs separated by 7 days between each test, an incremental treadmill running test, and a team-handball test game (TG) (2 × 20 minutes). Peak oxygen uptake (V[Combining Dot Above]O2peak), blood lactate concentration (BLC), heart rate (HR), sprinting time, time of offensive and defensive actions as well as running intensities, ball velocity, and jump height were measured in the game-based test. Reliability of the tests was calculated using an intraclass correlation coefficient (ICC). Additionally, we measured V[Combining Dot Above]O2peak in the incremental treadmill running test and BLC, HR, and running intensities in the team-handball TG to determine the validity of the GBPT. For the test-retest reliability, we found an ICC >0.70 for the peak BLC and HR, mean offense and defense time, as well as ball velocity that yielded an ICC >0.90 for the V[Combining Dot Above]O2peak in the GBPT. Percent walking and standing constituted 73% of total time. Moderate (18%) and high (9%) intensity running in the GBPT was similar to the team-handball TG. Our results indicated that the GBPT is a valid and reliable test to analyze team-handball performance (physiological and biomechanical variables) under conditions similar to competition.

  10. Pengaruh Likuiditas, Profitabilitas, Leverage, dan Market Ratio terhadap Dividend Payout Ratio pada Perusahaan Manufaktur

    Directory of Open Access Journals (Sweden)

    Erna Puspita

    2017-04-01

    Full Text Available Dividend policy is concerned with financial policies regarding what amount cash dividend paid to shareholders and re-invested as retained earnings. The recent research aimed to test empirically various factors is considered to affect dividend policy. The independent variables in his research included Current Ratio (CR, Return on Equity (ROE, Debt to Equity Ratio (DER, and Earning Per Share (EPS. Meanwhile, the dependent variable was Dividend Payout Ratio (DPR. Quantitative research was used as the research design and the data was secondary data. Furthermore, purposive sampling was selected to get the sample. The result was 14 companies that pay dividend continuously during this research conducted on 2012 - 2014 were selected as the sample of this research. Multiple linier regression was used to analyze the data. The results showed that ROE and EPS has a contribution to the DPR, and then CR and DER has no contribution to the DPR.

  11. A LabVIEWTM-based detector testing system

    International Nuclear Information System (INIS)

    Yang Haori; Li Yuanjing; Wang Yi; Li Yulan; Li Jin

    2003-01-01

    The construction of a LabVIEW-based detector testing system is described in this paper. In this system, the signal of detector is magnified and digitized, so amplitude or time spectrum can be obtained. The Analog-to-Digital Converter is a peak-sensitive ADC based on VME bus. The virtual instrument constructed by LabVIEW can be used to acquire data, draw spectrum and save testing results

  12. Intensity ratio to improve black hole assessment in multiple sclerosis.

    Science.gov (United States)

    Adusumilli, Gautam; Trinkaus, Kathryn; Sun, Peng; Lancia, Samantha; Viox, Jeffrey D; Wen, Jie; Naismith, Robert T; Cross, Anne H

    2018-01-01

    Improved imaging methods are critical to assess neurodegeneration and remyelination in multiple sclerosis. Chronic hypointensities observed on T1-weighted brain MRI, "persistent black holes," reflect severe focal tissue damage. Present measures consist of determining persistent black holes numbers and volumes, but do not quantitate severity of individual lesions. Develop a method to differentiate black and gray holes and estimate the severity of individual multiple sclerosis lesions using standard magnetic resonance imaging. 38 multiple sclerosis patients contributed images. Intensities of lesions on T1-weighted scans were assessed relative to cerebrospinal fluid intensity using commercial software. Magnetization transfer imaging, diffusion tensor imaging and clinical testing were performed to assess associations with T1w intensity-based measures. Intensity-based assessments of T1w hypointensities were reproducible and achieved > 90% concordance with expert rater determinations of "black" and "gray" holes. Intensity ratio values correlated with magnetization transfer ratios (R = 0.473) and diffusion tensor imaging metrics (R values ranging from 0.283 to -0.531) that have been associated with demyelination and axon loss. Intensity ratio values incorporated into T1w hypointensity volumes correlated with clinical measures of cognition. This method of determining the degree of hypointensity within multiple sclerosis lesions can add information to conventional imaging. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Sequential probability ratio controllers for safeguards radiation monitors

    International Nuclear Information System (INIS)

    Fehlau, P.E.; Coop, K.L.; Nixon, K.V.

    1984-01-01

    Sequential hypothesis tests applied to nuclear safeguards accounting methods make the methods more sensitive to detecting diversion. The sequential tests also improve transient signal detection in safeguards radiation monitors. This paper describes three microprocessor control units with sequential probability-ratio tests for detecting transient increases in radiation intensity. The control units are designed for three specific applications: low-intensity monitoring with Poisson probability ratios, higher intensity gamma-ray monitoring where fixed counting intervals are shortened by sequential testing, and monitoring moving traffic where the sequential technique responds to variable-duration signals. The fixed-interval controller shortens a customary 50-s monitoring time to an average of 18 s, making the monitoring delay less bothersome. The controller for monitoring moving vehicles benefits from the sequential technique by maintaining more than half its sensitivity when the normal passage speed doubles

  14. Agreement between clinicians' and care givers' assessment of intelligence in Nigerian children with intellectual disability: 'ratio IQ' as a viable option in the absence of standardized 'deviance IQ' tests in sub-Saharan Africa

    Directory of Open Access Journals (Sweden)

    Aguocha Chinyere M

    2009-09-01

    Full Text Available Abstract Background There may be need to assess intelligent quotient (IQ scores in sub-Saharan African children with intellectual disability, either for the purpose of educational needs assessment or research. However, modern intelligence scales developed in the western parts of the world suffer limitation of widespread use because of the influence of socio-cultural variations across the world. This study examined the agreement between IQ scores estimation among Nigerian children with intellectual disability using clinicians' judgment based on International Classification of Diseases, tenth Edition (ICD - 10 criteria for mental retardation and caregivers judgment based on 'ratio IQ' scores calculated from estimated mental age in the context of socio-cultural milieu of the children. It proposed a viable option of IQ score assessment among sub-Saharan African children with intellectual disability, using a ratio of culture-specific estimated mental age and chronological age of the child in the absence of standardized alternatives, borne out of great diversity in socio-cultural context of sub-Saharan Africa. Methods Clinicians and care-givers independently assessed the children in relation to their socio-cultural background. Clinicians assessed the IQ scores of the children based on the ICD - 10 diagnostic criteria for mental retardation. 'Ratio IQ' scores were calculated from the ratio of estimated mental age and chronological age of each child. The IQ scores as assessed by the clinicians were then compared with the 'ratio IQ' scores using correlation statistics. Results A total of forty-four (44 children with intellectual disability were assessed. There was a significant correlation between clinicians' assessed IQ scores and the 'ratio IQ' scores employing zero order correlation without controlling for the chronological age of the children (r = 0.47, df = 42, p = 0.001. First order correlation controlling for the chronological age of the children

  15. Model Predictive Engine Air-Ratio Control Using Online Sequential Relevance Vector Machine

    Directory of Open Access Journals (Sweden)

    Hang-cheong Wong

    2012-01-01

    Full Text Available Engine power, brake-specific fuel consumption, and emissions relate closely to air ratio (i.e., lambda among all the engine variables. An accurate and adaptive model for lambda prediction is essential to effective lambda control for long term. This paper utilizes an emerging technique, relevance vector machine (RVM, to build a reliable time-dependent lambda model which can be continually updated whenever a sample is added to, or removed from, the estimated lambda model. The paper also presents a new model predictive control (MPC algorithm for air-ratio regulation based on RVM. This study shows that the accuracy, training, and updating time of the RVM model are superior to the latest modelling methods, such as diagonal recurrent neural network (DRNN and decremental least-squares support vector machine (DLSSVM. Moreover, the control algorithm has been implemented on a real car to test. Experimental results reveal that the control performance of the proposed relevance vector machine model predictive controller (RVMMPC is also superior to DRNNMPC, support vector machine-based MPC, and conventional proportional-integral (PI controller in production cars. Therefore, the proposed RVMMPC is a promising scheme to replace conventional PI controller for engine air-ratio control.

  16. Contribution to the problem of liquidity ratios

    Directory of Open Access Journals (Sweden)

    Dvoøáèek Jaroslav

    1997-03-01

    Full Text Available The article is based on the importance of the financial analysis in mining industry. The author pays attention to liquidity ratios given in literature from the standpoint of their number, content, units and recommended quantity value of single ratios. For the application in practice two liquidity ratios are suggested and the methodology of their recommended values determination is given.

  17. Automation for a base station stability testing

    OpenAIRE

    Punnek, Elvis

    2016-01-01

    This Batchelor’s thesis was commissioned by Oy LM Ericsson Ab Oulu. The aim of it was to help to investigate and create a test automation solution for the stability testing of the LTE base station. The main objective was to create a test automation for a predefined test set. This test automation solution had to be created for specific environments and equipment. This work included creating the automation for the test cases and putting them to daily test automation jobs. The key factor...

  18. Preconception stress and the secondary sex ratio in a population-based preconception cohort.

    Science.gov (United States)

    Bae, Jisuk; Lynch, Courtney D; Kim, Sungduk; Sundaram, Rajeshwari; Sapra, Katherine J; Buck Louis, Germaine M

    2017-03-01

    To examine the association between preconception parental stress and the secondary sex ratio, defined as the ratio of males to females at birth. A population-based preconception cohort. Not applicable. A total of 235 couples who were enrolled before conception in Michigan and Texas between 2005 and 2009 and who had a singleton birth during the follow-up period. Couples were interviewed separately at baseline to obtain information on perceived stress (Cohen's Perceived Stress Scale) and lifetime history of physician-diagnosed anxiety and/or mood disorders. Female partners were also trained to collect basal saliva samples for the measurement of salivary stress markers, alpha-amylase and cortisol. None. Birth outcome data including infant sex were collected upon delivery. Modified Poisson regression models were used to estimate the relative risks (RRs) of a male birth for each stress marker. After adjusting for potential confounders, we observed a 76% increase in the risk of fathering a male infant (RR 1.76; 95% confidence interval 1.17-2.65) in men diagnosed with anxiety disorders compared with those who were not diagnosed. When lifetime history of physician-diagnosed anxiety disorders was modeled jointly for the couple, the association was slightly strengthened (RR 2.03; 95% confidence interval 1.46-2.84). This prospective cohort study suggests that paternal lifetime history of physician-diagnosed anxiety disorders may be associated with an increase in the secondary sex ratio, resulting in an excess of male births. Copyright © 2016 American Society for Reproductive Medicine. All rights reserved.

  19. Broadband non-polarizing terahertz beam splitters with variable split ratio

    KAUST Repository

    Wei, Minggui

    2017-08-15

    Seeking effective terahertz functional devices has always aroused extensive attention. Of particular interest is the terahertz beam splitter. Here, we have proposed, designed, manufactured, and tested a broadband non-polarizing terahertz beam splitter with a variable split ratio based on an all-dielectric metasurface. The metasurface was created by patterning a dielectric surface of the N-step phase gradient and etching to a few hundred micrometers. The conversion efficiency as high as 81% under the normal incidence at 0.7 THz was achieved. Meanwhile, such a splitter works well over a broad frequency range. The split ratio of the proposed design can be continuously tuned by simply shifting the metasurface, and the angle of emergences can also be easily adjusted by choosing the step of phase gradients. The proposed design is non-polarizing, and its performance is kept under different polarizations.

  20. Broadband non-polarizing terahertz beam splitters with variable split ratio

    Science.gov (United States)

    Wei, Minggui; Xu, Quan; Wang, Qiu; Zhang, Xueqian; Li, Yanfeng; Gu, Jianqiang; Tian, Zhen; Zhang, Xixiang; Han, Jiaguang; Zhang, Weili

    2017-08-01

    Seeking effective terahertz functional devices has always aroused extensive attention. Of particular interest is the terahertz beam splitter. Here, we have proposed, designed, manufactured, and tested a broadband non-polarizing terahertz beam splitter with a variable split ratio based on an all-dielectric metasurface. The metasurface was created by patterning a dielectric surface of the N-step phase gradient and etching to a few hundred micrometers. The conversion efficiency as high as 81% under the normal incidence at 0.7 THz was achieved. Meanwhile, such a splitter works well over a broad frequency range. The split ratio of the proposed design can be continuously tuned by simply shifting the metasurface, and the angle of emergences can also be easily adjusted by choosing the step of phase gradients. The proposed design is non-polarizing, and its performance is kept under different polarizations.

  1. Broadband non-polarizing terahertz beam splitters with variable split ratio

    KAUST Repository

    Wei, Minggui; Xu, Quan; Wang, Qiu; Zhang, Xueqian; Li, Yanfeng; Gu, Jianqiang; Tian, Zhen; Zhang, Xixiang; Han, Jiaguang; Zhang, Weili

    2017-01-01

    Seeking effective terahertz functional devices has always aroused extensive attention. Of particular interest is the terahertz beam splitter. Here, we have proposed, designed, manufactured, and tested a broadband non-polarizing terahertz beam splitter with a variable split ratio based on an all-dielectric metasurface. The metasurface was created by patterning a dielectric surface of the N-step phase gradient and etching to a few hundred micrometers. The conversion efficiency as high as 81% under the normal incidence at 0.7 THz was achieved. Meanwhile, such a splitter works well over a broad frequency range. The split ratio of the proposed design can be continuously tuned by simply shifting the metasurface, and the angle of emergences can also be easily adjusted by choosing the step of phase gradients. The proposed design is non-polarizing, and its performance is kept under different polarizations.

  2. Comparison of Potentiometric and Gravimetric Methods for Determination of O/U Ratio

    International Nuclear Information System (INIS)

    Farida; Windaryati, L; Putro Kasino, P

    1998-01-01

    Comparison of determination O/U ratio by using potentiometric and gravimetric methods has been done. Those methods are simple, economical and having high precision and accuracy. Determination O/U ratio for UO 2 powder using potentiometric is carried out by adopting the davies-gray method. This technique is based on the redox reaction of uranium species such as U(IV) and U(VI). In gravimetric method,the UO 2 power as a sample is calcined at temperature of 900 C, and the weight of the sample is measured after calcination process. The t-student test show that there are no different result significantly between those methods. However, for low concentration in the sample the potentiometric method has a highed precision and accuracy compare to the gravimetric method. O/U ratio obtained is 2.00768 ± 0,00170 for potentiometric method 2.01089 ± 0,02395 for gravimetric method

  3. Spot protein-creatinine ratio and spot albumin-creatinine ratio in the assessment of pre-eclampsia: a diagnostic accuracy study with decision-analytic model-based economic evaluation and acceptability analysis.

    Science.gov (United States)

    Waugh, Jason; Hooper, Richard; Lamb, Edmund; Robson, Stephen; Shennan, Andrew; Milne, Fiona; Price, Christopher; Thangaratinam, Shakila; Berdunov, Vladislav; Bingham, Jenn

    2017-10-01

    The National Institute for Health and Care Excellence (NICE) guidelines highlighted the need for 'large, high-quality prospective studies comparing the various methods of measuring proteinuria in women with new-onset hypertensive disorders during pregnancy'. The primary objective was to evaluate quantitative assessments of spot protein-creatinine ratio (SPCR) and spot albumin-creatinine ratio (SACR) in predicting severe pre-eclampsia (PE) compared with 24-hour urine protein measurement. The secondary objectives were to investigate interlaboratory assay variation, to evaluate SPCR and SACR thresholds in predicting adverse maternal and fetal outcomes and to assess the cost-effectiveness of these models. This was a prospective diagnostic accuracy cohort study, with decision-analytic modelling and a cost-effectiveness analysis. The setting was 36 obstetric units in England, UK. Pregnant women (aged ≥ 16 years), who were at > 20 weeks' gestation with confirmed gestational hypertension and trace or more proteinuria on an automated dipstick urinalysis. Women provided a spot urine sample for protein analysis (the recruitment sample) and were asked to collect a 24-hour urine sample, which was stored for secondary analysis. A further spot sample of urine was taken immediately before delivery. Outcome data were collected from hospital records. There were four index tests on a spot sample of urine: (1) SPCR test (conducted at the local laboratory); (2) SPCR test [conducted at the central laboratory using the benzethonium chloride (BZC) assay]; (3) SPCR test [conducted at the central laboratory using the pyrogallol red (PGR) assay]; and (4) SACR test (conducted at the central laboratory using an automated chemistry analyser). The comparator tests on 24-hour urine collection were a central test using the BZC assay and a central test using the PGR assay. The primary reference standard was the NICE definition of severe PE. Secondary reference standards were a clinician

  4. Meiotic sex ratio variation in natural populations of Ceratodon purpureus (Ditrichaceae).

    Science.gov (United States)

    Norrell, Tatum E; Jones, Kelly S; Payton, Adam C; McDaniel, Stuart F

    2014-09-01

    • Sex ratio variation is a common but often unexplained phenomenon in species across the tree of life. Here we evaluate the hypothesis that meiotic sex ratio variation can contribute to the biased sex ratios found in natural populations of the moss Ceratodon purpureus.• We obtained sporophytes from several populations of C. purpureus from eastern North America. From each sporophyte, we estimated the mean spore viability by germinating replicate samples on agar plates. We estimated the meiotic sex ratio of each sporophyte by inferring the sex of a random sample of germinated spores (mean = 77) using a PCR-RFLP test. We tested for among-sporophyte variation in viability using an ANOVA and for deviations from 1:1 sex ratio using a χ(2)-test and evaluated the relationship between these quantities using a linear regression.• We found among-sporophyte variation in spore viability and meiotic sex ratio, suggesting that genetic variants that contribute to variation in both of these traits segregate within populations of this species. However, we found no relationship between these quantities, suggesting that factors other than sex ratio distorters contribute to variation in spore viability within populations.• These results demonstrate that sex ratio distortion may partially explain the population sex ratio variation seen in C. purpureus, but more generally that genetic conflict over meiotic segregation may contribute to fitness variation in this species. Overall, this study lays the groundwork for future studies on the genetic basis of meiotic sex ratio variation. © 2014 Botanical Society of America, Inc.

  5. Cost Implications of Value-Based Pricing for Companion Diagnostic Tests in Precision Medicine.

    Science.gov (United States)

    Zaric, Gregory S

    2016-07-01

    Many interpretations of personalized medicine, also referred to as precision medicine, include discussions of companion diagnostic tests that allow drugs to be targeted to those individuals who are most likely to benefit or that allow treatment to be designed in a way such that individuals who are unlikely to benefit do not receive treatment. Many authors have commented on the clinical and competitive implications of companion diagnostics, but there has been relatively little formal analysis of the cost implications of companion diagnostics, although cost reduction is often cited as a significant benefit of precision medicine. We investigate the potential impact on costs of precision medicine implemented through the use of companion diagnostics. We develop a framework in which the costs of companion diagnostic tests are determined by considerations of profit maximization and cost effectiveness. We analyze four scenarios that are defined by the incremental cost-effectiveness ratio of the new drug in the absence of a companion diagnostic test. We find that, in most scenarios, precision medicine strategies based on companion diagnostics should be expected to lead to increases in costs in the short term and that costs would fall only in a limited number of situations.

  6. Social inequality and HIV-testing: Comparing home- and clinic-based testing in rural Malawi

    Directory of Open Access Journals (Sweden)

    Alexander A. Weinreb

    2009-10-01

    Full Text Available The plan to increase HIV testing is a cornerstone of the international health strategy against the HIV/AIDS epidemic, particularly in sub-Saharan Africa. This paper highlights a problematic aspect of that plan: the reliance on clinic- rather than home-based testing. First, drawing on DHS data from across Africa, we demonstrate the substantial differences in socio-demographic and economic profiles between those who report having ever had an HIV test, and those who report never having had one. Then, using data from a random household survey in rural Malawi, we show that substituting home-based for clinic-based testing may eliminate this source of inequality between those tested and those not tested. This result, which is stable across modeling frameworks, has important implications for accurately and equitably addressing the counseling and treatment programs that comprise the international health strategy against AIDS, and that promise to shape the future trajectory of the epidemic in Africa and beyond.

  7. The influence of Saccharomyces cerevisiae enzyme ratio on preparation virgin coconut oil for candidate in-house reference materials

    Science.gov (United States)

    Rohyami, Yuli; Anjani, Rafika Debby; Purwanti, Napthalina Putri

    2017-03-01

    Virgin coconut oil is an excellent product which has result of oil processing business opportunities in the international market. Standardization of virgin coconut oil necessary to satisfy the requirements industry needs. This research is expected as procedure preparation of reference materials. Preparation of virgin coconut oil by Sacharomycescerevisiaeenzyme. Based on the results of this study concluded that the ratio of Saccharomyces cerevisiae can affect the yield of virgin coconut oil produced. The preparation of virgin coconut oil enzymatically using a variety of mass ratio of 0.001 to 0.006% is obtained yield average of 12.40%. The optimum separation of virgin coconut oil on the use of enzymes with a mass ratio of 0.002%. The average water content at a ratio of 0.002% is 0.04 % with a value of uncertainty is 0.005%. The average iodine number in virgin coconut oil produced is 2.4403 ± 0,1974 grams of iodine per 100 grams of oil and optimum iodine number is obtained from the manufacturing process virgin coconut oil with a ratio of 0.006% Saccharomyces cerevisiae. Sacharomycescerevisiae with a ratio of 0.002% results virgin coconut oil with acid number 0.3068 ± 0.1098%. The peroxide value of virgin coconut oil between 0.0108 ± 0.009 to 0.0114 ± 0015milli-equivalent per kilograms. Organoleptic test results and test chemical parameters can be used as the test data that can be developed in prototype preparation of candidate in-house reference material in the testing standards of quality virgin coconut oil.

  8. Fatigue crack closure behavior at high stress ratios

    Science.gov (United States)

    Turner, C. Christopher; Carman, C. Davis; Hillberry, Ben M.

    1988-01-01

    Fatigue crack delay behavior at high stress ratio caused by single peak overloads was investigated in two thicknesses of 7475-T731 aluminum alloy. Closure measurements indicated no closure occurred before or throughout the overload plastic zones following the overload. This was further substantiated by comparing the specimen compliance following the overload with the compliance of a low R ratio test when the crack was fully open. Scanning electron microscope studies revealed that crack tunneling and possibly reinitiation of the crack occurred, most likely a result of crack-tip blunting. The number of delay cycles was greater for the thinner mixed mode stress state specimen than for the thicker plane strain stress state specimen, which is similar to low R ratio test results and may be due to a larger plastic zone for the mixed mode cased.

  9. A heuristic application of critical power ratio to pressurized water reactor core design

    International Nuclear Information System (INIS)

    Ahn, Seung Hoon; Jeun, Gyoo Dong

    2002-01-01

    The approach for evaluating the critical heat flux (CHF) margin using the departure from nucleate boiling ratio (DNBR) concept has been widely applied to PWR core design, while DNBR in this approach does not indicate appropriately the CHF margin in terms of the attainable power margin-to-CHF against a reactor core condition. The CHF power margin must be calculated by increasing power until the minimum DNBR reaches a DNBR limit. The Critical Power Ratio (CPR), defined as the ratio of the predicted CHF power to the operating power, is considered more reasonable for indicating the CHF margin and can be calculated by a CPR correlation based on the heat balance of a test bundle. This approach yields directly the CHF power margin, but the calculated CPR must be corrected to compensate for many local effects of the actual core, which are not considered in the CHF test and analysis. In this paper, correction of the calculated CPR is made so that it may become equal to the DNB overpower margin. Exemplary calculations showed that the correction tends to be increased as power distribution is more distorted, but are not unduly large

  10. Space Launch System Base Heating Test: Experimental Operations & Results

    Science.gov (United States)

    Dufrene, Aaron; Mehta, Manish; MacLean, Matthew; Seaford, Mark; Holden, Michael

    2016-01-01

    NASA's Space Launch System (SLS) uses four clustered liquid rocket engines along with two solid rocket boosters. The interaction between all six rocket exhaust plumes will produce a complex and severe thermal environment in the base of the vehicle. This work focuses on a recent 2% scale, hot-fire SLS base heating test. These base heating tests are short-duration tests executed with chamber pressures near the full-scale values with gaseous hydrogen/oxygen engines and RSRMV analogous solid propellant motors. The LENS II shock tunnel/Ludwieg tube tunnel was used at or near flight duplicated conditions up to Mach 5. Model development was based on the Space Shuttle base heating tests with several improvements including doubling of the maximum chamber pressures and duplication of freestream conditions. Test methodology and conditions are presented, and base heating results from 76 runs are reported in non-dimensional form. Regions of high heating are identified and comparisons of various configuration and conditions are highlighted. Base pressure and radiometer results are also reported.

  11. Diagnostic tests based on human basophils

    DEFF Research Database (Denmark)

    Kleine-Tebbe, Jörg; Erdmann, Stephan; Knol, Edward F

    2006-01-01

    -maximal responses, termed 'intrinsic sensitivity'. These variables give rise to shifts in the dose-response curves which, in a diagnostic setting where only a single antigen concentration is employed, may produce false-negative data. Thus, in order to meaningfully utilize the current basophil activation tests....... Diagnostic studies using CD63 or CD203c in hymenoptera, food and drug allergy are critically discussed. Basophil-based tests are indicated for allergy testing in selected cases but should only be performed by experienced laboratories....

  12. The use of interval ratios in consonance perception by rats (Rattus norvegicus) and humans (Homo sapiens).

    Science.gov (United States)

    Crespo-Bojorque, Paola; Toro, Juan M

    2015-02-01

    Traditionally, physical features in musical chords have been proposed to be at the root of consonance perception. Alternatively, recent studies suggest that different types of experience modulate some perceptual foundations for musical sounds. The present study tested whether the mechanisms involved in the perception of consonance are present in an animal with no extensive experience with harmonic stimuli and a relatively limited vocal repertoire. In Experiment 1, rats were trained to discriminate consonant from dissonant chords and tested to explore whether they could generalize such discrimination to novel chords. In Experiment 2, we tested if rats could discriminate between chords differing only in their interval ratios and generalize them to different octaves. To contrast the observed pattern of results, human adults were tested with the same stimuli in Experiment 3. Rats successfully discriminated across chords in both experiments, but they did not generalize to novel items in either Experiment 1 or Experiment 2. On the contrary, humans not only discriminated among both consonance-dissonance categories, and among sets of interval ratios, they also generalized their responses to novel items. These results suggest that experience with harmonic sounds may be required for the construction of categories among stimuli varying in frequency ratios. However, the discriminative capacity observed in rats suggests that at least some components of auditory processing needed to distinguish chords based on their interval ratios are shared across species. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  13. Analysis of the moderating ratio in BWR fuels

    International Nuclear Information System (INIS)

    Gomez, A.; Xolocostli, V.; Alonso, G.

    2001-01-01

    In all different light water nuclear reactors is very important the fuel assembly design. It has to be designed to achieve safety and efficiency performance in an economical way. The moderating ratio plays a very important role because an adequate election can provide an optimal energy production making the fuel assembly more efficient. This work analyze the moderation ratio as a function of the fuel assembly enrichment and ifs burnup, based on this study the optimal moderation ratio are obtained. Furthermore, based on numerical relations some simulation schemes are proposed to describe the behavior of the infinite multiplication factor as a function of the moderating ratio for a given fuel assembly enrichment at zero burnup. (Author)

  14. Properties of permutation-based gene tests and controlling type 1 error using a summary statistic based gene test.

    Science.gov (United States)

    Swanson, David M; Blacker, Deborah; Alchawa, Taofik; Ludwig, Kerstin U; Mangold, Elisabeth; Lange, Christoph

    2013-11-07

    The advent of genome-wide association studies has led to many novel disease-SNP associations, opening the door to focused study on their biological underpinnings. Because of the importance of analyzing these associations, numerous statistical methods have been devoted to them. However, fewer methods have attempted to associate entire genes or genomic regions with outcomes, which is potentially more useful knowledge from a biological perspective and those methods currently implemented are often permutation-based. One property of some permutation-based tests is that their power varies as a function of whether significant markers are in regions of linkage disequilibrium (LD) or not, which we show from a theoretical perspective. We therefore develop two methods for quantifying the degree of association between a genomic region and outcome, both of whose power does not vary as a function of LD structure. One method uses dimension reduction to "filter" redundant information when significant LD exists in the region, while the other, called the summary-statistic test, controls for LD by scaling marker Z-statistics using knowledge of the correlation matrix of markers. An advantage of this latter test is that it does not require the original data, but only their Z-statistics from univariate regressions and an estimate of the correlation structure of markers, and we show how to modify the test to protect the type 1 error rate when the correlation structure of markers is misspecified. We apply these methods to sequence data of oral cleft and compare our results to previously proposed gene tests, in particular permutation-based ones. We evaluate the versatility of the modification of the summary-statistic test since the specification of correlation structure between markers can be inaccurate. We find a significant association in the sequence data between the 8q24 region and oral cleft using our dimension reduction approach and a borderline significant association using the

  15. Isotopic ratios of actinides used in British nuclear trials at Maralinga and Emu

    International Nuclear Information System (INIS)

    Johnston, P.N.; Burns, P.A.; Cooper, M.B.; Williams, G.A.

    1988-10-01

    Studies are underway to investigate the rehabilitation of the former nuclear weapons test sites at Maralinga and Emu in South Australia. Many of these studies are based on measurements of Am-241 which is a contaminant in the plutonium dispersed at these sites. Measurements of the ratio of activities of Pu-239 and Am-241 are presented for sites where suitable samples could be collected. Where possible, measurements were also made of Pu-240 and U-235 activities. Recommended values, current for mid 1988, for the Pu-239/Am-241 activity ratio for the major trial sites range between 20 and 40. At Taranaki values of 6 to 22 were reported, while for the minor sites, current values of the Pu-240/Am-241 activity ratio vary between 1 and 2. 6 refs., 6 tabs., 3 figs

  16. EVALUATION OF QUANTITATIVE THYROID SCINTIGRAPHY FOR DIAGNOSIS AND STAGING OF DISEASE SEVERITY IN CATS WITH HYPERTHYROIDISM: COMPARISON OF THE PERCENT THYROIDAL UPTAKE OF PERTECHNETATE TO THYROID-TO-SALIVARY RATIO AND THYROID-TO-BACKGROUND RATIOS.

    Science.gov (United States)

    Peterson, Mark E; Guterl, Jade N; Rishniw, Mark; Broome, Michael R

    2016-07-01

    Thyroid scintigraphy is commonly used for evaluation of cats with hyperthyroidism, with the thyroid-to-salivary ratio (T/S) being the most common method to quantify the degree of thyroid activity and disease. Calculation of thyroid-to-background ratios (T/B) or percent thyroidal uptake of (99m) TcO(-) 4 (TcTU) has only been reported in a few studies. The purpose of this prospective, cross-sectional study was to evaluate a number of quantitative scintigraphic indices as diagnostic tests for hyperthyroidism, including the T/S, three different T/B, TcTU, and estimated thyroid volume. Of 524 cats referred to our clinic for evaluation of suspected hyperthyroidism, the diagnosis was confirmed (n = 504) or excluded (n = 20) based on results of a serum thyroid panel consisting of thyroxine (T4 ), triiodothyronine (T3 ), free T4 (fT4 ), and thyroid-stimulating hormone (TSH) concentrations. In the hyperthyroid cats, median values for TcTU, T/S, and three T/B ratios were all significantly higher (P hyperthyroidism, but the T/S ratio had the highest test accuracy. The T/S ratio correlated strongly with the TcTU (r = 0.85). However, the TcTU had a higher and more significant correlation (P metabolic activity of the feline adenomatous thyroid gland. © 2016 American College of Veterinary Radiology.

  17. Establishing a sample-to cut-off ratio for lab-diagnosis of hepatitis C virus in Indian context.

    Science.gov (United States)

    Tiwari, Aseem K; Pandey, Prashant K; Negi, Avinash; Bagga, Ruchika; Shanker, Ajay; Baveja, Usha; Vimarsh, Raina; Bhargava, Richa; Dara, Ravi C; Rawat, Ganesh

    2015-01-01

    Lab-diagnosis of hepatitis C virus (HCV) is based on detecting specific antibodies by enzyme immuno-assay (EIA) or chemiluminescence immuno-assay (CIA). Center for Disease Control reported that signal-to-cut-off (s/co) ratios in anti-HCV antibody tests like EIA/CIA can be used to predict the probable result of supplemental test; above a certain s/co value it is most likely to be true-HCV positive result and below that certain s/co it is most likely to be false-positive result. A prospective study was undertaken in patients in tertiary care setting for establishing this "certain" s/co value. The study was carried out in consecutive patients requiring HCV testing for screening/diagnosis and medical management. These samples were tested for anti-HCV on CIA (VITROS(®) Anti-HCV assay, Ortho-Clinical Diagnostics, New Jersey) for calculating s/co value. The supplemental nucleic acid test used was polymerase chain reaction (PCR) (Abbott). PCR test results were used to define true negatives, false negatives, true positives, and false positives. Performance of different putative s/co ratios versus PCR was measured using sensitivity, specificity, positive predictive value and negative predictive value and most appropriate s/co was considered on basis of highest specificity at sensitivity of at least 95%. An s/co ratio of ≥6 worked out to be over 95% sensitive and almost 92% specific in 438 consecutive patient samples tested. The s/co ratio of six can be used for lab-diagnosis of HCV infection; those with s/co higher than six can be diagnosed to have HCV infection without any need for supplemental assays.

  18. Evaluating score- and feature-based likelihood ratio models for multivariate continuous data: applied to forensic MDMA comparison

    NARCIS (Netherlands)

    Bolck, A.; Ni, H.; Lopatka, M.

    2015-01-01

    Likelihood ratio (LR) models are moving into the forefront of forensic evidence evaluation as these methods are adopted by a diverse range of application areas in forensic science. We examine the fundamentally different results that can be achieved when feature- and score-based methodologies are

  19. Gyromagnetic ratio of charged Kerr-anti-de Sitter black holes

    International Nuclear Information System (INIS)

    Aliev, Alikram N

    2007-01-01

    We examine the gyromagnetic ratios of rotating and charged AdS black holes in four and higher spacetime dimensions. We compute the gyromagnetic ratio for Kerr-AdS black holes with an arbitrary electric charge in four dimensions and show that it corresponds to g = 2 irrespective of the AdS nature of the spacetime. We also compute the gyromagnetic ratio for Kerr-AdS black holes with a single angular momentum and with a test electric charge in all higher dimensions. The gyromagnetic ratio crucially depends on the dimensionless ratio of the rotation parameter to the curvature radius of the AdS background. At the critical limit, when the boundary Einstein universe is rotating at the speed of light, it exhibits a striking feature leading to g 2 regardless of the spacetime dimension. Next, we extend our consideration to include the exact metric for five-dimensional rotating charged black holes in minimal gauged supergravity. We show that the value of the gyromagnetic ratio found in the 'test-charge' approach remains unchanged for these black holes

  20. The ATP/DNA Ratio Is a Better Indicator of Islet Cell Viability Than the ADP/ATP Ratio

    Science.gov (United States)

    Suszynski, T.M.; Wildey, G.M.; Falde, E.J.; Cline, G.W.; Maynard, K. Stewart; Ko, N.; Sotiris, J.; Naji, A.; Hering, B.J.; Papas, K.K.

    2009-01-01

    Real-time, accurate assessment of islet viability is critical for avoiding transplantation of nontherapeutic preparations. Measurements of the intracellular ADP/ATP ratio have been recently proposed as useful prospective estimates of islet cell viability and potency. However, dead cells may be rapidly depleted of both ATP and ADP, which would render the ratio incapable of accounting for dead cells. Since the DNA of dead cells is expected to remain stable over prolonged periods of time (days), we hypothesized that use of the ATP/DNA ratio would take into account dead cells and may be a better indicator of islet cell viability than the ADP/ATP ratio. We tested this hypothesis using mixtures of healthy and lethally heat-treated (HT) rat insulinoma cells and human islets. Measurements of ATP/DNA and ADP/ATP from the known mixtures of healthy and HT cells and islets were used to evaluate how well these parameters correlated with viability. The results indicated that ATP and ADP were rapidly (within 1 hour) depleted in HT cells. The fraction of HT cells in a mixture correlated linearly with the ATP/DNA ratio, whereas the ADP/ADP ratio was highly scattered, remaining effectively unchanged. Despite similar limitations in both ADP/ADP and ATP/DNA ratios, in that ATP levels may fluctuate significantly and reversibly with metabolic stress, the results indicated that ATP/DNA was a better measure of islet viability than the ADP/ATP ratio. PMID:18374063

  1. [Comparison study on subjective and objective measurements of the accommodative convergence to accommodation ratio].

    Science.gov (United States)

    Xu, Jing-jing; Xu, Dan; Huang, Tao; Jiang, Jian; Lü, Fan

    2012-05-01

    To detect the accommodative convergence to accommodation (AC/A) ratios measured respectively by objective and subjective methods. The differences and its relative factors were explored. Forty young volunteers were measured by eye tracker to get the amount of convergence when fixating at the target at 100 cm, 50 cm, 33 cm and 25 cm and were measured by infrared auto-refractor to get corresponding accommodative responses. AC/A ratio based on these two measurements were compared with the calculated and the gradient AC/A ratio from Von Graefe tests. Mean value of stimulated AC/A ratio measured by eye tracker was higher than the calculated and gradient AC/A ratio by Von Graefe method (P = 0.003, 0.001). There are statistic correlation (r = 0.871, P = 0.000) and difference (P = 0.000) between stimulated AC/A ratio and response AC/A ratios both measured by eye tracker, and the difference trends to be greater with the higher AC/A ratio. The objective AC/A ratio is usually higher than the clinical subjective measurement because of more proximal effect. The response AC/A ratio measured objectively may reveal realistically the mutual effect and relationship between accommodation and convergence and it seems to be more credible to be the monitor parameter on progression of myopia in clinics.

  2. An investigation of the effect of load ratio on near-threshold fatigue crack propagation in a Ni-Base superalloy

    International Nuclear Information System (INIS)

    Schooling, J.M.; Reed, P.A.S.

    1995-01-01

    The near-threshold fatigue crack growth behavior of Waspaloy has been investigated to elucidate important parameters relevant to the development of a modelling program for fatigue behavior in Ni-base superalloys. At low values of load-ratio, R, threshold stress intensity values are found to be highly sensitive to R. This behavior is rationalized in terms of roughness induced crack closure. At high load ratios there is less sensitivity to R, and stage II behavior appears to persist to threshold. The threshold stress intensity at high R-ratios is lower than that for closure corrected Stage I (low load ratio) threshold behavior, indicating the existence of two intrinsic threshold values. This difference appears to be due not only to crack branching and deflection in Stage I, but also to be intrinsic difference in resistance to threshold behavior in the two growth modes. (author)

  3. A Test for the Presence of a Signal

    OpenAIRE

    Rolke, Wolfgang A.; Lopez, Angel M.

    2006-01-01

    We describe a statistical hypothesis test for the presence of a signal based on the likelihood ratio statistic. We derive the test for a case of interest and also show that for that case the test works very well, even far out in the tails of the distribution. We also study extensions of the test to cases where there are multiple channels.

  4. Reducing test-data volume and test-power simultaneously in LFSR reseeding-based compression environment

    Energy Technology Data Exchange (ETDEWEB)

    Wang Weizheng; Kuang Jishun; You Zhiqiang; Liu Peng, E-mail: jshkuang@163.com [College of Information Science and Engineering, Hunan University, Changsha 410082 (China)

    2011-07-15

    This paper presents a new test scheme based on scan block encoding in a linear feedback shift register (LFSR) reseeding-based compression environment. Meanwhile, our paper also introduces a novel algorithm of scan-block clustering. The main contribution of this paper is a flexible test-application framework that achieves significant reductions in switching activity during scan shift and the number of specified bits that need to be generated via LFSR reseeding. Thus, it can significantly reduce the test power and test data volume. Experimental results using Mintest test set on the larger ISCAS'89 benchmarks show that the proposed method reduces the switching activity significantly by 72%-94% and provides a best possible test compression of 74%-94% with little hardware overhead. (semiconductor integrated circuits)

  5. Risk Based Optimal Fatigue Testing

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Faber, M.H.; Kroon, I.B.

    1992-01-01

    Optimal fatigue life testing of materials is considered. Based on minimization of the total expected costs of a mechanical component a strategy is suggested to determine the optimal stress range levels for which additional experiments are to be performed together with an optimal value...

  6. Adult age differences in perceptually based, but not conceptually based implicit tests of memory.

    Science.gov (United States)

    Small, B J; Hultsch, D F; Masson, M E

    1995-05-01

    Implicit tests of memory assess the influence of recent experience without requiring awareness of remembering. Evidence concerning age differences on implicit tests of memory suggests small age differences in favor of younger adults. However, the majority of research examining this issue has relied upon perceptually based implicit tests. Recently, a second type of implicit test, one that relies upon conceptually based processes, has been identified. The pattern of age differences on this second type of implicit test is less clear. In the present study, we examined the pattern of age differences on one conceptually based (fact completion) and one perceptually based (stem completion) implicit test of memory, as well as two explicit tests of memory (fact and word recall). Tasks were administered to 403 adults from three age groups (19-34 years, 58-73 years, 74-89 years). Significant age differences in favor of the young were found on stem completion but not fact completion. Age differences were present for both word and fast recall. Correlational analyses examining the relationship of memory performance to other cognitive variables indicated that the implicit tests were supported by different components than the explicit tests, as well as being different from each other.

  7. Person fit for test speededness: normal curvatures, likelihood ratio tests and empirical Bayes estimates

    NARCIS (Netherlands)

    Goegebeur, Y.; de Boeck, P.; Molenberghs, G.

    2010-01-01

    The local influence diagnostics, proposed by Cook (1986), provide a flexible way to assess the impact of minor model perturbations on key model parameters’ estimates. In this paper, we apply the local influence idea to the detection of test speededness in a model describing nonresponse in test data,

  8. Application of Performance Ratios in Portfolio Optimization

    Directory of Open Access Journals (Sweden)

    Aleš Kresta

    2015-01-01

    Full Text Available The cornerstone of modern portfolio theory was established by pioneer work of Harry Markowitz. Based on his mean-variance framework, Sharpe formulated his well-known Sharpe ratio aiming to measure the performance of mutual funds. The contemporary development in computer’s computational power allowed to apply more complex performance ratios, which take into account also higher moments of return probability distribution. Although these ratios were proposed to help the investors to improve the results of portfolio optimization, we empirically demonstrated in our paper that this may not necessarily be true. On the historical dataset of DJIA components we empirically showed that both Sharpe ratio and MAD ratio outperformed Rachev ratio. However, for Rachev ratio we assumed only one level of parameters value. Different set-ups of parameters may provide different results and thus further analysis is certainly required.

  9. Geometrical error calibration in reflective surface testing based on reverse Hartmann test

    Science.gov (United States)

    Gong, Zhidong; Wang, Daodang; Xu, Ping; Wang, Chao; Liang, Rongguang; Kong, Ming; Zhao, Jun; Mo, Linhai; Mo, Shuhui

    2017-08-01

    In the fringe-illumination deflectometry based on reverse-Hartmann-test configuration, ray tracing of the modeled testing system is performed to reconstruct the test surface error. Careful calibration of system geometry is required to achieve high testing accuracy. To realize the high-precision surface testing with reverse Hartmann test, a computer-aided geometrical error calibration method is proposed. The aberrations corresponding to various geometrical errors are studied. With the aberration weights for various geometrical errors, the computer-aided optimization of system geometry with iterative ray tracing is carried out to calibration the geometrical error, and the accuracy in the order of subnanometer is achieved.

  10. MARKETING MIX BY BED OCCUPANCY RATIO (BOR

    Directory of Open Access Journals (Sweden)

    Abdul Muhith

    2017-04-01

    Full Text Available Introduction: Bed Occupancy Ratio (BOR in RSI Arafah Mojosari during the last three years are at under ideal rate and the lowest of the three existing hospitals in the area of Mojosari. The purpose of this study was to determine the relationship marketing mix with Bed Occupancy Ratio in RSI Arafah Mojosari. Methods: This research uses analytic methods with crossectional approach. Variables in the study is marketing mix and Bed Occupancy Ratio (BOR. The population in this study were all patients hospitalized in the RSI Arafah Mojosari. Samples amounted 44 respondents taken by the Stratified random sampling technique. Data were collected using the questionnaire and analyzed using Fisher's Exact test. Result: The results obtained more than 50% of respondents (59.1% rate well against the marketing mix is developed by the hospital management and the majority of respondents (79.5% are in the treatment room that has a number BOR is not ideal. Fisher Exact test test results obtained probabililty value=0.02<0.05 so that H0 is rejected, which means there is a relationship marketing mix with the Bed Occupancy Ratio in RSI Arafah Mojosari. Discussion: Hospitals which able to develop the marketing mix very well, can attract consumers to use inpatient services at the hospital, with that BOR value will increase as the increased use of inpatient services. Hospital management must be able to formulate a good marketing mix strategy that hospital marketing objectives can be achieved. Conformity between service quality and service rates must be addressed, otherwise it extent of media promotions can attract patients to inpatient services.

  11. Simultaneous estimation of Poisson's ratio and Young's modulus using a single indentation: a finite element study

    International Nuclear Information System (INIS)

    Zheng, Y P; Choi, A P C; Ling, H Y; Huang, Y P

    2009-01-01

    Indentation is commonly used to determine the mechanical properties of different kinds of biological tissues and engineering materials. With the force–deformation data obtained from an indentation test, Young's modulus of the tissue can be calculated using a linear elastic indentation model with a known Poisson's ratio. A novel method for simultaneous estimation of Young's modulus and Poisson's ratio of the tissue using a single indentation was proposed in this study. Finite element (FE) analysis using 3D models was first used to establish the relationship between Poisson's ratio and the deformation-dependent indentation stiffness for different aspect ratios (indentor radius/tissue original thickness) in the indentation test. From the FE results, it was found that the deformation-dependent indentation stiffness linearly increased with the deformation. Poisson's ratio could be extracted based on the deformation-dependent indentation stiffness obtained from the force–deformation data. Young's modulus was then further calculated with the estimated Poisson's ratio. The feasibility of this method was demonstrated in virtue of using the indentation models with different material properties in the FE analysis. The numerical results showed that the percentage errors of the estimated Poisson's ratios and the corresponding Young's moduli ranged from −1.7% to −3.2% and 3.0% to 7.2%, respectively, with the aspect ratio (indentor radius/tissue thickness) larger than 1. It is expected that this novel method can be potentially used for quantitative assessment of various kinds of engineering materials and biological tissues, such as articular cartilage

  12. Internal jugular vein: Peripheral vein adrenocorticotropic hormone ratio in patients with adrenocorticotropic hormone-dependent Cushing′s syndrome: Ratio calculated from one adrenocorticotropic hormone sample each from right and left internal jugular vein during corticotrophin releasing hormone stimulation test

    Directory of Open Access Journals (Sweden)

    Sachin Chittawar

    2013-01-01

    Full Text Available Background: Demonstration of central: Peripheral adrenocorticotropic hormone (ACTH gradient is important for diagnosis of Cushing′s disease. Aim: The aim was to assess the utility of internal jugular vein (IJV: Peripheral vein ACTH ratio for diagnosis of Cushing′s disease. Materials and Methods: Patients with ACTH-dependent Cushing′s syndrome (CS patients were the subjects for this study. One blood sample each was collected from right and left IJV following intravenous hCRH at 3 and 5 min, respectively. A simultaneous peripheral vein sample was also collected with each IJV sample for calculation of IJV: Peripheral vein ACTH ratio. IJV sample collection was done under ultrasound guidance. ACTH was assayed using electrochemiluminescence immunoassay (ECLIA. Results: Thirty-two patients participated in this study. The IJV: Peripheral vein ACTH ratio ranged from 1.07 to 6.99 ( n = 32. It was more than 1.6 in 23 patients. Cushing′s disease could be confirmed in 20 of the 23 cases with IJV: Peripheral vein ratio more than 1.6. Four patients with Cushing′s disease and 2 patients with ectopic ACTH syndrome had IJV: Peripheral vein ACTH ratio less than 1.6. Six cases with unknown ACTH source were excluded for calculation of sensitivity and specificity of the test. Conclusion: IJV: Peripheral vein ACTH ratio calculated from a single sample from each IJV obtained after hCRH had 83% sensitivity and 100% specificity for diagnosis of CD.

  13. Urine Test: Microalbumin-to-Creatinine Ratio (For Parents)

    Science.gov (United States)

    ... could interfere with test results. Be sure to review all your child's medications with your doctor. The Procedure Your child will be asked to urinate (pee) into a clean sample cup in the doctor's office or at home. Collecting the specimen should only take a few minutes. If your child isn' ...

  14. Enhancing SAT-Based Test Pattern Generation

    Institute of Scientific and Technical Information of China (English)

    LIU Xin; XIONG You-lun

    2005-01-01

    This paper presents modeling tools based on Boolean satisfiability (SAT) to solve problems of test generation for combinational circuits. It exploits an added layer to maintain circuit-related information and value justification relations to a generic SAT algorithm. It dovetails binary decision graphs (BDD) and SAT techniques to improve the efficiency of automatic test pattern generation (ATPG). More specifically, it first exploits inexpensive reconvergent fanout analysis of circuit to gather information on the local signal correlation by using BDD learning, then uses the above learned information to restrict and focus the overall search space of SAT-based ATPG. Its learning technique is effective and lightweight. The experimental results demonstrate the effectiveness of the approach.

  15. Effects of Stress Ratio and Microstructure on Fatigue Failure Behavior of Polycrystalline Nickel Superalloy

    Science.gov (United States)

    Zhang, H.; Guan, Z. W.; Wang, Q. Y.; Liu, Y. J.; Li, J. K.

    2018-05-01

    The effects of microstructure and stress ratio on high cycle fatigue of nickel superalloy Nimonic 80A were investigated. The stress ratios of 0.1, 0.5 and 0.8 were chosen to perform fatigue tests in a frequency of 110 Hz. Cleavage failure was observed, and three competing failure crack initiation modes were discovered by a scanning electron microscope, which were classified as surface without facets, surface with facets and subsurface with facets. With increasing the stress ratio from 0.1 to 0.8, the occurrence probability of surface and subsurface with facets also increased and reached the maximum value at R = 0.5, meanwhile the probability of surface initiation without facets decreased. The effect of microstructure on the fatigue fracture behavior at different stress ratios was also observed and discussed. Based on the Goodman diagram, it was concluded that the fatigue strength of 50% probability of failure at R = 0.1, 0.5 and 0.8 is lower than the modified Goodman line.

  16. Performance of a high-work low aspect ratio turbine tested with a realistic inlet radial temperature profile

    Science.gov (United States)

    Stabe, R. G.; Whitney, W. J.; Moffitt, T. P.

    1984-01-01

    Experimental results are presented for a 0.767 scale model of the first stage of a two-stage turbine designed for a high by-pass ratio engine. The turbine was tested with both uniform inlet conditions and with an inlet radial temperature profile simulating engine conditions. The inlet temperature profile was essentially mixed-out in the rotor. There was also substantial underturning of the exit flow at the mean diameter. Both of these effects were attributed to strong secondary flows in the rotor blading. There were no significant differences in the stage performance with either inlet condition when differences in tip clearance were considered. Performance was very close to design intent in both cases. Previously announced in STAR as N84-24589

  17. Efficient p-value evaluation for resampling-based tests

    KAUST Repository

    Yu, K.; Liang, F.; Ciampa, J.; Chatterjee, N.

    2011-01-01

    The resampling-based test, which often relies on permutation or bootstrap procedures, has been widely used for statistical hypothesis testing when the asymptotic distribution of the test statistic is unavailable or unreliable. It requires repeated

  18. Effects of cathodic protection potential and stress ratio on fatigue thresholds of structured steels in sea water

    Energy Technology Data Exchange (ETDEWEB)

    Dolphin, A.S.; Tice, D.R.

    1987-09-01

    The results reported here suggest that the very high thresholds found under reducing {Delta}K conditions may be inapplicable under the increasing {Delta}K conditions likely to be more relevant to real structures. This conclusion is based on just two tests at - 0.85V SCE, and so requires confirmation over a range of R ratios and at free corrosion and overprotection potentials. Crack growth thresholds appear to be higher under cathodic over-protection conditions (-1.05V SCE) than at more positive potentials, due to calcareous scale formation. Tests at negative R ratios are required to ensure this calcareous scale would remain intact under compressive loading. Due to the large observed influence of calcareous scale on crack growth, and particularly on the arrest of growing cracks, more detailed microstructural examination is recommended on the specimens tested in this programme. (author).

  19. Worldwide Research, Worldwide Participation: Web-Based Test Logger

    Science.gov (United States)

    Clark, David A.

    1998-01-01

    Thanks to the World Wide Web, a new paradigm has been born. ESCORT (steady state data system) facilities can now be configured to use a Web-based test logger, enabling worldwide participation in tests. NASA Lewis Research Center's new Web-based test logger for ESCORT automatically writes selected test and facility parameters to a browser and allows researchers to insert comments. All data can be viewed in real time via Internet connections, so anyone with a Web browser and the correct URL (universal resource locator, or Web address) can interactively participate. As the test proceeds and ESCORT data are taken, Web browsers connected to the logger are updated automatically. The use of this logger has demonstrated several benefits. First, researchers are free from manual data entry and are able to focus more on the tests. Second, research logs can be printed in report format immediately after (or during) a test. And finally, all test information is readily available to an international public.

  20. Methodology for testing and validating knowledge bases

    Science.gov (United States)

    Krishnamurthy, C.; Padalkar, S.; Sztipanovits, J.; Purves, B. R.

    1987-01-01

    A test and validation toolset developed for artificial intelligence programs is described. The basic premises of this method are: (1) knowledge bases have a strongly declarative character and represent mostly structural information about different domains, (2) the conditions for integrity, consistency, and correctness can be transformed into structural properties of knowledge bases, and (3) structural information and structural properties can be uniformly represented by graphs and checked by graph algorithms. The interactive test and validation environment have been implemented on a SUN workstation.

  1. Does the Test Work? Evaluating a Web-Based Language Placement Test

    Science.gov (United States)

    Long, Avizia Y.; Shin, Sun-Young; Geeslin, Kimberly; Willis, Erik W.

    2018-01-01

    In response to the need for examples of test validation from which everyday language programs can benefit, this paper reports on a study that used Bachman's (2005) assessment use argument (AUA) framework to examine evidence to support claims made about the intended interpretations and uses of scores based on a new web-based Spanish language…

  2. Evaporative Air Coolers Optimization for Energy Consumption Reduction and Energy Efficiency Ratio Increment

    OpenAIRE

    Leila Torkaman; Nasser Ghassembaglou

    2015-01-01

    Significant quota of Municipal Electrical Energy consumption is related to Decentralized Air Conditioning which is mostly provided by evaporative coolers. So the aim is to optimize design of air conditioners to increase their efficiencies. To achieve this goal, results of practical standardized tests for 40 evaporative coolers in different types collected and simultaneously results for same coolers based on one of EER (Energy Efficiency Ratio) modeling styles are figured ...

  3. Planned Enhanced Wakefield Transformer Ratio Experiment at Argonne Wakefield Accelerator

    CERN Document Server

    Kanareykin, Alex; Gai, Wei; Jing, Chunguang; Konecny, Richard; Power, John G

    2005-01-01

    In this paper, we present a preliminary experimental study of a wakefield accelerating scheme that uses a carefully spaced and current ramped electron pulse train to produce wakefields that increases the transformer ratio much higher than 2. A dielectric structure was designed and fabricated to operate at 13.625 GHz with dielectric constant of 15.7. The structure will be initially excited by two beams with first and second beam charge ratio of 1:3. The expected transformer ratio is 3 and the setup can be easily extend to 4 pulses which leads to a transformer ratio of more than 6. The dielectric structure cold test results show the tube is within the specification. A set of laser splitters was also tested to produce ramped bunch train of 2 - 4 pulses. Overall design of the experiment and initial results will be presented.

  4. Effect of an online video-based intervention to increase HIV testing in men who have sex with men in Peru.

    Directory of Open Access Journals (Sweden)

    Magaly M Blas

    2010-05-01

    Full Text Available Although many men who have sex with men (MSM in Peru are unaware of their HIV status, they are frequent users of the Internet, and can be approached by that medium for promotion of HIV testing.We conducted an online randomized controlled trial to compare the effect of HIV-testing motivational videos versus standard public health text, both offered through a gay website. The videos were customized for two audiences based on self-identification: either gay or non-gay men. The outcomes evaluated were 'intention to get tested' and 'HIV testing at the clinic.'In the non-gay identified group, 97 men were randomly assigned to the video-based intervention and 90 to the text-based intervention. Non-gay identified participants randomized to the video-based intervention were more likely to report their intention of getting tested for HIV within the next 30 days (62.5% vs. 15.4%, Relative Risk (RR: 2.77, 95% Confidence Interval (CI: 1.42-5.39. After a mean of 125.5 days of observation (range 42-209 days, 11 participants randomized to the video and none of the participants randomized to text attended our clinic requesting HIV testing (p = 0.001. In the gay-identified group, 142 men were randomized to the video-based intervention and 130 to the text-based intervention. Gay-identified participants randomized to the video were more likely to report intentions of getting an HIV test within 30 days, although not significantly (50% vs. 21.6%, RR: 1.54, 95% CI: 0.74-3.20. At the end of follow up, 8 participants who watched the video and 10 who read the text visited our clinic for HIV testing (Hazard Ratio: 1.07, 95% CI: 0.40-2.85.This study provides some evidence of the efficacy of a video-based online intervention in improving HIV testing among non-gay-identified MSM in Peru. This intervention may be adopted by institutions with websites oriented to motivate HIV testing among similar MSM populations.Clinicaltrials.gov NCT00751192.

  5. GPS Device Testing Based on User Performance Metrics

    Science.gov (United States)

    2015-10-02

    1. Rationale for a Test Program Based on User Performance Metrics ; 2. Roberson and Associates Test Program ; 3. Status of, and Revisions to, the Roberson and Associates Test Program ; 4. Comparison of Roberson and DOT/Volpe Programs

  6. Testing ESL sociopragmatics development and validation of a web-based test battery

    CERN Document Server

    Roever, Carsten; Elder, Catherine

    2014-01-01

    Testing of second language pragmatics has grown as a research area but still suffers from a tension between construct coverage and practicality. In this book, the authors describe the development and validation of a web-based test of second language pragmatics for learners of English. The test has a sociopragmatic orientation and strives for a broad coverage of the construct by assessing learners'' metapragmatic judgments as well as their ability to co-construct discourse. To ensure practicality, the test is delivered online and is scored partially automatically and partially by human raters.

  7. Investigation of SiO{sub 2}:Na{sub 2}O ratio as a corrosion inhibitor for metal alloys

    Energy Technology Data Exchange (ETDEWEB)

    Mohamad, N.; Othman, N. K. [School of Applied Physics, Faculty of Science and Technology, Universiti Kebangsaan Malaysia, 43600 UKM Bangi, Selangor Darul Ehsan (Malaysia); Jalar, A. [Institute of Micro Engineering and Nanoelectronics (IMEN), Universiti Kebangsaan Malaysia, 43600 UKM Bangi, Selangor Darul Ehsan (Malaysia)

    2013-11-27

    The silicate is one of the potential compounds used as a corrosion inhibitor for metal alloys. The mixture between silica and sodium hydroxide (NaOH) succeeded to produce the silicate product. The formulation of a silicate product normally variable depended by the different ratio of SiO{sub 2}:Na{sub 2}O. This research utilized the agriculture waste product of paddy using its rice husk. In this study, the amorphous silica content in rice husk ash was used after rice husk burnt in a muffle furnace at a certain temperature. The X-ray diffraction (XRD) analysis was done to determine the existence of amorphous phase of silica in the rice husk ash. There are several studies that recognized rice husk as an alternative source that obtained high silica content. The X-ray fluorescence (XRF) analysis was carried out to clarify the percentage amount of Si and O elements, which referred the silica compound in rice husk ash. The preparation of sodium silicate formulation were differ based on the SiO{sub 2}:Na{sub 2}O ratio (SiO{sub 2}:Na{sub 2}O ratio = 1.00, 2.00 and 3.00). These silicate based corrosion inhibitors were tested on several testing samples, which were copper (99.9%), aluminum alloy (AA 6061) and carbon steel (SAE 1045). The purpose of this study is to determine the appropriate SiO{sub 2}:Na{sub 2}O ratio and understand how this SiO{sub 2}:Na{sub 2}O ratio can affect the corrosion rate of each metal alloys immersed in acidic medium. In order to investigate this study, weight loss test was conducted in 0.5 M hydrochloric acid (HCl) for 24 hours at room temperature.

  8. Empirical likelihood-based confidence intervals for the sensitivity of a continuous-scale diagnostic test at a fixed level of specificity.

    Science.gov (United States)

    Gengsheng Qin; Davis, Angela E; Jing, Bing-Yi

    2011-06-01

    For a continuous-scale diagnostic test, it is often of interest to find the range of the sensitivity of the test at the cut-off that yields a desired specificity. In this article, we first define a profile empirical likelihood ratio for the sensitivity of a continuous-scale diagnostic test and show that its limiting distribution is a scaled chi-square distribution. We then propose two new empirical likelihood-based confidence intervals for the sensitivity of the test at a fixed level of specificity by using the scaled chi-square distribution. Simulation studies are conducted to compare the finite sample performance of the newly proposed intervals with the existing intervals for the sensitivity in terms of coverage probability. A real example is used to illustrate the application of the recommended methods.

  9. Establishing a sample-to cut-off ratio for lab-diagnosis of hepatitis C virus in Indian context

    Directory of Open Access Journals (Sweden)

    Aseem K Tiwari

    2015-01-01

    Full Text Available Introduction: Lab-diagnosis of hepatitis C virus (HCV is based on detecting specific antibodies by enzyme immuno-assay (EIA or chemiluminescence immuno-assay (CIA. Center for Disease Control reported that signal-to-cut-off (s/co ratios in anti-HCV antibody tests like EIA/CIA can be used to predict the probable result of supplemental test; above a certain s/co value it is most likely to be true-HCV positive result and below that certain s/co it is most likely to be false-positive result. A prospective study was undertaken in patients in tertiary care setting for establishing this "certain" s/co value. Materials and Methods: The study was carried out in consecutive patients requiring HCV testing for screening/diagnosis and medical management. These samples were tested for anti-HCV on CIA (VITROS ® Anti-HCV assay, Ortho-Clinical Diagnostics, New Jersey for calculating s/co value. The supplemental nucleic acid test used was polymerase chain reaction (PCR (Abbott. PCR test results were used to define true negatives, false negatives, true positives, and false positives. Performance of different putative s/co ratios versus PCR was measured using sensitivity, specificity, positive predictive value and negative predictive value and most appropriate s/co was considered on basis of highest specificity at sensitivity of at least 95%. Results: An s/co ratio of ≥6 worked out to be over 95% sensitive and almost 92% specific in 438 consecutive patient samples tested. Conclusion: The s/co ratio of six can be used for lab-diagnosis of HCV infection; those with s/co higher than six can be diagnosed to have HCV infection without any need for supplemental assays.

  10. A test of the mean density approximation for Lennard-Jones mixtures with large size ratios

    International Nuclear Information System (INIS)

    Ely, J.F.

    1986-01-01

    The mean density approximation for mixture radial distribution functions plays a central role in modern corresponding-states theories. This approximation is reasonably accurate for systems that do not differ widely in size and energy ratios and which are nearly equimolar. As the size ratio increases, however, or if one approaches an infinite dilution of one of the components, the approximation becomes progressively worse, especially for the small molecule pair. In an attempt to better understand and improve this approximation, isothermal molecular dynamics simulations have been performed on a series of Lennard-Jones mixtures. Thermodynamic properties, including the mixture radial distribution functions, have been obtained at seven compositions ranging from 5 to 95 mol%. In all cases the size ratio was fixed at two and three energy ratios were investigated, 22 / 11 =0.5, 1.0, and 1.5. The results of the simulations are compared with the mean density approximation and a modification to integrals evaluated with the mean density approximation is proposed

  11. [Antibiotics prescription and complementary tests based on frequency of use and loyalty in Primary Care].

    Science.gov (United States)

    Balaguer Martínez, Josep Vicent; Del Castillo Aguas, Guadalupe; Gallego Iborra, Ana

    2017-12-30

    To assess whether there is a relationship between the prescription of antibiotics and the performance of complementary tests with frequency of use and loyalty in Primary Care. Analytical descriptive study performed through a network of Primary Care sentinel paediatricians (PAPenRed). Each paediatrician reviewed the spontaneous visits (in Primary Care and in Emergency Departments) of 15 patients for 12 months, randomly chosen from their quota. The prescription of antibiotics and the complementary tests performed on these patients were also collected. A total of 212 paediatricians took part and reviewed 2,726 patients. It was found that 8.3% were moderate over-users (mean + 1-2 standard deviations) and 5.2% extreme over-users (mean + 2 standard deviations). Almost half (49.6%) were high-loyalty patients (more than 75% of visits with their doctor). The incidence ratio of antibiotic prescriptions for moderate over-users was 2.13 (1.74-2.62) and 3.25 (2.55-4.13) for extreme over-users, compared to non-over-user children. The incidence ratio for the diagnostic tests were 2.25 (1.86-2.73) and 3.48 (2.78-4.35), respectively. The incidence ratios for antibiotic prescription were 1.34 (1.16-1.55) in patients with medium-high loyalty, 1.45 (1.15-1.83) for medium-low loyalty, and 1.08 (0.81-1.44) for those with low loyalty, compared to patients with high loyalty. The incidence ratios to perform diagnostic tests were 1.46 (1.27-1.67); 1.60 (1.28 - 2.00), and 0.84 (0.63-1.12), respectively. Antibiotics prescription and complementary tests were significantly related to medical overuse. They were also related to loyalty, but less significantly. Copyright © 2017. Publicado por Elsevier España, S.L.U.

  12. Dynamic moduli and damping ratios of soil evaluated from pressuremeter test

    International Nuclear Information System (INIS)

    Yoshida, Yasuo; Ezashi, Yasuyuki; Kokusho, Takaji; Nishi, Yoshikazu

    1984-01-01

    Dynamic and static properties of soils are investigated using the newly developed equipment of in-situ test, which imposes dynamic repeated pressure on borehole wall at any depth covering a wide range of strain amplitude. This paper describes mainly the shear modulus and damping characteristics of soils obtained by using the equipment in several sites covering wide variety of soils. The test results are compared and with those obtained by other test methods such as the dynamic triaxial test, the simple shear test and the shear wave velocity test, and discussions are made with regard to their relation ships to each other, which demonstrates the efficiency of this in-situ test. (author)

  13. RATIO_TOOL - SOFTWARE FOR COMPUTING IMAGE RATIOS

    Science.gov (United States)

    Yates, G. L.

    1994-01-01

    Geological studies analyze spectral data in order to gain information on surface materials. RATIO_TOOL is an interactive program for viewing and analyzing large multispectral image data sets that have been created by an imaging spectrometer. While the standard approach to classification of multispectral data is to match the spectrum for each input pixel against a library of known mineral spectra, RATIO_TOOL uses ratios of spectral bands in order to spot significant areas of interest within a multispectral image. Each image band can be viewed iteratively, or a selected image band of the data set can be requested and displayed. When the image ratios are computed, the result is displayed as a gray scale image. At this point a histogram option helps in viewing the distribution of values. A thresholding option can then be used to segment the ratio image result into two to four classes. The segmented image is then color coded to indicate threshold classes and displayed alongside the gray scale image. RATIO_TOOL is written in C language for Sun series computers running SunOS 4.0 and later. It requires the XView toolkit and the OpenWindows window manager (version 2.0 or 3.0). The XView toolkit is distributed with Open Windows. A color monitor is also required. The standard distribution medium for RATIO_TOOL is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation is included on the program media. RATIO_TOOL was developed in 1992 and is a copyrighted work with all copyright vested in NASA. Sun, SunOS, and OpenWindows are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  14. Corporate prediction models, ratios or regression analysis?

    NARCIS (Netherlands)

    Bijnen, E.J.; Wijn, M.F.C.M.

    1994-01-01

    The models developed in the literature with respect to the prediction of a company s failure are based on ratios. It has been shown before that these models should be rejected on theoretical grounds. Our study of industrial companies in the Netherlands shows that the ratios which are used in

  15. Similarity regularized sparse group lasso for cup to disc ratio computation.

    Science.gov (United States)

    Cheng, Jun; Zhang, Zhuo; Tao, Dacheng; Wong, Damon Wing Kee; Liu, Jiang; Baskaran, Mani; Aung, Tin; Wong, Tien Yin

    2017-08-01

    Automatic cup to disc ratio (CDR) computation from color fundus images has shown to be promising for glaucoma detection. Over the past decade, many algorithms have been proposed. In this paper, we first review the recent work in the area and then present a novel similarity-regularized sparse group lasso method for automated CDR estimation. The proposed method reconstructs the testing disc image based on a set of reference disc images by integrating the similarity between testing and the reference disc images with the sparse group lasso constraints. The reconstruction coefficients are then used to estimate the CDR of the testing image. The proposed method has been validated using 650 images with manually annotated CDRs. Experimental results show an average CDR error of 0.0616 and a correlation coefficient of 0.7, outperforming other methods. The areas under curve in the diagnostic test reach 0.843 and 0.837 when manual and automatically segmented discs are used respectively, better than other methods as well.

  16. The Optimal Price Ratio of Typical Energy Sources in Beijing Based on the Computable General Equilibrium Model

    Directory of Open Access Journals (Sweden)

    Yongxiu He

    2014-04-01

    Full Text Available In Beijing, China, the rational consumption of energy is affected by the insufficient linkage mechanism of the energy pricing system, the unreasonable price ratio and other issues. This paper combines the characteristics of Beijing’s energy market, putting forward the society-economy equilibrium indicator R maximization taking into consideration the mitigation cost to determine a reasonable price ratio range. Based on the computable general equilibrium (CGE model, and dividing four kinds of energy sources into three groups, the impact of price fluctuations of electricity and natural gas on the Gross Domestic Product (GDP, Consumer Price Index (CPI, energy consumption and CO2 and SO2 emissions can be simulated for various scenarios. On this basis, the integrated effects of electricity and natural gas price shocks on the Beijing economy and environment can be calculated. The results show that relative to the coal prices, the electricity and natural gas prices in Beijing are currently below reasonable levels; the solution to these unreasonable energy price ratios should begin by improving the energy pricing mechanism, through means such as the establishment of a sound dynamic adjustment mechanism between regulated prices and market prices. This provides a new idea for exploring the rationality of energy price ratios in imperfect competitive energy markets.

  17. ANALISIS PENGARUH LDR, NPL DAN OPERATIONAL EFFICIENCY RATIO TERHADAP RETURN ON ASSETS PADA BANK DEVISA DI INDONESIA PERIODE 2010-2012

    Directory of Open Access Journals (Sweden)

    Hamidah Hamidah

    2014-04-01

    Full Text Available This research is performed on order to test analysis the influence of the Loan to Deposit Ratio (LDR, Non Performing Loan (NPL and Operational Efficiency Ratio (OER toward Return On Asset (ROA On Foreign Exchange Banks In Indonesia Period 2010-2012. Methodology research as the sample used purposive sampling, sample was accrued from foreign banks in Indonesia. Data analysis with multi linear regression of ordinary least square and hypotheses test used t-statistic and F statistic, a classic assumption examination to test the hypotheses.Based on normality test, multicolinearity test, heterosskedasticity test and auto correlation test were not found variables that deviate from the classical assumptions, this indicate that the available data has fulfill the condition to use multi linear regression model. This result of research show that variable LDR and NPL partially have positive influence but not significant toward ROA. Variable OERpartially have negative significant influence toward ROA. Variable LDR, NPL and OER simultaneously have significant influence toward ROA.

  18. Automated model-based testing of hybrid systems

    NARCIS (Netherlands)

    Osch, van M.P.W.J.

    2009-01-01

    In automated model-based input-output conformance testing, tests are automati- cally generated from a speci¯cation and automatically executed on an implemen- tation. Input is applied to the implementation and output is observed from the implementation. If the observed output is allowed according to

  19. Defect-based testing of LTS digital circuits

    NARCIS (Netherlands)

    Arun, A.J.

    2006-01-01

    A Defect-Based Test (DBT) methodology for Superconductor Electronics (SCE) is presented in this thesis, so that commercial production and efficient testing of systems can be implemented in this technology in the future. In the first chapter, the features and prospects for SCE have been presented.

  20. Could changes in reported sex ratios at birth during China's 1958-1961 famine support the adaptive sex ratio adjustment hypothesis?

    Directory of Open Access Journals (Sweden)

    Anna Reimondos

    2013-10-01

    Full Text Available Background: The adaptive sex ratio adjustment hypothesis suggests that when mothers are in poor conditions the sex ratio of their offspring will be biased towards females. Major famines provide opportunities for testing this hypothesis because they lead to the widespread deterioration of living conditions in the affected population. Objective: This study examines changes in sex ratio at birth before, during, and after China's 1958-1961 famine, to see whether they provide any support for the adaptive sex ratio adjustment hypothesis. Methods: We use descriptive statistics to analyse data collected by both China's 1982 and 1988 fertility sample surveys and examine changes in sex ratio at birth in recent history. In addition, we examine the effectiveness of using different methods to model changes in sex ratio at birth and compare their differences. Results: During China's 1958-1961 famine, reported sex ratio at birth remained notably higher than that observed in most countries in the world. The timing of the decline in sex ratio at birth did not coincide with the timing of the famine. After the famine, although living conditions were considerably improved, the sex ratio at birth was not higher but lower than that recorded during the famine. Conclusions: The analysis of the data collected by the two fertility surveys has found no evidence that changes in sex ratio at birth during China's 1958-1961 famine and the post-famine period supported the adaptive sex ratio adjustment hypothesis.

  1. Towards model-based testing of electronic funds transfer systems

    OpenAIRE

    Asaadi, H.R.; Khosravi, R.; Mousavi, M.R.; Noroozi, N.

    2010-01-01

    We report on our first experience with applying model-based testing techniques to an operational Electronic Funds Transfer (EFT) switch. The goal is to test the conformance of the EFT switch to the standard flows described by the ISO 8583 standard. To this end, we first make a formalization of the transaction flows specified in the ISO 8583 standard in terms of a Labeled Transition System (LTS). This formalization paves the way for model-based testing based on the formal notion of Input-Outpu...

  2. Home-based versus mobile clinic HIV testing and counseling in rural Lesotho: a cluster-randomized trial.

    Science.gov (United States)

    Labhardt, Niklaus Daniel; Motlomelo, Masetsibi; Cerutti, Bernard; Pfeiffer, Karolin; Kamele, Mashaete; Hobbins, Michael A; Ehmer, Jochen

    2014-12-01

    The success of HIV programs relies on widely accessible HIV testing and counseling (HTC) services at health facilities as well as in the community. Home-based HTC (HB-HTC) is a popular community-based approach to reach persons who do not test at health facilities. Data comparing HB-HTC to other community-based HTC approaches are very limited. This trial compares HB-HTC to mobile clinic HTC (MC-HTC). The trial was powered to test the hypothesis of higher HTC uptake in HB-HTC campaigns than in MC-HTC campaigns. Twelve clusters were randomly allocated to HB-HTC or MC-HTC. The six clusters in the HB-HTC group received 30 1-d multi-disease campaigns (five villages per cluster) that delivered services by going door-to-door, whereas the six clusters in MC-HTC group received campaigns involving community gatherings in the 30 villages with subsequent service provision in mobile clinics. Time allocation and human resources were standardized and equal in both groups. All individuals accessing the campaigns with unknown HIV status or whose last HIV test was >12 wk ago and was negative were eligible. All outcomes were assessed at the individual level. Statistical analysis used multivariable logistic regression. Odds ratios and p-values were adjusted for gender, age, and cluster effect. Out of 3,197 participants from the 12 clusters, 2,563 (80.2%) were eligible (HB-HTC: 1,171; MC-HTC: 1,392). The results for the primary outcomes were as follows. Overall HTC uptake was higher in the HB-HTC group than in the MC-HTC group (92.5% versus 86.7%; adjusted odds ratio [aOR]: 2.06; 95% CI: 1.18-3.60; p = 0. 011). Among adolescents and adults ≥ 12 y, HTC uptake did not differ significantly between the two groups; however, in children versus 58.7%; aOR: 4.91; 95% CI: 2.41-10.0; pindividuals in the HB-HTC and in the MC-HTC arms, respectively, linked to HIV care within 1 mo after testing positive. Findings for secondary outcomes were as follows: HB-HTC reached more first-time testers

  3. Gel/Space Ratio Evolution in Ternary Composite System Consisting of Portland Cement, Silica Fume, and Fly Ash.

    Science.gov (United States)

    Wu, Mengxue; Li, Chen; Yao, Wu

    2017-01-11

    In cement-based pastes, the relationship between the complex phase assemblage and mechanical properties is usually described by the "gel/space ratio" descriptor. The gel/space ratio is defined as the volume ratio of the gel to the available space in the composite system, and it has been widely studied in the cement unary system. This work determines the gel/space ratio in the cement-silica fume-fly ash ternary system (C-SF-FA system) by measuring the reaction degrees of the cement, SF, and FA. The effects that the supplementary cementitious material (SCM) replacements exert on the evolution of the gel/space ratio are discussed both theoretically and practically. The relationship between the gel/space ratio and compressive strength is then explored, and the relationship disparities for different mix proportions are analyzed in detail. The results demonstrate that the SCM replacements promote the gel/space ratio evolution only when the SCM reaction degree is higher than a certain value, which is calculated and defined as the critical reaction degree (CRD). The effects of the SCM replacements can be predicted based on the CRD, and the theological predictions agree with the test results quite well. At low gel/space ratios, disparities in the relationship between the gel/space ratio and the compressive strength are caused by porosity, which has also been studied in cement unary systems. The ratio of cement-produced gel to SCM-produced gel ( G C to G S C M ratio) is introduced for use in analyzing high gel/space ratios, in which it plays a major role in creating relationship disparities.

  4. Systematic design of 3D auxetic lattice materials with programmable Poisson's ratio for finite strains

    Science.gov (United States)

    Wang, Fengwen

    2018-05-01

    This paper presents a systematic approach for designing 3D auxetic lattice materials, which exhibit constant negative Poisson's ratios over large strain intervals. A unit cell model mimicking tensile tests is established and based on the proposed model, the secant Poisson's ratio is defined as the negative ratio between the lateral and the longitudinal engineering strains. The optimization problem for designing a material unit cell with a target Poisson's ratio is formulated to minimize the average lateral engineering stresses under the prescribed deformations. Numerical results demonstrate that 3D auxetic lattice materials with constant Poisson's ratios can be achieved by the proposed optimization formulation and that two sets of material architectures are obtained by imposing different symmetry on the unit cell. Moreover, inspired by the topology-optimized material architecture, a subsequent shape optimization is proposed by parametrizing material architectures using super-ellipsoids. By designing two geometrical parameters, simple optimized material microstructures with different target Poisson's ratios are obtained. By interpolating these two parameters as polynomial functions of Poisson's ratios, material architectures for any Poisson's ratio in the interval of ν ∈ [ - 0.78 , 0.00 ] are explicitly presented. Numerical evaluations show that interpolated auxetic lattice materials exhibit constant Poisson's ratios in the target strain interval of [0.00, 0.20] and that 3D auxetic lattice material architectures with programmable Poisson's ratio are achievable.

  5. La Dinámica Cross-Section de los Ratios Financieros: ¿Tienden los Ratios a Converger hacia la Media Sectorial?

    Directory of Open Access Journals (Sweden)

    Manuel Illueca Muñoz

    2002-12-01

    Full Text Available El objetivo fundamental de este artículo consiste en contrastar si los ratios financieros describen un proceso de ajuste / convergencia hacia la media sectorial. Utilizando un enfoque no paramétrico, se ha modelizado explícitamente la dinámica de las distribuciones de seis ratios, calculados a partir de una muestra de empresas del sector español de pavimentos y revestimientos cerámicos. Los resultados no permiten afirmar que las empresas de la muestra converjan hacia la media del sector, al contrario, las distribuciones de probabilidad de los ratios analizados presentan a largo plazo una dispersión similar a la del periodo muestral.The main objective of this paper is to test whether financial ratios follow a convergent path towards the industry mean. A non parametric approach is used to model the cross-section dynamics of six financial ratios, computed on a sample of Spanish tile firms. Our findings do not support the hipothesis of convergence. The dispersion of the ratios does not shrink in the long term.

  6. A comparison of test statistics for the recovery of rapid growth-based enumeration tests

    NARCIS (Netherlands)

    van den Heuvel, Edwin R.; IJzerman-Boon, Pieta C.

    This paper considers five test statistics for comparing the recovery of a rapid growth-based enumeration test with respect to the compendial microbiological method using a specific nonserial dilution experiment. The finite sample distributions of these test statistics are unknown, because they are

  7. Stochastic shock response spectrum decomposition method based on probabilistic definitions of temporal peak acceleration, spectral energy, and phase lag distributions of mechanical impact pyrotechnic shock test data

    Science.gov (United States)

    Hwang, James Ho-Jin; Duran, Adam

    2016-08-01

    Most of the times pyrotechnic shock design and test requirements for space systems are provided in Shock Response Spectrum (SRS) without the input time history. Since the SRS does not describe the input or the environment, a decomposition method is used to obtain the source time history. The main objective of this paper is to develop a decomposition method producing input time histories that can satisfy the SRS requirement based on the pyrotechnic shock test data measured from a mechanical impact test apparatus. At the heart of this decomposition method is the statistical representation of the pyrotechnic shock test data measured from the MIT Lincoln Laboratory (LL) designed Universal Pyrotechnic Shock Simulator (UPSS). Each pyrotechnic shock test data measured at the interface of a test unit has been analyzed to produce the temporal peak acceleration, Root Mean Square (RMS) acceleration, and the phase lag at each band center frequency. Maximum SRS of each filtered time history has been calculated to produce a relationship between the input and the response. Two new definitions are proposed as a result. The Peak Ratio (PR) is defined as the ratio between the maximum SRS and the temporal peak acceleration at each band center frequency. The ratio between the maximum SRS and the RMS acceleration is defined as the Energy Ratio (ER) at each band center frequency. Phase lag is estimated based on the time delay between the temporal peak acceleration at each band center frequency and the peak acceleration at the lowest band center frequency. This stochastic process has been applied to more than one hundred pyrotechnic shock test data to produce probabilistic definitions of the PR, ER, and the phase lag. The SRS is decomposed at each band center frequency using damped sinusoids with the PR and the decays obtained by matching the ER of the damped sinusoids to the ER of the test data. The final step in this stochastic SRS decomposition process is the Monte Carlo (MC

  8. Technique for Selecting Optimum Fan Compression Ratio based on the Effective Power Plant Parameters

    Directory of Open Access Journals (Sweden)

    I. I. Kondrashov

    2016-01-01

    Full Text Available Nowadays, civilian aircrafts occupy the major share of global aviation industry market. As to medium and long - haul aircrafts, turbofans with separate exhaust streams are widely used. Here, fuel efficiency is the main criterion of this engine. The paper presents the research results of the mutual influence of fan pressure ratio and bypass ratio on the effective specific fuel consumption. Shows the increasing bypass ratio to be a rational step for reducing the fuel consumption. Also considers the basic features of engines with a high bypass ratio. Among the other working process parameters, fan pressure ratio and bypass ratio are the most relevant for consideration as they are the most structural variables at a given level of technical excellence. The paper presents the dependence of the nacelle drag coefficient on the engine bypass ratio. For computation were adopted the projected parameters of prospective turbofans to be used in the power plant of the 180-seat medium-haul aircraft. Computation of the engine cycle was performed in Mathcad using these data, with fan pressure ratio and bypass ratio being varied. The combustion chamber gas temperature, the overall pressure ratio and engine thrust remained constant. Pressure loss coefficients, the efficiency of the engine components and the amount of air taken for cooling also remained constant. The optimal parameters corresponding to the minimum effective specific fuel consumption were found as the result of computation. The paper gives recommendations for adjusting optimal parameters, depending on the considered external factors, such as weight of engine and required fuel reserve. The obtained data can be used to estimate parameters of future turbofan engines with high bypass ratio.

  9. Design Of Computer Based Test Using The Unified Modeling Language

    Science.gov (United States)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  10. Gamma-glutamyl-transpeptidase to platelet ratio is not superior to APRI,FIB-4 and RPR for diagnosing liver fibrosis in CHB patients in China

    OpenAIRE

    Huang, Rui; Wang, Guiyang; Tian, Chen; Liu, Yong; Jia, Bei; Wang, Jian; Yang, Yue; Li, Yang; Sun, Zhenhua; Yan, Xiaomin; Xia, Juan; Xiong, Yali; Song, Peixin; Zhang, Zhaoping; Ding, Weimao

    2017-01-01

    The gamma-glutamyl transpeptidase to platelet ratio (GPR) is a novel index to estimate liver fibrosis in chronic hepatitis B (CHB). Few studies compared diagnostic accuracy of GPR with other non-invasive fibrosis tests based on blood parameters. We analyzed diagnostic values of GPR for detecting liver fibrosis and compared diagnostic performances of GPR with APRI (aspartate aminotransferase-to-platelet ratio index), FIB-4 (fibrosis index based on the four factors), NLR (neutrophil-to-lymphocy...

  11. Using the noninformative families in family-based association tests : A powerful new testing strategy

    NARCIS (Netherlands)

    Lange, C; DeMeo, D; Silverman, EK; Weiss, ST; Laird, NM

    2003-01-01

    For genetic association studies with multiple phenotypes, we propose a new strategy for multiple testing with family-based association tests (FBATs). The strategy increases the power by both using all available family data and reducing the number of hypotheses tested while being robust against

  12. Low knowledge and anecdotal use of unauthorized online HIV self-test kits among attendees at a street-based HIV rapid testing programme in Spain.

    Science.gov (United States)

    Belza, M José; Figueroa, Carmen; Rosales-Statkus, M Elena; Ruiz, Mónica; Vallejo, Fernando; de la Fuente, Luis

    2014-08-01

    The objectives of this study were to estimate the percentage of potential users who know that unauthorized HIV self-tests can be purchased online and the percentage of those who have already used them, and to determine socio-demographic and behavioural correlates. A self-administered questionnaire was employed to collect data from attendees at a street-based HIV testing programme. Logistic regression for rare events was performed. Of the 3340 participants, 5.3% (95% confidence interval (CI) 4.5-6.0%) had knowledge of self-tests being sold online and 7.5% (95% CI 6.6-8.5%) thought they existed but had never seen them; only 0.6% (95% CI 0.3-0.9%) had ever used one. Knowing that self-tests are sold online (odds ratio (OR) 3.6, 95% CI 2.4-5.4) and using them (OR 7.3, 95% CI 2.2-23.8) were associated with having undergone more than two previous HIV tests. Use was also associated with being neither Spanish nor Latin American (OR 3.8, 95% CI 1.2-12.0) and with having a university degree (OR 0.2, 95% CI 0.1-0.7). At the time of the study, the impact on the population of issues related to the use of unauthorized tests was very low. However, media coverage following the approval of self-testing in the USA might have changed the situation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Home urine C-peptide creatinine ratio (UCPCR) testing can identify type 2 and MODY in pediatric diabetes.

    Science.gov (United States)

    Besser, Rachel E J; Shields, Beverley M; Hammersley, Suzanne E; Colclough, Kevin; McDonald, Timothy J; Gray, Zoe; Heywood, James J N; Barrett, Timothy G; Hattersley, Andrew T

    2013-05-01

    Making the correct diabetes diagnosis in children is crucial for lifelong management. Type 2 diabetes and maturity onset diabetes of the young (MODY) are seen in the pediatric setting, and can be difficult to discriminate from type 1 diabetes. Postprandial urinary C-peptide creatinine ratio (UCPCR) is a non-invasive measure of endogenous insulin secretion that has not been tested as a diagnostic tool in children or in patients with diabetes duration MODY and type 2 in pediatric diabetes. Two-hour postprandial UCPCR was measured in 264 patients aged MODY, n = 63). Receiver operating characteristic curves were used to identify the optimal UCPCR cutoff for discriminating diabetes subtypes. UCPCR was lower in type 1 diabetes [0.05 (MODY [3.51 (2.37-5.32) nmol/mmol, p MODY (p = 0.25), so patients were combined for subsequent analyses. After 2-yr duration, UCPCR ≥ 0.7 nmol/mmol has 100% sensitivity [95% confidence interval (CI): 92-100] and 97% specificity (95% CI: 91-99) for identifying non-type 1 (MODY + type 2 diabetes) from type 1 diabetes [area under the curve (AUC) 0.997]. UCPCR was poor at discriminating MODY from type 2 diabetes (AUC 0.57). UCPCR testing can be used in diabetes duration greater than 2 yr to identify pediatric patients with non-type 1 diabetes. UCPCR testing is a practical non-invasive method for use in the pediatric outpatient setting. © 2013 John Wiley & Sons A/S.

  14. Tracing the Base: A Topographic Test for Collusive Basing-Point Pricing

    NARCIS (Netherlands)

    Bos, Iwan; Schinkel, Maarten Pieter

    2009-01-01

    Basing-point pricing is known to have been abused by geographically dispersed firms in order to eliminate competition on transportation costs. This paper develops a topographic test for collusive basing-point pricing. The method uses transaction data (prices, quantities) and customer project site

  15. Tracing the base: A topographic test for collusive basing-point pricing

    NARCIS (Netherlands)

    Bos, I.; Schinkel, M.P.

    2008-01-01

    Basing-point pricing is known to have been abused by geographically dispersed firms in order to eliminate competition on transportation costs. This paper develops a topographic test for collusive basing-point pricing. The method uses transaction data (prices, quantities) and customer project site

  16. Gel/Space Ratio Evolution in Ternary Composite System Consisting of Portland Cement, Silica Fume, and Fly Ash

    Directory of Open Access Journals (Sweden)

    Mengxue Wu

    2017-01-01

    Full Text Available In cement-based pastes, the relationship between the complex phase assemblage and mechanical properties is usually described by the “gel/space ratio” descriptor. The gel/space ratio is defined as the volume ratio of the gel to the available space in the composite system, and it has been widely studied in the cement unary system. This work determines the gel/space ratio in the cement-silica fume-fly ash ternary system (C-SF-FA system by measuring the reaction degrees of the cement, SF, and FA. The effects that the supplementary cementitious material (SCM replacements exert on the evolution of the gel/space ratio are discussed both theoretically and practically. The relationship between the gel/space ratio and compressive strength is then explored, and the relationship disparities for different mix proportions are analyzed in detail. The results demonstrate that the SCM replacements promote the gel/space ratio evolution only when the SCM reaction degree is higher than a certain value, which is calculated and defined as the critical reaction degree (CRD. The effects of the SCM replacements can be predicted based on the CRD, and the theological predictions agree with the test results quite well. At low gel/space ratios, disparities in the relationship between the gel/space ratio and the compressive strength are caused by porosity, which has also been studied in cement unary systems. The ratio of cement-produced gel to SCM-produced gel ( G C to G S C M ratio is introduced for use in analyzing high gel/space ratios, in which it plays a major role in creating relationship disparities.

  17. Empirical approach based on centrifuge testing for cyclic deformations of laterally loaded piles in sand

    DEFF Research Database (Denmark)

    Truong, P.; Lehane, B. M.; Zania, Varvara

    2018-01-01

    A systematic study into the response of monopiles to lateral cyclic loading in medium dense and dense sand was performed in beam and drum centrifuge tests. The centrifuge tests were carried out at different cyclic load and magnitude ratios, while the cyclic load sequence was also varied...

  18. A novel approach for small sample size family-based association studies: sequential tests.

    Science.gov (United States)

    Ilk, Ozlem; Rajabli, Farid; Dungul, Dilay Ciglidag; Ozdag, Hilal; Ilk, Hakki Gokhan

    2011-08-01

    In this paper, we propose a sequential probability ratio test (SPRT) to overcome the problem of limited samples in studies related to complex genetic diseases. The results of this novel approach are compared with the ones obtained from the traditional transmission disequilibrium test (TDT) on simulated data. Although TDT classifies single-nucleotide polymorphisms (SNPs) to only two groups (SNPs associated with the disease and the others), SPRT has the flexibility of assigning SNPs to a third group, that is, those for which we do not have enough evidence and should keep sampling. It is shown that SPRT results in smaller ratios of false positives and negatives, as well as better accuracy and sensitivity values for classifying SNPs when compared with TDT. By using SPRT, data with small sample size become usable for an accurate association analysis.

  19. Greater power and computational efficiency for kernel-based association testing of sets of genetic variants.

    Science.gov (United States)

    Lippert, Christoph; Xiang, Jing; Horta, Danilo; Widmer, Christian; Kadie, Carl; Heckerman, David; Listgarten, Jennifer

    2014-11-15

    Set-based variance component tests have been identified as a way to increase power in association studies by aggregating weak individual effects. However, the choice of test statistic has been largely ignored even though it may play an important role in obtaining optimal power. We compared a standard statistical test-a score test-with a recently developed likelihood ratio (LR) test. Further, when correction for hidden structure is needed, or gene-gene interactions are sought, state-of-the art algorithms for both the score and LR tests can be computationally impractical. Thus we develop new computationally efficient methods. After reviewing theoretical differences in performance between the score and LR tests, we find empirically on real data that the LR test generally has more power. In particular, on 15 of 17 real datasets, the LR test yielded at least as many associations as the score test-up to 23 more associations-whereas the score test yielded at most one more association than the LR test in the two remaining datasets. On synthetic data, we find that the LR test yielded up to 12% more associations, consistent with our results on real data, but also observe a regime of extremely small signal where the score test yielded up to 25% more associations than the LR test, consistent with theory. Finally, our computational speedups now enable (i) efficient LR testing when the background kernel is full rank, and (ii) efficient score testing when the background kernel changes with each test, as for gene-gene interaction tests. The latter yielded a factor of 2000 speedup on a cohort of size 13 500. Software available at http://research.microsoft.com/en-us/um/redmond/projects/MSCompBio/Fastlmm/. heckerma@microsoft.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  20. Comprehensive quantification of signal-to-noise ratio and g-factor for image-based and k-space-based parallel imaging reconstructions.

    Science.gov (United States)

    Robson, Philip M; Grant, Aaron K; Madhuranthakam, Ananth J; Lattanzi, Riccardo; Sodickson, Daniel K; McKenzie, Charles A

    2008-10-01

    Parallel imaging reconstructions result in spatially varying noise amplification characterized by the g-factor, precluding conventional measurements of noise from the final image. A simple Monte Carlo based method is proposed for all linear image reconstruction algorithms, which allows measurement of signal-to-noise ratio and g-factor and is demonstrated for SENSE and GRAPPA reconstructions for accelerated acquisitions that have not previously been amenable to such assessment. Only a simple "prescan" measurement of noise amplitude and correlation in the phased-array receiver, and a single accelerated image acquisition are required, allowing robust assessment of signal-to-noise ratio and g-factor. The "pseudo multiple replica" method has been rigorously validated in phantoms and in vivo, showing excellent agreement with true multiple replica and analytical methods. This method is universally applicable to the parallel imaging reconstruction techniques used in clinical applications and will allow pixel-by-pixel image noise measurements for all parallel imaging strategies, allowing quantitative comparison between arbitrary k-space trajectories, image reconstruction, or noise conditioning techniques. (c) 2008 Wiley-Liss, Inc.

  1. The co-feature ratio, a novel method for the measurement of chromatographic and signal selectivity in LC-MS-based metabolomics

    International Nuclear Information System (INIS)

    Elmsjö, Albert; Haglöf, Jakob; Engskog, Mikael K.R.; Nestor, Marika; Arvidsson, Torbjörn; Pettersson, Curt

    2017-01-01

    Evaluation of analytical procedures, especially in regards to measuring chromatographic and signal selectivity, is highly challenging in untargeted metabolomics. The aim of this study was to suggest a new straightforward approach for a systematic examination of chromatographic and signal selectivity in LC-MS-based metabolomics. By calculating the ratio between each feature and its co-eluting features (the co-features), a measurement of the chromatographic selectivity (i.e. extent of co-elution) as well as the signal selectivity (e.g. amount of adduct formation) of each feature could be acquired, the co-feature ratio. This approach was used to examine possible differences in chromatographic and signal selectivity present in samples exposed to three different sample preparation procedures. The capability of the co-feature ratio was evaluated both in a classical targeted setting using isotope labelled standards as well as without standards in an untargeted setting. For the targeted analysis, several metabolites showed a skewed quantitative signal due to poor chromatographic selectivity and/or poor signal selectivity. Moreover, evaluation of the untargeted approach through multivariate analysis of the co-feature ratios demonstrated the possibility to screen for metabolites displaying poor chromatographic and/or signal selectivity characteristics. We conclude that the co-feature ratio can be a useful tool in the development and evaluation of analytical procedures in LC-MS-based metabolomics investigations. Increased selectivity through proper choice of analytical procedures may decrease the false positive and false negative discovery rate and thereby increase the validity of any metabolomic investigation. - Highlights: • The co-feature ratio (CFR) is introduced. • CFR measures chromatographic and signal selectivity of a feature. • CFR can be used for evaluating experimental procedures in metabolomics. • CFR can aid in locating features with poor selectivity.

  2. The co-feature ratio, a novel method for the measurement of chromatographic and signal selectivity in LC-MS-based metabolomics

    Energy Technology Data Exchange (ETDEWEB)

    Elmsjö, Albert, E-mail: Albert.Elmsjo@farmkemi.uu.se [Department of Medicinal Chemistry, Division of Analytical Pharmaceutical Chemistry, Uppsala University (Sweden); Haglöf, Jakob; Engskog, Mikael K.R. [Department of Medicinal Chemistry, Division of Analytical Pharmaceutical Chemistry, Uppsala University (Sweden); Nestor, Marika [Department of Immunology, Genetics and Pathology, Uppsala University (Sweden); Arvidsson, Torbjörn [Department of Medicinal Chemistry, Division of Analytical Pharmaceutical Chemistry, Uppsala University (Sweden); Medical Product Agency, Uppsala (Sweden); Pettersson, Curt [Department of Medicinal Chemistry, Division of Analytical Pharmaceutical Chemistry, Uppsala University (Sweden)

    2017-03-01

    Evaluation of analytical procedures, especially in regards to measuring chromatographic and signal selectivity, is highly challenging in untargeted metabolomics. The aim of this study was to suggest a new straightforward approach for a systematic examination of chromatographic and signal selectivity in LC-MS-based metabolomics. By calculating the ratio between each feature and its co-eluting features (the co-features), a measurement of the chromatographic selectivity (i.e. extent of co-elution) as well as the signal selectivity (e.g. amount of adduct formation) of each feature could be acquired, the co-feature ratio. This approach was used to examine possible differences in chromatographic and signal selectivity present in samples exposed to three different sample preparation procedures. The capability of the co-feature ratio was evaluated both in a classical targeted setting using isotope labelled standards as well as without standards in an untargeted setting. For the targeted analysis, several metabolites showed a skewed quantitative signal due to poor chromatographic selectivity and/or poor signal selectivity. Moreover, evaluation of the untargeted approach through multivariate analysis of the co-feature ratios demonstrated the possibility to screen for metabolites displaying poor chromatographic and/or signal selectivity characteristics. We conclude that the co-feature ratio can be a useful tool in the development and evaluation of analytical procedures in LC-MS-based metabolomics investigations. Increased selectivity through proper choice of analytical procedures may decrease the false positive and false negative discovery rate and thereby increase the validity of any metabolomic investigation. - Highlights: • The co-feature ratio (CFR) is introduced. • CFR measures chromatographic and signal selectivity of a feature. • CFR can be used for evaluating experimental procedures in metabolomics. • CFR can aid in locating features with poor selectivity.

  3. Testing hypotheses for excess flower production and low fruit-to-flower ratios in a pollinating seed-consuming mutualism

    Science.gov (United States)

    Holland, J. Nathaniel; Bronstein, Judith L.; DeAngelis, Donald L.

    2004-01-01

    Pollinator attraction, pollen limitation, resource limitation, pollen donation and selective fruit abortion have all been proposed as processes explaining why hermaphroditic plants commonly produce many more flowers than mature fruit. We conducted a series of experiments in Arizona to investigate low fruit-to-flower ratios in senita cacti, which rely exclusively on pollinating seed-consumers. Selective abortion of fruit based on seed predators is of particular interest in this case because plants relying on pollinating seed-consumers are predicted to have such a mechanism to minimize seed loss. Pollinator attraction and pollen dispersal increased with flower number, but fruit set did not, refuting the hypothesis that excess flowers increase fruit set by attracting more pollinators. Fruit set of natural- and hand-pollinated flowers were not different, supporting the resource, rather than pollen, limitation hypothesis. Senita did abort fruit, but not selectively based on pollen quantity, pollen donors, or seed predators. Collectively, these results are consistent with sex allocation theory in that resource allocation to excess flower production can increase pollen dispersal and the male fitness function of flowers, but consequently results in reduced resources available for fruit set. Inconsistent with sex allocation theory, however, fruit production and the female fitness function of flowers may actually increase with flower production. This is because excess flower production lowers pollinator-to-flower ratios and results in fruit abortion, both of which limit the abundance and hence oviposition rates, of pre-dispersal seed predators.

  4. Seasonal variation in Eurasian Wigeon Anas penelope sex and age ratios from hunter-based surveys

    DEFF Research Database (Denmark)

    Clausen, Kevin Kuhlmann; Dalby, Lars; Sunde, Peter

    2013-01-01

    dominated by adult males, and juvenile proportions were highest in November and significantly lower before and after this peak. Nationwide field assessments undertaken in January 2012 showed no significant differences from sex and age ratios in the wing survey data from that particular hunting season (2011...... schemes. This study found consistent seasonal variation in Eurasian Wigeon Anas penelope sex and age ratios among Danish hunter-based wing surveys, and describes how accounting for this variation might explain reported discrepancies between this and other monitoring methods. Early season flocks were....../2012), indicating that this survey is a good predictor of Wigeon demography. These results highlight the need to account for consistent temporal variation in such demographic time series when using the results to model population parameters....

  5. Acute effects of static stretching on peak and end-range hamstring-to-quadriceps functional ratios

    Science.gov (United States)

    Sekir, Ufuk; Arabaci, Ramiz; Akova, Bedrettin

    2015-01-01

    AIM: To evaluate if static stretching influences peak and end-range functional hamstring-to-quadriceps (H/Q) strength ratios in elite women athletes. METHODS: Eleven healthy female athletes in an elite competitive level participated to the study. All the participants fulfilled the static stretching or non-stretching (control) intervention protocol in a randomized design on different days. Two static unassisted stretching exercises, one in standing and one in sitting position, were used to stretch both the hamstring and quadriceps muscles during these protocols. The total time for the static stretching was 6 ± 1 min. The isokinetic peak torque measurements for the hamstring and quadriceps muscles in eccentric and concentric modes and the calculations for the functional H/Q strength ratios at angular velocities of 60°/s and 180°/s were made before (pre) and after (post) the control or stretching intervention. The strength measurements and functional strength ratio calculations were based during the entire- and end-range of knee extension. RESULTS: The pre-test scores for quadriceps and hamstring peak torque and end range values were not significantly different between the groups (P > 0.05). Subsequently, although the control group did not exhibit significant changes in quadriceps and hamstring muscle strength (P > 0.05), static stretching decreased eccentric and concentric quadriceps muscle strength at both the 60°/s and 180°/s test speeds (P hamstring muscle strength at both the 60°/s and 180°/s test speeds (P 0.05). Furthermore, the functional H/Q strength ratios exhibited no significant alterations during the entire and end ranges of knee extension both in the static stretching or the control intervention (P > 0.05). CONCLUSION: According to our results, static stretching routine does not influence functional H/Q ratio. Athletes can confidently perform static stretching during their warm-up routines. PMID:26495249

  6. Impacts of data covariances on the calculated breeding ratio for CRBRP

    International Nuclear Information System (INIS)

    Liaw, J.R.; Collins, P.J.; Henryson, H. II; Shenter, R.E.

    1983-01-01

    In order to establish confidence on the data adjustment methodology as applied to LMFBR design, and to estimate the importance of data correlations in that respect, an investigation was initiated on the impacts of data covariances on the calculated reactor performance parameters. This paper summarizes the results and findings of such an effort specifically related to the calculation of breeding ratio for CRBRP as an illustration. Thirty-nine integral parameters and their covariances, including k/sub eff/ and various capture and fission reaction rate ratios, from the ZEBRA-8 series and four ZPR physics benchmark assemblies were used in the least-squares fitting processes. Multigroup differential data and the sensitivity coefficients of those 39 integral parameters were generated by standard 2-D diffusion theory neutronic calculational modules at ANL. Three differential data covariance libraries, all based on ENDF/B-V evaluations, were tested in this study

  7. Transfer Entropy as a Log-Likelihood Ratio

    Science.gov (United States)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  8. A note on Youden's J and its cost ratio

    Directory of Open Access Journals (Sweden)

    Smits Niels

    2010-09-01

    Full Text Available Abstract Background The Youden index, the sum of sensitivity and specificity minus one, is an index used for setting optimal thresholds on medical tests. Discussion When using this index, one implicitly uses decision theory with a ratio of misclassification costs which is equal to one minus the prevalence proportion of the disease. It is doubtful whether this cost ratio truly represents the decision maker's preferences. Moreover, in populations with a different prevalence, a selected threshold is optimal with reference to a different cost ratio. Summary The Youden index is not a truly optimal decision rule for setting thresholds because its cost ratio varies with prevalence. Researchers should look into their cost ratio and employ it in a decision theoretic framework to obtain genuinely optimal thresholds.

  9. Validity of selected cardiovascular field-based test among Malaysian ...

    African Journals Online (AJOL)

    Based on emerge obese problem among Malaysian, this research is formulated to validate published tests among healthy female adult. Selected test namely; 20 meter multi-stage shuttle run, 2.4km run test, 1 mile walk test and Harvard Step test were correlated with laboratory test (Bruce protocol) to find the criterion validity ...

  10. Moving beyond the Failure of Test-Based Accountability

    Science.gov (United States)

    Koretz, Daniel

    2018-01-01

    In "The Testing Charade: Pretending to Make Schools Better", the author's new book from which this article is drawn, the failures of test-based accountability are documented and some of the most egregious misuses and outright abuses of testing are described, along with some of the most serious negative effects. Neither good intentions…

  11. Development of theoretical oxygen saturation calibration curve based on optical density ratio and optical simulation approach

    Science.gov (United States)

    Jumadi, Nur Anida; Beng, Gan Kok; Ali, Mohd Alauddin Mohd; Zahedi, Edmond; Morsin, Marlia

    2017-09-01

    The implementation of surface-based Monte Carlo simulation technique for oxygen saturation (SaO2) calibration curve estimation is demonstrated in this paper. Generally, the calibration curve is estimated either from the empirical study using animals as the subject of experiment or is derived from mathematical equations. However, the determination of calibration curve using animal is time consuming and requires expertise to conduct the experiment. Alternatively, an optical simulation technique has been used widely in the biomedical optics field due to its capability to exhibit the real tissue behavior. The mathematical relationship between optical density (OD) and optical density ratios (ODR) associated with SaO2 during systole and diastole is used as the basis of obtaining the theoretical calibration curve. The optical properties correspond to systolic and diastolic behaviors were applied to the tissue model to mimic the optical properties of the tissues. Based on the absorbed ray flux at detectors, the OD and ODR were successfully calculated. The simulation results of optical density ratio occurred at every 20 % interval of SaO2 is presented with maximum error of 2.17 % when comparing it with previous numerical simulation technique (MC model). The findings reveal the potential of the proposed method to be used for extended calibration curve study using other wavelength pair.

  12. Precision Gamma-Ray Branching Ratios for Long-Lived Radioactive Nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Tonchev, Anton [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-10-19

    Many properties of the high-energy-density environments in nuclear weapons tests, advanced laser-fusion experiments, the interior of stars, and other astrophysical bodies must be inferred from the resulting long-lived radioactive nuclei that are produced. These radioactive nuclei are most easily and sensitively identified by studying the characteristic gamma rays emitted during decay. Measuring a number of decays via detection of the characteristic gamma-rays emitted during the gamma-decay (the gamma-ray branching ratio) of the long-lived fission products is one of the most straightforward and reliable ways to determine the number of fissions that occurred in a nuclear weapon test. The fission products 147Nd, 144Ce, 156Eu, and certain other long-lived isotopes play a crucial role in science-based stockpile stewardship, however, the large uncertainties (about 8%) on the branching ratios measured for these isotopes are currently limiting the usefulness of the existing data [1,2]. We performed highly accurate gamma-ray branching-ratio measurements for a group of high-atomic-number rare earth isotopes to greatly improve the precision and reliability with which the fission yield and reaction products in high-energy-density environments can be determined. We have developed techniques that take advantage of new radioactive-beam facilities, such as DOE's CARIBU located at Argonne National Laboratory, to produce radioactive samples and perform decay spectroscopy measurements. The absolute gamma-ray branching ratios for 147Nd and 144Ce are reduced <2% precision. In addition, high-energy monoenergetic neutron beams from the FN Tandem accelerator in TUNL at Duke University was used to produce 167Tm using the 169Tm(n,3n) reaction. Fourtime improved branching ratio of 167Tm is used now to measure reaction-in-flight (RIF) neutrons from a burning DT capsule at NIF [10]. This represents the

  13. Protein-Based Urine Test Predicts Kidney Transplant Outcomes

    Science.gov (United States)

    ... News Releases News Release Thursday, August 22, 2013 Protein-based urine test predicts kidney transplant outcomes NIH- ... supporting development of noninvasive tests. Levels of a protein in the urine of kidney transplant recipients can ...

  14. Computer Based Test Untuk Seleksi Masuk Politeknik Negeri Bengkalis

    Directory of Open Access Journals (Sweden)

    Agus Tedyyana

    2017-11-01

    Full Text Available AbstrakPenyeleksian calon mahasiswa baru dapat dilakukan dengan aplikasi Computer Based Test (CBT. Metode yang digunakan meliputi teknik pengumpulan data, analisis sistem, model perancangan, implementasi dan pengujian. Penelitian ini menghasilkan aplikasi CBT dimana soal yang dimunculkan dari bank soal melalui proses pengacakan dengan tidak akan memunculkan soal yang sama dengan menggunakan metoda Fisher-Yates Shuffle. Dalam proses pengamanan informasi soal saat terhubung ke jaringan maka diperlukan teknik untuk penyandian pesan agar soal tersebut sebeum dimunculkan melewati proses enkripsi dan deskripsi data terlebih dahulu maka digunakan algoritma kriptografi  RSA. Metode perancangan perangkat lunak menggunakan model waterfall, perancangan database menggunakan entity relationship diagram, perancangan antarmuka menggunakan hypertext markup language (HTML Cascading Style Sheet (CSS dan jQuery serta diimplementasikan berbasis web dengan menggunakan bahasa pemrograman PHP dan database MySQL, Arsitektur jaringan yang digunakan aplikasi Computer Based Test adalah model jaringan client-server dengan jaringan Local Area Network (LAN. Kata kunci: Computer Based Test, Fisher-Yates Shuffle, Criptography, Local Area Network AbstractSelection of new student candidates can be done with Computer Based Test (CBT application. The methods used include data collection techniques, system analysis, design model, implementation and testing. This study produces a CBT application where the questions raised from the question bank through randomization process will not bring up the same problem using the Fisher-Yates Shuffle method. In the process of securing information about the problem when connected to the network it is necessary techniques for encoding the message so that the problem before appear through the process of encryption and description of data first then used RSA cryptography algorithm. Software design method using waterfall model, database design

  15. Hepatic MR imaging for in vivo differentiation of steatosis, iron deposition and combined storage disorder: Single-ratio in/opposed phase analysis vs. dual-ratio Dixon discrimination

    International Nuclear Information System (INIS)

    Bashir, Mustafa R.; Merkle, Elmar M.; Smith, Alastair D.; Boll, Daniel T.

    2012-01-01

    Objective: To assess whether in vivo dual-ratio Dixon discrimination can improve detection of diffuse liver disease, specifically steatosis, iron deposition and combined disease over traditional single-ratio in/opposed phase analysis. Methods: Seventy-one patients with biopsy-proven (17.7 ± 17.0 days) hepatic steatosis (n = 16), iron deposition (n = 11), combined deposition (n = 3) and neither disease (n = 41) underwent MR examinations. Dual-echo in/opposed-phase MR with Dixon water/fat reconstructions were acquired. Analysis consisted of: (a) single-ratio hepatic region-of-interest (ROI)-based assessment of in/opposed ratios; (b) dual-ratio hepatic ROI assessment of in/opposed and fat/water ratios; (c) computer-aided dual-ratio assessment evaluating all hepatic voxels. Disease-specific thresholds were determined; statistical analyses assessed disease-dependent voxel ratios, based on single-ratio (a) and dual-ratio (b and c) techniques. Results: Single-ratio discrimination succeeded in identifying iron deposition (I/O Ironthreshold Fatthreshold>1.15 ) from normal parenchyma, sensitivity 70.0%; it failed to detect combined disease. Dual-ratio discrimination succeeded in identifying abnormal hepatic parenchyma (F/W Normalthreshold > 0.05), sensitivity 96.7%; logarithmic functions for iron deposition (I/O Iron d iscriminator (0.01−F/W Iron )/0.48 ) and for steatosis (I/O Fatdiscriminator > e (F/W Fat −0.01)/0.48 ) differentiated combined from isolated diseases, sensitivity 100.0%; computer-aided dual-ratio analysis was comparably sensitive but less specific, 90.2% vs. 97.6%. Conclusion: MR two-point-Dixon imaging using dual-ratio post-processing based on in/opposed and fat/water ratios improved in vivo detection of hepatic steatosis, iron deposition, and combined storage disease beyond traditional in/opposed analysis.

  16. Infrared detectors and test technology of cryogenic camera

    Science.gov (United States)

    Yang, Xiaole; Liu, Xingxin; Xing, Mailing; Ling, Long

    2016-10-01

    Cryogenic camera which is widely used in deep space detection cools down optical system and support structure by cryogenic refrigeration technology, thereby improving the sensitivity. Discussing the characteristics and design points of infrared detector combined with camera's characteristics. At the same time, cryogenic background test systems of chip and detector assembly are established. Chip test system is based on variable cryogenic and multilayer Dewar, and assembly test system is based on target and background simulator in the thermal vacuum environment. The core of test is to establish cryogenic background. Non-uniformity, ratio of dead pixels and noise of test result are given finally. The establishment of test system supports for the design and calculation of infrared systems.

  17. Pengaruh Current Ratio, Asset Size, dan Earnings Variability terhadap Beta Pasar

    Directory of Open Access Journals (Sweden)

    Ahim Abdurahim

    2016-02-01

    Full Text Available The research objective was to determine the effect of variable accounting ie :, current ratio, asset size and earnings variability of the market beta. This study used 72 samples. Analyzer used to test the hypothesis that regression. Previous methods of Fowler and Rorke (1983 to adjust the market beta, and BLUE test is used to test classic assumptions of the independent variables are multikolinearitas, heteroskedasitas with Breushch-Pagan-Godfrey test, and autocorrelation with BG (The Breussh-Godfrey. The results found that the hypothesis H1a, H1b, H1c, and H2a powered means no influence current ratio, asset size and earnings variability of the market beta, both individually and simultaneously.

  18. The Reference Return Ratio

    DEFF Research Database (Denmark)

    Nicolaisen, Jeppe; Faber Frandsen, Tove

    2008-01-01

    The paper introduces a new journal impact measure called The Reference Return Ratio (3R). Unlike the traditional Journal Impact Factor (JIF), which is based on calculations of publications and citations, the new measure is based on calculations of bibliographic investments (references) and returns...... (citations). A comparative study of the two measures shows a strong relationship between the 3R and the JIF. Yet, the 3R appears to correct for citation habits, citation dynamics, and composition of document types - problems that typically are raised against the JIF. In addition, contrary to traditional...

  19. The labor/land ratio and India's caste system

    OpenAIRE

    Duleep, Harriet

    2012-01-01

    This paper proposes that India’s caste system and involuntary labor were joint responses by a nonworking landowning class to a low labor/land ratio in which the rules of the caste system supported the institution of involuntary labor. The hypothesis is tested in two ways: longitudinally, with data from ancient religious texts, and cross-sectionally, with twentieth-century statistics on regional population/land ratios linked to anthropological measures of caste-system rigidity. Both the longit...

  20. A study on the effect of free cash flow and profitability current ratio on dividend payout ratio: Evidence from Tehran Stock Exchange

    Directory of Open Access Journals (Sweden)

    Hosein Parsian

    2014-01-01

    Full Text Available Decision making about dividend payout is one of the most important decision that companies should encounter. Identifying factors that influence dividends can help managers in making an appropriate dividend policy. In the other side, companies’ dividend payouts over time and with a stable manner may influence on stock price, future earnings growth and finally investor's evaluation about owners' equity. Hence, investigating the factors influencing dividend payout ratio is of high importance. In this research, we investigate the effects of various factors on dividend payout ratio of Tehran Stock Exchange (TSE listed companies. We use time series regression (panel data in order to test the hypothesis of this study. This study provides empirical evidences by choosing a sample of 102 companies over the time span of 2005-2010. The result shows that independent variables of free cash flow and profitability current ratio have negative and significant impact on dividend payout ratio; whereas, the independent variable of leverage ratio has a positive and significant impact on dividend payout ratio. The other independent ratio such as size of the company, growth opportunities and systematic risk do not have any significant influence on dividend payout ratio.

  1. Using a micro computer based test bank

    International Nuclear Information System (INIS)

    Hamel, R.T.

    1987-01-01

    Utilizing a micro computer based test bank offers a training department many advantages and can have a positive impact upon training procedures and examination standards. Prior to data entry, Training Department management must pre-review the examination questions and answers to ensure compliance with examination standards and to verify the validity of all questions. Management must adhere to the TSD format since all questions require an enabling objective numbering scheme. Each question is entered under the enabling objective upon which it is based. Then the question is selected via the enabling objective. This eliminates any instructor bias because a random number generator chooses the test question. However, the instructor may load specific questions to create an emphasis theme for any test. The examination, answer and cover sheets are produced and printed within minutes. The test bank eliminates the large amount of time that is normally required for an instructor to formulate an examination. The need for clerical support is reduced by the elimination of typing examinations and also by the software's ability to maintain and generate student/course lists, attendance sheets, and grades. Software security measures limit access to the test bank, and the impromptu method used to generate and print an examination enhance its security

  2. Sex ratio and Wolbachia infection in the ant Formica exsecta.

    Science.gov (United States)

    Keller, L; Liautard, C; Reuter, M; Brown, W D; Sundström, L; Chapuisat, M

    2001-08-01

    Sex allocation data in social Hymenoptera provide some of the best tests of kin selection, parent-offspring conflict and sex ratio theories. However, these studies critically depend on controlling for confounding ecological factors and on identifying all parties that potentially manipulate colony sex ratio. It has been suggested that maternally inherited parasites may influence sex allocation in social Hymenoptera. If the parasites can influence sex allocation, infected colonies are predicted to invest more resources in females than non-infected colonies, because the parasites are transmitted through females but not males. Prime candidates for such sex ratio manipulation are Wolbachia, because these cytoplasmically transmitted bacteria have been shown to affect the sex ratio of host arthropods by cytoplasmic incompatibility, parthenogenesis, male-killing and feminization. In this study, we tested whether Wolbachia infection is associated with colony sex ratio in two populations of the ant Formica exsecta that have been the subject of extensive sex ratio studies. In these populations colonies specialize in the production of one sex or the other. We found that almost all F. exsecta colonies in both populations are infected with Wolbachia. However, in neither population did we find a significant association in the predicted direction between the prevalence of Wolbachia and colony sex ratio. In particular, colonies with a higher proportion of infected workers did not produce more females. Hence, we conclude that Wolbachia does not seem to alter the sex ratio of its hosts as a means to increase transmission rate in these two populations of ants.

  3. Tests of gravity with future space-based experiments

    Science.gov (United States)

    Sakstein, Jeremy

    2018-03-01

    Future space-based tests of relativistic gravitation—laser ranging to Phobos, accelerometers in orbit, and optical networks surrounding Earth—will constrain the theory of gravity with unprecedented precision by testing the inverse-square law, the strong and weak equivalence principles, and the deflection and time delay of light by massive bodies. In this paper, we estimate the bounds that could be obtained on alternative gravity theories that use screening mechanisms to suppress deviations from general relativity in the Solar System: chameleon, symmetron, and Galileon models. We find that space-based tests of the parametrized post-Newtonian parameter γ will constrain chameleon and symmetron theories to new levels, and that tests of the inverse-square law using laser ranging to Phobos will provide the most stringent constraints on Galileon theories to date. We end by discussing the potential for constraining these theories using upcoming tests of the weak equivalence principle, and conclude that further theoretical modeling is required in order to fully utilize the data.

  4. Operational Based Vision Assessment Automated Vision Test Collection User Guide

    Science.gov (United States)

    2017-05-15

    AFRL-SA-WP-SR-2017-0012 Operational Based Vision Assessment Automated Vision Test Collection User Guide Elizabeth Shoda, Alex...June 2015 – May 2017 4. TITLE AND SUBTITLE Operational Based Vision Assessment Automated Vision Test Collection User Guide 5a. CONTRACT NUMBER... automated vision tests , or AVT. Development of the AVT was required to support threshold-level vision testing capability needed to investigate the

  5. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  6. Evolution of a Computer-Based Testing Laboratory

    Science.gov (United States)

    Moskal, Patrick; Caldwell, Richard; Ellis, Taylor

    2009-01-01

    In 2003, faced with increasing growth in technology-based and large-enrollment courses, the College of Business Administration at the University of Central Florida opened a computer-based testing lab to facilitate administration of course examinations. Patrick Moskal, Richard Caldwell, and Taylor Ellis describe the development and evolution of the…

  7. Robust Confidence Interval for a Ratio of Standard Deviations

    Science.gov (United States)

    Bonett, Douglas G.

    2006-01-01

    Comparing variability of test scores across alternate forms, test conditions, or subpopulations is a fundamental problem in psychometrics. A confidence interval for a ratio of standard deviations is proposed that performs as well as the classic method with normal distributions and performs dramatically better with nonnormal distributions. A simple…

  8. Formal Specification Based Automatic Test Generation for Embedded Network Systems

    Directory of Open Access Journals (Sweden)

    Eun Hye Choi

    2014-01-01

    Full Text Available Embedded systems have become increasingly connected and communicate with each other, forming large-scaled and complicated network systems. To make their design and testing more reliable and robust, this paper proposes a formal specification language called SENS and a SENS-based automatic test generation tool called TGSENS. Our approach is summarized as follows: (1 A user describes requirements of target embedded network systems by logical property-based constraints using SENS. (2 Given SENS specifications, test cases are automatically generated using a SAT-based solver. Filtering mechanisms to select efficient test cases are also available in our tool. (3 In addition, given a testing goal by the user, test sequences are automatically extracted from exhaustive test cases. We’ve implemented our approach and conducted several experiments on practical case studies. Through the experiments, we confirmed the efficiency of our approach in design and test generation of real embedded air-conditioning network systems.

  9. Computer-Based English Language Testing in China: Present and Future

    Science.gov (United States)

    Yu, Guoxing; Zhang, Jing

    2017-01-01

    In this special issue on high-stakes English language testing in China, the two articles on computer-based testing (Jin & Yan; He & Min) highlight a number of consistent, ongoing challenges and concerns in the development and implementation of the nationwide IB-CET (Internet Based College English Test) and institutional computer-adaptive…

  10. Assessing connectivity of estuarine fishes based on stable isotope ratio analysis

    Science.gov (United States)

    Herzka, Sharon Z.

    2005-07-01

    Assessing connectivity is fundamental to understanding the population dynamics of fishes. I propose that isotopic analyses can greatly contribute to studies of connectivity in estuarine fishes due to the high diversity of isotopic signatures found among estuarine habitats and the fact that variations in isotopic composition at the base of a food web are reflected in the tissues of consumers. Isotopic analysis can be used for identifying nursery habitats and estimating their contribution to adult populations. If movement to a new habitat is accompanied by a shift to foods of distinct isotopic composition, recent immigrants and residents can be distinguished based on their isotopic ratios. Movement patterns thus can be reconstructed based on information obtained from individuals. A key consideration is the rate of isotopic turnover, which determines the length of time that an immigrant to a given habitat will be distinguishable from a longtime resident. A literature survey indicated that few studies have measured turnover rates in fishes and that these have focused on larvae and juveniles. These studies reveal that biomass gain is the primary process driving turnover rates, while metabolic turnover is either minimal or undetectable. Using a simple dilution model and biomass-specific growth rates, I estimated that young fishes with fast growth rates will reflect the isotopic composition of a new diet within days or weeks. Older or slower-growing individuals may take years or never fully equilibrate. Future studies should evaluate the factors that influence turnover rates in fishes during various stages of the life cycle and in different tissues, as well as explore the potential for combining stable isotope and otolith microstructure analyses to examine the relationship between demographic parameters, movement and connectivity.

  11. High - Resolution SST Record Based on Mg/Ca Ratios of Late Holocene Planktonic Foraminifers From the Great Bahama Bank

    Science.gov (United States)

    Mueller, A.; Reijmer, J. J.; Roth, S.

    2001-12-01

    We analyzed five different planktic foraminifera species in the high resolution core MD 992201 off the Great Bahama Bank (79° 16.34 W; 25° 53.49 N) in 290 m water depth. This 38.05 m long core comprises a 7,000 year long Holocene record. The selected species were Orbulina universa, Globigerinoides ruber, Globigerinoides sacculifer, Globorotalia menardii and Globigerinella aequilateralis, which live in the upper 200 m of the water column. The Mg/Ca ratios of these different foraminifers show species-specific values, which represent a distinct habitat depth. With this species-specific Mg/Ca ratios we can reconstruct a temperature profile through the water column. The lowest Mg/Ca are shown by G. menardii (2.5 - 4 mmol/mol), followed by G. sacculifer (4.2 - 5.6 mmol/mol), G. ruber (5.1 - 7.2 mmol/mol) and G. aequilateralis (5.5 - 8.7 mmol/mol). Highest are shown by O. universa (6 - 14 mmol/mol). During the Little Ice Age, the Mg/Ca ratios of all species except for the deeper dwelling G. menardii, became more variable and showed lower ratios. The shallow dwelling species like G. ruber and G. sacculifer display an increase in the Mg/Ca ratios during the Medieval Warm Period. Our data show that transferring Mg/Ca ratios into SST based calibration curves known from literature needs re-evaluation. Species-specific calibration seems to be necessary to achieve reliable results.

  12. Impact of H2/CO ratios on phase and performance of Mn-modified Fe-based Fischer Tropsch synthesis catalyst

    International Nuclear Information System (INIS)

    Ding, Mingyue; Yang, Yong; Li, Yongwang; Wang, Tiejun; Ma, Longlong; Wu, Chuangzhi

    2013-01-01

    Highlights: ► Decreasing H 2 /CO ratio facilitated the conversion of Fe 3 O 4 to iron carbides on the surface layers. ► The formation of surface carbonaceous species was promoted in higher CO partial pressure. ► The formation of iron carbides on the surface of Fe 3 O 4 provided the FTS active sites. ► Decreasing H 2 /CO ratio promoted the product shifting towards heavy hydrocarbons. - Abstract: Impacts of H 2 /CO ratios on both the bulky and surface compositions of an iron–manganese based catalyst were investigated by XRD, MES, N 2 -physisorption, XPS and LRS. Fischer–Tropsch (F–T) synthesis performances were studied in a slurry-phase continuously stirred tank reactor. The characterization results showed that the fresh catalyst was comprised of the hematite, which was converted firstly to Fe 3 O 4 , and then carburized to iron carbides in both the bulk and surface regions under different H 2 /CO ratios atmosphere. Pretreatment in lower H 2 /CO ratio facilitated the formation of iron carbides on the surface of magnetite and surface carbonaceous species. During the F–T synthesis reaction, the catalyst reduced in lower H 2 /CO ratio presented higher catalytic activity, which is assigned probably to the formation of more iron carbides (especially for χ-Fe 5 C 2 ) on the surface of magnetite. The increase of CO partial pressure promoted the product distribution shifting towards heavy hydrocarbons

  13. Single specimen fracture toughness determination procedure using instrumented impact test

    International Nuclear Information System (INIS)

    Rintamaa, R.

    1993-04-01

    In the study a new single specimen test method and testing facility for evaluating dynamic fracture toughness has been developed. The method is based on the application of a new pendulum type instrumented impact tester equipped with and optical crack mouth opening displacement (COD) extensometer. The fracture toughness measurement technique uses the Double Displacement Ratio (DDR) method, which is based on the assumption that the specimen is deformed as two rigid arms that rotate around an apparent centre of rotation. This apparent moves as the crack grows, and the ratio of COD versus specimen displacement changes. As a consequence the onset ductile crack initiation can be detected on the load-displacement curve. Thus, an energy-based fracture toughness can be calculated. In addition the testing apparatus can use specimens with the Double ligament size as compared with the standard Charpy specimen which makes the impact testing more appropriate from the fracture mechanics point of view. The novel features of the testing facility and the feasibility of the new DDR method has been verified by performing an extensive experimental and analytical study. (99 refs., 91 figs., 27 tabs.)

  14. Power calculations for likelihood ratio tests for offspring genotype risks, maternal effects, and parent-of-origin (POO) effects in the presence of missing parental genotypes when unaffected siblings are available.

    Science.gov (United States)

    Rampersaud, E; Morris, R W; Weinberg, C R; Speer, M C; Martin, E R

    2007-01-01

    Genotype-based likelihood-ratio tests (LRT) of association that examine maternal and parent-of-origin effects have been previously developed in the framework of log-linear and conditional logistic regression models. In the situation where parental genotypes are missing, the expectation-maximization (EM) algorithm has been incorporated in the log-linear approach to allow incomplete triads to contribute to the LRT. We present an extension to this model which we call the Combined_LRT that incorporates additional information from the genotypes of unaffected siblings to improve assignment of incompletely typed families to mating type categories, thereby improving inference of missing parental data. Using simulations involving a realistic array of family structures, we demonstrate the validity of the Combined_LRT under the null hypothesis of no association and provide power comparisons under varying levels of missing data and using sibling genotype data. We demonstrate the improved power of the Combined_LRT compared with the family-based association test (FBAT), another widely used association test. Lastly, we apply the Combined_LRT to a candidate gene analysis in Autism families, some of which have missing parental genotypes. We conclude that the proposed log-linear model will be an important tool for future candidate gene studies, for many complex diseases where unaffected siblings can often be ascertained and where epigenetic factors such as imprinting may play a role in disease etiology.

  15. Forensic Automatic Speaker Recognition Based on Likelihood Ratio Using Acoustic-phonetic Features Measured Automatically

    Directory of Open Access Journals (Sweden)

    Huapeng Wang

    2015-01-01

    Full Text Available Forensic speaker recognition is experiencing a remarkable paradigm shift in terms of the evaluation framework and presentation of voice evidence. This paper proposes a new method of forensic automatic speaker recognition using the likelihood ratio framework to quantify the strength of voice evidence. The proposed method uses a reference database to calculate the within- and between-speaker variability. Some acoustic-phonetic features are extracted automatically using the software VoiceSauce. The effectiveness of the approach was tested using two Mandarin databases: A mobile telephone database and a landline database. The experiment's results indicate that these acoustic-phonetic features do have some discriminating potential and are worth trying in discrimination. The automatic acoustic-phonetic features have acceptable discriminative performance and can provide more reliable results in evidence analysis when fused with other kind of voice features.

  16. Testing Measurement Invariance Using MIMIC: Likelihood Ratio Test with a Critical Value Adjustment

    Science.gov (United States)

    Kim, Eun Sook; Yoon, Myeongsun; Lee, Taehun

    2012-01-01

    Multiple-indicators multiple-causes (MIMIC) modeling is often used to test a latent group mean difference while assuming the equivalence of factor loadings and intercepts over groups. However, this study demonstrated that MIMIC was insensitive to the presence of factor loading noninvariance, which implies that factor loading invariance should be…

  17. Gene-based testing of interactions in association studies of quantitative traits.

    Directory of Open Access Journals (Sweden)

    Li Ma

    Full Text Available Various methods have been developed for identifying gene-gene interactions in genome-wide association studies (GWAS. However, most methods focus on individual markers as the testing unit, and the large number of such tests drastically erodes statistical power. In this study, we propose novel interaction tests of quantitative traits that are gene-based and that confer advantage in both statistical power and biological interpretation. The framework of gene-based gene-gene interaction (GGG tests combine marker-based interaction tests between all pairs of markers in two genes to produce a gene-level test for interaction between the two. The tests are based on an analytical formula we derive for the correlation between marker-based interaction tests due to linkage disequilibrium. We propose four GGG tests that extend the following P value combining methods: minimum P value, extended Simes procedure, truncated tail strength, and truncated P value product. Extensive simulations point to correct type I error rates of all tests and show that the two truncated tests are more powerful than the other tests in cases of markers involved in the underlying interaction not being directly genotyped and in cases of multiple underlying interactions. We applied our tests to pairs of genes that exhibit a protein-protein interaction to test for gene-level interactions underlying lipid levels using genotype data from the Atherosclerosis Risk in Communities study. We identified five novel interactions that are not evident from marker-based interaction testing and successfully replicated one of these interactions, between SMAD3 and NEDD9, in an independent sample from the Multi-Ethnic Study of Atherosclerosis. We conclude that our GGG tests show improved power to identify gene-level interactions in existing, as well as emerging, association studies.

  18. Ratios of charmed and bottom meson decay constants

    International Nuclear Information System (INIS)

    Oakes, R.J.

    1994-01-01

    It is shown that general features of the standard theory require the double ratio (f Bs /f Bd )/(f Ds /f Dd ) to be very nearly unity with only very small corrections. The ratios f Bs /f Bd and f Ds /f Dd are both also near unity, within small corrections which partially cancel in the double ratio. A precise measurement of f Ds /f Dd therefore provides a sensitive test of some generally accepted features of the standard Lagrangian, as well as determines the value of f Bs /f Bd , which is important for calculating the relative strengths of B s- bar B s and B d- bar B d mixing

  19. Bayesian models based on test statistics for multiple hypothesis testing problems.

    Science.gov (United States)

    Ji, Yuan; Lu, Yiling; Mills, Gordon B

    2008-04-01

    We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.

  20. A Quantitative Analysis of Evidence-Based Testing Practices in Nursing Education

    Science.gov (United States)

    Moore, Wendy

    2017-01-01

    The focus of this dissertation is evidence-based testing practices in nursing education. Specifically, this research study explored the implementation of evidence-based testing practices between nursing faculty of various experience levels. While the significance of evidence-based testing in nursing education is well documented, little is known…

  1. Thermal property testing technique on micro specimen

    International Nuclear Information System (INIS)

    Baba, Tetsuya; Kishimoto, Isao; Taketoshi, Naoyuki

    2000-01-01

    This study aims at establishment of further development on some testing techniques on the nuclear advanced basic research accumulated by the National Research Laboratory of Metrology for ten years. For this purpose, a technology to test heat diffusion ratio and specific heat capacity of less than 3 mm in diameter and 1 mm in thickness of micro specimen and technology to test heat diffusion ratio at micro area of less than 1 mm in area along cross section of less than 10 mm in diameter of column specimen were developed to contribute to common basic technology supporting the nuclear power field. As a result, as an element technology to test heat diffusion ratio and specific heat capacity of the micro specimen, a specimen holding technique stably to hold a micro specimen with 3 mm in diameter could be developed. And, for testing the specific heat capacity by using the laser flush differential calorimetry, a technique to hold two specimen of 5 mm in diameter at their proximities was also developed. In addition, by promoting development of thermal property data base capable of storing thermal property data obtained in this study and with excellent workability in this 1998 fiscal year a data in/out-put program with graphical user interface could be prepared. (G.K.)

  2. The Liquidity Coverage Ratio: the need for further complementary ratios?

    OpenAIRE

    Ojo, Marianne

    2013-01-01

    This paper considers components of the Liquidity Coverage Ratio – as well as certain prevailing gaps which may necessitate the introduction of a complementary liquidity ratio. The definitions and objectives accorded to the Liquidity Coverage Ratio (LCR) and Net Stable Funding Ratio (NSFR) highlight the focus which is accorded to time horizons for funding bank operations. A ratio which would focus on the rate of liquidity transformations and which could also serve as a complementary metric gi...

  3. Impact on colorectal cancer mortality of screening programmes based on the faecal immunochemical test.

    Science.gov (United States)

    Zorzi, Manuel; Fedeli, Ugo; Schievano, Elena; Bovo, Emanuela; Guzzinati, Stefano; Baracco, Susanna; Fedato, Chiara; Saugo, Mario; Dei Tos, Angelo Paolo

    2015-05-01

    Colorectal cancer (CRC) screening programmes based on the guaiac faecal occult blood test (gFOBT) reduce CRC-specific mortality. Several studies have shown higher sensitivity with the faecal immunochemical test (FIT) compared with gFOBT. We carried out an ecological study to evaluate the impact of FIT-based screening programmes on CRC mortality. In the Veneto Region (Italy), biennial FIT-based screening programmes that invited 50-69-year-old residents were introduced in different areas between 2002 and 2009. We compared CRC mortality rates from 1995 to 2011 between the areas where screening started in 2002-2004 (early screening areas (ESA)) and areas that introduced the screening in 2008-2009 (late screening areas (LSA)) using Poisson regression models. We also compared available data on CRC incidence rates (1995-2007) and surgical resection rates (2001-2012). Before the introduction of screening, CRC mortality and incidence rates in the two areas were similar. Compared with 1995-2000, 2006-2011 mortality rates were 22% lower in the ESA than in the LSA (rate ratio (RR)=0.78; 95% CI 0.68 to 0.89). The reduction was larger in women (RR=0.64; CI 0.51 to 0.80) than in men (RR=0.87; CI 0.73 to 1.04). In the ESA, incidence and surgery rates peaked during the introduction of the screening programme and then returned to the baseline (2006-2007 incidence) or dropped below initial values (surgery after 2007). FIT-based screening programmes were associated with a significant reduction in CRC mortality. This effect took place much earlier than reported by gFOBT-based trials and observational studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  4. Model-Assisted Control of Flow Front in Resin Transfer Molding Based on Real-Time Estimation of Permeability/Porosity Ratio

    Directory of Open Access Journals (Sweden)

    Bai-Jian Wei

    2016-09-01

    Full Text Available Resin transfer molding (RTM is a popular manufacturing technique that produces fiber reinforced polymer (FRP composites. In this paper, a model-assisted flow front control system is developed based on real-time estimation of permeability/porosity ratio using the information acquired by a visualization system. In the proposed control system, a radial basis function (RBF network meta-model is utilized to predict the position of the future flow front by inputting the injection pressure, the current position of flow front, and the estimated ratio. By conducting optimization based on the meta-model, the value of injection pressure to be implemented at each step is obtained. Moreover, a cascade control structure is established to further improve the control performance. Experiments show that the developed system successfully enhances the performance of flow front control in RTM. Especially, the cascade structure makes the control system robust to model mismatch.

  5. An Accurate Method for Inferring Relatedness in Large Datasets of Unphased Genotypes via an Embedded Likelihood-Ratio Test

    KAUST Repository

    Rodriguez, Jesse M.

    2013-01-01

    Studies that map disease genes rely on accurate annotations that indicate whether individuals in the studied cohorts are related to each other or not. For example, in genome-wide association studies, the cohort members are assumed to be unrelated to one another. Investigators can correct for individuals in a cohort with previously-unknown shared familial descent by detecting genomic segments that are shared between them, which are considered to be identical by descent (IBD). Alternatively, elevated frequencies of IBD segments near a particular locus among affected individuals can be indicative of a disease-associated gene. As genotyping studies grow to use increasingly large sample sizes and meta-analyses begin to include many data sets, accurate and efficient detection of hidden relatedness becomes a challenge. To enable disease-mapping studies of increasingly large cohorts, a fast and accurate method to detect IBD segments is required. We present PARENTE, a novel method for detecting related pairs of individuals and shared haplotypic segments within these pairs. PARENTE is a computationally-efficient method based on an embedded likelihood ratio test. As demonstrated by the results of our simulations, our method exhibits better accuracy than the current state of the art, and can be used for the analysis of large genotyped cohorts. PARENTE\\'s higher accuracy becomes even more significant in more challenging scenarios, such as detecting shorter IBD segments or when an extremely low false-positive rate is required. PARENTE is publicly and freely available at http://parente.stanford.edu/. © 2013 Springer-Verlag.

  6. Ratios of involved nodes in early breast cancer

    International Nuclear Information System (INIS)

    Vinh-Hung, Vincent; Royce, Melanie; Verschraegen, Claire; Promish, Donald I; Cserni, Gábor; Van de Steene, Jan; Tai, Patricia; Vlastos, Georges; Voordeckers, Mia; Storme, Guy

    2004-01-01

    The number of lymph nodes found to be involved in an axillary dissection is among the most powerful prognostic factors in breast cancer, but it is confounded by the number of lymph nodes that have been examined. We investigate an idea that has surfaced recently in the literature (since 1999), namely that the proportion of node-positive lymph nodes (or a function thereof) is a much better predictor of survival than the number of excised and node-positive lymph nodes, alone or together. The data were abstracted from 83,686 cases registered in the Surveillance, Epidemiology, and End Results (SEER) program of women diagnosed with nonmetastatic T1–T2 primary breast carcinoma between 1988 and 1997, in whom axillary node dissection was performed. The end-point was death from breast cancer. Cox models based on different expressions of nodal involvement were compared using the Nagelkerke R 2 index (R 2 N ). Ratios were modeled as percentage and as log odds of involved nodes. Log odds were estimated in a way that avoids singularities (zero values) by using the empirical logistic transform. In node-negative cases both the number of nodes excised and the log odds were significant, with hazard ratios of 0.991 (95% confidence interval 0.986–0.997) and 1.150 (1.058–1.249), respectively, but without improving R 2 N . In node-positive cases the hazard ratios were 1.003–1.088 for the number of involved nodes, 0.966–1.005 for the number of excised nodes, 1.015–1.017 for the percentage, and 1.344–1.381 for the log odds. R 2 N improved from 0.067 (no nodal covariate) to 0.102 (models based on counts only) and to 0.108 (models based on ratios). Ratios are simple optimal predictors, in that they provide at least the same prognostic value as the more traditional staging based on counting of involved nodes, without replacing them with a needlessly complicated alternative. They can be viewed as a per patient standardization in which the number of involved nodes is standardized

  7. Golden Ratio

    Indian Academy of Sciences (India)

    Keywords. Fibonacci numbers, golden ratio, Sanskrit prosody, solar panel. Abstract. Our attraction to another body increases if the body is symmetricaland in proportion. If a face or a structure is in proportion,we are more likely to notice it and find it beautiful.The universal ratio of beauty is the 'Golden Ratio', found inmany ...

  8. Determination of Optimum Compression Ratio: A Tribological Aspect

    Directory of Open Access Journals (Sweden)

    L. Yüksek

    2013-12-01

    Full Text Available Internal combustion engines are the primary energy conversion machines both in industry and transportation. Modern technologies are being implemented to engines to fulfill today's low fuel consumption demand. Friction energy consumed by the rubbing parts of the engines are becoming an important parameter for higher fuel efficiency. Rate of friction loss is primarily affected by sliding speed and the load acting upon rubbing surfaces. Compression ratio is the main parameter that increases the peak cylinder pressure and hence normal load on components. Aim of this study is to investigate the effect of compression ratio on total friction loss of a diesel engine. A variable compression ratio diesel engine was operated at four different compression ratios which were "12.96", "15:59", "18:03", "20:17". Brake power and speed was kept constant at predefined value while measuring the in- cylinder pressure. Friction mean effective pressure ( FMEP data were obtained from the in cylinder pressure curves for each compression ratio. Ratio of friction power to indicated power of the engine was increased from 22.83% to 37.06% with varying compression ratio from 12.96 to 20:17. Considering the thermal efficiency , FMEP and maximum in- cylinder pressure optimum compression ratio interval of the test engine was determined as 18.8 ÷ 19.6.

  9. A performance-oriented and risk-based regulation for containment testing

    International Nuclear Information System (INIS)

    Dey, M.

    1994-01-01

    In August 1992, the NRC initiated a major initiative to develop requirements for containment testing that are less prescriptive, and more performance-oriented and risk-based. This action was a result of public comments and several studies that concluded that the economic burden of certain, present containment testing requirements are not commensurate with their safety benefits. The rulemaking will include consideration of relaxing the allowable containment leakage rate, increasing the interval for the integrated containment test, and establishing intervals for the local containment leak rate tests based on their performance. A study has been conducted to provide technical information for establishing the performance criteria for containment tests, the allowable leakage rate, commensurate with its significance to total public risk. The study used results of a recent comprehensive study conducted by the NRC, NUREG-1150, 'Severe Accident Risks: An Assessment for Five U.S. Nuclear Power Plants,' to examine the sensitivity of containment leakage to public risk. Risk was found to be insensitive to containment leakage rate up to levels of about 100 percent-volume per day for certain types of containments. PRA methods have also been developed to establish risk-based intervals for containment tests based on their past experience. Preliminary evaluations show that increasing the interval for the integrated containment leakage test from three times to once every ten years would have an insignificant impact on public risk. Preliminary analyses of operational experience data for local leak rate tests show that performance-based testing, valves and penetrations that perform well are tested less frequently, is feasible with marginal impact on safety. The above technical studies are being used to develop efficient (cost-effective) requirements for containment tests. (author). 4 refs., 2 figs

  10. Home-based versus mobile clinic HIV testing and counseling in rural Lesotho: a cluster-randomized trial.

    Directory of Open Access Journals (Sweden)

    Niklaus Daniel Labhardt

    2014-12-01

    Full Text Available The success of HIV programs relies on widely accessible HIV testing and counseling (HTC services at health facilities as well as in the community. Home-based HTC (HB-HTC is a popular community-based approach to reach persons who do not test at health facilities. Data comparing HB-HTC to other community-based HTC approaches are very limited. This trial compares HB-HTC to mobile clinic HTC (MC-HTC.The trial was powered to test the hypothesis of higher HTC uptake in HB-HTC campaigns than in MC-HTC campaigns. Twelve clusters were randomly allocated to HB-HTC or MC-HTC. The six clusters in the HB-HTC group received 30 1-d multi-disease campaigns (five villages per cluster that delivered services by going door-to-door, whereas the six clusters in MC-HTC group received campaigns involving community gatherings in the 30 villages with subsequent service provision in mobile clinics. Time allocation and human resources were standardized and equal in both groups. All individuals accessing the campaigns with unknown HIV status or whose last HIV test was >12 wk ago and was negative were eligible. All outcomes were assessed at the individual level. Statistical analysis used multivariable logistic regression. Odds ratios and p-values were adjusted for gender, age, and cluster effect. Out of 3,197 participants from the 12 clusters, 2,563 (80.2% were eligible (HB-HTC: 1,171; MC-HTC: 1,392. The results for the primary outcomes were as follows. Overall HTC uptake was higher in the HB-HTC group than in the MC-HTC group (92.5% versus 86.7%; adjusted odds ratio [aOR]: 2.06; 95% CI: 1.18-3.60; p = 0. 011. Among adolescents and adults ≥ 12 y, HTC uptake did not differ significantly between the two groups; however, in children <12 y, HTC uptake was higher in the HB-HTC arm (87.5% versus 58.7%; aOR: 4.91; 95% CI: 2.41-10.0; p<0.001. Out of those who took up HTC, 114 (4.9% tested HIV-positive, 39 (3.6% in the HB-HTC arm and 75 (6.2% in the MC-HTC arm (aOR: 0.64; 95% CI

  11. Systematic study of the π-/π+ ratio in heavy-ion collisions with the same neutron/proton ratio but different masses

    International Nuclear Information System (INIS)

    Zhang Ming; Xiao Zhigang; Zhu Shengjiang; Li Baoan; Chen Liewen; Yong Gaochan

    2009-01-01

    A systematic study of the π - /π + ratio in heavy-ion collisions with the same neutron/proton ratio but different masses can help single out effects of the nuclear mean field on pion production. Based on simulations using the IBUU04 transport model, it is found that the π - /π + ratio in head-on collisions of 48 Ca+ 48 Ca, 124 Sn+ 124 Sn, and 197 Au+ 197 Au at beam energies from 0.25 to 0.6 GeV/nucleon increases with increasing the system size or decreasing the beam energies. A comprehensive analysis of the dynamical isospin fractionation and the π - /π + ratio as well as their time evolution and spatial distributions demonstrates clearly that the π - /π + ratio is an effective probe of the high-density behavior of the nuclear symmetry energy.

  12. Uncertainty management in knowledge based systems for nondestructive testing-an example from ultrasonic testing

    International Nuclear Information System (INIS)

    Rajagopalan, C.; Kalyanasundaram, P.; Baldev Raj

    1996-01-01

    The use of fuzzy logic, as a framework for uncertainty management, in a knowledge-based system (KBS) for ultrasonic testing of austenitic stainless steels is described. Parameters that may contain uncertain values are identified. Methodologies to handle uncertainty in these parameters using fuzzy logic are detailed. The overall improvement in the performance of the knowledge-based system after incorporating fuzzy logic is discussed. The methodology developed being universal, its extension to other KBS for nondestructive testing and evaluation is highlighted. (author)

  13. Golden Ratio

    Indian Academy of Sciences (India)

    Our attraction to another body increases if the body is symmetricaland in proportion. If a face or a structure is in proportion,we are more likely to notice it and find it beautiful.The universal ratio of beauty is the 'Golden Ratio', found inmany structures. This ratio comes from Fibonacci numbers.In this article, we explore this ...

  14. Alpha-in-air monitor for continuous monitoring based on alpha to beta ratio

    International Nuclear Information System (INIS)

    Somayaji, K.S.; Venkataramani, R.; Swaminathan, N.; Pushparaja

    1997-01-01

    Measurement of long-lived alpha activity collected on a filter paper in continuous air monitoring of ambient working environment is difficult due to interference from much larger concentrations of short-lived alpha emitting daughter products of 222 Rn and 220 Rn. However, the ratio between the natural alpha and beta activity is approximately constant and this constancy of the ratio is used to discriminate against short-lived natural radioactivity in continuous air monitoring. Detection system was specially designed for the purpose of simultaneous counting of alpha and beta activity deposited on the filter paper during continuous monitoring. The activity ratios were calculated and plotted against the monitoring duration up to about six hours. Monitoring was carried out in three facilities with different ventilation conditions. Presence of any long-lived alpha contamination on the filter paper results in increase in the alpha to beta ratio. Long-lived 239 Pu contamination of about 16 DAC.h could be detected after about 45 minutes of commencement of the sampling. The experimental results using prototype units have shown that the approach of using alpha to beta activity ratio method to detect long-lived alpha activity in the presence of short-lived natural activity is satisfactory. (author)

  15. Autonomic Neuropathy—a Prospective Cohort Study of Symptoms and E/I Ratio in Normal Glucose Tolerance, Impaired Glucose Tolerance, and Type 2 Diabetes

    Directory of Open Access Journals (Sweden)

    Malin Zimmerman

    2018-03-01

    Full Text Available BackgroundAutonomic neuropathy in diabetes, in addition to causing a range of symptoms originating from the autonomic nervous system, may increase cardiovascular morbidity. Our aim was to study the progression of autonomic neuropathy, based on symptom score and evaluation of an autonomic test, in persons with normal and impaired glucose tolerance and in patients with type 2 diabetes (T2D.MethodsParticipants were recruited in 2003/2004 with a follow-up in 2014. The participants’ glucose tolerance was categorized using oral glucose tolerance tests. Symptoms were evaluated using an autonomic symptom score (ASS, ECG was used to test cardiac autonomic function based on the expiration/inspiration ratio (E/I ratio, and blood samples were taken on both occasions.ResultsASSs were higher at follow-up in the T2D patients than in the normal glucose tolerance group (mean 1.21 ± 1.30 vs. 0.79 ± 0.7; p < 0.05. E/I ratio did not deteriorate more than could be expected as an aging effect in well-controlled T2D. No relationship was found between E/I ratio and HbA1c or ASS.ConclusionThe presence of autonomic symptoms increased over time in T2D patients, but the symptoms did not correlate with the E/I ratio in this metabolically well-controlled cohort. ASSs can be a useful clinical tool when assessing the progression of autonomic dysfunction in patients with abnormal glucose metabolism.

  16. Exploring pharmacy and home-based sexually transmissible infection testing.

    Science.gov (United States)

    Habel, Melissa A; Scheinmann, Roberta; Verdesoto, Elizabeth; Gaydos, Charlotte; Bertisch, Maggie; Chiasson, Mary Ann

    2015-11-01

    Background This study assessed the feasibility and acceptability of pharmacy and home-based sexually transmissible infection (STI) screening as alternate testing venues among emergency contraception (EC) users. The study included two phases in February 2011-July 2012. In Phase I, customers purchasing EC from eight pharmacies in Manhattan received vouchers for free STI testing at onsite medical clinics. In Phase II, three Facebook ads targeted EC users to connect them with free home-based STI test kits ordered online. Participants completed a self-administered survey. Only 38 participants enrolled in Phase I: 90% female, ≤29 years (74%), 45% White non-Hispanic and 75% college graduates; 71% were not tested for STIs in the past year and 68% reported a new partner in the past 3 months. None tested positive for STIs. In Phase II, ads led to >45000 click-throughs, 382 completed the survey and 290 requested kits; 28% were returned. Phase II participants were younger and less educated than Phase I participants; six tested positive for STIs. Challenges included recruitment, pharmacy staff participation, advertising with discretion and cost. This study found low uptake of pharmacy and home-based testing among EC users; however, STI testing in these settings is feasible and the acceptability findings indicate an appeal among younger women for testing in non-traditional settings. Collaborating with and training pharmacy and medical staff are key elements of service provision. Future research should explore how different permutations of expanding screening in non-traditional settings could improve testing uptake and detect additional STI cases.

  17. Tau hadronic branching ratios

    CERN Document Server

    Buskulic, Damir; De Bonis, I; Décamp, D; Ghez, P; Goy, C; Lees, J P; Lucotte, A; Minard, M N; Odier, P; Pietrzyk, B; Ariztizabal, F; Chmeissani, M; Crespo, J M; Efthymiopoulos, I; Fernández, E; Fernández-Bosman, M; Gaitan, V; Martínez, M; Orteu, S; Pacheco, A; Padilla, C; Palla, Fabrizio; Pascual, A; Perlas, J A; Sánchez, F; Teubert, F; Colaleo, A; Creanza, D; De Palma, M; Farilla, A; Gelao, G; Girone, M; Iaselli, Giuseppe; Maggi, G; Maggi, M; Marinelli, N; Natali, S; Nuzzo, S; Ranieri, A; Raso, G; Romano, F; Ruggieri, F; Selvaggi, G; Silvestris, L; Tempesta, P; Zito, G; Huang, X; Lin, J; Ouyang, Q; Wang, T; Xie, Y; Xu, R; Xue, S; Zhang, J; Zhang, L; Zhao, W; Bonvicini, G; Cattaneo, M; Comas, P; Coyle, P; Drevermann, H; Engelhardt, A; Forty, Roger W; Frank, M; Hagelberg, R; Harvey, J; Jacobsen, R; Janot, P; Jost, B; Kneringer, E; Knobloch, J; Lehraus, Ivan; Markou, C; Martin, E B; Mato, P; Minten, Adolf G; Miquel, R; Oest, T; Palazzi, P; Pater, J R; Pusztaszeri, J F; Ranjard, F; Rensing, P E; Rolandi, Luigi; Schlatter, W D; Schmelling, M; Schneider, O; Tejessy, W; Tomalin, I R; Venturi, A; Wachsmuth, H W; Wiedenmann, W; Wildish, T; Witzeling, W; Wotschack, J; Ajaltouni, Ziad J; Bardadin-Otwinowska, Maria; Barrès, A; Boyer, C; Falvard, A; Gay, P; Guicheney, C; Henrard, P; Jousset, J; Michel, B; Monteil, S; Pallin, D; Perret, P; Podlyski, F; Proriol, J; Rossignol, J M; Saadi, F; Fearnley, Tom; Hansen, J B; Hansen, J D; Hansen, J R; Hansen, P H; Nilsson, B S; Kyriakis, A; Simopoulou, Errietta; Siotis, I; Vayaki, Anna; Zachariadou, K; Blondel, A; Bonneaud, G R; Brient, J C; Bourdon, P; Passalacqua, L; Rougé, A; Rumpf, M; Tanaka, R; Valassi, Andrea; Verderi, M; Videau, H L; Candlin, D J; Parsons, M I; Focardi, E; Parrini, G; Corden, M; Delfino, M C; Georgiopoulos, C H; Jaffe, D E; Antonelli, A; Bencivenni, G; Bologna, G; Bossi, F; Campana, P; Capon, G; Chiarella, V; Felici, G; Laurelli, P; Mannocchi, G; Murtas, F; Murtas, G P; Pepé-Altarelli, M; Dorris, S J; Halley, A W; ten Have, I; Knowles, I G; Lynch, J G; Morton, W T; O'Shea, V; Raine, C; Reeves, P; Scarr, J M; Smith, K; Smith, M G; Thompson, A S; Thomson, F; Thorn, S; Turnbull, R M; Becker, U; Braun, O; Geweniger, C; Graefe, G; Hanke, P; Hepp, V; Kluge, E E; Putzer, A; Rensch, B; Schmidt, M; Sommer, J; Stenzel, H; Tittel, K; Werner, S; Wunsch, M; Beuselinck, R; Binnie, David M; Cameron, W; Colling, D J; Dornan, Peter J; Konstantinidis, N P; Moneta, L; Moutoussi, A; Nash, J; San Martin, G; Sedgbeer, J K; Stacey, A M; Dissertori, G; Girtler, P; Kuhn, D; Rudolph, G; Bowdery, C K; Brodbeck, T J; Colrain, P; Crawford, G; Finch, A J; Foster, F; Hughes, G; Sloan, Terence; Whelan, E P; Williams, M I; Galla, A; Greene, A M; Kleinknecht, K; Quast, G; Raab, J; Renk, B; Sander, H G; Wanke, R; Van Gemmeren, P; Zeitnitz, C; Aubert, Jean-Jacques; Bencheikh, A M; Benchouk, C; Bonissent, A; Bujosa, G; Calvet, D; Carr, J; Diaconu, C A; Etienne, F; Thulasidas, M; Nicod, D; Payre, P; Rousseau, D; Talby, M; Abt, I; Assmann, R W; Bauer, C; Blum, Walter; Brown, D; Dietl, H; Dydak, Friedrich; Ganis, G; Gotzhein, C; Jakobs, K; Kroha, H; Lütjens, G; Lutz, Gerhard; Männer, W; Moser, H G; Richter, R H; Rosado-Schlosser, A; Schael, S; Settles, Ronald; Seywerd, H C J; Saint-Denis, R; Wolf, G; Alemany, R; Boucrot, J; Callot, O; Cordier, A; Courault, F; Davier, M; Duflot, L; Grivaz, J F; Heusse, P; Jacquet, M; Kim, D W; Le Diberder, F R; Lefrançois, J; Lutz, A M; Musolino, G; Nikolic, I A; Park, H J; Park, I C; Schune, M H; Simion, S; Veillet, J J; Videau, I; Abbaneo, D; Azzurri, P; Bagliesi, G; Batignani, G; Bettarini, S; Bozzi, C; Calderini, G; Carpinelli, M; Ciocci, M A; Ciulli, V; Dell'Orso, R; Fantechi, R; Ferrante, I; Foà, L; Forti, F; Giassi, A; Giorgi, M A; Gregorio, A; Ligabue, F; Lusiani, A; Marrocchesi, P S; Messineo, A; Rizzo, G; Sanguinetti, G; Sciabà, A; Spagnolo, P; Steinberger, Jack; Tenchini, Roberto; Tonelli, G; Triggiani, G; Vannini, C; Verdini, P G; Walsh, J; Betteridge, A P; Blair, G A; Bryant, L M; Cerutti, F; Gao, Y; Green, M G; Johnson, D L; Medcalf, T; Mir, L M; Perrodo, P; Strong, J A; Bertin, V; Botterill, David R; Clifft, R W; Edgecock, T R; Haywood, S; Edwards, M; Maley, P; Norton, P R; Thompson, J C; Bloch-Devaux, B; Colas, P; Emery, S; Kozanecki, Witold; Lançon, E; Lemaire, M C; Locci, E; Marx, B; Pérez, P; Rander, J; Renardy, J F; Roussarie, A; Schuller, J P; Schwindling, J; Trabelsi, A; Vallage, B; Johnson, R P; Kim, H Y; Litke, A M; McNeil, M A; Taylor, G; Beddall, A; Booth, C N; Boswell, R; Cartwright, S L; Combley, F; Dawson, I; Köksal, A; Letho, M; Newton, W M; Rankin, C; Thompson, L F; Böhrer, A; Brandt, S; Cowan, G D; Feigl, E; Grupen, Claus; Lutters, G; Minguet-Rodríguez, J A; Rivera, F; Saraiva, P; Smolik, L; Stephan, F; Apollonio, M; Bosisio, L; Della Marina, R; Giannini, G; Gobbo, B; Ragusa, F; Rothberg, J E; Wasserbaech, S R; Armstrong, S R; Bellantoni, L; Elmer, P; Feng, Z; Ferguson, D P S; Gao, Y S; González, S; Grahl, J; Harton, J L; Hayes, O J; Hu, H; McNamara, P A; Nachtman, J M; Orejudos, W; Pan, Y B; Saadi, Y; Schmitt, M; Scott, I J; Sharma, V; Turk, J; Walsh, A M; Wu Sau Lan; Wu, X; Yamartino, J M; Zheng, M; Zobernig, G

    1996-01-01

    From 64492 selected \\tau-pair events, produced at the Z^0 resonance, the measurement of the tau decays into hadrons from a global analysis using 1991, 1992 and 1993 ALEPH data is presented. Special emphasis is given to the reconstruction of photons and \\pi^0's, and the removal of fake photons. A detailed study of the systematics entering the \\pi^0 reconstruction is also given. A complete and consistent set of tau hadronic branching ratios is presented for 18 exclusive modes. Most measurements are more precise than the present world average. The new level of precision reached allows a stringent test of \\tau-\\mu universality in hadronic decays, g_\\tau/g_\\mu \\ = \\ 1.0013 \\ \\pm \\ 0.0095, and the first measurement of the vector and axial-vector contributions to the non-strange hadronic \\tau decay width: R_{\\tau ,V} \\ = \\ 1.788 \\ \\pm \\ 0.025 and R_{\\tau ,A} \\ = \\ 1.694 \\ \\pm \\ 0.027. The ratio (R_{\\tau ,V} - R_{\\tau ,A}) / (R_{\\tau ,V} + R_{\\tau ,A}), equal to (2.7 \\pm 1.3) \\ \\%, is a measure of the importance of Q...

  18. Inquiry-Based Instruction and High Stakes Testing

    Science.gov (United States)

    Cothern, Rebecca L.

    Science education is a key to economic success for a country in terms of promoting advances in national industry and technology and maximizing competitive advantage in a global marketplace. The December 2010 Program for International Student Assessment (PISA) ranked the United States 23rd of 65 countries in science. That dismal standing in science proficiency impedes the ability of American school graduates to compete in the global market place. Furthermore, the implementation of high stakes testing in science mandated by the 2007 No Child Left Behind (NCLB) Act has created an additional need for educators to find effective science pedagogy. Research has shown that inquiry-based science instruction is one of the predominant science instructional methods. Inquiry-based instruction is a multifaceted teaching method with its theoretical foundation in constructivism. A correlational survey research design was used to determine the relationship between levels of inquiry-based science instruction and student performance on a standardized state science test. A self-report survey, using a Likert-type scale, was completed by 26 fifth grade teachers. Participants' responses were analyzed and grouped as high, medium, or low level inquiry instruction. The unit of analysis for the achievement variable was the student scale score average from the state science test. Spearman's Rho correlation data showed a positive relationship between the level of inquiry-based instruction and student achievement on the state assessment. The findings can assist teachers and administrators by providing additional research on the benefits of the inquiry-based instructional method. Implications for positive social change include increases in student proficiency and decision-making skills related to science policy issues which can help make them more competitive in the global marketplace.

  19. Zero crossing and ratio spectra derivative spectrophotometry for the dissolution tests of amlodipine and perindopril in their fixed dose formulations

    Directory of Open Access Journals (Sweden)

    Maczka Paulina

    2014-06-01

    Full Text Available Dissolution tests of amlodipine and perindopril from their fixed dose formulations were performed in 900 mL of phosphate buffer of pH 5.5 at 37°C using the paddle apparatus. Then, two simple and rapid derivative spectrophotometric methods were used for the quantitative measurements of amlodipine and perindopril. The first method was zero crossing first derivative spectrophotometry in which measuring of amplitudes at 253 nm for amlodipine and 229 nm for perindopril were used. The second method was ratio derivative spectrophotometry in which spectra of amlodipine over the linearity range were divided by one selected standard spectrum of perindopril and then amplitudes at 242 nm were measured. Similarly, spectra of perindopril were divided by one selected standard spectrum of amlodipine and then amplitudes at 298 nm were measured. Both of the methods were validated to meet official requirements and were demonstrated to be selective, precise and accurate. Since there is no official monograph for these drugs in binary formulations, the dissolution tests and quantification procedure presented here can be used as a quality control test for amlodipine and perindopril in respective dosage forms.

  20. Test and evaluation about damping characteristics of hanger supports for nuclear power plant piping systems (Seismic Damping Ratio Evaluation Program)

    International Nuclear Information System (INIS)

    Shibata, H.; Ito, A.; Tanaka, K.; Niino, T.; Gotoh, N.

    1981-01-01

    Generally, damping phenomena of structures and equipments is caused by very complex energy dissipation. Especially, as piping systems are composed of many components, it is very difficult to evaluate damping characteristics of its system theoretically. On the other hand, the damping value for aseismic design of nuclear power plants is very important design factor to decide seismic response loads of structures, equipments and piping systems. The very extensive studies titled SDREP (Seismic Damping Ratio Evaluation Program) were performed to establish proper damping values for seismic design of piping as a joint work among a university, electric companies and plant makers. In SDREP, various systematic vibration tests were conducted to investigate factors which may contribute to damping characteristics of piping systems and to supplement the data of the pre-operating tests. This study is related to the component damping characteristics tests of that program. The object of this study is to clarify damping characteristics and mechanism of hanger supports used in piping systems, and to establish the evaluation technique of dispersing energy at hanger support points and its effect to the total damping ability of piping system. (orig./WL)