WorldWideScience

Sample records for regression test problems

  1. Summary of Documentation for DYNA3D-ParaDyn's Software Quality Assurance Regression Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Zywicz, Edward [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-08-18

    The Software Quality Assurance (SQA) regression test suite for DYNA3D (Zywicz and Lin, 2015) and ParaDyn (DeGroot, et al., 2015) currently contains approximately 600 problems divided into 21 suites, and is a required component of ParaDyn’s SQA plan (Ferencz and Oliver, 2013). The regression suite allows developers to ensure that software modifications do not unintentionally alter the code response. The entire regression suite is run prior to permanently incorporating any software modification or addition. When code modifications alter test problem results, the specific cause must be determined and fully understood before the software changes and revised test answers can be incorporated. The regression suite is executed on LLNL platforms using a Python script and an associated data file. The user specifies the DYNA3D or ParaDyn executable, number of processors to use, test problems to run, and other options to the script. The data file details how each problem and its answer extraction scripts are executed. For each problem in the regression suite there exists an input deck, an eight-processor partition file, an answer file, and various extraction scripts. These scripts assemble a temporary answer file in a specific format from the simulation results. The temporary and stored answer files are compared to a specific level of numerical precision, and when differences are detected the test problem is flagged as failed. Presently, numerical results are stored and compared to 16 digits. At this accuracy level different processor types, compilers, number of partitions, etc. impact the results to various degrees. Thus, for consistency purposes the regression suite is run with ParaDyn using 8 processors on machines with a specific processor type (currently the Intel Xeon E5530 processor). For non-parallel regression problems, i.e., the two XFEM problems, DYNA3D is used instead. When environments or platforms change, executables using the current source code and the new

  2. Credit Scoring Problem Based on Regression Analysis

    OpenAIRE

    Khassawneh, Bashar Suhil Jad Allah

    2014-01-01

    ABSTRACT: This thesis provides an explanatory introduction to the regression models of data mining and contains basic definitions of key terms in the linear, multiple and logistic regression models. Meanwhile, the aim of this study is to illustrate fitting models for the credit scoring problem using simple linear, multiple linear and logistic regression models and also to analyze the found model functions by statistical tools. Keywords: Data mining, linear regression, logistic regression....

  3. Polynomial regression analysis and significance test of the regression function

    International Nuclear Information System (INIS)

    Gao Zhengming; Zhao Juan; He Shengping

    2012-01-01

    In order to analyze the decay heating power of a certain radioactive isotope per kilogram with polynomial regression method, the paper firstly demonstrated the broad usage of polynomial function and deduced its parameters with ordinary least squares estimate. Then significance test method of polynomial regression function is derived considering the similarity between the polynomial regression model and the multivariable linear regression model. Finally, polynomial regression analysis and significance test of the polynomial function are done to the decay heating power of the iso tope per kilogram in accord with the authors' real work. (authors)

  4. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin

    2017-01-19

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  5. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin; Zhou, Yuejin; Tong, Tiejun

    2017-01-01

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  6. Bayesian nonlinear regression for large small problems

    KAUST Repository

    Chakraborty, Sounak; Ghosh, Malay; Mallick, Bani K.

    2012-01-01

    Statistical modeling and inference problems with sample sizes substantially smaller than the number of available covariates are challenging. This is known as large p small n problem. Furthermore, the problem is more complicated when we have multiple correlated responses. We develop multivariate nonlinear regression models in this setup for accurate prediction. In this paper, we introduce a full Bayesian support vector regression model with Vapnik's ε-insensitive loss function, based on reproducing kernel Hilbert spaces (RKHS) under the multivariate correlated response setup. This provides a full probabilistic description of support vector machine (SVM) rather than an algorithm for fitting purposes. We have also introduced a multivariate version of the relevance vector machine (RVM). Instead of the original treatment of the RVM relying on the use of type II maximum likelihood estimates of the hyper-parameters, we put a prior on the hyper-parameters and use Markov chain Monte Carlo technique for computation. We have also proposed an empirical Bayes method for our RVM and SVM. Our methods are illustrated with a prediction problem in the near-infrared (NIR) spectroscopy. A simulation study is also undertaken to check the prediction accuracy of our models. © 2012 Elsevier Inc.

  7. Bayesian nonlinear regression for large small problems

    KAUST Repository

    Chakraborty, Sounak

    2012-07-01

    Statistical modeling and inference problems with sample sizes substantially smaller than the number of available covariates are challenging. This is known as large p small n problem. Furthermore, the problem is more complicated when we have multiple correlated responses. We develop multivariate nonlinear regression models in this setup for accurate prediction. In this paper, we introduce a full Bayesian support vector regression model with Vapnik\\'s ε-insensitive loss function, based on reproducing kernel Hilbert spaces (RKHS) under the multivariate correlated response setup. This provides a full probabilistic description of support vector machine (SVM) rather than an algorithm for fitting purposes. We have also introduced a multivariate version of the relevance vector machine (RVM). Instead of the original treatment of the RVM relying on the use of type II maximum likelihood estimates of the hyper-parameters, we put a prior on the hyper-parameters and use Markov chain Monte Carlo technique for computation. We have also proposed an empirical Bayes method for our RVM and SVM. Our methods are illustrated with a prediction problem in the near-infrared (NIR) spectroscopy. A simulation study is also undertaken to check the prediction accuracy of our models. © 2012 Elsevier Inc.

  8. Testing Heteroscedasticity in Robust Regression

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2011-01-01

    Roč. 1, č. 4 (2011), s. 25-28 ISSN 2045-3345 Grant - others:GA ČR(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust regression * heteroscedasticity * regression quantiles * diagnostics Subject RIV: BB - Applied Statistics , Operational Research http://www.researchjournals.co.uk/documents/Vol4/06%20Kalina.pdf

  9. Regression testing Ajax applications : Coping with dynamism

    NARCIS (Netherlands)

    Roest, D.; Mesbah, A.; Van Deursen, A.

    2009-01-01

    Note: This paper is a pre-print of: Danny Roest, Ali Mesbah and Arie van Deursen. Regression Testing AJAX Applications: Coping with Dynamism. In Proceedings of the 3rd International Conference on Software Testing, Verification and Validation (ICST’10), Paris, France. IEEE Computer Society, 2010.

  10. Testing homogeneity in Weibull-regression models.

    Science.gov (United States)

    Bolfarine, Heleno; Valença, Dione M

    2005-10-01

    In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.

  11. DYNA3D/ParaDyn Regression Test Suite Inventory

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Jerry I. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-09-01

    The following table constitutes an initial assessment of feature coverage across the regression test suite used for DYNA3D and ParaDyn. It documents the regression test suite at the time of preliminary release 16.1 in September 2016. The columns of the table represent groupings of functionalities, e.g., material models. Each problem in the test suite is represented by a row in the table. All features exercised by the problem are denoted by a check mark (√) in the corresponding column. The definition of “feature” has not been subdivided to its smallest unit of user input, e.g., algorithmic parameters specific to a particular type of contact surface. This represents a judgment to provide code developers and users a reasonable impression of feature coverage without expanding the width of the table by several multiples. All regression testing is run in parallel, typically with eight processors, except problems involving features only available in serial mode. Many are strictly regression tests acting as a check that the codes continue to produce adequately repeatable results as development unfolds; compilers change and platforms are replaced. A subset of the tests represents true verification problems that have been checked against analytical or other benchmark solutions. Users are welcomed to submit documented problems for inclusion in the test suite, especially if they are heavily exercising, and dependent upon, features that are currently underrepresented.

  12. Structural Break Tests Robust to Regression Misspecification

    Directory of Open Access Journals (Sweden)

    Alaa Abi Morshed

    2018-05-01

    Full Text Available Structural break tests for regression models are sensitive to model misspecification. We show—analytically and through simulations—that the sup Wald test for breaks in the conditional mean and variance of a time series process exhibits severe size distortions when the conditional mean dynamics are misspecified. We also show that the sup Wald test for breaks in the unconditional mean and variance does not have the same size distortions, yet benefits from similar power to its conditional counterpart in correctly specified models. Hence, we propose using it as an alternative and complementary test for breaks. We apply the unconditional and conditional mean and variance tests to three US series: unemployment, industrial production growth and interest rates. Both the unconditional and the conditional mean tests detect a break in the mean of interest rates. However, for the other two series, the unconditional mean test does not detect a break, while the conditional mean tests based on dynamic regression models occasionally detect a break, with the implied break-point estimator varying across different dynamic specifications. For all series, the unconditional variance does not detect a break while most tests for the conditional variance do detect a break which also varies across specifications.

  13. Automation of Flight Software Regression Testing

    Science.gov (United States)

    Tashakkor, Scott B.

    2016-01-01

    NASA is developing the Space Launch System (SLS) to be a heavy lift launch vehicle supporting human and scientific exploration beyond earth orbit. SLS will have a common core stage, an upper stage, and different permutations of boosters and fairings to perform various crewed or cargo missions. Marshall Space Flight Center (MSFC) is writing the Flight Software (FSW) that will operate the SLS launch vehicle. The FSW is developed in an incremental manner based on "Agile" software techniques. As the FSW is incrementally developed, testing the functionality of the code needs to be performed continually to ensure that the integrity of the software is maintained. Manually testing the functionality on an ever-growing set of requirements and features is not an efficient solution and therefore needs to be done automatically to ensure testing is comprehensive. To support test automation, a framework for a regression test harness has been developed and used on SLS FSW. The test harness provides a modular design approach that can compile or read in the required information specified by the developer of the test. The modularity provides independence between groups of tests and the ability to add and remove tests without disturbing others. This provides the SLS FSW team a time saving feature that is essential to meeting SLS Program technical and programmatic requirements. During development of SLS FSW, this technique has proved to be a useful tool to ensure all requirements have been tested, and that desired functionality is maintained, as changes occur. It also provides a mechanism for developers to check functionality of the code that they have developed. With this system, automation of regression testing is accomplished through a scheduling tool and/or commit hooks. Key advantages of this test harness capability includes execution support for multiple independent test cases, the ability for developers to specify precisely what they are testing and how, the ability to add

  14. Constitutive Theories of Self-Knowledge and the Regress Problem ...

    African Journals Online (AJOL)

    ... on the other hand, hold that self-knowledge is constitutive of intentional states. That is, self-ascription is a necessary condition for being in a particular mental state. Akeel Bilgrami is a defender of the constitutive model. I argue that the constitutive model gives rise to a regress problem. This paper will focus on that problem ...

  15. Regression testing in the TOTEM DCS

    International Nuclear Information System (INIS)

    Rodríguez, F Lucas; Atanassov, I; Burkimsher, P; Frost, O; Taskinen, J; Tulimaki, V

    2012-01-01

    The Detector Control System of the TOTEM experiment at the LHC is built with the industrial product WinCC OA (PVSS). The TOTEM system is generated automatically through scripts using as input the detector Product Breakdown Structure (PBS) structure and its pinout connectivity, archiving and alarm metainformation, and some other heuristics based on the naming conventions. When those initial parameters and automation code are modified to include new features, the resulting PVSS system can also introduce side-effects. On a daily basis, a custom developed regression testing tool takes the most recent code from a Subversion (SVN) repository and builds a new control system from scratch. This system is exported in plain text format using the PVSS export tool, and compared with a system previously validated by a human. A report is sent to the developers with any differences highlighted, in readiness for validation and acceptance as a new stable version. This regression approach is not dependent on any development framework or methodology. This process has been satisfactory during several months, proving to be a very valuable tool before deploying new versions in the production systems.

  16. Multiple regression for physiological data analysis: the problem of multicollinearity.

    Science.gov (United States)

    Slinker, B K; Glantz, S A

    1985-07-01

    Multiple linear regression, in which several predictor variables are related to a response variable, is a powerful statistical tool for gaining quantitative insight into complex in vivo physiological systems. For these insights to be correct, all predictor variables must be uncorrelated. However, in many physiological experiments the predictor variables cannot be precisely controlled and thus change in parallel (i.e., they are highly correlated). There is a redundancy of information about the response, a situation called multicollinearity, that leads to numerical problems in estimating the parameters in regression equations; the parameters are often of incorrect magnitude or sign or have large standard errors. Although multicollinearity can be avoided with good experimental design, not all interesting physiological questions can be studied without encountering multicollinearity. In these cases various ad hoc procedures have been proposed to mitigate multicollinearity. Although many of these procedures are controversial, they can be helpful in applying multiple linear regression to some physiological problems.

  17. Solving Dynamic Traveling Salesman Problem Using Dynamic Gaussian Process Regression

    Directory of Open Access Journals (Sweden)

    Stephen M. Akandwanaho

    2014-01-01

    Full Text Available This paper solves the dynamic traveling salesman problem (DTSP using dynamic Gaussian Process Regression (DGPR method. The problem of varying correlation tour is alleviated by the nonstationary covariance function interleaved with DGPR to generate a predictive distribution for DTSP tour. This approach is conjoined with Nearest Neighbor (NN method and the iterated local search to track dynamic optima. Experimental results were obtained on DTSP instances. The comparisons were performed with Genetic Algorithm and Simulated Annealing. The proposed approach demonstrates superiority in finding good traveling salesman problem (TSP tour and less computational time in nonstationary conditions.

  18. A Powerful Test for Comparing Multiple Regression Functions.

    Science.gov (United States)

    Maity, Arnab

    2012-09-01

    In this article, we address the important problem of comparison of two or more population regression functions. Recently, Pardo-Fernández, Van Keilegom and González-Manteiga (2007) developed test statistics for simple nonparametric regression models: Y(ij) = θ(j)(Z(ij)) + σ(j)(Z(ij))∊(ij), based on empirical distributions of the errors in each population j = 1, … , J. In this paper, we propose a test for equality of the θ(j)(·) based on the concept of generalized likelihood ratio type statistics. We also generalize our test for other nonparametric regression setups, e.g, nonparametric logistic regression, where the loglikelihood for population j is any general smooth function [Formula: see text]. We describe a resampling procedure to obtain the critical values of the test. In addition, we present a simulation study to evaluate the performance of the proposed test and compare our results to those in Pardo-Fernández et al. (2007).

  19. Computing group cardinality constraint solutions for logistic regression problems.

    Science.gov (United States)

    Zhang, Yong; Kwon, Dongjin; Pohl, Kilian M

    2017-01-01

    We derive an algorithm to directly solve logistic regression based on cardinality constraint, group sparsity and use it to classify intra-subject MRI sequences (e.g. cine MRIs) of healthy from diseased subjects. Group cardinality constraint models are often applied to medical images in order to avoid overfitting of the classifier to the training data. Solutions within these models are generally determined by relaxing the cardinality constraint to a weighted feature selection scheme. However, these solutions relate to the original sparse problem only under specific assumptions, which generally do not hold for medical image applications. In addition, inferring clinical meaning from features weighted by a classifier is an ongoing topic of discussion. Avoiding weighing features, we propose to directly solve the group cardinality constraint logistic regression problem by generalizing the Penalty Decomposition method. To do so, we assume that an intra-subject series of images represents repeated samples of the same disease patterns. We model this assumption by combining series of measurements created by a feature across time into a single group. Our algorithm then derives a solution within that model by decoupling the minimization of the logistic regression function from enforcing the group sparsity constraint. The minimum to the smooth and convex logistic regression problem is determined via gradient descent while we derive a closed form solution for finding a sparse approximation of that minimum. We apply our method to cine MRI of 38 healthy controls and 44 adult patients that received reconstructive surgery of Tetralogy of Fallot (TOF) during infancy. Our method correctly identifies regions impacted by TOF and generally obtains statistically significant higher classification accuracy than alternative solutions to this model, i.e., ones relaxing group cardinality constraints. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Convergence diagnostics for Eigenvalue problems with linear regression model

    International Nuclear Information System (INIS)

    Shi, Bo; Petrovic, Bojan

    2011-01-01

    Although the Monte Carlo method has been extensively used for criticality/Eigenvalue problems, a reliable, robust, and efficient convergence diagnostics method is still desired. Most methods are based on integral parameters (multiplication factor, entropy) and either condense the local distribution information into a single value (e.g., entropy) or even disregard it. We propose to employ the detailed cycle-by-cycle local flux evolution obtained by using mesh tally mechanism to assess the source and flux convergence. By applying a linear regression model to each individual mesh in a mesh tally for convergence diagnostics, a global convergence criterion can be obtained. We exemplify this method on two problems and obtain promising diagnostics results. (author)

  1. A test for the parameters of multiple linear regression models ...

    African Journals Online (AJOL)

    A test for the parameters of multiple linear regression models is developed for conducting tests simultaneously on all the parameters of multiple linear regression models. The test is robust relative to the assumptions of homogeneity of variances and absence of serial correlation of the classical F-test. Under certain null and ...

  2. Testing for Stock Market Contagion: A Quantile Regression Approach

    NARCIS (Netherlands)

    S.Y. Park (Sung); W. Wang (Wendun); N. Huang (Naijing)

    2015-01-01

    markdownabstract__Abstract__ Regarding the asymmetric and leptokurtic behavior of financial data, we propose a new contagion test in the quantile regression framework that is robust to model misspecification. Unlike conventional correlation-based tests, the proposed quantile contagion test

  3. Testing hypotheses for differences between linear regression lines

    Science.gov (United States)

    Stanley J. Zarnoch

    2009-01-01

    Five hypotheses are identified for testing differences between simple linear regression lines. The distinctions between these hypotheses are based on a priori assumptions and illustrated with full and reduced models. The contrast approach is presented as an easy and complete method for testing for overall differences between the regressions and for making pairwise...

  4. Dimension Reduction and Discretization in Stochastic Problems by Regression Method

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1996-01-01

    The chapter mainly deals with dimension reduction and field discretizations based directly on the concept of linear regression. Several examples of interesting applications in stochastic mechanics are also given.Keywords: Random fields discretization, Linear regression, Stochastic interpolation, ...

  5. Significance testing in ridge regression for genetic data

    Directory of Open Access Journals (Sweden)

    De Iorio Maria

    2011-09-01

    Full Text Available Abstract Background Technological developments have increased the feasibility of large scale genetic association studies. Densely typed genetic markers are obtained using SNP arrays, next-generation sequencing technologies and imputation. However, SNPs typed using these methods can be highly correlated due to linkage disequilibrium among them, and standard multiple regression techniques fail with these data sets due to their high dimensionality and correlation structure. There has been increasing interest in using penalised regression in the analysis of high dimensional data. Ridge regression is one such penalised regression technique which does not perform variable selection, instead estimating a regression coefficient for each predictor variable. It is therefore desirable to obtain an estimate of the significance of each ridge regression coefficient. Results We develop and evaluate a test of significance for ridge regression coefficients. Using simulation studies, we demonstrate that the performance of the test is comparable to that of a permutation test, with the advantage of a much-reduced computational cost. We introduce the p-value trace, a plot of the negative logarithm of the p-values of ridge regression coefficients with increasing shrinkage parameter, which enables the visualisation of the change in p-value of the regression coefficients with increasing penalisation. We apply the proposed method to a lung cancer case-control data set from EPIC, the European Prospective Investigation into Cancer and Nutrition. Conclusions The proposed test is a useful alternative to a permutation test for the estimation of the significance of ridge regression coefficients, at a much-reduced computational cost. The p-value trace is an informative graphical tool for evaluating the results of a test of significance of ridge regression coefficients as the shrinkage parameter increases, and the proposed test makes its production computationally feasible.

  6. Testing for constant nonparametric effects in general semiparametric regression models with interactions

    KAUST Repository

    Wei, Jiawei; Carroll, Raymond J.; Maity, Arnab

    2011-01-01

    We consider the problem of testing for a constant nonparametric effect in a general semi-parametric regression model when there is the potential for interaction between the parametrically and nonparametrically modeled variables. The work

  7. Testing overall and moderator effects meta-regression

    NARCIS (Netherlands)

    Huizenga, H.M.; Visser, I.; Dolan, C.V.

    2011-01-01

    Random effects meta-regression is a technique to synthesize results of multiple studies. It allows for a test of an overall effect, as well as for tests of effects of study characteristics, that is, (discrete or continuous) moderator effects. We describe various procedures to test moderator effects:

  8. Linearity and Misspecification Tests for Vector Smooth Transition Regression Models

    DEFF Research Database (Denmark)

    Teräsvirta, Timo; Yang, Yukai

    The purpose of the paper is to derive Lagrange multiplier and Lagrange multiplier type specification and misspecification tests for vector smooth transition regression models. We report results from simulation studies in which the size and power properties of the proposed asymptotic tests in small...

  9. PELE-IC test problems

    International Nuclear Information System (INIS)

    Gong, E.Y.; Alexander, E.E.; McMaster, W.H.; Quinones, D.F.

    1979-01-01

    This report provides prospective users of the Lawrence Livermore Laboratory (LLL) fluid-structure interaction computer code, PELE-IC, a variety of test problems for verifying the code on CDC 7600 computer systems at facilities external to the LLL environment. The test problems have been successfully run on CDC 7600 computers at the LLL and Lawrence Berkeley Laboratory (LBL) computer centers

  10. HYBRID DATA APPROACH FOR SELECTING EFFECTIVE TEST CASES DURING THE REGRESSION TESTING

    OpenAIRE

    Mohan, M.; Shrimali, Tarun

    2017-01-01

    In the software industry, software testing becomes more important in the entire software development life cycle. Software testing is one of the fundamental components of software quality assurances. Software Testing Life Cycle (STLC)is a process involved in testing the complete software, which includes Regression Testing, Unit Testing, Smoke Testing, Integration Testing, Interface Testing, System Testing & etc. In the STLC of Regression testing, test case selection is one of the most importan...

  11. Testing Under Fire: Chicago's Problem.

    Science.gov (United States)

    Byrd, Manford, Jr.

    The history and development of city-wide testing programs in Chicago since 1936 are reviewed and placed in context with the impact on testing of Sputnik and the passage of the National Defense Education Act of 1958. Current testing problems include the time lag between events and curricular changes and new test construction, the time lag between…

  12. Testing for marginal linear effects in quantile regression

    KAUST Repository

    Wang, Huixia Judy

    2017-10-23

    The paper develops a new marginal testing procedure to detect significant predictors that are associated with the conditional quantiles of a scalar response. The idea is to fit the marginal quantile regression on each predictor one at a time, and then to base the test on the t-statistics that are associated with the most predictive predictors. A resampling method is devised to calibrate this test statistic, which has non-regular limiting behaviour due to the selection of the most predictive variables. Asymptotic validity of the procedure is established in a general quantile regression setting in which the marginal quantile regression models can be misspecified. Even though a fixed dimension is assumed to derive the asymptotic results, the test proposed is applicable and computationally feasible for large dimensional predictors. The method is more flexible than existing marginal screening test methods based on mean regression and has the added advantage of being robust against outliers in the response. The approach is illustrated by using an application to a human immunodeficiency virus drug resistance data set.

  13. Testing for marginal linear effects in quantile regression

    KAUST Repository

    Wang, Huixia Judy; McKeague, Ian W.; Qian, Min

    2017-01-01

    The paper develops a new marginal testing procedure to detect significant predictors that are associated with the conditional quantiles of a scalar response. The idea is to fit the marginal quantile regression on each predictor one at a time, and then to base the test on the t-statistics that are associated with the most predictive predictors. A resampling method is devised to calibrate this test statistic, which has non-regular limiting behaviour due to the selection of the most predictive variables. Asymptotic validity of the procedure is established in a general quantile regression setting in which the marginal quantile regression models can be misspecified. Even though a fixed dimension is assumed to derive the asymptotic results, the test proposed is applicable and computationally feasible for large dimensional predictors. The method is more flexible than existing marginal screening test methods based on mean regression and has the added advantage of being robust against outliers in the response. The approach is illustrated by using an application to a human immunodeficiency virus drug resistance data set.

  14. A test of inflated zeros for Poisson regression models.

    Science.gov (United States)

    He, Hua; Zhang, Hui; Ye, Peng; Tang, Wan

    2017-01-01

    Excessive zeros are common in practice and may cause overdispersion and invalidate inference when fitting Poisson regression models. There is a large body of literature on zero-inflated Poisson models. However, methods for testing whether there are excessive zeros are less well developed. The Vuong test comparing a Poisson and a zero-inflated Poisson model is commonly applied in practice. However, the type I error of the test often deviates seriously from the nominal level, rendering serious doubts on the validity of the test in such applications. In this paper, we develop a new approach for testing inflated zeros under the Poisson model. Unlike the Vuong test for inflated zeros, our method does not require a zero-inflated Poisson model to perform the test. Simulation studies show that when compared with the Vuong test our approach not only better at controlling type I error rate, but also yield more power.

  15. Continuous validation of ASTEC containment models and regression testing

    International Nuclear Information System (INIS)

    Nowack, Holger; Reinke, Nils; Sonnenkalb, Martin

    2014-01-01

    The focus of the ASTEC (Accident Source Term Evaluation Code) development at GRS is primarily on the containment module CPA (Containment Part of ASTEC), whose modelling is to a large extent based on the GRS containment code COCOSYS (COntainment COde SYStem). Validation is usually understood as the approval of the modelling capabilities by calculations of appropriate experiments done by external users different from the code developers. During the development process of ASTEC CPA, bugs and unintended side effects may occur, which leads to changes in the results of the initially conducted validation. Due to the involvement of a considerable number of developers in the coding of ASTEC modules, validation of the code alone, even if executed repeatedly, is not sufficient. Therefore, a regression testing procedure has been implemented in order to ensure that the initially obtained validation results are still valid with succeeding code versions. Within the regression testing procedure, calculations of experiments and plant sequences are performed with the same input deck but applying two different code versions. For every test-case the up-to-date code version is compared to the preceding one on the basis of physical parameters deemed to be characteristic for the test-case under consideration. In the case of post-calculations of experiments also a comparison to experimental data is carried out. Three validation cases from the regression testing procedure are presented within this paper. The very good post-calculation of the HDR E11.1 experiment shows the high quality modelling of thermal-hydraulics in ASTEC CPA. Aerosol behaviour is validated on the BMC VANAM M3 experiment, and the results show also a very good agreement with experimental data. Finally, iodine behaviour is checked in the validation test-case of the THAI IOD-11 experiment. Within this test-case, the comparison of the ASTEC versions V2.0r1 and V2.0r2 shows how an error was detected by the regression testing

  16. Testing the Perturbation Sensitivity of Abortion-Crime Regressions

    Directory of Open Access Journals (Sweden)

    Michał Brzeziński

    2012-06-01

    Full Text Available The hypothesis that the legalisation of abortion contributed significantly to the reduction of crime in the United States in 1990s is one of the most prominent ideas from the recent “economics-made-fun” movement sparked by the book Freakonomics. This paper expands on the existing literature about the computational stability of abortion-crime regressions by testing the sensitivity of coefficients’ estimates to small amounts of data perturbation. In contrast to previous studies, we use a new data set on crime correlates for each of the US states, the original model specifica-tion and estimation methodology, and an improved data perturbation algorithm. We find that the coefficients’ estimates in abortion-crime regressions are not computationally stable and, therefore, are unreliable.

  17. Testing of a Fiber Optic Wear, Erosion and Regression Sensor

    Science.gov (United States)

    Korman, Valentin; Polzin, Kurt A.

    2011-01-01

    The nature of the physical processes and harsh environments associated with erosion and wear in propulsion environments makes their measurement and real-time rate quantification difficult. A fiber optic sensor capable of determining the wear (regression, erosion, ablation) associated with these environments has been developed and tested in a number of different applications to validate the technique. The sensor consists of two fiber optics that have differing attenuation coefficients and transmit light to detectors. The ratio of the two measured intensities can be correlated to the lengths of the fiber optic lines, and if the fibers and the host parent material in which they are embedded wear at the same rate the remaining length of fiber provides a real-time measure of the wear process. Testing in several disparate situations has been performed, with the data exhibiting excellent qualitative agreement with the theoretical description of the process and when a separate calibrated regression measurement is available good quantitative agreement is obtained as well. The light collected by the fibers can also be used to optically obtain the spectra and measure the internal temperature of the wear layer.

  18. Conditional Monte Carlo randomization tests for regression models.

    Science.gov (United States)

    Parhat, Parwen; Rosenberger, William F; Diao, Guoqing

    2014-08-15

    We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.

  19. Bias in logistic regression due to imperfect diagnostic test results and practical correction approaches.

    Science.gov (United States)

    Valle, Denis; Lima, Joanna M Tucker; Millar, Justin; Amratia, Punam; Haque, Ubydul

    2015-11-04

    Logistic regression is a statistical model widely used in cross-sectional and cohort studies to identify and quantify the effects of potential disease risk factors. However, the impact of imperfect tests on adjusted odds ratios (and thus on the identification of risk factors) is under-appreciated. The purpose of this article is to draw attention to the problem associated with modelling imperfect diagnostic tests, and propose simple Bayesian models to adequately address this issue. A systematic literature review was conducted to determine the proportion of malaria studies that appropriately accounted for false-negatives/false-positives in a logistic regression setting. Inference from the standard logistic regression was also compared with that from three proposed Bayesian models using simulations and malaria data from the western Brazilian Amazon. A systematic literature review suggests that malaria epidemiologists are largely unaware of the problem of using logistic regression to model imperfect diagnostic test results. Simulation results reveal that statistical inference can be substantially improved when using the proposed Bayesian models versus the standard logistic regression. Finally, analysis of original malaria data with one of the proposed Bayesian models reveals that microscopy sensitivity is strongly influenced by how long people have lived in the study region, and an important risk factor (i.e., participation in forest extractivism) is identified that would have been missed by standard logistic regression. Given the numerous diagnostic methods employed by malaria researchers and the ubiquitous use of logistic regression to model the results of these diagnostic tests, this paper provides critical guidelines to improve data analysis practice in the presence of misclassification error. Easy-to-use code that can be readily adapted to WinBUGS is provided, enabling straightforward implementation of the proposed Bayesian models.

  20. Testing the equality of nonparametric regression curves based on ...

    African Journals Online (AJOL)

    Abstract. In this work we propose a new methodology for the comparison of two regression functions f1 and f2 in the case of homoscedastic error structure and a fixed design. Our approach is based on the empirical Fourier coefficients of the regression functions f1 and f2 respectively. As our main results we obtain the ...

  1. Considering a non-polynomial basis for local kernel regression problem

    Science.gov (United States)

    Silalahi, Divo Dharma; Midi, Habshah

    2017-01-01

    A common used as solution for local kernel nonparametric regression problem is given using polynomial regression. In this study, we demonstrated the estimator and properties using maximum likelihood estimator for a non-polynomial basis such B-spline to replacing the polynomial basis. This estimator allows for flexibility in the selection of a bandwidth and a knot. The best estimator was selected by finding an optimal bandwidth and knot through minimizing the famous generalized validation function.

  2. A Spreadsheet Tool for Learning the Multiple Regression F-Test, T-Tests, and Multicollinearity

    Science.gov (United States)

    Martin, David

    2008-01-01

    This note presents a spreadsheet tool that allows teachers the opportunity to guide students towards answering on their own questions related to the multiple regression F-test, the t-tests, and multicollinearity. The note demonstrates approaches for using the spreadsheet that might be appropriate for three different levels of statistics classes,…

  3. SOCP relaxation bounds for the optimal subset selection problem applied to robust linear regression

    OpenAIRE

    Flores, Salvador

    2015-01-01

    This paper deals with the problem of finding the globally optimal subset of h elements from a larger set of n elements in d space dimensions so as to minimize a quadratic criterion, with an special emphasis on applications to computing the Least Trimmed Squares Estimator (LTSE) for robust regression. The computation of the LTSE is a challenging subset selection problem involving a nonlinear program with continuous and binary variables, linked in a highly nonlinear fashion. The selection of a ...

  4. The Prediction Properties of Inverse and Reverse Regression for the Simple Linear Calibration Problem

    Science.gov (United States)

    Parker, Peter A.; Geoffrey, Vining G.; Wilson, Sara R.; Szarka, John L., III; Johnson, Nels G.

    2010-01-01

    The calibration of measurement systems is a fundamental but under-studied problem within industrial statistics. The origins of this problem go back to basic chemical analysis based on NIST standards. In today's world these issues extend to mechanical, electrical, and materials engineering. Often, these new scenarios do not provide "gold standards" such as the standard weights provided by NIST. This paper considers the classic "forward regression followed by inverse regression" approach. In this approach the initial experiment treats the "standards" as the regressor and the observed values as the response to calibrate the instrument. The analyst then must invert the resulting regression model in order to use the instrument to make actual measurements in practice. This paper compares this classical approach to "reverse regression," which treats the standards as the response and the observed measurements as the regressor in the calibration experiment. Such an approach is intuitively appealing because it avoids the need for the inverse regression. However, it also violates some of the basic regression assumptions.

  5. Comparing Linear Discriminant Function with Logistic Regression for the Two-Group Classification Problem.

    Science.gov (United States)

    Fan, Xitao; Wang, Lin

    The Monte Carlo study compared the performance of predictive discriminant analysis (PDA) and that of logistic regression (LR) for the two-group classification problem. Prior probabilities were used for classification, but the cost of misclassification was assumed to be equal. The study used a fully crossed three-factor experimental design (with…

  6. On the estimation and testing of predictive panel regressions

    NARCIS (Netherlands)

    Karabiyik, H.; Westerlund, Joakim; Narayan, Paresh

    2016-01-01

    Hjalmarsson (2010) considers an OLS-based estimator of predictive panel regressions that is argued to be mixed normal under very general conditions. In a recent paper, Westerlund et al. (2016) show that while consistent, the estimator is generally not mixed normal, which invalidates standard normal

  7. Testing and Estimating Shape-Constrained Nonparametric Density and Regression in the Presence of Measurement Error

    KAUST Repository

    Carroll, Raymond J.

    2011-03-01

    In many applications we can expect that, or are interested to know if, a density function or a regression curve satisfies some specific shape constraints. For example, when the explanatory variable, X, represents the value taken by a treatment or dosage, the conditional mean of the response, Y , is often anticipated to be a monotone function of X. Indeed, if this regression mean is not monotone (in the appropriate direction) then the medical or commercial value of the treatment is likely to be significantly curtailed, at least for values of X that lie beyond the point at which monotonicity fails. In the case of a density, common shape constraints include log-concavity and unimodality. If we can correctly guess the shape of a curve, then nonparametric estimators can be improved by taking this information into account. Addressing such problems requires a method for testing the hypothesis that the curve of interest satisfies a shape constraint, and, if the conclusion of the test is positive, a technique for estimating the curve subject to the constraint. Nonparametric methodology for solving these problems already exists, but only in cases where the covariates are observed precisely. However in many problems, data can only be observed with measurement errors, and the methods employed in the error-free case typically do not carry over to this error context. In this paper we develop a novel approach to hypothesis testing and function estimation under shape constraints, which is valid in the context of measurement errors. Our method is based on tilting an estimator of the density or the regression mean until it satisfies the shape constraint, and we take as our test statistic the distance through which it is tilted. Bootstrap methods are used to calibrate the test. The constrained curve estimators that we develop are also based on tilting, and in that context our work has points of contact with methodology in the error-free case.

  8. An application of robust ridge regression model in the presence of outliers to real data problem

    Science.gov (United States)

    Shariff, N. S. Md.; Ferdaos, N. A.

    2017-09-01

    Multicollinearity and outliers are often leads to inconsistent and unreliable parameter estimates in regression analysis. The well-known procedure that is robust to multicollinearity problem is the ridge regression method. This method however is believed are affected by the presence of outlier. The combination of GM-estimation and ridge parameter that is robust towards both problems is on interest in this study. As such, both techniques are employed to investigate the relationship between stock market price and macroeconomic variables in Malaysia due to curiosity of involving the multicollinearity and outlier problem in the data set. There are four macroeconomic factors selected for this study which are Consumer Price Index (CPI), Gross Domestic Product (GDP), Base Lending Rate (BLR) and Money Supply (M1). The results demonstrate that the proposed procedure is able to produce reliable results towards the presence of multicollinearity and outliers in the real data.

  9. Medical Tests for Prostate Problems

    Science.gov (United States)

    ... walnut-shaped gland that is part of the male reproductive system. It has two or more lobes, or sections, ... treating problems of the urinary tract and the male reproductive system. Abdominal Ultrasound Ultrasound uses a device, called a ...

  10. Posterior consistency for Bayesian inverse problems through stability and regression results

    International Nuclear Information System (INIS)

    Vollmer, Sebastian J

    2013-01-01

    In the Bayesian approach, the a priori knowledge about the input of a mathematical model is described via a probability measure. The joint distribution of the unknown input and the data is then conditioned, using Bayes’ formula, giving rise to the posterior distribution on the unknown input. In this setting we prove posterior consistency for nonlinear inverse problems: a sequence of data is considered, with diminishing fluctuations around a single truth and it is then of interest to show that the resulting sequence of posterior measures arising from this sequence of data concentrates around the truth used to generate the data. Posterior consistency justifies the use of the Bayesian approach very much in the same way as error bounds and convergence results for regularization techniques do. As a guiding example, we consider the inverse problem of reconstructing the diffusion coefficient from noisy observations of the solution to an elliptic PDE in divergence form. This problem is approached by splitting the forward operator into the underlying continuum model and a simpler observation operator based on the output of the model. In general, these splittings allow us to conclude posterior consistency provided a deterministic stability result for the underlying inverse problem and a posterior consistency result for the Bayesian regression problem with the push-forward prior. Moreover, we prove posterior consistency for the Bayesian regression problem based on the regularity, the tail behaviour and the small ball probabilities of the prior. (paper)

  11. Multiple Linear Regression for Reconstruction of Gene Regulatory Networks in Solving Cascade Error Problems

    Directory of Open Access Journals (Sweden)

    Faridah Hani Mohamed Salleh

    2017-01-01

    Full Text Available Gene regulatory network (GRN reconstruction is the process of identifying regulatory gene interactions from experimental data through computational analysis. One of the main reasons for the reduced performance of previous GRN methods had been inaccurate prediction of cascade motifs. Cascade error is defined as the wrong prediction of cascade motifs, where an indirect interaction is misinterpreted as a direct interaction. Despite the active research on various GRN prediction methods, the discussion on specific methods to solve problems related to cascade errors is still lacking. In fact, the experiments conducted by the past studies were not specifically geared towards proving the ability of GRN prediction methods in avoiding the occurrences of cascade errors. Hence, this research aims to propose Multiple Linear Regression (MLR to infer GRN from gene expression data and to avoid wrongly inferring of an indirect interaction (A → B → C as a direct interaction (A → C. Since the number of observations of the real experiment datasets was far less than the number of predictors, some predictors were eliminated by extracting the random subnetworks from global interaction networks via an established extraction method. In addition, the experiment was extended to assess the effectiveness of MLR in dealing with cascade error by using a novel experimental procedure that had been proposed in this work. The experiment revealed that the number of cascade errors had been very minimal. Apart from that, the Belsley collinearity test proved that multicollinearity did affect the datasets used in this experiment greatly. All the tested subnetworks obtained satisfactory results, with AUROC values above 0.5.

  12. Multiple Linear Regression for Reconstruction of Gene Regulatory Networks in Solving Cascade Error Problems.

    Science.gov (United States)

    Salleh, Faridah Hani Mohamed; Zainudin, Suhaila; Arif, Shereena M

    2017-01-01

    Gene regulatory network (GRN) reconstruction is the process of identifying regulatory gene interactions from experimental data through computational analysis. One of the main reasons for the reduced performance of previous GRN methods had been inaccurate prediction of cascade motifs. Cascade error is defined as the wrong prediction of cascade motifs, where an indirect interaction is misinterpreted as a direct interaction. Despite the active research on various GRN prediction methods, the discussion on specific methods to solve problems related to cascade errors is still lacking. In fact, the experiments conducted by the past studies were not specifically geared towards proving the ability of GRN prediction methods in avoiding the occurrences of cascade errors. Hence, this research aims to propose Multiple Linear Regression (MLR) to infer GRN from gene expression data and to avoid wrongly inferring of an indirect interaction (A → B → C) as a direct interaction (A → C). Since the number of observations of the real experiment datasets was far less than the number of predictors, some predictors were eliminated by extracting the random subnetworks from global interaction networks via an established extraction method. In addition, the experiment was extended to assess the effectiveness of MLR in dealing with cascade error by using a novel experimental procedure that had been proposed in this work. The experiment revealed that the number of cascade errors had been very minimal. Apart from that, the Belsley collinearity test proved that multicollinearity did affect the datasets used in this experiment greatly. All the tested subnetworks obtained satisfactory results, with AUROC values above 0.5.

  13. Solving the Omitted Variables Problem of Regression Analysis Using the Relative Vertical Position of Observations

    Directory of Open Access Journals (Sweden)

    Jonathan E. Leightner

    2012-01-01

    Full Text Available The omitted variables problem is one of regression analysis’ most serious problems. The standard approach to the omitted variables problem is to find instruments, or proxies, for the omitted variables, but this approach makes strong assumptions that are rarely met in practice. This paper introduces best projection reiterative truncated projected least squares (BP-RTPLS, the third generation of a technique that solves the omitted variables problem without using proxies or instruments. This paper presents a theoretical argument that BP-RTPLS produces unbiased reduced form estimates when there are omitted variables. This paper also provides simulation evidence that shows OLS produces between 250% and 2450% more errors than BP-RTPLS when there are omitted variables and when measurement and round-off error is 1 percent or less. In an example, the government spending multiplier, , is estimated using annual data for the USA between 1929 and 2010.

  14. Testing for constant nonparametric effects in general semiparametric regression models with interactions

    KAUST Repository

    Wei, Jiawei

    2011-07-01

    We consider the problem of testing for a constant nonparametric effect in a general semi-parametric regression model when there is the potential for interaction between the parametrically and nonparametrically modeled variables. The work was originally motivated by a unique testing problem in genetic epidemiology (Chatterjee, et al., 2006) that involved a typical generalized linear model but with an additional term reminiscent of the Tukey one-degree-of-freedom formulation, and their interest was in testing for main effects of the genetic variables, while gaining statistical power by allowing for a possible interaction between genes and the environment. Later work (Maity, et al., 2009) involved the possibility of modeling the environmental variable nonparametrically, but they focused on whether there was a parametric main effect for the genetic variables. In this paper, we consider the complementary problem, where the interest is in testing for the main effect of the nonparametrically modeled environmental variable. We derive a generalized likelihood ratio test for this hypothesis, show how to implement it, and provide evidence that our method can improve statistical power when compared to standard partially linear models with main effects only. We use the method for the primary purpose of analyzing data from a case-control study of colorectal adenoma.

  15. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-14

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  16. Regression Tests and the Efficiency of Fixed Odds Betting Markets

    NARCIS (Netherlands)

    Koning, Ruud H.

    The informational content of odds posted in sports betting market has been an ongoing topic of research. In this paper, I test whether fixed odds betting markets in soccer are informationally efficient. The contributions of the paper are threefold: first, I propose a simple yet flexible statistical

  17. The alarming problems of confounding equivalence using logistic regression models in the perspective of causal diagrams

    Directory of Open Access Journals (Sweden)

    Yuanyuan Yu

    2017-12-01

    Full Text Available Abstract Background Confounders can produce spurious associations between exposure and outcome in observational studies. For majority of epidemiologists, adjusting for confounders using logistic regression model is their habitual method, though it has some problems in accuracy and precision. It is, therefore, important to highlight the problems of logistic regression and search the alternative method. Methods Four causal diagram models were defined to summarize confounding equivalence. Both theoretical proofs and simulation studies were performed to verify whether conditioning on different confounding equivalence sets had the same bias-reducing potential and then to select the optimum adjusting strategy, in which logistic regression model and inverse probability weighting based marginal structural model (IPW-based-MSM were compared. The “do-calculus” was used to calculate the true causal effect of exposure on outcome, then the bias and standard error were used to evaluate the performances of different strategies. Results Adjusting for different sets of confounding equivalence, as judged by identical Markov boundaries, produced different bias-reducing potential in the logistic regression model. For the sets satisfied G-admissibility, adjusting for the set including all the confounders reduced the equivalent bias to the one containing the parent nodes of the outcome, while the bias after adjusting for the parent nodes of exposure was not equivalent to them. In addition, all causal effect estimations through logistic regression were biased, although the estimation after adjusting for the parent nodes of exposure was nearest to the true causal effect. However, conditioning on different confounding equivalence sets had the same bias-reducing potential under IPW-based-MSM. Compared with logistic regression, the IPW-based-MSM could obtain unbiased causal effect estimation when the adjusted confounders satisfied G-admissibility and the optimal

  18. The alarming problems of confounding equivalence using logistic regression models in the perspective of causal diagrams.

    Science.gov (United States)

    Yu, Yuanyuan; Li, Hongkai; Sun, Xiaoru; Su, Ping; Wang, Tingting; Liu, Yi; Yuan, Zhongshang; Liu, Yanxun; Xue, Fuzhong

    2017-12-28

    Confounders can produce spurious associations between exposure and outcome in observational studies. For majority of epidemiologists, adjusting for confounders using logistic regression model is their habitual method, though it has some problems in accuracy and precision. It is, therefore, important to highlight the problems of logistic regression and search the alternative method. Four causal diagram models were defined to summarize confounding equivalence. Both theoretical proofs and simulation studies were performed to verify whether conditioning on different confounding equivalence sets had the same bias-reducing potential and then to select the optimum adjusting strategy, in which logistic regression model and inverse probability weighting based marginal structural model (IPW-based-MSM) were compared. The "do-calculus" was used to calculate the true causal effect of exposure on outcome, then the bias and standard error were used to evaluate the performances of different strategies. Adjusting for different sets of confounding equivalence, as judged by identical Markov boundaries, produced different bias-reducing potential in the logistic regression model. For the sets satisfied G-admissibility, adjusting for the set including all the confounders reduced the equivalent bias to the one containing the parent nodes of the outcome, while the bias after adjusting for the parent nodes of exposure was not equivalent to them. In addition, all causal effect estimations through logistic regression were biased, although the estimation after adjusting for the parent nodes of exposure was nearest to the true causal effect. However, conditioning on different confounding equivalence sets had the same bias-reducing potential under IPW-based-MSM. Compared with logistic regression, the IPW-based-MSM could obtain unbiased causal effect estimation when the adjusted confounders satisfied G-admissibility and the optimal strategy was to adjust for the parent nodes of outcome, which

  19. Multi-platform SCADA GUI Regression Testing at CERN

    CERN Document Server

    Burkimsher, P C; Klikovits, S

    2011-01-01

    The JCOP Framework is a toolkit used widely at CERN for the development of industrial control systems in several domains (i.e. experiments, accelerators and technical infrastructure). The software development started 10 years ago and there is now a large base of production systems running it. For the success of the project, it was essential to formalize and automate the quality assurance process. This paper will present the overall testing strategy and will describe in detail mechanisms used for GUI testing. The choice of a commercial tool (Squish) and the architectural features making it appropriate for our multi-platform environment will be described. Practical difficulties encountered when using the tool in the CERN context are discussed as well as how these were addressed. In the light of initial experience, the test code itself has been recently reworked in object-oriented style to facilitate future maintenance and extension. The current reporting process is described, as well as future plans for easy re...

  20. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  1. Test set for initial value problem solvers

    NARCIS (Netherlands)

    W.M. Lioen (Walter); J.J.B. de Swart (Jacques)

    1998-01-01

    textabstractThe CWI test set for IVP solvers presents a collection of Initial Value Problems to test solvers for implicit differential equations. This test set can both decrease the effort for the code developer to test his software in a reliable way, and cross the bridge between the application

  2. Power and Sample Size Calculations for Logistic Regression Tests for Differential Item Functioning

    Science.gov (United States)

    Li, Zhushan

    2014-01-01

    Logistic regression is a popular method for detecting uniform and nonuniform differential item functioning (DIF) effects. Theoretical formulas for the power and sample size calculations are derived for likelihood ratio tests and Wald tests based on the asymptotic distribution of the maximum likelihood estimators for the logistic regression model.…

  3. Beta/gamma test problems for ITS

    International Nuclear Information System (INIS)

    Mei, G.T.

    1993-01-01

    The Integrated Tiger Series of Coupled Electron/Photon Monte Carlo Transport Codes (ITS 3.0, PC Version) was used at Oak Ridge National Laboratory (ORNL) to compare with and extend the experimental findings of the beta/gamma response of selected health physics instruments. In order to assure that ITS gives correct results, several beta/gamma problems have been tested. ITS was used to simulate these problems numerically, and results for each were compared to the problem's experimental or analytical results. ITS successfully predicted the experimental or analytical results of all tested problems within the statistical uncertainty inherent in the Monte Carlo method

  4. Testing contingency hypotheses in budgetary research: An evaluation of the use of moderated regression analysis

    NARCIS (Netherlands)

    Hartmann, Frank G.H.; Moers, Frank

    1999-01-01

    In the contingency literature on the behavioral and organizational effects of budgeting, use of the Moderated Regression Analysis (MRA) technique is prevalent. This technique is used to test contingency hypotheses that predict interaction effects between budgetary and contextual variables. This

  5. CUSUM-Logistic Regression analysis for the rapid detection of errors in clinical laboratory test results.

    Science.gov (United States)

    Sampson, Maureen L; Gounden, Verena; van Deventer, Hendrik E; Remaley, Alan T

    2016-02-01

    The main drawback of the periodic analysis of quality control (QC) material is that test performance is not monitored in time periods between QC analyses, potentially leading to the reporting of faulty test results. The objective of this study was to develop a patient based QC procedure for the more timely detection of test errors. Results from a Chem-14 panel measured on the Beckman LX20 analyzer were used to develop the model. Each test result was predicted from the other 13 members of the panel by multiple regression, which resulted in correlation coefficients between the predicted and measured result of >0.7 for 8 of the 14 tests. A logistic regression model, which utilized the measured test result, the predicted test result, the day of the week and time of day, was then developed for predicting test errors. The output of the logistic regression was tallied by a daily CUSUM approach and used to predict test errors, with a fixed specificity of 90%. The mean average run length (ARL) before error detection by CUSUM-Logistic Regression (CSLR) was 20 with a mean sensitivity of 97%, which was considerably shorter than the mean ARL of 53 (sensitivity 87.5%) for a simple prediction model that only used the measured result for error detection. A CUSUM-Logistic Regression analysis of patient laboratory data can be an effective approach for the rapid and sensitive detection of clinical laboratory errors. Published by Elsevier Inc.

  6. Testing the water-energy theory on American palms (Arecaceae using geographically weighted regression.

    Directory of Open Access Journals (Sweden)

    Wolf L Eiserhardt

    Full Text Available Water and energy have emerged as the best contemporary environmental correlates of broad-scale species richness patterns. A corollary hypothesis of water-energy dynamics theory is that the influence of water decreases and the influence of energy increases with absolute latitude. We report the first use of geographically weighted regression for testing this hypothesis on a continuous species richness gradient that is entirely located within the tropics and subtropics. The dataset was divided into northern and southern hemispheric portions to test whether predictor shifts are more pronounced in the less oceanic northern hemisphere. American palms (Arecaceae, n = 547 spp., whose species richness and distributions are known to respond strongly to water and energy, were used as a model group. The ability of water and energy to explain palm species richness was quantified locally at different spatial scales and regressed on latitude. Clear latitudinal trends in agreement with water-energy dynamics theory were found, but the results did not differ qualitatively between hemispheres. Strong inherent spatial autocorrelation in local modeling results and collinearity of water and energy variables were identified as important methodological challenges. We overcame these problems by using simultaneous autoregressive models and variation partitioning. Our results show that the ability of water and energy to explain species richness changes not only across large climatic gradients spanning tropical to temperate or arctic zones but also within megathermal climates, at least for strictly tropical taxa such as palms. This finding suggests that the predictor shifts are related to gradual latitudinal changes in ambient energy (related to solar flux input rather than to abrupt transitions at specific latitudes, such as the occurrence of frost.

  7. Power properties of invariant tests for spatial autocorrelation in linear regression

    NARCIS (Netherlands)

    Martellosio, F.

    2006-01-01

    Many popular tests for residual spatial autocorrelation in the context of the linear regression model belong to the class of invariant tests. This paper derives a number of exact properties of the power function of such tests. In particular, we extend the work of Krämer (2005, Journal of Statistical

  8. Accounting for measurement error in log regression models with applications to accelerated testing.

    Directory of Open Access Journals (Sweden)

    Robert Richardson

    Full Text Available In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.

  9. Accounting for measurement error in log regression models with applications to accelerated testing.

    Science.gov (United States)

    Richardson, Robert; Tolley, H Dennis; Evenson, William E; Lunt, Barry M

    2018-01-01

    In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.

  10. Problem-Solving Test: Tryptophan Operon Mutants

    Science.gov (United States)

    Szeberenyi, Jozsef

    2010-01-01

    This paper presents a problem-solving test that deals with the regulation of the "trp" operon of "Escherichia coli." Two mutants of this operon are described: in mutant A, the operator region of the operon carries a point mutation so that it is unable to carry out its function; mutant B expresses a "trp" repressor protein unable to bind…

  11. Testing quantum contextuality. The problem of compatibility

    International Nuclear Information System (INIS)

    Szangolies, Jochen

    2015-01-01

    Jochen Szangolies contributes a novel way of dealing with the problem of the experimental testability of the Kochen-Specker theorem posed by realistic, that is, noisy, measurements. Such noise spoils perfect compatibility between successive measurements, which however is a necessary requirement to test the notion of contextuality in usual approaches. To overcome this difficulty, a new, extended notion of contextuality that reduces to Kochen-Specker contextuality in the limit of perfect measurement implementations is proposed by the author, together with a scheme to test this notion experimentally. Furthermore, the behaviour of these tests under realistic noise conditions is investigated.

  12. A Note on Three Statistical Tests in the Logistic Regression DIF Procedure

    Science.gov (United States)

    Paek, Insu

    2012-01-01

    Although logistic regression became one of the well-known methods in detecting differential item functioning (DIF), its three statistical tests, the Wald, likelihood ratio (LR), and score tests, which are readily available under the maximum likelihood, do not seem to be consistently distinguished in DIF literature. This paper provides a clarifying…

  13. Motor operated valves problems tests and simulations

    Energy Technology Data Exchange (ETDEWEB)

    Pinier, D.; Haas, J.L.

    1996-12-01

    An analysis of the two refusals of operation of the EAS recirculation shutoff valves enabled two distinct problems to be identified on the motorized valves: the calculation methods for the operating torques of valves in use in the power plants are not conservative enough, which results in the misadjustement of the torque limiters installed on their motorizations, the second problem concerns the pressure locking phenomenon: a number of valves may entrap a pressure exceeding the in-line pressure between the disks, which may cause a jamming of the valve. EDF has made the following approach to settle the first problem: determination of the friction coefficients and the efficiency of the valve and its actuator through general and specific tests and models, definition of a new calculation method. In order to solve the second problem, EDF has made the following operations: identification of the valves whose technology enables the pressure to be entrapped: the tests and numerical simulations carried out in the Research and Development Division confirm the possibility of a {open_quotes}boiler{close_quotes} effect: determination of the necessary modifications: development and testing of anti-boiler effect systems.

  14. Motor operated valves problems tests and simulations

    International Nuclear Information System (INIS)

    Pinier, D.; Haas, J.L.

    1996-01-01

    An analysis of the two refusals of operation of the EAS recirculation shutoff valves enabled two distinct problems to be identified on the motorized valves: the calculation methods for the operating torques of valves in use in the power plants are not conservative enough, which results in the misadjustement of the torque limiters installed on their motorizations, the second problem concerns the pressure locking phenomenon: a number of valves may entrap a pressure exceeding the in-line pressure between the disks, which may cause a jamming of the valve. EDF has made the following approach to settle the first problem: determination of the friction coefficients and the efficiency of the valve and its actuator through general and specific tests and models, definition of a new calculation method. In order to solve the second problem, EDF has made the following operations: identification of the valves whose technology enables the pressure to be entrapped: the tests and numerical simulations carried out in the Research and Development Division confirm the possibility of a open-quotes boilerclose quotes effect: determination of the necessary modifications: development and testing of anti-boiler effect systems

  15. Parameter estimation and statistical test of geographically weighted bivariate Poisson inverse Gaussian regression models

    Science.gov (United States)

    Amalia, Junita; Purhadi, Otok, Bambang Widjanarko

    2017-11-01

    Poisson distribution is a discrete distribution with count data as the random variables and it has one parameter defines both mean and variance. Poisson regression assumes mean and variance should be same (equidispersion). Nonetheless, some case of the count data unsatisfied this assumption because variance exceeds mean (over-dispersion). The ignorance of over-dispersion causes underestimates in standard error. Furthermore, it causes incorrect decision in the statistical test. Previously, paired count data has a correlation and it has bivariate Poisson distribution. If there is over-dispersion, modeling paired count data is not sufficient with simple bivariate Poisson regression. Bivariate Poisson Inverse Gaussian Regression (BPIGR) model is mix Poisson regression for modeling paired count data within over-dispersion. BPIGR model produces a global model for all locations. In another hand, each location has different geographic conditions, social, cultural and economic so that Geographically Weighted Regression (GWR) is needed. The weighting function of each location in GWR generates a different local model. Geographically Weighted Bivariate Poisson Inverse Gaussian Regression (GWBPIGR) model is used to solve over-dispersion and to generate local models. Parameter estimation of GWBPIGR model obtained by Maximum Likelihood Estimation (MLE) method. Meanwhile, hypothesis testing of GWBPIGR model acquired by Maximum Likelihood Ratio Test (MLRT) method.

  16. A Bayesian goodness of fit test and semiparametric generalization of logistic regression with measurement data.

    Science.gov (United States)

    Schörgendorfer, Angela; Branscum, Adam J; Hanson, Timothy E

    2013-06-01

    Logistic regression is a popular tool for risk analysis in medical and population health science. With continuous response data, it is common to create a dichotomous outcome for logistic regression analysis by specifying a threshold for positivity. Fitting a linear regression to the nondichotomized response variable assuming a logistic sampling model for the data has been empirically shown to yield more efficient estimates of odds ratios than ordinary logistic regression of the dichotomized endpoint. We illustrate that risk inference is not robust to departures from the parametric logistic distribution. Moreover, the model assumption of proportional odds is generally not satisfied when the condition of a logistic distribution for the data is violated, leading to biased inference from a parametric logistic analysis. We develop novel Bayesian semiparametric methodology for testing goodness of fit of parametric logistic regression with continuous measurement data. The testing procedures hold for any cutoff threshold and our approach simultaneously provides the ability to perform semiparametric risk estimation. Bayes factors are calculated using the Savage-Dickey ratio for testing the null hypothesis of logistic regression versus a semiparametric generalization. We propose a fully Bayesian and a computationally efficient empirical Bayesian approach to testing, and we present methods for semiparametric estimation of risks, relative risks, and odds ratios when parametric logistic regression fails. Theoretical results establish the consistency of the empirical Bayes test. Results from simulated data show that the proposed approach provides accurate inference irrespective of whether parametric assumptions hold or not. Evaluation of risk factors for obesity shows that different inferences are derived from an analysis of a real data set when deviations from a logistic distribution are permissible in a flexible semiparametric framework. © 2013, The International Biometric

  17. Significance tests to determine the direction of effects in linear regression models.

    Science.gov (United States)

    Wiedermann, Wolfgang; Hagmann, Michael; von Eye, Alexander

    2015-02-01

    Previous studies have discussed asymmetric interpretations of the Pearson correlation coefficient and have shown that higher moments can be used to decide on the direction of dependence in the bivariate linear regression setting. The current study extends this approach by illustrating that the third moment of regression residuals may also be used to derive conclusions concerning the direction of effects. Assuming non-normally distributed variables, it is shown that the distribution of residuals of the correctly specified regression model (e.g., Y is regressed on X) is more symmetric than the distribution of residuals of the competing model (i.e., X is regressed on Y). Based on this result, 4 one-sample tests are discussed which can be used to decide which variable is more likely to be the response and which one is more likely to be the explanatory variable. A fifth significance test is proposed based on the differences of skewness estimates, which leads to a more direct test of a hypothesis that is compatible with direction of dependence. A Monte Carlo simulation study was performed to examine the behaviour of the procedures under various degrees of associations, sample sizes, and distributional properties of the underlying population. An empirical example is given which illustrates the application of the tests in practice. © 2014 The British Psychological Society.

  18. Chandra X-ray Center Science Data Systems Regression Testing of CIAO

    Science.gov (United States)

    Lee, N. P.; Karovska, M.; Galle, E. C.; Bonaventura, N. R.

    2011-07-01

    The Chandra Interactive Analysis of Observations (CIAO) is a software system developed for the analysis of Chandra X-ray Observatory observations. An important component of a successful CIAO release is the repeated testing of the tools across various platforms to ensure consistent and scientifically valid results. We describe the procedures of the scientific regression testing of CIAO and the enhancements made to the testing system to increase the efficiency of run time and result validation.

  19. Social problems on Semipalatinsk test site

    International Nuclear Information System (INIS)

    Cherepnin, Yu.S.; Zhdanov, N.A.; Tumenova, B.N.

    2000-01-01

    In the report main stages of National Nuclear Center of Republic of Kazakhstan activity in the field of scientific information obtain about consequences of conducted nuclear tests, radioecological and medical and biological researches, restoration of natural environment and people's health in Republic of Kazakhstan are reflected. Chronicle and results of joint works within frameworks of international programs in these field are given as well. Analysis of up-to-date social problems of population of the region is carried out

  20. The Finite Deformation Dynamic Sphere Test Problem

    Energy Technology Data Exchange (ETDEWEB)

    Versino, Daniele [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brock, Jerry Steven [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-02

    In this manuscript we describe test cases for the dynamic sphere problem in presence of finite deformations. The spherical shell in exam is made of a homogeneous, isotropic or transverse isotropic material and elastic and elastic-plastic material behaviors are considered. Twenty cases, (a) to (t), are thus defined combining material types and boundary conditions. The inner surface radius, the outer surface radius and the material's density are kept constant for all the considered test cases and their values are ri = 10mm, ro = 20mm and p = 1000Kg/m3 respectively.

  1. Investigating the Effect of Complexity Factors in Stoichiometry Problems Using Logistic Regression and Eye Tracking

    Science.gov (United States)

    Tang, Hui; Kirk, John; Pienta, Norbert J.

    2014-01-01

    This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we developed. The…

  2. Comparison of IRT Likelihood Ratio Test and Logistic Regression DIF Detection Procedures

    Science.gov (United States)

    Atar, Burcu; Kamata, Akihito

    2011-01-01

    The Type I error rates and the power of IRT likelihood ratio test and cumulative logit ordinal logistic regression procedures in detecting differential item functioning (DIF) for polytomously scored items were investigated in this Monte Carlo simulation study. For this purpose, 54 simulation conditions (combinations of 3 sample sizes, 2 sample…

  3. Application of range-test in multiple linear regression analysis in ...

    African Journals Online (AJOL)

    Application of range-test in multiple linear regression analysis in the presence of outliers is studied in this paper. First, the plot of the explanatory variables (i.e. Administration, Social/Commercial, Economic services and Transfer) on the dependent variable (i.e. GDP) was done to identify the statistical trend over the years.

  4. Reduction of the number of parameters needed for a polynomial random regression test-day model

    NARCIS (Netherlands)

    Pool, M.H.; Meuwissen, T.H.E.

    2000-01-01

    Legendre polynomials were used to describe the (co)variance matrix within a random regression test day model. The goodness of fit depended on the polynomial order of fit, i.e., number of parameters to be estimated per animal but is limited by computing capacity. Two aspects: incomplete lactation

  5. Testing Mediation Using Multiple Regression and Structural Equation Modeling Analyses in Secondary Data

    Science.gov (United States)

    Li, Spencer D.

    2011-01-01

    Mediation analysis in child and adolescent development research is possible using large secondary data sets. This article provides an overview of two statistical methods commonly used to test mediated effects in secondary analysis: multiple regression and structural equation modeling (SEM). Two empirical studies are presented to illustrate the…

  6. A unified framework for testing in the linear regression model under unknown order of fractional integration

    DEFF Research Database (Denmark)

    Christensen, Bent Jesper; Kruse, Robinson; Sibbertsen, Philipp

    We consider hypothesis testing in a general linear time series regression framework when the possibly fractional order of integration of the error term is unknown. We show that the approach suggested by Vogelsang (1998a) for the case of integer integration does not apply to the case of fractional...

  7. Pivotal statistics for testing subsets of structural parameters in the IV Regression Model

    NARCIS (Netherlands)

    Kleibergen, F.R.

    2000-01-01

    We construct a novel statistic to test hypothezes on subsets of the structural parameters in anInstrumental Variables (IV) regression model. We derive the chi squared limiting distribution of thestatistic and show that it has a degrees of freedom parameter that is equal to the number ofstructural

  8. Regressive Imagery in Creative Problem-Solving: Comparing Verbal Protocols of Expert and Novice Visual Artists and Computer Programmers

    Science.gov (United States)

    Kozbelt, Aaron; Dexter, Scott; Dolese, Melissa; Meredith, Daniel; Ostrofsky, Justin

    2015-01-01

    We applied computer-based text analyses of regressive imagery to verbal protocols of individuals engaged in creative problem-solving in two domains: visual art (23 experts, 23 novices) and computer programming (14 experts, 14 novices). Percentages of words involving primary process and secondary process thought, plus emotion-related words, were…

  9. Double Length Regressions for Testing the Box-Cox Difference Transformation.

    OpenAIRE

    Park, Timothy

    1991-01-01

    The Box-Cox difference transformation is used to determine the appropriate specification for estimation of hedge ratios and a new double length regression form of the Lagrange multiplier test is presented for the difference transformation. The Box-Cox difference transformation allows the testing of the first difference model and the returns model as special cases of the Box-Cox difference transformation. Copyright 1991 by MIT Press.

  10. Two-Sample Tests for High-Dimensional Linear Regression with an Application to Detecting Interactions.

    Science.gov (United States)

    Xia, Yin; Cai, Tianxi; Cai, T Tony

    2018-01-01

    Motivated by applications in genomics, we consider in this paper global and multiple testing for the comparisons of two high-dimensional linear regression models. A procedure for testing the equality of the two regression vectors globally is proposed and shown to be particularly powerful against sparse alternatives. We then introduce a multiple testing procedure for identifying unequal coordinates while controlling the false discovery rate and false discovery proportion. Theoretical justifications are provided to guarantee the validity of the proposed tests and optimality results are established under sparsity assumptions on the regression coefficients. The proposed testing procedures are easy to implement. Numerical properties of the procedures are investigated through simulation and data analysis. The results show that the proposed tests maintain the desired error rates under the null and have good power under the alternative at moderate sample sizes. The procedures are applied to the Framingham Offspring study to investigate the interactions between smoking and cardiovascular related genetic mutations important for an inflammation marker.

  11. Notes on power of normality tests of error terms in regression models

    Energy Technology Data Exchange (ETDEWEB)

    Střelec, Luboš [Department of Statistics and Operation Analysis, Faculty of Business and Economics, Mendel University in Brno, Zemědělská 1, Brno, 61300 (Czech Republic)

    2015-03-10

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.

  12. Notes on power of normality tests of error terms in regression models

    International Nuclear Information System (INIS)

    Střelec, Luboš

    2015-01-01

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models

  13. A Spline-Based Lack-Of-Fit Test for Independent Variable Effect in Poisson Regression.

    Science.gov (United States)

    Li, Chin-Shang; Tu, Wanzhu

    2007-05-01

    In regression analysis of count data, independent variables are often modeled by their linear effects under the assumption of log-linearity. In reality, the validity of such an assumption is rarely tested, and its use is at times unjustifiable. A lack-of-fit test is proposed for the adequacy of a postulated functional form of an independent variable within the framework of semiparametric Poisson regression models based on penalized splines. It offers added flexibility in accommodating the potentially non-loglinear effect of the independent variable. A likelihood ratio test is constructed for the adequacy of the postulated parametric form, for example log-linearity, of the independent variable effect. Simulations indicate that the proposed model performs well, and misspecified parametric model has much reduced power. An example is given.

  14. Accounting for estimated IQ in neuropsychological test performance with regression-based techniques.

    Science.gov (United States)

    Testa, S Marc; Winicki, Jessica M; Pearlson, Godfrey D; Gordon, Barry; Schretlen, David J

    2009-11-01

    Regression-based normative techniques account for variability in test performance associated with multiple predictor variables and generate expected scores based on algebraic equations. Using this approach, we show that estimated IQ, based on oral word reading, accounts for 1-9% of the variability beyond that explained by individual differences in age, sex, race, and years of education for most cognitive measures. These results confirm that adding estimated "premorbid" IQ to demographic predictors in multiple regression models can incrementally improve the accuracy with which regression-based norms (RBNs) benchmark expected neuropsychological test performance in healthy adults. It remains to be seen whether the incremental variance in test performance explained by estimated "premorbid" IQ translates to improved diagnostic accuracy in patient samples. We describe these methods, and illustrate the step-by-step application of RBNs with two cases. We also discuss the rationale, assumptions, and caveats of this approach. More broadly, we note that adjusting test scores for age and other characteristics might actually decrease the accuracy with which test performance predicts absolute criteria, such as the ability to drive or live independently.

  15. Semiparametric Allelic Tests for Mapping Multiple Phenotypes: Binomial Regression and Mahalanobis Distance.

    Science.gov (United States)

    Majumdar, Arunabha; Witte, John S; Ghosh, Saurabh

    2015-12-01

    Binary phenotypes commonly arise due to multiple underlying quantitative precursors and genetic variants may impact multiple traits in a pleiotropic manner. Hence, simultaneously analyzing such correlated traits may be more powerful than analyzing individual traits. Various genotype-level methods, e.g., MultiPhen (O'Reilly et al. []), have been developed to identify genetic factors underlying a multivariate phenotype. For univariate phenotypes, the usefulness and applicability of allele-level tests have been investigated. The test of allele frequency difference among cases and controls is commonly used for mapping case-control association. However, allelic methods for multivariate association mapping have not been studied much. In this article, we explore two allelic tests of multivariate association: one using a Binomial regression model based on inverted regression of genotype on phenotype (Binomial regression-based Association of Multivariate Phenotypes [BAMP]), and the other employing the Mahalanobis distance between two sample means of the multivariate phenotype vector for two alleles at a single-nucleotide polymorphism (Distance-based Association of Multivariate Phenotypes [DAMP]). These methods can incorporate both discrete and continuous phenotypes. Some theoretical properties for BAMP are studied. Using simulations, the power of the methods for detecting multivariate association is compared with the genotype-level test MultiPhen's. The allelic tests yield marginally higher power than MultiPhen for multivariate phenotypes. For one/two binary traits under recessive mode of inheritance, allelic tests are found to be substantially more powerful. All three tests are applied to two different real data and the results offer some support for the simulation study. We propose a hybrid approach for testing multivariate association that implements MultiPhen when Hardy-Weinberg Equilibrium (HWE) is violated and BAMP otherwise, because the allelic approaches assume HWE

  16. Support Vector Regression-Based Adaptive Divided Difference Filter for Nonlinear State Estimation Problems

    Directory of Open Access Journals (Sweden)

    Hongjian Wang

    2014-01-01

    Full Text Available We present a support vector regression-based adaptive divided difference filter (SVRADDF algorithm for improving the low state estimation accuracy of nonlinear systems, which are typically affected by large initial estimation errors and imprecise prior knowledge of process and measurement noises. The derivative-free SVRADDF algorithm is significantly simpler to compute than other methods and is implemented using only functional evaluations. The SVRADDF algorithm involves the use of the theoretical and actual covariance of the innovation sequence. Support vector regression (SVR is employed to generate the adaptive factor to tune the noise covariance at each sampling instant when the measurement update step executes, which improves the algorithm’s robustness. The performance of the proposed algorithm is evaluated by estimating states for (i an underwater nonmaneuvering target bearing-only tracking system and (ii maneuvering target bearing-only tracking in an air-traffic control system. The simulation results show that the proposed SVRADDF algorithm exhibits better performance when compared with a traditional DDF algorithm.

  17. Predicting Student Success on the Texas Chemistry STAAR Test: A Logistic Regression Analysis

    Science.gov (United States)

    Johnson, William L.; Johnson, Annabel M.; Johnson, Jared

    2012-01-01

    Background: The context is the new Texas STAAR end-of-course testing program. Purpose: The authors developed a logistic regression model to predict who would pass-or-fail the new Texas chemistry STAAR end-of-course exam. Setting: Robert E. Lee High School (5A) with an enrollment of 2700 students, Tyler, Texas. Date of the study was the 2011-2012…

  18. Problems With Section Two ITP TOEFL Test

    Directory of Open Access Journals (Sweden)

    Rizki Ananda

    2016-03-01

    Full Text Available This study was designed to investigate (1 the difficulties faced by EFL university students with section two of the ITP, and (2 whether part A or part B was more difficult for them and why. A number of 26 students from two different universities, Syiah Kuala University and the State Islamic University Ar-Raniry were the samples for the test. The data was obtained from a multiple choice questionnaire test consisting of 46 questions, each with 4 answers to choose from. The results showed that inversions (12%, subject-verb agreements (10%, adverb clause connectors (7%, passives (6%, reduced adjective clauses (5%, parallel structures (5% and use of verbs (5% were the most difficult questions for the students. Furthermore, they felt that part B was more difficult than part A, as finding an error in a sentence was harder than completing a sentence from a multiple choice. Furthermore, the length of questions in part A did not affect the amount of time the students spent to complete part A and did not cause them to panic. Also, unfamiliar words in part A were not regarded as a problem by the students. Hence, TOEFL teachers and trainers are highly encouraged to pay more attention to doing study exercises for the seven topics with the highest percentages above in part A and also to more practice for part B.

  19. Inverse Tasks In The Tsunami Problem: Nonlinear Regression With Inaccurate Input Data

    Science.gov (United States)

    Lavrentiev, M.; Shchemel, A.; Simonov, K.

    A variant of modified training functional that allows considering inaccurate input data is suggested. A limiting case when a part of input data is completely undefined, and, therefore, a problem of reconstruction of hidden parameters should be solved, is also considered. Some numerical experiments are presented. It is assumed that a dependence of known output variables on known input ones should be found is the classic problem definition, which is widely used in the majority of neural nets algorithms. The quality of approximation is evaluated as a performance function. Often the error of the task is evaluated as squared distance between known input data and predicted data multiplied by weighed coefficients. These coefficients may be named "precision coefficients". When inputs are not known exactly, natural generalization of performance function is adding member that responsible for distance between known inputs and shifted inputs, which lessen model's error. It is desirable that the set of variable parameters is compact for training to be con- verging. In the above problem it is possible to choose variants of demands of a priori compactness, which allow meaningful interpretation in the smoothness of the model dependence. Two kinds of regularization was used, first limited squares of coefficients responsible for nonlinearity and second limited multiplication of the above coeffi- cients and linear coefficients. Asymptotic universality of neural net ability to approxi- mate various smooth functions with any accuracy by increase of the number of tunable parameters is often the base for selecting a type of neural net approximation. It is pos- sible to show that used neural net will approach to Fourier integral transform, which approximate abilities are known, with increasing of the number of tunable parameters. In the limiting case, when input data is set with zero precision, the problem of recon- struction of hidden parameters with observed output data appears. The

  20. Breath tests: principles, problems, and promise

    International Nuclear Information System (INIS)

    Lo, C.W.; Carter, E.A.; Walker, W.A.

    1982-01-01

    Breath tests rely on the measurement of gases produced in the intestine, absorbed, and expired in the breath. Carbohydrates, such as lactose and sucrose, can be administered in ysiologic doses; if malabsorbed, they will be metabolized to hydrogen by colonic bacteria. Since hydrogen is not produced by human metabolic reactions, a rise in breath hydrogen, as measured by gas chromatography, is evidence of carbohydrate malabsorption. Likewise, a rise in breath hydrogen marks the transit time of nonabsorbable carbohydrates such as lactulose through the small intestine into the colon. Simple end-expiratory interval collection into nonsiliconized vacutainer tubes has made these noninvasive tests quite convenient to perform, but various problems, including changes in stool pH intestinal motility, or metabolic rate, may influence results. Another group of breath tests uses substrates labeled with radioactive or stable isotopes of carbon. Labeled fat substrates such as trioctanoin, tripalmitin, and triolein do not produce the expected rise in labeled breath CO 2 if there is fat malabsorption. Bile acid malabsorption and small intestinal bacterial overgrowth can be measured with labeled cholylglycine or cholyltaurine. Labeled drugs such as aminopyrine, methacetin, and phenacetin can be used as an indication of drug metabolism and liver function. Radioactive substrates have been used to trace metabolic pathways and can be measured by scintillation counters. The availability of nonradioactive stable isotopes has made these ideal for use in children and pregnant women, but the cost of substrates and the mass spectrometers to measure them has so far limited their use to research centers. It is hoped that new techniques of processing and measurement will allow further realization of the exciting potential breath analysis has in a growing list of clinical applications

  1. SOFC regulation at constant temperature: Experimental test and data regression study

    International Nuclear Information System (INIS)

    Barelli, L.; Bidini, G.; Cinti, G.; Ottaviano, A.

    2016-01-01

    Highlights: • SOFC operating temperature impacts strongly on its performance and lifetime. • Experimental tests were carried out varying electric load and feeding mixture gas. • Three different anodic inlet gases were tested maintaining constant temperature. • Cathodic air flow rate was used to maintain constant its operating temperature. • Regression law was defined from experimental data to regulate the air flow rate. - Abstract: The operating temperature of solid oxide fuel cell stack (SOFC) is an important parameter to be controlled, which impacts the SOFC performance and its lifetime. Rapid temperature change implies a significant temperature differences between the surface and the mean body leading to a state of thermal shock. Thermal shock and thermal cycling introduce stress in a material due to temperature differences between the surface and the interior, or between different regions of the cell. In this context, in order to determine a control law that permit to maintain constant the fuel cell temperature varying the electrical load and the infeed fuel mixture, an experimental activity were carried out on a planar SOFC short stack to analyse stack temperature. Specifically, three different anodic inlet gas compositions were tested: pure hydrogen, reformed natural gas with steam to carbon ratio equal to 2 and 2.5. By processing the obtained results, a regression law was defined to regulate the air flow rate to be provided to the fuel cell to maintain constant its operating temperature varying its operating conditions.

  2. Goodness-of-fit tests and model diagnostics for negative binomial regression of RNA sequencing data.

    Science.gov (United States)

    Mi, Gu; Di, Yanming; Schafer, Daniel W

    2015-01-01

    This work is about assessing model adequacy for negative binomial (NB) regression, particularly (1) assessing the adequacy of the NB assumption, and (2) assessing the appropriateness of models for NB dispersion parameters. Tools for the first are appropriate for NB regression generally; those for the second are primarily intended for RNA sequencing (RNA-Seq) data analysis. The typically small number of biological samples and large number of genes in RNA-Seq analysis motivate us to address the trade-offs between robustness and statistical power using NB regression models. One widely-used power-saving strategy, for example, is to assume some commonalities of NB dispersion parameters across genes via simple models relating them to mean expression rates, and many such models have been proposed. As RNA-Seq analysis is becoming ever more popular, it is appropriate to make more thorough investigations into power and robustness of the resulting methods, and into practical tools for model assessment. In this article, we propose simulation-based statistical tests and diagnostic graphics to address model adequacy. We provide simulated and real data examples to illustrate that our proposed methods are effective for detecting the misspecification of the NB mean-variance relationship as well as judging the adequacy of fit of several NB dispersion models.

  3. Bidirectional extreme learning machine for regression problem and its learning effectiveness.

    Science.gov (United States)

    Yang, Yimin; Wang, Yaonan; Yuan, Xiaofang

    2012-09-01

    It is clear that the learning effectiveness and learning speed of neural networks are in general far slower than required, which has been a major bottleneck for many applications. Recently, a simple and efficient learning method, referred to as extreme learning machine (ELM), was proposed by Huang , which has shown that, compared to some conventional methods, the training time of neural networks can be reduced by a thousand times. However, one of the open problems in ELM research is whether the number of hidden nodes can be further reduced without affecting learning effectiveness. This brief proposes a new learning algorithm, called bidirectional extreme learning machine (B-ELM), in which some hidden nodes are not randomly selected. In theory, this algorithm tends to reduce network output error to 0 at an extremely early learning stage. Furthermore, we find a relationship between the network output error and the network output weights in the proposed B-ELM. Simulation results demonstrate that the proposed method can be tens to hundreds of times faster than other incremental ELM algorithms.

  4. Regression Phalanxes

    OpenAIRE

    Zhang, Hongyang; Welch, William J.; Zamar, Ruben H.

    2017-01-01

    Tomal et al. (2015) introduced the notion of "phalanxes" in the context of rare-class detection in two-class classification problems. A phalanx is a subset of features that work well for classification tasks. In this paper, we propose a different class of phalanxes for application in regression settings. We define a "Regression Phalanx" - a subset of features that work well together for prediction. We propose a novel algorithm which automatically chooses Regression Phalanxes from high-dimensi...

  5. A class of ejecta transport test problems

    International Nuclear Information System (INIS)

    Hammerberg, James E.; Buttler, William T.; Oro, David M.; Rousculp, Christopher L.; Morris, Christopher; Mariam, Fesseha G.

    2011-01-01

    Hydro code implementations of ejecta dynamics at shocked interfaces presume a source distribution function ofparticulate masses and velocities, f 0 (m, v;t). Some of the properties of this source distribution function have been determined from extensive Taylor and supported wave experiments on shock loaded Sn interfaces of varying surface and subsurface morphology. Such experiments measure the mass moment of f o under vacuum conditions assuming weak particle-particle interaction and, usually, fully inelastic capture by piezo-electric diagnostic probes. Recently, planar Sn experiments in He, Ar, and Kr gas atmospheres have been carried out to provide transport data both for machined surfaces and for coated surfaces. A hydro code model of ejecta transport usually specifies a criterion for the instantaneous temporal appearance of ejecta with source distribution f 0 (m, v;t 0 ). Under the further assumption of separability, f 0 (m,v;t 0 ) = f 1 (m)f 2 (v), the motion of particles under the influence of gas dynamic forces is calculated. For the situation of non-interacting particulates, interacting with a gas via drag forces, with the assumption of separability and simplified approximations to the Reynolds number dependence of the drag coefficient, the dynamical equation for the time evolution of the distribution function, f(r,v,m;t), can be resolved as a one-dimensional integral which can be compared to a direct hydro simulation as a test problem. Such solutions can also be used for preliminary analysis of experimental data. We report solutions for several shape dependent drag coefficients and analyze the results of recent planar dsh experiments in Ar and Xe.

  6. Multiple linear combination (MLC) regression tests for common variants adapted to linkage disequilibrium structure.

    Science.gov (United States)

    Yoo, Yun Joo; Sun, Lei; Poirier, Julia G; Paterson, Andrew D; Bull, Shelley B

    2017-02-01

    By jointly analyzing multiple variants within a gene, instead of one at a time, gene-based multiple regression can improve power, robustness, and interpretation in genetic association analysis. We investigate multiple linear combination (MLC) test statistics for analysis of common variants under realistic trait models with linkage disequilibrium (LD) based on HapMap Asian haplotypes. MLC is a directional test that exploits LD structure in a gene to construct clusters of closely correlated variants recoded such that the majority of pairwise correlations are positive. It combines variant effects within the same cluster linearly, and aggregates cluster-specific effects in a quadratic sum of squares and cross-products, producing a test statistic with reduced degrees of freedom (df) equal to the number of clusters. By simulation studies of 1000 genes from across the genome, we demonstrate that MLC is a well-powered and robust choice among existing methods across a broad range of gene structures. Compared to minimum P-value, variance-component, and principal-component methods, the mean power of MLC is never much lower than that of other methods, and can be higher, particularly with multiple causal variants. Moreover, the variation in gene-specific MLC test size and power across 1000 genes is less than that of other methods, suggesting it is a complementary approach for discovery in genome-wide analysis. The cluster construction of the MLC test statistics helps reveal within-gene LD structure, allowing interpretation of clustered variants as haplotypic effects, while multiple regression helps to distinguish direct and indirect associations. © 2016 The Authors Genetic Epidemiology Published by Wiley Periodicals, Inc.

  7. Testing developmental pathways to antisocial personality problems

    NARCIS (Netherlands)

    S. Diamantopoulou; F.C. Verhulst (Frank); J. van der Ende (Jan)

    2010-01-01

    textabstractThis study examined the development of antisocial personality problems (APP) in young adulthood from disruptive behaviors and internalizing problems in childhood and adolescence. Parent ratings of 507 children's (aged 6-8 years) symptoms of attention deficit hyperactivity disorder,

  8. Testing Developmental Pathways to Antisocial Personality Problems

    Science.gov (United States)

    Diamantopoulou, Sofia; Verhulst, Frank C.; van der Ende, Jan

    2010-01-01

    This study examined the development of antisocial personality problems (APP) in young adulthood from disruptive behaviors and internalizing problems in childhood and adolescence. Parent ratings of 507 children's (aged 6-8 years) symptoms of attention deficit hyperactivity disorder, oppositional defiant disorder, and anxiety, were linked to…

  9. Neoclassical versus Frontier Production Models ? Testing for the Skewness of Regression Residuals

    DEFF Research Database (Denmark)

    Kuosmanen, T; Fosgerau, Mogens

    2009-01-01

    The empirical literature on production and cost functions is divided into two strands. The neoclassical approach concentrates on model parameters, while the frontier approach decomposes the disturbance term to a symmetric noise term and a positively skewed inefficiency term. We propose a theoreti......The empirical literature on production and cost functions is divided into two strands. The neoclassical approach concentrates on model parameters, while the frontier approach decomposes the disturbance term to a symmetric noise term and a positively skewed inefficiency term. We propose...... a theoretical justification for the skewness of the inefficiency term, arguing that this skewness is the key testable hypothesis of the frontier approach. We propose to test the regression residuals for skewness in order to distinguish the two competing approaches. Our test builds directly upon the asymmetry...

  10. Projected regression method for solving Fredholm integral equations arising in the analytic continuation problem of quantum physics

    International Nuclear Information System (INIS)

    Arsenault, Louis-François; Millis, Andrew J; Neuberg, Richard; Hannah, Lauren A

    2017-01-01

    We present a supervised machine learning approach to the inversion of Fredholm integrals of the first kind as they arise, for example, in the analytic continuation problem of quantum many-body physics. The approach provides a natural regularization for the ill-conditioned inverse of the Fredholm kernel, as well as an efficient and stable treatment of constraints. The key observation is that the stability of the forward problem permits the construction of a large database of outputs for physically meaningful inputs. Applying machine learning to this database generates a regression function of controlled complexity, which returns approximate solutions for previously unseen inputs; the approximate solutions are then projected onto the subspace of functions satisfying relevant constraints. Under standard error metrics the method performs as well or better than the Maximum Entropy method for low input noise and is substantially more robust to increased input noise. We suggest that the methodology will be similarly effective for other problems involving a formally ill-conditioned inversion of an integral operator, provided that the forward problem can be efficiently solved. (paper)

  11. Testing a model of research intention among U.K. clinical psychologists: a logistic regression analysis.

    Science.gov (United States)

    Eke, Gemma; Holttum, Sue; Hayward, Mark

    2012-03-01

    Previous research highlights barriers to clinical psychologists conducting research, but has rarely examined U.K. clinical psychologists. The study investigated U.K. clinical psychologists' self-reported research output and tested part of a theoretical model of factors influencing their intention to conduct research. Questionnaires were mailed to 1,300 U.K. clinical psychologists. Three hundred and seventy-four questionnaires were returned (29% response-rate). This study replicated in a U.K. sample the finding that the modal number of publications was zero, highlighted in a number of U.K. and U.S. studies. Research intention was bimodally distributed, and logistic regression classified 78% of cases successfully. Outcome expectations, perceived behavioral control and normative beliefs mediated between research training environment and intention. Further research should explore how research is negotiated in clinical roles, and this issue should be incorporated into prequalification training. © 2012 Wiley Periodicals, Inc.

  12. A meta-regression analysis of 41 Australian problem gambling prevalence estimates and their relationship to total spending on electronic gaming machines.

    Science.gov (United States)

    Markham, Francis; Young, Martin; Doran, Bruce; Sugden, Mark

    2017-05-23

    Many jurisdictions regularly conduct surveys to estimate the prevalence of problem gambling in their adult populations. However, the comparison of such estimates is problematic due to methodological variations between studies. Total consumption theory suggests that an association between mean electronic gaming machine (EGM) and casino gambling losses and problem gambling prevalence estimates may exist. If this is the case, then changes in EGM losses may be used as a proxy indicator for changes in problem gambling prevalence. To test for this association this study examines the relationship between aggregated losses on electronic gaming machines (EGMs) and problem gambling prevalence estimates for Australian states and territories between 1994 and 2016. A Bayesian meta-regression analysis of 41 cross-sectional problem gambling prevalence estimates was undertaken using EGM gambling losses, year of survey and methodological variations as predictor variables. General population studies of adults in Australian states and territory published before 1 July 2016 were considered in scope. 41 studies were identified, with a total of 267,367 participants. Problem gambling prevalence, moderate-risk problem gambling prevalence, problem gambling screen, administration mode and frequency threshold were extracted from surveys. Administrative data on EGM and casino gambling loss data were extracted from government reports and expressed as the proportion of household disposable income lost. Money lost on EGMs is correlated with problem gambling prevalence. An increase of 1% of household disposable income lost on EGMs and in casinos was associated with problem gambling prevalence estimates that were 1.33 times higher [95% credible interval 1.04, 1.71]. There was no clear association between EGM losses and moderate-risk problem gambling prevalence estimates. Moderate-risk problem gambling prevalence estimates were not explained by the models (I 2  ≥ 0.97; R 2  ≤ 0.01). The

  13. A meta-regression analysis of 41 Australian problem gambling prevalence estimates and their relationship to total spending on electronic gaming machines

    Directory of Open Access Journals (Sweden)

    Francis Markham

    2017-05-01

    Full Text Available Abstract Background Many jurisdictions regularly conduct surveys to estimate the prevalence of problem gambling in their adult populations. However, the comparison of such estimates is problematic due to methodological variations between studies. Total consumption theory suggests that an association between mean electronic gaming machine (EGM and casino gambling losses and problem gambling prevalence estimates may exist. If this is the case, then changes in EGM losses may be used as a proxy indicator for changes in problem gambling prevalence. To test for this association this study examines the relationship between aggregated losses on electronic gaming machines (EGMs and problem gambling prevalence estimates for Australian states and territories between 1994 and 2016. Methods A Bayesian meta-regression analysis of 41 cross-sectional problem gambling prevalence estimates was undertaken using EGM gambling losses, year of survey and methodological variations as predictor variables. General population studies of adults in Australian states and territory published before 1 July 2016 were considered in scope. 41 studies were identified, with a total of 267,367 participants. Problem gambling prevalence, moderate-risk problem gambling prevalence, problem gambling screen, administration mode and frequency threshold were extracted from surveys. Administrative data on EGM and casino gambling loss data were extracted from government reports and expressed as the proportion of household disposable income lost. Results Money lost on EGMs is correlated with problem gambling prevalence. An increase of 1% of household disposable income lost on EGMs and in casinos was associated with problem gambling prevalence estimates that were 1.33 times higher [95% credible interval 1.04, 1.71]. There was no clear association between EGM losses and moderate-risk problem gambling prevalence estimates. Moderate-risk problem gambling prevalence estimates were not explained by

  14. Testing Environmental Kuznets Curve in the Selected Transition Economies with Panel Smooth Transition Regression Analysis

    Directory of Open Access Journals (Sweden)

    Mahmut Zortuk

    2016-08-01

    Full Text Available The Environmental Kuznets Curve (EKC introduces an inverted U-shaped relationship between environmental pollution and economic development. The inverted U-shaped curve is seen as complete pattern for developed economies. However, our study tests the EKC for developing transition economies of European Union, therefore, our results could make a significant contribution to the literature. In this paper, the relationship between carbon dioxide (CO2 emissions, gross domestic product (GDP, energy use and urban population is investigated in the Transition Economies (Bulgaria, Croatia, Czech Republic, Estonia, Hungary, Latvia, Lithuania, Poland, Romania, Slovakia and Slovenia. Environmental Kuznets Curve is tested by panel smooth transition regression for these economies for 1993 – 2010 periods. As a result of study, the null hypothesis of linearity was rejected and no-remaining nonlinearity test showed that there is a smooth transition exists between two regimes (below $5176 GDP per capita is first one and above $5176 GDP per capita is second one in the related period for these economies.

  15. Using the Coefficient of Determination "R"[superscript 2] to Test the Significance of Multiple Linear Regression

    Science.gov (United States)

    Quinino, Roberto C.; Reis, Edna A.; Bessegato, Lupercio F.

    2013-01-01

    This article proposes the use of the coefficient of determination as a statistic for hypothesis testing in multiple linear regression based on distributions acquired by beta sampling. (Contains 3 figures.)

  16. Partial F-tests with multiply imputed data in the linear regression framework via coefficient of determination.

    Science.gov (United States)

    Chaurasia, Ashok; Harel, Ofer

    2015-02-10

    Tests for regression coefficients such as global, local, and partial F-tests are common in applied research. In the framework of multiple imputation, there are several papers addressing tests for regression coefficients. However, for simultaneous hypothesis testing, the existing methods are computationally intensive because they involve calculation with vectors and (inversion of) matrices. In this paper, we propose a simple method based on the scalar entity, coefficient of determination, to perform (global, local, and partial) F-tests with multiply imputed data. The proposed method is evaluated using simulated data and applied to suicide prevention data. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Improvement of the test quality for specific test problems. Proceedings

    International Nuclear Information System (INIS)

    2011-01-01

    This proceedings CD discusses the many factors that are relevant in nearly all tests, as well as their effects on the validity of the test result. Interfaces with technical rules, staff qualification, POD, and validation of test results by supplementary techniques are presented as well. Three of the 17 papers are available as separate records in the ENERGY database. [de

  18. Predictive genetic tests: problems and pitfalls.

    Science.gov (United States)

    Davis, J G

    1997-12-29

    The role that genetic factors play in medicine has expanded, owing to such recent advances as those made by the Human Genome Project and the work that has spun off from it. The project is focusing particularly on localization and characterization of recognized human genetic disorders, which in turn increases awareness of the potential for improved treatment of these disorders. Technical advances in genetic testing in the absence of effective treatment has presented the health profession with major ethical challenges. The example of the identification of the BRCA1 and BRCA2 genes in families at high risk for breast and ovarian cancer is presented to illustrate the issues of the sensitivity of the method, the degree of susceptibility a positive result implies, the need for and availability of counseling and patient education, and confidentiality of the test results. A compelling need exists for adequate education about medical genetics to raise the "literacy" rate among health professionals.

  19. Cointegrating MiDaS Regressions and a MiDaS Test

    OpenAIRE

    J. Isaac Miller

    2011-01-01

    This paper introduces cointegrating mixed data sampling (CoMiDaS) regressions, generalizing nonlinear MiDaS regressions in the extant literature. Under a linear mixed-frequency data-generating process, MiDaS regressions provide a parsimoniously parameterized nonlinear alternative when the linear forecasting model is over-parameterized and may be infeasible. In spite of potential correlation of the error term both serially and with the regressors, I find that nonlinear least squares consistent...

  20. Accounting for regression-to-the-mean in tests for recent changes in institutional performance: analysis and power.

    Science.gov (United States)

    Jones, Hayley E; Spiegelhalter, David J

    2009-05-30

    Recent changes in individual units are often of interest when monitoring and assessing the performance of healthcare providers. We consider three high profile examples: (a) annual teenage pregnancy rates in English local authorities, (b) quarterly rates of the hospital-acquired infection Clostridium difficile in National Health Service (NHS) Trusts and (c) annual mortality rates following heart surgery in New York State hospitals. Increasingly, government targets call for continual improvements, in each individual provider as well as overall.Owing to the well-known statistical phenomenon of regression-to-the-mean, observed changes between just two measurements are potentially misleading. This problem has received much attention in other areas, but there is a need for guidelines within performance monitoring.In this paper we show theoretically and with worked examples that a simple random effects predictive distribution can be used to 'correct' for the potentially undesirable consequences of regression-to-the-mean on a test for individual change. We discuss connections to the literature in other fields, and build upon this, in particular by examining the effect of the correction on the power to detect genuine changes. It is demonstrated that a gain in average power can be expected, but that this gain is only very slight if the providers are very different from one another, for example due to poor risk adjustment. Further, the power of the corrected test depends on the provider's baseline rate and, although large gains can be expected for some providers, this is at the cost of some power to detect real changes in others. (c) 2009 John Wiley & Sons, Ltd.

  1. Near infrared spectrometric technique for testing fruit quality: optimisation of regression models using genetic algorithms

    Science.gov (United States)

    Isingizwe Nturambirwe, J. Frédéric; Perold, Willem J.; Opara, Umezuruike L.

    2016-02-01

    Near infrared (NIR) spectroscopy has gained extensive use in quality evaluation. It is arguably one of the most advanced spectroscopic tools in non-destructive quality testing of food stuff, from measurement to data analysis and interpretation. NIR spectral data are interpreted through means often involving multivariate statistical analysis, sometimes associated with optimisation techniques for model improvement. The objective of this research was to explore the extent to which genetic algorithms (GA) can be used to enhance model development, for predicting fruit quality. Apple fruits were used, and NIR spectra in the range from 12000 to 4000 cm-1 were acquired on both bruised and healthy tissues, with different degrees of mechanical damage. GAs were used in combination with partial least squares regression methods to develop bruise severity prediction models, and compared to PLS models developed using the full NIR spectrum. A classification model was developed, which clearly separated bruised from unbruised apple tissue. GAs helped improve prediction models by over 10%, in comparison with full spectrum-based models, as evaluated in terms of error of prediction (Root Mean Square Error of Cross-validation). PLS models to predict internal quality, such as sugar content and acidity were developed and compared to the versions optimized by genetic algorithm. Overall, the results highlighted the potential use of GA method to improve speed and accuracy of fruit quality prediction.

  2. EPS-LASSO: Test for High-Dimensional Regression Under Extreme Phenotype Sampling of Continuous Traits.

    Science.gov (United States)

    Xu, Chao; Fang, Jian; Shen, Hui; Wang, Yu-Ping; Deng, Hong-Wen

    2018-01-25

    Extreme phenotype sampling (EPS) is a broadly-used design to identify candidate genetic factors contributing to the variation of quantitative traits. By enriching the signals in extreme phenotypic samples, EPS can boost the association power compared to random sampling. Most existing statistical methods for EPS examine the genetic factors individually, despite many quantitative traits have multiple genetic factors underlying their variation. It is desirable to model the joint effects of genetic factors, which may increase the power and identify novel quantitative trait loci under EPS. The joint analysis of genetic data in high-dimensional situations requires specialized techniques, e.g., the least absolute shrinkage and selection operator (LASSO). Although there are extensive research and application related to LASSO, the statistical inference and testing for the sparse model under EPS remain unknown. We propose a novel sparse model (EPS-LASSO) with hypothesis test for high-dimensional regression under EPS based on a decorrelated score function. The comprehensive simulation shows EPS-LASSO outperforms existing methods with stable type I error and FDR control. EPS-LASSO can provide a consistent power for both low- and high-dimensional situations compared with the other methods dealing with high-dimensional situations. The power of EPS-LASSO is close to other low-dimensional methods when the causal effect sizes are small and is superior when the effects are large. Applying EPS-LASSO to a transcriptome-wide gene expression study for obesity reveals 10 significant body mass index associated genes. Our results indicate that EPS-LASSO is an effective method for EPS data analysis, which can account for correlated predictors. The source code is available at https://github.com/xu1912/EPSLASSO. hdeng2@tulane.edu. Supplementary data are available at Bioinformatics online. © The Author (2018). Published by Oxford University Press. All rights reserved. For Permissions, please

  3. [Comparison of application of Cochran-Armitage trend test and linear regression analysis for rate trend analysis in epidemiology study].

    Science.gov (United States)

    Wang, D Z; Wang, C; Shen, C F; Zhang, Y; Zhang, H; Song, G D; Xue, X D; Xu, Z L; Zhang, S; Jiang, G H

    2017-05-10

    We described the time trend of acute myocardial infarction (AMI) from 1999 to 2013 in Tianjin incidence rate with Cochran-Armitage trend (CAT) test and linear regression analysis, and the results were compared. Based on actual population, CAT test had much stronger statistical power than linear regression analysis for both overall incidence trend and age specific incidence trend (Cochran-Armitage trend P valuelinear regression P value). The statistical power of CAT test decreased, while the result of linear regression analysis remained the same when population size was reduced by 100 times and AMI incidence rate remained unchanged. The two statistical methods have their advantages and disadvantages. It is necessary to choose statistical method according the fitting degree of data, or comprehensively analyze the results of two methods.

  4. Linear Regression Analysis

    CERN Document Server

    Seber, George A F

    2012-01-01

    Concise, mathematically clear, and comprehensive treatment of the subject.* Expanded coverage of diagnostics and methods of model fitting.* Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models.* More than 200 problems throughout the book plus outline solutions for the exercises.* This revision has been extensively class-tested.

  5. Penalized linear regression for discrete ill-posed problems: A hybrid least-squares and mean-squared error approach

    KAUST Repository

    Suliman, Mohamed Abdalla Elhag

    2016-12-19

    This paper proposes a new approach to find the regularization parameter for linear least-squares discrete ill-posed problems. In the proposed approach, an artificial perturbation matrix with a bounded norm is forced into the discrete ill-posed model matrix. This perturbation is introduced to enhance the singular-value (SV) structure of the matrix and hence to provide a better solution. The proposed approach is derived to select the regularization parameter in a way that minimizes the mean-squared error (MSE) of the estimator. Numerical results demonstrate that the proposed approach outperforms a set of benchmark methods in most cases when applied to different scenarios of discrete ill-posed problems. Jointly, the proposed approach enjoys the lowest run-time and offers the highest level of robustness amongst all the tested methods.

  6. Autistic Regression

    Science.gov (United States)

    Matson, Johnny L.; Kozlowski, Alison M.

    2010-01-01

    Autistic regression is one of the many mysteries in the developmental course of autism and pervasive developmental disorders not otherwise specified (PDD-NOS). Various definitions of this phenomenon have been used, further clouding the study of the topic. Despite this problem, some efforts at establishing prevalence have been made. The purpose of…

  7. Testing and Modeling Fuel Regression Rate in a Miniature Hybrid Burner

    Directory of Open Access Journals (Sweden)

    Luciano Fanton

    2012-01-01

    Full Text Available Ballistic characterization of an extended group of innovative HTPB-based solid fuel formulations for hybrid rocket propulsion was performed in a lab-scale burner. An optical time-resolved technique was used to assess the quasisteady regression history of single perforation, cylindrical samples. The effects of metalized additives and radiant heat transfer on the regression rate of such formulations were assessed. Under the investigated operating conditions and based on phenomenological models from the literature, analyses of the collected experimental data show an appreciable influence of the radiant heat flux from burnt gases and soot for both unloaded and loaded fuel formulations. Pure HTPB regression rate data are satisfactorily reproduced, while the impressive initial regression rates of metalized formulations require further assessment.

  8. Testing and Estimating Shape-Constrained Nonparametric Density and Regression in the Presence of Measurement Error

    KAUST Repository

    Carroll, Raymond J.; Delaigle, Aurore; Hall, Peter

    2011-01-01

    In many applications we can expect that, or are interested to know if, a density function or a regression curve satisfies some specific shape constraints. For example, when the explanatory variable, X, represents the value taken by a treatment

  9. Gaussian Processes and Polynomial Chaos Expansion for Regression Problem: Linkage via the RKHS and Comparison via the KL Divergence

    Directory of Open Access Journals (Sweden)

    Liang Yan

    2018-03-01

    Full Text Available In this paper, we examine two widely-used approaches, the polynomial chaos expansion (PCE and Gaussian process (GP regression, for the development of surrogate models. The theoretical differences between the PCE and GP approximations are discussed. A state-of-the-art PCE approach is constructed based on high precision quadrature points; however, the need for truncation may result in potential precision loss; the GP approach performs well on small datasets and allows a fine and precise trade-off between fitting the data and smoothing, but its overall performance depends largely on the training dataset. The reproducing kernel Hilbert space (RKHS and Mercer’s theorem are introduced to form a linkage between the two methods. The theorem has proven that the two surrogates can be embedded in two isomorphic RKHS, by which we propose a novel method named Gaussian process on polynomial chaos basis (GPCB that incorporates the PCE and GP. A theoretical comparison is made between the PCE and GPCB with the help of the Kullback–Leibler divergence. We present that the GPCB is as stable and accurate as the PCE method. Furthermore, the GPCB is a one-step Bayesian method that chooses the best subset of RKHS in which the true function should lie, while the PCE method requires an adaptive procedure. Simulations of 1D and 2D benchmark functions show that GPCB outperforms both the PCE and classical GP methods. In order to solve high dimensional problems, a random sample scheme with a constructive design (i.e., tensor product of quadrature points is proposed to generate a valid training dataset for the GPCB method. This approach utilizes the nature of the high numerical accuracy underlying the quadrature points while ensuring the computational feasibility. Finally, the experimental results show that our sample strategy has a higher accuracy than classical experimental designs; meanwhile, it is suitable for solving high dimensional problems.

  10. Developing and testing a global-scale regression model to quantify mean annual streamflow

    Science.gov (United States)

    Barbarossa, Valerio; Huijbregts, Mark A. J.; Hendriks, A. Jan; Beusen, Arthur H. W.; Clavreul, Julie; King, Henry; Schipper, Aafke M.

    2017-01-01

    Quantifying mean annual flow of rivers (MAF) at ungauged sites is essential for assessments of global water supply, ecosystem integrity and water footprints. MAF can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict MAF based on climate and catchment characteristics. Yet, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. In this study, we developed a global-scale regression model for MAF based on a dataset unprecedented in size, using observations of discharge and catchment characteristics from 1885 catchments worldwide, measuring between 2 and 106 km2. In addition, we compared the performance of the regression model with the predictive ability of the spatially explicit global hydrological model PCR-GLOBWB by comparing results from both models to independent measurements. We obtained a regression model explaining 89% of the variance in MAF based on catchment area and catchment averaged mean annual precipitation and air temperature, slope and elevation. The regression model performed better than PCR-GLOBWB for the prediction of MAF, as root-mean-square error (RMSE) values were lower (0.29-0.38 compared to 0.49-0.57) and the modified index of agreement (d) was higher (0.80-0.83 compared to 0.72-0.75). Our regression model can be applied globally to estimate MAF at any point of the river network, thus providing a feasible alternative to spatially explicit process-based global hydrological models.

  11. Strategies for Testing Statistical and Practical Significance in Detecting DIF with Logistic Regression Models

    Science.gov (United States)

    Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza

    2014-01-01

    This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…

  12. Accuracy of Bayes and Logistic Regression Subscale Probabilities for Educational and Certification Tests

    Science.gov (United States)

    Rudner, Lawrence

    2016-01-01

    In the machine learning literature, it is commonly accepted as fact that as calibration sample sizes increase, Naïve Bayes classifiers initially outperform Logistic Regression classifiers in terms of classification accuracy. Applied to subtests from an on-line final examination and from a highly regarded certification examination, this study shows…

  13. The Effect of Multicollinearity and the Violation of the Assumption of Normality on the Testing of Hypotheses in Regression Analysis.

    Science.gov (United States)

    Vasu, Ellen S.; Elmore, Patricia B.

    The effects of the violation of the assumption of normality coupled with the condition of multicollinearity upon the outcome of testing the hypothesis Beta equals zero in the two-predictor regression equation is investigated. A monte carlo approach was utilized in which three differenct distributions were sampled for two sample sizes over…

  14. Methods for significance testing of categorical covariates in logistic regression models after multiple imputation: power and applicability analysis

    NARCIS (Netherlands)

    Eekhout, I.; Wiel, M.A. van de; Heymans, M.W.

    2017-01-01

    Background. Multiple imputation is a recommended method to handle missing data. For significance testing after multiple imputation, Rubin’s Rules (RR) are easily applied to pool parameter estimates. In a logistic regression model, to consider whether a categorical covariate with more than two levels

  15. Simulation and Analysis of Converging Shock Wave Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Ramsey, Scott D. [Los Alamos National Laboratory; Shashkov, Mikhail J. [Los Alamos National Laboratory

    2012-06-21

    Results and analysis pertaining to the simulation of the Guderley converging shock wave test problem (and associated code verification hydrodynamics test problems involving converging shock waves) in the LANL ASC radiation-hydrodynamics code xRAGE are presented. One-dimensional (1D) spherical and two-dimensional (2D) axi-symmetric geometric setups are utilized and evaluated in this study, as is an instantiation of the xRAGE adaptive mesh refinement capability. For the 2D simulations, a 'Surrogate Guderley' test problem is developed and used to obviate subtleties inherent to the true Guderley solution's initialization on a square grid, while still maintaining a high degree of fidelity to the original problem, and minimally straining the general credibility of associated analysis and conclusions.

  16. Methodical approaches to solving special problems of testing. Seminar papers

    International Nuclear Information System (INIS)

    1996-01-01

    This Seminar volume introduces concepts and applications from different areas of application of ultrasonic testing and other non-destructive test methods in 18 lectures, in order to give an idea of new trends in development and stimuli for special solutions to problems. 3 articles were recorded separately for the ENERGY data bank. (orig./MM) [de

  17. Extending multivariate distance matrix regression with an effect size measure and the asymptotic null distribution of the test statistic.

    Science.gov (United States)

    McArtor, Daniel B; Lubke, Gitta H; Bergeman, C S

    2017-12-01

    Person-centered methods are useful for studying individual differences in terms of (dis)similarities between response profiles on multivariate outcomes. Multivariate distance matrix regression (MDMR) tests the significance of associations of response profile (dis)similarities and a set of predictors using permutation tests. This paper extends MDMR by deriving and empirically validating the asymptotic null distribution of its test statistic, and by proposing an effect size for individual outcome variables, which is shown to recover true associations. These extensions alleviate the computational burden of permutation tests currently used in MDMR and render more informative results, thus making MDMR accessible to new research domains.

  18. Development of a computer program to support an efficient non-regression test of a thermal-hydraulic system code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jun Yeob; Jeong, Jae Jun [School of Mechanical Engineering, Pusan National University, Busan (Korea, Republic of); Suh, Jae Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Kim, Kyung Doo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    During the development process of a thermal-hydraulic system code, a non-regression test (NRT) must be performed repeatedly in order to prevent software regression. The NRT process, however, is time-consuming and labor-intensive. Thus, automation of this process is an ideal solution. In this study, we have developed a program to support an efficient NRT for the SPACE code and demonstrated its usability. This results in a high degree of efficiency for code development. The program was developed using the Visual Basic for Applications and designed so that it can be easily customized for the NRT of other computer codes.

  19. Experimentally testing the dependence of momentum transport on second derivatives using Gaussian process regression

    Science.gov (United States)

    Chilenski, M. A.; Greenwald, M. J.; Hubbard, A. E.; Hughes, J. W.; Lee, J. P.; Marzouk, Y. M.; Rice, J. E.; White, A. E.

    2017-12-01

    It remains an open question to explain the dramatic change in intrinsic rotation induced by slight changes in electron density (White et al 2013 Phys. Plasmas 20 056106). One proposed explanation is that momentum transport is sensitive to the second derivatives of the temperature and density profiles (Lee et al 2015 Plasma Phys. Control. Fusion 57 125006), but it is widely considered to be impossible to measure these higher derivatives. In this paper, we show that it is possible to estimate second derivatives of electron density and temperature using a nonparametric regression technique known as Gaussian process regression. This technique avoids over-constraining the fit by not assuming an explicit functional form for the fitted curve. The uncertainties, obtained rigorously using Markov chain Monte Carlo sampling, are small enough that it is reasonable to explore hypotheses which depend on second derivatives. It is found that the differences in the second derivatives of n{e} and T{e} between the peaked and hollow rotation cases are rather small, suggesting that changes in the second derivatives are not likely to explain the experimental results.

  20. Multi-stratified multiple regression tests of the linear/no-threshold theory of radon-induced lung cancer

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1992-01-01

    A plot of lung-cancer rates versus radon exposures in 965 US counties, or in all US states, has a strong negative slope, b, in sharp contrast to the strong positive slope predicted by linear/no-threshold theory. The discrepancy between these slopes exceeds 20 standard deviations (SD). Including smoking frequency in the analysis substantially improves fits to a linear relationship but has little effect on the discrepancy in b, because correlations between smoking frequency and radon levels are quite weak. Including 17 socioeconomic variables (SEV) in multiple regression analysis reduces the discrepancy to 15 SD. Data were divided into segments by stratifying on each SEV in turn, and on geography, and on both simultaneously, giving over 300 data sets to be analyzed individually, but negative slopes predominated. The slope is negative whether one considers only the most urban counties or only the most rural; only the richest or only the poorest; only the richest in the South Atlantic region or only the poorest in that region, etc., etc.,; and for all the strata in between. Since this is an ecological study, the well-known problems with ecological studies were investigated and found not to be applicable here. The open-quotes ecological fallacyclose quotes was shown not to apply in testing a linear/no-threshold theory, and the vulnerability to confounding is greatly reduced when confounding factors are only weakly correlated with radon levels, as is generally the case here. All confounding factors known to correlate with radon and with lung cancer were investigated quantitatively and found to have little effect on the discrepancy

  1. Hypothesis Designs for Three-Hypothesis Test Problems

    OpenAIRE

    Yan Li; Xiaolong Pu

    2010-01-01

    As a helpful guide for applications, the alternative hypotheses of the three-hypothesis test problems are designed under the required error probabilities and average sample number in this paper. The asymptotic formulas and the proposed numerical quadrature formulas are adopted, respectively, to obtain the hypothesis designs and the corresponding sequential test schemes under the Koopman-Darmois distributions. The example of the normal mean test shows that our methods are qu...

  2. Linear regression

    CERN Document Server

    Olive, David J

    2017-01-01

    This text covers both multiple linear regression and some experimental design models. The text uses the response plot to visualize the model and to detect outliers, does not assume that the error distribution has a known parametric distribution, develops prediction intervals that work when the error distribution is unknown, suggests bootstrap hypothesis tests that may be useful for inference after variable selection, and develops prediction regions and large sample theory for the multivariate linear regression model that has m response variables. A relationship between multivariate prediction regions and confidence regions provides a simple way to bootstrap confidence regions. These confidence regions often provide a practical method for testing hypotheses. There is also a chapter on generalized linear models and generalized additive models. There are many R functions to produce response and residual plots, to simulate prediction intervals and hypothesis tests, to detect outliers, and to choose response trans...

  3. Bradley’s Regress, Russell’s States of Affairs, and Some General Remarks on the Problem

    Directory of Open Access Journals (Sweden)

    Holger Leerhoff

    2008-12-01

    Full Text Available In this paper, I will give a presentation of Bradley's two main arguments against the reality of relations. Whereas one of his arguments is highly specific to Bradley's metaphysical background, his famous regress argument seems to pose a serious threat not only for ontological pluralism, but especially for states of affairs as an ontological category. Amongst the proponents of states-of-affairs ontologies two groups can be distinguished: One group holds states of affairs to be complexes consisting of their particular and universal constituents alone, the other holds that there has to be a "unifying relation" of some sort to establish the unity of a given state of affairs. Bradley's regress is often conceived to be a compelling argument against the first and for the latter. I will argue that the latter approaches have no real advantage over the simpler theories—neither in the light of Bradley's regress nor in other respects.

  4. Achievement Gap Projection for Standardized Testing through Logistic Regression within a Large Arizona School District

    Science.gov (United States)

    Kellermeyer, Steven Bruce

    2011-01-01

    In the last few decades high-stakes testing has become more political than educational. The Districts within Arizona are bound by the mandates of both AZ LEARNS and the No Child Left Behind Act of 2001. At the time of this writing, both legislative mandates relied on the Arizona Instrument for Measuring Standards (AIMS) as State Tests for gauging…

  5. Testing the transferability of regression equations derived from small sub-catchments to a large area in central Sweden

    Directory of Open Access Journals (Sweden)

    C. Xu

    2003-01-01

    Full Text Available There is an ever increasing need to apply hydrological models to catchments where streamflow data are unavailable or to large geographical regions where calibration is not feasible. Estimation of model parameters from spatial physical data is the key issue in the development and application of hydrological models at various scales. To investigate the suitability of transferring the regression equations relating model parameters to physical characteristics developed from small sub-catchments to a large region for estimating model parameters, a conceptual snow and water balance model was optimised on all the sub-catchments in the region. A multiple regression analysis related model parameters to physical data for the catchments and the regression equations derived from the small sub-catchments were used to calculate regional parameter values for the large basin using spatially aggregated physical data. For the model tested, the results support the suitability of transferring the regression equations to the larger region. Keywords: water balance modelling,large scale, multiple regression, regionalisation

  6. The inverse problem of the magnetostatic nondestructive testing

    International Nuclear Information System (INIS)

    Pechenkov, A.N.; Shcherbinin, V.E.

    2006-01-01

    The inverse problem of magnetostatic nondestructive testing consists in the calculation of the shape and magnetic characteristics of a flaw in a uniform magnetized body with measurement of static magnetic field beyond the body. If the flaw does not contain any magnetic material, the inverse problem is reduced to identification of the shape and magnetic susceptibility of the substance. This case has been considered in the study [ru

  7. A Fast Solution of the Lindley Equations for the M-Group Regression Problem. Technical Report 78-3, October 1977 through May 1978.

    Science.gov (United States)

    Molenaar, Ivo W.

    The technical problems involved in obtaining Bayesian model estimates for the regression parameters in m similar groups are studied. The available computer programs, BPREP (BASIC), and BAYREG, both written in FORTRAN, require an amount of computer processing that does not encourage regular use. These programs are analyzed so that the performance…

  8. Some problems in use of the moral judgment test.

    Science.gov (United States)

    Villegas de Posada, Cristina

    2005-06-01

    The Moral Judgment Test has been widely used in evaluation of moral development; however, it presents some problems related to the trait measured, reliability, and validity of its summary score (C-index). This index reflects consistency in moral judgment, but this construct is different from moral development as stated by Kohlberg. Therefore, users interested in the latter evaluation should refer to other indexes derived from the test. Some of the analyzed problems could be partially corrected with more theory and research on moral consistency as a component of moral competence.

  9. Group Work Tests for Context-Rich Problems

    Science.gov (United States)

    Meyer, Chris

    2016-05-01

    The group work test is an assessment strategy that promotes higher-order thinking skills for solving context-rich problems. With this format, teachers are able to pose challenging, nuanced questions on a test, while providing the support weaker students need to get started and show their understanding. The test begins with a group discussion phase, when students are given a "number-free" version of the problem. This phase allows students to digest the story-like problem, explore solution ideas, and alleviate some test anxiety. After 10-15 minutes of discussion, students inform the instructor of their readiness for the individual part of the test. What follows next is a pedagogical phase change from lively group discussion to quiet individual work. The group work test is a natural continuation of the group work in our daily physics classes and helps reinforce the importance of collaboration. This method has met with success at York Mills Collegiate Institute, in Toronto, Ontario, where it has been used consistently for unit tests and the final exam of the grade 12 university preparation physics course.

  10. Spatial regression test for ensuring temperature data quality in southern Spain

    Science.gov (United States)

    Estévez, J.; Gavilán, P.; García-Marín, A. P.

    2018-01-01

    Quality assurance of meteorological data is crucial for ensuring the reliability of applications and models that use such data as input variables, especially in the field of environmental sciences. Spatial validation of meteorological data is based on the application of quality control procedures using data from neighbouring stations to assess the validity of data from a candidate station (the station of interest). These kinds of tests, which are referred to in the literature as spatial consistency tests, take data from neighbouring stations in order to estimate the corresponding measurement at the candidate station. These estimations can be made by weighting values according to the distance between the stations or to the coefficient of correlation, among other methods. The test applied in this study relies on statistical decision-making and uses a weighting based on the standard error of the estimate. This paper summarizes the results of the application of this test to maximum, minimum and mean temperature data from the Agroclimatic Information Network of Andalusia (southern Spain). This quality control procedure includes a decision based on a factor f, the fraction of potential outliers for each station across the region. Using GIS techniques, the geographic distribution of the errors detected has been also analysed. Finally, the performance of the test was assessed by evaluating its effectiveness in detecting known errors.

  11. High cycle fatigue test and regression methods of S-N curve

    International Nuclear Information System (INIS)

    Kim, D. W.; Park, J. Y.; Kim, W. G.; Yoon, J. H.

    2011-11-01

    The fatigue design curve in the ASME Boiler and Pressure Vessel Code Section III are based on the assumption that fatigue life is infinite after 106 cycles. This is because standard fatigue testing equipment prior to the past decades was limited in speed to less than 200 cycles per second. Traditional servo-hydraulic machines work at frequency of 50 Hz. Servo-hydraulic machines working at 1000 Hz have been developed after 1997. This machines allow high frequency and displacement of up to ±0.1 mm and dynamic load of ±20 kN are guaranteed. The frequency of resonant fatigue test machine is 50-250 Hz. Various forced vibration-based system works at 500 Hz or 1.8 kHz. Rotating bending machines allow testing frequency at 0.1-200 Hz. The main advantage of ultrasonic fatigue testing at 20 kHz is performing Although S-N curve is determined by experiment, the fatigue strength corresponding to a given fatigue life should be determined by statistical method considering the scatter of fatigue properties. In this report, the statistical methods for evaluation of fatigue test data is investigated

  12. A non-parametric test for partial monotonicity in multiple regression

    NARCIS (Netherlands)

    van Beek, M.; Daniëls, H.A.M.

    Partial positive (negative) monotonicity in a dataset is the property that an increase in an independent variable, ceteris paribus, generates an increase (decrease) in the dependent variable. A test for partial monotonicity in datasets could (1) increase model performance if monotonicity may be

  13. Relationship between Academic Stress and Suicidal Ideation: Testing for Depression as a Mediator Using Multiple Regression

    Science.gov (United States)

    Ang, Rebecca P.; Huan, Vivien S.

    2006-01-01

    Relations among academic stress, depression, and suicidal ideation were examined in 1,108 Asian adolescents 12-18 years old from a secondary school in Singapore. Using Baron and Kenny's [J Pers Soc Psychol 51:1173-1192, 1986] framework, this study tested the prediction that adolescent depression mediated the relationship between academic stress…

  14. Estimation of Genetic Parameters for First Lactation Monthly Test-day Milk Yields using Random Regression Test Day Model in Karan Fries Cattle

    Directory of Open Access Journals (Sweden)

    Ajay Singh

    2016-06-01

    Full Text Available A single trait linear mixed random regression test-day model was applied for the first time for analyzing the first lactation monthly test-day milk yield records in Karan Fries cattle. The test-day milk yield data was modeled using a random regression model (RRM considering different order of Legendre polynomial for the additive genetic effect (4th order and the permanent environmental effect (5th order. Data pertaining to 1,583 lactation records spread over a period of 30 years were recorded and analyzed in the study. The variance component, heritability and genetic correlations among test-day milk yields were estimated using RRM. RRM heritability estimates of test-day milk yield varied from 0.11 to 0.22 in different test-day records. The estimates of genetic correlations between different test-day milk yields ranged 0.01 (test-day 1 [TD-1] and TD-11 to 0.99 (TD-4 and TD-5. The magnitudes of genetic correlations between test-day milk yields decreased as the interval between test-days increased and adjacent test-day had higher correlations. Additive genetic and permanent environment variances were higher for test-day milk yields at both ends of lactation. The residual variance was observed to be lower than the permanent environment variance for all the test-day milk yields.

  15. Feasibility testing for dial-a-ride problems

    DEFF Research Database (Denmark)

    Haugland, Dag; Ho, Sin C.

    Hunsaker and Savelsbergh have proposed an algorithm for testing feasibility of a route in the solution to the dial-a-ride problem. The constraints that are checked are load capacity constraints, time windows, ride time bounds and wait time bounds. The algorithm has linear running time. By virtue...

  16. Feasibility Testing for Dial-a-Ride Problems

    DEFF Research Database (Denmark)

    Haugland, Dag; Ho, Sin C.

    2010-01-01

    Hunsaker and Savelsbergh have proposed an algorithm for testing feasibility of a route in the solution to the dial-a-ride problem. The constraints that are checked are load capacity constraints, time windows, ride time bounds and wait time bounds. The algorithm has linear running time. By virtue...

  17. Genetic analysis of somatic cell score in Danish dairy cattle using ramdom regression test-day model

    DEFF Research Database (Denmark)

    Elsaid, Reda; Sabry, Ayman; Lund, Mogens Sandø

    2011-01-01

    ,233 Danish Holstein cows, were extracted from the national milk recording database. Each data set was analyzed with random regression models using AI-REML. Fixed effects in all models were age at first calving, herd test day, days carrying calf, effects of germ plasm importation (e.g. additive breed effects......) and low between the beginning and the end of lactation. The estimated environmental correlations were lower than the genetic correlations, but the trends were similar. Based on test-day records, the accuracy of genetic evaluations for SCC should be improved when the variation in heritabilities...

  18. Testing Homogeneity in a Semiparametric Two-Sample Problem

    Directory of Open Access Journals (Sweden)

    Yukun Liu

    2012-01-01

    Full Text Available We study a two-sample homogeneity testing problem, in which one sample comes from a population with density f(x and the other is from a mixture population with mixture density (1−λf(x+λg(x. This problem arises naturally from many statistical applications such as test for partial differential gene expression in microarray study or genetic studies for gene mutation. Under the semiparametric assumption g(x=f(xeα+βx, a penalized empirical likelihood ratio test could be constructed, but its implementation is hindered by the fact that there is neither feasible algorithm for computing the test statistic nor available research results on its theoretical properties. To circumvent these difficulties, we propose an EM test based on the penalized empirical likelihood. We prove that the EM test has a simple chi-square limiting distribution, and we also demonstrate its competitive testing performances by simulations. A real-data example is used to illustrate the proposed methodology.

  19. Regression filter for signal resolution

    International Nuclear Information System (INIS)

    Matthes, W.

    1975-01-01

    The problem considered is that of resolving a measured pulse height spectrum of a material mixture, e.g. gamma ray spectrum, Raman spectrum, into a weighed sum of the spectra of the individual constituents. The model on which the analytical formulation is based is described. The problem reduces to that of a multiple linear regression. A stepwise linear regression procedure was constructed. The efficiency of this method was then tested by transforming the procedure in a computer programme which was used to unfold test spectra obtained by mixing some spectra, from a library of arbitrary chosen spectra, and adding a noise component. (U.K.)

  20. Differential diagnosis of degenerative dementias using basic neuropsychological tests: multivariable logistic regression analysis of 301 patients.

    Science.gov (United States)

    Jiménez-Huete, Adolfo; Riva, Elena; Toledano, Rafael; Campo, Pablo; Esteban, Jesús; Barrio, Antonio Del; Franch, Oriol

    2014-12-01

    The validity of neuropsychological tests for the differential diagnosis of degenerative dementias may depend on the clinical context. We constructed a series of logistic models taking into account this factor. We retrospectively analyzed the demographic and neuropsychological data of 301 patients with probable Alzheimer's disease (AD), frontotemporal degeneration (FTLD), or dementia with Lewy bodies (DLB). Nine models were constructed taking into account the diagnostic question (eg, AD vs DLB) and subpopulation (incident vs prevalent). The AD versus DLB model for all patients, including memory recovery and phonological fluency, was highly accurate (area under the curve = 0.919, sensitivity = 90%, and specificity = 80%). The results were comparable in incident and prevalent cases. The FTLD versus AD and DLB versus FTLD models were both inaccurate. The models constructed from basic neuropsychological variables allowed an accurate differential diagnosis of AD versus DLB but not of FTLD versus AD or DLB. © The Author(s) 2014.

  1. Time-adaptive quantile regression

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg; Madsen, Henrik

    2008-01-01

    and an updating procedure are combined into a new algorithm for time-adaptive quantile regression, which generates new solutions on the basis of the old solution, leading to savings in computation time. The suggested algorithm is tested against a static quantile regression model on a data set with wind power......An algorithm for time-adaptive quantile regression is presented. The algorithm is based on the simplex algorithm, and the linear optimization formulation of the quantile regression problem is given. The observations have been split to allow a direct use of the simplex algorithm. The simplex method...... production, where the models combine splines and quantile regression. The comparison indicates superior performance for the time-adaptive quantile regression in all the performance parameters considered....

  2. Penalized linear regression for discrete ill-posed problems: A hybrid least-squares and mean-squared error approach

    KAUST Repository

    Suliman, Mohamed Abdalla Elhag; Ballal, Tarig; Kammoun, Abla; Al-Naffouri, Tareq Y.

    2016-01-01

    This paper proposes a new approach to find the regularization parameter for linear least-squares discrete ill-posed problems. In the proposed approach, an artificial perturbation matrix with a bounded norm is forced into the discrete ill-posed model

  3. Dual Regression

    OpenAIRE

    Spady, Richard; Stouli, Sami

    2012-01-01

    We propose dual regression as an alternative to the quantile regression process for the global estimation of conditional distribution functions under minimal assumptions. Dual regression provides all the interpretational power of the quantile regression process while avoiding the need for repairing the intersecting conditional quantile surfaces that quantile regression often produces in practice. Our approach introduces a mathematical programming characterization of conditional distribution f...

  4. Usability Testing: Too Early? Too Much Talking? Too Many Problems?

    DEFF Research Database (Denmark)

    Hertzum, Morten

    2016-01-01

    Usability testing has evolved in response to a search for tests that are cheap, early, easy, and fast. In addition, it accords with a situational definition of usability, such as the one propounded by ISO. By approaching usability from an organizational perspective, this author argues that usabil......Usability testing has evolved in response to a search for tests that are cheap, early, easy, and fast. In addition, it accords with a situational definition of usability, such as the one propounded by ISO. By approaching usability from an organizational perspective, this author argues...... that usability should (also) be evaluated late, that usability professionals should be wary of thinking aloud, and that they should focus more on effects achievement than problem detection....

  5. Testing problem-solving capacities: differences between individual testing and social group setting.

    Science.gov (United States)

    Krasheninnikova, Anastasia; Schneider, Jutta M

    2014-09-01

    Testing animals individually in problem-solving tasks limits distractions of the subjects during the test, so that they can fully concentrate on the problem. However, such individual performance may not indicate the problem-solving capacity that is commonly employed in the wild when individuals are faced with a novel problem in their social groups, where the presence of a conspecific influences an individual's behaviour. To assess the validity of data gathered from parrots when tested individually, we compared the performance on patterned-string tasks among parrots tested singly and parrots tested in social context. We tested two captive groups of orange-winged amazons (Amazona amazonica) with several patterned-string tasks. Despite the differences in the testing environment (singly vs. social context), parrots from both groups performed similarly. However, we found that the willingness to participate in the tasks was significantly higher for the individuals tested in social context. The study provides further evidence for the crucial influence of social context on individual's response to a challenging situation such as a problem-solving test.

  6. A general equation to obtain multiple cut-off scores on a test from multinomial logistic regression.

    Science.gov (United States)

    Bersabé, Rosa; Rivas, Teresa

    2010-05-01

    The authors derive a general equation to compute multiple cut-offs on a total test score in order to classify individuals into more than two ordinal categories. The equation is derived from the multinomial logistic regression (MLR) model, which is an extension of the binary logistic regression (BLR) model to accommodate polytomous outcome variables. From this analytical procedure, cut-off scores are established at the test score (the predictor variable) at which an individual is as likely to be in category j as in category j+1 of an ordinal outcome variable. The application of the complete procedure is illustrated by an example with data from an actual study on eating disorders. In this example, two cut-off scores on the Eating Attitudes Test (EAT-26) scores are obtained in order to classify individuals into three ordinal categories: asymptomatic, symptomatic and eating disorder. Diagnoses were made from the responses to a self-report (Q-EDD) that operationalises DSM-IV criteria for eating disorders. Alternatives to the MLR model to set multiple cut-off scores are discussed.

  7. Genetic Analysis of Milk Yield Using Random Regression Test Day Model in Tehran Province Holstein Dairy Cow

    Directory of Open Access Journals (Sweden)

    A. Seyeddokht

    2012-09-01

    Full Text Available In this research a random regression test day model was used to estimate heritability values and calculation genetic correlations between test day milk records. a total of 140357 monthly test day milk records belonging to 28292 first lactation Holstein cattle(trice time a day milking distributed in 165 herd and calved from 2001 to 2010 belonging to the herds of Tehran province were used. The fixed effects of herd-year-month of calving as contemporary group and age at calving and Holstein gene percentage as covariate were fitted. Orthogonal legendre polynomial with a 4th-order was implemented to take account of genetic and environmental aspects of milk production over the course of lactation. RRM using Legendre polynomials as base functions appears to be the most adequate to describe the covariance structure of the data. The results showed that the average of heritability for the second half of lactation period was higher than that of the first half. The heritability value for the first month was lowest (0.117 and for the eighth month of the lactation was highest (0.230 compared to the other months of lactation. Because of genetic variation was increased gradually, and residual variance was high in the first months of lactation, heritabilities were different over the course of lactation. The RRMs with a higher number of parameters were more useful to describe the genetic variation of test-day milk yield throughout the lactation. In this research estimation of genetic parameters, and calculation genetic correlations were implemented by random regression test day model, therefore using this method is the exact way to take account of parameters rather than the other ways.

  8. [The effect of prison crowding on prisoners' violence in Japan: testing with cointegration regressions and error correction models].

    Science.gov (United States)

    Yuma, Yoshikazu

    2010-08-01

    This research examined the effect of prison population densities (PPD) on inmate-inmate prison violence rates (PVR) in Japan using one-year-interval time-series data (1972-2006). Cointegration regressions revealed a long-run equilibrium relationship between PPD and PVR. PPD had a significant and increasing effect on PVR in the long-term. Error correction models showed that in the short-term, the effect of PPD was significant and positive on PVR, even after controlling for the effects of the proportions of males, age younger than 30 years, less than one-year incarceration, and prisoner/staff ratio. The results were discussed in regard to (a) differences between Japanese prisons and prisons in the United States, and (b) methodological problems found in previous research.

  9. Test-state approach to the quantum search problem

    International Nuclear Information System (INIS)

    Sehrawat, Arun; Nguyen, Le Huy; Englert, Berthold-Georg

    2011-01-01

    The search for 'a quantum needle in a quantum haystack' is a metaphor for the problem of finding out which one of a permissible set of unitary mappings - the oracles - is implemented by a given black box. Grover's algorithm solves this problem with quadratic speedup as compared with the analogous search for 'a classical needle in a classical haystack'. Since the outcome of Grover's algorithm is probabilistic - it gives the correct answer with high probability, not with certainty - the answer requires verification. For this purpose we introduce specific test states, one for each oracle. These test states can also be used to realize 'a classical search for the quantum needle' which is deterministic - it always gives a definite answer after a finite number of steps - and 3.41 times as fast as the purely classical search. Since the test-state search and Grover's algorithm look for the same quantum needle, the average number of oracle queries of the test-state search is the classical benchmark for Grover's algorithm.

  10. School Attendance Problems and Youth Psychopathology: Structural Cross-Lagged Regression Models in Three Longitudinal Data Sets

    Science.gov (United States)

    Wood, Jeffrey J.; Lynne-Landsman, Sarah D.; Langer, David A.; Wood, Patricia A.; Clark, Shaunna L.; Eddy, J. Mark; Ialongo, Nick

    2012-01-01

    This study tests a model of reciprocal influences between absenteeism and youth psychopathology using 3 longitudinal datasets (Ns = 20,745, 2,311, and 671). Participants in 1st through 12th grades were interviewed annually or biannually. Measures of psychopathology include self-, parent-, and teacher-report questionnaires. Structural cross-lagged…

  11. Inferring genetic parameters of lactation in Tropical Milking Criollo cattle with random regression test-day models.

    Science.gov (United States)

    Santellano-Estrada, E; Becerril-Pérez, C M; de Alba, J; Chang, Y M; Gianola, D; Torres-Hernández, G; Ramírez-Valverde, R

    2008-11-01

    This study inferred genetic and permanent environmental variation of milk yield in Tropical Milking Criollo cattle and compared 5 random regression test-day models using Wilmink's function and Legendre polynomials. Data consisted of 15,377 test-day records from 467 Tropical Milking Criollo cows that calved between 1974 and 2006 in the tropical lowlands of the Gulf Coast of Mexico and in southern Nicaragua. Estimated heritabilities of test-day milk yields ranged from 0.18 to 0.45, and repeatabilities ranged from 0.35 to 0.68 for the period spanning from 6 to 400 d in milk. Genetic correlation between days in milk 10 and 400 was around 0.50 but greater than 0.90 for most pairs of test days. The model that used first-order Legendre polynomials for additive genetic effects and second-order Legendre polynomials for permanent environmental effects gave the smallest residual variance and was also favored by the Akaike information criterion and likelihood ratio tests.

  12. Online monitoring and conditional regression tree test: Useful tools for a better understanding of combined sewer network behavior.

    Science.gov (United States)

    Bersinger, T; Bareille, G; Pigot, T; Bru, N; Le Hécho, I

    2018-06-01

    A good knowledge of the dynamic of pollutant concentration and flux in a combined sewer network is necessary when considering solutions to limit the pollutants discharged by combined sewer overflow (CSO) into receiving water during wet weather. Identification of the parameters that influence pollutant concentration and flux is important. Nevertheless, few studies have obtained satisfactory results for the identification of these parameters using statistical tools. Thus, this work uses a large database of rain events (116 over one year) obtained via continuous measurement of rainfall, discharge flow and chemical oxygen demand (COD) estimated using online turbidity for the identification of these parameters. We carried out a statistical study of the parameters influencing the maximum COD concentration, the discharge flow and the discharge COD flux. In this study a new test was used that has never been used in this field: the conditional regression tree test. We have demonstrated that the antecedent dry weather period, the rain event average intensity and the flow before the event are the three main factors influencing the maximum COD concentration during a rainfall event. Regarding the discharge flow, it is mainly influenced by the overall rainfall height but not by the maximum rainfall intensity. Finally, COD discharge flux is influenced by the discharge volume and the maximum COD concentration. Regression trees seem much more appropriate than common tests like PCA and PLS for this type of study as they take into account the thresholds and cumulative effects of various parameters as a function of the target variable. These results could help to improve sewer and CSO management in order to decrease the discharge of pollutants into receiving waters. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. A multiple objective test assembly approach for exposure control problems in Computerized Adaptive Testing

    Directory of Open Access Journals (Sweden)

    Theo J.H.M. Eggen

    2010-01-01

    Full Text Available Overexposure and underexposure of items in the bank are serious problems in operational computerized adaptive testing (CAT systems. These exposure problems might result in item compromise, or point at a waste of investments. The exposure control problem can be viewed as a test assembly problem with multiple objectives. Information in the test has to be maximized, item compromise has to be minimized, and pool usage has to be optimized. In this paper, a multiple objectives method is developed to deal with both types of exposure problems. In this method, exposure control parameters based on observed exposure rates are implemented as weights for the information in the item selection procedure. The method does not need time consuming simulation studies, and it can be implemented conditional on ability level. The method is compared with Sympson Hetter method for exposure control, with the Progressive method and with alphastratified testing. The results show that the method is successful in dealing with both kinds of exposure problems.

  14. Regression-based approach for testing the association between multi-region haplotype configuration and complex trait

    Directory of Open Access Journals (Sweden)

    Zhao Hongbo

    2009-09-01

    Full Text Available Abstract Background It is quite common that the genetic architecture of complex traits involves many genes and their interactions. Therefore, dealing with multiple unlinked genomic regions simultaneously is desirable. Results In this paper we develop a regression-based approach to assess the interactions of haplotypes that belong to different unlinked regions, and we use score statistics to test the null hypothesis of non-genetic association. Additionally, multiple marker combinations at each unlinked region are considered. The multiple tests are settled via the minP approach. The P value of the "best" multi-region multi-marker configuration is corrected via Monte-Carlo simulations. Through simulation studies, we assess the performance of the proposed approach and demonstrate its validity and power in testing for haplotype interaction association. Conclusion Our simulations showed that, for binary trait without covariates, our proposed methods prove to be equal and even more powerful than htr and hapcc which are part of the FAMHAP program. Additionally, our model can be applied to a wider variety of traits and allow adjustment for other covariates. To test the validity, our methods are applied to analyze the association between four unlinked candidate genes and pig meat quality.

  15. Testing and Modeling of Contact Problems in Resistance Welding

    DEFF Research Database (Denmark)

    Song, Quanfeng

    together two or three cylindrical parts as well as disc-ring pairs of dissimilar metals. The tests have demonstrated the effectiveness of the model. A theoretical and experimental study is performed on the contact resistance aiming at a more reliable model for numerical simulation of resistance welding......As a part of the efforts towards a professional and reliable numerical tool for resistance welding engineers, this Ph.D. project is dedicated to refining the numerical models related to the interface behavior. An FE algorithm for the contact problems in resistance welding has been developed...... for the formulation, and the interfaces are treated in a symmetric pattern. The frictional sliding contact is also solved employing the constant friction model. The algorithm is incorporated into the finite element code. Verification is carried out in some numerical tests as well as experiments such as upsetting...

  16. Association of Stressful Life Events with Psychological Problems: A Large-Scale Community-Based Study Using Grouped Outcomes Latent Factor Regression with Latent Predictors

    Directory of Open Access Journals (Sweden)

    Akbar Hassanzadeh

    2017-01-01

    Full Text Available Objective. The current study is aimed at investigating the association between stressful life events and psychological problems in a large sample of Iranian adults. Method. In a cross-sectional large-scale community-based study, 4763 Iranian adults, living in Isfahan, Iran, were investigated. Grouped outcomes latent factor regression on latent predictors was used for modeling the association of psychological problems (depression, anxiety, and psychological distress, measured by Hospital Anxiety and Depression Scale (HADS and General Health Questionnaire (GHQ-12, as the grouped outcomes, and stressful life events, measured by a self-administered stressful life events (SLEs questionnaire, as the latent predictors. Results. The results showed that the personal stressors domain has significant positive association with psychological distress (β=0.19, anxiety (β=0.25, depression (β=0.15, and their collective profile score (β=0.20, with greater associations in females (β=0.28 than in males (β=0.13 (all P<0.001. In addition, in the adjusted models, the regression coefficients for the association of social stressors domain and psychological problems profile score were 0.37, 0.35, and 0.46 in total sample, males, and females, respectively (P<0.001. Conclusion. Results of our study indicated that different stressors, particularly those socioeconomic related, have an effective impact on psychological problems. It is important to consider the social and cultural background of a population for managing the stressors as an effective approach for preventing and reducing the destructive burden of psychological problems.

  17. Association of Stressful Life Events with Psychological Problems: A Large-Scale Community-Based Study Using Grouped Outcomes Latent Factor Regression with Latent Predictors

    Science.gov (United States)

    Hassanzadeh, Akbar; Heidari, Zahra; Hassanzadeh Keshteli, Ammar; Afshar, Hamid

    2017-01-01

    Objective The current study is aimed at investigating the association between stressful life events and psychological problems in a large sample of Iranian adults. Method In a cross-sectional large-scale community-based study, 4763 Iranian adults, living in Isfahan, Iran, were investigated. Grouped outcomes latent factor regression on latent predictors was used for modeling the association of psychological problems (depression, anxiety, and psychological distress), measured by Hospital Anxiety and Depression Scale (HADS) and General Health Questionnaire (GHQ-12), as the grouped outcomes, and stressful life events, measured by a self-administered stressful life events (SLEs) questionnaire, as the latent predictors. Results The results showed that the personal stressors domain has significant positive association with psychological distress (β = 0.19), anxiety (β = 0.25), depression (β = 0.15), and their collective profile score (β = 0.20), with greater associations in females (β = 0.28) than in males (β = 0.13) (all P < 0.001). In addition, in the adjusted models, the regression coefficients for the association of social stressors domain and psychological problems profile score were 0.37, 0.35, and 0.46 in total sample, males, and females, respectively (P < 0.001). Conclusion Results of our study indicated that different stressors, particularly those socioeconomic related, have an effective impact on psychological problems. It is important to consider the social and cultural background of a population for managing the stressors as an effective approach for preventing and reducing the destructive burden of psychological problems. PMID:29312459

  18. A comparison of discriminant logistic regression and Item Response Theory Likelihood-Ratio Tests for Differential Item Functioning (IRTLRDIF) in polytomous short tests.

    Science.gov (United States)

    Hidalgo, María D; López-Martínez, María D; Gómez-Benito, Juana; Guilera, Georgina

    2016-01-01

    Short scales are typically used in the social, behavioural and health sciences. This is relevant since test length can influence whether items showing DIF are correctly flagged. This paper compares the relative effectiveness of discriminant logistic regression (DLR) and IRTLRDIF for detecting DIF in polytomous short tests. A simulation study was designed. Test length, sample size, DIF amount and item response categories number were manipulated. Type I error and power were evaluated. IRTLRDIF and DLR yielded Type I error rates close to nominal level in no-DIF conditions. Under DIF conditions, Type I error rates were affected by test length DIF amount, degree of test contamination, sample size and number of item response categories. DLR showed a higher Type I error rate than did IRTLRDIF. Power rates were affected by DIF amount and sample size, but not by test length. DLR achieved higher power rates than did IRTLRDIF in very short tests, although the high Type I error rate involved means that this result cannot be taken into account. Test length had an important impact on the Type I error rate. IRTLRDIF and DLR showed a low power rate in short tests and with small sample sizes.

  19. Leak testing of cryogenic components — problems and solutions

    Science.gov (United States)

    Srivastava, S. P.; Pandarkar, S. P.; Unni, T. G.; Sinha, A. K.; Mahajan, K.; Suthar, R. L.

    2008-05-01

    moderator pot was driving the MSLD out of range. Since it was very difficult to locate the leak by Tracer Probe Method, some other technique was ventured to solve the problem of leak location. Finally, it was possible to locate the leak by observing the change in Helium background reading of MSLD during masking/unmasking of the welded joints. This paper, in general describes the design and leak testing aspects of cryogenic components of Cold Neutron Source and in particular, the problems and solutions for leak testing of transfer lines and moderator pot.

  20. Effects of dependence in high-dimensional multiple testing problems

    Directory of Open Access Journals (Sweden)

    van de Wiel Mark A

    2008-02-01

    Full Text Available Abstract Background We consider effects of dependence among variables of high-dimensional data in multiple hypothesis testing problems, in particular the False Discovery Rate (FDR control procedures. Recent simulation studies consider only simple correlation structures among variables, which is hardly inspired by real data features. Our aim is to systematically study effects of several network features like sparsity and correlation strength by imposing dependence structures among variables using random correlation matrices. Results We study the robustness against dependence of several FDR procedures that are popular in microarray studies, such as Benjamin-Hochberg FDR, Storey's q-value, SAM and resampling based FDR procedures. False Non-discovery Rates and estimates of the number of null hypotheses are computed from those methods and compared. Our simulation study shows that methods such as SAM and the q-value do not adequately control the FDR to the level claimed under dependence conditions. On the other hand, the adaptive Benjamini-Hochberg procedure seems to be most robust while remaining conservative. Finally, the estimates of the number of true null hypotheses under various dependence conditions are variable. Conclusion We discuss a new method for efficient guided simulation of dependent data, which satisfy imposed network constraints as conditional independence structures. Our simulation set-up allows for a structural study of the effect of dependencies on multiple testing criterions and is useful for testing a potentially new method on π0 or FDR estimation in a dependency context.

  1. Validity of the reduced-sample insulin modified frequently-sampled intravenous glucose tolerance test using the nonlinear regression approach.

    Science.gov (United States)

    Sumner, Anne E; Luercio, Marcella F; Frempong, Barbara A; Ricks, Madia; Sen, Sabyasachi; Kushner, Harvey; Tulloch-Reid, Marshall K

    2009-02-01

    The disposition index, the product of the insulin sensitivity index (S(I)) and the acute insulin response to glucose, is linked in African Americans to chromosome 11q. This link was determined with S(I) calculated with the nonlinear regression approach to the minimal model and data from the reduced-sample insulin-modified frequently-sampled intravenous glucose tolerance test (Reduced-Sample-IM-FSIGT). However, the application of the nonlinear regression approach to calculate S(I) using data from the Reduced-Sample-IM-FSIGT has been challenged as being not only inaccurate but also having a high failure rate in insulin-resistant subjects. Our goal was to determine the accuracy and failure rate of the Reduced-Sample-IM-FSIGT using the nonlinear regression approach to the minimal model. With S(I) from the Full-Sample-IM-FSIGT considered the standard and using the nonlinear regression approach to the minimal model, we compared the agreement between S(I) from the Full- and Reduced-Sample-IM-FSIGT protocols. One hundred African Americans (body mass index, 31.3 +/- 7.6 kg/m(2) [mean +/- SD]; range, 19.0-56.9 kg/m(2)) had FSIGTs. Glucose (0.3 g/kg) was given at baseline. Insulin was infused from 20 to 25 minutes (total insulin dose, 0.02 U/kg). For the Full-Sample-IM-FSIGT, S(I) was calculated based on the glucose and insulin samples taken at -1, 1, 2, 3, 4, 5, 6, 7, 8,10, 12, 14, 16, 19, 22, 23, 24, 25, 27, 30, 40, 50, 60, 70, 80, 90, 100, 120, 150, and 180 minutes. For the Reduced-Sample-FSIGT, S(I) was calculated based on the time points that appear in bold. Agreement was determined by Spearman correlation, concordance, and the Bland-Altman method. In addition, for both protocols, the population was divided into tertiles of S(I). Insulin resistance was defined by the lowest tertile of S(I) from the Full-Sample-IM-FSIGT. The distribution of subjects across tertiles was compared by rank order and kappa statistic. We found that the rate of failure of resolution of S(I) by

  2. Geologic investigations of drill hole sloughing problems, Nevada Test Site

    International Nuclear Information System (INIS)

    Drellack, S.L. Jr.; Davies, W.J.; Gonzales, J.L.; Hawkins, W.L.

    1983-01-01

    Severe sloughing zones encountered while drilling large diameter emplacement holes in Yucca Flat, Nevada Test Site, have been identified, correlated and predicted through detailed geologic investigations. In central and southeastern Area 7 and in northern Area 3, the unstable zones are a very fine-grained, well-sorted, unconsolidated sand deposit, probably eolian in origin, which will readily flow into large diameter drill holes. Other areas exhibit hole erosion related to poor induration or extensive zeolitization of the Tertiary tuff units which are very friable and porous. By examining drill hole samples, geophysical logs, caliper logs and drilling histories, these problem zones can be characterized, correlated and then projected into nearby sites. Maps have been generated to show the depth, thickness and areal extent of these strata. In some cases, they are local and have a lenticular geometry, while in others they are quite extensive. The ability to predict such features can enhance the quality of the hole construction and completion operations to avoid costly delays and the loss of valuable testing real estate. The control of hole enlargements will also eliminate related containment concerns, such as stemming uncertainties

  3. Low temperature storage test phase 2 : identification of problem species

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2009-12-15

    The use of renewable fuels such as biodiesel, in motor vehicle fuels is expected to grow rapidly in North America as a result of governmental mandates. Biodiesel is a fuel component made from plant and animal feedstocks via a transesterification process. The fatty acid methyl esters (FAME) of biodiesel have cloud points that range from 5 degrees C to -15 degrees C. The poor low temperature performance of blends containing FAME must be understood in order to avoid operability issues. This paper presented the results of several testing programs conducted by researchers to investigate filter plugging in biodiesel fuels caused by high levels of saturated monoglycerides. The low temperature storage stability of 57 biodiesel fuels comprised of B5 and B20 made with canola methyl ester (CME), soybean methyl ester (SME), tallow methyl ester (TME) and palm methyl ester (PME) was investigated. Filter blocking tests were conducted to assess storage stability. Deposits from the blends were analyzed using gas chromatography and mass spectrometry (GC-MS) in order to identify the problem species. Results of the study confirmed the deleterious impact of saturated mono-glycerides in FAME on the low temperature operability of filters in fuel handling systems. 11 refs., 7 tabs., 5 figs. 9 appendices.

  4. Why some children with externalising problems develop internalising symptoms: testing two pathways in a genetically sensitive cohort study.

    Science.gov (United States)

    Wertz, Jasmin; Zavos, Helena; Matthews, Timothy; Harvey, Kirsten; Hunt, Alice; Pariante, Carmine M; Arseneault, Louise

    2015-07-01

    Children with externalising problems are at risk of developing internalising problems as they grow older. The pathways underlying this developmental association remain to be elucidated. We tested two processes that could explain why some children with externalising problems develop internalising symptoms in preadolescence: a mediation model whereby the association between early externalising and later new internalising symptoms is explained by negative experiences; and a genetic model, whereby genes influence both problems. We used data from the Environmental Risk (E-Risk) Study, a 1994-1995 birth cohort of 2,232 twins born in England and Wales. We assessed externalising and internalising problems using combined mothers' and teachers' ratings at age 5 and 12. We measured bullying victimisation, maternal dissatisfaction and academic difficulties between age 7 and 10 and used linear regression analyses to test the effects of these negative experiences on the association between early externalising and later internalising problems. We employed a Cholesky decomposition to examine the genetic influences on the association. Children with externalising problems at age 5 showed increased rates of new internalising problems at age 12 (r = .24, p children with externalising problems develop internalising symptoms in preadolescence. Negative experiences also contribute to the association, possibly through gene-environment interplay. Mental health professionals should monitor the development of internalising symptoms in young children with externalising problems. © 2014 Association for Child and Adolescent Mental Health.

  5. 49 CFR 40.205 - How are drug test problems corrected?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false How are drug test problems corrected? 40.205 Section 40.205 Transportation Office of the Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Problems in Drug Tests § 40.205 How are drug test problems...

  6. Finite Algorithms for Robust Linear Regression

    DEFF Research Database (Denmark)

    Madsen, Kaj; Nielsen, Hans Bruun

    1990-01-01

    The Huber M-estimator for robust linear regression is analyzed. Newton type methods for solution of the problem are defined and analyzed, and finite convergence is proved. Numerical experiments with a large number of test problems demonstrate efficiency and indicate that this kind of approach may...

  7. Bayesian models based on test statistics for multiple hypothesis testing problems.

    Science.gov (United States)

    Ji, Yuan; Lu, Yiling; Mills, Gordon B

    2008-04-01

    We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.

  8. Current problems regarding abortion, prenatal genetic testing and managing pregnancy

    Directory of Open Access Journals (Sweden)

    Klajn-Tatić Vesna

    2011-01-01

    Full Text Available Current ethical and legal issues with regard to abortion, prenatal genetic testing and managing pregnancy are discussed in this paper. These problems are considered from the legal theory point of view as well as from the standpoint of the Serbian Law, the European Convention for the Protection of Human Rights and Fundamental Freedoms, European Court of Human Rights, legal regulations of several EU countries, the USA, Japan, and their judicial practice. First, the pregnancy termination standards that exist in Serbia are introduced. Then the following issues are explained separately: the pro life and pro choice approaches to abortion; abortion according to the legal approach as a way of survival; the moral and legal status of the fetus; prenatal genetic testing, and finally matters regarding managing pregnancy today. Moral and legal principals of autonomy, namely freedom of choice of the individual, privacy and self-determination give women the right to terminate unwanted pregnancies. In addition, the basic question is whether the right of the woman to abortion clashes with the rights of others. Firstly, with the right of the "fetus to life". Secondly, with the right of the state to intervene in the interest of protecting "the life of the fetus". Third, with the rights of the woman’s partner. The fetus has the moral right to life, but less in relation to the same right of the woman as well as in relation to her right to control her life and her physical and moral integrity. On the other hand, the value of the life of the fetus increases morally and legally with the maturity of gestation; from the third trimester, the interest of the state prevails in the protection of the "life of the fetus" except when the life or health of the pregnant woman are at risk. As regards the rights of the woman’s partner, namely the husband’s opinion, there is no legal significance. The law does not request his participation in the decision on abortion because

  9. Validity of the Clock Drawing Test in predicting reports of driving problems in the elderly

    Directory of Open Access Journals (Sweden)

    Banou Evangelia

    2004-10-01

    Full Text Available Abstract Background This study examined the use of the Folstein Mini Mental Status Exam (MMSE and the Clock Drawing Test (CDT in predicting retrospective reports of driving problems among the elderly. The utility of existing scoring systems for the CDT was also examined. Methods Archival chart records of 325 patients of a geriatric outpatient clinic were reviewed, of which 162 had CDT results (including original clock drawings. T-test, correlation, and regression procedures were used to analyze the data. Results Both CDT and MMSE scores were significantly worse among non-drivers than individuals who were currently or recently driving. Among current or recent drivers, scores on both instruments correlated significantly with the total number of reported accidents or near misses, although the magnitude of the respective correlations was small. Only MMSE scores, however, significantly predicted whether or not any accidents or near misses were reported at all. Neither MMSE nor CDT scores predicted unique variance in the regressions. Conclusions The overall results suggest that both the MMSE and CDT have limited utility as potential indicators of driving problems in the elderly. The demonstrated predictive power for these instruments appears to be redundant, such that both appear to assess general cognitive function versus more specific abilities. Furthermore, the lack of robust prediction suggests that neither are sufficient to serve as stand-alone instruments on which to solely base decisions of driving capacity. Rather, individuals who evidence impairment should be provided a more thorough and comprehensive assessment than can be obtained through screening tools.

  10. Problems of overcoming medical consequences of nuclear tests at the former Semipalatinsk test site (STS)

    International Nuclear Information System (INIS)

    Devyatko, V.N.

    1997-01-01

    Tests conducted for many years resulted in large radioactive contamination of Semipalatinsk, East Kazakhstan, Pavlodar and Karaganda regions. About 1,5 million people underwent multiple acute and chronic influence of small ionizing radiation doses basically.In this connection Ministry of Heals Protection and Social Protection Organizations are worried about the problem of recovering and rehabilitating the population of the above regions. to solve these problems Ministry of Health Protection Republic of Kazakhstan established Scientific Research Institutes of Medicine and Ecology in Semipalatinsk and regional Medical and Diagnostic Center in Kurchatov. With the help of regional Administrations there were created medical centers: diagnostic, children's, recovering, ophthalmological, of motherhood and childhood protection. Work on creating State National Medical Registration for people who underwent influence of ionizing radiation is being performed

  11. Introduction to regression graphics

    CERN Document Server

    Cook, R Dennis

    2009-01-01

    Covers the use of dynamic and interactive computer graphics in linear regression analysis, focusing on analytical graphics. Features new techniques like plot rotation. The authors have composed their own regression code, using Xlisp-Stat language called R-code, which is a nearly complete system for linear regression analysis and can be utilized as the main computer program in a linear regression course. The accompanying disks, for both Macintosh and Windows computers, contain the R-code and Xlisp-Stat. An Instructor's Manual presenting detailed solutions to all the problems in the book is ava

  12. A regression-based method for mapping traffic-related air pollution. Application and testing in four contrasting urban environments

    International Nuclear Information System (INIS)

    Briggs, D.J.; De Hoogh, C.; Elliot, P.; Gulliver, J.; Wills, J.; Kingham, S.; Smallbone, K.

    2000-01-01

    Accurate, high-resolution maps of traffic-related air pollution are needed both as a basis for assessing exposures as part of epidemiological studies, and to inform urban air-quality policy and traffic management. This paper assesses the use of a GIS-based, regression mapping technique to model spatial patterns of traffic-related air pollution. The model - developed using data from 80 passive sampler sites in Huddersfield, as part of the SAVIAH (Small Area Variations in Air Quality and Health) project - uses data on traffic flows and land cover in the 300-m buffer zone around each site, and altitude of the site, as predictors of NO 2 concentrations. It was tested here by application in four urban areas in the UK: Huddersfield (for the year following that used for initial model development), Sheffield, Northampton, and part of London. In each case, a GIS was built in ArcInfo, integrating relevant data on road traffic, urban land use and topography. Monitoring of NO 2 was undertaken using replicate passive samplers (in London, data were obtained from surveys carried out as part of the London network). In Huddersfield, Sheffield and Northampton, the model was first calibrated by comparing modelled results with monitored NO 2 concentrations at 10 randomly selected sites; the calibrated model was then validated against data from a further 10-28 sites. In London, where data for only 11 sites were available, validation was not undertaken. Results showed that the model performed well in all cases. After local calibration, the model gave estimates of mean annual NO 2 concentrations within a factor of 1.5 of the actual mean (approx. 70-90%) of the time and within a factor of 2 between 70 and 100% of the time. r 2 values between modelled and observed concentrations are in the range of 0.58-0.76. These results are comparable to those achieved by more sophisticated dispersion models. The model also has several advantages over dispersion modelling. It is able, for example, to

  13. COVAR: Computer Program for Multifactor Relative Risks and Tests of Hypotheses Using a Variance-Covariance Matrix from Linear and Log-Linear Regression

    Directory of Open Access Journals (Sweden)

    Leif E. Peterson

    1997-11-01

    Full Text Available A computer program for multifactor relative risks, confidence limits, and tests of hypotheses using regression coefficients and a variance-covariance matrix obtained from a previous additive or multiplicative regression analysis is described in detail. Data used by the program can be stored and input from an external disk-file or entered via the keyboard. The output contains a list of the input data, point estimates of single or joint effects, confidence intervals and tests of hypotheses based on a minimum modified chi-square statistic. Availability of the program is also discussed.

  14. 49 CFR 40.271 - How are alcohol testing problems corrected?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false How are alcohol testing problems corrected? 40.271 Section 40.271 Transportation Office of the Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Problems in Alcohol Testing § 40.271 How are alcohol testing...

  15. Modal data for the BARC challenge problem Test Report

    Energy Technology Data Exchange (ETDEWEB)

    Rohe, Daniel Peter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2018-01-22

    Modal testing was performed on the uncut BARC structure as a whole and broken into its two sub-assemblies. The structure was placed on soft foam during the test. Excitation was provided with a small modal hammer attached to an actuator. Responses were measured using a 3D Scanning Laser Doppler Vibrometer. Data, shapes, and geometry from this test can be downloaded in Universal File Format from the Sandia Connect SharePoint site.

  16. Pharmacogenetic testing, informed consent and the problem of secondary information.

    Science.gov (United States)

    Netzer, Christian; Biller-Andorno, Nikola

    2004-08-01

    Numerous benefits for patients have been predicted if prescribing decisions were routinely accompanied by pharmacogenetic testing. So far, little attention has been paid to the possibility that the routine application of this new technology could result in considerable harm to patients. This article emphasises that pharmacogenetic testing shares both the opportunities and the pitfalls with 'conventional' disease-genetic testing. It demonstrates that performing pharmacogenetic tests as well as interpreting the results are extraordinarily complex issues requiring a high level of expertise. It further argues that pharmacogenetic testing can have a huge impact on clinical decisions and may influence the therapeutic strategy as well as the clinical monitoring of a patient. This view challenges the predominant paradigm that pharmacogenetic testing will predict patients' responses to medicines, but that it will not provide any other significant disease-specific predictive information about the patient or family members. The article also questions published proposals to reduce the consent procedure for pharmacogenetic testing to a simple statement that the physician wishes to test a sample of the patient's DNA to see if a drug will be safe or whether it will work, and presents an alternative model that is better suited to protect patient's interests and to obtain meaningful informed consent. The paper concludes by outlining conditions for the application of pharmacogenetic testing in clinical practice in a way that can make full use of its potential benefits while minimising possible harm to patients and their families.

  17. Progression and regression of cervical pap test lesions in an urban AIDS clinic in the combined antiretroviral therapy era: a longitudinal, retrospective study.

    Science.gov (United States)

    Lofgren, Sarah M; Tadros, Talaat; Herring-Bailey, Gina; Birdsong, George; Mosunjac, Marina; Flowers, Lisa; Nguyen, Minh Ly

    2015-05-01

    Our objective was to evaluate the progression and regression of cervical dysplasia in human immunodeficiency virus (HIV)-positive women during the late antiretroviral era. Risk factors as well as outcomes after treatment of cancerous or precancerous lesions were examined. This is a longitudinal retrospective review of cervical Pap tests performed on HIV-infected women with an intact cervix between 2004 and 2011. Subjects needed over two Pap tests for at least 2 years of follow-up. Progression was defined as those who developed a squamous intraepithelial lesion (SIL), atypical glandular cells (AGC), had low-grade SIL (LSIL) followed by atypical squamous cells-cannot exclude high-grade SIL (ASC-H) or high-grade SIL (HSIL), or cancer. Regression was defined as an initial SIL with two or more subsequent normal Pap tests. Persistence was defined as having an SIL without progression or regression. High-risk human papillomavirus (HPV) testing started in 2006 on atypical squamous cells of undetermined significance (ASCUS) Pap tests. AGC at enrollment were excluded from progression analysis. Of 1,445 screened, 383 patients had over two Pap tests for a 2-year period. Of those, 309 had an intact cervix. The median age was 40 years and CD4+ cell count was 277 cells/mL. Four had AGC at enrollment. A quarter had persistently normal Pap tests, 64 (31%) regressed, and 50 (24%) progressed. Four developed cancer. The only risk factor associated with progression was CD4 count. In those with treated lesions, 24 (59%) had negative Pap tests at the end of follow-up. More studies are needed to evaluate follow-up strategies of LSIL patients, potentially combined with HPV testing. Guidelines for HIV-seropositive women who are in care, have improved CD4, and have persistently negative Pap tests could likely lengthen the follow-up interval.

  18. Light phase testing of social behaviors: not a problem

    Directory of Open Access Journals (Sweden)

    Mu Yang

    2008-12-01

    Full Text Available The rich repertoire of mouse social behaviors makes it possible to use mouse models to study neurodevelopmental disorders characterized by social deficits. The fact that mice are naturally nocturnal animals raises a critical question of whether behavioral experiments should be strictly conducted in the dark phase and whether light phase testing is a major methodologically mistake. Although mouse social tasks have been performed in both phases in different laboratories, there seems to be no general consensus on whether testing phase is a critical factor or not. A recent study from our group showed remarkably similar social scores obtained from inbred mice tested in the light and the dark phase, providing evidence that light phase testing could yield reliable results as robust as dark phase testing for the sociability test. Here we offer a comprehensive review on mouse social behaviors measured in light and dark phases and explain why it is reasonable to test laboratory mice in experimental social tasks in the light phase.

  19. Regression-Based Norms for a Bi-factor Model for Scoring the Brief Test of Adult Cognition by Telephone (BTACT).

    Science.gov (United States)

    Gurnani, Ashita S; John, Samantha E; Gavett, Brandon E

    2015-05-01

    The current study developed regression-based normative adjustments for a bi-factor model of the The Brief Test of Adult Cognition by Telephone (BTACT). Archival data from the Midlife Development in the United States-II Cognitive Project were used to develop eight separate linear regression models that predicted bi-factor BTACT scores, accounting for age, education, gender, and occupation-alone and in various combinations. All regression models provided statistically significant fit to the data. A three-predictor regression model fit best and accounted for 32.8% of the variance in the global bi-factor BTACT score. The fit of the regression models was not improved by gender. Eight different regression models are presented to allow the user flexibility in applying demographic corrections to the bi-factor BTACT scores. Occupation corrections, while not widely used, may provide useful demographic adjustments for adult populations or for those individuals who have attained an occupational status not commensurate with expected educational attainment. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Regression-Based Norms for the Symbol Digit Modalities Test in the Dutch Population: Improving Detection of Cognitive Impairment in Multiple Sclerosis?

    Science.gov (United States)

    Burggraaff, Jessica; Knol, Dirk L; Uitdehaag, Bernard M J

    2017-01-01

    Appropriate and timely screening instruments that sensitively capture the cognitive functioning of multiple sclerosis (MS) patients are the need of the hour. We evaluated newly derived regression-based norms for the Symbol Digit Modalities Test (SDMT) in a Dutch-speaking sample, as an indicator of the cognitive state of MS patients. Regression-based norms for the SDMT were created from a healthy control sample (n = 96) and used to convert MS patients' (n = 157) raw scores to demographically adjusted Z-scores, correcting for the effects of age, age2, gender, and education. Conventional and regression-based norms were compared on their impairment-classification rates and related to other neuropsychological measures. The regression analyses revealed that age was the only significantly influencing demographic in our healthy sample. Regression-based norms for the SDMT more readily detected impairment in MS patients than conventional normalization methods (32 patients instead of 15). Patients changing from an SDMT-preserved to -impaired status (n = 17) were also impaired on other cognitive domains (p < 0.05), except for visuospatial memory (p = 0.34). Regression-based norms for the SDMT more readily detect abnormal performance in MS patients than conventional norms, identifying those patients at highest risk for cognitive impairment, which was supported by a worse performance on other neuropsychological measures. © 2017 S. Karger AG, Basel.

  1. The use of regression analysis in determining reference intervals for low hematocrit and thrombocyte count in multiple electrode aggregometry and platelet function analyzer 100 testing of platelet function.

    Science.gov (United States)

    Kuiper, Gerhardus J A J M; Houben, Rik; Wetzels, Rick J H; Verhezen, Paul W M; Oerle, Rene van; Ten Cate, Hugo; Henskens, Yvonne M C; Lancé, Marcus D

    2017-11-01

    Low platelet counts and hematocrit levels hinder whole blood point-of-care testing of platelet function. Thus far, no reference ranges for MEA (multiple electrode aggregometry) and PFA-100 (platelet function analyzer 100) devices exist for low ranges. Through dilution methods of volunteer whole blood, platelet function at low ranges of platelet count and hematocrit levels was assessed on MEA for four agonists and for PFA-100 in two cartridges. Using (multiple) regression analysis, 95% reference intervals were computed for these low ranges. Low platelet counts affected MEA in a positive correlation (all agonists showed r 2 ≥ 0.75) and PFA-100 in an inverse correlation (closure times were prolonged with lower platelet counts). Lowered hematocrit did not affect MEA testing, except for arachidonic acid activation (ASPI), which showed a weak positive correlation (r 2 = 0.14). Closure time on PFA-100 testing was inversely correlated with hematocrit for both cartridges. Regression analysis revealed different 95% reference intervals in comparison with originally established intervals for both MEA and PFA-100 in low platelet or hematocrit conditions. Multiple regression analysis of ASPI and both tests on the PFA-100 for combined low platelet and hematocrit conditions revealed that only PFA-100 testing should be adjusted for both thrombocytopenia and anemia. 95% reference intervals were calculated using multiple regression analysis. However, coefficients of determination of PFA-100 were poor, and some variance remained unexplained. Thus, in this pilot study using (multiple) regression analysis, we could establish reference intervals of platelet function in anemia and thrombocytopenia conditions on PFA-100 and in thrombocytopenia conditions on MEA.

  2. Nash evolutionary algorithms : Testing problem size in reconstruction problems in frame structures

    OpenAIRE

    Greiner, D.; Periaux, Jacques; Emperador, J.M.; Galván, B.; Winter, G.

    2016-01-01

    The use of evolutionary algorithms has been enhanced in recent years for solving real engineering problems, where the requirements of intense computational calculations are needed, especially when computational engineering simulations are involved (use of finite element method, boundary element method, etc). The coupling of game-theory concepts in evolutionary algorithms has been a recent line of research which could enhance the efficiency of the optimum design procedure and th...

  3. Testing After Worked Example Study Does Not Enhance Delayed Problem-Solving Performance Compared to Restudy

    NARCIS (Netherlands)

    T.A.J.M. van Gog (Tamara); L. Kester (Liesbeth); K. Dirkx (Kim); V. Hoogerheide (Vincent); J. Boerboom (Joris); P.P.J.L. Verkoeijen (Peter)

    2015-01-01

    textabstractFour experiments investigated whether the testing effect also applies to the acquisition of problem-solving skills from worked examples. Experiment 1 (n = 120) showed no beneficial effects of testing consisting of isomorphic problem solving or example recall on final test performance,

  4. Testing After Worked Example Study Does Not Enhance Delayed Problem-Solving Performance Compared to Restudy

    NARCIS (Netherlands)

    Van Gog, Tamara; Kester, Liesbeth; Dirkx, Kim; Hoogerheide, Vincent; Boerboom, Joris; Verkoeijen, Peter P. J. L.

    2016-01-01

    Four experiments investigated whether the testing effect also applies to the acquisition of problem-solving skills from worked examples. Experiment 1 (n=120) showed no beneficial effects of testing consisting of isomorphic problem solving or example recall on final test performance, which

  5. Testing After Worked Example Study Does Not Enhance Delayed Problem-Solving Performance Compared to Restudy

    NARCIS (Netherlands)

    van Gog, Tamara; Kester, Liesbeth; Dirkx, Kim; Hoogerheide, Vincent; Boerboom, Joris; Verkoeijen, Peter P J L

    Four experiments investigated whether the testing effect also applies to the acquisition of problem-solving skills from worked examples. Experiment 1 (n = 120) showed no beneficial effects of testing consisting of isomorphic problem solving or example recall on final test performance, which

  6. Testing after Worked Example Study Does Not Enhance Delayed Problem-Solving Performance Compared to Restudy

    Science.gov (United States)

    van Gog, Tamara; Kester, Liesbeth; Dirkx, Kim; Hoogerheide, Vincent; Boerboom, Joris; Verkoeijen, Peter P. J. L.

    2015-01-01

    Four experiments investigated whether the testing effect also applies to the acquisition of problem-solving skills from worked examples. Experiment 1 (n?=?120) showed no beneficial effects of testing consisting of "isomorphic" problem solving or "example recall" on final test performance, which consisted of isomorphic problem…

  7. Problem-Solving Test: The Mechanism of Protein Synthesis

    Science.gov (United States)

    Szeberenyi, Jozsef

    2009-01-01

    Terms to be familiar with before you start to solve the test: protein synthesis, ribosomes, amino acids, peptides, peptide bond, polypeptide chain, N- and C-terminus, hemoglobin, [alpha]- and [beta]-globin chains, radioactive labeling, [[to the third power]H] and [[to the fourteenth power]C]leucine, cytosol, differential centrifugation, density…

  8. A revised simplex method for test construction problems

    NARCIS (Netherlands)

    Adema, Jos J.; Adema, J.J.

    1990-01-01

    Linear programming models with 0-1 variables are useful for the construction of tests from an item bank. Most solution strategies for these models start with solving the relaxed 0-1 linear programming model, allowing the 0-1 variables to take on values between 0 and 1. Then, a 0-1 solution is found

  9. Classroom Tests and Achievement in Problem Solving in Physical Geography

    Science.gov (United States)

    Monk, Janice J.; Stallings, William M.

    1975-01-01

    Two hundred students in an undergraduate physical geography course were assigned to a group which received either factually oriented quizzes or quizzes which stressed higher level behaviors such as application and analysis. Evaluation of the results indicated that the variation in testing procedures had no discernable effect on student scores in…

  10. Problem-Solving Test: Submitochondrial Localization of Proteins

    Science.gov (United States)

    Szeberenyi, Jozsef

    2011-01-01

    Mitochondria are surrounded by two membranes (outer and inner mitochondrial membrane) that separate two mitochondrial compartments (intermembrane space and matrix). Hundreds of proteins are distributed among these submitochondrial components. A simple biochemical/immunological procedure is described in this test to determine the localization of…

  11. Language Testing in the Military: Problems, Politics and Progress

    Science.gov (United States)

    Green, Rita; Wall, Dianne

    2005-01-01

    There appears to be little literature available -- either descriptive or research-related -- on language testing in the military. This form of specific purposes assessment affects both military personnel and civilians working within the military structure in terms of posting, promotion and remuneration, and it could be argued that it has serious…

  12. Boosted beta regression.

    Directory of Open Access Journals (Sweden)

    Matthias Schmid

    Full Text Available Regression analysis with a bounded outcome is a common problem in applied statistics. Typical examples include regression models for percentage outcomes and the analysis of ratings that are measured on a bounded scale. In this paper, we consider beta regression, which is a generalization of logit models to situations where the response is continuous on the interval (0,1. Consequently, beta regression is a convenient tool for analyzing percentage responses. The classical approach to fit a beta regression model is to use maximum likelihood estimation with subsequent AIC-based variable selection. As an alternative to this established - yet unstable - approach, we propose a new estimation technique called boosted beta regression. With boosted beta regression estimation and variable selection can be carried out simultaneously in a highly efficient way. Additionally, both the mean and the variance of a percentage response can be modeled using flexible nonlinear covariate effects. As a consequence, the new method accounts for common problems such as overdispersion and non-binomial variance structures.

  13. The efficient market hypothesis: problems with interpretations of empirical tests

    Directory of Open Access Journals (Sweden)

    Denis Alajbeg

    2012-03-01

    Full Text Available Despite many “refutations” in empirical tests, the efficient market hypothesis (EMH remains the central concept of financial economics. The EMH’s resistance to the results of empirical testing emerges from the fact that the EMH is not a falsifiable theory. Its axiomatic definition shows how asset prices would behave under assumed conditions. Testing for this price behavior does not make much sense as the conditions in the financial markets are much more complex than the simplified conditions of perfect competition, zero transaction costs and free information used in the formulation of the EMH. Some recent developments within the tradition of the adaptive market hypothesis are promising regarding development of a falsifiable theory of price formation in financial markets, but are far from giving assurance that we are approaching a new formulation. The most that can be done in the meantime is to be very cautious while interpreting the empirical evidence that is presented as “testing” the EMH.

  14. Testing the macroeconomic impact of the budget deficit in EU Member States using linear regression with fixed effects

    Directory of Open Access Journals (Sweden)

    Dalian Marius DORAN

    2017-11-01

    Full Text Available The article aims to research impact of budget balance, whether surplus or deficit, on the main indicator characterizing the economic growth of a country, namely GDP and the inflation rate in the 27 European Union Member States and the United Kingdom. For this analysis was used panel data, taking into account the period from 2001 to 2015. The method used for the analysis is the linear regression with fixed effects and with Driscoll-Kraay standard errors. The dependent variables are the growth rate of real GDP and the inflation rate, and the independent variable is the budget balance (surplus or deficit. The results obtained after using econometric software Stata shows a positive impact of budget balance on growth in the European Union for the analyzed period.

  15. Reproducibility problems of in-service ultrasonic testing results

    International Nuclear Information System (INIS)

    Honcu, E.

    1974-01-01

    The reproducibility of the results of ultrasonic testing is the basic precondition for its successful application in in-service inspection of changes in the quality of components of nuclear power installations. The results of periodic ultrasonic inspections are not satisfactory from the point of view of reproducibility. Regardless, the ultrasonic pulse-type method is suitable for evaluating the quality of most components of nuclear installations and often the sole method which may be recommended for inspection with regard to its technical and economic aspects. (J.B.)

  16. Construct-level predictive validity of educational attainment and intellectual aptitude tests in medical student selection: meta-regression of six UK longitudinal studies

    Science.gov (United States)

    2013-01-01

    Background Measures used for medical student selection should predict future performance during training. A problem for any selection study is that predictor-outcome correlations are known only in those who have been selected, whereas selectors need to know how measures would predict in the entire pool of applicants. That problem of interpretation can be solved by calculating construct-level predictive validity, an estimate of true predictor-outcome correlation across the range of applicant abilities. Methods Construct-level predictive validities were calculated in six cohort studies of medical student selection and training (student entry, 1972 to 2009) for a range of predictors, including A-levels, General Certificates of Secondary Education (GCSEs)/O-levels, and aptitude tests (AH5 and UK Clinical Aptitude Test (UKCAT)). Outcomes included undergraduate basic medical science and finals assessments, as well as postgraduate measures of Membership of the Royal Colleges of Physicians of the United Kingdom (MRCP(UK)) performance and entry in the Specialist Register. Construct-level predictive validity was calculated with the method of Hunter, Schmidt and Le (2006), adapted to correct for right-censorship of examination results due to grade inflation. Results Meta-regression analyzed 57 separate predictor-outcome correlations (POCs) and construct-level predictive validities (CLPVs). Mean CLPVs are substantially higher (.450) than mean POCs (.171). Mean CLPVs for first-year examinations, were high for A-levels (.809; CI: .501 to .935), and lower for GCSEs/O-levels (.332; CI: .024 to .583) and UKCAT (mean = .245; CI: .207 to .276). A-levels had higher CLPVs for all undergraduate and postgraduate assessments than did GCSEs/O-levels and intellectual aptitude tests. CLPVs of educational attainment measures decline somewhat during training, but continue to predict postgraduate performance. Intellectual aptitude tests have lower CLPVs than A-levels or GCSEs

  17. Another baryon miracle? Testing solutions to the `missing dwarfs' problem

    Science.gov (United States)

    Trujillo-Gomez, Sebastian; Schneider, Aurel; Papastergis, Emmanouil; Reed, Darren S.; Lake, George

    2018-04-01

    The dearth of dwarf galaxies in the local Universe is hard to reconcile with the large number of low-mass haloes expected within the concordance Λ cold dark matter (ΛCDM) paradigm. In this paper, we perform a systematic evaluation of the uncertainties affecting the measurement of dark matter halo abundance using galaxy kinematics. Using a large sample of dwarf galaxies with spatially resolved kinematics, we derive a correction to obtain the abundance of galaxies as a function of maximum circular velocity - a direct probe of halo mass - from the line-of-sight velocity function in the Local Volume. This method provides a direct means of comparing the predictions of theoretical models and simulations (including non-standard cosmologies and novel galaxy formation physics) to the observational constraints. The new `galactic Vmax' function is steeper than the line-of-sight velocity function but still shallower than the theoretical CDM expectation, implying that unaccounted baryonic physics may be necessary to reduce the predicted abundance of galaxies. Using the galactic Vmax function, we investigate the theoretical effects of feedback-powered outflows and photoevaporation of gas due to reionization. At the 3σ confidence level, we find that feedback and reionization are not effective enough to reconcile the disagreement. In the case of maximum baryonic effects, the theoretical prediction still deviates significantly from the observations for Vmax < 60 km s-1. CDM predicts at least 1.8 times more galaxies with Vmax = 50 km s-1 and 2.5 times more than observed at 30 km s-1. Recent hydrodynamic simulations seem to resolve the discrepancy but disagree with the properties of observed galaxies with spatially resolved kinematics. This abundance problem might point to the need to modify cosmological predictions at small scales.

  18. Research Problems Associated with Limiting the Applied Force in Vibration Tests and Conducting Base-Drive Modal Vibration Tests

    Science.gov (United States)

    Scharton, Terry D.

    1995-01-01

    The intent of this paper is to make a case for developing and conducting vibration tests which are both realistic and practical (a question of tailoring versus standards). Tests are essential for finding things overlooked in the analyses. The best test is often the most realistic test which can be conducted within the cost and budget constraints. Some standards are essential, but the author believes more in the individual's ingenuity to solve a specific problem than in the application of standards which reduce problems (and technology) to their lowest common denominator. Force limited vibration tests and base-drive modal tests are two examples of realistic, but practical testing approaches. Since both of these approaches are relatively new, a number of interesting research problems exist, and these are emphasized herein.

  19. Sex Differences and Self-Reported Attention Problems During Baseline Concussion Testing.

    Science.gov (United States)

    Brooks, Brian L; Iverson, Grant L; Atkins, Joseph E; Zafonte, Ross; Berkner, Paul D

    2016-01-01

    Amateur athletic programs often use computerized cognitive testing as part of their concussion management programs. There is evidence that athletes with preexisting attention problems will have worse cognitive performance and more symptoms at baseline testing. The purpose of this study was to examine whether attention problems affect assessments differently for male and female athletes. Participants were drawn from a database that included 6,840 adolescents from Maine who completed Immediate Postconcussion Assessment and Cognitive Testing (ImPACT) at baseline (primary outcome measure). The final sample included 249 boys and 100 girls with self-reported attention problems. Each participant was individually matched for sex, age, number of past concussions, and sport to a control participant (249 boys, 100 girls). Boys with attention problems had worse reaction time than boys without attention problems. Girls with attention problems had worse visual-motor speed than girls without attention problems. Boys with attention problems reported more total symptoms, including more cognitive-sensory and sleep-arousal symptoms, compared with boys without attention problems. Girls with attention problems reported more cognitive-sensory, sleep-arousal, and affective symptoms than girls without attention problems. When considering the assessment, management, and outcome from concussions in adolescent athletes, it is important to consider both sex and preinjury attention problems regarding cognitive test results and symptom reporting.

  20. Inverse problems in eddy current testing using neural network

    Science.gov (United States)

    Yusa, N.; Cheng, W.; Miya, K.

    2000-05-01

    Reconstruction of crack in conductive material is one of the most important issues in the field of eddy current testing. Although many attempts to reconstruct cracks have been made, most of them deal with only artificial cracks machined with electro-discharge. However, in the case of natural cracks like stress corrosion cracking or inter-granular attack, there must be contact region and therefore their conductivity is not necessarily zero. In this study, an attempt to reconstruct natural cracks using neural network is presented. The neural network was trained through numerical simulated data obtained by the fast forward solver that calculated unflawed potential data a priori to save computational time. The solver is based on A-φ method discretized by using FEM-BEM A natural crack was modeled as an area whose conductivity was less than that of a specimen. The distribution of conductivity in that area was reconstructed as well. It took much time to train the network, but the speed of reconstruction was extremely fast after once it was trained. Well-trained network gave good reconstruction result.

  1. Medical Physics: Forming and testing solutions to clinical problems.

    Science.gov (United States)

    Tsapaki, Virginia; Bayford, Richard

    2015-11-01

    According to the European Federation of Organizations for Medical Physics (EFOMP) policy statement No. 13, "The rapid advance in the use of highly sophisticated equipment and procedures in the medical field increasingly depends on information and communication technology. In spite of the fact that the safety and quality of such technology is vigorously tested before it is placed on the market, it often turns out that the safety and quality is not sufficient when used under hospital working conditions. To improve safety and quality for patient and users, additional safeguards and related monitoring, as well as measures to enhance quality, are required. Furthermore a large number of accidents and incidents happen every year in hospitals and as a consequence a number of patients die or are injured. Medical Physicists are well positioned to contribute towards preventing these kinds of events". The newest developments related to this increasingly important medical speciality were presented during the 8th European Conference of Medical Physics 2014 which was held in Athens, 11-13 September 2014 and hosted by the Hellenic Association of Medical Physicists (HAMP) in collaboration with the EFOMP and are summarized in this issue. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  2. Inverse problems in the design, modeling and testing of engineering systems

    Science.gov (United States)

    Alifanov, Oleg M.

    1991-01-01

    Formulations, classification, areas of application, and approaches to solving different inverse problems are considered for the design of structures, modeling, and experimental data processing. Problems in the practical implementation of theoretical-experimental methods based on solving inverse problems are analyzed in order to identify mathematical models of physical processes, aid in input data preparation for design parameter optimization, help in design parameter optimization itself, and to model experiments, large-scale tests, and real tests of engineering systems.

  3. Collaborative regression.

    Science.gov (United States)

    Gross, Samuel M; Tibshirani, Robert

    2015-04-01

    We consider the scenario where one observes an outcome variable and sets of features from multiple assays, all measured on the same set of samples. One approach that has been proposed for dealing with these type of data is "sparse multiple canonical correlation analysis" (sparse mCCA). All of the current sparse mCCA techniques are biconvex and thus have no guarantees about reaching a global optimum. We propose a method for performing sparse supervised canonical correlation analysis (sparse sCCA), a specific case of sparse mCCA when one of the datasets is a vector. Our proposal for sparse sCCA is convex and thus does not face the same difficulties as the other methods. We derive efficient algorithms for this problem that can be implemented with off the shelf solvers, and illustrate their use on simulated and real data. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. Application of single-step genomic best linear unbiased prediction with a multiple-lactation random regression test-day model for Japanese Holsteins.

    Science.gov (United States)

    Baba, Toshimi; Gotoh, Yusaku; Yamaguchi, Satoshi; Nakagawa, Satoshi; Abe, Hayato; Masuda, Yutaka; Kawahara, Takayoshi

    2017-08-01

    This study aimed to evaluate a validation reliability of single-step genomic best linear unbiased prediction (ssGBLUP) with a multiple-lactation random regression test-day model and investigate an effect of adding genotyped cows on the reliability. Two data sets for test-day records from the first three lactations were used: full data from February 1975 to December 2015 (60 850 534 records from 2 853 810 cows) and reduced data cut off in 2011 (53 091 066 records from 2 502 307 cows). We used marker genotypes of 4480 bulls and 608 cows. Genomic enhanced breeding values (GEBV) of 305-day milk yield in all the lactations were estimated for at least 535 young bulls using two marker data sets: bull genotypes only and both bulls and cows genotypes. The realized reliability (R 2 ) from linear regression analysis was used as an indicator of validation reliability. Using only genotyped bulls, R 2 was ranged from 0.41 to 0.46 and it was always higher than parent averages. The very similar R 2 were observed when genotyped cows were added. An application of ssGBLUP to a multiple-lactation random regression model is feasible and adding a limited number of genotyped cows has no significant effect on reliability of GEBV for genotyped bulls. © 2016 Japanese Society of Animal Science.

  5. A Test Set for stiff Initial Value Problem Solvers in the open source software R: Package deTestSet

    NARCIS (Netherlands)

    Mazzia, F.; Cash, J.R.; Soetaert, K.

    2012-01-01

    In this paper we present the R package deTestSet that includes challenging test problems written as ordinary differential equations (ODEs), differential algebraic equations (DAEs) of index up to 3 and implicit differential equations (IDES). In addition it includes 6 new codes to solve initial value

  6. Nonlinear reaction-diffusion equations with delay: some theorems, test problems, exact and numerical solutions

    Science.gov (United States)

    Polyanin, A. D.; Sorokin, V. G.

    2017-12-01

    The paper deals with nonlinear reaction-diffusion equations with one or several delays. We formulate theorems that allow constructing exact solutions for some classes of these equations, which depend on several arbitrary functions. Examples of application of these theorems for obtaining new exact solutions in elementary functions are provided. We state basic principles of construction, selection, and use of test problems for nonlinear partial differential equations with delay. Some test problems which can be suitable for estimating accuracy of approximate analytical and numerical methods of solving reaction-diffusion equations with delay are presented. Some examples of numerical solutions of nonlinear test problems with delay are considered.

  7. Predicting hyperketonemia by logistic and linear regression using test-day milk and performance variables in early-lactation Holstein and Jersey cows.

    Science.gov (United States)

    Chandler, T L; Pralle, R S; Dórea, J R R; Poock, S E; Oetzel, G R; Fourdraine, R H; White, H M

    2018-03-01

    Although cowside testing strategies for diagnosing hyperketonemia (HYK) are available, many are labor intensive and costly, and some lack sufficient accuracy. Predicting milk ketone bodies by Fourier transform infrared spectrometry during routine milk sampling may offer a more practical monitoring strategy. The objectives of this study were to (1) develop linear and logistic regression models using all available test-day milk and performance variables for predicting HYK and (2) compare prediction methods (Fourier transform infrared milk ketone bodies, linear regression models, and logistic regression models) to determine which is the most predictive of HYK. Given the data available, a secondary objective was to evaluate differences in test-day milk and performance variables (continuous measurements) between Holsteins and Jerseys and between cows with or without HYK within breed. Blood samples were collected on the same day as milk sampling from 658 Holstein and 468 Jersey cows between 5 and 20 d in milk (DIM). Diagnosis of HYK was at a serum β-hydroxybutyrate (BHB) concentration ≥1.2 mmol/L. Concentrations of milk BHB and acetone were predicted by Fourier transform infrared spectrometry (Foss Analytical, Hillerød, Denmark). Thresholds of milk BHB and acetone were tested for diagnostic accuracy, and logistic models were built from continuous variables to predict HYK in primiparous and multiparous cows within breed. Linear models were constructed from continuous variables for primiparous and multiparous cows within breed that were 5 to 11 DIM or 12 to 20 DIM. Milk ketone body thresholds diagnosed HYK with 64.0 to 92.9% accuracy in Holsteins and 59.1 to 86.6% accuracy in Jerseys. Logistic models predicted HYK with 82.6 to 97.3% accuracy. Internally cross-validated multiple linear regression models diagnosed HYK of Holstein cows with 97.8% accuracy for primiparous and 83.3% accuracy for multiparous cows. Accuracy of Jersey models was 81.3% in primiparous and 83

  8. 49 CFR 40.267 - What problems always cause an alcohol test to be cancelled?

    Science.gov (United States)

    2010-10-01

    ... cancelled? 40.267 Section 40.267 Transportation Office of the Secretary of Transportation PROCEDURES FOR... always cause an alcohol test to be cancelled? As an employer, a BAT, or an STT, you must cancel an... the test was cancelled and must be treated as if the test never occurred. These problems are: (a) In...

  9. Influence of regression model and initial intensity of an incremental test on the relationship between the lactate threshold estimated by the maximal-deviation method and running performance.

    Science.gov (United States)

    Santos-Concejero, Jordan; Tucker, Ross; Granados, Cristina; Irazusta, Jon; Bidaurrazaga-Letona, Iraia; Zabala-Lili, Jon; Gil, Susana María

    2014-01-01

    This study investigated the influence of the regression model and initial intensity during an incremental test on the relationship between the lactate threshold estimated by the maximal-deviation method and performance in elite-standard runners. Twenty-three well-trained runners completed a discontinuous incremental running test on a treadmill. Speed started at 9 km · h(-1) and increased by 1.5 km · h(-1) every 4 min until exhaustion, with a minute of recovery for blood collection. Lactate-speed data were fitted by exponential and polynomial models. The lactate threshold was determined for both models, using all the co-ordinates, excluding the first and excluding the first and second points. The exponential lactate threshold was greater than the polynomial equivalent in any co-ordinate condition (P performance and is independent of the initial intensity of the test.

  10. A Test of the Testing Effect: Acquiring Problem-Solving Skills from Worked Examples

    Science.gov (United States)

    van Gog, Tamara; Kester, Liesbeth

    2012-01-01

    The "testing effect" refers to the finding that after an initial study opportunity, testing is more effective for long-term retention than restudying. The testing effect seems robust and is a finding from the field of cognitive science that has important implications for education. However, it is unclear whether this effect also applies…

  11. The students' ability in the mathematical literacy for uncertainty problems on the PISA adaptation test

    Science.gov (United States)

    Julie, Hongki; Sanjaya, Febi; Anggoro, Ant. Yudhi

    2017-08-01

    One of purposes of this study was to describe the solution profile of the junior high school students for the PISA adaptation test. The procedures conducted by researchers to achieve this objective were (1) adapting the PISA test, (2) validating the adapting PISA test, (3) asking junior high school students to do the adapting PISA test, and (4) making the students' solution profile. The PISA problems for mathematics could be classified into four areas, namely quantity, space and shape, change and relationship, and uncertainty. The research results that would be presented in this paper were the result test for uncertainty problems. In the adapting PISA test, there were fifteen questions. Subjects in this study were 18 students from 11 junior high schools in Yogyakarta, Central Java, and Banten. The type of research that used by the researchers was a qualitative research. For the first uncertainty problem in the adapting test, 66.67% of students reached level 3. For the second uncertainty problem in the adapting test, 44.44% of students achieved level 4, and 33.33% of students reached level 3. For the third uncertainty problem in the adapting test n, 38.89% of students achieved level 5, 11.11% of students reached level 4, and 5.56% of students achieved level 3. For the part a of the fourth uncertainty problem in the adapting test, 72.22% of students reached level 4 and for the part b of the fourth uncertainty problem in the adapting test, 83.33% students achieved level 4.

  12. Influence of regression model and incremental test protocol on the relationship between lactate threshold using the maximal-deviation method and performance in female runners.

    Science.gov (United States)

    Machado, Fabiana Andrade; Nakamura, Fábio Yuzo; Moraes, Solange Marta Franzói De

    2012-01-01

    This study examined the influence of the regression model and initial intensity of an incremental test on the relationship between the lactate threshold estimated by the maximal-deviation method and the endurance performance. Sixteen non-competitive, recreational female runners performed a discontinuous incremental treadmill test. The initial speed was set at 7 km · h⁻¹, and increased every 3 min by 1 km · h⁻¹ with a 30-s rest between the stages used for earlobe capillary blood sample collection. Lactate-speed data were fitted by an exponential-plus-constant and a third-order polynomial equation. The lactate threshold was determined for both regression equations, using all the coordinates, excluding the first and excluding the first and second initial points. Mean speed of a 10-km road race was the performance index (3.04 ± 0.22 m · s⁻¹). The exponentially-derived lactate threshold had a higher correlation (0.98 ≤ r ≤ 0.99) and smaller standard error of estimate (SEE) (0.04 ≤ SEE ≤ 0.05 m · s⁻¹) with performance than the polynomially-derived equivalent (0.83 ≤ r ≤ 0.89; 0.10 ≤ SEE ≤ 0.13 m · s⁻¹). The exponential lactate threshold was greater than the polynomial equivalent (P performance index that is independent of the initial intensity of the incremental test and better than the polynomial equivalent.

  13. THE USEFULNESS OF USER TESTING METHODS IN IDENTIFYING PROBLEMS ON UNIVERSITY WEBSITES

    Directory of Open Access Journals (Sweden)

    Layla Hasan

    2014-10-01

    Full Text Available This paper aims to investigate the usefulness of three user testing methods (observation, and using both quantitative and qualitative data from a post-test questionnaire in terms of their ability or inability to find specific usability problems on university websites. The results showed that observation was the best method, compared to the other two, in identifying large numbers of major and minor usability problems on university websites. The results also showed that employing qualitative data from a post-test questionnaire was a useful complementary method since this identified additional usability problems that were not identified by the observation method. However, the results showed that the quantitative data from the post-test questionnaire were inaccurate and ineffective in terms of identifying usability problems on such websites.

  14. Testing foreign language impact on engineering students' scientific problem-solving performance

    Science.gov (United States)

    Tatzl, Dietmar; Messnarz, Bernd

    2013-12-01

    This article investigates the influence of English as the examination language on the solution of physics and science problems by non-native speakers in tertiary engineering education. For that purpose, a statistically significant total number of 96 students in four year groups from freshman to senior level participated in a testing experiment in the Degree Programme of Aviation at the FH JOANNEUM University of Applied Sciences, Graz, Austria. Half of each test group were given a set of 12 physics problems described in German, the other half received the same set of problems described in English. It was the goal to test linguistic reading comprehension necessary for scientific problem solving instead of physics knowledge as such. The results imply that written undergraduate English-medium engineering tests and examinations may not require additional examination time or language-specific aids for students who have reached university-entrance proficiency in English as a foreign language.

  15. Robustness to non-normality of various tests for the one-sample location problem

    Directory of Open Access Journals (Sweden)

    Michelle K. McDougall

    2004-01-01

    Full Text Available This paper studies the effect of the normal distribution assumption on the power and size of the sign test, Wilcoxon's signed rank test and the t-test when used in one-sample location problems. Power functions for these tests under various skewness and kurtosis conditions are produced for several sample sizes from simulated data using the g-and-k distribution of MacGillivray and Cannon [5].

  16. Seed germination test for toxicity evaluation of compost: Its roles, problems and prospects.

    Science.gov (United States)

    Luo, Yuan; Liang, Jie; Zeng, Guangming; Chen, Ming; Mo, Dan; Li, Guoxue; Zhang, Difang

    2018-01-01

    Compost is commonly used for the growth of plants and the remediation of environmental pollution. It is important to evaluate the quality of compost and seed germination test is a powerful tool to examine the toxicity of compost, which is the most important aspect of the quality. Now the test is widely adopted, but the main problem is that the test results vary with different methods and seed species, which limits the development and application of it. The standardization of methods and the modelization of seeds can contribute to solving the problem. Additionally, according to the probabilistic theory of seed germination, the error caused by the analysis and judgment methods of the test results can be reduced. Here, we reviewed the roles, problems and prospects of the seed germination test in the studies of compost. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. An attempt at solving the problem of autocorrelation associated with use of mean approach for pooling cross-section and time series in regression modelling

    International Nuclear Information System (INIS)

    Nuamah, N.N.N.N.

    1990-12-01

    The paradoxical nature of results of the mean approach in pooling cross-section and time series data has been identified to be caused by the presence in the normal equations of phenomena such as autocovariances, multicollinear covariances, drift covariances and drift multicollinear covariances. This paper considers the problem of autocorrelation and suggests ways of solving it. (author). 4 refs

  18. Problem-Solving Test: RNA and Protein Synthesis in Bacteriophage-Infected "E. coli" Cells

    Science.gov (United States)

    Szeberenyi, Jozsef

    2008-01-01

    The classic experiment presented in this problem-solving test was designed to identify the template molecules of translation by analyzing the synthesis of phage proteins in "Escherichia coli" cells infected with bacteriophage T4. The work described in this test led to one of the most seminal discoveries of early molecular biology: it dealt a…

  19. Patch testing with markers of fragrance contact allergy. Do clinical tests correspond to patients' self-reported problems?

    DEFF Research Database (Denmark)

    Johansen, J D; Andersen, T F; Veien, Niels

    1997-01-01

    The aim of the present study was to investigate the relationship between patients' own recognition of skin problems using consumer products and the results of patch testing with markers of fragrance sensitization. Eight hundred and eighty-four consecutive eczema patients, 18-69 years of age, filled...

  20. Patch testing with markers of fragrance contact allergy. Do clinical tests correspond to patients' self-reported problems?

    DEFF Research Database (Denmark)

    Johansen, J D; Andersen, T F; Veien, Niels

    1997-01-01

    The aim of the present study was to investigate the relationship between patients' own recognition of skin problems using consumer products and the results of patch testing with markers of fragrance sensitization. Eight hundred and eighty-four consecutive eczema patients, 18-69 years of age, fill...

  1. Computer-based tests: The impact of test design and problem of equivalency

    Czech Academy of Sciences Publication Activity Database

    Květon, Petr; Jelínek, Martin; Vobořil, Dalibor; Klimusová, H.

    -, č. 23 (2007), s. 32-51 ISSN 0747-5632 R&D Projects: GA ČR(CZ) GA406/99/1052; GA AV ČR(CZ) KSK9058117 Institutional research plan: CEZ:AV0Z7025918 Keywords : Computer-based assessment * speeded test * equivalency Subject RIV: AN - Psychology Impact factor: 1.344, year: 2007

  2. Linear regression in astronomy. I

    Science.gov (United States)

    Isobe, Takashi; Feigelson, Eric D.; Akritas, Michael G.; Babu, Gutti Jogesh

    1990-01-01

    Five methods for obtaining linear regression fits to bivariate data with unknown or insignificant measurement errors are discussed: ordinary least-squares (OLS) regression of Y on X, OLS regression of X on Y, the bisector of the two OLS lines, orthogonal regression, and 'reduced major-axis' regression. These methods have been used by various researchers in observational astronomy, most importantly in cosmic distance scale applications. Formulas for calculating the slope and intercept coefficients and their uncertainties are given for all the methods, including a new general form of the OLS variance estimates. The accuracy of the formulas was confirmed using numerical simulations. The applicability of the procedures is discussed with respect to their mathematical properties, the nature of the astronomical data under consideration, and the scientific purpose of the regression. It is found that, for problems needing symmetrical treatment of the variables, the OLS bisector performs significantly better than orthogonal or reduced major-axis regression.

  3. Vector regression introduced

    Directory of Open Access Journals (Sweden)

    Mok Tik

    2014-06-01

    Full Text Available This study formulates regression of vector data that will enable statistical analysis of various geodetic phenomena such as, polar motion, ocean currents, typhoon/hurricane tracking, crustal deformations, and precursory earthquake signals. The observed vector variable of an event (dependent vector variable is expressed as a function of a number of hypothesized phenomena realized also as vector variables (independent vector variables and/or scalar variables that are likely to impact the dependent vector variable. The proposed representation has the unique property of solving the coefficients of independent vector variables (explanatory variables also as vectors, hence it supersedes multivariate multiple regression models, in which the unknown coefficients are scalar quantities. For the solution, complex numbers are used to rep- resent vector information, and the method of least squares is deployed to estimate the vector model parameters after transforming the complex vector regression model into a real vector regression model through isomorphism. Various operational statistics for testing the predictive significance of the estimated vector parameter coefficients are also derived. A simple numerical example demonstrates the use of the proposed vector regression analysis in modeling typhoon paths.

  4. Bayesian ARTMAP for regression.

    Science.gov (United States)

    Sasu, L M; Andonie, R

    2013-10-01

    Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Testing of the PELSHIE shielding code using Benchmark problems and other special shielding models

    International Nuclear Information System (INIS)

    Language, A.E.; Sartori, D.E.; De Beer, G.P.

    1981-08-01

    The PELSHIE shielding code for gamma rays from point and extended sources was written in 1971 and a revised version was published in October 1979. At Pelindaba the program is used extensively due to its flexibility and ease of use for a wide range of problems. The testing of PELSHIE results with the results of a range of models and so-called Benchmark problems is desirable to determine possible weaknesses in PELSHIE. Benchmark problems, experimental data, and shielding models, some of which were resolved by the discrete-ordinates method with the ANISN and DOT 3.5 codes, were used for the efficiency test. The description of the models followed the pattern of a classical shielding problem. After the intercomparison with six different models, the usefulness of the PELSHIE code was quantitatively determined [af

  6. Regression modeling methods, theory, and computation with SAS

    CERN Document Server

    Panik, Michael

    2009-01-01

    Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,

  7. Mindfulness Facets, Social Anxiety, and Drinking to Cope with Social Anxiety: Testing Mediators of Drinking Problems

    OpenAIRE

    Clerkin, Elise M.; Sarfan, Laurel D.; Parsons, E. Marie; Magee, Joshua C.

    2016-01-01

    This cross-sectional study tested social anxiety symptoms, trait mindfulness, and drinking to cope with social anxiety as potential predictors and/or serial mediators of drinking problems. A community-based sample of individuals with co-occurring social anxiety symptoms and alcohol dependence were recruited. Participants (N = 105) completed measures of social anxiety, drinking to cope with social anxiety, and alcohol use and problems. As well, participants completed the Five Facet Mindfulness...

  8. Analysis of standard problem six (Semiscale test S-02-6) data

    International Nuclear Information System (INIS)

    Cartmill, C.E.

    1977-08-01

    Test S-02-6 of the Semiscale Mod-1 blowdown heat transfer test series was conducted to supply data for the U.S. Nuclear Regulatory Commission Standard Problem Six. To determine the credibility of the data and thus establish the validity of Standard Problem Six, an analysis of the results of Test S-02-6 was performed and is presented. This analysis consisted of investigations of system hydraulic and core thermal data. The credibility of the system hydraulic data was investigated through comparisons of the data with data and calculations from related sources (Test S-02-4) and, when necessary, through assessment of physical events. The credibility of the core thermal data was based on a thorough analysis of physical events. The results of these investigations substantiate the validity of Test S-02-6 data

  9. Modification of Harvard step-test for assessment of students’ with health problems functional potentials

    Directory of Open Access Journals (Sweden)

    E.N. Kopeikina

    2016-08-01

    Full Text Available Purpose: to substantiate, work out and experimentally prove modified test for assessment of students’ with health problems functional potentials. Material: in the research students and girl students of 18-20 years’ age (n=522 participated. According to the worked out modification of test during 30 seconds student ascended on bench (h=43 cm and descended from it. Then pulse was measured three times. In total the test took 4 minutes. Results: For working out the scale for interpretation of the received results we assessed new 30 seconds’ modification of Harvard step-test for validity. First, for this purpose all students fulfilled modified step-test. Then after full restoration (after 20 minutes they fulfilled its three minutes’ variant. Correlation analysis of the received results showed the presence of average correlation between two samples (r=0.64. Conclusions: application of this modified variant permits for pedagogues to completely assess functional potentials of students with heath problems.

  10. Parenting, attention and externalizing problems: testing mediation longitudinally, repeatedly and reciprocally.

    Science.gov (United States)

    Belsky, Jay; Pasco Fearon, R M; Bell, Brian

    2007-12-01

    Building on prior work, this paper tests, longitudinally and repeatedly, the proposition that attentional control processes mediate the effect of earlier parenting on later externalizing problems. Repeated independent measurements of all three constructs--observed parenting, computer-tested attentional control and adult-reported externalizing problems--were subjected to structural equation modeling using data from the large-scale American study of child care and youth development. Structural equation modeling indicated (a) that greater maternal sensitivity at two different ages (54 months, approximately 6 years) predicted better attentional control on the Continuous Performance Test (CPT) of attention regulation two later ages ( approximately 6/9 years); (2) that better attentional control at three different ages (54 months, approximately 6/9 years) predicted less teacher-reported externalizing problems at three later ages ( approximately 6/8/10 years); and (3) that attentional control partially mediated the effect of parenting on externalizing problems at two different lags (i.e., 54 months--> approximately 6 years--> approximately 8 years; approximately 6 years--> approximately 9 years--> approximately 10 years), though somewhat more strongly for the first. Additionally, (4) some evidence of reciprocal effects of attentional processes on parenting emerged (54 months--> approximately 6 years; approximately 6 years--> approximately 8 years), but not of problem behavior on attention. Because attention control partially mediates the effects of parenting on externalizing problems, intervention efforts could target both parenting and attentional processes.

  11. [Testing a Model to Predict Problem Gambling in Speculative Game Users].

    Science.gov (United States)

    Park, Hyangjin; Kim, Suk Sun

    2018-04-01

    The purpose of the study was to develop and test a model for predicting problem gambling in speculative game users based on Blaszczynski and Nower's pathways model of problem and pathological gambling. The participants were 262 speculative game users recruited from seven speculative gambling places located in Seoul, Gangwon, and Gyeonggi, Korea. They completed a structured self-report questionnaire comprising measures of problem gambling, negative emotions, attentional impulsivity, motor impulsivity, non-planning impulsivity, gambler's fallacy, and gambling self-efficacy. Structural Equation Modeling was used to test the hypothesized model and to examine the direct and indirect effects on problem gambling in speculative game users using SPSS 22.0 and AMOS 20.0 programs. The hypothetical research model provided a reasonable fit to the data. Negative emotions, motor impulsivity, gambler's fallacy, and gambling self-efficacy had direct effects on problem gambling in speculative game users, while indirect effects were reported for negative emotions, motor impulsivity, and gambler's fallacy. These predictors explained 75.2% problem gambling in speculative game users. The findings suggest that developing intervention programs to reduce negative emotions, motor impulsivity, and gambler's fallacy, and to increase gambling self-efficacy in speculative game users are needed to prevent their problem gambling. © 2018 Korean Society of Nursing Science.

  12. Ridge Regression Signal Processing

    Science.gov (United States)

    Kuhl, Mark R.

    1990-01-01

    The introduction of the Global Positioning System (GPS) into the National Airspace System (NAS) necessitates the development of Receiver Autonomous Integrity Monitoring (RAIM) techniques. In order to guarantee a certain level of integrity, a thorough understanding of modern estimation techniques applied to navigational problems is required. The extended Kalman filter (EKF) is derived and analyzed under poor geometry conditions. It was found that the performance of the EKF is difficult to predict, since the EKF is designed for a Gaussian environment. A novel approach is implemented which incorporates ridge regression to explain the behavior of an EKF in the presence of dynamics under poor geometry conditions. The basic principles of ridge regression theory are presented, followed by the derivation of a linearized recursive ridge estimator. Computer simulations are performed to confirm the underlying theory and to provide a comparative analysis of the EKF and the recursive ridge estimator.

  13. Type I error rates of rare single nucleotide variants are inflated in tests of association with non-normally distributed traits using simple linear regression methods.

    Science.gov (United States)

    Schwantes-An, Tae-Hwi; Sung, Heejong; Sabourin, Jeremy A; Justice, Cristina M; Sorant, Alexa J M; Wilson, Alexander F

    2016-01-01

    In this study, the effects of (a) the minor allele frequency of the single nucleotide variant (SNV), (b) the degree of departure from normality of the trait, and (c) the position of the SNVs on type I error rates were investigated in the Genetic Analysis Workshop (GAW) 19 whole exome sequence data. To test the distribution of the type I error rate, 5 simulated traits were considered: standard normal and gamma distributed traits; 2 transformed versions of the gamma trait (log 10 and rank-based inverse normal transformations); and trait Q1 provided by GAW 19. Each trait was tested with 313,340 SNVs. Tests of association were performed with simple linear regression and average type I error rates were determined for minor allele frequency classes. Rare SNVs (minor allele frequency < 0.05) showed inflated type I error rates for non-normally distributed traits that increased as the minor allele frequency decreased. The inflation of average type I error rates increased as the significance threshold decreased. Normally distributed traits did not show inflated type I error rates with respect to the minor allele frequency for rare SNVs. There was no consistent effect of transformation on the uniformity of the distribution of the location of SNVs with a type I error.

  14. Logistic regression analysis of multiple noninvasive tests for the prediction of the presence and extent of coronary artery disease in men

    International Nuclear Information System (INIS)

    Hung, J.; Chaitman, B.R.; Lam, J.; Lesperance, J.; Dupras, G.; Fines, P.; Cherkaoui, O.; Robert, P.; Bourassa, M.G.

    1985-01-01

    The incremental diagnostic yield of clinical data, exercise ECG, stress thallium scintigraphy, and cardiac fluoroscopy to predict coronary and multivessel disease was assessed in 171 symptomatic men by means of multiple logistic regression analyses. When clinical variables alone were analyzed, chest pain type and age were predictive of coronary disease, whereas chest pain type, age, a family history of premature coronary disease before age 55 years, and abnormal ST-T wave changes on the rest ECG were predictive of multivessel disease. The percentage of patients correctly classified by cardiac fluoroscopy (presence or absence of coronary artery calcification), exercise ECG, and thallium scintigraphy was 9%, 25%, and 50%, respectively, greater than for clinical variables, when the presence or absence of coronary disease was the outcome, and 13%, 25%, and 29%, respectively, when multivessel disease was studied; 5% of patients were misclassified. When the 37 clinical and noninvasive test variables were analyzed jointly, the most significant variable predictive of coronary disease was an abnormal thallium scan and for multivessel disease, the amount of exercise performed. The data from this study provide a quantitative model and confirm previous reports that optimal diagnostic efficacy is obtained when noninvasive tests are ordered sequentially. In symptomatic men, cardiac fluoroscopy is a relatively ineffective test when compared to exercise ECG and thallium scintigraphy

  15. Differentiating regressed melanoma from regressed lichenoid keratosis.

    Science.gov (United States)

    Chan, Aegean H; Shulman, Kenneth J; Lee, Bonnie A

    2017-04-01

    Distinguishing regressed lichen planus-like keratosis (LPLK) from regressed melanoma can be difficult on histopathologic examination, potentially resulting in mismanagement of patients. We aimed to identify histopathologic features by which regressed melanoma can be differentiated from regressed LPLK. Twenty actively inflamed LPLK, 12 LPLK with regression and 15 melanomas with regression were compared and evaluated by hematoxylin and eosin staining as well as Melan-A, microphthalmia transcription factor (MiTF) and cytokeratin (AE1/AE3) immunostaining. (1) A total of 40% of regressed melanomas showed complete or near complete loss of melanocytes within the epidermis with Melan-A and MiTF immunostaining, while 8% of regressed LPLK exhibited this finding. (2) Necrotic keratinocytes were seen in the epidermis in 33% regressed melanomas as opposed to all of the regressed LPLK. (3) A dense infiltrate of melanophages in the papillary dermis was seen in 40% of regressed melanomas, a feature not seen in regressed LPLK. In summary, our findings suggest that a complete or near complete loss of melanocytes within the epidermis strongly favors a regressed melanoma over a regressed LPLK. In addition, necrotic epidermal keratinocytes and the presence of a dense band-like distribution of dermal melanophages can be helpful in differentiating these lesions. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. [Development of a proverb test for assessment of concrete thinking problems in schizophrenic patients].

    Science.gov (United States)

    Barth, A; Küfferle, B

    2001-11-01

    Concretism is considered an important aspect of schizophrenic thought disorder. Traditionally it is measured using the method of proverb interpretation, in which metaphoric proverbs are presented with the request that the subject tell its meaning. Interpretations are recorded and scored on concretistic tendencies. However, this method has two problems: its reliability is doubtful and it is rather complicated to perform. In this paper, a new version of a multiple choice proverb test is presented which can solve these problems in a reliable and economic manner. Using the new test, it is has been shown that schizophrenic patients have greater deficits in proverb interpretation than depressive patients.

  17. Steganalysis using logistic regression

    Science.gov (United States)

    Lubenko, Ivans; Ker, Andrew D.

    2011-02-01

    We advocate Logistic Regression (LR) as an alternative to the Support Vector Machine (SVM) classifiers commonly used in steganalysis. LR offers more information than traditional SVM methods - it estimates class probabilities as well as providing a simple classification - and can be adapted more easily and efficiently for multiclass problems. Like SVM, LR can be kernelised for nonlinear classification, and it shows comparable classification accuracy to SVM methods. This work is a case study, comparing accuracy and speed of SVM and LR classifiers in detection of LSB Matching and other related spatial-domain image steganography, through the state-of-art 686-dimensional SPAM feature set, in three image sets.

  18. Explaining behavior change after genetic testing: the problem of collinearity between test results and risk estimates.

    Science.gov (United States)

    Fanshawe, Thomas R; Prevost, A Toby; Roberts, J Scott; Green, Robert C; Armstrong, David; Marteau, Theresa M

    2008-09-01

    This paper explores whether and how the behavioral impact of genotype disclosure can be disentangled from the impact of numerical risk estimates generated by genetic tests. Secondary data analyses are presented from a randomized controlled trial of 162 first-degree relatives of Alzheimer's disease (AD) patients. Each participant received a lifetime risk estimate of AD. Control group estimates were based on age, gender, family history, and assumed epsilon4-negative apolipoprotein E (APOE) genotype; intervention group estimates were based upon the first three variables plus true APOE genotype, which was also disclosed. AD-specific self-reported behavior change (diet, exercise, and medication use) was assessed at 12 months. Behavior change was significantly more likely with increasing risk estimates, and also more likely, but not significantly so, in epsilon4-positive intervention group participants (53% changed behavior) than in control group participants (31%). Intervention group participants receiving epsilon4-negative genotype feedback (24% changed behavior) and control group participants had similar rates of behavior change and risk estimates, the latter allowing assessment of the independent effects of genotype disclosure. However, collinearity between risk estimates and epsilon4-positive genotypes, which engender high-risk estimates, prevented assessment of the independent effect of the disclosure of an epsilon4 genotype. Novel study designs are proposed to determine whether genotype disclosure has an impact upon behavior beyond that of numerical risk estimates.

  19. New evidence on the convergence of per capita carbon dioxide emissions from panel seemingly unrelated regressions augmented Dickey-Fuller tests

    International Nuclear Information System (INIS)

    Lee, Chien-Chiang; Chang, Chun-Ping

    2008-01-01

    Using the data for per capita carbon dioxide (CO 2 ) emissions relative to the average per capita emissions for 21 countries in the organisation for economic co-operation and development (OECD) covering the period 1960-2000, this paper seeks to determine whether the stochastic convergence and β-convergence of CO 2 emissions are supported in countries with the same level of development. In other words, are shocks to relative per capita CO 2 emissions temporary in industrialized countries? We respond to this question by utilizing Breuer et al.'s [Breuer JB, McNown R, Wallace MS. Misleading inferences from panel unit-root tests with an illustration from purchasing power parity. Review of International Economics 2001;9(3):482-93; Breuer JB, McNown R, Wallace MS. Series-specific unit-root tests with panel data. Oxford Bulletin of Economics and Statistics 2002 64(5):527-46] panel seemingly unrelated regressions augmented Dickey-Fuller (SURADF) unit-root tests, which allow us to account for possible cross-sectional effects and to identify how many and which members of the panel contain a unit root. Our empirical findings provide evidence that relative per capita CO 2 emissions in OECD countries are a mixture of I(0) and I(1) processes, in which 14 out of 21 OECD countries exhibit divergence. The results reveal that conventional panel unit-root tests can lead to misleading inferences biased towards stationarity even if only one series in the panel is strongly stationary. (author)

  20. Linear regression in astronomy. II

    Science.gov (United States)

    Feigelson, Eric D.; Babu, Gutti J.

    1992-01-01

    A wide variety of least-squares linear regression procedures used in observational astronomy, particularly investigations of the cosmic distance scale, are presented and discussed. The classes of linear models considered are (1) unweighted regression lines, with bootstrap and jackknife resampling; (2) regression solutions when measurement error, in one or both variables, dominates the scatter; (3) methods to apply a calibration line to new data; (4) truncated regression models, which apply to flux-limited data sets; and (5) censored regression models, which apply when nondetections are present. For the calibration problem we develop two new procedures: a formula for the intercept offset between two parallel data sets, which propagates slope errors from one regression to the other; and a generalization of the Working-Hotelling confidence bands to nonstandard least-squares lines. They can provide improved error analysis for Faber-Jackson, Tully-Fisher, and similar cosmic distance scale relations.

  1. Regression: A Bibliography.

    Science.gov (United States)

    Pedrini, D. T.; Pedrini, Bonnie C.

    Regression, another mechanism studied by Sigmund Freud, has had much research, e.g., hypnotic regression, frustration regression, schizophrenic regression, and infra-human-animal regression (often directly related to fixation). Many investigators worked with hypnotic age regression, which has a long history, going back to Russian reflexologists.…

  2. Estimating the Proportion of True Null Hypotheses in Multiple Testing Problems

    Directory of Open Access Journals (Sweden)

    Oluyemi Oyeniran

    2016-01-01

    Full Text Available The problem of estimating the proportion, π0, of the true null hypotheses in a multiple testing problem is important in cases where large scale parallel hypotheses tests are performed independently. While the problem is a quantity of interest in its own right in applications, the estimate of π0 can be used for assessing or controlling an overall false discovery rate. In this article, we develop an innovative nonparametric maximum likelihood approach to estimate π0. The nonparametric likelihood is proposed to be restricted to multinomial models and an EM algorithm is also developed to approximate the estimate of π0. Simulation studies show that the proposed method outperforms other existing methods. Using experimental microarray datasets, we demonstrate that the new method provides satisfactory estimate in practice.

  3. Retro-regression--another important multivariate regression improvement.

    Science.gov (United States)

    Randić, M

    2001-01-01

    We review the serious problem associated with instabilities of the coefficients of regression equations, referred to as the MRA (multivariate regression analysis) "nightmare of the first kind". This is manifested when in a stepwise regression a descriptor is included or excluded from a regression. The consequence is an unpredictable change of the coefficients of the descriptors that remain in the regression equation. We follow with consideration of an even more serious problem, referred to as the MRA "nightmare of the second kind", arising when optimal descriptors are selected from a large pool of descriptors. This process typically causes at different steps of the stepwise regression a replacement of several previously used descriptors by new ones. We describe a procedure that resolves these difficulties. The approach is illustrated on boiling points of nonanes which are considered (1) by using an ordered connectivity basis; (2) by using an ordering resulting from application of greedy algorithm; and (3) by using an ordering derived from an exhaustive search for optimal descriptors. A novel variant of multiple regression analysis, called retro-regression (RR), is outlined showing how it resolves the ambiguities associated with both "nightmares" of the first and the second kind of MRA.

  4. Testing Foreign Language Impact on Engineering Students' Scientific Problem-Solving Performance

    Science.gov (United States)

    Tatzl, Dietmar; Messnarz, Bernd

    2013-01-01

    This article investigates the influence of English as the examination language on the solution of physics and science problems by non-native speakers in tertiary engineering education. For that purpose, a statistically significant total number of 96 students in four year groups from freshman to senior level participated in a testing experiment in…

  5. Analysis and discussion on several problems when testing the thickness of reinforcement cover of concrete component

    Science.gov (United States)

    Zhanhua, Zhang; Guiling, Ji; Lijie; Zhaobo, Zhang; Na, Han; Jing, Zhao; Tan, Li; Zhaorui, Liu

    2018-03-01

    Reinforcement cover of concrete component plays a very important role to ensure the durability of various types of structures and the effective anchorage between steel reinforcement and concrete. This paper discusses and analyzes the problems occurred when testing the thickness of reinforcement cover of concrete component, so as to provide reference and help for related work.

  6. Patch testing with markers of fragrance contact allergy. Do clinical tests correspond to patients' self-reported problems?

    DEFF Research Database (Denmark)

    Johansen, J D; Andersen, T F; Veien, N

    1997-01-01

    The aim of the present study was to investigate the relationship between patients' own recognition of skin problems using consumer products and the results of patch testing with markers of fragrance sensitization. Eight hundred and eighty-four consecutive eczema patients, 18-69 years of age, filled...... in a questionnaire prior to patch testing with the European standard series. The questionnaire contained questions about skin symptoms from the use of scented and unscented products as well as skin reactions from contact with spices, flowers and citrus fruits that could indicate fragrance sensitivity. A highly...... significant association was found between reporting a history of visible skin symptoms from using scented products and a positive patch test to the fragrance mix, whereas no such relationship could be established to the Peru balsam in univariate or multivariate analysis. Our results suggest that the role...

  7. Testing the Emotional Vulnerability Pathway to Problem Gambling in Culturally Diverse University Students.

    Science.gov (United States)

    Hum, Sandra; Carr, Sherilene M

    2018-02-12

    Loneliness and adapting to an unfamiliar environment can increase emotional vulnerability in culturally and linguistically diverse (CALD) university students. According to Blaszczynski and Nower's pathways model of problem and pathological gambling, this emotional vulnerability could increase the risk of problem gambling. The current study examined whether loneliness was associated with problem gambling risk in CALD students relative to their Australian peers. Additionally, differences in coping strategies were examined to determine their buffering effect on the relationship. A total of 463 female and 165 male university students (aged 18-38) from Australian (38%), mixed Australian and CALD (23%) and CALD (28%) backgrounds responded to an online survey of problem gambling behaviour, loneliness, and coping strategies. The results supported the hypothesis that loneliness would be related to problem gambling in CALD students. There was no evidence of a moderating effect of coping strategies. Future research could test whether the introduction of programs designed to alleviate loneliness in culturally diverse university students reduces their risk of developing problem gambling.

  8. Role reversal and problem solving in international negotiations: the Partial Nuclear Test Ban case

    International Nuclear Information System (INIS)

    King, T.D.

    1978-01-01

    To facilitate finding bargaining space and to reinforce cooperative potential, a number of analysts have promoted the use of role reversal and problem solving. Role reversal involves restating the positions of one's adversary to demonstrate understanding and to develop empathy, while problem solving involves searching for alternatives that promote joint interests. The case of the negotiations in the Eighteen Nation Disarmament Conference from 1962--1963 leading to the Partial Nuclear Test Ban Treaty provided the context for examining bargaining relationships involving role reversal and problem solving. Interactions among the United States, the United Kingdom, and the Soviet Union, as recorded in transcripts of 112 sessions, were coded using Bargaining Process Analysis II, a content analysis instrument used to classify negotiation behaviors. Role reversal was measured by the frequency of paraphrases of the adversary's positions. Problem solving was measured by the frequency of themes promoting the exploration of alternatives and the search for mutually beneficial outcomes. The findings on the use of paraphrasing suggest that it can be used to restrict exploration as well as to promote it. The exploratory focus of problem solving was somewhat limited by its use in association with demands, suggesting that problem solving was interpreted as a sign of weakness

  9. Mindfulness Facets, Social Anxiety, and Drinking to Cope with Social Anxiety: Testing Mediators of Drinking Problems.

    Science.gov (United States)

    Clerkin, Elise M; Sarfan, Laurel D; Parsons, E Marie; Magee, Joshua C

    2017-02-01

    This cross-sectional study tested social anxiety symptoms, trait mindfulness, and drinking to cope with social anxiety as potential predictors and/or serial mediators of drinking problems. A community-based sample of individuals with co-occurring social anxiety symptoms and alcohol dependence were recruited. Participants ( N = 105) completed measures of social anxiety, drinking to cope with social anxiety, and alcohol use and problems. As well, participants completed the Five Facet Mindfulness Questionnaire , which assesses mindfulness facets of accepting without judgment, acting with awareness, not reacting to one's internal experiences, observing and attending to experiences, and labeling and describing. As predicted, the relationship between social anxiety symptoms and drinking problems was mediated by social anxiety coping motives across each of the models. Further, the relationship between specific mindfulness facets (acting with awareness, accepting without judgment, and describe) and drinking problems was serially mediated by social anxiety symptoms and drinking to cope with social anxiety. This research builds upon existing studies that have largely been conducted with college students to evaluate potential mediators driving drinking problems. Specifically, individuals who are less able to act with awareness, accept without judgment, and describe their internal experiences may experience heightened social anxiety and drinking to cope with that anxiety, which could ultimately result in greater alcohol-related problems.

  10. DYNAMIC PROGRAMMING APPROACH TO TESTING RESOURCE ALLOCATION PROBLEM FOR MODULAR SOFTWARE

    Directory of Open Access Journals (Sweden)

    P.K. Kapur

    2003-02-01

    Full Text Available Testing phase of a software begins with module testing. During this period modules are tested independently to remove maximum possible number of faults within a specified time limit or testing resource budget. This gives rise to some interesting optimization problems, which are discussed in this paper. Two Optimization models are proposed for optimal allocation of testing resources among the modules of a Software. In the first model, we maximize the total fault removal, subject to budgetary Constraint. In the second model, additional constraint representing aspiration level for fault removals for each module of the software is added. These models are solved using dynamic programming technique. The methods have been illustrated through numerical examples.

  11. Advanced statistics: linear regression, part I: simple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    Simple linear regression is a mathematical technique used to model the relationship between a single independent predictor variable and a single dependent outcome variable. In this, the first of a two-part series exploring concepts in linear regression analysis, the four fundamental assumptions and the mechanics of simple linear regression are reviewed. The most common technique used to derive the regression line, the method of least squares, is described. The reader will be acquainted with other important concepts in simple linear regression, including: variable transformations, dummy variables, relationship to inference testing, and leverage. Simplified clinical examples with small datasets and graphic models are used to illustrate the points. This will provide a foundation for the second article in this series: a discussion of multiple linear regression, in which there are multiple predictor variables.

  12. Unbalanced Regressions and the Predictive Equation

    DEFF Research Database (Denmark)

    Osterrieder, Daniela; Ventosa-Santaulària, Daniel; Vera-Valdés, J. Eduardo

    Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness in the theoreti......Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness...

  13. Patch testing with markers of fragrance contact allergy. Do clinical tests correspond to patients' self-reported problems?

    Science.gov (United States)

    Johansen, J D; Andersen, T F; Veien, N; Avnstorp, C; Andersen, K E; Menné, T

    1997-03-01

    The aim of the present study was to investigate the relationship between patients' own recognition of skin problems using consumer products and the results of patch testing with markers of fragrance sensitization. Eight hundred and eighty-four consecutive eczema patients, 18-69 years of age, filled in a questionnaire prior to patch testing with the European standard series. The questionnaire contained questions about skin symptoms from the use of scented and unscented products as well as skin reactions from contact with spices, flowers and citrus fruits that could indicate fragrance sensitivity. A highly significant association was found between reporting a history of visible skin symptoms from using scented products and a positive patch test to the fragrance mix, whereas no such relationship could be established to the Peru balsam in univariate or multivariate analysis. Our results suggest that the role of Peru balsam in detecting relevant fragrance contact allergy is limited, while most fragrance mix-positive patients are aware that the use of scented products may cause skin problems.

  14. Deep Support Vector Machines for Regression Problems

    NARCIS (Netherlands)

    Wiering, Marco; Schutten, Marten; Millea, Adrian; Meijster, Arnold; Schomaker, Lambertus

    2013-01-01

    In this paper we describe a novel extension of the support vector machine, called the deep support vector machine (DSVM). The original SVM has a single layer with kernel functions and is therefore a shallow model. The DSVM can use an arbitrary number of layers, in which lower-level layers contain

  15. An accurate and efficient identification of children with psychosocial problems by means of computerized adaptive testing

    Directory of Open Access Journals (Sweden)

    Reijneveld Symen A

    2011-08-01

    Full Text Available Abstract Background Questionnaires used by health services to identify children with psychosocial problems are often rather short. The psychometric properties of such short questionnaires are mostly less than needed for an accurate distinction between children with and without problems. We aimed to assess whether a short Computerized Adaptive Test (CAT can overcome the weaknesses of short written questionnaires when identifying children with psychosocial problems. Method We used a Dutch national data set obtained from parents of children invited for a routine health examination by Preventive Child Healthcare with 205 items on behavioral and emotional problems (n = 2,041, response 84%. In a random subsample we determined which items met the requirements of an Item Response Theory (IRT model to a sufficient degree. Using those items, item parameters necessary for a CAT were calculated and a cut-off point was defined. In the remaining subsample we determined the validity and efficiency of a Computerized Adaptive Test using simulation techniques, with current treatment status and a clinical score on the Total Problem Scale (TPS of the Child Behavior Checklist as criteria. Results Out of 205 items available 190 sufficiently met the criteria of the underlying IRT model. For 90% of the children a score above or below cut-off point could be determined with 95% accuracy. The mean number of items needed to achieve this was 12. Sensitivity and specificity with the TPS as a criterion were 0.89 and 0.91, respectively. Conclusion An IRT-based CAT is a very promising option for the identification of psychosocial problems in children, as it can lead to an efficient, yet high-quality identification. The results of our simulation study need to be replicated in a real-life administration of this CAT.

  16. 3D Modeling and Simulation for Electromagnetic Non-Destructive Testing- Problems and Limitations

    International Nuclear Information System (INIS)

    Ilham Mukriz Zainal Abidin; Nurul Ain Ahmad Latif

    2011-01-01

    Non-Destructive Testing (NDT) plays a critical role in nuclear power plants (NPPs) for life cycle management; such testing requires specialists with various NDT related expertise with specific equipment. This paper will discuss the importance of 3D modeling and simulation for electromagnetic NDT for critical and complex components in terms of engineering reasoning and physical trials. Results from simulation are presented which show the link established between the measurements and information relating to defects, such as 3D shape, size and location, which facilitates not only forward problem but also inverse modeling involving experimental system specification and configuration; and pattern recognition for 3D defect information. Subsequently, the problems and limitations pertinent to 3D modeling and simulation are then highlighted and areas of improvement are discussed. (author)

  17. Logistic regression models

    CERN Document Server

    Hilbe, Joseph M

    2009-01-01

    This book really does cover everything you ever wanted to know about logistic regression … with updates available on the author's website. Hilbe, a former national athletics champion, philosopher, and expert in astronomy, is a master at explaining statistical concepts and methods. Readers familiar with his other expository work will know what to expect-great clarity.The book provides considerable detail about all facets of logistic regression. No step of an argument is omitted so that the book will meet the needs of the reader who likes to see everything spelt out, while a person familiar with some of the topics has the option to skip "obvious" sections. The material has been thoroughly road-tested through classroom and web-based teaching. … The focus is on helping the reader to learn and understand logistic regression. The audience is not just students meeting the topic for the first time, but also experienced users. I believe the book really does meet the author's goal … .-Annette J. Dobson, Biometric...

  18. SEPARATION PHENOMENA LOGISTIC REGRESSION

    Directory of Open Access Journals (Sweden)

    Ikaro Daniel de Carvalho Barreto

    2014-03-01

    Full Text Available This paper proposes an application of concepts about the maximum likelihood estimation of the binomial logistic regression model to the separation phenomena. It generates bias in the estimation and provides different interpretations of the estimates on the different statistical tests (Wald, Likelihood Ratio and Score and provides different estimates on the different iterative methods (Newton-Raphson and Fisher Score. It also presents an example that demonstrates the direct implications for the validation of the model and validation of variables, the implications for estimates of odds ratios and confidence intervals, generated from the Wald statistics. Furthermore, we present, briefly, the Firth correction to circumvent the phenomena of separation.

  19. 49 CFR 40.203 - What problems cause a drug test to be cancelled unless they are corrected?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false What problems cause a drug test to be cancelled unless they are corrected? 40.203 Section 40.203 Transportation Office of the Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Problems in Drug Tests § 40.203...

  20. Iterative and range test methods for an inverse source problem for acoustic waves

    International Nuclear Information System (INIS)

    Alves, Carlos; Kress, Rainer; Serranho, Pedro

    2009-01-01

    We propose two methods for solving an inverse source problem for time-harmonic acoustic waves. Based on the reciprocity gap principle a nonlinear equation is presented for the locations and intensities of the point sources that can be solved via Newton iterations. To provide an initial guess for this iteration we suggest a range test algorithm for approximating the source locations. We give a mathematical foundation for the range test and exhibit its feasibility in connection with the iteration method by some numerical examples

  1. Remote sensing and GIS-based landslide hazard analysis and cross-validation using multivariate logistic regression model on three test areas in Malaysia

    Science.gov (United States)

    Pradhan, Biswajeet

    2010-05-01

    This paper presents the results of the cross-validation of a multivariate logistic regression model using remote sensing data and GIS for landslide hazard analysis on the Penang, Cameron, and Selangor areas in Malaysia. Landslide locations in the study areas were identified by interpreting aerial photographs and satellite images, supported by field surveys. SPOT 5 and Landsat TM satellite imagery were used to map landcover and vegetation index, respectively. Maps of topography, soil type, lineaments and land cover were constructed from the spatial datasets. Ten factors which influence landslide occurrence, i.e., slope, aspect, curvature, distance from drainage, lithology, distance from lineaments, soil type, landcover, rainfall precipitation, and normalized difference vegetation index (ndvi), were extracted from the spatial database and the logistic regression coefficient of each factor was computed. Then the landslide hazard was analysed using the multivariate logistic regression coefficients derived not only from the data for the respective area but also using the logistic regression coefficients calculated from each of the other two areas (nine hazard maps in all) as a cross-validation of the model. For verification of the model, the results of the analyses were then compared with the field-verified landslide locations. Among the three cases of the application of logistic regression coefficient in the same study area, the case of Selangor based on the Selangor logistic regression coefficients showed the highest accuracy (94%), where as Penang based on the Penang coefficients showed the lowest accuracy (86%). Similarly, among the six cases from the cross application of logistic regression coefficient in other two areas, the case of Selangor based on logistic coefficient of Cameron showed highest (90%) prediction accuracy where as the case of Penang based on the Selangor logistic regression coefficients showed the lowest accuracy (79%). Qualitatively, the cross

  2. Alcohol Use-Related Problems Among a Rural Indian Population of West Bengal: An Application of the Alcohol Use Disorders Identification Test (AUDIT).

    Science.gov (United States)

    Barik, Anamitra; Rai, Rajesh Kumar; Chowdhury, Abhijit

    2016-03-01

    To examine alcohol use and related problems among a rural subset of the Indian population. The Alcohol Use Disorders Identification Test (AUDIT) was used as part of Health and Demographic Surveillance of 36,611 individuals aged ≥18 years. From this survey data on 3671 current alcohol users were analysed using bivariate and multivariate ordered logit regression. Over 19% of males and 2.4% of females were current alcohol users. Mean ethanol consumption on a typical drinking day among males was estimated to be higher (96.3 gm) than females (56.5 gm). Mean AUDIT score was 11 among current alcohol users. AUDIT showed in the ordered logit regression estimated alcohol use-related problems to be low among women, Scheduled Tribes and unmarried people, whereas alcohol use-related problems registered high among Muslims. This rural population appears to be in need of an effective intervention program, perhaps targeting men and the household, aimed at reducing the level of alcohol use and related problems. © The Author 2015. Medical Council on Alcohol and Oxford University Press. All rights reserved.

  3. Exact and conceptual repetition dissociate conceptual memory tests: problems for transfer appropriate processing theory.

    Science.gov (United States)

    McDermott, K B; Roediger, H L

    1996-03-01

    Three experiments examined whether a conceptual implicit memory test (specifically, category instance generation) would exhibit repetition effects similar to those found in free recall. The transfer appropriate processing account of dissociations among memory tests led us to predict that the tests would show parallel effects; this prediction was based upon the theory's assumption that conceptual tests will behave similarly as a function of various independent variables. In Experiment 1, conceptual repetition (i.e., following a target word [e.g., puzzles] with an associate [e.g., jigsaw]) did not enhance priming on the instance generation test relative to the condition of simply presenting the target word once, although this manipulation did affect free recall. In Experiment 2, conceptual repetition was achieved by following a picture with its corresponding word (or vice versa). In this case, there was an effect of conceptual repetition on free recall but no reliable effect on category instance generation or category cued recall. In addition, we obtained a picture superiority effect in free recall but not in category instance generation. In the third experiment, when the same study sequence was used as in Experiment 1, but with instructions that encouraged relational processing, priming on the category instance generation task was enhanced by conceptual repetition. Results demonstrate that conceptual memory tests can be dissociated and present problems for Roediger's (1990) transfer appropriate processing account of dissociations between explicit and implicit tests.

  4. A boundary-optimized rejection region test for the two-sample binomial problem.

    Science.gov (United States)

    Gabriel, Erin E; Nason, Martha; Fay, Michael P; Follmann, Dean A

    2018-03-30

    Testing the equality of 2 proportions for a control group versus a treatment group is a well-researched statistical problem. In some settings, there may be strong historical data that allow one to reliably expect that the control proportion is one, or nearly so. While one-sample tests or comparisons to historical controls could be used, neither can rigorously control the type I error rate in the event the true control rate changes. In this work, we propose an unconditional exact test that exploits the historical information while controlling the type I error rate. We sequentially construct a rejection region by first maximizing the rejection region in the space where all controls have an event, subject to the constraint that our type I error rate does not exceed α for any true event rate; then with any remaining α we maximize the additional rejection region in the space where one control avoids the event, and so on. When the true control event rate is one, our test is the most powerful nonrandomized test for all points in the alternative space. When the true control event rate is nearly one, we demonstrate that our test has equal or higher mean power, averaging over the alternative space, than a variety of well-known tests. For the comparison of 4 controls and 4 treated subjects, our proposed test has higher power than all comparator tests. We demonstrate the properties of our proposed test by simulation and use our method to design a malaria vaccine trial. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  5. Reduced Rank Regression

    DEFF Research Database (Denmark)

    Johansen, Søren

    2008-01-01

    The reduced rank regression model is a multivariate regression model with a coefficient matrix with reduced rank. The reduced rank regression algorithm is an estimation procedure, which estimates the reduced rank regression model. It is related to canonical correlations and involves calculating...

  6. Operational aspects, results and problems associated with R/B testing at Gentilly 2

    International Nuclear Information System (INIS)

    Garceau, N.; Beaudoin, R.

    1991-01-01

    There are many methods, some more complex or difficult to deal with than others, to verify the containment building integrity. At G-2, we chose the temperature compensation method. Our selection criteria were: 1) the greater precision of this method; 2) the possibility of executing the test with the plant running at full power; 3) short period required for the test; 4) after the technique is understood, its simplicity of execution; 5) can be easily inserted in the normal operating test program with a minimum of personnel; 6) this technique can be used at both low and high pressure. In this presentation we will succinctly discuss the different phases of the technique such as: the background, the prerequisite, the problems, the results and, finally, we will give some recommendations to facilitate the use of this method

  7. Application of multilinear regression analysis in modeling of soil ...

    African Journals Online (AJOL)

    The application of Multi-Linear Regression Analysis (MLRA) model for predicting soil properties in Calabar South offers a technical guide and solution in foundation designs problems in the area. Forty-five soil samples were collected from fifteen different boreholes at a different depth and 270 tests were carried out for CBR, ...

  8. Solution of the Chandler-Gibson equations for a three-body test problem

    International Nuclear Information System (INIS)

    Gibson, A.G.; Waters, A.J.; Berthold, G.H.; Chandler, C.

    1991-01-01

    The Chandler-Gibson (CG) N-body equations are tested by considering the problem of three nonrelativistic particles moving on a line and interacting through attractive delta-function potentials. In particular, the input Born and overlap matrix-valued functions are evaluated analytically, and the CG equations are solved using a B-spline collocation method. The computed scattering matrix elements are within 0.5% of the known exact solutions, and the corresponding scattering probabilities are within 0.001% of the exact probabilities, both below and above the 3-body breakup threshold. These results establish that the CG method is practical, as well as theoretically correct, and may be a valuable approach for solving certain more complicated N-body scattering problems

  9. Canonical variate regression.

    Science.gov (United States)

    Luo, Chongliang; Liu, Jin; Dey, Dipak K; Chen, Kun

    2016-07-01

    In many fields, multi-view datasets, measuring multiple distinct but interrelated sets of characteristics on the same set of subjects, together with data on certain outcomes or phenotypes, are routinely collected. The objective in such a problem is often two-fold: both to explore the association structures of multiple sets of measurements and to develop a parsimonious model for predicting the future outcomes. We study a unified canonical variate regression framework to tackle the two problems simultaneously. The proposed criterion integrates multiple canonical correlation analysis with predictive modeling, balancing between the association strength of the canonical variates and their joint predictive power on the outcomes. Moreover, the proposed criterion seeks multiple sets of canonical variates simultaneously to enable the examination of their joint effects on the outcomes, and is able to handle multivariate and non-Gaussian outcomes. An efficient algorithm based on variable splitting and Lagrangian multipliers is proposed. Simulation studies show the superior performance of the proposed approach. We demonstrate the effectiveness of the proposed approach in an [Formula: see text] intercross mice study and an alcohol dependence study. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Some problems on materials tests in high temperature hydrogen base gas mixture

    International Nuclear Information System (INIS)

    Shikama, Tatsuo; Tanabe, Tatsuhiko; Fujitsuka, Masakazu; Yoshida, Heitaro; Watanabe, Ryoji

    1980-01-01

    Some problems have been examined on materials tests (creep rupture tests and corrosion tests) in high temperature mixture gas of hydrogen (80%H 2 + 15%CO + 5%CO 2 ) simulating the reducing gas for direct steel making. H 2 , CO, CO 2 and CH 4 in the reducing gas interact with each other at elevated temperature and produce water vapor (H 2 O) and carbon (soot). Carbon deposited on the walls of retorts and the water condensed at pipings of the lower temperature gas outlet causes blocking of gas flow. The gas reactions have been found to be catalyzed by the retort walls, and appropriate selection of the materials for retorts has been found to mitigate the problems caused by water condensation and carbon deposition. Quartz has been recognized to be one of the most promising materials for minimizing the gas reactions. And ceramic coating, namely, BN (born nitride) on the heat resistant superalloy, MO-RE II, has reduced the amounts of water vapor and deposited carbon (sooting) produced by gas reactions and has kept dew points of outlet gas below room temperature. The well known emf (thermo-electromotive force) deterioration of Alumel-Chromel thermocouples in the reducing gases at elevated temperatures has been also found to be prevented by the ceramic (BN) coating. (author)

  11. Normalization Ridge Regression in Practice I: Comparisons Between Ordinary Least Squares, Ridge Regression and Normalization Ridge Regression.

    Science.gov (United States)

    Bulcock, J. W.

    The problem of model estimation when the data are collinear was examined. Though the ridge regression (RR) outperforms ordinary least squares (OLS) regression in the presence of acute multicollinearity, it is not a problem free technique for reducing the variance of the estimates. It is a stochastic procedure when it should be nonstochastic and it…

  12. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    International Nuclear Information System (INIS)

    Brown, Forrest B.

    2016-01-01

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple a ce.pl and simple a ce m g.pl.

  13. 49 CFR 40.208 - What problem requires corrective action but does not result in the cancellation of a test?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false What problem requires corrective action but does not result in the cancellation of a test? 40.208 Section 40.208 Transportation Office of the Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Problems...

  14. Prediction of spatial soil property information from ancillary sensor data using ordinary linear regression: Model derivations, residual assumptions and model validation tests

    Science.gov (United States)

    Geospatial measurements of ancillary sensor data, such as bulk soil electrical conductivity or remotely sensed imagery data, are commonly used to characterize spatial variation in soil or crop properties. Geostatistical techniques like kriging with external drift or regression kriging are often use...

  15. Abstract Expression Grammar Symbolic Regression

    Science.gov (United States)

    Korns, Michael F.

    This chapter examines the use of Abstract Expression Grammars to perform the entire Symbolic Regression process without the use of Genetic Programming per se. The techniques explored produce a symbolic regression engine which has absolutely no bloat, which allows total user control of the search space and output formulas, which is faster, and more accurate than the engines produced in our previous papers using Genetic Programming. The genome is an all vector structure with four chromosomes plus additional epigenetic and constraint vectors, allowing total user control of the search space and the final output formulas. A combination of specialized compiler techniques, genetic algorithms, particle swarm, aged layered populations, plus discrete and continuous differential evolution are used to produce an improved symbolic regression sytem. Nine base test cases, from the literature, are used to test the improvement in speed and accuracy. The improved results indicate that these techniques move us a big step closer toward future industrial strength symbolic regression systems.

  16. A Simulation Investigation of Principal Component Regression.

    Science.gov (United States)

    Allen, David E.

    Regression analysis is one of the more common analytic tools used by researchers. However, multicollinearity between the predictor variables can cause problems in using the results of regression analyses. Problems associated with multicollinearity include entanglement of relative influences of variables due to reduced precision of estimation,…

  17. Simulation of international standard problem no. 44 open tests using Melcor computer code

    International Nuclear Information System (INIS)

    Song, Y.M.; Cho, S.W.

    2001-01-01

    MELCOR 1.8.4 code has been employed to simulate the KAEVER test series of K123/K148/K186/K188 that were proposed as open experiments of International Standard Problem No.44 by OECD-CSNI. The main purpose of this study is to evaluate the accuracy of the MELCOR aerosol model which calculates the aerosol distribution and settlement in a containment. For this, thermal hydraulic conditions are simulated first for the whole test period and then the behavior of hygroscopic CsOH/CsI and unsoluble Ag aerosols, which are predominant activity carriers in a release into the containment, is compared between the experimental results and the code predictions. The calculation results of vessel atmospheric concentration show a good simulation for dry aerosol but show large difference for wet aerosol due to a data mismatch in vessel humidity and the hygroscopicity. (authors)

  18. Pilot study risk assessment for selected problems at the Nevada Test Site (NTS)

    International Nuclear Information System (INIS)

    Daniels, J.I.; Andricevic, R.; Jacobson, R.L.

    1993-06-01

    The Nevada Test Site (NTS) is located in southwestern Nevada, about 105 km (65 mi) northwest of the city of Las Vegas. A series of tests was conducted in the late 1950s and early 1960s at or near the NTS to study issues involving plutonium-bearing devices. These tests resulted in the dispersal of about 5 TBq of 239,24O Pu on the surficial soils at the test locations. Additionally, underground tests of nuclear weapons devices have been conducted at the NTS since late 1962; ground water beneath the NTS has been contaminated with radionuclides produced by these tests. These two important problems have been selected for assessment. Regarding the plutonium contamination, because the residual 239 Pu decays slowly (half-life of 24,110 y), these sites could represent a long-term hazard if they are not remediated and if institutional controls are lost. To investigate the magnitude of the potential health risks for this no-remediation case, three basic exposure scenarios were defined that could bring individuals in contact with 239,24O Pu at the sites: (1) a resident living in a subdivision, (2) a resident farmer, and (3) a worker at a commercial facility -- all located at a test site. The predicted cancer risks for the resident farmer were more than a factor of three times higher than the suburban resident at the median risk level, and about a factor of ten greater than the reference worker at a commercial facility. At 100 y from the present, the 5, 50, and 95th percentile risks for the resident farmer at the most contaminated site were 4 x 10 -6 , 6 x 10 -5 , and 5 x 10 -4 , respectively. For the assessment of Pu in surface soil, the principal sources of uncertainty in the estimated risks were population mobility, the relationship between indoor and outdoor contaminant levels, and the dose and risk factors for bone, liver, and lung

  19. Pollen tube growth test (PTGT) in environmental biomonitoring and predictive radiation biology studies: problem and prospect

    International Nuclear Information System (INIS)

    Pandey, Devi D.

    2012-01-01

    In Environmental and Human Bio monitoring studies of Hazardous xenobiotics over living system particularly at cell level, it is desirable to have easy and sensitive test system like Cell Viability assay, MNT, Cell Culture photo toxicity Test, PTGT etc. Out of these the PTGT quite better than other because the in vitro culture of pollen grain can provides a sensitive indication of toxicity at cellular level, since germination and growth of pollen tube will inhibited in presence of toxic substance like DDT, Heavy metal, even Radionuclide's. This test system is easy, economical and widely accepted through out world. In PTGT pollen tube never containing Chloroplast or other plastids so pollen tube resembles animals more than a plant organ and is therefore also a suitable as model for Genotoxicity Assessment of compounds harmful to animal and humans. Lack of plastids in PT, PTGT will not identify the toxic effect of compounds that targets Non cyclic and cyclic photoposphorylation of photosynthesis. This test system valid in International Toxicity Testing Protocol. But this method is time consuming and problem in measurement of pollen tube growing in a culture medium became usually bent and make measurement difficult. Other disadvantage of this method is requirement of DMSO to dissolve test substance of low water suitability in culture medium. DMSO shown to have no effect on PTG at Concentration not more than 1% but some extent interfere with results. Values of PTG are quantified in ED50/IC50 that is the concentration of test compounds that reduces pollen tube growth to 50% of control. So PTGT could be very sensitive and easy to assess in common lab in International way. (author)

  20. Some problems with non-inferiority tests in psychotherapy research: psychodynamic therapies as an example.

    Science.gov (United States)

    Rief, Winfried; Hofmann, Stefan G

    2018-02-14

    In virtually every field of medicine, non-inferiority trials and meta-analyses with non-inferiority conclusions are increasingly common. This non-inferiority approach has been frequently used by a group of authors favoring psychodynamic therapies (PDTs), concluding that PDTs are just as effective as cognitive-behavioral therapies (CBT). We focus on these examples to exemplify some problems associated with non-inferiority tests of psychological treatments, although the problems also apply to psychopharmacotherapy research, CBT research, and others. We conclude that non-inferiority trials have specific risks of different types of validity problems, usually favoring an (erroneous) non-inferiority conclusion. Non-inferiority trials require the definition of non-inferiority margins, and currently used thresholds have a tendency to be inflationary, not protecting sufficiently against degradation. The use of non-inferiority approaches can lead to the astonishing result that one single analysis can suggest both, superiority of the comparator (here: CBT) and non-inferiority of the other treatment (here PDT) at the same time. We provide recommendations how to improve the quality of non-inferiority trials, and we recommend to consider them among other criteria when evaluating manuscripts examining non-inferiority trials. If psychotherapeutic families (such as PDT and CBT) differ on the number of investigating trials, and in the fields of clinical applications, and in other validity aspects mentioned above, conclusions about their general non-inferiority are no more than a best guess, typically expressing the favored approach of the lead author.

  1. Chaotic Multiobjective Evolutionary Algorithm Based on Decomposition for Test Task Scheduling Problem

    Directory of Open Access Journals (Sweden)

    Hui Lu

    2014-01-01

    Full Text Available Test task scheduling problem (TTSP is a complex optimization problem and has many local optima. In this paper, a hybrid chaotic multiobjective evolutionary algorithm based on decomposition (CMOEA/D is presented to avoid becoming trapped in local optima and to obtain high quality solutions. First, we propose an improving integrated encoding scheme (IES to increase the efficiency. Then ten chaotic maps are applied into the multiobjective evolutionary algorithm based on decomposition (MOEA/D in three phases, that is, initial population and crossover and mutation operators. To identify a good approach for hybrid MOEA/D and chaos and indicate the effectiveness of the improving IES several experiments are performed. The Pareto front and the statistical results demonstrate that different chaotic maps in different phases have different effects for solving the TTSP especially the circle map and ICMIC map. The similarity degree of distribution between chaotic maps and the problem is a very essential factor for the application of chaotic maps. In addition, the experiments of comparisons of CMOEA/D and variable neighborhood MOEA/D (VNM indicate that our algorithm has the best performance in solving the TTSP.

  2. Gaussian process regression analysis for functional data

    CERN Document Server

    Shi, Jian Qing

    2011-01-01

    Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime

  3. Harmonic voltage excess problem test and analysis in UHV and EHV grid particular operation mode

    Science.gov (United States)

    Lv, Zhenhua; Shi, Mingming; Fei, Juntao

    2018-02-01

    The test and analysis of the power quality of some 1000kV UHV transmission lines and 500kV EHV transmission lines is carried out. It is found that there is harmonic voltage excess problems when the power supply of the UHV and EHV voltage line is single-ended or single-loop, the problem basically disappeared after the operation mode change, different operating conditions, the harmonic current has not been greatly affected, indicating that the harmonic voltage changes mainly caused by the system harmonic impedance. With the analysis of MATLAB Simulink system model, it can be seen that there are specific harmonic voltage excess in the system under the specific operating mode, which results in serious distortion of the specific harmonic voltage. Since such phenomena are found in 500kV and 1000kV systems, it is suggested that the test evaluation work should be done under the typical mode of operation in 500kV, 1000kV Planning and construction process to prevent the occurrence of serious distortion and the regional harmonic current monitoring and suppression work should be done.

  4. Prediction of the semiscale blowdown heat transfer test S-02-8 (NRC Standard Problem Five)

    International Nuclear Information System (INIS)

    Fujita, N.; Irani, A.A.; Mecham, D.C.; Sawtelle, G.R.; Moore, K.V.

    1976-10-01

    Standard Problem Five was the prediction of test S-02-8 in the Semiscale Mod-1 experimental program. The Semiscale System is an electrically heated experiment designed to produce data on system performance typical of PWR thermal-hydraulic behavior. The RELAP4 program used for these analyses is a digital computer program developed to predict the thermal-hydraulic behavior of experimental systems and water-cooled nuclear reactors subjected to postulated transients. The RELAP4 predictions of Standard Problem 5 were in good overall agreement with the measured hydraulic data. Fortunately, sufficient experience has been gained with the semiscale break configuration and the critical flow models in RELAP4 to accurately predict the break flow and, hence the overall system depressurization. Generally, the hydraulic predictions are quite good in regions where homogeneity existed. Where separation effects occurred, predictions are not as good, and the data oscillations and error bands are larger. A large discrepancy existed among the measured heater rod temperature data as well as between these data and predicted values. Several potential causes for these differences were considered, and several post test analyses were performed in order to evaluate the discrepancies

  5. Regression analysis by example

    CERN Document Server

    Chatterjee, Samprit

    2012-01-01

    Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded

  6. Regression analysis with categorized regression calibrated exposure: some interesting findings

    Directory of Open Access Journals (Sweden)

    Hjartåker Anette

    2006-07-01

    percentile scale. Relating back to the original scale of the exposure solves the problem. The conclusion regards all regression models.

  7. Regression to Causality : Regression-style presentation influences causal attribution

    DEFF Research Database (Denmark)

    Bordacconi, Mats Joe; Larsen, Martin Vinæs

    2014-01-01

    of equivalent results presented as either regression models or as a test of two sample means. Our experiment shows that the subjects who were presented with results as estimates from a regression model were more inclined to interpret these results causally. Our experiment implies that scholars using regression...... models – one of the primary vehicles for analyzing statistical results in political science – encourage causal interpretation. Specifically, we demonstrate that presenting observational results in a regression model, rather than as a simple comparison of means, makes causal interpretation of the results...... more likely. Our experiment drew on a sample of 235 university students from three different social science degree programs (political science, sociology and economics), all of whom had received substantial training in statistics. The subjects were asked to compare and evaluate the validity...

  8. Quantile Regression Methods

    DEFF Research Database (Denmark)

    Fitzenberger, Bernd; Wilke, Ralf Andreas

    2015-01-01

    if the mean regression model does not. We provide a short informal introduction into the principle of quantile regression which includes an illustrative application from empirical labor market research. This is followed by briefly sketching the underlying statistical model for linear quantile regression based......Quantile regression is emerging as a popular statistical approach, which complements the estimation of conditional mean models. While the latter only focuses on one aspect of the conditional distribution of the dependent variable, the mean, quantile regression provides more detailed insights...... by modeling conditional quantiles. Quantile regression can therefore detect whether the partial effect of a regressor on the conditional quantiles is the same for all quantiles or differs across quantiles. Quantile regression can provide evidence for a statistical relationship between two variables even...

  9. Pilot study risk assessment for selected problems at the Nevada Test Site (NTS)

    Energy Technology Data Exchange (ETDEWEB)

    Daniels, J.I. [ed.; Anspaugh, L.R.; Bogen, K.T.; Daniels, J.I.; Layton, D.W.; Straume, T. [Lawrence Livermore National Lab., CA (United States); Andricevic, R.; Jacobson, R.L. [Nevada Univ., Las Vegas, NV (United States). Water Resources Center; Meinhold, A.F.; Holtzman, S.; Morris, S.C.; Hamilton, L.D. [Brookhaven National Lab., Upton, NY (United States)

    1993-06-01

    The Nevada Test Site (NTS) is located in southwestern Nevada, about 105 km (65 mi) northwest of the city of Las Vegas. A series of tests was conducted in the late 1950s and early 1960s at or near the NTS to study issues involving plutonium-bearing devices. These tests resulted in the dispersal of about 5 TBq of {sup 239,24O}Pu on the surficial soils at the test locations. Additionally, underground tests of nuclear weapons devices have been conducted at the NTS since late 1962; ground water beneath the NTS has been contaminated with radionuclides produced by these tests. These two important problems have been selected for assessment. Regarding the plutonium contamination, because the residual {sup 239}Pu decays slowly (half-life of 24,110 y), these sites could represent a long-term hazard if they are not remediated and if institutional controls are lost. To investigate the magnitude of the potential health risks for this no-remediation case, three basic exposure scenarios were defined that could bring individuals in contact with {sup 239,24O}Pu at the sites: (1) a resident living in a subdivision, (2) a resident farmer, and (3) a worker at a commercial facility -- all located at a test site. The predicted cancer risks for the resident farmer were more than a factor of three times higher than the suburban resident at the median risk level, and about a factor of ten greater than the reference worker at a commercial facility. At 100 y from the present, the 5, 50, and 95th percentile risks for the resident farmer at the most contaminated site were 4 x 10{sup {minus}6}, 6 x 10{sup {minus}5}, and 5 x 10{sup {minus}4}, respectively. For the assessment of Pu in surface soil, the principal sources of uncertainty in the estimated risks were population mobility, the relationship between indoor and outdoor contaminant levels, and the dose and risk factors for bone, liver, and lung.

  10. Pilot study risk assessment for selected problems at the Nevada Test Site (NTS)

    Energy Technology Data Exchange (ETDEWEB)

    Daniels, J.I. (ed.)

    1993-06-01

    The Nevada Test Site (NTS) is located in southwestern Nevada, about 105 km (65 mi) northwest of the city of Las Vegas. A series of tests was conducted in the late 1950s and early 1960s at or near the NTS to study issues involving plutonium-bearing devices. These tests resulted in the dispersal of about 5 TBq of [sup 239,24O]Pu on the surficial soils at the test locations. Additionally, underground tests of nuclear weapons devices have been conducted at the NTS since late 1962; ground water beneath the NTS has been contaminated with radionuclides produced by these tests. These two important problems have been selected for assessment. Regarding the plutonium contamination, because the residual [sup 239]Pu decays slowly (half-life of 24,110 y), these sites could represent a long-term hazard if they are not remediated and if institutional controls are lost. To investigate the magnitude of the potential health risks for this no-remediation case, three basic exposure scenarios were defined that could bring individuals in contact with [sup 239,24O]Pu at the sites: (1) a resident living in a subdivision, (2) a resident farmer, and (3) a worker at a commercial facility -- all located at a test site. The predicted cancer risks for the resident farmer were more than a factor of three times higher than the suburban resident at the median risk level, and about a factor of ten greater than the reference worker at a commercial facility. At 100 y from the present, the 5, 50, and 95th percentile risks for the resident farmer at the most contaminated site were 4 x 10[sup [minus]6], 6 x 10[sup [minus]5], and 5 x 10[sup [minus]4], respectively. For the assessment of Pu in surface soil, the principal sources of uncertainty in the estimated risks were population mobility, the relationship between indoor and outdoor contaminant levels, and the dose and risk factors for bone, liver, and lung.

  11. Process-oriented tests for validation of baroclinic shallow water models: The lock-exchange problem

    Science.gov (United States)

    Kolar, R. L.; Kibbey, T. C. G.; Szpilka, C. M.; Dresback, K. M.; Tromble, E. M.; Toohey, I. P.; Hoggan, J. L.; Atkinson, J. H.

    A first step often taken to validate prognostic baroclinic codes is a series of process-oriented tests, as those suggested by Haidvogel and Beckmann [Haidvogel, D., Beckmann, A., 1999. Numerical Ocean Circulation Modeling. Imperial College Press, London], among others. One of these tests is the so-called "lock-exchange" test or "dam break" problem, wherein water of different densities is separated by a vertical barrier, which is removed at time zero. Validation against these tests has primarily consisted of comparing the propagation speed of the wave front, as predicted by various theoretical and experimental results, to model output. In addition, inter-model comparisons of the lock-exchange test have been used to validate codes. Herein, we present a high resolution data set, taken from a laboratory-scale model, for direct and quantitative comparison of experimental and numerical results throughout the domain, not just the wave front. Data is captured every 0.2 s using high resolution digital photography, with salt concentration extracted by comparing pixel intensity of the dyed fluid against calibration standards. Two scenarios are discussed in this paper, symmetric and asymmetric mixing, depending on the proportion of dense/light water (17.5 ppt/0.0 ppt) in the experiment; the Boussinesq approximation applies to both. Front speeds, cast in terms of the dimensionless Froude number, show excellent agreement with literature-reported values. Data are also used to quantify the degree of mixing, as measured by the front thickness, which also provides an error band on the front speed. Finally, experimental results are used to validate baroclinic enhancements to the barotropic shallow water ADvanced CIRCulation (ADCIRC) model, including the effect of the vertical mixing scheme on simulation results. Based on salinity data, the model provides an average root-mean-square (rms) error of 3.43 ppt for the symmetric case and 3.74 ppt for the asymmetric case, most of which can

  12. Siblings are special: initial test of a new approach for preventing youth behavior problems.

    Science.gov (United States)

    Feinberg, Mark E; Solmeyer, Anna R; Hostetler, Michelle L; Sakuma, Kari-Lyn; Jones, Damon; McHale, Susan M

    2013-08-01

    A growing body of research documents the significance of siblings and sibling relationships for development, mental health, and behavioral risk across childhood and adolescence. Nonetheless, few well-designed efforts have been undertaken to promote positive and reduce negative youth outcomes by enhancing sibling relationships. Based on a theoretical model of sibling influences, we conducted a randomized trial of Siblings Are Special (SIBS), a group-format afterschool program for fifth graders with a younger sibling in second through fourth grades, which entailed 12 weekly afterschool sessions and three Family Nights. We tested program efficacy with a pre- and post-test design with 174 families randomly assigned to condition. In home visits at both time points, we collected data via parent questionnaires, child interviews, and observer-rated videotaped interactions and teachers rated children's behavior at school. The program enhanced positive sibling relationships, appropriate strategies for parenting siblings, and child self-control, social competence, and academic performance; program exposure was also associated with reduced maternal depression and child internalizing problems. Results were robust across the sample, not qualified by sibling gender, age, family demographics, or baseline risk. No effects were found for sibling conflict, collusion, or child externalizing problems; we will examine follow-up data to determine if short-term impacts lead to reduced negative behaviors over time. The breadth of the SIBS program's impact is consistent with research suggesting that siblings are an important influence on development and adjustment and supports our argument that a sibling focus should be incorporated into youth and family-oriented prevention programs. Copyright © 2013 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  13. Sierra/SolidMechanics 4.46 Example Problems Manual.

    Energy Technology Data Exchange (ETDEWEB)

    Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose; Le, San; Littlewood, David John; Merewether, Mark Thomas; Mosby, Matthew David; Pierson, Kendall H.; Porter, Vicki L.; Shelton, Timothy; Thomas, Jesse David; Tupek, Michael R.; Veilleux, Michael

    2018-03-01

    Presented in this document are tests that exist in the Sierra/SolidMechanics example problem suite, which is a subset of the Sierra/SM regression and performance test suite. These examples showcase common and advanced code capabilities. A wide variety of other regression and verification tests exist in the Sierra/SM test suite that are not included in this manual.

  14. Advantages and Limitations of Anticipating Laboratory Test Results from Regression- and Tree-Based Rules Derived from Electronic Health-Record Data

    OpenAIRE

    Mohammad, Fahim; Theisen-Toupal, Jesse C.; Arnaout, Ramy

    2014-01-01

    Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal"). We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to thei...

  15. Advanced statistics: linear regression, part II: multiple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.

  16. Advantages and limitations of anticipating laboratory test results from regression- and tree-based rules derived from electronic health-record data.

    Directory of Open Access Journals (Sweden)

    Fahim Mohammad

    Full Text Available Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal". We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to their positive and negative predictive value (PPV and NPV and area under the receiver-operator characteristic curve (ROC AUCs. Using a stringent cutoff of PPV and/or NPV≥0.95, standard techniques yield few rules for sendout tests but several for in-house tests, mostly for repeat laboratory tests that are part of the complete blood count and basic metabolic panel. Most rules were clinically and pathophysiologically plausible, and several seemed clinically useful for informing pre-test probability of a given result. But overall, rules were unlikely to be able to function as a general substitute for actually ordering a test. Improving laboratory utilization will likely require different input data and/or alternative methods.

  17. Advantages and limitations of anticipating laboratory test results from regression- and tree-based rules derived from electronic health-record data.

    Science.gov (United States)

    Mohammad, Fahim; Theisen-Toupal, Jesse C; Arnaout, Ramy

    2014-01-01

    Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal"). We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to their positive and negative predictive value (PPV and NPV) and area under the receiver-operator characteristic curve (ROC AUCs). Using a stringent cutoff of PPV and/or NPV≥0.95, standard techniques yield few rules for sendout tests but several for in-house tests, mostly for repeat laboratory tests that are part of the complete blood count and basic metabolic panel. Most rules were clinically and pathophysiologically plausible, and several seemed clinically useful for informing pre-test probability of a given result. But overall, rules were unlikely to be able to function as a general substitute for actually ordering a test. Improving laboratory utilization will likely require different input data and/or alternative methods.

  18. Understanding logistic regression analysis

    OpenAIRE

    Sperandei, Sandro

    2014-01-01

    Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using ex...

  19. The Screening Test for Emotional Problems-Parent Report (STEP-P): Studies of Reliability and Validity

    Science.gov (United States)

    Erford, Bradley T.; Alsamadi, Silvana C.

    2012-01-01

    Score reliability and validity of parent responses concerning their 10- to 17-year-old students were analyzed using the Screening Test for Emotional Problems-Parent Report (STEP-P), which assesses a variety of emotional problems classified under the Individuals with Disabilities Education Improvement Act. Score reliability, convergent, and…

  20. Discontinuous Petrov-Galerkin method based on the optimal test space norm for one-dimensional transport problems

    KAUST Repository

    Niemi, Antti; Collier, Nathan; Calo, Victor M.

    2011-01-01

    We revisit the finite element analysis of convection dominated flow problems within the recently developed Discontinuous Petrov-Galerkin (DPG) variational framework. We demonstrate how test function spaces that guarantee numerical stability can

  1. Alternative Methods of Regression

    CERN Document Server

    Birkes, David

    2011-01-01

    Of related interest. Nonlinear Regression Analysis and its Applications Douglas M. Bates and Donald G. Watts ".an extraordinary presentation of concepts and methods concerning the use and analysis of nonlinear regression models.highly recommend[ed].for anyone needing to use and/or understand issues concerning the analysis of nonlinear regression models." --Technometrics This book provides a balance between theory and practice supported by extensive displays of instructive geometrical constructs. Numerous in-depth case studies illustrate the use of nonlinear regression analysis--with all data s

  2. The M Word: Multicollinearity in Multiple Regression.

    Science.gov (United States)

    Morrow-Howell, Nancy

    1994-01-01

    Notes that existence of substantial correlation between two or more independent variables creates problems of multicollinearity in multiple regression. Discusses multicollinearity problem in social work research in which independent variables are usually intercorrelated. Clarifies problems created by multicollinearity, explains detection of…

  3. Western Regional Conference on Testing Problems (7th, Los Angeles, California, March 14, 1958). Testing for the Discovery and Development of Human Talent.

    Science.gov (United States)

    Educational Testing Service, Los Angeles, CA.

    At the seventh Western Regional Conference on Testing Problems, the following speeches were given: (1) "A Guidance Person's Approach to Testing for the Discovery and Development of Human Talent" by Frances D. McGill; (2) "The Instructional Uses of Measurement in the Discovery and Development of Human Talent" by Roy P. Wahle; (3) "New Frontiers of…

  4. Explaining Discrepancies Between the Digit Triplet Speech-in-Noise Test Score and Self-Reported Hearing Problems in Older Adults.

    Science.gov (United States)

    Pronk, Marieke; Deeg, Dorly J H; Kramer, Sophia E

    2018-04-17

    The purpose of this study is to determine which demographic, health-related, mood, personality, or social factors predict discrepancies between older adults' functional speech-in-noise test result and their self-reported hearing problems. Data of 1,061 respondents from the Longitudinal Aging Study Amsterdam were used (ages ranged from 57 to 95 years). Functional hearing problems were measured using a digit triplet speech-in-noise test. Five questions were used to assess self-reported hearing problems. Scores of both hearing measures were dichotomized. Two discrepancy outcomes were created: (a) being unaware: those with functional but without self-reported problems (reference is aware: those with functional and self-reported problems); (b) reporting false complaints: those without functional but with self-reported problems (reference is well: those without functional and self-reported hearing problems). Two multivariable prediction models (logistic regression) were built with 19 candidate predictors. The speech reception threshold in noise was kept (forced) as a predictor in both models. Persons with higher self-efficacy (to initiate behavior) and higher self-esteem had a higher odds to being unaware than persons with lower self-efficacy scores (odds ratio [OR] = 1.13 and 1.11, respectively). Women had a higher odds than men (OR = 1.47). Persons with more chronic diseases and persons with worse (i.e., higher) speech-in-noise reception thresholds in noise had a lower odds to being unaware (OR = 0.85 and 0.91, respectively) than persons with less diseases and better thresholds, respectively. A higher odds to reporting false complaints was predicted by more depressive symptoms (OR = 1.06), more chronic diseases (OR = 1.21), and a larger social network (OR = 1.02). Persons with higher self-efficacy (to complete behavior) had a lower odds (OR = 0.86), whereas persons with higher self-esteem had a higher odds to report false complaints (OR = 1.21). The explained variance

  5. A Solution to Separation and Multicollinearity in Multiple Logistic Regression.

    Science.gov (United States)

    Shen, Jianzhao; Gao, Sujuan

    2008-10-01

    In dementia screening tests, item selection for shortening an existing screening test can be achieved using multiple logistic regression. However, maximum likelihood estimates for such logistic regression models often experience serious bias or even non-existence because of separation and multicollinearity problems resulting from a large number of highly correlated items. Firth (1993, Biometrika, 80(1), 27-38) proposed a penalized likelihood estimator for generalized linear models and it was shown to reduce bias and the non-existence problems. The ridge regression has been used in logistic regression to stabilize the estimates in cases of multicollinearity. However, neither solves the problems for each other. In this paper, we propose a double penalized maximum likelihood estimator combining Firth's penalized likelihood equation with a ridge parameter. We present a simulation study evaluating the empirical performance of the double penalized likelihood estimator in small to moderate sample sizes. We demonstrate the proposed approach using a current screening data from a community-based dementia study.

  6. Strategies for Solving Potential Problems Associated with Laboratory Diffusion and Batch Experiments - Part 1: An Overview of Conventional Test Methods

    International Nuclear Information System (INIS)

    Zhang, M.; Takeda, M.; Nakajima, H.

    2006-01-01

    Laboratory diffusion testing as well as batch experiments are well established and widely adopted techniques for characterizing the diffusive and adsorptive properties of geological, geotechnical, and synthetic materials in both scientific and applied fields, including geological disposal of radioactive waste. Although several types of diffusion test, such as the through- diffusion test, in-diffusion test, out-diffusion test, and column test, are currently available, different methods may have different advantages and disadvantages. In addition, traditional methods may have limitations, such as the need for relatively long test times, cumbersome test procedures, and the possibility of errors due to differences between analytical assumptions and actual test conditions. Furthermore, traditional batch experiments using mineral powders are known to overestimate the sorption coefficient. In part 1 of this report, we present a brief overview of laboratory diffusion and batch experiments. The advantages, disadvantages, limitations, and/or potential problems associated with individual tests were compared and summarized. This comprehensive report will provide practical references for reviewing the results obtained from relevant experiments, especially from the viewpoint of regulation. To solve and/or eliminate the potential problems associated with conventional methods, and to obtain the diffusion coefficient and rock capacity factor from a laboratory test both rapidly and accurately, part 2 of this study discusses possible strategies involving the development of rigorous solutions to some relevant test methods, and sensitivity analyses for the related tests that may be helpful to judge the accuracy of the two parameters to be determined from individual tests. (authors)

  7. The 1-min Screening Test for Reading Problems in College Students: Psychometric Properties of the 1-min TIL.

    Science.gov (United States)

    Fernandes, Tânia; Araújo, Susana; Sucena, Ana; Reis, Alexandra; Castro, São Luís

    2017-02-01

    Reading is a central cognitive domain, but little research has been devoted to standardized tests for adults. We, thus, examined the psychometric properties of the 1-min version of Teste de Idade de Leitura (Reading Age Test; 1-min TIL), the Portuguese version of Lobrot L3 test, in three experiments with college students: typical readers in Experiment 1A and B, dyslexic readers and chronological age controls in Experiment 2. In Experiment 1A, test-retest reliability and convergent validity were evaluated in 185 students. Reliability was >.70, and phonological decoding underpinned 1-min TIL. In Experiment 1B, internal consistency was assessed by presenting two 45-s versions of the test to 19 students, and performance in these versions was significantly associated (r = .78). In Experiment 2, construct validity, criterion validity and clinical utility of 1-min TIL were investigated. A multiple regression analysis corroborated construct validity; both phonological decoding and listening comprehension were reliable predictors of 1-min TIL scores. Logistic regression and receiver operating characteristics analyses revealed the high accuracy of this test in distinguishing dyslexic from typical readers. Therefore, the 1-min TIL, which assesses reading comprehension and potential reading difficulties in college students, has the necessary psychometric properties to become a useful screening instrument in neuropsychological assessment and research. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  8. The Screening Test for Emotional Problems--Teacher-Report Version (Step-T): Studies of Reliability and Validity

    Science.gov (United States)

    Erford, Bradley T.; Butler, Caitlin; Peacock, Elizabeth

    2015-01-01

    The Screening Test for Emotional Problems-Teacher Version (STEP-T) was designed to identify students aged 7-17 years with wide-ranging emotional disturbances. Coefficients alpha and test-retest reliability were adequate for all subscales except Anxiety. The hypothesized five-factor model fit the data very well and external aspects of validity were…

  9. Some problems in the technique of high-voltage testing of the accelerating tube gaps in electrostatic accelerators

    International Nuclear Information System (INIS)

    Romanov, V.A.; Ivanov, V.V.; Mukhametshin, V.I.; Dmitriev, E.P.; Kidalov, A.I.

    1983-01-01

    Problems arising during high-voltage testing and training of accelerating taubes of electrostatic accelrators are discussed. A rig and technique of the accelerating tube testing and program designed for the processing of the data obtained and sorting out of the samples investigated are described

  10. Thermohydraulic analysis of the IAEA standard problem test on the PMK-NHV facility; Termohidraulicna analiza standardnega problem MAAE na poskusni napravi PMK-NHV

    Energy Technology Data Exchange (ETDEWEB)

    Stritar, A [Institut Jozef Stefan, Ljubljana (Yugoslavia)

    1987-07-01

    International Atomic Energy Agency (IAEA) has supported a standard test problem simulating small break loss of coolant accident on the test facility PMH-NHV in Budapest. The present pretest analysis of that transient was done using the computer code RELAP4/MOD6. The results were compared to the measurements data and to data of 19 other laboratories around the world that have performed the same analysis. The correspondence of the results to the measured data is reasonable. There are bigger discrepancies, which in turn influence other variables. (author)

  11. Adapting tests of sign language assessment for other sign languages--a review of linguistic, cultural, and psychometric problems.

    Science.gov (United States)

    Haug, Tobias; Mann, Wolfgang

    2008-01-01

    Given the current lack of appropriate assessment tools for measuring deaf children's sign language skills, many test developers have used existing tests of other sign languages as templates to measure the sign language used by deaf people in their country. This article discusses factors that may influence the adaptation of assessment tests from one natural sign language to another. Two tests which have been adapted for several other sign languages are focused upon: the Test for American Sign Language and the British Sign Language Receptive Skills Test. A brief description is given of each test as well as insights from ongoing adaptations of these tests for other sign languages. The problems reported in these adaptations were found to be grounded in linguistic and cultural differences, which need to be considered for future test adaptations. Other reported shortcomings of test adaptation are related to the question of how well psychometric measures transfer from one instrument to another.

  12. Understanding logistic regression analysis.

    Science.gov (United States)

    Sperandei, Sandro

    2014-01-01

    Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using examples to make it as simple as possible. After definition of the technique, the basic interpretation of the results is highlighted and then some special issues are discussed.

  13. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2013-01-01

    Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus

  14. Applied logistic regression

    CERN Document Server

    Hosmer, David W; Sturdivant, Rodney X

    2013-01-01

     A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-

  15. Using Logistic Regression for Validating or Invalidating Initial Statewide Cut-Off Scores on Basic Skills Placement Tests at the Community College Level

    Science.gov (United States)

    Secolsky, Charles; Krishnan, Sathasivam; Judd, Thomas P.

    2013-01-01

    The community colleges in the state of New Jersey went through a process of establishing statewide cut-off scores for English and mathematics placement tests. The colleges wanted to communicate to secondary schools a consistent preparation that would be necessary for enrolling in Freshman Composition and College Algebra at the community college…

  16. Associations of breed and feeding management with milk production curves at herd level using a random regression test-day model

    NARCIS (Netherlands)

    Caccamo, M.; Veerkamp, R.F.; Ferguson, J.D.; Petriglieri, R.; Terra, La F.; Licitra, G.

    2010-01-01

    Earlier studies identified large between-herd variation in estimated lactation curve parameters from test-day milk yield and milk composition records collected in Ragusa province, Italy. The objective of this study was to identify sources of variation able to explain these between-herd differences

  17. Understanding poisson regression.

    Science.gov (United States)

    Hayat, Matthew J; Higgins, Melinda

    2014-04-01

    Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.

  18. Polylinear regression analysis in radiochemistry

    International Nuclear Information System (INIS)

    Kopyrin, A.A.; Terent'eva, T.N.; Khramov, N.N.

    1995-01-01

    A number of radiochemical problems have been formulated in the framework of polylinear regression analysis, which permits the use of conventional mathematical methods for their solution. The authors have considered features of the use of polylinear regression analysis for estimating the contributions of various sources to the atmospheric pollution, for studying irradiated nuclear fuel, for estimating concentrations from spectral data, for measuring neutron fields of a nuclear reactor, for estimating crystal lattice parameters from X-ray diffraction patterns, for interpreting data of X-ray fluorescence analysis, for estimating complex formation constants, and for analyzing results of radiometric measurements. The problem of estimating the target parameters can be incorrect at certain properties of the system under study. The authors showed the possibility of regularization by adding a fictitious set of data open-quotes obtainedclose quotes from the orthogonal design. To estimate only a part of the parameters under consideration, the authors used incomplete rank models. In this case, it is necessary to take into account the possibility of confounding estimates. An algorithm for evaluating the degree of confounding is presented which is realized using standard software or regression analysis

  19. The students’ ability in mathematical literacy for the quantity, and the change and relationship problems on the PISA adaptation test

    Science.gov (United States)

    Julie, Hongki; Sanjaya, Febi; Yudhi Anggoro, Ant.

    2017-09-01

    One of purposes of this study was to describe the solution profile of the junior high school students for the PISA adaptation test. The procedures conducted by researchers to achieve this objective were (1) adapting the PISA test, (2) validating the adapting PISA test, (3) asking junior high school students to do the adapting PISA test, and (4) making the students’ solution profile. The PISA problems for mathematics could be classified into four areas, namely quantity, space and shape, change and relationship, and uncertainty. The research results that would be presented in this paper were the result test for quantity, and change and relationship problems. In the adapting PISA test, there were fifteen questions that consist of two questions for the quantity group, six questions for space and shape group, three questions for the change and relationship group, and four questions for uncertainty. Subjects in this study were 18 students from 11 junior high schools in Yogyakarta, Central Java, and Banten. The type of research that used by the researchers was a qualitative research. For the first quantity problem, there were 38.89 % students who achieved level 3. For the second quantity problem, there were 88.89 % students who achieved level 2. For part a of the first change and relationship problem, there were 55.56 % students who achieved level 5. For part b of the first change and relationship problem, there were 77.78 % students who achieved level 2. For the second change and relationship problem, there were 38.89 % students who achieved level 2.

  20. Is Trait Rumination Associated with the Ability to Generate Effective Problem Solving Strategies? Utilizing Two Versions of the Means-Ends Problem-Solving Test.

    Science.gov (United States)

    Hasegawa, Akira; Nishimura, Haruki; Mastuda, Yuko; Kunisato, Yoshihiko; Morimoto, Hiroshi; Adachi, Masaki

    This study examined the relationship between trait rumination and the effectiveness of problem solving strategies as assessed by the Means-Ends Problem-Solving Test (MEPS) in a nonclinical population. The present study extended previous studies in terms of using two instructions in the MEPS: the second-person, actual strategy instructions, which has been utilized in previous studies on rumination, and the third-person, ideal-strategy instructions, which is considered more suitable for assessing the effectiveness of problem solving strategies. We also replicated the association between rumination and each dimension of the Social Problem-Solving Inventory-Revised Short Version (SPSI-R:S). Japanese undergraduate students ( N  = 223) completed the Beck Depression Inventory-Second Edition, Ruminative Responses Scale (RRS), MEPS, and SPSI-R:S. One half of the sample completed the MEPS with the second-person, actual strategy instructions. The other participants completed the MEPS with the third-person, ideal-strategy instructions. The results showed that neither total RRS score, nor its subscale scores were significantly correlated with MEPS scores under either of the two instructions. These findings taken together with previous findings indicate that in nonclinical populations, trait rumination is not related to the effectiveness of problem solving strategies, but that state rumination while responding to the MEPS deteriorates the quality of strategies. The correlations between RRS and SPSI-R:S scores indicated that trait rumination in general, and its brooding subcomponent in particular are parts of cognitive and behavioral responses that attempt to avoid negative environmental and negative private events. Results also showed that reflection is a part of active problem solving.

  1. Generating feasible transition paths for testing from an extended finite state machine (EFSM) with the counter problem

    OpenAIRE

    Kalaji, AS; Hierons, RM; Swift, S

    2009-01-01

    The extended finite state machine (EFSM) is a powerful approach for modeling state-based systems. However, testing from EFSMs is complicated by the existence of infeasible paths. One important problem is the existence of a transition with a guard that references a counter variable whose value depends on previous transitions. The presence of such transitions in paths often leads to infeasible paths. This paper proposes a novel approach to bypass the counter problem. The proposed approach is ev...

  2. An accurate and efficient identification of children with psychosocial problems by means of computerized adaptive testing

    NARCIS (Netherlands)

    Vogels, Antonius G. C.; Jacobusse, Gert W.; Reijneveld, Symen A.

    2011-01-01

    Background: Questionnaires used by health services to identify children with psychosocial problems are often rather short. The psychometric properties of such short questionnaires are mostly less than needed for an accurate distinction between children with and without problems. We aimed to assess

  3. Discontinuous Petrov-Galerkin method based on the optimal test space norm for steady transport problems in one space dimension

    KAUST Repository

    Niemi, Antti; Collier, Nathan; Calo, Victor M.

    2013-01-01

    We revisit the finite element analysis of convection-dominated flow problems within the recently developed Discontinuous Petrov-Galerkin (DPG) variational framework. We demonstrate how test function spaces that guarantee numerical stability can be computed automatically with respect to the optimal test space norm. This makes the DPG method not only stable but also robust, that is, uniformly stable with respect to the Péclet number in the current application. We employ discontinuous piecewise Bernstein polynomials as trial functions and construct a subgrid discretization that accounts for the singular perturbation character of the problem to resolve the corresponding optimal test functions. We also show that a smooth B-spline basis has certain computational advantages in the subgrid discretization. The overall effectiveness of the algorithm is demonstrated on two problems for the linear advection-diffusion equation. © 2011 Elsevier B.V.

  4. Discontinuous Petrov-Galerkin method based on the optimal test space norm for steady transport problems in one space dimension

    KAUST Repository

    Niemi, Antti

    2013-05-01

    We revisit the finite element analysis of convection-dominated flow problems within the recently developed Discontinuous Petrov-Galerkin (DPG) variational framework. We demonstrate how test function spaces that guarantee numerical stability can be computed automatically with respect to the optimal test space norm. This makes the DPG method not only stable but also robust, that is, uniformly stable with respect to the Péclet number in the current application. We employ discontinuous piecewise Bernstein polynomials as trial functions and construct a subgrid discretization that accounts for the singular perturbation character of the problem to resolve the corresponding optimal test functions. We also show that a smooth B-spline basis has certain computational advantages in the subgrid discretization. The overall effectiveness of the algorithm is demonstrated on two problems for the linear advection-diffusion equation. © 2011 Elsevier B.V.

  5. Predicting Antitumor Activity of Peptides by Consensus of Regression Models Trained on a Small Data Sample

    Directory of Open Access Journals (Sweden)

    Ivanka Jerić

    2011-11-01

    Full Text Available Predicting antitumor activity of compounds using regression models trained on a small number of compounds with measured biological activity is an ill-posed inverse problem. Yet, it occurs very often within the academic community. To counteract, up to some extent, overfitting problems caused by a small training data, we propose to use consensus of six regression models for prediction of biological activity of virtual library of compounds. The QSAR descriptors of 22 compounds related to the opioid growth factor (OGF, Tyr-Gly-Gly-Phe-Met with known antitumor activity were used to train regression models: the feed-forward artificial neural network, the k-nearest neighbor, sparseness constrained linear regression, the linear and nonlinear (with polynomial and Gaussian kernel support vector machine. Regression models were applied on a virtual library of 429 compounds that resulted in six lists with candidate compounds ranked by predicted antitumor activity. The highly ranked candidate compounds were synthesized, characterized and tested for an antiproliferative activity. Some of prepared peptides showed more pronounced activity compared with the native OGF; however, they were less active than highly ranked compounds selected previously by the radial basis function support vector machine (RBF SVM regression model. The ill-posedness of the related inverse problem causes unstable behavior of trained regression models on test data. These results point to high complexity of prediction based on the regression models trained on a small data sample.

  6. The neural correlates of problem states: testing FMRI predictions of a computational model of multitasking.

    Directory of Open Access Journals (Sweden)

    Jelmer P Borst

    Full Text Available BACKGROUND: It has been shown that people can only maintain one problem state, or intermediate mental representation, at a time. When more than one problem state is required, for example in multitasking, performance decreases considerably. This effect has been explained in terms of a problem state bottleneck. METHODOLOGY: In the current study we use the complimentary methodologies of computational cognitive modeling and neuroimaging to investigate the neural correlates of this problem state bottleneck. In particular, an existing computational cognitive model was used to generate a priori fMRI predictions for a multitasking experiment in which the problem state bottleneck plays a major role. Hemodynamic responses were predicted for five brain regions, corresponding to five cognitive resources in the model. Most importantly, we predicted the intraparietal sulcus to show a strong effect of the problem state manipulations. CONCLUSIONS: Some of the predictions were confirmed by a subsequent fMRI experiment, while others were not matched by the data. The experiment supported the hypothesis that the problem state bottleneck is a plausible cause of the interference in the experiment and that it could be located in the intraparietal sulcus.

  7. Structure of the generalized momentum of a test charged particle and the inverse problem in general relativity theory

    International Nuclear Information System (INIS)

    Zakharov, A.V.; Singatullin, R.S.

    1981-01-01

    The inverse problem is solved in general relativity theory (GRT) consisting in determining the metric and potentials of an electromagnetic field by their values in the nonsingular point of the V 4 space and present functions, being the generalized momenta of a test charged particle. The Hamilton-Jacobi equation for a test charged particle in GRT is used. The general form of the generalized momentum dependence on the initial values is determined. It is noted that the inverse problem solution of dynamics in GRT contains arbitrariness which depends on the choice of the metric and potential values of the electromagnetic field in the nonsingular point [ru

  8. Discontinuous Petrov-Galerkin method based on the optimal test space norm for one-dimensional transport problems

    KAUST Repository

    Niemi, Antti

    2011-05-14

    We revisit the finite element analysis of convection dominated flow problems within the recently developed Discontinuous Petrov-Galerkin (DPG) variational framework. We demonstrate how test function spaces that guarantee numerical stability can be computed automatically with respect to the so called optimal test space norm by using an element subgrid discretization. This should make the DPG method not only stable but also robust, that is, uniformly stable with respect to the Ṕeclet number in the current application. The e_ectiveness of the algorithm is demonstrated on two problems for the linear advection-di_usion equation.

  9. Multicollinearity and Regression Analysis

    Science.gov (United States)

    Daoud, Jamal I.

    2017-12-01

    In regression analysis it is obvious to have a correlation between the response and predictor(s), but having correlation among predictors is something undesired. The number of predictors included in the regression model depends on many factors among which, historical data, experience, etc. At the end selection of most important predictors is something objective due to the researcher. Multicollinearity is a phenomena when two or more predictors are correlated, if this happens, the standard error of the coefficients will increase [8]. Increased standard errors means that the coefficients for some or all independent variables may be found to be significantly different from In other words, by overinflating the standard errors, multicollinearity makes some variables statistically insignificant when they should be significant. In this paper we focus on the multicollinearity, reasons and consequences on the reliability of the regression model.

  10. Minimax Regression Quantiles

    DEFF Research Database (Denmark)

    Bache, Stefan Holst

    A new and alternative quantile regression estimator is developed and it is shown that the estimator is root n-consistent and asymptotically normal. The estimator is based on a minimax ‘deviance function’ and has asymptotically equivalent properties to the usual quantile regression estimator. It is......, however, a different and therefore new estimator. It allows for both linear- and nonlinear model specifications. A simple algorithm for computing the estimates is proposed. It seems to work quite well in practice but whether it has theoretical justification is still an open question....

  11. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface...... for predicting the covariate specific absolute risks, their confidence intervals, and their confidence bands based on right censored time to event data. We provide explicit formulas for our implementation of the estimator of the (stratified) baseline hazard function in the presence of tied event times. As a by...... functionals. The software presented here is implemented in the riskRegression package....

  12. FLAG Simulations of the Elasticity Test Problem of Gavrilyuk et al.

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Runnels, Scott R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Canfield, Thomas R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Carney, Theodore C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-04-23

    This report contains a description of the impact problem used to compare hypoelastic and hyperelastic material models, as described by Gavrilyuk, Favrie & Saurel. That description is used to set up hypoelastic simulations in the FLAG hydrocode.

  13. A simple approach to power and sample size calculations in logistic regression and Cox regression models.

    Science.gov (United States)

    Vaeth, Michael; Skovlund, Eva

    2004-06-15

    For a given regression problem it is possible to identify a suitably defined equivalent two-sample problem such that the power or sample size obtained for the two-sample problem also applies to the regression problem. For a standard linear regression model the equivalent two-sample problem is easily identified, but for generalized linear models and for Cox regression models the situation is more complicated. An approximately equivalent two-sample problem may, however, also be identified here. In particular, we show that for logistic regression and Cox regression models the equivalent two-sample problem is obtained by selecting two equally sized samples for which the parameters differ by a value equal to the slope times twice the standard deviation of the independent variable and further requiring that the overall expected number of events is unchanged. In a simulation study we examine the validity of this approach to power calculations in logistic regression and Cox regression models. Several different covariate distributions are considered for selected values of the overall response probability and a range of alternatives. For the Cox regression model we consider both constant and non-constant hazard rates. The results show that in general the approach is remarkably accurate even in relatively small samples. Some discrepancies are, however, found in small samples with few events and a highly skewed covariate distribution. Comparison with results based on alternative methods for logistic regression models with a single continuous covariate indicates that the proposed method is at least as good as its competitors. The method is easy to implement and therefore provides a simple way to extend the range of problems that can be covered by the usual formulas for power and sample size determination. Copyright 2004 John Wiley & Sons, Ltd.

  14. ISP 22 OECD/NEA/CSNI International standard problem n. 22. Evaluation of post-test analyses

    International Nuclear Information System (INIS)

    1992-07-01

    The present report deals with the open re-evaluation of the originally double-blind CSNI International Standard Problem 22 based on the test SP-FW-02 performed in the SPES facility. The SPES apparatus is an experimental simulator of the Westinghouse PWR-PUN plant. The test SP-FW-02 (ISP22) simulates a complete loss of feedwater with delayed injection of auxiliary feedwater. The main parts of the report are: outline of the test facility and of the SP-FW-02 experiment; overview of pre-test activities; overview of input models used by post-test participants; evaluation of participant predictions; evaluation of qualitative and quantitative code accuracy of pre-test and post-test calculations

  15. The problem of false-positive human papillomavirus DNA tests in cervical screening

    DEFF Research Database (Denmark)

    Rebolj, Matejka; Pribac, Igor; Frederiksen, Maria Eiholm

    2013-01-01

    Human Papillomavirus (HPV) testing has been extensively studied in randomized controlled trials of primary cervical screening. Based on encouraging results concerning its high detection rates and a high negative predictive value for high-grade cervical intraepithelial neoplasia (CIN), HPV testing...... will probably replace cytology in future primary cervical screening. However, HPV testing is associated with more frequent false-positive tests compared to cytology. False-positive tests are defined as positive screening tests which are not subsequently confirmed with high-grade CIN. Several authors have...

  16. Multiple linear regression analysis

    Science.gov (United States)

    Edwards, T. R.

    1980-01-01

    Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.

  17. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  18. Nonlinear Regression with R

    CERN Document Server

    Ritz, Christian; Parmigiani, Giovanni

    2009-01-01

    R is a rapidly evolving lingua franca of graphical display and statistical analysis of experiments from the applied sciences. This book provides a coherent treatment of nonlinear regression with R by means of examples from a diversity of applied sciences such as biology, chemistry, engineering, medicine and toxicology.

  19. Bounded Gaussian process regression

    DEFF Research Database (Denmark)

    Jensen, Bjørn Sand; Nielsen, Jens Brehm; Larsen, Jan

    2013-01-01

    We extend the Gaussian process (GP) framework for bounded regression by introducing two bounded likelihood functions that model the noise on the dependent variable explicitly. This is fundamentally different from the implicit noise assumption in the previously suggested warped GP framework. We...... with the proposed explicit noise-model extension....

  20. and Multinomial Logistic Regression

    African Journals Online (AJOL)

    This work presented the results of an experimental comparison of two models: Multinomial Logistic Regression (MLR) and Artificial Neural Network (ANN) for classifying students based on their academic performance. The predictive accuracy for each model was measured by their average Classification Correct Rate (CCR).

  1. Mechanisms of neuroblastoma regression

    Science.gov (United States)

    Brodeur, Garrett M.; Bagatell, Rochelle

    2014-01-01

    Recent genomic and biological studies of neuroblastoma have shed light on the dramatic heterogeneity in the clinical behaviour of this disease, which spans from spontaneous regression or differentiation in some patients, to relentless disease progression in others, despite intensive multimodality therapy. This evidence also suggests several possible mechanisms to explain the phenomena of spontaneous regression in neuroblastomas, including neurotrophin deprivation, humoral or cellular immunity, loss of telomerase activity and alterations in epigenetic regulation. A better understanding of the mechanisms of spontaneous regression might help to identify optimal therapeutic approaches for patients with these tumours. Currently, the most druggable mechanism is the delayed activation of developmentally programmed cell death regulated by the tropomyosin receptor kinase A pathway. Indeed, targeted therapy aimed at inhibiting neurotrophin receptors might be used in lieu of conventional chemotherapy or radiation in infants with biologically favourable tumours that require treatment. Alternative approaches consist of breaking immune tolerance to tumour antigens or activating neurotrophin receptor pathways to induce neuronal differentiation. These approaches are likely to be most effective against biologically favourable tumours, but they might also provide insights into treatment of biologically unfavourable tumours. We describe the different mechanisms of spontaneous neuroblastoma regression and the consequent therapeutic approaches. PMID:25331179

  2. Analysis of some methods for reduced rank Gaussian process regression

    DEFF Research Database (Denmark)

    Quinonero-Candela, J.; Rasmussen, Carl Edward

    2005-01-01

    While there is strong motivation for using Gaussian Processes (GPs) due to their excellent performance in regression and classification problems, their computational complexity makes them impractical when the size of the training set exceeds a few thousand cases. This has motivated the recent...... proliferation of a number of cost-effective approximations to GPs, both for classification and for regression. In this paper we analyze one popular approximation to GPs for regression: the reduced rank approximation. While generally GPs are equivalent to infinite linear models, we show that Reduced Rank...... Gaussian Processes (RRGPs) are equivalent to finite sparse linear models. We also introduce the concept of degenerate GPs and show that they correspond to inappropriate priors. We show how to modify the RRGP to prevent it from being degenerate at test time. Training RRGPs consists both in learning...

  3. Patch testing with markers of fragrance contact allergy. Do clinical tests correspond to patients' self-reported problems?

    DEFF Research Database (Denmark)

    Johansen, J D; Andersen, T F; Veien, N

    1997-01-01

    in a questionnaire prior to patch testing with the European standard series. The questionnaire contained questions about skin symptoms from the use of scented and unscented products as well as skin reactions from contact with spices, flowers and citrus fruits that could indicate fragrance sensitivity. A highly...

  4. On Weighted Support Vector Regression

    DEFF Research Database (Denmark)

    Han, Xixuan; Clemmensen, Line Katrine Harder

    2014-01-01

    We propose a new type of weighted support vector regression (SVR), motivated by modeling local dependencies in time and space in prediction of house prices. The classic weights of the weighted SVR are added to the slack variables in the objective function (OF‐weights). This procedure directly...... shrinks the coefficient of each observation in the estimated functions; thus, it is widely used for minimizing influence of outliers. We propose to additionally add weights to the slack variables in the constraints (CF‐weights) and call the combination of weights the doubly weighted SVR. We illustrate...... the differences and similarities of the two types of weights by demonstrating the connection between the Least Absolute Shrinkage and Selection Operator (LASSO) and the SVR. We show that an SVR problem can be transformed to a LASSO problem plus a linear constraint and a box constraint. We demonstrate...

  5. Adaptation of eddy current methods to the multiple problems of reactor testing

    International Nuclear Information System (INIS)

    Stumm, W.

    1975-01-01

    In reactor testing, the eddy current method is mainly used for the testing of surface regions inside the pressure vessel, on welds and joints, and for the testing of thin-walled pipes, e.g. the heat exchanger pipes. (RW/AK) [de

  6. Automatically stable discontinuous Petrov-Galerkin methods for stationary transport problems: Quasi-optimal test space norm

    KAUST Repository

    Niemi, Antti H.; Collier, Nathan; Calo, Victor M.

    2013-01-01

    We investigate the application of the discontinuous Petrov-Galerkin (DPG) finite element framework to stationary convection-diffusion problems. In particular, we demonstrate how the quasi-optimal test space norm improves the robustness of the DPG method with respect to vanishing diffusion. We numerically compare coarse-mesh accuracy of the approximation when using the quasi-optimal norm, the standard norm, and the weighted norm. Our results show that the quasi-optimal norm leads to more accurate results on three benchmark problems in two spatial dimensions. We address the problems associated to the resolution of the optimal test functions with respect to the quasi-optimal norm by studying their convergence numerically. In order to facilitate understanding of the method, we also include a detailed explanation of the methodology from the algorithmic point of view. © 2013 Elsevier Ltd. All rights reserved.

  7. Automatically stable discontinuous Petrov-Galerkin methods for stationary transport problems: Quasi-optimal test space norm

    KAUST Repository

    Niemi, Antti H.

    2013-12-01

    We investigate the application of the discontinuous Petrov-Galerkin (DPG) finite element framework to stationary convection-diffusion problems. In particular, we demonstrate how the quasi-optimal test space norm improves the robustness of the DPG method with respect to vanishing diffusion. We numerically compare coarse-mesh accuracy of the approximation when using the quasi-optimal norm, the standard norm, and the weighted norm. Our results show that the quasi-optimal norm leads to more accurate results on three benchmark problems in two spatial dimensions. We address the problems associated to the resolution of the optimal test functions with respect to the quasi-optimal norm by studying their convergence numerically. In order to facilitate understanding of the method, we also include a detailed explanation of the methodology from the algorithmic point of view. © 2013 Elsevier Ltd. All rights reserved.

  8. Test of the role of nicotine dependence in the relation between posttraumatic stress disorder and panic spectrum problems.

    Science.gov (United States)

    Feldner, Matthew T; Smith, Rose C; Babson, Kimberly A; Sachs-Ericsson, Natalie; Schmidt, Norman B; Zvolensky, Michael J

    2009-02-01

    Posttraumatic stress disorder (PTSD) frequently co-occurs with panic spectrum problems. Relatively little empirical work has tested possible mechanisms accounting for this association. Nicotine dependence often ensues subsequent to PTSD onset and research suggests smoking high numbers of cigarettes daily may lead to panic problems. The current study tested the hypotheses that nicotine dependence partially mediates the relations between PTSD and both panic attacks and panic disorder within a nationally representative sample of 5,692 (3,020 women; M(Age) = 45, SD = 18) adults from the National Comorbidity Survey-Replication. Results were consistent with hypotheses. These findings support the theory suggesting smoking among people with PTSD may be involved in the development of panic problems.

  9. A Test of the Circumvention-of-Limits Hypothesis in Scientific Problem Solving: The Case of Geological Bedrock Mapping

    Science.gov (United States)

    Hambrick, David Z.; Libarkin, Julie C.; Petcovic, Heather L.; Baker, Kathleen M.; Elkins, Joe; Callahan, Caitlin N.; Turner, Sheldon P.; Rench, Tara A.; LaDue, Nicole D.

    2012-01-01

    Sources of individual differences in scientific problem solving were investigated. Participants representing a wide range of experience in geology completed tests of visuospatial ability and geological knowledge, and performed a geological bedrock mapping task, in which they attempted to infer the geological structure of an area in the Tobacco…

  10. Accelerated Educational Change; The Annual Western Regional Conference on Testing Problems (15th, San Francisco, California, May 6, 1966).

    Science.gov (United States)

    Educational Testing Service, Princeton, NJ.

    The 1966 meeting of the Western Regional Conference on Testing Problems dealt with accelerated educational change. The following speeches were presented: (1) "Access to Higher Education: Implications for Future Planning" by Richard Pearson; (2) "The Differentiated Youth: A Challenge to Traditional Institutions" by Joseph D. Lohman; (3) "Teaching…

  11. The Influence of Maternal Acculturation, Neighborhood Disadvantage, and Parenting on Chinese American Adolescents' Conduct Problems: Testing the Segmented Assimilation Hypothesis

    Science.gov (United States)

    Liu, Lisa L.; Lau, Anna S.; Chen, Angela Chia-Chen; Dinh, Khanh T.; Kim, Su Yeong

    2009-01-01

    Associations among neighborhood disadvantage, maternal acculturation, parenting and conduct problems were investigated in a sample of 444 Chinese American adolescents. Adolescents (54% female, 46% male) ranged from 12 to 15 years of age (mean age = 13.0 years). Multilevel modeling was employed to test the hypothesis that the association between…

  12. Associations between maternal and paternal depressive symptoms and early child behavior problems: Testing a mutually adjusted prospective longitudinal model.

    Science.gov (United States)

    Narayanan, Martina K; Nærde, Ane

    2016-05-15

    While there is substantial empirical work on maternal depression, less is known about how mothers' and fathers' depressive symptoms compare in their association with child behavior problems in early childhood. In particular, few studies have examined unique relationships in the postpartum period by controlling for the other parent, or looked at longitudinal change in either parent's depressive symptoms across the first living years as a predictor of child problems. We examined depressive symptoms in parents at 6, 12, 24, 36 and 48 months following childbirth, and child behavior problems at 48 months. Linear growth curve analysis was used to model parents' initial levels and changes in symptoms across time and their associations with child outcomes. Mothers' depressive symptoms at 6 months predicted behavior problems at 48 months for all syndrome scales, while fathers' did not. Estimates for mothers' symptoms were significantly stronger on all subscales. Change in fathers' depressive symptoms over time was a significantly larger predictor of child aggressive behavior than corresponding change in mothers'. No interaction effects between parents' symptoms on behavior problems appeared, and few child gender differences. Child behavior was assessed once precluding tests for bidirectional effects. We only looked at linear change in parental symptoms. Mothers' postpartum depressive symptoms are a stronger predictor for early child behavior problems than fathers'. Change in fathers' depressive symptoms across this developmental period was uniquely and strongly associated with child aggressive problems, and should therefore be addressed in future research and clinical practice. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Friction and wear in liquid-metal systems: comparability problems of test results obtained from different test facilities

    International Nuclear Information System (INIS)

    Wild, E.; Mack, K.J.

    1976-01-01

    Operational induced relative movements take place between contacting components in the core region of sodium cooled reactors. To ensure reliable long term functioning of such friction loaded components, materials are needed with good sliding properties and high wear resistance. Therefore, tribological properties of material combinations in liquid metal have been investigated experimentally for many years at various research establishments. However, despite identical boundary conditions, the comparison of results published does not yield a satisfactory agreement. The cause must be seen in the individual design and concept of the test sections used. This discrepancy was investigated. The results show that the elasticity, mass movement, and relative motion characteristic to the system prove to be the most important criteria influencing the test results

  14. Perform qualify reliability-power tests by shooting common mistakes: practical problems and standard answers per Telcordia/Bellcore requests

    Science.gov (United States)

    Yu, Zheng

    2002-08-01

    Facing the new demands of the optical fiber communications market, almost all the performance and reliability of optical network system are dependent on the qualification of the fiber optics components. So, how to comply with the system requirements, the Telcordia / Bellcore reliability and high-power testing has become the key issue for the fiber optics components manufacturers. The qualification of Telcordia / Bellcore reliability or high-power testing is a crucial issue for the manufacturers. It is relating to who is the outstanding one in the intense competition market. These testing also need maintenances and optimizations. Now, work on the reliability and high-power testing have become the new demands in the market. The way is needed to get the 'Triple-Win' goal expected by the component-makers, the reliability-testers and the system-users. To those who are meeting practical problems for the testing, there are following seven topics that deal with how to shoot the common mistakes to perform qualify reliability and high-power testing: ¸ Qualification maintenance requirements for the reliability testing ¸ Lots control for preparing the reliability testing ¸ Sampling select per the reliability testing ¸ Interim measurements during the reliability testing ¸ Basic referencing factors relating to the high-power testing ¸ Necessity of re-qualification testing for the changing of producing ¸ Understanding the similarity for product family by the definitions

  15. Neutrino oscillations in the Earth suggest a terrestrial test of solution to solar neutrino problem

    International Nuclear Information System (INIS)

    Dar, A.; Mann, A.; Technicon-Israel Inst. of Tech., Haifa. Space Research Inst.)

    1987-01-01

    The verification of the Mikheyev-Smirnov-Wolfenstein (MSW) solution of the solar neutrino problem is discussed. One verification experiment concerns the detection of sizeable oscillations of atmospheric neutrinos in the earth, which can be detected with the massive underground proton decay detectors. Diurnal and seasonal modulations of the solar neutrino flux can perhaps be detected by the radiochemical Cl and Ga detectors. Moreover, neutrino oscillations in the Earth may modify the values of the oscillation parameters which can solve the solar neutrino problem and help determine their values. (UK)

  16. Testing the Cosmic Coincidence Problem and the Nature of Dark Energy

    International Nuclear Information System (INIS)

    Dalal, Neal; Abazajian, Kevork; Jenkins, Elizabeth; Manohar, Aneesh V.

    2001-01-01

    Dark energy models which alter the relative scaling behavior of dark energy and matter could provide a natural solution to the cosmic coincidence problem -- why the densities of dark energy and dark matter are comparable today. A generalized class of dark energy models is introduced which allows noncanonical scaling of the ratio of dark matter and dark energy with the Robertson-Walker scale factor a(t) . We show that determining whether there is a coincidence problem, and the extent of cosmic coincidence, can be addressed by several forthcoming experiments

  17. The Planar Sandwich and Other 1D Planar Heat Flow Test Problems in ExactPack

    Energy Technology Data Exchange (ETDEWEB)

    Singleton, Jr., Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-24

    This report documents the implementation of several related 1D heat flow problems in the verification package ExactPack [1]. In particular, the planar sandwich class defined in Ref. [2], as well as the classes PlanarSandwichHot, PlanarSandwichHalf, and other generalizations of the planar sandwich problem, are defined and documented here. A rather general treatment of 1D heat flow is presented, whose main results have been implemented in the class Rod1D. All planar sandwich classes are derived from the parent class Rod1D.

  18. The art of regression modeling in road safety

    CERN Document Server

    Hauer, Ezra

    2015-01-01

    This unique book explains how to fashion useful regression models from commonly available data to erect models essential for evidence-based road safety management and research. Composed from techniques and best practices presented over many years of lectures and workshops, The Art of Regression Modeling in Road Safety illustrates that fruitful modeling cannot be done without substantive knowledge about the modeled phenomenon. Class-tested in courses and workshops across North America, the book is ideal for professionals, researchers, university professors, and graduate students with an interest in, or responsibilities related to, road safety. This book also: · Presents for the first time a powerful analytical tool for road safety researchers and practitioners · Includes problems and solutions in each chapter as well as data and spreadsheets for running models and PowerPoint presentation slides · Features pedagogy well-suited for graduate courses and workshops including problems, solutions, and PowerPoint p...

  19. Intertester reliability of clinical shoulder instability and laxity tests in subjects with and without self-reported shoulder problems.

    Science.gov (United States)

    Eshoj, Henrik; Ingwersen, Kim Gordon; Larsen, Camilla Marie; Kjaer, Birgitte Hougs; Juul-Kristensen, Birgit

    2018-03-03

    First, to investigate the intertester reliability of clinical shoulder instability and laxity tests, and second, to describe the mutual dependency of each test evaluated by each tester for identifying self-reported shoulder instability and laxity. A standardised protocol for conducting reliability studies was used to test the intertester reliability of the six clinical shoulder instability and laxity tests: apprehension, relocation, surprise, load-and-shift, sulcus sign and Gagey. Cohen's kappa (κ) with 95% CIs besides prevalence-adjusted and bias-adjusted kappa (PABAK), accounting for insufficient prevalence and bias, were computed to establish the intertester reliability and mutual dependency. Forty individuals (13 with self-reported shoulder instability and laxity-related shoulder problems and 27 normal shoulder individuals) aged 18-60 were included. Fair (relocation), moderate (load-and-shift, sulcus sign) and substantial (apprehension, surprise, Gagey) intertester reliability were observed across tests (κ 0.39-0.73; 95% CI 0.00 to 1.00). PABAK improved reliability across tests, resulting in substantial to almost perfect intertester reliability for the apprehension, surprise, load-and-shift and Gagey tests (κ 0.65-0.90). Mutual dependencies between each test and self-reported shoulder problem showed apprehension, relocation and surprise to be the most often used tests to characterise self-reported shoulder instability and laxity conditions. Four tests (apprehension, surprise, load-and-shift and Gagey) out of six were considered intertester reliable for clinical use, while relocation and sulcus sign tests need further standardisation before acceptable evidence. Furthermore, the validity of the tests for shoulder instability and laxity needs to be studied. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. An Instructors Guide to Water Pollution. Test Edition. AAAS Study Guides on Contemporary Problems, No. 5.

    Science.gov (United States)

    Kidd, David E.

    This is one of several study guides on contemporary problems produced by the American Association for the Advancement of Science with support of the National Science Foundation. This study guide on water pollution includes the following units: (1) Overview of World Pollution; (2) History, Definition, Criteria; (3) Ecosystem Theory; (4) Biological…

  1. Problems with Contingency Theory: Testing Assumptions Hidden within the Language of Contingency "Theory".

    Science.gov (United States)

    Schoonhoven, Clausia Bird

    1981-01-01

    Discusses problems in contingency theory, which relates organizational structure to the tasks performed and the information needed. Analysis of data from 17 hospitals suggests that traditional contingency theory underrepresents the complexity of relations among technological uncertainty, structure, and organizational effectiveness. (Author/RW)

  2. Longitudinal Relations among Parenting, Best Friends, and Early Adolescent Problem Behavior: Testing Bidirectional Effects

    Science.gov (United States)

    Reitz, Ellen; Dekovic, Maja; Meijer, Anne Marie; Engels, Rutger C. M. E.

    2006-01-01

    In this longitudinal study, the bidirectional relations between parenting and friends' deviance, on one hand, and early adolescent externalizing and internalizing problem behavior, on the other hand, are examined. Of the 650 adolescents (13- to 14-year-olds) who filled out the Youth Self-Report and questionnaires about their parents at two times…

  3. Rhetorical Dissent as an Adaptive Response to Classroom Problems: A Test of Protection Motivation Theory

    Science.gov (United States)

    Bolkan, San; Goodboy, Alan K.

    2016-01-01

    Protection motivation theory (PMT) explains people's adaptive behavior in response to personal threats. In this study, PMT was used to predict rhetorical dissent episodes related to 210 student reports of perceived classroom problems. In line with theoretical predictions, a moderated moderation analysis revealed that students were likely to voice…

  4. Dispositional and Environmental Predictors of the Development of Internalizing Problems in Childhood: Testing a Multilevel Model.

    Science.gov (United States)

    Hastings, Paul D; Helm, Jonathan; Mills, Rosemary S L; Serbin, Lisa A; Stack, Dale M; Schwartzman, Alex E

    2015-07-01

    This investigation evaluated a multilevel model of dispositional and environmental factors contributing to the development of internalizing problems from preschool-age to school-age. In a sample of 375 families (185 daughters, 190 sons) drawn from three independent samples, preschoolers' behavioral inhibition, cortisol and gender were examined as moderators of the links between mothers' negative parenting behavior, negative emotional characteristics, and socioeconomic status when children were 3.95 years, and their internalizing problems when they were 8.34 years. Children's dispositional characteristics moderated all associations between these environmental factors and mother-reported internalizing problems in patterns that were consistent with either diathesis-stress or differential-susceptibility models of individual-environment interaction, and with gender models of developmental psychopathology. Greater inhibition and lower socioeconomic status were directly predictive of more teacher reported internalizing problems. These findings highlight the importance of using multilevel models within a bioecological framework to understand the complex pathways through which internalizing difficulties develop.

  5. The English translation and testing of the Problems After Discharge Questionnaire.

    NARCIS (Netherlands)

    Holland, D.E.; Mistiaen, P.; Knafl, G.J.; Bowles, K.H.

    2011-01-01

    The quality of hospital discharge planning assessments determines whether patients receive the health and social services they need or are sent home with unmet needs and without services. There is a valid and reliable Dutch instrument that measures problems and unmet needs patients encounter after

  6. A Study Guide on Holography (Draft). Test Edition. AAAS Study Guides on Contemporary Problems.

    Science.gov (United States)

    Jeong, Tung H.

    This is one of several study guides on contemporary problems produced by the American Association for the Advancement of Science with support of the National Science Foundation. The primary purpose of this guide is to provide a student with sufficient practical and technical information to begin independently practicing holography, with occasional…

  7. The Use of the Ames Test as a Tool for Addressing Problem-Based Learning in the Microbiology Lab

    Directory of Open Access Journals (Sweden)

    Eliana Rodríguez

    2012-08-01

    Full Text Available Our environment is full of potential carcinogens such as UV light, industrial pollutants, pesticides, and food additives, among others. It is estimated that 90% of all carcinogens are also mutagens. The Ames test is one of the most common tests for mutagens. In this problem-based learning activity, undergraduate biology students used the Ames test to screen a substance they provided, to see if it could be considered a mutagen. The idea of surveying substances used in everyday life appealed to our students, and helped engage them in this activity.

  8. Subset selection in regression

    CERN Document Server

    Miller, Alan

    2002-01-01

    Originally published in 1990, the first edition of Subset Selection in Regression filled a significant gap in the literature, and its critical and popular success has continued for more than a decade. Thoroughly revised to reflect progress in theory, methods, and computing power, the second edition promises to continue that tradition. The author has thoroughly updated each chapter, incorporated new material on recent developments, and included more examples and references. New in the Second Edition:A separate chapter on Bayesian methodsComplete revision of the chapter on estimationA major example from the field of near infrared spectroscopyMore emphasis on cross-validationGreater focus on bootstrappingStochastic algorithms for finding good subsets from large numbers of predictors when an exhaustive search is not feasible Software available on the Internet for implementing many of the algorithms presentedMore examplesSubset Selection in Regression, Second Edition remains dedicated to the techniques for fitting...

  9. Better Autologistic Regression

    Directory of Open Access Journals (Sweden)

    Mark A. Wolters

    2017-11-01

    Full Text Available Autologistic regression is an important probability model for dichotomous random variables observed along with covariate information. It has been used in various fields for analyzing binary data possessing spatial or network structure. The model can be viewed as an extension of the autologistic model (also known as the Ising model, quadratic exponential binary distribution, or Boltzmann machine to include covariates. It can also be viewed as an extension of logistic regression to handle responses that are not independent. Not all authors use exactly the same form of the autologistic regression model. Variations of the model differ in two respects. First, the variable coding—the two numbers used to represent the two possible states of the variables—might differ. Common coding choices are (zero, one and (minus one, plus one. Second, the model might appear in either of two algebraic forms: a standard form, or a recently proposed centered form. Little attention has been paid to the effect of these differences, and the literature shows ambiguity about their importance. It is shown here that changes to either coding or centering in fact produce distinct, non-nested probability models. Theoretical results, numerical studies, and analysis of an ecological data set all show that the differences among the models can be large and practically significant. Understanding the nature of the differences and making appropriate modeling choices can lead to significantly improved autologistic regression analyses. The results strongly suggest that the standard model with plus/minus coding, which we call the symmetric autologistic model, is the most natural choice among the autologistic variants.

  10. Regression in organizational leadership.

    Science.gov (United States)

    Kernberg, O F

    1979-02-01

    The choice of good leaders is a major task for all organizations. Inforamtion regarding the prospective administrator's personality should complement questions regarding his previous experience, his general conceptual skills, his technical knowledge, and the specific skills in the area for which he is being selected. The growing psychoanalytic knowledge about the crucial importance of internal, in contrast to external, object relations, and about the mutual relationships of regression in individuals and in groups, constitutes an important practical tool for the selection of leaders.

  11. Classification and regression trees

    CERN Document Server

    Breiman, Leo; Olshen, Richard A; Stone, Charles J

    1984-01-01

    The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.

  12. Exploring the Domain Specificity of Creativity in Children: The Relationship between a Non-Verbal Creative Production Test and Creative Problem-Solving Activities

    Directory of Open Access Journals (Sweden)

    Ahmed Mohamed

    2012-12-01

    Full Text Available AbstractIn this study, we explored whether creativity was domain specific or domain general. The relationships between students’ scores on three creative problem-solving activities (math, spa-tial artistic, and oral linguistic in the DIS-COVER assessment (Discovering Intellectual Strengths and Capabilities While Observing Varied Ethnic Responses and the TCT-DP (Test of Creative Thinking-Drawing Produc-tion, a non-verbal general measure of creativi-ty, were examined. The participants were 135 first and second graders from two schools in the Southwestern United States from linguisti-cally and culturally diverse backgrounds. Pearson correlations, canonical correlations, and multiple regression analyses were calcu-lated to describe the relationship between the TCT-DP and the three DISCOVER creative problem-solving activities. We found that crea-tivity has both domain-specific and domain-general aspects, but that the domain-specific component seemed more prominent. One im-plication of these results is that educators should consider assessing creativity in specific domains to place students in special programs for gifted students rather than relying only on domain-general measures of divergent think-ing or creativity.

  13. 49 CFR 40.199 - What problems always cause a drug test to be cancelled?

    Science.gov (United States)

    2010-10-01

    ... cancelled? 40.199 Section 40.199 Transportation Office of the Secretary of Transportation PROCEDURES FOR... cause a drug test to be cancelled? (a) As the MRO, when the laboratory discovers a “fatal flaw” during... specimen has been “Rejected for Testing” (with the reason stated). You must always cancel such a test. (b...

  14. Problem-Solving Test: Analysis of DNA Damage Recognizing Proteins in Yeast and Human Cells

    Science.gov (United States)

    Szeberenyi, Jozsef

    2013-01-01

    The experiment described in this test was aimed at identifying DNA repair proteins in human and yeast cells. Terms to be familiar with before you start to solve the test: DNA repair, germline mutation, somatic mutation, inherited disease, cancer, restriction endonuclease, radioactive labeling, [alpha-[superscript 32]P]ATP, [gamma-[superscript…

  15. A Note on Some Problems in the Testing of Personality Characteristics in Children with Visual Impairment

    Science.gov (United States)

    Tobin, Michael; Hill, Eileen

    2010-01-01

    An examination is made of the value of using published personality tests with young blind and partially sighted children. Based on data gathered during a longitudinal investigation into the educational and psychological development of a group of 120 visually impaired learners, the authors conclude that their own selection of a test instrument…

  16. Design of a Maglev Vibration Test Platform for the Research of Maglev Vehicle-girder Coupled Vibration Problem

    Directory of Open Access Journals (Sweden)

    Zhou Danfeng

    2017-01-01

    Full Text Available The maglev vehicle-girder coupled vibration problem has been encountered in many maglev test or commercial lines, which significantly degrade the performance of the maglev train. In previous research on the principle of the coupled vibration problem, it has been discovered that the fundamental model of the maglev girder can be simplified as a series of mass-spring resonators of different but related resonance frequencies, and that the stability of the vehicle-girder coupled system can be investigated by separately examining the stability of each mass-spring resonator – electromagnet coupled system. Based on this conclusion, a maglev test platform, which includes a single electromagnetic suspension control system, is built for experimental study of the coupled vibration problem. The guideway of the test platform is supported by a number of springs so as to change its flexibility. The mass of the guideway can also be changed by adjusting extra weights attached to it. By changing the flexibility and mass of the guideway, the rules of the maglev vehicle-girder coupled vibration problem are to be examined through experiments, and related theory on the vehicle-girder self-excited vibration proposed in previous research is also testified.

  17. Repeated Results Analysis for Middleware Regression Benchmarking

    Czech Academy of Sciences Publication Activity Database

    Bulej, Lubomír; Kalibera, T.; Tůma, P.

    2005-01-01

    Roč. 60, - (2005), s. 345-358 ISSN 0166-5316 R&D Projects: GA ČR GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : middleware benchmarking * regression benchmarking * regression testing Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.756, year: 2005

  18. USING A PHENOMENOLOGICAL MODEL TO TEST THE COINCIDENCE PROBLEM OF DARK ENERGY

    International Nuclear Information System (INIS)

    Chen Yun; Zhu Zonghong; Alcaniz, J. S.; Gong Yungui

    2010-01-01

    By assuming a phenomenological form for the ratio of the dark energy and matter densities ρ X ∝ ρ m a ξ , we discuss the cosmic coincidence problem in light of current observational data. Here, ξ is a key parameter to denote the severity of the coincidence problem. In this scenario, ξ = 3 and ξ = 0 correspond to ΛCDM and the self-similar solution without the coincidence problem, respectively. Hence, any solution with a scaling parameter 0 X = 0, where ω X is the equation of state of the dark energy component, whereas the inequality ξ + 3ω X ≠ 0 represents non-standard cosmology. We place observational constraints on the parameters (Ω X,0 , ω X , ξ) of this model, where Ω X,0 is the present value of density parameter of dark energy Ω X , by using the Constitution Set (397 supernovae of type Ia data, hereafter SNeIa), the cosmic microwave background shift parameter from the five-year Wilkinson Microwave Anisotropy Probe and the Sloan Digital Sky Survey baryon acoustic peak. Combining the three samples, we get Ω X,0 = 0.72 ± 0.02, ω X = -0.98 ± 0.07, and ξ = 3.06 ± 0.35 at 68.3% confidence level. The result shows that the ΛCDM model still remains a good fit to the recent observational data, and the coincidence problem indeed exists and is quite severe, in the framework of this simple phenomenological model. We further constrain the model with the transition redshift (deceleration/acceleration). It shows that if the transition from deceleration to acceleration happens at the redshift z > 0.73, within the framework of this model, we can conclude that the interaction between dark energy and dark matter is necessary.

  19. Direct Tests on Individual Behaviour in Small Decision-Making Problems

    Directory of Open Access Journals (Sweden)

    Takemi Fujikawa

    2007-10-01

    Full Text Available This paper provides an empirical and experimental analysis of individual decision making in small decision-making problems with a series of laboratory experiments. Two experimental treatments with binary small decision-making problems are implemented: (1 the search treatment with the unknown payoff distribution to the decision makers and (2 the choice treatment with the known payoff distribution. A first observation is that in the search treatment the tendency to select best reply to the past performances, and misestimation of the payoff distribution can lead to robust deviations from expected value maximisation. A second observation is concerned with choice problems with two options with the same expected value: one option is more risky with larger payoff variability; the other option is moderate with less payoff variability. Experimental results show that it is likely that the more the decision makers choose a risky option, the higher they can achieve high points, ex post. Finally, I investigate the exploration tendency. Comparison of results between the search treatment and the choice treatment reveals that the additional information to the decision makers enhances expected value maximisation.

  20. Problems studied within the state research project New Methods of Nondestructive Materials Testing Using Ionizing Radiation

    International Nuclear Information System (INIS)

    Mysak, F.; Strba, J.

    1979-01-01

    A state research project is described divided into ten subprojects, viz.: New trends of ionizing radiation detection using television technology in nondestructive testing; the application of accelerators for thick-walled product testing; the atlas of butt welds of medium thicknesses; the application of radioanalytical methods in testing the wear of gearboxes and other components of instrument parts; multielemental analyses of combustion engine wear using radionuclides; the application of radioisotope methods in research into wear of antifriction bearings of trucks and railway cars; the application of radionuclides in assessing corrosion resistance of steels and corrosion protection systems; the application of radionuclide methods in improving the quality of high-grade steel production; the selection and testing of radionuclide instruments for building production control, intermediate and acceptance checks; and radioisotope methods for building machine and equipment control. (M.S.)

  1. Problems in Standardization of Orthodontic Shear Bond Strength Tests; A Brief Review

    Directory of Open Access Journals (Sweden)

    M.S. A. Akhoundi

    2005-03-01

    Full Text Available Bonding brackets to the enamel surface has gained much popularity today. New adhesive systems have been introduced and marketed and a considerable increase in research regarding bond strength has been published. A considerable amount of these studies deal with shear bond strength of adhesives designed for orthodontic purpose.Previous studies have used variety of test designs. This diversity in test design is due to the fact that there is no standard method for evaluating shear bond strength in orthodontics. Therefore comparison of data obtained from different study is almost impossible.This article tries to briefly discuss the developments occurred in the process of shear bond strength measurement of orthodontic adhesives with an emphasis on the type of test set up and load application.Although the test designs for measuring shear bond strength in orthodontics are still far from ideal, attempts must be made to standardize these tests especially in order to makecomparison of different data easier. It is recommended that test designs be set up in such a manner that better matches with the purpose of the study.

  2. Marginal longitudinal semiparametric regression via penalized splines

    KAUST Repository

    Al Kadiri, M.

    2010-08-01

    We study the marginal longitudinal nonparametric regression problem and some of its semiparametric extensions. We point out that, while several elaborate proposals for efficient estimation have been proposed, a relative simple and straightforward one, based on penalized splines, has not. After describing our approach, we then explain how Gibbs sampling and the BUGS software can be used to achieve quick and effective implementation. Illustrations are provided for nonparametric regression and additive models.

  3. Marginal longitudinal semiparametric regression via penalized splines

    KAUST Repository

    Al Kadiri, M.; Carroll, R.J.; Wand, M.P.

    2010-01-01

    We study the marginal longitudinal nonparametric regression problem and some of its semiparametric extensions. We point out that, while several elaborate proposals for efficient estimation have been proposed, a relative simple and straightforward one, based on penalized splines, has not. After describing our approach, we then explain how Gibbs sampling and the BUGS software can be used to achieve quick and effective implementation. Illustrations are provided for nonparametric regression and additive models.

  4. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface......-product we obtain fast access to the baseline hazards (compared to survival::basehaz()) and predictions of survival probabilities, their confidence intervals and confidence bands. Confidence intervals and confidence bands are based on point-wise asymptotic expansions of the corresponding statistical...

  5. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  6. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  7. Regression of environmental noise in LIGO data

    International Nuclear Information System (INIS)

    Tiwari, V; Klimenko, S; Mitselmakher, G; Necula, V; Drago, M; Prodi, G; Frolov, V; Yakushin, I; Re, V; Salemi, F; Vedovato, G

    2015-01-01

    We address the problem of noise regression in the output of gravitational-wave (GW) interferometers, using data from the physical environmental monitors (PEM). The objective of the regression analysis is to predict environmental noise in the GW channel from the PEM measurements. One of the most promising regression methods is based on the construction of Wiener–Kolmogorov (WK) filters. Using this method, the seismic noise cancellation from the LIGO GW channel has already been performed. In the presented approach the WK method has been extended, incorporating banks of Wiener filters in the time–frequency domain, multi-channel analysis and regulation schemes, which greatly enhance the versatility of the regression analysis. Also we present the first results on regression of the bi-coherent noise in the LIGO data. (paper)

  8. Positive hepatitis B surface antigen tests due to recent vaccination: a persistent problem

    Directory of Open Access Journals (Sweden)

    Rysgaard Carolyn D

    2012-09-01

    Full Text Available Abstract Background Hepatitis B virus (HBV is a common cause of viral hepatitis with significant health complications including cirrhosis and hepatocellular carcinoma. Assays for hepatitis B surface antigen (HBsAg are the most frequently used tests to detect HBV infection. Vaccination for HBV can produce transiently detectable levels of HBsAg in patients. However, the time course and duration of this effect is unclear. The objective of this retrospective study was to clarify the frequency and duration of transient HBsAg positivity following vaccination against HBV. Methods The electronic medical record at an academic tertiary care medical center was searched to identify all orders for HBsAg within a 17 month time period. Detailed chart review was performed to identify all patients who were administered HBV vaccine within 180 days prior to HBsAg testing and also to ascertain likely cause of weakly positive (grayzone results. Results During the 17 month study period, 11,719 HBsAg tests were ordered on 9,930 patients. There were 34 tests performed on 34 patients who received HBV vaccine 14 days or less prior to HBsAg testing. Of these 34 patients, 11 had grayzone results for HBsAg that could be attributed to recent vaccination. Ten of the 11 patients were renal dialysis patients who were receiving HBsAg testing as part of routine and ongoing monitoring. Beyond 14 days, there were no reactive or grayzone HBsAg tests that could be attributed to recent HBV vaccination. HBsAg results reached a peak COI two to three days following vaccination before decaying. Further analysis of all the grayzone results within the 17 month study period (43 results out of 11,719 tests revealed that only 4 of 43 were the result of true HBV infection as verified by confirmatory testing. Conclusions Our study confirms that transient HBsAg positivity can occur in patients following HBV vaccination. The results suggest this positivity is unlikely to persist beyond 14 days

  9. Frequency formats, probability formats, or problem structure? A test of the nested-sets hypothesis in an extensional reasoning task

    Directory of Open Access Journals (Sweden)

    William P. Neace

    2008-02-01

    Full Text Available Five experiments addressed a controversy in the probability judgment literature that centers on the efficacy of framing probabilities as frequencies. The natural frequency view predicts that frequency formats attenuate errors, while the nested-sets view predicts that highlighting the set-subset structure of the problem reduces error, regardless of problem format. This study tested these predictions using a conjunction task. Previous studies reporting that frequency formats reduced conjunction errors confounded reference class with problem format. After controlling this confound, the present study's findings show that conjunction errors can be reduced using either a probability or a frequency format, that frequency effects depend upon the presence of a reference class, and that frequency formats do not promote better statistical reasoning than probability formats.

  10. METHODOLOGICAL PROBLEMS AND WAYS OF CREATION OF THE AIRCRAFT EQUIPMENT TEST AUTOMATED MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Vladimir Michailovich Vetoshkin

    2017-01-01

    Full Text Available The development of new and modernization of existing aviation equipment specimens of different classes are ac- companied and completed by the complex process of ground and flight tests. This phase of aviation equipment life cycle is implemented by means of organizational and technical systems - running centers. The latter include various proving grounds, measuring complex and systems, aircraft, ships, security and flight control offices, information processing laborato- ries and many other elements. The system analysis results of development challenges of the automated control systems of aviation equipment tests operations are presented. The automated control systems are in essence an automated data bank. The key role of development of flight tests automated control system in the process of creation of the automated control sys- tems of aviation equipment tests operations is substantiated. The way of the mobile modular measuring complexes integra- tion and the need for national methodologies and technological standards for database systems design concepts are grounded. Database system, as a central element in this scheme, provides collection, storing and updating of values of the elements described above in pace and the required frequency of the controlled object state monitoring. It is database system that pro- vides the supervisory unit with actual data corresponding to specific moments of time, which concern the state processes, assessments of the progress and results of flight experiments, creating the necessary environment for aviation equipment managing and testing as a whole. The basis for development of subsystems of automated control systems of aviation equip- ment tests operations are conceptual design processes of the respective database system, the implementation effectiveness of which largely determines the level of success and ability to develop the systems being created. Introduced conclusions and suggestions can be used in the

  11. On Solving Lq-Penalized Regressions

    Directory of Open Access Journals (Sweden)

    Tracy Zhou Wu

    2007-01-01

    Full Text Available Lq-penalized regression arises in multidimensional statistical modelling where all or part of the regression coefficients are penalized to achieve both accuracy and parsimony of statistical models. There is often substantial computational difficulty except for the quadratic penalty case. The difficulty is partly due to the nonsmoothness of the objective function inherited from the use of the absolute value. We propose a new solution method for the general Lq-penalized regression problem based on space transformation and thus efficient optimization algorithms. The new method has immediate applications in statistics, notably in penalized spline smoothing problems. In particular, the LASSO problem is shown to be polynomial time solvable. Numerical studies show promise of our approach.

  12. Specification of the Advanced Burner Test Reactor Multi-Physics Coupling Demonstration Problem

    Energy Technology Data Exchange (ETDEWEB)

    Shemon, E. R. [Argonne National Lab. (ANL), Argonne, IL (United States); Grudzinski, J. J. [Argonne National Lab. (ANL), Argonne, IL (United States); Lee, C. H. [Argonne National Lab. (ANL), Argonne, IL (United States); Thomas, J. W. [Argonne National Lab. (ANL), Argonne, IL (United States); Yu, Y. Q. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-12-21

    This document specifies the multi-physics nuclear reactor demonstration problem using the SHARP software package developed by NEAMS. The SHARP toolset simulates the key coupled physics phenomena inside a nuclear reactor. The PROTEUS neutronics code models the neutron transport within the system, the Nek5000 computational fluid dynamics code models the fluid flow and heat transfer, and the DIABLO structural mechanics code models structural and mechanical deformation. The three codes are coupled to the MOAB mesh framework which allows feedback from neutronics, fluid mechanics, and mechanical deformation in a compatible format.

  13. Stochastic search, optimization and regression with energy applications

    Science.gov (United States)

    Hannah, Lauren A.

    models. We evaluate DP-GLM on several data sets, comparing it to modern methods of nonparametric regression like CART, Bayesian trees and Gaussian processes. Compared to existing techniques, the DP-GLM provides a single model (and corresponding inference algorithms) that performs well in many regression settings. Finally, we study convex stochastic search problems where a noisy objective function value is observed after a decision is made. There are many stochastic search problems whose behavior depends on an exogenous state variable which affects the shape of the objective function. Currently, there is no general purpose algorithm to solve this class of problems. We use nonparametric density estimation to take observations from the joint state-outcome distribution and use them to infer the optimal decision for a given query state. We propose two solution methods that depend on the problem characteristics: function-based and gradient-based optimization. We examine two weighting schemes, kernel-based weights and Dirichlet process-based weights, for use with the solution methods. The weights and solution methods are tested on a synthetic multi-product newsvendor problem and the hour-ahead wind commitment problem. Our results show that in some cases Dirichlet process weights offer substantial benefits over kernel based weights and more generally that nonparametric estimation methods provide good solutions to otherwise intractable problems.

  14. Regularized Label Relaxation Linear Regression.

    Science.gov (United States)

    Fang, Xiaozhao; Xu, Yong; Li, Xuelong; Lai, Zhihui; Wong, Wai Keung; Fang, Bingwu

    2018-04-01

    Linear regression (LR) and some of its variants have been widely used for classification problems. Most of these methods assume that during the learning phase, the training samples can be exactly transformed into a strict binary label matrix, which has too little freedom to fit the labels adequately. To address this problem, in this paper, we propose a novel regularized label relaxation LR method, which has the following notable characteristics. First, the proposed method relaxes the strict binary label matrix into a slack variable matrix by introducing a nonnegative label relaxation matrix into LR, which provides more freedom to fit the labels and simultaneously enlarges the margins between different classes as much as possible. Second, the proposed method constructs the class compactness graph based on manifold learning and uses it as the regularization item to avoid the problem of overfitting. The class compactness graph is used to ensure that the samples sharing the same labels can be kept close after they are transformed. Two different algorithms, which are, respectively, based on -norm and -norm loss functions are devised. These two algorithms have compact closed-form solutions in each iteration so that they are easily implemented. Extensive experiments show that these two algorithms outperform the state-of-the-art algorithms in terms of the classification accuracy and running time.

  15. Tests of microprocessor-based relay protection devices: Problems and solutions

    Directory of Open Access Journals (Sweden)

    Gurevich Vladimir

    2009-01-01

    Full Text Available Usually, the operational condition of relay protection devices is checked with specific settings used for the relay operation in a certain network point. In the author's opinion in order to verify the proper operation of complex multifunctional microprocessor-based protection devices (MPD at their inspection, start-up after repairs or during periodic tests there is no need to use the actual settings at which the relay is to be operated in a certain network's point. It should be tested for proper operation at several of its most critical preset characteristic points as well as in several preset characteristics constituting its most complicated (combined operation modes, including the dynamic operation modes with preset transition processes specific for standard power networks (not necessarily for a specific point. The proposed set of actions for the unification of software platforms of the modern, microprocessor-based relay protection test systems will enable examination of modern MPD in an absolutely new way. .

  16. Progression paths in children's problem solving: The influence of dynamic testing, initial variability, and working memory.

    Science.gov (United States)

    Resing, Wilma C M; Bakker, Merel; Pronk, Christine M E; Elliott, Julian G

    2017-01-01

    The current study investigated developmental trajectories of analogical reasoning performance of 104 7- and 8-year-old children. We employed a microgenetic research method and multilevel analysis to examine the influence of several background variables and experimental treatment on the children's developmental trajectories. Our participants were divided into two treatment groups: repeated practice alone and repeated practice with training. Each child received an initial working memory assessment and was subsequently asked to solve figural analogies on each of several sessions. We examined children's analogical problem-solving behavior and their subsequent verbal accounts of their employed solving processes. We also investigated the influence of verbal and visual-spatial working memory capacity and initial variability in strategy use on analogical reasoning development. Results indicated that children in both treatment groups improved but that gains were greater for those who had received training. Training also reduced the influence of children's initial variability in the use of analogical strategies with the degree of improvement in reasoning largely unrelated to working memory capacity. Findings from this study demonstrate the value of a microgenetic research method and the use of multilevel analysis to examine inter- and intra-individual change in problem-solving processes. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Chinese Beliefs in Luck are Linked to Gambling Problems via Strengthened Cognitive Biases: A Mediation Test.

    Science.gov (United States)

    Lim, Matthew S M; Rogers, Robert D

    2017-12-01

    Problematic patterns of gambling and their harms are known to have culturally specific expressions. For ethnic Chinese people, patterns of superstitious belief in this community appear to be linked to the elevated rates of gambling-related harms; however, little is known about the mediating psychological mechanisms. To address this issue, we surveyed 333 Chinese gamblers residing internationally and used a mediation analysis to explore how gambling-related cognitive biases, gambling frequency and variety of gambling forms ('scope') mediate the association between beliefs in luck and gambling problems. We found that cognitive biases and scope were significant mediators of this link but that the former is a stronger mediator than the latter. The mediating erroneous beliefs were not specific to any particular type of cognitive bias. These results suggest that Chinese beliefs in luck are expressed as gambling cognitive biases that increase the likelihood of gambling problems, and that biases that promote gambling (and its harms) are best understood within their socio-cultural context.

  18. Direct-to-consumer genetic testing: perceptions, problems, and policy responses.

    Science.gov (United States)

    Caulfield, Timothy; McGuire, Amy L

    2012-01-01

    Direct-to-consumer (DTC) genetic testing has attracted a great amount of attention from policy makers, the scientific community, professional groups, and the media. Although it is unclear what the public demand is for these services, there does appear to be public interest in personal genetic risk information. As a result, many commentators have raised a variety of social, ethical, and regulatory issues associated with this emerging industry, including privacy issues, ensuring that DTC companies provide accurate information about the risks and limitations of their services, the possible adverse impact of DTC genetic testing on healthcare systems, and concern about how individuals may interpret and react to genetic risk information.

  19. Robustness to non-normality of common tests for the many-sample location problem

    Directory of Open Access Journals (Sweden)

    Azmeri Khan

    2003-01-01

    Full Text Available This paper studies the effect of deviating from the normal distribution assumption when considering the power of two many-sample location test procedures: ANOVA (parametric and Kruskal-Wallis (non-parametric. Power functions for these tests under various conditions are produced using simulation, where the simulated data are produced using MacGillivray and Cannon's [10] recently suggested g-and-k distribution. This distribution can provide data with selected amounts of skewness and kurtosis by varying two nearly independent parameters.

  20. Start up test and technical problems encountered on N.S. Mutsu

    International Nuclear Information System (INIS)

    Osanai, M.; Tomiyama, E.

    1978-01-01

    Based on experience with other reactors in general, the commissioning trials on N.S. Mutsu leading to full power were planned to be undertaken in six phases covering various ranges of reactor output from zero to full power. The first phase of zero power tests was carried out successfully, but the ensuring phases of the trials had to be suspended on account of radiation leakage being detected. In order to determine the cause of this leakage, measurements were made of the radiation dose rate around the reactor. The present paper outlines the evolution of the zero-phase test, and the survey work undertaken immediately after the radiation leakage

  1. Some New Verification Test Problems for Multimaterial Diffusion on Meshes that are Non-Aligned with Material Boundaries

    Energy Technology Data Exchange (ETDEWEB)

    Dawes, Alan Sidney [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Malone, Christopher M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shashkov, Mikhail Jurievich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-07

    In this report a number of new verification test problems for multimaterial diffusion will be shown. Using them we will show that homogenization of multimaterial cells in either Arbitrary Lagrangian Eulerian (ALE) or Eulerian simulations can lead to errors in the energy flow at the interfaces. Results will be presented that show that significant improvements and predictive capability can be gained by using either a surrogate supermesh, such as Thin Mesh in FLAG, or the emerging method based on Static Condensation.

  2. Time Limits in Testing: An Analysis of Eye Movements and Visual Attention in Spatial Problem Solving

    Science.gov (United States)

    Roach, Victoria A.; Fraser, Graham M.; Kryklywy, James H.; Mitchell, Derek G. V.; Wilson, Timothy D.

    2017-01-01

    Individuals with an aptitude for interpreting spatial information (high mental rotation ability: HMRA) typically master anatomy with more ease, and more quickly, than those with low mental rotation ability (LMRA). This article explores how visual attention differs with time limits on spatial reasoning tests. Participants were assorted to two…

  3. Problem-Based Test: Replication of Mitochondrial DNA during the Cell Cycle

    Science.gov (United States)

    Setalo, Gyorgy, Jr.

    2013-01-01

    Terms to be familiar with before you start to solve the test: cell cycle, generation time, S-phase, cell culture synchronization, isotopic pulse-chase labeling, density labeling, equilibrium density-gradient centrifugation, buoyant density, rate-zonal centrifugation, nucleoside, nucleotide, kinase enzymes, polymerization of nucleic acids,…

  4. Economic Crisis and Marital Problems in Turkey: Testing the Family Stress Model

    Science.gov (United States)

    Aytac, Isik A.; Rankin, Bruce H.

    2009-01-01

    This paper applied the family stress model to the case of Turkey in the wake of the 2001 economic crisis. Using structural equation modeling and a nationally representative urban sample of 711 married women and 490 married men, we tested whether economic hardship and the associated family economic strain on families resulted in greater marital…

  5. Problem-Solving Test: Nucleocytoplasmic Shuttling of Pre-mRNA Binding Proteins

    Science.gov (United States)

    Szeberenyi, Jozsef

    2012-01-01

    Terms to be familiar with before you start to solve the test: transcription, pre-mRNA, RNA processing, RNA transport, RNA polymerase II, direct and indirect immunofluorescence staining, cell fractionation by centrifugation, oligo(dT)-cellulose chromatography, washing and elution of the column, ribonuclease, SDS-polyacrylamide gel electrophoresis,…

  6. Problem-Solving Test: Analysis of the Role of Cyclin B

    Science.gov (United States)

    Szeberenyi, Jozsef

    2011-01-01

    An experiment is described in this test that was designed to study the role of the cyclin B protein in a cell-free system. The work was performed in the lab of Tim Hunt who, together with Hartwell and Nurse, received the Nobel Prize in Physiology or Medicine in 2001 "for their discoveries of key chemicals that regulate the cell division cycle." It…

  7. Problem-Solving Test: Conditional Gene Targeting Using the Cre/loxP Recombination System

    Science.gov (United States)

    Szeberényi, József

    2013-01-01

    Terms to be familiar with before you start to solve the test: gene targeting, knock-out mutation, bacteriophage, complementary base-pairing, homologous recombination, deletion, transgenic organisms, promoter, polyadenylation element, transgene, DNA replication, RNA polymerase, Shine-Dalgarno sequence, restriction endonuclease, polymerase chain…

  8. Problem-Based Test: Functional Analysis of Mutant 16S rRNAs

    Science.gov (United States)

    Szeberenyi, Jozsef

    2010-01-01

    Terms to be familiar with before you start to solve the test: ribosome, ribosomal subunits, antibiotics, point mutation, 16S, 5S, and 23S rRNA, Shine-Dalgarno sequence, mRNA, tRNA, palindrome, hairpin, restriction endonuclease, fMet-tRNA, peptidyl transferase, initiation, elongation, termination of translation, expression plasmid, transformation,…

  9. Problem-Solving Test: The Mechanism of Action of a Human Papilloma Virus Oncoprotein

    Science.gov (United States)

    Szeberenyi, Jozsef

    2009-01-01

    Terms to be familiar with before you start to solve the test: human papilloma virus; cervical cancer; oncoproteins; malignant transformation; retinoblastoma protein; cell cycle; quiescent and cycling cells; cyclin/cyclin-dependent kinase (Cdk) complexes; E2F; S-phase genes; enhancer element; proto-oncogenes; tumor suppressor genes; radioactive…

  10. Test fields on compact spacetimes: Problems, some partial results and speculations

    International Nuclear Information System (INIS)

    Yurtsever, U.

    1989-09-01

    In this paper we study some basic aspects of (Lorentzian) field theory on compact Lorentz manifolds. All compact spacetimes are acausal, i.e. possess closed timelike curves; this makes them a useful testbed in analyzing some new notions of causality that we will introduce for more general acausal spacetimes. In addition, studying compact spacetimes in their own right raises a wide range of fascinating mathematical problems some of which we will explore. We will see that it is reasonable to expect Lorentzian field theory on a compact spacetime to provide information on the topology of the underlying manifold; if this is true, then this information is likely to be ''orthogonal'' (or complementary) to the information obtained through the study of Euclidean field theory. (author). 45 refs, 2 figs

  11. Aid and growth regressions

    DEFF Research Database (Denmark)

    Hansen, Henrik; Tarp, Finn

    2001-01-01

    This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy....... investment. We conclude by stressing the need for more theoretical work before this kind of cross-country regressions are used for policy purposes.......This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy...

  12. Review of Bose-Fermi and ''Supersymmetry'' models; problems in particle transfer tests

    International Nuclear Information System (INIS)

    Vergnes, M.

    1986-01-01

    The first case suggested for a supersymmetry in nuclei was that of a j = 3/2 particle coupled to an 0(6) core. A more recent and elaborate scheme is the ''multi-j'' supersymmetry, describing the coupling of a particle in more than just one orbital, with the three possible cores of the interacting boson model. A general survey of the particle transfer tests of these different models is presented and the results summarized. A comparison of IBFM-2 calculations with experimental data is discussed, as well as results of sum rules analysis. Present and future tests concerning extensions of the above mentioned models, particularly to odd-odd nuclei, are briefly indicated. It appears necessary to clearly determine if the origin of the difficulties outlined for transfer reactions indeed lies -as often suggested- in the simplified form of the transfer operator used in deriving the selection rules, and not in the models themselves

  13. Problem of presently available diagnostic tests for Zika virus infection: View from Thailand

    Institute of Scientific and Technical Information of China (English)

    Beuy Joob; Viroj Wiwanitkit

    2016-01-01

    Dear Editor,Zika virus infection is the present global issue due to the finding of occurrence of congenital defect relating to this infection[1,2].The disease is a dengue-like infection,hence,it is well-known that the missed and under diagnosis is possible[1,2].However,the big concern is on the reliability of the presently available diagnostic tests for diagnosing Zika virus infection.Here,the authors appraise on previous published

  14. National Security Cutter: Enhanced Oversight Needed to Ensure Problems Discovered during Testing and Operations Are Addressed

    Science.gov (United States)

    2016-01-01

    May Lead to NSCs and Future DHS Assets Deploying without Having Demonstrated Full Capabilities 22 Performance Issues Discovered Outside of IOT &E...Examples of National Security Cutter Critical Operational Issues and Key Performance Parameters 10 Table 3: National Security Cutter Major...Cutter IOT &E Initial Operational Test and Evaluation KPP Key Performance Parameter LRI-II Long-Range Interceptor II NSC National Security Cutter

  15. Speed test results and hardware/software study of computational speed problem, appendix D

    Science.gov (United States)

    1984-01-01

    The HP9845C is a desktop computer which is tested and evaluated for processing speed. A study was made to determine the availability and approximate cost of computers and/or hardware accessories necessary to meet the 20 ms sample period speed requirements. Additional requirements were that the control algorithm could be programmed in a high language and that the machine have sufficient storage to store the data from a complete experiment.

  16. On the problem of nonsense correlations in allergological tests after routine extraction.

    Science.gov (United States)

    Rijckaert, G

    1981-01-01

    The influence of extraction procedures and culturing methods of material used for the preparation of allergenic extracts on correlation patterns found in allergological testing (skin test and RAST) was investigated. In our laboratory a short extraction procedure performed at O degrees C was used for Aspergillus repens. A. penicilloides, Wallemia sebi, their rearing media and non-inoculated medium. For the commercially available extracts from house dust, house-dust mite, pollen of Dactylus glomerata and A. penicilloides a longer procedure (several days) performed at room temperature was used. Statistical analysis showed a separation of all test results into two clusters, each cluster being composed of correlations between extracts from only one the manufacturers did not show any correlation. The correlations found between the short time incubated extracts of the xerophilic fungi and their rearing media could be explained by genetical and biochemical relationships between these fungi depending on ecological conditions. However, while the correlation found between house dust and house-dust mite is understandable, correlations found between long time incubated extracts from house-dust mite and D. glomerata or A. penicilloides may be nonsense correlations, that do not adequately describe the in vivo situation. The similarity of these extracts is presumably artificially created during extraction.

  17. Evaluation of the entropy consistent euler flux on 1D and 2D test problems

    Science.gov (United States)

    Roslan, Nur Khairunnisa Hanisah; Ismail, Farzad

    2012-06-01

    Perhaps most CFD simulations may yield good predictions of pressure and velocity when compared to experimental data. Unfortunately, these results will most likely not adhere to the second law of thermodynamics hence comprising the authenticity of predicted data. Currently, the test of a good CFD code is to check how much entropy is generated in a smooth flow and hope that the numerical entropy produced is of the correct sign when a shock is encountered. Herein, a shock capturing code written in C++ based on a recent entropy consistent Euler flux is developed to simulate 1D and 2D flows. Unlike other finite volume schemes in commercial CFD code, this entropy consistent flux (EC) function precisely satisfies the discrete second law of thermodynamics. This EC flux has an entropy-conserved part, preserving entropy for smooth flows and a numerical diffusion part that will accurately produce the proper amount of entropy, consistent with the second law. Several numerical simulations of the entropy consistent flux have been tested on two dimensional test cases. The first case is a Mach 3 flow over a forward facing step. The second case is a flow over a NACA 0012 airfoil while the third case is a hypersonic flow passing over a 2D cylinder. Local flow quantities such as velocity and pressure are analyzed and then compared with mainly the Roe flux. The results herein show that the EC flux does not capture the unphysical rarefaction shock unlike the Roe-flux and does not easily succumb to the carbuncle phenomenon. In addition, the EC flux maintains good performance in cases where the Roe flux is known to be superior.

  18. Captive chimpanzee foraging in a social setting: a test of problem solving, flexibility, and spatial discounting

    Science.gov (United States)

    Kurtycz, Laura M.; Ross, Stephen R.; Bonnie, Kristin E.

    2015-01-01

    In the wild, primates are selective over the routes that they take when foraging and seek out preferred or ephemeral food. Given this, we tested how a group of captive chimpanzees weighed the relative benefits and costs of foraging for food in their environment when a less-preferred food could be obtained with less effort than a more-preferred food. In this study, a social group of six zoo-housed chimpanzees (Pan troglodytes) could collect PVC tokens and exchange them with researchers for food rewards at one of two locations. Food preference tests had revealed that, for these chimpanzees, grapes were a highly-preferred food while carrot pieces were a less-preferred food. The chimpanzees were tested in three phases, each comprised of 30 thirty-minute sessions. In phases 1 and 3, if the chimpanzees exchanged a token at the location they collected them they received a carrot piece (no travel) or they could travel ≥10 m to exchange tokens for grapes at a second location. In phase 2, the chimpanzees had to travel for both rewards (≥10 m for carrot pieces, ≥15 m for grapes). The chimpanzees learned how to exchange tokens for food rewards, but there was individual variation in the time it took for them to make their first exchange and to discover the different exchange locations. Once all the chimpanzees were proficient at exchanging tokens, they exchanged more tokens for grapes (phase 3). However, when travel was required for both rewards (phase 2), the chimpanzees were less likely to work for either reward. Aside from the alpha male, all chimpanzees exchanged tokens for both reward types, demonstrating their ability to explore the available options. Contrary to our predictions, low-ranked individuals made more exchanges than high-ranked individuals, most likely because, in this protocol, chimpanzees could not monopolize the tokens or access to exchange locations. Although the chimpanzees showed a preference for exchanging tokens for their more-preferred food, they

  19. Modified Regression Correlation Coefficient for Poisson Regression Model

    Science.gov (United States)

    Kaengthong, Nattacha; Domthong, Uthumporn

    2017-09-01

    This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).

  20. The Application of Classification and Regression Trees for the Triage of Women for Referral to Colposcopy and the Estimation of Risk for Cervical Intraepithelial Neoplasia: A Study Based on 1625 Cases with Incomplete Data from Molecular Tests

    Directory of Open Access Journals (Sweden)

    Abraham Pouliakis

    2015-01-01

    Full Text Available Objective. Nowadays numerous ancillary techniques detecting HPV DNA and mRNA compete with cytology; however no perfect test exists; in this study we evaluated classification and regression trees (CARTs for the production of triage rules and estimate the risk for cervical intraepithelial neoplasia (CIN in cases with ASCUS+ in cytology. Study Design. We used 1625 cases. In contrast to other approaches we used missing data to increase the data volume, obtain more accurate results, and simulate real conditions in the everyday practice of gynecologic clinics and laboratories. The proposed CART was based on the cytological result, HPV DNA typing, HPV mRNA detection based on NASBA and flow cytometry, p16 immunocytochemical expression, and finally age and parous status. Results. Algorithms useful for the triage of women were produced; gynecologists could apply these in conjunction with available examination results and conclude to an estimation of the risk for a woman to harbor CIN expressed as a probability. Conclusions. The most important test was the cytological examination; however the CART handled cases with inadequate cytological outcome and increased the diagnostic accuracy by exploiting the results of ancillary techniques even if there were inadequate missing data. The CART performance was better than any other single test involved in this study.

  1. The Application of Classification and Regression Trees for the Triage of Women for Referral to Colposcopy and the Estimation of Risk for Cervical Intraepithelial Neoplasia: A Study Based on 1625 Cases with Incomplete Data from Molecular Tests.

    Science.gov (United States)

    Pouliakis, Abraham; Karakitsou, Efrossyni; Chrelias, Charalampos; Pappas, Asimakis; Panayiotides, Ioannis; Valasoulis, George; Kyrgiou, Maria; Paraskevaidis, Evangelos; Karakitsos, Petros

    2015-01-01

    Nowadays numerous ancillary techniques detecting HPV DNA and mRNA compete with cytology; however no perfect test exists; in this study we evaluated classification and regression trees (CARTs) for the production of triage rules and estimate the risk for cervical intraepithelial neoplasia (CIN) in cases with ASCUS+ in cytology. We used 1625 cases. In contrast to other approaches we used missing data to increase the data volume, obtain more accurate results, and simulate real conditions in the everyday practice of gynecologic clinics and laboratories. The proposed CART was based on the cytological result, HPV DNA typing, HPV mRNA detection based on NASBA and flow cytometry, p16 immunocytochemical expression, and finally age and parous status. Algorithms useful for the triage of women were produced; gynecologists could apply these in conjunction with available examination results and conclude to an estimation of the risk for a woman to harbor CIN expressed as a probability. The most important test was the cytological examination; however the CART handled cases with inadequate cytological outcome and increased the diagnostic accuracy by exploiting the results of ancillary techniques even if there were inadequate missing data. The CART performance was better than any other single test involved in this study.

  2. Analysis of reinjection problems at the Stony Brook ATES field test site

    Science.gov (United States)

    Supkow, D. J.; Shultz, J. A.

    1982-12-01

    Aquifer Thermal Energy Storage (ATES) is one of several energy storage technologies being investigated by the DOE to determine the feasibility of reducing energy consumption by means of energy management systems. The State University of New York, (SUNY) Stony Brook, Long Island, New York site was selected by Battelle PNL for a Phase 1 investigation to determine the feasibility of an ATES demonstration to seasonally store chill energy by injecting chilled water in the winter and recovering it at a maximum rate of 100 MBTU/hr (30 MW) in the summer. The Phase 1 study was performed during 1981 by Dames & Moore under subcontract to Batelle PLN. The pumping and injection tests were performed using two wells in a doublet configuration. Well PI-1 is a previously existing well and PI-2 was installed specifically for this investigation. Both wells are screened in the Upper Magothy aquifer from approximately 300 to 350 feet below ground surface. Nine observation wells were also installed as a portion of the investigation to monitor water level and aquifer temperature changes during the test.

  3. Discontinuous Petrov–Galerkin method with optimal test functions for thin-body problems in solid mechanics

    KAUST Repository

    Niemi, Antti H.

    2011-02-01

    We study the applicability of the discontinuous Petrov-Galerkin (DPG) variational framework for thin-body problems in structural mechanics. Our numerical approach is based on discontinuous piecewise polynomial finite element spaces for the trial functions and approximate, local computation of the corresponding \\'optimal\\' test functions. In the Timoshenko beam problem, the proposed method is shown to provide the best approximation in an energy-type norm which is equivalent to the L2-norm for all the unknowns, uniformly with respect to the thickness parameter. The same formulation remains valid also for the asymptotic Euler-Bernoulli solution. As another one-dimensional model problem we consider the modelling of the so called basic edge effect in shell deformations. In particular, we derive a special norm for the test space which leads to a robust method in terms of the shell thickness. Finally, we demonstrate how a posteriori error estimator arising directly from the discontinuous variational framework can be utilized to generate an optimal hp-mesh for resolving the boundary layer. © 2010 Elsevier B.V.

  4. Testing with Values: the Refugee Problem and Political Prospects of the “Alternative for Germany”

    Directory of Open Access Journals (Sweden)

    Алена Васильевна Федина

    2018-12-01

    Full Text Available The article analyses the reasons for the electoral success of the “Alternative for Germany” party at the federal and state levels in 2016-2017. Looking at the ideological spectrum of German political parties through the research lens of a two-dimensional coordinate system that, alongside the traditional ideological dimension includes the value axis, the author points out the fact that the problem of the refugees, which appeals to the values of the nation, perplexed German parties, as the choice of a particular strategy in regard to the refugees involved high risks of losing a significant number of votes. As a result, those parties, which managed to clearly articulate their system of values, above all the “Alternative”, benefited from this situation. The author states that the rise of the “Alternative” was caused by the “value vacuum” in the party life in Germany stemming from the progressive “open-door” refugee policy introduced by Chancellor Merkel in order to resolve the European refugee crisis. Much consideration is given to the reasons that led the CDU to adopt such an unusual strategy for a predominantly centrist party. In conclusion the author suggests that, in spite of its electoral success, political influence of the “Alternative” will be seriously limited by the unwillingness on the part of other parties to establish contacts and build inter-party ties with an inexperienced political player, which is prone to neglect the achievements of German democracy and does not fit into the framework of modern political mainstream. From the other hand, the tightening of the immigration policy weakens the protest potential of German voters.

  5. Unbalanced Regressions and the Predictive Equation

    DEFF Research Database (Denmark)

    Osterrieder, Daniela; Ventosa-Santaulària, Daniel; Vera-Valdés, J. Eduardo

    Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness in the theoreti......Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness...... in the theoretical predictive equation by suggesting a data generating process, where returns are generated as linear functions of a lagged latent I(0) risk process. The observed predictor is a function of this latent I(0) process, but it is corrupted by a fractionally integrated noise. Such a process may arise due...... to aggregation or unexpected level shifts. In this setup, the practitioner estimates a misspecified, unbalanced, and endogenous predictive regression. We show that the OLS estimate of this regression is inconsistent, but standard inference is possible. To obtain a consistent slope estimate, we then suggest...

  6. Is past life regression therapy ethical?

    Science.gov (United States)

    Andrade, Gabriel

    2017-01-01

    Past life regression therapy is used by some physicians in cases with some mental diseases. Anxiety disorders, mood disorders, and gender dysphoria have all been treated using life regression therapy by some doctors on the assumption that they reflect problems in past lives. Although it is not supported by psychiatric associations, few medical associations have actually condemned it as unethical. In this article, I argue that past life regression therapy is unethical for two basic reasons. First, it is not evidence-based. Past life regression is based on the reincarnation hypothesis, but this hypothesis is not supported by evidence, and in fact, it faces some insurmountable conceptual problems. If patients are not fully informed about these problems, they cannot provide an informed consent, and hence, the principle of autonomy is violated. Second, past life regression therapy has the great risk of implanting false memories in patients, and thus, causing significant harm. This is a violation of the principle of non-malfeasance, which is surely the most important principle in medical ethics.

  7. Role of diagnostic testing in identifying and resolving dimensional-stability problems in electroplated laser mirrors

    International Nuclear Information System (INIS)

    Cutler, R.L.; Hogan, B.

    1982-01-01

    The metal mirrors which are the subject of this discussion are to be used in the Antares inertial fusion laser system. Antares is a high-power (40 TW), high-energy (35 to 40 kJ), pulsed CO 2 laser system for the investigation of inertial confinement fusion. The system contains more than four hundred small and large diamond-turned and conventionally polished mirrors. The largest mirrors are trapezoidal in shape with the longest dimension being 16 to 18 inches. The substrates are type 2124 aluminum for most large mirrors, and aluminum bronze, oxygen-free copper or a copper-zirconium alloy for most of the smaller mirrors. The optical surface is electro-deposited copper 20 to 40 mils thick. After nondestructive testing and rough machining, the electroplated surface is single-point diamond machined or conventionally polished

  8. Parâmetros genéticos para a produção de leite de controles individuais de vacas da raça Gir estimados com modelos de repetibilidade e regressão aleatória Estimation of genetic parameters for test day milk records of first lactation Gyr cows using repeatability and random regression animal models

    Directory of Open Access Journals (Sweden)

    Claudio Napolis Costa

    2005-10-01

    número de estimativas negativas entre as PLC do início e fim da lactação do que a FAS. Exceto para a FAS, observou-se redução das estimativas de correlação genética próximas à unidade entre as PLC adjacentes para valores negativos entre as PLC no início e no fim da lactação. Entre os polinômios de Legendre, o de quinta ordem apresentou um melhor o ajuste das PLC. Os resultados indicam o potencial de uso de regressão aleatória, com os modelos LP5 e a FAS apresentando-se como os mais adequados para a modelagem das variâncias genética e de efeito permanente das PLC da raça Gir.Data comprising 8,183 test day records of 1,273 first lactations of Gyr cows from herds supervised by ABCZ were used to estimate variance components and genetic parameters for milk yield using repeatability and random regression animal models by REML. Genetic modelling of logarithmic (FAS, exponential (FW curves was compared to orthogonal Legendre polynomials (LP of order 3 to 5. Residual variance was assumed to be constant in all (ME=1 or some periods of lactation (ME=4. Lactation milk yield in 305-d was also adjusted by an animal model. Genetic variance, heritability and repeatability for test day milk yields estimated by a repeatability animal model were 1.74 kg2, 0.27, and 0.76, respectively. Genetic variance and heritability estimates for lactation milk yield were respectively 121,094.6 and 0.22. Heritability estimates from FAS and FW, respectively, decreased from 0,59 and 0.74 at the beginning of lactation to 0.20 at the end of the period. Except for a fifth-order LP with ME=1, heritability estimates decreased from around 0,70 at early lactation to 0,30 at the end of lactation. Residual variance estimates were slightly smaller for logarithimic than for exponential curves both for homogeneous and heterogeneous variance assumptions. Estimates of residual variance in all stages of lactation decreased as the order of LP increased and depended on the assumption about ME

  9. Image superresolution using support vector regression.

    Science.gov (United States)

    Ni, Karl S; Nguyen, Truong Q

    2007-06-01

    A thorough investigation of the application of support vector regression (SVR) to the superresolution problem is conducted through various frameworks. Prior to the study, the SVR problem is enhanced by finding the optimal kernel. This is done by formulating the kernel learning problem in SVR form as a convex optimization problem, specifically a semi-definite programming (SDP) problem. An additional constraint is added to reduce the SDP to a quadratically constrained quadratic programming (QCQP) problem. After this optimization, investigation of the relevancy of SVR to superresolution proceeds with the possibility of using a single and general support vector regression for all image content, and the results are impressive for small training sets. This idea is improved upon by observing structural properties in the discrete cosine transform (DCT) domain to aid in learning the regression. Further improvement involves a combination of classification and SVR-based techniques, extending works in resolution synthesis. This method, termed kernel resolution synthesis, uses specific regressors for isolated image content to describe the domain through a partitioned look of the vector space, thereby yielding good results.

  10. Exploration of analysis methods for diagnostic imaging tests: problems with ROC AUC and confidence scores in CT colonography.

    Science.gov (United States)

    Mallett, Susan; Halligan, Steve; Collins, Gary S; Altman, Doug G

    2014-01-01

    Different methods of evaluating diagnostic performance when comparing diagnostic tests may lead to different results. We compared two such approaches, sensitivity and specificity with area under the Receiver Operating Characteristic Curve (ROC AUC) for the evaluation of CT colonography for the detection of polyps, either with or without computer assisted detection. In a multireader multicase study of 10 readers and 107 cases we compared sensitivity and specificity, using radiological reporting of the presence or absence of polyps, to ROC AUC calculated from confidence scores concerning the presence of polyps. Both methods were assessed against a reference standard. Here we focus on five readers, selected to illustrate issues in design and analysis. We compared diagnostic measures within readers, showing that differences in results are due to statistical methods. Reader performance varied widely depending on whether sensitivity and specificity or ROC AUC was used. There were problems using confidence scores; in assigning scores to all cases; in use of zero scores when no polyps were identified; the bimodal non-normal distribution of scores; fitting ROC curves due to extrapolation beyond the study data; and the undue influence of a few false positive results. Variation due to use of different ROC methods exceeded differences between test results for ROC AUC. The confidence scores recorded in our study violated many assumptions of ROC AUC methods, rendering these methods inappropriate. The problems we identified will apply to other detection studies using confidence scores. We found sensitivity and specificity were a more reliable and clinically appropriate method to compare diagnostic tests.

  11. Fast metabolite identification with Input Output Kernel Regression

    Science.gov (United States)

    Brouard, Céline; Shen, Huibin; Dührkop, Kai; d'Alché-Buc, Florence; Böcker, Sebastian; Rousu, Juho

    2016-01-01

    Motivation: An important problematic of metabolomics is to identify metabolites using tandem mass spectrometry data. Machine learning methods have been proposed recently to solve this problem by predicting molecular fingerprint vectors and matching these fingerprints against existing molecular structure databases. In this work we propose to address the metabolite identification problem using a structured output prediction approach. This type of approach is not limited to vector output space and can handle structured output space such as the molecule space. Results: We use the Input Output Kernel Regression method to learn the mapping between tandem mass spectra and molecular structures. The principle of this method is to encode the similarities in the input (spectra) space and the similarities in the output (molecule) space using two kernel functions. This method approximates the spectra-molecule mapping in two phases. The first phase corresponds to a regression problem from the input space to the feature space associated to the output kernel. The second phase is a preimage problem, consisting in mapping back the predicted output feature vectors to the molecule space. We show that our approach achieves state-of-the-art accuracy in metabolite identification. Moreover, our method has the advantage of decreasing the running times for the training step and the test step by several orders of magnitude over the preceding methods. Availability and implementation: Contact: celine.brouard@aalto.fi Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27307628

  12. Method for nonlinear exponential regression analysis

    Science.gov (United States)

    Junkin, B. G.

    1972-01-01

    Two computer programs developed according to two general types of exponential models for conducting nonlinear exponential regression analysis are described. Least squares procedure is used in which the nonlinear problem is linearized by expanding in a Taylor series. Program is written in FORTRAN 5 for the Univac 1108 computer.

  13. Change-based test selection : An empirical evaluation

    NARCIS (Netherlands)

    Soetens, Quinten; Demeyer, Serge; Zaidman, A.E.; Perez, Javier

    2015-01-01

    Regression test selection (i.e., selecting a subset of a given regression test suite) is a problem that has been studied intensely over the last decade. However, with the increasing popularity of developer tests as the driver of the test process, more fine-grained solutions that work well within the

  14. The Necessity of a New Type Test Rig for the Development of an Evaluation Method in Grid Fretting Problems

    International Nuclear Information System (INIS)

    Lee, Young-Ho; Kim, Hyung-Kyu

    2007-01-01

    A grid fretting problem is recognized as one of the most important degradation mechanisms even though the examination results of fretting experiments could be applied to the development and design of spacer grid structures. This is because it is difficult to develop a fretting wear model for a grid fretting problem due to the various wear mechanisms involved according to the mechanical and environmental variables, the contact condition with a spring/dimple and the material properties. A number of spring shapes has been developed in KAERI and their performance tests such as fretting wear, flow-induced vibration (FIV) tests, etc. have been carried out from a part unit to a full assembly scale. From the unit part fretting test results, one of the noticeable results is that the contacting force (normal load) was gradually decreased with increasing number of fretting cycles due to a depth increase and this behavior was closely related to the contacting spring shape. When considering the actual contact condition between a fuel rod and a spring/dimple, if a fretting wear progresses due to a FIV under a specific normal load exerted on the fuel rod by an elastic deformation of the spring, the contacting force between the fuel rod and dimple that are located in the opposite side should be decreased. Consequently, an evaluation of developed spacer grids against fretting wear damage should be performed with the results of 1x1 cell unit experiments because a contacting force is one of the most important variables that influences a fretting wear mechanism. The discussion was focused on the development procedure of a new test rig and its performance by using a 1x1 cell unit test rig. (authors)

  15. Regression modeling of ground-water flow

    Science.gov (United States)

    Cooley, R.L.; Naff, R.L.

    1985-01-01

    Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)

  16. Discriminative Elastic-Net Regularized Linear Regression.

    Science.gov (United States)

    Zhang, Zheng; Lai, Zhihui; Xu, Yong; Shao, Ling; Wu, Jian; Xie, Guo-Sen

    2017-03-01

    In this paper, we aim at learning compact and discriminative linear regression models. Linear regression has been widely used in different problems. However, most of the existing linear regression methods exploit the conventional zero-one matrix as the regression targets, which greatly narrows the flexibility of the regression model. Another major limitation of these methods is that the learned projection matrix fails to precisely project the image features to the target space due to their weak discriminative capability. To this end, we present an elastic-net regularized linear regression (ENLR) framework, and develop two robust linear regression models which possess the following special characteristics. First, our methods exploit two particular strategies to enlarge the margins of different classes by relaxing the strict binary targets into a more feasible variable matrix. Second, a robust elastic-net regularization of singular values is introduced to enhance the compactness and effectiveness of the learned projection matrix. Third, the resulting optimization problem of ENLR has a closed-form solution in each iteration, which can be solved efficiently. Finally, rather than directly exploiting the projection matrix for recognition, our methods employ the transformed features as the new discriminate representations to make final image classification. Compared with the traditional linear regression model and some of its variants, our method is much more accurate in image classification. Extensive experiments conducted on publicly available data sets well demonstrate that the proposed framework can outperform the state-of-the-art methods. The MATLAB codes of our methods can be available at http://www.yongxu.org/lunwen.html.

  17. Spontaneous regression of pulmonary bullae

    International Nuclear Information System (INIS)

    Satoh, H.; Ishikawa, H.; Ohtsuka, M.; Sekizawa, K.

    2002-01-01

    The natural history of pulmonary bullae is often characterized by gradual, progressive enlargement. Spontaneous regression of bullae is, however, very rare. We report a case in which complete resolution of pulmonary bullae in the left upper lung occurred spontaneously. The management of pulmonary bullae is occasionally made difficult because of gradual progressive enlargement associated with abnormal pulmonary function. Some patients have multiple bulla in both lungs and/or have a history of pulmonary emphysema. Others have a giant bulla without emphysematous change in the lungs. Our present case had treated lung cancer with no evidence of local recurrence. He had no emphysematous change in lung function test and had no complaints, although the high resolution CT scan shows evidence of underlying minimal changes of emphysema. Ortin and Gurney presented three cases of spontaneous reduction in size of bulla. Interestingly, one of them had a marked decrease in the size of a bulla in association with thickening of the wall of the bulla, which was observed in our patient. This case we describe is of interest, not only because of the rarity with which regression of pulmonary bulla has been reported in the literature, but also because of the spontaneous improvements in the radiological picture in the absence of overt infection or tumor. Copyright (2002) Blackwell Science Pty Ltd

  18. Implicit collinearity effect in linear regression: Application to basal ...

    African Journals Online (AJOL)

    Collinearity of predictor variables is a severe problem in the least square regression analysis. It contributes to the instability of regression coefficients and leads to a wrong prediction accuracy. Despite these problems, studies are conducted with a large number of observed and derived variables linked with a response ...

  19. Variable and subset selection in PLS regression

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar

    2001-01-01

    The purpose of this paper is to present some useful methods for introductory analysis of variables and subsets in relation to PLS regression. We present here methods that are efficient in finding the appropriate variables or subset to use in the PLS regression. The general conclusion...... is that variable selection is important for successful analysis of chemometric data. An important aspect of the results presented is that lack of variable selection can spoil the PLS regression, and that cross-validation measures using a test set can show larger variation, when we use different subsets of X, than...

  20. Behavioral and Emotional Regulation and Adolescent Substance Use Problems: A Test of Moderation Effects in a Dual-Process Model

    Science.gov (United States)

    Wills, Thomas A.; Pokhrel, Pallav; Morehouse, Ellen; Fenster, Bonnie

    2011-01-01

    In a structural model, we tested how relations of predictors to level of adolescent substance use (tobacco, alcohol, marijuana), and to substance-related impaired-control and behavior problems, are moderated by good self-control and poor regulation in behavioral and emotional domains. The participants were a sample of 1,116 public high-school students. In a multiple-group analysis for good self-control, the paths from negative life events to substance use level and from level to behavior problems were lower among persons scoring higher on good behavioral self-control. In a multiple-group analysis for poor regulation, the paths from negative life events to level and from peer substance use to level were greater among persons scoring higher on poor behavioral (but not emotional) regulation; an inverse path from academic competence to level was greater among persons scoring higher on both aspects of poor regulation. Paths from level to impaired-control and behavior problems were greater among persons scoring higher on both poor behavioral and poor emotional regulation. Theoretical implications for the basis of moderation effects are discussed. PMID:21443302

  1. Recursive Algorithm For Linear Regression

    Science.gov (United States)

    Varanasi, S. V.

    1988-01-01

    Order of model determined easily. Linear-regression algorithhm includes recursive equations for coefficients of model of increased order. Algorithm eliminates duplicative calculations, facilitates search for minimum order of linear-regression model fitting set of data satisfactory.

  2. Distance Based Root Cause Analysis and Change Impact Analysis of Performance Regressions

    Directory of Open Access Journals (Sweden)

    Junzan Zhou

    2015-01-01

    Full Text Available Performance regression testing is applied to uncover both performance and functional problems of software releases. A performance problem revealed by performance testing can be high response time, low throughput, or even being out of service. Mature performance testing process helps systematically detect software performance problems. However, it is difficult to identify the root cause and evaluate the potential change impact. In this paper, we present an approach leveraging server side logs for identifying root causes of performance problems. Firstly, server side logs are used to recover call tree of each business transaction. We define a novel distance based metric computed from call trees for root cause analysis and apply inverted index from methods to business transactions for change impact analysis. Empirical studies show that our approach can effectively and efficiently help developers diagnose root cause of performance problems.

  3. The use of cognitive ability measures as explanatory variables in regression analysis.

    Science.gov (United States)

    Junker, Brian; Schofield, Lynne Steuerle; Taylor, Lowell J

    2012-12-01

    Cognitive ability measures are often taken as explanatory variables in regression analysis, e.g., as a factor affecting a market outcome such as an individual's wage, or a decision such as an individual's education acquisition. Cognitive ability is a latent construct; its true value is unobserved. Nonetheless, researchers often assume that a test score , constructed via standard psychometric practice from individuals' responses to test items, can be safely used in regression analysis. We examine problems that can arise, and suggest that an alternative approach, a "mixed effects structural equations" (MESE) model, may be more appropriate in many circumstances.

  4. Combining Alphas via Bounded Regression

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-11-01

    Full Text Available We give an explicit algorithm and source code for combining alpha streams via bounded regression. In practical applications, typically, there is insufficient history to compute a sample covariance matrix (SCM for a large number of alphas. To compute alpha allocation weights, one then resorts to (weighted regression over SCM principal components. Regression often produces alpha weights with insufficient diversification and/or skewed distribution against, e.g., turnover. This can be rectified by imposing bounds on alpha weights within the regression procedure. Bounded regression can also be applied to stock and other asset portfolio construction. We discuss illustrative examples.

  5. Regression in autistic spectrum disorders.

    Science.gov (United States)

    Stefanatos, Gerry A

    2008-12-01

    A significant proportion of children diagnosed with Autistic Spectrum Disorder experience a developmental regression characterized by a loss of previously-acquired skills. This may involve a loss of speech or social responsitivity, but often entails both. This paper critically reviews the phenomena of regression in autistic spectrum disorders, highlighting the characteristics of regression, age of onset, temporal course, and long-term outcome. Important considerations for diagnosis are discussed and multiple etiological factors currently hypothesized to underlie the phenomenon are reviewed. It is argued that regressive autistic spectrum disorders can be conceptualized on a spectrum with other regressive disorders that may share common pathophysiological features. The implications of this viewpoint are discussed.

  6. Bias and Uncertainty in Regression-Calibrated Models of Groundwater Flow in Heterogeneous Media

    DEFF Research Database (Denmark)

    Cooley, R.L.; Christensen, Steen

    2006-01-01

    small. Model error is accounted for in the weighted nonlinear regression methodology developed to estimate θ* and assess model uncertainties by incorporating the second-moment matrix of the model errors into the weight matrix. Techniques developed by statisticians to analyze classical nonlinear...... are reduced in magnitude. Biases, correction factors, and confidence and prediction intervals were obtained for a test problem for which model error is large to test robustness of the methodology. Numerical results conform with the theoretical analysis....

  7. Syndemics of psychosocial problems and HIV risk: A systematic review of empirical tests of the disease interaction concept.

    Science.gov (United States)

    Tsai, Alexander C; Burns, Bridget F O

    2015-08-01

    In the theory of syndemics, diseases co-occur in particular temporal or geographical contexts due to harmful social conditions (disease concentration) and interact at the level of populations and individuals, with mutually enhancing deleterious consequences for health (disease interaction). This theory has widespread adherents in the field, but the extent to which there is empirical support for the concept of disease interaction remains unclear. In January 2015 we systematically searched 7 bibliographic databases and tracked citations to highly cited publications associated with the theory of syndemics. Of the 783 records, we ultimately included 34 published journal articles, 5 dissertations, and 1 conference abstract. Most studies were based on a cross-sectional design (32 [80%]), were conducted in the U.S. (32 [80%]), and focused on men who have sex with men (21 [53%]). The most frequently studied psychosocial problems were related to mental health (33 [83%]), substance abuse (36 [90%]), and violence (27 [68%]); while the most frequently studied outcome variables were HIV transmission risk behaviors (29 [73%]) or HIV infection (9 [23%]). To test the disease interaction concept, 11 (28%) studies used some variation of a product term, with less than half of these (5/11 [45%]) providing sufficient information to interpret interaction both on an additive and on a multiplicative scale. The most frequently used specification (31 [78%]) to test the disease interaction concept was the sum score corresponding to the total count of psychosocial problems. Although the count variable approach does not test hypotheses about interactions between psychosocial problems, these studies were much more likely than others (14/31 [45%] vs. 0/9 [0%]; χ2 = 6.25, P = 0.01) to incorporate language about "synergy" or "interaction" that was inconsistent with the statistical models used. Therefore, more evidence is needed to assess the extent to which diseases interact, either at the

  8. Valuation of environmental problems in landfill deposition and composting - test of methodology; Verdsetting av miljoekonsekvenser av avfallsdeponering og kompostering - metodeutproeving

    Energy Technology Data Exchange (ETDEWEB)

    Leknes, Einar; Movik, Espen; Wiik, Ragnhild; Meissnes, Rudolf

    1995-08-01

    This study is aimed at the tests and design of methods for valuation of environmental problems associated with the landfill deposition of household waste. An extensive review of literature has been conducted with respect to the environmental impacts and valuation methods. Environmental impact assessment and valuation with respect to emission of greenhouse gases (GHG's), leachate and disamenity, have been performed for 4 Norwegian landfills. These differ in their approach towards waste treatment in terms of GHG-collection, briquette production and composting and also in their location in terms of proximity to residential areas and the quality of natural recipients. The study shows that the collection of methane and production of briquettes causes major reductions in the generation of GHG's, whereas composting brings significant reductions for all types of environmental impacts. (author)

  9. Beta-Test Data On An Assessment Of Textbook Problem Solving Ability: An Argument For Right/Wrong Grading?

    Science.gov (United States)

    Cummings, Karen; Marx, Jeffrey D.

    2010-10-01

    We have developed an assessment of students' ability to solve standard textbook style problems and are currently engaged in the validation and revision process. The assessment covers the topics of force and motion, conservation of momentum and conservation of energy at a level consistent with most calculus-based, introductory physics courses. This tool is discussed in more detail in an accompanying paper by Marx and Cummings. [1] Here we present preliminary beta-test data collected at four schools during the 2009/2010 academic year. Data include both pre- and post-instruction results for introductory physics courses as well as results for physics majors in later years. In addition, we present evidence that right/wrong grading may well be a perfectly acceptable grading procedure for a course-level assessment of this type.

  10. Valuation of environmental problems in landfill deposition and composting - test of methodology; Verdsetting av miljoekonsekvenser av avfallsdeponering og kompostering - metodeutproeving

    Energy Technology Data Exchange (ETDEWEB)

    Leknes, Einar; Movik, Espen; Wiik, Ragnhild; Meissnes, Rudolf

    1995-08-01

    This study is aimed at the tests and design of methods for valuation of environmental problems associated with the landfill deposition of household waste. An extensive review of literature has been conducted with respect to the environmental impacts and valuation methods. Environmental impact assessment and valuation with respect to emission of greenhouse gases (GHG's), leachate and disamenity, have been performed for 4 Norwegian landfills. These differ in their approach towards waste treatment in terms of GHG-collection, briquette production and composting and also in their location in terms of proximity to residential areas and the quality of natural recipients. The study shows that the collection of methane and production of briquettes causes major reductions in the generation of GHG's, whereas composting brings significant reductions for all types of environmental impacts. (author)

  11. Quantile regression theory and applications

    CERN Document Server

    Davino, Cristina; Vistocco, Domenico

    2013-01-01

    A guide to the implementation and interpretation of Quantile Regression models This book explores the theory and numerous applications of quantile regression, offering empirical data analysis as well as the software tools to implement the methods. The main focus of this book is to provide the reader with a comprehensivedescription of the main issues concerning quantile regression; these include basic modeling, geometrical interpretation, estimation and inference for quantile regression, as well as issues on validity of the model, diagnostic tools. Each methodological aspect is explored and

  12. International standard problem (ISP) No. 41. Containment iodine computer code exercise based on a radioiodine test facility (RTF) experiment

    International Nuclear Information System (INIS)

    2000-04-01

    International Standard Problem (ISP) exercises are comparative exercises in which predictions of different computer codes for a given physical problem are compared with each other or with the results of a carefully controlled experimental study. The main goal of ISP exercises is to increase confidence in the validity and accuracy of the tools, which were used in assessing the safety of nuclear installations. Moreover, they enable code users to gain experience and demonstrate their competence. The ISP No. 41 exercise, computer code exercise based on a Radioiodine Test Facility (RTF) experiment on iodine behaviour in containment under severe accident conditions, is one of such ISP exercises. The ISP No. 41 exercise was borne at the recommendation at the Fourth Iodine Chemistry Workshop held at PSI, Switzerland in June 1996: 'the performance of an International Standard Problem as the basis of an in-depth comparison of the models as well as contributing to the database for validation of iodine codes'. [Proceedings NEA/CSNI/R(96)6, Summary and Conclusions NEA/CSNI/R(96)7]. COG (CANDU Owners Group), comprising AECL and the Canadian nuclear utilities, offered to make the results of a Radioiodine Test Facility (RTF) test available for such an exercise. The ISP No. 41 exercise was endorsed in turn by the FPC (PWG4's Task Group on Fission Product Phenomena in the Primary Circuit and the Containment), PWG4 (CSNI Principal Working Group on the Confinement of Accidental Radioactive Releases), and the CSNI. The OECD/NEA Committee on the Safety of Nuclear Installations (CSNI) has sponsored forty-five ISP exercises over the last twenty-four years, thirteen of them in the area of severe accidents. The criteria for the selection of the RTF test as a basis for the ISP-41 exercise were; (1) complementary to other RTF tests available through the PHEBUS and ACE programmes, (2) simplicity for ease of modelling and (3) good quality data. A simple RTF experiment performed under controlled

  13. An Experimental Copyright Moratorium: Study of a Proposed Solution to the Copyright Photocopying Problem. Final Report to the American Society for Testing and Materials (ASTM).

    Science.gov (United States)

    Heilprin, Laurence B.

    The Committee to Investigate Copyright Problems (CICP), a non-profit organization dedicated to resolving the conflict known as the "copyright photocopying problem" was joined by the American Society for Testing and Materials (ASTM), a large national publisher of technical and scientific standards, in a plan to simulate a long-proposed…

  14. [Application of detecting and taking overdispersion into account in Poisson regression model].

    Science.gov (United States)

    Bouche, G; Lepage, B; Migeot, V; Ingrand, P

    2009-08-01

    Researchers often use the Poisson regression model to analyze count data. Overdispersion can occur when a Poisson regression model is used, resulting in an underestimation of variance of the regression model parameters. Our objective was to take overdispersion into account and assess its impact with an illustration based on the data of a study investigating the relationship between use of the Internet to seek health information and number of primary care consultations. Three methods, overdispersed Poisson, a robust estimator, and negative binomial regression, were performed to take overdispersion into account in explaining variation in the number (Y) of primary care consultations. We tested overdispersion in the Poisson regression model using the ratio of the sum of Pearson residuals over the number of degrees of freedom (chi(2)/df). We then fitted the three models and compared parameter estimation to the estimations given by Poisson regression model. Variance of the number of primary care consultations (Var[Y]=21.03) was greater than the mean (E[Y]=5.93) and the chi(2)/df ratio was 3.26, which confirmed overdispersion. Standard errors of the parameters varied greatly between the Poisson regression model and the three other regression models. Interpretation of estimates from two variables (using the Internet to seek health information and single parent family) would have changed according to the model retained, with significant levels of 0.06 and 0.002 (Poisson), 0.29 and 0.09 (overdispersed Poisson), 0.29 and 0.13 (use of a robust estimator) and 0.45 and 0.13 (negative binomial) respectively. Different methods exist to solve the problem of underestimating variance in the Poisson regression model when overdispersion is present. The negative binomial regression model seems to be particularly accurate because of its theorical distribution ; in addition this regression is easy to perform with ordinary statistical software packages.

  15. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.

    2007-01-01

    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  16. Panel Smooth Transition Regression Models

    DEFF Research Database (Denmark)

    González, Andrés; Terasvirta, Timo; Dijk, Dick van

    We introduce the panel smooth transition regression model. This new model is intended for characterizing heterogeneous panels, allowing the regression coefficients to vary both across individuals and over time. Specifically, heterogeneity is allowed for by assuming that these coefficients are bou...

  17. Logistic Regression: Concept and Application

    Science.gov (United States)

    Cokluk, Omay

    2010-01-01

    The main focus of logistic regression analysis is classification of individuals in different groups. The aim of the present study is to explain basic concepts and processes of binary logistic regression analysis intended to determine the combination of independent variables which best explain the membership in certain groups called dichotomous…

  18. Fungible weights in logistic regression.

    Science.gov (United States)

    Jones, Jeff A; Waller, Niels G

    2016-06-01

    In this article we develop methods for assessing parameter sensitivity in logistic regression models. To set the stage for this work, we first review Waller's (2008) equations for computing fungible weights in linear regression. Next, we describe 2 methods for computing fungible weights in logistic regression. To demonstrate the utility of these methods, we compute fungible logistic regression weights using data from the Centers for Disease Control and Prevention's (2010) Youth Risk Behavior Surveillance Survey, and we illustrate how these alternate weights can be used to evaluate parameter sensitivity. To make our work accessible to the research community, we provide R code (R Core Team, 2015) that will generate both kinds of fungible logistic regression weights. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  19. Ordinary least square regression, orthogonal regression, geometric mean regression and their applications in aerosol science

    International Nuclear Information System (INIS)

    Leng Ling; Zhang Tianyi; Kleinman, Lawrence; Zhu Wei

    2007-01-01

    Regression analysis, especially the ordinary least squares method which assumes that errors are confined to the dependent variable, has seen a fair share of its applications in aerosol science. The ordinary least squares approach, however, could be problematic due to the fact that atmospheric data often does not lend itself to calling one variable independent and the other dependent. Errors often exist for both measurements. In this work, we examine two regression approaches available to accommodate this situation. They are orthogonal regression and geometric mean regression. Comparisons are made theoretically as well as numerically through an aerosol study examining whether the ratio of organic aerosol to CO would change with age

  20. A Behavioral Test of Accepting Benefits that Cost Others: Associations with Conduct Problems and Callous-Unemotionality

    Science.gov (United States)

    Sakai, Joseph T.; Dalwani, Manish S.; Gelhorn, Heather L.; Mikulich-Gilbertson, Susan K.; Crowley, Thomas J.

    2012-01-01

    Background Youth with conduct problems (CP) often make decisions which value self-interest over the interests of others. Self-benefiting behavior despite loss to others is especially common among youth with CP and callous-unemotional traits (CU). Such behavioral tendencies are generally measured using self- or observer-report. We are unaware of attempts to measure this tendency with a behavioral paradigm. Methods/Principal Findings In our AlAn's (altruism-antisocial) game a computer program presents subjects with a series of offers in which they will receive money but a planned actual charity donation will be reduced; subjects decide to accept or reject each offer. We tested (1) whether adolescent patients with CP (n = 20) compared with adolescent controls (n = 19) differed on AlAn's game outcomes, (2) whether youths with CP and CU differed significantly from controls without CP or CU, and (3) whether AlAn's game outcomes correlated significantly with CP and separately, CU severity. Patients with CP and CU compared with controls without these problems took significantly more money for themselves and left significantly less money in the charity donation; AlAn's game outcomes were significantly correlated with CU, but not CP. Conclusions/Significance In the AlAn's game adolescents with conduct problems and CU traits, compared with controls without CP/CU, are disposed to benefit themselves while costing others even in a novel situation, devoid of peer influences, where anonymity is assured, reciprocity or retribution are impossible, intoxication is absent and when the “other” to be harmed is considered beneficent. AlAn's game outcomes are associated with measures of CU. Results suggest that the AlAn's game provides an objective means of capturing information about CU traits. The AlAn's game, which was designed for future use in the MRI environment, may be used in studies attempting to identify the neural correlates of self-benefiting decision-making. PMID

  1. Regression Benchmarking: An Approach to Quality Assurance in Performance

    OpenAIRE

    Bulej, Lubomír

    2005-01-01

    The paper presents a short summary of our work in the area of regression benchmarking and its application to software development. Specially, we explain the concept of regression benchmarking, the requirements for employing regression testing in a software project, and methods used for analyzing the vast amounts of data resulting from repeated benchmarking. We present the application of regression benchmarking on a real software project and conclude with a glimpse at the challenges for the fu...

  2. Tumor regression patterns in retinoblastoma

    International Nuclear Information System (INIS)

    Zafar, S.N.; Siddique, S.N.; Zaheer, N.

    2016-01-01

    To observe the types of tumor regression after treatment, and identify the common pattern of regression in our patients. Study Design: Descriptive study. Place and Duration of Study: Department of Pediatric Ophthalmology and Strabismus, Al-Shifa Trust Eye Hospital, Rawalpindi, Pakistan, from October 2011 to October 2014. Methodology: Children with unilateral and bilateral retinoblastoma were included in the study. Patients were referred to Pakistan Institute of Medical Sciences, Islamabad, for chemotherapy. After every cycle of chemotherapy, dilated funds examination under anesthesia was performed to record response of the treatment. Regression patterns were recorded on RetCam II. Results: Seventy-four tumors were included in the study. Out of 74 tumors, 3 were ICRB group A tumors, 43 were ICRB group B tumors, 14 tumors belonged to ICRB group C, and remaining 14 were ICRB group D tumors. Type IV regression was seen in 39.1% (n=29) tumors, type II in 29.7% (n=22), type III in 25.6% (n=19), and type I in 5.4% (n=4). All group A tumors (100%) showed type IV regression. Seventeen (39.5%) group B tumors showed type IV regression. In group C, 5 tumors (35.7%) showed type II regression and 5 tumors (35.7%) showed type IV regression. In group D, 6 tumors (42.9%) regressed to type II non-calcified remnants. Conclusion: The response and success of the focal and systemic treatment, as judged by the appearance of different patterns of tumor regression, varies with the ICRB grouping of the tumor. (author)

  3. The neural correlates of problem states : Testing fMRI predictions of a computational model of multitasking

    NARCIS (Netherlands)

    Borst, J.P.; Taatgen, N.A.; Stocco, A.; Van Rijn, D.H.

    2010-01-01

    Background: It has been shown that people can only maintain one problem state, or intermediate mental representation, at a time. When more than one problem state is required, for example in multitasking, performance decreases considerably. This effect has been explained in terms of a problem state

  4. Comparison of multinomial logistic regression and logistic regression: which is more efficient in allocating land use?

    Science.gov (United States)

    Lin, Yingzhi; Deng, Xiangzheng; Li, Xing; Ma, Enjun

    2014-12-01

    Spatially explicit simulation of land use change is the basis for estimating the effects of land use and cover change on energy fluxes, ecology and the environment. At the pixel level, logistic regression is one of the most common approaches used in spatially explicit land use allocation models to determine the relationship between land use and its causal factors in driving land use change, and thereby to evaluate land use suitability. However, these models have a drawback in that they do not determine/allocate land use based on the direct relationship between land use change and its driving factors. Consequently, a multinomial logistic regression method was introduced to address this flaw, and thereby, judge the suitability of a type of land use in any given pixel in a case study area of the Jiangxi Province, China. A comparison of the two regression methods indicated that the proportion of correctly allocated pixels using multinomial logistic regression was 92.98%, which was 8.47% higher than that obtained using logistic regression. Paired t-test results also showed that pixels were more clearly distinguished by multinomial logistic regression than by logistic regression. In conclusion, multinomial logistic regression is a more efficient and accurate method for the spatial allocation of land use changes. The application of this method in future land use change studies may improve the accuracy of predicting the effects of land use and cover change on energy fluxes, ecology, and environment.

  5. Logic regression and its extensions.

    Science.gov (United States)

    Schwender, Holger; Ruczinski, Ingo

    2010-01-01

    Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.

  6. Distributed Monitoring of the R2 Statistic for Linear Regression

    Data.gov (United States)

    National Aeronautics and Space Administration — The problem of monitoring a multivariate linear regression model is relevant in studying the evolving relationship between a set of input variables (features) and...

  7. Block-GP: Scalable Gaussian Process Regression for Multimodal Data

    Data.gov (United States)

    National Aeronautics and Space Administration — Regression problems on massive data sets are ubiquitous in many application domains including the Internet, earth and space sciences, and finances. In many cases,...

  8. General Nature of Multicollinearity in Multiple Regression Analysis.

    Science.gov (United States)

    Liu, Richard

    1981-01-01

    Discusses multiple regression, a very popular statistical technique in the field of education. One of the basic assumptions in regression analysis requires that independent variables in the equation should not be highly correlated. The problem of multicollinearity and some of the solutions to it are discussed. (Author)

  9. A Novel Multiobjective Evolutionary Algorithm Based on Regression Analysis

    Directory of Open Access Journals (Sweden)

    Zhiming Song

    2015-01-01

    Full Text Available As is known, the Pareto set of a continuous multiobjective optimization problem with m objective functions is a piecewise continuous (m-1-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is (m-1-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper.

  10. Sparse reduced-rank regression with covariance estimation

    KAUST Repository

    Chen, Lisha

    2014-12-08

    Improving the predicting performance of the multiple response regression compared with separate linear regressions is a challenging question. On the one hand, it is desirable to seek model parsimony when facing a large number of parameters. On the other hand, for certain applications it is necessary to take into account the general covariance structure for the errors of the regression model. We assume a reduced-rank regression model and work with the likelihood function with general error covariance to achieve both objectives. In addition we propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty, and to estimate the error covariance matrix simultaneously by using a similar penalty on the precision matrix. We develop a numerical algorithm to solve the penalized regression problem. In a simulation study and real data analysis, the new method is compared with two recent methods for multivariate regression and exhibits competitive performance in prediction and variable selection.

  11. Sparse reduced-rank regression with covariance estimation

    KAUST Repository

    Chen, Lisha; Huang, Jianhua Z.

    2014-01-01

    Improving the predicting performance of the multiple response regression compared with separate linear regressions is a challenging question. On the one hand, it is desirable to seek model parsimony when facing a large number of parameters. On the other hand, for certain applications it is necessary to take into account the general covariance structure for the errors of the regression model. We assume a reduced-rank regression model and work with the likelihood function with general error covariance to achieve both objectives. In addition we propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty, and to estimate the error covariance matrix simultaneously by using a similar penalty on the precision matrix. We develop a numerical algorithm to solve the penalized regression problem. In a simulation study and real data analysis, the new method is compared with two recent methods for multivariate regression and exhibits competitive performance in prediction and variable selection.

  12. Detecting and Analyzing I/O Performance Regressions

    NARCIS (Netherlands)

    Bezemer, C.P.; Milon, E.; Zaidman, A.; Pouwelse, J.

    2014-01-01

    Regression testing can be done by re-executing a test suite on different software versions and comparing the outcome. For functional testing, the outcome of such tests is either pass (correct behaviour) or fail (incorrect behaviour). For non-functional testing, such as performance testing, this is

  13. Quantile Regression With Measurement Error

    KAUST Repository

    Wei, Ying; Carroll, Raymond J.

    2009-01-01

    . The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a

  14. From Rasch scores to regression

    DEFF Research Database (Denmark)

    Christensen, Karl Bang

    2006-01-01

    Rasch models provide a framework for measurement and modelling latent variables. Having measured a latent variable in a population a comparison of groups will often be of interest. For this purpose the use of observed raw scores will often be inadequate because these lack interval scale propertie....... This paper compares two approaches to group comparison: linear regression models using estimated person locations as outcome variables and latent regression models based on the distribution of the score....

  15. Regression methods for medical research

    CERN Document Server

    Tai, Bee Choo

    2013-01-01

    Regression Methods for Medical Research provides medical researchers with the skills they need to critically read and interpret research using more advanced statistical methods. The statistical requirements of interpreting and publishing in medical journals, together with rapid changes in science and technology, increasingly demands an understanding of more complex and sophisticated analytic procedures.The text explains the application of statistical models to a wide variety of practical medical investigative studies and clinical trials. Regression methods are used to appropriately answer the

  16. Forecasting with Dynamic Regression Models

    CERN Document Server

    Pankratz, Alan

    2012-01-01

    One of the most widely used tools in statistical forecasting, single equation regression models is examined here. A companion to the author's earlier work, Forecasting with Univariate Box-Jenkins Models: Concepts and Cases, the present text pulls together recent time series ideas and gives special attention to possible intertemporal patterns, distributed lag responses of output to input series and the auto correlation patterns of regression disturbance. It also includes six case studies.

  17. Piecewise linear regression splines with hyperbolic covariates

    International Nuclear Information System (INIS)

    Cologne, John B.; Sposto, Richard

    1992-09-01

    Consider the problem of fitting a curve to data that exhibit a multiphase linear response with smooth transitions between phases. We propose substituting hyperbolas as covariates in piecewise linear regression splines to obtain curves that are smoothly joined. The method provides an intuitive and easy way to extend the two-phase linear hyperbolic response model of Griffiths and Miller and Watts and Bacon to accommodate more than two linear segments. The resulting regression spline with hyperbolic covariates may be fit by nonlinear regression methods to estimate the degree of curvature between adjoining linear segments. The added complexity of fitting nonlinear, as opposed to linear, regression models is not great. The extra effort is particularly worthwhile when investigators are unwilling to assume that the slope of the response changes abruptly at the join points. We can also estimate the join points (the values of the abscissas where the linear segments would intersect if extrapolated) if their number and approximate locations may be presumed known. An example using data on changing age at menarche in a cohort of Japanese women illustrates the use of the method for exploratory data analysis. (author)

  18. Low adolescent self-esteem leads to multiple interpersonal problems: a test a social-adaptation theory.

    Science.gov (United States)

    Kahle, L R; Kulka, R A; Klingel, D M

    1980-09-01

    This article reports the results of a study that annually monitored the self-esteem and interpersonal problems of over 100 boys during their sophomore, junior, and senior years of high school. Cross-lagged panel correlation differences show that low self-esteem leads to interpersonal problems in all three time lags when multiple interpersonal problems constitute the dependent variable but not when single interpersonal problem criteria constitute the dependent variable. These results are interpreted as supporting social-adaptation theory rather than self-perception theory. Implications for the conceptual status of personality variables as causal antecedents and for the assessment of individual differences are discussed.

  19. Demonstration of a Fiber Optic Regression Probe

    Science.gov (United States)

    Korman, Valentin; Polzin, Kurt A.

    2010-01-01

    The capability to provide localized, real-time monitoring of material regression rates in various applications has the potential to provide a new stream of data for development testing of various components and systems, as well as serving as a monitoring tool in flight applications. These applications include, but are not limited to, the regression of a combusting solid fuel surface, the ablation of the throat in a chemical rocket or the heat shield of an aeroshell, and the monitoring of erosion in long-life plasma thrusters. The rate of regression in the first application is very fast, while the second and third are increasingly slower. A recent fundamental sensor development effort has led to a novel regression, erosion, and ablation sensor technology (REAST). The REAST sensor allows for measurement of real-time surface erosion rates at a discrete surface location. The sensor is optical, using two different, co-located fiber-optics to perform the regression measurement. The disparate optical transmission properties of the two fiber-optics makes it possible to measure the regression rate by monitoring the relative light attenuation through the fibers. As the fibers regress along with the parent material in which they are embedded, the relative light intensities through the two fibers changes, providing a measure of the regression rate. The optical nature of the system makes it relatively easy to use in a variety of harsh, high temperature environments, and it is also unaffected by the presence of electric and magnetic fields. In addition, the sensor could be used to perform optical spectroscopy on the light emitted by a process and collected by fibers, giving localized measurements of various properties. The capability to perform an in-situ measurement of material regression rates is useful in addressing a variety of physical issues in various applications. An in-situ measurement allows for real-time data regarding the erosion rates, providing a quick method for

  20. Stochastic development regression using method of moments

    DEFF Research Database (Denmark)

    Kühnel, Line; Sommer, Stefan Horst

    2017-01-01

    This paper considers the estimation problem arising when inferring parameters in the stochastic development regression model for manifold valued non-linear data. Stochastic development regression captures the relation between manifold-valued response and Euclidean covariate variables using...... the stochastic development construction. It is thereby able to incorporate several covariate variables and random effects. The model is intrinsically defined using the connection of the manifold, and the use of stochastic development avoids linearizing the geometry. We propose to infer parameters using...... the Method of Moments procedure that matches known constraints on moments of the observations conditional on the latent variables. The performance of the model is investigated in a simulation example using data on finite dimensional landmark manifolds....