WorldWideScience

Sample records for optimal hypothesis testing

  1. Chaotic annealing with hypothesis test for function optimization in noisy environments

    International Nuclear Information System (INIS)

    Pan Hui; Wang Ling; Liu Bo

    2008-01-01

    As a special mechanism to avoid being trapped in local minimum, the ergodicity property of chaos has been used as a novel searching technique for optimization problems, but there is no research work on chaos for optimization in noisy environments. In this paper, the performance of chaotic annealing (CA) for uncertain function optimization is investigated, and a new hybrid approach (namely CAHT) that combines CA and hypothesis test (HT) is proposed. In CAHT, the merits of CA are applied for well exploration and exploitation in searching space, and solution quality can be identified reliably by hypothesis test to reduce the repeated search to some extent and to reasonably estimate performance for solution. Simulation results and comparisons show that, chaos is helpful to improve the performance of SA for uncertain function optimization, and CAHT can further improve the searching efficiency, quality and robustness

  2. A test of the herbivore optimization hypothesis using muskoxen and a graminoid meadow plant community

    Directory of Open Access Journals (Sweden)

    David L. Smith

    1996-01-01

    Full Text Available A prediction from the herbivore optimization hypothesis is that grazing by herbivores at moderate intensities will increase net above-ground primary productivity more than at lower or higher intensities. I tested this hypothesis in an area of high muskox {Ovibos moschatus density on north-central Banks Island, Northwest Territories, Canada (73°50'N, 119°53'W. Plots (1 m2 in graminoid meadows dominated by cottongrass (Eriophorum triste were either clipped, exposed to muskoxen, protected for part of one growing season, or permanently protected. This resulted in the removal of 22-44%, 10-39%, 0-39% or 0%, respectively, of shoot tissue during each growing season. Contrary to the predictions of the herbivore optimization hypothesis, productivity did not increase across this range of tissue removal. Productivity of plants clipped at 1.5 cm above ground once or twice per growing season, declined by 60+/-5% in 64% of the tests. The productivity of plants grazed by muskoxen declined by 56+/-7% in 25% of the tests. No significant change in productivity was observed in 36% and 75% of the tests in clipped and grazed treatments, respecrively. Clipping and grazing reduced below-ground standing crop except where removals were small. Grazing and clipping did not stimulate productivity of north-central Banks Island graminoid meadows.

  3. A Bayesian Decision-Theoretic Approach to Logically-Consistent Hypothesis Testing

    Directory of Open Access Journals (Sweden)

    Gustavo Miranda da Silva

    2015-09-01

    Full Text Available This work addresses an important issue regarding the performance of simultaneous test procedures: the construction of multiple tests that at the same time are optimal from a statistical perspective and that also yield logically-consistent results that are easy to communicate to practitioners of statistical methods. For instance, if hypothesis A implies hypothesis B, is it possible to create optimal testing procedures that reject A whenever they reject B? Unfortunately, several standard testing procedures fail in having such logical consistency. Although this has been deeply investigated under a frequentist perspective, the literature lacks analyses under a Bayesian paradigm. In this work, we contribute to the discussion by investigating three rational relationships under a Bayesian decision-theoretic standpoint: coherence, invertibility and union consonance. We characterize and illustrate through simple examples optimal Bayes tests that fulfill each of these requisites separately. We also explore how far one can go by putting these requirements together. We show that although fairly intuitive tests satisfy both coherence and invertibility, no Bayesian testing scheme meets the desiderata as a whole, strengthening the understanding that logical consistency cannot be combined with statistical optimality in general. Finally, we associate Bayesian hypothesis testing with Bayes point estimation procedures. We prove the performance of logically-consistent hypothesis testing by means of a Bayes point estimator to be optimal only under very restrictive conditions.

  4. Active SAmpling Protocol (ASAP) to Optimize Individual Neurocognitive Hypothesis Testing: A BCI-Inspired Dynamic Experimental Design.

    Science.gov (United States)

    Sanchez, Gaëtan; Lecaignard, Françoise; Otman, Anatole; Maby, Emmanuel; Mattout, Jérémie

    2016-01-01

    The relatively young field of Brain-Computer Interfaces has promoted the use of electrophysiology and neuroimaging in real-time. In the meantime, cognitive neuroscience studies, which make extensive use of functional exploration techniques, have evolved toward model-based experiments and fine hypothesis testing protocols. Although these two developments are mostly unrelated, we argue that, brought together, they may trigger an important shift in the way experimental paradigms are being designed, which should prove fruitful to both endeavors. This change simply consists in using real-time neuroimaging in order to optimize advanced neurocognitive hypothesis testing. We refer to this new approach as the instantiation of an Active SAmpling Protocol (ASAP). As opposed to classical (static) experimental protocols, ASAP implements online model comparison, enabling the optimization of design parameters (e.g., stimuli) during the course of data acquisition. This follows the well-known principle of sequential hypothesis testing. What is radically new, however, is our ability to perform online processing of the huge amount of complex data that brain imaging techniques provide. This is all the more relevant at a time when physiological and psychological processes are beginning to be approached using more realistic, generative models which may be difficult to tease apart empirically. Based upon Bayesian inference, ASAP proposes a generic and principled way to optimize experimental design adaptively. In this perspective paper, we summarize the main steps in ASAP. Using synthetic data we illustrate its superiority in selecting the right perceptual model compared to a classical design. Finally, we briefly discuss its future potential for basic and clinical neuroscience as well as some remaining challenges.

  5. [Dilemma of null hypothesis in ecological hypothesis's experiment test.

    Science.gov (United States)

    Li, Ji

    2016-06-01

    Experimental test is one of the major test methods of ecological hypothesis, though there are many arguments due to null hypothesis. Quinn and Dunham (1983) analyzed the hypothesis deduction model from Platt (1964) and thus stated that there is no null hypothesis in ecology that can be strictly tested by experiments. Fisher's falsificationism and Neyman-Pearson (N-P)'s non-decisivity inhibit statistical null hypothesis from being strictly tested. Moreover, since the null hypothesis H 0 (α=1, β=0) and alternative hypothesis H 1 '(α'=1, β'=0) in ecological progresses are diffe-rent from classic physics, the ecological null hypothesis can neither be strictly tested experimentally. These dilemmas of null hypothesis could be relieved via the reduction of P value, careful selection of null hypothesis, non-centralization of non-null hypothesis, and two-tailed test. However, the statistical null hypothesis significance testing (NHST) should not to be equivalent to the causality logistical test in ecological hypothesis. Hence, the findings and conclusions about methodological studies and experimental tests based on NHST are not always logically reliable.

  6. Hypothesis Designs for Three-Hypothesis Test Problems

    OpenAIRE

    Yan Li; Xiaolong Pu

    2010-01-01

    As a helpful guide for applications, the alternative hypotheses of the three-hypothesis test problems are designed under the required error probabilities and average sample number in this paper. The asymptotic formulas and the proposed numerical quadrature formulas are adopted, respectively, to obtain the hypothesis designs and the corresponding sequential test schemes under the Koopman-Darmois distributions. The example of the normal mean test shows that our methods are qu...

  7. Local hypothesis testing between a pure bipartite state and the white noise state

    OpenAIRE

    Owari, Masaki; Hayashi, Masahito

    2010-01-01

    In this paper, we treat a local discrimination problem in the framework of asymmetric hypothesis testing. We choose a known bipartite pure state $\\ket{\\Psi}$ as an alternative hypothesis, and the completely mixed state as a null hypothesis. As a result, we analytically derive an optimal type 2 error and an optimal POVM for one-way LOCC POVM and Separable POVM. For two-way LOCC POVM, we study a family of simple three-step LOCC protocols, and show that the best protocol in this family has stric...

  8. Biostatistics series module 2: Overview of hypothesis testing

    Directory of Open Access Journals (Sweden)

    Avijit Hazra

    2016-01-01

    Full Text Available Hypothesis testing (or statistical inference is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric and the number of groups or data sets being compared (e.g., two or more than two at a time. The same research question may be explored by more than one type of hypothesis test

  9. Privacy on Hypothesis Testing in Smart Grids

    OpenAIRE

    Li, Zuxing; Oechtering, Tobias

    2015-01-01

    In this paper, we study the problem of privacy information leakage in a smart grid. The privacy risk is assumed to be caused by an unauthorized binary hypothesis testing of the consumer's behaviour based on the smart meter readings of energy supplies from the energy provider. Another energy supplies are produced by an alternative energy source. A controller equipped with an energy storage device manages the energy inflows to satisfy the energy demand of the consumer. We study the optimal ener...

  10. The hubris hypothesis: The downside of comparative optimism displays.

    Science.gov (United States)

    Hoorens, Vera; Van Damme, Carolien; Helweg-Larsen, Marie; Sedikides, Constantine

    2017-04-01

    According to the hubris hypothesis, observers respond more unfavorably to individuals who express their positive self-views comparatively than to those who express their positive self-views non-comparatively, because observers infer that the former hold a more disparaging view of others and particularly of observers. Two experiments extended the hubris hypothesis in the domain of optimism. Observers attributed less warmth (but not less competence) to, and showed less interest in affiliating with, an individual displaying comparative optimism (the belief that one's future will be better than others' future) than with an individual displaying absolute optimism (the belief that one's future will be good). Observers responded differently to individuals displaying comparative versus absolute optimism, because they inferred that the former held a gloomier view of the observers' future. Consistent with previous research, observers still attributed more positive traits to a comparative or absolute optimist than to a comparative or absolute pessimist. Copyright © 2016. Published by Elsevier Inc.

  11. Hypothesis tests for the detection of constant speed radiation moving sources

    Energy Technology Data Exchange (ETDEWEB)

    Dumazert, Jonathan; Coulon, Romain; Kondrasovs, Vladimir; Boudergui, Karim; Sannie, Guillaume; Gameiro, Jordan; Normand, Stephane [CEA, LIST, Laboratoire Capteurs Architectures Electroniques, 99 Gif-sur-Yvette, (France); Mechin, Laurence [CNRS, UCBN, Groupe de Recherche en Informatique, Image, Automatique et Instrumentation de Caen, 4050 Caen, (France)

    2015-07-01

    Radiation Portal Monitors are deployed in linear network to detect radiological material in motion. As a complement to single and multichannel detection algorithms, inefficient under too low signal to noise ratios, temporal correlation algorithms have been introduced. Test hypothesis methods based on empirically estimated mean and variance of the signals delivered by the different channels have shown significant gain in terms of a tradeoff between detection sensitivity and false alarm probability. This paper discloses the concept of a new hypothesis test for temporal correlation detection methods, taking advantage of the Poisson nature of the registered counting signals, and establishes a benchmark between this test and its empirical counterpart. The simulation study validates that in the four relevant configurations of a pedestrian source carrier under respectively high and low count rate radioactive background, and a vehicle source carrier under the same respectively high and low count rate radioactive background, the newly introduced hypothesis test ensures a significantly improved compromise between sensitivity and false alarm, while guaranteeing the stability of its optimization parameter regardless of signal to noise ratio variations between 2 to 0.8. (authors)

  12. Hypothesis Testing in the Real World

    Science.gov (United States)

    Miller, Jeff

    2017-01-01

    Critics of null hypothesis significance testing suggest that (a) its basic logic is invalid and (b) it addresses a question that is of no interest. In contrast to (a), I argue that the underlying logic of hypothesis testing is actually extremely straightforward and compelling. To substantiate that, I present examples showing that hypothesis…

  13. A NONPARAMETRIC HYPOTHESIS TEST VIA THE BOOTSTRAP RESAMPLING

    OpenAIRE

    Temel, Tugrul T.

    2001-01-01

    This paper adapts an already existing nonparametric hypothesis test to the bootstrap framework. The test utilizes the nonparametric kernel regression method to estimate a measure of distance between the models stated under the null hypothesis. The bootstraped version of the test allows to approximate errors involved in the asymptotic hypothesis test. The paper also develops a Mathematica Code for the test algorithm.

  14. Bayesian Hypothesis Testing

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H0.

  15. Approaches to informed consent for hypothesis-testing and hypothesis-generating clinical genomics research.

    Science.gov (United States)

    Facio, Flavia M; Sapp, Julie C; Linn, Amy; Biesecker, Leslie G

    2012-10-10

    Massively-parallel sequencing (MPS) technologies create challenges for informed consent of research participants given the enormous scale of the data and the wide range of potential results. We propose that the consent process in these studies be based on whether they use MPS to test a hypothesis or to generate hypotheses. To demonstrate the differences in these approaches to informed consent, we describe the consent processes for two MPS studies. The purpose of our hypothesis-testing study is to elucidate the etiology of rare phenotypes using MPS. The purpose of our hypothesis-generating study is to test the feasibility of using MPS to generate clinical hypotheses, and to approach the return of results as an experimental manipulation. Issues to consider in both designs include: volume and nature of the potential results, primary versus secondary results, return of individual results, duty to warn, length of interaction, target population, and privacy and confidentiality. The categorization of MPS studies as hypothesis-testing versus hypothesis-generating can help to clarify the issue of so-called incidental or secondary results for the consent process, and aid the communication of the research goals to study participants.

  16. An algorithm for testing the efficient market hypothesis.

    Directory of Open Access Journals (Sweden)

    Ioana-Andreea Boboc

    Full Text Available The objective of this research is to examine the efficiency of EUR/USD market through the application of a trading system. The system uses a genetic algorithm based on technical analysis indicators such as Exponential Moving Average (EMA, Moving Average Convergence Divergence (MACD, Relative Strength Index (RSI and Filter that gives buying and selling recommendations to investors. The algorithm optimizes the strategies by dynamically searching for parameters that improve profitability in the training period. The best sets of rules are then applied on the testing period. The results show inconsistency in finding a set of trading rules that performs well in both periods. Strategies that achieve very good returns in the training period show difficulty in returning positive results in the testing period, this being consistent with the efficient market hypothesis (EMH.

  17. An algorithm for testing the efficient market hypothesis.

    Science.gov (United States)

    Boboc, Ioana-Andreea; Dinică, Mihai-Cristian

    2013-01-01

    The objective of this research is to examine the efficiency of EUR/USD market through the application of a trading system. The system uses a genetic algorithm based on technical analysis indicators such as Exponential Moving Average (EMA), Moving Average Convergence Divergence (MACD), Relative Strength Index (RSI) and Filter that gives buying and selling recommendations to investors. The algorithm optimizes the strategies by dynamically searching for parameters that improve profitability in the training period. The best sets of rules are then applied on the testing period. The results show inconsistency in finding a set of trading rules that performs well in both periods. Strategies that achieve very good returns in the training period show difficulty in returning positive results in the testing period, this being consistent with the efficient market hypothesis (EMH).

  18. Habitat fragmentation, vole population fluctuations, and the ROMPA hypothesis: An experimental test using model landscapes.

    Science.gov (United States)

    Batzli, George O

    2016-11-01

    Increased habitat fragmentation leads to smaller size of habitat patches and to greater distance between patches. The ROMPA hypothesis (ratio of optimal to marginal patch area) uniquely links vole population fluctuations to the composition of the landscape. It states that as ROMPA decreases (fragmentation increases), vole population fluctuations will increase (including the tendency to display multi-annual cycles in abundance) because decreased proportions of optimal habitat result in greater population declines and longer recovery time after a harsh season. To date, only comparative observations in the field have supported the hypothesis. This paper reports the results of the first experimental test. I used prairie voles, Microtus ochrogaster, and mowed grassland to create model landscapes with 3 levels of ROMPA (high with 25% mowed, medium with 50% mowed and low with 75% mowed). As ROMPA decreased, distances between patches of favorable habitat (high cover) increased owing to a greater proportion of unfavorable (mowed) habitat. Results from the first year with intensive live trapping indicated that the preconditions for operation of the hypothesis existed (inversely density dependent emigration and, as ROMPA decreased, increased per capita mortality and decreased per capita movement between optimal patches). Nevertheless, contrary to the prediction of the hypothesis that populations in landscapes with high ROMPA should have the lowest variability, 5 years of trapping indicated that variability was lowest with medium ROMPA. The design of field experiments may never be perfect, but these results indicate that the ROMPA hypothesis needs further rigorous testing. © 2016 International Society of Zoological Sciences, Institute of Zoology/Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.

  19. A default Bayesian hypothesis test for ANOVA designs

    NARCIS (Netherlands)

    Wetzels, R.; Grasman, R.P.P.P.; Wagenmakers, E.J.

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA

  20. The potential for increased power from combining P-values testing the same hypothesis.

    Science.gov (United States)

    Ganju, Jitendra; Julie Ma, Guoguang

    2017-02-01

    The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.

  1. Debates—Hypothesis testing in hydrology: Introduction

    Science.gov (United States)

    Blöschl, Günter

    2017-03-01

    This paper introduces the papers in the "Debates—Hypothesis testing in hydrology" series. The four articles in the series discuss whether and how the process of testing hypotheses leads to progress in hydrology. Repeated experiments with controlled boundary conditions are rarely feasible in hydrology. Research is therefore not easily aligned with the classical scientific method of testing hypotheses. Hypotheses in hydrology are often enshrined in computer models which are tested against observed data. Testability may be limited due to model complexity and data uncertainty. All four articles suggest that hypothesis testing has contributed to progress in hydrology and is needed in the future. However, the procedure is usually not as systematic as the philosophy of science suggests. A greater emphasis on a creative reasoning process on the basis of clues and explorative analyses is therefore needed.

  2. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.

    2014-12-15

    This paper considers multi-agent sequential hypothesis testing and presents a framework for strategic learning in sequential games with explicit consideration of both temporal and spatial coordination. The associated Bayes risk functions explicitly incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well-defined value functions with respect to (a) the belief states for the case of conditional independent private noisy measurements that are also assumed to be independent identically distributed over time, and (b) the information states for the case of correlated private noisy measurements. A sequential investment game of strategic coordination and delay is also discussed as an application of the proposed strategic learning rules.

  3. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.; Shamma, Jeff S.

    2014-01-01

    incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well

  4. Hypothesis testing of scientific Monte Carlo calculations

    Science.gov (United States)

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  5. Personal Hypothesis Testing: The Role of Consistency and Self-Schema.

    Science.gov (United States)

    Strohmer, Douglas C.; And Others

    1988-01-01

    Studied how individuals test hypotheses about themselves. Examined extent to which Snyder's bias toward confirmation persists when negative or nonconsistent personal hypothesis is tested. Found negativity or positivity did not affect hypothesis testing directly, though hypothesis consistency did. Found cognitive schematic variable (vulnerability…

  6. Null but not void: considerations for hypothesis testing.

    Science.gov (United States)

    Shaw, Pamela A; Proschan, Michael A

    2013-01-30

    Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.

  7. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  8. New methods of testing nonlinear hypothesis using iterative NLLS estimator

    Science.gov (United States)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.

  9. Tests of the Giant Impact Hypothesis

    Science.gov (United States)

    Jones, J. H.

    1998-01-01

    The giant impact hypothesis has gained popularity as a means of explaining a volatile-depleted Moon that still has a chemical affinity to the Earth. As Taylor's Axiom decrees, the best models of lunar origin are testable, but this is difficult with the giant impact model. The energy associated with the impact would be sufficient to totally melt and partially vaporize the Earth. And this means that there should he no geological vestige of Barber times. Accordingly, it is important to devise tests that may be used to evaluate the giant impact hypothesis. Three such tests are discussed here. None of these is supportive of the giant impact model, but neither do they disprove it.

  10. Testing competing forms of the Milankovitch hypothesis

    DEFF Research Database (Denmark)

    Kaufmann, Robert K.; Juselius, Katarina

    2016-01-01

    We test competing forms of the Milankovitch hypothesis by estimating the coefficients and diagnostic statistics for a cointegrated vector autoregressive model that includes 10 climate variables and four exogenous variables for solar insolation. The estimates are consistent with the physical...... ice volume and solar insolation. The estimated adjustment dynamics show that solar insolation affects an array of climate variables other than ice volume, each at a unique rate. This implies that previous efforts to test the strong form of the Milankovitch hypothesis by examining the relationship...... that the latter is consistent with a weak form of the Milankovitch hypothesis and that it should be restated as follows: Internal climate dynamics impose perturbations on glacial cycles that are driven by solar insolation. Our results show that these perturbations are likely caused by slow adjustment between land...

  11. Testing the null hypothesis: the forgotten legacy of Karl Popper?

    Science.gov (United States)

    Wilkinson, Mick

    2013-01-01

    Testing of the null hypothesis is a fundamental aspect of the scientific method and has its basis in the falsification theory of Karl Popper. Null hypothesis testing makes use of deductive reasoning to ensure that the truth of conclusions is irrefutable. In contrast, attempting to demonstrate the new facts on the basis of testing the experimental or research hypothesis makes use of inductive reasoning and is prone to the problem of the Uniformity of Nature assumption described by David Hume in the eighteenth century. Despite this issue and the well documented solution provided by Popper's falsification theory, the majority of publications are still written such that they suggest the research hypothesis is being tested. This is contrary to accepted scientific convention and possibly highlights a poor understanding of the application of conventional significance-based data analysis approaches. Our work should remain driven by conjecture and attempted falsification such that it is always the null hypothesis that is tested. The write up of our studies should make it clear that we are indeed testing the null hypothesis and conforming to the established and accepted philosophical conventions of the scientific method.

  12. A test of the orthographic recoding hypothesis

    Science.gov (United States)

    Gaygen, Daniel E.

    2003-04-01

    The Orthographic Recoding Hypothesis [D. E. Gaygen and P. A. Luce, Percept. Psychophys. 60, 465-483 (1998)] was tested. According to this hypothesis, listeners recognize spoken words heard for the first time by mapping them onto stored representations of the orthographic forms of the words. Listeners have a stable orthographic representation of words, but no phonological representation, when those words have been read frequently but never heard or spoken. Such may be the case for low frequency words such as jargon. Three experiments using visually and auditorily presented nonword stimuli tested this hypothesis. The first two experiments were explicit tests of memory (old-new tests) for words presented visually. In the first experiment, the recognition of auditorily presented nonwords was facilitated when they previously appeared on a visually presented list. The second experiment was similar, but included a concurrent articulation task during a visual word list presentation, thus preventing covert rehearsal of the nonwords. The results were similar to the first experiment. The third experiment was an indirect test of memory (auditory lexical decision task) for visually presented nonwords. Auditorily presented nonwords were identified as nonwords significantly more slowly if they had previously appeared on the visually presented list accompanied by a concurrent articulation task.

  13. The venom optimization hypothesis revisited.

    Science.gov (United States)

    Morgenstern, David; King, Glenn F

    2013-03-01

    Animal venoms are complex chemical mixtures that typically contain hundreds of proteins and non-proteinaceous compounds, resulting in a potent weapon for prey immobilization and predator deterrence. However, because venoms are protein-rich, they come with a high metabolic price tag. The metabolic cost of venom is sufficiently high to result in secondary loss of venom whenever its use becomes non-essential to survival of the animal. The high metabolic cost of venom leads to the prediction that venomous animals may have evolved strategies for minimizing venom expenditure. Indeed, various behaviors have been identified that appear consistent with frugality of venom use. This has led to formulation of the "venom optimization hypothesis" (Wigger et al. (2002) Toxicon 40, 749-752), also known as "venom metering", which postulates that venom is metabolically expensive and therefore used frugally through behavioral control. Here, we review the available data concerning economy of venom use by animals with either ancient or more recently evolved venom systems. We conclude that the convergent nature of the evidence in multiple taxa strongly suggests the existence of evolutionary pressures favoring frugal use of venom. However, there remains an unresolved dichotomy between this economy of venom use and the lavish biochemical complexity of venom, which includes a high degree of functional redundancy. We discuss the evidence for biochemical optimization of venom as a means of resolving this conundrum. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Phi index: a new metric to test the flush early and avoid the rush hypothesis.

    Science.gov (United States)

    Samia, Diogo S M; Blumstein, Daniel T

    2014-01-01

    Optimal escape theory states that animals should counterbalance the costs and benefits of flight when escaping from a potential predator. However, in apparent contradiction with this well-established optimality model, birds and mammals generally initiate escape soon after beginning to monitor an approaching threat, a phenomena codified as the "Flush Early and Avoid the Rush" (FEAR) hypothesis. Typically, the FEAR hypothesis is tested using correlational statistics and is supported when there is a strong relationship between the distance at which an individual first responds behaviorally to an approaching predator (alert distance, AD), and its flight initiation distance (the distance at which it flees the approaching predator, FID). However, such correlational statistics are both inadequate to analyze relationships constrained by an envelope (such as that in the AD-FID relationship) and are sensitive to outliers with high leverage, which can lead one to erroneous conclusions. To overcome these statistical concerns we develop the phi index (Φ), a distribution-free metric to evaluate the goodness of fit of a 1:1 relationship in a constraint envelope (the prediction of the FEAR hypothesis). Using both simulation and empirical data, we conclude that Φ is superior to traditional correlational analyses because it explicitly tests the FEAR prediction, is robust to outliers, and it controls for the disproportionate influence of observations from large predictor values (caused by the constrained envelope in AD-FID relationship). Importantly, by analyzing the empirical data we corroborate the strong effect that alertness has on flight as stated by the FEAR hypothesis.

  15. Phi index: a new metric to test the flush early and avoid the rush hypothesis.

    Directory of Open Access Journals (Sweden)

    Diogo S M Samia

    Full Text Available Optimal escape theory states that animals should counterbalance the costs and benefits of flight when escaping from a potential predator. However, in apparent contradiction with this well-established optimality model, birds and mammals generally initiate escape soon after beginning to monitor an approaching threat, a phenomena codified as the "Flush Early and Avoid the Rush" (FEAR hypothesis. Typically, the FEAR hypothesis is tested using correlational statistics and is supported when there is a strong relationship between the distance at which an individual first responds behaviorally to an approaching predator (alert distance, AD, and its flight initiation distance (the distance at which it flees the approaching predator, FID. However, such correlational statistics are both inadequate to analyze relationships constrained by an envelope (such as that in the AD-FID relationship and are sensitive to outliers with high leverage, which can lead one to erroneous conclusions. To overcome these statistical concerns we develop the phi index (Φ, a distribution-free metric to evaluate the goodness of fit of a 1:1 relationship in a constraint envelope (the prediction of the FEAR hypothesis. Using both simulation and empirical data, we conclude that Φ is superior to traditional correlational analyses because it explicitly tests the FEAR prediction, is robust to outliers, and it controls for the disproportionate influence of observations from large predictor values (caused by the constrained envelope in AD-FID relationship. Importantly, by analyzing the empirical data we corroborate the strong effect that alertness has on flight as stated by the FEAR hypothesis.

  16. A shift from significance test to hypothesis test through power analysis in medical research.

    Science.gov (United States)

    Singh, G

    2006-01-01

    Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.

  17. A Hypothesis-Driven Approach to Site Investigation

    Science.gov (United States)

    Nowak, W.

    2008-12-01

    Variability of subsurface formations and the scarcity of data lead to the notion of aquifer parameters as geostatistical random variables. Given an information need and limited resources for field campaigns, site investigation is often put into the context of optimal design. In optimal design, the types, numbers and positions of samples are optimized under case-specific objectives to meet the information needs. Past studies feature optimal data worth (balancing maximum financial profit in an engineering task versus the cost of additional sampling), or aim at a minimum prediction uncertainty of stochastic models for a prescribed investigation budget. Recent studies also account for other sources of uncertainty outside the hydrogeological range, such as uncertain toxicity, ingestion and behavioral parameters of the affected population when predicting the human health risk from groundwater contaminations. The current study looks at optimal site investigation from a new angle. Answering a yes/no question under uncertainty directly requires recasting the original question as a hypothesis test. Otherwise, false confidence in the resulting answer would be pretended. A straightforward example is whether a recent contaminant spill will cause contaminant concentrations in excess of a legal limit at a nearby drinking water well. This question can only be answered down to a specified chance of error, i.e., based on the significance level used in hypothesis tests. Optimal design is placed into the hypothesis-driven context by using the chance of providing a false yes/no answer as new criterion to be minimized. Different configurations apply for one-sided and two-sided hypothesis tests. If a false answer entails financial liability, the hypothesis-driven context can be re-cast in the context of data worth. The remaining difference is that failure is a hard constraint in the data worth context versus a monetary punishment term in the hypothesis-driven context. The basic principle

  18. Robust real-time pattern matching using bayesian sequential hypothesis testing.

    Science.gov (United States)

    Pele, Ofir; Werman, Michael

    2008-08-01

    This paper describes a method for robust real time pattern matching. We first introduce a family of image distance measures, the "Image Hamming Distance Family". Members of this family are robust to occlusion, small geometrical transforms, light changes and non-rigid deformations. We then present a novel Bayesian framework for sequential hypothesis testing on finite populations. Based on this framework, we design an optimal rejection/acceptance sampling algorithm. This algorithm quickly determines whether two images are similar with respect to a member of the Image Hamming Distance Family. We also present a fast framework that designs a near-optimal sampling algorithm. Extensive experimental results show that the sequential sampling algorithm performance is excellent. Implemented on a Pentium 4 3 GHz processor, detection of a pattern with 2197 pixels, in 640 x 480 pixel frames, where in each frame the pattern rotated and was highly occluded, proceeds at only 0.022 seconds per frame.

  19. A large scale test of the gaming-enhancement hypothesis

    Directory of Open Access Journals (Sweden)

    Andrew K. Przybylski

    2016-11-01

    Full Text Available A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis, has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people’s gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  20. A large scale test of the gaming-enhancement hypothesis.

    Science.gov (United States)

    Przybylski, Andrew K; Wang, John C

    2016-01-01

    A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis , has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people's gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  1. A shift from significance test to hypothesis test through power analysis in medical research

    Directory of Open Access Journals (Sweden)

    Singh Girish

    2006-01-01

    Full Text Available Medical research literature until recently, exhibited substantial dominance of the Fisher′s significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson′s hypothesis test considering both probability of type I and II error. Fisher′s approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson′s approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher′s significance test to Neyman-Pearson′s hypothesis test procedure.

  2. An Exercise for Illustrating the Logic of Hypothesis Testing

    Science.gov (United States)

    Lawton, Leigh

    2009-01-01

    Hypothesis testing is one of the more difficult concepts for students to master in a basic, undergraduate statistics course. Students often are puzzled as to why statisticians simply don't calculate the probability that a hypothesis is true. This article presents an exercise that forces students to lay out on their own a procedure for testing a…

  3. An omnibus test for the global null hypothesis.

    Science.gov (United States)

    Futschik, Andreas; Taus, Thomas; Zehetmayer, Sonja

    2018-01-01

    Global hypothesis tests are a useful tool in the context of clinical trials, genetic studies, or meta-analyses, when researchers are not interested in testing individual hypotheses, but in testing whether none of the hypotheses is false. There are several possibilities how to test the global null hypothesis when the individual null hypotheses are independent. If it is assumed that many of the individual null hypotheses are false, combination tests have been recommended to maximize power. If, however, it is assumed that only one or a few null hypotheses are false, global tests based on individual test statistics are more powerful (e.g. Bonferroni or Simes test). However, usually there is no a priori knowledge on the number of false individual null hypotheses. We therefore propose an omnibus test based on cumulative sums of the transformed p-values. We show that this test yields an impressive overall performance. The proposed method is implemented in an R-package called omnibus.

  4. A default Bayesian hypothesis test for mediation.

    Science.gov (United States)

    Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan

    2015-03-01

    In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).

  5. Counselor Hypothesis Testing Strategies: The Role of Initial Impressions and Self-Schema.

    Science.gov (United States)

    Strohmer, Douglas C.; Chiodo, Anthony L.

    1984-01-01

    Presents two experiments concerning confirmatory bias in the way counselors collect data to test their hypotheses. Counselors were asked either to develop their own clinical hypothesis or were given a hypothesis to test. Confirmatory bias in hypothesis testing was not supported in either experiment. (JAC)

  6. Robust and distributed hypothesis testing

    CERN Document Server

    Gül, Gökhan

    2017-01-01

    This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the boo...

  7. A critique of statistical hypothesis testing in clinical research

    Directory of Open Access Journals (Sweden)

    Somik Raha

    2011-01-01

    Full Text Available Many have documented the difficulty of using the current paradigm of Randomized Controlled Trials (RCTs to test and validate the effectiveness of alternative medical systems such as Ayurveda. This paper critiques the applicability of RCTs for all clinical knowledge-seeking endeavors, of which Ayurveda research is a part. This is done by examining statistical hypothesis testing, the underlying foundation of RCTs, from a practical and philosophical perspective. In the philosophical critique, the two main worldviews of probability are that of the Bayesian and the frequentist. The frequentist worldview is a special case of the Bayesian worldview requiring the unrealistic assumptions of knowing nothing about the universe and believing that all observations are unrelated to each other. Many have claimed that the first belief is necessary for science, and this claim is debunked by comparing variations in learning with different prior beliefs. Moving beyond the Bayesian and frequentist worldviews, the notion of hypothesis testing itself is challenged on the grounds that a hypothesis is an unclear distinction, and assigning a probability on an unclear distinction is an exercise that does not lead to clarity of action. This critique is of the theory itself and not any particular application of statistical hypothesis testing. A decision-making frame is proposed as a way of both addressing this critique and transcending ideological debates on probability. An example of a Bayesian decision-making approach is shown as an alternative to statistical hypothesis testing, utilizing data from a past clinical trial that studied the effect of Aspirin on heart attacks in a sample population of doctors. As a big reason for the prevalence of RCTs in academia is legislation requiring it, the ethics of legislating the use of statistical methods for clinical research is also examined.

  8. Statistical hypothesis testing with SAS and R

    CERN Document Server

    Taeger, Dirk

    2014-01-01

    A comprehensive guide to statistical hypothesis testing with examples in SAS and R When analyzing datasets the following questions often arise:Is there a short hand procedure for a statistical test available in SAS or R?If so, how do I use it?If not, how do I program the test myself? This book answers these questions and provides an overview of the most commonstatistical test problems in a comprehensive way, making it easy to find and performan appropriate statistical test. A general summary of statistical test theory is presented, along with a basicdescription for each test, including the

  9. A hypothesis on improving foreign accents by optimizing variability in vocal learning brain circuits.

    Science.gov (United States)

    Simmonds, Anna J

    2015-01-01

    Rapid vocal motor learning is observed when acquiring a language in early childhood, or learning to speak another language later in life. Accurate pronunciation is one of the hardest things for late learners to master and they are almost always left with a non-native accent. Here, I propose a novel hypothesis that this accent could be improved by optimizing variability in vocal learning brain circuits during learning. Much of the neurobiology of human vocal motor learning has been inferred from studies on songbirds. Jarvis (2004) proposed the hypothesis that as in songbirds there are two pathways in humans: one for learning speech (the striatal vocal learning pathway), and one for production of previously learnt speech (the motor pathway). Learning new motor sequences necessary for accurate non-native pronunciation is challenging and I argue that in late learners of a foreign language the vocal learning pathway becomes inactive prematurely. The motor pathway is engaged once again and learners maintain their original native motor patterns for producing speech, resulting in speaking with a foreign accent. Further, I argue that variability in neural activity within vocal motor circuitry generates vocal variability that supports accurate non-native pronunciation. Recent theoretical and experimental work on motor learning suggests that variability in the motor movement is necessary for the development of expertise. I propose that there is little trial-by-trial variability when using the motor pathway. When using the vocal learning pathway variability gradually increases, reflecting an exploratory phase in which learners try out different ways of pronouncing words, before decreasing and stabilizing once the "best" performance has been identified. The hypothesis proposed here could be tested using behavioral interventions that optimize variability and engage the vocal learning pathway for longer, with the prediction that this would allow learners to develop new motor

  10. Hypothesis test for synchronization: twin surrogates revisited.

    Science.gov (United States)

    Romano, M Carmen; Thiel, Marco; Kurths, Jürgen; Mergenthaler, Konstantin; Engbert, Ralf

    2009-03-01

    The method of twin surrogates has been introduced to test for phase synchronization of complex systems in the case of passive experiments. In this paper we derive new analytical expressions for the number of twins depending on the size of the neighborhood, as well as on the length of the trajectory. This allows us to determine the optimal parameters for the generation of twin surrogates. Furthermore, we determine the quality of the twin surrogates with respect to several linear and nonlinear statistics depending on the parameters of the method. In the second part of the paper we perform a hypothesis test for phase synchronization in the case of experimental data from fixational eye movements. These miniature eye movements have been shown to play a central role in neural information processing underlying the perception of static visual scenes. The high number of data sets (21 subjects and 30 trials per person) allows us to compare the generated twin surrogates with the "natural" surrogates that correspond to the different trials. We show that the generated twin surrogates reproduce very well all linear and nonlinear characteristics of the underlying experimental system. The synchronization analysis of fixational eye movements by means of twin surrogates reveals that the synchronization between the left and right eye is significant, indicating that either the centers in the brain stem generating fixational eye movements are closely linked, or, alternatively that there is only one center controlling both eyes.

  11. Explorations in Statistics: Hypothesis Tests and P Values

    Science.gov (United States)

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of "Explorations in Statistics" delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what…

  12. Prospective detection of large prediction errors: a hypothesis testing approach

    International Nuclear Information System (INIS)

    Ruan, Dan

    2010-01-01

    Real-time motion management is important in radiotherapy. In addition to effective monitoring schemes, prediction is required to compensate for system latency, so that treatment can be synchronized with tumor motion. However, it is difficult to predict tumor motion at all times, and it is critical to determine when large prediction errors may occur. Such information can be used to pause the treatment beam or adjust monitoring/prediction schemes. In this study, we propose a hypothesis testing approach for detecting instants corresponding to potentially large prediction errors in real time. We treat the future tumor location as a random variable, and obtain its empirical probability distribution with the kernel density estimation-based method. Under the null hypothesis, the model probability is assumed to be a concentrated Gaussian centered at the prediction output. Under the alternative hypothesis, the model distribution is assumed to be non-informative uniform, which reflects the situation that the future position cannot be inferred reliably. We derive the likelihood ratio test (LRT) for this hypothesis testing problem and show that with the method of moments for estimating the null hypothesis Gaussian parameters, the LRT reduces to a simple test on the empirical variance of the predictive random variable. This conforms to the intuition to expect a (potentially) large prediction error when the estimate is associated with high uncertainty, and to expect an accurate prediction when the uncertainty level is low. We tested the proposed method on patient-derived respiratory traces. The 'ground-truth' prediction error was evaluated by comparing the prediction values with retrospective observations, and the large prediction regions were subsequently delineated by thresholding the prediction errors. The receiver operating characteristic curve was used to describe the performance of the proposed hypothesis testing method. Clinical implication was represented by miss

  13. Some consequences of using the Horsfall-Barratt scale for hypothesis testing

    Science.gov (United States)

    Comparing treatment effects by hypothesis testing is a common practice in plant pathology. Nearest percent estimates (NPEs) of disease severity were compared to Horsfall-Barratt (H-B) scale data to explore whether there was an effect of assessment method on hypothesis testing. A simulation model ba...

  14. Different meaning of the p-value in exploratory and confirmatory hypothesis testing

    DEFF Research Database (Denmark)

    Gerke, Oke; Høilund-Carlsen, Poul Flemming; Vach, Werner

    2011-01-01

    The outcome of clinical studies is often reduced to the statistical significance of results by indicating a p-value below the 5% significance level. Hypothesis testing and, through that, the p-value is commonly used, but their meaning is frequently misinterpreted in clinical research. The concept...... of hypothesis testing is explained and some pitfalls including those of multiple testing are given. The conceptual difference between exploratory and confirmatory hypothesis testing is discussed, and a better use of p-values, which includes presenting p-values with two or three decimals, is suggested....

  15. The frequentist implications of optional stopping on Bayesian hypothesis tests.

    Science.gov (United States)

    Sanborn, Adam N; Hills, Thomas T

    2014-04-01

    Null hypothesis significance testing (NHST) is the most commonly used statistical methodology in psychology. The probability of achieving a value as extreme or more extreme than the statistic obtained from the data is evaluated, and if it is low enough, the null hypothesis is rejected. However, because common experimental practice often clashes with the assumptions underlying NHST, these calculated probabilities are often incorrect. Most commonly, experimenters use tests that assume that sample sizes are fixed in advance of data collection but then use the data to determine when to stop; in the limit, experimenters can use data monitoring to guarantee that the null hypothesis will be rejected. Bayesian hypothesis testing (BHT) provides a solution to these ills because the stopping rule used is irrelevant to the calculation of a Bayes factor. In addition, there are strong mathematical guarantees on the frequentist properties of BHT that are comforting for researchers concerned that stopping rules could influence the Bayes factors produced. Here, we show that these guaranteed bounds have limited scope and often do not apply in psychological research. Specifically, we quantitatively demonstrate the impact of optional stopping on the resulting Bayes factors in two common situations: (1) when the truth is a combination of the hypotheses, such as in a heterogeneous population, and (2) when a hypothesis is composite-taking multiple parameter values-such as the alternative hypothesis in a t-test. We found that, for these situations, while the Bayesian interpretation remains correct regardless of the stopping rule used, the choice of stopping rule can, in some situations, greatly increase the chance of experimenters finding evidence in the direction they desire. We suggest ways to control these frequentist implications of stopping rules on BHT.

  16. The old age security hypothesis and optimal population growth.

    Science.gov (United States)

    Bental, B

    1989-03-01

    The application of the Samuelson-Diamond overlapping generations framework to the old age security hypothesis indicates that government intervention schemes can influence the relationship between population growth and capital accumulation. The most direct means of optimizing population growth is through taxes or subsidies that relate to the intergenerational transfer of wealth. A pay-as-you-go social security scheme, in which payment is predicated on the number of children the receiver has and is financed by taxes levied on the working population, emerges as the most likely intervention to produce the optimal steady state equilibrium. This system is able to correct any distortions the private sector may build into it. In contrast, a child support system, in which the government subsidizes or taxes workers according to their family size, can guarantee the optimal capital:labor ratio but not the optimal population growth rate. Thus, if the government seeks to decrease the population growth rate, the appropriate intervention is to levy a lump-sum social-security tax on workers and transfer the revenues to the old; the direction should be reversed if the goal is to increase population growth. Another alternative, a lump sum social security system, can guarantee optimal population growth but not a desirable capital:labor ratio. Finally, the introduction of money as a valued commodity into an economy with a high capital:labor ratio will also serve to decrease the population growth rate and solve the intergenerational transfer problem through the private sector without any need for government intervention.

  17. Hypothesis Testing as an Act of Rationality

    Science.gov (United States)

    Nearing, Grey

    2017-04-01

    Statistical hypothesis testing is ad hoc in two ways. First, setting probabilistic rejection criteria is, as Neyman (1957) put it, an act of will rather than an act of rationality. Second, physical theories like conservation laws do not inherently admit probabilistic predictions, and so we must use what are called epistemic bridge principles to connect model predictions with the actual methods of hypothesis testing. In practice, these bridge principles are likelihood functions, error functions, or performance metrics. I propose that the reason we are faced with these problems is because we have historically failed to account for a fundamental component of basic logic - namely the portion of logic that explains how epistemic states evolve in the presence of empirical data. This component of Cox' (1946) calculitic logic is called information theory (Knuth, 2005), and adding information theory our hypothetico-deductive account of science yields straightforward solutions to both of the above problems. This also yields a straightforward method for dealing with Popper's (1963) problem of verisimilitude by facilitating a quantitative approach to measuring process isomorphism. In practice, this involves data assimilation. Finally, information theory allows us to reliably bound measures of epistemic uncertainty, thereby avoiding the problem of Bayesian incoherency under misspecified priors (Grünwald, 2006). I therefore propose solutions to four of the fundamental problems inherent in both hypothetico-deductive and/or Bayesian hypothesis testing. - Neyman (1957) Inductive Behavior as a Basic Concept of Philosophy of Science. - Cox (1946) Probability, Frequency and Reasonable Expectation. - Knuth (2005) Lattice Duality: The Origin of Probability and Entropy. - Grünwald (2006). Bayesian Inconsistency under Misspecification. - Popper (1963) Conjectures and Refutations: The Growth of Scientific Knowledge.

  18. Cross-system log file analysis for hypothesis testing

    NARCIS (Netherlands)

    Glahn, Christian

    2008-01-01

    Glahn, C. (2008). Cross-system log file analysis for hypothesis testing. Presented at Empowering Learners for Lifelong Competence Development: pedagogical, organisational and technological issues. 4th TENCompetence Open Workshop. April, 10, 2008, Madrid, Spain.

  19. P value and the theory of hypothesis testing: an explanation for new researchers.

    Science.gov (United States)

    Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël

    2010-03-01

    In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

  20. A test of the reward-value hypothesis.

    Science.gov (United States)

    Smith, Alexandra E; Dalecki, Stefan J; Crystal, Jonathon D

    2017-03-01

    Rats retain source memory (memory for the origin of information) over a retention interval of at least 1 week, whereas their spatial working memory (radial maze locations) decays within approximately 1 day. We have argued that different forgetting functions dissociate memory systems. However, the two tasks, in our previous work, used different reward values. The source memory task used multiple pellets of a preferred food flavor (chocolate), whereas the spatial working memory task provided access to a single pellet of standard chow-flavored food at each location. Thus, according to the reward-value hypothesis, enhanced performance in the source memory task stems from enhanced encoding/memory of a preferred reward. We tested the reward-value hypothesis by using a standard 8-arm radial maze task to compare spatial working memory accuracy of rats rewarded with either multiple chocolate or chow pellets at each location using a between-subjects design. The reward-value hypothesis predicts superior accuracy for high-valued rewards. We documented equivalent spatial memory accuracy for high- and low-value rewards. Importantly, a 24-h retention interval produced equivalent spatial working memory accuracy for both flavors. These data are inconsistent with the reward-value hypothesis and suggest that reward value does not explain our earlier findings that source memory survives unusually long retention intervals.

  1. The discovered preference hypothesis - an empirical test

    DEFF Research Database (Denmark)

    Lundhede, Thomas; Ladenburg, Jacob; Olsen, Søren Bøye

    Using stated preference methods for valuation of non-market goods is known to be vulnerable to a range of biases. Some authors claim that these so-called anomalies in effect render the methods useless for the purpose. However, the Discovered Preference Hypothesis, as put forth by Plott [31], offers...... an nterpretation and explanation of biases which entails that the stated preference methods need not to be completely written off. In this paper we conduct a test for the validity and relevance of the DPH interpretation of biases. In a choice experiment concerning preferences for protection of Danish nature areas...... as respondents evaluate more and more choice sets. This finding supports the Discovered Preference Hypothesis interpretation and explanation of starting point bias....

  2. Plant Disease Severity Assessment-How Rater Bias, Assessment Method, and Experimental Design Affect Hypothesis Testing and Resource Use Efficiency.

    Science.gov (United States)

    Chiang, Kuo-Szu; Bock, Clive H; Lee, I-Hsuan; El Jarroudi, Moussa; Delfosse, Philippe

    2016-12-01

    The effect of rater bias and assessment method on hypothesis testing was studied for representative experimental designs for plant disease assessment using balanced and unbalanced data sets. Data sets with the same number of replicate estimates for each of two treatments are termed "balanced" and those with unequal numbers of replicate estimates are termed "unbalanced". The three assessment methods considered were nearest percent estimates (NPEs), an amended 10% incremental scale, and the Horsfall-Barratt (H-B) scale. Estimates of severity of Septoria leaf blotch on leaves of winter wheat were used to develop distributions for a simulation model. The experimental designs are presented here in the context of simulation experiments which consider the optimal design for the number of specimens (individual units sampled) and the number of replicate estimates per specimen for a fixed total number of observations (total sample size for the treatments being compared). The criterion used to gauge each method was the power of the hypothesis test. As expected, at a given fixed number of observations, the balanced experimental designs invariably resulted in a higher power compared with the unbalanced designs at different disease severity means, mean differences, and variances. Based on these results, with unbiased estimates using NPE, the recommended number of replicate estimates taken per specimen is 2 (from a sample of specimens of at least 30), because this conserves resources. Furthermore, for biased estimates, an apparent difference in the power of the hypothesis test was observed between assessment methods and between experimental designs. Results indicated that, regardless of experimental design or rater bias, an amended 10% incremental scale has slightly less power compared with NPEs, and that the H-B scale is more likely than the others to cause a type II error. These results suggest that choice of assessment method, optimizing sample number and number of replicate

  3. Gaussian Hypothesis Testing and Quantum Illumination.

    Science.gov (United States)

    Wilde, Mark M; Tomamichel, Marco; Lloyd, Seth; Berta, Mario

    2017-09-22

    Quantum hypothesis testing is one of the most basic tasks in quantum information theory and has fundamental links with quantum communication and estimation theory. In this paper, we establish a formula that characterizes the decay rate of the minimal type-II error probability in a quantum hypothesis test of two Gaussian states given a fixed constraint on the type-I error probability. This formula is a direct function of the mean vectors and covariance matrices of the quantum Gaussian states in question. We give an application to quantum illumination, which is the task of determining whether there is a low-reflectivity object embedded in a target region with a bright thermal-noise bath. For the asymmetric-error setting, we find that a quantum illumination transmitter can achieve an error probability exponent stronger than a coherent-state transmitter of the same mean photon number, and furthermore, that it requires far fewer trials to do so. This occurs when the background thermal noise is either low or bright, which means that a quantum advantage is even easier to witness than in the symmetric-error setting because it occurs for a larger range of parameters. Going forward from here, we expect our formula to have applications in settings well beyond those considered in this paper, especially to quantum communication tasks involving quantum Gaussian channels.

  4. Mechanisms of eyewitness suggestibility: tests of the explanatory role hypothesis.

    Science.gov (United States)

    Rindal, Eric J; Chrobak, Quin M; Zaragoza, Maria S; Weihing, Caitlin A

    2017-10-01

    In a recent paper, Chrobak and Zaragoza (Journal of Experimental Psychology: General, 142(3), 827-844, 2013) proposed the explanatory role hypothesis, which posits that the likelihood of developing false memories for post-event suggestions is a function of the explanatory function the suggestion serves. In support of this hypothesis, they provided evidence that participant-witnesses were especially likely to develop false memories for their forced fabrications when their fabrications helped to explain outcomes they had witnessed. In three experiments, we test the generality of the explanatory role hypothesis as a mechanism of eyewitness suggestibility by assessing whether this hypothesis can predict suggestibility errors in (a) situations where the post-event suggestions are provided by the experimenter (as opposed to fabricated by the participant), and (b) across a variety of memory measures and measures of recollective experience. In support of the explanatory role hypothesis, participants were more likely to subsequently freely report (E1) and recollect the suggestions as part of the witnessed event (E2, source test) when the post-event suggestion helped to provide a causal explanation for a witnessed outcome than when it did not serve this explanatory role. Participants were also less likely to recollect the suggestions as part of the witnessed event (on measures of subjective experience) when their explanatory strength had been reduced by the presence of an alternative explanation that could explain the same outcome (E3, source test + warning). Collectively, the results provide strong evidence that the search for explanatory coherence influences people's tendency to misremember witnessing events that were only suggested to them.

  5. A default Bayesian hypothesis test for correlations and partial correlations

    NARCIS (Netherlands)

    Wetzels, R.; Wagenmakers, E.J.

    2012-01-01

    We propose a default Bayesian hypothesis test for the presence of a correlation or a partial correlation. The test is a direct application of Bayesian techniques for variable selection in regression models. The test is easy to apply and yields practical advantages that the standard frequentist tests

  6. A test of the reward-contrast hypothesis.

    Science.gov (United States)

    Dalecki, Stefan J; Panoz-Brown, Danielle E; Crystal, Jonathon D

    2017-12-01

    Source memory, a facet of episodic memory, is the memory of the origin of information. Whereas source memory in rats is sustained for at least a week, spatial memory degraded after approximately a day. Different forgetting functions may suggest that two memory systems (source memory and spatial memory) are dissociated. However, in previous work, the two tasks used baiting conditions consisting of chocolate and chow flavors; notably, the source memory task used the relatively better flavor. Thus, according to the reward-contrast hypothesis, when chocolate and chow were presented within the same context (i.e., within a single radial maze trial), the chocolate location was more memorable than the chow location because of contrast. We tested the reward-contrast hypothesis using baiting configurations designed to produce reward-contrast. The reward-contrast hypothesis predicts that under these conditions, spatial memory will survive a 24-h retention interval. We documented elimination of spatial memory performance after a 24-h retention interval using a reward-contrast baiting pattern. These data suggest that reward contrast does not explain our earlier findings that source memory survives unusually long retention intervals. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Dale [Los Alamos National Laboratory; Selby, Neil [AWE Blacknest

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  8. Chi-square test and its application in hypothesis testing

    Directory of Open Access Journals (Sweden)

    Rakesh Rana

    2015-01-01

    Full Text Available In medical research, there are studies which often collect data on categorical variables that can be summarized as a series of counts. These counts are commonly arranged in a tabular format known as a contingency table. The chi-square test statistic can be used to evaluate whether there is an association between the rows and columns in a contingency table. More specifically, this statistic can be used to determine whether there is any difference between the study groups in the proportions of the risk factor of interest. Chi-square test and the logic of hypothesis testing were developed by Karl Pearson. This article describes in detail what is a chi-square test, on which type of data it is used, the assumptions associated with its application, how to manually calculate it and how to make use of an online calculator for calculating the Chi-square statistics and its associated P-value.

  9. Trends in hypothesis testing and related variables in nursing research: a retrospective exploratory study.

    Science.gov (United States)

    Lash, Ayhan Aytekin; Plonczynski, Donna J; Sehdev, Amikar

    2011-01-01

    To compare the inclusion and the influences of selected variables on hypothesis testing during the 1980s and 1990s. In spite of the emphasis on conducting inquiry consistent with the tenets of logical positivism, there have been no studies investigating the frequency and patterns of hypothesis testing in nursing research The sample was obtained from the journal Nursing Research which was the research journal with the highest circulation during the study period under study. All quantitative studies published during the two decades including briefs and historical studies were included in the analyses A retrospective design was used to select the sample. Five years from the 1980s and 1990s each were randomly selected from the journal, Nursing Research. Of the 582 studies, 517 met inclusion criteria. Findings suggest that there has been a decline in the use of hypothesis testing in the last decades of the 20th century. Further research is needed to identify the factors that influence the conduction of research with hypothesis testing. Hypothesis testing in nursing research showed a steady decline from the 1980s to 1990s. Research purposes of explanation, and prediction/ control increased the likelihood of hypothesis testing. Hypothesis testing strengthens the quality of the quantitative studies, increases the generality of findings and provides dependable knowledge. This is particularly true for quantitative studies that aim to explore, explain and predict/control phenomena and/or test theories. The findings also have implications for doctoral programmes, research preparation of nurse-investigators, and theory testing.

  10. Hypothesis testing in students: Sequences, stages, and instructional strategies

    Science.gov (United States)

    Moshman, David; Thompson, Pat A.

    Six sequences in the development of hypothesis-testing conceptions are proposed, involving (a) interpretation of the hypothesis; (b) the distinction between using theories and testing theories; (c) the consideration of multiple possibilities; (d) the relation of theory and data; (e) the nature of verification and falsification; and (f) the relation of truth and falsity. An alternative account is then provided involving three global stages: concrete operations, formal operations, and a postformal metaconstructivestage. Relative advantages and difficulties of the stage and sequence conceptualizations are discussed. Finally, three families of teaching strategy are distinguished, which emphasize, respectively: (a) social transmission of knowledge; (b) carefully sequenced empirical experience by the student; and (c) self-regulated cognitive activity of the student. It is argued on the basis of Piaget's theory that the last of these plays a crucial role in the construction of such logical reasoning strategies as those involved in testing hypotheses.

  11. Cross-system log file analysis for hypothesis testing

    NARCIS (Netherlands)

    Glahn, Christian; Specht, Marcus; Schoonenboom, Judith; Sligte, Henk; Moghnieh, Ayman; Hernández-Leo, Davinia; Stefanov, Krassen; Lemmers, Ruud; Koper, Rob

    2008-01-01

    Glahn, C., Specht, M., Schoonenboom, J., Sligte, H., Moghnieh, A., Hernández-Leo, D. Stefanov, K., Lemmers, R., & Koper, R. (2008). Cross-system log file analysis for hypothesis testing. In H. Sligte & R. Koper (Eds.), Proceedings of the 4th TENCompetence Open Workshop. Empowering Learners for

  12. A "Projective" Test of the Golden Section Hypothesis.

    Science.gov (United States)

    Lee, Chris; Adams-Webber, Jack

    1987-01-01

    In a projective test of the golden section hypothesis, 24 high school students rated themselves and 10 comic strip characters on basis of 12 bipolar constructs. Overall proportion of cartoon figures which subjects assigned to positive poles of constructs was very close to golden section. (Author/NB)

  13. Animal Models for Testing the DOHaD Hypothesis

    Science.gov (United States)

    Since the seminal work in human populations by David Barker and colleagues, several species of animals have been used in the laboratory to test the Developmental Origins of Health and Disease (DOHaD) hypothesis. Rats, mice, guinea pigs, sheep, pigs and non-human primates have bee...

  14. Bayesian Approaches to Imputation, Hypothesis Testing, and Parameter Estimation

    Science.gov (United States)

    Ross, Steven J.; Mackey, Beth

    2015-01-01

    This chapter introduces three applications of Bayesian inference to common and novel issues in second language research. After a review of the critiques of conventional hypothesis testing, our focus centers on ways Bayesian inference can be used for dealing with missing data, for testing theory-driven substantive hypotheses without a default null…

  15. Tests of the lunar hypothesis

    Science.gov (United States)

    Taylor, S. R.

    1984-01-01

    The concept that the Moon was fissioned from the Earth after core separation is the most readily testable hypothesis of lunar origin, since direct comparisons of lunar and terrestrial compositions can be made. Differences found in such comparisons introduce so many ad hoc adjustments to the fission hypothesis that it becomes untestable. Further constraints may be obtained from attempting to date the volatile-refractory element fractionation. The combination of chemical and isotopic problems suggests that the fission hypothesis is no longer viable, and separate terrestrial and lunar accretion from a population of fractionated precursor planetesimals provides a more reasonable explanation.

  16. A Review of Multiple Hypothesis Testing in Otolaryngology Literature

    Science.gov (United States)

    Kirkham, Erin M.; Weaver, Edward M.

    2018-01-01

    Objective Multiple hypothesis testing (or multiple testing) refers to testing more than one hypothesis within a single analysis, and can inflate the Type I error rate (false positives) within a study. The aim of this review was to quantify multiple testing in recent large clinical studies in the otolaryngology literature and to discuss strategies to address this potential problem. Data sources Original clinical research articles with >100 subjects published in 2012 in the four general otolaryngology journals with the highest Journal Citation Reports 5-year impact factors. Review methods Articles were reviewed to determine whether the authors tested greater than five hypotheses in at least one family of inferences. For the articles meeting this criterion for multiple testing, Type I error rates were calculated and statistical correction was applied to the reported results. Results Of the 195 original clinical research articles reviewed, 72% met the criterion for multiple testing. Within these studies, there was a mean 41% chance of a Type I error and, on average, 18% of significant results were likely to be false positives. After the Bonferroni correction was applied, only 57% of significant results reported within the articles remained significant. Conclusion Multiple testing is common in recent large clinical studies in otolaryngology and deserves closer attention from researchers, reviewers and editors. Strategies for adjusting for multiple testing are discussed. PMID:25111574

  17. Testing the Cross-Racial Generality of Spearman's Hypothesis in Two Samples

    Science.gov (United States)

    Hartmann, Peter; Kruuse, Nanna Hye Sun; Nyborg, Helmuth

    2007-01-01

    Spearman's hypothesis states that racial differences in IQ between Blacks (B) and Whites (W) are due primarily to differences in the "g" factor. This hypothesis is often confirmed, but it is less certain whether it generalizes to other races. We therefore tested its cross-racial generality by comparing American subjects of European…

  18. Hypothesis Testing Using the Films of the Three Stooges

    Science.gov (United States)

    Gardner, Robert; Davidson, Robert

    2010-01-01

    The use of The Three Stooges' films as a source of data in an introductory statistics class is described. The Stooges' films are separated into three populations. Using these populations, students may conduct hypothesis tests with data they collect.

  19. Planned Hypothesis Tests Are Not Necessarily Exempt from Multiplicity Adjustment

    Science.gov (United States)

    Frane, Andrew V.

    2015-01-01

    Scientific research often involves testing more than one hypothesis at a time, which can inflate the probability that a Type I error (false discovery) will occur. To prevent this Type I error inflation, adjustments can be made to the testing procedure that compensate for the number of tests. Yet many researchers believe that such adjustments are…

  20. A Critique of One-Tailed Hypothesis Test Procedures in Business and Economics Statistics Textbooks.

    Science.gov (United States)

    Liu, Tung; Stone, Courtenay C.

    1999-01-01

    Surveys introductory business and economics statistics textbooks and finds that they differ over the best way to explain one-tailed hypothesis tests: the simple null-hypothesis approach or the composite null-hypothesis approach. Argues that the composite null-hypothesis approach contains methodological shortcomings that make it more difficult for…

  1. SAR-based change detection using hypothesis testing and Markov random field modelling

    Science.gov (United States)

    Cao, W.; Martinis, S.

    2015-04-01

    The objective of this study is to automatically detect changed areas caused by natural disasters from bi-temporal co-registered and calibrated TerraSAR-X data. The technique in this paper consists of two steps: Firstly, an automatic coarse detection step is applied based on a statistical hypothesis test for initializing the classification. The original analytical formula as proposed in the constant false alarm rate (CFAR) edge detector is reviewed and rewritten in a compact form of the incomplete beta function, which is a builtin routine in commercial scientific software such as MATLAB and IDL. Secondly, a post-classification step is introduced to optimize the noisy classification result in the previous step. Generally, an optimization problem can be formulated as a Markov random field (MRF) on which the quality of a classification is measured by an energy function. The optimal classification based on the MRF is related to the lowest energy value. Previous studies provide methods for the optimization problem using MRFs, such as the iterated conditional modes (ICM) algorithm. Recently, a novel algorithm was presented based on graph-cut theory. This method transforms a MRF to an equivalent graph and solves the optimization problem by a max-flow/min-cut algorithm on the graph. In this study this graph-cut algorithm is applied iteratively to improve the coarse classification. At each iteration the parameters of the energy function for the current classification are set by the logarithmic probability density function (PDF). The relevant parameters are estimated by the method of logarithmic cumulants (MoLC). Experiments are performed using two flood events in Germany and Australia in 2011 and a forest fire on La Palma in 2009 using pre- and post-event TerraSAR-X data. The results show convincing coarse classifications and considerable improvement by the graph-cut post-classification step.

  2. Sex ratios in the two Germanies: a test of the economic stress hypothesis.

    Science.gov (United States)

    Catalano, Ralph A

    2003-09-01

    Literature describing temporal variation in the secondary sex ratio among humans reports an association between population stressors and declines in the odds of male birth. Explanations of this phenomenon draw on reports that stressed females spontaneously abort male more than female fetuses, and that stressed males exhibit reduced sperm motility. This work has led to the argument that population stress induced by a declining economy reduces the human sex ratio. No direct test of this hypothesis appears in the literature. Here, a test is offered based on a comparison of the sex ratio in East and West Germany for the years 1946 to 1999. The theory suggests that the East German sex ratio should be lower in 1991, when East Germany's economy collapsed, than expected from its own history and from the sex ratio in West Germany. The hypothesis is tested using time-series modelling methods. The data support the hypothesis. The sex ratio in East Germany was at its lowest in 1991. This first direct test supports the hypothesis that economic decline reduces the human sex ratio.

  3. A checklist to facilitate objective hypothesis testing in social psychology research.

    Science.gov (United States)

    Washburn, Anthony N; Morgan, G Scott; Skitka, Linda J

    2015-01-01

    Social psychology is not a very politically diverse area of inquiry, something that could negatively affect the objectivity of social psychological theory and research, as Duarte et al. argue in the target article. This commentary offers a number of checks to help researchers uncover possible biases and identify when they are engaging in hypothesis confirmation and advocacy instead of hypothesis testing.

  4. Feasibility study using hypothesis testing to demonstrate containment of radionuclides within waste packages

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1986-04-01

    The purpose of this report is to apply methods of statistical hypothesis testing to demonstrate the performance of containers of radioactive waste. The approach involves modeling the failure times of waste containers using Weibull distributions, making strong assumptions about the parameters. A specific objective is to apply methods of statistical hypothesis testing to determine the number of container tests that must be performed in order to control the probability of arriving at the wrong conclusions. An algorithm to determine the required number of containers to be tested with the acceptable number of failures is derived as a function of the distribution parameters, stated probabilities, and the desired waste containment life. Using a set of reference values for the input parameters, sample sizes of containers to be tested are calculated for demonstration purposes. These sample sizes are found to be excessively large, indicating that this hypothesis-testing framework does not provide a feasible approach for demonstrating satisfactory performance of waste packages for exceptionally long time periods

  5. Consumer health information seeking as hypothesis testing.

    Science.gov (United States)

    Keselman, Alla; Browne, Allen C; Kaufman, David R

    2008-01-01

    Despite the proliferation of consumer health sites, lay individuals often experience difficulty finding health information online. The present study attempts to understand users' information seeking difficulties by drawing on a hypothesis testing explanatory framework. It also addresses the role of user competencies and their interaction with internet resources. Twenty participants were interviewed about their understanding of a hypothetical scenario about a family member suffering from stable angina and then searched MedlinePlus consumer health information portal for information on the problem presented in the scenario. Participants' understanding of heart disease was analyzed via semantic analysis. Thematic coding was used to describe information seeking trajectories in terms of three key strategies: verification of the primary hypothesis, narrowing search within the general hypothesis area and bottom-up search. Compared to an expert model, participants' understanding of heart disease involved different key concepts, which were also differently grouped and defined. This understanding provided the framework for search-guiding hypotheses and results interpretation. Incorrect or imprecise domain knowledge led individuals to search for information on irrelevant sites, often seeking out data to confirm their incorrect initial hypotheses. Online search skills enhanced search efficiency, but did not eliminate these difficulties. Regardless of their web experience and general search skills, lay individuals may experience difficulty with health information searches. These difficulties may be related to formulating and evaluating hypotheses that are rooted in their domain knowledge. Informatics can provide support at the levels of health information portals, individual websites, and consumer education tools.

  6. Concerns regarding a call for pluralism of information theory and hypothesis testing

    Science.gov (United States)

    Lukacs, P.M.; Thompson, W.L.; Kendall, W.L.; Gould, W.R.; Doherty, P.F.; Burnham, K.P.; Anderson, D.R.

    2007-01-01

    1. Stephens et al . (2005) argue for `pluralism? in statistical analysis, combining null hypothesis testing and information-theoretic (I-T) methods. We show that I-T methods are more informative even in single variable problems and we provide an ecological example. 2. I-T methods allow inferences to be made from multiple models simultaneously. We believe multimodel inference is the future of data analysis, which cannot be achieved with null hypothesis-testing approaches. 3. We argue for a stronger emphasis on critical thinking in science in general and less reliance on exploratory data analysis and data dredging. Deriving alternative hypotheses is central to science; deriving a single interesting science hypothesis and then comparing it to a default null hypothesis (e.g. `no difference?) is not an efficient strategy for gaining knowledge. We think this single-hypothesis strategy has been relied upon too often in the past. 4. We clarify misconceptions presented by Stephens et al . (2005). 5. We think inference should be made about models, directly linked to scientific hypotheses, and their parameters conditioned on data, Prob(Hj| data). I-T methods provide a basis for this inference. Null hypothesis testing merely provides a probability statement about the data conditioned on a null model, Prob(data |H0). 6. Synthesis and applications. I-T methods provide a more informative approach to inference. I-T methods provide a direct measure of evidence for or against hypotheses and a means to consider simultaneously multiple hypotheses as a basis for rigorous inference. Progress in our science can be accelerated if modern methods can be used intelligently; this includes various I-T and Bayesian methods.

  7. Tax Evasion, Information Reporting, and the Regressive Bias Hypothesis

    DEFF Research Database (Denmark)

    Boserup, Simon Halphen; Pinje, Jori Veng

    A robust prediction from the tax evasion literature is that optimal auditing induces a regressive bias in effective tax rates compared to statutory rates. If correct, this will have important distributional consequences. Nevertheless, the regressive bias hypothesis has never been tested empirically...

  8. The efficient market hypothesis: problems with interpretations of empirical tests

    Directory of Open Access Journals (Sweden)

    Denis Alajbeg

    2012-03-01

    Full Text Available Despite many “refutations” in empirical tests, the efficient market hypothesis (EMH remains the central concept of financial economics. The EMH’s resistance to the results of empirical testing emerges from the fact that the EMH is not a falsifiable theory. Its axiomatic definition shows how asset prices would behave under assumed conditions. Testing for this price behavior does not make much sense as the conditions in the financial markets are much more complex than the simplified conditions of perfect competition, zero transaction costs and free information used in the formulation of the EMH. Some recent developments within the tradition of the adaptive market hypothesis are promising regarding development of a falsifiable theory of price formation in financial markets, but are far from giving assurance that we are approaching a new formulation. The most that can be done in the meantime is to be very cautious while interpreting the empirical evidence that is presented as “testing” the EMH.

  9. The Bio-Inspired Optimization of Trading Strategies and Its Impact on the Efficient Market Hypothesis and Sustainable Development Strategies

    Directory of Open Access Journals (Sweden)

    Rafał Dreżewski

    2018-05-01

    Full Text Available In this paper, the evolutionary algorithm for the optimization of Forex market trading strategies is proposed. The introduction to issues related to the financial markets and the evolutionary algorithms precedes the main part of the paper, in which the proposed trading system is presented. The system uses the evolutionary algorithm for optimization of a parameterized greedy strategy, which is then used as an investment strategy on the Forex market. In the proposed system, a model of the Forex market was developed, including all elements that are necessary for simulating realistic trading processes. The proposed evolutionary algorithm contains several novel mechanisms that were introduced to optimize the greedy strategy. The most important of the proposed techniques are the mechanisms for maintaining the population diversity, a mechanism for protecting the best individuals in the population, the mechanisms preventing the excessive growth of the population, the mechanisms of the initialization of the population after moving the time window and a mechanism of choosing the best strategies used for trading. The experiments, conducted with the use of real-world Forex market data, were aimed at testing the quality of the results obtained using the proposed algorithm and comparing them with the results obtained by the buy-and-hold strategy. By comparing our results with the results of the buy-and-hold strategy, we attempted to verify the validity of the efficient market hypothesis. The credibility of the hypothesis would have more general implications for many different areas of our lives, including future sustainable development policies.

  10. Tests of the planetary hypothesis for PTFO 8-8695b

    DEFF Research Database (Denmark)

    Yu, Liang; Winn, Joshua N.; Gillon, Michaël

    2015-01-01

    The T Tauri star PTFO 8-8695 exhibits periodic fading events that have been interpreted as the transits of a giant planet on a precessing orbit. Here we present three tests of the planet hypothesis. First, we sought evidence for the secular changes in light-curve morphology that are predicted...... planetary orbit. Our spectroscopy also revealed strong, time-variable, high-velocity H{\\alpha} and Ca H & K emission features. All these observations cast doubt on the planetary hypothesis, and suggest instead that the fading events represent starspots, eclipses by circumstellar dust, or occultations...

  11. Risk-Based, Hypothesis-Driven Framework for Hydrological Field Campaigns with Case Studies

    Science.gov (United States)

    Harken, B.; Rubin, Y.

    2014-12-01

    There are several stages in any hydrological modeling campaign, including: formulation and analysis of a priori information, data acquisition through field campaigns, inverse modeling, and prediction of some environmental performance metric (EPM). The EPM being predicted could be, for example, contaminant concentration or plume travel time. These predictions often have significant bearing on a decision that must be made. Examples include: how to allocate limited remediation resources between contaminated groundwater sites or where to place a waste repository site. Answering such questions depends on predictions of EPMs using forward models as well as levels of uncertainty related to these predictions. Uncertainty in EPM predictions stems from uncertainty in model parameters, which can be reduced by measurements taken in field campaigns. The costly nature of field measurements motivates a rational basis for determining a measurement strategy that is optimal with respect to the uncertainty in the EPM prediction. The tool of hypothesis testing allows this uncertainty to be quantified by computing the significance of the test resulting from a proposed field campaign. The significance of the test gives a rational basis for determining the optimality of a proposed field campaign. This hypothesis testing framework is demonstrated and discussed using various synthetic case studies. This study involves contaminated aquifers where a decision must be made based on prediction of when a contaminant will arrive at a specified location. The EPM, in this case contaminant travel time, is cast into the hypothesis testing framework. The null hypothesis states that the contaminant plume will arrive at the specified location before a critical amount of time passes, and the alternative hypothesis states that the plume will arrive after the critical time passes. The optimality of different field campaigns is assessed by computing the significance of the test resulting from each one

  12. The limits to pride: A test of the pro-anorexia hypothesis.

    Science.gov (United States)

    Cornelius, Talea; Blanton, Hart

    2016-01-01

    Many social psychological models propose that positive self-conceptions promote self-esteem. An extreme version of this hypothesis is advanced in "pro-anorexia" communities: identifying with anorexia, in conjunction with disordered eating, can lead to higher self-esteem. The current study empirically tested this hypothesis. Results challenge the pro-anorexia hypothesis. Although those with higher levels of pro-anorexia identification trended towards higher self-esteem with increased disordered eating, this did not overcome the strong negative main effect of pro-anorexia identification. These data suggest a more effective strategy for promoting self-esteem is to encourage rejection of disordered eating and an anorexic identity.

  13. Hypothesis testing of a change point during cognitive decline among Alzheimer's disease patients.

    Science.gov (United States)

    Ji, Ming; Xiong, Chengjie; Grundman, Michael

    2003-10-01

    In this paper, we present a statistical hypothesis test for detecting a change point over the course of cognitive decline among Alzheimer's disease patients. The model under the null hypothesis assumes a constant rate of cognitive decline over time and the model under the alternative hypothesis is a general bilinear model with an unknown change point. When the change point is unknown, however, the null distribution of the test statistics is not analytically tractable and has to be simulated by parametric bootstrap. When the alternative hypothesis that a change point exists is accepted, we propose an estimate of its location based on the Akaike's Information Criterion. We applied our method to a data set from the Neuropsychological Database Initiative by implementing our hypothesis testing method to analyze Mini Mental Status Exam scores based on a random-slope and random-intercept model with a bilinear fixed effect. Our result shows that despite large amount of missing data, accelerated decline did occur for MMSE among AD patients. Our finding supports the clinical belief of the existence of a change point during cognitive decline among AD patients and suggests the use of change point models for the longitudinal modeling of cognitive decline in AD research.

  14. Is conscious stimulus identification dependent on knowledge of the perceptual modality? Testing the "source misidentification hypothesis"

    DEFF Research Database (Denmark)

    Overgaard, Morten; Lindeløv, Jonas Kristoffer; Svejstrup, Stinna

    2013-01-01

    This paper reports an experiment intended to test a particular hypothesis derived from blindsight research, which we name the “source misidentification hypothesis.” According to this hypothesis, a subject may be correct about a stimulus without being correct about how she had access...... to this knowledge (whether the stimulus was visual, auditory, or something else). We test this hypothesis in healthy subjects, asking them to report whether a masked stimulus was presented auditorily or visually, what the stimulus was, and how clearly they experienced the stimulus using the Perceptual Awareness...... experience of the stimulus. To demonstrate that particular levels of reporting accuracy are obtained, we employ a statistical strategy, which operationally tests the hypothesis of non-equality, such that the usual rejection of the null-hypothesis admits the conclusion of equivalence....

  15. Bayesian models based on test statistics for multiple hypothesis testing problems.

    Science.gov (United States)

    Ji, Yuan; Lu, Yiling; Mills, Gordon B

    2008-04-01

    We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.

  16. Shaping Up the Practice of Null Hypothesis Significance Testing.

    Science.gov (United States)

    Wainer, Howard; Robinson, Daniel H.

    2003-01-01

    Discusses criticisms of null hypothesis significance testing (NHST), suggesting that historical use of NHST was reasonable, and current users should read Sir Ronald Fisher's applied work. Notes that modifications to NHST and interpretations of its outcomes might better suit the needs of modern science. Concludes that NHST is most often useful as…

  17. Adolescents' Body Image Trajectories: A Further Test of the Self-Equilibrium Hypothesis

    Science.gov (United States)

    Morin, Alexandre J. S.; Maïano, Christophe; Scalas, L. Francesca; Janosz, Michel; Litalien, David

    2017-01-01

    The self-equilibrium hypothesis underlines the importance of having a strong core self, which is defined as a high and developmentally stable self-concept. This study tested this hypothesis in relation to body image (BI) trajectories in a sample of 1,006 adolescents (M[subscript age] = 12.6, including 541 males and 465 females) across a 4-year…

  18. The Harm Done to Reproducibility by the Culture of Null Hypothesis Significance Testing.

    Science.gov (United States)

    Lash, Timothy L

    2017-09-15

    In the last few years, stakeholders in the scientific community have raised alarms about a perceived lack of reproducibility of scientific results. In reaction, guidelines for journals have been promulgated and grant applicants have been asked to address the rigor and reproducibility of their proposed projects. Neither solution addresses a primary culprit, which is the culture of null hypothesis significance testing that dominates statistical analysis and inference. In an innovative research enterprise, selection of results for further evaluation based on null hypothesis significance testing is doomed to yield a low proportion of reproducible results and a high proportion of effects that are initially overestimated. In addition, the culture of null hypothesis significance testing discourages quantitative adjustments to account for systematic errors and quantitative incorporation of prior information. These strategies would otherwise improve reproducibility and have not been previously proposed in the widely cited literature on this topic. Without discarding the culture of null hypothesis significance testing and implementing these alternative methods for statistical analysis and inference, all other strategies for improving reproducibility will yield marginal gains at best. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  20. A more powerful test based on ratio distribution for retention noninferiority hypothesis.

    Science.gov (United States)

    Deng, Ling; Chen, Gang

    2013-03-11

    Rothmann et al. ( 2003 ) proposed a method for the statistical inference of fraction retention noninferiority (NI) hypothesis. A fraction retention hypothesis is defined as a ratio of the new treatment effect verse the control effect in the context of a time to event endpoint. One of the major concerns using this method in the design of an NI trial is that with a limited sample size, the power of the study is usually very low. This makes an NI trial not applicable particularly when using time to event endpoint. To improve power, Wang et al. ( 2006 ) proposed a ratio test based on asymptotic normality theory. Under a strong assumption (equal variance of the NI test statistic under null and alternative hypotheses), the sample size using Wang's test was much smaller than that using Rothmann's test. However, in practice, the assumption of equal variance is generally questionable for an NI trial design. This assumption is removed in the ratio test proposed in this article, which is derived directly from a Cauchy-like ratio distribution. In addition, using this method, the fundamental assumption used in Rothmann's test, that the observed control effect is always positive, that is, the observed hazard ratio for placebo over the control is greater than 1, is no longer necessary. Without assuming equal variance under null and alternative hypotheses, the sample size required for an NI trial can be significantly reduced if using the proposed ratio test for a fraction retention NI hypothesis.

  1. A sequential hypothesis test based on a generalized Azuma inequality

    NARCIS (Netherlands)

    Reijsbergen, D.P.; Scheinhardt, Willem R.W.; de Boer, Pieter-Tjerk

    We present a new power-one sequential hypothesis test based on a bound for the probability that a bounded zero-mean martingale ever crosses a curve of the form $a(n+k)^b$. The proof of the bound is of independent interest.

  2. Test of the Brink-Axel Hypothesis for the Pygmy Dipole Resonance

    Science.gov (United States)

    Martin, D.; von Neumann-Cosel, P.; Tamii, A.; Aoi, N.; Bassauer, S.; Bertulani, C. A.; Carter, J.; Donaldson, L.; Fujita, H.; Fujita, Y.; Hashimoto, T.; Hatanaka, K.; Ito, T.; Krugmann, A.; Liu, B.; Maeda, Y.; Miki, K.; Neveling, R.; Pietralla, N.; Poltoratska, I.; Ponomarev, V. Yu.; Richter, A.; Shima, T.; Yamamoto, T.; Zweidinger, M.

    2017-11-01

    The gamma strength function and level density of 1- states in 96Mo have been extracted from a high-resolution study of the (p → , p→ ' ) reaction at 295 MeV and extreme forward angles. By comparison with compound nucleus γ decay experiments, this allows a test of the generalized Brink-Axel hypothesis in the energy region of the pygmy dipole resonance. The Brink-Axel hypothesis is commonly assumed in astrophysical reaction network calculations and states that the gamma strength function in nuclei is independent of the structure of the initial and final state. The present results validate the Brink-Axel hypothesis for 96Mo and provide independent confirmation of the methods used to separate gamma strength function and level density in γ decay experiments.

  3. Correlates of androgens in wild male Barbary macaques: Testing the challenge hypothesis.

    Science.gov (United States)

    Rincon, Alan V; Maréchal, Laëtitia; Semple, Stuart; Majolo, Bonaventura; MacLarnon, Ann

    2017-10-01

    Investigating causes and consequences of variation in hormonal expression is a key focus in behavioral ecology. Many studies have explored patterns of secretion of the androgen testosterone in male vertebrates, using the challenge hypothesis (Wingfield, Hegner, Dufty, & Ball, 1990; The American Naturalist, 136(6), 829-846) as a theoretical framework. Rather than the classic association of testosterone with male sexual behavior, this hypothesis predicts that high levels of testosterone are associated with male-male reproductive competition but also inhibit paternal care. The hypothesis was originally developed for birds, and subsequently tested in other vertebrate taxa, including primates. Such studies have explored the link between testosterone and reproductive aggression as well as other measures of mating competition, or between testosterone and aspects of male behavior related to the presence of infants. Very few studies have simultaneously investigated the links between testosterone and male aggression, other aspects of mating competition and infant-related behavior. We tested predictions derived from the challenge hypothesis in wild male Barbary macaques (Macaca sylvanus), a species with marked breeding seasonality and high levels of male-infant affiliation, providing a powerful test of this theoretical framework. Over 11 months, 251 hr of behavioral observations and 296 fecal samples were collected from seven adult males in the Middle Atlas Mountains, Morocco. Fecal androgen levels rose before the onset of the mating season, during a period of rank instability, and were positively related to group mating activity across the mating season. Androgen levels were unrelated to rates of male-male aggression in any period, but higher ranked males had higher levels in both the mating season and in the period of rank instability. Lower androgen levels were associated with increased rates of male-infant grooming during the mating and unstable periods. Our results

  4. Semiparametric Power Envelopes for Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael

    This paper derives asymptotic power envelopes for tests of the unit root hypothesis in a zero-mean AR(1) model. The power envelopes are derived using the limits of experiments approach and are semiparametric in the sense that the underlying error distribution is treated as an unknown...

  5. Using Employer Hiring Behavior to Test the Educational Signaling Hypothesis

    NARCIS (Netherlands)

    Albrecht, J.W.; van Ours, J.C.

    2001-01-01

    This paper presents a test of the educational signaling hypothesis.If employers use education as a signal in the hiring process, they will rely more on education when less is otherwise known about applicants.We nd that employers are more likely to lower educational standards when an informal, more

  6. A test of the domain-specific acculturation strategy hypothesis.

    Science.gov (United States)

    Miller, Matthew J; Yang, Minji; Lim, Robert H; Hui, Kayi; Choi, Na-Yeun; Fan, Xiaoyan; Lin, Li-Ling; Grome, Rebekah E; Farrell, Jerome A; Blackmon, Sha'kema

    2013-01-01

    Acculturation literature has evolved over the past several decades and has highlighted the dynamic ways in which individuals negotiate experiences in multiple cultural contexts. The present study extends this literature by testing M. J. Miller and R. H. Lim's (2010) domain-specific acculturation strategy hypothesis-that individuals might use different acculturation strategies (i.e., assimilated, bicultural, separated, and marginalized strategies; J. W. Berry, 2003) across behavioral and values domains-in 3 independent cluster analyses with Asian American participants. Present findings supported the domain-specific acculturation strategy hypothesis as 67% to 72% of participants from 3 independent samples using different strategies across behavioral and values domains. Consistent with theory, a number of acculturation strategy cluster group differences emerged across generational status, acculturative stress, mental health symptoms, and attitudes toward seeking professional psychological help. Study limitations and future directions for research are discussed.

  7. Testing for Marshall-Lerner hypothesis: A panel approach

    Science.gov (United States)

    Azizan, Nur Najwa; Sek, Siok Kun

    2014-12-01

    The relationship between real exchange rate and trade balances are documented in many theories. One of the theories is the so-called Marshall-Lerner condition. In this study, we seek to test for the validity of Marshall-Lerner hypothesis, i.e. to reveal if the depreciation of real exchange rate leads to the improvement in trade balances. We focus our study in ASEAN-5 countries and their main trade partners of U.S., Japan and China. The dynamic panel data of pooled mean group (PMG) approach is used to detect the Marshall-Lerner hypothesis among ASEAN-5, between ASEAN-5 and U.S., between ASEAN-5 and Japan and between ASEAN-5 and China respectively. The estimation is based on the autoregressive Distributed Lag or ARDL model for the period of 1970-2012. The paper concludes that Marshal Lerner theory does not hold in bilateral trades in four groups of countries. The trade balances of ASEAN5 are mainly determined by the domestic income level and foreign production cost.

  8. A hypothesis on improving foreign accents by optimizing variability in vocal learning brain circuits

    OpenAIRE

    Simmonds, Anna J.

    2015-01-01

    Rapid vocal motor learning is observed when acquiring a language in early childhood, or learning to speak another language later in life. Accurate pronunciation is one of the hardest things for late learners to master and they are almost always left with a non-native accent. Here, I propose a novel hypothesis that this accent could be improved by optimizing variability in vocal learning brain circuits during learning. Much of the neurobiology of human vocal motor learning has been inferred fr...

  9. A test of the substitution-habitat hypothesis in amphibians.

    Science.gov (United States)

    Martínez-Abraín, Alejandro; Galán, Pedro

    2017-12-08

    Most examples that support the substitution-habitat hypothesis (human-made habitats act as substitutes of original habitat) deal with birds and mammals. We tested this hypothesis in 14 amphibians by using percentage occupancy as a proxy of habitat quality (i.e., higher occupancy percentages indicate higher quality). We classified water body types as original habitat (no or little human influence) depending on anatomical, behavioral, or physiological adaptations of each amphibian species. Ten species had relatively high probabilities (0.16-0.28) of occurrence in original habitat, moderate probability of occurrence in substitution habitats (0.11-0.14), and low probability of occurrence in refuge habitats (0.05-0.08). Thus, the substitution-habitat hypothesis only partially applies to amphibians because the low occupancy of refuges could be due to the negligible human persecution of this group (indicating good conservation status). However, low occupancy of refuges could also be due to low tolerance of refuge conditions, which could have led to selective extinction or colonization problems due to poor dispersal capabilities. That original habitats had the highest probabilities of occupancy suggests amphibians have a good conservation status in the region. They also appeared highly adaptable to anthropogenic substitution habitats. © 2017 Society for Conservation Biology.

  10. Advertising investment as a tool for boosting consumption: testing Galbraith's hypothesis for Spain

    Directory of Open Access Journals (Sweden)

    Valentín-Alejandro Martínez-Fernández

    2014-12-01

    Full Text Available The recession that most of the world economies have been facing in the last years has caused a great interest in the study of its macroeconomic effects. In this context, a debate has resurged regarding the advertising investment, as for its potential capacity to impel the consumer spending and to impact positively on the economic recovery. This idea, sustained in the so-called Galbraith's hypothesis, constitutes the core of this paper, where the main objective is to test that hypothesis by means of an empirical analysis. In this study, we focus on the Spanish case and the data correspond to the period 1976 -2010. A cointegration analysis is carried out, using two different approaches (Engle-Granger test and Gregory-Hansen test, respectively, to determine if there is any relationship between the advertising investment and six macromagnitudes (GDP, National Income, Consumption, Savings and Fixed Capital Formation, as well as the registered unemployment rate. Based on the results obtained, we conclude that Galbraith's hypothesis is not fulfilled for the Spanish case.

  11. Is it better to select or to receive? Learning via active and passive hypothesis testing.

    Science.gov (United States)

    Markant, Douglas B; Gureckis, Todd M

    2014-02-01

    People can test hypotheses through either selection or reception. In a selection task, the learner actively chooses observations to test his or her beliefs, whereas in reception tasks data are passively encountered. People routinely use both forms of testing in everyday life, but the critical psychological differences between selection and reception learning remain poorly understood. One hypothesis is that selection learning improves learning performance by enhancing generic cognitive processes related to motivation, attention, and engagement. Alternatively, we suggest that differences between these 2 learning modes derives from a hypothesis-dependent sampling bias that is introduced when a person collects data to test his or her own individual hypothesis. Drawing on influential models of sequential hypothesis-testing behavior, we show that such a bias (a) can lead to the collection of data that facilitates learning compared with reception learning and (b) can be more effective than observing the selections of another person. We then report a novel experiment based on a popular category learning paradigm that compares reception and selection learning. We additionally compare selection learners to a set of "yoked" participants who viewed the exact same sequence of observations under reception conditions. The results revealed systematic differences in performance that depended on the learner's role in collecting information and the abstract structure of the problem.

  12. Is the Economic andTesting the Efficient Markets Hypothesis on the Romanian Capital Market

    Directory of Open Access Journals (Sweden)

    Dragoș Mînjină

    2013-11-01

    Full Text Available Informational efficiency of capital markets has been the subject of numerous empirical studies. Intensive research of the field is justified by the important implications of the knowledge of the of informational efficiency level in the financial practice. Empirical studies that have tested the efficient markets hypothesis on the Romanian capital market revealed mostly that this market is not characterised by the weak form of the efficient markets hypothesis. However, recent empirical studies have obtained results for the weak form of the efficient markets hypothesis. The present decline period of the Romanian capital market, recorded on the background of adverse economic developments internally and externally, will be an important test for the continuation of recent positive developments, manifested the level of informational efficiency too.

  13. The Relation between Parental Values and Parenting Behavior: A Test of the Kohn Hypothesis.

    Science.gov (United States)

    Luster, Tom; And Others

    1989-01-01

    Used data on 65 mother-infant dyads to test Kohn's hypothesis concerning the relation between values and parenting behavior. Findings support Kohn's hypothesis that parents who value self-direction would emphasize supportive function of parenting and parents who value conformity would emphasize their obligations to impose restraints. (Author/NB)

  14. Giant Panda Maternal Care: A Test of the Experience Constraint Hypothesis

    Science.gov (United States)

    Snyder, Rebecca J.; Perdue, Bonnie M.; Zhang, Zhihe; Maple, Terry L.; Charlton, Benjamin D.

    2016-01-01

    The body condition constraint and the experience condition constraint hypotheses have both been proposed to account for differences in reproductive success between multiparous (experienced) and primiparous (first-time) mothers. However, because primiparous mothers are typically characterized by both inferior body condition and lack of experience when compared to multiparous mothers, interpreting experience related differences in maternal care as support for either the body condition constraint hypothesis or the experience constraint hypothesis is extremely difficult. Here, we examined maternal behaviour in captive giant pandas, allowing us to simultaneously control for body condition and provide a rigorous test of the experience constraint hypothesis in this endangered animal. We found that multiparous mothers spent more time engaged in key maternal behaviours (nursing, grooming, and holding cubs) and had significantly less vocal cubs than primiparous mothers. This study provides the first evidence supporting the experience constraint hypothesis in the order Carnivora, and may have utility for captive breeding programs in which it is important to monitor the welfare of this species’ highly altricial cubs, whose survival is almost entirely dependent on receiving adequate maternal care during the first few weeks of life. PMID:27272352

  15. Testing the activitystat hypothesis: a randomised controlled trial protocol.

    Science.gov (United States)

    Gomersall, Sjaan; Maher, Carol; Norton, Kevin; Dollman, Jim; Tomkinson, Grant; Esterman, Adrian; English, Coralie; Lewis, Nicole; Olds, Tim

    2012-10-08

    The activitystat hypothesis proposes that when physical activity or energy expenditure is increased or decreased in one domain, there will be a compensatory change in another domain to maintain an overall, stable level of physical activity or energy expenditure. To date, there has been no experimental study primarily designed to test the activitystat hypothesis in adults. The aim of this trial is to determine the effect of two different imposed exercise loads on total daily energy expenditure and physical activity levels. This study will be a randomised, multi-arm, parallel controlled trial. Insufficiently active adults (as determined by the Active Australia survey) aged 18-60 years old will be recruited for this study (n=146). Participants must also satisfy the Sports Medicine Australia Pre-Exercise Screening System and must weigh less than 150 kg. Participants will be randomly assigned to one of three groups using a computer-generated allocation sequence. Participants in the Moderate exercise group will receive an additional 150 minutes of moderate to vigorous physical activity per week for six weeks, and those in the Extensive exercise group will receive an additional 300 minutes of moderate to vigorous physical activity per week for six weeks. Exercise targets will be accumulated through both group and individual exercise sessions monitored by heart rate telemetry. Control participants will not be given any instructions regarding lifestyle. The primary outcome measures are activity energy expenditure (doubly labeled water) and physical activity (accelerometry). Secondary measures will include resting metabolic rate via indirect calorimetry, use of time, maximal oxygen consumption and several anthropometric and physiological measures. Outcome measures will be conducted at baseline (zero weeks), mid- and end-intervention (three and six weeks) with three (12 weeks) and six month (24 week) follow-up. All assessors will be blinded to group allocation. This protocol

  16. Testing the Granger noncausality hypothesis in stationary nonlinear models of unknown functional form

    DEFF Research Database (Denmark)

    Péguin-Feissolle, Anne; Strikholm, Birgit; Teräsvirta, Timo

    In this paper we propose a general method for testing the Granger noncausality hypothesis in stationary nonlinear models of unknown functional form. These tests are based on a Taylor expansion of the nonlinear model around a given point in the sample space. We study the performance of our tests b...

  17. The Bayesian New Statistics: Hypothesis testing, estimation, meta-analysis, and power analysis from a Bayesian perspective.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.

  18. Improving the space surveillance telescope's performance using multi-hypothesis testing

    Energy Technology Data Exchange (ETDEWEB)

    Chris Zingarelli, J.; Cain, Stephen [Air Force Institute of Technology, 2950 Hobson Way, Bldg 641, Wright Patterson AFB, OH 45433 (United States); Pearce, Eric; Lambour, Richard [Lincoln Labratory, Massachusetts Institute of Technology, 244 Wood Street, Lexington, MA 02421 (United States); Blake, Travis [Defense Advanced Research Projects Agency, 675 North Randolph Street Arlington, VA 22203 (United States); Peterson, Curtis J. R., E-mail: John.Zingarelli@afit.edu [United States Air Force, 1690 Air Force Pentagon, Washington, DC 20330 (United States)

    2014-05-01

    The Space Surveillance Telescope (SST) is a Defense Advanced Research Projects Agency program designed to detect objects in space like near Earth asteroids and space debris in the geosynchronous Earth orbit (GEO) belt. Binary hypothesis test (BHT) methods have historically been used to facilitate the detection of new objects in space. In this paper a multi-hypothesis detection strategy is introduced to improve the detection performance of SST. In this context, the multi-hypothesis testing (MHT) determines if an unresolvable point source is in either the center, a corner, or a side of a pixel in contrast to BHT, which only tests whether an object is in the pixel or not. The images recorded by SST are undersampled such as to cause aliasing, which degrades the performance of traditional detection schemes. The equations for the MHT are derived in terms of signal-to-noise ratio (S/N), which is computed by subtracting the background light level around the pixel being tested and dividing by the standard deviation of the noise. A new method for determining the local noise statistics that rejects outliers is introduced in combination with the MHT. An experiment using observations of a known GEO satellite are used to demonstrate the improved detection performance of the new algorithm over algorithms previously reported in the literature. The results show a significant improvement in the probability of detection by as much as 50% over existing algorithms. In addition to detection, the S/N results prove to be linearly related to the least-squares estimates of point source irradiance, thus improving photometric accuracy.

  19. Congruence analysis of geodetic networks - hypothesis tests versus model selection by information criteria

    Science.gov (United States)

    Lehmann, Rüdiger; Lösler, Michael

    2017-12-01

    Geodetic deformation analysis can be interpreted as a model selection problem. The null model indicates that no deformation has occurred. It is opposed to a number of alternative models, which stipulate different deformation patterns. A common way to select the right model is the usage of a statistical hypothesis test. However, since we have to test a series of deformation patterns, this must be a multiple test. As an alternative solution for the test problem, we propose the p-value approach. Another approach arises from information theory. Here, the Akaike information criterion (AIC) or some alternative is used to select an appropriate model for a given set of observations. Both approaches are discussed and applied to two test scenarios: A synthetic levelling network and the Delft test data set. It is demonstrated that they work but behave differently, sometimes even producing different results. Hypothesis tests are well-established in geodesy, but may suffer from an unfavourable choice of the decision error rates. The multiple test also suffers from statistical dependencies between the test statistics, which are neglected. Both problems are overcome by applying information criterions like AIC.

  20. Testing the fire-sale FDI hypothesis for the European financial crisis

    NARCIS (Netherlands)

    Weitzel, G.U.; Kling, G.; Gerritsen, D.

    2014-01-01

    Using a panel of corporate transactions in 27 EU countries from 1999 to 2012, we investigate the impact of the financial crisis on the market for corporate assets. In particular, we test the ‘fire-sale FDI’ hypothesis by analyzing the number of cross-border transactions, the price of corporate

  1. Testing the Fire-Sale FDI Hypothesis for the European Financial Crisis

    NARCIS (Netherlands)

    Kling, G.; Gerritsen, Dirk; Weitzel, Gustav Utz

    2014-01-01

    Using a panel of corporate transactions in 27 EU countries from 1999 to 2012, we investigate the impact of the financial crisis on the market for corporate assets. In particular, we test the ‘fire-sale FDI’ hypothesis by analyzing the number of cross-border transactions, the price of corporate

  2. [Experimental testing of Pflüger's reflex hypothesis of menstruation in late 19th century].

    Science.gov (United States)

    Simmer, H H

    1980-07-01

    Pflüger's hypothesis of a nerve reflex as the cause of menstruation published in 1865 and accepted by many, nonetheless did not lead to experimental investigations for 25 years. According to this hypothesis the nerve reflex starts in the ovary by an increase of the intraovarian pressure by the growing follicles. In 1884 Adolph Kehrer proposed a program to test the nerve reflex, but only in 1890, Cohnstein artificially increased the intraovarian pressure in women by bimanual compression from the outside and the vagina. His results were not convincing. Six years later, Strassmann injected fluids into ovaries of animals and obtained changes in the uterus resembling those of oestrus. His results seemed to verify a prognosis derived from Pflüger's hypothesis. Thus, after a long interval, that hypothesis had become a paradigma. Though reasons can be given for the delay, it is little understood, why experimental testing started so late.

  3. Testing the implicit processing hypothesis of precognitive dream experience.

    Science.gov (United States)

    Valášek, Milan; Watt, Caroline; Hutton, Jenny; Neill, Rebecca; Nuttall, Rachel; Renwick, Grace

    2014-08-01

    Seemingly precognitive (prophetic) dreams may be a result of one's unconscious processing of environmental cues and having an implicit inference based on these cues manifest itself in one's dreams. We present two studies exploring this implicit processing hypothesis of precognitive dream experience. Study 1 investigated the relationship between implicit learning, transliminality, and precognitive dream belief and experience. Participants completed the Serial Reaction Time task and several questionnaires. We predicted a positive relationship between the variables. With the exception of relationships between transliminality and precognitive dream belief and experience, this prediction was not supported. Study 2 tested the hypothesis that differences in the ability to notice subtle cues explicitly might account for precognitive dream beliefs and experiences. Participants completed a modified version of the flicker paradigm. We predicted a negative relationship between the ability to explicitly detect changes and precognitive dream variables. This relationship was not found. There was also no relationship between precognitive dream belief and experience and implicit change detection. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Mental Abilities and School Achievement: A Test of a Mediation Hypothesis

    Science.gov (United States)

    Vock, Miriam; Preckel, Franzis; Holling, Heinz

    2011-01-01

    This study analyzes the interplay of four cognitive abilities--reasoning, divergent thinking, mental speed, and short-term memory--and their impact on academic achievement in school in a sample of adolescents in grades seven to 10 (N = 1135). Based on information processing approaches to intelligence, we tested a mediation hypothesis, which states…

  5. The Need for Nuance in the Null Hypothesis Significance Testing Debate

    Science.gov (United States)

    Häggström, Olle

    2017-01-01

    Null hypothesis significance testing (NHST) provides an important statistical toolbox, but there are a number of ways in which it is often abused and misinterpreted, with bad consequences for the reliability and progress of science. Parts of contemporary NHST debate, especially in the psychological sciences, is reviewed, and a suggestion is made…

  6. Reliability Evaluation of Concentric Butterfly Valve Using Statistical Hypothesis Test

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Mu Seong; Choi, Jong Sik; Choi, Byung Oh; Kim, Do Sik [Korea Institute of Machinery and Materials, Daejeon (Korea, Republic of)

    2015-12-15

    A butterfly valve is a type of flow-control device typically used to regulate a fluid flow. This paper presents an estimation of the shape parameter of the Weibull distribution, characteristic life, and B10 life for a concentric butterfly valve based on a statistical analysis of the reliability test data taken before and after the valve improvement. The difference in the shape and scale parameters between the existing and improved valves is reviewed using a statistical hypothesis test. The test results indicate that the shape parameter of the improved valve is similar to that of the existing valve, and that the scale parameter of the improved valve is found to have increased. These analysis results are particularly useful for a reliability qualification test and the determination of the service life cycles.

  7. Reliability Evaluation of Concentric Butterfly Valve Using Statistical Hypothesis Test

    International Nuclear Information System (INIS)

    Chang, Mu Seong; Choi, Jong Sik; Choi, Byung Oh; Kim, Do Sik

    2015-01-01

    A butterfly valve is a type of flow-control device typically used to regulate a fluid flow. This paper presents an estimation of the shape parameter of the Weibull distribution, characteristic life, and B10 life for a concentric butterfly valve based on a statistical analysis of the reliability test data taken before and after the valve improvement. The difference in the shape and scale parameters between the existing and improved valves is reviewed using a statistical hypothesis test. The test results indicate that the shape parameter of the improved valve is similar to that of the existing valve, and that the scale parameter of the improved valve is found to have increased. These analysis results are particularly useful for a reliability qualification test and the determination of the service life cycles

  8. Testing hypotheses and the advancement of science: recent attempts to falsify the equilibrium point hypothesis.

    Science.gov (United States)

    Feldman, Anatol G; Latash, Mark L

    2005-02-01

    Criticisms of the equilibrium point (EP) hypothesis have recently appeared that are based on misunderstandings of some of its central notions. Starting from such interpretations of the hypothesis, incorrect predictions are made and tested. When the incorrect predictions prove false, the hypothesis is claimed to be falsified. In particular, the hypothesis has been rejected based on the wrong assumptions that it conflicts with empirically defined joint stiffness values or that it is incompatible with violations of equifinality under certain velocity-dependent perturbations. Typically, such attempts use notions describing the control of movements of artificial systems in place of physiologically relevant ones. While appreciating constructive criticisms of the EP hypothesis, we feel that incorrect interpretations have to be clarified by reiterating what the EP hypothesis does and does not predict. We conclude that the recent claims of falsifying the EP hypothesis and the calls for its replacement by EMG-force control hypothesis are unsubstantiated. The EP hypothesis goes far beyond the EMG-force control view. In particular, the former offers a resolution for the famous posture-movement paradox while the latter fails to resolve it.

  9. Acceptance sampling for attributes via hypothesis testing and the hypergeometric distribution

    Science.gov (United States)

    Samohyl, Robert Wayne

    2017-10-01

    This paper questions some aspects of attribute acceptance sampling in light of the original concepts of hypothesis testing from Neyman and Pearson (NP). Attribute acceptance sampling in industry, as developed by Dodge and Romig (DR), generally follows the international standards of ISO 2859, and similarly the Brazilian standards NBR 5425 to NBR 5427 and the United States Standards ANSI/ASQC Z1.4. The paper evaluates and extends the area of acceptance sampling in two directions. First, by suggesting the use of the hypergeometric distribution to calculate the parameters of sampling plans avoiding the unnecessary use of approximations such as the binomial or Poisson distributions. We show that, under usual conditions, discrepancies can be large. The conclusion is that the hypergeometric distribution, ubiquitously available in commonly used software, is more appropriate than other distributions for acceptance sampling. Second, and more importantly, we elaborate the theory of acceptance sampling in terms of hypothesis testing rigorously following the original concepts of NP. By offering a common theoretical structure, hypothesis testing from NP can produce a better understanding of applications even beyond the usual areas of industry and commerce such as public health and political polling. With the new procedures, both sample size and sample error can be reduced. What is unclear in traditional acceptance sampling is the necessity of linking the acceptable quality limit (AQL) exclusively to the producer and the lot quality percent defective (LTPD) exclusively to the consumer. In reality, the consumer should also be preoccupied with a value of AQL, as should the producer with LTPD. Furthermore, we can also question why type I error is always uniquely associated with the producer as producer risk, and likewise, the same question arises with consumer risk which is necessarily associated with type II error. The resolution of these questions is new to the literature. The

  10. Men’s Perception of Raped Women: Test of the Sexually Transmitted Disease Hypothesis and the Cuckoldry Hypothesis

    Directory of Open Access Journals (Sweden)

    Prokop Pavol

    2016-06-01

    Full Text Available Rape is a recurrent adaptive problem of female humans and females of a number of non-human animals. Rape has various physiological and reproductive costs to the victim. The costs of rape are furthermore exaggerated by social rejection and blaming of a victim, particularly by men. The negative perception of raped women by men has received little attention from an evolutionary perspective. Across two independent studies, we investigated whether the risk of sexually transmitted diseases (the STD hypothesis, Hypothesis 1 or paternity uncertainty (the cuckoldry hypothesis, Hypothesis 2 influence the negative perception of raped women by men. Raped women received lower attractiveness score than non-raped women, especially in long-term mate attractiveness score. The perceived attractiveness of raped women was not influenced by the presence of experimentally manipulated STD cues on faces of putative rapists. Women raped by three men received lower attractiveness score than women raped by one man. These results provide stronger support for the cuckoldry hypothesis (Hypothesis 2 than for the STD hypothesis (Hypothesis 1. Single men perceived raped women as more attractive than men in a committed relationship (Hypothesis 3, suggesting that the mating opportunities mediate men’s perception of victims of rape. Overall, our results suggest that the risk of cuckoldry underlie the negative perception of victims of rape by men rather than the fear of disease transmission.

  11. Test-potentiated learning: three independent replications, a disconfirmed hypothesis, and an unexpected boundary condition.

    Science.gov (United States)

    Wissman, Kathryn T; Rawson, Katherine A

    2018-04-01

    Arnold and McDermott [(2013). Test-potentiated learning: Distinguishing between direct and indirect effects of testing. Journal of Experimental Psychology: Learning, Memory, and Cognition, 39, 940-945] isolated the indirect effects of testing and concluded that encoding is enhanced to a greater extent following more versus fewer practice tests, referred to as test-potentiated learning. The current research provided further evidence for test-potentiated learning and evaluated the covert retrieval hypothesis as an alternative explanation for the observed effect. Learners initially studied foreign language word pairs and then completed either one or five practice tests before restudy occurred. Results of greatest interest concern performance on test trials following restudy for items that were not correctly recalled on the test trials that preceded restudy. Results replicate Arnold and McDermott (2013) by demonstrating that more versus fewer tests potentiate learning when trial time is limited. Results also provide strong evidence against the covert retrieval hypothesis concerning why the effect occurs (i.e., it does not reflect differential covert retrieval during pre-restudy trials). In addition, outcomes indicate that the magnitude of the test-potentiated learning effect decreases as trial length increases, revealing an unexpected boundary condition to test-potentiated learning.

  12. Picture-Perfect Is Not Perfect for Metamemory: Testing the Perceptual Fluency Hypothesis with Degraded Images

    Science.gov (United States)

    Besken, Miri

    2016-01-01

    The perceptual fluency hypothesis claims that items that are easy to perceive at encoding induce an illusion that they will be easier to remember, despite the finding that perception does not generally affect recall. The current set of studies tested the predictions of the perceptual fluency hypothesis with a picture generation manipulation.…

  13. Risk Based Optimal Fatigue Testing

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Faber, M.H.; Kroon, I.B.

    1992-01-01

    Optimal fatigue life testing of materials is considered. Based on minimization of the total expected costs of a mechanical component a strategy is suggested to determine the optimal stress range levels for which additional experiments are to be performed together with an optimal value...

  14. Why is muscularity sexy? Tests of the fitness indicator hypothesis.

    Science.gov (United States)

    Frederick, David A; Haselton, Martie G

    2007-08-01

    Evolutionary scientists propose that exaggerated secondary sexual characteristics are cues of genes that increase offspring viability or reproductive success. In six studies the hypothesis that muscularity is one such cue is tested. As predicted, women rate muscular men as sexier, more physically dominant and volatile, and less committed to their mates than nonmuscular men. Consistent with the inverted-U hypothesis of masculine traits, men with moderate muscularity are rated most attractive. Consistent with past research on fitness cues, across two measures, women indicate that their most recent short-term sex partners were more muscular than their other sex partners (ds = .36, .47). Across three studies, when controlling for other characteristics (e.g., body fat), muscular men rate their bodies as sexier to women (partial rs = .49-.62) and report more lifetime sex partners (partial rs = .20-.27), short-term partners (partial rs = .25-.28), and more affairs with mated women (partial r = .28).

  15. Statistical hypothesis tests of some micrometeorological observations

    International Nuclear Information System (INIS)

    SethuRaman, S.; Tichler, J.

    1977-01-01

    Chi-square goodness-of-fit is used to test the hypothesis that the medium scale of turbulence in the atmospheric surface layer is normally distributed. Coefficients of skewness and excess are computed from the data. If the data are not normal, these coefficients are used in Edgeworth's asymptotic expansion of Gram-Charlier series to determine an altrnate probability density function. The observed data are then compared with the modified probability densities and the new chi-square values computed.Seventy percent of the data analyzed was either normal or approximatley normal. The coefficient of skewness g 1 has a good correlation with the chi-square values. Events with vertical-barg 1 vertical-bar 1 vertical-bar<0.43 were approximately normal. Intermittency associated with the formation and breaking of internal gravity waves in surface-based inversions over water is thought to be the reason for the non-normality

  16. Decentralized Hypothesis Testing in Energy Harvesting Wireless Sensor Networks

    Science.gov (United States)

    Tarighati, Alla; Gross, James; Jalden, Joakim

    2017-09-01

    We consider the problem of decentralized hypothesis testing in a network of energy harvesting sensors, where sensors make noisy observations of a phenomenon and send quantized information about the phenomenon towards a fusion center. The fusion center makes a decision about the present hypothesis using the aggregate received data during a time interval. We explicitly consider a scenario under which the messages are sent through parallel access channels towards the fusion center. To avoid limited lifetime issues, we assume each sensor is capable of harvesting all the energy it needs for the communication from the environment. Each sensor has an energy buffer (battery) to save its harvested energy for use in other time intervals. Our key contribution is to formulate the problem of decentralized detection in a sensor network with energy harvesting devices. Our analysis is based on a queuing-theoretic model for the battery and we propose a sensor decision design method by considering long term energy management at the sensors. We show how the performance of the system changes for different battery capacities. We then numerically show how our findings can be used in the design of sensor networks with energy harvesting sensors.

  17. A Highest Order Hypothesis Compatibility Test for Monocular SLAM

    Directory of Open Access Journals (Sweden)

    Edmundo Guerra

    2013-08-01

    Full Text Available Simultaneous Location and Mapping (SLAM is a key problem to solve in order to build truly autonomous mobile robots. SLAM with a unique camera, or monocular SLAM, is probably one of the most complex SLAM variants, based entirely on a bearing-only sensor working over six DOF. The monocular SLAM method developed in this work is based on the Delayed Inverse-Depth (DI-D Feature Initialization, with the contribution of a new data association batch validation technique, the Highest Order Hypothesis Compatibility Test, HOHCT. The Delayed Inverse-Depth technique is used to initialize new features in the system and defines a single hypothesis for the initial depth of features with the use of a stochastic technique of triangulation. The introduced HOHCT method is based on the evaluation of statistically compatible hypotheses and a search algorithm designed to exploit the strengths of the Delayed Inverse-Depth technique to achieve good performance results. This work presents the HOHCT with a detailed formulation of the monocular DI-D SLAM problem. The performance of the proposed HOHCT is validated with experimental results, in both indoor and outdoor environments, while its costs are compared with other popular approaches.

  18. Planned Hypothesis Tests Are Not Necessarily Exempt From Multiplicity Adjustment

    Directory of Open Access Journals (Sweden)

    Andrew V. Frane

    2015-10-01

    Full Text Available Scientific research often involves testing more than one hypothesis at a time, which can inflate the probability that a Type I error (false discovery will occur. To prevent this Type I error inflation, adjustments can be made to the testing procedure that compensate for the number of tests. Yet many researchers believe that such adjustments are inherently unnecessary if the tests were “planned” (i.e., if the hypotheses were specified before the study began. This longstanding misconception continues to be perpetuated in textbooks and continues to be cited in journal articles to justify disregard for Type I error inflation. I critically evaluate this myth and examine its rationales and variations. To emphasize the myth’s prevalence and relevance in current research practice, I provide examples from popular textbooks and from recent literature. I also make recommendations for improving research practice and pedagogy regarding this problem and regarding multiple testing in general.

  19. Variability: A Pernicious Hypothesis.

    Science.gov (United States)

    Noddings, Nel

    1992-01-01

    The hypothesis of greater male variability in test results is discussed in its historical context, and reasons feminists have objected to the hypothesis are considered. The hypothesis acquires political importance if it is considered that variability results from biological, rather than cultural, differences. (SLD)

  20. Graphic tests of Easterlin's hypothesis: science or art?

    Science.gov (United States)

    Rutten, A; Higgs, R

    1984-01-01

    Richard Easterlin believes that the postwar fertility cycle is uniquely consistent with the hypothesis of his relative income model of fertility, yet a closer examination of his evidence shows that the case for the relative income explanation is much weaker than initially appears. Easterlin finds the postwar baby boom a transparent event. Couples who entered the labor market in the postwar period have very low material aspirations. Having grown up during the Great Depression and World War II, they were content with a modest level of living. Their labor market experience was very good. Tight restrictions on immigration kept aliens from coming in to fill the gap. Thus the members of his generation occupied an unprecedented position. They could easily meet and even exceed their expectations. This high level of relative income meant that they could have more of everything they wanted, including children. For the children born during the baby boom, all this was reversed, and hence the needs of the baby bust were sown. To test this hypothesis, Easterlin compared the movements of relative income and fertility over the postwar years using a graph. 4 published versions of the graph are presented. The graph shows that relative income and fertility did move together over the cycle, apparently very closely. Easterlin's measure of fertility is the total fertility rate (TFR). There is no such direct measure of relative income. Easterlin develops 2 proxies based on changing economic conditions believed to shape the level of material aspirations. His preferred measure, labeled R or income in his graph, relates the income experience of young couples in the years previous to marriage to that of their parents in the years before the young people left home. Because of the available data limit construction of this index to the years after 1956, another measure, labeled Re or employment in Easterlin's graphs, is constructed for the pre-1956 period. This measure relates the average of

  1. Praise the Bridge that Carries You Over: Testing the Flattery Citation Hypothesis

    DEFF Research Database (Denmark)

    Frandsen, Tove Faber; Nicolaisen, Jeppe

    2011-01-01

    analysis of the editorial board members entering American Economic Review from 1984 to 2004 using a citation window of 11 years. In order to test the flattery citation hypothesis further we have conducted a study applying the difference-in-difference estimator. We analyse the number of times the editors...

  2. Speech production in people who stutter: Testing the motor plan assembly hypothesis

    NARCIS (Netherlands)

    Lieshout, P.H.H.M. van; Hulstijn, W.; Peters, H.F.M.

    1996-01-01

    The main purpose of the present study was to test the hypothesis that persons who stutter, when compared to persons who do not stutter, are less able to assemble abstract motor plans for short verbal responses. Subjects were adult males who stutter and age- and sex-matched control speakers, who were

  3. Using modern human cortical bone distribution to test the systemic robusticity hypothesis.

    Science.gov (United States)

    Baab, Karen L; Copes, Lynn E; Ward, Devin L; Wells, Nora; Grine, Frederick E

    2018-06-01

    The systemic robusticity hypothesis links the thickness of cortical bone in both the cranium and limb bones. This hypothesis posits that thick cortical bone is in part a systemic response to circulating hormones, such as growth hormone and thyroid hormone, possibly related to physical activity or cold climates. Although this hypothesis has gained popular traction, only rarely has robusticity of the cranium and postcranial skeleton been considered jointly. We acquired computed tomographic scans from associated crania, femora and humeri from single individuals representing 11 populations in Africa and North America (n = 228). Cortical thickness in the parietal, frontal and occipital bones and cortical bone area in limb bone diaphyses were analyzed using correlation, multiple regression and general linear models to test the hypothesis. Absolute thickness values from the crania were not correlated with cortical bone area of the femur or humerus, which is at odds with the systemic robusticity hypothesis. However, measures of cortical bone scaled by total vault thickness and limb cross-sectional area were positively correlated between the cranium and postcranium. When accounting for a range of potential confounding variables, including sex, age and body mass, variation in relative postcranial cortical bone area explained ∼20% of variation in the proportion of cortical cranial bone thickness. While these findings provide limited support for the systemic robusticity hypothesis, cranial cortical thickness did not track climate or physical activity across populations. Thus, some of the variation in cranial cortical bone thickness in modern humans is attributable to systemic effects, but the driving force behind this effect remains obscure. Moreover, neither absolute nor proportional measures of cranial cortical bone thickness are positively correlated with total cranial bone thickness, complicating the extrapolation of these findings to extinct species where only cranial

  4. Aging and motor variability: a test of the neural noise hypothesis.

    Science.gov (United States)

    Sosnoff, Jacob J; Newell, Karl M

    2011-07-01

    Experimental tests of the neural noise hypothesis of aging, which holds that aging-related increments in motor variability are due to increases in white noise in the perceptual-motor system, were conducted. Young (20-29 years old) and old (60-69 and 70-79 years old) adults performed several perceptual-motor tasks. Older adults were progressively more variable in their performance outcome, but there was no age-related difference in white noise in the motor output. Older adults had a greater frequency-dependent structure in their motor variability that was associated with performance decrements. The findings challenge the main tenet of the neural noise hypothesis of aging in that the increased variability of older adults was due to a decreased ability to adapt to the constraints of the task rather than an increment of neural noise per se.

  5. Balassa-Samuelson Hypothesis: A Test Of Turkish Economy By ARDL Bound Testing Approach

    Directory of Open Access Journals (Sweden)

    Utku ALTUNÖZ

    2014-06-01

    Full Text Available Balassa-Samuelson effect is a popular theme at last years that introduced by Bèla Balassa (1964 and Paul Samuelson (1964. This concept, suggests that a differentiation at international level between the relative rates of productivity of the tradable and non tradable sectors may cause structural and permanent deviations from the purchasing power parity. In this essay, related variables are tested through Balassa-Samuelson Effect in terms of Turkey-European Economy. The choice of econometric technique used to estimate the model was important because the regressors in the model appeared to be a mixture of I(0 and I(1 processes. Thus ARDL bounds testing approaches to co integration analysis in estimating the long-run determinants of the real exchange rates. Given the dataset and econometric techniques used, the results do not support the B-S hypothesis.

  6. Hypothesis-driven methods to augment human cognition by optimizing cortical oscillations

    Directory of Open Access Journals (Sweden)

    Jörn M. Horschig

    2014-06-01

    Full Text Available Cortical oscillations have been shown to represent fundamental functions of a working brain, e.g. communication, stimulus binding, error monitoring, and inhibition, and are directly linked to behavior. Recent studies intervening with these oscillations have demonstrated effective modulation of both the oscillations and behavior. In this review, we collect evidence in favor of how hypothesis-driven methods can be used to augment cognition by optimizing cortical oscillations. We elaborate their potential usefulness for three target groups: healthy elderly, patients with attention deficit/hyperactivity disorder, and healthy young adults. We discuss the relevance of neuronal oscillations in each group and show how each of them can benefit from the manipulation of functionally-related oscillations. Further, we describe methods for manipulation of neuronal oscillations including direct brain stimulation as well as indirect task alterations. We also discuss practical considerations about the proposed techniques. In conclusion, we propose that insights from neuroscience should guide techniques to augment human cognition, which in turn can provide a better understanding of how the human brain works.

  7. A General Relativistic Null Hypothesis Test with Event Horizon Telescope Observations of the Black Hole Shadow in Sgr A*

    Science.gov (United States)

    Psaltis, Dimitrios; Özel, Feryal; Chan, Chi-Kwan; Marrone, Daniel P.

    2015-12-01

    The half opening angle of a Kerr black hole shadow is always equal to (5 ± 0.2)GM/Dc2, where M is the mass of the black hole and D is its distance from the Earth. Therefore, measuring the size of a shadow and verifying whether it is within this 4% range constitutes a null hypothesis test of general relativity. We show that the black hole in the center of the Milky Way, Sgr A*, is the optimal target for performing this test with upcoming observations using the Event Horizon Telescope (EHT). We use the results of optical/IR monitoring of stellar orbits to show that the mass-to-distance ratio for Sgr A* is already known to an accuracy of ∼4%. We investigate our prior knowledge of the properties of the scattering screen between Sgr A* and the Earth, the effects of which will need to be corrected for in order for the black hole shadow to appear sharp against the background emission. Finally, we explore an edge detection scheme for interferometric data and a pattern matching algorithm based on the Hough/Radon transform and demonstrate that the shadow of the black hole at 1.3 mm can be localized, in principle, to within ∼9%. All these results suggest that our prior knowledge of the properties of the black hole, of scattering broadening, and of the accretion flow can only limit this general relativistic null hypothesis test with EHT observations of Sgr A* to ≲10%.

  8. [A test of the focusing hypothesis for category judgment: an explanation using the mental-box model].

    Science.gov (United States)

    Hatori, Tsuyoshi; Takemura, Kazuhisa; Fujii, Satoshi; Ideno, Takashi

    2011-06-01

    This paper presents a new model of category judgment. The model hypothesizes that, when more attention is focused on a category, the psychological range of the category gets narrower (category-focusing hypothesis). We explain this hypothesis by using the metaphor of a "mental-box" model: the more attention that is focused on a mental box (i.e., a category set), the smaller the size of the box becomes (i.e., a cardinal number of the category set). The hypothesis was tested in an experiment (N = 40), where the focus of attention on prescribed verbal categories was manipulated. The obtained data gave support to the hypothesis: category-focusing effects were found in three experimental tasks (regarding the category of "food", "height", and "income"). The validity of the hypothesis was discussed based on the results.

  9. Bayesian Hypothesis Testing for Psychologists: A Tutorial on the Savage-Dickey Method

    Science.gov (United States)

    Wagenmakers, Eric-Jan; Lodewyckx, Tom; Kuriyal, Himanshu; Grasman, Raoul

    2010-01-01

    In the field of cognitive psychology, the "p"-value hypothesis test has established a stranglehold on statistical reporting. This is unfortunate, as the "p"-value provides at best a rough estimate of the evidence that the data provide for the presence of an experimental effect. An alternative and arguably more appropriate measure of evidence is…

  10. Paranormal psychic believers and skeptics: a large-scale test of the cognitive differences hypothesis.

    Science.gov (United States)

    Gray, Stephen J; Gallo, David A

    2016-02-01

    Belief in paranormal psychic phenomena is widespread in the United States, with over a third of the population believing in extrasensory perception (ESP). Why do some people believe, while others are skeptical? According to the cognitive differences hypothesis, individual differences in the way people process information about the world can contribute to the creation of psychic beliefs, such as differences in memory accuracy (e.g., selectively remembering a fortune teller's correct predictions) or analytical thinking (e.g., relying on intuition rather than scrutinizing evidence). While this hypothesis is prevalent in the literature, few have attempted to empirically test it. Here, we provided the most comprehensive test of the cognitive differences hypothesis to date. In 3 studies, we used online screening to recruit groups of strong believers and strong skeptics, matched on key demographics (age, sex, and years of education). These groups were then tested in laboratory and online settings using multiple cognitive tasks and other measures. Our cognitive testing showed that there were no consistent group differences on tasks of episodic memory distortion, autobiographical memory distortion, or working memory capacity, but skeptics consistently outperformed believers on several tasks tapping analytical or logical thinking as well as vocabulary. These findings demonstrate cognitive similarities and differences between these groups and suggest that differences in analytical thinking and conceptual knowledge might contribute to the development of psychic beliefs. We also found that psychic belief was associated with greater life satisfaction, demonstrating benefits associated with psychic beliefs and highlighting the role of both cognitive and noncognitive factors in understanding these individual differences.

  11. Testing the hypothesis that treatment can eliminate HIV

    DEFF Research Database (Denmark)

    Okano, Justin T; Robbins, Danielle; Palk, Laurence

    2016-01-01

    BACKGROUND: Worldwide, approximately 35 million individuals are infected with HIV; about 25 million of these live in sub-Saharan Africa. WHO proposes using treatment as prevention (TasP) to eliminate HIV. Treatment suppresses viral load, decreasing the probability an individual transmits HIV....... The elimination threshold is one new HIV infection per 1000 individuals. Here, we test the hypothesis that TasP can substantially reduce epidemics and eliminate HIV. We estimate the impact of TasP, between 1996 and 2013, on the Danish HIV epidemic in men who have sex with men (MSM), an epidemic UNAIDS has...... identified as a priority for elimination. METHODS: We use a CD4-staged Bayesian back-calculation approach to estimate incidence, and the hidden epidemic (the number of HIV-infected undiagnosed MSM). To develop the back-calculation model, we use data from an ongoing nationwide population-based study...

  12. A test of the predator satiation hypothesis, acorn predator size, and acorn preference

    Science.gov (United States)

    C.H. Greenberg; S.J. Zarnoch

    2018-01-01

    Mast seeding is hypothesized to satiate seed predators with heavy production and reduce populations with crop failure, thereby increasing seed survival. Preference for red or white oak acorns could influence recruitment among oak species. We tested the predator satiation hypothesis, acorn preference, and predator size by concurrently...

  13. Persistent Confusions about Hypothesis Testing in the Social Sciences

    Directory of Open Access Journals (Sweden)

    Christopher Thron

    2015-05-01

    Full Text Available This paper analyzes common confusions involving basic concepts in statistical hypothesis testing. One-third of the social science statistics textbooks examined in the study contained false statements about significance level and/or p-value. We infer that a large proportion of social scientists are being miseducated about these concepts. We analyze the causes of these persistent misunderstandings, and conclude that the conventional terminology is prone to abuse because it does not clearly represent the conditional nature of probabilities and events involved. We argue that modifications in terminology, as well as the explicit introduction of conditional probability concepts and notation into the statistics curriculum in the social sciences, are necessary to prevent the persistence of these errors.

  14. Why Is Test-Restudy Practice Beneficial for Memory? An Evaluation of the Mediator Shift Hypothesis

    Science.gov (United States)

    Pyc, Mary A.; Rawson, Katherine A.

    2012-01-01

    Although the memorial benefits of testing are well established empirically, the mechanisms underlying this benefit are not well understood. The authors evaluated the mediator shift hypothesis, which states that test-restudy practice is beneficial for memory because retrieval failures during practice allow individuals to evaluate the effectiveness…

  15. Convergence Hypothesis: Evidence from Panel Unit Root Test with Spatial Dependence

    Directory of Open Access Journals (Sweden)

    Lezheng Liu

    2006-10-01

    Full Text Available In this paper we test the convergence hypothesis by using a revised 4- step procedure of panel unit root test suggested by Evans and Karras (1996. We use data on output for 24 OECD countries over 40 years long. Whether the convergence, if any, is conditional or absolute is also examined. According to a proposition by Baltagi, Bresson, and Pirotte (2005, we incorporate spatial autoregressive error into a fixedeffect panel model to account for not only the heterogeneous panel structure, but also spatial dependence, which might induce lower statistical power of conventional panel unit root test. Our empirical results indicate that output is converging among OECD countries. However, convergence is characterized as conditional. The results also report a relatively lower convergent speed compared to conventional panel studies.

  16. TESTS OF THE PLANETARY HYPOTHESIS FOR PTFO 8-8695b

    International Nuclear Information System (INIS)

    Yu, Liang; Winn, Joshua N.; Rappaport, Saul; Dai, Fei; Triaud, Amaury H. M. J.; Gillon, Michaël; Delrez, Laetitia; Jehin, Emmanuel; Lendl, Monika; Albrecht, Simon; Bieryla, Allyson; Holman, Matthew J.; Montet, Benjamin T.; Hillenbrand, Lynne; Howard, Andrew W.; Huang, Chelsea X.; Isaacson, Howard; Sanchis-Ojeda, Roberto; Muirhead, Philip

    2015-01-01

    The T Tauri star PTFO 8-8695 exhibits periodic fading events that have been interpreted as the transits of a giant planet on a precessing orbit. Here we present three tests of the planet hypothesis. First, we sought evidence for the secular changes in light-curve morphology that are predicted to be a consequence of orbital precession. We observed 28 fading events spread over several years and did not see the expected changes. Instead, we found that the fading events are not strictly periodic. Second, we attempted to detect the planet's radiation, based on infrared observations spanning the predicted times of occultations. We ruled out a signal of the expected amplitude. Third, we attempted to detect the Rossiter–McLaughlin effect by performing high-resolution spectroscopy throughout a fading event. No effect was seen at the expected level, ruling out most (but not all) possible orientations for the hypothetical planetary orbit. Our spectroscopy also revealed strong, time-variable, high-velocity Hα and Ca H and K emission features. All these observations cast doubt on the planetary hypothesis, and suggest instead that the fading events represent starspots, eclipses by circumstellar dust, or occultations of an accretion hotspot

  17. Water Pollution Detection Based on Hypothesis Testing in Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xu Luo

    2017-01-01

    Full Text Available Water pollution detection is of great importance in water conservation. In this paper, the water pollution detection problems of the network and of the node in sensor networks are discussed. The detection problems in both cases of the distribution of the monitoring noise being normal and nonnormal are considered. The pollution detection problems are analyzed based on hypothesis testing theory firstly; then, the specific detection algorithms are given. Finally, two implementation examples are given to illustrate how the proposed detection methods are used in the water pollution detection in sensor networks and prove the effectiveness of the proposed detection methods.

  18. The self: Your own worst enemy? A test of the self-invoking trigger hypothesis.

    Science.gov (United States)

    McKay, Brad; Wulf, Gabriele; Lewthwaite, Rebecca; Nordin, Andrew

    2015-01-01

    The self-invoking trigger hypothesis was proposed by Wulf and Lewthwaite [Wulf, G., & Lewthwaite, R. (2010). Effortless motor learning? An external focus of attention enhances movement effectiveness and efficiency. In B. Bruya (Ed.), Effortless attention: A new perspective in attention and action (pp. 75-101). Cambridge, MA: MIT Press] as a mechanism underlying the robust effect of attentional focus on motor learning and performance. One component of this hypothesis, relevant beyond the attentional focus effect, suggests that causing individuals to access their self-schema will negatively impact their learning and performance of a motor skill. The purpose of the present two studies was to provide an initial test of the performance and learning aspects of the self-invoking trigger hypothesis by asking participants in one group to think about themselves between trial blocks-presumably activating their self-schema-to compare their performance and learning to that of a control group. In Experiment 1, participants performed 2 blocks of 10 trials on a throwing task. In one condition, participants were asked between blocks to think about their past throwing experience. While a control group maintained their performance across blocks, the self group's performance was degraded on the second block. In Experiment 2, participants were asked to practice a wiffleball hitting task on two separate days. Participants returned on a third day to perform retention and transfer tests without the self-activating manipulation. Results indicated that the self group learned the hitting task less effectively than the control group. The findings reported here provide initial support for the self-invoking trigger hypothesis.

  19. SETI in vivo: testing the we-are-them hypothesis

    Science.gov (United States)

    Makukov, Maxim A.; Shcherbak, Vladimir I.

    2018-04-01

    After it was proposed that life on Earth might descend from seeding by an earlier extraterrestrial civilization motivated to secure and spread life, some authors noted that this alternative offers a testable implication: microbial seeds could be intentionally supplied with a durable signature that might be found in extant organisms. In particular, it was suggested that the optimal location for such an artefact is the genetic code, as the least evolving part of cells. However, as the mainstream view goes, this scenario is too speculative and cannot be meaningfully tested because encoding/decoding a signature within the genetic code is something ill-defined, so any retrieval attempt is doomed to guesswork. Here we refresh the seeded-Earth hypothesis in light of recent observations, and discuss the motivation for inserting a signature. We then show that `biological SETI' involves even weaker assumptions than traditional SETI and admits a well-defined methodological framework. After assessing the possibility in terms of molecular and evolutionary biology, we formalize the approach and, adopting the standard guideline of SETI that encoding/decoding should follow from first principles and be convention-free, develop a universal retrieval strategy. Applied to the canonical genetic code, it reveals a non-trivial precision structure of interlocked logical and numerical attributes of systematic character (previously we found these heuristically). To assess this result in view of the initial assumption, we perform statistical, comparison, interdependence and semiotic analyses. Statistical analysis reveals no causal connection of the result to evolutionary models of the genetic code, interdependence analysis precludes overinterpretation, and comparison analysis shows that known variations of the code lack any precision-logic structures, in agreement with these variations being post-LUCA (i.e. post-seeding) evolutionary deviations from the canonical code. Finally, semiotic

  20. Using Optimization to Improve Test Planning

    Science.gov (United States)

    2017-09-01

    OPTIMIZATION TO IMPROVE TEST PLANNING by Arlene M. Payne September 2017 Thesis Advisor: Jeffrey E. Kline Second Reader: Oleg A. Yakimenko THIS... Project (0704-0188) Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE September 2017 3. REPORT TYPE AND DATES COVERED Master’s...thesis 4. TITLE AND SUBTITLE USING OPTIMIZATION TO IMPROVE TEST PLANNING 5. FUNDING NUMBERS 6. AUTHOR(S) Arlene M. Payne 7. PERFORMING ORGANIZATION

  1. Predictability of Exchange Rates in Sri Lanka: A Test of the Efficient Market Hypothesis

    OpenAIRE

    Guneratne B Wickremasinghe

    2007-01-01

    This study examined the validity of the weak and semi-strong forms of the efficient market hypothesis (EMH) for the foreign exchange market of Sri Lanka. Monthly exchange rates for four currencies during the floating exchange rate regime were used in the empirical tests. Using a battery of tests, empirical results indicate that the current values of the four exchange rates can be predicted from their past values. Further, the tests of semi-strong form efficiency indicate that exchange rate pa...

  2. Parameter estimation and hypothesis testing in linear models

    CERN Document Server

    Koch, Karl-Rudolf

    1999-01-01

    The necessity to publish the second edition of this book arose when its third German edition had just been published. This second English edition is there­ fore a translation of the third German edition of Parameter Estimation and Hypothesis Testing in Linear Models, published in 1997. It differs from the first English edition by the addition of a new chapter on robust estimation of parameters and the deletion of the section on discriminant analysis, which has been more completely dealt with by the author in the book Bayesian In­ ference with Geodetic Applications, Springer-Verlag, Berlin Heidelberg New York, 1990. Smaller additions and deletions have been incorporated, to im­ prove the text, to point out new developments or to eliminate errors which became apparent. A few examples have been also added. I thank Springer-Verlag for publishing this second edition and for the assistance in checking the translation, although the responsibility of errors remains with the author. I also want to express my thanks...

  3. Visual Working Memory and Number Sense: Testing the Double Deficit Hypothesis in Mathematics

    Science.gov (United States)

    Toll, Sylke W. M.; Kroesbergen, Evelyn H.; Van Luit, Johannes E. H.

    2016-01-01

    Background: Evidence exists that there are two main underlying cognitive factors in mathematical difficulties: working memory and number sense. It is suggested that real math difficulties appear when both working memory and number sense are weak, here referred to as the double deficit (DD) hypothesis. Aims: The aim of this study was to test the DD…

  4. A critical discussion of null hypothesis significance testing and statistical power analysis within psychological research

    DEFF Research Database (Denmark)

    Jones, Allan; Sommerlund, Bo

    2007-01-01

    The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...

  5. Hypothesis Testing of Inclusion of the Tolerance Interval for the Assessment of Food Safety.

    Directory of Open Access Journals (Sweden)

    Hungyen Chen

    Full Text Available In the testing of food quality and safety, we contrast the contents of the newly proposed food (genetically modified food against those of conventional foods. Because the contents vary largely between crop varieties and production environments, we propose a two-sample test of substantial equivalence that examines the inclusion of the tolerance intervals of the two populations, the population of the contents of the proposed food, which we call the target population, and the population of the contents of the conventional food, which we call the reference population. Rejection of the test hypothesis guarantees that the contents of the proposed foods essentially do not include outliers in the population of the contents of the conventional food. The existing tolerance interval (TI0 is constructed to have at least a pre-specified level of the coverage probability. Here, we newly introduce the complementary tolerance interval (TI1 that is guaranteed to have at most a pre-specified level of the coverage probability. By applying TI0 and TI1 to the samples from the target population and the reference population respectively, we construct a test statistic for testing inclusion of the two tolerance intervals. To examine the performance of the testing procedure, we conducted a simulation that reflects the effects of gene and environment, and residual from a crop experiment. As a case study, we applied the hypothesis testing to test if the distribution of the protein content of rice in Kyushu area is included in the distribution of the protein content in the other areas in Japan.

  6. Testing the null hypothesis of the nonexistence of a preseizure state

    International Nuclear Information System (INIS)

    Andrzejak, Ralph G.; Kraskov, Alexander; Mormann, Florian; Rieke, Christoph; Kreuz, Thomas; Elger, Christian E.; Lehnertz, Klaus

    2003-01-01

    A rapidly growing number of studies deals with the prediction of epileptic seizures. For this purpose, various techniques derived from linear and nonlinear time series analysis have been applied to the electroencephalogram of epilepsy patients. In none of these works, however, the performance of the seizure prediction statistics is tested against a null hypothesis, an otherwise ubiquitous concept in science. In consequence, the evaluation of the reported performance values is problematic. Here, we propose the technique of seizure time surrogates based on a Monte Carlo simulation to remedy this deficit

  7. Testing the null hypothesis of the nonexistence of a preseizure state

    Energy Technology Data Exchange (ETDEWEB)

    Andrzejak, Ralph G; Kraskov, Alexander [John-von-Neumann Institute for Computing, Forschungszentrum Juelich, 52425 Juelich (Germany); Mormann, Florian; Rieke, Christoph [Department of Epileptology, University of Bonn, Sigmund-Freud-Strasse 25, 53105 Bonn (Germany); Helmholtz Institut fuer Strahlen- und Kernphysik, University of Bonn, Nussallee 14-16, 53115 Bonn (Germany); Kreuz, Thomas [John-von-Neumann Institute for Computing, Forschungszentrum Juelich, 52425 Juelich (Germany); Department of Epileptology, University of Bonn, Sigmund-Freud-Strasse 25, 53105 Bonn (Germany); Elger, Christian E; Lehnertz, Klaus [Department of Epileptology, University of Bonn, Sigmund-Freud-Strasse 25, 53105 Bonn (Germany)

    2003-01-01

    A rapidly growing number of studies deals with the prediction of epileptic seizures. For this purpose, various techniques derived from linear and nonlinear time series analysis have been applied to the electroencephalogram of epilepsy patients. In none of these works, however, the performance of the seizure prediction statistics is tested against a null hypothesis, an otherwise ubiquitous concept in science. In consequence, the evaluation of the reported performance values is problematic. Here, we propose the technique of seizure time surrogates based on a Monte Carlo simulation to remedy this deficit.

  8. TEST OF THE CATCH-UP HYPOTHESIS IN AFRICAN AGRICULTURAL GROWTH RATES

    Directory of Open Access Journals (Sweden)

    Kalu Ukpai IFEGWU

    2015-11-01

    Full Text Available The paper tested the catch-up hypothesis in agricultural growth rates of twenty-six African countries. Panel data used was drawn from the Food and Agricultural Organization Statistics (FAOSTAT of the United Nations. The Data Envelopment Analysis Method for measuring productivity was used to estimate productivity growth rates. The cross-section framework consisting of sigma-convergence and beta-convergence was employed to test the catching up process. Catching up is said to exist if the value of beta is negative and significant. Since catching up does not necessarily imply narrowing of national productivity inequalities, sigma-convergence which measures inequality, was estimated for the same variables. The results showed evidence of the catch-up process, but failed to find a narrowing of productivity inequalities among countries.

  9. Does mediator use contribute to the spacing effect for cued recall? Critical tests of the mediator hypothesis.

    Science.gov (United States)

    Morehead, Kayla; Dunlosky, John; Rawson, Katherine A; Bishop, Melissa; Pyc, Mary A

    2018-04-01

    When study is spaced across sessions (versus massed within a single session), final performance is greater after spacing. This spacing effect may have multiple causes, and according to the mediator hypothesis, part of the effect can be explained by the use of mediator-based strategies. This hypothesis proposes that when study is spaced across sessions, rather than massed within a session, more mediators will be generated that are longer lasting and hence more mediators will be available to support criterion recall. In two experiments, participants were randomly assigned to study paired associates using either a spaced or massed schedule. They reported strategy use for each item during study trials and during the final test. Consistent with the mediator hypothesis, participants who had spaced (as compared to massed) practice reported using more mediators on the final test. This use of effective mediators also statistically accounted for some - but not all of - the spacing effect on final performance.

  10. Central Plant Optimization for Waste Energy Reduction (CPOWER). ESTCP Cost and Performance Report

    Science.gov (United States)

    2016-12-01

    meet all demands, and not necessarily for fuel economy or energy efficiency. Plant operators run the equipment according to a pre-set, fixed strategy ...exchanger, based on the site protocol. Thermal Energy Storage Tank Site-specific optimal operating strategies were developed for the chilled water...being served by the central plant Hypothesis The hypothesis tested that the optimized operation reduces wasted energy and energy costs by smart

  11. Contrast class cues and performance facilitation in a hypothesis-testing task: evidence for an iterative counterfactual model.

    Science.gov (United States)

    Gale, Maggie; Ball, Linden J

    2012-04-01

    Hypothesis-testing performance on Wason's (Quarterly Journal of Experimental Psychology 12:129-140, 1960) 2-4-6 task is typically poor, with only around 20% of participants announcing the to-be-discovered "ascending numbers" rule on their first attempt. Enhanced solution rates can, however, readily be observed with dual-goal (DG) task variants requiring the discovery of two complementary rules, one labeled "DAX" (the standard "ascending numbers" rule) and the other labeled "MED" ("any other number triples"). Two DG experiments are reported in which we manipulated the usefulness of a presented MED exemplar, where usefulness denotes cues that can establish a helpful "contrast class" that can stand in opposition to the presented 2-4-6 DAX exemplar. The usefulness of MED exemplars had a striking facilitatory effect on DAX rule discovery, which supports the importance of contrast-class information in hypothesis testing. A third experiment ruled out the possibility that the useful MED triple seeded the correct rule from the outset and obviated any need for hypothesis testing. We propose that an extension of Oaksford and Chater's (European Journal of Cognitive Psychology 6:149-169, 1994) iterative counterfactual model can neatly capture the mechanisms by which DG facilitation arises.

  12. Life shocks and crime: a test of the "turning point" hypothesis.

    Science.gov (United States)

    Corman, Hope; Noonan, Kelly; Reichman, Nancy E; Schwartz-Soicher, Ofira

    2011-08-01

    Other researchers have posited that important events in men's lives-such as employment, marriage, and parenthood-strengthen their social ties and lead them to refrain from crime. A challenge in empirically testing this hypothesis has been the issue of self-selection into life transitions. This study contributes to this literature by estimating the effects of an exogenous life shock on crime. We use data from the Fragile Families and Child Wellbeing Study, augmented with information from hospital medical records, to estimate the effects of the birth of a child with a severe health problem on the likelihood that the infant's father engages in illegal activities. We conduct a number of auxiliary analyses to examine exogeneity assumptions. We find that having an infant born with a severe health condition increases the likelihood that the father is convicted of a crime in the three-year period following the birth of the child, and at least part of the effect appears to operate through work and changes in parental relationships. These results provide evidence that life events can cause crime and, as such, support the "turning point" hypothesis.

  13. Gratitude facilitates private conformity: A test of the social alignment hypothesis.

    Science.gov (United States)

    Ng, Jomel W X; Tong, Eddie M W; Sim, Dael L Y; Teo, Samantha W Y; Loy, Xingqi; Giesbrecht, Timo

    2017-03-01

    Past research has established clear support for the prosocial function of gratitude in improving the well-being of others. The present research provides evidence for another hypothesized function of gratitude: the social alignment function, which enhances the tendency of grateful individuals to follow social norms. We tested the social alignment hypothesis of gratitude in 2 studies with large samples. Using 2 different conformity paradigms, participants were subjected to a color judgment task (Experiment 1) and a material consumption task (Experiment 2). They were provided with information showing choices allegedly made by others, but were allowed to state their responses in private. Supporting the social alignment hypothesis, the results showed that induced gratitude increased private conformity. Specifically, participants induced to feel gratitude were more likely to conform to the purportedly popular choice, even if the option was factually incorrect (Experiment 1). This effect appears to be specific to gratitude; induction of joy produced significantly less conformity than gratitude (Experiment 2). We discuss whether the social alignment function provides a behavioral pathway in the role of gratitude in building social relationships. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. On the Keyhole Hypothesis

    DEFF Research Database (Denmark)

    Mikkelsen, Kaare B.; Kidmose, Preben; Hansen, Lars Kai

    2017-01-01

    simultaneously recorded scalp EEG. A cross-validation procedure was employed to ensure unbiased estimates. We present several pieces of evidence in support of the keyhole hypothesis: There is a high mutual information between data acquired at scalp electrodes and through the ear-EEG "keyhole," furthermore we......We propose and test the keyhole hypothesis that measurements from low dimensional EEG, such as ear-EEG reflect a broadly distributed set of neural processes. We formulate the keyhole hypothesis in information theoretical terms. The experimental investigation is based on legacy data consisting of 10...

  15. Mothers Who Kill Their Offspring: Testing Evolutionary Hypothesis in a 110-Case Italian Sample

    Science.gov (United States)

    Camperio Ciani, Andrea S.; Fontanesi, Lilybeth

    2012-01-01

    Objectives: This research aimed to identify incidents of mothers in Italy killing their own children and to test an adaptive evolutionary hypothesis to explain their occurrence. Methods: 110 cases of mothers killing 123 of their own offspring from 1976 to 2010 were analyzed. Each case was classified using 13 dichotomic variables. Descriptive…

  16. Reassessing the Trade-off Hypothesis

    DEFF Research Database (Denmark)

    Rosas, Guillermo; Manzetti, Luigi

    2015-01-01

    Do economic conditions drive voters to punish politicians that tolerate corruption? Previous scholarly work contends that citizens in young democracies support corrupt governments that are capable of promoting good economic outcomes, the so-called trade-off hypothesis. We test this hypothesis based...

  17. Use of supernovae light curves for testing the expansion hypothesis and other cosmological relations

    International Nuclear Information System (INIS)

    Rust, B.W.

    1974-01-01

    This thesis is primarily concerned with a test of the expansion hypothesis based on the relation Δt/sub obs/ = (1 + V/sub r//c)Δt/sub int/ where Δt/sub int/ is the time lapse characterizing some phenomenon in a distant galaxy, Δt/sub obs/ is the observed time lapse and V/sub r/ is the symbolic velocity of recession. If the red shift is a Doppler effect, the observed time lapse should be lengthened by the same factor as the wave length of the light. Many authors have suggested type I supernovae for such a test because of their great luminosity and the uniformity of their light curves, but apparently the test has heretofore never actually been performed. Thirty-six light curves were gathered from the literature and one (SN1971i) was measured. All of the light curves were reduced to a common (m/sub pg/) photometric system. The comparison time lapse, Δt/sub c/, was taken to be the time required for the brightness to fall from 0.5 m below peak to 2.5 m below peak. The straight line regression of Δt/sub c/ on V/sub r/ gives a correlation coefficient significant at the 93 percent level, and the simple static Euclidean hypothesis is rejected at that level. The regression line also deviates from the prediction of the classical expansion hypothesis. Better agreement was obtained using the chronogeometric theory of I. E. Segal ( []972 Astron. and Astrophys. 18, 143), but the scatter in the present data makes it impossible to distinguish between these alternate hypotheses at the 95 percent confidence level. The question of how many additional light curves would be needed to give definite tests is addressed. It is shown that at the present rate of supernova discoveries, only a few more years would be required to obtain the necessary data if light curves are systematically measured for the more distant supernovae. (Diss. Abstr. Int., B)

  18. Optimal design of accelerated life tests for an extension of the exponential distribution

    International Nuclear Information System (INIS)

    Haghighi, Firoozeh

    2014-01-01

    Accelerated life tests provide information quickly on the lifetime distribution of the products by testing them at higher than usual levels of stress. In this paper, the lifetime of a product at any level of stress is assumed to have an extension of the exponential distribution. This new family has been recently introduced by Nadarajah and Haghighi (2011 [1]); it can be used as an alternative to the gamma, Weibull and exponentiated exponential distributions. The scale parameter of lifetime distribution at constant stress levels is assumed to be a log-linear function of the stress levels and a cumulative exposure model holds. For this model, the maximum likelihood estimates (MLEs) of the parameters, as well as the Fisher information matrix, are derived. The asymptotic variance of the scale parameter at a design stress is adopted as an optimization objective and its expression formula is provided using the maximum likelihood method. A Monte Carlo simulation study is carried out to examine the performance of these methods. The asymptotic confidence intervals for the parameters and hypothesis test for the parameter of interest are constructed

  19. Design of clinical trials involving multiple hypothesis tests with a common control.

    Science.gov (United States)

    Schou, I Manjula; Marschner, Ian C

    2017-07-01

    Randomized clinical trials comparing several treatments to a common control are often reported in the medical literature. For example, multiple experimental treatments may be compared with placebo, or in combination therapy trials, a combination therapy may be compared with each of its constituent monotherapies. Such trials are typically designed using a balanced approach in which equal numbers of individuals are randomized to each arm, however, this can result in an inefficient use of resources. We provide a unified framework and new theoretical results for optimal design of such single-control multiple-comparator studies. We consider variance optimal designs based on D-, A-, and E-optimality criteria, using a general model that allows for heteroscedasticity and a range of effect measures that include both continuous and binary outcomes. We demonstrate the sensitivity of these designs to the type of optimality criterion by showing that the optimal allocation ratios are systematically ordered according to the optimality criterion. Given this sensitivity to the optimality criterion, we argue that power optimality is a more suitable approach when designing clinical trials where testing is the objective. Weighted variance optimal designs are also discussed, which, like power optimal designs, allow the treatment difference to play a major role in determining allocation ratios. We illustrate our methods using two real clinical trial examples taken from the medical literature. Some recommendations on the use of optimal designs in single-control multiple-comparator trials are also provided. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Statistical hypothesis testing and common misinterpretations: Should we abandon p-value in forensic science applications?

    Science.gov (United States)

    Taroni, F; Biedermann, A; Bozza, S

    2016-02-01

    Many people regard the concept of hypothesis testing as fundamental to inferential statistics. Various schools of thought, in particular frequentist and Bayesian, have promoted radically different solutions for taking a decision about the plausibility of competing hypotheses. Comprehensive philosophical comparisons about their advantages and drawbacks are widely available and continue to span over large debates in the literature. More recently, controversial discussion was initiated by an editorial decision of a scientific journal [1] to refuse any paper submitted for publication containing null hypothesis testing procedures. Since the large majority of papers published in forensic journals propose the evaluation of statistical evidence based on the so called p-values, it is of interest to expose the discussion of this journal's decision within the forensic science community. This paper aims to provide forensic science researchers with a primer on the main concepts and their implications for making informed methodological choices. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. Surveillance test interval optimization

    International Nuclear Information System (INIS)

    Cepin, M.; Mavko, B.

    1995-01-01

    Technical specifications have been developed on the bases of deterministic analyses, engineering judgment, and expert opinion. This paper introduces our risk-based approach to surveillance test interval (STI) optimization. This approach consists of three main levels. The first level is the component level, which serves as a rough estimation of the optimal STI and can be calculated analytically by a differentiating equation for mean unavailability. The second and third levels give more representative results. They take into account the results of probabilistic risk assessment (PRA) calculated by a personal computer (PC) based code and are based on system unavailability at the system level and on core damage frequency at the plant level

  2. Test of the hypothesis; a lymphoma stem cells exist which is capable of self-renewal

    DEFF Research Database (Denmark)

    Kjeldsen, Malene Krag

      Test of the hypothesis; a lymphoma stem cell exist which is capable of self-renewal   Malene Krag Pedersen, Karen Dybkaer, Hans E. Johnsen   The Research Laboratory, Department of Haematology, Aalborg Hospital, Århus University   Failure of current therapeutics in the treatment of diffuse large B...... and sustaining cells(1-3). My project is based on studies of stem and early progenitor cells in lymphoid cell lines from patients with advanced DLBCL. The cell lines are world wide recognised and generously provided by Dr. Hans Messner and colleagues.   Hypothesis and aims: A lymphoma stem and progenitor cell...

  3. A test of the thermal melanism hypothesis in the wingless grasshopper Phaulacridium vittatum.

    Science.gov (United States)

    Harris, Rebecca M; McQuillan, Peter; Hughes, Lesley

    2013-01-01

    Altitudinal clines in melanism are generally assumed to reflect the fitness benefits resulting from thermal differences between colour morphs, yet differences in thermal quality are not always discernible. The intra-specific application of the thermal melanism hypothesis was tested in the wingless grasshopper Phaulacridium vittatum (Sjöstedt) (Orthoptera: Acrididae) first by measuring the thermal properties of the different colour morphs in the laboratory, and second by testing for differences in average reflectance and spectral characteristics of populations along 14 altitudinal gradients. Correlations between reflectance, body size, and climatic variables were also tested to investigate the underlying causes of clines in melanism. Melanism in P. vittatum represents a gradation in colour rather than distinct colour morphs, with reflectance ranging from 2.49 to 5.65%. In unstriped grasshoppers, darker morphs warmed more rapidly than lighter morphs and reached a higher maximum temperature (lower temperature excess). In contrast, significant differences in thermal quality were not found between the colour morphs of striped grasshoppers. In support of the thermal melanism hypothesis, grasshoppers were, on average, darker at higher altitudes, there were differences in the spectral properties of brightness and chroma between high and low altitudes, and temperature variables were significant influences on the average reflectance of female grasshoppers. However, altitudinal gradients do not represent predictable variation in temperature, and the relationship between melanism and altitude was not consistent across all gradients. Grasshoppers generally became darker at altitudes above 800 m a.s.l., but on several gradients reflectance declined with altitude and then increased at the highest altitude.

  4. A Bayesian Optimal Design for Sequential Accelerated Degradation Testing

    Directory of Open Access Journals (Sweden)

    Xiaoyang Li

    2017-07-01

    Full Text Available When optimizing an accelerated degradation testing (ADT plan, the initial values of unknown model parameters must be pre-specified. However, it is usually difficult to obtain the exact values, since many uncertainties are embedded in these parameters. Bayesian ADT optimal design was presented to address this problem by using prior distributions to capture these uncertainties. Nevertheless, when the difference between a prior distribution and actual situation is large, the existing Bayesian optimal design might cause some over-testing or under-testing issues. For example, the implemented ADT following the optimal ADT plan consumes too much testing resources or few accelerated degradation data are obtained during the ADT. To overcome these obstacles, a Bayesian sequential step-down-stress ADT design is proposed in this article. During the sequential ADT, the test under the highest stress level is firstly conducted based on the initial prior information to quickly generate degradation data. Then, the data collected under higher stress levels are employed to construct the prior distributions for the test design under lower stress levels by using the Bayesian inference. In the process of optimization, the inverse Gaussian (IG process is assumed to describe the degradation paths, and the Bayesian D-optimality is selected as the optimal objective. A case study on an electrical connector’s ADT plan is provided to illustrate the application of the proposed Bayesian sequential ADT design method. Compared with the results from a typical static Bayesian ADT plan, the proposed design could guarantee more stable and precise estimations of different reliability measures.

  5. Life Shocks and Crime: A Test of the “Turning Point” Hypothesis

    Science.gov (United States)

    Noonan, Kelly; Reichman, Nancy E.; Schwartz-Soicher, Ofira

    2012-01-01

    Other researchers have posited that important events in men’s lives—such as employment, marriage, and parenthood—strengthen their social ties and lead them to refrain from crime. A challenge in empirically testing this hypothesis has been the issue of self-selection into life transitions. This study contributes to this literature by estimating the effects of an exogenous life shock on crime. We use data from the Fragile Families and Child Wellbeing Study, augmented with information from hospital medical records, to estimate the effects of the birth of a child with a severe health problem on the likelihood that the infant’s father engages in illegal activities. We conduct a number of auxiliary analyses to examine exogeneity assumptions. We find that having an infant born with a severe health condition increases the likelihood that the father is convicted of a crime in the three-year period following the birth of the child, and at least part of the effect appears to operate through work and changes in parental relationships. These results provide evidence that life events can cause crime and, as such, support the “turning point” hypothesis. PMID:21660628

  6. Segmentation by Large Scale Hypothesis Testing - Segmentation as Outlier Detection

    DEFF Research Database (Denmark)

    Darkner, Sune; Dahl, Anders Lindbjerg; Larsen, Rasmus

    2010-01-01

    a microscope and we show how the method can handle transparent particles with significant glare point. The method generalizes to other problems. THis is illustrated by applying the method to camera calibration images and MRI of the midsagittal plane for gray and white matter separation and segmentation......We propose a novel and efficient way of performing local image segmentation. For many applications a threshold of pixel intensities is sufficient but determine the appropriate threshold value can be difficult. In cases with large global intensity variation the threshold value has to be adapted...... locally. We propose a method based on large scale hypothesis testing with a consistent method for selecting an appropriate threshold for the given data. By estimating the background distribution we characterize the segment of interest as a set of outliers with a certain probability based on the estimated...

  7. Quantum chi-squared and goodness of fit testing

    Energy Technology Data Exchange (ETDEWEB)

    Temme, Kristan [IQIM, California Institute of Technology, Pasadena, California 91125 (United States); Verstraete, Frank [Fakultät für Physik, Universität Wien, Boltzmanngasse 5, 1090 Wien, Austria and Faculty of Science, Ghent University, B-9000 Ghent (Belgium)

    2015-01-15

    A quantum mechanical hypothesis test is presented for the hypothesis that a certain setup produces a given quantum state. Although the classical and the quantum problems are very much related to each other, the quantum problem is much richer due to the additional optimization over the measurement basis. A goodness of fit test for i.i.d quantum states is developed and a max-min characterization for the optimal measurement is introduced. We find the quantum measurement which leads both to the maximal Pitman and Bahadur efficiencies, and determine the associated divergence rates. We discuss the relationship of the quantum goodness of fit test to the problem of estimating multiple parameters from a density matrix. These problems are found to be closely related and we show that the largest error of an optimal strategy, determined by the smallest eigenvalue of the Fisher information matrix, is given by the divergence rate of the goodness of fit test.

  8. Testing the junk-food hypothesis on marine birds: Effects of prey type on growth and development

    Science.gov (United States)

    Romano, Marc D.; Piatt, John F.; Roby, D.D.

    2006-01-01

    The junk-food hypothesis attributes declines in productivity of marine birds and mammals to changes in the species of prey they consume and corresponding differences in nutritional quality of those prey. To test this hypothesis nestling Black-legged Kittiwakes (Rissa tridactyla) and Tufted Puffins (Fratercula cirrhata) were raised in captivity under controlled conditions to determine whether the type and quality of fish consumed by young seabirds constrains their growth and development. Some nestlings were fed rations of Capelin (Mallotus villosus), Herring (Clupea pallasi) or Sand Lance (Ammodytes hexapterus) and their growth was compared with nestlings raised on equal biomass rations of Walleye Pollock (Theragra chalcograma). Nestlings fed rations of herring, sand lance, or capelin experienced higher growth increments than nestlings fed pollock. The energy density of forage fish fed to nestlings had a marked effect on growth increments and could be expected to have an effect on pre- and post-fledging survival of nestlings in the wild. These results provide empirical support for the junk-food hypothesis.

  9. Testing the EKC hypothesis by considering trade openness, urbanization, and financial development: the case of Turkey.

    Science.gov (United States)

    Ozatac, Nesrin; Gokmenoglu, Korhan K; Taspinar, Nigar

    2017-07-01

    This study investigates the environmental Kuznets curve (EKC) hypothesis for the case of Turkey from 1960 to 2013 by considering energy consumption, trade, urbanization, and financial development variables. Although previous literature examines various aspects of the EKC hypothesis for the case of Turkey, our model augments the basic model with several covariates to develop a better understanding of the relationship among the variables and to refrain from omitted variable bias. The results of the bounds test and the error correction model under autoregressive distributed lag mechanism suggest long-run relationships among the variables as well as proof of the EKC and the scale effect in Turkey. A conditional Granger causality test reveals that there are causal relationships among the variables. Our findings can have policy implications including the imposition of a "polluter pays" mechanism, such as the implementation of a carbon tax for pollution trading, to raise the urban population's awareness about the importance of adopting renewable energy and to support clean, environmentally friendly technology.

  10. Testing optimization sequence for the beam port facility of PSBR

    International Nuclear Information System (INIS)

    Bekar, K.B.; Azmy, Y.Y.; Unlu, K.

    2005-01-01

    We present preliminary testing results of the modular code package prepared for the size and shape optimization of the beam tube device of the beam port facility at the Penn State Breazeale Reactor (PSBR). In the test cases, using the Min-max algorithm as an optimizer and multidimensional, neutral particle transport code TORT as a transport solver in the physics calculation, we optimize the shape of the D 2 O moderator of the beam tube device. We illustrate the modular nature of the optimization package, validation tests of the physics model, and preliminary optimization calculation via the whole code package. Results obtained so far indicate the drum-shaped D 2 O moderator tank is over-designed in size and does not possess the almost hemi-spherical optimal shape computed by our new package. (authors)

  11. Time-Optimal Real-Time Test Case Generation using UPPAAL

    DEFF Research Database (Denmark)

    Hessel, Anders; Larsen, Kim Guldstrand; Nielsen, Brian

    2004-01-01

    Testing is the primary software validation technique used by industry today, but remains ad hoc, error prone, and very expensive. A promising improvement is to automatically generate test cases from formal models of the system under test. We demonstrate how to automatically generate real...... test purposes or generated automatically from various coverage criteria of the model.......-time conformance test cases from timed automata specifications. Specifically we demonstrate how to fficiently generate real-time test cases with optimal execution time i.e test cases that are the fastest possible to execute. Our technique allows time optimal test cases to be generated using manually formulated...

  12. Personality and Behavior in Social Dilemmas: Testing the Situational Strength Hypothesis and the Role of Hypothetical Versus Real Incentives.

    Science.gov (United States)

    Lozano, José H

    2016-02-01

    Previous research aimed at testing the situational strength hypothesis suffers from serious limitations regarding the conceptualization of strength. In order to overcome these limitations, the present study attempts to test the situational strength hypothesis based on the operationalization of strength as reinforcement contingencies. One dispositional factor of proven effect on cooperative behavior, social value orientation (SVO), was used as a predictor of behavior in four social dilemmas with varying degree of situational strength. The moderating role of incentive condition (hypothetical vs. real) on the relationship between SVO and behavior was also tested. One hundred undergraduates were presented with the four social dilemmas and the Social Value Orientation Scale. One-half of the sample played the social dilemmas using real incentives, whereas the other half used hypothetical incentives. Results supported the situational strength hypothesis in that no behavioral variability and no effect of SVO on behavior were found in the strongest situation. However, situational strength did not moderate the effect of SVO on behavior in situations where behavior showed variability. No moderating effect was found for incentive condition either. The implications of these results for personality theory and assessment are discussed. © 2014 Wiley Periodicals, Inc.

  13. A test of the symbol interdependency hypothesis with both concrete and abstract stimuli

    Science.gov (United States)

    Buchanan, Lori

    2018-01-01

    In Experiment 1, the symbol interdependency hypothesis was tested with both concrete and abstract stimuli. Symbolic (i.e., semantic neighbourhood distance) and embodied (i.e., iconicity) factors were manipulated in two tasks—one that tapped symbolic relations (i.e., semantic relatedness judgment) and another that tapped embodied relations (i.e., iconicity judgment). Results supported the symbol interdependency hypothesis in that the symbolic factor was recruited for the semantic relatedness task and the embodied factor was recruited for the iconicity task. Across tasks, and especially in the iconicity task, abstract stimuli resulted in shorter RTs. This finding was in contrast to the concreteness effect where concrete words result in shorter RTs. Experiment 2 followed up on this finding by replicating the iconicity task from Experiment 1 in an ERP paradigm. Behavioural results continued to show a reverse concreteness effect with shorter RTs for abstract stimuli. However, ERP results paralleled the N400 and anterior N700 concreteness effects found in the literature, with more negative amplitudes for concrete stimuli. PMID:29590121

  14. A test of the symbol interdependency hypothesis with both concrete and abstract stimuli.

    Science.gov (United States)

    Malhi, Simritpal Kaur; Buchanan, Lori

    2018-01-01

    In Experiment 1, the symbol interdependency hypothesis was tested with both concrete and abstract stimuli. Symbolic (i.e., semantic neighbourhood distance) and embodied (i.e., iconicity) factors were manipulated in two tasks-one that tapped symbolic relations (i.e., semantic relatedness judgment) and another that tapped embodied relations (i.e., iconicity judgment). Results supported the symbol interdependency hypothesis in that the symbolic factor was recruited for the semantic relatedness task and the embodied factor was recruited for the iconicity task. Across tasks, and especially in the iconicity task, abstract stimuli resulted in shorter RTs. This finding was in contrast to the concreteness effect where concrete words result in shorter RTs. Experiment 2 followed up on this finding by replicating the iconicity task from Experiment 1 in an ERP paradigm. Behavioural results continued to show a reverse concreteness effect with shorter RTs for abstract stimuli. However, ERP results paralleled the N400 and anterior N700 concreteness effects found in the literature, with more negative amplitudes for concrete stimuli.

  15. Analysis and optimization of blood-testing procedures.

    NARCIS (Netherlands)

    Bar-Lev, S.K.; Boxma, O.J.; Perry, D.; Vastazos, L.P.

    2017-01-01

    This paper is devoted to the performance analysis and optimization of blood testing procedures. We present a queueing model of two queues in series, representing the two stages of a blood-testing procedure. Service (testing) in stage 1 is performed in batches, whereas it is done individually in

  16. Testing the ‘Residential Rootedness’-Hypothesis of Self-Employment for Germany and the UK (discussion paper)

    NARCIS (Netherlands)

    Reuschke, D.; Van Ham, M.

    2011-01-01

    Based on the notion that entrepreneurship is a ‘local event’, the literature argues that selfemployed workers and entrepreneurs are ‘rooted’ in place. This paper tests the ‘residential rootedness’-hypothesis of self-employment by examining for Germany and the UK whether the self-employed are less

  17. Robust Means Modeling: An Alternative for Hypothesis Testing of Independent Means under Variance Heterogeneity and Nonnormality

    Science.gov (United States)

    Fan, Weihua; Hancock, Gregory R.

    2012-01-01

    This study proposes robust means modeling (RMM) approaches for hypothesis testing of mean differences for between-subjects designs in order to control the biasing effects of nonnormality and variance inequality. Drawing from structural equation modeling (SEM), the RMM approaches make no assumption of variance homogeneity and employ robust…

  18. Invited Commentary: Can Issues With Reproducibility in Science Be Blamed on Hypothesis Testing?

    Science.gov (United States)

    Weinberg, Clarice R.

    2017-01-01

    Abstract In the accompanying article (Am J Epidemiol. 2017;186(6):646–647), Dr. Timothy Lash makes a forceful case that the problems with reproducibility in science stem from our “culture” of null hypothesis significance testing. He notes that when attention is selectively given to statistically significant findings, the estimated effects will be systematically biased away from the null. Here I revisit the recent history of genetic epidemiology and argue for retaining statistical testing as an important part of the tool kit. Particularly when many factors are considered in an agnostic way, in what Lash calls “innovative” research, investigators need a selection strategy to identify which findings are most likely to be genuine, and hence worthy of further study. PMID:28938713

  19. Optimal design of constant-stress accelerated degradation tests using the M-optimality criterion

    International Nuclear Information System (INIS)

    Wang, Han; Zhao, Yu; Ma, Xiaobing; Wang, Hongyu

    2017-01-01

    In this paper, we propose the M-optimality criterion for designing constant-stress accelerated degradation tests (ADTs). The newly proposed criterion concentrates on the degradation mechanism equivalence rather than evaluation precision or prediction accuracy which is usually considered in traditional optimization criteria. Subject to the constraints of total sample number, test termination time as well as the stress region, an optimum constant-stress ADT plan is derived by determining the combination of stress levels and the number of samples allocated to each stress level, when the degradation path comes from inverse Gaussian (IG) process model with covariates and random effects. A numerical example is presented to verify the robustness of our proposed optimum plan and compare its efficiency with other test plans. Results show that, with a slightly relaxed requirement of evaluation precision and prediction accuracy, our proposed optimum plan reduces the dispersion of the estimated acceleration factor between the usage stress level and a higher accelerated stress level, which makes an important contribution to reliability demonstration and assessment tests. - Highlights: • We establish the necessary conditions for degradation mechanism equivalence of ADTs. • We propose the M-optimality criterion for designing constant-stress ADT plans. • The M-optimality plan reduces the dispersion of the estimated accelerated factors. • An electrical connector with its stress relaxation data is used for illustration.

  20. Neuroticism, intelligence, and intra-individual variability in elementary cognitive tasks: testing the mental noise hypothesis.

    Science.gov (United States)

    Colom, Roberto; Quiroga, Ma Angeles

    2009-08-01

    Some studies show positive correlations between intraindividual variability in elementary speed measures (reflecting processing efficiency) and individual differences in neuroticism (reflecting instability in behaviour). The so-called neural noise hypothesis assumes that higher levels of noise are related both to smaller indices of processing efficiency and greater levels of neuroticism. Here, we test this hypothesis measuring mental speed by means of three elementary cognitive tasks tapping similar basic processes but varying systematically their content (verbal, numerical, and spatial). Neuroticism and intelligence are also measured. The sample comprised 196 undergraduate psychology students. The results show that (1) processing efficiency is generally unrelated to individual differences in neuroticism, (2) processing speed and efficiency correlate with intelligence, and (3) only the efficiency index is genuinely related to intelligence when the colinearity between speed and efficiency is controlled.

  1. Testing the stress-gradient hypothesis during the restoration of tropical degraded land using the shrub Rhodomyrtus tomentosa as a nurse plant

    Science.gov (United States)

    Nan Liu; Hai Ren; Sufen Yuan; Qinfeng Guo; Long Yang

    2013-01-01

    The relative importance of facilitation and competition between pairwise plants across abiotic stress gradients as predicted by the stress-gradient hypothesis has been confirmed in arid and temperate ecosystems, but the hypothesis has rarely been tested in tropical systems, particularly across nutrient gradients. The current research examines the interactions between a...

  2. Is intuition really cooperative? Improved tests support the social heuristics hypothesis.

    Science.gov (United States)

    Isler, Ozan; Maule, John; Starmer, Chris

    2018-01-01

    Understanding human cooperation is a major scientific challenge. While cooperation is typically explained with reference to individual preferences, a recent cognitive process view hypothesized that cooperation is regulated by socially acquired heuristics. Evidence for the social heuristics hypothesis rests on experiments showing that time-pressure promotes cooperation, a result that can be interpreted as demonstrating that intuition promotes cooperation. This interpretation, however, is highly contested because of two potential confounds. First, in pivotal studies compliance with time-limits is low and, crucially, evidence shows intuitive cooperation only when noncompliant participants are excluded. The inconsistency of test results has led to the currently unresolved controversy regarding whether or not noncompliant subjects should be included in the analysis. Second, many studies show high levels of social dilemma misunderstanding, leading to speculation that asymmetries in understanding might explain patterns that are otherwise interpreted as intuitive cooperation. We present evidence from an experiment that employs an improved time-pressure protocol with new features designed to induce high levels of compliance and clear tests of understanding. Our study resolves the noncompliance issue, shows that misunderstanding does not confound tests of intuitive cooperation, and provides the first independent experimental evidence for intuitive cooperation in a social dilemma using time-pressure.

  3. Ion microprobe analyses of aluminous lunar glasses - A test of the 'rock type' hypothesis

    Science.gov (United States)

    Meyer, C., Jr.

    1978-01-01

    Previous soil survey investigations found that there are natural groupings of glass compositions in lunar soils and that the average major element composition of some of these groupings is the same at widely separated lunar landing sites. This led soil survey enthusiasts to promote the hypothesis that the average composition of glass groupings represents the composition of primary lunar 'rock types'. In this investigation the trace element composition of numerous aluminous glass particles was determined by the ion microprobe method as a test of the above mentioned 'rock type' hypothesis. It was found that within any grouping of aluminous lunar glasses by major element content, there is considerable scatter in the refractory trace element content. In addition, aluminous glasses grouped by major elements were found to have different average trace element contents at different sites (Apollo 15, 16 and Luna 20). This evidence argues that natural groupings in glass compositions are determined by regolith processes and may not represent the composition of primary lunar 'rock types'.

  4. Visual working memory and number sense: Testing the double deficit hypothesis in mathematics.

    Science.gov (United States)

    Toll, Sylke W M; Kroesbergen, Evelyn H; Van Luit, Johannes E H

    2016-09-01

    Evidence exists that there are two main underlying cognitive factors in mathematical difficulties: working memory and number sense. It is suggested that real math difficulties appear when both working memory and number sense are weak, here referred to as the double deficit (DD) hypothesis. The aim of this study was to test the DD hypothesis within a longitudinal time span of 2 years. A total of 670 children participated. The mean age was 4.96 years at the start of the study and 7.02 years at the end of the study. At the end of the first year of kindergarten, both visual-spatial working memory and number sense were measured by two different tasks. At the end of first grade, mathematical performance was measured with two tasks, one for math facts and one for math problems. Multiple regressions revealed that both visual working memory and symbolic number sense are predictors of mathematical performance in first grade. Symbolic number sense appears to be the strongest predictor for both math areas (math facts and math problems). Non-symbolic number sense only predicts performance in math problems. Multivariate analyses of variance showed that a combination of visual working memory and number sense deficits (NSDs) leads to the lowest performance on mathematics. Our DD hypothesis was confirmed. Both visual working memory and symbolic number sense in kindergarten are related to mathematical performance 2 years later, and a combination of visual working memory and NSDs leads to low performance in mathematical performance. © 2016 The British Psychological Society.

  5. Recurrence network measures for hypothesis testing using surrogate data: Application to black hole light curves

    Science.gov (United States)

    Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.

    2018-01-01

    Recurrence networks and the associated statistical measures have become important tools in the analysis of time series data. In this work, we test how effective the recurrence network measures are in analyzing real world data involving two main types of noise, white noise and colored noise. We use two prominent network measures as discriminating statistic for hypothesis testing using surrogate data for a specific null hypothesis that the data is derived from a linear stochastic process. We show that the characteristic path length is especially efficient as a discriminating measure with the conclusions reasonably accurate even with limited number of data points in the time series. We also highlight an additional advantage of the network approach in identifying the dimensionality of the system underlying the time series through a convergence measure derived from the probability distribution of the local clustering coefficients. As examples of real world data, we use the light curves from a prominent black hole system and show that a combined analysis using three primary network measures can provide vital information regarding the nature of temporal variability of light curves from different spectroscopic classes.

  6. Aminoglycoside antibiotics and autism: a speculative hypothesis

    Directory of Open Access Journals (Sweden)

    Manev Hari

    2001-10-01

    Full Text Available Abstract Background Recently, it has been suspected that there is a relationship between therapy with some antibiotics and the onset of autism; but even more curious, some children benefited transiently from a subsequent treatment with a different antibiotic. Here, we speculate how aminoglycoside antibiotics might be associated with autism. Presentation We hypothesize that aminoglycoside antibiotics could a trigger the autism syndrome in susceptible infants by causing the stop codon readthrough, i.e., a misreading of the genetic code of a hypothetical critical gene, and/or b improve autism symptoms by correcting the premature stop codon mutation in a hypothetical polymorphic gene linked to autism. Testing Investigate, retrospectively, whether a link exists between aminoglycoside use (which is not extensive in children and the onset of autism symptoms (hypothesis "a", or between amino glycoside use and improvement of these symptoms (hypothesis "b". Whereas a prospective study to test hypothesis "a" is not ethically justifiable, a study could be designed to test hypothesis "b". Implications It should be stressed that at this stage no direct evidence supports our speculative hypothesis and that its main purpose is to initiate development of new ideas that, eventually, would improve our understanding of the pathobiology of autism.

  7. Efficient Market Hypothesis in South Africa: Evidence from Linear and Nonlinear Unit Root Tests

    Directory of Open Access Journals (Sweden)

    Andrew Phiri

    2015-12-01

    Full Text Available This study investigates the weak form efficient market hypothesis (EMH for five generalized stock indices in the Johannesburg Stock Exchange (JSE using weekly data collected from 31st January 2000 to 16th December 2014. In particular, we test for weak form market efficiency using a battery of linear and nonlinear unit root testing procedures comprising of the classical augmented Dickey-Fuller (ADF tests, the two-regime threshold autoregressive (TAR unit root tests described in Enders and Granger (1998 as well as the three-regime unit root tests described in Bec, Salem, and Carrasco (2004. Based on our empirical analysis, we are able to demonstrate that whilst the linear unit root tests advocate for unit roots within the time series, the nonlinear unit root tests suggest that most stock indices are threshold stationary processes. These results bridge two opposing contentions obtained from previous studies by concluding that under a linear framework the JSE stock indices offer support in favour of weak form market efficiency whereas when nonlinearity is accounted for, a majority of the indices violate the weak form EMH.

  8. Monte Carlo, hypothesis-tests for rare events superimposed on a background

    International Nuclear Information System (INIS)

    Avignone, F.T. III; Miley, H.S.; Padgett, W.J.; Weier, D.W.

    1985-01-01

    We describe two techniques to search for small numbers of counts under a peak of known shape and superimposed on a background with statistical fluctuations. Many comparisons of a single experimental spectrum with computer simulations of the peak and background are made. From these we calculate the probability that y hypothesized counts in the peaks of the simulations, will result in a number larger than that observed in a given energy interval (bin) in the experimental spectrum. This is done for many values of the hypothesized number y. One procedure is very similar to testing a statistical hypothesis and can be analytically applied. Another is presented which is related to pattern recognition techniques and is less sensitive to the uncertainty in the mean. Sample applications to double beta decay data are presented. (orig.)

  9. Stochastic optimization of laboratory test workflow at metallurgical testing centers

    Directory of Open Access Journals (Sweden)

    F. Tošenovský

    2016-10-01

    Full Text Available The objective of the paper is to present a way to shorten the time required to perform laboratory tests of materials in metallurgy. The paper finds a relation between the time to perform a test of materials and the number of technicians carrying out the test. The relation can be used to optimize the number of technicians. The approach is based on probability theory, as the amount of material to be tested is unknown in advance, and uses powerful modelling techniques involving the generalized estimating equations.

  10. In silico model-based inference: a contemporary approach for hypothesis testing in network biology.

    Science.gov (United States)

    Klinke, David J

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. © 2014 American Institute of Chemical Engineers.

  11. Testing the Developmental Origins of Health and Disease Hypothesis for Psychopathology Using Family-Based Quasi-Experimental Designs

    Science.gov (United States)

    D’Onofrio, Brian M.; Class, Quetzal A.; Lahey, Benjamin B.; Larsson, Henrik

    2014-01-01

    The Developmental Origin of Health and Disease (DOHaD) hypothesis is a broad theoretical framework that emphasizes how early risk factors have a causal influence on psychopathology. Researchers have raised concerns about the causal interpretation of statistical associations between early risk factors and later psychopathology because most existing studies have been unable to rule out the possibility of environmental and genetic confounding. In this paper we illustrate how family-based quasi-experimental designs can test the DOHaD hypothesis by ruling out alternative hypotheses. We review the logic underlying sibling-comparison, co-twin control, offspring of siblings/twins, adoption, and in vitro fertilization designs. We then present results from studies using these designs focused on broad indices of fetal development (low birth weight and gestational age) and a particular teratogen, smoking during pregnancy. The results provide mixed support for the DOHaD hypothesis for psychopathology, illustrating the critical need to use design features that rule out unmeasured confounding. PMID:25364377

  12. Test of the decaying dark matter hypothesis using the Hopkins Ultraviolet Telescope

    Science.gov (United States)

    Davidsen, A. F.; Kriss, G. A.; Ferguson, H. C.; Blair, W. P.; Bowers, C. W.; Kimble, R. A.

    1991-01-01

    Sciama's hypothesis that the dark matter associated with galaxies, galaxy clusters, and the intergalactic medium consists of tau neutrinos of rest mass 28-30 eV whose decay generates ultraviolet photons of energy roughly 14-15 eV, has been tested using the Hopkins Ultraviolet Telescope flows aboard the Space Shuttle Columbia. A straightforward application of Sciama's model predicts that a spectral line from neutrino decay photons should be observed from the rich galaxy cluster Abell 665 with an SNR of about 30. No such emission was detected. For neutrinos in the mass range 27.2-32.1 eV, the observations set a lower lifetime limit significantly greater than Sciama's model requires.

  13. Effects of arousal on cognitive control: empirical tests of the conflict-modulated Hebbian-learning hypothesis.

    Science.gov (United States)

    Brown, Stephen B R E; van Steenbergen, Henk; Kedar, Tomer; Nieuwenhuis, Sander

    2014-01-01

    An increasing number of empirical phenomena that were previously interpreted as a result of cognitive control, turn out to reflect (in part) simple associative-learning effects. A prime example is the proportion congruency effect, the finding that interference effects (such as the Stroop effect) decrease as the proportion of incongruent stimuli increases. While this was previously regarded as strong evidence for a global conflict monitoring-cognitive control loop, recent evidence has shown that the proportion congruency effect is largely item-specific and hence must be due to associative learning. The goal of our research was to test a recent hypothesis about the mechanism underlying such associative-learning effects, the conflict-modulated Hebbian-learning hypothesis, which proposes that the effect of conflict on associative learning is mediated by phasic arousal responses. In Experiment 1, we examined in detail the relationship between the item-specific proportion congruency effect and an autonomic measure of phasic arousal: task-evoked pupillary responses. In Experiment 2, we used a task-irrelevant phasic arousal manipulation and examined the effect on item-specific learning of incongruent stimulus-response associations. The results provide little evidence for the conflict-modulated Hebbian-learning hypothesis, which requires additional empirical support to remain tenable.

  14. Effects of arousal on cognitive control: Empirical tests of the conflict-modulated Hebbian-learning hypothesis

    Directory of Open Access Journals (Sweden)

    Stephen B.R.E. Brown

    2014-01-01

    Full Text Available An increasing number of empirical phenomena that were previously interpreted as a result of cognitive control, turn out to reflect (in part simple associative-learning effects. A prime example is the proportion congruency effect, the finding that interference effects (such as the Stroop effect decrease as the proportion of incongruent stimuli increases. While this was previously regarded as strong evidence for a global conflict monitoring-cognitive control loop, recent evidence has shown that the proportion congruency effect is largely item-specific and hence must be due to associative learning. The goal of our research was to test a recent hypothesis about the mechanism underlying such associative-learning effects, the conflict-modulated Hebbian-learning hypothesis, which proposes that the effect of conflict on associative learning is mediated by phasic arousal responses. In Experiment 1, we examined in detail the relationship between the item-specific proportion congruency effect and an autonomic measure of phasic arousal: task-evoked pupillary responses. In Experiment 2, we used a task-irrelevant phasic arousal manipulation and examined the effect on item-specific learning of incongruent stimulus-response associations. The results provide little evidence for the conflict-modulated Hebbian-learning hypothesis, which requires additional empirical support to remain tenable.

  15. Molecular phylogeny of selected species of the order Dinophysiales (Dinophyceae) - testing the hypothesis of a Dinophysioid radiation

    DEFF Research Database (Denmark)

    Jensen, Maria Hastrup; Daugbjerg, Niels

    2009-01-01

    additional information on morphology and ecology to these evolutionary lineages. We have for the first time combined morphological information with molecular phylogenies to test the dinophysioid radiation hypothesis in a modern context. Nuclear-encoded LSU rDNA sequences including domains D1-D6 from 27...

  16. Empirical tests of the Chicago model and the Easterlin hypothesis: a case study of Japan.

    Science.gov (United States)

    Ohbuchi, H

    1982-05-01

    The objective of this discussion is to test the applicability of economic theory of fertility with special reference to postwar Japan and to find a clue for forecasting the future trend of fertility. The theories examined are the "Chicago model" and the "Easterlin hypothesis." The major conclusion common among the leading economic theories of fertility, which have their origin with Gary S. Becker (1960, 1965) and Richard A. Easterlin (1966), is the positive income effect, i.e., that the relationship between income and fertility is positive despite the evidence that higher income families have fewer children and that fertility has declined with economic development. To bridge the gap between theory and fact is the primary purpose of the economic theory of fertility, and each offers a different interpretation for it. The point of the Chicago model, particularly of the household decision making model of the "new home economics," is the mechanism that a positive effect of husband's income growth on fertility is offset by a negative price effect caused by the opportunity cost of wife's time. While the opportunity cost of wife's time is independent of the female wage rate for an unemployed wife, it is directly associated with the wage rate for a gainfully employed wife. Thus, the fertility response to female wages occurs only among families with an employed wife. The primary concern of empirical efforts to test the Chicago model has been with the determination of income and price elasticities. An attempt is made to test the relevance of the Chicago model and the Easterlin hypothesis in explaning the fertility movement in postwar Japan. In case of the Chicago model, the statistical results appeared fairly successful but did not match with the theory. The effect on fertility of a rise in women's real wage (and, therefore in the opportunity cost of mother's time) and of a rise in labor force participation rate of married women of childbearing age in recent years could not

  17. Feasibility of Combining Common Data Elements Across Studies to Test a Hypothesis.

    Science.gov (United States)

    Corwin, Elizabeth J; Moore, Shirley M; Plotsky, Andrea; Heitkemper, Margaret M; Dorsey, Susan G; Waldrop-Valverde, Drenna; Bailey, Donald E; Docherty, Sharron L; Whitney, Joanne D; Musil, Carol M; Dougherty, Cynthia M; McCloskey, Donna J; Austin, Joan K; Grady, Patricia A

    2017-05-01

    The purpose of this article is to describe the outcomes of a collaborative initiative to share data across five schools of nursing in order to evaluate the feasibility of collecting common data elements (CDEs) and developing a common data repository to test hypotheses of interest to nursing scientists. This initiative extended work already completed by the National Institute of Nursing Research CDE Working Group that successfully identified CDEs related to symptoms and self-management, with the goal of supporting more complex, reproducible, and patient-focused research. Two exemplars describing the group's efforts are presented. The first highlights a pilot study wherein data sets from various studies by the represented schools were collected retrospectively, and merging of the CDEs was attempted. The second exemplar describes the methods and results of an initiative at one school that utilized a prospective design for the collection and merging of CDEs. Methods for identifying a common symptom to be studied across schools and for collecting the data dictionaries for the related data elements are presented for the first exemplar. The processes for defining and comparing the concepts and acceptable values, and for evaluating the potential to combine and compare the data elements are also described. Presented next are the steps undertaken in the second exemplar to prospectively identify CDEs and establish the data dictionaries. Methods for common measurement and analysis strategies are included. Findings from the first exemplar indicated that without plans in place a priori to ensure the ability to combine and compare data from disparate sources, doing so retrospectively may not be possible, and as a result hypothesis testing across studies may be prohibited. Findings from the second exemplar, however, indicated that a plan developed prospectively to combine and compare data sets is feasible and conducive to merged hypothesis testing. Although challenges exist in

  18. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio P [Richland, WA; Cowell, Andrew J [Kennewick, WA; Gregory, Michelle L [Richland, WA; Baddeley, Robert L [Richland, WA; Paulson, Patrick R [Pasco, WA; Tratz, Stephen C [Richland, WA; Hohimer, Ryan E [West Richland, WA

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  19. Testing the status-legitimacy hypothesis: A multilevel modeling approach to the perception of legitimacy in income distribution in 36 nations.

    Science.gov (United States)

    Caricati, Luca

    2017-01-01

    The status-legitimacy hypothesis was tested by analyzing cross-national data about social inequality. Several indicators were used as indexes of social advantage: social class, personal income, and self-position in the social hierarchy. Moreover, inequality and freedom in nations, as indexed by Gini and by the human freedom index, were considered. Results from 36 nations worldwide showed no support for the status-legitimacy hypothesis. The perception that income distribution was fair tended to increase as social advantage increased. Moreover, national context increased the difference between advantaged and disadvantaged people in the perception of social fairness: Contrary to the status-legitimacy hypothesis, disadvantaged people were more likely than advantaged people to perceive income distribution as too large, and this difference increased in nations with greater freedom and equality. The implications for the status-legitimacy hypothesis are discussed.

  20. Mastery Learning and the Decreasing Variability Hypothesis.

    Science.gov (United States)

    Livingston, Jennifer A.; Gentile, J. Ronald

    1996-01-01

    This report results from studies that tested two variations of Bloom's decreasing variability hypothesis using performance on successive units of achievement in four graduate classrooms that used mastery learning procedures. Data do not support the decreasing variability hypothesis; rather, they show no change over time. (SM)

  1. Testing the Rational Expectations Hypothesis on the Retail Trade Sector Using Survey Data from Malaysia

    OpenAIRE

    Puah, Chin-Hong; Chong, Lucy Lee-Yun; Jais, Mohamad

    2011-01-01

    The rational expectations hypothesis states that when people are expecting things to happen, using the available information, the predicted outcomes usually occur. This study utilized survey data provided by the Business Expectations Survey of Limited Companies to test whether forecasts of the Malaysian retail sector, based on gross revenue and capital expenditures, are rational. The empirical evidence illustrates that the decision-makers expectations in the retail sector are biased and too o...

  2. Testing the Münch hypothesis of long distance phloem transport in plants

    DEFF Research Database (Denmark)

    Knoblauch, Michael; Knoblauch, Jan; Mullendore, Daniel L.

    2016-01-01

    Long distance transport in plants occurs in sieve tubes of the phloem. The pressure flow hypothesis introduced by Ernst Münch in 1930 describes a mechanism of osmotically generated pressure differentials that are supposed to drive the movement of sugars and other solutes in the phloem, but this h......Long distance transport in plants occurs in sieve tubes of the phloem. The pressure flow hypothesis introduced by Ernst Münch in 1930 describes a mechanism of osmotically generated pressure differentials that are supposed to drive the movement of sugars and other solutes in the phloem......, but this hypothesis has long faced major challenges. The key issue is whether the conductance of sieve tubes, including sieve plate pores, is sufficient to allow pressure flow. We show that with increasing distance between source and sink, sieve tube conductivity and turgor increases dramatically in Ipomoea nil. Our...... results provide strong support for the Münch hypothesis, while providing new tools for the investigation of one of the least understood plant tissues....

  3. A test of the cerebellar hypothesis of dyslexia in adequate and inadequate responders to reading intervention.

    Science.gov (United States)

    Barth, Amy E; Denton, Carolyn A; Stuebing, Karla K; Fletcher, Jack M; Cirino, Paul T; Francis, David J; Vaughn, Sharon

    2010-05-01

    The cerebellar hypothesis of dyslexia posits that cerebellar deficits are associated with reading disabilities and may explain why some individuals with reading disabilities fail to respond to reading interventions. We tested these hypotheses in a sample of children who participated in a grade 1 reading intervention study (n = 174) and a group of typically achieving children (n = 62). At posttest, children were classified as adequately responding to the intervention (n = 82), inadequately responding with decoding and fluency deficits (n = 36), or inadequately responding with only fluency deficits (n = 56). Based on the Bead Threading and Postural Stability subtests from the Dyslexia Screening Test-Junior, we found little evidence that assessments of cerebellar functions were associated with academic performance or responder status. In addition, we did not find evidence supporting the hypothesis that cerebellar deficits are more prominent for poor readers with "specific" reading disabilities (i.e., with discrepancies relative to IQ) than for poor readers with reading scores consistent with IQ. In contrast, measures of phonological awareness, rapid naming, and vocabulary were strongly associated with responder status and academic outcomes. These results add to accumulating evidence that fails to associate cerebellar functions with reading difficulties.

  4. Test of the prey-base hypothesis to explain use of red squirrel midden sites by American martens

    Science.gov (United States)

    Dean E. Pearson; Leonard F. Ruggiero

    2001-01-01

    We tested the prey-base hypothesis to determine whether selection of red squirrel (Tamiasciurus hudsonicus) midden sites (cone caches) by American martens (Martes americana) for resting and denning could be attributed to greater abundance of small-mammal prey. Five years of livetrapping at 180 sampling stations in 2 drainages showed that small mammals,...

  5. The Effect of Retention Interval Task Difficulty on Young Children's Prospective Memory: Testing the Intention Monitoring Hypothesis

    Science.gov (United States)

    Mahy, Caitlin E. V.; Moses, Louis J.

    2015-01-01

    The current study examined the impact of retention interval task difficulty on 4- and 5-year-olds' prospective memory (PM) to test the hypothesis that children periodically monitor their intentions during the retention interval and that disrupting this monitoring may result in poorer PM performance. In addition, relations among PM, working memory,…

  6. Statistical power analysis a simple and general model for traditional and modern hypothesis tests

    CERN Document Server

    Murphy, Kevin R; Wolach, Allen

    2014-01-01

    Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g

  7. Properties of hypothesis testing techniques and (Bayesian) model selection for exploration-based and theory-based (order-restricted) hypotheses

    NARCIS (Netherlands)

    Kuiper, Rebecca M.; Nederhoff, Tim; Klugkist, Irene

    2015-01-01

    In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is

  8. Generalist predator, cyclic voles and cavity nests: testing the alternative prey hypothesis.

    Science.gov (United States)

    Pöysä, Hannu; Jalava, Kaisa; Paasivaara, Antti

    2016-12-01

    The alternative prey hypothesis (APH) states that when the density of the main prey declines, generalist predators switch to alternative prey and vice versa, meaning that predation pressure on the alternative prey should be negatively correlated with the density of the main prey. We tested the APH in a system comprising one generalist predator (pine marten, Martes martes), cyclic main prey (microtine voles, Microtus agrestis and Myodes glareolus) and alternative prey (cavity nests of common goldeneye, Bucephala clangula); pine marten is an important predator of both voles and common goldeneye nests. Specifically, we studied whether annual predation rate of real common goldeneye nests and experimental nests is negatively associated with fluctuation in the density of voles in four study areas in southern Finland in 2000-2011. Both vole density and nest predation rate varied considerably between years in all study areas. However, we did not find support for the hypothesis that vole dynamics indirectly affects predation rate of cavity nests in the way predicted by the APH. On the contrary, the probability of predation increased with vole spring abundance for both real and experimental nests. Furthermore, a crash in vole abundance from previous autumn to spring did not increase the probability of predation of real nests, although it increased that of experimental nests. We suggest that learned predation by pine marten individuals, coupled with efficient search image for cavities, overrides possible indirect positive effects of high vole density on the alternative prey in our study system.

  9. Integrated testing strategies can be optimal for chemical risk classification.

    Science.gov (United States)

    Raseta, Marko; Pitchford, Jon; Cussens, James; Doe, John

    2017-08-01

    There is an urgent need to refine strategies for testing the safety of chemical compounds. This need arises both from the financial and ethical costs of animal tests, but also from the opportunities presented by new in-vitro and in-silico alternatives. Here we explore the mathematical theory underpinning the formulation of optimal testing strategies in toxicology. We show how the costs and imprecisions of the various tests, and the variability in exposures and responses of individuals, can be assembled rationally to form a Markov Decision Problem. We compute the corresponding optimal policies using well developed theory based on Dynamic Programming, thereby identifying and overcoming some methodological and logical inconsistencies which may exist in the current toxicological testing. By illustrating our methods for two simple but readily generalisable examples we show how so-called integrated testing strategies, where information of different precisions from different sources is combined and where different initial test outcomes lead to different sets of future tests, can arise naturally as optimal policies. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Hypothesis testing on the fractal structure of behavioral sequences: the Bayesian assessment of scaling methodology.

    Science.gov (United States)

    Moscoso del Prado Martín, Fermín

    2013-12-01

    I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  11. Testing the assumptions of the pyrodiversity begets biodiversity hypothesis for termites in semi-arid Australia.

    Science.gov (United States)

    Davis, Hayley; Ritchie, Euan G; Avitabile, Sarah; Doherty, Tim; Nimmo, Dale G

    2018-04-01

    Fire shapes the composition and functioning of ecosystems globally. In many regions, fire is actively managed to create diverse patch mosaics of fire-ages under the assumption that a diversity of post-fire-age classes will provide a greater variety of habitats, thereby enabling species with differing habitat requirements to coexist, and enhancing species diversity (the pyrodiversity begets biodiversity hypothesis). However, studies provide mixed support for this hypothesis. Here, using termite communities in a semi-arid region of southeast Australia, we test four key assumptions of the pyrodiversity begets biodiversity hypothesis (i) that fire shapes vegetation structure over sufficient time frames to influence species' occurrence, (ii) that animal species are linked to resources that are themselves shaped by fire and that peak at different times since fire, (iii) that species' probability of occurrence or abundance peaks at varying times since fire and (iv) that providing a diversity of fire-ages increases species diversity at the landscape scale. Termite species and habitat elements were sampled in 100 sites across a range of fire-ages, nested within 20 landscapes chosen to represent a gradient of low to high pyrodiversity. We used regression modelling to explore relationships between termites, habitat and fire. Fire affected two habitat elements (coarse woody debris and the cover of woody vegetation) that were associated with the probability of occurrence of three termite species and overall species richness, thus supporting the first two assumptions of the pyrodiversity hypothesis. However, this did not result in those species or species richness being affected by fire history per se. Consequently, landscapes with a low diversity of fire histories had similar numbers of termite species as landscapes with high pyrodiversity. Our work suggests that encouraging a diversity of fire-ages for enhancing termite species richness in this study region is not necessary.

  12. Prediction of pilot opinion ratings using an optimal pilot model. [of aircraft handling qualities in multiaxis tasks

    Science.gov (United States)

    Hess, R. A.

    1977-01-01

    A brief review of some of the more pertinent applications of analytical pilot models to the prediction of aircraft handling qualities is undertaken. The relative ease with which multiloop piloting tasks can be modeled via the optimal control formulation makes the use of optimal pilot models particularly attractive for handling qualities research. To this end, a rating hypothesis is introduced which relates the numerical pilot opinion rating assigned to a particular vehicle and task to the numerical value of the index of performance resulting from an optimal pilot modeling procedure as applied to that vehicle and task. This hypothesis is tested using data from piloted simulations and is shown to be reasonable. An example concerning a helicopter landing approach is introduced to outline the predictive capability of the rating hypothesis in multiaxis piloting tasks.

  13. Optimal number of tests to achieve and validate product reliability

    International Nuclear Information System (INIS)

    Ahmed, Hussam; Chateauneuf, Alaa

    2014-01-01

    The reliability validation of engineering products and systems is mandatory for choosing the best cost-effective design among a series of alternatives. Decisions at early design stages have a large effect on the overall life cycle performance and cost of products. In this paper, an optimization-based formulation is proposed by coupling the costs of product design and validation testing, in order to ensure the product reliability with the minimum number of tests. This formulation addresses the question about the number of tests to be specified through reliability demonstration necessary to validate the product under appropriate confidence level. The proposed formulation takes into account the product cost, the failure cost and the testing cost. The optimization problem can be considered as a decision making system according to the hierarchy of structural reliability measures. The numerical examples show the interest of coupling design and testing parameters. - Highlights: • Coupled formulation for design and testing costs, with lifetime degradation. • Cost-effective testing optimization to achieve reliability target. • Solution procedure for nested aleatoric and epistemic variable spaces

  14. Does Portuguese economy support crude oil conservation hypothesis?

    International Nuclear Information System (INIS)

    Bashiri Behmiri, Niaz; Pires Manso, José R.

    2012-01-01

    This paper examines cointegration relationships and Granger causality nexus in a trivariate framework among oil consumption, economic growth and international oil price in Portugal. For this purpose, we employ two Granger causality approaches: the Johansen cointegration test and vector error correction model (VECM) and the Toda–Yamamoto approaches. Cointegration test proves the existence of a long run equilibrium relationship among these variables and VECM and Toda–Yamamoto Granger causality tests indicate that there is bidirectional causality between crude oil consumption and economic growth (feed back hypothesis). Therefore, the Portuguese economy does not support crude oil conservation hypothesis. Consequently, policymakers should consider that implementing oil conservation and environmental policies may negatively impact on the Portuguese economic growth. - Highlights: ► We examine Granger causality among oil consumption, GDP and oil price in Portugal. ► VECM and Toda–Yamamoto tests found bidirectional causality among oil and GDP. ► Portuguese economy does not support the crude oil conservation hypothesis.

  15. The Random-Walk Hypothesis on the Indian Stock Market

    OpenAIRE

    Ankita Mishra; Vinod Mishra; Russell Smyth

    2014-01-01

    This study tests the random walk hypothesis for the Indian stock market. Using 19 years of monthly data on six indices from the National Stock Exchange (NSE) and the Bombay Stock Exchange (BSE), this study applies three different unit root tests with two structural breaks to analyse the random walk hypothesis. We find that unit root tests that allow for two structural breaks alone are not able to reject the unit root null; however, a recently developed unit root test that simultaneously accou...

  16. Testing the snake-detection hypothesis: larger early posterior negativity in humans to pictures of snakes than to pictures of other reptiles, spiders and slugs

    OpenAIRE

    Van Strien, Jan W.; Franken, Ingmar H. A.; Huijding, Jorg

    2014-01-01

    According to the snake detection hypothesis (Isbell, 2006), fear specifically of snakes may have pushed evolutionary changes in the primate visual system allowing pre-attentional visual detection of fearful stimuli. A previous study demonstrated that snake pictures, when compared to spiders or bird pictures, draw more early attention as reflected by larger early posterior negativity (EPN). Here we report two studies that further tested the snake detection hypothesis. In Study, 1 we tested whe...

  17. Do the disadvantaged legitimize the social system? A large-scale test of the status-legitimacy hypothesis.

    Science.gov (United States)

    Brandt, Mark J

    2013-05-01

    System justification theory (SJT) posits that members of low-status groups are more likely to see their social systems as legitimate than members of high-status groups because members of low-status groups experience a sense of dissonance between system motivations and self/group motivations (Jost, Pelham, Sheldon, & Sullivan, 2003). The author examined the status-legitimacy hypothesis using data from 3 representative sets of data from the United States (American National Election Studies and General Social Surveys) and throughout the world (World Values Survey; total N across studies = 151,794). Multilevel models revealed that the average effect across years in the United States and countries throughout the world was most often directly contrary to the status-legitimacy hypothesis or was practically zero. In short, the status-legitimacy effect is not a robust phenomenon. Two theoretically relevant moderator variables (inequality and civil liberties) were also tested, revealing weak evidence, null evidence, or contrary evidence to the dissonance-inspired status-legitimacy hypothesis. In sum, the status-legitimacy effect is not robust and is unlikely to be the result of dissonance. These results are used to discuss future directions for research, the current state of SJT, and the interpretation of theoretically relevant but contrary and null results. PsycINFO Database Record (c) 2013 APA, all rights reserved

  18. Invited Commentary: Can Issues With Reproducibility in Science Be Blamed on Hypothesis Testing?

    Science.gov (United States)

    Weinberg, Clarice R

    2017-09-15

    In the accompanying article (Am J Epidemiol. 2017;186(6):646-647), Dr. Timothy Lash makes a forceful case that the problems with reproducibility in science stem from our "culture" of null hypothesis significance testing. He notes that when attention is selectively given to statistically significant findings, the estimated effects will be systematically biased away from the null. Here I revisit the recent history of genetic epidemiology and argue for retaining statistical testing as an important part of the tool kit. Particularly when many factors are considered in an agnostic way, in what Lash calls "innovative" research, investigators need a selection strategy to identify which findings are most likely to be genuine, and hence worthy of further study. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  19. Testing a hypothesis of unidirectional hybridization in plants: Observations on Sonneratia, Bruguiera and Ligularia

    Directory of Open Access Journals (Sweden)

    Wu Chung-I

    2008-05-01

    Full Text Available Abstract Background When natural hybridization occurs at sites where the hybridizing species differ in abundance, the pollen load delivered to the rare species should be predominantly from the common species. Previous authors have therefore proposed a hypothesis on the direction of hybridization: interspecific hybrids are more likely to have the female parent from the rare species and the male parent from the common species. We wish to test this hypothesis using data of plant hybridizations both from our own experimentation and from the literature. Results By examining the maternally inherited chloroplast DNA of 6 cases of F1 hybridization from four genera of plants, we infer unidirectional hybridization in most cases. In all 5 cases where the relative abundance of the parental species deviates from parity, however, the direction is predominantly in the direction opposite of the prediction based strictly on numerical abundance. Conclusion Our results show that the observed direction of hybridization is almost always opposite of the predicted direction based on the relative abundance of the hybridizing species. Several alternative hypotheses, including unidirectional postmating isolation and reinforcement of premating isolation, were discussed.

  20. Testing Fiscal Dominance Hypothesis in a Structural VAR Specification for Pakistan

    Directory of Open Access Journals (Sweden)

    Shaheen Rozina

    2018-03-01

    Full Text Available This research aims to test the fiscal dominance hypothesis for Pakistan through a bivariate structural vector auto regression (SVAR specification, covering time period 1977 – 2016. This study employs real primary deficit (non interest government expenditures minus total revenues and real primary liabilities (sum of monetary base and domestic public debt as indicators of fiscal measures and monetary policy respectively. A structural VAR is retrieved both for entire sample period and four sub periods (1977 – 1986, 1987 – 1997, 1998 – 2008, and 2009 – 2016. This study identifies the presence of fiscal dominance for the entire sample period and the sub period from 1987 – 2008. The estimates reveal an interesting phenomenon that fiscal dominance is significant in the elected regimes and weaker in the presence of military regimes in Pakistan. From a policy perspective, this research suggests increased autonomy of central bank to achieve long term price stability and reduced administration costs to ensure efficient democratic regime in Pakistan.

  1. Test of a hypothesis of realism in quantum theory using a Bayesian approach

    Science.gov (United States)

    Nikitin, N.; Toms, K.

    2017-05-01

    In this paper we propose a time-independent equality and time-dependent inequality, suitable for an experimental test of the hypothesis of realism. The derivation of these relations is based on the concept of conditional probability and on Bayes' theorem in the framework of Kolmogorov's axiomatics of probability theory. The equality obtained is intrinsically different from the well-known Greenberger-Horne-Zeilinger (GHZ) equality and its variants, because violation of the proposed equality might be tested in experiments with only two microsystems in a maximally entangled Bell state |Ψ-> , while a test of the GHZ equality requires at least three quantum systems in a special state |ΨGHZ> . The obtained inequality differs from Bell's, Wigner's, and Leggett-Garg inequalities, because it deals with spin s =1 /2 projections onto only two nonparallel directions at two different moments of time, while a test of the Bell and Wigner inequalities requires at least three nonparallel directions, and a test of the Leggett-Garg inequalities requires at least three distinct moments of time. Hence, the proposed inequality seems to open an additional experimental possibility to avoid the "contextuality loophole." Violation of the proposed equality and inequality is illustrated with the behavior of a pair of anticorrelated spins in an external magnetic field and also with the oscillations of flavor-entangled pairs of neutral pseudoscalar mesons.

  2. Properties of hypothesis testing techniques and (Bayesian) model selection for exploration-based and theory-based (order-restricted) hypotheses.

    Science.gov (United States)

    Kuiper, Rebecca M; Nederhoff, Tim; Klugkist, Irene

    2015-05-01

    In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is investigated (a classical, exploration-based set of hypotheses containing equality constraints on the means, or a theory-based limited set of hypotheses with equality and/or order restrictions). A simulation study is conducted to examine the performance of these techniques. We demonstrate that, if one has specific, a priori specified hypotheses, confirmation (i.e., investigating theory-based hypotheses) has advantages over exploration (i.e., examining all possible equality-constrained hypotheses). Furthermore, examining reasonable order-restricted hypotheses has more power to detect the true effect/non-null hypothesis than evaluating only equality restrictions. Additionally, when investigating more than one theory-based hypothesis, model selection is preferred over hypothesis testing. Because of the first two results, we further examine the techniques that are able to evaluate order restrictions in a confirmatory fashion by examining their performance when the homogeneity of variance assumption is violated. Results show that the techniques are robust to heterogeneity when the sample sizes are equal. When the sample sizes are unequal, the performance is affected by heterogeneity. The size and direction of the deviations from the baseline, where there is no heterogeneity, depend on the effect size (of the means) and on the trend in the group variances with respect to the ordering of the group sizes. Importantly, the deviations are less pronounced when the group variances and sizes exhibit the same trend (e.g., are both increasing with group number). © 2014 The British Psychological Society.

  3. Data mining-based coefficient of influence factors optimization of test paper reliability

    Science.gov (United States)

    Xu, Peiyao; Jiang, Huiping; Wei, Jieyao

    2018-05-01

    Test is a significant part of the teaching process. It demonstrates the final outcome of school teaching through teachers' teaching level and students' scores. The analysis of test paper is a complex operation that has the characteristics of non-linear relation in the length of the paper, time duration and the degree of difficulty. It is therefore difficult to optimize the coefficient of influence factors under different conditions in order to get text papers with clearly higher reliability with general methods [1]. With data mining techniques like Support Vector Regression (SVR) and Genetic Algorithm (GA), we can model the test paper analysis and optimize the coefficient of impact factors for higher reliability. It's easy to find that the combination of SVR and GA can get an effective advance in reliability from the test results. The optimal coefficient of influence factors optimization has a practicability in actual application, and the whole optimizing operation can offer model basis for test paper analysis.

  4. Assessing Threat Detection Scenarios through Hypothesis Generation and Testing

    Science.gov (United States)

    2015-12-01

    Publications. Field, A. (2005). Discovering statistics using SPSS (2nd ed.). Thousand Oaks, CA: Sage Publications. Fisher, S. D., Gettys, C. F...therefore, subsequent F statistics are reported using the Huynh-Feldt correction (Greenhouse-Geisser Epsilon > .775). Experienced and inexperienced...change in hypothesis using experience and initial confidence as predictors. In the Dog Day scenario, the regression was not statistically

  5. Test of an Hypothesis of Magnetization, Tilt and Flow in an Hypabyssal Intrusion, Colombian Andes

    Science.gov (United States)

    Muggleton, S.; MacDonald, W. D.; Estrada, J. J.; Sierra, G. M.

    2002-05-01

    Magnetic remanence in the Miocene Clavijo intrusion in the Cauca Valley, adjacent to the Cordillera Central, plunges steeply northward (MacDonald et al., 1996). Assuming magnetization in a normal magnetic field, the expected remanence direction is approximately I= 10o, D= 000o; the observed remanence is I=84o, D=003o. The discrepancy could be explained by a 74o rotation about a horizontal E-W axis, i.e., about an axis normal to the nearby N-S trending Romeral fault zone. If the intrusion is the shallow feeder of a now-eroded andesitic volcano, then perhaps the paleovertical direction is preserved in flow lineations and provides a test of the tilt/rotation of the remanence. In combination, the steep remanence direction, vertical flow, and the inferred rotation of the volcanic neck lead to the hypothesis of a shallow-plunging southward lineation for this body. Using anisotropy of magnetic susceptibility (AMS) as a proxy for the flow lineation, it is predicted that the K1 (maximum susceptibility) axis in this body plunges gently south. This hypothesis was tested using approximately 50 oriented cores from 5 sites near the west margin of the Clavijo intrusion. The results suggest a NW plunging lineation, inconsistent with the initial hypothesis. However, a relatively consistent flow lineation is suggested by the K1 axes. If this flow axis represents paleovertical, it suggests moderate tilting of the Clavijo body towards the southeast. The results are encouraging enough to suggest that AMS may be useful for determining paleo-vertical in shallow volcanic necks and hypabyssal intrusions, and might ultimately be useful in a tilt-correction for such bodies. Other implications of the results will be discussed. MacDonald, WD, Estrada, JJ, Sierra, GM, Gonzalez, H, 1996, Late Cenozoic tectonics and paleomagnetism of North Cauca Basin intrusions, Colombian Andes: Dual rotation modes: Tectonophysics, v 261, p. 277-289.

  6. Optimizing selection of in vitro tests for diagnosing thyroid disorders

    International Nuclear Information System (INIS)

    Zwas, S.T.; Rosenblum, Yossef; Boruchowsky, Sabina

    1987-01-01

    The optimal utilization of the thyroid related radioimmunoassays T3, T4, and TSH-RIA is derived from analysing the clinical and laboratory data for 974 patients with functional thyroid disorders. A statistical computer analysis of the contribution of each of the three tests, and in combination, to the final diagnoses of hypothyroid, euthyroid, and hyper thyroid states was designed. The best contributing test for hypothyroidism and euthyroidism was TSH-RIA (98.5 and 93%, respectively). T4/T3+TSH-RIAs were the optimal dual combination for diagnosing euthyroidism (98.0%). For diagnosing hyperthyroidism T4-RIA was the best single test (82.5%) followed by T3 + T4 as an optimal dual combination (95%). Using all three tests was of no significant additional value over dual combinations. It is concluded that the work and cost of randomly performing three tests routinely is not justified without clinical basis. An algorithm is proposed to guide thyroid studies based on computer analyses of the above-mentioned single or dual-test combinations to establish accurate diagnosis at the lowest laboratory cost. (author)

  7. Optimizing Tuberculosis Testing for Basic Laboratories

    Science.gov (United States)

    Ramos, Eric; Schumacher, Samuel G.; Siedner, Mark; Herrera, Beatriz; Quino, Willi; Alvarado, Jessica; Montoya, Rosario; Grandjean, Louis; Martin, Laura; Sherman, Jonathan M.; Gilman, Robert H.; Evans, Carlton A.

    2010-01-01

    Optimal tuberculosis testing usually involves sputum centrifugation followed by broth culture. However, centrifuges are biohazardous and scarce in the resource-limited settings where most tuberculosis occurs. To optimize tuberculosis testing for these settings, centrifugation of 111 decontaminated sputum samples was compared with syringe-aspiration through polycarbonate membrane-filters that were then cultured in broth. To reduce the workload of repeated microscopic screening of broth cultures for tuberculosis growth, the colorimetric redox indicator 2,3-diphenyl-5-(2-thienyl) tetrazolium chloride was added to the broth, which enabled naked-eye detection of culture positivity. This combination of filtration and colorimetric growth-detection gave similar results to sputum centrifugation followed by culture microscopy regarding mean colony counts (43 versus 48; P = 0.6), contamination rates (0.9% versus 1.8%; P = 0.3), and sensitivity (94% versus 95%; P = 0.7), suggesting equivalency of the two methods. By obviating centrifugation and repeated microscopic screening of cultures, this approach may constitute a more appropriate technology for rapid and sensitive tuberculosis diagnosis in basic laboratories. PMID:20889887

  8. Optimizing urine drug testing for monitoring medication compliance in pain management.

    Science.gov (United States)

    Melanson, Stacy E F; Ptolemy, Adam S; Wasan, Ajay D

    2013-12-01

    It can be challenging to successfully monitor medication compliance in pain management. Clinicians and laboratorians need to collaborate to optimize patient care and maximize operational efficiency. The test menu, assay cutoffs, and testing algorithms utilized in the urine drug testing panels should be periodically reviewed and tailored to the patient population to effectively assess compliance and avoid unnecessary testing and cost to the patient. Pain management and pathology collaborated on an important quality improvement initiative to optimize urine drug testing for monitoring medication compliance in pain management. We retrospectively reviewed 18 months of data from our pain management center. We gathered data on test volumes, positivity rates, and the frequency of false positive results. We also reviewed the clinical utility of our testing algorithms, assay cutoffs, and adulterant panel. In addition, the cost of each component was calculated. The positivity rate for ethanol and 3,4-methylenedioxymethamphetamine were us to optimize our testing panel for monitoring medication compliance in pain management and reduce cost. Wiley Periodicals, Inc.

  9. Sex and Class Differences in Parent-Child Interaction: A Test of Kohn's Hypothesis

    Science.gov (United States)

    Gecas, Viktor; Nye, F. Ivan

    1974-01-01

    This paper focuses on Melvin Kohn's suggestive hypothesis that white-collar parents stress the development of internal standards of conduct in their children while blue-collar parents are more likely to react on the basis of the consequences of the child's behavior. This hypothesis was supported. (Author)

  10. A Global comparison of surface soil characteristics across five cities: A test of the urban ecosystem convergence hypothesis.

    Science.gov (United States)

    Richard V. Pouyat; Ian D. Yesilonis; Miklos Dombos; Katalin Szlavecz; Heikki Setala; Sarel Cilliers; Erzsebet Hornung; D. Johan Kotze; Stephanie Yarwood

    2015-01-01

    As part of the Global Urban Soil Ecology and Education Network and to test the urban ecosystem convergence hypothesis, we report on soil pH, organic carbon (OC), total nitrogen (TN), phosphorus (P), and potassium (K) measured in four soil habitat types (turfgrass, ruderal, remnant, and reference) in five metropolitan areas (Baltimore, Budapest,...

  11. Colour vision in ADHD: part 1--testing the retinal dopaminergic hypothesis.

    Science.gov (United States)

    Kim, Soyeon; Al-Haj, Mohamed; Chen, Samantha; Fuller, Stuart; Jain, Umesh; Carrasco, Marisa; Tannock, Rosemary

    2014-10-24

    To test the retinal dopaminergic hypothesis, which posits deficient blue color perception in ADHD, resulting from hypofunctioning CNS and retinal dopamine, to which blue cones are exquisitely sensitive. Also, purported sex differences in red color perception were explored. 30 young adults diagnosed with ADHD and 30 healthy young adults, matched on age and gender, performed a psychophysical task to measure blue and red color saturation and contrast discrimination ability. Visual function measures, such as the Visual Activities Questionnaire (VAQ) and Farnsworth-Munsell 100 hue test (FMT), were also administered. Females with ADHD were less accurate in discriminating blue and red color saturation relative to controls but did not differ in contrast sensitivity. Female control participants were better at discriminating red saturation than males, but no sex difference was present within the ADHD group. Poorer discrimination of red as well as blue color saturation in the female ADHD group may be partly attributable to a hypo-dopaminergic state in the retina, given that color perception (blue-yellow and red-green) is based on input from S-cones (short wavelength cone system) early in the visual pathway. The origin of female superiority in red perception may be rooted in sex-specific functional specialization in hunter-gather societies. The absence of this sexual dimorphism for red colour perception in ADHD females warrants further investigation.

  12. Proform-Antecedent Linking in Individuals with Agrammatic Aphasia: A Test of the Intervener Hypothesis.

    Science.gov (United States)

    Engel, Samantha; Shapiro, Lewis P; Love, Tracy

    2018-02-01

    To evaluate processing and comprehension of pronouns and reflexives in individuals with agrammatic (Broca's) aphasia and age-matched control participants. Specifically, we evaluate processing and comprehension patterns in terms of a specific hypothesis -- the Intervener Hypothesis - that posits that the difficulty of individuals with agrammatic (Broca's) aphasia results from similarity-based interference caused by the presence of an intervening NP between two elements of a dependency chain. We used an eye tracking-while-listening paradigm to investigate real-time processing (Experiment 1) and a sentence-picture matching task to investigate final interpretive comprehension (Experiment 2) of sentences containing proforms in complement phrase and subject relative constructions. Individuals with agrammatic aphasia demonstrated a greater proportion of gazes to the correct referent of reflexives relative to pronouns and significantly greater comprehension accuracy of reflexives relative to pronouns. These results provide support for the Intervener Hypothesis, previous support for which comes from studies of Wh- questions and unaccusative verbs, and we argue that this account provides an explanation for the deficits of individuals with agrammatic aphasia across a growing set of sentence constructions. The current study extends this hypothesis beyond filler-gap dependencies to referential dependencies and allows us to refine the hypothesis in terms of the structural constraints that meet the description of the Intervener Hypothesis.

  13. Optimization models for flight test scheduling

    Science.gov (United States)

    Holian, Derreck

    As threats around the world increase with nations developing new generations of warfare technology, the Unites States is keen on maintaining its position on top of the defense technology curve. This in return indicates that the U.S. military/government must research, develop, procure, and sustain new systems in the defense sector to safeguard this position. Currently, the Lockheed Martin F-35 Joint Strike Fighter (JSF) Lightning II is being developed, tested, and deployed to the U.S. military at Low Rate Initial Production (LRIP). The simultaneous act of testing and deployment is due to the contracted procurement process intended to provide a rapid Initial Operating Capability (IOC) release of the 5th Generation fighter. For this reason, many factors go into the determination of what is to be tested, in what order, and at which time due to the military requirements. A certain system or envelope of the aircraft must be assessed prior to releasing that capability into service. The objective of this praxis is to aide in the determination of what testing can be achieved on an aircraft at a point in time. Furthermore, it will define the optimum allocation of test points to aircraft and determine a prioritization of restrictions to be mitigated so that the test program can be best supported. The system described in this praxis has been deployed across the F-35 test program and testing sites. It has discovered hundreds of available test points for an aircraft to fly when it was thought none existed thus preventing an aircraft from being grounded. Additionally, it has saved hundreds of labor hours and greatly reduced the occurrence of test point reflight. Due to the proprietary nature of the JSF program, details regarding the actual test points, test plans, and all other program specific information have not been presented. Generic, representative data is used for example and proof-of-concept purposes. Apart from the data correlation algorithms, the optimization associated

  14. Fluorescence Microspectroscopy for Testing the Dimerization Hypothesis of BACE1 Protein in Cultured HEK293 Cells

    Science.gov (United States)

    Gardeen, Spencer; Johnson, Joseph L.; Heikal, Ahmed A.

    2016-06-01

    Alzheimer's Disease (AD) is a neurodegenerative disorder that results from the formation of beta-amyloid plaques in the brain that trigger the known symptoms of memory loss in AD patients. The beta-amyloid plaques are formed by the proteolytic cleavage of the amyloid precursor protein (APP) by the proteases BACE1 and gamma-secretase. These enzyme-facilitated cleavages lead to the production of beta-amyloid fragments that aggregate to form plaques, which ultimately lead to neuronal cell death. Recent detergent protein extraction studies suggest that BACE1 protein forms a dimer that has significantly higher catalytic activity than its monomeric counterpart. In this contribution, we examine the dimerization hypothesis of BACE1 in cultured HEK293 cells using complementary fluorescence spectroscopy and microscopy methods. Cells were transfected with a BACE1-EGFP fusion protein construct and imaged using confocal, and differential interference contrast to monitor the localization and distribution of intracellular BACE1. Complementary fluorescence lifetime and anisotropy measurements enabled us to examine the conformational and environmental changes of BACE1 as a function of substrate binding. Using fluorescence correlation spectroscopy, we also quantified the diffusion coefficient of BACE1-EGFP on the plasma membrane as a means to test the dimerization hypothesis as a fucntion of substrate-analog inhibitition. Our results represent an important first towards examining the substrate-mediated dimerization hypothesis of BACE1 in live cells.

  15. Testing the hypothesis of the natural suicide rates: Further evidence from OECD data

    DEFF Research Database (Denmark)

    Andres, Antonio Rodriguez; Halicioglu, Ferda

    2011-01-01

    This paper provides further evidence on the hypothesis of the natural rate of suicide using the time series data for 15 OECD countries over the period 1970–2004. This hypothesis suggests that the suicide rate of a society could never be zero even if both the economic and the social conditions wer...

  16. PSO Based Optimization of Testing and Maintenance Cost in NPPs

    Directory of Open Access Journals (Sweden)

    Qiang Chou

    2014-01-01

    Full Text Available Testing and maintenance activities of safety equipment have drawn much attention in Nuclear Power Plant (NPP to risk and cost control. The testing and maintenance activities are often implemented in compliance with the technical specification and maintenance requirements. Technical specification and maintenance-related parameters, that is, allowed outage time (AOT, maintenance period and duration, and so forth, in NPP are associated with controlling risk level and operating cost which need to be minimized. The above problems can be formulated by a constrained multiobjective optimization model, which is widely used in many other engineering problems. Particle swarm optimizations (PSOs have proved their capability to solve these kinds of problems. In this paper, we adopt PSO as an optimizer to optimize the multiobjective optimization problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Numerical results have demonstrated the efficiency of our proposed algorithm.

  17. Eddy current testing probe optimization using a parallel genetic algorithm

    Directory of Open Access Journals (Sweden)

    Dolapchiev Ivaylo

    2008-01-01

    Full Text Available This paper uses the developed parallel version of Michalewicz's Genocop III Genetic Algorithm (GA searching technique to optimize the coil geometry of an eddy current non-destructive testing probe (ECTP. The electromagnetic field is computed using FEMM 2D finite element code. The aim of this optimization was to determine coil dimensions and positions that improve ECTP sensitivity to physical properties of the tested devices.

  18. Long-term resource variation and group size: A large-sample field test of the Resource Dispersion Hypothesis

    Directory of Open Access Journals (Sweden)

    Morecroft Michael D

    2001-07-01

    Full Text Available Abstract Background The Resource Dispersion Hypothesis (RDH proposes a mechanism for the passive formation of social groups where resources are dispersed, even in the absence of any benefits of group living per se. Despite supportive modelling, it lacks empirical testing. The RDH predicts that, rather than Territory Size (TS increasing monotonically with Group Size (GS to account for increasing metabolic needs, TS is constrained by the dispersion of resource patches, whereas GS is independently limited by their richness. We conducted multiple-year tests of these predictions using data from the long-term study of badgers Meles meles in Wytham Woods, England. The study has long failed to identify direct benefits from group living and, consequently, alternative explanations for their large group sizes have been sought. Results TS was not consistently related to resource dispersion, nor was GS consistently related to resource richness. Results differed according to data groupings and whether territories were mapped using minimum convex polygons or traditional methods. Habitats differed significantly in resource availability, but there was also evidence that food resources may be spatially aggregated within habitat types as well as between them. Conclusions This is, we believe, the largest ever test of the RDH and builds on the long-term project that initiated part of the thinking behind the hypothesis. Support for predictions were mixed and depended on year and the method used to map territory borders. We suggest that within-habitat patchiness, as well as model assumptions, should be further investigated for improved tests of the RDH in the future.

  19. Testing the environmental Kuznets curve hypothesis with bird populations as habitat-specific environmental indicators: evidence from Canada.

    Science.gov (United States)

    Lantz, Van; Martínez-Espiñeira, Roberto

    2008-04-01

    The traditional environmental Kuznets curve (EKC) hypothesis postulates that environmental degradation follows an inverted U-shaped relationship with gross domestic product (GDP) per capita. We tested the EKC hypothesis with bird populations in 5 different habitats as environmental quality indicators. Because birds are considered environmental goods, for them the EKC hypothesis would instead be associated with a U-shaped relationship between bird populations and GDP per capita. In keeping with the literature, we included other variables in the analysis-namely, human population density and time index variables (the latter variable captured the impact of persistent and exogenous climate and/or policy changes on bird populations over time). Using data from 9 Canadian provinces gathered over 37 years, we used a generalized least-squares regression for each bird habitat type, which accounted for the panel structure of the data, the cross-sectional dependence across provinces in the residuals, heteroskedasticity, and fixed- or random-effect specifications of the models. We found evidence that supports the EKC hypothesis for 3 of the 5 bird population habitat types. In addition, the relationship between human population density and the different bird populations varied, which emphasizes the complex nature of the impact that human populations have on the environment. The relationship between the time-index variable and the different bird populations also varied, which indicates there are other persistent and significant influences on bird populations over time. Overall our EKC results were consistent with those found for threatened bird species, indicating that economic prosperity does indeed act to benefit some bird populations.

  20. An experimental test of the habitat-amount hypothesis for saproxylic beetles in a forested region.

    Science.gov (United States)

    Seibold, Sebastian; Bässler, Claus; Brandl, Roland; Fahrig, Lenore; Förster, Bernhard; Heurich, Marco; Hothorn, Torsten; Scheipl, Fabian; Thorn, Simon; Müller, Jörg

    2017-06-01

    The habitat-amount hypothesis challenges traditional concepts that explain species richness within habitats, such as the habitat-patch hypothesis, where species number is a function of patch size and patch isolation. It posits that effects of patch size and patch isolation are driven by effects of sample area, and thus that the number of species at a site is basically a function of the total habitat amount surrounding this site. We tested the habitat-amount hypothesis for saproxylic beetles and their habitat of dead wood by using an experiment comprising 190 plots with manipulated patch sizes situated in a forested region with a high variation in habitat amount (i.e., density of dead trees in the surrounding landscape). Although dead wood is a spatio-temporally dynamic habitat, saproxylic insects have life cycles shorter than the time needed for habitat turnover and they closely track their resource. Patch size was manipulated by adding various amounts of downed dead wood to the plots (~800 m³ in total); dead trees in the surrounding landscape (~240 km 2 ) were identified using airborne laser scanning (light detection and ranging). Over 3 yr, 477 saproxylic species (101,416 individuals) were recorded. Considering 20-1,000 m radii around the patches, local landscapes were identified as having a radius of 40-120 m. Both patch size and habitat amount in the local landscapes independently affected species numbers without a significant interaction effect, hence refuting the island effect. Species accumulation curves relative to cumulative patch size were not consistent with either the habitat-patch hypothesis or the habitat-amount hypothesis: several small dead-wood patches held more species than a single large patch with an amount of dead wood equal to the sum of that of the small patches. Our results indicate that conservation of saproxylic beetles in forested regions should primarily focus on increasing the overall amount of dead wood without considering its

  1. A Blind Test of the Younger Dryas Impact Hypothesis.

    Directory of Open Access Journals (Sweden)

    Vance Holliday

    Full Text Available The Younger Dryas Impact Hypothesis (YDIH states that North America was devastated by some sort of extraterrestrial event ~12,800 calendar years before present. Two fundamental questions persist in the debate over the YDIH: Can the results of analyses for purported impact indicators be reproduced? And are the indicators unique to the lower YD boundary (YDB, i.e., ~12.8k cal yrs BP? A test reported here presents the results of analyses that address these questions. Two different labs analyzed identical splits of samples collected at, above, and below the ~12.8ka zone at the Lubbock Lake archaeological site (LL in northwest Texas. Both labs reported similar variation in levels of magnetic micrograins (>300 mg/kg >12.8ka and <11.5ka, but <150 mg/kg 12.8ka to 11.5ka. Analysis for magnetic microspheres in one split, reported elsewhere, produced very low to nonexistent levels throughout the section. In the other split, reported here, the levels of magnetic microspherules and nanodiamonds are low or nonexistent at, below, and above the YDB with the notable exception of a sample <11,500 cal years old. In that sample the claimed impact proxies were recovered at abundances two to four orders of magnitude above that from the other samples. Reproducibility of at least some analyses are problematic. In particular, no standard criteria exist for identification of magnetic spheres. Moreover, the purported impact proxies are not unique to the YDB.

  2. A test of the massive binary black hole hypothesis - Arp 102B

    Science.gov (United States)

    Helpern, J. P.; Filippenko, Alexei V.

    1988-01-01

    The emission-line spectra of several AGN have broad peaks which are significantly displaced in velocity with respect to the host galaxy. An interpretation of this effect in terms of orbital motion of a binary black hole predicts periods of a few centuries. It is pointed out here that recent measurements of the masses and sizes of many low-luminosity AGN imply orbital periods much shorter than this. In particular, it is found that the elliptical galaxy Arp 102B is the most likely candidate for observation of radial velocity variations; its period is expected to be about 3 yr. The H-alpha line profile of Arp 102B has been measured for 5 yr without detecting any change in velocity, and it is thus found that a rather restrictive observational test of the massive binary black hole hypothesis already exists, albeit for this one object.

  3. A single test for rejecting the null hypothesis in subgroups and in the overall sample.

    Science.gov (United States)

    Lin, Yunzhi; Zhou, Kefei; Ganju, Jitendra

    2017-01-01

    In clinical trials, some patient subgroups are likely to demonstrate larger effect sizes than other subgroups. For example, the effect size, or informally the benefit with treatment, is often greater in patients with a moderate condition of a disease than in those with a mild condition. A limitation of the usual method of analysis is that it does not incorporate this ordering of effect size by patient subgroup. We propose a test statistic which supplements the conventional test by including this information and simultaneously tests the null hypothesis in pre-specified subgroups and in the overall sample. It results in more power than the conventional test when the differences in effect sizes across subgroups are at least moderately large; otherwise it loses power. The method involves combining p-values from models fit to pre-specified subgroups and the overall sample in a manner that assigns greater weight to subgroups in which a larger effect size is expected. Results are presented for randomized trials with two and three subgroups.

  4. How organisms do the right thing: The attractor hypothesis

    Science.gov (United States)

    Emlen, J.M.; Freeman, D.C.; Mills, A.; Graham, J.H.

    1998-01-01

    Neo-Darwinian theory is highly successful at explaining the emergence of adaptive traits over successive generations. However, there are reasons to doubt its efficacy in explaining the observed, impressively detailed adaptive responses of organisms to day-to-day changes in their surroundings. Also, the theory lacks a clear mechanism to account for both plasticity and canalization. In effect, there is a growing sentiment that the neo-Darwinian paradigm is incomplete, that something more than genetic structure, mutation, genetic drift, and the action of natural selection is required to explain organismal behavior. In this paper we extend the view of organisms as complex self-organizing entities by arguing that basic physical laws, coupled with the acquisitive nature of organisms, makes adaptation all but tautological. That is, much adaptation is an unavoidable emergent property of organisms' complexity and, to some a significant degree, occurs quite independently of genomic changes wrought by natural selection. For reasons that will become obvious, we refer to this assertion as the attractor hypothesis. The arguments also clarify the concept of "adaptation." Adaptation across generations, by natural selection, equates to the (game theoretic) maximization of fitness (the success with which one individual produces more individuals), while self-organizing based adaptation, within generations, equates to energetic efficiency and the matching of intake and biosynthesis to need. Finally, we discuss implications of the attractor hypothesis for a wide variety of genetical and physiological phenomena, including genetic architecture, directed mutation, genetic imprinting, paramutation, hormesis, plasticity, optimality theory, genotype-phenotype linkage and puncuated equilibrium, and present suggestions for tests of the hypothesis. ?? 1998 American Institute of Physics.

  5. Whiplash and the compensation hypothesis.

    Science.gov (United States)

    Spearing, Natalie M; Connelly, Luke B

    2011-12-01

    Review article. To explain why the evidence that compensation-related factors lead to worse health outcomes is not compelling, either in general, or in the specific case of whiplash. There is a common view that compensation-related factors lead to worse health outcomes ("the compensation hypothesis"), despite the presence of important, and unresolved sources of bias. The empirical evidence on this question has ramifications for the design of compensation schemes. Using studies on whiplash, this article outlines the methodological problems that impede attempts to confirm or refute the compensation hypothesis. Compensation studies are prone to measurement bias, reverse causation bias, and selection bias. Errors in measurement are largely due to the latent nature of whiplash injuries and health itself, a lack of clarity over the unit of measurement (specific factors, or "compensation"), and a lack of appreciation for the heterogeneous qualities of compensation-related factors and schemes. There has been a failure to acknowledge and empirically address reverse causation bias, or the likelihood that poor health influences the decision to pursue compensation: it is unclear if compensation is a cause or a consequence of poor health, or both. Finally, unresolved selection bias (and hence, confounding) is evident in longitudinal studies and natural experiments. In both cases, between-group differences have not been addressed convincingly. The nature of the relationship between compensation-related factors and health is unclear. Current approaches to testing the compensation hypothesis are prone to several important sources of bias, which compromise the validity of their results. Methods that explicitly test the hypothesis and establish whether or not a causal relationship exists between compensation factors and prolonged whiplash symptoms are needed in future studies.

  6. Addendum to the article: Misuse of null hypothesis significance testing: Would estimation of positive and negative predictive values improve certainty of chemical risk assessment?

    Science.gov (United States)

    Bundschuh, Mirco; Newman, Michael C; Zubrod, Jochen P; Seitz, Frank; Rosenfeldt, Ricki R; Schulz, Ralf

    2015-03-01

    We argued recently that the positive predictive value (PPV) and the negative predictive value (NPV) are valuable metrics to include during null hypothesis significance testing: They inform the researcher about the probability of statistically significant and non-significant test outcomes actually being true. Although commonly misunderstood, a reported p value estimates only the probability of obtaining the results or more extreme results if the null hypothesis of no effect was true. Calculations of the more informative PPV and NPV require a priori estimate of the probability (R). The present document discusses challenges of estimating R.

  7. Robust estimation and hypothesis testing

    CERN Document Server

    Tiku, Moti L

    2004-01-01

    In statistical theory and practice, a certain distribution is usually assumed and then optimal solutions sought. Since deviations from an assumed distribution are very common, one cannot feel comfortable with assuming a particular distribution and believing it to be exactly correct. That brings the robustness issue in focus. In this book, we have given statistical procedures which are robust to plausible deviations from an assumed mode. The method of modified maximum likelihood estimation is used in formulating these procedures. The modified maximum likelihood estimators are explicit functions of sample observations and are easy to compute. They are asymptotically fully efficient and are as efficient as the maximum likelihood estimators for small sample sizes. The maximum likelihood estimators have computational problems and are, therefore, elusive. A broad range of topics are covered in this book. Solutions are given which are easy to implement and are efficient. The solutions are also robust to data anomali...

  8. Efficient testing of the homogeneity, scale parameters and number of components in the Rayleigh mixture

    International Nuclear Information System (INIS)

    Stehlik, M.; Ososkov, G.A.

    2003-01-01

    The statistical problem to expand the experimental distribution of transverse momenta into Rayleigh distribution is considered. A high-efficient testing procedure for testing the hypothesis of the homogeneity of the observed measurements which is optimal in the sense of Bahadur is constructed. The exact likelihood ratio (LR) test of the scale parameter of the Rayleigh distribution is proposed for cases when the hypothesis of homogeneity holds. Otherwise the efficient procedure for testing the number of components in the mixture is also proposed

  9. Testing the Hypothesis of Biofilm as a Source for Soft Tissue and Cell-Like Structures Preserved in Dinosaur Bone

    Science.gov (United States)

    2016-01-01

    Recovery of still-soft tissue structures, including blood vessels and osteocytes, from dinosaur bone after demineralization was reported in 2005 and in subsequent publications. Despite multiple lines of evidence supporting an endogenous source, it was proposed that these structures arose from contamination from biofilm-forming organisms. To test the hypothesis that soft tissue structures result from microbial invasion of the fossil bone, we used two different biofilm-forming microorganisms to inoculate modern bone fragments from which organic components had been removed. We show fundamental morphological, chemical and textural differences between the resultant biofilm structures and those derived from dinosaur bone. The data do not support the hypothesis that biofilm-forming microorganisms are the source of these structures. PMID:26926069

  10. Event rate and reaction time performance in ADHD: Testing predictions from the state regulation deficit hypothesis using an ex-Gaussian model.

    Science.gov (United States)

    Metin, Baris; Wiersema, Jan R; Verguts, Tom; Gasthuys, Roos; van Der Meere, Jacob J; Roeyers, Herbert; Sonuga-Barke, Edmund

    2014-12-06

    According to the state regulation deficit (SRD) account, ADHD is associated with a problem using effort to maintain an optimal activation state under demanding task settings such as very fast or very slow event rates. This leads to a prediction of disrupted performance at event rate extremes reflected in higher Gaussian response variability that is a putative marker of activation during motor preparation. In the current study, we tested this hypothesis using ex-Gaussian modeling, which distinguishes Gaussian from non-Gaussian variability. Twenty-five children with ADHD and 29 typically developing controls performed a simple Go/No-Go task under four different event-rate conditions. There was an accentuated quadratic relationship between event rate and Gaussian variability in the ADHD group compared to the controls. The children with ADHD had greater Gaussian variability at very fast and very slow event rates but not at moderate event rates. The results provide evidence for the SRD account of ADHD. However, given that this effect did not explain all group differences (some of which were independent of event rate) other cognitive and/or motivational processes are also likely implicated in ADHD performance deficits.

  11. Are only infants held more often on the left? If so, why? Testing the attention-emotion hypothesis with an infant, a vase, and two chimeric tests, one "emotional," one not.

    Science.gov (United States)

    Harris, Lauren Julius; Cárdenas, Rodrigo A; Stewart, Nathaniel D; Almerigi, Jason B

    2018-05-16

    Most adults, especially women, hold infants and dolls but not books or packages on the left side. One reason may be that attention is more often leftward in response to infants, unlike emotionally neutral objects like books and packages. Women's stronger bias may reflect greater responsiveness to infants. Previously, we tested the attention hypothesis by comparing women's side-of-hold of a doll, book, and package with direction-of-attention on the Chimeric Faces Test (CFT) [Harris, L. J., Cárdenas, R. A., Spradlin, Jr., M. P., & Almerigi, J. B. (2010). Why are infants held on the left? A test of the attention hypothesis with a doll, a book, and a bag. Laterality: Asymmetries of Body, Brain and Cognition, 15(5), 548-571. doi: 10.1080/13576500903064018 ]. Only the doll was held more often to the left, and only for the doll were side-of-hold and CFT scores related, with left-holders showing a stronger left-attention bias than right-holders. In the current study, we tested men and women with a doll and the CFT along with a vase as a neutral object and a "non-emotional" chimeric test. Again, only the doll was held more often to the left, but now, although both chimeric tests showed left-attention biases, scores were unrelated to side-of-hold. Nor were there sex differences. The results support left-hold selectivity but not the attention hypothesis, with or without the element of emotion. They also raise questions about the contribution of sex-of-holder. We conclude with suggestions for addressing these issues.

  12. Individual diet variation in a marine fish assemblage: Optimal Foraging Theory, Niche Variation Hypothesis and functional identity

    Science.gov (United States)

    Cachera, M.; Ernande, B.; Villanueva, M. C.; Lefebvre, S.

    2017-02-01

    Individual diet variation (i.e. diet variation among individuals) impacts intra- and inter-specific interactions. Investigating its sources and relationship with species trophic niche organization is important for understanding community structure and dynamics. Individual diet variation may increase with intra-specific phenotypic (or "individual state") variation and habitat variability, according to Optimal Foraging Theory (OFT), and with species trophic niche width, according to the Niche Variation Hypothesis (NVH). OFT proposes "proximate sources" of individual diet variation such as variations in habitat or size whereas NVH relies on "ultimate sources" related to the competitive balance between intra- and inter-specific competitions. The latter implies as a corollary that species trophic niche overlap, taken as inter-specific competition measure, decreases as species niche width and individual niche variation increase. We tested the complementary predictions of OFT and NVH in a marine fish assemblage using stomach content data and associated trophic niche metrics. The NVH predictions were tested between species of the assemblage and decomposed into a between- and a within-functional group component to assess the potential influence of species' ecological function. For most species, individual diet variation and niche overlap were consistently larger than expected. Individual diet variation increased with intra-specific variability in individual state and habitat, as expected from OFT. It also increased with species niche width but in compliance with the null expectation, thus not supporting the NVH. In contrast, species niche overlap increased significantly less than null expectation with both species niche width and individual diet variation, supporting NVH corollary. The between- and within-functional group components of the NVH relationships were consistent with those between species at the assemblage level. Changing the number of prey categories used to

  13. Gender differences in emotion perception and self-reported emotional intelligence: A test of the emotion sensitivity hypothesis

    OpenAIRE

    Fischer, Agneta H.; Kret, Mariska E.; Broekens, Joost

    2018-01-01

    Previous meta-analyses and reviews on gender differences in emotion recognition have shown a small to moderate female advantage. However, inconsistent evidence from recent studies has raised questions regarding the implications of different methodologies, stimuli, and samples. In the present research based on a community sample of more than 5000 participants, we tested the emotional sensitivity hypothesis, stating that women are more sensitive to perceive subtle, i.e. low intense or ambiguous...

  14. Optimization and Improvement of Test Processes on a Production Line

    Science.gov (United States)

    Sujová, Erika; Čierna, Helena

    2018-06-01

    The paper deals with increasing processes efficiency at a production line of cylinder heads of engines in a production company operating in the automotive industry. The goal is to achieve improvement and optimization of test processes on a production line. It analyzes options for improving capacity, availability and productivity of processes of an output test by using modern technology available on the market. We have focused on analysis of operation times before and after optimization of test processes at specific production sections. By analyzing measured results we have determined differences in time before and after improvement of the process. We have determined a coefficient of efficiency OEE and by comparing outputs we have confirmed real improvement of the process of the output test of cylinder heads.

  15. Testing the niche variation hypothesis with a measure of body condition

    Science.gov (United States)

    Individual variation and fitness are cornerstones of evolution by natural selection. The niche variation hypothesis (NVH) posits that when interspecific competition is relaxed, intraspecific competition should drive niche expansion by selection favoring use of novel resources. Po...

  16. Reverse hypothesis machine learning a practitioner's perspective

    CERN Document Server

    Kulkarni, Parag

    2017-01-01

    This book introduces a paradigm of reverse hypothesis machines (RHM), focusing on knowledge innovation and machine learning. Knowledge- acquisition -based learning is constrained by large volumes of data and is time consuming. Hence Knowledge innovation based learning is the need of time. Since under-learning results in cognitive inabilities and over-learning compromises freedom, there is need for optimal machine learning. All existing learning techniques rely on mapping input and output and establishing mathematical relationships between them. Though methods change the paradigm remains the same—the forward hypothesis machine paradigm, which tries to minimize uncertainty. The RHM, on the other hand, makes use of uncertainty for creative learning. The approach uses limited data to help identify new and surprising solutions. It focuses on improving learnability, unlike traditional approaches, which focus on accuracy. The book is useful as a reference book for machine learning researchers and professionals as ...

  17. Cross-validation and hypothesis testing in neuroimaging: An irenic comment on the exchange between Friston and Lindquist et al.

    Science.gov (United States)

    Reiss, Philip T

    2015-08-01

    The "ten ironic rules for statistical reviewers" presented by Friston (2012) prompted a rebuttal by Lindquist et al. (2013), which was followed by a rejoinder by Friston (2013). A key issue left unresolved in this discussion is the use of cross-validation to test the significance of predictive analyses. This note discusses the role that cross-validation-based and related hypothesis tests have come to play in modern data analyses, in neuroimaging and other fields. It is shown that such tests need not be suboptimal and can fill otherwise-unmet inferential needs. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Wind Generators Test Bench. Optimal Design of PI Controller

    Directory of Open Access Journals (Sweden)

    TUDORACHE, T.

    2011-08-01

    Full Text Available This paper proposes a novel and robust strategy for the optimal design of the drive system integrated in a wind generators test bench. The PI regulator coefficients used in control systems are usually computed based on simplified hypotheses and then tuned manually so as the system response meet certain specifications in terms of stability, accuracy and speed. The proposed methodology permits the automatic identification of PI regulator coefficients using intelligent optimization algorithms, the initial guess for the search procedure being determined based on particular simplified hypotheses. The proposed procedure can help the design engineers to drastically reduce the effort for finding the best PI regulator coefficients offering a range of feasible solutions depending on the imposed optimum criteria. The characteristics and performances of the optimization strategy are highlighted by using it for the design of a DC motor drive system used to simulate the wind prime mover integrated in a wind generators test bench.

  19. Testing statistical hypotheses

    CERN Document Server

    Lehmann, E L

    2005-01-01

    The third edition of Testing Statistical Hypotheses updates and expands upon the classic graduate text, emphasizing optimality theory for hypothesis testing and confidence sets. The principal additions include a rigorous treatment of large sample optimality, together with the requisite tools. In addition, an introduction to the theory of resampling methods such as the bootstrap is developed. The sections on multiple testing and goodness of fit testing are expanded. The text is suitable for Ph.D. students in statistics and includes over 300 new problems out of a total of more than 760. E.L. Lehmann is Professor of Statistics Emeritus at the University of California, Berkeley. He is a member of the National Academy of Sciences and the American Academy of Arts and Sciences, and the recipient of honorary degrees from the University of Leiden, The Netherlands and the University of Chicago. He is the author of Elements of Large-Sample Theory and (with George Casella) he is also the author of Theory of Point Estimat...

  20. Is Variability in Mate Choice Similar for Intelligence and Personality Traits? Testing a Hypothesis about the Evolutionary Genetics of Personality

    Science.gov (United States)

    Stone, Emily A.; Shackelford, Todd K.; Buss, David M.

    2012-01-01

    This study tests the hypothesis presented by Penke, Denissen, and Miller (2007a) that condition-dependent traits, including intelligence, attractiveness, and health, are universally and uniformly preferred as characteristics in a mate relative to traits that are less indicative of condition, including personality traits. We analyzed…

  1. Testing the Hypothesis of Biofilm as a Source for Soft Tissue and Cell-Like Structures Preserved in Dinosaur Bone.

    Directory of Open Access Journals (Sweden)

    Mary Higby Schweitzer

    Full Text Available Recovery of still-soft tissue structures, including blood vessels and osteocytes, from dinosaur bone after demineralization was reported in 2005 and in subsequent publications. Despite multiple lines of evidence supporting an endogenous source, it was proposed that these structures arose from contamination from biofilm-forming organisms. To test the hypothesis that soft tissue structures result from microbial invasion of the fossil bone, we used two different biofilm-forming microorganisms to inoculate modern bone fragments from which organic components had been removed. We show fundamental morphological, chemical and textural differences between the resultant biofilm structures and those derived from dinosaur bone. The data do not support the hypothesis that biofilm-forming microorganisms are the source of these structures.

  2. Biomic specialization and speciation rates in ruminants (Cetartiodactyla, Mammalia): a test of the resource-use hypothesis at the global scale.

    Science.gov (United States)

    Cantalapiedra, Juan L; Hernández Fernández, Manuel; Morales, Jorge

    2011-01-01

    The resource-use hypothesis proposed by E.S. Vrba predicts that specialist species have higher speciation and extinction rates than generalists because they are more susceptible to environmental changes and vicariance. In this work, we test some of the predictions derived from this hypothesis on the 197 extant and recently extinct species of Ruminantia (Cetartiodactyla, Mammalia) using the biomic specialization index (BSI) of each species, which is based on its distribution within different biomes. We ran 10000 Monte Carlo simulations of our data in order to get a null distribution of BSI values against which to contrast the observed data. Additionally, we drew on a supertree of the ruminants and a phylogenetic likelihood-based method (QuaSSE) for testing whether the degree of biomic specialization affects speciation rates in ruminant lineages. Our results are consistent with the predictions of the resource-use hypothesis, which foretells a higher speciation rate of lineages restricted to a single biome (BSI = 1) and higher frequency of specialist species in biomes that underwent high degree of contraction and fragmentation during climatic cycles. Bovids and deer present differential specialization across biomes; cervids show higher specialization in biomes with a marked hydric seasonality (tropical deciduous woodlands and schlerophyllous woodlands), while bovids present higher specialization in a greater variety of biomes. This might be the result of divergent physiological constraints as well as a different biogeographic and evolutionary history.

  3. On the Flexibility of Social Source Memory: A Test of the Emotional Incongruity Hypothesis

    Science.gov (United States)

    Bell, Raoul; Buchner, Axel; Kroneisen, Meike; Giang, Trang

    2012-01-01

    A popular hypothesis in evolutionary psychology posits that reciprocal altruism is supported by a cognitive module that helps cooperative individuals to detect and remember cheaters. Consistent with this hypothesis, a source memory advantage for faces of cheaters (better memory for the cheating context in which these faces were encountered) was…

  4. Received social support and exercising: An intervention study to test the enabling hypothesis.

    Science.gov (United States)

    Rackow, Pamela; Scholz, Urte; Hornung, Rainer

    2015-11-01

    Received social support is considered important for health-enhancing exercise participation. The enabling hypothesis of social support suggests an indirect association of social support and exercising via constructs of self-regulation, such as self-efficacy. This study aimed at examining an expanded enabling hypothesis by examining effects of different kinds of social support (i.e., emotional and instrumental) on exercising not only via self-efficacy but also via self-monitoring and action planning. An 8-week online study was conducted. Participants were randomly assigned to an intervention or a control group. The intervention comprised finding and then exercising regularly with a new exercise companion. Intervention and control group effects were compared by a manifest multigroup model. Received emotional social support predicted self-efficacy, self-monitoring, and action planning in the intervention group. Moreover, received emotional social support was indirectly connected with exercise via the examined mediators. The indirect effect from received emotional social support via self-efficacy mainly contributed to the total effect. No direct or indirect effect of received instrumental social support on exercise emerged. In the control group, neither emotional nor instrumental social support was associated with any of the self-regulation constructs nor with exercise. Actively looking for a new exercise companion and exercising together seems to be beneficial for the promotion of received emotional and instrumental social support. Emotional support in turn promotes exercise by enabling better self-regulation, in particular self-efficacy. Statement of contribution What is already known on this subject? With the 'enabling hypothesis', Benight and Bandura (2004, Behav. Res. Ther., 42, 1129) claimed that social support indirectly affects behaviour via self-efficacy. Research in the domain of physical exercise has provided evidence for this enabling hypothesis on a

  5. Assess the Critical Period Hypothesis in Second Language Acquisition

    Science.gov (United States)

    Du, Lihong

    2010-01-01

    The Critical Period Hypothesis aims to investigate the reason for significant difference between first language acquisition and second language acquisition. Over the past few decades, researchers carried out a series of studies to test the validity of the hypothesis. Although there were certain limitations in these studies, most of their results…

  6. Optimal test intervals of standby components based on actual plant-specific data

    International Nuclear Information System (INIS)

    Jones, R.B.; Bickel, J.H.

    1987-01-01

    Based on standard reliability analysis techniques, both under testing and over testing affect the availability of standby components. If tests are performed too often, unavailability is increased since the equipment is being used excessively. Conversely if testing is performed too infrequently, the likelihood of component unavailability is also increased due to the formation of rust, heat or radiation damage, dirt infiltration, etc. Thus from a physical perspective, an optimal test interval should exist which minimizes unavailability. This paper illustrates the application of an unavailability model that calculates optimal testing intervals for components with a failure database. (orig./HSCH)

  7. Testing the Cuckoldry Risk Hypothesis of Partner Sexual Coercion in Community and Forensic Samples

    Directory of Open Access Journals (Sweden)

    Joseph A. Camilleri

    2009-04-01

    Full Text Available Evolutionary theory has informed the investigation of male sexual coercion but has seldom been applied to the analysis of sexual coercion within established couples. The cuckoldry risk hypothesis, that sexual coercion is a male tactic used to reduce the risk of extrapair paternity, was tested in two studies. In a community sample, indirect cues of infidelity predicted male propensity for sexual coaxing in the relationship, and direct cues predicted propensity for sexual coercion. In the forensic sample, we found that most partner rapists experienced cuckoldry risk prior to committing their offence and experienced more types of cuckoldry risk events than non-sexual partner assaulters. These findings suggest that cuckoldry risk influences male sexual coercion in established sexual relationships.

  8. Optimal test intervals for shutdown systems for the Cernavoda nuclear power station

    International Nuclear Information System (INIS)

    Negut, Gh.; Laslau, F.

    1993-01-01

    Cernavoda nuclear power station required a complete PSA study. As a part of this study, an important goal to enhance the effectiveness of the plant operation is to establish optimal test intervals for the important engineering safety systems. The paper presents, briefly, the current methods to optimize the test intervals. For this reason it was used Vesely methods to establish optimal test intervals and Frantic code to survey the influence of the test intervals on system availability. The applications were done on the Shutdown System no. 1, a shutdown system provided whit solid rods and on Shutdown System no. 2 provided with injecting poison. The shutdown systems receive nine total independent scram signals that dictate the test interval. Fault trees for the both safety systems were developed. For the fault tree solutions an original code developed in our Institute was used. The results, intended to be implemented in the technical specifications for test and operation of Cernavoda NPS are presented

  9. Short-Term Wind Speed Forecasting Using Support Vector Regression Optimized by Cuckoo Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Jianzhou Wang

    2015-01-01

    Full Text Available This paper develops an effectively intelligent model to forecast short-term wind speed series. A hybrid forecasting technique is proposed based on recurrence plot (RP and optimized support vector regression (SVR. Wind caused by the interaction of meteorological systems makes itself extremely unsteady and difficult to forecast. To understand the wind system, the wind speed series is analyzed using RP. Then, the SVR model is employed to forecast wind speed, in which the input variables are selected by RP, and two crucial parameters, including the penalties factor and gamma of the kernel function RBF, are optimized by various optimization algorithms. Those optimized algorithms are genetic algorithm (GA, particle swarm optimization algorithm (PSO, and cuckoo optimization algorithm (COA. Finally, the optimized SVR models, including COA-SVR, PSO-SVR, and GA-SVR, are evaluated based on some criteria and a hypothesis test. The experimental results show that (1 analysis of RP reveals that wind speed has short-term predictability on a short-term time scale, (2 the performance of the COA-SVR model is superior to that of the PSO-SVR and GA-SVR methods, especially for the jumping samplings, and (3 the COA-SVR method is statistically robust in multi-step-ahead prediction and can be applied to practical wind farm applications.

  10. Optimal design of tests for heat exchanger fouling identification

    International Nuclear Information System (INIS)

    Palmer, Kyle A.; Hale, William T.; Such, Kyle D.; Shea, Brian R.; Bollas, George M.

    2016-01-01

    Highlights: • Built-in test design that optimizes the information extractable from the said test. • Method minimizes the covariance of a fault with system uncertainty. • Method applied for the identification and quantification of heat exchanger fouling. • Heat exchanger fouling is identifiable despite the uncertainty in inputs and states. - Graphical Abstract: - Abstract: Particulate fouling in plate fin heat exchangers of aircraft environmental control systems is a recurring issue in environments rich in foreign object debris. Heat exchanger fouling detection, in terms of quantification of its severity, is critical for aircraft maintenance scheduling and safe operation. In this work, we focus on methods for offline fouling detection during aircraft ground handling, where the allowable variability range of admissible inputs is wider. We explore methods of optimal experimental design to estimate heat exchanger inputs and input trajectories that maximize the identifiability of fouling. In particular, we present a methodology in which D-optimality is used as a criterion for statistically significant inference of heat exchanger fouling in uncertain environments. The optimal tests are designed on the basis of a heat exchanger model of the inherent mass, energy and momentum balances, validated against literature data. The model is then used to infer sensitivities of the heat exchanger outputs with respect to fouling metrics and maximize them by manipulating input trajectories; thus enhancing the accuracy in quantifying the fouling extent. The proposed methodology is evaluated with statistical indices of the confidence in estimating thermal fouling resistance at uncertain operating conditions, explored in a series of case studies.

  11. Resemblance profiles as clustering decision criteria: Estimating statistical power, error, and correspondence for a hypothesis test for multivariate structure.

    Science.gov (United States)

    Kilborn, Joshua P; Jones, David L; Peebles, Ernst B; Naar, David F

    2017-04-01

    Clustering data continues to be a highly active area of data analysis, and resemblance profiles are being incorporated into ecological methodologies as a hypothesis testing-based approach to clustering multivariate data. However, these new clustering techniques have not been rigorously tested to determine the performance variability based on the algorithm's assumptions or any underlying data structures. Here, we use simulation studies to estimate the statistical error rates for the hypothesis test for multivariate structure based on dissimilarity profiles (DISPROF). We concurrently tested a widely used algorithm that employs the unweighted pair group method with arithmetic mean (UPGMA) to estimate the proficiency of clustering with DISPROF as a decision criterion. We simulated unstructured multivariate data from different probability distributions with increasing numbers of objects and descriptors, and grouped data with increasing overlap, overdispersion for ecological data, and correlation among descriptors within groups. Using simulated data, we measured the resolution and correspondence of clustering solutions achieved by DISPROF with UPGMA against the reference grouping partitions used to simulate the structured test datasets. Our results highlight the dynamic interactions between dataset dimensionality, group overlap, and the properties of the descriptors within a group (i.e., overdispersion or correlation structure) that are relevant to resemblance profiles as a clustering criterion for multivariate data. These methods are particularly useful for multivariate ecological datasets that benefit from distance-based statistical analyses. We propose guidelines for using DISPROF as a clustering decision tool that will help future users avoid potential pitfalls during the application of methods and the interpretation of results.

  12. [Working memory, phonological awareness and spelling hypothesis].

    Science.gov (United States)

    Gindri, Gigiane; Keske-Soares, Márcia; Mota, Helena Bolli

    2007-01-01

    Working memory, phonological awareness and spelling hypothesis. To verify the relationship between working memory, phonological awareness and spelling hypothesis in pre-school children and first graders. Participants of this study were 90 students, belonging to state schools, who presented typical linguistic development. Forty students were preschoolers, with the average age of six and 50 students were first graders, with the average age of seven. Participants were submitted to an evaluation of the working memory abilities based on the Working Memory Model (Baddeley, 2000), involving phonological loop. Phonological loop was evaluated using the Auditory Sequential Test, subtest 5 of Illinois Test of Psycholinguistic Abilities (ITPA), Brazilian version (Bogossian & Santos, 1977), and the Meaningless Words Memory Test (Kessler, 1997). Phonological awareness abilities were investigated using the Phonological Awareness: Instrument of Sequential Assessment (CONFIAS - Moojen et al., 2003), involving syllabic and phonemic awareness tasks. Writing was characterized according to Ferreiro & Teberosky (1999). Preschoolers presented the ability of repeating sequences of 4.80 digits and 4.30 syllables. Regarding phonological awareness, the performance in the syllabic level was of 19.68 and in the phonemic level was of 8.58. Most of the preschoolers demonstrated to have a pre-syllabic writing hypothesis. First graders repeated, in average, sequences of 5.06 digits and 4.56 syllables. These children presented a phonological awareness of 31.12 in the syllabic level and of 16.18 in the phonemic level, and demonstrated to have an alphabetic writing hypothesis. The performance of working memory, phonological awareness and spelling level are inter-related, as well as being related to chronological age, development and scholarity.

  13. Optimal Robust Self-Testing by Binary Nonlocal XOR Games

    OpenAIRE

    Miller, Carl A.; Shi, Yaoyun

    2013-01-01

    Self-testing a quantum apparatus means verifying the existence of a certain quantum state as well as the effect of the associated measuring devices based only on the statistics of the measurement outcomes. Robust (i.e., error-tolerant) self-testing quantum apparatuses are critical building blocks for quantum cryptographic protocols that rely on imperfect or untrusted devices. We devise a general scheme for proving optimal robust self-testing properties for tests based on nonlocal binary XOR g...

  14. Motivation in vigilance - A test of the goal-setting hypothesis of the effectiveness of knowledge of results.

    Science.gov (United States)

    Warm, J. S.; Riechmann, S. W.; Grasha, A. F.; Seibel, B.

    1973-01-01

    This study tested the prediction, derived from the goal-setting hypothesis, that the facilitating effects of knowledge of results (KR) in a simple vigilance task should be related directly to the level of the performance standard used to regulate KR. Two groups of Ss received dichotomous KR in terms of whether Ss response times (RTs) to signal detections exceeded a high or low standard of performance. The aperiodic offset of a visual signal was the critical event for detection. The vigil was divided into a training phase followed by testing, during which KR was withdrawn. Knowledge of results enhanced performance in both phases. However, the two standards used to regulate feedback contributed little to these effects.

  15. The Influence of Maternal Acculturation, Neighborhood Disadvantage, and Parenting on Chinese American Adolescents' Conduct Problems: Testing the Segmented Assimilation Hypothesis

    Science.gov (United States)

    Liu, Lisa L.; Lau, Anna S.; Chen, Angela Chia-Chen; Dinh, Khanh T.; Kim, Su Yeong

    2009-01-01

    Associations among neighborhood disadvantage, maternal acculturation, parenting and conduct problems were investigated in a sample of 444 Chinese American adolescents. Adolescents (54% female, 46% male) ranged from 12 to 15 years of age (mean age = 13.0 years). Multilevel modeling was employed to test the hypothesis that the association between…

  16. Using Saccharomyces cerevisiae to Test the Mutagenicity of Household Compounds: An Open Ended Hypothesis-Driven Teaching Lab

    OpenAIRE

    Marshall, Pamela A.

    2007-01-01

    In our Fundamentals of Genetics lab, students perform a wide variety of labs to reinforce and extend the topics covered in lecture. I developed an active-learning lab to augment the lecture topic of mutagenesis. In this lab exercise, students determine if a compound they bring from home is a mutagen. Students are required to read extensive background material, perform research to find a potential mutagen to test, develop a hypothesis, and bring to the lab their own suspected mutagen. This lab...

  17. Testing the carotenoid trade-off hypothesis in the polychromatic Midas cichlid, Amphilophus citrinellus.

    Science.gov (United States)

    Lin, Susan M; Nieves-Puigdoller, Katherine; Brown, Alexandria C; McGraw, Kevin J; Clotfelter, Ethan D

    2010-01-01

    Many animals use carotenoid pigments derived from their diet for coloration and immunity. The carotenoid trade-off hypothesis predicts that, under conditions of carotenoid scarcity, individuals may be forced to allocate limited carotenoids to either coloration or immunity. In polychromatic species, the pattern of allocation may differ among individuals. We tested the carotenoid trade-off hypothesis in the Midas cichlid, Amphilophus citrinellus, a species with two ontogenetic color morphs, barred and gold, the latter of which is the result of carotenoid expression. We performed a diet-supplementation experiment in which cichlids of both color morphs were assigned to one of two diet treatments that differed only in carotenoid content (beta-carotene, lutein, and zeaxanthin). We measured integument color using spectrometry, quantified carotenoid concentrations in tissue and plasma, and assessed innate immunity using lysozyme activity and alternative complement pathway assays. In both color morphs, dietary carotenoid supplementation elevated plasma carotenoid circulation but failed to affect skin coloration. Consistent with observable differences in integument coloration, we found that gold fish sequestered more carotenoids in skin tissue than barred fish, but barred fish had higher concentrations of carotenoids in plasma than gold fish. Neither measure of innate immunity differed between gold and barred fish, or as a function of dietary carotenoid supplementation. Lysozyme activity, but not complement activity, was strongly affected by body condition. Our data show that a diet low in carotenoids is sufficient to maintain both coloration and innate immunity in Midas cichlids. Our data also suggest that the developmental transition from the barred to gold morph is not accompanied by a decrease in innate immunity in this species.

  18. Testing of Frank's hypothesis on a containerless packing of macroscopic soft spheres and comparison with mono-atomic metallic liquids

    International Nuclear Information System (INIS)

    Sahu, K.K.; Wessels, V.; Kelton, K.F.; Loeffler, J.F.

    2011-01-01

    Highlights: → Testing of Frank's hypothesis for Centripetal Packing (CP) has been proposed. → It is shown that CP is an idealized model for Monatomic Supercooled Liquid (MSL). → The CP is fit for comparing with studies on MSL in a containerless environment. → We measure local orders in CP by HA and BOO methods for the first time. → It is shown that icosahedral order is greater in CP than MSL and reasons explored. - Abstract: It is well-known that metallic liquids can exist below their equilibrium melting temperature for a considerable time. To explain this, Frank proposed that icosahedral ordering, incompatible with crystalline long-range order, is prevalent in the atomic structure of these liquids, stabilizing them and enabling them to be supercooled. Some studies of the atomic structures of metallic liquids using Beam-line Electrostatic Levitation (BESL; containerless melting), and other techniques, support this hypothesis . Here we examine Frank's hypothesis in a system of macroscopic, monodisperse deformable spheres obtained by containerless packing under the influence of centripetal force. The local structure of this packing is analyzed and compared with atomic ensembles of liquid transition metals obtained by containerless melting using the BESL method.

  19. PEMBENTUKAN PORTOFOLIO OPTIMAL SAHAM-SAHAM PADA PERIODE BULLISH DI BURSA EFEK INDONESIA

    Directory of Open Access Journals (Sweden)

    Suramaya Suci Kewal

    2013-04-01

    Full Text Available Abstrak: Pembentukan Portofolio Optimal Saham-Saham Pada Periode Bullish Di Bursa Efek Indonesia. Penelitian ini dilakukan dengan tujuan untuk menyusun portofolio optimal saham yang tercatat di Bursa Efek Indonesia dengan menggunakan model indeks tunggal pada periode Bullish. Periode penelitian yang digunakan dalam penelitian ini adalah 2009-2011. Hasilnya adalah tersusunnya sebuah portofolio saham yang terdiri dari empat saham, yaitu ASRI (48,72%, INDF (28,24%, BBNI (16,32%, dan BKSL (6.71%. Hasil pengujian hipotesis pertama menunjukkan bahwa ada perbedaan dalam return saham dari portofolio candidate dibandingkan dengan portofolio noncandidate. Hasil pengujian hipotesis kedua menunjukkan bahwa tidak ada perbedaan dalam risiko saham yang termasuk dalam portofolio candidate dibandingkan dengan portofolio noncandidate. Kata kunci: portofolio optimal, model single-index   Abstract: Establishment of Optimal Portfolio Shares On Bullish Period In Indonesia Stock Exchange. This study was conducted in order to establish an optimal portfolio of stocks listed on the Indonesia Stock Exchange using a single index model on Bullish conditions. Period of research used in this study is 2009-2011. The result is the formation of a stock portfolio consisting of four stocks, namely ASRI (48.72%, INDF (28.24%, BBNI (16.32%, and BKSL (6.71%. The test results of the first hypothesis suggests that there are differences in stock returns a stock portfolio of candidates as compared to the noncandidate portfolio. The results of testing the second hypothesis suggests that there is no difference in the risk of stocks included in candidate portfolios as compared to the noncandidate portfolio. Keywords: optimal portfolios, single-index models

  20. What Constitutes Science and Scientific Evidence: Roles of Null Hypothesis Testing

    Science.gov (United States)

    Chang, Mark

    2017-01-01

    We briefly discuss the philosophical basis of science, causality, and scientific evidence, by introducing the hidden but most fundamental principle of science: the similarity principle. The principle's use in scientific discovery is illustrated with Simpson's paradox and other examples. In discussing the value of null hypothesis statistical…

  1. Life Origination Hydrate Hypothesis (LOH-Hypothesis

    Directory of Open Access Journals (Sweden)

    Victor Ostrovskii

    2012-01-01

    Full Text Available The paper develops the Life Origination Hydrate Hypothesis (LOH-hypothesis, according to which living-matter simplest elements (LMSEs, which are N-bases, riboses, nucleosides, nucleotides, DNA- and RNA-like molecules, amino-acids, and proto-cells repeatedly originated on the basis of thermodynamically controlled, natural, and inevitable processes governed by universal physical and chemical laws from CH4, niters, and phosphates under the Earth's surface or seabed within the crystal cavities of the honeycomb methane-hydrate structure at low temperatures; the chemical processes passed slowly through all successive chemical steps in the direction that is determined by a gradual decrease in the Gibbs free energy of reacting systems. The hypothesis formulation method is based on the thermodynamic directedness of natural movement and consists ofan attempt to mentally backtrack on the progression of nature and thus reveal principal milestones alongits route. The changes in Gibbs free energy are estimated for different steps of the living-matter origination process; special attention is paid to the processes of proto-cell formation. Just the occurrence of the gas-hydrate periodic honeycomb matrix filled with LMSEs almost completely in its final state accounts for size limitation in the DNA functional groups and the nonrandom location of N-bases in the DNA chains. The slowness of the low-temperature chemical transformations and their “thermodynamic front” guide the gross process of living matter origination and its successive steps. It is shown that the hypothesis is thermodynamically justified and testable and that many observed natural phenomena count in its favor.

  2. A novel hypothesis on the sensitivity of the fecal occult blood test: Results of a joint analysis of 3 randomized controlled trials.

    Science.gov (United States)

    Lansdorp-Vogelaar, Iris; van Ballegooijen, Marjolein; Boer, Rob; Zauber, Ann; Habbema, J Dik F

    2009-06-01

    Estimates of the fecal occult blood test (FOBT) (Hemoccult II) sensitivity differed widely between screening trials and led to divergent conclusions on the effects of FOBT screening. We used microsimulation modeling to estimate a preclinical colorectal cancer (CRC) duration and sensitivity for unrehydrated FOBT from the data of 3 randomized controlled trials of Minnesota, Nottingham, and Funen. In addition to 2 usual hypotheses on the sensitivity of FOBT, we tested a novel hypothesis where sensitivity is linked to the stage of clinical diagnosis in the situation without screening. We used the MISCAN-Colon microsimulation model to estimate sensitivity and duration, accounting for differences between the trials in demography, background incidence, and trial design. We tested 3 hypotheses for FOBT sensitivity: sensitivity is the same for all preclinical CRC stages, sensitivity increases with each stage, and sensitivity is higher for the stage in which the cancer would have been diagnosed in the absence of screening than for earlier stages. Goodness-of-fit was evaluated by comparing expected and observed rates of screen-detected and interval CRC. The hypothesis with a higher sensitivity in the stage of clinical diagnosis gave the best fit. Under this hypothesis, sensitivity of FOBT was 51% in the stage of clinical diagnosis and 19% in earlier stages. The average duration of preclinical CRC was estimated at 6.7 years. Our analysis corroborated a long duration of preclinical CRC, with FOBT most sensitive in the stage of clinical diagnosis. (c) 2009 American Cancer Society.

  3. Biorhythms, deciduous enamel thickness, and primary bone growth: a test of the Havers-Halberg Oscillation hypothesis.

    Science.gov (United States)

    Mahoney, Patrick; Miszkiewicz, Justyna J; Pitfield, Rosie; Schlecht, Stephen H; Deter, Chris; Guatelli-Steinberg, Debbie

    2016-06-01

    Across mammalian species, the periodicity with which enamel layers form (Retzius periodicity) in permanent teeth corresponds with average body mass and the pace of life history. According to the Havers-Halberg Oscillation hypothesis (HHO), Retzius periodicity (RP) is a manifestation of a biorhythm that is also expressed in lamellar bone. Potentially, these links provide a basis for investigating aspects of a species' biology from fossilized teeth. Here, we tested intra-specific predictions of this hypothesis on skeletal samples of human juveniles. We measured daily enamel growth increments to calculate RP in deciduous molars (n = 25). Correlations were sought between RP, molar average and relative enamel thickness (AET, RET), and the average amount of primary bone growth (n = 7) in humeri of age-matched juveniles. Results show a previously undescribed relationship between RP and enamel thickness. Reduced major axis regression reveals RP is significantly and positively correlated with AET and RET, and scales isometrically. The direction of the correlation was opposite to HHO predictions as currently understood for human adults. Juveniles with higher RPs and thicker enamel had increased primary bone formation, which suggests a coordinating biorhythm. However, the direction of the correspondence was, again, opposite to predictions. Next, we compared RP from deciduous molars with new data for permanent molars, and with previously published values. The lowermost RP of 4 and 5 days in deciduous enamel extends below the lowermost RP of 6 days in permanent enamel. A lowered range of RP values in deciduous enamel implies that the underlying biorhythm might change with age. Our results develop the intra-specific HHO hypothesis. © 2016 Anatomical Society.

  4. Optimal integration and test plans for software releases of lithographic systems

    NARCIS (Netherlands)

    Boumen, R.; Jong, de I.S.M.; Mortel - Fronczak, van de J.M.; Rooda, J.E.

    2007-01-01

    This paper describes a method to determine the optimal integration and test plan for embedded systems software releases. The method consists of four steps: 1)describe the integration and test problem in an integration and test model which is introduced in this paper, 2) determine possible test

  5. Perceived message sensation value and psychological reactance: a test of the dominant thought disruption hypothesis.

    Science.gov (United States)

    Quick, Brian L

    2013-01-01

    The present study tests to see whether perceived message sensation value reduces psychological reactance within the context of anti-marijuana ads for television. After controlling for sensation seeking, biological sex, and marijuana use, the results indicate that message novelty is negatively associated with a freedom threat, whereas dramatic impact and emotional arousal were not associated with the antecedent to reactance. Results support the use of novel messages in future ads while at the same time offer an explanation to the challenges involved in creating effective anti-marijuana ads. Overall, the results provide partial support for the dominant thought disruption hypothesis and are discussed with an emphasis on the theoretical and practical implications for health communication researchers and practitioners.

  6. Intelligent Network Flow Optimization (INFLO) prototype acceptance test summary.

    Science.gov (United States)

    2015-05-01

    This report summarizes the results of System Acceptance Testing for the implementation of the Intelligent Network : Flow Optimization (INFLO) Prototype bundle within the Dynamic Mobility Applications (DMA) portion of the Connected : Vehicle Program. ...

  7. Testing the relativistic Doppler boost hypothesis for supermassive black hole binary candidates

    Science.gov (United States)

    Charisi, Maria; Haiman, Zoltán; Schiminovich, David; D'Orazio, Daniel J.

    2018-06-01

    Supermassive black hole binaries (SMBHBs) should be common in galactic nuclei as a result of frequent galaxy mergers. Recently, a large sample of sub-parsec SMBHB candidates was identified as bright periodically variable quasars in optical surveys. If the observed periodicity corresponds to the redshifted binary orbital period, the inferred orbital velocities are relativistic (v/c ≈ 0.1). The optical and ultraviolet (UV) luminosities are expected to arise from gas bound to the individual BHs, and would be modulated by the relativistic Doppler effect. The optical and UV light curves should vary in tandem with relative amplitudes which depend on the respective spectral slopes. We constructed a control sample of 42 quasars with aperiodic variability, to test whether this Doppler colour signature can be distinguished from intrinsic chromatic variability. We found that the Doppler signature can arise by chance in ˜20 per cent (˜37 per cent) of quasars in the nUV (fUV) band. These probabilities reflect the limited quality of the control sample and represent upper limits on how frequently quasars mimic the Doppler brightness+colour variations. We performed separate tests on the periodic quasar candidates, and found that for the majority, the Doppler boost hypothesis requires an unusually steep UV spectrum or an unexpectedly large BH mass and orbital velocity. We conclude that at most approximately one-third of these periodic candidates can harbor Doppler-modulated SMBHBs.

  8. An Improved Test Selection Optimization Model Based on Fault Ambiguity Group Isolation and Chaotic Discrete PSO

    Directory of Open Access Journals (Sweden)

    Xiaofeng Lv

    2018-01-01

    Full Text Available Sensor data-based test selection optimization is the basis for designing a test work, which ensures that the system is tested under the constraint of the conventional indexes such as fault detection rate (FDR and fault isolation rate (FIR. From the perspective of equipment maintenance support, the ambiguity isolation has a significant effect on the result of test selection. In this paper, an improved test selection optimization model is proposed by considering the ambiguity degree of fault isolation. In the new model, the fault test dependency matrix is adopted to model the correlation between the system fault and the test group. The objective function of the proposed model is minimizing the test cost with the constraint of FDR and FIR. The improved chaotic discrete particle swarm optimization (PSO algorithm is adopted to solve the improved test selection optimization model. The new test selection optimization model is more consistent with real complicated engineering systems. The experimental result verifies the effectiveness of the proposed method.

  9. TESTING THE HYPOTHESIS THAT METHANOL MASER RINGS TRACE CIRCUMSTELLAR DISKS: HIGH-RESOLUTION NEAR-INFRARED AND MID-INFRARED IMAGING

    International Nuclear Information System (INIS)

    De Buizer, James M.; Bartkiewicz, Anna; Szymczak, Marian

    2012-01-01

    Milliarcsecond very long baseline interferometry maps of regions containing 6.7 GHz methanol maser emission have lead to the recent discovery of ring-like distributions of maser spots and the plausible hypothesis that they may be tracing circumstellar disks around forming high-mass stars. We aimed to test this hypothesis by imaging these regions in the near- and mid-infrared at high spatial resolution and compare the observed emission to the expected infrared morphologies as inferred from the geometries of the maser rings. In the near-infrared we used the Gemini North adaptive optics system of ALTAIR/NIRI, while in the mid-infrared we used the combination of the Gemini South instrument T-ReCS and super-resolution techniques. Resultant images had a resolution of ∼150 mas in both the near-infrared and mid-infrared. We discuss the expected distribution of circumstellar material around young and massive accreting (proto)stars and what infrared emission geometries would be expected for the different maser ring orientations under the assumption that the masers are coming from within circumstellar disks. Based upon the observed infrared emission geometries for the four targets in our sample and the results of spectral energy distribution modeling of the massive young stellar objects associated with the maser rings, we do not find compelling evidence in support of the hypothesis that methanol masers rings reside in circumstellar disks.

  10. Testing the lexical hypothesis: are socially important traits more densely reflected in the English lexicon?

    Science.gov (United States)

    Wood, Dustin

    2015-02-01

    Using a set of 498 English words identified by Saucier (1997) as common person-descriptor adjectives or trait terms, I tested 3 instantiations of the lexical hypothesis, which posit that more socially important person descriptors show greater density in the lexicon. Specifically, I explored whether trait terms that have greater relational impact (i.e., more greatly influence how others respond to a person) have more synonyms, are more frequently used, and are more strongly correlated with other trait terms. I found little evidence to suggest that trait terms rated as having greater relational impact were more frequently used or had more synonyms. However, these terms correlated more strongly with other trait terms in the set. Conversely, a trait term's loadings on structural factors (e.g., the Big Five, HEXACO) were extremely good predictors of the term's relational impact. The findings suggest that the lexical hypothesis may not be strongly supported in some ways it is commonly understood but is supported in the manner most important to investigations of trait structure. Specifically, trait terms with greater relational impact tend to more strongly correlate with other terms in lexical sets and thus have a greater role in driving the location of factors in analyses of trait structure. Implications for understanding the meaning of lexical factors such as the Big Five are discussed. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  11. Optimization of inverse model identification for multi-axial test rig control

    Directory of Open Access Journals (Sweden)

    Müller Tino

    2016-01-01

    Full Text Available Laboratory testing of multi-axial fatigue situations improves repeatability and allows a time condensing of tests which can be carried out until component failure, compared to field testing. To achieve realistic and convincing durability results, precise load data reconstruction is necessary. Cross-talk and a high number of degrees of freedom negatively affect the control accuracy. Therefore a multiple input/multiple output (MIMO model of the system, capturing all inherent cross-couplings is identified. In a first step the model order is estimated based on the physical fundamentals of a one channel hydraulic-servo system. Subsequently, the structure of the MIMO model is optimized using correlation of the outputs, to increase control stability and reduce complexity of the parameter optimization. The identification process is successfully applied to the iterative control of a multi-axial suspension rig. The results show accurate control, with increased stability compared to control without structure optimization.

  12. A test of the permanent income hypothesis on Czech voucher privatization

    Czech Academy of Sciences Publication Activity Database

    Hanousek, Jan; Tůma, Z.

    2002-01-01

    Roč. 10, č. 2 (2002), s. 235-254 ISSN 0967-0750 Institutional research plan: CEZ:AV0Z7085904 Keywords : Barro-Riccardian * equivalence * permanent income hypothesis Subject RIV: AH - Economics Impact factor: 0.897, year: 2002 http://search. ebscohost .com/login.aspx?direct=true&db=bth&AN=6844845&site=ehost-live

  13. Optimizing multiple-choice tests as tools for learning.

    Science.gov (United States)

    Little, Jeri L; Bjork, Elizabeth Ligon

    2015-01-01

    Answering multiple-choice questions with competitive alternatives can enhance performance on a later test, not only on questions about the information previously tested, but also on questions about related information not previously tested-in particular, on questions about information pertaining to the previously incorrect alternatives. In the present research, we assessed a possible explanation for this pattern: When multiple-choice questions contain competitive incorrect alternatives, test-takers are led to retrieve previously studied information pertaining to all of the alternatives in order to discriminate among them and select an answer, with such processing strengthening later access to information associated with both the correct and incorrect alternatives. Supporting this hypothesis, we found enhanced performance on a later cued-recall test for previously nontested questions when their answers had previously appeared as competitive incorrect alternatives in the initial multiple-choice test, but not when they had previously appeared as noncompetitive alternatives. Importantly, however, competitive alternatives were not more likely than noncompetitive alternatives to be intruded as incorrect responses, indicating that a general increased accessibility for previously presented incorrect alternatives could not be the explanation for these results. The present findings, replicated across two experiments (one in which corrective feedback was provided during the initial multiple-choice testing, and one in which it was not), thus strongly suggest that competitive multiple-choice questions can trigger beneficial retrieval processes for both tested and related information, and the results have implications for the effective use of multiple-choice tests as tools for learning.

  14. Dispositional optimism and sleep quality: a test of mediating pathways.

    Science.gov (United States)

    Uchino, Bert N; Cribbet, Matthew; de Grey, Robert G Kent; Cronan, Sierra; Trettevik, Ryan; Smith, Timothy W

    2017-04-01

    Dispositional optimism has been related to beneficial influences on physical health outcomes. However, its links to global sleep quality and the psychological mediators responsible for such associations are less studied. This study thus examined if trait optimism predicted global sleep quality, and if measures of subjective well-being were statistical mediators of such links. A community sample of 175 participants (93 men, 82 women) completed measures of trait optimism, depression, and life satisfaction. Global sleep quality was assessed using the Pittsburgh Sleep Quality Index. Results indicated that trait optimism was a strong predictor of better PSQI global sleep quality. Moreover, this association was mediated by depression and life satisfaction in both single and multiple mediator models. These results highlight the importance of optimism for the restorative process of sleep, as well as the utility of multiple mediator models in testing distinct psychological pathways.

  15. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  16. Minimum scale controlled topology optimization and experimental test of a micro thermal actuator

    DEFF Research Database (Denmark)

    Heo, S.; Yoon, Gil Ho; Kim, Y.Y.

    2008-01-01

    This paper is concerned with the optimal topology design, fabrication and test of a micro thermal actuator. Because the minimum scale was controlled during the design optimization process, the production yield rate of the actuator was improved considerably; alternatively, the optimization design ...... tested. The test showed that control over the minimum length scale in the design process greatly improves the yield rate and reduces the performance deviation....... without scale control resulted in a very low yield rate. Using the minimum scale controlling topology design method developed earlier by the authors, micro thermal actuators were designed and fabricated through a MEMS process. Moreover, both their performance and production yield were experimentally...

  17. Independent component analysis in non-hypothesis driven metabolomics

    DEFF Research Database (Denmark)

    Li, Xiang; Hansen, Jakob; Zhao, Xinjie

    2012-01-01

    In a non-hypothesis driven metabolomics approach plasma samples collected at six different time points (before, during and after an exercise bout) were analyzed by gas chromatography-time of flight mass spectrometry (GC-TOF MS). Since independent component analysis (ICA) does not need a priori...... information on the investigated process and moreover can separate statistically independent source signals with non-Gaussian distribution, we aimed to elucidate the analytical power of ICA for the metabolic pattern analysis and the identification of key metabolites in this exercise study. A novel approach...... based on descriptive statistics was established to optimize ICA model. In the GC-TOF MS data set the number of principal components after whitening and the number of independent components of ICA were optimized and systematically selected by descriptive statistics. The elucidated dominating independent...

  18. GSMA: Gene Set Matrix Analysis, An Automated Method for Rapid Hypothesis Testing of Gene Expression Data

    Directory of Open Access Journals (Sweden)

    Chris Cheadle

    2007-01-01

    Full Text Available Background: Microarray technology has become highly valuable for identifying complex global changes in gene expression patterns. The assignment of functional information to these complex patterns remains a challenging task in effectively interpreting data and correlating results from across experiments, projects and laboratories. Methods which allow the rapid and robust evaluation of multiple functional hypotheses increase the power of individual researchers to data mine gene expression data more efficiently.Results: We have developed (gene set matrix analysis GSMA as a useful method for the rapid testing of group-wise up- or downregulation of gene expression simultaneously for multiple lists of genes (gene sets against entire distributions of gene expression changes (datasets for single or multiple experiments. The utility of GSMA lies in its flexibility to rapidly poll gene sets related by known biological function or as designated solely by the end-user against large numbers of datasets simultaneously.Conclusions: GSMA provides a simple and straightforward method for hypothesis testing in which genes are tested by groups across multiple datasets for patterns of expression enrichment.

  19. TESTING THE EFFICIENT MARKET HYPOTHESIS ON THE ROMANIAN CAPITAL MARKET

    OpenAIRE

    Daniel Stefan ARMEANU; Sorin-Iulian CIOACA

    2014-01-01

    The Efficient Market Hypothesis (EMH) is one of the leading financial concepts that dominated the economic research over the last 50 years, being one of the pillars of the modern economic science. This theory, developed by Eugene Fama in the `70s, was a landmark in the development of theoretical concepts and models trying to explain the price evolution of financial assets (considering the common assumptions of the main developed theories) and also for the development of some branches in the f...

  20. Sexual selection on land snail shell ornamentation: a hypothesis that may explain shell diversity

    Directory of Open Access Journals (Sweden)

    Schilthuizen Menno

    2003-06-01

    Full Text Available Abstract Background Many groups of land snails show great interspecific diversity in shell ornamentation, which may include spines on the shell and flanges on the aperture. Such structures have been explained as camouflage or defence, but the possibility that they might be under sexual selection has not previously been explored. Presentation of the hypothesis The hypothesis that is presented consists of two parts. First, that shell ornamentation is the result of sexual selection. Second, that such sexual selection has caused the divergence in shell shape in different species. Testing the hypothesis The first part of the hypothesis may be tested by searching for sexual dimorphism in shell ornamentation in gonochoristic snails, by searching for increased variance in shell ornamentation relative to other shell traits, and by mate choice experiments using individuals with experimentally enhanced ornamentation. The second part of the hypothesis may be tested by comparing sister groups and correlating shell diversity with degree of polygamy. Implications of the hypothesis If the hypothesis were true, it would provide an explanation for the many cases of allopatric evolutionary radiation in snails, where shell diversity cannot be related to any niche differentiation or environmental differences.

  1. The Effects of Social Anxiety and State Anxiety on Visual Attention: Testing the Vigilance-Avoidance Hypothesis.

    Science.gov (United States)

    Singh, J Suzanne; Capozzoli, Michelle C; Dodd, Michael D; Hope, Debra A

    2015-01-01

    A growing theoretical and research literature suggests that trait and state social anxiety can predict attentional patterns in the presence of emotional stimuli. The current study adds to this literature by examining the effects of state anxiety on visual attention and testing the vigilance-avoidance hypothesis, using a method of continuous visual attentional assessment. Participants were 91 undergraduate college students with high or low trait fear of negative evaluation (FNE), a core aspect of social anxiety, who were randomly assigned to either a high or low state anxiety condition. Participants engaged in a free view task in which pairs of emotional facial stimuli were presented and eye movements were continuously monitored. Overall, participants with high FNE avoided angry stimuli and participants with high state anxiety attended to positive stimuli. Participants with high state anxiety and high FNE were avoidant of angry faces, whereas participants with low state and low FNE exhibited a bias toward angry faces. The study provided partial support for the vigilance-avoidance hypothesis. The findings add to the mixed results in the literature that suggest that both positive and negative emotional stimuli may be important in understanding the complex attention patterns associated with social anxiety. Clinical implications and suggestions for future research are discussed.

  2. Experimental test of the PCAC-hypothesis in charged current neutrino and antineutrino interactions on protons

    Science.gov (United States)

    Jones, G. T.; Jones, R. W. L.; Kennedy, B. W.; O'Neale, S. W.; Klein, H.; Morrison, D. R. O.; Schmid, P.; Wachsmuth, H.; Miller, D. B.; Mobayyen, M. M.; Wainstein, S.; Aderholz, M.; Hoffmann, E.; Katz, U. F.; Kern, J.; Schmitz, N.; Wittek, W.; Allport, P.; Myatt, G.; Radojicic, D.; Bullock, F. W.; Burke, S.

    1987-03-01

    Data obtained with the bubble chamber BEBC at CERN are used for the first significant test of Adler's prediction for the neutrino and antineutrino-proton scattering cross sections at vanishing four-momentum transfer squared Q 2. An Extended Vector Meson Dominance Model (EVDM) is applied to extrapolate Adler's prediction to experimentally accessible values of Q 2. The data show good agreement with Adler's prediction for Q 2→0 thus confirming the PCAC hypothesis in the kinematical region of high leptonic energy transfer ν>2 GeV. The good agreement of the data with the theoretical predictions also at higher Q 2, where the EVDM terms are dominant, also supports this model. However, an EVDM calculation without PCAC is clearly ruled out by the data.

  3. Experimental test of the PCAC-hypothesis in charged current neutrino and antineutrino interactions on protons

    International Nuclear Information System (INIS)

    Jones, G.T.; Jones, R.W.L.; Kennedy, B.W.; O'Neale, S.W.; Klein, H.; Morrison, D.R.O.; Schmid, P.; Wachsmuth, H.; Allport, P.; Myatt, G.; Radojicic, D.; Bullock, F.W.; Burke, S.

    1987-01-01

    Data obtained with the bubble chamber BEBC at CERN are used for the first significant test of Adler's prediction for the neutrino and antineutrino-proton scattering cross sections at vanishing four-momentum transfer squared Q 2 . An Extended Vector Meson Dominance Model (EVDM) is applied to extrapolate Adler's prediction to experimentally accessible values of Q 2 . The data show good agreement with Adler's prediction for Q 2 → 0 thus confirming the PCAC hypothesis in the kinematical region of high leptonic energy transfer ν > 2 GeV. The good agreement of the data with the theoretical predictions also at higher Q 2 , where the EVDM terms are dominant, also supports this model. However, an EVDM calculation without PCAC is clearly ruled out by the data. (orig.)

  4. Helminth community structure and diet of three Afrotropical anuran species: a test of the interactive-versus-isolationist parasite communities hypothesis

    Directory of Open Access Journals (Sweden)

    G. C. Akani

    2011-09-01

    Full Text Available The interactive-versus-isolationist hypothesis predicts that parasite communities should be depauperated and weakly structured by interspecific competition in amphibians. A parasitological survey was carried out to test this hypothesis using three anuran species from Nigeria, tropical Africa (one Bufonidae; two Ranidae. High values of parasite infection parameters were found in all three species, which were infected by nematodes, cestodes and trematodes. Nonetheless, the parasite communities of the three anurans were very depauperated in terms of number of species (4 to 6. Interspecific competition was irrelevant in all species, as revealed by null models and Monte Carlo permutations. Cluster analyses revealed that, in terms of parasite community composition, the two Ranidae were similar, whereas the Bufonidae was more different. However, when prevalence, intensity, and abundance of parasites are combined into a multivariate analysis, each anuran species was clearly spaced apart from the others, thus revealing considerable species-specific differences in terms of their parasite communities. All anurans were generalists and probably opportunistic in terms of dietary habits, and showed no evidence of interspecific competition for food. Overall, our data are widely consistent with expectations driven from the interactive-versus-isolationist parasite communities hypothesis.

  5. Testing in mice the hypothesis that melanin is protective in malaria infections.

    Directory of Open Access Journals (Sweden)

    Michael Waisberg

    Full Text Available Malaria has had the largest impact of any infectious disease on shaping the human genome, exerting enormous selective pressure on genes that improve survival in severe malaria infections. Modern humans originated in Africa and lost skin melanization as they migrated to temperate regions of the globe. Although it is well documented that loss of melanization improved cutaneous Vitamin D synthesis, melanin plays an evolutionary ancient role in insect immunity to malaria and in some instances melanin has been implicated to play an immunoregulatory role in vertebrates. Thus, we tested the hypothesis that melanization may be protective in malaria infections using mouse models. Congenic C57BL/6 mice that differed only in the gene encoding tyrosinase, a key enzyme in the synthesis of melanin, showed no difference in the clinical course of infection by Plasmodium yoelii 17XL, that causes severe anemia, Plasmodium berghei ANKA, that causes severe cerebral malaria or Plasmodium chabaudi AS that causes uncomplicated chronic disease. Moreover, neither genetic deficiencies in vitamin D synthesis nor vitamin D supplementation had an effect on survival in cerebral malaria. Taken together, these results indicate that neither melanin nor vitamin D production improve survival in severe malaria.

  6. Brain morphology of the threespine stickleback (Gasterosteus aculeatus) varies inconsistently with respect to habitat complexity: A test of the Clever Foraging Hypothesis.

    Science.gov (United States)

    Ahmed, Newaz I; Thompson, Cole; Bolnick, Daniel I; Stuart, Yoel E

    2017-05-01

    The Clever Foraging Hypothesis asserts that organisms living in a more spatially complex environment will have a greater neurological capacity for cognitive processes related to spatial memory, navigation, and foraging. Because the telencephalon is often associated with spatial memory and navigation tasks, this hypothesis predicts a positive association between telencephalon size and environmental complexity. The association between habitat complexity and brain size has been supported by comparative studies across multiple species but has not been widely studied at the within-species level. We tested for covariation between environmental complexity and neuroanatomy of threespine stickleback ( Gasterosteus aculeatus ) collected from 15 pairs of lakes and their parapatric streams on Vancouver Island. In most pairs, neuroanatomy differed between the adjoining lake and stream populations. However, the magnitude and direction of this difference were inconsistent between watersheds and did not covary strongly with measures of within-site environmental heterogeneity. Overall, we find weak support for the Clever Foraging Hypothesis in our study.

  7. The Cognitive Mediation Hypothesis Revisited: An Empirical Response to Methodological and Theoretical Criticism.

    Science.gov (United States)

    Romero, Anna A.; And Others

    1996-01-01

    In order to address criticisms raised against the cognitive mediation hypothesis, three experiments were conducted to develop a more direct test of the hypothesis. Taken together, the three experiments provide converging support for the cognitive mediation hypothesis, reconfirming the central role of cognition in the persuasion process.…

  8. Tracing the footsteps of Sherlock Holmes: cognitive representations of hypothesis testing.

    Science.gov (United States)

    Van Wallendael, L R; Hastie, R

    1990-05-01

    A well-documented phenomenon in opinion-revision literature is subjects' failure to revise probability estimates for an exhaustive set of mutually exclusive hypotheses in a complementary manner. However, prior research has not addressed the question of whether such behavior simply represents a misunderstanding of mathematical rules, or whether it is a consequence of a cognitive representation of hypotheses that is at odds with the Bayesian notion of a set relationship. Two alternatives to the Bayesian representation, a belief system (Shafer, 1976) and a system of independent hypotheses, were proposed, and three experiments were conducted to examine cognitive representations of hypothesis sets in the testing of multiple competing hypotheses. Subjects were given brief murder mysteries to solve and allowed to request various types of information about the suspects; after having received each new piece of information, subjects rated each suspect's probability of being the murderer. Presence and timing of suspect eliminations were varied in the first two experiments; the final experiment involved the varying of percentages of clues that referred to more than one suspect (for example, all of the female suspects). The noncomplementarity of opinion revisions remained a strong phenomenon in all conditions. Information-search data refuted the idea that subjects represented hypotheses as a Bayesian set; further study of the independent hypotheses theory and Shaferian belief functions as descriptive models is encouraged.

  9. Prevalence of hardcore smoking in the Netherlands between 2001 and 2012: a test of the hardening hypothesis

    Directory of Open Access Journals (Sweden)

    Jeroen Bommelé

    2016-08-01

    Full Text Available Abstract Background Hardcore smokers are smokers who have smoked for many years and who do not intend to quit smoking. The “hardening hypothesis” states that light smokers are more likely to quit smoking than heavy smokers (such as hardcore smokers. Therefore, the prevalence of hardcore smoking among smokers would increase over time. If this is true, the smoking population would become harder to reach with tobacco control measures. In this study we tested the hardening hypothesis. Methods We calculated the prevalence of hardcore smoking in the Netherlands from 2001 to 2012. Smokers were ‘hardcore’ if they a smoked every day, b smoked on average 15 cigarettes per day or more, c had not attempted to quit in the past 12 months, and d had no intention to quit within 6 months. We used logistic regression models to test whether the prevalence changed over time. We also investigated whether trends differed between educational levels. Results Among smokers, the prevalence of hardcore smoking decreased from 40.8 % in 2001 to 32.2 % in 2012. In the general population, it decreased from 12.2 to 8.2 %. Hardcore smokers were significantly lower educated than non-hardcore smokers. Among the general population, the prevalence of hardcore smoking decreased more among higher educated people than among lower educated people. Conclusions We found no support for the hardening hypothesis in the Netherlands between 2001 and 2012. Instead, the decrease of hardcore smoking among smokers suggests a ‘softening’ of the smoking population.

  10. A Test of the Optimality Approach to Modelling Canopy gas Exchange by Natural Vegetation

    Science.gov (United States)

    Schymanski, S. J.; Sivapalan, M.; Roderick, M. L.; Beringer, J.; Hutley, L. B.

    2005-12-01

    Natural vegetation has co-evolved with its environment over a long period of time and natural selection has led to a species composition that is most suited for the given conditions. Part of this adaptation is the vegetation's water use strategy, which determines the amount and timing of water extraction from the soil. Knowing that water extraction by vegetation often accounts for over 90% of the annual water balance in some places, we need to understand its controls if we want to properly model the hydrologic cycle. Water extraction by roots is driven by transpiration from the canopy, which in turn is an inevitable consequence of CO2 uptake for photosynthesis. Photosynthesis provides plants with their main building material, carbohydrates, and with the energy necessary to thrive and prosper in their environment. Therefore we expect that natural vegetation would have evolved an optimal water use strategy to maximise its `net carbon profit' (the difference between carbon acquired by photosynthesis and carbon spent on maintenance of the organs involved in its uptake). Based on this hypothesis and on an ecophysiological gas exchange and photosynthesis model (Cowan and Farquhar 1977; von Caemmerer 2000), we model the optimal vegetation for a site in Howard Springs (N.T., Australia) and compare the modelled fluxes with measurements by Beringer, Hutley et al. (2003). The comparison gives insights into theoretical and real controls on transpiration and photosynthesis and tests the optimality approach to modelling gas exchange of natural vegetation with unknown properties. The main advantage of the optimality approach is that no assumptions about the particular vegetation on a site are needed, which makes it very powerful for predicting vegetation response to long-term climate- or land use change. Literature: Beringer, J., L. B. Hutley, et al. (2003). "Fire impacts on surface heat, moisture and carbon fluxes from a tropical savanna in northern Australia." International

  11. A Multi-Verse Optimizer with Levy Flights for Numerical Optimization and Its Application in Test Scheduling for Network-on-Chip.

    Directory of Open Access Journals (Sweden)

    Cong Hu

    Full Text Available We propose a new meta-heuristic algorithm named Levy flights multi-verse optimizer (LFMVO, which incorporates Levy flights into multi-verse optimizer (MVO algorithm to solve numerical and engineering optimization problems. The Original MVO easily falls into stagnation when wormholes stochastically re-span a number of universes (solutions around the best universe achieved over the course of iterations. Since Levy flights are superior in exploring unknown, large-scale search space, they are integrated into the previous best universe to force MVO out of stagnation. We test this method on three sets of 23 well-known benchmark test functions and an NP complete problem of test scheduling for Network-on-Chip (NoC. Experimental results prove that the proposed LFMVO is more competitive than its peers in both the quality of the resulting solutions and convergence speed.

  12. A Bayesian optimal design for degradation tests based on the inverse Gaussian process

    Energy Technology Data Exchange (ETDEWEB)

    Peng, Weiwen; Liu, Yu; Li, Yan Feng; Zhu, Shun Peng; Huang, Hong Zhong [University of Electronic Science and Technology of China, Chengdu (China)

    2014-10-15

    The inverse Gaussian process is recently introduced as an attractive and flexible stochastic process for degradation modeling. This process has been demonstrated as a valuable complement for models that are developed on the basis of the Wiener and gamma processes. We investigate the optimal design of the degradation tests on the basis of the inverse Gaussian process. In addition to an optimal design with pre-estimated planning values of model parameters, we also address the issue of uncertainty in the planning values by using the Bayesian method. An average pre-posterior variance of reliability is used as the optimization criterion. A trade-off between sample size and number of degradation observations is investigated in the degradation test planning. The effects of priors on the optimal designs and on the value of prior information are also investigated and quantified. The degradation test planning of a GaAs Laser device is performed to demonstrate the proposed method.

  13. Tests of the salt-nuclei hypothesis of rain formation

    Energy Technology Data Exchange (ETDEWEB)

    Woodcock, A H; Blanchard, D C

    1955-01-01

    Atmospheric chlorides in sea-salt nuclei and the chlorides dissolved in shower rainwaters were recently measured in Hawaii. A comparison of these measurements reveals the remarkable fact that the weight of chloride present in a certain number of nuclei in a cubic meter of clear air tends to be equal to the weight of chloride dissolved in an equal number of raindrops in a cubic meter of rainy air. This result is explained as an indication that the raindrops grow on the salt nuclei in some manner which prevents a marked change in the distribution of these nuclei during the drop-growth process. The data presented add new evidence in further support of the salt-nuclei raindrop hypothesis previously proposed by the first author.

  14. Recent tests of the equilibrium-point hypothesis (lambda model).

    Science.gov (United States)

    Feldman, A G; Ostry, D J; Levin, M F; Gribble, P L; Mitnitski, A B

    1998-07-01

    The lambda model of the equilibrium-point hypothesis (Feldman & Levin, 1995) is an approach to motor control which, like physics, is based on a logical system coordinating empirical data. The model has gone through an interesting period. On one hand, several nontrivial predictions of the model have been successfully verified in recent studies. In addition, the explanatory and predictive capacity of the model has been enhanced by its extension to multimuscle and multijoint systems. On the other hand, claims have recently appeared suggesting that the model should be abandoned. The present paper focuses on these claims and concludes that they are unfounded. Much of the experimental data that have been used to reject the model are actually consistent with it.

  15. Experimental tests of a superposition hypothesis to explain the relationship between the vestibuloocular reflex and smooth pursuit during horizontal combined eye-head tracking in humans

    Science.gov (United States)

    Huebner, W. P.; Leigh, R. J.; Seidman, S. H.; Thomas, C. W.; Billian, C.; DiScenna, A. O.; Dell'Osso, L. F.

    1992-01-01

    1. We used a modeling approach to test the hypothesis that, in humans, the smooth pursuit (SP) system provides the primary signal for cancelling the vestibuloocular reflex (VOR) during combined eye-head tracking (CEHT) of a target moving smoothly in the horizontal plane. Separate models for SP and the VOR were developed. The optimal values of parameters of the two models were calculated using measured responses of four subjects to trials of SP and the visually enhanced VOR. After optimal parameter values were specified, each model generated waveforms that accurately reflected the subjects' responses to SP and vestibular stimuli. The models were then combined into a CEHT model wherein the final eye movement command signal was generated as the linear summation of the signals from the SP and VOR pathways. 2. The SP-VOR superposition hypothesis was tested using two types of CEHT stimuli, both of which involved passive rotation of subjects in a vestibular chair. The first stimulus consisted of a "chair brake" or sudden stop of the subject's head during CEHT; the visual target continued to move. The second stimulus consisted of a sudden change from the visually enhanced VOR to CEHT ("delayed target onset" paradigm); as the vestibular chair rotated past the angular position of the stationary visual stimulus, the latter started to move in synchrony with the chair. Data collected during experiments that employed these stimuli were compared quantitatively with predictions made by the CEHT model. 3. During CEHT, when the chair was suddenly and unexpectedly stopped, the eye promptly began to move in the orbit to track the moving target. Initially, gaze velocity did not completely match target velocity, however; this finally occurred approximately 100 ms after the brake onset. The model did predict the prompt onset of eye-in-orbit motion after the brake, but it did not predict that gaze velocity would initially be only approximately 70% of target velocity. One possible

  16. The Twin Deficits Hypothesis: An Empirical Analysis for Tanzania

    Directory of Open Access Journals (Sweden)

    Manamba Epaphra

    2017-09-01

    Full Text Available This paper examines the relationship between current account and government budget deficits in Tanzania. The paper tests the validity of the twin deficits hypothesis, using annual time series data for the 1966-2015 period. The paper is thought to be significant because the concept of the twin deficit hypothesis is fraught with controversy. Some researches support the hypothesis that there is a positive relationship between current account deficits and fiscal deficits in the economy while others do not. In this paper, the empirical tests fail to reject the twin deficits hypothesis, indicating that rising budget deficits put more strain on the current account deficits in Tanzania. Specifically, the Vector Error Correction Model results support the conventional theory of a positive relationship between fiscal and external balances, with a relatively high speed of adjustment toward the equilibrium position. This evidence is consistent with a small open economy. To address the problem that may result from this kind of relationship, appropriate policy variables for reducing budget deficits such as reduction in non-development expenditure, enhancement of domestic revenue collection and actively fight corruption and tax evasion should be adopted. The government should also target export oriented firms and encourage an import substitution industry by creating favorable business environments.

  17. THE FRACTAL MARKET HYPOTHESIS

    Directory of Open Access Journals (Sweden)

    FELICIA RAMONA BIRAU

    2012-05-01

    Full Text Available In this article, the concept of capital market is analysed using Fractal Market Hypothesis which is a modern, complex and unconventional alternative to classical finance methods. Fractal Market Hypothesis is in sharp opposition to Efficient Market Hypothesis and it explores the application of chaos theory and fractal geometry to finance. Fractal Market Hypothesis is based on certain assumption. Thus, it is emphasized that investors did not react immediately to the information they receive and of course, the manner in which they interpret that information may be different. Also, Fractal Market Hypothesis refers to the way that liquidity and investment horizons influence the behaviour of financial investors.

  18. Optimal Testing Intervals in the Squatting Test to Determine Baroreflex Sensitivity

    OpenAIRE

    Ishitsuka, S.; Kusuyama, N.; Tanaka, M.

    2014-01-01

    The recently introduced “squatting test” (ST) utilizes a simple postural change to perturb the blood pressure and to assess baroreflex sensitivity (BRS). In our study, we estimated the reproducibility of and the optimal testing interval between the STs in healthy volunteers. Thirty-four subjects free of cardiovascular disorders and taking no medication were instructed to perform the repeated ST at 30-sec, 1-min, and 3-min intervals in duplicate in a random sequence, while the systolic blood p...

  19. THE FRACTAL MARKET HYPOTHESIS

    OpenAIRE

    FELICIA RAMONA BIRAU

    2012-01-01

    In this article, the concept of capital market is analysed using Fractal Market Hypothesis which is a modern, complex and unconventional alternative to classical finance methods. Fractal Market Hypothesis is in sharp opposition to Efficient Market Hypothesis and it explores the application of chaos theory and fractal geometry to finance. Fractal Market Hypothesis is based on certain assumption. Thus, it is emphasized that investors did not react immediately to the information they receive and...

  20. Sex differences in DNA methylation and expression in zebrafish brain: a test of an extended 'male sex drive' hypothesis.

    Science.gov (United States)

    Chatterjee, Aniruddha; Lagisz, Malgorzata; Rodger, Euan J; Zhen, Li; Stockwell, Peter A; Duncan, Elizabeth J; Horsfield, Julia A; Jeyakani, Justin; Mathavan, Sinnakaruppan; Ozaki, Yuichi; Nakagawa, Shinichi

    2016-09-30

    The sex drive hypothesis predicts that stronger selection on male traits has resulted in masculinization of the genome. Here we test whether such masculinizing effects can be detected at the level of the transcriptome and methylome in the adult zebrafish brain. Although methylation is globally similar, we identified 914 specific differentially methylated CpGs (DMCs) between males and females (435 were hypermethylated and 479 were hypomethylated in males compared to females). These DMCs were prevalent in gene body, intergenic regions and CpG island shores. We also discovered 15 distinct CpG clusters with striking sex-specific DNA methylation differences. In contrast, at transcriptome level, more female-biased genes than male-biased genes were expressed, giving little support for the male sex drive hypothesis. Our study provides genome-wide methylome and transcriptome assessment and sheds light on sex-specific epigenetic patterns and in zebrafish for the first time. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Learning-Related Changes in Adolescents' Neural Networks during Hypothesis-Generating and Hypothesis-Understanding Training

    Science.gov (United States)

    Lee, Jun-Ki; Kwon, Yongju

    2012-01-01

    Fourteen science high school students participated in this study, which investigated neural-network plasticity associated with hypothesis-generating and hypothesis-understanding in learning. The students were divided into two groups and participated in either hypothesis-generating or hypothesis-understanding type learning programs, which were…

  2. Humans have evolved specialized skills of social cognition: the cultural intelligence hypothesis.

    Science.gov (United States)

    Herrmann, Esther; Call, Josep; Hernàndez-Lloreda, Maráa Victoria; Hare, Brian; Tomasello, Michael

    2007-09-07

    Humans have many cognitive skills not possessed by their nearest primate relatives. The cultural intelligence hypothesis argues that this is mainly due to a species-specific set of social-cognitive skills, emerging early in ontogeny, for participating and exchanging knowledge in cultural groups. We tested this hypothesis by giving a comprehensive battery of cognitive tests to large numbers of two of humans' closest primate relatives, chimpanzees and orangutans, as well as to 2.5-year-old human children before literacy and schooling. Supporting the cultural intelligence hypothesis and contradicting the hypothesis that humans simply have more "general intelligence," we found that the children and chimpanzees had very similar cognitive skills for dealing with the physical world but that the children had more sophisticated cognitive skills than either of the ape species for dealing with the social world.

  3. A novel hypothesis splitting method implementation for multi-hypothesis filters

    DEFF Research Database (Denmark)

    Bayramoglu, Enis; Ravn, Ole; Andersen, Nils Axel

    2013-01-01

    The paper presents a multi-hypothesis filter library featuring a novel method for splitting Gaussians into ones with smaller variances. The library is written in C++ for high performance and the source code is open and free1. The multi-hypothesis filters commonly approximate the distribution tran...

  4. Optimal testing input sets for reduced diagnosis time of nuclear power plant digital electronic circuits

    International Nuclear Information System (INIS)

    Kim, D.S.; Seong, P.H.

    1994-01-01

    This paper describes the optimal testing input sets required for the fault diagnosis of the nuclear power plant digital electronic circuits. With the complicated systems such as very large scale integration (VLSI), nuclear power plant (NPP), and aircraft, testing is the major factor of the maintenance of the system. Particularly, diagnosis time grows quickly with the complexity of the component. In this research, for reduce diagnosis time the authors derived the optimal testing sets that are the minimal testing sets required for detecting the failure and for locating of the failed component. For reduced diagnosis time, the technique presented by Hayes fits best for the approach to testing sets generation among many conventional methods. However, this method has the following disadvantages: (a) it considers only the simple network (b) it concerns only whether the system is in failed state or not and does not provide the way to locate the failed component. Therefore the authors have derived the optimal testing input sets that resolve these problems by Hayes while preserving its advantages. When they applied the optimal testing sets to the automatic fault diagnosis system (AFDS) which incorporates the advanced fault diagnosis method of artificial intelligence technique, they found that the fault diagnosis using the optimal testing sets makes testing the digital electronic circuits much faster than that using exhaustive testing input sets; when they applied them to test the Universal (UV) Card which is a nuclear power plant digital input/output solid state protection system card, they reduced the testing time up to about 100 times

  5. Optimal Test Characteristics for Maximal Anaerobic Work on the Bicycle Ergometer

    Science.gov (United States)

    Katch, Victor; And Others

    1977-01-01

    Data from two separate experiments conducted to ascertain the optimum protocol for a maximum anaerobic work output test on the bicycle ergometer indicated that the test duration needs to be approximately forty seconds and the optimal frictional resistance five to six kilograms. (MB)

  6. Optimizing conditions for an accelerated leach test

    International Nuclear Information System (INIS)

    Pietrzak, R.F.; Fuhrmann, M.; Heiser, J.; Franz, E.M.; Colombo, P.

    1988-01-01

    An accelerated leach test for low-level radioactive waste forms is being developed to provide, in a short time, data that can be extrapolated to long time periods. The approach is to provide experimental conditions that will accelerate leaching without changing the dominant release mechanism. Experimental efforts have focused on combining individual factors that have been observed to accelerate leaching. These include elevated temperature, increased leachant volume, and reduced specimen size. The response of diffusion coefficients to various acceleration factors have been evaluated and provide information on experimental parameters that need to be optimized to increase leach rates. Preliminary modeling using a diffusion mechanism (allowing for depletion) of a finite cylinder geometry indicates that during early portions of experiments (daily sampling intervals), leaching is diffusion controlled and more rapid than later in the same experiments (weekly or greater sampling intervals). For cement waste forms, this reduction in rate may be partially controlled by changes in physical structure and chemistry (sometimes related to environmental influences such as CO 2 ), but it is more likely associated with the duration of the sampling interval. By using a combination of mathematical modeling and by experimentally investigating various leach rate controlling factors, a more complete understanding of leaching processes is being developed. This, in turn, is leading to optimized accelerating conditions for a leach test

  7. Optimization of ultrasonic tube testing with concentric transducers

    International Nuclear Information System (INIS)

    Dufayet, J.-P.; Gambin, Raymond.

    1978-01-01

    In order to test tubes by ultrasonics without rotation, concentric transducers can be used with conical mirrors to detect transverse defects and with helical shaped mirrors to detect longitudinal defects. Further optimization studies have been carried out in order to bring the system highly operational. The respective advantages brought by the rotating screen or by our especially designed sectorial transducers are discussed [fr

  8. Brood desertion by female shorebirds : a test of the differential parental capacity hypothesis on Kentish plovers

    NARCIS (Netherlands)

    Amat, JA; Visser, GH; Perez-Hurtado, A; Arroyo, GM

    2000-01-01

    The aim of this study was to examine whether the energetic costs of reproduction explain offspring desertion by female shorebirds, as is suggested by the differential parental capacity hypothesis. A prediction of the hypothesis is that, in species with biparental incubation in which females desert

  9. Generating optimal states for a homodyne Bell test

    International Nuclear Information System (INIS)

    Daffer, S.; Knight, P.L.

    2005-01-01

    Full text: We present a protocol that produces a conditionally prepared state that can be used for a Bell test based on homodyne detection. Based on the results of Munro, the state is near-optimal for Bell inequality violations based on quadrature-phase homodyne measurements that use correlated photon-number states. The scheme utilizes the Gaussian entanglement distillation protocol of Eisert et. al. and uses only beam splitters and photodetection to conditionally prepare a non-Gaussian state from a source of two-mode squeezed states with low squeezing parameter, permitting a loophole-free test of Bell inequalities. (author)

  10. Multiple Choice Testing and the Retrieval Hypothesis of the Testing Effect

    Science.gov (United States)

    Sensenig, Amanda E.

    2010-01-01

    Taking a test often leads to enhanced later memory for the tested information, a phenomenon known as the "testing effect". This memory advantage has been reliably demonstrated with recall tests but not multiple choice tests. One potential explanation for this finding is that multiple choice tests do not rely on retrieval processes to the same…

  11. Social learning and evolution: the cultural intelligence hypothesis

    Science.gov (United States)

    van Schaik, Carel P.; Burkart, Judith M.

    2011-01-01

    If social learning is more efficient than independent individual exploration, animals should learn vital cultural skills exclusively, and routine skills faster, through social learning, provided they actually use social learning preferentially. Animals with opportunities for social learning indeed do so. Moreover, more frequent opportunities for social learning should boost an individual's repertoire of learned skills. This prediction is confirmed by comparisons among wild great ape populations and by social deprivation and enculturation experiments. These findings shaped the cultural intelligence hypothesis, which complements the traditional benefit hypotheses for the evolution of intelligence by specifying the conditions in which these benefits can be reaped. The evolutionary version of the hypothesis argues that species with frequent opportunities for social learning should more readily respond to selection for a greater number of learned skills. Because improved social learning also improves asocial learning, the hypothesis predicts a positive interspecific correlation between social-learning performance and individual learning ability. Variation among primates supports this prediction. The hypothesis also predicts that more heavily cultural species should be more intelligent. Preliminary tests involving birds and mammals support this prediction too. The cultural intelligence hypothesis can also account for the unusual cognitive abilities of humans, as well as our unique mechanisms of skill transfer. PMID:21357223

  12. Social learning and evolution: the cultural intelligence hypothesis.

    Science.gov (United States)

    van Schaik, Carel P; Burkart, Judith M

    2011-04-12

    If social learning is more efficient than independent individual exploration, animals should learn vital cultural skills exclusively, and routine skills faster, through social learning, provided they actually use social learning preferentially. Animals with opportunities for social learning indeed do so. Moreover, more frequent opportunities for social learning should boost an individual's repertoire of learned skills. This prediction is confirmed by comparisons among wild great ape populations and by social deprivation and enculturation experiments. These findings shaped the cultural intelligence hypothesis, which complements the traditional benefit hypotheses for the evolution of intelligence by specifying the conditions in which these benefits can be reaped. The evolutionary version of the hypothesis argues that species with frequent opportunities for social learning should more readily respond to selection for a greater number of learned skills. Because improved social learning also improves asocial learning, the hypothesis predicts a positive interspecific correlation between social-learning performance and individual learning ability. Variation among primates supports this prediction. The hypothesis also predicts that more heavily cultural species should be more intelligent. Preliminary tests involving birds and mammals support this prediction too. The cultural intelligence hypothesis can also account for the unusual cognitive abilities of humans, as well as our unique mechanisms of skill transfer.

  13. Method to determine the optimal constitutive model from spherical indentation tests

    Directory of Open Access Journals (Sweden)

    Tairui Zhang

    2018-03-01

    Full Text Available The limitation of current indentation theories was investigated and a method to determine the optimal constitutive model through spherical indentation tests was proposed. Two constitutive models, the Power-law and the Linear-law, were used in Finite Element (FE calculations, and then a set of indentation governing equations was established for each model. The load-depth data from the normal indentation depth was used to fit the best parameters in each constitutive model while the data from the further loading part was compared with those from FE calculations, and the model that better predicted the further deformation was considered the optimal one. Moreover, a Yang’s modulus calculation model which took the previous plastic deformation and the phenomenon of pile-up (or sink-in into consideration was also proposed to revise the original Sneddon-Pharr-Oliver model. The indentation results on six materials, 304, 321, SA508, SA533, 15CrMoR, and Fv520B, were compared with tensile ones, which validated the reliability of the revised E calculation model and the optimal constitutive model determination method in this study. Keywords: Optimal constitutive model, Spherical indentation test, Finite Element calculations, Yang’s modulus

  14. Physiopathological Hypothesis of Cellulite

    Science.gov (United States)

    de Godoy, José Maria Pereira; de Godoy, Maria de Fátima Guerreiro

    2009-01-01

    A series of questions are asked concerning this condition including as regards to its name, the consensus about the histopathological findings, physiological hypothesis and treatment of the disease. We established a hypothesis for cellulite and confirmed that the clinical response is compatible with this hypothesis. Hence this novel approach brings a modern physiological concept with physiopathologic basis and clinical proof of the hypothesis. We emphasize that the choice of patient, correct diagnosis of cellulite and the technique employed are fundamental to success. PMID:19756187

  15. A hypothesis-testing framework for studies investigating ontogenetic niche shifts using stable isotope ratios.

    Directory of Open Access Journals (Sweden)

    Caroline M Hammerschlag-Peyer

    Full Text Available Ontogenetic niche shifts occur across diverse taxonomic groups, and can have critical implications for population dynamics, community structure, and ecosystem function. In this study, we provide a hypothesis-testing framework combining univariate and multivariate analyses to examine ontogenetic niche shifts using stable isotope ratios. This framework is based on three distinct ontogenetic niche shift scenarios, i.e., (1 no niche shift, (2 niche expansion/reduction, and (3 discrete niche shift between size classes. We developed criteria for identifying each scenario, as based on three important resource use characteristics, i.e., niche width, niche position, and niche overlap. We provide an empirical example for each ontogenetic niche shift scenario, illustrating differences in resource use characteristics among different organisms. The present framework provides a foundation for future studies on ontogenetic niche shifts, and also can be applied to examine resource variability among other population sub-groupings (e.g., by sex or phenotype.

  16. UHPC for Blast and Ballistic Protection, Explosion Testing and Composition Optimization

    Science.gov (United States)

    Bibora, P.; Drdlová, M.; Prachař, V.; Sviták, O.

    2017-10-01

    The realization of high performance concrete resistant to detonation is the aim and expected outcome of the presented project, which is oriented to development of construction materials for larger objects as protective walls and bunkers. Use of high-strength concrete (HSC / HPC - “high strength / performance concrete”) and high-fiber reinforced concrete (UHPC / UHPFC -“Ultra High Performance Fiber Reinforced Concrete”) seems to be optimal for this purpose of research. The paper describes the research phase of the project, in which we focused on the selection of specific raw materials and chemical additives, including determining the most suitable type and amount of distributed fiber reinforcement. Composition of UHPC was optimized during laboratory manufacture of test specimens to obtain the best desired physical- mechanical properties of developed high performance concretes. In connection with laboratory testing, explosion field tests of UHPC specimens were performed and explosion resistance of laboratory produced UHPC testing boards was investigated.

  17. Analysis of Various Multi-Objective Optimization Evolutionary Algorithms for Monte Carlo Treatment Planning System

    CERN Document Server

    Tydrichova, Magdalena

    2017-01-01

    In this project, various available multi-objective optimization evolutionary algorithms were compared considering their performance and distribution of solutions. The main goal was to select the most suitable algorithms for applications in cancer hadron therapy planning. For our purposes, a complex testing and analysis software was developed. Also, many conclusions and hypothesis have been done for the further research.

  18. Testing the Optimality of Consumption Decisions of the Representative Household: Evidence from Brazil

    Directory of Open Access Journals (Sweden)

    Marcos Gesteira Costa

    2015-09-01

    Full Text Available This paper investigates whether there is a fraction of consumers that do not behave as fully forward-looking optimal consumers in the Brazilian economy. The generalized method of moments technique was applied to nonlinear Euler equations of the consumption-based capital assets model contemplating utility functions with time separability and non-separability. The results show that when the household utility function was modeled as constant relative risk aversion, external habits and Kreps–Porteus, estimates of the fraction of rule-of-thumb households was, respectively, 89%, 78% and 22%. According to this, a portion of disposable income goes to households who consume their current incomes in violation of the permanent income hypothesis.

  19. Testing the gravitational instability hypothesis?

    Science.gov (United States)

    Babul, Arif; Weinberg, David H.; Dekel, Avishai; Ostriker, Jeremiah P.

    1994-01-01

    We challenge a widely accepted assumption of observational cosmology: that successful reconstruction of observed galaxy density fields from measured galaxy velocity fields (or vice versa), using the methods of gravitational instability theory, implies that the observed large-scale structures and large-scale flows were produced by the action of gravity. This assumption is false, in that there exist nongravitational theories that pass the reconstruction tests and gravitational theories with certain forms of biased galaxy formation that fail them. Gravitational instability theory predicts specific correlations between large-scale velocity and mass density fields, but the same correlations arise in any model where (a) structures in the galaxy distribution grow from homogeneous initial conditions in a way that satisfies the continuity equation, and (b) the present-day velocity field is irrotational and proportional to the time-averaged velocity field. We demonstrate these assertions using analytical arguments and N-body simulations. If large-scale structure is formed by gravitational instability, then the ratio of the galaxy density contrast to the divergence of the velocity field yields an estimate of the density parameter Omega (or, more generally, an estimate of beta identically equal to Omega(exp 0.6)/b, where b is an assumed constant of proportionality between galaxy and mass density fluctuations. In nongravitational scenarios, the values of Omega or beta estimated in this way may fail to represent the true cosmological values. However, even if nongravitational forces initiate and shape the growth of structure, gravitationally induced accelerations can dominate the velocity field at late times, long after the action of any nongravitational impulses. The estimated beta approaches the true value in such cases, and in our numerical simulations the estimated beta values are reasonably accurate for both gravitational and nongravitational models. Reconstruction tests

  20. Secretive Food Concocting in Binge Eating: Test of a Famine Hypothesis

    Science.gov (United States)

    Boggiano, Mary M.; Turan, Bulent; Maldonado, Christine R.; Oswald, Kimberly D.; Shuman, Ellen S.

    2016-01-01

    Objective Food concocting, or making strange food mixtures, is well documented in the famine and experimental semistarvation literature and appears anecdotally in rare descriptions of eating disorder (ED) patients but has never been scientifically investigated. Here we do so in the context of binge-eating using a “famine hypothesis of concocting.” Method A sample of 552 adults varying in binge eating and dieting traits completed a Concocting Survey created for this study. Exploratory ED groups were created to obtain predictions as to the nature of concocting in clinical populations. Results Binge eating predicted the 24.6% of participants who reported having ever concocted but dietary restraint, independently, even after controlling for binge eating, predicted its frequency and salience. Craving was the main motive. Emotions while concocting mirrored classic high-arousal symptoms associated with drug use; while eating the concoctions were associated with intensely negative/self-deprecating emotions. Concocting prevalence and salience was greater in the anorexia > bulimia > BED > no ED groups, consistent with their respectively incrementing dieting scores. Discussion Concocting distinguishes binge eating from other overeating and, consistent with the famine hypothesis, is accounted for by dietary restraint. Unlike its adaptive function in famine, concocting could worsen binge-eating disorders by increasing negative effect, shame, and secrecy. Its assessment in these disorders may prove therapeutically valuable. PMID:23255044

  1. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    Science.gov (United States)

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Combining multiple hypothesis testing and affinity propagation clustering leads to accurate, robust and sample size independent classification on gene expression data

    Directory of Open Access Journals (Sweden)

    Sakellariou Argiris

    2012-10-01

    Full Text Available Abstract Background A feature selection method in microarray gene expression data should be independent of platform, disease and dataset size. Our hypothesis is that among the statistically significant ranked genes in a gene list, there should be clusters of genes that share similar biological functions related to the investigated disease. Thus, instead of keeping N top ranked genes, it would be more appropriate to define and keep a number of gene cluster exemplars. Results We propose a hybrid FS method (mAP-KL, which combines multiple hypothesis testing and affinity propagation (AP-clustering algorithm along with the Krzanowski & Lai cluster quality index, to select a small yet informative subset of genes. We applied mAP-KL on real microarray data, as well as on simulated data, and compared its performance against 13 other feature selection approaches. Across a variety of diseases and number of samples, mAP-KL presents competitive classification results, particularly in neuromuscular diseases, where its overall AUC score was 0.91. Furthermore, mAP-KL generates concise yet biologically relevant and informative N-gene expression signatures, which can serve as a valuable tool for diagnostic and prognostic purposes, as well as a source of potential disease biomarkers in a broad range of diseases. Conclusions mAP-KL is a data-driven and classifier-independent hybrid feature selection method, which applies to any disease classification problem based on microarray data, regardless of the available samples. Combining multiple hypothesis testing and AP leads to subsets of genes, which classify unknown samples from both, small and large patient cohorts with high accuracy.

  3. Testing the shape-similarity hypothesis between particle-size distribution and water retention for Sicilian soils

    Directory of Open Access Journals (Sweden)

    Chiara Antinoro

    2012-12-01

    Full Text Available Application of the Arya and Paris (AP model to estimate the soil water retention curve requires a detailed description of the particlesize distribution (PSD but limited experimental PSD data are generally determined by the conventional sieve-hydrometer (SH method. Detailed PSDs can be obtained by fitting a continuous model to SH data or performing measurements by the laser diffraction (LD method. The AP model was applied to 40 Sicilian soils for which the PSD was measured by both the SH and LD methods. The scale factor was set equal to 1.38 (procedure AP1 or estimated by a logistical model with parameters gathered from literature (procedure AP2. For both SH and LD data, procedure AP2 allowed a more accurate prediction of the water retention than procedure AP1, confirming that it is not convenient to use a unique value of  for soils that are very different in texture. Despite the differences in PSDs obtained by the SH and LD methods, the water retention predicted by a given procedure (AP1 or AP2 using SH or LD data was characterized by the same level of accuracy. Discrepancies in the estimated water retention from the two PSD measurement methods were attributed to underestimation of the finest diameter frequency obtained by the LD method. Analysis also showed that the soil water retention estimated using the SH method was affected by an estimation bias that could be corrected by an optimization procedure (OPT. Comparison of a-distributions and water retention shape indices obtained by the two methods (SH or LD indicated that the shape-similarity hypothesis is better verified if the traditional sieve-hydrometer data are used to apply the AP model. The optimization procedure allowed more accurate predictions of the water retention curves than the traditional AP1 and AP2 procedures. Therefore, OPT can be considered a valid alternative to the more complex logistical model for estimating the water retention curve of Sicilian soils.

  4. Design Optimization for a Truncated Catenary Mooring System for Scale Model Test

    Directory of Open Access Journals (Sweden)

    Climent Molins

    2015-11-01

    Full Text Available One of the main aspects when testing floating offshore platforms is the scaled mooring system, particularly with the increased depths where such platforms are intended. The paper proposes the use of truncated mooring systems to emulate the real mooring system by solving an optimization problem. This approach could be an interesting option when the existing testing facilities do not have enough available space. As part of the development of a new spar platform made of concrete for Floating Offshore Wind Turbines (FOWTs, called Windcrete, a station keeping system with catenary shaped lines was selected. The test facility available for the planned experiments had an important width constraint. Then, an algorithm to optimize the design of the scaled truncated mooring system using different weights of lines was developed. The optimization process adjusts the quasi-static behavior of the scaled mooring system as much as possible to the real mooring system within its expected maximum displacement range, where the catenary line provides the restoring forces by its suspended line length.

  5. A Dopamine Hypothesis of Autism Spectrum Disorder.

    Science.gov (United States)

    Pavăl, Denis

    2017-01-01

    Autism spectrum disorder (ASD) comprises a group of neurodevelopmental disorders characterized by social deficits and stereotyped behaviors. While several theories have emerged, the pathogenesis of ASD remains unknown. Although studies report dopamine signaling abnormalities in autistic patients, a coherent dopamine hypothesis which could link neurobiology to behavior in ASD is currently lacking. In this paper, we present such a hypothesis by proposing that autistic behavior arises from dysfunctions in the midbrain dopaminergic system. We hypothesize that a dysfunction of the mesocorticolimbic circuit leads to social deficits, while a dysfunction of the nigrostriatal circuit leads to stereotyped behaviors. Furthermore, we discuss 2 key predictions of our hypothesis, with emphasis on clinical and therapeutic aspects. First, we argue that dopaminergic dysfunctions in the same circuits should associate with autistic-like behavior in nonautistic subjects. Concerning this, we discuss the case of PANDAS (pediatric autoimmune neuropsychiatric disorder associated with streptococcal infections) which displays behaviors similar to those of ASD, presumed to arise from dopaminergic dysfunctions. Second, we argue that providing dopamine modulators to autistic subjects should lead to a behavioral improvement. Regarding this, we present clinical studies of dopamine antagonists which seem to have improving effects on autistic behavior. Furthermore, we explore the means of testing our hypothesis by using neuroreceptor imaging, which could provide comprehensive evidence for dopamine signaling dysfunctions in autistic subjects. Lastly, we discuss the limitations of our hypothesis. Along these lines, we aim to provide a dopaminergic model of ASD which might lead to a better understanding of the ASD pathogenesis. © 2017 S. Karger AG, Basel.

  6. Possible Solution to Publication Bias Through Bayesian Statistics, Including Proper Null Hypothesis Testing

    NARCIS (Netherlands)

    Konijn, Elly A.; van de Schoot, Rens; Winter, Sonja D.; Ferguson, Christopher J.

    2015-01-01

    The present paper argues that an important cause of publication bias resides in traditional frequentist statistics forcing binary decisions. An alternative approach through Bayesian statistics provides various degrees of support for any hypothesis allowing balanced decisions and proper null

  7. Testing Bergmann's rule and the Rosenzweig hypothesis with craniometric studies of the South American sea lion.

    Science.gov (United States)

    Sepúlveda, Maritza; Oliva, Doris; Duran, L René; Urra, Alejandra; Pedraza, Susana N; Majluf, Patrícia; Goodall, Natalie; Crespo, Enrique A

    2013-04-01

    We tested the validity of Bergmann's rule and Rosenzweig's hypothesis through an analysis of the geographical variation of the skull size of Otaria flavescens along the entire distribution range of the species (except Brazil). We quantified the sizes of 606 adult South American sea lion skulls measured in seven localities of Peru, Chile, Uruguay, Argentina, and the Falkland/Malvinas Islands. Geographical and environmental variables included latitude, longitude, and monthly minimum, maximum, and mean air and ocean temperatures. We also included information on fish landings as a proxy for productivity. Males showed a positive relationship between condylobasal length (CBL) and latitude, and between CBL and the six temperature variables. By contrast, females showed a negative relationship between CBL and the same variables. Finally, female skull size showed a significant and positive correlation with fish landings, while males did not show any relationship with this variable. The body size of males conformed to Bergmann's rule, with larger individuals found in southern localities of South America. Females followed the converse of Bergmann's rule at the intraspecific level, but showed a positive relationship with the proxy for productivity, thus supporting Rosenzweig's hypothesis. Differences in the factors that drive body size in females and males may be explained by their different life-history strategies. Our analyses demonstrate that latitude and temperature are not the only factors that explain spatial variation in body size: others such as food availability are also important for explaining the ecogeographical patterns found in O. flavescens.

  8. Combining Multiple Hypothesis Testing with Machine Learning Increases the Statistical Power of Genome-wide Association Studies

    Science.gov (United States)

    Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert

    2016-01-01

    The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008–2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0. PMID:27892471

  9. RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2017-05-01

    Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.

  10. Optimization Of Nakazima Test At Elevated Temperatures

    International Nuclear Information System (INIS)

    Turetta, A.; Ghiotti, A.; Bruschi, S.

    2007-01-01

    Nowadays hot forming of High Strength Steel is gaining the strict requirements of automotive producer: in fact deformation performed simultaneously with quenching assures a fully martensitic microstructure at room temperature and thus high strength properties that allow the thickness reduction of the body-in-white components. Basic aspects of hot stamping are still under investigation and supplementary achievements are expected for a successful application of sheet metal forming technologies at elevated temperatures. Among data needed to settle a numerical model of the process, information about material formability may help in better designing and optimizing hot stamping operations. In the first part of the work, a new experimental apparatus based on Nakazima concept is presented; process parameters are optimized in order to accurately replicate the thermo-mechanical conditions typical of the industrial process, paying particular attention to the thermal and microstructural evolution. On the other hand, as commercial FE codes require the implementation of Forming Limit Diagrams at constant temperature, numerical investigations have been performed in order to determine the proper testing conditions to obtain FLD at nearly constant temperature

  11. Optimal Constant-Stress Accelerated Degradation Test Plans Using Nonlinear Generalized Wiener Process

    Directory of Open Access Journals (Sweden)

    Zhen Chen

    2016-01-01

    Full Text Available Accelerated degradation test (ADT has been widely used to assess highly reliable products’ lifetime. To conduct an ADT, an appropriate degradation model and test plan should be determined in advance. Although many historical studies have proposed quite a few models, there is still room for improvement. Hence we propose a Nonlinear Generalized Wiener Process (NGWP model with consideration of the effects of stress level, product-to-product variability, and measurement errors for a higher estimation accuracy and a wider range of use. Then under the constraints of sample size, test duration, and test cost, the plans of constant-stress ADT (CSADT with multiple stress levels based on the NGWP are designed by minimizing the asymptotic variance of the reliability estimation of the products under normal operation conditions. An optimization algorithm is developed to determine the optimal stress levels, the number of units allocated to each level, inspection frequency, and measurement times simultaneously. In addition, a comparison based on degradation data of LEDs is made to show better goodness-of-fit of the NGWP than that of other models. Finally, optimal two-level and three-level CSADT plans under various constraints and a detailed sensitivity analysis are demonstrated through examples in this paper.

  12. Raison d’être of insulin resistance: the adjustable threshold hypothesis

    OpenAIRE

    Wang, Guanyu

    2014-01-01

    The epidemics of obesity and diabetes demand a deeper understanding of insulin resistance, for which the adjustable threshold hypothesis is formed in this paper. To test the hypothesis, mathematical modelling was used to analyse clinical data and to simulate biological processes at both molecular and organismal levels. I found that insulin resistance roots in the thresholds of the cell's bistable response. By assuming heterogeneity of the thresholds, single cells' all-or-none response can col...

  13. Risk-sensitive optimal feedback control accounts for sensorimotor behavior under uncertainty.

    Directory of Open Access Journals (Sweden)

    Arne J Nagengast

    2010-07-01

    Full Text Available Many aspects of human motor behavior can be understood using optimality principles such as optimal feedback control. However, these proposed optimal control models are risk-neutral; that is, they are indifferent to the variability of the movement cost. Here, we propose the use of a risk-sensitive optimal controller that incorporates movement cost variance either as an added cost (risk-averse controller or as an added value (risk-seeking controller to model human motor behavior in the face of uncertainty. We use a sensorimotor task to test the hypothesis that subjects are risk-sensitive. Subjects controlled a virtual ball undergoing Brownian motion towards a target. Subjects were required to minimize an explicit cost, in points, that was a combination of the final positional error of the ball and the integrated control cost. By testing subjects on different levels of Brownian motion noise and relative weighting of the position and control cost, we could distinguish between risk-sensitive and risk-neutral control. We show that subjects change their movement strategy pessimistically in the face of increased uncertainty in accord with the predictions of a risk-averse optimal controller. Our results suggest that risk-sensitivity is a fundamental attribute that needs to be incorporated into optimal feedback control models.

  14. The picture superiority effect in conceptual implicit memory: a conceptual distinctiveness hypothesis.

    Science.gov (United States)

    Hamilton, Maryellen; Geraci, Lisa

    2006-01-01

    According to leading theories, the picture superiority effect is driven by conceptual processing, yet this effect has been difficult to obtain using conceptual implicit memory tests. We hypothesized that the picture superiority effect results from conceptual processing of a picture's distinctive features rather than a picture's semantic features. To test this hypothesis, we used 2 conceptual implicit general knowledge tests; one cued conceptually distinctive features (e.g., "What animal has large eyes?") and the other cued semantic features (e.g., "What animal is the figurehead of Tootsie Roll?"). Results showed a picture superiority effect only on the conceptual test using distinctive cues, supporting our hypothesis that this effect is mediated by conceptual processing of a picture's distinctive features.

  15. Optimization of Allowed Outage Time and Surveillance Test Intervals

    Energy Technology Data Exchange (ETDEWEB)

    Al-Dheeb, Mujahed; Kang, Sunkoo; Kim, Jonghyun [KEPCO international nuclear graduate school, Ulsan (Korea, Republic of)

    2015-10-15

    The primary purpose of surveillance testing is to assure that the components of standby safety systems will be operable when they are needed in an accident. By testing these components, failures can be detected that may have occurred since the last test or the time when the equipment was last known to be operational. The probability a system or system component performs a specified function or mission under given conditions at a prescribed time is called availability (A). Unavailability (U) as a risk measure is just the complementary probability to A(t). The increase of U means the risk is increased as well. D and T have an important impact on components, or systems, unavailability. The extension of D impacts the maintenance duration distributions for at-power operations, making them longer. This, in turn, increases the unavailability due to maintenance in the systems analysis. As for T, overly-frequent surveillances can result in high system unavailability. This is because the system may be taken out of service often due to the surveillance itself and due to the repair of test-caused failures of the component. The test-caused failures include those incurred by wear and tear of the component due to the surveillances. On the other hand, as the surveillance interval increases, the component's unavailability will grow because of increased occurrences of time-dependent random failures. In that situation, the component cannot be relied upon, and accordingly the system unavailability will increase. Thus, there should be an optimal component surveillance interval in terms of the corresponding system availability. This paper aims at finding the optimal T and D which result in minimum unavailability which in turn reduces the risk. Applying the methodology in section 2 to find the values of optimal T and D for two components, i.e., safety injection pump (SIP) and turbine driven aux feedwater pump (TDAFP). Section 4 is addressing interaction between D and T. In general

  16. Optimization of Allowed Outage Time and Surveillance Test Intervals

    International Nuclear Information System (INIS)

    Al-Dheeb, Mujahed; Kang, Sunkoo; Kim, Jonghyun

    2015-01-01

    The primary purpose of surveillance testing is to assure that the components of standby safety systems will be operable when they are needed in an accident. By testing these components, failures can be detected that may have occurred since the last test or the time when the equipment was last known to be operational. The probability a system or system component performs a specified function or mission under given conditions at a prescribed time is called availability (A). Unavailability (U) as a risk measure is just the complementary probability to A(t). The increase of U means the risk is increased as well. D and T have an important impact on components, or systems, unavailability. The extension of D impacts the maintenance duration distributions for at-power operations, making them longer. This, in turn, increases the unavailability due to maintenance in the systems analysis. As for T, overly-frequent surveillances can result in high system unavailability. This is because the system may be taken out of service often due to the surveillance itself and due to the repair of test-caused failures of the component. The test-caused failures include those incurred by wear and tear of the component due to the surveillances. On the other hand, as the surveillance interval increases, the component's unavailability will grow because of increased occurrences of time-dependent random failures. In that situation, the component cannot be relied upon, and accordingly the system unavailability will increase. Thus, there should be an optimal component surveillance interval in terms of the corresponding system availability. This paper aims at finding the optimal T and D which result in minimum unavailability which in turn reduces the risk. Applying the methodology in section 2 to find the values of optimal T and D for two components, i.e., safety injection pump (SIP) and turbine driven aux feedwater pump (TDAFP). Section 4 is addressing interaction between D and T. In general

  17. Robust Approach to Verifying the Weak Form of the Efficient Market Hypothesis

    Science.gov (United States)

    Střelec, Luboš

    2011-09-01

    The weak form of the efficient markets hypothesis states that prices incorporate only past information about the asset. An implication of this form of the efficient markets hypothesis is that one cannot detect mispriced assets and consistently outperform the market through technical analysis of past prices. One of possible formulations of the efficient market hypothesis used for weak form tests is that share prices follow a random walk. It means that returns are realizations of IID sequence of random variables. Consequently, for verifying the weak form of the efficient market hypothesis, we can use distribution tests, among others, i.e. some tests of normality and/or some graphical methods. Many procedures for testing the normality of univariate samples have been proposed in the literature [7]. Today the most popular omnibus test of normality for a general use is the Shapiro-Wilk test. The Jarque-Bera test is the most widely adopted omnibus test of normality in econometrics and related fields. In particular, the Jarque-Bera test (i.e. test based on the classical measures of skewness and kurtosis) is frequently used when one is more concerned about heavy-tailed alternatives. As these measures are based on moments of the data, this test has a zero breakdown value [2]. In other words, a single outlier can make the test worthless. The reason so many classical procedures are nonrobust to outliers is that the parameters of the model are expressed in terms of moments, and their classical estimators are expressed in terms of sample moments, which are very sensitive to outliers. Another approach to robustness is to concentrate on the parameters of interest suggested by the problem under this study. Consequently, novel robust testing procedures of testing normality are presented in this paper to overcome shortcomings of classical normality tests in the field of financial data, which are typical with occurrence of remote data points and additional types of deviations from

  18. Hypothesis testing for differentially correlated features.

    Science.gov (United States)

    Sheng, Elisa; Witten, Daniela; Zhou, Xiao-Hua

    2016-10-01

    In a multivariate setting, we consider the task of identifying features whose correlations with the other features differ across conditions. Such correlation shifts may occur independently of mean shifts, or differences in the means of the individual features across conditions. Previous approaches for detecting correlation shifts consider features simultaneously, by computing a correlation-based test statistic for each feature. However, since correlations involve two features, such approaches do not lend themselves to identifying which feature is the culprit. In this article, we instead consider a serial testing approach, by comparing columns of the sample correlation matrix across two conditions, and removing one feature at a time. Our method provides a novel perspective and favorable empirical results compared with competing approaches. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Cognitive differences between orang-utan species: a test of the cultural intelligence hypothesis.

    Science.gov (United States)

    Forss, Sofia I F; Willems, Erik; Call, Josep; van Schaik, Carel P

    2016-07-28

    Cultural species can - or even prefer to - learn their skills from conspecifics. According to the cultural intelligence hypothesis, selection on underlying mechanisms not only improves this social learning ability but also the asocial (individual) learning ability. Thus, species with systematically richer opportunities to socially acquire knowledge and skills should over time evolve to become more intelligent. We experimentally compared the problem-solving ability of Sumatran orang-utans (Pongo abelii), which are sociable in the wild, with that of the closely related, but more solitary Bornean orang-utans (P. pygmaeus), under the homogeneous environmental conditions provided by zoos. Our results revealed that Sumatrans showed superior innate problem-solving skills to Borneans, and also showed greater inhibition and a more cautious and less rough exploration style. This pattern is consistent with the cultural intelligence hypothesis, which predicts that the more sociable of two sister species experienced stronger selection on cognitive mechanisms underlying learning.

  20. FADTTSter: accelerating hypothesis testing with functional analysis of diffusion tensor tract statistics

    Science.gov (United States)

    Noel, Jean; Prieto, Juan C.; Styner, Martin

    2017-03-01

    Functional Analysis of Diffusion Tensor Tract Statistics (FADTTS) is a toolbox for analysis of white matter (WM) fiber tracts. It allows associating diffusion properties along major WM bundles with a set of covariates of interest, such as age, diagnostic status and gender, and the structure of the variability of these WM tract properties. However, to use this toolbox, a user must have an intermediate knowledge in scripting languages (MATLAB). FADTTSter was created to overcome this issue and make the statistical analysis accessible to any non-technical researcher. FADTTSter is actively being used by researchers at the University of North Carolina. FADTTSter guides non-technical users through a series of steps including quality control of subjects and fibers in order to setup the necessary parameters to run FADTTS. Additionally, FADTTSter implements interactive charts for FADTTS' outputs. This interactive chart enhances the researcher experience and facilitates the analysis of the results. FADTTSter's motivation is to improve usability and provide a new analysis tool to the community that complements FADTTS. Ultimately, by enabling FADTTS to a broader audience, FADTTSter seeks to accelerate hypothesis testing in neuroimaging studies involving heterogeneous clinical data and diffusion tensor imaging. This work is submitted to the Biomedical Applications in Molecular, Structural, and Functional Imaging conference. The source code of this application is available in NITRC.

  1. Testing the Binary Hypothesis: Pulsar Timing Constraints on Supermassive Black Hole Binary Candidates

    Science.gov (United States)

    Sesana, Alberto; Haiman, Zoltán; Kocsis, Bence; Kelley, Luke Zoltan

    2018-03-01

    The advent of time domain astronomy is revolutionizing our understanding of the universe. Programs such as the Catalina Real-time Transient Survey (CRTS) or the Palomar Transient Factory (PTF) surveyed millions of objects for several years, allowing variability studies on large statistical samples. The inspection of ≈250 k quasars in CRTS resulted in a catalog of 111 potentially periodic sources, put forward as supermassive black hole binary (SMBHB) candidates. A similar investigation on PTF data yielded 33 candidates from a sample of ≈35 k quasars. Working under the SMBHB hypothesis, we compute the implied SMBHB merger rate and we use it to construct the expected gravitational wave background (GWB) at nano-Hz frequencies, probed by pulsar timing arrays (PTAs). After correcting for incompleteness and assuming virial mass estimates, we find that the GWB implied by the CRTS sample exceeds the current most stringent PTA upper limits by almost an order of magnitude. After further correcting for the implicit bias in virial mass measurements, the implied GWB drops significantly but is still in tension with the most stringent PTA upper limits. Similar results hold for the PTF sample. Bayesian model selection shows that the null hypothesis (whereby the candidates are false positives) is preferred over the binary hypothesis at about 2.3σ and 3.6σ for the CRTS and PTF samples respectively. Although not decisive, our analysis highlights the potential of PTAs as astrophysical probes of individual SMBHB candidates and indicates that the CRTS and PTF samples are likely contaminated by several false positives.

  2. Detecting changes in real-time data: a user's guide to optimal detection.

    Science.gov (United States)

    Johnson, P; Moriarty, J; Peskir, G

    2017-08-13

    The real-time detection of changes in a noisily observed signal is an important problem in applied science and engineering. The study of parametric optimal detection theory began in the 1930s, motivated by applications in production and defence. Today this theory, which aims to minimize a given measure of detection delay under accuracy constraints, finds applications in domains including radar, sonar, seismic activity, global positioning, psychological testing, quality control, communications and power systems engineering. This paper reviews developments in optimal detection theory and sequential analysis, including sequential hypothesis testing and change-point detection, in both Bayesian and classical (non-Bayesian) settings. For clarity of exposition, we work in discrete time and provide a brief discussion of the continuous time setting, including recent developments using stochastic calculus. Different measures of detection delay are presented, together with the corresponding optimal solutions. We emphasize the important role of the signal-to-noise ratio and discuss both the underlying assumptions and some typical applications for each formulation.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  3. The evolution of polyandry: patterns of genotypic variation in female mating frequency, male fertilization success and a test of the sexy-sperm hypothesis.

    Science.gov (United States)

    Simmons, L W

    2003-07-01

    The sexy-sperm hypothesis predicts that females obtain indirect benefits for their offspring via polyandy, in the form of increased fertilization success for their sons. I use a quantitative genetic approach to test the sexy-sperm hypothesis using the field cricket Teleogryllus oceanicus. Previous studies of this species have shown considerable phenotypic variation in fertilization success when two or more males compete. There were high broad-sense heritabilities for both paternity and polyandry. Patterns of genotypic variance were consistent with X-linked inheritance and/or maternal effects on these traits. The genetic architecture therefore precludes the evolution of polyandry via a sexy-sperm process. Thus the positive genetic correlation between paternity in sons and polyandry in daughters predicted by the sexy-sperm hypothesis was absent. There was significant heritable variation in the investment by females in ovaries and by males in the accessory gland. Surprisingly there was a very strong genetic correlation between these two traits. The significance of this genetic correlation for the coevolution of male seminal products and polyandry is discussed.

  4. Hypothesis driven development of new adjuvants: short peptides as immunomodulators.

    Science.gov (United States)

    Dong, Jessica C; Kobinger, Gary P

    2013-04-01

    To date, vaccinations have been one of the key strategies in the prevention and protection against infectious pathogens. Traditional vaccines have well-known limitations such as safety and efficacy issues, which consequently deems it inappropriate for particular populations and may not be an effective strategy against all pathogens. This evidence highlights the need to develop more efficacious vaccination regiments. Higher levels of protection can be achieved by the addition of immunostimulating adjuvants. Many adjuvants elicit strong, undefined inflammation, which produces increased immunogenicity but may also lead to undesirable effects. Hypothesis driven development of adjuvants is needed to achieve a more specific and directed immune response required for optimal and safe vaccine-induced immune protection. An example of such hypothesis driven development includes the use of short immunomodulating peptides as adjuvants. These peptides have the ability to influence the immune response and can be extrapolated for adjuvant use, but requires further investigation.

  5. A comparator-hypothesis account of biased contingency detection.

    Science.gov (United States)

    Vadillo, Miguel A; Barberia, Itxaso

    2018-02-12

    Our ability to detect statistical dependencies between different events in the environment is strongly biased by the number of coincidences between them. Even when there is no true covariation between a cue and an outcome, if the marginal probability of either of them is high, people tend to perceive some degree of statistical contingency between both events. The present paper explores the ability of the Comparator Hypothesis to explain the general pattern of results observed in this literature. Our simulations show that this model can account for the biasing effects of the marginal probabilities of cues and outcomes. Furthermore, the overall fit of the Comparator Hypothesis to a sample of experimental conditions from previous studies is comparable to that of the popular Rescorla-Wagner model. These results should encourage researchers to further explore and put to the test the predictions of the Comparator Hypothesis in the domain of biased contingency detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. The Income Inequality Hypothesis Revisited : Assessing the Hypothesis Using Four Methodological Approaches

    NARCIS (Netherlands)

    Kragten, N.; Rözer, J.

    The income inequality hypothesis states that income inequality has a negative effect on individual’s health, partially because it reduces social trust. This article aims to critically assess the income inequality hypothesis by comparing several analytical strategies, namely OLS regression,

  7. Optimized Environmental Test Sequences to Ensure the Sustainability and Reliability of Marine Weapons

    Directory of Open Access Journals (Sweden)

    Jung Ho Yang

    2014-11-01

    Full Text Available In recent years, there has been an increase in the types of marine weapons used in response to diverse hostile threats. However, because marine weapons are only tested under a single set of environmental conditions, failures due to different environmental stresses have been difficult to detect. Hence, this study proposes an environmental test sequence for multi-environment testing. The environmental test sequences for electrical units described in the international standard IEC 60068-1, and for military supply described in the United States national standard MIL-STD-810G were investigated to propose guidelines for the appropriate test sequences. This study demonstrated the need for tests in multiple environments by investigating marine weapon accidents, and evaluated which environmental stresses and test items have the largest impacts on marine weapons using a two-phase quality function deployment (QFD analysis of operational scenarios, environmental stresses, and environmental test items. Integer programming was used to determine the most influential test items and the shortest environmental test time, allowing us to propose optimal test procedures. Based on our analysis, we developed optimal environmental test sequences that could be selected by a test designer.

  8. Carbon and nutrient use efficiencies optimally balance stoichiometric imbalances

    Science.gov (United States)

    Manzoni, Stefano; Čapek, Petr; Lindahl, Björn; Mooshammer, Maria; Richter, Andreas; Šantrůčková, Hana

    2016-04-01

    Decomposer organisms face large stoichiometric imbalances because their food is generally poor in nutrients compared to the decomposer cellular composition. The presence of excess carbon (C) requires adaptations to utilize nutrients effectively while disposing of or investing excess C. As food composition changes, these adaptations lead to variable C- and nutrient-use efficiencies (defined as the ratios of C and nutrients used for growth over the amounts consumed). For organisms to be ecologically competitive, these changes in efficiencies with resource stoichiometry have to balance advantages and disadvantages in an optimal way. We hypothesize that efficiencies are varied so that community growth rate is optimized along stoichiometric gradients of their resources. Building from previous theories, we predict that maximum growth is achieved when C and nutrients are co-limiting, so that the maximum C-use efficiency is reached, and nutrient release is minimized. This optimality principle is expected to be applicable across terrestrial-aquatic borders, to various elements, and at different trophic levels. While the growth rate maximization hypothesis has been evaluated for consumers and predators, in this contribution we test it for terrestrial and aquatic decomposers degrading resources across wide stoichiometry gradients. The optimality hypothesis predicts constant efficiencies at low substrate C:N and C:P, whereas above a stoichiometric threshold, C-use efficiency declines and nitrogen- and phosphorus-use efficiencies increase up to one. Thus, high resource C:N and C:P lead to low C-use efficiency, but effective retention of nitrogen and phosphorus. Predictions are broadly consistent with efficiency trends in decomposer communities across terrestrial and aquatic ecosystems.

  9. Water developments and canids in two North American deserts: a test of the indirect effect of water hypothesis.

    Directory of Open Access Journals (Sweden)

    Lucas K Hall

    Full Text Available Anthropogenic modifications to landscapes intended to benefit wildlife may negatively influence wildlife communities. Anthropogenic provisioning of free water (water developments to enhance abundance and distribution of wildlife is a common management practice in arid regions where water is limiting. Despite the long-term and widespread use of water developments, little is known about how they influence native species. Water developments may negatively influence arid-adapted species (e.g., kit fox, Vulpes macrotis by enabling water-dependent competitors (e.g., coyote, Canis latrans to expand distribution in arid landscapes (i.e., indirect effect of water hypothesis. We tested the two predictions of the indirect effect of water hypothesis (i.e., coyotes will visit areas with free water more frequently and kit foxes will spatially and temporally avoid coyotes and evaluated relative use of free water by canids in the Great Basin and Mojave Deserts from 2010 to 2012. We established scent stations in areas with (wet and without (dry free water and monitored visitation by canids to these sites and visitation to water sources using infrared-triggered cameras. There was no difference in the proportions of visits to scent stations in wet or dry areas by coyotes or kit foxes at either study area. We did not detect spatial (no negative correlation between visits to scent stations or temporal (no difference between times when stations were visited segregation between coyotes and kit foxes. Visitation to water sources was not different for coyotes between study areas, but kit foxes visited water sources more in Mojave than Great Basin. Our results did not support the indirect effect of water hypothesis in the Great Basin or Mojave Deserts for these two canids.

  10. Conflict of interest between a nematode and a trematode in an amphipod host: Test of the "sabotage" hypothesis

    Science.gov (United States)

    Thomas, Frédéric; Fauchier, Jerome; Lafferty, Kevin D.

    2002-01-01

    Microphallus papillorobustus is a manipulative trematode that induces strong behavioural alterations in the gamaridean amphipod Gammarus insensibilis, making the amphipod more vulnerable to predation by aquatic birds (definitive hosts). Conversely, the sympatric nematodeGammarinema gammari uses Gammarus insensibilis as a habitat and a source of nutrition. We investigated the conflict of interest between these two parasite species by studying the consequences of mixed infection on amphipod behaviour associated with the trematode. In the field, some amphipods infected by the trematode did not display the altered behaviour. These normal amphipods also had more nematodes, suggesting that the nematode overpowered the manipulation of the trematode, a strategy that would prolong the nematode's life. We hypothesize that sabotage of the trematode by the nematode would be an adaptive strategy for the nematode consistent with recent speculation about co-operation and conflict in manipulative parasites. A behavioural test conducted in the laboratory from naturally infected amphipods yielded the same result. However, exposing amphipods to nematodes did not negate or decrease the manipulation exerted by the trematode. Similarly, experimental elimination of nematodes from amphipods did not permit trematodes to manipulate behaviour. These experimental data do not support the hypothesis that the negative association between nematodes and manipulation by the trematode is a result of the "sabotage" hypothesis.

  11. Antiaging therapy: a prospective hypothesis

    Directory of Open Access Journals (Sweden)

    Shahidi Bonjar MR

    2015-01-01

    Full Text Available Mohammad Rashid Shahidi Bonjar,1 Leyla Shahidi Bonjar2 1School of Dentistry, Kerman University of Medical Sciences, Kerman Iran; 2Department of Pharmacology, College of Pharmacy, Kerman University of Medical Sciences, Kerman, Iran Abstract: This hypothesis proposes a new prospective approach to slow the aging process in older humans. The hypothesis could lead to developing new treatments for age-related illnesses and help humans to live longer. This hypothesis has no previous documentation in scientific media and has no protocol. Scientists have presented evidence that systemic aging is influenced by peculiar molecules in the blood. Researchers at Albert Einstein College of Medicine, New York, and Harvard University in Cambridge discovered elevated titer of aging-related molecules (ARMs in blood, which trigger cascade of aging process in mice; they also indicated that the process can be reduced or even reversed. By inhibiting the production of ARMs, they could reduce age-related cognitive and physical declines. The present hypothesis offers a new approach to translate these findings into medical treatment: extracorporeal adjustment of ARMs would lead to slower rates of aging. A prospective “antiaging blood filtration column” (AABFC is a nanotechnological device that would fulfill the central role in this approach. An AABFC would set a near-youth homeostatic titer of ARMs in the blood. In this regard, the AABFC immobilizes ARMs from the blood while blood passes through the column. The AABFC harbors antibodies against ARMs. ARM antibodies would be conjugated irreversibly to ARMs on contact surfaces of the reaction platforms inside the AABFC till near-youth homeostasis is attained. The treatment is performed with the aid of a blood-circulating pump. Similar to a renal dialysis machine, blood would circulate from the body to the AABFC and from there back to the body in a closed circuit until ARMs were sufficiently depleted from the blood. The

  12. Q-Matrix Optimization Based on the Linear Logistic Test Model.

    Science.gov (United States)

    Ma, Lin; Green, Kelly E

    This study explored optimization of item-attribute matrices with the linear logistic test model (Fischer, 1973), with optimal models explaining more variance in item difficulty due to identified item attributes. Data were 8th-grade mathematics test item responses of two TIMSS 2007 booklets. The study investigated three categories of attributes (content, cognitive process, and comprehensive cognitive process) at two grain levels (larger, smaller) and also compared results with random attribute matrices. The proposed attributes accounted for most of the variance in item difficulty for two assessment booklets (81% and 65%). The variance explained by the content attributes was very small (13% to 31%), less than variance explained by the comprehensive cognitive process attributes which explained much more variance than the content and cognitive process attributes. The variances explained by the grain level were similar to each other. However, the attributes did not predict the item difficulties of two assessment booklets equally.

  13. A test of the submentalizing hypothesis: Apes' performance in a false belief task inanimate control

    Science.gov (United States)

    Hirata, Satoshi; Call, Josep; Tomasello, Michael

    2017-01-01

    ABSTRACT Much debate concerns whether any nonhuman animals share with humans the ability to infer others' mental states, such as desires and beliefs. In a recent eye-tracking false-belief task, we showed that great apes correctly anticipated that a human actor would search for a goal object where he had last seen it, even though the apes themselves knew that it was no longer there. In response, Heyes proposed that apes' looking behavior was guided not by social cognitive mechanisms but rather domain-general cueing effects, and suggested the use of inanimate controls to test this alternative submentalizing hypothesis. In the present study, we implemented the suggested inanimate control of our previous false-belief task. Apes attended well to key events but showed markedly fewer anticipatory looks and no significant tendency to look to the correct location. We thus found no evidence that submentalizing was responsible for apes' anticipatory looks in our false-belief task. PMID:28919941

  14. Optimization of Asphalt Mixture Design for the Louisiana ALF Test Sections

    Science.gov (United States)

    2018-05-01

    This research presents an extensive study on the design and characterization of asphalt mixtures used in road pavements. Both mixture volumetrics and physical properties obtained from several laboratory tests were considered in optimizing the mixture...

  15. Optimal Detection under the Restricted Bayesian Criterion

    Directory of Open Access Journals (Sweden)

    Shujun Liu

    2017-07-01

    Full Text Available This paper aims to find a suitable decision rule for a binary composite hypothesis-testing problem with a partial or coarse prior distribution. To alleviate the negative impact of the information uncertainty, a constraint is considered that the maximum conditional risk cannot be greater than a predefined value. Therefore, the objective of this paper becomes to find the optimal decision rule to minimize the Bayes risk under the constraint. By applying the Lagrange duality, the constrained optimization problem is transformed to an unconstrained optimization problem. In doing so, the restricted Bayesian decision rule is obtained as a classical Bayesian decision rule corresponding to a modified prior distribution. Based on this transformation, the optimal restricted Bayesian decision rule is analyzed and the corresponding algorithm is developed. Furthermore, the relation between the Bayes risk and the predefined value of the constraint is also discussed. The Bayes risk obtained via the restricted Bayesian decision rule is a strictly decreasing and convex function of the constraint on the maximum conditional risk. Finally, the numerical results including a detection example are presented and agree with the theoretical results.

  16. Automatically stable discontinuous Petrov-Galerkin methods for stationary transport problems: Quasi-optimal test space norm

    KAUST Repository

    Niemi, Antti H.

    2013-12-01

    We investigate the application of the discontinuous Petrov-Galerkin (DPG) finite element framework to stationary convection-diffusion problems. In particular, we demonstrate how the quasi-optimal test space norm improves the robustness of the DPG method with respect to vanishing diffusion. We numerically compare coarse-mesh accuracy of the approximation when using the quasi-optimal norm, the standard norm, and the weighted norm. Our results show that the quasi-optimal norm leads to more accurate results on three benchmark problems in two spatial dimensions. We address the problems associated to the resolution of the optimal test functions with respect to the quasi-optimal norm by studying their convergence numerically. In order to facilitate understanding of the method, we also include a detailed explanation of the methodology from the algorithmic point of view. © 2013 Elsevier Ltd. All rights reserved.

  17. Automatically stable discontinuous Petrov-Galerkin methods for stationary transport problems: Quasi-optimal test space norm

    KAUST Repository

    Niemi, Antti H.; Collier, Nathan; Calo, Victor M.

    2013-01-01

    We investigate the application of the discontinuous Petrov-Galerkin (DPG) finite element framework to stationary convection-diffusion problems. In particular, we demonstrate how the quasi-optimal test space norm improves the robustness of the DPG method with respect to vanishing diffusion. We numerically compare coarse-mesh accuracy of the approximation when using the quasi-optimal norm, the standard norm, and the weighted norm. Our results show that the quasi-optimal norm leads to more accurate results on three benchmark problems in two spatial dimensions. We address the problems associated to the resolution of the optimal test functions with respect to the quasi-optimal norm by studying their convergence numerically. In order to facilitate understanding of the method, we also include a detailed explanation of the methodology from the algorithmic point of view. © 2013 Elsevier Ltd. All rights reserved.

  18. Footprints of Optimal Protein Assembly Strategies in the Operonic Structure of Prokaryotes

    Directory of Open Access Journals (Sweden)

    Jan Ewald

    2015-04-01

    Full Text Available In this work, we investigate optimality principles behind synthesis strategies for protein complexes using a dynamic optimization approach. We show that the cellular capacity of protein synthesis has a strong influence on optimal synthesis strategies reaching from a simultaneous to a sequential synthesis of the subunits of a protein complex. Sequential synthesis is preferred if protein synthesis is strongly limited, whereas a simultaneous synthesis is optimal in situations with a high protein synthesis capacity. We confirm the predictions of our optimization approach through the analysis of the operonic organization of protein complexes in several hundred prokaryotes. Thereby, we are able to show that cellular protein synthesis capacity is a driving force in the dissolution of operons comprising the subunits of a protein complex. Thus, we also provide a tested hypothesis explaining why the subunits of many prokaryotic protein complexes are distributed across several operons despite the presumably less precise co-regulation.

  19. Optimal closed-loop identification test design for internal model control

    NARCIS (Netherlands)

    Zhu, Y.; Bosch, van den P.P.J.

    2000-01-01

    In this work, optimal closed-loop test design for control is studied. Simple design formulas are derived based on the asymptotic theory of Ljung. The control scheme used is internal model control (IMC) and the design constraint is the power of the process output or that of the reference signal. The

  20. The evolution of bacterial cell size: the internal diffusion-constraint hypothesis.

    Science.gov (United States)

    Gallet, Romain; Violle, Cyrille; Fromin, Nathalie; Jabbour-Zahab, Roula; Enquist, Brian J; Lenormand, Thomas

    2017-07-01

    Size is one of the most important biological traits influencing organismal ecology and evolution. However, we know little about the drivers of body size evolution in unicellulars. A long-term evolution experiment (Lenski's LTEE) in which Escherichia coli adapts to a simple glucose medium has shown that not only the growth rate and the fitness of the bacterium increase over time but also its cell size. This increase in size contradicts prominent 'external diffusion' theory (EDC) predicting that cell size should have evolved toward smaller cells. Among several scenarios, we propose and test an alternative 'internal diffusion-constraint' (IDC) hypothesis for cell size evolution. A change in cell volume affects metabolite concentrations in the cytoplasm. The IDC states that a higher metabolism can be achieved by a reduction in the molecular traffic time inside of the cell, by increasing its volume. To test this hypothesis, we studied a population from the LTEE. We show that bigger cells with greater growth and CO 2 production rates and lower mass-to-volume ratio were selected over time in the LTEE. These results are consistent with the IDC hypothesis. This novel hypothesis offers a promising approach for understanding the evolutionary constraints on cell size.

  1. An Intersection–Union Test for the Sharpe Ratio

    Directory of Open Access Journals (Sweden)

    Gabriel Frahm

    2018-04-01

    Full Text Available An intersection–union test for supporting the hypothesis that a given investment strategy is optimal among a set of alternatives is presented. It compares the Sharpe ratio of the benchmark with that of each other strategy. The intersection–union test takes serial dependence into account and does not presume that asset returns are multivariate normally distributed. An empirical study based on the G–7 countries demonstrates that it is hard to find significant results due to the lack of data, which confirms a general observation in empirical finance.

  2. Testing the Martian Methane from Cometary Debris Hypothesis: The Unusually Close 24 Jan 2018 Interaction Between Comet C/2007 H2 (Skiff) and Mars

    Science.gov (United States)

    Fries, M.; Archer, D.; Christou, T.; Conrad, P.; Eigenbrode, J.; Kate, I. L. ten; Steele, A.

    2018-01-01

    In previous work we proposed a hypothesis wherein debris moving along cometary orbits interacting with Mars (e.g. meteor showers) may be responsible for transient local increases of methane observed in the martian atmosphere (henceforth 'the hypothesis' ). An examination of the literature of methane detections dating back to 1997 showed that each detection was made, at most, 16 days after an interaction between Mars and one of seven small bodies (six comets and the unusual object 5335 Damocles)[ibid]. Two observations of high-altitude, transient visible plumes on Mars also correlate with cometary interactions, one occurring on the same day as the plume observation and the second observation occurring three days afterwards, and with two of the same seven small bodies. The proposed mechanism for methane production is dissemination of carbon-rich cometary material on infall into Mars' atmosphere followed by methane production via UV photolysis, a process that has been observed in laboratory experiments. Given this set of observations it is necessary and indeed conducive to the scientific process to explore and robustly test the hypothesis.

  3. Optimal Combinations of Diagnostic Tests Based on AUC.

    Science.gov (United States)

    Huang, Xin; Qin, Gengsheng; Fang, Yixin

    2011-06-01

    When several diagnostic tests are available, one can combine them to achieve better diagnostic accuracy. This article considers the optimal linear combination that maximizes the area under the receiver operating characteristic curve (AUC); the estimates of the combination's coefficients can be obtained via a nonparametric procedure. However, for estimating the AUC associated with the estimated coefficients, the apparent estimation by re-substitution is too optimistic. To adjust for the upward bias, several methods are proposed. Among them the cross-validation approach is especially advocated, and an approximated cross-validation is developed to reduce the computational cost. Furthermore, these proposed methods can be applied for variable selection to select important diagnostic tests. The proposed methods are examined through simulation studies and applications to three real examples. © 2010, The International Biometric Society.

  4. Do implicit motives and basic psychological needs interact to predict well-being and flow? : Testing a universal hypothesis and a matching hypothesis

    OpenAIRE

    Schüler, Julia; Brandstätter, Veronika; Sheldon, Kennon M.

    2013-01-01

    Self-Determination Theory (Deci and Ryan in Intrinsic motivation and self-determination in human behavior. Plenum Press, New York, 1985) suggests that certain experiences, such as competence, are equally beneficial to everyone’s well-being (universal hypothesis), whereas Motive Disposition Theory (McClelland in Human motivation. Scott, Foresman, Glenview, IL, 1985) predicts that some people, such as those with a high achievement motive, should benefit particularly from such experiences (match...

  5. The Neuropsychology of Adolescent Sexual Offending: Testing an Executive Dysfunction Hypothesis.

    Science.gov (United States)

    Morais, Hugo B; Joyal, Christian C; Alexander, Apryl A; Fix, Rebecca L; Burkhart, Barry R

    2016-12-01

    Although executive dysfunctions are commonly hypothesized to contribute to sexual deviance or aggression, evidence of this relationship is scarce and its specificity is unproven, especially among adolescents. The objective of this study was to compare the executive functioning (EF) of adolescents with sexual offense convictions (ASOC) to that of non-sex-delinquents (NSD). A secondary goal was to assess the relationship among specific sexual offense characteristics (i.e., victim age), history of childhood sexual abuse (CSA), and EF. It was hypothesized that as a group, ASOC would present similar EF profiles as NSD. It was further hypothesized that ASOC with child victims would present significantly higher rates of CSA and more severe impairment of EF than ASOC with peer-aged or older victims and NSD. A total of 183 male adolescents (127 ASOC and 56 NSD) were interviewed to collect demographic information, sexual development history, history of CSA, an assessment of living conditions, and history of delinquency and sexual offending. Participants were administered the Delis-Kaplan Executive Functioning System and the Hare Psychopathy Checklist-Youth Version. In accord with the first hypothesis, ASOC and NSD presented similar EF scores, well below normative values. Thus, EF deficits may not characterize the profiles of adolescents with sexual behavior problems. Contrarily to our second hypothesis, however, offending against children and or experiencing CSA were not associated with poorer EF performance. On the contrary, ASOC with child victims obtained significantly higher scores on measures of higher order EF than both ASOC with peer-aged or older victims and NSD. Implications of these results and future directions are discussed. © The Author(s) 2015.

  6. Psychosocial and biological aspects of dispositional optimism at old age

    NARCIS (Netherlands)

    Rius Ottenheim, Nathaly

    2012-01-01

    The findings presented in this thesis fit the hypothesis that optimism is associated with enhanced mental well-being and longevity, but also that levels of optimism within a certain person are rather stable over time and resistant to change. However, this stability of optimism does not necessarily

  7. Prolonged minor allograft survival in intravenously primed mice--a test of the veto hypothesis

    International Nuclear Information System (INIS)

    Johnson, L.L.

    1987-01-01

    Experiments were performed to test the hypothesis that veto cells are responsible for the prolonged survival of minor allografts of skin that is observed in recipients primed intravenously with spleen cells from mice syngeneic with the skin donors. This prolonged survival was observed for each of several minor histocompatibility (H) antigens and is antigen-specific. Gamma radiation (3300 rads) abolished the ability of male spleen cells infused i.v. to delay the rejection of male skin grafts (H-Y antigen) on female recipients. However, depletion of Thy-1+ cells from the i.v. infusion failed to abolish the ability to prolong male skin graft survival. Furthermore, the prolonged survival accorded to B6 (H-2b) male skin grafts on CB6F1 (H-2b/H-2d) female recipients given i.v. infusions of B6 male spleen cells extended to BALB/c (H-2d) male skin grafts as well, indicating a lack of MHC restriction. Thus, prolongation of minor allograft survival by i.v. infusion of minor H antigen-bearing spleen cells appears not to depend on veto T cells that others have found to be responsible for the suppression of CTL generation

  8. Temporal variability and cooperative breeding: testing the bet-hedging hypothesis in the acorn woodpecker.

    Science.gov (United States)

    Koenig, Walter D; Walters, Eric L

    2015-10-07

    Cooperative breeding is generally considered an adaptation to ecological constraints on dispersal and independent breeding, usually due to limited breeding opportunities. Although benefits of cooperative breeding are typically thought of in terms of increased mean reproductive success, it has recently been proposed that this phenomenon may be a bet-hedging strategy that reduces variance in reproductive success (fecundity variance) in populations living in highly variable environments. We tested this hypothesis using long-term data on the polygynandrous acorn woodpecker (Melanerpes formicivorus). In general, fecundity variance decreased with increasing sociality, at least when controlling for annual variation in ecological conditions. Nonetheless, decreased fecundity variance was insufficient to compensate for reduced per capita reproductive success of larger, more social groups, which typically suffered lower estimated mean fitness. We did, however, find evidence that sociality in the form of larger group size resulted in increased fitness in years following a small acorn crop due to reduced fecundity variance. Bet-hedging, although not the factor driving sociality in general, may play a role in driving acorn woodpecker group living when acorns are scarce and ecological conditions are poor. © 2015 The Author(s).

  9. There and back again: putting the vectorial movement planning hypothesis to a critical test.

    Science.gov (United States)

    Kobak, Eva-Maria; Cardoso de Oliveira, Simone

    2014-01-01

    Based on psychophysical evidence about how learning of visuomotor transformation generalizes, it has been suggested that movements are planned on the basis of movement direction and magnitude, i.e., the vector connecting movement origin and targets. This notion is also known under the term "vectorial planning hypothesis". Previous psychophysical studies, however, have included separate areas of the workspace for training movements and testing the learning. This study eliminates this confounding factor by investigating the transfer of learning from forward to backward movements in a center-out-and-back task, in which the workspace for both movements is completely identical. Visual feedback allowed for learning only during movements towards the target (forward movements) and not while moving back to the origin (backward movements). When subjects learned the visuomotor rotation in forward movements, initial directional errors in backward movements also decreased to some degree. This learning effect in backward movements occurred predominantly when backward movements featured the same movement directions as the ones trained in forward movements (i.e., when opposite targets were presented). This suggests that learning was transferred in a direction specific way, supporting the notion that movement direction is the most prominent parameter used for motor planning.

  10. Constrained optimization of test intervals using a steady-state genetic algorithm

    International Nuclear Information System (INIS)

    Martorell, S.; Carlos, S.; Sanchez, A.; Serradell, V.

    2000-01-01

    There is a growing interest from both the regulatory authorities and the nuclear industry to stimulate the use of Probabilistic Risk Analysis (PRA) for risk-informed applications at Nuclear Power Plants (NPPs). Nowadays, special attention is being paid on analyzing plant-specific changes to Test Intervals (TIs) within the Technical Specifications (TSs) of NPPs and it seems to be a consensus on the need of making these requirements more risk-effective and less costly. Resource versus risk-control effectiveness principles formally enters in optimization problems. This paper presents an approach for using the PRA models in conducting the constrained optimization of TIs based on a steady-state genetic algorithm (SSGA) where the cost or the burden is to be minimized while the risk or performance is constrained to be at a given level, or vice versa. The paper encompasses first with the problem formulation, where the objective function and constraints that apply in the constrained optimization of TIs based on risk and cost models at system level are derived. Next, the foundation of the optimizer is given, which is derived by customizing a SSGA in order to allow optimizing TIs under constraints. Also, a case study is performed using this approach, which shows the benefits of adopting both PRA models and genetic algorithms, in particular for the constrained optimization of TIs, although it is also expected a great benefit of using this approach to solve other engineering optimization problems. However, care must be taken in using genetic algorithms in constrained optimization problems as it is concluded in this paper

  11. Optimal Sequential Diagnostic Strategy Generation Considering Test Placement Cost for Multimode Systems

    Directory of Open Access Journals (Sweden)

    Shigang Zhang

    2015-10-01

    Full Text Available Sequential fault diagnosis is an approach that realizes fault isolation by executing the optimal test step by step. The strategy used, i.e., the sequential diagnostic strategy, has great influence on diagnostic accuracy and cost. Optimal sequential diagnostic strategy generation is an important step in the process of diagnosis system construction, which has been studied extensively in the literature. However, previous algorithms either are designed for single mode systems or do not consider test placement cost. They are not suitable to solve the sequential diagnostic strategy generation problem considering test placement cost for multimode systems. Therefore, this problem is studied in this paper. A formulation is presented. Two algorithms are proposed, one of which is realized by system transformation and the other is newly designed. Extensive simulations are carried out to test the effectiveness of the algorithms. A real-world system is also presented. All the results show that both of them have the ability to solve the diagnostic strategy generation problem, and they have different characteristics.

  12. On the design of innovative heterogeneous tests using a shape optimization approach

    Science.gov (United States)

    Aquino, J.; Campos, A. Andrade; Souto, N.; Thuillier, S.

    2018-05-01

    The development of full-field measurement methods enabled a new trend of mechanical tests. By providing the inhomogeneous strain field from the tests, these techniques are being widely used in sheet metal identification strategies, through heterogeneous mechanical tests. In this work, a heterogeneous mechanical test with an innovative tool/specimen shape, capable of producing rich heterogeneous strain paths providing extensive information on material behavior, is aimed. The specimen is found using a shape optimization process where a dedicated indicator that evaluates the richness of strain information is used. The methodology and results here presented are extended to non-specimen geometry dependence and to the non-dependence of the geometry parametrization through the use of the Ritz method for boundary value problems. Different curve models, such as Splines, B-Splines and NURBS, are used and C1 continuity throughout the specimen is guaranteed. Moreover, various optimization methods are used, deterministic and stochastic, in order to find the method or a combination of methods able to effectively minimize the cost function.

  13. Optimal Sequential Diagnostic Strategy Generation Considering Test Placement Cost for Multimode Systems

    Science.gov (United States)

    Zhang, Shigang; Song, Lijun; Zhang, Wei; Hu, Zheng; Yang, Yongmin

    2015-01-01

    Sequential fault diagnosis is an approach that realizes fault isolation by executing the optimal test step by step. The strategy used, i.e., the sequential diagnostic strategy, has great influence on diagnostic accuracy and cost. Optimal sequential diagnostic strategy generation is an important step in the process of diagnosis system construction, which has been studied extensively in the literature. However, previous algorithms either are designed for single mode systems or do not consider test placement cost. They are not suitable to solve the sequential diagnostic strategy generation problem considering test placement cost for multimode systems. Therefore, this problem is studied in this paper. A formulation is presented. Two algorithms are proposed, one of which is realized by system transformation and the other is newly designed. Extensive simulations are carried out to test the effectiveness of the algorithms. A real-world system is also presented. All the results show that both of them have the ability to solve the diagnostic strategy generation problem, and they have different characteristics. PMID:26457709

  14. Optimizing human semen cryopreservation by reducing test vial volume and repetitive test vial sampling

    DEFF Research Database (Denmark)

    Jensen, Christian F S; Ohl, Dana A; Parker, Walter R

    2015-01-01

    OBJECTIVE: To investigate optimal test vial (TV) volume, utility and reliability of TVs, intermediate temperature exposure (-88°C to -93°C) before cryostorage, cryostorage in nitrogen vapor (VN2) and liquid nitrogen (LN2), and long-term stability of VN2 cryostorage of human semen. DESIGN......: Prospective clinical laboratory study. SETTING: University assisted reproductive technology (ART) laboratory. PATIENT(S): A total of 594 patients undergoing semen analysis and cryopreservation. INTERVENTION(S): Semen analysis, cryopreservation with different intermediate steps and in different volumes (50......-1,000 μL), and long-term storage in LN2 or VN2. MAIN OUTCOME MEASURE(S): Optimal TV volume, prediction of cryosurvival (CS) in ART procedure vials (ARTVs) with pre-freeze semen parameters and TV CS, post-thaw motility after two- or three-step semen cryopreservation and cryostorage in VN2 and LN2. RESULT...

  15. Testing of Strategies for the Acceleration of the Cost Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Ponciroli, Roberto [Argonne National Lab. (ANL), Argonne, IL (United States); Vilim, Richard B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-08-31

    The general problem addressed in the Nuclear-Renewable Hybrid Energy System (N-R HES) project is finding the optimum economical dispatch (ED) and capacity planning solutions for the hybrid energy systems. In the present test-problem configuration, the N-R HES unit is composed of three electrical power-generating components, i.e. the Balance of Plant (BOP), the Secondary Energy Source (SES), and the Energy Storage (ES). In addition, there is an Industrial Process (IP), which is devoted to hydrogen generation. At this preliminary stage, the goal is to find the power outputs of each one of the N-R HES unit components (BOP, SES, ES) and the IP hydrogen production level that maximizes the unit profit by simultaneously satisfying individual component operational constraints. The optimization problem is meant to be solved in the Risk Analysis Virtual Environment (RAVEN) framework. The dynamic response of the N-R HES unit components is simulated by using dedicated object-oriented models written in the Modelica modeling language. Though this code coupling provides for very accurate predictions, the ensuing optimization problem is characterized by a very large number of solution variables. To ease the computational burden and to improve the path to a converged solution, a method to better estimate the initial guess for the optimization problem solution was developed. The proposed approach led to the definition of a suitable Monte Carlo-based optimization algorithm (called the preconditioner), which provides an initial guess for the optimal N-R HES power dispatch and the optimal installed capacity for each one of the unit components. The preconditioner samples a set of stochastic power scenarios for each one of the N-R HES unit components, and then for each of them the corresponding value of a suitably defined cost function is evaluated. After having simulated a sufficient number of power histories, the configuration which ensures the highest profit is selected as the optimal

  16. Fecundity of trees and the colonization-competition hypothesis

    Science.gov (United States)

    James S. Clark; Shannon LaDeau; Ines Ibanez

    2004-01-01

    Colonization-competition trade-offs represent a stabilizing mechanism that is thought to maintain diversity of forest trees. If so, then early-successional species should benefit from high capacity to colonize new sites, and late-successional species should be good competitors. Tests of this hypothesis in forests have been precluded by an inability to estimate...

  17. Efficient compliance with prescribed bounds on operational parameters by means of hypothesis testing using reactor data

    International Nuclear Information System (INIS)

    Sermer, P.; Olive, C.; Hoppe, F.M.

    2000-01-01

    - A common problem in any reactor operations is to comply with a requirement that certain operational parameters are constrained to lie within some prescribed bounds. The fundamental issue which is to be addressed in any compliance description can be stated as follows: The compliance definition, compliance procedures and allowances for uncertainties in data and accompanying methodologies, should be well defined and justifiable. To this end, a mathematical framework for compliance, in which the computed or measured estimates of process parameters are considered random variables, is described in this paper. This allows a statistical formulation of the definition of compliance with licence or otherwise imposed limits. An important aspect of the proposed methodology is that the derived statistical tests are obtained by a Monte Carlo procedure using actual reactor operational data. The implementation of the methodology requires a routine surveillance of the reactor core in order to perform the underlying statistical tests. The additional work required for surveillance is balanced by the fact that the resulting actions on the reactor operations, implemented in station procedures, make the reactor 'safer' by increasing the operating margins. Furthermore, increased margins are also achieved by efficient solution techniques which may allow an increase in reactor power. A rigorous analysis of a compliance problem using statistical hypothesis testing based on extreme value probability distributions and actual reactor operational data leads to effective solutions in the areas of licensing, nuclear safety, reliability and competitiveness of operating nuclear reactors. (author)

  18. Cross-cultural differences in cognitive performance and Spearman's hypothesis : g or c?

    NARCIS (Netherlands)

    Helms-Lorenz, M; Van de Vijver, FJR; Poortinga, YH

    2003-01-01

    Common tests of Spearman's hypothesis, according to which performance differences between cultural groups on cognitive tests increase with their g loadings, confound cognitive complexity and verbal-cultural aspects. The present study attempts to disentangle these components. Two intelligence

  19. Do the Emotional Benefits of Optimism Vary Across Older Adulthood? A Life Span Perspective.

    Science.gov (United States)

    Wrosch, Carsten; Jobin, Joelle; Scheier, Michael F

    2017-06-01

    This study examined whether the emotional benefits of dispositional optimism for managing stressful encounters decrease across older adulthood. Such an effect might emerge because age-related declines in opportunities for overcoming stressors could reduce the effectiveness of optimism. This hypothesis was tested in a 6-year longitudinal study of 171 community-dwelling older adults (age range = 64-90 years). Hierarchical linear models showed that dispositional optimism protected relatively young participants from exhibiting elevations in depressive symptoms over time, but that these benefits became increasingly reduced among their older counterparts. Moreover, the findings showed that an age-related association between optimism and depressive symptoms was observed particularly during periods of enhanced, as compared to reduced, stress. These results suggest that dispositional optimism protects emotional well-being during the early phases of older adulthood, but that its effects are reduced in advanced old age. © 2016 Wiley Periodicals, Inc.

  20. Review of domestic and international experience on optimization of tests planning for safety related systems at NPP

    International Nuclear Information System (INIS)

    Skalozubov, V.I.; Komarov, Yu.A.; Kolykanov, V.N.; Kochneva, V.Yu.; Gablaya, T.V.

    2009-01-01

    There are represented the basic requirements of normative and operating documents on test periodicity of safety related systems at NPPs, sets out the theoretical methods of test optimization of the technical systems, and analyses foreign engineering methods for changing test periodicity of the NPP systems. Based on this review analyses further tasks are formulated for improvement of the methodical base of optimization of tests planning for safety related systems

  1. Optimal design of degradation tests in presence of cost constraint

    International Nuclear Information System (INIS)

    Wu, S.-J.; Chang, C.-T.

    2002-01-01

    Degradation test is a useful technique to provide information about the lifetime of highly reliable products. We obtain the degradation measurements over time in such a test. In general, the degradation data are modeled by a nonlinear regression model with random coefficients. If we can obtain the estimates of parameters under the model, then the failure time distribution can be estimated. However, in order to obtain a precise estimate of the percentile of failure time distribution, one needs to design an optimal degradation test. Therefore, this study proposes an approach to determine the number of units to test, inspection frequency, and termination time of a degradation test under a determined cost of experiment such that the variance of estimator of percentile of failure time distribution is minimum. The method will be applied to a numerical example and the sensitivity analysis will be discussed

  2. Investigation and optimization of composting processes--test systems and practical examples

    International Nuclear Information System (INIS)

    Koerner, I.; Braukmeier, J.; Herrenklage, J.; Leikam, K.; Ritzkowski, M.; Schlegelmilch, M.; Stegmann, R.

    2003-01-01

    To determine the optimal course of composting it is useful to carry out experiments. The selection of the right experimental set-up depends on the question of concern. Each set-up is useful for a particular application and has its limits. Two test systems of different scales (up to 1500 ml; up to 100 l) are introduced. The purpose and importance of each system design shall be highlighted by application examples: (1) Suitability of a liquid industrial residue as composting accelerator; (2) Determination of the compost maturity; (3) Behaviour of odor-reducing additives during waste collection and composting; (4) Production of tailor-made compost with respect to Nitrogen (5) Suitability of O 2 -enriched air for acceleration of composting. Small-scale respiration experiments are useful to optimize parameters which have to be adjusted during substrate pre-treatment and composting, with the exception of particle size and temperature, and to reduce the number of variants which have to be investigated in greater detail in larger scale experiments. As all regulation possibilities such as aeration, moistening, turning can be simulated with the technical scale set-up, their complex cooperation can be taken into consideration. Encouraging composting variants can be tested, compared and optimized

  3. Numerical thermal mathematical model correlation to thermal balance test using adaptive particle swarm optimization (APSO)

    International Nuclear Information System (INIS)

    Beck, T.; Bieler, A.; Thomas, N.

    2012-01-01

    We present structural and thermal model (STM) tests of the BepiColombo laser altimeter (BELA) receiver baffle with emphasis on the correlation of the data with a thermal mathematical model. The test unit is a part of the thermal and optical protection of the BELA instrument being tested under infrared and solar irradiation at University of Bern. An iterative optimization method known as particle swarm optimization has been adapted to adjust the model parameters, mainly the linear conductivity, in such a way that model and test results match. The thermal model reproduces the thermal tests to an accuracy of 4.2 °C ± 3.2 °C in a temperature range of 200 °C after using only 600 iteration steps of the correlation algorithm. The use of this method brings major benefits to the accuracy of the results as well as to the computational time required for the correlation. - Highlights: ► We present model correlations of the BELA receiver baffle to thermal balance tests. ► Adaptive particle swarm optimization has been adapted for the correlation. ► The method improves the accuracy of the correlation and the computational time.

  4. Hypothesis in research

    Directory of Open Access Journals (Sweden)

    Eudaldo Enrique Espinoza Freire

    2018-01-01

    Full Text Available It is intended with this work to have a material with the fundamental contents, which enable the university professor to formulate the hypothesis, for the development of an investigation, taking into account the problem to be solved. For its elaboration, the search of information in primary documents was carried out, such as thesis of degree and reports of research results, selected on the basis of its relevance with the analyzed subject, current and reliability, secondary documents, as scientific articles published in journals of recognized prestige, the selection was made with the same terms as in the previous documents. It presents a conceptualization of the updated hypothesis, its characterization and an analysis of the structure of the hypothesis in which the determination of the variables is deepened. The involvement of the university professor in the teaching-research process currently faces some difficulties, which are manifested, among other aspects, in an unstable balance between teaching and research, which leads to a separation between them.

  5. Testing the stress shadow hypothesis

    Science.gov (United States)

    Felzer, Karen R.; Brodsky, Emily E.

    2005-05-01

    A fundamental question in earthquake physics is whether aftershocks are predominantly triggered by static stress changes (permanent stress changes associated with fault displacement) or dynamic stresses (temporary stress changes associated with earthquake shaking). Both classes of models provide plausible explanations for earthquake triggering of aftershocks, but only the static stress model predicts stress shadows, or regions in which activity is decreased by a nearby earthquake. To test for whether a main shock has produced a stress shadow, we calculate time ratios, defined as the ratio of the time between the main shock and the first earthquake to follow it and the time between the last earthquake to precede the main shock and the first earthquake to follow it. A single value of the time ratio is calculated for each 10 × 10 km bin within 1.5 fault lengths of the main shock epicenter. Large values of the time ratio indicate a long wait for the first earthquake to follow the main shock and thus a potential stress shadow, whereas small values indicate the presence of aftershocks. Simulations indicate that the time ratio test should have sufficient sensitivity to detect stress shadows if they are produced in accordance with the rate and state friction model. We evaluate the 1989 MW 7.0 Loma Prieta, 1992 MW 7.3 Landers, 1994 MW 6.7 Northridge, and 1999 MW 7.1 Hector Mine main shocks. For each main shock, there is a pronounced concentration of small time ratios, indicating the presence of aftershocks, but the number of large time ratios is less than at other times in the catalog. This suggests that stress shadows are not present. By comparing our results to simulations we estimate that we can be at least 98% confident that the Loma Prieta and Landers main shocks did not produce stress shadows and 91% and 84% confident that stress shadows were not generated by the Hector Mine and Northridge main shocks, respectively. We also investigate the long hypothesized existence

  6. Community-Driven Hypothesis Testing: A Solution for the Tragedy of the Anticommons.

    Science.gov (United States)

    Palma-Oliveira, José Manuel; Trump, Benjamin D; Wood, Matthew D; Linkov, Igor

    2018-03-01

    Shared ownership of property and resources is a longstanding challenge throughout history that has been amplifying with the increasing development of industrial and postindustrial societies. Where governments, project planners, and commercial developers seek to develop new infrastructure, industrial projects, and various other land-and resource-intensive tasks, veto power shared by various local stakeholders can complicate or halt progress. Risk communication has been used as an attempt to address stakeholder concerns in these contexts, but has demonstrated shortcomings. These coordination failures between project planners and stakeholders can be described as a specific kind of social dilemma that we describe as the "tragedy of the anticommons." To overcome such dilemmas, we demonstrate how a two-step process can directly address public mistrust of project planners and public perceptions of limited decision-making authority. This approach is examined via two separate empirical field experiments in Portugal and Tunisia, where public resistance and anticommons problems threatened to derail emerging industrial projects. In both applications, an intervention is undertaken to address initial public resistance to such projects, where specific public stakeholders and project sponsors collectively engaged in a hypothesis-testing process to identify and assess human and environmental health risks associated with proposed industrial facilities. These field experiments indicate that a rigorous attempt to address public mistrust and perceptions of power imbalances and change the pay-off structure of the given dilemma may help overcome such anticommons problems in specific cases, and may potentially generate enthusiasm and support for such projects by local publics moving forward. © 2017 Society for Risk Analysis.

  7. The origin of Phobos grooves from ejecta launched from impact craters on Mars: Tests of the hypothesis

    Science.gov (United States)

    Ramsley, Kenneth R.; Head, James W.

    2013-01-01

    The surface of the martian moon Phobos is characterized by parallel and intersecting grooves that bear resemblance to secondary crater chains observed on planetary surfaces. Murray (2011) has hypothesized that the main groove-forming process on Phobos is the intersection of Phobos with ejecta from primary impact events on Mars to produce chains of secondary craters. The hypothesis infers a pattern of parallel jets of ejecta, either fluidized or solidified, that break into equally-spaced fragments and disperse uniformly along-trajectory during the flight from Mars to Phobos. At the moment of impact with Phobos the dispersed fragments emplace secondary craters that are aligned along strike corresponding to the flight pattern of ejecta along trajectory. The aspects of the characteristics of grooves on Phobos cited by this hypothesis that might be explained by secondary ejecta include: their observed linearity, parallelism, planar alignment, pitted nature, change in character along strike, and a "zone of avoidance" where ejecta from Mars is predicted not to impact (Murray, 2011). To test the hypothesis we plot precise Keplerian orbits for ejecta from Mars (elliptical and hyperbolic with periapsis located below the surface of Mars). From these trajectories we: (1) set the fragment dispersion limits of ejecta patterns required to emplace the more typically well-organized parallel grooves observed in returned images from Phobos; (2) plot ranges of the ejecta flight durations from Mars to Phobos and map regions of exposure; (3) utilize the same exposure map to observe trajectory-defined ejecta exposure shadows; (4) observe hemispheric exposure in response to shorter and longer durations of ejecta flight; (5) assess the viability of ejecta emplacing the large family of grooves covering most of the northern hemisphere of Phobos; and (6) plot the arrival of parallel lines of ejecta emplacing chains of craters at oblique incident angles. We also assess the bulk volume of

  8. The Human Release Hypothesis for biological invasions: human activity as a determinant of the abundance of invasive plant species [v1; ref status: indexed, http://f1000r.es/33c

    Directory of Open Access Journals (Sweden)

    Heike Zimmermann

    2014-05-01

    Full Text Available Research on biological invasions has increased rapidly over the past 30 years, generating numerous explanations of how species become invasive. While the mechanisms of invasive species establishment are well studied, the mechanisms driving abundance patterns (i.e. patterns of population density remain poorly understood. Invasive species typically have higher abundances in their new environments than in their native ranges, and patterns of invasive species abundance differ between invaded regions. To explain differences in invasive species abundance, we propose the Human Release Hypothesis. In parallel to the established Enemy Release Hypothesis, this hypothesis states that the abundance of invasive species may be partly explained by the level of human activity or landscape maintenance, with intermediate levels of human activity providing optimal conditions for high abundance. The Human Release Hypothesis does not negate other important drivers of species invasions, but rather should be considered as a potentially important additional or complementary mechanism. We illustrate the hypothesis via a case study on an invasive rose species, and hypothesize which locations globally may be most likely to support high abundances of invasive species. We propose that more extensive empirical work on the Human Release Hypothesis could be useful to test its general applicability.

  9. Further Evidence on the Weak and Strong Versions of the Screening Hypothesis in Greece.

    Science.gov (United States)

    Lambropoulos, Haris S.

    1992-01-01

    Uses Greek data for 1981 and 1985 to test screening hypothesis by replicating method proposed by Psacharopoulos. Credentialism, or sheepskin effect of education, directly challenges human capital theory, which views education as a productivity augmenting process. Results do not support the strong version of the screening hypothesis and suggest…

  10. Unicorns do exist: a tutorial on "proving" the null hypothesis.

    Science.gov (United States)

    Streiner, David L

    2003-12-01

    Introductory statistics classes teach us that we can never prove the null hypothesis; all we can do is reject or fail to reject it. However, there are times when it is necessary to try to prove the nonexistence of a difference between groups. This most often happens within the context of comparing a new treatment against an established one and showing that the new intervention is not inferior to the standard. This article first outlines the logic of "noninferiority" testing by differentiating between the null hypothesis (that which we are trying to nullify) and the "nill" hypothesis (there is no difference), reversing the role of the null and alternate hypotheses, and defining an interval within which groups are said to be equivalent. We then work through an example and show how to calculate sample sizes for noninferiority studies.

  11. Is the Aluminum Hypothesis Dead?

    Science.gov (United States)

    2014-01-01

    The Aluminum Hypothesis, the idea that aluminum exposure is involved in the etiology of Alzheimer disease, dates back to a 1965 demonstration that aluminum causes neurofibrillary tangles in the brains of rabbits. Initially the focus of intensive research, the Aluminum Hypothesis has gradually been abandoned by most researchers. Yet, despite this current indifference, the Aluminum Hypothesis continues to attract the attention of a small group of scientists and aluminum continues to be viewed with concern by some of the public. This review article discusses reasons that mainstream science has largely abandoned the Aluminum Hypothesis and explores a possible reason for some in the general public continuing to view aluminum with mistrust. PMID:24806729

  12. Optimal HIV testing and earlier care: the way forward in Europe

    DEFF Research Database (Denmark)

    Coenen, T; Lundgren, J; Lazarus, Jeff

    2008-01-01

    The articles in this supplement were developed from a recent pan-European conference entitled 'HIV in Europe 2007: Working together for optimal testing and earlier care', which took place on 26-27 November in Brussels, Belgium. The conference, organized by a multidisciplinary group of experts rep...

  13. Isotopic Resonance Hypothesis: Experimental Verification by Escherichia coli Growth Measurements

    Science.gov (United States)

    Xie, Xueshu; Zubarev, Roman A.

    2015-03-01

    Isotopic composition of reactants affects the rates of chemical and biochemical reactions. As a rule, enrichment of heavy stable isotopes leads to progressively slower reactions. But the recent isotopic resonance hypothesis suggests that the dependence of the reaction rate upon the enrichment degree is not monotonous. Instead, at some ``resonance'' isotopic compositions, the kinetics increases, while at ``off-resonance'' compositions the same reactions progress slower. To test the predictions of this hypothesis for the elements C, H, N and O, we designed a precise (standard error +/-0.05%) experiment that measures the parameters of bacterial growth in minimal media with varying isotopic composition. A number of predicted resonance conditions were tested, with significant enhancements in kinetics discovered at these conditions. The combined statistics extremely strongly supports the validity of the isotopic resonance phenomenon (p biotechnology, medicine, chemistry and other areas.

  14. A common optimization principle for motor execution in healthy subjects and parkinsonian patients.

    Science.gov (United States)

    Baraduc, Pierre; Thobois, Stéphane; Gan, Jing; Broussolle, Emmanuel; Desmurget, Michel

    2013-01-09

    Recent research on Parkinson's disease (PD) has emphasized that parkinsonian movement, although bradykinetic, shares many attributes with healthy behavior. This observation led to the suggestion that bradykinesia in PD could be due to a reduction in motor motivation. This hypothesis can be tested in the framework of optimal control theory, which accounts for many characteristics of healthy human movement while providing a link between the motor behavior and a cost/benefit trade-off. This approach offers the opportunity to interpret movement deficits of PD patients in the light of a computational theory of normal motor control. We studied 14 PD patients with bilateral subthalamic nucleus (STN) stimulation and 16 age-matched healthy controls, and tested whether reaching movements were governed by similar rules in these two groups. A single optimal control model accounted for the reaching movements of healthy subjects and PD patients, whatever the condition of STN stimulation (on or off). The choice of movement speed was explained in all subjects by the existence of a preset dynamic range for the motor signals. This range was idiosyncratic and applied to all movements regardless of their amplitude. In PD patients this dynamic range was abnormally narrow and correlated with bradykinesia. STN stimulation reduced bradykinesia and widened this range in all patients, but did not restore it to a normal value. These results, consistent with the motor motivation hypothesis, suggest that constrained optimization of motor effort is the main determinant of movement planning (choice of speed) and movement production, in both healthy and PD subjects.

  15. Ontogeny of Foraging Competence in Capuchin Monkeys (Cebus capucinus for Easy versus Difficult to Acquire Fruits: A Test of the Needing to Learn Hypothesis.

    Directory of Open Access Journals (Sweden)

    Elizabeth Christine Eadie

    Full Text Available Which factors select for long juvenile periods in some species is not well understood. One potential reason to delay the onset of reproduction is slow food acquisition rates, either due to competition (part of the ecological risk avoidance hypothesis, or due to a decreased foraging efficiency (a version of the needing to learn hypothesis. Capuchins provide a useful genus to test the needing to learn hypothesis because they are known for having long juvenile periods and a difficult-to-acquire diet. Generalized, linear, mixed models with data from 609 fruit forage focal follows on 49, habituated, wild Cebus capucinus were used to test two predictions from the needing-to-learn hypothesis as it applies to fruit foraging skills: 1 capuchin monkeys do not achieve adult foraging return rates for difficult-to-acquire fruits before late in the juvenile period; and 2 variance in return rates for these fruits is at least partially associated with differences in foraging skill. In support of the first prediction, adults, compared with all younger age classes, had significantly higher foraging return rates when foraging for fruits that were ranked as difficult-to-acquire (return rates relative to adults: 0.30-0.41, p-value range 0.008-0.016, indicating that the individuals in the group who have the most foraging experience also achieve the highest return rates. In contrast, and in support of the second prediction, there were no significant differences between age classes for fruits that were ranked as easy to acquire (return rates relative to adults: 0.97-1.42, p-value range 0.086-0.896, indicating that strength and/or skill are likely to affect return rates. In addition, fruits that were difficult to acquire were foraged at nearly identical rates by adult males and significantly smaller (and presumably weaker adult females (males relative to females: 1.01, p = 0.978, while subadult females had much lower foraging efficiency than the similarly-sized but more

  16. Ontogeny of Foraging Competence in Capuchin Monkeys (Cebus capucinus) for Easy versus Difficult to Acquire Fruits: A Test of the Needing to Learn Hypothesis.

    Science.gov (United States)

    Eadie, Elizabeth Christine

    2015-01-01

    Which factors select for long juvenile periods in some species is not well understood. One potential reason to delay the onset of reproduction is slow food acquisition rates, either due to competition (part of the ecological risk avoidance hypothesis), or due to a decreased foraging efficiency (a version of the needing to learn hypothesis). Capuchins provide a useful genus to test the needing to learn hypothesis because they are known for having long juvenile periods and a difficult-to-acquire diet. Generalized, linear, mixed models with data from 609 fruit forage focal follows on 49, habituated, wild Cebus capucinus were used to test two predictions from the needing-to-learn hypothesis as it applies to fruit foraging skills: 1) capuchin monkeys do not achieve adult foraging return rates for difficult-to-acquire fruits before late in the juvenile period; and 2) variance in return rates for these fruits is at least partially associated with differences in foraging skill. In support of the first prediction, adults, compared with all younger age classes, had significantly higher foraging return rates when foraging for fruits that were ranked as difficult-to-acquire (return rates relative to adults: 0.30-0.41, p-value range 0.008-0.016), indicating that the individuals in the group who have the most foraging experience also achieve the highest return rates. In contrast, and in support of the second prediction, there were no significant differences between age classes for fruits that were ranked as easy to acquire (return rates relative to adults: 0.97-1.42, p-value range 0.086-0.896), indicating that strength and/or skill are likely to affect return rates. In addition, fruits that were difficult to acquire were foraged at nearly identical rates by adult males and significantly smaller (and presumably weaker) adult females (males relative to females: 1.01, p = 0.978), while subadult females had much lower foraging efficiency than the similarly-sized but more experienced

  17. The estrogen hypothesis of schizophrenia implicates glucose metabolism

    DEFF Research Database (Denmark)

    Olsen, Line; Hansen, Thomas; Jakobsen, Klaus D

    2008-01-01

    expression studies have indicated an equally large set of candidate genes that only partially overlap linkage genes. A thorough assessment, beyond the resolution of current GWA studies, of the disease risk conferred by the numerous schizophrenia candidate genes is a daunting and presently not feasible task....... We undertook these challenges by using an established clinical paradigm, the estrogen hypothesis of schizophrenia, as the criterion to select candidates among the numerous genes experimentally implicated in schizophrenia. Bioinformatic tools were used to build and priorities the signaling networks...... implicated by the candidate genes resulting from the estrogen selection. We identified ten candidate genes using this approach that are all active in glucose metabolism and particularly in the glycolysis. Thus, we tested the hypothesis that variants of the glycolytic genes are associated with schizophrenia...

  18. Optimizing infrastructure for software testing using virtualization

    International Nuclear Information System (INIS)

    Khalid, O.; Shaikh, A.; Copy, B.

    2012-01-01

    Virtualization technology and cloud computing have brought a paradigm shift in the way we utilize, deploy and manage computer resources. They allow fast deployment of multiple operating system as containers on physical machines which can be either discarded after use or check-pointed for later re-deployment. At European Organization for Nuclear Research (CERN), we have been using virtualization technology to quickly setup virtual machines for our developers with pre-configured software to enable them to quickly test/deploy a new version of a software patch for a given application. This paper reports both on the techniques that have been used to setup a private cloud on a commodity hardware and also presents the optimization techniques we used to remove deployment specific performance bottlenecks. (authors)

  19. The Lehman Sisters Hypothesis: an exploration of literature and bankers

    NARCIS (Netherlands)

    I.P. van Staveren (Irene)

    2012-01-01

    textabstractAbstract This article tests the Lehman Sisters Hypothesis in two complementary, although incomplete ways. It reviews the diverse empirical literature in behavioral, experimental, and neuroeconomics as well as related fields of behavioral research. And it presents the findings from an

  20. The Lehman Sisters Hypothesis: an exploration of literature and bankers

    NARCIS (Netherlands)

    I.P. van Staveren (Irene)

    2012-01-01

    textabstractThis article tests the Lehman Sisters Hypothesis in two complementary, although incomplete ways. It reviews the diverse empirical literature in behavioural, experimental, and neuroeconomics as well as related fields of behavioural research. And it presents the findings from an

  1. Testing the phenotype-linked fertility hypothesis in the presence and absence of inbreeding

    Czech Academy of Sciences Publication Activity Database

    Forstmeier, W.; Ihle, M.; Opatová, Pavlína; Martin, K.; Knief, U.; Albrechtová, Jana; Albrecht, Tomáš; Kempenaers, B.

    2017-01-01

    Roč. 30, č. 5 (2017), s. 968-976 ISSN 1010-061X R&D Projects: GA ČR(CZ) GAP506/12/2472 Institutional support: RVO:68081766 Keywords : display behaviour * mate choice * phenotype-linked fertility hypothesis * precopulatory traits * sexual selection * sperm abnormalities * sperm quality * sperm velocity Subject RIV: EB - Genetics ; Molecular Biology OBOR OECD: Genetics and heredity (medical genetics to be 3) Impact factor: 2.792, year: 2016

  2. Development and flight testing of UV optimized Photon Counting CCDs

    Science.gov (United States)

    Hamden, Erika T.

    2018-06-01

    I will discuss the latest results from the Hamden UV/Vis Detector Lab and our ongoing work using a UV optimized EMCCD in flight. Our lab is currently testing efficiency and performance of delta-doped, anti-reflection coated EMCCDs, in collaboration with JPL. The lab has been set-up to test quantum efficiency, dark current, clock-induced-charge, and read noise. I will describe our improvements to our circuit boards for lower noise, updates from a new, more flexible NUVU controller, and the integration of an EMCCD in the FIREBall-2 UV spectrograph. I will also briefly describe future plans to conduct radiation testing on delta-doped EMCCDs (both warm, unbiased and cold, biased configurations) thus summer and longer term plans for testing newer photon counting CCDs as I move the HUVD Lab to the University of Arizona in the Fall of 2018.

  3. Testing the ``Wildfire Hypothesis:'' Terrestrial Organic Carbon Burning as the Cause of the Paleocene-Eocene Boundary Carbon Isotope Excursion

    Science.gov (United States)

    Moore, E. A.; Kurtz, A. C.

    2005-12-01

    The 3‰ negative carbon isotope excursion (CIE) at the Paleocene-Eocene boundary has generally been attributed to dissociation of seafloor methane hydrates. We are testing the alternative hypothesis that the carbon cycle perturbation resulted from wildfires affecting the extensive peatlands and coal swamps formed in the Paleocene. Accounting for the CIE with terrestrial organic carbon rather than methane requires a significantly larger net release of fossil carbon to the ocean-atmosphere, which may be more consistent with the extreme global warming and ocean acidification characteristic of the Paleocene-Eocene Thermal Maximum (PETM). While other researchers have noted evidence of fires at the Paleocene-Eocene boundary in individual locations, the research presented here is designed to test the "wildfire hypothesis" for the Paleocene-Eocene boundary by examining marine sediments for evidence of a global increase in wildfire activity. Such fires would produce massive amounts of soot, widely distributed by wind and well preserved in marine sediments as refractory black carbon. We expect that global wildfires occurring at the Paleocene-Eocene boundary would produce a peak in black carbon abundance at the PETM horizon. We are using the method of Gelinas et al. (2001) to produce high-resolution concentration profiles of black carbon across the Paleocene-Eocene boundary using seafloor sediments from ODP cores, beginning with the Bass River core from ODP leg 174AX and site 1209 from ODP leg 198. This method involves the chemical and thermal extraction of non-refractory carbon followed by combustion of the residual black carbon and measurement as CO2. Measurement of the δ 13C of the black carbon will put additional constraints on the source of the organic material combusted, and will allow us to determine if this organic material was formed prior to or during the CIE.

  4. Analysis of a PCB In-Circuit Test and Its Optimized Cycle

    International Nuclear Information System (INIS)

    Chi, Moon Goo; Lee, Eun Chan; Bae, Yeon Kyoung

    2011-01-01

    KHNP performs subcomponent performance tests of the PCBs (Printed Circuit Boards) installed in safety-related systems or plant trip-related systems with every outage. The characteristics of each subcomponent are measured by test equipment. The tests are known as an ICT (In-Circuit Test). If a degraded condition is detected by this test, the affected subcomponents are replaced. This test has been conducted for 17 years, since 1994, and its results have been compiled into a test system database. As part of the reliability improvement plan of critical PCBs, KHNP developed a program that analyzes the performance of various key PCBs based on this test data. Thus, it became possible to evaluate the performance trends related to PCBs by tracing the test history of the PCB subcomponents through the ICT over many years. The present study also estimates an optimized ICT cycle that can be implemented to prevent the degradation of PCBs before they fail due to aging

  5. Simultaneity modeling analysis of the environmental Kuznets curve hypothesis

    International Nuclear Information System (INIS)

    Ben Youssef, Adel; Hammoudeh, Shawkat; Omri, Anis

    2016-01-01

    The environmental Kuznets curve (EKC) hypothesis has been recognized in the environmental economics literature since the 1990's. Various statistical tests have been used on time series, cross section and panel data related to single and groups of countries to validate this hypothesis. In the literature, the validation has always been conducted by using a single equation. However, since both the environment and income variables are endogenous, the estimation of a single equation model when simultaneity exists produces inconsistent and biased estimates. Therefore, we formulate simultaneous two-equation models to investigate the EKC hypothesis for fifty-six countries, using annual panel data from 1990 to 2012, with the end year is determined by data availability for the panel. To make the panel data analysis more homogeneous, we investigate this issue for a three income-based panels (namely, high-, middle-, and low-income panels) given several explanatory variables. Our results indicate that there exists a bidirectional causality between economic growth and pollution emissions in the overall panels. We also find that the relationship is nonlinear and has an inverted U-shape for all the considered panels. Policy implications are provided. - Highlights: • We have given a new look for the validity of the EKC hypothesis. • We formulate two-simultaneous equation models to validate this hypothesis for fifty-six countries. • We find a bidirectional causality between economic growth and pollution emissions. • We also discover an inverted U-shaped between environmental degradation and economic growth. • This relationship varies at different stages of economic development.

  6. The insignificance of statistical significance testing

    Science.gov (United States)

    Johnson, Douglas H.

    1999-01-01

    Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.

  7. On-line Flagging of Anomalies and Adaptive Sequential Hypothesis Testing for Fine-feature Characterization of Geosynchronous Satellites

    Science.gov (United States)

    Chaudhary, A.; Payne, T.; Kinateder, K.; Dao, P.; Beecher, E.; Boone, D.; Elliott, B.

    The objective of on-line flagging in this paper is to perform interactive assessment of geosynchronous satellites anomalies such as cross-tagging of a satellites in a cluster, solar panel offset change, etc. This assessment will utilize a Bayesian belief propagation procedure and will include automated update of baseline signature data for the satellite, while accounting for the seasonal changes. Its purpose is to enable an ongoing, automated assessment of satellite behavior through its life cycle using the photometry data collected during the synoptic search performed by a ground or space-based sensor as a part of its metrics mission. The change in the satellite features will be reported along with the probabilities of Type I and Type II errors. The objective of adaptive sequential hypothesis testing in this paper is to define future sensor tasking for the purpose of characterization of fine features of the satellite. The tasking will be designed in order to maximize new information with the least number of photometry data points to be collected during the synoptic search by a ground or space-based sensor. Its calculation is based on the utilization of information entropy techniques. The tasking is defined by considering a sequence of hypotheses in regard to the fine features of the satellite. The optimal observation conditions are then ordered in order to maximize new information about a chosen fine feature. The combined objective of on-line flagging and adaptive sequential hypothesis testing is to progressively discover new information about the features of a geosynchronous satellites by leveraging the regular but sparse cadence of data collection during the synoptic search performed by a ground or space-based sensor. Automated Algorithm to Detect Changes in Geostationary Satellite's Configuration and Cross-Tagging Phan Dao, Air Force Research Laboratory/RVB By characterizing geostationary satellites based on photometry and color photometry, analysts can

  8. Visibility-Based Hypothesis Testing Using Higher-Order Optical Interference

    Science.gov (United States)

    Jachura, Michał; Jarzyna, Marcin; Lipka, Michał; Wasilewski, Wojciech; Banaszek, Konrad

    2018-03-01

    Many quantum information protocols rely on optical interference to compare data sets with efficiency or security unattainable by classical means. Standard implementations exploit first-order coherence between signals whose preparation requires a shared phase reference. Here, we analyze and experimentally demonstrate the binary discrimination of visibility hypotheses based on higher-order interference for optical signals with a random relative phase. This provides a robust protocol implementation primitive when a phase lock is unavailable or impractical. With the primitive cost quantified by the total detected optical energy, optimal operation is typically reached in the few-photon regime.

  9. Design optimization and fatigue testing of an electronically-driven mechanically-resonant cantilever spring mechanism

    International Nuclear Information System (INIS)

    Kheng, Lim Boon; Kean, Koay Loke; Gitano-Briggs, Horizon

    2010-01-01

    A light scanning device consisting of an electronically-driven mechanically-resonant cantilever spring-mirror system has been developed for innovative lighting applications. The repeated flexing of the cantilever spring during operation can lead to premature fatigue failure. A model was created to optimize the spring design. The optimized spring design can reduce stress by approximately one-third from the initial design. Fatigue testing showed that the optimized spring design can operate continuously for over 1 month without failure. Analysis of failures indicates surface cracks near the root of the spring are responsible for the failures.

  10. The particle swarm optimization algorithm applied to nuclear systems surveillance test planning

    International Nuclear Information System (INIS)

    Siqueira, Newton Norat

    2006-12-01

    This work shows a new approach to solve availability maximization problems in electromechanical systems, under periodic preventive scheduled tests. This approach uses a new Optimization tool called PSO developed by Kennedy and Eberhart (2001), Particle Swarm Optimization, integrated with probabilistic safety analysis model. Two maintenance optimization problems are solved by the proposed technique, the first one is a hypothetical electromechanical configuration and the second one is a real case from a nuclear power plant (Emergency Diesel Generators). For both problem PSO is compared to a genetic algorithm (GA). In the experiments made, PSO was able to obtain results comparable or even slightly better than those obtained b GA. Therefore, the PSO algorithm is simpler and its convergence is faster, indicating that PSO is a good alternative for solving such kind of problems. (author)

  11. The environmental convergence hypothesis: Carbon dioxide emissions according to the source of energy

    International Nuclear Information System (INIS)

    Herrerias, M.J.

    2013-01-01

    The aim of this paper is to investigate the environmental convergence hypothesis in carbon dioxide emissions for a large group of developed and developing countries from 1980 to 2009. The novel aspect of this work is that we distinguish among carbon dioxide emissions according to the source of energy (coal, natural gas and petroleum) instead of considering the aggregate measure of per capita carbon dioxide emissions, where notable interest is given to the regional dimension due to the application of new club convergence tests. This allows us to determine the convergence behaviour of emissions in a more precise way and to detect it according to the source of energy used, thereby helping to address the environmental targets. More specifically, the convergence hypothesis is examined with a pair-wise test and another one is used to test for the existence of club convergence. Our results from using the pair-wise test indicate that carbon dioxide emissions for each type of energy diverge. However, club convergence is found for a large group of countries, although some still display divergence. These findings point to the need to apply specific environmental policies to each club detected, since specific countries converge to different clubs. - Highlights: • The environmental convergence hypothesis is investigated across countries. • We perform a pair-wise test and a club convergence test. • Results from the first of these two tests suggest that carbon dioxide emissions are diverging. • However, we find that carbon dioxide emissions are converging within groups of countries. • Active environmental policies are required

  12. Testing the Limits of Optimizing Dual-Task Performance in Younger and Older Adults

    Science.gov (United States)

    Strobach, Tilo; Frensch, Peter; Müller, Herrmann Josef; Schubert, Torsten

    2012-01-01

    Impaired dual-task performance in younger and older adults can be improved with practice. Optimal conditions even allow for a (near) elimination of this impairment in younger adults. However, it is unknown whether such (near) elimination is the limit of performance improvements in older adults. The present study tests this limit in older adults under conditions of (a) a high amount of dual-task training and (b) training with simplified component tasks in dual-task situations. The data showed that a high amount of dual-task training in older adults provided no evidence for an improvement of dual-task performance to the optimal dual-task performance level achieved by younger adults. However, training with simplified component tasks in dual-task situations exclusively in older adults provided a similar level of optimal dual-task performance in both age groups. Therefore through applying a testing the limits approach, we demonstrated that older adults improved dual-task performance to the same level as younger adults at the end of training under very specific conditions. PMID:22408613

  13. A test of the nest sanitation hypothesis for the evolution of foreign egg rejection in an avian brood parasite rejecter host species.

    Science.gov (United States)

    Luro, Alec B; Hauber, Mark E

    2017-04-01

    Hosts of avian brood parasites have evolved diverse defenses to avoid the costs associated with raising brood parasite nestlings. In egg ejection, the host recognizes and removes foreign eggs laid in its nest. Nest sanitation, a behavior similar in motor pattern to egg ejection, has been proposed repeatedly as a potential pre-adaptation to egg ejection. Here, we separately placed blue 3D-printed, brown-headed cowbird (Molothrus ater) eggs known to elicit interindividual variation in ejection responses and semi-natural leaves into American robins' (Turdus migratorius) nests to test proximate predictions that (1) rejecter hosts should sanitize debris from nests more frequently and consistently than accepter hosts and (2) hosts that sanitize their nests of debris prior to the presentation of a foreign egg will be more likely to eject the foreign egg. Egg ejection responses were highly repeatable within individuals yet variable between them, but were not influenced by prior exposure to debris, nor related to sanitation tendencies as a whole, because nearly all individuals sanitized their nests. Additionally, we collected published data for eight different host species to test for a potential positive correlation between sanitation and egg ejection. We found no significant correlation between nest sanitation and egg ejection rates; however, our comparative analysis was limited to a sample size of 8, and we advise that more data from additional species are necessary to properly address interspecific tests of the pre-adaptation hypothesis. In lack of support for the nest sanitation hypothesis, our study suggests that, within individuals, foreign egg ejection is distinct from nest sanitation tendencies, and sanitation and foreign egg ejection may not correlate across species.

  14. A model of optimal voluntary muscular control.

    Science.gov (United States)

    FitzHugh, R

    1977-07-19

    In the absence of detailed knowledge of how the CNS controls a muscle through its motor fibers, a reasonable hypothesis is that of optimal control. This hypothesis is studied using a simplified mathematical model of a single muscle, based on A.V. Hill's equations, with series elastic element omitted, and with the motor signal represented by a single input variable. Two cost functions were used. The first was total energy expended by the muscle (work plus heat). If the load is a constant force, with no inertia, Hill's optimal velocity of shortening results. If the load includes a mass, analysis by optimal control theory shows that the motor signal to the muscle consists of three phases: (1) maximal stimulation to accelerate the mass to the optimal velocity as quickly as possible, (2) an intermediate level of stimulation to hold the velocity at its optimal value, once reached, and (3) zero stimulation, to permit the mass to slow down, as quickly as possible, to zero velocity at the specified distance shortened. If the latter distance is too small, or the mass too large, the optimal velocity is not reached, and phase (2) is absent. For lengthening, there is no optimal velocity; there are only two phases, zero stimulation followed by maximal stimulation. The second cost function was total time. The optimal control for shortening consists of only phases (1) and (3) above, and is identical to the minimal energy control whenever phase (2) is absent from the latter. Generalization of this model to include viscous loads and a series elastic element are discussed.

  15. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  16. Pareto-optimal multi-objective dimensionality reduction deep auto-encoder for mammography classification.

    Science.gov (United States)

    Taghanaki, Saeid Asgari; Kawahara, Jeremy; Miles, Brandon; Hamarneh, Ghassan

    2017-07-01

    Feature reduction is an essential stage in computer aided breast cancer diagnosis systems. Multilayer neural networks can be trained to extract relevant features by encoding high-dimensional data into low-dimensional codes. Optimizing traditional auto-encoders works well only if the initial weights are close to a proper solution. They are also trained to only reduce the mean squared reconstruction error (MRE) between the encoder inputs and the decoder outputs, but do not address the classification error. The goal of the current work is to test the hypothesis that extending traditional auto-encoders (which only minimize reconstruction error) to multi-objective optimization for finding Pareto-optimal solutions provides more discriminative features that will improve classification performance when compared to single-objective and other multi-objective approaches (i.e. scalarized and sequential). In this paper, we introduce a novel multi-objective optimization of deep auto-encoder networks, in which the auto-encoder optimizes two objectives: MRE and mean classification error (MCE) for Pareto-optimal solutions, rather than just MRE. These two objectives are optimized simultaneously by a non-dominated sorting genetic algorithm. We tested our method on 949 X-ray mammograms categorized into 12 classes. The results show that the features identified by the proposed algorithm allow a classification accuracy of up to 98.45%, demonstrating favourable accuracy over the results of state-of-the-art methods reported in the literature. We conclude that adding the classification objective to the traditional auto-encoder objective and optimizing for finding Pareto-optimal solutions, using evolutionary multi-objective optimization, results in producing more discriminative features. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Allowed outage time for test and maintenance - Optimization of safety

    International Nuclear Information System (INIS)

    Cepin, M.; Mavko, B.

    1997-01-01

    The main objective of the project is the development and application of methodologies for improvement and optimization of test and maintenance activities for safety related equipment in NPPs on basis of their enhanced safety. The probabilistic safety assessment serves as a base, which does not mean the replacement of the deterministic analyses but the consideration of probabilistic safety assessment results as complement to deterministic results. 15 refs, 2 figs

  18. Optimizing the design of a reproduction toxicity test with the pond snail Lymnaea stagnalis.

    Science.gov (United States)

    Charles, Sandrine; Ducrot, Virginie; Azam, Didier; Benstead, Rachel; Brettschneider, Denise; De Schamphelaere, Karel; Filipe Goncalves, Sandra; Green, John W; Holbech, Henrik; Hutchinson, Thomas H; Faber, Daniel; Laranjeiro, Filipe; Matthiessen, Peter; Norrgren, Leif; Oehlmann, Jörg; Reategui-Zirena, Evelyn; Seeland-Fremer, Anne; Teigeler, Matthias; Thome, Jean-Pierre; Tobor Kaplon, Marysia; Weltje, Lennart; Lagadic, Laurent

    2016-11-01

    This paper presents the results from two ring-tests addressing the feasibility, robustness and reproducibility of a reproduction toxicity test with the freshwater gastropod Lymnaea stagnalis (RENILYS strain). Sixteen laboratories (from inexperienced to expert laboratories in mollusc testing) from nine countries participated in these ring-tests. Survival and reproduction were evaluated in L. stagnalis exposed to cadmium, tributyltin, prochloraz and trenbolone according to an OECD draft Test Guideline. In total, 49 datasets were analysed to assess the practicability of the proposed experimental protocol, and to estimate the between-laboratory reproducibility of toxicity endpoint values. The statistical analysis of count data (number of clutches or eggs per individual-day) leading to ECx estimation was specifically developed and automated through a free web-interface. Based on a complementary statistical analysis, the optimal test duration was established and the most sensitive and cost-effective reproduction toxicity endpoint was identified, to be used as the core endpoint. This validation process and the resulting optimized protocol were used to consolidate the OECD Test Guideline for the evaluation of reproductive effects of chemicals in L. stagnalis. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  20. On the optimal number of classes in the Pearson goodness-of-fit tests

    Czech Academy of Sciences Publication Activity Database

    Morales, D.; Pardo, L.; Vajda, Igor

    2005-01-01

    Roč. 41, č. 6 (2005), s. 677-698 ISSN 0023-5954 R&D Projects: GA AV ČR(CZ) IAA1075403 Grant - others:BFM(ES) 2003-00892; GV(ES) 04B-670 Institutional research plan: CEZ:AV0Z10750506 Keywords : pearson-type goodness -of- fit test s * asymptotic local test power * asymptotic equivalence of test s * optimal number of classes Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.343, year: 2005

  1. An Evaluation of the Sniffer Global Optimization Algorithm Using Standard Test Functions

    Science.gov (United States)

    Butler, Roger A. R.; Slaminka, Edward E.

    1992-03-01

    The performance of Sniffer—a new global optimization algorithm—is compared with that of Simulated Annealing. Using the number of function evaluations as a measure of efficiency, the new algorithm is shown to be significantly better at finding the global minimum of seven standard test functions. Several of the test functions used have many local minima and very steep walls surrounding the global minimum. Such functions are intended to thwart global minimization algorithms.

  2. Test of determination of nucleon structure functions in the hypothesis of scalar di-quark existence

    International Nuclear Information System (INIS)

    Tavernier, P.; Dugne, J.J.

    1992-01-01

    The authors present the nucleon structure functions that have been obtained in the hypothesis of existence of a scalar di-quark, progressively broken by increasing energy of electromagnetic probe (Stockolm model). Comparisons with other models and experimental results are presented. 20 figs

  3. Anatomical Thin Titanium Mesh Plate Structural Optimization for Zygomatic-Maxillary Complex Fracture under Fatigue Testing

    Directory of Open Access Journals (Sweden)

    Yu-Tzu Wang

    2018-01-01

    Full Text Available This study performs a structural optimization of anatomical thin titanium mesh (ATTM plate and optimal designed ATTM plate fabricated using additive manufacturing (AM to verify its stabilization under fatigue testing. Finite element (FE analysis was used to simulate the structural bending resistance of a regular ATTM plate. The Taguchi method was employed to identify the significance of each design factor in controlling the deflection and determine an optimal combination of designed factors. The optimal designed ATTM plate with patient-matched facial contour was fabricated using AM and applied to a ZMC comminuted fracture to evaluate the resting maxillary micromotion/strain under fatigue testing. The Taguchi analysis found that the ATTM plate required a designed internal hole distance to be 0.9 mm, internal hole diameter to be 1 mm, plate thickness to be 0.8 mm, and plate height to be 10 mm. The designed plate thickness factor primarily dominated the bending resistance up to 78% importance. The averaged micromotion (displacement and strain of the maxillary bone showed that ZMC fracture fixation using the miniplate was significantly higher than those using the AM optimal designed ATTM plate. This study concluded that the optimal designed ATTM plate with enough strength to resist the bending effect can be obtained by combining FE and Taguchi analyses. The optimal designed ATTM plate with patient-matched facial contour fabricated using AM provides superior stabilization for ZMC comminuted fractured bone segments.

  4. Order information and free recall: evaluating the item-order hypothesis.

    Science.gov (United States)

    Mulligan, Neil W; Lozito, Jeffrey P

    2007-05-01

    The item-order hypothesis proposes that order information plays an important role in recall from long-term memory, and it is commonly used to account for the moderating effects of experimental design in memory research. Recent research (Engelkamp, Jahn, & Seiler, 2003; McDaniel, DeLosh, & Merritt, 2000) raises questions about the assumptions underlying the item-order hypothesis. Four experiments tested these assumptions by examining the relationship between free recall and order memory for lists of varying length (8, 16, or 24 unrelated words or pictures). Some groups were given standard free-recall instructions, other groups were explicitly instructed to use order information in free recall, and other groups were given free-recall tests intermixed with tests of order memory (order reconstruction). The results for short lists were consistent with the assumptions of the item-order account. For intermediate-length lists, explicit order instructions and intermixed order tests made recall more reliant on order information, but under standard conditions, order information played little role in recall. For long lists, there was little evidence that order information contributed to recall. In sum, the assumptions of the item-order account held for short lists, received mixed support with intermediate lists, and received no support for longer lists.

  5. The community conditioning hypothesis and its application to environmental toxicology

    International Nuclear Information System (INIS)

    Matthews, R.A.; Landis, W.G.; Matthews, G.B.

    1996-01-01

    In this paper the authors present the community conditions hypothesis, ecological communities retain information bout events in their history. This hypothesis, which was derived from the concept of nonequilibrium community ecology, was developed as a framework for understanding the persistence of dose-related responses in multispecies toxicity tests. The authors present data from three standardized aquatic microcosm (SAM) toxicity tests using the water-soluble fractions from turbine fuels (Jet-A, JP-4, and JP-8). In all three tests, the toxicants depressed the Daphnia populations for several weeks, which resulted in algal blooms in the dosed microcosms due to lower predation rates. These effects were short-lived, and by the second and third months of the experiments, the Daphnia populations appeared to have recovered. However, multivariate analysis of the data released dose/response differences that reappeared during the later part of the tests, often due to differences in other consumers (rotifers, ostracods, ciliates), or algae that are not normally consumed (filamentous green algae and bluegreen algae). The findings are consistent with ecological theories that describe communities as the unique production of their etiologies. The implications of this to environmental toxicology are that almost all environmental events leave lasting effects, whether or not they have observed them

  6. Optimal Scoring Methods of Hand-Strength Tests in Patients with Stroke

    Science.gov (United States)

    Huang, Sheau-Ling; Hsieh, Ching-Lin; Lin, Jau-Hong; Chen, Hui-Mei

    2011-01-01

    The purpose of this study was to determine the optimal scoring methods for measuring strength of the more-affected hand in patients with stroke by examining the effect of reducing measurement errors. Three hand-strength tests of grip, palmar pinch, and lateral pinch were administered at two sessions in 56 patients with stroke. Five scoring methods…

  7. [Optimization of stir-baking with vinegar technology for Curcumae Radix by orthogonal test].

    Science.gov (United States)

    Shi, Dianhua; Su, Benzheng; Sun, Lili; Zhang, Jun; Qu, Yongsheng

    2011-05-01

    To optimize the stir-baking with vinegar technology for Curcumae Radix. The intrinsic quality (the content of Curcumin) and traditional outward appearance were chosen as indexes. The best technology was determined by orthogonal test L9 (3(4)). The factors of the moistening time, stir-baking temperature and stir-baking time were investigated. The optimal technology was as follows: the quantity of vinegar was 10%, the moistening time was 10 min, the stir-baking temperature was 130 degrees C and the stir-baking time was 10 min. The optimal stir-baking with vinegar technology for Curcumae Radix is reasonable, which can be used to guide the standardized production of Curcumae Radix stir-baked with vinegar.

  8. Testing the snake-detection hypothesis: larger early posterior negativity in humans to pictures of snakes than to pictures of other reptiles, spiders and slugs.

    Science.gov (United States)

    Van Strien, Jan W; Franken, Ingmar H A; Huijding, Jorg

    2014-01-01

    According to the snake detection hypothesis (Isbell, 2006), fear specifically of snakes may have pushed evolutionary changes in the primate visual system allowing pre-attentional visual detection of fearful stimuli. A previous study demonstrated that snake pictures, when compared to spiders or bird pictures, draw more early attention as reflected by larger early posterior negativity (EPN). Here we report two studies that further tested the snake detection hypothesis. In Study 1, we tested whether the enlarged EPN is specific for snakes or also generalizes to other reptiles. Twenty-four healthy, non-phobic women watched the random rapid serial presentation of snake, crocodile, and turtle pictures. The EPN was scored as the mean activity at occipital electrodes (PO3, O1, Oz, PO4, O2) in the 225-300 ms time window after picture onset. The EPN was significantly larger for snake pictures than for pictures of the other reptiles. In Study 2, we tested whether disgust plays a role in the modulation of the EPN and whether preferential processing of snakes also can be found in men. 12 men and 12 women watched snake, spider, and slug pictures. Both men and women exhibited the largest EPN amplitudes to snake pictures, intermediate amplitudes to spider pictures and the smallest amplitudes to slug pictures. Disgust ratings were not associated with EPN amplitudes. The results replicate previous findings and suggest that ancestral priorities modulate the early capture of visual attention.

  9. Suitable or optimal noise benefits in signal detection

    International Nuclear Information System (INIS)

    Liu, Shujun; Yang, Ting; Tang, Mingchun; Wang, Pin; Zhang, Xinzheng

    2016-01-01

    Highlights: • Six intervals of additive noises divided according to the two constraints. • Derivation of the suitable additive noise to meet the two constraints. • Formulation of the suitable noise for improvability or nonimprovability. • Optimal noises to minimize P FA , maximize P D and maximize the overall improvement. - Abstract: We present an effective way to generate the suitable or the optimal additive noises which can achieve the three goals of the noise enhanced detectability, i.e., the maximum detection probability (P D ), the minimum false alarm probability (P FA ) and the maximum overall improvement of P D and P FA , without increasing P FA and decreasing P D in a binary hypothesis testing problem. The mechanism of our method is that we divide the discrete vectors into six intervals and choose the useful or partial useful vectors from these intervals to form the additive noise according to different requirements. The form of the optimal noise is derived and proven as a randomization of no more than two discrete vectors in our way. Moreover, how to choose suitable and optimal noises from the six intervals are given. Finally, numerous examples are presented to illustrate the theoretical analysis, where the background noises are Gaussian, symmetric and asymmetric Gaussian mixture noise, respectively.

  10. Test Beam Results of Geometry Optimized Hybrid Pixel Detectors

    CERN Document Server

    Becks, K H; Grah, C; Mättig, P; Rohe, T

    2006-01-01

    The Multi-Chip-Module-Deposited (MCM-D) technique has been used to build hybrid pixel detector assemblies. This paper summarises the results of an analysis of data obtained in a test beam campaign at CERN. Here, single chip hybrids made of ATLAS pixel prototype read-out electronics and special sensor tiles were used. They were prepared by the Fraunhofer Institut fuer Zuverlaessigkeit und Mikrointegration, IZM, Berlin, Germany. The sensors feature an optimized sensor geometry called equal sized bricked. This design enhances the spatial resolution for double hits in the long direction of the sensor cells.

  11. Biomechanical spinal growth modulation and progressive adolescent scoliosis – a test of the 'vicious cycle' pathogenetic hypothesis: Summary of an electronic focus group debate of the IBSE

    Directory of Open Access Journals (Sweden)

    Burwell R Geoffrey

    2006-10-01

    Full Text Available Abstract There is no generally accepted scientific theory for the causes of adolescent idiopathic scoliosis (AIS. As part of its mission to widen understanding of scoliosis etiology, the International Federated Body on Scoliosis Etiology (IBSE introduced the electronic focus group (EFG as a means of increasing debate on knowledge of important topics. This has been designated as an on-line Delphi discussion. The text for this debate was written by Dr Ian A Stokes. It evaluates the hypothesis that in progressive scoliosis vertebral body wedging during adolescent growth results from asymmetric muscular loading in a "vicious cycle" (vicious cycle hypothesis of pathogenesis by affecting vertebral body growth plates (endplate physes. A frontal plane mathematical simulation tested whether the calculated loading asymmetry created by muscles in a scoliotic spine could explain the observed rate of scoliosis increase by measuring the vertebral growth modulation by altered compression. The model deals only with vertebral (not disc wedging. It assumes that a pre-existing scoliosis curve initiates the mechanically-modulated alteration of vertebral body growth that in turn causes worsening of the scoliosis, while everything else is anatomically and physiologically 'normal' The results provide quantitative data consistent with the vicious cycle hypothesis. Dr Stokes' biomechanical research engenders controversy. A new speculative concept is proposed of vertebral symphyseal dysplasia with implications for Dr Stokes' research and the etiology of AIS. What is not controversial is the need to test this hypothesis using additional factors in his current model and in three-dimensional quantitative models that incorporate intervertebral discs and simulate thoracic as well as lumbar scoliosis. The growth modulation process in the vertebral body can be viewed as one type of the biologic phenomenon of mechanotransduction. In certain connective tissues this involves the

  12. SINGLE VERSUS MULTIPLE TRIAL VECTORS IN CLASSICAL DIFFERENTIAL EVOLUTION FOR OPTIMIZING THE QUANTIZATION TABLE IN JPEG BASELINE ALGORITHM

    Directory of Open Access Journals (Sweden)

    B Vinoth Kumar

    2017-07-01

    Full Text Available Quantization Table is responsible for compression / quality trade-off in baseline Joint Photographic Experts Group (JPEG algorithm and therefore it is viewed as an optimization problem. In the literature, it has been found that Classical Differential Evolution (CDE is a promising algorithm to generate the optimal quantization table. However, the searching capability of CDE could be limited due to generation of single trial vector in an iteration which in turn reduces the convergence speed. This paper studies the performance of CDE by employing multiple trial vectors in a single iteration. An extensive performance analysis has been made between CDE and CDE with multiple trial vectors in terms of Optimization process, accuracy, convergence speed and reliability. The analysis report reveals that CDE with multiple trial vectors improves the convergence speed of CDE and the same is confirmed using a statistical hypothesis test (t-test.

  13. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    Science.gov (United States)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  14. Testing the accelerating moment release (AMR) hypothesis in areas of high stress

    Science.gov (United States)

    Guilhem, Aurélie; Bürgmann, Roland; Freed, Andrew M.; Ali, Syed Tabrez

    2013-11-01

    Several retrospective analyses have proposed that significant increases in moment release occurred prior to many large earthquakes of recent times. However, the finding of accelerating moment release (AMR) strongly depends on the choice of three parameters: (1) magnitude range, (2) area being considered surrounding the events and (3) the time period prior to the large earthquakes. Consequently, the AMR analysis has been criticized as being a posteriori data-fitting exercise with no new predictive power. As AMR has been hypothesized to relate to changes in the state of stress around the eventual epicentre, we compare here AMR results to models of stress accumulation in California. Instead of assuming a complete stress drop on all surrounding fault segments implied by a back-slip stress lobe method, we consider that stress evolves dynamically, punctuated by the occurrence of earthquakes, and governed by the elastic and viscous properties of the lithosphere. We study the seismicity of southern California and extract events for AMR calculations following the systematic approach employed in previous studies. We present several sensitivity tests of the method, as well as grid-search analyses over the region between 1955 and 2005 using fixed magnitude range, radius of the search area and period of time. The results are compared to the occurrence of large events and to maps of Coulomb stress changes. The Coulomb stress maps are compiled using the coseismic stress from all M > 7.0 earthquakes since 1812, their subsequent post-seismic relaxation, and the interseismic strain accumulation. We find no convincing correlation of seismicity rate changes in recent decades with areas of high stress that would support the AMR hypothesis. Furthermore, this indicates limited utility for practical earthquake hazard analysis in southern California, and possibly other regions.

  15. Hypothesis testing in the Maimai Catchments, Westland

    International Nuclear Information System (INIS)

    Stewart, M.K.

    1993-01-01

    Seven experiments were carried out on the Maimai Catchments, Westland, to test assumptions about the nature of unsaturated zone waters flows in this humid environment. Hypotheses tested were: 1) that the deuterium (D) content of base flow water sources in small streams are constant at any given time, 2) that different soil moisture sampling methods give the same D contents, 3) that throughfall has the same D content as rainfall, 4) that saturation overland flow is mainly composed of current event rainfall, 5) that macropores are not connected into pipe networks, 6) that the underlying substrate (Old Man Gravel conglomerate) does not deliver water to the stream during rainfall events, and 7) that different near-stream water sources have the same D contents at a given time. Over 570 samples were collected of which 300 were analysed for deuterium in 1992-1993. This report gives the background, rationale, methods and brief results of the experiments. The results will be integrated with other measurements and written up in one or more papers for journal publication. (author). 18 refs.; 4 figs.; 1 tab

  16. Comparison between Genetic Algorithms and Particle Swarm Optimization Methods on Standard Test Functions and Machine Design

    DEFF Research Database (Denmark)

    Nica, Florin Valentin Traian; Ritchie, Ewen; Leban, Krisztina Monika

    2013-01-01

    , genetic algorithm and particle swarm are shortly presented in this paper. These two algorithms are tested to determine their performance on five different benchmark test functions. The algorithms are tested based on three requirements: precision of the result, number of iterations and calculation time....... Both algorithms are also tested on an analytical design process of a Transverse Flux Permanent Magnet Generator to observe their performances in an electrical machine design application.......Nowadays the requirements imposed by the industry and economy ask for better quality and performance while the price must be maintained in the same range. To achieve this goal optimization must be introduced in the design process. Two of the best known optimization algorithms for machine design...

  17. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  18. Evidence for Enhanced Mutualism Hypothesis: Solidago canadensis Plants from Regular Soils Perform Better

    OpenAIRE

    Sun, Zhen-Kai; He, Wei-Ming

    2010-01-01

    The important roles of plant-soil microbe interactions have been documented in exotic plant invasion, but we know very little about how soil mutualists enhance this process (i.e. enhanced mutualism hypothesis). To test this hypothesis we conducted two greenhouse experiments with Solidago canadensis (hereafter Solidago), an invasive forb from North America, and Stipa bungeana (hereafter Stipa), a native Chinese grass. In a germination experiment, we found soil microbes from the rhizospheres of...

  19. A test of the theory of nonrenewable resources. Controlling for exploration and market power

    Energy Technology Data Exchange (ETDEWEB)

    Malischek, Raimund [Koeln Univ. (Germany). Inst. of Energy Economics; Tode, Christian [Koeln Univ. (Germany). Inst. of Energy Economics; Koeln Univ. (Germany). Dept. of Economics

    2015-05-15

    Despite the central role of the Hotelling model within the theory of nonrenewable resources, tests of the model are rarely found. If existent, these tests tend to ignore two key features, namely market power and exploration. We therefore suggest an extension of the basic Hotelling framework to incorporate exploration activity and market power and propose an implicit price behavior test of the model to indicate whether firms undergo inter-temporal optimization. When applied to a newly constructed data set for the uranium mining industry, the null hypothesis of the firm optimizing inter-temporally is rejected in all settings. However, parameter estimates of the model still yield valuable information on cost structure, resource scarcity and market power. Our results suggest that the shadow price of the resource in situ is comparably small and may be overshadowed by market power, which may serve as an explanation for the firm failing to optimize inter-temporally.

  20. A test of the theory of nonrenewable resources. Controlling for exploration and market power

    International Nuclear Information System (INIS)

    Malischek, Raimund; Tode, Christian; Koeln Univ.

    2015-01-01

    Despite the central role of the Hotelling model within the theory of nonrenewable resources, tests of the model are rarely found. If existent, these tests tend to ignore two key features, namely market power and exploration. We therefore suggest an extension of the basic Hotelling framework to incorporate exploration activity and market power and propose an implicit price behavior test of the model to indicate whether firms undergo inter-temporal optimization. When applied to a newly constructed data set for the uranium mining industry, the null hypothesis of the firm optimizing inter-temporally is rejected in all settings. However, parameter estimates of the model still yield valuable information on cost structure, resource scarcity and market power. Our results suggest that the shadow price of the resource in situ is comparably small and may be overshadowed by market power, which may serve as an explanation for the firm failing to optimize inter-temporally.

  1. Testing the Sensory Drive Hypothesis: Geographic variation in echolocation frequencies of Geoffroy's horseshoe bat (Rhinolophidae: Rhinolophus clivosus).

    Science.gov (United States)

    Jacobs, David S; Catto, Sarah; Mutumi, Gregory L; Finger, Nikita; Webala, Paul W

    2017-01-01

    Geographic variation in sensory traits is usually influenced by adaptive processes because these traits are involved in crucial life-history aspects including orientation, communication, lineage recognition and mate choice. Studying this variation can therefore provide insights into lineage diversification. According to the Sensory Drive Hypothesis, lineage diversification may be driven by adaptation of sensory systems to local environments. It predicts that acoustic signals vary in association with local climatic conditions so that atmospheric attenuation is minimized and transmission of the signals maximized. To test this prediction, we investigated the influence of climatic factors (specifically relative humidity and temperature) on geographic variation in the resting frequencies of the echolocation pulses of Geoffroy's horseshoe bat, Rhinolophus clivosus. If the evolution of phenotypic variation in this lineage tracks climate variation, human induced climate change may lead to decreases in detection volumes and a reduction in foraging efficiency. A complex non-linear interaction between relative humidity and temperature affects atmospheric attenuation of sound and principal components composed of these correlated variables were, therefore, used in a linear mixed effects model to assess their contribution to observed variation in resting frequencies. A principal component composed predominantly of mean annual temperature (factor loading of -0.8455) significantly explained a proportion of the variation in resting frequency across sites (P < 0.05). Specifically, at higher relative humidity (around 60%) prevalent across the distribution of R. clivosus, increasing temperature had a strong negative effect on resting frequency. Climatic factors thus strongly influence acoustic signal divergence in this lineage, supporting the prediction of the Sensory Drive Hypothesis. The predicted future increase in temperature due to climate change is likely to decrease the

  2. Testing the Sensory Drive Hypothesis: Geographic variation in echolocation frequencies of Geoffroy's horseshoe bat (Rhinolophidae: Rhinolophus clivosus.

    Directory of Open Access Journals (Sweden)

    David S Jacobs

    Full Text Available Geographic variation in sensory traits is usually influenced by adaptive processes because these traits are involved in crucial life-history aspects including orientation, communication, lineage recognition and mate choice. Studying this variation can therefore provide insights into lineage diversification. According to the Sensory Drive Hypothesis, lineage diversification may be driven by adaptation of sensory systems to local environments. It predicts that acoustic signals vary in association with local climatic conditions so that atmospheric attenuation is minimized and transmission of the signals maximized. To test this prediction, we investigated the influence of climatic factors (specifically relative humidity and temperature on geographic variation in the resting frequencies of the echolocation pulses of Geoffroy's horseshoe bat, Rhinolophus clivosus. If the evolution of phenotypic variation in this lineage tracks climate variation, human induced climate change may lead to decreases in detection volumes and a reduction in foraging efficiency. A complex non-linear interaction between relative humidity and temperature affects atmospheric attenuation of sound and principal components composed of these correlated variables were, therefore, used in a linear mixed effects model to assess their contribution to observed variation in resting frequencies. A principal component composed predominantly of mean annual temperature (factor loading of -0.8455 significantly explained a proportion of the variation in resting frequency across sites (P < 0.05. Specifically, at higher relative humidity (around 60% prevalent across the distribution of R. clivosus, increasing temperature had a strong negative effect on resting frequency. Climatic factors thus strongly influence acoustic signal divergence in this lineage, supporting the prediction of the Sensory Drive Hypothesis. The predicted future increase in temperature due to climate change is likely to

  3. Odegaard's selection hypothesis revisited : Schizophrenia in Surinamese immigrants to the Netherlands

    NARCIS (Netherlands)

    Selten, JP; Cantor-Graae, E; Slaets, J; Kahn, RS

    Objective. The incidence of schizophrenia among Surinamese immigrants to the Netherlands is high. The authors tested Odegaard's hypothesis that this phenomenon is explained by selective migration. Method: The authors imagined that migration from Surinam to the Netherlands subsumed the entire

  4. A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.

    Science.gov (United States)

    Yu, Qingzhao; Zhu, Lin; Zhu, Han

    2017-11-01

    Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.

  5. On the hypothesis-free testing of metabolite ratios in genome-wide and metabolome-wide association studies

    Directory of Open Access Journals (Sweden)

    Petersen Ann-Kristin

    2012-06-01

    Full Text Available Abstract Background Genome-wide association studies (GWAS with metabolic traits and metabolome-wide association studies (MWAS with traits of biomedical relevance are powerful tools to identify the contribution of genetic, environmental and lifestyle factors to the etiology of complex diseases. Hypothesis-free testing of ratios between all possible metabolite pairs in GWAS and MWAS has proven to be an innovative approach in the discovery of new biologically meaningful associations. The p-gain statistic was introduced as an ad-hoc measure to determine whether a ratio between two metabolite concentrations carries more information than the two corresponding metabolite concentrations alone. So far, only a rule of thumb was applied to determine the significance of the p-gain. Results Here we explore the statistical properties of the p-gain through simulation of its density and by sampling of experimental data. We derive critical values of the p-gain for different levels of correlation between metabolite pairs and show that B/(2*α is a conservative critical value for the p-gain, where α is the level of significance and B the number of tested metabolite pairs. Conclusions We show that the p-gain is a well defined measure that can be used to identify statistically significant metabolite ratios in association studies and provide a conservative significance cut-off for the p-gain for use in future association studies with metabolic traits.

  6. Testing Happiness Hypothesis among the Elderly

    Directory of Open Access Journals (Sweden)

    Rossi Máximo

    2008-08-01

    Full Text Available We use a rich data set that allows us to test different happiness hypotheses employing four methodological approaches. We find that older people in Uruguay have a tendency to report themselves happy when they are married, when they have higher standards of health and when they earn higher levels of income or they consider that their income is suitable for their standard of living. On the contrary, they report lower levels of happiness when they live alone and when their nutrition is insufficient. We also find that education has no clear impact on happiness. We think that our study is a contribution to the study of those factors that can explain happiness among the elderly in Latin American countries. Future work will focus on enhanced empirical analysis and in extending our study to other countries.

  7. Optimized Method for Knee Displacement Measurement in Vehicle Sled Crash Test

    Directory of Open Access Journals (Sweden)

    Sun Hang

    2017-01-01

    Full Text Available This paper provides an optimized method for measuring dummy’s knee displacement in vehicle sled crash test. The proposed method utilizes completely new elements for measurement, which are acceleration and angular velocity of dummy’s pelvis, as well as the rotational angle of its femur. Compared with the traditional measurement only using camera-based high-speed motion image analysis, the optimized one can not only maintain the measuring accuracy, but also avoid the disturbance caused by dummy movement, dashboard blocking and knee deformation during the crash. An experiment is made to verify the accuracy of the proposed method, which eliminates the strong dependence on single target tracing in traditional method. Moreover, it is very appropriate for calculating the penetration depth to the dashboard.

  8. The need to optimize inservice testing and inspection to enhance safety

    International Nuclear Information System (INIS)

    Perry, J.A.

    1996-01-01

    Welcome to the Fourth U.S. Nuclear Regulatory Commission and American Society of Mechanical Engineers (USNRC/ASME) Symposium on Valve and Pump Testing in Nuclear Power Plants. This symposium provides a forum to exchange information on technical and regulatory issues associated with the testing of valves and pumps used in nuclear power plants. Progress made since the last symposium will be discussed along with various methods for in service testing of valves and pumps. Active participation by industry representatives, regulators and consultants will entail discussion of a broad array of ideas and points of view regarding how to improve the in service testing of valves and pumps at nuclear power plants. One of the challenges faced is the need to optimize the in service testing and inspection to enhance safety, operability and reliability. The author addresses this challenge from an ASME Nuclear Codes and Standards point of view

  9. The need to optimize inservice testing and inspection to enhance safety

    Energy Technology Data Exchange (ETDEWEB)

    Perry, J.A.

    1996-12-01

    Welcome to the Fourth U.S. Nuclear Regulatory Commission and American Society of Mechanical Engineers (USNRC/ASME) Symposium on Valve and Pump Testing in Nuclear Power Plants. This symposium provides a forum to exchange information on technical and regulatory issues associated with the testing of valves and pumps used in nuclear power plants. Progress made since the last symposium will be discussed along with various methods for in service testing of valves and pumps. Active participation by industry representatives, regulators and consultants will entail discussion of a broad array of ideas and points of view regarding how to improve the in service testing of valves and pumps at nuclear power plants. One of the challenges faced is the need to optimize the in service testing and inspection to enhance safety, operability and reliability. The author addresses this challenge from an ASME Nuclear Codes and Standards point of view.

  10. Effects of musicality and motivational orientation on auditory category learning: a test of a regulatory-fit hypothesis.

    Science.gov (United States)

    McAuley, J Devin; Henry, Molly J; Wedd, Alan; Pleskac, Timothy J; Cesario, Joseph

    2012-02-01

    Two experiments investigated the effects of musicality and motivational orientation on auditory category learning. In both experiments, participants learned to classify tone stimuli that varied in frequency and duration according to an initially unknown disjunctive rule; feedback involved gaining points for correct responses (a gains reward structure) or losing points for incorrect responses (a losses reward structure). For Experiment 1, participants were told at the start that musicians typically outperform nonmusicians on the task, and then they were asked to identify themselves as either a "musician" or a "nonmusician." For Experiment 2, participants were given either a promotion focus prime (a performance-based opportunity to gain entry into a raffle) or a prevention focus prime (a performance-based criterion that needed to be maintained to avoid losing an entry into a raffle) at the start of the experiment. Consistent with a regulatory-fit hypothesis, self-identified musicians and promotion-primed participants given a gains reward structure made more correct tone classifications and were more likely to discover the optimal disjunctive rule than were musicians and promotion-primed participants experiencing losses. Reward structure (gains vs. losses) had inconsistent effects on the performance of nonmusicians, and a weaker regulatory-fit effect was found for the prevention focus prime. Overall, the findings from this study demonstrate a regulatory-fit effect in the domain of auditory category learning and show that motivational orientation may contribute to musician performance advantages in auditory perception.

  11. Optimal design and dynamic impact tests of removable bollards

    Science.gov (United States)

    Chen, Suwen; Liu, Tianyi; Li, Guoqiang; Liu, Qing; Sun, Jianyun

    2017-10-01

    Anti-ram bollard systems, which are installed around buildings and infrastructure, can prevent unauthorized vehicles from entering, maintain distance from vehicle-borne improvised explosive devices (VBIED) and reduce the corresponding damage. Compared with a fixed bollard system, a removable bollard system provides more flexibility as it can be removed when needed. This paper first proposes a new type of K4-rated removable anti-ram bollard system. To simulate the collision of a vehicle hitting the bollard system, a finite element model was then built and verified through comparison of numerical simulation results and existing experimental results. Based on the orthogonal design method, the factors influencing the safety and economy of this proposed system were examined and sorted according to their importance. An optimal design scheme was then produced. Finally, to validate the effectiveness of the proposed design scheme, four dynamic impact tests, including two front impact tests and two side impact tests, have been conducted according to BSI Specifications. The residual rotation angles of the specimen are smaller than 30º and satisfy the requirements of the BSI Specification.

  12. Method to determine the optimal constitutive model from spherical indentation tests

    Science.gov (United States)

    Zhang, Tairui; Wang, Shang; Wang, Weiqiang

    2018-03-01

    The limitation of current indentation theories was investigated and a method to determine the optimal constitutive model through spherical indentation tests was proposed. Two constitutive models, the Power-law and the Linear-law, were used in Finite Element (FE) calculations, and then a set of indentation governing equations was established for each model. The load-depth data from the normal indentation depth was used to fit the best parameters in each constitutive model while the data from the further loading part was compared with those from FE calculations, and the model that better predicted the further deformation was considered the optimal one. Moreover, a Yang's modulus calculation model which took the previous plastic deformation and the phenomenon of pile-up (or sink-in) into consideration was also proposed to revise the original Sneddon-Pharr-Oliver model. The indentation results on six materials, 304, 321, SA508, SA533, 15CrMoR, and Fv520B, were compared with tensile ones, which validated the reliability of the revised E calculation model and the optimal constitutive model determination method in this study.

  13. Parallel island genetic algorithm applied to a nuclear power plant auxiliary feedwater system surveillance tests policy optimization

    International Nuclear Information System (INIS)

    Pereira, Claudio M.N.A.; Lapa, Celso M.F.

    2003-01-01

    In this work, we focus the application of an Island Genetic Algorithm (IGA), a coarse-grained parallel genetic algorithm (PGA) model, to a Nuclear Power Plant (NPP) Auxiliary Feedwater System (AFWS) surveillance tests policy optimization. Here, the main objective is to outline, by means of comparisons, the advantages of the IGA over the simple (non-parallel) genetic algorithm (GA), which has been successfully applied in the solution of such kind of problem. The goal of the optimization is to maximize the system's average availability for a given period of time, considering realistic features such as: i) aging effects on standby components during the tests; ii) revealing failures in the tests implies on corrective maintenance, increasing outage times; iii) components have distinct test parameters (outage time, aging factors, etc.) and iv) tests are not necessarily periodic. In our experiments, which were made in a cluster comprised by 8 1-GHz personal computers, we could clearly observe gains not only in the computational time, which reduced linearly with the number of computers, but in the optimization outcome

  14. Contextual effects on the perceived health benefits of exercise: the exercise rank hypothesis.

    Science.gov (United States)

    Maltby, John; Wood, Alex M; Vlaev, Ivo; Taylor, Michael J; Brown, Gordon D A

    2012-12-01

    Many accounts of social influences on exercise participation describe how people compare their behaviors to those of others. We develop and test a novel hypothesis, the exercise rank hypothesis, of how this comparison can occur. The exercise rank hypothesis, derived from evolutionary theory and the decision by sampling model of judgment, suggests that individuals' perceptions of the health benefits of exercise are influenced by how individuals believe the amount of exercise ranks in comparison with other people's amounts of exercise. Study 1 demonstrated that individuals' perceptions of the health benefits of their own current exercise amounts were as predicted by the exercise rank hypothesis. Study 2 demonstrated that the perceptions of the health benefits of an amount of exercise can be manipulated by experimentally changing the ranked position of the amount within a comparison context. The discussion focuses on how social norm-based interventions could benefit from using rank information.

  15. Emergency Diesel Generation System Surveillance Test Policy Optimization Through Genetic Algorithms Using Non-Periodic Intervention Frequencies and Seasonal Constraints

    International Nuclear Information System (INIS)

    Lapa, Celso M.F.; Pereira, Claudio M.N.A.; Frutuoso e Melo, P.F.

    2002-01-01

    Nuclear standby safety systems must frequently, be submitted to periodic surveillance tests. The main reason is to detect, as soon as possible, the occurrence of unrevealed failure states. Such interventions may, however, affect the overall system availability due to component outages. Besides, as the components are demanded, deterioration by aging may occur, penalizing again the system performance. By these reasons, planning a good surveillance test policy implies in a trade-off between gains and overheads due to the surveillance test interventions. In order maximize the systems average availability during a given period of time, it has recently been developed a non-periodic surveillance test optimization methodology based on genetic algorithms (GA). The fact of allowing non-periodic tests turns the solution space much more flexible and schedules can be better adjusted, providing gains in the overall system average availability, when compared to those obtained by an optimized periodic tests scheme. The optimization problem becomes, however, more complex. Hence, the use of a powerful optimization technique, such as GAs, is required. Some particular features of certain systems can turn it advisable to introduce other specific constraints in the optimization problem. The Emergency Diesel Generation System (EDGS) of a Nuclear Power Plant (N-PP) is a good example for demonstrating the introduction of seasonal constraints in the optimization problem. This system is responsible for power supply during an external blackout. Therefore, it is desirable during periods of high blackout probability to maintain the system availability as high as possible. Previous applications have demonstrated the robustness and effectiveness of the methodology. However, no seasonal constraints have ever been imposed. This work aims at investigating the application of such methodology in the Angra-II Brazilian NPP EDGS surveillance test policy optimization, considering the blackout probability

  16. MEMS resonant load cells for micro-mechanical test frames: feasibility study and optimal design

    Science.gov (United States)

    Torrents, A.; Azgin, K.; Godfrey, S. W.; Topalli, E. S.; Akin, T.; Valdevit, L.

    2010-12-01

    This paper presents the design, optimization and manufacturing of a novel micro-fabricated load cell based on a double-ended tuning fork. The device geometry and operating voltages are optimized for maximum force resolution and range, subject to a number of manufacturing and electromechanical constraints. All optimizations are enabled by analytical modeling (verified by selected finite elements analyses) coupled with an efficient C++ code based on the particle swarm optimization algorithm. This assessment indicates that force resolutions of ~0.5-10 nN are feasible in vacuum (~1-50 mTorr), with force ranges as large as 1 N. Importantly, the optimal design for vacuum operation is independent of the desired range, ensuring versatility. Experimental verifications on a sub-optimal device fabricated using silicon-on-glass technology demonstrate a resolution of ~23 nN at a vacuum level of ~50 mTorr. The device demonstrated in this article will be integrated in a hybrid micro-mechanical test frame for unprecedented combinations of force resolution and range, displacement resolution and range, optical (or SEM) access to the sample, versatility and cost.

  17. Summary on Bayes estimation and hypothesis testing

    Directory of Open Access Journals (Sweden)

    D. J. de Waal

    1988-03-01

    Full Text Available Although Bayes’ theorem was published in 1764, it is only recently that Bayesian procedures were used in practice in statistical analyses. Many developments have taken place and are still taking place in the areas of decision theory and group decision making. Two aspects, namely that of estimation and tests of hypotheses, will be looked into. This is the area of statistical inference mainly concerned with Mathematical Statistics.

  18. Microbial decomposition of keratin in nature—a new hypothesis of industrial relevance

    DEFF Research Database (Denmark)

    Lange, Lene; Huang, Yuhong; Kamp Busk, Peter

    2016-01-01

    with the keratinases to loosen the molecular structure, thus giving the enzymes access to their substrate, the protein structure. With such complexity, it is relevant to compare microbial keratin decomposition with the microbial decomposition of well-studied polymers such as cellulose and chitin. Interestingly...... enzymatic and boosting factors needed for keratin breakdown have been used to formulate a hypothesis for mode of action of the LPMOs in keratin decomposition and for a model for degradation of keratin in nature. Testing such hypotheses and models still needs to be done. Even now, the hypothesis can serve...

  19. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economist's model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  20. Does cooperation increase helpers' later success as breeders? A test of the skills hypothesis in the cooperatively displaying lance-tailed manakin.

    Science.gov (United States)

    DuVal, Emily H

    2013-07-01

    Experience improves individual performance in many tasks. Pre-breeding cooperation may provide important experience that improves later success as a breeder, offering one compelling explanation for why some individuals delay reproduction to help others breed (the 'skills hypothesis'). However, confounding effects of age, quality and alternative selective benefits have complicated rigorous tests of this hypothesis. Male lance-tailed manakins perform cooperative courtship displays involving partnerships between unrelated alpha and beta males, and alphas monopolize resulting copulations. Beta males therefore do not receive immediate direct or indirect fitness benefits, but may gain skills during cooperation that increase their later success as an alpha. To date, however, the effect of cooperative experience on later success as a breeder has never been tested in any cooperatively displaying taxon. The effects of prior cooperative experience on reproductive success of alpha lance-tailed manakins were analysed in a mixed model framework using 12 years of information on cooperative experience and annual and lifetime genetic reproductive success for 57 alpha males. Models included previously identified effects of age and alpha tenure. Individual-level random effects controlled for quality differences to test for an independent influence of beta experience on success. Males accumulated up to 5 years of beta experience before becoming alphas, but 42·1% of alphas had no prior beta experience. Betas became alphas later in life, and experienced significantly lower reproductive success in their final year as alpha than males that were never beta, but did not have higher lifetime success or longer alpha tenures. Differences in patterns of annual siring success were best explained by age-dependent patterns of reproductive improvement and senescence among alphas, not beta experience. Cooperative experience does not increase relative breeding success for male lance-tailed manakins