WorldWideScience

Sample records for sequential hypothesis testing

  1. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.

    2014-12-15

    This paper considers multi-agent sequential hypothesis testing and presents a framework for strategic learning in sequential games with explicit consideration of both temporal and spatial coordination. The associated Bayes risk functions explicitly incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well-defined value functions with respect to (a) the belief states for the case of conditional independent private noisy measurements that are also assumed to be independent identically distributed over time, and (b) the information states for the case of correlated private noisy measurements. A sequential investment game of strategic coordination and delay is also discussed as an application of the proposed strategic learning rules.

  2. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.; Shamma, Jeff S.

    2014-01-01

    incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well

  3. A sequential hypothesis test based on a generalized Azuma inequality

    NARCIS (Netherlands)

    Reijsbergen, D.P.; Scheinhardt, Willem R.W.; de Boer, Pieter-Tjerk

    We present a new power-one sequential hypothesis test based on a bound for the probability that a bounded zero-mean martingale ever crosses a curve of the form $a(n+k)^b$. The proof of the bound is of independent interest.

  4. Robust real-time pattern matching using bayesian sequential hypothesis testing.

    Science.gov (United States)

    Pele, Ofir; Werman, Michael

    2008-08-01

    This paper describes a method for robust real time pattern matching. We first introduce a family of image distance measures, the "Image Hamming Distance Family". Members of this family are robust to occlusion, small geometrical transforms, light changes and non-rigid deformations. We then present a novel Bayesian framework for sequential hypothesis testing on finite populations. Based on this framework, we design an optimal rejection/acceptance sampling algorithm. This algorithm quickly determines whether two images are similar with respect to a member of the Image Hamming Distance Family. We also present a fast framework that designs a near-optimal sampling algorithm. Extensive experimental results show that the sequential sampling algorithm performance is excellent. Implemented on a Pentium 4 3 GHz processor, detection of a pattern with 2197 pixels, in 640 x 480 pixel frames, where in each frame the pattern rotated and was highly occluded, proceeds at only 0.022 seconds per frame.

  5. Hypothesis Designs for Three-Hypothesis Test Problems

    OpenAIRE

    Yan Li; Xiaolong Pu

    2010-01-01

    As a helpful guide for applications, the alternative hypotheses of the three-hypothesis test problems are designed under the required error probabilities and average sample number in this paper. The asymptotic formulas and the proposed numerical quadrature formulas are adopted, respectively, to obtain the hypothesis designs and the corresponding sequential test schemes under the Koopman-Darmois distributions. The example of the normal mean test shows that our methods are qu...

  6. Comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Racz, A. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1996-07-01

    In this paper the classical sequential probability ratio testing method (SPRT) is reconsidered. Every individual boundary crossing event of the SPRT is regarded as a new piece of evidence about the problem under hypothesis testing. The Bayes method is applied for belief updating, i.e. integrating these individual decisions. The procedure is recommended to use when the user (1) would like to be informed about the tested hypothesis continuously and (2) would like to achieve his final conclusion with high confidence level. (Author).

  7. Sequential Analysis: Hypothesis Testing and Changepoint Detection

    Science.gov (United States)

    2014-07-11

    maintains the flexibility of deciding sooner than the fixed sample size procedure at the price of some lower power [13, 514]. The sequential probability... markets , detection of signals with unknown arrival time in seismology, navigation, radar and sonar signal processing, speech segmentation, and the... skimming cruise missile can yield a significant increase in the probability of raid annihilation. Furthermore, usually detection systems are

  8. A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.

    Science.gov (United States)

    Yu, Qingzhao; Zhu, Lin; Zhu, Han

    2017-11-01

    Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Robust and distributed hypothesis testing

    CERN Document Server

    Gül, Gökhan

    2017-01-01

    This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the boo...

  10. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    Science.gov (United States)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  11. On-line Flagging of Anomalies and Adaptive Sequential Hypothesis Testing for Fine-feature Characterization of Geosynchronous Satellites

    Science.gov (United States)

    2015-10-18

    model-based evidence. This work resolves cross-tag using three methods (Z-test for dependent data, classical sequential analysis and Brownian motion...Slider Movement The two-facet model is used as the Inversion Model. It represents a three-axis stabilized satellite as two facets, namely a body...the sequential analysis. If is independent and has an approximately normal distribution then Brownian motion drift analysis is used. If is

  12. [Dilemma of null hypothesis in ecological hypothesis's experiment test.

    Science.gov (United States)

    Li, Ji

    2016-06-01

    Experimental test is one of the major test methods of ecological hypothesis, though there are many arguments due to null hypothesis. Quinn and Dunham (1983) analyzed the hypothesis deduction model from Platt (1964) and thus stated that there is no null hypothesis in ecology that can be strictly tested by experiments. Fisher's falsificationism and Neyman-Pearson (N-P)'s non-decisivity inhibit statistical null hypothesis from being strictly tested. Moreover, since the null hypothesis H 0 (α=1, β=0) and alternative hypothesis H 1 '(α'=1, β'=0) in ecological progresses are diffe-rent from classic physics, the ecological null hypothesis can neither be strictly tested experimentally. These dilemmas of null hypothesis could be relieved via the reduction of P value, careful selection of null hypothesis, non-centralization of non-null hypothesis, and two-tailed test. However, the statistical null hypothesis significance testing (NHST) should not to be equivalent to the causality logistical test in ecological hypothesis. Hence, the findings and conclusions about methodological studies and experimental tests based on NHST are not always logically reliable.

  13. Application of Multi-Hypothesis Sequential Monte Carlo for Breakup Analysis

    Science.gov (United States)

    Faber, W. R.; Zaidi, W.; Hussein, I. I.; Roscoe, C. W. T.; Wilkins, M. P.; Schumacher, P. W., Jr.

    As more objects are launched into space, the potential for breakup events and space object collisions is ever increasing. These events create large clouds of debris that are extremely hazardous to space operations. Providing timely, accurate, and statistically meaningful Space Situational Awareness (SSA) data is crucial in order to protect assets and operations in space. The space object tracking problem, in general, is nonlinear in both state dynamics and observations, making it ill-suited to linear filtering techniques such as the Kalman filter. Additionally, given the multi-object, multi-scenario nature of the problem, space situational awareness requires multi-hypothesis tracking and management that is combinatorially challenging in nature. In practice, it is often seen that assumptions of underlying linearity and/or Gaussianity are used to provide tractable solutions to the multiple space object tracking problem. However, these assumptions are, at times, detrimental to tracking data and provide statistically inconsistent solutions. This paper details a tractable solution to the multiple space object tracking problem applicable to space object breakup events. Within this solution, simplifying assumptions of the underlying probability density function are relaxed and heuristic methods for hypothesis management are avoided. This is done by implementing Sequential Monte Carlo (SMC) methods for both nonlinear filtering as well as hypothesis management. This goal of this paper is to detail the solution and use it as a platform to discuss computational limitations that hinder proper analysis of large breakup events.

  14. Sequential Probability Ration Tests : Conservative and Robust

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Shi, Wen

    2017-01-01

    In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output

  15. A one-sided sequential test

    Energy Technology Data Exchange (ETDEWEB)

    Racz, A.; Lux, I. [Hungarian Academy of Sciences, Budapest (Hungary). Atomic Energy Research Inst.

    1996-04-16

    The applicability of the classical sequential probability ratio testing (SPRT) for early failure detection problems is limited by the fact that there is an extra time delay between the occurrence of the failure and its first recognition. Chien and Adams developed a method to minimize this time for the case when the problem can be formulated as testing the mean value of a Gaussian signal. In our paper we propose a procedure that can be applied for both mean and variance testing and that minimizes the time delay. The method is based on a special parametrization of the classical SPRT. The one-sided sequential tests (OSST) can reproduce the results of the Chien-Adams test when applied for mean values. (author).

  16. Applying the minimax principle to sequential mastery testing

    NARCIS (Netherlands)

    Vos, Hendrik J.

    2002-01-01

    The purpose of this paper is to derive optimal rules for sequential mastery tests. In a sequential mastery test, the decision is to classify a subject as a master, a nonmaster, or to continue sampling and administering another random item. The framework of minimax sequential decision theory (minimum

  17. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  18. Sequential probability ratio controllers for safeguards radiation monitors

    International Nuclear Information System (INIS)

    Fehlau, P.E.; Coop, K.L.; Nixon, K.V.

    1984-01-01

    Sequential hypothesis tests applied to nuclear safeguards accounting methods make the methods more sensitive to detecting diversion. The sequential tests also improve transient signal detection in safeguards radiation monitors. This paper describes three microprocessor control units with sequential probability-ratio tests for detecting transient increases in radiation intensity. The control units are designed for three specific applications: low-intensity monitoring with Poisson probability ratios, higher intensity gamma-ray monitoring where fixed counting intervals are shortened by sequential testing, and monitoring moving traffic where the sequential technique responds to variable-duration signals. The fixed-interval controller shortens a customary 50-s monitoring time to an average of 18 s, making the monitoring delay less bothersome. The controller for monitoring moving vehicles benefits from the sequential technique by maintaining more than half its sensitivity when the normal passage speed doubles

  19. Is it better to select or to receive? Learning via active and passive hypothesis testing.

    Science.gov (United States)

    Markant, Douglas B; Gureckis, Todd M

    2014-02-01

    People can test hypotheses through either selection or reception. In a selection task, the learner actively chooses observations to test his or her beliefs, whereas in reception tasks data are passively encountered. People routinely use both forms of testing in everyday life, but the critical psychological differences between selection and reception learning remain poorly understood. One hypothesis is that selection learning improves learning performance by enhancing generic cognitive processes related to motivation, attention, and engagement. Alternatively, we suggest that differences between these 2 learning modes derives from a hypothesis-dependent sampling bias that is introduced when a person collects data to test his or her own individual hypothesis. Drawing on influential models of sequential hypothesis-testing behavior, we show that such a bias (a) can lead to the collection of data that facilitates learning compared with reception learning and (b) can be more effective than observing the selections of another person. We then report a novel experiment based on a popular category learning paradigm that compares reception and selection learning. We additionally compare selection learners to a set of "yoked" participants who viewed the exact same sequence of observations under reception conditions. The results revealed systematic differences in performance that depended on the learner's role in collecting information and the abstract structure of the problem.

  20. On-line Flagging of Anomalies and Adaptive Sequential Hypothesis Testing for Fine-feature Characterization of Geosynchronous Satellites

    Science.gov (United States)

    Chaudhary, A.; Payne, T.; Kinateder, K.; Dao, P.; Beecher, E.; Boone, D.; Elliott, B.

    The objective of on-line flagging in this paper is to perform interactive assessment of geosynchronous satellites anomalies such as cross-tagging of a satellites in a cluster, solar panel offset change, etc. This assessment will utilize a Bayesian belief propagation procedure and will include automated update of baseline signature data for the satellite, while accounting for the seasonal changes. Its purpose is to enable an ongoing, automated assessment of satellite behavior through its life cycle using the photometry data collected during the synoptic search performed by a ground or space-based sensor as a part of its metrics mission. The change in the satellite features will be reported along with the probabilities of Type I and Type II errors. The objective of adaptive sequential hypothesis testing in this paper is to define future sensor tasking for the purpose of characterization of fine features of the satellite. The tasking will be designed in order to maximize new information with the least number of photometry data points to be collected during the synoptic search by a ground or space-based sensor. Its calculation is based on the utilization of information entropy techniques. The tasking is defined by considering a sequence of hypotheses in regard to the fine features of the satellite. The optimal observation conditions are then ordered in order to maximize new information about a chosen fine feature. The combined objective of on-line flagging and adaptive sequential hypothesis testing is to progressively discover new information about the features of a geosynchronous satellites by leveraging the regular but sparse cadence of data collection during the synoptic search performed by a ground or space-based sensor. Automated Algorithm to Detect Changes in Geostationary Satellite's Configuration and Cross-Tagging Phan Dao, Air Force Research Laboratory/RVB By characterizing geostationary satellites based on photometry and color photometry, analysts can

  1. Using a Simple Binomial Model to Assess Improvement in Predictive Capability: Sequential Bayesian Inference, Hypothesis Testing, and Power Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sigeti, David E. [Los Alamos National Laboratory; Pelak, Robert A. [Los Alamos National Laboratory

    2012-09-11

    We present a Bayesian statistical methodology for identifying improvement in predictive simulations, including an analysis of the number of (presumably expensive) simulations that will need to be made in order to establish with a given level of confidence that an improvement has been observed. Our analysis assumes the ability to predict (or postdict) the same experiments with legacy and new simulation codes and uses a simple binomial model for the probability, {theta}, that, in an experiment chosen at random, the new code will provide a better prediction than the old. This model makes it possible to do statistical analysis with an absolute minimum of assumptions about the statistics of the quantities involved, at the price of discarding some potentially important information in the data. In particular, the analysis depends only on whether or not the new code predicts better than the old in any given experiment, and not on the magnitude of the improvement. We show how the posterior distribution for {theta} may be used, in a kind of Bayesian hypothesis testing, both to decide if an improvement has been observed and to quantify our confidence in that decision. We quantify the predictive probability that should be assigned, prior to taking any data, to the possibility of achieving a given level of confidence, as a function of sample size. We show how this predictive probability depends on the true value of {theta} and, in particular, how there will always be a region around {theta} = 1/2 where it is highly improbable that we will be able to identify an improvement in predictive capability, although the width of this region will shrink to zero as the sample size goes to infinity. We show how the posterior standard deviation may be used, as a kind of 'plan B metric' in the case that the analysis shows that {theta} is close to 1/2 and argue that such a plan B should generally be part of hypothesis testing. All the analysis presented in the paper is done with a

  2. A minimax procedure in the context of sequential mastery testing

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1999-01-01

    The purpose of this paper is to derive optimal rules for sequential mastery tests. In a sequential mastery test, the decision is to classify a subject as a master or a nonmaster, or to continue sampling and administering another random test item. The framework of minimax sequential decision theory

  3. Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis

    2013-01-01

    A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.

  4. Sequential accelerated tests: Improving the correlation of accelerated tests to module performance in the field

    Science.gov (United States)

    Felder, Thomas; Gambogi, William; Stika, Katherine; Yu, Bao-Ling; Bradley, Alex; Hu, Hongjie; Garreau-Iles, Lucie; Trout, T. John

    2016-09-01

    DuPont has been working steadily to develop accelerated backsheet tests that correlate with solar panels observations in the field. This report updates efforts in sequential testing. Single exposure tests are more commonly used and can be completed more quickly, and certain tests provide helpful predictions of certain backsheet failure modes. DuPont recommendations for single exposure tests are based on 25-year exposure levels for UV and humidity/temperature, and form a good basis for sequential test development. We recommend a sequential exposure of damp heat followed by UV then repetitions of thermal cycling and UVA. This sequence preserves 25-year exposure levels for humidity/temperature and UV, and correlates well with a large body of field observations. Measurements can be taken at intervals in the test, although the full test runs 10 months. A second, shorter sequential test based on damp heat and thermal cycling tests mechanical durability and correlates with loss of mechanical properties seen in the field. Ongoing work is directed toward shorter sequential tests that preserve good correlation to field data.

  5. Hybrid Computerized Adaptive Testing: From Group Sequential Design to Fully Sequential Design

    Science.gov (United States)

    Wang, Shiyu; Lin, Haiyan; Chang, Hua-Hua; Douglas, Jeff

    2016-01-01

    Computerized adaptive testing (CAT) and multistage testing (MST) have become two of the most popular modes in large-scale computer-based sequential testing. Though most designs of CAT and MST exhibit strength and weakness in recent large-scale implementations, there is no simple answer to the question of which design is better because different…

  6. Hypothesis Testing in the Real World

    Science.gov (United States)

    Miller, Jeff

    2017-01-01

    Critics of null hypothesis significance testing suggest that (a) its basic logic is invalid and (b) it addresses a question that is of no interest. In contrast to (a), I argue that the underlying logic of hypothesis testing is actually extremely straightforward and compelling. To substantiate that, I present examples showing that hypothesis…

  7. A NONPARAMETRIC HYPOTHESIS TEST VIA THE BOOTSTRAP RESAMPLING

    OpenAIRE

    Temel, Tugrul T.

    2001-01-01

    This paper adapts an already existing nonparametric hypothesis test to the bootstrap framework. The test utilizes the nonparametric kernel regression method to estimate a measure of distance between the models stated under the null hypothesis. The bootstraped version of the test allows to approximate errors involved in the asymptotic hypothesis test. The paper also develops a Mathematica Code for the test algorithm.

  8. Bayesian Hypothesis Testing

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H0.

  9. Fault detection in multiply-redundant measurement systems via sequential testing

    International Nuclear Information System (INIS)

    Ray, A.

    1988-01-01

    The theory and application of a sequential test procedure for fault detection and isolation. The test procedure is suited for development of intelligent instrumentation in strategic processes like aircraft and nuclear plants where redundant measurements are usually available for individual critical variables. The test procedure consists of: (1) a generic redundancy management procedure which is essentially independent of the fault detection strategy and measurement noise statistics, and (2) a modified version of sequential probability ratio test algorithm for fault detection and isolation, which functions within the framework of this redundancy management procedure. The sequential test procedure is suitable for real-time applications using commercially available microcomputers and its efficacy has been verified by online fault detection in an operating nuclear reactor. 15 references

  10. Approaches to informed consent for hypothesis-testing and hypothesis-generating clinical genomics research.

    Science.gov (United States)

    Facio, Flavia M; Sapp, Julie C; Linn, Amy; Biesecker, Leslie G

    2012-10-10

    Massively-parallel sequencing (MPS) technologies create challenges for informed consent of research participants given the enormous scale of the data and the wide range of potential results. We propose that the consent process in these studies be based on whether they use MPS to test a hypothesis or to generate hypotheses. To demonstrate the differences in these approaches to informed consent, we describe the consent processes for two MPS studies. The purpose of our hypothesis-testing study is to elucidate the etiology of rare phenotypes using MPS. The purpose of our hypothesis-generating study is to test the feasibility of using MPS to generate clinical hypotheses, and to approach the return of results as an experimental manipulation. Issues to consider in both designs include: volume and nature of the potential results, primary versus secondary results, return of individual results, duty to warn, length of interaction, target population, and privacy and confidentiality. The categorization of MPS studies as hypothesis-testing versus hypothesis-generating can help to clarify the issue of so-called incidental or secondary results for the consent process, and aid the communication of the research goals to study participants.

  11. Group-sequential analysis may allow for early trial termination

    DEFF Research Database (Denmark)

    Gerke, Oke; Vilstrup, Mie H; Halekoh, Ulrich

    2017-01-01

    BACKGROUND: Group-sequential testing is widely used in pivotal therapeutic, but rarely in diagnostic research, although it may save studies, time, and costs. The purpose of this paper was to demonstrate a group-sequential analysis strategy in an intra-observer study on quantitative FDG-PET/CT mea......BACKGROUND: Group-sequential testing is widely used in pivotal therapeutic, but rarely in diagnostic research, although it may save studies, time, and costs. The purpose of this paper was to demonstrate a group-sequential analysis strategy in an intra-observer study on quantitative FDG...... assumed to be normally distributed, and sequential one-sided hypothesis tests on the population standard deviation of the differences against a hypothesised value of 1.5 were performed, employing an alpha spending function. The fixed-sample analysis (N = 45) was compared with the group-sequential analysis...... strategies comprising one (at N = 23), two (at N = 15, 30), or three interim analyses (at N = 11, 23, 34), respectively, which were defined post hoc. RESULTS: When performing interim analyses with one third and two thirds of patients, sufficient agreement could be concluded after the first interim analysis...

  12. A default Bayesian hypothesis test for ANOVA designs

    NARCIS (Netherlands)

    Wetzels, R.; Grasman, R.P.P.P.; Wagenmakers, E.J.

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA

  13. The Avalanche Hypothesis and Compression of Morbidity: Testing Assumptions through Cohort-Sequential Analysis.

    Directory of Open Access Journals (Sweden)

    Jordan Silberman

    Full Text Available The compression of morbidity model posits a breakpoint in the adult lifespan that separates an initial period of relative health from a subsequent period of ever increasing morbidity. Researchers often assume that such a breakpoint exists; however, this assumption is hitherto untested.To test the assumption that a breakpoint exists--which we term a morbidity tipping point--separating a period of relative health from a subsequent deterioration in health status. An analogous tipping point for healthcare costs was also investigated.Four years of adults' (N = 55,550 morbidity and costs data were retrospectively analyzed. Data were collected in Pittsburgh, PA between 2006 and 2009; analyses were performed in Rochester, NY and Ann Arbor, MI in 2012 and 2013. Cohort-sequential and hockey stick regression models were used to characterize long-term trajectories and tipping points, respectively, for both morbidity and costs.Morbidity increased exponentially with age (P<.001. A morbidity tipping point was observed at age 45.5 (95% CI, 41.3-49.7. An exponential trajectory was also observed for costs (P<.001, with a costs tipping point occurring at age 39.5 (95% CI, 32.4-46.6. Following their respective tipping points, both morbidity and costs increased substantially (Ps<.001.Findings support the existence of a morbidity tipping point, confirming an important but untested assumption. This tipping point, however, may occur earlier in the lifespan than is widely assumed. An "avalanche of morbidity" occurred after the morbidity tipping point-an ever increasing rate of morbidity progression. For costs, an analogous tipping point and "avalanche" were observed. The time point at which costs began to increase substantially occurred approximately 6 years before health status began to deteriorate.

  14. [Working memory, phonological awareness and spelling hypothesis].

    Science.gov (United States)

    Gindri, Gigiane; Keske-Soares, Márcia; Mota, Helena Bolli

    2007-01-01

    Working memory, phonological awareness and spelling hypothesis. To verify the relationship between working memory, phonological awareness and spelling hypothesis in pre-school children and first graders. Participants of this study were 90 students, belonging to state schools, who presented typical linguistic development. Forty students were preschoolers, with the average age of six and 50 students were first graders, with the average age of seven. Participants were submitted to an evaluation of the working memory abilities based on the Working Memory Model (Baddeley, 2000), involving phonological loop. Phonological loop was evaluated using the Auditory Sequential Test, subtest 5 of Illinois Test of Psycholinguistic Abilities (ITPA), Brazilian version (Bogossian & Santos, 1977), and the Meaningless Words Memory Test (Kessler, 1997). Phonological awareness abilities were investigated using the Phonological Awareness: Instrument of Sequential Assessment (CONFIAS - Moojen et al., 2003), involving syllabic and phonemic awareness tasks. Writing was characterized according to Ferreiro & Teberosky (1999). Preschoolers presented the ability of repeating sequences of 4.80 digits and 4.30 syllables. Regarding phonological awareness, the performance in the syllabic level was of 19.68 and in the phonemic level was of 8.58. Most of the preschoolers demonstrated to have a pre-syllabic writing hypothesis. First graders repeated, in average, sequences of 5.06 digits and 4.56 syllables. These children presented a phonological awareness of 31.12 in the syllabic level and of 16.18 in the phonemic level, and demonstrated to have an alphabetic writing hypothesis. The performance of working memory, phonological awareness and spelling level are inter-related, as well as being related to chronological age, development and scholarity.

  15. Human visual system automatically encodes sequential regularities of discrete events.

    Science.gov (United States)

    Kimura, Motohiro; Schröger, Erich; Czigler, István; Ohira, Hideki

    2010-06-01

    For our adaptive behavior in a dynamically changing environment, an essential task of the brain is to automatically encode sequential regularities inherent in the environment into a memory representation. Recent studies in neuroscience have suggested that sequential regularities embedded in discrete sensory events are automatically encoded into a memory representation at the level of the sensory system. This notion is largely supported by evidence from investigations using auditory mismatch negativity (auditory MMN), an event-related brain potential (ERP) correlate of an automatic memory-mismatch process in the auditory sensory system. However, it is still largely unclear whether or not this notion can be generalized to other sensory modalities. The purpose of the present study was to investigate the contribution of the visual sensory system to the automatic encoding of sequential regularities using visual mismatch negativity (visual MMN), an ERP correlate of an automatic memory-mismatch process in the visual sensory system. To this end, we conducted a sequential analysis of visual MMN in an oddball sequence consisting of infrequent deviant and frequent standard stimuli, and tested whether the underlying memory representation of visual MMN generation contains only a sensory memory trace of standard stimuli (trace-mismatch hypothesis) or whether it also contains sequential regularities extracted from the repetitive standard sequence (regularity-violation hypothesis). The results showed that visual MMN was elicited by first deviant (deviant stimuli following at least one standard stimulus), second deviant (deviant stimuli immediately following first deviant), and first standard (standard stimuli immediately following first deviant), but not by second standard (standard stimuli immediately following first standard). These results are consistent with the regularity-violation hypothesis, suggesting that the visual sensory system automatically encodes sequential

  16. Debates—Hypothesis testing in hydrology: Introduction

    Science.gov (United States)

    Blöschl, Günter

    2017-03-01

    This paper introduces the papers in the "Debates—Hypothesis testing in hydrology" series. The four articles in the series discuss whether and how the process of testing hypotheses leads to progress in hydrology. Repeated experiments with controlled boundary conditions are rarely feasible in hydrology. Research is therefore not easily aligned with the classical scientific method of testing hypotheses. Hypotheses in hydrology are often enshrined in computer models which are tested against observed data. Testability may be limited due to model complexity and data uncertainty. All four articles suggest that hypothesis testing has contributed to progress in hydrology and is needed in the future. However, the procedure is usually not as systematic as the philosophy of science suggests. A greater emphasis on a creative reasoning process on the basis of clues and explorative analyses is therefore needed.

  17. Biostatistics series module 2: Overview of hypothesis testing

    Directory of Open Access Journals (Sweden)

    Avijit Hazra

    2016-01-01

    Full Text Available Hypothesis testing (or statistical inference is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric and the number of groups or data sets being compared (e.g., two or more than two at a time. The same research question may be explored by more than one type of hypothesis test

  18. Hypothesis testing of scientific Monte Carlo calculations

    Science.gov (United States)

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  19. Personal Hypothesis Testing: The Role of Consistency and Self-Schema.

    Science.gov (United States)

    Strohmer, Douglas C.; And Others

    1988-01-01

    Studied how individuals test hypotheses about themselves. Examined extent to which Snyder's bias toward confirmation persists when negative or nonconsistent personal hypothesis is tested. Found negativity or positivity did not affect hypothesis testing directly, though hypothesis consistency did. Found cognitive schematic variable (vulnerability…

  20. Null but not void: considerations for hypothesis testing.

    Science.gov (United States)

    Shaw, Pamela A; Proschan, Michael A

    2013-01-30

    Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.

  1. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  2. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions Based on a Bank of Norm-Inequality-Constrained Epoch-State Filters

    Science.gov (United States)

    Carpenter, J. R.; Markley, F. L.; Alfriend, K. T.; Wright, C.; Arcido, J.

    2011-01-01

    Sequential probability ratio tests explicitly allow decision makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models 1he null hypothesis 1ha1 the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming highly-elliptical orbit formation flying mission.

  3. Sequential Computerized Mastery Tests--Three Simulation Studies

    Science.gov (United States)

    Wiberg, Marie

    2006-01-01

    A simulation study of a sequential computerized mastery test is carried out with items modeled with the 3 parameter logistic item response theory model. The examinees' responses are either identically distributed, not identically distributed, or not identically distributed together with estimation errors in the item characteristics. The…

  4. Sequential test procedures for inventory differences

    International Nuclear Information System (INIS)

    Goldman, A.S.; Kern, E.A.; Emeigh, C.W.

    1985-01-01

    By means of a simulation study, we investigated the appropriateness of Page's and power-one sequential tests on sequences of inventory differences obtained from an example materials control unit, a sub-area of a hypothetical UF 6 -to-U 3 O 8 conversion process. The study examined detection probability and run length curves obtained from different loss scenarios. 12 refs., 10 figs., 2 tabs

  5. New methods of testing nonlinear hypothesis using iterative NLLS estimator

    Science.gov (United States)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.

  6. Tests of the Giant Impact Hypothesis

    Science.gov (United States)

    Jones, J. H.

    1998-01-01

    The giant impact hypothesis has gained popularity as a means of explaining a volatile-depleted Moon that still has a chemical affinity to the Earth. As Taylor's Axiom decrees, the best models of lunar origin are testable, but this is difficult with the giant impact model. The energy associated with the impact would be sufficient to totally melt and partially vaporize the Earth. And this means that there should he no geological vestige of Barber times. Accordingly, it is important to devise tests that may be used to evaluate the giant impact hypothesis. Three such tests are discussed here. None of these is supportive of the giant impact model, but neither do they disprove it.

  7. Optimal Sequential Diagnostic Strategy Generation Considering Test Placement Cost for Multimode Systems

    Directory of Open Access Journals (Sweden)

    Shigang Zhang

    2015-10-01

    Full Text Available Sequential fault diagnosis is an approach that realizes fault isolation by executing the optimal test step by step. The strategy used, i.e., the sequential diagnostic strategy, has great influence on diagnostic accuracy and cost. Optimal sequential diagnostic strategy generation is an important step in the process of diagnosis system construction, which has been studied extensively in the literature. However, previous algorithms either are designed for single mode systems or do not consider test placement cost. They are not suitable to solve the sequential diagnostic strategy generation problem considering test placement cost for multimode systems. Therefore, this problem is studied in this paper. A formulation is presented. Two algorithms are proposed, one of which is realized by system transformation and the other is newly designed. Extensive simulations are carried out to test the effectiveness of the algorithms. A real-world system is also presented. All the results show that both of them have the ability to solve the diagnostic strategy generation problem, and they have different characteristics.

  8. Optimal Sequential Diagnostic Strategy Generation Considering Test Placement Cost for Multimode Systems

    Science.gov (United States)

    Zhang, Shigang; Song, Lijun; Zhang, Wei; Hu, Zheng; Yang, Yongmin

    2015-01-01

    Sequential fault diagnosis is an approach that realizes fault isolation by executing the optimal test step by step. The strategy used, i.e., the sequential diagnostic strategy, has great influence on diagnostic accuracy and cost. Optimal sequential diagnostic strategy generation is an important step in the process of diagnosis system construction, which has been studied extensively in the literature. However, previous algorithms either are designed for single mode systems or do not consider test placement cost. They are not suitable to solve the sequential diagnostic strategy generation problem considering test placement cost for multimode systems. Therefore, this problem is studied in this paper. A formulation is presented. Two algorithms are proposed, one of which is realized by system transformation and the other is newly designed. Extensive simulations are carried out to test the effectiveness of the algorithms. A real-world system is also presented. All the results show that both of them have the ability to solve the diagnostic strategy generation problem, and they have different characteristics. PMID:26457709

  9. Testing competing forms of the Milankovitch hypothesis

    DEFF Research Database (Denmark)

    Kaufmann, Robert K.; Juselius, Katarina

    2016-01-01

    We test competing forms of the Milankovitch hypothesis by estimating the coefficients and diagnostic statistics for a cointegrated vector autoregressive model that includes 10 climate variables and four exogenous variables for solar insolation. The estimates are consistent with the physical...... ice volume and solar insolation. The estimated adjustment dynamics show that solar insolation affects an array of climate variables other than ice volume, each at a unique rate. This implies that previous efforts to test the strong form of the Milankovitch hypothesis by examining the relationship...... that the latter is consistent with a weak form of the Milankovitch hypothesis and that it should be restated as follows: Internal climate dynamics impose perturbations on glacial cycles that are driven by solar insolation. Our results show that these perturbations are likely caused by slow adjustment between land...

  10. A Bayesian Optimal Design for Sequential Accelerated Degradation Testing

    Directory of Open Access Journals (Sweden)

    Xiaoyang Li

    2017-07-01

    Full Text Available When optimizing an accelerated degradation testing (ADT plan, the initial values of unknown model parameters must be pre-specified. However, it is usually difficult to obtain the exact values, since many uncertainties are embedded in these parameters. Bayesian ADT optimal design was presented to address this problem by using prior distributions to capture these uncertainties. Nevertheless, when the difference between a prior distribution and actual situation is large, the existing Bayesian optimal design might cause some over-testing or under-testing issues. For example, the implemented ADT following the optimal ADT plan consumes too much testing resources or few accelerated degradation data are obtained during the ADT. To overcome these obstacles, a Bayesian sequential step-down-stress ADT design is proposed in this article. During the sequential ADT, the test under the highest stress level is firstly conducted based on the initial prior information to quickly generate degradation data. Then, the data collected under higher stress levels are employed to construct the prior distributions for the test design under lower stress levels by using the Bayesian inference. In the process of optimization, the inverse Gaussian (IG process is assumed to describe the degradation paths, and the Bayesian D-optimality is selected as the optimal objective. A case study on an electrical connector’s ADT plan is provided to illustrate the application of the proposed Bayesian sequential ADT design method. Compared with the results from a typical static Bayesian ADT plan, the proposed design could guarantee more stable and precise estimations of different reliability measures.

  11. Testing the null hypothesis: the forgotten legacy of Karl Popper?

    Science.gov (United States)

    Wilkinson, Mick

    2013-01-01

    Testing of the null hypothesis is a fundamental aspect of the scientific method and has its basis in the falsification theory of Karl Popper. Null hypothesis testing makes use of deductive reasoning to ensure that the truth of conclusions is irrefutable. In contrast, attempting to demonstrate the new facts on the basis of testing the experimental or research hypothesis makes use of inductive reasoning and is prone to the problem of the Uniformity of Nature assumption described by David Hume in the eighteenth century. Despite this issue and the well documented solution provided by Popper's falsification theory, the majority of publications are still written such that they suggest the research hypothesis is being tested. This is contrary to accepted scientific convention and possibly highlights a poor understanding of the application of conventional significance-based data analysis approaches. Our work should remain driven by conjecture and attempted falsification such that it is always the null hypothesis that is tested. The write up of our studies should make it clear that we are indeed testing the null hypothesis and conforming to the established and accepted philosophical conventions of the scientific method.

  12. A test of the orthographic recoding hypothesis

    Science.gov (United States)

    Gaygen, Daniel E.

    2003-04-01

    The Orthographic Recoding Hypothesis [D. E. Gaygen and P. A. Luce, Percept. Psychophys. 60, 465-483 (1998)] was tested. According to this hypothesis, listeners recognize spoken words heard for the first time by mapping them onto stored representations of the orthographic forms of the words. Listeners have a stable orthographic representation of words, but no phonological representation, when those words have been read frequently but never heard or spoken. Such may be the case for low frequency words such as jargon. Three experiments using visually and auditorily presented nonword stimuli tested this hypothesis. The first two experiments were explicit tests of memory (old-new tests) for words presented visually. In the first experiment, the recognition of auditorily presented nonwords was facilitated when they previously appeared on a visually presented list. The second experiment was similar, but included a concurrent articulation task during a visual word list presentation, thus preventing covert rehearsal of the nonwords. The results were similar to the first experiment. The third experiment was an indirect test of memory (auditory lexical decision task) for visually presented nonwords. Auditorily presented nonwords were identified as nonwords significantly more slowly if they had previously appeared on the visually presented list accompanied by a concurrent articulation task.

  13. A shift from significance test to hypothesis test through power analysis in medical research.

    Science.gov (United States)

    Singh, G

    2006-01-01

    Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.

  14. Programme for test generation for combinatorial and sequential systems

    International Nuclear Information System (INIS)

    Tran Huy Hoan

    1973-01-01

    This research thesis reports the computer-assisted search for tests aimed at failure detection in combinatorial and sequential logic circuits. As he wants to deal with complex circuits with many modules such as those met in large scale integrated circuits (LSI), the author used propagation paths. He reports the development of a method which is valid for combinatorial systems and for several sequential circuits comprising elementary logic modules and JK and RS flip-flops. This method is developed on an IBM 360/91 computer in PL/1 language. The used memory space is limited and adjustable with respect to circuit dimension. Computing time is short when compared to that needed by other programmes. The solution is practical and efficient for failure test and localisation

  15. On Locally Most Powerful Sequential Rank Tests

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2017-01-01

    Roč. 36, č. 1 (2017), s. 111-125 ISSN 0747-4946 R&D Projects: GA ČR GA17-07384S Grant - others:Nadační fond na podporu vědy(CZ) Neuron Institutional support: RVO:67985807 Keywords : nonparametric test s * sequential ranks * stopping variable Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.339, year: 2016

  16. A large scale test of the gaming-enhancement hypothesis

    Directory of Open Access Journals (Sweden)

    Andrew K. Przybylski

    2016-11-01

    Full Text Available A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis, has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people’s gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  17. A large scale test of the gaming-enhancement hypothesis.

    Science.gov (United States)

    Przybylski, Andrew K; Wang, John C

    2016-01-01

    A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis , has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people's gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  18. A shift from significance test to hypothesis test through power analysis in medical research

    Directory of Open Access Journals (Sweden)

    Singh Girish

    2006-01-01

    Full Text Available Medical research literature until recently, exhibited substantial dominance of the Fisher′s significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson′s hypothesis test considering both probability of type I and II error. Fisher′s approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson′s approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher′s significance test to Neyman-Pearson′s hypothesis test procedure.

  19. Active SAmpling Protocol (ASAP) to Optimize Individual Neurocognitive Hypothesis Testing: A BCI-Inspired Dynamic Experimental Design.

    Science.gov (United States)

    Sanchez, Gaëtan; Lecaignard, Françoise; Otman, Anatole; Maby, Emmanuel; Mattout, Jérémie

    2016-01-01

    The relatively young field of Brain-Computer Interfaces has promoted the use of electrophysiology and neuroimaging in real-time. In the meantime, cognitive neuroscience studies, which make extensive use of functional exploration techniques, have evolved toward model-based experiments and fine hypothesis testing protocols. Although these two developments are mostly unrelated, we argue that, brought together, they may trigger an important shift in the way experimental paradigms are being designed, which should prove fruitful to both endeavors. This change simply consists in using real-time neuroimaging in order to optimize advanced neurocognitive hypothesis testing. We refer to this new approach as the instantiation of an Active SAmpling Protocol (ASAP). As opposed to classical (static) experimental protocols, ASAP implements online model comparison, enabling the optimization of design parameters (e.g., stimuli) during the course of data acquisition. This follows the well-known principle of sequential hypothesis testing. What is radically new, however, is our ability to perform online processing of the huge amount of complex data that brain imaging techniques provide. This is all the more relevant at a time when physiological and psychological processes are beginning to be approached using more realistic, generative models which may be difficult to tease apart empirically. Based upon Bayesian inference, ASAP proposes a generic and principled way to optimize experimental design adaptively. In this perspective paper, we summarize the main steps in ASAP. Using synthetic data we illustrate its superiority in selecting the right perceptual model compared to a classical design. Finally, we briefly discuss its future potential for basic and clinical neuroscience as well as some remaining challenges.

  20. An Exercise for Illustrating the Logic of Hypothesis Testing

    Science.gov (United States)

    Lawton, Leigh

    2009-01-01

    Hypothesis testing is one of the more difficult concepts for students to master in a basic, undergraduate statistics course. Students often are puzzled as to why statisticians simply don't calculate the probability that a hypothesis is true. This article presents an exercise that forces students to lay out on their own a procedure for testing a…

  1. Safeguarding a Lunar Rover with Wald's Sequential Probability Ratio Test

    Science.gov (United States)

    Furlong, Michael; Dille, Michael; Wong, Uland; Nefian, Ara

    2016-01-01

    The virtual bumper is a safeguarding mechanism for autonomous and remotely operated robots. In this paper we take a new approach to the virtual bumper system by using an old statistical test. By using a modified version of Wald's sequential probability ratio test we demonstrate that we can reduce the number of false positive reported by the virtual bumper, thereby saving valuable mission time. We use the concept of sequential probability ratio to control vehicle speed in the presence of possible obstacles in order to increase certainty about whether or not obstacles are present. Our new algorithm reduces the chances of collision by approximately 98 relative to traditional virtual bumper safeguarding without speed control.

  2. Further comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kulacsy, K. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1997-05-23

    The Bayesian method for belief updating proposed in Racz (1996) is examined. The interpretation of the belief function introduced therein is found, and the method is compared to the classical binary Sequential Probability Ratio Testing method (SPRT). (author).

  3. An omnibus test for the global null hypothesis.

    Science.gov (United States)

    Futschik, Andreas; Taus, Thomas; Zehetmayer, Sonja

    2018-01-01

    Global hypothesis tests are a useful tool in the context of clinical trials, genetic studies, or meta-analyses, when researchers are not interested in testing individual hypotheses, but in testing whether none of the hypotheses is false. There are several possibilities how to test the global null hypothesis when the individual null hypotheses are independent. If it is assumed that many of the individual null hypotheses are false, combination tests have been recommended to maximize power. If, however, it is assumed that only one or a few null hypotheses are false, global tests based on individual test statistics are more powerful (e.g. Bonferroni or Simes test). However, usually there is no a priori knowledge on the number of false individual null hypotheses. We therefore propose an omnibus test based on cumulative sums of the transformed p-values. We show that this test yields an impressive overall performance. The proposed method is implemented in an R-package called omnibus.

  4. A default Bayesian hypothesis test for mediation.

    Science.gov (United States)

    Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan

    2015-03-01

    In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).

  5. Counselor Hypothesis Testing Strategies: The Role of Initial Impressions and Self-Schema.

    Science.gov (United States)

    Strohmer, Douglas C.; Chiodo, Anthony L.

    1984-01-01

    Presents two experiments concerning confirmatory bias in the way counselors collect data to test their hypotheses. Counselors were asked either to develop their own clinical hypothesis or were given a hypothesis to test. Confirmatory bias in hypothesis testing was not supported in either experiment. (JAC)

  6. On Locally Most Powerful Sequential Rank Tests

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2017-01-01

    Roč. 36, č. 1 (2017), s. 111-125 ISSN 0747-4946 R&D Projects: GA ČR GA17-07384S Grant - others:Nadační fond na podporu vědy(CZ) Neuron Institutional support: RVO:67985556 Keywords : nonparametric test s * sequential ranks * stopping variable Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.339, year: 2016 http://library.utia.cas.cz/separaty/2017/SI/kalina-0474065.pdf

  7. Improving the identification accuracy of senior witnesses: do prelineup questions and sequential testing help?

    Science.gov (United States)

    Memon, Amina; Gabbert, Fiona

    2003-04-01

    Eyewitness research has identified sequential lineup testing as a way of reducing false lineup choices while maintaining accurate identifications. The authors examined the usefulness of this procedure for reducing false choices in older adults. Young and senior witnesses viewed a crime video and were later presented with target present orabsent lineups in a simultaneous or sequential format. In addition, some participants received prelineup questions about their memory for a perpetrator's face and about their confidence in their ability to identify the culprit or to correctly reject the lineup. The sequential lineup reduced false choosing rates among young and older adults in target-absent conditions. In target-present conditions, sequential testing significantly reduced the correct identification rate in both age groups.

  8. Decision-making in research tasks with sequential testing.

    Directory of Open Access Journals (Sweden)

    Thomas Pfeiffer

    Full Text Available BACKGROUND: In a recent controversial essay, published by JPA Ioannidis in PLoS Medicine, it has been argued that in some research fields, most of the published findings are false. Based on theoretical reasoning it can be shown that small effect sizes, error-prone tests, low priors of the tested hypotheses and biases in the evaluation and publication of research findings increase the fraction of false positives. These findings raise concerns about the reliability of research. However, they are based on a very simple scenario of scientific research, where single tests are used to evaluate independent hypotheses. METHODOLOGY/PRINCIPAL FINDINGS: In this study, we present computer simulations and experimental approaches for analyzing more realistic scenarios. In these scenarios, research tasks are solved sequentially, i.e. subsequent tests can be chosen depending on previous results. We investigate simple sequential testing and scenarios where only a selected subset of results can be published and used for future rounds of test choice. Results from computer simulations indicate that for the tasks analyzed in this study, the fraction of false among the positive findings declines over several rounds of testing if the most informative tests are performed. Our experiments show that human subjects frequently perform the most informative tests, leading to a decline of false positives as expected from the simulations. CONCLUSIONS/SIGNIFICANCE: For the research tasks studied here, findings tend to become more reliable over time. We also find that the performance in those experimental settings where not all performed tests could be published turned out to be surprisingly inefficient. Our results may help optimize existing procedures used in the practice of scientific research and provide guidance for the development of novel forms of scholarly communication.

  9. A Bayesian Decision-Theoretic Approach to Logically-Consistent Hypothesis Testing

    Directory of Open Access Journals (Sweden)

    Gustavo Miranda da Silva

    2015-09-01

    Full Text Available This work addresses an important issue regarding the performance of simultaneous test procedures: the construction of multiple tests that at the same time are optimal from a statistical perspective and that also yield logically-consistent results that are easy to communicate to practitioners of statistical methods. For instance, if hypothesis A implies hypothesis B, is it possible to create optimal testing procedures that reject A whenever they reject B? Unfortunately, several standard testing procedures fail in having such logical consistency. Although this has been deeply investigated under a frequentist perspective, the literature lacks analyses under a Bayesian paradigm. In this work, we contribute to the discussion by investigating three rational relationships under a Bayesian decision-theoretic standpoint: coherence, invertibility and union consonance. We characterize and illustrate through simple examples optimal Bayes tests that fulfill each of these requisites separately. We also explore how far one can go by putting these requirements together. We show that although fairly intuitive tests satisfy both coherence and invertibility, no Bayesian testing scheme meets the desiderata as a whole, strengthening the understanding that logical consistency cannot be combined with statistical optimality in general. Finally, we associate Bayesian hypothesis testing with Bayes point estimation procedures. We prove the performance of logically-consistent hypothesis testing by means of a Bayes point estimator to be optimal only under very restrictive conditions.

  10. A critique of statistical hypothesis testing in clinical research

    Directory of Open Access Journals (Sweden)

    Somik Raha

    2011-01-01

    Full Text Available Many have documented the difficulty of using the current paradigm of Randomized Controlled Trials (RCTs to test and validate the effectiveness of alternative medical systems such as Ayurveda. This paper critiques the applicability of RCTs for all clinical knowledge-seeking endeavors, of which Ayurveda research is a part. This is done by examining statistical hypothesis testing, the underlying foundation of RCTs, from a practical and philosophical perspective. In the philosophical critique, the two main worldviews of probability are that of the Bayesian and the frequentist. The frequentist worldview is a special case of the Bayesian worldview requiring the unrealistic assumptions of knowing nothing about the universe and believing that all observations are unrelated to each other. Many have claimed that the first belief is necessary for science, and this claim is debunked by comparing variations in learning with different prior beliefs. Moving beyond the Bayesian and frequentist worldviews, the notion of hypothesis testing itself is challenged on the grounds that a hypothesis is an unclear distinction, and assigning a probability on an unclear distinction is an exercise that does not lead to clarity of action. This critique is of the theory itself and not any particular application of statistical hypothesis testing. A decision-making frame is proposed as a way of both addressing this critique and transcending ideological debates on probability. An example of a Bayesian decision-making approach is shown as an alternative to statistical hypothesis testing, utilizing data from a past clinical trial that studied the effect of Aspirin on heart attacks in a sample population of doctors. As a big reason for the prevalence of RCTs in academia is legislation requiring it, the ethics of legislating the use of statistical methods for clinical research is also examined.

  11. Statistical hypothesis testing with SAS and R

    CERN Document Server

    Taeger, Dirk

    2014-01-01

    A comprehensive guide to statistical hypothesis testing with examples in SAS and R When analyzing datasets the following questions often arise:Is there a short hand procedure for a statistical test available in SAS or R?If so, how do I use it?If not, how do I program the test myself? This book answers these questions and provides an overview of the most commonstatistical test problems in a comprehensive way, making it easy to find and performan appropriate statistical test. A general summary of statistical test theory is presented, along with a basicdescription for each test, including the

  12. Multi-arm group sequential designs with a simultaneous stopping rule.

    Science.gov (United States)

    Urach, S; Posch, M

    2016-12-30

    Multi-arm group sequential clinical trials are efficient designs to compare multiple treatments to a control. They allow one to test for treatment effects already in interim analyses and can have a lower average sample number than fixed sample designs. Their operating characteristics depend on the stopping rule: We consider simultaneous stopping, where the whole trial is stopped as soon as for any of the arms the null hypothesis of no treatment effect can be rejected, and separate stopping, where only recruitment to arms for which a significant treatment effect could be demonstrated is stopped, but the other arms are continued. For both stopping rules, the family-wise error rate can be controlled by the closed testing procedure applied to group sequential tests of intersection and elementary hypotheses. The group sequential boundaries for the separate stopping rule also control the family-wise error rate if the simultaneous stopping rule is applied. However, we show that for the simultaneous stopping rule, one can apply improved, less conservative stopping boundaries for local tests of elementary hypotheses. We derive corresponding improved Pocock and O'Brien type boundaries as well as optimized boundaries to maximize the power or average sample number and investigate the operating characteristics and small sample properties of the resulting designs. To control the power to reject at least one null hypothesis, the simultaneous stopping rule requires a lower average sample number than the separate stopping rule. This comes at the cost of a lower power to reject all null hypotheses. Some of this loss in power can be regained by applying the improved stopping boundaries for the simultaneous stopping rule. The procedures are illustrated with clinical trials in systemic sclerosis and narcolepsy. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  13. Explorations in Statistics: Hypothesis Tests and P Values

    Science.gov (United States)

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of "Explorations in Statistics" delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what…

  14. Prospective detection of large prediction errors: a hypothesis testing approach

    International Nuclear Information System (INIS)

    Ruan, Dan

    2010-01-01

    Real-time motion management is important in radiotherapy. In addition to effective monitoring schemes, prediction is required to compensate for system latency, so that treatment can be synchronized with tumor motion. However, it is difficult to predict tumor motion at all times, and it is critical to determine when large prediction errors may occur. Such information can be used to pause the treatment beam or adjust monitoring/prediction schemes. In this study, we propose a hypothesis testing approach for detecting instants corresponding to potentially large prediction errors in real time. We treat the future tumor location as a random variable, and obtain its empirical probability distribution with the kernel density estimation-based method. Under the null hypothesis, the model probability is assumed to be a concentrated Gaussian centered at the prediction output. Under the alternative hypothesis, the model distribution is assumed to be non-informative uniform, which reflects the situation that the future position cannot be inferred reliably. We derive the likelihood ratio test (LRT) for this hypothesis testing problem and show that with the method of moments for estimating the null hypothesis Gaussian parameters, the LRT reduces to a simple test on the empirical variance of the predictive random variable. This conforms to the intuition to expect a (potentially) large prediction error when the estimate is associated with high uncertainty, and to expect an accurate prediction when the uncertainty level is low. We tested the proposed method on patient-derived respiratory traces. The 'ground-truth' prediction error was evaluated by comparing the prediction values with retrospective observations, and the large prediction regions were subsequently delineated by thresholding the prediction errors. The receiver operating characteristic curve was used to describe the performance of the proposed hypothesis testing method. Clinical implication was represented by miss

  15. Some consequences of using the Horsfall-Barratt scale for hypothesis testing

    Science.gov (United States)

    Comparing treatment effects by hypothesis testing is a common practice in plant pathology. Nearest percent estimates (NPEs) of disease severity were compared to Horsfall-Barratt (H-B) scale data to explore whether there was an effect of assessment method on hypothesis testing. A simulation model ba...

  16. Safety test No. S-6, launch pad abort sequential test Phase II: solid propellant fire

    International Nuclear Information System (INIS)

    Snow, E.C.

    1975-08-01

    In preparation for the Lincoln Laboratory's LES 8/9 space mission, a series of tests was performed to evaluate the nuclear safety capability of the Multi-Hundred Watt (MHW) Radioisotope Thermoelectric Generator (RTG) to be used to supply power for the satellite. One such safety test is Test No. S-6, Launch Pad Abort Sequential Test. The objective of this test was to subject the RTG and its components to the sequential environments characteristic of a catastrophic launch pad accident to evaluate their capability to contain the 238 PuO 2 fuel. This sequence of environments was to have consisted of the blast overpressure and fragments, followed by the fireball, low velocity impact on the launch pad, and solid propellant fire. The blast overpressure and fragments were subsequently eliminated from this sequence. The procedures and results of Phase II of Test S-6, Solid Propellant Fire are presented. In this phase of the test, a simulant Fuel Sphere Assembly (FSA) and a mockup of a damaged Heat Source Assembly (HSA) were subjected to single proximity solid propellant fires of approximately 10-min duration. Steel was introduced into both tests to simulate the effects of launch pad debris and the solid rocket motor (SRM) casing that might be present in the fire zone. (TFD)

  17. Different meaning of the p-value in exploratory and confirmatory hypothesis testing

    DEFF Research Database (Denmark)

    Gerke, Oke; Høilund-Carlsen, Poul Flemming; Vach, Werner

    2011-01-01

    The outcome of clinical studies is often reduced to the statistical significance of results by indicating a p-value below the 5% significance level. Hypothesis testing and, through that, the p-value is commonly used, but their meaning is frequently misinterpreted in clinical research. The concept...... of hypothesis testing is explained and some pitfalls including those of multiple testing are given. The conceptual difference between exploratory and confirmatory hypothesis testing is discussed, and a better use of p-values, which includes presenting p-values with two or three decimals, is suggested....

  18. Sequential and simultaneous choices: testing the diet selection and sequential choice models.

    Science.gov (United States)

    Freidin, Esteban; Aw, Justine; Kacelnik, Alex

    2009-03-01

    We investigate simultaneous and sequential choices in starlings, using Charnov's Diet Choice Model (DCM) and Shapiro, Siller and Kacelnik's Sequential Choice Model (SCM) to integrate function and mechanism. During a training phase, starlings encountered one food-related option per trial (A, B or R) in random sequence and with equal probability. A and B delivered food rewards after programmed delays (shorter for A), while R ('rejection') moved directly to the next trial without reward. In this phase we measured latencies to respond. In a later, choice, phase, birds encountered the pairs A-B, A-R and B-R, the first implementing a simultaneous choice and the second and third sequential choices. The DCM predicts when R should be chosen to maximize intake rate, and SCM uses latencies of the training phase to predict choices between any pair of options in the choice phase. The predictions of both models coincided, and both successfully predicted the birds' preferences. The DCM does not deal with partial preferences, while the SCM does, and experimental results were strongly correlated to this model's predictions. We believe that the SCM may expose a very general mechanism of animal choice, and that its wider domain of success reflects the greater ecological significance of sequential over simultaneous choices.

  19. Sequential detection of influenza epidemics by the Kolmogorov-Smirnov test

    Directory of Open Access Journals (Sweden)

    Closas Pau

    2012-10-01

    Full Text Available Abstract Background Influenza is a well known and common human respiratory infection, causing significant morbidity and mortality every year. Despite Influenza variability, fast and reliable outbreak detection is required for health resource planning. Clinical health records, as published by the Diagnosticat database in Catalonia, host useful data for probabilistic detection of influenza outbreaks. Methods This paper proposes a statistical method to detect influenza epidemic activity. Non-epidemic incidence rates are modeled against the exponential distribution, and the maximum likelihood estimate for the decaying factor λ is calculated. The sequential detection algorithm updates the parameter as new data becomes available. Binary epidemic detection of weekly incidence rates is assessed by Kolmogorov-Smirnov test on the absolute difference between the empirical and the cumulative density function of the estimated exponential distribution with significance level 0 ≤ α ≤ 1. Results The main advantage with respect to other approaches is the adoption of a statistically meaningful test, which provides an indicator of epidemic activity with an associated probability. The detection algorithm was initiated with parameter λ0 = 3.8617 estimated from the training sequence (corresponding to non-epidemic incidence rates of the 2008-2009 influenza season and sequentially updated. Kolmogorov-Smirnov test detected the following weeks as epidemic for each influenza season: 50−10 (2008-2009 season, 38−50 (2009-2010 season, weeks 50−9 (2010-2011 season and weeks 3 to 12 for the current 2011-2012 season. Conclusions Real medical data was used to assess the validity of the approach, as well as to construct a realistic statistical model of weekly influenza incidence rates in non-epidemic periods. For the tested data, the results confirmed the ability of the algorithm to detect the start and the end of epidemic periods. In general, the proposed test could

  20. The frequentist implications of optional stopping on Bayesian hypothesis tests.

    Science.gov (United States)

    Sanborn, Adam N; Hills, Thomas T

    2014-04-01

    Null hypothesis significance testing (NHST) is the most commonly used statistical methodology in psychology. The probability of achieving a value as extreme or more extreme than the statistic obtained from the data is evaluated, and if it is low enough, the null hypothesis is rejected. However, because common experimental practice often clashes with the assumptions underlying NHST, these calculated probabilities are often incorrect. Most commonly, experimenters use tests that assume that sample sizes are fixed in advance of data collection but then use the data to determine when to stop; in the limit, experimenters can use data monitoring to guarantee that the null hypothesis will be rejected. Bayesian hypothesis testing (BHT) provides a solution to these ills because the stopping rule used is irrelevant to the calculation of a Bayes factor. In addition, there are strong mathematical guarantees on the frequentist properties of BHT that are comforting for researchers concerned that stopping rules could influence the Bayes factors produced. Here, we show that these guaranteed bounds have limited scope and often do not apply in psychological research. Specifically, we quantitatively demonstrate the impact of optional stopping on the resulting Bayes factors in two common situations: (1) when the truth is a combination of the hypotheses, such as in a heterogeneous population, and (2) when a hypothesis is composite-taking multiple parameter values-such as the alternative hypothesis in a t-test. We found that, for these situations, while the Bayesian interpretation remains correct regardless of the stopping rule used, the choice of stopping rule can, in some situations, greatly increase the chance of experimenters finding evidence in the direction they desire. We suggest ways to control these frequentist implications of stopping rules on BHT.

  1. Hypothesis Testing as an Act of Rationality

    Science.gov (United States)

    Nearing, Grey

    2017-04-01

    Statistical hypothesis testing is ad hoc in two ways. First, setting probabilistic rejection criteria is, as Neyman (1957) put it, an act of will rather than an act of rationality. Second, physical theories like conservation laws do not inherently admit probabilistic predictions, and so we must use what are called epistemic bridge principles to connect model predictions with the actual methods of hypothesis testing. In practice, these bridge principles are likelihood functions, error functions, or performance metrics. I propose that the reason we are faced with these problems is because we have historically failed to account for a fundamental component of basic logic - namely the portion of logic that explains how epistemic states evolve in the presence of empirical data. This component of Cox' (1946) calculitic logic is called information theory (Knuth, 2005), and adding information theory our hypothetico-deductive account of science yields straightforward solutions to both of the above problems. This also yields a straightforward method for dealing with Popper's (1963) problem of verisimilitude by facilitating a quantitative approach to measuring process isomorphism. In practice, this involves data assimilation. Finally, information theory allows us to reliably bound measures of epistemic uncertainty, thereby avoiding the problem of Bayesian incoherency under misspecified priors (Grünwald, 2006). I therefore propose solutions to four of the fundamental problems inherent in both hypothetico-deductive and/or Bayesian hypothesis testing. - Neyman (1957) Inductive Behavior as a Basic Concept of Philosophy of Science. - Cox (1946) Probability, Frequency and Reasonable Expectation. - Knuth (2005) Lattice Duality: The Origin of Probability and Entropy. - Grünwald (2006). Bayesian Inconsistency under Misspecification. - Popper (1963) Conjectures and Refutations: The Growth of Scientific Knowledge.

  2. Organization principles in visual working memory: Evidence from sequential stimulus display.

    Science.gov (United States)

    Gao, Zaifeng; Gao, Qiyang; Tang, Ning; Shui, Rende; Shen, Mowei

    2016-01-01

    Although the mechanisms of visual working memory (VWM) have been studied extensively in recent years, the active property of VWM has received less attention. In the current study, we examined how VWM integrates sequentially presented stimuli by focusing on the role of Gestalt principles, which are important organizing principles in perceptual integration. We manipulated the level of Gestalt cues among three or four sequentially presented objects that were memorized. The Gestalt principle could not emerge unless all the objects appeared together. We distinguished two hypotheses: a perception-alike hypothesis and an encoding-specificity hypothesis. The former predicts that the Gestalt cue will play a role in information integration within VWM; the latter predicts that the Gestalt cue will not operate within VWM. In four experiments, we demonstrated that collinearity (Experiment 1) and closure (Experiment 2) cues significantly improved VWM performance, and this facilitation was not affected by the testing manner (Experiment 3) or by adding extra colors to the memorized objects (Experiment 4). Finally, we re-established the Gestalt cue benefit with similarity cues (Experiment 5). These findings together suggest that VWM realizes and uses potential Gestalt principles within the stored representations, supporting a perception-alike hypothesis. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Cross-system log file analysis for hypothesis testing

    NARCIS (Netherlands)

    Glahn, Christian

    2008-01-01

    Glahn, C. (2008). Cross-system log file analysis for hypothesis testing. Presented at Empowering Learners for Lifelong Competence Development: pedagogical, organisational and technological issues. 4th TENCompetence Open Workshop. April, 10, 2008, Madrid, Spain.

  4. Validation of new prognostic and predictive scores by sequential testing approach

    International Nuclear Information System (INIS)

    Nieder, Carsten; Haukland, Ellinor; Pawinski, Adam; Dalhaug, Astrid

    2010-01-01

    Background and Purpose: For practitioners, the question arises how their own patient population differs from that used in large-scale analyses resulting in new scores and nomograms and whether such tools actually are valid at a local level and thus can be implemented. A recent article proposed an easy-to-use method for the in-clinic validation of new prediction tools with a limited number of patients, a so-called sequential testing approach. The present study evaluates this approach in scores related to radiation oncology. Material and Methods: Three different scores were used, each predicting short overall survival after palliative radiotherapy (bone metastases, brain metastases, metastatic spinal cord compression). For each scenario, a limited number of consecutive patients entered the sequential testing approach. The positive predictive value (PPV) was used for validation of the respective score and it was required that the PPV exceeded 80%. Results: For two scores, validity in the own local patient population could be confirmed after entering 13 and 17 patients, respectively. For the third score, no decision could be reached even after increasing the sample size to 30. Conclusion: In-clinic validation of new predictive tools with sequential testing approach should be preferred over uncritical adoption of tools which provide no significant benefit to local patient populations. Often the necessary number of patients can be reached within reasonable time frames even in small oncology practices. In addition, validation is performed continuously as the data are collected. (orig.)

  5. Validation of new prognostic and predictive scores by sequential testing approach

    Energy Technology Data Exchange (ETDEWEB)

    Nieder, Carsten [Radiation Oncology Unit, Nordland Hospital, Bodo (Norway); Inst. of Clinical Medicine, Univ. of Tromso (Norway); Haukland, Ellinor; Pawinski, Adam; Dalhaug, Astrid [Radiation Oncology Unit, Nordland Hospital, Bodo (Norway)

    2010-03-15

    Background and Purpose: For practitioners, the question arises how their own patient population differs from that used in large-scale analyses resulting in new scores and nomograms and whether such tools actually are valid at a local level and thus can be implemented. A recent article proposed an easy-to-use method for the in-clinic validation of new prediction tools with a limited number of patients, a so-called sequential testing approach. The present study evaluates this approach in scores related to radiation oncology. Material and Methods: Three different scores were used, each predicting short overall survival after palliative radiotherapy (bone metastases, brain metastases, metastatic spinal cord compression). For each scenario, a limited number of consecutive patients entered the sequential testing approach. The positive predictive value (PPV) was used for validation of the respective score and it was required that the PPV exceeded 80%. Results: For two scores, validity in the own local patient population could be confirmed after entering 13 and 17 patients, respectively. For the third score, no decision could be reached even after increasing the sample size to 30. Conclusion: In-clinic validation of new predictive tools with sequential testing approach should be preferred over uncritical adoption of tools which provide no significant benefit to local patient populations. Often the necessary number of patients can be reached within reasonable time frames even in small oncology practices. In addition, validation is performed continuously as the data are collected. (orig.)

  6. P value and the theory of hypothesis testing: an explanation for new researchers.

    Science.gov (United States)

    Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël

    2010-03-01

    In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

  7. A test of the reward-value hypothesis.

    Science.gov (United States)

    Smith, Alexandra E; Dalecki, Stefan J; Crystal, Jonathon D

    2017-03-01

    Rats retain source memory (memory for the origin of information) over a retention interval of at least 1 week, whereas their spatial working memory (radial maze locations) decays within approximately 1 day. We have argued that different forgetting functions dissociate memory systems. However, the two tasks, in our previous work, used different reward values. The source memory task used multiple pellets of a preferred food flavor (chocolate), whereas the spatial working memory task provided access to a single pellet of standard chow-flavored food at each location. Thus, according to the reward-value hypothesis, enhanced performance in the source memory task stems from enhanced encoding/memory of a preferred reward. We tested the reward-value hypothesis by using a standard 8-arm radial maze task to compare spatial working memory accuracy of rats rewarded with either multiple chocolate or chow pellets at each location using a between-subjects design. The reward-value hypothesis predicts superior accuracy for high-valued rewards. We documented equivalent spatial memory accuracy for high- and low-value rewards. Importantly, a 24-h retention interval produced equivalent spatial working memory accuracy for both flavors. These data are inconsistent with the reward-value hypothesis and suggest that reward value does not explain our earlier findings that source memory survives unusually long retention intervals.

  8. The discovered preference hypothesis - an empirical test

    DEFF Research Database (Denmark)

    Lundhede, Thomas; Ladenburg, Jacob; Olsen, Søren Bøye

    Using stated preference methods for valuation of non-market goods is known to be vulnerable to a range of biases. Some authors claim that these so-called anomalies in effect render the methods useless for the purpose. However, the Discovered Preference Hypothesis, as put forth by Plott [31], offers...... an nterpretation and explanation of biases which entails that the stated preference methods need not to be completely written off. In this paper we conduct a test for the validity and relevance of the DPH interpretation of biases. In a choice experiment concerning preferences for protection of Danish nature areas...... as respondents evaluate more and more choice sets. This finding supports the Discovered Preference Hypothesis interpretation and explanation of starting point bias....

  9. Gaussian Hypothesis Testing and Quantum Illumination.

    Science.gov (United States)

    Wilde, Mark M; Tomamichel, Marco; Lloyd, Seth; Berta, Mario

    2017-09-22

    Quantum hypothesis testing is one of the most basic tasks in quantum information theory and has fundamental links with quantum communication and estimation theory. In this paper, we establish a formula that characterizes the decay rate of the minimal type-II error probability in a quantum hypothesis test of two Gaussian states given a fixed constraint on the type-I error probability. This formula is a direct function of the mean vectors and covariance matrices of the quantum Gaussian states in question. We give an application to quantum illumination, which is the task of determining whether there is a low-reflectivity object embedded in a target region with a bright thermal-noise bath. For the asymmetric-error setting, we find that a quantum illumination transmitter can achieve an error probability exponent stronger than a coherent-state transmitter of the same mean photon number, and furthermore, that it requires far fewer trials to do so. This occurs when the background thermal noise is either low or bright, which means that a quantum advantage is even easier to witness than in the symmetric-error setting because it occurs for a larger range of parameters. Going forward from here, we expect our formula to have applications in settings well beyond those considered in this paper, especially to quantum communication tasks involving quantum Gaussian channels.

  10. Mechanisms of eyewitness suggestibility: tests of the explanatory role hypothesis.

    Science.gov (United States)

    Rindal, Eric J; Chrobak, Quin M; Zaragoza, Maria S; Weihing, Caitlin A

    2017-10-01

    In a recent paper, Chrobak and Zaragoza (Journal of Experimental Psychology: General, 142(3), 827-844, 2013) proposed the explanatory role hypothesis, which posits that the likelihood of developing false memories for post-event suggestions is a function of the explanatory function the suggestion serves. In support of this hypothesis, they provided evidence that participant-witnesses were especially likely to develop false memories for their forced fabrications when their fabrications helped to explain outcomes they had witnessed. In three experiments, we test the generality of the explanatory role hypothesis as a mechanism of eyewitness suggestibility by assessing whether this hypothesis can predict suggestibility errors in (a) situations where the post-event suggestions are provided by the experimenter (as opposed to fabricated by the participant), and (b) across a variety of memory measures and measures of recollective experience. In support of the explanatory role hypothesis, participants were more likely to subsequently freely report (E1) and recollect the suggestions as part of the witnessed event (E2, source test) when the post-event suggestion helped to provide a causal explanation for a witnessed outcome than when it did not serve this explanatory role. Participants were also less likely to recollect the suggestions as part of the witnessed event (on measures of subjective experience) when their explanatory strength had been reduced by the presence of an alternative explanation that could explain the same outcome (E3, source test + warning). Collectively, the results provide strong evidence that the search for explanatory coherence influences people's tendency to misremember witnessing events that were only suggested to them.

  11. A default Bayesian hypothesis test for correlations and partial correlations

    NARCIS (Netherlands)

    Wetzels, R.; Wagenmakers, E.J.

    2012-01-01

    We propose a default Bayesian hypothesis test for the presence of a correlation or a partial correlation. The test is a direct application of Bayesian techniques for variable selection in regression models. The test is easy to apply and yields practical advantages that the standard frequentist tests

  12. A test of the reward-contrast hypothesis.

    Science.gov (United States)

    Dalecki, Stefan J; Panoz-Brown, Danielle E; Crystal, Jonathon D

    2017-12-01

    Source memory, a facet of episodic memory, is the memory of the origin of information. Whereas source memory in rats is sustained for at least a week, spatial memory degraded after approximately a day. Different forgetting functions may suggest that two memory systems (source memory and spatial memory) are dissociated. However, in previous work, the two tasks used baiting conditions consisting of chocolate and chow flavors; notably, the source memory task used the relatively better flavor. Thus, according to the reward-contrast hypothesis, when chocolate and chow were presented within the same context (i.e., within a single radial maze trial), the chocolate location was more memorable than the chow location because of contrast. We tested the reward-contrast hypothesis using baiting configurations designed to produce reward-contrast. The reward-contrast hypothesis predicts that under these conditions, spatial memory will survive a 24-h retention interval. We documented elimination of spatial memory performance after a 24-h retention interval using a reward-contrast baiting pattern. These data suggest that reward contrast does not explain our earlier findings that source memory survives unusually long retention intervals. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Simultaneous Versus Sequential Presentation in Testing Recognition Memory for Faces.

    Science.gov (United States)

    Finley, Jason R; Roediger, Henry L; Hughes, Andrea D; Wahlheim, Christopher N; Jacoby, Larry L

    2015-01-01

    Three experiments examined the issue of whether faces could be better recognized in a simul- taneous test format (2-alternative forced choice [2AFC]) or a sequential test format (yes-no). All experiments showed that when target faces were present in the test, the simultaneous procedure led to superior performance (area under the ROC curve), whether lures were high or low in similarity to the targets. However, when a target-absent condition was used in which no lures resembled the targets but the lures were similar to each other, the simultaneous procedure yielded higher false alarm rates (Experiments 2 and 3) and worse overall performance (Experi- ment 3). This pattern persisted even when we excluded responses that participants opted to withhold rather than volunteer. We conclude that for the basic recognition procedures used in these experiments, simultaneous presentation of alternatives (2AFC) generally leads to better discriminability than does sequential presentation (yes-no) when a target is among the alterna- tives. However, our results also show that the opposite can occur when there is no target among the alternatives. An important future step is to see whether these patterns extend to more realistic eyewitness lineup procedures. The pictures used in the experiment are available online at http://www.press.uillinois.edu/journals/ajp/media/testing_recognition/.

  14. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Dale [Los Alamos National Laboratory; Selby, Neil [AWE Blacknest

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  15. Chi-square test and its application in hypothesis testing

    Directory of Open Access Journals (Sweden)

    Rakesh Rana

    2015-01-01

    Full Text Available In medical research, there are studies which often collect data on categorical variables that can be summarized as a series of counts. These counts are commonly arranged in a tabular format known as a contingency table. The chi-square test statistic can be used to evaluate whether there is an association between the rows and columns in a contingency table. More specifically, this statistic can be used to determine whether there is any difference between the study groups in the proportions of the risk factor of interest. Chi-square test and the logic of hypothesis testing were developed by Karl Pearson. This article describes in detail what is a chi-square test, on which type of data it is used, the assumptions associated with its application, how to manually calculate it and how to make use of an online calculator for calculating the Chi-square statistics and its associated P-value.

  16. Trends in hypothesis testing and related variables in nursing research: a retrospective exploratory study.

    Science.gov (United States)

    Lash, Ayhan Aytekin; Plonczynski, Donna J; Sehdev, Amikar

    2011-01-01

    To compare the inclusion and the influences of selected variables on hypothesis testing during the 1980s and 1990s. In spite of the emphasis on conducting inquiry consistent with the tenets of logical positivism, there have been no studies investigating the frequency and patterns of hypothesis testing in nursing research The sample was obtained from the journal Nursing Research which was the research journal with the highest circulation during the study period under study. All quantitative studies published during the two decades including briefs and historical studies were included in the analyses A retrospective design was used to select the sample. Five years from the 1980s and 1990s each were randomly selected from the journal, Nursing Research. Of the 582 studies, 517 met inclusion criteria. Findings suggest that there has been a decline in the use of hypothesis testing in the last decades of the 20th century. Further research is needed to identify the factors that influence the conduction of research with hypothesis testing. Hypothesis testing in nursing research showed a steady decline from the 1980s to 1990s. Research purposes of explanation, and prediction/ control increased the likelihood of hypothesis testing. Hypothesis testing strengthens the quality of the quantitative studies, increases the generality of findings and provides dependable knowledge. This is particularly true for quantitative studies that aim to explore, explain and predict/control phenomena and/or test theories. The findings also have implications for doctoral programmes, research preparation of nurse-investigators, and theory testing.

  17. Hypothesis testing in students: Sequences, stages, and instructional strategies

    Science.gov (United States)

    Moshman, David; Thompson, Pat A.

    Six sequences in the development of hypothesis-testing conceptions are proposed, involving (a) interpretation of the hypothesis; (b) the distinction between using theories and testing theories; (c) the consideration of multiple possibilities; (d) the relation of theory and data; (e) the nature of verification and falsification; and (f) the relation of truth and falsity. An alternative account is then provided involving three global stages: concrete operations, formal operations, and a postformal metaconstructivestage. Relative advantages and difficulties of the stage and sequence conceptualizations are discussed. Finally, three families of teaching strategy are distinguished, which emphasize, respectively: (a) social transmission of knowledge; (b) carefully sequenced empirical experience by the student; and (c) self-regulated cognitive activity of the student. It is argued on the basis of Piaget's theory that the last of these plays a crucial role in the construction of such logical reasoning strategies as those involved in testing hypotheses.

  18. Cross-system log file analysis for hypothesis testing

    NARCIS (Netherlands)

    Glahn, Christian; Specht, Marcus; Schoonenboom, Judith; Sligte, Henk; Moghnieh, Ayman; Hernández-Leo, Davinia; Stefanov, Krassen; Lemmers, Ruud; Koper, Rob

    2008-01-01

    Glahn, C., Specht, M., Schoonenboom, J., Sligte, H., Moghnieh, A., Hernández-Leo, D. Stefanov, K., Lemmers, R., & Koper, R. (2008). Cross-system log file analysis for hypothesis testing. In H. Sligte & R. Koper (Eds.), Proceedings of the 4th TENCompetence Open Workshop. Empowering Learners for

  19. A "Projective" Test of the Golden Section Hypothesis.

    Science.gov (United States)

    Lee, Chris; Adams-Webber, Jack

    1987-01-01

    In a projective test of the golden section hypothesis, 24 high school students rated themselves and 10 comic strip characters on basis of 12 bipolar constructs. Overall proportion of cartoon figures which subjects assigned to positive poles of constructs was very close to golden section. (Author/NB)

  20. Animal Models for Testing the DOHaD Hypothesis

    Science.gov (United States)

    Since the seminal work in human populations by David Barker and colleagues, several species of animals have been used in the laboratory to test the Developmental Origins of Health and Disease (DOHaD) hypothesis. Rats, mice, guinea pigs, sheep, pigs and non-human primates have bee...

  1. Susceptibility testing of sequential isolates of Aspergillus fumigatus recovered from treated patients.

    NARCIS (Netherlands)

    Danaoui, E.; Meletiadis, J.; Tortorano, A.M.; Symoens, F.; Nolard, N.; Viviani, M.A.; Piens, M.A.; Lebeau, B.; Verweij, P.E.; Grillot, R.

    2004-01-01

    Two-hundred sequential Aspergillus fumigatus isolates recovered from 26 immunocompromised patients with invasive aspergillosis or bronchial colonization were tested for their in vitro susceptibility to posaconazole, itraconazole, voriconazole, terbinafine and amphotericin B. Twenty-one patients were

  2. Bayesian Approaches to Imputation, Hypothesis Testing, and Parameter Estimation

    Science.gov (United States)

    Ross, Steven J.; Mackey, Beth

    2015-01-01

    This chapter introduces three applications of Bayesian inference to common and novel issues in second language research. After a review of the critiques of conventional hypothesis testing, our focus centers on ways Bayesian inference can be used for dealing with missing data, for testing theory-driven substantive hypotheses without a default null…

  3. Tests of the lunar hypothesis

    Science.gov (United States)

    Taylor, S. R.

    1984-01-01

    The concept that the Moon was fissioned from the Earth after core separation is the most readily testable hypothesis of lunar origin, since direct comparisons of lunar and terrestrial compositions can be made. Differences found in such comparisons introduce so many ad hoc adjustments to the fission hypothesis that it becomes untestable. Further constraints may be obtained from attempting to date the volatile-refractory element fractionation. The combination of chemical and isotopic problems suggests that the fission hypothesis is no longer viable, and separate terrestrial and lunar accretion from a population of fractionated precursor planetesimals provides a more reasonable explanation.

  4. A Review of Multiple Hypothesis Testing in Otolaryngology Literature

    Science.gov (United States)

    Kirkham, Erin M.; Weaver, Edward M.

    2018-01-01

    Objective Multiple hypothesis testing (or multiple testing) refers to testing more than one hypothesis within a single analysis, and can inflate the Type I error rate (false positives) within a study. The aim of this review was to quantify multiple testing in recent large clinical studies in the otolaryngology literature and to discuss strategies to address this potential problem. Data sources Original clinical research articles with >100 subjects published in 2012 in the four general otolaryngology journals with the highest Journal Citation Reports 5-year impact factors. Review methods Articles were reviewed to determine whether the authors tested greater than five hypotheses in at least one family of inferences. For the articles meeting this criterion for multiple testing, Type I error rates were calculated and statistical correction was applied to the reported results. Results Of the 195 original clinical research articles reviewed, 72% met the criterion for multiple testing. Within these studies, there was a mean 41% chance of a Type I error and, on average, 18% of significant results were likely to be false positives. After the Bonferroni correction was applied, only 57% of significant results reported within the articles remained significant. Conclusion Multiple testing is common in recent large clinical studies in otolaryngology and deserves closer attention from researchers, reviewers and editors. Strategies for adjusting for multiple testing are discussed. PMID:25111574

  5. Sequential-Simultaneous Analysis of Japanese Children's Performance on the Japanese McCarthy.

    Science.gov (United States)

    Ishikuma, Toshinori; And Others

    This study explored the hypothesis that Japanese children perform significantly better on simultaneous processing than on sequential processing. The Kaufman Assessment Battery for Children (K-ABC) served as the criterion of the two types of mental processing. Regression equations to predict Sequential and Simultaneous processing from McCarthy…

  6. Testing the Cross-Racial Generality of Spearman's Hypothesis in Two Samples

    Science.gov (United States)

    Hartmann, Peter; Kruuse, Nanna Hye Sun; Nyborg, Helmuth

    2007-01-01

    Spearman's hypothesis states that racial differences in IQ between Blacks (B) and Whites (W) are due primarily to differences in the "g" factor. This hypothesis is often confirmed, but it is less certain whether it generalizes to other races. We therefore tested its cross-racial generality by comparing American subjects of European…

  7. Hypothesis Testing Using the Films of the Three Stooges

    Science.gov (United States)

    Gardner, Robert; Davidson, Robert

    2010-01-01

    The use of The Three Stooges' films as a source of data in an introductory statistics class is described. The Stooges' films are separated into three populations. Using these populations, students may conduct hypothesis tests with data they collect.

  8. Planned Hypothesis Tests Are Not Necessarily Exempt from Multiplicity Adjustment

    Science.gov (United States)

    Frane, Andrew V.

    2015-01-01

    Scientific research often involves testing more than one hypothesis at a time, which can inflate the probability that a Type I error (false discovery) will occur. To prevent this Type I error inflation, adjustments can be made to the testing procedure that compensate for the number of tests. Yet many researchers believe that such adjustments are…

  9. Hypothesis tests for the detection of constant speed radiation moving sources

    Energy Technology Data Exchange (ETDEWEB)

    Dumazert, Jonathan; Coulon, Romain; Kondrasovs, Vladimir; Boudergui, Karim; Sannie, Guillaume; Gameiro, Jordan; Normand, Stephane [CEA, LIST, Laboratoire Capteurs Architectures Electroniques, 99 Gif-sur-Yvette, (France); Mechin, Laurence [CNRS, UCBN, Groupe de Recherche en Informatique, Image, Automatique et Instrumentation de Caen, 4050 Caen, (France)

    2015-07-01

    Radiation Portal Monitors are deployed in linear network to detect radiological material in motion. As a complement to single and multichannel detection algorithms, inefficient under too low signal to noise ratios, temporal correlation algorithms have been introduced. Test hypothesis methods based on empirically estimated mean and variance of the signals delivered by the different channels have shown significant gain in terms of a tradeoff between detection sensitivity and false alarm probability. This paper discloses the concept of a new hypothesis test for temporal correlation detection methods, taking advantage of the Poisson nature of the registered counting signals, and establishes a benchmark between this test and its empirical counterpart. The simulation study validates that in the four relevant configurations of a pedestrian source carrier under respectively high and low count rate radioactive background, and a vehicle source carrier under the same respectively high and low count rate radioactive background, the newly introduced hypothesis test ensures a significantly improved compromise between sensitivity and false alarm, while guaranteeing the stability of its optimization parameter regardless of signal to noise ratio variations between 2 to 0.8. (authors)

  10. A Critique of One-Tailed Hypothesis Test Procedures in Business and Economics Statistics Textbooks.

    Science.gov (United States)

    Liu, Tung; Stone, Courtenay C.

    1999-01-01

    Surveys introductory business and economics statistics textbooks and finds that they differ over the best way to explain one-tailed hypothesis tests: the simple null-hypothesis approach or the composite null-hypothesis approach. Argues that the composite null-hypothesis approach contains methodological shortcomings that make it more difficult for…

  11. Sex ratios in the two Germanies: a test of the economic stress hypothesis.

    Science.gov (United States)

    Catalano, Ralph A

    2003-09-01

    Literature describing temporal variation in the secondary sex ratio among humans reports an association between population stressors and declines in the odds of male birth. Explanations of this phenomenon draw on reports that stressed females spontaneously abort male more than female fetuses, and that stressed males exhibit reduced sperm motility. This work has led to the argument that population stress induced by a declining economy reduces the human sex ratio. No direct test of this hypothesis appears in the literature. Here, a test is offered based on a comparison of the sex ratio in East and West Germany for the years 1946 to 1999. The theory suggests that the East German sex ratio should be lower in 1991, when East Germany's economy collapsed, than expected from its own history and from the sex ratio in West Germany. The hypothesis is tested using time-series modelling methods. The data support the hypothesis. The sex ratio in East Germany was at its lowest in 1991. This first direct test supports the hypothesis that economic decline reduces the human sex ratio.

  12. Sequential Test Selection by Quantifying of the Reduction in Diagnostic Uncertainty for the Diagnosis of Proximal Caries

    Directory of Open Access Journals (Sweden)

    Umut Arslan

    2013-06-01

    Full Text Available Background: In order to determine the presence or absence of a certain disease, multiple diagnostic tests may be necessary. Performance of these tests can be sequentially evaluated. Aims: The aim of the study is to determine the contribution of the test in each step, in reducing diagnostic uncertainty when multiple tests are sequentially used for the diagnosis. Study Design: Diagnostic accuracy study Methods: Radiographs of seventy-three patients of the Department of Dento-Maxillofacial Radiology of Hacettepe University Faculty of Dentistry were assessed. Panoramic (PAN, full mouth intraoral (FM, and bitewing (BW radiographs were used for the diagnosis of proximal caries in the maxillary and mandibular molar regions. Diagnostic performance of radiography was sequentially evaluated by using the reduction in diagnostic uncertainty. Results: FM provided maximum diagnostic information for ruling in potential in the maxillary and mandibular molar regions in the first step. FM provided more diagnostic information than BW radiographs for ruling in the mandibular region in the second step. In the mandibular region, BW radiographs provided more diagnostic information than FM for ruling out in the first step. Conclusion: The presented method in this study provides the clinicians with a solution for the decision of the sequential selection of diagnostic tests for the correct diagnosis of the presence or absence of a certain disease.

  13. Sequentially optimized reconstruction strategy: A meta-strategy for perimetry testing.

    Directory of Open Access Journals (Sweden)

    Şerife Seda Kucur

    Full Text Available Perimetry testing is an automated method to measure visual function and is heavily used for diagnosing ophthalmic and neurological conditions. Its working principle is to sequentially query a subject about perceived light using different brightness levels at different visual field locations. At a given location, this query-patient-feedback process is expected to converge at a perceived sensitivity, such that a shown stimulus intensity is observed and reported 50% of the time. Given this inherently time-intensive and noisy process, fast testing strategies are necessary in order to measure existing regions more effectively and reliably. In this work, we present a novel meta-strategy which relies on the correlative nature of visual field locations in order to strongly reduce the necessary number of locations that need to be examined. To do this, we sequentially determine locations that most effectively reduce visual field estimation errors in an initial training phase. We then exploit these locations at examination time and show that our approach can easily be combined with existing perceived sensitivity estimation schemes to speed up the examinations. Compared to state-of-the-art strategies, our approach shows marked performance gains with a better accuracy-speed trade-off regime for both mixed and sub-populations.

  14. A checklist to facilitate objective hypothesis testing in social psychology research.

    Science.gov (United States)

    Washburn, Anthony N; Morgan, G Scott; Skitka, Linda J

    2015-01-01

    Social psychology is not a very politically diverse area of inquiry, something that could negatively affect the objectivity of social psychological theory and research, as Duarte et al. argue in the target article. This commentary offers a number of checks to help researchers uncover possible biases and identify when they are engaging in hypothesis confirmation and advocacy instead of hypothesis testing.

  15. Feasibility study using hypothesis testing to demonstrate containment of radionuclides within waste packages

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1986-04-01

    The purpose of this report is to apply methods of statistical hypothesis testing to demonstrate the performance of containers of radioactive waste. The approach involves modeling the failure times of waste containers using Weibull distributions, making strong assumptions about the parameters. A specific objective is to apply methods of statistical hypothesis testing to determine the number of container tests that must be performed in order to control the probability of arriving at the wrong conclusions. An algorithm to determine the required number of containers to be tested with the acceptable number of failures is derived as a function of the distribution parameters, stated probabilities, and the desired waste containment life. Using a set of reference values for the input parameters, sample sizes of containers to be tested are calculated for demonstration purposes. These sample sizes are found to be excessively large, indicating that this hypothesis-testing framework does not provide a feasible approach for demonstrating satisfactory performance of waste packages for exceptionally long time periods

  16. Privacy on Hypothesis Testing in Smart Grids

    OpenAIRE

    Li, Zuxing; Oechtering, Tobias

    2015-01-01

    In this paper, we study the problem of privacy information leakage in a smart grid. The privacy risk is assumed to be caused by an unauthorized binary hypothesis testing of the consumer's behaviour based on the smart meter readings of energy supplies from the energy provider. Another energy supplies are produced by an alternative energy source. A controller equipped with an energy storage device manages the energy inflows to satisfy the energy demand of the consumer. We study the optimal ener...

  17. Double-blind photo lineups using actual eyewitnesses: an experimental test of a sequential versus simultaneous lineup procedure.

    Science.gov (United States)

    Wells, Gary L; Steblay, Nancy K; Dysart, Jennifer E

    2015-02-01

    Eyewitnesses (494) to actual crimes in 4 police jurisdictions were randomly assigned to view simultaneous or sequential photo lineups using laptop computers and double-blind administration. The sequential procedure used in the field experiment mimicked how it is conducted in actual practice (e.g., using a continuation rule, witness does not know how many photos are to be viewed, witnesses resolve any multiple identifications), which is not how most lab experiments have tested the sequential lineup. No significant differences emerged in rates of identifying lineup suspects (25% overall) but the sequential procedure produced a significantly lower rate (11%) of identifying known-innocent lineup fillers than did the simultaneous procedure (18%). The simultaneous/sequential pattern did not significantly interact with estimator variables and no lineup-position effects were observed for either the simultaneous or sequential procedures. Rates of nonidentification were not significantly different for simultaneous and sequential but nonidentifiers from the sequential procedure were more likely to use the "not sure" response option than were nonidentifiers from the simultaneous procedure. Among witnesses who made an identification, 36% (41% of simultaneous and 32% of sequential) identified a known-innocent filler rather than a suspect, indicating that eyewitness performance overall was very poor. The results suggest that the sequential procedure that is used in the field reduces the identification of known-innocent fillers, but the differences are relatively small.

  18. Consumer health information seeking as hypothesis testing.

    Science.gov (United States)

    Keselman, Alla; Browne, Allen C; Kaufman, David R

    2008-01-01

    Despite the proliferation of consumer health sites, lay individuals often experience difficulty finding health information online. The present study attempts to understand users' information seeking difficulties by drawing on a hypothesis testing explanatory framework. It also addresses the role of user competencies and their interaction with internet resources. Twenty participants were interviewed about their understanding of a hypothetical scenario about a family member suffering from stable angina and then searched MedlinePlus consumer health information portal for information on the problem presented in the scenario. Participants' understanding of heart disease was analyzed via semantic analysis. Thematic coding was used to describe information seeking trajectories in terms of three key strategies: verification of the primary hypothesis, narrowing search within the general hypothesis area and bottom-up search. Compared to an expert model, participants' understanding of heart disease involved different key concepts, which were also differently grouped and defined. This understanding provided the framework for search-guiding hypotheses and results interpretation. Incorrect or imprecise domain knowledge led individuals to search for information on irrelevant sites, often seeking out data to confirm their incorrect initial hypotheses. Online search skills enhanced search efficiency, but did not eliminate these difficulties. Regardless of their web experience and general search skills, lay individuals may experience difficulty with health information searches. These difficulties may be related to formulating and evaluating hypotheses that are rooted in their domain knowledge. Informatics can provide support at the levels of health information portals, individual websites, and consumer education tools.

  19. Concerns regarding a call for pluralism of information theory and hypothesis testing

    Science.gov (United States)

    Lukacs, P.M.; Thompson, W.L.; Kendall, W.L.; Gould, W.R.; Doherty, P.F.; Burnham, K.P.; Anderson, D.R.

    2007-01-01

    1. Stephens et al . (2005) argue for `pluralism? in statistical analysis, combining null hypothesis testing and information-theoretic (I-T) methods. We show that I-T methods are more informative even in single variable problems and we provide an ecological example. 2. I-T methods allow inferences to be made from multiple models simultaneously. We believe multimodel inference is the future of data analysis, which cannot be achieved with null hypothesis-testing approaches. 3. We argue for a stronger emphasis on critical thinking in science in general and less reliance on exploratory data analysis and data dredging. Deriving alternative hypotheses is central to science; deriving a single interesting science hypothesis and then comparing it to a default null hypothesis (e.g. `no difference?) is not an efficient strategy for gaining knowledge. We think this single-hypothesis strategy has been relied upon too often in the past. 4. We clarify misconceptions presented by Stephens et al . (2005). 5. We think inference should be made about models, directly linked to scientific hypotheses, and their parameters conditioned on data, Prob(Hj| data). I-T methods provide a basis for this inference. Null hypothesis testing merely provides a probability statement about the data conditioned on a null model, Prob(data |H0). 6. Synthesis and applications. I-T methods provide a more informative approach to inference. I-T methods provide a direct measure of evidence for or against hypotheses and a means to consider simultaneously multiple hypotheses as a basis for rigorous inference. Progress in our science can be accelerated if modern methods can be used intelligently; this includes various I-T and Bayesian methods.

  20. The efficient market hypothesis: problems with interpretations of empirical tests

    Directory of Open Access Journals (Sweden)

    Denis Alajbeg

    2012-03-01

    Full Text Available Despite many “refutations” in empirical tests, the efficient market hypothesis (EMH remains the central concept of financial economics. The EMH’s resistance to the results of empirical testing emerges from the fact that the EMH is not a falsifiable theory. Its axiomatic definition shows how asset prices would behave under assumed conditions. Testing for this price behavior does not make much sense as the conditions in the financial markets are much more complex than the simplified conditions of perfect competition, zero transaction costs and free information used in the formulation of the EMH. Some recent developments within the tradition of the adaptive market hypothesis are promising regarding development of a falsifiable theory of price formation in financial markets, but are far from giving assurance that we are approaching a new formulation. The most that can be done in the meantime is to be very cautious while interpreting the empirical evidence that is presented as “testing” the EMH.

  1. Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis; Gold, Dara

    2013-01-01

    We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.

  2. Tests of the planetary hypothesis for PTFO 8-8695b

    DEFF Research Database (Denmark)

    Yu, Liang; Winn, Joshua N.; Gillon, Michaël

    2015-01-01

    The T Tauri star PTFO 8-8695 exhibits periodic fading events that have been interpreted as the transits of a giant planet on a precessing orbit. Here we present three tests of the planet hypothesis. First, we sought evidence for the secular changes in light-curve morphology that are predicted...... planetary orbit. Our spectroscopy also revealed strong, time-variable, high-velocity H{\\alpha} and Ca H & K emission features. All these observations cast doubt on the planetary hypothesis, and suggest instead that the fading events represent starspots, eclipses by circumstellar dust, or occultations...

  3. The limits to pride: A test of the pro-anorexia hypothesis.

    Science.gov (United States)

    Cornelius, Talea; Blanton, Hart

    2016-01-01

    Many social psychological models propose that positive self-conceptions promote self-esteem. An extreme version of this hypothesis is advanced in "pro-anorexia" communities: identifying with anorexia, in conjunction with disordered eating, can lead to higher self-esteem. The current study empirically tested this hypothesis. Results challenge the pro-anorexia hypothesis. Although those with higher levels of pro-anorexia identification trended towards higher self-esteem with increased disordered eating, this did not overcome the strong negative main effect of pro-anorexia identification. These data suggest a more effective strategy for promoting self-esteem is to encourage rejection of disordered eating and an anorexic identity.

  4. Hypothesis testing of a change point during cognitive decline among Alzheimer's disease patients.

    Science.gov (United States)

    Ji, Ming; Xiong, Chengjie; Grundman, Michael

    2003-10-01

    In this paper, we present a statistical hypothesis test for detecting a change point over the course of cognitive decline among Alzheimer's disease patients. The model under the null hypothesis assumes a constant rate of cognitive decline over time and the model under the alternative hypothesis is a general bilinear model with an unknown change point. When the change point is unknown, however, the null distribution of the test statistics is not analytically tractable and has to be simulated by parametric bootstrap. When the alternative hypothesis that a change point exists is accepted, we propose an estimate of its location based on the Akaike's Information Criterion. We applied our method to a data set from the Neuropsychological Database Initiative by implementing our hypothesis testing method to analyze Mini Mental Status Exam scores based on a random-slope and random-intercept model with a bilinear fixed effect. Our result shows that despite large amount of missing data, accelerated decline did occur for MMSE among AD patients. Our finding supports the clinical belief of the existence of a change point during cognitive decline among AD patients and suggests the use of change point models for the longitudinal modeling of cognitive decline in AD research.

  5. Is conscious stimulus identification dependent on knowledge of the perceptual modality? Testing the "source misidentification hypothesis"

    DEFF Research Database (Denmark)

    Overgaard, Morten; Lindeløv, Jonas Kristoffer; Svejstrup, Stinna

    2013-01-01

    This paper reports an experiment intended to test a particular hypothesis derived from blindsight research, which we name the “source misidentification hypothesis.” According to this hypothesis, a subject may be correct about a stimulus without being correct about how she had access...... to this knowledge (whether the stimulus was visual, auditory, or something else). We test this hypothesis in healthy subjects, asking them to report whether a masked stimulus was presented auditorily or visually, what the stimulus was, and how clearly they experienced the stimulus using the Perceptual Awareness...... experience of the stimulus. To demonstrate that particular levels of reporting accuracy are obtained, we employ a statistical strategy, which operationally tests the hypothesis of non-equality, such that the usual rejection of the null-hypothesis admits the conclusion of equivalence....

  6. Bayesian models based on test statistics for multiple hypothesis testing problems.

    Science.gov (United States)

    Ji, Yuan; Lu, Yiling; Mills, Gordon B

    2008-04-01

    We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.

  7. Shaping Up the Practice of Null Hypothesis Significance Testing.

    Science.gov (United States)

    Wainer, Howard; Robinson, Daniel H.

    2003-01-01

    Discusses criticisms of null hypothesis significance testing (NHST), suggesting that historical use of NHST was reasonable, and current users should read Sir Ronald Fisher's applied work. Notes that modifications to NHST and interpretations of its outcomes might better suit the needs of modern science. Concludes that NHST is most often useful as…

  8. Adolescents' Body Image Trajectories: A Further Test of the Self-Equilibrium Hypothesis

    Science.gov (United States)

    Morin, Alexandre J. S.; Maïano, Christophe; Scalas, L. Francesca; Janosz, Michel; Litalien, David

    2017-01-01

    The self-equilibrium hypothesis underlines the importance of having a strong core self, which is defined as a high and developmentally stable self-concept. This study tested this hypothesis in relation to body image (BI) trajectories in a sample of 1,006 adolescents (M[subscript age] = 12.6, including 541 males and 465 females) across a 4-year…

  9. The Harm Done to Reproducibility by the Culture of Null Hypothesis Significance Testing.

    Science.gov (United States)

    Lash, Timothy L

    2017-09-15

    In the last few years, stakeholders in the scientific community have raised alarms about a perceived lack of reproducibility of scientific results. In reaction, guidelines for journals have been promulgated and grant applicants have been asked to address the rigor and reproducibility of their proposed projects. Neither solution addresses a primary culprit, which is the culture of null hypothesis significance testing that dominates statistical analysis and inference. In an innovative research enterprise, selection of results for further evaluation based on null hypothesis significance testing is doomed to yield a low proportion of reproducible results and a high proportion of effects that are initially overestimated. In addition, the culture of null hypothesis significance testing discourages quantitative adjustments to account for systematic errors and quantitative incorporation of prior information. These strategies would otherwise improve reproducibility and have not been previously proposed in the widely cited literature on this topic. Without discarding the culture of null hypothesis significance testing and implementing these alternative methods for statistical analysis and inference, all other strategies for improving reproducibility will yield marginal gains at best. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  11. Eyewitness accuracy rates in sequential and simultaneous lineup presentations: a meta-analytic comparison.

    Science.gov (United States)

    Steblay, N; Dysart, J; Fulero, S; Lindsay, R C

    2001-10-01

    Most police lineups use a simultaneous presentation technique in which eyewitnesses view all lineup members at the same time. Lindsay and Wells (R. C. L. Lindsay & G. L. Wells, 1985) devised an alternative procedure, the sequential lineup, in which witnesses view one lineup member at a time and decide whether or not that person is the perpetrator prior to viewing the next lineup member. The present work uses the technique of meta-analysis to compare the accuracy rates of these presentation styles. Twenty-three papers were located (9 published and 14 unpublished), providing 30 tests of the hypothesis and including 4,145 participants. Results showed that identification of perpetrators from target-present lineups occurs at a higher rate from simultaneous than from sequential lineups. However, this difference largely disappears when moderator variables approximating real world conditions are considered. Also, correct rejection rates were significantly higher for sequential than simultaneous lineups and this difference is maintained or increased by greater approximation to real world conditions. Implications of these findings are discussed.

  12. A more powerful test based on ratio distribution for retention noninferiority hypothesis.

    Science.gov (United States)

    Deng, Ling; Chen, Gang

    2013-03-11

    Rothmann et al. ( 2003 ) proposed a method for the statistical inference of fraction retention noninferiority (NI) hypothesis. A fraction retention hypothesis is defined as a ratio of the new treatment effect verse the control effect in the context of a time to event endpoint. One of the major concerns using this method in the design of an NI trial is that with a limited sample size, the power of the study is usually very low. This makes an NI trial not applicable particularly when using time to event endpoint. To improve power, Wang et al. ( 2006 ) proposed a ratio test based on asymptotic normality theory. Under a strong assumption (equal variance of the NI test statistic under null and alternative hypotheses), the sample size using Wang's test was much smaller than that using Rothmann's test. However, in practice, the assumption of equal variance is generally questionable for an NI trial design. This assumption is removed in the ratio test proposed in this article, which is derived directly from a Cauchy-like ratio distribution. In addition, using this method, the fundamental assumption used in Rothmann's test, that the observed control effect is always positive, that is, the observed hazard ratio for placebo over the control is greater than 1, is no longer necessary. Without assuming equal variance under null and alternative hypotheses, the sample size required for an NI trial can be significantly reduced if using the proposed ratio test for a fraction retention NI hypothesis.

  13. Test of the Brink-Axel Hypothesis for the Pygmy Dipole Resonance

    Science.gov (United States)

    Martin, D.; von Neumann-Cosel, P.; Tamii, A.; Aoi, N.; Bassauer, S.; Bertulani, C. A.; Carter, J.; Donaldson, L.; Fujita, H.; Fujita, Y.; Hashimoto, T.; Hatanaka, K.; Ito, T.; Krugmann, A.; Liu, B.; Maeda, Y.; Miki, K.; Neveling, R.; Pietralla, N.; Poltoratska, I.; Ponomarev, V. Yu.; Richter, A.; Shima, T.; Yamamoto, T.; Zweidinger, M.

    2017-11-01

    The gamma strength function and level density of 1- states in 96Mo have been extracted from a high-resolution study of the (p → , p→ ' ) reaction at 295 MeV and extreme forward angles. By comparison with compound nucleus γ decay experiments, this allows a test of the generalized Brink-Axel hypothesis in the energy region of the pygmy dipole resonance. The Brink-Axel hypothesis is commonly assumed in astrophysical reaction network calculations and states that the gamma strength function in nuclei is independent of the structure of the initial and final state. The present results validate the Brink-Axel hypothesis for 96Mo and provide independent confirmation of the methods used to separate gamma strength function and level density in γ decay experiments.

  14. The Sequential Probability Ratio Test: An efficient alternative to exact binomial testing for Clean Water Act 303(d) evaluation.

    Science.gov (United States)

    Chen, Connie; Gribble, Matthew O; Bartroff, Jay; Bay, Steven M; Goldstein, Larry

    2017-05-01

    The United States's Clean Water Act stipulates in section 303(d) that states must identify impaired water bodies for which total maximum daily loads (TMDLs) of pollution inputs into water bodies are developed. Decision-making procedures about how to list, or delist, water bodies as impaired, or not, per Clean Water Act 303(d) differ across states. In states such as California, whether or not a particular monitoring sample suggests that water quality is impaired can be regarded as a binary outcome variable, and California's current regulatory framework invokes a version of the exact binomial test to consolidate evidence across samples and assess whether the overall water body complies with the Clean Water Act. Here, we contrast the performance of California's exact binomial test with one potential alternative, the Sequential Probability Ratio Test (SPRT). The SPRT uses a sequential testing framework, testing samples as they become available and evaluating evidence as it emerges, rather than measuring all the samples and calculating a test statistic at the end of the data collection process. Through simulations and theoretical derivations, we demonstrate that the SPRT on average requires fewer samples to be measured to have comparable Type I and Type II error rates as the current fixed-sample binomial test. Policymakers might consider efficient alternatives such as SPRT to current procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Correlates of androgens in wild male Barbary macaques: Testing the challenge hypothesis.

    Science.gov (United States)

    Rincon, Alan V; Maréchal, Laëtitia; Semple, Stuart; Majolo, Bonaventura; MacLarnon, Ann

    2017-10-01

    Investigating causes and consequences of variation in hormonal expression is a key focus in behavioral ecology. Many studies have explored patterns of secretion of the androgen testosterone in male vertebrates, using the challenge hypothesis (Wingfield, Hegner, Dufty, & Ball, 1990; The American Naturalist, 136(6), 829-846) as a theoretical framework. Rather than the classic association of testosterone with male sexual behavior, this hypothesis predicts that high levels of testosterone are associated with male-male reproductive competition but also inhibit paternal care. The hypothesis was originally developed for birds, and subsequently tested in other vertebrate taxa, including primates. Such studies have explored the link between testosterone and reproductive aggression as well as other measures of mating competition, or between testosterone and aspects of male behavior related to the presence of infants. Very few studies have simultaneously investigated the links between testosterone and male aggression, other aspects of mating competition and infant-related behavior. We tested predictions derived from the challenge hypothesis in wild male Barbary macaques (Macaca sylvanus), a species with marked breeding seasonality and high levels of male-infant affiliation, providing a powerful test of this theoretical framework. Over 11 months, 251 hr of behavioral observations and 296 fecal samples were collected from seven adult males in the Middle Atlas Mountains, Morocco. Fecal androgen levels rose before the onset of the mating season, during a period of rank instability, and were positively related to group mating activity across the mating season. Androgen levels were unrelated to rates of male-male aggression in any period, but higher ranked males had higher levels in both the mating season and in the period of rank instability. Lower androgen levels were associated with increased rates of male-infant grooming during the mating and unstable periods. Our results

  16. Semiparametric Power Envelopes for Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael

    This paper derives asymptotic power envelopes for tests of the unit root hypothesis in a zero-mean AR(1) model. The power envelopes are derived using the limits of experiments approach and are semiparametric in the sense that the underlying error distribution is treated as an unknown...

  17. Using Employer Hiring Behavior to Test the Educational Signaling Hypothesis

    NARCIS (Netherlands)

    Albrecht, J.W.; van Ours, J.C.

    2001-01-01

    This paper presents a test of the educational signaling hypothesis.If employers use education as a signal in the hiring process, they will rely more on education when less is otherwise known about applicants.We nd that employers are more likely to lower educational standards when an informal, more

  18. Local hypothesis testing between a pure bipartite state and the white noise state

    OpenAIRE

    Owari, Masaki; Hayashi, Masahito

    2010-01-01

    In this paper, we treat a local discrimination problem in the framework of asymmetric hypothesis testing. We choose a known bipartite pure state $\\ket{\\Psi}$ as an alternative hypothesis, and the completely mixed state as a null hypothesis. As a result, we analytically derive an optimal type 2 error and an optimal POVM for one-way LOCC POVM and Separable POVM. For two-way LOCC POVM, we study a family of simple three-step LOCC protocols, and show that the best protocol in this family has stric...

  19. A test of the domain-specific acculturation strategy hypothesis.

    Science.gov (United States)

    Miller, Matthew J; Yang, Minji; Lim, Robert H; Hui, Kayi; Choi, Na-Yeun; Fan, Xiaoyan; Lin, Li-Ling; Grome, Rebekah E; Farrell, Jerome A; Blackmon, Sha'kema

    2013-01-01

    Acculturation literature has evolved over the past several decades and has highlighted the dynamic ways in which individuals negotiate experiences in multiple cultural contexts. The present study extends this literature by testing M. J. Miller and R. H. Lim's (2010) domain-specific acculturation strategy hypothesis-that individuals might use different acculturation strategies (i.e., assimilated, bicultural, separated, and marginalized strategies; J. W. Berry, 2003) across behavioral and values domains-in 3 independent cluster analyses with Asian American participants. Present findings supported the domain-specific acculturation strategy hypothesis as 67% to 72% of participants from 3 independent samples using different strategies across behavioral and values domains. Consistent with theory, a number of acculturation strategy cluster group differences emerged across generational status, acculturative stress, mental health symptoms, and attitudes toward seeking professional psychological help. Study limitations and future directions for research are discussed.

  20. Testing for Marshall-Lerner hypothesis: A panel approach

    Science.gov (United States)

    Azizan, Nur Najwa; Sek, Siok Kun

    2014-12-01

    The relationship between real exchange rate and trade balances are documented in many theories. One of the theories is the so-called Marshall-Lerner condition. In this study, we seek to test for the validity of Marshall-Lerner hypothesis, i.e. to reveal if the depreciation of real exchange rate leads to the improvement in trade balances. We focus our study in ASEAN-5 countries and their main trade partners of U.S., Japan and China. The dynamic panel data of pooled mean group (PMG) approach is used to detect the Marshall-Lerner hypothesis among ASEAN-5, between ASEAN-5 and U.S., between ASEAN-5 and Japan and between ASEAN-5 and China respectively. The estimation is based on the autoregressive Distributed Lag or ARDL model for the period of 1970-2012. The paper concludes that Marshal Lerner theory does not hold in bilateral trades in four groups of countries. The trade balances of ASEAN5 are mainly determined by the domestic income level and foreign production cost.

  1. A test of the substitution-habitat hypothesis in amphibians.

    Science.gov (United States)

    Martínez-Abraín, Alejandro; Galán, Pedro

    2017-12-08

    Most examples that support the substitution-habitat hypothesis (human-made habitats act as substitutes of original habitat) deal with birds and mammals. We tested this hypothesis in 14 amphibians by using percentage occupancy as a proxy of habitat quality (i.e., higher occupancy percentages indicate higher quality). We classified water body types as original habitat (no or little human influence) depending on anatomical, behavioral, or physiological adaptations of each amphibian species. Ten species had relatively high probabilities (0.16-0.28) of occurrence in original habitat, moderate probability of occurrence in substitution habitats (0.11-0.14), and low probability of occurrence in refuge habitats (0.05-0.08). Thus, the substitution-habitat hypothesis only partially applies to amphibians because the low occupancy of refuges could be due to the negligible human persecution of this group (indicating good conservation status). However, low occupancy of refuges could also be due to low tolerance of refuge conditions, which could have led to selective extinction or colonization problems due to poor dispersal capabilities. That original habitats had the highest probabilities of occupancy suggests amphibians have a good conservation status in the region. They also appeared highly adaptable to anthropogenic substitution habitats. © 2017 Society for Conservation Biology.

  2. Optimal Sequential Rules for Computer-Based Instruction.

    Science.gov (United States)

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  3. Chaotic annealing with hypothesis test for function optimization in noisy environments

    International Nuclear Information System (INIS)

    Pan Hui; Wang Ling; Liu Bo

    2008-01-01

    As a special mechanism to avoid being trapped in local minimum, the ergodicity property of chaos has been used as a novel searching technique for optimization problems, but there is no research work on chaos for optimization in noisy environments. In this paper, the performance of chaotic annealing (CA) for uncertain function optimization is investigated, and a new hybrid approach (namely CAHT) that combines CA and hypothesis test (HT) is proposed. In CAHT, the merits of CA are applied for well exploration and exploitation in searching space, and solution quality can be identified reliably by hypothesis test to reduce the repeated search to some extent and to reasonably estimate performance for solution. Simulation results and comparisons show that, chaos is helpful to improve the performance of SA for uncertain function optimization, and CAHT can further improve the searching efficiency, quality and robustness

  4. Advertising investment as a tool for boosting consumption: testing Galbraith's hypothesis for Spain

    Directory of Open Access Journals (Sweden)

    Valentín-Alejandro Martínez-Fernández

    2014-12-01

    Full Text Available The recession that most of the world economies have been facing in the last years has caused a great interest in the study of its macroeconomic effects. In this context, a debate has resurged regarding the advertising investment, as for its potential capacity to impel the consumer spending and to impact positively on the economic recovery. This idea, sustained in the so-called Galbraith's hypothesis, constitutes the core of this paper, where the main objective is to test that hypothesis by means of an empirical analysis. In this study, we focus on the Spanish case and the data correspond to the period 1976 -2010. A cointegration analysis is carried out, using two different approaches (Engle-Granger test and Gregory-Hansen test, respectively, to determine if there is any relationship between the advertising investment and six macromagnitudes (GDP, National Income, Consumption, Savings and Fixed Capital Formation, as well as the registered unemployment rate. Based on the results obtained, we conclude that Galbraith's hypothesis is not fulfilled for the Spanish case.

  5. An anomaly detection and isolation scheme with instance-based learning and sequential analysis

    International Nuclear Information System (INIS)

    Yoo, T. S.; Garcia, H. E.

    2006-01-01

    This paper presents an online anomaly detection and isolation (FDI) technique using an instance-based learning method combined with a sequential change detection and isolation algorithm. The proposed method uses kernel density estimation techniques to build statistical models of the given empirical data (null hypothesis). The null hypothesis is associated with the set of alternative hypotheses modeling the abnormalities of the systems. A decision procedure involves a sequential change detection and isolation algorithm. Notably, the proposed method enjoys asymptotic optimality as the applied change detection and isolation algorithm is optimal in minimizing the worst mean detection/isolation delay for a given mean time before a false alarm or a false isolation. Applicability of this methodology is illustrated with redundant sensor data set and its performance. (authors)

  6. An algorithm for testing the efficient market hypothesis.

    Directory of Open Access Journals (Sweden)

    Ioana-Andreea Boboc

    Full Text Available The objective of this research is to examine the efficiency of EUR/USD market through the application of a trading system. The system uses a genetic algorithm based on technical analysis indicators such as Exponential Moving Average (EMA, Moving Average Convergence Divergence (MACD, Relative Strength Index (RSI and Filter that gives buying and selling recommendations to investors. The algorithm optimizes the strategies by dynamically searching for parameters that improve profitability in the training period. The best sets of rules are then applied on the testing period. The results show inconsistency in finding a set of trading rules that performs well in both periods. Strategies that achieve very good returns in the training period show difficulty in returning positive results in the testing period, this being consistent with the efficient market hypothesis (EMH.

  7. An algorithm for testing the efficient market hypothesis.

    Science.gov (United States)

    Boboc, Ioana-Andreea; Dinică, Mihai-Cristian

    2013-01-01

    The objective of this research is to examine the efficiency of EUR/USD market through the application of a trading system. The system uses a genetic algorithm based on technical analysis indicators such as Exponential Moving Average (EMA), Moving Average Convergence Divergence (MACD), Relative Strength Index (RSI) and Filter that gives buying and selling recommendations to investors. The algorithm optimizes the strategies by dynamically searching for parameters that improve profitability in the training period. The best sets of rules are then applied on the testing period. The results show inconsistency in finding a set of trading rules that performs well in both periods. Strategies that achieve very good returns in the training period show difficulty in returning positive results in the testing period, this being consistent with the efficient market hypothesis (EMH).

  8. Is the Economic andTesting the Efficient Markets Hypothesis on the Romanian Capital Market

    Directory of Open Access Journals (Sweden)

    Dragoș Mînjină

    2013-11-01

    Full Text Available Informational efficiency of capital markets has been the subject of numerous empirical studies. Intensive research of the field is justified by the important implications of the knowledge of the of informational efficiency level in the financial practice. Empirical studies that have tested the efficient markets hypothesis on the Romanian capital market revealed mostly that this market is not characterised by the weak form of the efficient markets hypothesis. However, recent empirical studies have obtained results for the weak form of the efficient markets hypothesis. The present decline period of the Romanian capital market, recorded on the background of adverse economic developments internally and externally, will be an important test for the continuation of recent positive developments, manifested the level of informational efficiency too.

  9. The Relation between Parental Values and Parenting Behavior: A Test of the Kohn Hypothesis.

    Science.gov (United States)

    Luster, Tom; And Others

    1989-01-01

    Used data on 65 mother-infant dyads to test Kohn's hypothesis concerning the relation between values and parenting behavior. Findings support Kohn's hypothesis that parents who value self-direction would emphasize supportive function of parenting and parents who value conformity would emphasize their obligations to impose restraints. (Author/NB)

  10. Giant Panda Maternal Care: A Test of the Experience Constraint Hypothesis

    Science.gov (United States)

    Snyder, Rebecca J.; Perdue, Bonnie M.; Zhang, Zhihe; Maple, Terry L.; Charlton, Benjamin D.

    2016-01-01

    The body condition constraint and the experience condition constraint hypotheses have both been proposed to account for differences in reproductive success between multiparous (experienced) and primiparous (first-time) mothers. However, because primiparous mothers are typically characterized by both inferior body condition and lack of experience when compared to multiparous mothers, interpreting experience related differences in maternal care as support for either the body condition constraint hypothesis or the experience constraint hypothesis is extremely difficult. Here, we examined maternal behaviour in captive giant pandas, allowing us to simultaneously control for body condition and provide a rigorous test of the experience constraint hypothesis in this endangered animal. We found that multiparous mothers spent more time engaged in key maternal behaviours (nursing, grooming, and holding cubs) and had significantly less vocal cubs than primiparous mothers. This study provides the first evidence supporting the experience constraint hypothesis in the order Carnivora, and may have utility for captive breeding programs in which it is important to monitor the welfare of this species’ highly altricial cubs, whose survival is almost entirely dependent on receiving adequate maternal care during the first few weeks of life. PMID:27272352

  11. The potential for increased power from combining P-values testing the same hypothesis.

    Science.gov (United States)

    Ganju, Jitendra; Julie Ma, Guoguang

    2017-02-01

    The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.

  12. Bursts and heavy tails in temporal and sequential dynamics of foraging decisions.

    Directory of Open Access Journals (Sweden)

    Kanghoon Jung

    2014-08-01

    Full Text Available A fundamental understanding of behavior requires predicting when and what an individual will choose. However, the actual temporal and sequential dynamics of successive choices made among multiple alternatives remain unclear. In the current study, we tested the hypothesis that there is a general bursting property in both the timing and sequential patterns of foraging decisions. We conducted a foraging experiment in which rats chose among four different foods over a continuous two-week time period. Regarding when choices were made, we found bursts of rapidly occurring actions, separated by time-varying inactive periods, partially based on a circadian rhythm. Regarding what was chosen, we found sequential dynamics in affective choices characterized by two key features: (a a highly biased choice distribution; and (b preferential attachment, in which the animals were more likely to choose what they had previously chosen. To capture the temporal dynamics, we propose a dual-state model consisting of active and inactive states. We also introduce a satiation-attainment process for bursty activity, and a non-homogeneous Poisson process for longer inactivity between bursts. For the sequential dynamics, we propose a dual-control model consisting of goal-directed and habit systems, based on outcome valuation and choice history, respectively. This study provides insights into how the bursty nature of behavior emerges from the interaction of different underlying systems, leading to heavy tails in the distribution of behavior over time and choices.

  13. Bursts and Heavy Tails in Temporal and Sequential Dynamics of Foraging Decisions

    Science.gov (United States)

    Jung, Kanghoon; Jang, Hyeran; Kralik, Jerald D.; Jeong, Jaeseung

    2014-01-01

    A fundamental understanding of behavior requires predicting when and what an individual will choose. However, the actual temporal and sequential dynamics of successive choices made among multiple alternatives remain unclear. In the current study, we tested the hypothesis that there is a general bursting property in both the timing and sequential patterns of foraging decisions. We conducted a foraging experiment in which rats chose among four different foods over a continuous two-week time period. Regarding when choices were made, we found bursts of rapidly occurring actions, separated by time-varying inactive periods, partially based on a circadian rhythm. Regarding what was chosen, we found sequential dynamics in affective choices characterized by two key features: (a) a highly biased choice distribution; and (b) preferential attachment, in which the animals were more likely to choose what they had previously chosen. To capture the temporal dynamics, we propose a dual-state model consisting of active and inactive states. We also introduce a satiation-attainment process for bursty activity, and a non-homogeneous Poisson process for longer inactivity between bursts. For the sequential dynamics, we propose a dual-control model consisting of goal-directed and habit systems, based on outcome valuation and choice history, respectively. This study provides insights into how the bursty nature of behavior emerges from the interaction of different underlying systems, leading to heavy tails in the distribution of behavior over time and choices. PMID:25122498

  14. Testing the activitystat hypothesis: a randomised controlled trial protocol.

    Science.gov (United States)

    Gomersall, Sjaan; Maher, Carol; Norton, Kevin; Dollman, Jim; Tomkinson, Grant; Esterman, Adrian; English, Coralie; Lewis, Nicole; Olds, Tim

    2012-10-08

    The activitystat hypothesis proposes that when physical activity or energy expenditure is increased or decreased in one domain, there will be a compensatory change in another domain to maintain an overall, stable level of physical activity or energy expenditure. To date, there has been no experimental study primarily designed to test the activitystat hypothesis in adults. The aim of this trial is to determine the effect of two different imposed exercise loads on total daily energy expenditure and physical activity levels. This study will be a randomised, multi-arm, parallel controlled trial. Insufficiently active adults (as determined by the Active Australia survey) aged 18-60 years old will be recruited for this study (n=146). Participants must also satisfy the Sports Medicine Australia Pre-Exercise Screening System and must weigh less than 150 kg. Participants will be randomly assigned to one of three groups using a computer-generated allocation sequence. Participants in the Moderate exercise group will receive an additional 150 minutes of moderate to vigorous physical activity per week for six weeks, and those in the Extensive exercise group will receive an additional 300 minutes of moderate to vigorous physical activity per week for six weeks. Exercise targets will be accumulated through both group and individual exercise sessions monitored by heart rate telemetry. Control participants will not be given any instructions regarding lifestyle. The primary outcome measures are activity energy expenditure (doubly labeled water) and physical activity (accelerometry). Secondary measures will include resting metabolic rate via indirect calorimetry, use of time, maximal oxygen consumption and several anthropometric and physiological measures. Outcome measures will be conducted at baseline (zero weeks), mid- and end-intervention (three and six weeks) with three (12 weeks) and six month (24 week) follow-up. All assessors will be blinded to group allocation. This protocol

  15. Testing the Granger noncausality hypothesis in stationary nonlinear models of unknown functional form

    DEFF Research Database (Denmark)

    Péguin-Feissolle, Anne; Strikholm, Birgit; Teräsvirta, Timo

    In this paper we propose a general method for testing the Granger noncausality hypothesis in stationary nonlinear models of unknown functional form. These tests are based on a Taylor expansion of the nonlinear model around a given point in the sample space. We study the performance of our tests b...

  16. The subtyping of primary aldosteronism by adrenal vein sampling: sequential blood sampling causes factitious lateralization.

    Science.gov (United States)

    Rossitto, Giacomo; Battistel, Michele; Barbiero, Giulio; Bisogni, Valeria; Maiolino, Giuseppe; Diego, Miotto; Seccia, Teresa M; Rossi, Gian Paolo

    2018-02-01

    The pulsatile secretion of adrenocortical hormones and a stress reaction occurring when starting adrenal vein sampling (AVS) can affect the selectivity and also the assessment of lateralization when sequential blood sampling is used. We therefore tested the hypothesis that a simulated sequential blood sampling could decrease the diagnostic accuracy of lateralization index for identification of aldosterone-producing adenoma (APA), as compared with bilaterally simultaneous AVS. In 138 consecutive patients who underwent subtyping of primary aldosteronism, we compared the results obtained simultaneously bilaterally when starting AVS (t-15) and 15 min after (t0), with those gained with a simulated sequential right-to-left AVS technique (R ⇒ L) created by combining hormonal values obtained at t-15 and at t0. The concordance between simultaneously obtained values at t-15 and t0, and between simultaneously obtained values and values gained with a sequential R ⇒ L technique, was also assessed. We found a marked interindividual variability of lateralization index values in the patients with bilaterally selective AVS at both time point. However, overall the lateralization index simultaneously determined at t0 provided a more accurate identification of APA than the simulated sequential lateralization indexR ⇒ L (P = 0.001). Moreover, regardless of which side was sampled first, the sequential AVS technique induced a sequence-dependent overestimation of lateralization index. While in APA patients the concordance between simultaneous AVS at t0 and t-15 and between simultaneous t0 and sequential technique was moderate-to-good (K = 0.55 and 0.66, respectively), in non-APA patients, it was poor (K = 0.12 and 0.13, respectively). Sequential AVS generates factitious between-sides gradients, which lower its diagnostic accuracy, likely because of the stress reaction arising upon starting AVS.

  17. The Bayesian New Statistics: Hypothesis testing, estimation, meta-analysis, and power analysis from a Bayesian perspective.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.

  18. Improving the space surveillance telescope's performance using multi-hypothesis testing

    Energy Technology Data Exchange (ETDEWEB)

    Chris Zingarelli, J.; Cain, Stephen [Air Force Institute of Technology, 2950 Hobson Way, Bldg 641, Wright Patterson AFB, OH 45433 (United States); Pearce, Eric; Lambour, Richard [Lincoln Labratory, Massachusetts Institute of Technology, 244 Wood Street, Lexington, MA 02421 (United States); Blake, Travis [Defense Advanced Research Projects Agency, 675 North Randolph Street Arlington, VA 22203 (United States); Peterson, Curtis J. R., E-mail: John.Zingarelli@afit.edu [United States Air Force, 1690 Air Force Pentagon, Washington, DC 20330 (United States)

    2014-05-01

    The Space Surveillance Telescope (SST) is a Defense Advanced Research Projects Agency program designed to detect objects in space like near Earth asteroids and space debris in the geosynchronous Earth orbit (GEO) belt. Binary hypothesis test (BHT) methods have historically been used to facilitate the detection of new objects in space. In this paper a multi-hypothesis detection strategy is introduced to improve the detection performance of SST. In this context, the multi-hypothesis testing (MHT) determines if an unresolvable point source is in either the center, a corner, or a side of a pixel in contrast to BHT, which only tests whether an object is in the pixel or not. The images recorded by SST are undersampled such as to cause aliasing, which degrades the performance of traditional detection schemes. The equations for the MHT are derived in terms of signal-to-noise ratio (S/N), which is computed by subtracting the background light level around the pixel being tested and dividing by the standard deviation of the noise. A new method for determining the local noise statistics that rejects outliers is introduced in combination with the MHT. An experiment using observations of a known GEO satellite are used to demonstrate the improved detection performance of the new algorithm over algorithms previously reported in the literature. The results show a significant improvement in the probability of detection by as much as 50% over existing algorithms. In addition to detection, the S/N results prove to be linearly related to the least-squares estimates of point source irradiance, thus improving photometric accuracy.

  19. Congruence analysis of geodetic networks - hypothesis tests versus model selection by information criteria

    Science.gov (United States)

    Lehmann, Rüdiger; Lösler, Michael

    2017-12-01

    Geodetic deformation analysis can be interpreted as a model selection problem. The null model indicates that no deformation has occurred. It is opposed to a number of alternative models, which stipulate different deformation patterns. A common way to select the right model is the usage of a statistical hypothesis test. However, since we have to test a series of deformation patterns, this must be a multiple test. As an alternative solution for the test problem, we propose the p-value approach. Another approach arises from information theory. Here, the Akaike information criterion (AIC) or some alternative is used to select an appropriate model for a given set of observations. Both approaches are discussed and applied to two test scenarios: A synthetic levelling network and the Delft test data set. It is demonstrated that they work but behave differently, sometimes even producing different results. Hypothesis tests are well-established in geodesy, but may suffer from an unfavourable choice of the decision error rates. The multiple test also suffers from statistical dependencies between the test statistics, which are neglected. Both problems are overcome by applying information criterions like AIC.

  20. The effect of sequential dual-gas testing on laser-induced breakdown spectroscopy-based discrimination: Application to brass samples and bacterial strains

    International Nuclear Information System (INIS)

    Rehse, Steven J.; Mohaidat, Qassem I.

    2009-01-01

    Four Cu-Zn brass alloys with different stoichiometries and compositions have been analyzed by laser-induced breakdown spectroscopy (LIBS) using nanosecond laser pulses. The intensities of 15 emission lines of copper, zinc, lead, carbon, and aluminum (as well as the environmental contaminants sodium and calcium) were normalized and analyzed with a discriminant function analysis (DFA) to rapidly categorize the samples by alloy. The alloys were tested sequentially in two different noble gases (argon and helium) to enhance discrimination between them. When emission intensities from samples tested sequentially in both gases were combined to form a single 30-spectral line 'fingerprint' of the alloy, an overall 100% correct identification was achieved. This was a modest improvement over using emission intensities acquired in argon gas alone. A similar study was performed to demonstrate an enhanced discrimination between two strains of Escherichia coli (a Gram-negative bacterium) and a Gram-positive bacterium. When emission intensities from bacteria sequentially ablated in two different gas environments were combined, the DFA achieved a 100% categorization accuracy. This result showed the benefit of sequentially testing highly similar samples in two different ambient gases to enhance discrimination between the samples.

  1. Testing the fire-sale FDI hypothesis for the European financial crisis

    NARCIS (Netherlands)

    Weitzel, G.U.; Kling, G.; Gerritsen, D.

    2014-01-01

    Using a panel of corporate transactions in 27 EU countries from 1999 to 2012, we investigate the impact of the financial crisis on the market for corporate assets. In particular, we test the ‘fire-sale FDI’ hypothesis by analyzing the number of cross-border transactions, the price of corporate

  2. Testing the Fire-Sale FDI Hypothesis for the European Financial Crisis

    NARCIS (Netherlands)

    Kling, G.; Gerritsen, Dirk; Weitzel, Gustav Utz

    2014-01-01

    Using a panel of corporate transactions in 27 EU countries from 1999 to 2012, we investigate the impact of the financial crisis on the market for corporate assets. In particular, we test the ‘fire-sale FDI’ hypothesis by analyzing the number of cross-border transactions, the price of corporate

  3. Group-sequential analysis may allow for early trial termination

    DEFF Research Database (Denmark)

    Gerke, Oke; Vilstrup, Mie H; Halekoh, Ulrich

    2017-01-01

    BACKGROUND: Group-sequential testing is widely used in pivotal therapeutic, but rarely in diagnostic research, although it may save studies, time, and costs. The purpose of this paper was to demonstrate a group-sequential analysis strategy in an intra-observer study on quantitative FDG-PET/CT mea......BACKGROUND: Group-sequential testing is widely used in pivotal therapeutic, but rarely in diagnostic research, although it may save studies, time, and costs. The purpose of this paper was to demonstrate a group-sequential analysis strategy in an intra-observer study on quantitative FDG...

  4. [Experimental testing of Pflüger's reflex hypothesis of menstruation in late 19th century].

    Science.gov (United States)

    Simmer, H H

    1980-07-01

    Pflüger's hypothesis of a nerve reflex as the cause of menstruation published in 1865 and accepted by many, nonetheless did not lead to experimental investigations for 25 years. According to this hypothesis the nerve reflex starts in the ovary by an increase of the intraovarian pressure by the growing follicles. In 1884 Adolph Kehrer proposed a program to test the nerve reflex, but only in 1890, Cohnstein artificially increased the intraovarian pressure in women by bimanual compression from the outside and the vagina. His results were not convincing. Six years later, Strassmann injected fluids into ovaries of animals and obtained changes in the uterus resembling those of oestrus. His results seemed to verify a prognosis derived from Pflüger's hypothesis. Thus, after a long interval, that hypothesis had become a paradigma. Though reasons can be given for the delay, it is little understood, why experimental testing started so late.

  5. Testing the implicit processing hypothesis of precognitive dream experience.

    Science.gov (United States)

    Valášek, Milan; Watt, Caroline; Hutton, Jenny; Neill, Rebecca; Nuttall, Rachel; Renwick, Grace

    2014-08-01

    Seemingly precognitive (prophetic) dreams may be a result of one's unconscious processing of environmental cues and having an implicit inference based on these cues manifest itself in one's dreams. We present two studies exploring this implicit processing hypothesis of precognitive dream experience. Study 1 investigated the relationship between implicit learning, transliminality, and precognitive dream belief and experience. Participants completed the Serial Reaction Time task and several questionnaires. We predicted a positive relationship between the variables. With the exception of relationships between transliminality and precognitive dream belief and experience, this prediction was not supported. Study 2 tested the hypothesis that differences in the ability to notice subtle cues explicitly might account for precognitive dream beliefs and experiences. Participants completed a modified version of the flicker paradigm. We predicted a negative relationship between the ability to explicitly detect changes and precognitive dream variables. This relationship was not found. There was also no relationship between precognitive dream belief and experience and implicit change detection. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Mental Abilities and School Achievement: A Test of a Mediation Hypothesis

    Science.gov (United States)

    Vock, Miriam; Preckel, Franzis; Holling, Heinz

    2011-01-01

    This study analyzes the interplay of four cognitive abilities--reasoning, divergent thinking, mental speed, and short-term memory--and their impact on academic achievement in school in a sample of adolescents in grades seven to 10 (N = 1135). Based on information processing approaches to intelligence, we tested a mediation hypothesis, which states…

  7. The Need for Nuance in the Null Hypothesis Significance Testing Debate

    Science.gov (United States)

    Häggström, Olle

    2017-01-01

    Null hypothesis significance testing (NHST) provides an important statistical toolbox, but there are a number of ways in which it is often abused and misinterpreted, with bad consequences for the reliability and progress of science. Parts of contemporary NHST debate, especially in the psychological sciences, is reviewed, and a suggestion is made…

  8. Reliability Evaluation of Concentric Butterfly Valve Using Statistical Hypothesis Test

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Mu Seong; Choi, Jong Sik; Choi, Byung Oh; Kim, Do Sik [Korea Institute of Machinery and Materials, Daejeon (Korea, Republic of)

    2015-12-15

    A butterfly valve is a type of flow-control device typically used to regulate a fluid flow. This paper presents an estimation of the shape parameter of the Weibull distribution, characteristic life, and B10 life for a concentric butterfly valve based on a statistical analysis of the reliability test data taken before and after the valve improvement. The difference in the shape and scale parameters between the existing and improved valves is reviewed using a statistical hypothesis test. The test results indicate that the shape parameter of the improved valve is similar to that of the existing valve, and that the scale parameter of the improved valve is found to have increased. These analysis results are particularly useful for a reliability qualification test and the determination of the service life cycles.

  9. Reliability Evaluation of Concentric Butterfly Valve Using Statistical Hypothesis Test

    International Nuclear Information System (INIS)

    Chang, Mu Seong; Choi, Jong Sik; Choi, Byung Oh; Kim, Do Sik

    2015-01-01

    A butterfly valve is a type of flow-control device typically used to regulate a fluid flow. This paper presents an estimation of the shape parameter of the Weibull distribution, characteristic life, and B10 life for a concentric butterfly valve based on a statistical analysis of the reliability test data taken before and after the valve improvement. The difference in the shape and scale parameters between the existing and improved valves is reviewed using a statistical hypothesis test. The test results indicate that the shape parameter of the improved valve is similar to that of the existing valve, and that the scale parameter of the improved valve is found to have increased. These analysis results are particularly useful for a reliability qualification test and the determination of the service life cycles

  10. Testing hypotheses and the advancement of science: recent attempts to falsify the equilibrium point hypothesis.

    Science.gov (United States)

    Feldman, Anatol G; Latash, Mark L

    2005-02-01

    Criticisms of the equilibrium point (EP) hypothesis have recently appeared that are based on misunderstandings of some of its central notions. Starting from such interpretations of the hypothesis, incorrect predictions are made and tested. When the incorrect predictions prove false, the hypothesis is claimed to be falsified. In particular, the hypothesis has been rejected based on the wrong assumptions that it conflicts with empirically defined joint stiffness values or that it is incompatible with violations of equifinality under certain velocity-dependent perturbations. Typically, such attempts use notions describing the control of movements of artificial systems in place of physiologically relevant ones. While appreciating constructive criticisms of the EP hypothesis, we feel that incorrect interpretations have to be clarified by reiterating what the EP hypothesis does and does not predict. We conclude that the recent claims of falsifying the EP hypothesis and the calls for its replacement by EMG-force control hypothesis are unsubstantiated. The EP hypothesis goes far beyond the EMG-force control view. In particular, the former offers a resolution for the famous posture-movement paradox while the latter fails to resolve it.

  11. Acceptance sampling for attributes via hypothesis testing and the hypergeometric distribution

    Science.gov (United States)

    Samohyl, Robert Wayne

    2017-10-01

    This paper questions some aspects of attribute acceptance sampling in light of the original concepts of hypothesis testing from Neyman and Pearson (NP). Attribute acceptance sampling in industry, as developed by Dodge and Romig (DR), generally follows the international standards of ISO 2859, and similarly the Brazilian standards NBR 5425 to NBR 5427 and the United States Standards ANSI/ASQC Z1.4. The paper evaluates and extends the area of acceptance sampling in two directions. First, by suggesting the use of the hypergeometric distribution to calculate the parameters of sampling plans avoiding the unnecessary use of approximations such as the binomial or Poisson distributions. We show that, under usual conditions, discrepancies can be large. The conclusion is that the hypergeometric distribution, ubiquitously available in commonly used software, is more appropriate than other distributions for acceptance sampling. Second, and more importantly, we elaborate the theory of acceptance sampling in terms of hypothesis testing rigorously following the original concepts of NP. By offering a common theoretical structure, hypothesis testing from NP can produce a better understanding of applications even beyond the usual areas of industry and commerce such as public health and political polling. With the new procedures, both sample size and sample error can be reduced. What is unclear in traditional acceptance sampling is the necessity of linking the acceptable quality limit (AQL) exclusively to the producer and the lot quality percent defective (LTPD) exclusively to the consumer. In reality, the consumer should also be preoccupied with a value of AQL, as should the producer with LTPD. Furthermore, we can also question why type I error is always uniquely associated with the producer as producer risk, and likewise, the same question arises with consumer risk which is necessarily associated with type II error. The resolution of these questions is new to the literature. The

  12. Men’s Perception of Raped Women: Test of the Sexually Transmitted Disease Hypothesis and the Cuckoldry Hypothesis

    Directory of Open Access Journals (Sweden)

    Prokop Pavol

    2016-06-01

    Full Text Available Rape is a recurrent adaptive problem of female humans and females of a number of non-human animals. Rape has various physiological and reproductive costs to the victim. The costs of rape are furthermore exaggerated by social rejection and blaming of a victim, particularly by men. The negative perception of raped women by men has received little attention from an evolutionary perspective. Across two independent studies, we investigated whether the risk of sexually transmitted diseases (the STD hypothesis, Hypothesis 1 or paternity uncertainty (the cuckoldry hypothesis, Hypothesis 2 influence the negative perception of raped women by men. Raped women received lower attractiveness score than non-raped women, especially in long-term mate attractiveness score. The perceived attractiveness of raped women was not influenced by the presence of experimentally manipulated STD cues on faces of putative rapists. Women raped by three men received lower attractiveness score than women raped by one man. These results provide stronger support for the cuckoldry hypothesis (Hypothesis 2 than for the STD hypothesis (Hypothesis 1. Single men perceived raped women as more attractive than men in a committed relationship (Hypothesis 3, suggesting that the mating opportunities mediate men’s perception of victims of rape. Overall, our results suggest that the risk of cuckoldry underlie the negative perception of victims of rape by men rather than the fear of disease transmission.

  13. Sequential Probability Ratio Testing with Power Projective Base Method Improves Decision-Making for BCI

    Science.gov (United States)

    Liu, Rong

    2017-01-01

    Obtaining a fast and reliable decision is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this study, the EEG signals were firstly analyzed with a power projective base method. Then we were applied a decision-making model, the sequential probability ratio testing (SPRT), for single-trial classification of motor imagery movement events. The unique strength of this proposed classification method lies in its accumulative process, which increases the discriminative power as more and more evidence is observed over time. The properties of the method were illustrated on thirteen subjects' recordings from three datasets. Results showed that our proposed power projective method outperformed two benchmark methods for every subject. Moreover, with sequential classifier, the accuracies across subjects were significantly higher than that with nonsequential ones. The average maximum accuracy of the SPRT method was 84.1%, as compared with 82.3% accuracy for the sequential Bayesian (SB) method. The proposed SPRT method provides an explicit relationship between stopping time, thresholds, and error, which is important for balancing the time-accuracy trade-off. These results suggest SPRT would be useful in speeding up decision-making while trading off errors in BCI. PMID:29348781

  14. Test-potentiated learning: three independent replications, a disconfirmed hypothesis, and an unexpected boundary condition.

    Science.gov (United States)

    Wissman, Kathryn T; Rawson, Katherine A

    2018-04-01

    Arnold and McDermott [(2013). Test-potentiated learning: Distinguishing between direct and indirect effects of testing. Journal of Experimental Psychology: Learning, Memory, and Cognition, 39, 940-945] isolated the indirect effects of testing and concluded that encoding is enhanced to a greater extent following more versus fewer practice tests, referred to as test-potentiated learning. The current research provided further evidence for test-potentiated learning and evaluated the covert retrieval hypothesis as an alternative explanation for the observed effect. Learners initially studied foreign language word pairs and then completed either one or five practice tests before restudy occurred. Results of greatest interest concern performance on test trials following restudy for items that were not correctly recalled on the test trials that preceded restudy. Results replicate Arnold and McDermott (2013) by demonstrating that more versus fewer tests potentiate learning when trial time is limited. Results also provide strong evidence against the covert retrieval hypothesis concerning why the effect occurs (i.e., it does not reflect differential covert retrieval during pre-restudy trials). In addition, outcomes indicate that the magnitude of the test-potentiated learning effect decreases as trial length increases, revealing an unexpected boundary condition to test-potentiated learning.

  15. Picture-Perfect Is Not Perfect for Metamemory: Testing the Perceptual Fluency Hypothesis with Degraded Images

    Science.gov (United States)

    Besken, Miri

    2016-01-01

    The perceptual fluency hypothesis claims that items that are easy to perceive at encoding induce an illusion that they will be easier to remember, despite the finding that perception does not generally affect recall. The current set of studies tested the predictions of the perceptual fluency hypothesis with a picture generation manipulation.…

  16. Why is muscularity sexy? Tests of the fitness indicator hypothesis.

    Science.gov (United States)

    Frederick, David A; Haselton, Martie G

    2007-08-01

    Evolutionary scientists propose that exaggerated secondary sexual characteristics are cues of genes that increase offspring viability or reproductive success. In six studies the hypothesis that muscularity is one such cue is tested. As predicted, women rate muscular men as sexier, more physically dominant and volatile, and less committed to their mates than nonmuscular men. Consistent with the inverted-U hypothesis of masculine traits, men with moderate muscularity are rated most attractive. Consistent with past research on fitness cues, across two measures, women indicate that their most recent short-term sex partners were more muscular than their other sex partners (ds = .36, .47). Across three studies, when controlling for other characteristics (e.g., body fat), muscular men rate their bodies as sexier to women (partial rs = .49-.62) and report more lifetime sex partners (partial rs = .20-.27), short-term partners (partial rs = .25-.28), and more affairs with mated women (partial r = .28).

  17. Statistical hypothesis tests of some micrometeorological observations

    International Nuclear Information System (INIS)

    SethuRaman, S.; Tichler, J.

    1977-01-01

    Chi-square goodness-of-fit is used to test the hypothesis that the medium scale of turbulence in the atmospheric surface layer is normally distributed. Coefficients of skewness and excess are computed from the data. If the data are not normal, these coefficients are used in Edgeworth's asymptotic expansion of Gram-Charlier series to determine an altrnate probability density function. The observed data are then compared with the modified probability densities and the new chi-square values computed.Seventy percent of the data analyzed was either normal or approximatley normal. The coefficient of skewness g 1 has a good correlation with the chi-square values. Events with vertical-barg 1 vertical-bar 1 vertical-bar<0.43 were approximately normal. Intermittency associated with the formation and breaking of internal gravity waves in surface-based inversions over water is thought to be the reason for the non-normality

  18. Decentralized Hypothesis Testing in Energy Harvesting Wireless Sensor Networks

    Science.gov (United States)

    Tarighati, Alla; Gross, James; Jalden, Joakim

    2017-09-01

    We consider the problem of decentralized hypothesis testing in a network of energy harvesting sensors, where sensors make noisy observations of a phenomenon and send quantized information about the phenomenon towards a fusion center. The fusion center makes a decision about the present hypothesis using the aggregate received data during a time interval. We explicitly consider a scenario under which the messages are sent through parallel access channels towards the fusion center. To avoid limited lifetime issues, we assume each sensor is capable of harvesting all the energy it needs for the communication from the environment. Each sensor has an energy buffer (battery) to save its harvested energy for use in other time intervals. Our key contribution is to formulate the problem of decentralized detection in a sensor network with energy harvesting devices. Our analysis is based on a queuing-theoretic model for the battery and we propose a sensor decision design method by considering long term energy management at the sensors. We show how the performance of the system changes for different battery capacities. We then numerically show how our findings can be used in the design of sensor networks with energy harvesting sensors.

  19. A Highest Order Hypothesis Compatibility Test for Monocular SLAM

    Directory of Open Access Journals (Sweden)

    Edmundo Guerra

    2013-08-01

    Full Text Available Simultaneous Location and Mapping (SLAM is a key problem to solve in order to build truly autonomous mobile robots. SLAM with a unique camera, or monocular SLAM, is probably one of the most complex SLAM variants, based entirely on a bearing-only sensor working over six DOF. The monocular SLAM method developed in this work is based on the Delayed Inverse-Depth (DI-D Feature Initialization, with the contribution of a new data association batch validation technique, the Highest Order Hypothesis Compatibility Test, HOHCT. The Delayed Inverse-Depth technique is used to initialize new features in the system and defines a single hypothesis for the initial depth of features with the use of a stochastic technique of triangulation. The introduced HOHCT method is based on the evaluation of statistically compatible hypotheses and a search algorithm designed to exploit the strengths of the Delayed Inverse-Depth technique to achieve good performance results. This work presents the HOHCT with a detailed formulation of the monocular DI-D SLAM problem. The performance of the proposed HOHCT is validated with experimental results, in both indoor and outdoor environments, while its costs are compared with other popular approaches.

  20. Planned Hypothesis Tests Are Not Necessarily Exempt From Multiplicity Adjustment

    Directory of Open Access Journals (Sweden)

    Andrew V. Frane

    2015-10-01

    Full Text Available Scientific research often involves testing more than one hypothesis at a time, which can inflate the probability that a Type I error (false discovery will occur. To prevent this Type I error inflation, adjustments can be made to the testing procedure that compensate for the number of tests. Yet many researchers believe that such adjustments are inherently unnecessary if the tests were “planned” (i.e., if the hypotheses were specified before the study began. This longstanding misconception continues to be perpetuated in textbooks and continues to be cited in journal articles to justify disregard for Type I error inflation. I critically evaluate this myth and examine its rationales and variations. To emphasize the myth’s prevalence and relevance in current research practice, I provide examples from popular textbooks and from recent literature. I also make recommendations for improving research practice and pedagogy regarding this problem and regarding multiple testing in general.

  1. Variability: A Pernicious Hypothesis.

    Science.gov (United States)

    Noddings, Nel

    1992-01-01

    The hypothesis of greater male variability in test results is discussed in its historical context, and reasons feminists have objected to the hypothesis are considered. The hypothesis acquires political importance if it is considered that variability results from biological, rather than cultural, differences. (SLD)

  2. Adaptive x-ray threat detection using sequential hypotheses testing with fan-beam experimental data (Conference Presentation)

    Science.gov (United States)

    Thamvichai, Ratchaneekorn; Huang, Liang-Chih; Ashok, Amit; Gong, Qian; Coccarelli, David; Greenberg, Joel A.; Gehm, Michael E.; Neifeld, Mark A.

    2017-05-01

    We employ an adaptive measurement system, based on sequential hypotheses testing (SHT) framework, for detecting material-based threats using experimental data acquired on an X-ray experimental testbed system. This testbed employs 45-degree fan-beam geometry and 15 views over a 180-degree span to generate energy sensitive X-ray projection data. Using this testbed system, we acquire multiple view projection data for 200 bags. We consider an adaptive measurement design where the X-ray projection measurements are acquired in a sequential manner and the adaptation occurs through the choice of the optimal "next" source/view system parameter. Our analysis of such an adaptive measurement design using the experimental data demonstrates a 3x-7x reduction in the probability of error relative to a static measurement design. Here the static measurement design refers to the operational system baseline that corresponds to a sequential measurement using all the available sources/views. We also show that by using adaptive measurements it is possible to reduce the number of sources/views by nearly 50% compared a system that relies on static measurements.

  3. Graphic tests of Easterlin's hypothesis: science or art?

    Science.gov (United States)

    Rutten, A; Higgs, R

    1984-01-01

    Richard Easterlin believes that the postwar fertility cycle is uniquely consistent with the hypothesis of his relative income model of fertility, yet a closer examination of his evidence shows that the case for the relative income explanation is much weaker than initially appears. Easterlin finds the postwar baby boom a transparent event. Couples who entered the labor market in the postwar period have very low material aspirations. Having grown up during the Great Depression and World War II, they were content with a modest level of living. Their labor market experience was very good. Tight restrictions on immigration kept aliens from coming in to fill the gap. Thus the members of his generation occupied an unprecedented position. They could easily meet and even exceed their expectations. This high level of relative income meant that they could have more of everything they wanted, including children. For the children born during the baby boom, all this was reversed, and hence the needs of the baby bust were sown. To test this hypothesis, Easterlin compared the movements of relative income and fertility over the postwar years using a graph. 4 published versions of the graph are presented. The graph shows that relative income and fertility did move together over the cycle, apparently very closely. Easterlin's measure of fertility is the total fertility rate (TFR). There is no such direct measure of relative income. Easterlin develops 2 proxies based on changing economic conditions believed to shape the level of material aspirations. His preferred measure, labeled R or income in his graph, relates the income experience of young couples in the years previous to marriage to that of their parents in the years before the young people left home. Because of the available data limit construction of this index to the years after 1956, another measure, labeled Re or employment in Easterlin's graphs, is constructed for the pre-1956 period. This measure relates the average of

  4. Praise the Bridge that Carries You Over: Testing the Flattery Citation Hypothesis

    DEFF Research Database (Denmark)

    Frandsen, Tove Faber; Nicolaisen, Jeppe

    2011-01-01

    analysis of the editorial board members entering American Economic Review from 1984 to 2004 using a citation window of 11 years. In order to test the flattery citation hypothesis further we have conducted a study applying the difference-in-difference estimator. We analyse the number of times the editors...

  5. Sequential testing scheme for the assessment of the side-effects of plant protection products on the predatory bug Orius laevigatus

    NARCIS (Netherlands)

    Veire, Van de M.; Sterk, G.; Staaij, van der M.; Ramakers, P.M.J.; Tirry, L.

    2002-01-01

    This paper describes a number of test methods, to beused in a sequential scheme, for testing the side-effects ofplant protection products on anthocorid bugs. Orius laevigatuswas used as test species. A `worst case' laboratory method wasdeveloped for evaluating the effect on mortality of the

  6. Speech production in people who stutter: Testing the motor plan assembly hypothesis

    NARCIS (Netherlands)

    Lieshout, P.H.H.M. van; Hulstijn, W.; Peters, H.F.M.

    1996-01-01

    The main purpose of the present study was to test the hypothesis that persons who stutter, when compared to persons who do not stutter, are less able to assemble abstract motor plans for short verbal responses. Subjects were adult males who stutter and age- and sex-matched control speakers, who were

  7. Using modern human cortical bone distribution to test the systemic robusticity hypothesis.

    Science.gov (United States)

    Baab, Karen L; Copes, Lynn E; Ward, Devin L; Wells, Nora; Grine, Frederick E

    2018-06-01

    The systemic robusticity hypothesis links the thickness of cortical bone in both the cranium and limb bones. This hypothesis posits that thick cortical bone is in part a systemic response to circulating hormones, such as growth hormone and thyroid hormone, possibly related to physical activity or cold climates. Although this hypothesis has gained popular traction, only rarely has robusticity of the cranium and postcranial skeleton been considered jointly. We acquired computed tomographic scans from associated crania, femora and humeri from single individuals representing 11 populations in Africa and North America (n = 228). Cortical thickness in the parietal, frontal and occipital bones and cortical bone area in limb bone diaphyses were analyzed using correlation, multiple regression and general linear models to test the hypothesis. Absolute thickness values from the crania were not correlated with cortical bone area of the femur or humerus, which is at odds with the systemic robusticity hypothesis. However, measures of cortical bone scaled by total vault thickness and limb cross-sectional area were positively correlated between the cranium and postcranium. When accounting for a range of potential confounding variables, including sex, age and body mass, variation in relative postcranial cortical bone area explained ∼20% of variation in the proportion of cortical cranial bone thickness. While these findings provide limited support for the systemic robusticity hypothesis, cranial cortical thickness did not track climate or physical activity across populations. Thus, some of the variation in cranial cortical bone thickness in modern humans is attributable to systemic effects, but the driving force behind this effect remains obscure. Moreover, neither absolute nor proportional measures of cranial cortical bone thickness are positively correlated with total cranial bone thickness, complicating the extrapolation of these findings to extinct species where only cranial

  8. Aging and motor variability: a test of the neural noise hypothesis.

    Science.gov (United States)

    Sosnoff, Jacob J; Newell, Karl M

    2011-07-01

    Experimental tests of the neural noise hypothesis of aging, which holds that aging-related increments in motor variability are due to increases in white noise in the perceptual-motor system, were conducted. Young (20-29 years old) and old (60-69 and 70-79 years old) adults performed several perceptual-motor tasks. Older adults were progressively more variable in their performance outcome, but there was no age-related difference in white noise in the motor output. Older adults had a greater frequency-dependent structure in their motor variability that was associated with performance decrements. The findings challenge the main tenet of the neural noise hypothesis of aging in that the increased variability of older adults was due to a decreased ability to adapt to the constraints of the task rather than an increment of neural noise per se.

  9. Balassa-Samuelson Hypothesis: A Test Of Turkish Economy By ARDL Bound Testing Approach

    Directory of Open Access Journals (Sweden)

    Utku ALTUNÖZ

    2014-06-01

    Full Text Available Balassa-Samuelson effect is a popular theme at last years that introduced by Bèla Balassa (1964 and Paul Samuelson (1964. This concept, suggests that a differentiation at international level between the relative rates of productivity of the tradable and non tradable sectors may cause structural and permanent deviations from the purchasing power parity. In this essay, related variables are tested through Balassa-Samuelson Effect in terms of Turkey-European Economy. The choice of econometric technique used to estimate the model was important because the regressors in the model appeared to be a mixture of I(0 and I(1 processes. Thus ARDL bounds testing approaches to co integration analysis in estimating the long-run determinants of the real exchange rates. Given the dataset and econometric techniques used, the results do not support the B-S hypothesis.

  10. [A test of the focusing hypothesis for category judgment: an explanation using the mental-box model].

    Science.gov (United States)

    Hatori, Tsuyoshi; Takemura, Kazuhisa; Fujii, Satoshi; Ideno, Takashi

    2011-06-01

    This paper presents a new model of category judgment. The model hypothesizes that, when more attention is focused on a category, the psychological range of the category gets narrower (category-focusing hypothesis). We explain this hypothesis by using the metaphor of a "mental-box" model: the more attention that is focused on a mental box (i.e., a category set), the smaller the size of the box becomes (i.e., a cardinal number of the category set). The hypothesis was tested in an experiment (N = 40), where the focus of attention on prescribed verbal categories was manipulated. The obtained data gave support to the hypothesis: category-focusing effects were found in three experimental tasks (regarding the category of "food", "height", and "income"). The validity of the hypothesis was discussed based on the results.

  11. Bayesian Hypothesis Testing for Psychologists: A Tutorial on the Savage-Dickey Method

    Science.gov (United States)

    Wagenmakers, Eric-Jan; Lodewyckx, Tom; Kuriyal, Himanshu; Grasman, Raoul

    2010-01-01

    In the field of cognitive psychology, the "p"-value hypothesis test has established a stranglehold on statistical reporting. This is unfortunate, as the "p"-value provides at best a rough estimate of the evidence that the data provide for the presence of an experimental effect. An alternative and arguably more appropriate measure of evidence is…

  12. A Bayesian sequential design using alpha spending function to control type I error.

    Science.gov (United States)

    Zhu, Han; Yu, Qingzhao

    2017-10-01

    We propose in this article a Bayesian sequential design using alpha spending functions to control the overall type I error in phase III clinical trials. We provide algorithms to calculate critical values, power, and sample sizes for the proposed design. Sensitivity analysis is implemented to check the effects from different prior distributions, and conservative priors are recommended. We compare the power and actual sample sizes of the proposed Bayesian sequential design with different alpha spending functions through simulations. We also compare the power of the proposed method with frequentist sequential design using the same alpha spending function. Simulations show that, at the same sample size, the proposed method provides larger power than the corresponding frequentist sequential design. It also has larger power than traditional Bayesian sequential design which sets equal critical values for all interim analyses. When compared with other alpha spending functions, O'Brien-Fleming alpha spending function has the largest power and is the most conservative in terms that at the same sample size, the null hypothesis is the least likely to be rejected at early stage of clinical trials. And finally, we show that adding a step of stop for futility in the Bayesian sequential design can reduce the overall type I error and reduce the actual sample sizes.

  13. Monte Carlo simulation of the sequential probability ratio test for radiation monitoring

    International Nuclear Information System (INIS)

    Coop, K.L.

    1984-01-01

    A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table

  14. A test of the herbivore optimization hypothesis using muskoxen and a graminoid meadow plant community

    Directory of Open Access Journals (Sweden)

    David L. Smith

    1996-01-01

    Full Text Available A prediction from the herbivore optimization hypothesis is that grazing by herbivores at moderate intensities will increase net above-ground primary productivity more than at lower or higher intensities. I tested this hypothesis in an area of high muskox {Ovibos moschatus density on north-central Banks Island, Northwest Territories, Canada (73°50'N, 119°53'W. Plots (1 m2 in graminoid meadows dominated by cottongrass (Eriophorum triste were either clipped, exposed to muskoxen, protected for part of one growing season, or permanently protected. This resulted in the removal of 22-44%, 10-39%, 0-39% or 0%, respectively, of shoot tissue during each growing season. Contrary to the predictions of the herbivore optimization hypothesis, productivity did not increase across this range of tissue removal. Productivity of plants clipped at 1.5 cm above ground once or twice per growing season, declined by 60+/-5% in 64% of the tests. The productivity of plants grazed by muskoxen declined by 56+/-7% in 25% of the tests. No significant change in productivity was observed in 36% and 75% of the tests in clipped and grazed treatments, respecrively. Clipping and grazing reduced below-ground standing crop except where removals were small. Grazing and clipping did not stimulate productivity of north-central Banks Island graminoid meadows.

  15. Paranormal psychic believers and skeptics: a large-scale test of the cognitive differences hypothesis.

    Science.gov (United States)

    Gray, Stephen J; Gallo, David A

    2016-02-01

    Belief in paranormal psychic phenomena is widespread in the United States, with over a third of the population believing in extrasensory perception (ESP). Why do some people believe, while others are skeptical? According to the cognitive differences hypothesis, individual differences in the way people process information about the world can contribute to the creation of psychic beliefs, such as differences in memory accuracy (e.g., selectively remembering a fortune teller's correct predictions) or analytical thinking (e.g., relying on intuition rather than scrutinizing evidence). While this hypothesis is prevalent in the literature, few have attempted to empirically test it. Here, we provided the most comprehensive test of the cognitive differences hypothesis to date. In 3 studies, we used online screening to recruit groups of strong believers and strong skeptics, matched on key demographics (age, sex, and years of education). These groups were then tested in laboratory and online settings using multiple cognitive tasks and other measures. Our cognitive testing showed that there were no consistent group differences on tasks of episodic memory distortion, autobiographical memory distortion, or working memory capacity, but skeptics consistently outperformed believers on several tasks tapping analytical or logical thinking as well as vocabulary. These findings demonstrate cognitive similarities and differences between these groups and suggest that differences in analytical thinking and conceptual knowledge might contribute to the development of psychic beliefs. We also found that psychic belief was associated with greater life satisfaction, demonstrating benefits associated with psychic beliefs and highlighting the role of both cognitive and noncognitive factors in understanding these individual differences.

  16. Testing the hypothesis that treatment can eliminate HIV

    DEFF Research Database (Denmark)

    Okano, Justin T; Robbins, Danielle; Palk, Laurence

    2016-01-01

    BACKGROUND: Worldwide, approximately 35 million individuals are infected with HIV; about 25 million of these live in sub-Saharan Africa. WHO proposes using treatment as prevention (TasP) to eliminate HIV. Treatment suppresses viral load, decreasing the probability an individual transmits HIV....... The elimination threshold is one new HIV infection per 1000 individuals. Here, we test the hypothesis that TasP can substantially reduce epidemics and eliminate HIV. We estimate the impact of TasP, between 1996 and 2013, on the Danish HIV epidemic in men who have sex with men (MSM), an epidemic UNAIDS has...... identified as a priority for elimination. METHODS: We use a CD4-staged Bayesian back-calculation approach to estimate incidence, and the hidden epidemic (the number of HIV-infected undiagnosed MSM). To develop the back-calculation model, we use data from an ongoing nationwide population-based study...

  17. A test of the predator satiation hypothesis, acorn predator size, and acorn preference

    Science.gov (United States)

    C.H. Greenberg; S.J. Zarnoch

    2018-01-01

    Mast seeding is hypothesized to satiate seed predators with heavy production and reduce populations with crop failure, thereby increasing seed survival. Preference for red or white oak acorns could influence recruitment among oak species. We tested the predator satiation hypothesis, acorn preference, and predator size by concurrently...

  18. Persistent Confusions about Hypothesis Testing in the Social Sciences

    Directory of Open Access Journals (Sweden)

    Christopher Thron

    2015-05-01

    Full Text Available This paper analyzes common confusions involving basic concepts in statistical hypothesis testing. One-third of the social science statistics textbooks examined in the study contained false statements about significance level and/or p-value. We infer that a large proportion of social scientists are being miseducated about these concepts. We analyze the causes of these persistent misunderstandings, and conclude that the conventional terminology is prone to abuse because it does not clearly represent the conditional nature of probabilities and events involved. We argue that modifications in terminology, as well as the explicit introduction of conditional probability concepts and notation into the statistics curriculum in the social sciences, are necessary to prevent the persistence of these errors.

  19. Why Is Test-Restudy Practice Beneficial for Memory? An Evaluation of the Mediator Shift Hypothesis

    Science.gov (United States)

    Pyc, Mary A.; Rawson, Katherine A.

    2012-01-01

    Although the memorial benefits of testing are well established empirically, the mechanisms underlying this benefit are not well understood. The authors evaluated the mediator shift hypothesis, which states that test-restudy practice is beneficial for memory because retrieval failures during practice allow individuals to evaluate the effectiveness…

  20. Convergence Hypothesis: Evidence from Panel Unit Root Test with Spatial Dependence

    Directory of Open Access Journals (Sweden)

    Lezheng Liu

    2006-10-01

    Full Text Available In this paper we test the convergence hypothesis by using a revised 4- step procedure of panel unit root test suggested by Evans and Karras (1996. We use data on output for 24 OECD countries over 40 years long. Whether the convergence, if any, is conditional or absolute is also examined. According to a proposition by Baltagi, Bresson, and Pirotte (2005, we incorporate spatial autoregressive error into a fixedeffect panel model to account for not only the heterogeneous panel structure, but also spatial dependence, which might induce lower statistical power of conventional panel unit root test. Our empirical results indicate that output is converging among OECD countries. However, convergence is characterized as conditional. The results also report a relatively lower convergent speed compared to conventional panel studies.

  1. Endogenous sequential cortical activity evoked by visual stimuli.

    Science.gov (United States)

    Carrillo-Reid, Luis; Miller, Jae-Eun Kang; Hamm, Jordan P; Jackson, Jesse; Yuste, Rafael

    2015-06-10

    Although the functional properties of individual neurons in primary visual cortex have been studied intensely, little is known about how neuronal groups could encode changing visual stimuli using temporal activity patterns. To explore this, we used in vivo two-photon calcium imaging to record the activity of neuronal populations in primary visual cortex of awake mice in the presence and absence of visual stimulation. Multidimensional analysis of the network activity allowed us to identify neuronal ensembles defined as groups of cells firing in synchrony. These synchronous groups of neurons were themselves activated in sequential temporal patterns, which repeated at much higher proportions than chance and were triggered by specific visual stimuli such as natural visual scenes. Interestingly, sequential patterns were also present in recordings of spontaneous activity without any sensory stimulation and were accompanied by precise firing sequences at the single-cell level. Moreover, intrinsic dynamics could be used to predict the occurrence of future neuronal ensembles. Our data demonstrate that visual stimuli recruit similar sequential patterns to the ones observed spontaneously, consistent with the hypothesis that already existing Hebbian cell assemblies firing in predefined temporal sequences could be the microcircuit substrate that encodes visual percepts changing in time. Copyright © 2015 Carrillo-Reid et al.

  2. TESTS OF THE PLANETARY HYPOTHESIS FOR PTFO 8-8695b

    International Nuclear Information System (INIS)

    Yu, Liang; Winn, Joshua N.; Rappaport, Saul; Dai, Fei; Triaud, Amaury H. M. J.; Gillon, Michaël; Delrez, Laetitia; Jehin, Emmanuel; Lendl, Monika; Albrecht, Simon; Bieryla, Allyson; Holman, Matthew J.; Montet, Benjamin T.; Hillenbrand, Lynne; Howard, Andrew W.; Huang, Chelsea X.; Isaacson, Howard; Sanchis-Ojeda, Roberto; Muirhead, Philip

    2015-01-01

    The T Tauri star PTFO 8-8695 exhibits periodic fading events that have been interpreted as the transits of a giant planet on a precessing orbit. Here we present three tests of the planet hypothesis. First, we sought evidence for the secular changes in light-curve morphology that are predicted to be a consequence of orbital precession. We observed 28 fading events spread over several years and did not see the expected changes. Instead, we found that the fading events are not strictly periodic. Second, we attempted to detect the planet's radiation, based on infrared observations spanning the predicted times of occultations. We ruled out a signal of the expected amplitude. Third, we attempted to detect the Rossiter–McLaughlin effect by performing high-resolution spectroscopy throughout a fading event. No effect was seen at the expected level, ruling out most (but not all) possible orientations for the hypothetical planetary orbit. Our spectroscopy also revealed strong, time-variable, high-velocity Hα and Ca H and K emission features. All these observations cast doubt on the planetary hypothesis, and suggest instead that the fading events represent starspots, eclipses by circumstellar dust, or occultations of an accretion hotspot

  3. Hypothesis test for synchronization: twin surrogates revisited.

    Science.gov (United States)

    Romano, M Carmen; Thiel, Marco; Kurths, Jürgen; Mergenthaler, Konstantin; Engbert, Ralf

    2009-03-01

    The method of twin surrogates has been introduced to test for phase synchronization of complex systems in the case of passive experiments. In this paper we derive new analytical expressions for the number of twins depending on the size of the neighborhood, as well as on the length of the trajectory. This allows us to determine the optimal parameters for the generation of twin surrogates. Furthermore, we determine the quality of the twin surrogates with respect to several linear and nonlinear statistics depending on the parameters of the method. In the second part of the paper we perform a hypothesis test for phase synchronization in the case of experimental data from fixational eye movements. These miniature eye movements have been shown to play a central role in neural information processing underlying the perception of static visual scenes. The high number of data sets (21 subjects and 30 trials per person) allows us to compare the generated twin surrogates with the "natural" surrogates that correspond to the different trials. We show that the generated twin surrogates reproduce very well all linear and nonlinear characteristics of the underlying experimental system. The synchronization analysis of fixational eye movements by means of twin surrogates reveals that the synchronization between the left and right eye is significant, indicating that either the centers in the brain stem generating fixational eye movements are closely linked, or, alternatively that there is only one center controlling both eyes.

  4. Water Pollution Detection Based on Hypothesis Testing in Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xu Luo

    2017-01-01

    Full Text Available Water pollution detection is of great importance in water conservation. In this paper, the water pollution detection problems of the network and of the node in sensor networks are discussed. The detection problems in both cases of the distribution of the monitoring noise being normal and nonnormal are considered. The pollution detection problems are analyzed based on hypothesis testing theory firstly; then, the specific detection algorithms are given. Finally, two implementation examples are given to illustrate how the proposed detection methods are used in the water pollution detection in sensor networks and prove the effectiveness of the proposed detection methods.

  5. The self: Your own worst enemy? A test of the self-invoking trigger hypothesis.

    Science.gov (United States)

    McKay, Brad; Wulf, Gabriele; Lewthwaite, Rebecca; Nordin, Andrew

    2015-01-01

    The self-invoking trigger hypothesis was proposed by Wulf and Lewthwaite [Wulf, G., & Lewthwaite, R. (2010). Effortless motor learning? An external focus of attention enhances movement effectiveness and efficiency. In B. Bruya (Ed.), Effortless attention: A new perspective in attention and action (pp. 75-101). Cambridge, MA: MIT Press] as a mechanism underlying the robust effect of attentional focus on motor learning and performance. One component of this hypothesis, relevant beyond the attentional focus effect, suggests that causing individuals to access their self-schema will negatively impact their learning and performance of a motor skill. The purpose of the present two studies was to provide an initial test of the performance and learning aspects of the self-invoking trigger hypothesis by asking participants in one group to think about themselves between trial blocks-presumably activating their self-schema-to compare their performance and learning to that of a control group. In Experiment 1, participants performed 2 blocks of 10 trials on a throwing task. In one condition, participants were asked between blocks to think about their past throwing experience. While a control group maintained their performance across blocks, the self group's performance was degraded on the second block. In Experiment 2, participants were asked to practice a wiffleball hitting task on two separate days. Participants returned on a third day to perform retention and transfer tests without the self-activating manipulation. Results indicated that the self group learned the hitting task less effectively than the control group. The findings reported here provide initial support for the self-invoking trigger hypothesis.

  6. The sequential hypothesis of sleep function. IV. A correlative analysis of sleep variables in learning and nonlearning rats.

    Science.gov (United States)

    Langella, M; Colarieti, L; Ambrosini, M V; Giuditta, A

    1992-02-01

    Female adult rats were trained for a two-way active avoidance task (4 h), and allowed free sleep (3 h). Control rats (C) were left in their home cages during the acquisition period. Dural electrodes and an intraventricular cannula, implanted one week in advance, were used for EEG recording during the period of sleep and for the injection of [3H]thymidine at the beginning of the training session, respectively. Rats were killed at the end of the sleep period, and the DNA-specific activity was determined in the main brain regions and in liver. Correlations among sleep, behavioral and biochemical variables were assessed using Spearman's nonparametric method. In learning rats (L), the number of avoidances was negatively correlated with SS-W variables, and positively correlated with SS-PS variables (episodes of synchronized sleep followed by wakefulness or paradoxical sleep, respectively) and with PS variables. An inverse pattern of correlations was shown by the number of escapes or freezings. No correlations occurred in rats unable to achieve the learning criterion (NL). In L rats, the specific activity of brain DNA was negatively correlated with SS-W variables and positively correlated with SS-PS variables, while essentially no correlation concerned PS variables. On the other hand, in NL rats, comparable correlations were positive with SS-W variables and negative with SS-PS and PS variables. Few and weak correlations occurred in C rats. The data support a role of SS in brain information processing, as postulated by the sequential hypothesis on the function of sleep. In addition, they suggest that the elimination of nonadaptive memory traces may require several SS-W episodes and a terminal SS-PS episode. During PS episodes, adaptive memory traces cleared of nonadaptive components may be copied in more suitable brain sites.

  7. MUF residuals tested by a sequential test with power one

    International Nuclear Information System (INIS)

    Sellinschegg, D.; Bicking, U.

    1983-01-01

    Near-real-time material accountancy is an ongoing safeguards development to extend the current capability of IAEA safeguards. The evaluation of the observed ''Material Unaccounted For'' (MUF) time series is an important part in a near-real-time material accountancy regime. The maximum capability of a sequential data evaluation procedure is demonstrated by applying this procedure to the material balance area of the chemical separation process of a reference reprocessing facility with a throughput of 1000 tonnes heavy metal per year, as an example. It is shown that, compared to a conventional material accountancy approach, the detection time as well as the detection probability is significantly improved. (author)

  8. Modeling sequential context effects in diagnostic interpretation of screening mammograms.

    Science.gov (United States)

    Alamudun, Folami; Paulus, Paige; Yoon, Hong-Jun; Tourassi, Georgia

    2018-07-01

    Prior research has shown that physicians' medical decisions can be influenced by sequential context, particularly in cases where successive stimuli exhibit similar characteristics when analyzing medical images. This type of systematic error is known to psychophysicists as sequential context effect as it indicates that judgments are influenced by features of and decisions about the preceding case in the sequence of examined cases, rather than being based solely on the peculiarities unique to the present case. We determine if radiologists experience some form of context bias, using screening mammography as the use case. To this end, we explore correlations between previous perceptual behavior and diagnostic decisions and current decisions. We hypothesize that a radiologist's visual search pattern and diagnostic decisions in previous cases are predictive of the radiologist's current diagnostic decisions. To test our hypothesis, we tasked 10 radiologists of varied experience to conduct blind reviews of 100 four-view screening mammograms. Eye-tracking data and diagnostic decisions were collected from each radiologist under conditions mimicking clinical practice. Perceptual behavior was quantified using the fractal dimension of gaze scanpath, which was computed using the Minkowski-Bouligand box-counting method. To test the effect of previous behavior and decisions, we conducted a multifactor fixed-effects ANOVA. Further, to examine the predictive value of previous perceptual behavior and decisions, we trained and evaluated a predictive model for radiologists' current diagnostic decisions. ANOVA tests showed that previous visual behavior, characterized by fractal analysis, previous diagnostic decisions, and image characteristics of previous cases are significant predictors of current diagnostic decisions. Additionally, predictive modeling of diagnostic decisions showed an overall improvement in prediction error when the model is trained on additional information about

  9. Habitat fragmentation, vole population fluctuations, and the ROMPA hypothesis: An experimental test using model landscapes.

    Science.gov (United States)

    Batzli, George O

    2016-11-01

    Increased habitat fragmentation leads to smaller size of habitat patches and to greater distance between patches. The ROMPA hypothesis (ratio of optimal to marginal patch area) uniquely links vole population fluctuations to the composition of the landscape. It states that as ROMPA decreases (fragmentation increases), vole population fluctuations will increase (including the tendency to display multi-annual cycles in abundance) because decreased proportions of optimal habitat result in greater population declines and longer recovery time after a harsh season. To date, only comparative observations in the field have supported the hypothesis. This paper reports the results of the first experimental test. I used prairie voles, Microtus ochrogaster, and mowed grassland to create model landscapes with 3 levels of ROMPA (high with 25% mowed, medium with 50% mowed and low with 75% mowed). As ROMPA decreased, distances between patches of favorable habitat (high cover) increased owing to a greater proportion of unfavorable (mowed) habitat. Results from the first year with intensive live trapping indicated that the preconditions for operation of the hypothesis existed (inversely density dependent emigration and, as ROMPA decreased, increased per capita mortality and decreased per capita movement between optimal patches). Nevertheless, contrary to the prediction of the hypothesis that populations in landscapes with high ROMPA should have the lowest variability, 5 years of trapping indicated that variability was lowest with medium ROMPA. The design of field experiments may never be perfect, but these results indicate that the ROMPA hypothesis needs further rigorous testing. © 2016 International Society of Zoological Sciences, Institute of Zoology/Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.

  10. Predictability of Exchange Rates in Sri Lanka: A Test of the Efficient Market Hypothesis

    OpenAIRE

    Guneratne B Wickremasinghe

    2007-01-01

    This study examined the validity of the weak and semi-strong forms of the efficient market hypothesis (EMH) for the foreign exchange market of Sri Lanka. Monthly exchange rates for four currencies during the floating exchange rate regime were used in the empirical tests. Using a battery of tests, empirical results indicate that the current values of the four exchange rates can be predicted from their past values. Further, the tests of semi-strong form efficiency indicate that exchange rate pa...

  11. Efficacy and safety of sequential versus quadruple therapy as second-line treatment for helicobacter pylori infection-A randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Daniela Munteanu

    Full Text Available Quadruple therapy is recommended as second-line treatment for Helicobacter pylori eradication failure. However, high cost, multiple side effects, and low adherence rates are major drawbacks to its routine use. Our aim was to compare the efficacy and safety of sequential versus quadruple regimens as second line treatment for persistent Helicobacter pylori infection.Prospective, randomized, open label trial was conducted at a large academic, tertiary care center in Israel. Patients who previously failed a standard triple treatment eradication course were randomly assigned (1:1 to receive a 10-day sequential therapy course, or a 14-day quadruple regimen. Compliance and adverse events were evaluated by telephone questionnaires. The primary endpoint for analysis was the rate of Helicobacter pylori eradication as defined by either a negative 13C-urea breath-test, or stool antigen test, 4-16 weeks after treatment assessed under the non-inferiority hypothesis. The trial was terminated prematurely due to low recruitment rates. See S1 Checklist for CONSORT checklist.One hundred and one patients were randomized. Per modified intention-to-treat analysis, eradication rate was 49% in the sequential versus 42.5% in the quadruple regimen group (p-value for non-inferiority 0.02. Forty-two (84.0% versus 33 (64.7% patients completed treatment in the sequential and quadruple groups respectively (p 0.027. Gastrointestinal side effects were more common in the quadruple regimen group.Sequential treatment when used as a second line regimen, was non-inferior to the standard of care quadruple regimen in achieving Helicobacter pylori eradication, and was associated with better compliance and fewer adverse effects. Both treatment protocols failed to show an adequate eradication rate in the population of Southern Israel.ClinicalTrials.gov NCT01481844.

  12. Parameter estimation and hypothesis testing in linear models

    CERN Document Server

    Koch, Karl-Rudolf

    1999-01-01

    The necessity to publish the second edition of this book arose when its third German edition had just been published. This second English edition is there­ fore a translation of the third German edition of Parameter Estimation and Hypothesis Testing in Linear Models, published in 1997. It differs from the first English edition by the addition of a new chapter on robust estimation of parameters and the deletion of the section on discriminant analysis, which has been more completely dealt with by the author in the book Bayesian In­ ference with Geodetic Applications, Springer-Verlag, Berlin Heidelberg New York, 1990. Smaller additions and deletions have been incorporated, to im­ prove the text, to point out new developments or to eliminate errors which became apparent. A few examples have been also added. I thank Springer-Verlag for publishing this second edition and for the assistance in checking the translation, although the responsibility of errors remains with the author. I also want to express my thanks...

  13. Visual Working Memory and Number Sense: Testing the Double Deficit Hypothesis in Mathematics

    Science.gov (United States)

    Toll, Sylke W. M.; Kroesbergen, Evelyn H.; Van Luit, Johannes E. H.

    2016-01-01

    Background: Evidence exists that there are two main underlying cognitive factors in mathematical difficulties: working memory and number sense. It is suggested that real math difficulties appear when both working memory and number sense are weak, here referred to as the double deficit (DD) hypothesis. Aims: The aim of this study was to test the DD…

  14. A novel approach for small sample size family-based association studies: sequential tests.

    Science.gov (United States)

    Ilk, Ozlem; Rajabli, Farid; Dungul, Dilay Ciglidag; Ozdag, Hilal; Ilk, Hakki Gokhan

    2011-08-01

    In this paper, we propose a sequential probability ratio test (SPRT) to overcome the problem of limited samples in studies related to complex genetic diseases. The results of this novel approach are compared with the ones obtained from the traditional transmission disequilibrium test (TDT) on simulated data. Although TDT classifies single-nucleotide polymorphisms (SNPs) to only two groups (SNPs associated with the disease and the others), SPRT has the flexibility of assigning SNPs to a third group, that is, those for which we do not have enough evidence and should keep sampling. It is shown that SPRT results in smaller ratios of false positives and negatives, as well as better accuracy and sensitivity values for classifying SNPs when compared with TDT. By using SPRT, data with small sample size become usable for an accurate association analysis.

  15. Comparison of three-stage sequential extraction and toxicity characteristic leaching tests to evaluate metal mobility in mining wastes

    International Nuclear Information System (INIS)

    Margui, E.; Salvado, V.; Queralt, I.; Hidalgo, M.

    2004-01-01

    Abandoned mining sites contain residues from ore processing operations that are characterised by high concentrations of heavy metals. The form in which a metal exists strongly influences its mobility and, thus, the effects on the environment. Operational methods of speciation analysis, such as the use of sequential extraction procedures, are commonly applied. In this work, the modified three-stage sequential extraction procedure proposed by the BCR (now the Standards, Measurements and Testing Programme) was applied for the fractionation of Ni, Zn, Pb and Cd in mining wastes from old Pb-Zn mining areas located in the Val d'Aran (NE Spain) and Cartagena (SE Spain). Analyses of the extracts were performed by inductively coupled plasma atomic emission spectrometry and electrothermal atomic absorption spectrometry. The procedure was evaluated by using a certified reference material, BCR-701. The results of the partitioning study indicate that more easily mobilised forms (acid exchangeable) were predominant for Cd and Zn, particularly in the sample from Cartagena. In contrast, the largest amount of lead was associated with the iron and manganese oxide fractions. On the other hand, the applicability of lixiviation tests commonly used to evaluate the leaching of toxic species from landfill disposal (US-EPA Toxicity Characteristic Leaching Procedure and DIN 38414-S4) to mining wastes was also investigated and the obtained results compared with the information on metal mobility derivable from the application of the three-stage sequential extraction procedure

  16. A critical discussion of null hypothesis significance testing and statistical power analysis within psychological research

    DEFF Research Database (Denmark)

    Jones, Allan; Sommerlund, Bo

    2007-01-01

    The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...

  17. Hypothesis Testing of Inclusion of the Tolerance Interval for the Assessment of Food Safety.

    Directory of Open Access Journals (Sweden)

    Hungyen Chen

    Full Text Available In the testing of food quality and safety, we contrast the contents of the newly proposed food (genetically modified food against those of conventional foods. Because the contents vary largely between crop varieties and production environments, we propose a two-sample test of substantial equivalence that examines the inclusion of the tolerance intervals of the two populations, the population of the contents of the proposed food, which we call the target population, and the population of the contents of the conventional food, which we call the reference population. Rejection of the test hypothesis guarantees that the contents of the proposed foods essentially do not include outliers in the population of the contents of the conventional food. The existing tolerance interval (TI0 is constructed to have at least a pre-specified level of the coverage probability. Here, we newly introduce the complementary tolerance interval (TI1 that is guaranteed to have at most a pre-specified level of the coverage probability. By applying TI0 and TI1 to the samples from the target population and the reference population respectively, we construct a test statistic for testing inclusion of the two tolerance intervals. To examine the performance of the testing procedure, we conducted a simulation that reflects the effects of gene and environment, and residual from a crop experiment. As a case study, we applied the hypothesis testing to test if the distribution of the protein content of rice in Kyushu area is included in the distribution of the protein content in the other areas in Japan.

  18. Testing the null hypothesis of the nonexistence of a preseizure state

    International Nuclear Information System (INIS)

    Andrzejak, Ralph G.; Kraskov, Alexander; Mormann, Florian; Rieke, Christoph; Kreuz, Thomas; Elger, Christian E.; Lehnertz, Klaus

    2003-01-01

    A rapidly growing number of studies deals with the prediction of epileptic seizures. For this purpose, various techniques derived from linear and nonlinear time series analysis have been applied to the electroencephalogram of epilepsy patients. In none of these works, however, the performance of the seizure prediction statistics is tested against a null hypothesis, an otherwise ubiquitous concept in science. In consequence, the evaluation of the reported performance values is problematic. Here, we propose the technique of seizure time surrogates based on a Monte Carlo simulation to remedy this deficit

  19. Testing the null hypothesis of the nonexistence of a preseizure state

    Energy Technology Data Exchange (ETDEWEB)

    Andrzejak, Ralph G; Kraskov, Alexander [John-von-Neumann Institute for Computing, Forschungszentrum Juelich, 52425 Juelich (Germany); Mormann, Florian; Rieke, Christoph [Department of Epileptology, University of Bonn, Sigmund-Freud-Strasse 25, 53105 Bonn (Germany); Helmholtz Institut fuer Strahlen- und Kernphysik, University of Bonn, Nussallee 14-16, 53115 Bonn (Germany); Kreuz, Thomas [John-von-Neumann Institute for Computing, Forschungszentrum Juelich, 52425 Juelich (Germany); Department of Epileptology, University of Bonn, Sigmund-Freud-Strasse 25, 53105 Bonn (Germany); Elger, Christian E; Lehnertz, Klaus [Department of Epileptology, University of Bonn, Sigmund-Freud-Strasse 25, 53105 Bonn (Germany)

    2003-01-01

    A rapidly growing number of studies deals with the prediction of epileptic seizures. For this purpose, various techniques derived from linear and nonlinear time series analysis have been applied to the electroencephalogram of epilepsy patients. In none of these works, however, the performance of the seizure prediction statistics is tested against a null hypothesis, an otherwise ubiquitous concept in science. In consequence, the evaluation of the reported performance values is problematic. Here, we propose the technique of seizure time surrogates based on a Monte Carlo simulation to remedy this deficit.

  20. Analysis of membrane fusion as a two-state sequential process: evaluation of the stalk model.

    Science.gov (United States)

    Weinreb, Gabriel; Lentz, Barry R

    2007-06-01

    We propose a model that accounts for the time courses of PEG-induced fusion of membrane vesicles of varying lipid compositions and sizes. The model assumes that fusion proceeds from an initial, aggregated vesicle state ((A) membrane contact) through two sequential intermediate states (I(1) and I(2)) and then on to a fusion pore state (FP). Using this model, we interpreted data on the fusion of seven different vesicle systems. We found that the initial aggregated state involved no lipid or content mixing but did produce leakage. The final state (FP) was not leaky. Lipid mixing normally dominated the first intermediate state (I(1)), but content mixing signal was also observed in this state for most systems. The second intermediate state (I(2)) exhibited both lipid and content mixing signals and leakage, and was sometimes the only leaky state. In some systems, the first and second intermediates were indistinguishable and converted directly to the FP state. Having also tested a parallel, two-intermediate model subject to different assumptions about the nature of the intermediates, we conclude that a sequential, two-intermediate model is the simplest model sufficient to describe PEG-mediated fusion in all vesicle systems studied. We conclude as well that a fusion intermediate "state" should not be thought of as a fixed structure (e.g., "stalk" or "transmembrane contact") of uniform properties. Rather, a fusion "state" describes an ensemble of similar structures that can have different mechanical properties. Thus, a "state" can have varying probabilities of having a given functional property such as content mixing, lipid mixing, or leakage. Our data show that the content mixing signal may occur through two processes, one correlated and one not correlated with leakage. Finally, we consider the implications of our results in terms of the "modified stalk" hypothesis for the mechanism of lipid pore formation. We conclude that our results not only support this hypothesis but

  1. Understanding Postdisaster Substance Use and Psychological Distress Using Concepts from the Self-Medication Hypothesis and Social Cognitive Theory.

    Science.gov (United States)

    Alexander, Adam C; Ward, Kenneth D

    2017-11-10

    This article applies constructs from the Self-Medication Hypothesis and Social Cognitive Theory to explain the development of substance use and psychological distress after a disaster. A conceptual model is proposed, which employs a sequential mediation model, identifying perceived coping self-efficacy, psychological distress, and self-medication as pathways to substance use after a disaster. Disaster exposure decreases perceived coping self-efficacy, which, in turn, increases psychological distress and subsequently increases perceptions of self-medication in vulnerable individuals. These mechanisms lead to an increase in postdisaster substance use. Last, recommendations are offered to encourage disaster researchers to test more complex models in studies on postdisaster psychological distress and substance use.

  2. TEST OF THE CATCH-UP HYPOTHESIS IN AFRICAN AGRICULTURAL GROWTH RATES

    Directory of Open Access Journals (Sweden)

    Kalu Ukpai IFEGWU

    2015-11-01

    Full Text Available The paper tested the catch-up hypothesis in agricultural growth rates of twenty-six African countries. Panel data used was drawn from the Food and Agricultural Organization Statistics (FAOSTAT of the United Nations. The Data Envelopment Analysis Method for measuring productivity was used to estimate productivity growth rates. The cross-section framework consisting of sigma-convergence and beta-convergence was employed to test the catching up process. Catching up is said to exist if the value of beta is negative and significant. Since catching up does not necessarily imply narrowing of national productivity inequalities, sigma-convergence which measures inequality, was estimated for the same variables. The results showed evidence of the catch-up process, but failed to find a narrowing of productivity inequalities among countries.

  3. Phi index: a new metric to test the flush early and avoid the rush hypothesis.

    Science.gov (United States)

    Samia, Diogo S M; Blumstein, Daniel T

    2014-01-01

    Optimal escape theory states that animals should counterbalance the costs and benefits of flight when escaping from a potential predator. However, in apparent contradiction with this well-established optimality model, birds and mammals generally initiate escape soon after beginning to monitor an approaching threat, a phenomena codified as the "Flush Early and Avoid the Rush" (FEAR) hypothesis. Typically, the FEAR hypothesis is tested using correlational statistics and is supported when there is a strong relationship between the distance at which an individual first responds behaviorally to an approaching predator (alert distance, AD), and its flight initiation distance (the distance at which it flees the approaching predator, FID). However, such correlational statistics are both inadequate to analyze relationships constrained by an envelope (such as that in the AD-FID relationship) and are sensitive to outliers with high leverage, which can lead one to erroneous conclusions. To overcome these statistical concerns we develop the phi index (Φ), a distribution-free metric to evaluate the goodness of fit of a 1:1 relationship in a constraint envelope (the prediction of the FEAR hypothesis). Using both simulation and empirical data, we conclude that Φ is superior to traditional correlational analyses because it explicitly tests the FEAR prediction, is robust to outliers, and it controls for the disproportionate influence of observations from large predictor values (caused by the constrained envelope in AD-FID relationship). Importantly, by analyzing the empirical data we corroborate the strong effect that alertness has on flight as stated by the FEAR hypothesis.

  4. Phi index: a new metric to test the flush early and avoid the rush hypothesis.

    Directory of Open Access Journals (Sweden)

    Diogo S M Samia

    Full Text Available Optimal escape theory states that animals should counterbalance the costs and benefits of flight when escaping from a potential predator. However, in apparent contradiction with this well-established optimality model, birds and mammals generally initiate escape soon after beginning to monitor an approaching threat, a phenomena codified as the "Flush Early and Avoid the Rush" (FEAR hypothesis. Typically, the FEAR hypothesis is tested using correlational statistics and is supported when there is a strong relationship between the distance at which an individual first responds behaviorally to an approaching predator (alert distance, AD, and its flight initiation distance (the distance at which it flees the approaching predator, FID. However, such correlational statistics are both inadequate to analyze relationships constrained by an envelope (such as that in the AD-FID relationship and are sensitive to outliers with high leverage, which can lead one to erroneous conclusions. To overcome these statistical concerns we develop the phi index (Φ, a distribution-free metric to evaluate the goodness of fit of a 1:1 relationship in a constraint envelope (the prediction of the FEAR hypothesis. Using both simulation and empirical data, we conclude that Φ is superior to traditional correlational analyses because it explicitly tests the FEAR prediction, is robust to outliers, and it controls for the disproportionate influence of observations from large predictor values (caused by the constrained envelope in AD-FID relationship. Importantly, by analyzing the empirical data we corroborate the strong effect that alertness has on flight as stated by the FEAR hypothesis.

  5. Does mediator use contribute to the spacing effect for cued recall? Critical tests of the mediator hypothesis.

    Science.gov (United States)

    Morehead, Kayla; Dunlosky, John; Rawson, Katherine A; Bishop, Melissa; Pyc, Mary A

    2018-04-01

    When study is spaced across sessions (versus massed within a single session), final performance is greater after spacing. This spacing effect may have multiple causes, and according to the mediator hypothesis, part of the effect can be explained by the use of mediator-based strategies. This hypothesis proposes that when study is spaced across sessions, rather than massed within a session, more mediators will be generated that are longer lasting and hence more mediators will be available to support criterion recall. In two experiments, participants were randomly assigned to study paired associates using either a spaced or massed schedule. They reported strategy use for each item during study trials and during the final test. Consistent with the mediator hypothesis, participants who had spaced (as compared to massed) practice reported using more mediators on the final test. This use of effective mediators also statistically accounted for some - but not all of - the spacing effect on final performance.

  6. Contrast class cues and performance facilitation in a hypothesis-testing task: evidence for an iterative counterfactual model.

    Science.gov (United States)

    Gale, Maggie; Ball, Linden J

    2012-04-01

    Hypothesis-testing performance on Wason's (Quarterly Journal of Experimental Psychology 12:129-140, 1960) 2-4-6 task is typically poor, with only around 20% of participants announcing the to-be-discovered "ascending numbers" rule on their first attempt. Enhanced solution rates can, however, readily be observed with dual-goal (DG) task variants requiring the discovery of two complementary rules, one labeled "DAX" (the standard "ascending numbers" rule) and the other labeled "MED" ("any other number triples"). Two DG experiments are reported in which we manipulated the usefulness of a presented MED exemplar, where usefulness denotes cues that can establish a helpful "contrast class" that can stand in opposition to the presented 2-4-6 DAX exemplar. The usefulness of MED exemplars had a striking facilitatory effect on DAX rule discovery, which supports the importance of contrast-class information in hypothesis testing. A third experiment ruled out the possibility that the useful MED triple seeded the correct rule from the outset and obviated any need for hypothesis testing. We propose that an extension of Oaksford and Chater's (European Journal of Cognitive Psychology 6:149-169, 1994) iterative counterfactual model can neatly capture the mechanisms by which DG facilitation arises.

  7. Alternatives to the sequential lineup: the importance of controlling the pictures.

    Science.gov (United States)

    Lindsay, R C; Bellinger, K

    1999-06-01

    Because sequential lineups reduce false-positive choices, their use has been recommended (R. C. L. Lindsay, 1999; R. C. L. Lindsay & G. L. Wells, 1985). Blind testing is included in the recommended procedures. Police, concerned about blind testing, devised alternative procedures, including self-administered sequential lineups, to reduce use of relative judgments (G. L. Wells, 1984) while permitting the investigating officer to conduct the procedure. Identification data from undergraduates exposed to a staged crime (N = 165) demonstrated that 4 alternative identification procedures tested were less effective than the original sequential lineup. Allowing witnesses to control the photographs resulted in higher rates of false-positive identification. Self-reports of using relative judgments were shown to be postdictive of decision accuracy.

  8. Life shocks and crime: a test of the "turning point" hypothesis.

    Science.gov (United States)

    Corman, Hope; Noonan, Kelly; Reichman, Nancy E; Schwartz-Soicher, Ofira

    2011-08-01

    Other researchers have posited that important events in men's lives-such as employment, marriage, and parenthood-strengthen their social ties and lead them to refrain from crime. A challenge in empirically testing this hypothesis has been the issue of self-selection into life transitions. This study contributes to this literature by estimating the effects of an exogenous life shock on crime. We use data from the Fragile Families and Child Wellbeing Study, augmented with information from hospital medical records, to estimate the effects of the birth of a child with a severe health problem on the likelihood that the infant's father engages in illegal activities. We conduct a number of auxiliary analyses to examine exogeneity assumptions. We find that having an infant born with a severe health condition increases the likelihood that the father is convicted of a crime in the three-year period following the birth of the child, and at least part of the effect appears to operate through work and changes in parental relationships. These results provide evidence that life events can cause crime and, as such, support the "turning point" hypothesis.

  9. Gratitude facilitates private conformity: A test of the social alignment hypothesis.

    Science.gov (United States)

    Ng, Jomel W X; Tong, Eddie M W; Sim, Dael L Y; Teo, Samantha W Y; Loy, Xingqi; Giesbrecht, Timo

    2017-03-01

    Past research has established clear support for the prosocial function of gratitude in improving the well-being of others. The present research provides evidence for another hypothesized function of gratitude: the social alignment function, which enhances the tendency of grateful individuals to follow social norms. We tested the social alignment hypothesis of gratitude in 2 studies with large samples. Using 2 different conformity paradigms, participants were subjected to a color judgment task (Experiment 1) and a material consumption task (Experiment 2). They were provided with information showing choices allegedly made by others, but were allowed to state their responses in private. Supporting the social alignment hypothesis, the results showed that induced gratitude increased private conformity. Specifically, participants induced to feel gratitude were more likely to conform to the purportedly popular choice, even if the option was factually incorrect (Experiment 1). This effect appears to be specific to gratitude; induction of joy produced significantly less conformity than gratitude (Experiment 2). We discuss whether the social alignment function provides a behavioral pathway in the role of gratitude in building social relationships. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. On the Keyhole Hypothesis

    DEFF Research Database (Denmark)

    Mikkelsen, Kaare B.; Kidmose, Preben; Hansen, Lars Kai

    2017-01-01

    simultaneously recorded scalp EEG. A cross-validation procedure was employed to ensure unbiased estimates. We present several pieces of evidence in support of the keyhole hypothesis: There is a high mutual information between data acquired at scalp electrodes and through the ear-EEG "keyhole," furthermore we......We propose and test the keyhole hypothesis that measurements from low dimensional EEG, such as ear-EEG reflect a broadly distributed set of neural processes. We formulate the keyhole hypothesis in information theoretical terms. The experimental investigation is based on legacy data consisting of 10...

  11. Mothers Who Kill Their Offspring: Testing Evolutionary Hypothesis in a 110-Case Italian Sample

    Science.gov (United States)

    Camperio Ciani, Andrea S.; Fontanesi, Lilybeth

    2012-01-01

    Objectives: This research aimed to identify incidents of mothers in Italy killing their own children and to test an adaptive evolutionary hypothesis to explain their occurrence. Methods: 110 cases of mothers killing 123 of their own offspring from 1976 to 2010 were analyzed. Each case was classified using 13 dichotomic variables. Descriptive…

  12. Reassessing the Trade-off Hypothesis

    DEFF Research Database (Denmark)

    Rosas, Guillermo; Manzetti, Luigi

    2015-01-01

    Do economic conditions drive voters to punish politicians that tolerate corruption? Previous scholarly work contends that citizens in young democracies support corrupt governments that are capable of promoting good economic outcomes, the so-called trade-off hypothesis. We test this hypothesis based...

  13. Sequential Design of Experiments to Maximize Learning from Carbon Capture Pilot Plant Testing

    Energy Technology Data Exchange (ETDEWEB)

    Soepyan, Frits B.; Morgan, Joshua C.; Omell, Benjamin P.; Zamarripa-Perez, Miguel A.; Matuszewski, Michael S.; Miller, David C.

    2018-02-06

    Pilot plant test campaigns can be expensive and time-consuming. Therefore, it is of interest to maximize the amount of learning and the efficiency of the test campaign given the limited number of experiments that can be conducted. This work investigates the use of sequential design of experiments (SDOE) to overcome these challenges by demonstrating its usefulness for a recent solvent-based CO2 capture plant test campaign. Unlike traditional design of experiments methods, SDOE regularly uses information from ongoing experiments to determine the optimum locations in the design space for subsequent runs within the same experiment. However, there are challenges that need to be addressed, including reducing the high computational burden to efficiently update the model, and the need to incorporate the methodology into a computational tool. We address these challenges by applying SDOE in combination with a software tool, the Framework for Optimization, Quantification of Uncertainty and Surrogates (FOQUS) (Miller et al., 2014a, 2016, 2017). The results of applying SDOE on a pilot plant test campaign for CO2 capture suggests that relative to traditional design of experiments methods, SDOE can more effectively reduce the uncertainty of the model, thus decreasing technical risk. Future work includes integrating SDOE into FOQUS and using SDOE to support additional large-scale pilot plant test campaigns.

  14. Use of supernovae light curves for testing the expansion hypothesis and other cosmological relations

    International Nuclear Information System (INIS)

    Rust, B.W.

    1974-01-01

    This thesis is primarily concerned with a test of the expansion hypothesis based on the relation Δt/sub obs/ = (1 + V/sub r//c)Δt/sub int/ where Δt/sub int/ is the time lapse characterizing some phenomenon in a distant galaxy, Δt/sub obs/ is the observed time lapse and V/sub r/ is the symbolic velocity of recession. If the red shift is a Doppler effect, the observed time lapse should be lengthened by the same factor as the wave length of the light. Many authors have suggested type I supernovae for such a test because of their great luminosity and the uniformity of their light curves, but apparently the test has heretofore never actually been performed. Thirty-six light curves were gathered from the literature and one (SN1971i) was measured. All of the light curves were reduced to a common (m/sub pg/) photometric system. The comparison time lapse, Δt/sub c/, was taken to be the time required for the brightness to fall from 0.5 m below peak to 2.5 m below peak. The straight line regression of Δt/sub c/ on V/sub r/ gives a correlation coefficient significant at the 93 percent level, and the simple static Euclidean hypothesis is rejected at that level. The regression line also deviates from the prediction of the classical expansion hypothesis. Better agreement was obtained using the chronogeometric theory of I. E. Segal ( []972 Astron. and Astrophys. 18, 143), but the scatter in the present data makes it impossible to distinguish between these alternate hypotheses at the 95 percent confidence level. The question of how many additional light curves would be needed to give definite tests is addressed. It is shown that at the present rate of supernova discoveries, only a few more years would be required to obtain the necessary data if light curves are systematically measured for the more distant supernovae. (Diss. Abstr. Int., B)

  15. About a sequential method for non destructive testing of structures by mechanical vibrations

    International Nuclear Information System (INIS)

    Suarez Antola, R.

    2001-01-01

    The presence and growth of cracks voids or fields of pores under applied forces or environmental actions can produce a meaningful lowering in the proper frequencies of normal modes of mechanical vibration in structures.A quite general expression for the square of modes proper frequency as a functional of displacement field,density field and elastic moduli fields is used as a starting point.The effect of defects on frequency are modeled as equivalent changes in density and elastic moduli fields,introducing the concept of region of influence of each defect.An approximate expression is obtained which relates the relative lowering in the square of modes proper frequency with position,size,shape and orientation of defects in mode displacement field.Some simple examples of structural elements with cracks or fields of pores are considered.the connection with linear elastic fracture mechanics is briefly exemplified.A sequential method is proposed for non-destructive testing of structures using mechanical vibrations combined with properly chosen local nondestructive testing methods

  16. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  17. Sequential x-ray diffraction topography at 1-BM x-ray optics testing beamline at the advanced photon source

    Energy Technology Data Exchange (ETDEWEB)

    Stoupin, Stanislav, E-mail: sstoupin@aps.anl.gov; Shvyd’ko, Yuri; Trakhtenberg, Emil; Liu, Zunping; Lang, Keenan; Huang, Xianrong; Wieczorek, Michael; Kasman, Elina; Hammonds, John; Macrander, Albert; Assoufid, Lahsen [Advanced Photon Source, Argonne National Laboratory, Argonne, IL 60439 (United States)

    2016-07-27

    We report progress on implementation and commissioning of sequential X-ray diffraction topography at 1-BM Optics Testing Beamline of the Advanced Photon Source to accommodate growing needs of strain characterization in diffractive crystal optics and other semiconductor single crystals. The setup enables evaluation of strain in single crystals in the nearly-nondispersive double-crystal geometry. Si asymmetric collimator crystals of different crystallographic orientations were designed, fabricated and characterized using in-house capabilities. Imaging the exit beam using digital area detectors permits rapid sequential acquisition of X-ray topographs at different angular positions on the rocking curve of a crystal under investigation. Results on sensitivity and spatial resolution are reported based on experiments with high-quality Si and diamond crystals. The new setup complements laboratory-based X-ray topography capabilities of the Optics group at the Advanced Photon Source.

  18. Statistical hypothesis testing and common misinterpretations: Should we abandon p-value in forensic science applications?

    Science.gov (United States)

    Taroni, F; Biedermann, A; Bozza, S

    2016-02-01

    Many people regard the concept of hypothesis testing as fundamental to inferential statistics. Various schools of thought, in particular frequentist and Bayesian, have promoted radically different solutions for taking a decision about the plausibility of competing hypotheses. Comprehensive philosophical comparisons about their advantages and drawbacks are widely available and continue to span over large debates in the literature. More recently, controversial discussion was initiated by an editorial decision of a scientific journal [1] to refuse any paper submitted for publication containing null hypothesis testing procedures. Since the large majority of papers published in forensic journals propose the evaluation of statistical evidence based on the so called p-values, it is of interest to expose the discussion of this journal's decision within the forensic science community. This paper aims to provide forensic science researchers with a primer on the main concepts and their implications for making informed methodological choices. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Test of the hypothesis; a lymphoma stem cells exist which is capable of self-renewal

    DEFF Research Database (Denmark)

    Kjeldsen, Malene Krag

      Test of the hypothesis; a lymphoma stem cell exist which is capable of self-renewal   Malene Krag Pedersen, Karen Dybkaer, Hans E. Johnsen   The Research Laboratory, Department of Haematology, Aalborg Hospital, Århus University   Failure of current therapeutics in the treatment of diffuse large B...... and sustaining cells(1-3). My project is based on studies of stem and early progenitor cells in lymphoid cell lines from patients with advanced DLBCL. The cell lines are world wide recognised and generously provided by Dr. Hans Messner and colleagues.   Hypothesis and aims: A lymphoma stem and progenitor cell...

  20. Sequential Power-Dependence Theory

    NARCIS (Netherlands)

    Buskens, Vincent; Rijt, Arnout van de

    2008-01-01

    Existing methods for predicting resource divisions in laboratory exchange networks do not take into account the sequential nature of the experimental setting. We extend network exchange theory by considering sequential exchange. We prove that Sequential Power-Dependence Theory—unlike

  1. A test of the thermal melanism hypothesis in the wingless grasshopper Phaulacridium vittatum.

    Science.gov (United States)

    Harris, Rebecca M; McQuillan, Peter; Hughes, Lesley

    2013-01-01

    Altitudinal clines in melanism are generally assumed to reflect the fitness benefits resulting from thermal differences between colour morphs, yet differences in thermal quality are not always discernible. The intra-specific application of the thermal melanism hypothesis was tested in the wingless grasshopper Phaulacridium vittatum (Sjöstedt) (Orthoptera: Acrididae) first by measuring the thermal properties of the different colour morphs in the laboratory, and second by testing for differences in average reflectance and spectral characteristics of populations along 14 altitudinal gradients. Correlations between reflectance, body size, and climatic variables were also tested to investigate the underlying causes of clines in melanism. Melanism in P. vittatum represents a gradation in colour rather than distinct colour morphs, with reflectance ranging from 2.49 to 5.65%. In unstriped grasshoppers, darker morphs warmed more rapidly than lighter morphs and reached a higher maximum temperature (lower temperature excess). In contrast, significant differences in thermal quality were not found between the colour morphs of striped grasshoppers. In support of the thermal melanism hypothesis, grasshoppers were, on average, darker at higher altitudes, there were differences in the spectral properties of brightness and chroma between high and low altitudes, and temperature variables were significant influences on the average reflectance of female grasshoppers. However, altitudinal gradients do not represent predictable variation in temperature, and the relationship between melanism and altitude was not consistent across all gradients. Grasshoppers generally became darker at altitudes above 800 m a.s.l., but on several gradients reflectance declined with altitude and then increased at the highest altitude.

  2. Configural and component processing in simultaneous and sequential lineup procedures.

    Science.gov (United States)

    Flowe, Heather D; Smith, Harriet M J; Karoğlu, Nilda; Onwuegbusi, Tochukwu O; Rai, Lovedeep

    2016-01-01

    Configural processing supports accurate face recognition, yet it has never been examined within the context of criminal identification lineups. We tested, using the inversion paradigm, the role of configural processing in lineups. Recent research has found that face discrimination accuracy in lineups is better in a simultaneous compared to a sequential lineup procedure. Therefore, we compared configural processing in simultaneous and sequential lineups to examine whether there are differences. We had participants view a crime video, and then they attempted to identify the perpetrator from a simultaneous or sequential lineup. The test faces were presented either upright or inverted, as previous research has shown that inverting test faces disrupts configural processing. The size of the inversion effect for faces was the same across lineup procedures, indicating that configural processing underlies face recognition in both procedures. Discrimination accuracy was comparable across lineup procedures in both the upright and inversion condition. Theoretical implications of the results are discussed.

  3. Construction of computational program of aging in insulating materials for searching reversed sequential test conditions to give damage equivalent to simultaneous exposure of heat and radiation

    International Nuclear Information System (INIS)

    Fuse, Norikazu; Homma, Hiroya; Okamoto, Tatsuki

    2013-01-01

    Two consecutive numerical calculations on degradation of polymeric insulations under thermal and radiation environment are carried out to simulate so-called reversal sequential acceleration test. The aim of the calculation is to search testing conditions which provide material damage equivalent to the case of simultaneous exposure of heat and radiation. At least following four parameters are needed to be considered in the sequential method; dose rate and exposure time in radiation, as well as temperature and aging time in heating. The present paper discusses the handling of these parameters and shows some trial calculation results. (author)

  4. Life Shocks and Crime: A Test of the “Turning Point” Hypothesis

    Science.gov (United States)

    Noonan, Kelly; Reichman, Nancy E.; Schwartz-Soicher, Ofira

    2012-01-01

    Other researchers have posited that important events in men’s lives—such as employment, marriage, and parenthood—strengthen their social ties and lead them to refrain from crime. A challenge in empirically testing this hypothesis has been the issue of self-selection into life transitions. This study contributes to this literature by estimating the effects of an exogenous life shock on crime. We use data from the Fragile Families and Child Wellbeing Study, augmented with information from hospital medical records, to estimate the effects of the birth of a child with a severe health problem on the likelihood that the infant’s father engages in illegal activities. We conduct a number of auxiliary analyses to examine exogeneity assumptions. We find that having an infant born with a severe health condition increases the likelihood that the father is convicted of a crime in the three-year period following the birth of the child, and at least part of the effect appears to operate through work and changes in parental relationships. These results provide evidence that life events can cause crime and, as such, support the “turning point” hypothesis. PMID:21660628

  5. Segmentation by Large Scale Hypothesis Testing - Segmentation as Outlier Detection

    DEFF Research Database (Denmark)

    Darkner, Sune; Dahl, Anders Lindbjerg; Larsen, Rasmus

    2010-01-01

    a microscope and we show how the method can handle transparent particles with significant glare point. The method generalizes to other problems. THis is illustrated by applying the method to camera calibration images and MRI of the midsagittal plane for gray and white matter separation and segmentation......We propose a novel and efficient way of performing local image segmentation. For many applications a threshold of pixel intensities is sufficient but determine the appropriate threshold value can be difficult. In cases with large global intensity variation the threshold value has to be adapted...... locally. We propose a method based on large scale hypothesis testing with a consistent method for selecting an appropriate threshold for the given data. By estimating the background distribution we characterize the segment of interest as a set of outliers with a certain probability based on the estimated...

  6. Sequential boundaries approach in clinical trials with unequal allocation ratios

    Directory of Open Access Journals (Sweden)

    Ayatollahi Seyyed

    2006-01-01

    Full Text Available Abstract Background In clinical trials, both unequal randomization design and sequential analyses have ethical and economic advantages. In the single-stage-design (SSD, however, if the sample size is not adjusted based on unequal randomization, the power of the trial will decrease, whereas with sequential analysis the power will always remain constant. Our aim was to compare sequential boundaries approach with the SSD when the allocation ratio (R was not equal. Methods We evaluated the influence of R, the ratio of the patients in experimental group to the standard group, on the statistical properties of two-sided tests, including the two-sided single triangular test (TT, double triangular test (DTT and SSD by multiple simulations. The average sample size numbers (ASNs and power (1-β were evaluated for all tests. Results Our simulation study showed that choosing R = 2 instead of R = 1 increases the sample size of SSD by 12% and the ASN of the TT and DTT by the same proportion. Moreover, when R = 2, compared to the adjusted SSD, using the TT or DTT allows to retrieve the well known reductions of ASN observed when R = 1, compared to SSD. In addition, when R = 2, compared to SSD, using the TT and DTT allows to obtain smaller reductions of ASN than when R = 1, but maintains the power of the test to its planned value. Conclusion This study indicates that when the allocation ratio is not equal among the treatment groups, sequential analysis could indeed serve as a compromise between ethicists, economists and statisticians.

  7. Neural mechanisms of peristalsis in the isolated rabbit distal colon: a neuromechanical loop hypothesis.

    Science.gov (United States)

    Dinning, Phil G; Wiklendt, Lukasz; Omari, Taher; Arkwright, John W; Spencer, Nick J; Brookes, Simon J H; Costa, Marcello

    2014-01-01

    Propulsive contractions of circular muscle are largely responsible for the movements of content along the digestive tract. Mechanical and electrophysiological recordings of isolated colonic circular muscle have demonstrated that localized distension activates ascending and descending interneuronal pathways, evoking contraction orally and relaxation anally. These polarized enteric reflex pathways can theoretically be sequentially activated by the mechanical stimulation of the advancing contents. Here, we test the hypothesis that initiation and propagation of peristaltic contractions involves a neuromechanical loop; that is an initial gut distension activates local and oral reflex contraction and anal reflex relaxation, the subsequent movement of content then acts as new mechanical stimulus triggering sequentially reflex contractions/relaxations at each point of the gut resulting in a propulsive peristaltic contraction. In fluid filled isolated rabbit distal colon, we combined spatiotemporal mapping of gut diameter and intraluminal pressure with a new analytical method, allowing us to identify when and where active (neurally-driven) contraction or relaxation occurs. Our data indicate that gut dilation is associated with propagating peristaltic contractions, and that the associated level of dilation is greater than that preceding non-propagating contractions (2.7 ± 1.4 mm vs. 1.6 ± 1.2 mm; P polarized enteric circuits. These produce propulsion of the bolus which activates further anally, polarized enteric circuits by distension, thus closing the neuromechanical loop.

  8. Testing the junk-food hypothesis on marine birds: Effects of prey type on growth and development

    Science.gov (United States)

    Romano, Marc D.; Piatt, John F.; Roby, D.D.

    2006-01-01

    The junk-food hypothesis attributes declines in productivity of marine birds and mammals to changes in the species of prey they consume and corresponding differences in nutritional quality of those prey. To test this hypothesis nestling Black-legged Kittiwakes (Rissa tridactyla) and Tufted Puffins (Fratercula cirrhata) were raised in captivity under controlled conditions to determine whether the type and quality of fish consumed by young seabirds constrains their growth and development. Some nestlings were fed rations of Capelin (Mallotus villosus), Herring (Clupea pallasi) or Sand Lance (Ammodytes hexapterus) and their growth was compared with nestlings raised on equal biomass rations of Walleye Pollock (Theragra chalcograma). Nestlings fed rations of herring, sand lance, or capelin experienced higher growth increments than nestlings fed pollock. The energy density of forage fish fed to nestlings had a marked effect on growth increments and could be expected to have an effect on pre- and post-fledging survival of nestlings in the wild. These results provide empirical support for the junk-food hypothesis.

  9. Multiple model cardinalized probability hypothesis density filter

    Science.gov (United States)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  10. Visual short-term memory for sequential arrays.

    Science.gov (United States)

    Kumar, Arjun; Jiang, Yuhong

    2005-04-01

    The capacity of visual short-term memory (VSTM) for a single visual display has been investigated in past research, but VSTM for multiple sequential arrays has been explored only recently. In this study, we investigate the capacity of VSTM across two sequential arrays separated by a variable stimulus onset asynchrony (SOA). VSTM for spatial locations (Experiment 1), colors (Experiments 2-4), orientations (Experiments 3 and 4), and conjunction of color and orientation (Experiment 4) were tested, with the SOA across the two sequential arrays varying from 100 to 1,500 msec. We find that VSTM for the trailing array is much better than VSTM for the leading array, but when averaged across the two arrays VSTM has a constant capacity independent of the SOA. We suggest that multiple displays compete for retention in VSTM and that separating information into two temporally discrete groups does not enhance the overall capacity of VSTM.

  11. Testing the EKC hypothesis by considering trade openness, urbanization, and financial development: the case of Turkey.

    Science.gov (United States)

    Ozatac, Nesrin; Gokmenoglu, Korhan K; Taspinar, Nigar

    2017-07-01

    This study investigates the environmental Kuznets curve (EKC) hypothesis for the case of Turkey from 1960 to 2013 by considering energy consumption, trade, urbanization, and financial development variables. Although previous literature examines various aspects of the EKC hypothesis for the case of Turkey, our model augments the basic model with several covariates to develop a better understanding of the relationship among the variables and to refrain from omitted variable bias. The results of the bounds test and the error correction model under autoregressive distributed lag mechanism suggest long-run relationships among the variables as well as proof of the EKC and the scale effect in Turkey. A conditional Granger causality test reveals that there are causal relationships among the variables. Our findings can have policy implications including the imposition of a "polluter pays" mechanism, such as the implementation of a carbon tax for pollution trading, to raise the urban population's awareness about the importance of adopting renewable energy and to support clean, environmentally friendly technology.

  12. Personality and Behavior in Social Dilemmas: Testing the Situational Strength Hypothesis and the Role of Hypothetical Versus Real Incentives.

    Science.gov (United States)

    Lozano, José H

    2016-02-01

    Previous research aimed at testing the situational strength hypothesis suffers from serious limitations regarding the conceptualization of strength. In order to overcome these limitations, the present study attempts to test the situational strength hypothesis based on the operationalization of strength as reinforcement contingencies. One dispositional factor of proven effect on cooperative behavior, social value orientation (SVO), was used as a predictor of behavior in four social dilemmas with varying degree of situational strength. The moderating role of incentive condition (hypothetical vs. real) on the relationship between SVO and behavior was also tested. One hundred undergraduates were presented with the four social dilemmas and the Social Value Orientation Scale. One-half of the sample played the social dilemmas using real incentives, whereas the other half used hypothetical incentives. Results supported the situational strength hypothesis in that no behavioral variability and no effect of SVO on behavior were found in the strongest situation. However, situational strength did not moderate the effect of SVO on behavior in situations where behavior showed variability. No moderating effect was found for incentive condition either. The implications of these results for personality theory and assessment are discussed. © 2014 Wiley Periodicals, Inc.

  13. A test of the symbol interdependency hypothesis with both concrete and abstract stimuli

    Science.gov (United States)

    Buchanan, Lori

    2018-01-01

    In Experiment 1, the symbol interdependency hypothesis was tested with both concrete and abstract stimuli. Symbolic (i.e., semantic neighbourhood distance) and embodied (i.e., iconicity) factors were manipulated in two tasks—one that tapped symbolic relations (i.e., semantic relatedness judgment) and another that tapped embodied relations (i.e., iconicity judgment). Results supported the symbol interdependency hypothesis in that the symbolic factor was recruited for the semantic relatedness task and the embodied factor was recruited for the iconicity task. Across tasks, and especially in the iconicity task, abstract stimuli resulted in shorter RTs. This finding was in contrast to the concreteness effect where concrete words result in shorter RTs. Experiment 2 followed up on this finding by replicating the iconicity task from Experiment 1 in an ERP paradigm. Behavioural results continued to show a reverse concreteness effect with shorter RTs for abstract stimuli. However, ERP results paralleled the N400 and anterior N700 concreteness effects found in the literature, with more negative amplitudes for concrete stimuli. PMID:29590121

  14. A test of the symbol interdependency hypothesis with both concrete and abstract stimuli.

    Science.gov (United States)

    Malhi, Simritpal Kaur; Buchanan, Lori

    2018-01-01

    In Experiment 1, the symbol interdependency hypothesis was tested with both concrete and abstract stimuli. Symbolic (i.e., semantic neighbourhood distance) and embodied (i.e., iconicity) factors were manipulated in two tasks-one that tapped symbolic relations (i.e., semantic relatedness judgment) and another that tapped embodied relations (i.e., iconicity judgment). Results supported the symbol interdependency hypothesis in that the symbolic factor was recruited for the semantic relatedness task and the embodied factor was recruited for the iconicity task. Across tasks, and especially in the iconicity task, abstract stimuli resulted in shorter RTs. This finding was in contrast to the concreteness effect where concrete words result in shorter RTs. Experiment 2 followed up on this finding by replicating the iconicity task from Experiment 1 in an ERP paradigm. Behavioural results continued to show a reverse concreteness effect with shorter RTs for abstract stimuli. However, ERP results paralleled the N400 and anterior N700 concreteness effects found in the literature, with more negative amplitudes for concrete stimuli.

  15. Testing the ‘Residential Rootedness’-Hypothesis of Self-Employment for Germany and the UK (discussion paper)

    NARCIS (Netherlands)

    Reuschke, D.; Van Ham, M.

    2011-01-01

    Based on the notion that entrepreneurship is a ‘local event’, the literature argues that selfemployed workers and entrepreneurs are ‘rooted’ in place. This paper tests the ‘residential rootedness’-hypothesis of self-employment by examining for Germany and the UK whether the self-employed are less

  16. Robust Means Modeling: An Alternative for Hypothesis Testing of Independent Means under Variance Heterogeneity and Nonnormality

    Science.gov (United States)

    Fan, Weihua; Hancock, Gregory R.

    2012-01-01

    This study proposes robust means modeling (RMM) approaches for hypothesis testing of mean differences for between-subjects designs in order to control the biasing effects of nonnormality and variance inequality. Drawing from structural equation modeling (SEM), the RMM approaches make no assumption of variance homogeneity and employ robust…

  17. Invited Commentary: Can Issues With Reproducibility in Science Be Blamed on Hypothesis Testing?

    Science.gov (United States)

    Weinberg, Clarice R.

    2017-01-01

    Abstract In the accompanying article (Am J Epidemiol. 2017;186(6):646–647), Dr. Timothy Lash makes a forceful case that the problems with reproducibility in science stem from our “culture” of null hypothesis significance testing. He notes that when attention is selectively given to statistically significant findings, the estimated effects will be systematically biased away from the null. Here I revisit the recent history of genetic epidemiology and argue for retaining statistical testing as an important part of the tool kit. Particularly when many factors are considered in an agnostic way, in what Lash calls “innovative” research, investigators need a selection strategy to identify which findings are most likely to be genuine, and hence worthy of further study. PMID:28938713

  18. Neuroticism, intelligence, and intra-individual variability in elementary cognitive tasks: testing the mental noise hypothesis.

    Science.gov (United States)

    Colom, Roberto; Quiroga, Ma Angeles

    2009-08-01

    Some studies show positive correlations between intraindividual variability in elementary speed measures (reflecting processing efficiency) and individual differences in neuroticism (reflecting instability in behaviour). The so-called neural noise hypothesis assumes that higher levels of noise are related both to smaller indices of processing efficiency and greater levels of neuroticism. Here, we test this hypothesis measuring mental speed by means of three elementary cognitive tasks tapping similar basic processes but varying systematically their content (verbal, numerical, and spatial). Neuroticism and intelligence are also measured. The sample comprised 196 undergraduate psychology students. The results show that (1) processing efficiency is generally unrelated to individual differences in neuroticism, (2) processing speed and efficiency correlate with intelligence, and (3) only the efficiency index is genuinely related to intelligence when the colinearity between speed and efficiency is controlled.

  19. Mirror neurons, birdsong, and human language: a hypothesis.

    Science.gov (United States)

    Levy, Florence

    2011-01-01

    THE MIRROR SYSTEM HYPOTHESIS AND INVESTIGATIONS OF BIRDSONG ARE REVIEWED IN RELATION TO THE SIGNIFICANCE FOR THE DEVELOPMENT OF HUMAN SYMBOLIC AND LANGUAGE CAPACITY, IN TERMS OF THREE FUNDAMENTAL FORMS OF COGNITIVE REFERENCE: iconic, indexical, and symbolic. Mirror systems are initially iconic but can progress to indexical reference when produced without the need for concurrent stimuli. Developmental stages in birdsong are also explored with reference to juvenile subsong vs complex stereotyped adult syllables, as an analogy with human language development. While birdsong remains at an indexical reference stage, human language benefits from the capacity for symbolic reference. During a pre-linguistic "babbling" stage, recognition of native phonemic categories is established, allowing further development of subsequent prefrontal and linguistic circuits for sequential language capacity.

  20. Bootstrap Sequential Determination of the Co-integration Rank in VAR Models

    DEFF Research Database (Denmark)

    Guiseppe, Cavaliere; Rahbæk, Anders; Taylor, A.M. Robert

    with empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... in the literature by proposing a bootstrap sequential algorithm which we demonstrate delivers consistent cointegration rank estimation for general I(1) processes. Finite sample Monte Carlo simulations show the proposed procedure performs well in practice....

  1. Testing the stress-gradient hypothesis during the restoration of tropical degraded land using the shrub Rhodomyrtus tomentosa as a nurse plant

    Science.gov (United States)

    Nan Liu; Hai Ren; Sufen Yuan; Qinfeng Guo; Long Yang

    2013-01-01

    The relative importance of facilitation and competition between pairwise plants across abiotic stress gradients as predicted by the stress-gradient hypothesis has been confirmed in arid and temperate ecosystems, but the hypothesis has rarely been tested in tropical systems, particularly across nutrient gradients. The current research examines the interactions between a...

  2. Is intuition really cooperative? Improved tests support the social heuristics hypothesis.

    Science.gov (United States)

    Isler, Ozan; Maule, John; Starmer, Chris

    2018-01-01

    Understanding human cooperation is a major scientific challenge. While cooperation is typically explained with reference to individual preferences, a recent cognitive process view hypothesized that cooperation is regulated by socially acquired heuristics. Evidence for the social heuristics hypothesis rests on experiments showing that time-pressure promotes cooperation, a result that can be interpreted as demonstrating that intuition promotes cooperation. This interpretation, however, is highly contested because of two potential confounds. First, in pivotal studies compliance with time-limits is low and, crucially, evidence shows intuitive cooperation only when noncompliant participants are excluded. The inconsistency of test results has led to the currently unresolved controversy regarding whether or not noncompliant subjects should be included in the analysis. Second, many studies show high levels of social dilemma misunderstanding, leading to speculation that asymmetries in understanding might explain patterns that are otherwise interpreted as intuitive cooperation. We present evidence from an experiment that employs an improved time-pressure protocol with new features designed to induce high levels of compliance and clear tests of understanding. Our study resolves the noncompliance issue, shows that misunderstanding does not confound tests of intuitive cooperation, and provides the first independent experimental evidence for intuitive cooperation in a social dilemma using time-pressure.

  3. Ion microprobe analyses of aluminous lunar glasses - A test of the 'rock type' hypothesis

    Science.gov (United States)

    Meyer, C., Jr.

    1978-01-01

    Previous soil survey investigations found that there are natural groupings of glass compositions in lunar soils and that the average major element composition of some of these groupings is the same at widely separated lunar landing sites. This led soil survey enthusiasts to promote the hypothesis that the average composition of glass groupings represents the composition of primary lunar 'rock types'. In this investigation the trace element composition of numerous aluminous glass particles was determined by the ion microprobe method as a test of the above mentioned 'rock type' hypothesis. It was found that within any grouping of aluminous lunar glasses by major element content, there is considerable scatter in the refractory trace element content. In addition, aluminous glasses grouped by major elements were found to have different average trace element contents at different sites (Apollo 15, 16 and Luna 20). This evidence argues that natural groupings in glass compositions are determined by regolith processes and may not represent the composition of primary lunar 'rock types'.

  4. Visual working memory and number sense: Testing the double deficit hypothesis in mathematics.

    Science.gov (United States)

    Toll, Sylke W M; Kroesbergen, Evelyn H; Van Luit, Johannes E H

    2016-09-01

    Evidence exists that there are two main underlying cognitive factors in mathematical difficulties: working memory and number sense. It is suggested that real math difficulties appear when both working memory and number sense are weak, here referred to as the double deficit (DD) hypothesis. The aim of this study was to test the DD hypothesis within a longitudinal time span of 2 years. A total of 670 children participated. The mean age was 4.96 years at the start of the study and 7.02 years at the end of the study. At the end of the first year of kindergarten, both visual-spatial working memory and number sense were measured by two different tasks. At the end of first grade, mathematical performance was measured with two tasks, one for math facts and one for math problems. Multiple regressions revealed that both visual working memory and symbolic number sense are predictors of mathematical performance in first grade. Symbolic number sense appears to be the strongest predictor for both math areas (math facts and math problems). Non-symbolic number sense only predicts performance in math problems. Multivariate analyses of variance showed that a combination of visual working memory and number sense deficits (NSDs) leads to the lowest performance on mathematics. Our DD hypothesis was confirmed. Both visual working memory and symbolic number sense in kindergarten are related to mathematical performance 2 years later, and a combination of visual working memory and NSDs leads to low performance in mathematical performance. © 2016 The British Psychological Society.

  5. Recurrence network measures for hypothesis testing using surrogate data: Application to black hole light curves

    Science.gov (United States)

    Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.

    2018-01-01

    Recurrence networks and the associated statistical measures have become important tools in the analysis of time series data. In this work, we test how effective the recurrence network measures are in analyzing real world data involving two main types of noise, white noise and colored noise. We use two prominent network measures as discriminating statistic for hypothesis testing using surrogate data for a specific null hypothesis that the data is derived from a linear stochastic process. We show that the characteristic path length is especially efficient as a discriminating measure with the conclusions reasonably accurate even with limited number of data points in the time series. We also highlight an additional advantage of the network approach in identifying the dimensionality of the system underlying the time series through a convergence measure derived from the probability distribution of the local clustering coefficients. As examples of real world data, we use the light curves from a prominent black hole system and show that a combined analysis using three primary network measures can provide vital information regarding the nature of temporal variability of light curves from different spectroscopic classes.

  6. Aminoglycoside antibiotics and autism: a speculative hypothesis

    Directory of Open Access Journals (Sweden)

    Manev Hari

    2001-10-01

    Full Text Available Abstract Background Recently, it has been suspected that there is a relationship between therapy with some antibiotics and the onset of autism; but even more curious, some children benefited transiently from a subsequent treatment with a different antibiotic. Here, we speculate how aminoglycoside antibiotics might be associated with autism. Presentation We hypothesize that aminoglycoside antibiotics could a trigger the autism syndrome in susceptible infants by causing the stop codon readthrough, i.e., a misreading of the genetic code of a hypothetical critical gene, and/or b improve autism symptoms by correcting the premature stop codon mutation in a hypothetical polymorphic gene linked to autism. Testing Investigate, retrospectively, whether a link exists between aminoglycoside use (which is not extensive in children and the onset of autism symptoms (hypothesis "a", or between amino glycoside use and improvement of these symptoms (hypothesis "b". Whereas a prospective study to test hypothesis "a" is not ethically justifiable, a study could be designed to test hypothesis "b". Implications It should be stressed that at this stage no direct evidence supports our speculative hypothesis and that its main purpose is to initiate development of new ideas that, eventually, would improve our understanding of the pathobiology of autism.

  7. Efficient Market Hypothesis in South Africa: Evidence from Linear and Nonlinear Unit Root Tests

    Directory of Open Access Journals (Sweden)

    Andrew Phiri

    2015-12-01

    Full Text Available This study investigates the weak form efficient market hypothesis (EMH for five generalized stock indices in the Johannesburg Stock Exchange (JSE using weekly data collected from 31st January 2000 to 16th December 2014. In particular, we test for weak form market efficiency using a battery of linear and nonlinear unit root testing procedures comprising of the classical augmented Dickey-Fuller (ADF tests, the two-regime threshold autoregressive (TAR unit root tests described in Enders and Granger (1998 as well as the three-regime unit root tests described in Bec, Salem, and Carrasco (2004. Based on our empirical analysis, we are able to demonstrate that whilst the linear unit root tests advocate for unit roots within the time series, the nonlinear unit root tests suggest that most stock indices are threshold stationary processes. These results bridge two opposing contentions obtained from previous studies by concluding that under a linear framework the JSE stock indices offer support in favour of weak form market efficiency whereas when nonlinearity is accounted for, a majority of the indices violate the weak form EMH.

  8. The Bacterial Sequential Markov Coalescent.

    Science.gov (United States)

    De Maio, Nicola; Wilson, Daniel J

    2017-05-01

    Bacteria can exchange and acquire new genetic material from other organisms directly and via the environment. This process, known as bacterial recombination, has a strong impact on the evolution of bacteria, for example, leading to the spread of antibiotic resistance across clades and species, and to the avoidance of clonal interference. Recombination hinders phylogenetic and transmission inference because it creates patterns of substitutions (homoplasies) inconsistent with the hypothesis of a single evolutionary tree. Bacterial recombination is typically modeled as statistically akin to gene conversion in eukaryotes, i.e. , using the coalescent with gene conversion (CGC). However, this model can be very computationally demanding as it needs to account for the correlations of evolutionary histories of even distant loci. So, with the increasing popularity of whole genome sequencing, the need has emerged for a faster approach to model and simulate bacterial genome evolution. We present a new model that approximates the coalescent with gene conversion: the bacterial sequential Markov coalescent (BSMC). Our approach is based on a similar idea to the sequential Markov coalescent (SMC)-an approximation of the coalescent with crossover recombination. However, bacterial recombination poses hurdles to a sequential Markov approximation, as it leads to strong correlations and linkage disequilibrium across very distant sites in the genome. Our BSMC overcomes these difficulties, and shows a considerable reduction in computational demand compared to the exact CGC, and very similar patterns in simulated data. We implemented our BSMC model within new simulation software FastSimBac. In addition to the decreased computational demand compared to previous bacterial genome evolution simulators, FastSimBac provides more general options for evolutionary scenarios, allowing population structure with migration, speciation, population size changes, and recombination hotspots. FastSimBac is

  9. Monte Carlo, hypothesis-tests for rare events superimposed on a background

    International Nuclear Information System (INIS)

    Avignone, F.T. III; Miley, H.S.; Padgett, W.J.; Weier, D.W.

    1985-01-01

    We describe two techniques to search for small numbers of counts under a peak of known shape and superimposed on a background with statistical fluctuations. Many comparisons of a single experimental spectrum with computer simulations of the peak and background are made. From these we calculate the probability that y hypothesized counts in the peaks of the simulations, will result in a number larger than that observed in a given energy interval (bin) in the experimental spectrum. This is done for many values of the hypothesized number y. One procedure is very similar to testing a statistical hypothesis and can be analytically applied. Another is presented which is related to pattern recognition techniques and is less sensitive to the uncertainty in the mean. Sample applications to double beta decay data are presented. (orig.)

  10. Early lymphocyte recovery after intensive timed sequential chemotherapy for acute myelogenous leukemia: peripheral oligoclonal expansion of regulatory T cells.

    Science.gov (United States)

    Kanakry, Christopher G; Hess, Allan D; Gocke, Christopher D; Thoburn, Christopher; Kos, Ferdynand; Meyer, Christian; Briel, Janet; Luznik, Leo; Smith, B Douglas; Levitsky, Hyam; Karp, Judith E

    2011-01-13

    Few published studies characterize early lymphocyte recovery after intensive chemotherapy for acute myelogenous leukemia (AML). To test the hypothesis that lymphocyte recovery mirrors ontogeny, we characterized early lymphocyte recovery in 20 consecutive patients undergoing induction timed sequential chemotherapy for newly diagnosed AML. Recovering T lymphocytes were predominantly CD4(+) and included a greatly expanded population of CD3(+)CD4(+)CD25(+)Foxp3(+) T cells. Recovering CD3(+)CD4(+)CD25(+)Foxp3(+) T cells were phenotypically activated regulatory T cells and showed suppressive activity on cytokine production in a mixed lymphocyte reaction. Despite an initial burst of thymopoiesis, most recovering regulatory T cells were peripherally derived. Furthermore, regulatory T cells showed marked oligoclonal skewing, suggesting that their peripheral expansion was antigen-driven. Overall, lymphocyte recovery after chemotherapy differs from ontogeny, specifically identifying a peripherally expanded oligoclonal population of activated regulatory T lymphocytes. These differences suggest a stereotyped immunologic recovery shared by patients with newly diagnosed AML after induction timed sequential chemotherapy. Further insight into this oligoclonal regulatory T-cell population will be fundamental toward developing effective immunomodulatory techniques to improve survival for patients with AML.

  11. Configural and component processing in simultaneous and sequential lineup procedures

    OpenAIRE

    Flowe, HD; Smith, HMJ; Karoğlu, N; Onwuegbusi, TO; Rai, L

    2015-01-01

    Configural processing supports accurate face recognition, yet it has never been examined within the context of criminal identification lineups. We tested, using the inversion paradigm, the role of configural processing in lineups. Recent research has found that face discrimination accuracy in lineups is better in a simultaneous compared to a sequential lineup procedure. Therefore, we compared configural processing in simultaneous and sequential lineups to examine whether there are differences...

  12. In silico model-based inference: a contemporary approach for hypothesis testing in network biology.

    Science.gov (United States)

    Klinke, David J

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. © 2014 American Institute of Chemical Engineers.

  13. Testing the Developmental Origins of Health and Disease Hypothesis for Psychopathology Using Family-Based Quasi-Experimental Designs

    Science.gov (United States)

    D’Onofrio, Brian M.; Class, Quetzal A.; Lahey, Benjamin B.; Larsson, Henrik

    2014-01-01

    The Developmental Origin of Health and Disease (DOHaD) hypothesis is a broad theoretical framework that emphasizes how early risk factors have a causal influence on psychopathology. Researchers have raised concerns about the causal interpretation of statistical associations between early risk factors and later psychopathology because most existing studies have been unable to rule out the possibility of environmental and genetic confounding. In this paper we illustrate how family-based quasi-experimental designs can test the DOHaD hypothesis by ruling out alternative hypotheses. We review the logic underlying sibling-comparison, co-twin control, offspring of siblings/twins, adoption, and in vitro fertilization designs. We then present results from studies using these designs focused on broad indices of fetal development (low birth weight and gestational age) and a particular teratogen, smoking during pregnancy. The results provide mixed support for the DOHaD hypothesis for psychopathology, illustrating the critical need to use design features that rule out unmeasured confounding. PMID:25364377

  14. Test of the decaying dark matter hypothesis using the Hopkins Ultraviolet Telescope

    Science.gov (United States)

    Davidsen, A. F.; Kriss, G. A.; Ferguson, H. C.; Blair, W. P.; Bowers, C. W.; Kimble, R. A.

    1991-01-01

    Sciama's hypothesis that the dark matter associated with galaxies, galaxy clusters, and the intergalactic medium consists of tau neutrinos of rest mass 28-30 eV whose decay generates ultraviolet photons of energy roughly 14-15 eV, has been tested using the Hopkins Ultraviolet Telescope flows aboard the Space Shuttle Columbia. A straightforward application of Sciama's model predicts that a spectral line from neutrino decay photons should be observed from the rich galaxy cluster Abell 665 with an SNR of about 30. No such emission was detected. For neutrinos in the mass range 27.2-32.1 eV, the observations set a lower lifetime limit significantly greater than Sciama's model requires.

  15. Effects of arousal on cognitive control: empirical tests of the conflict-modulated Hebbian-learning hypothesis.

    Science.gov (United States)

    Brown, Stephen B R E; van Steenbergen, Henk; Kedar, Tomer; Nieuwenhuis, Sander

    2014-01-01

    An increasing number of empirical phenomena that were previously interpreted as a result of cognitive control, turn out to reflect (in part) simple associative-learning effects. A prime example is the proportion congruency effect, the finding that interference effects (such as the Stroop effect) decrease as the proportion of incongruent stimuli increases. While this was previously regarded as strong evidence for a global conflict monitoring-cognitive control loop, recent evidence has shown that the proportion congruency effect is largely item-specific and hence must be due to associative learning. The goal of our research was to test a recent hypothesis about the mechanism underlying such associative-learning effects, the conflict-modulated Hebbian-learning hypothesis, which proposes that the effect of conflict on associative learning is mediated by phasic arousal responses. In Experiment 1, we examined in detail the relationship between the item-specific proportion congruency effect and an autonomic measure of phasic arousal: task-evoked pupillary responses. In Experiment 2, we used a task-irrelevant phasic arousal manipulation and examined the effect on item-specific learning of incongruent stimulus-response associations. The results provide little evidence for the conflict-modulated Hebbian-learning hypothesis, which requires additional empirical support to remain tenable.

  16. Effects of arousal on cognitive control: Empirical tests of the conflict-modulated Hebbian-learning hypothesis

    Directory of Open Access Journals (Sweden)

    Stephen B.R.E. Brown

    2014-01-01

    Full Text Available An increasing number of empirical phenomena that were previously interpreted as a result of cognitive control, turn out to reflect (in part simple associative-learning effects. A prime example is the proportion congruency effect, the finding that interference effects (such as the Stroop effect decrease as the proportion of incongruent stimuli increases. While this was previously regarded as strong evidence for a global conflict monitoring-cognitive control loop, recent evidence has shown that the proportion congruency effect is largely item-specific and hence must be due to associative learning. The goal of our research was to test a recent hypothesis about the mechanism underlying such associative-learning effects, the conflict-modulated Hebbian-learning hypothesis, which proposes that the effect of conflict on associative learning is mediated by phasic arousal responses. In Experiment 1, we examined in detail the relationship between the item-specific proportion congruency effect and an autonomic measure of phasic arousal: task-evoked pupillary responses. In Experiment 2, we used a task-irrelevant phasic arousal manipulation and examined the effect on item-specific learning of incongruent stimulus-response associations. The results provide little evidence for the conflict-modulated Hebbian-learning hypothesis, which requires additional empirical support to remain tenable.

  17. Molecular phylogeny of selected species of the order Dinophysiales (Dinophyceae) - testing the hypothesis of a Dinophysioid radiation

    DEFF Research Database (Denmark)

    Jensen, Maria Hastrup; Daugbjerg, Niels

    2009-01-01

    additional information on morphology and ecology to these evolutionary lineages. We have for the first time combined morphological information with molecular phylogenies to test the dinophysioid radiation hypothesis in a modern context. Nuclear-encoded LSU rDNA sequences including domains D1-D6 from 27...

  18. Mirror neurons, birdsong and human language: a hypothesis

    Directory of Open Access Journals (Sweden)

    Florence eLevy

    2012-01-01

    Full Text Available AbstractThe Mirror System Hypothesis (MSH and investigations of birdsong are reviewed in relation to the significance for the development of human symbolic and language capacity, in terms of three fundamental forms of cognitive reference: iconic, indexical, and symbolic. Mirror systems are initially iconic but can progress to indexal reference when produced without the need for concurrent stimuli. Developmental stages in birdsong are also explored with reference to juvenile subsong vs complex stereotyped adult syllables, as an analogy with human language development. While birdsong remains at an indexal reference stage, human language benefits from the capacity for symbolic reference. During a pre-linguistic ‘babbling’ stage, recognition of native phonemic categories is established, allowing further development of a subsequent prefrontal and linguistic circuits for sequential language capacity.

  19. Modelling sequentially scored item responses

    NARCIS (Netherlands)

    Akkermans, W.

    2000-01-01

    The sequential model can be used to describe the variable resulting from a sequential scoring process. In this paper two more item response models are investigated with respect to their suitability for sequential scoring: the partial credit model and the graded response model. The investigation is

  20. Empirical tests of the Chicago model and the Easterlin hypothesis: a case study of Japan.

    Science.gov (United States)

    Ohbuchi, H

    1982-05-01

    The objective of this discussion is to test the applicability of economic theory of fertility with special reference to postwar Japan and to find a clue for forecasting the future trend of fertility. The theories examined are the "Chicago model" and the "Easterlin hypothesis." The major conclusion common among the leading economic theories of fertility, which have their origin with Gary S. Becker (1960, 1965) and Richard A. Easterlin (1966), is the positive income effect, i.e., that the relationship between income and fertility is positive despite the evidence that higher income families have fewer children and that fertility has declined with economic development. To bridge the gap between theory and fact is the primary purpose of the economic theory of fertility, and each offers a different interpretation for it. The point of the Chicago model, particularly of the household decision making model of the "new home economics," is the mechanism that a positive effect of husband's income growth on fertility is offset by a negative price effect caused by the opportunity cost of wife's time. While the opportunity cost of wife's time is independent of the female wage rate for an unemployed wife, it is directly associated with the wage rate for a gainfully employed wife. Thus, the fertility response to female wages occurs only among families with an employed wife. The primary concern of empirical efforts to test the Chicago model has been with the determination of income and price elasticities. An attempt is made to test the relevance of the Chicago model and the Easterlin hypothesis in explaning the fertility movement in postwar Japan. In case of the Chicago model, the statistical results appeared fairly successful but did not match with the theory. The effect on fertility of a rise in women's real wage (and, therefore in the opportunity cost of mother's time) and of a rise in labor force participation rate of married women of childbearing age in recent years could not

  1. Feasibility of Combining Common Data Elements Across Studies to Test a Hypothesis.

    Science.gov (United States)

    Corwin, Elizabeth J; Moore, Shirley M; Plotsky, Andrea; Heitkemper, Margaret M; Dorsey, Susan G; Waldrop-Valverde, Drenna; Bailey, Donald E; Docherty, Sharron L; Whitney, Joanne D; Musil, Carol M; Dougherty, Cynthia M; McCloskey, Donna J; Austin, Joan K; Grady, Patricia A

    2017-05-01

    The purpose of this article is to describe the outcomes of a collaborative initiative to share data across five schools of nursing in order to evaluate the feasibility of collecting common data elements (CDEs) and developing a common data repository to test hypotheses of interest to nursing scientists. This initiative extended work already completed by the National Institute of Nursing Research CDE Working Group that successfully identified CDEs related to symptoms and self-management, with the goal of supporting more complex, reproducible, and patient-focused research. Two exemplars describing the group's efforts are presented. The first highlights a pilot study wherein data sets from various studies by the represented schools were collected retrospectively, and merging of the CDEs was attempted. The second exemplar describes the methods and results of an initiative at one school that utilized a prospective design for the collection and merging of CDEs. Methods for identifying a common symptom to be studied across schools and for collecting the data dictionaries for the related data elements are presented for the first exemplar. The processes for defining and comparing the concepts and acceptable values, and for evaluating the potential to combine and compare the data elements are also described. Presented next are the steps undertaken in the second exemplar to prospectively identify CDEs and establish the data dictionaries. Methods for common measurement and analysis strategies are included. Findings from the first exemplar indicated that without plans in place a priori to ensure the ability to combine and compare data from disparate sources, doing so retrospectively may not be possible, and as a result hypothesis testing across studies may be prohibited. Findings from the second exemplar, however, indicated that a plan developed prospectively to combine and compare data sets is feasible and conducive to merged hypothesis testing. Although challenges exist in

  2. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio P [Richland, WA; Cowell, Andrew J [Kennewick, WA; Gregory, Michelle L [Richland, WA; Baddeley, Robert L [Richland, WA; Paulson, Patrick R [Pasco, WA; Tratz, Stephen C [Richland, WA; Hohimer, Ryan E [West Richland, WA

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  3. Sequential and Simultaneous Logit: A Nested Model.

    NARCIS (Netherlands)

    van Ophem, J.C.M.; Schram, A.J.H.C.

    1997-01-01

    A nested model is presented which has both the sequential and the multinomial logit model as special cases. This model provides a simple test to investigate the validity of these specifications. Some theoretical properties of the model are discussed. In the analysis a distribution function is

  4. Testing the status-legitimacy hypothesis: A multilevel modeling approach to the perception of legitimacy in income distribution in 36 nations.

    Science.gov (United States)

    Caricati, Luca

    2017-01-01

    The status-legitimacy hypothesis was tested by analyzing cross-national data about social inequality. Several indicators were used as indexes of social advantage: social class, personal income, and self-position in the social hierarchy. Moreover, inequality and freedom in nations, as indexed by Gini and by the human freedom index, were considered. Results from 36 nations worldwide showed no support for the status-legitimacy hypothesis. The perception that income distribution was fair tended to increase as social advantage increased. Moreover, national context increased the difference between advantaged and disadvantaged people in the perception of social fairness: Contrary to the status-legitimacy hypothesis, disadvantaged people were more likely than advantaged people to perceive income distribution as too large, and this difference increased in nations with greater freedom and equality. The implications for the status-legitimacy hypothesis are discussed.

  5. Sequential versus Organized Rehearsal

    Science.gov (United States)

    Weist, Richard M.; Crawford, Charlotte

    1973-01-01

    The purpose of this research was to test the hypothesis that organization in rehearsal is a necessary condition for organization in recall; that is, if recall is organized, then rehearsal must have been organized. (Author)

  6. Sequential fragmentation of Pleistocene forests in an East Africa biodiversity hotspot: chameleons as a model to track forest history.

    Directory of Open Access Journals (Sweden)

    G John Measey

    Full Text Available The Eastern Arc Mountains (EAM is an example of naturally fragmented tropical forests, which contain one of the highest known concentrations of endemic plants and vertebrates. Numerous paleo-climatic studies have not provided direct evidence for ancient presence of Pleistocene forests, particularly in the regions in which savannah presently occurs. Knowledge of the last period when forests connected EAM would provide a sound basis for hypothesis testing of vicariance and dispersal models of speciation. Dated phylogenies have revealed complex patterns throughout EAM, so we investigated divergence times of forest fauna on four montane isolates in close proximity to determine whether forest break-up was most likely to have been simultaneous or sequential, using population genetics of a forest restricted arboreal chameleon, Kinyongia boehmei.We used mitochondrial and nuclear genetic sequence data and mutation rates from a fossil-calibrated phylogeny to estimate divergence times between montane isolates using a coalescent approach. We found that chameleons on all mountains are most likely to have diverged sequentially within the Pleistocene from 0.93-0.59 Ma (95% HPD 0.22-1.84 Ma. In addition, post-hoc tests on chameleons on the largest montane isolate suggest a population expansion ∼182 Ka.Sequential divergence is most likely to have occurred after the last of three wet periods within the arid Plio-Pleistocene era, but was not correlated with inter-montane distance. We speculate that forest connection persisted due to riparian corridors regardless of proximity, highlighting their importance in the region's historic dispersal events. The population expansion coincides with nearby volcanic activity, which may also explain the relative paucity of the Taita's endemic fauna. Our study shows that forest chameleons are an apposite group to track forest fragmentation, with the inference that forest extended between some EAM during the Pleistocene 1

  7. Anatomy of a defective barrier: sequential glove leak detection in a surgical and dental environment.

    Science.gov (United States)

    Albin, M S; Bunegin, L; Duke, E S; Ritter, R R; Page, C P

    1992-02-01

    a) To determine the frequency of perforations in latex surgical gloves before, during, and after surgical and dental procedures; b) to evaluate the topographical distribution of perforations in latex surgical gloves after surgical and dental procedures; and c) to validate methods of testing for latex surgical glove patency. Multitrial tests under in vitro conditions and a prospective sequential patient study using consecutive testing. An outpatient dental clinic at a university dental school, the operating suite in a medical school affiliated with the Veteran's Hospital, and a biomechanics laboratory. Surgeons, scrub nurses, and dental technicians participating in 50 surgical and 50 dental procedures. We collected 679 latex surgical gloves after surgical procedures and tested them for patency by using a water pressure test. We also employed an electronic glove leak detector before donning, after sequential time intervals, and upon termination of 47 surgical (sequential surgical), 50 dental (sequential dental), and in three orthopedic cases where double gloving was used. The electronic glove leak detector was validated by using electronic point-by-point surface probing, fluorescein dye diffusion, as well as detecting glove punctures made with a 27-gauge needle. The random study indicated a leak rate of 33.0% (224 out of 679) in latex surgical gloves; the sequential surgical study demonstrated patency in 203 out of 347 gloves (58.5%); the sequential dental study showed 34 leaks in the 106 gloves used (32.1%); and with double gloving, the leak rate decreased to 25.0% (13 of 52 gloves tested). While the allowable FDA defect rate for unused latex surgical gloves is 1.5%, we noted defect rates in unused gloves of 5.5% in the sequential surgical, 1.9% in the sequential dental, and 4.0% in our electronic glove leak detector validating study. In the sequential surgical study, 52% of the leaks had occurred by 75 mins, and in the sequential dental study, 75% of the leaks

  8. Spatial distribution of sequential ventilation during mechanical ventilation of the uninjured lung: an argument for cyclical airway collapse and expansion

    Directory of Open Access Journals (Sweden)

    Altemeier William A

    2010-05-01

    Full Text Available Abstract Background Ventilator-induced lung injury (VILI is a recognized complication of mechanical ventilation. Although the specific mechanism by which mechanical ventilation causes lung injury remains an active area of study, the application of positive end expiratory pressure (PEEP reduces its severity. We have previously reported that VILI is spatially heterogeneous with the most severe injury in the dorsal-caudal lung. This regional injury heterogeneity was abolished by the application of PEEP = 8 cm H2O. We hypothesized that the spatial distribution of lung injury correlates with areas in which cyclical airway collapse and recruitment occurs. Methods To test this hypothesis, rabbits were mechanically ventilated in the supine posture, and regional ventilation distribution was measured under four conditions: tidal volumes (VT of 6 and 12 ml/kg with PEEP levels of 0 and 8 cm H2O. Results We found that relative ventilation was sequentially redistributed towards dorsal-caudal lung with increasing tidal volume. This sequential ventilation redistribution was abolished with the addition of PEEP. Conclusions These results suggest that cyclical airway collapse and recruitment is regionally heterogeneous and spatially correlated with areas most susceptible to VILI.

  9. Mastery Learning and the Decreasing Variability Hypothesis.

    Science.gov (United States)

    Livingston, Jennifer A.; Gentile, J. Ronald

    1996-01-01

    This report results from studies that tested two variations of Bloom's decreasing variability hypothesis using performance on successive units of achievement in four graduate classrooms that used mastery learning procedures. Data do not support the decreasing variability hypothesis; rather, they show no change over time. (SM)

  10. Testing the Rational Expectations Hypothesis on the Retail Trade Sector Using Survey Data from Malaysia

    OpenAIRE

    Puah, Chin-Hong; Chong, Lucy Lee-Yun; Jais, Mohamad

    2011-01-01

    The rational expectations hypothesis states that when people are expecting things to happen, using the available information, the predicted outcomes usually occur. This study utilized survey data provided by the Business Expectations Survey of Limited Companies to test whether forecasts of the Malaysian retail sector, based on gross revenue and capital expenditures, are rational. The empirical evidence illustrates that the decision-makers expectations in the retail sector are biased and too o...

  11. Sequential decisions: a computational comparison of observational and reinforcement accounts.

    Directory of Open Access Journals (Sweden)

    Nazanin Mohammadi Sepahvand

    Full Text Available Right brain damaged patients show impairments in sequential decision making tasks for which healthy people do not show any difficulty. We hypothesized that this difficulty could be due to the failure of right brain damage patients to develop well-matched models of the world. Our motivation is the idea that to navigate uncertainty, humans use models of the world to direct the decisions they make when interacting with their environment. The better the model is, the better their decisions are. To explore the model building and updating process in humans and the basis for impairment after brain injury, we used a computational model of non-stationary sequence learning. RELPH (Reinforcement and Entropy Learned Pruned Hypothesis space was able to qualitatively and quantitatively reproduce the results of left and right brain damaged patient groups and healthy controls playing a sequential version of Rock, Paper, Scissors. Our results suggests that, in general, humans employ a sub-optimal reinforcement based learning method rather than an objectively better statistical learning approach, and that differences between right brain damaged and healthy control groups can be explained by different exploration policies, rather than qualitatively different learning mechanisms.

  12. Apraxia of tool use: more evidence for the technical reasoning hypothesis.

    Science.gov (United States)

    Jarry, Christophe; Osiurak, François; Delafuys, David; Chauviré, Valérie; Etcharry-Bouyx, Frédérique; Le Gall, Didier

    2013-10-01

    Various distinct cognitive processes such as semantic memory, executive planning or technical reasoning have been shown to support tool use. The aim of this study is to investigate the relationship between these processes. To do so, a large apraxia battery was submitted to 16 patients with left brain-damage (LBD) and aphasia and 19 healthy controls. The battery included: classical apraxia tests (Pantomime of Tool Use and Single Tool Use), familiar and novel tool use tests (Tool-Object Pairs and Sequential Mechanical Problem-Solving), semantic memory tests (Recognition of tool utilization gestures and Functional and Categorical Associations) as well as the Tower Of London. The Sequential Mechanical Problem-Solving task is a new task which permits the evaluation of pre-planning in unusual tool use situations. In this task as well as in the Tool-Object Pairs task, participants solved a tool use problem in a Choice and a No-Choice condition to examine the effect of tool selection. Globally, left brain damaged patients were impaired as compared to controls. We found high correlations in left brain damaged patients between performances on classical apraxia tests, familiar and novel tool use tests and Functional and Categorical Associations but no significant association between these performances and Tower Of London or Recognition of tool utilization gestures. Furthermore, the two conditions (Choice and No-Choice) of Tool-Object Pairs and Sequential Mechanical Problem-Solving were associated. In sum, all tasks involving tool use are strongly associated in LBD patients. Moreover, the ability to solve sequential mechanical problems does not depend on executive planning. Also, tool use appears to be associated with knowledge about object function but not with knowledge about tool manipulation. Taken together, these findings indicate that technical reasoning and, to a lesser extent, semantic memory may both play an important role in tool use. Copyright © 2013 Elsevier Ltd

  13. Plant Disease Severity Assessment-How Rater Bias, Assessment Method, and Experimental Design Affect Hypothesis Testing and Resource Use Efficiency.

    Science.gov (United States)

    Chiang, Kuo-Szu; Bock, Clive H; Lee, I-Hsuan; El Jarroudi, Moussa; Delfosse, Philippe

    2016-12-01

    The effect of rater bias and assessment method on hypothesis testing was studied for representative experimental designs for plant disease assessment using balanced and unbalanced data sets. Data sets with the same number of replicate estimates for each of two treatments are termed "balanced" and those with unequal numbers of replicate estimates are termed "unbalanced". The three assessment methods considered were nearest percent estimates (NPEs), an amended 10% incremental scale, and the Horsfall-Barratt (H-B) scale. Estimates of severity of Septoria leaf blotch on leaves of winter wheat were used to develop distributions for a simulation model. The experimental designs are presented here in the context of simulation experiments which consider the optimal design for the number of specimens (individual units sampled) and the number of replicate estimates per specimen for a fixed total number of observations (total sample size for the treatments being compared). The criterion used to gauge each method was the power of the hypothesis test. As expected, at a given fixed number of observations, the balanced experimental designs invariably resulted in a higher power compared with the unbalanced designs at different disease severity means, mean differences, and variances. Based on these results, with unbiased estimates using NPE, the recommended number of replicate estimates taken per specimen is 2 (from a sample of specimens of at least 30), because this conserves resources. Furthermore, for biased estimates, an apparent difference in the power of the hypothesis test was observed between assessment methods and between experimental designs. Results indicated that, regardless of experimental design or rater bias, an amended 10% incremental scale has slightly less power compared with NPEs, and that the H-B scale is more likely than the others to cause a type II error. These results suggest that choice of assessment method, optimizing sample number and number of replicate

  14. Imitation of the sequential structure of actions by chimpanzees (Pan troglodytes).

    Science.gov (United States)

    Whiten, A

    1998-09-01

    Imitation was studied experimentally by allowing chimpanzees (Pan troglodytes) to observe alternative patterns of actions for opening a specially designed "artificial fruit." Like problematic foods primates deal with naturally, with the test fruit several defenses had to be removed to gain access to an edible core, but the sequential order and method of defense removal could be systematically varied. Each subject repeatedly observed 1 of 2 alternative techniques for removing each defense and 1 of 2 alternative sequential patterns of defense removal. Imitation of sequential organization emerged after repeated cycles of demonstration and attempts at opening the fruit. Imitation in chimpanzees may thus have some power to produce cultural convergence, counter to the supposition that individual learning processes corrupt copied actions. Imitation of sequential organization was accompanied by imitation of some aspects of the techniques that made up the sequence.

  15. Testing the Münch hypothesis of long distance phloem transport in plants

    DEFF Research Database (Denmark)

    Knoblauch, Michael; Knoblauch, Jan; Mullendore, Daniel L.

    2016-01-01

    Long distance transport in plants occurs in sieve tubes of the phloem. The pressure flow hypothesis introduced by Ernst Münch in 1930 describes a mechanism of osmotically generated pressure differentials that are supposed to drive the movement of sugars and other solutes in the phloem, but this h......Long distance transport in plants occurs in sieve tubes of the phloem. The pressure flow hypothesis introduced by Ernst Münch in 1930 describes a mechanism of osmotically generated pressure differentials that are supposed to drive the movement of sugars and other solutes in the phloem......, but this hypothesis has long faced major challenges. The key issue is whether the conductance of sieve tubes, including sieve plate pores, is sufficient to allow pressure flow. We show that with increasing distance between source and sink, sieve tube conductivity and turgor increases dramatically in Ipomoea nil. Our...... results provide strong support for the Münch hypothesis, while providing new tools for the investigation of one of the least understood plant tissues....

  16. A test of the cerebellar hypothesis of dyslexia in adequate and inadequate responders to reading intervention.

    Science.gov (United States)

    Barth, Amy E; Denton, Carolyn A; Stuebing, Karla K; Fletcher, Jack M; Cirino, Paul T; Francis, David J; Vaughn, Sharon

    2010-05-01

    The cerebellar hypothesis of dyslexia posits that cerebellar deficits are associated with reading disabilities and may explain why some individuals with reading disabilities fail to respond to reading interventions. We tested these hypotheses in a sample of children who participated in a grade 1 reading intervention study (n = 174) and a group of typically achieving children (n = 62). At posttest, children were classified as adequately responding to the intervention (n = 82), inadequately responding with decoding and fluency deficits (n = 36), or inadequately responding with only fluency deficits (n = 56). Based on the Bead Threading and Postural Stability subtests from the Dyslexia Screening Test-Junior, we found little evidence that assessments of cerebellar functions were associated with academic performance or responder status. In addition, we did not find evidence supporting the hypothesis that cerebellar deficits are more prominent for poor readers with "specific" reading disabilities (i.e., with discrepancies relative to IQ) than for poor readers with reading scores consistent with IQ. In contrast, measures of phonological awareness, rapid naming, and vocabulary were strongly associated with responder status and academic outcomes. These results add to accumulating evidence that fails to associate cerebellar functions with reading difficulties.

  17. Test of the prey-base hypothesis to explain use of red squirrel midden sites by American martens

    Science.gov (United States)

    Dean E. Pearson; Leonard F. Ruggiero

    2001-01-01

    We tested the prey-base hypothesis to determine whether selection of red squirrel (Tamiasciurus hudsonicus) midden sites (cone caches) by American martens (Martes americana) for resting and denning could be attributed to greater abundance of small-mammal prey. Five years of livetrapping at 180 sampling stations in 2 drainages showed that small mammals,...

  18. The Effect of Retention Interval Task Difficulty on Young Children's Prospective Memory: Testing the Intention Monitoring Hypothesis

    Science.gov (United States)

    Mahy, Caitlin E. V.; Moses, Louis J.

    2015-01-01

    The current study examined the impact of retention interval task difficulty on 4- and 5-year-olds' prospective memory (PM) to test the hypothesis that children periodically monitor their intentions during the retention interval and that disrupting this monitoring may result in poorer PM performance. In addition, relations among PM, working memory,…

  19. Statistical power analysis a simple and general model for traditional and modern hypothesis tests

    CERN Document Server

    Murphy, Kevin R; Wolach, Allen

    2014-01-01

    Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g

  20. Memory and other properties of multiple test procedures generated by entangled graphs.

    Science.gov (United States)

    Maurer, Willi; Bretz, Frank

    2013-05-10

    Methods for addressing multiplicity in clinical trials have attracted much attention during the past 20 years. They include the investigation of new classes of multiple test procedures, such as fixed sequence, fallback and gatekeeping procedures. More recently, sequentially rejective graphical test procedures have been introduced to construct and visualize complex multiple test strategies. These methods propagate the local significance level of a rejected null hypothesis to not-yet rejected hypotheses. In the graph defining the test procedure, hypotheses together with their local significance levels are represented by weighted vertices and the propagation rule by weighted directed edges. An algorithm provides the rules for updating the local significance levels and the transition weights after rejecting an individual hypothesis. These graphical procedures have no memory in the sense that the origin of the propagated significance level is ignored in subsequent iterations. However, in some clinical trial applications, memory is desirable to reflect the underlying dependence structure of the study objectives. In such cases, it would allow the further propagation of significance levels to be dependent on their origin and thus reflect the grouped parent-descendant structures of the hypotheses. We will give examples of such situations and show how to induce memory and other properties by convex combination of several individual graphs. The resulting entangled graphs provide an intuitive way to represent the underlying relative importance relationships between the hypotheses, are as easy to perform as the original individual graphs, remain sequentially rejective and control the familywise error rate in the strong sense. Copyright © 2012 John Wiley & Sons, Ltd.

  1. Sequential charged particle reaction

    International Nuclear Information System (INIS)

    Hori, Jun-ichi; Ochiai, Kentaro; Sato, Satoshi; Yamauchi, Michinori; Nishitani, Takeo

    2004-01-01

    The effective cross sections for producing the sequential reaction products in F82H, pure vanadium and LiF with respect to the 14.9-MeV neutron were obtained and compared with the estimation ones. Since the sequential reactions depend on the secondary charged particles behavior, the effective cross sections are corresponding to the target nuclei and the material composition. The effective cross sections were also estimated by using the EAF-libraries and compared with the experimental ones. There were large discrepancies between estimated and experimental values. Additionally, we showed the contribution of the sequential reaction on the induced activity and dose rate in the boundary region with water. From the present study, it has been clarified that the sequential reactions are of great importance to evaluate the dose rates around the surface of cooling pipe and the activated corrosion products. (author)

  2. Random sequential adsorption of cubes

    Science.gov (United States)

    Cieśla, Michał; Kubala, Piotr

    2018-01-01

    Random packings built of cubes are studied numerically using a random sequential adsorption algorithm. To compare the obtained results with previous reports, three different models of cube orientation sampling were used. Also, three different cube-cube intersection algorithms were tested to find the most efficient one. The study focuses on the mean saturated packing fraction as well as kinetics of packing growth. Microstructural properties of packings were analyzed using density autocorrelation function.

  3. Effects of neostriatal 6-OHDA lesion on performance in a rat sequential reaction time task.

    Science.gov (United States)

    Domenger, D; Schwarting, R K W

    2008-10-31

    Work in humans and monkeys has provided evidence that the basal ganglia, and the neurotransmitter dopamine therein, play an important role for sequential learning and performance. Compared to primates, experimental work in rodents is rather sparse, largely due to the fact that tasks comparable to the human ones, especially serial reaction time tasks (SRTT), had been lacking until recently. We have developed a rat model of the SRTT, which allows to study neural correlates of sequential performance and motor sequence execution. Here, we report the effects of dopaminergic neostriatal lesions, performed using bilateral 6-hydroxydopamine injections, on performance of well-trained rats tested in our SRTT. Sequential behavior was measured in two ways: for one, the effects of small violations of otherwise well trained sequences were examined as a measure of attention and automation. Secondly, sequential versus random performance was compared as a measure of sequential learning. Neurochemically, the lesions led to sub-total dopamine depletions in the neostriatum, which ranged around 60% in the lateral, and around 40% in the medial neostriatum. These lesions led to a general instrumental impairment in terms of reduced speed (response latencies) and response rate, and these deficits were correlated with the degree of striatal dopamine loss. Furthermore, the violation test indicated that the lesion group conducted less automated responses. The comparison of random versus sequential responding showed that the lesion group did not retain its superior sequential performance in terms of speed, whereas they did in terms of accuracy. Also, rats with lesions did not improve further in overall performance as compared to pre-lesion values, whereas controls did. These results support previous results that neostriatal dopamine is involved in instrumental behaviour in general. Also, these lesions are not sufficient to completely abolish sequential performance, at least when acquired

  4. Properties of hypothesis testing techniques and (Bayesian) model selection for exploration-based and theory-based (order-restricted) hypotheses

    NARCIS (Netherlands)

    Kuiper, Rebecca M.; Nederhoff, Tim; Klugkist, Irene

    2015-01-01

    In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is

  5. Generalist predator, cyclic voles and cavity nests: testing the alternative prey hypothesis.

    Science.gov (United States)

    Pöysä, Hannu; Jalava, Kaisa; Paasivaara, Antti

    2016-12-01

    The alternative prey hypothesis (APH) states that when the density of the main prey declines, generalist predators switch to alternative prey and vice versa, meaning that predation pressure on the alternative prey should be negatively correlated with the density of the main prey. We tested the APH in a system comprising one generalist predator (pine marten, Martes martes), cyclic main prey (microtine voles, Microtus agrestis and Myodes glareolus) and alternative prey (cavity nests of common goldeneye, Bucephala clangula); pine marten is an important predator of both voles and common goldeneye nests. Specifically, we studied whether annual predation rate of real common goldeneye nests and experimental nests is negatively associated with fluctuation in the density of voles in four study areas in southern Finland in 2000-2011. Both vole density and nest predation rate varied considerably between years in all study areas. However, we did not find support for the hypothesis that vole dynamics indirectly affects predation rate of cavity nests in the way predicted by the APH. On the contrary, the probability of predation increased with vole spring abundance for both real and experimental nests. Furthermore, a crash in vole abundance from previous autumn to spring did not increase the probability of predation of real nests, although it increased that of experimental nests. We suggest that learned predation by pine marten individuals, coupled with efficient search image for cavities, overrides possible indirect positive effects of high vole density on the alternative prey in our study system.

  6. Eyewitness confidence in simultaneous and sequential lineups: a criterion shift account for sequential mistaken identification overconfidence.

    Science.gov (United States)

    Dobolyi, David G; Dodson, Chad S

    2013-12-01

    Confidence judgments for eyewitness identifications play an integral role in determining guilt during legal proceedings. Past research has shown that confidence in positive identifications is strongly associated with accuracy. Using a standard lineup recognition paradigm, we investigated accuracy using signal detection and ROC analyses, along with the tendency to choose a face with both simultaneous and sequential lineups. We replicated past findings of reduced rates of choosing with sequential as compared to simultaneous lineups, but notably found an accuracy advantage in favor of simultaneous lineups. Moreover, our analysis of the confidence-accuracy relationship revealed two key findings. First, we observed a sequential mistaken identification overconfidence effect: despite an overall reduction in false alarms, confidence for false alarms that did occur was higher with sequential lineups than with simultaneous lineups, with no differences in confidence for correct identifications. This sequential mistaken identification overconfidence effect is an expected byproduct of the use of a more conservative identification criterion with sequential than with simultaneous lineups. Second, we found a steady drop in confidence for mistaken identifications (i.e., foil identifications and false alarms) from the first to the last face in sequential lineups, whereas confidence in and accuracy of correct identifications remained relatively stable. Overall, we observed that sequential lineups are both less accurate and produce higher confidence false identifications than do simultaneous lineups. Given the increasing prominence of sequential lineups in our legal system, our data argue for increased scrutiny and possibly a wholesale reevaluation of this lineup format. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  7. Hypothesis testing on the fractal structure of behavioral sequences: the Bayesian assessment of scaling methodology.

    Science.gov (United States)

    Moscoso del Prado Martín, Fermín

    2013-12-01

    I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  8. Testing the assumptions of the pyrodiversity begets biodiversity hypothesis for termites in semi-arid Australia.

    Science.gov (United States)

    Davis, Hayley; Ritchie, Euan G; Avitabile, Sarah; Doherty, Tim; Nimmo, Dale G

    2018-04-01

    Fire shapes the composition and functioning of ecosystems globally. In many regions, fire is actively managed to create diverse patch mosaics of fire-ages under the assumption that a diversity of post-fire-age classes will provide a greater variety of habitats, thereby enabling species with differing habitat requirements to coexist, and enhancing species diversity (the pyrodiversity begets biodiversity hypothesis). However, studies provide mixed support for this hypothesis. Here, using termite communities in a semi-arid region of southeast Australia, we test four key assumptions of the pyrodiversity begets biodiversity hypothesis (i) that fire shapes vegetation structure over sufficient time frames to influence species' occurrence, (ii) that animal species are linked to resources that are themselves shaped by fire and that peak at different times since fire, (iii) that species' probability of occurrence or abundance peaks at varying times since fire and (iv) that providing a diversity of fire-ages increases species diversity at the landscape scale. Termite species and habitat elements were sampled in 100 sites across a range of fire-ages, nested within 20 landscapes chosen to represent a gradient of low to high pyrodiversity. We used regression modelling to explore relationships between termites, habitat and fire. Fire affected two habitat elements (coarse woody debris and the cover of woody vegetation) that were associated with the probability of occurrence of three termite species and overall species richness, thus supporting the first two assumptions of the pyrodiversity hypothesis. However, this did not result in those species or species richness being affected by fire history per se. Consequently, landscapes with a low diversity of fire histories had similar numbers of termite species as landscapes with high pyrodiversity. Our work suggests that encouraging a diversity of fire-ages for enhancing termite species richness in this study region is not necessary.

  9. Simple and flexible SAS and SPSS programs for analyzing lag-sequential categorical data.

    Science.gov (United States)

    O'Connor, B P

    1999-11-01

    This paper describes simple and flexible programs for analyzing lag-sequential categorical data, using SAS and SPSS. The programs read a stream of codes and produce a variety of lag-sequential statistics, including transitional frequencies, expected transitional frequencies, transitional probabilities, adjusted residuals, z values, Yule's Q values, likelihood ratio tests of stationarity across time and homogeneity across groups or segments, transformed kappas for unidirectional dependence, bidirectional dependence, parallel and nonparallel dominance, and significance levels based on both parametric and randomization tests.

  10. Does Portuguese economy support crude oil conservation hypothesis?

    International Nuclear Information System (INIS)

    Bashiri Behmiri, Niaz; Pires Manso, José R.

    2012-01-01

    This paper examines cointegration relationships and Granger causality nexus in a trivariate framework among oil consumption, economic growth and international oil price in Portugal. For this purpose, we employ two Granger causality approaches: the Johansen cointegration test and vector error correction model (VECM) and the Toda–Yamamoto approaches. Cointegration test proves the existence of a long run equilibrium relationship among these variables and VECM and Toda–Yamamoto Granger causality tests indicate that there is bidirectional causality between crude oil consumption and economic growth (feed back hypothesis). Therefore, the Portuguese economy does not support crude oil conservation hypothesis. Consequently, policymakers should consider that implementing oil conservation and environmental policies may negatively impact on the Portuguese economic growth. - Highlights: ► We examine Granger causality among oil consumption, GDP and oil price in Portugal. ► VECM and Toda–Yamamoto tests found bidirectional causality among oil and GDP. ► Portuguese economy does not support the crude oil conservation hypothesis.

  11. The Random-Walk Hypothesis on the Indian Stock Market

    OpenAIRE

    Ankita Mishra; Vinod Mishra; Russell Smyth

    2014-01-01

    This study tests the random walk hypothesis for the Indian stock market. Using 19 years of monthly data on six indices from the National Stock Exchange (NSE) and the Bombay Stock Exchange (BSE), this study applies three different unit root tests with two structural breaks to analyse the random walk hypothesis. We find that unit root tests that allow for two structural breaks alone are not able to reject the unit root null; however, a recently developed unit root test that simultaneously accou...

  12. Testing the snake-detection hypothesis: larger early posterior negativity in humans to pictures of snakes than to pictures of other reptiles, spiders and slugs

    OpenAIRE

    Van Strien, Jan W.; Franken, Ingmar H. A.; Huijding, Jorg

    2014-01-01

    According to the snake detection hypothesis (Isbell, 2006), fear specifically of snakes may have pushed evolutionary changes in the primate visual system allowing pre-attentional visual detection of fearful stimuli. A previous study demonstrated that snake pictures, when compared to spiders or bird pictures, draw more early attention as reflected by larger early posterior negativity (EPN). Here we report two studies that further tested the snake detection hypothesis. In Study, 1 we tested whe...

  13. Do the disadvantaged legitimize the social system? A large-scale test of the status-legitimacy hypothesis.

    Science.gov (United States)

    Brandt, Mark J

    2013-05-01

    System justification theory (SJT) posits that members of low-status groups are more likely to see their social systems as legitimate than members of high-status groups because members of low-status groups experience a sense of dissonance between system motivations and self/group motivations (Jost, Pelham, Sheldon, & Sullivan, 2003). The author examined the status-legitimacy hypothesis using data from 3 representative sets of data from the United States (American National Election Studies and General Social Surveys) and throughout the world (World Values Survey; total N across studies = 151,794). Multilevel models revealed that the average effect across years in the United States and countries throughout the world was most often directly contrary to the status-legitimacy hypothesis or was practically zero. In short, the status-legitimacy effect is not a robust phenomenon. Two theoretically relevant moderator variables (inequality and civil liberties) were also tested, revealing weak evidence, null evidence, or contrary evidence to the dissonance-inspired status-legitimacy hypothesis. In sum, the status-legitimacy effect is not robust and is unlikely to be the result of dissonance. These results are used to discuss future directions for research, the current state of SJT, and the interpretation of theoretically relevant but contrary and null results. PsycINFO Database Record (c) 2013 APA, all rights reserved

  14. Invited Commentary: Can Issues With Reproducibility in Science Be Blamed on Hypothesis Testing?

    Science.gov (United States)

    Weinberg, Clarice R

    2017-09-15

    In the accompanying article (Am J Epidemiol. 2017;186(6):646-647), Dr. Timothy Lash makes a forceful case that the problems with reproducibility in science stem from our "culture" of null hypothesis significance testing. He notes that when attention is selectively given to statistically significant findings, the estimated effects will be systematically biased away from the null. Here I revisit the recent history of genetic epidemiology and argue for retaining statistical testing as an important part of the tool kit. Particularly when many factors are considered in an agnostic way, in what Lash calls "innovative" research, investigators need a selection strategy to identify which findings are most likely to be genuine, and hence worthy of further study. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  15. Testing a hypothesis of unidirectional hybridization in plants: Observations on Sonneratia, Bruguiera and Ligularia

    Directory of Open Access Journals (Sweden)

    Wu Chung-I

    2008-05-01

    Full Text Available Abstract Background When natural hybridization occurs at sites where the hybridizing species differ in abundance, the pollen load delivered to the rare species should be predominantly from the common species. Previous authors have therefore proposed a hypothesis on the direction of hybridization: interspecific hybrids are more likely to have the female parent from the rare species and the male parent from the common species. We wish to test this hypothesis using data of plant hybridizations both from our own experimentation and from the literature. Results By examining the maternally inherited chloroplast DNA of 6 cases of F1 hybridization from four genera of plants, we infer unidirectional hybridization in most cases. In all 5 cases where the relative abundance of the parental species deviates from parity, however, the direction is predominantly in the direction opposite of the prediction based strictly on numerical abundance. Conclusion Our results show that the observed direction of hybridization is almost always opposite of the predicted direction based on the relative abundance of the hybridizing species. Several alternative hypotheses, including unidirectional postmating isolation and reinforcement of premating isolation, were discussed.

  16. Sequential analysis of materials balances. Application to a prospective reprocessing facility

    International Nuclear Information System (INIS)

    Picard, R.

    1986-01-01

    This paper discusses near-real-time accounting in the context of the prospective DWK reprocessing plant. Sensitivity of a standard sequential testing procedure, applied to unfalsified operator data only, is examined with respect to a variety of loss scenarios. It is seen that large inventories preclude high-probability detection of certain protracted losses of material. In Sec. 2, a rough error propagation for the MBA of interest is outlined. Mathematical development for the analysis is given in Sec. 3, and generic aspects of sequential testing are reviewed in Sec. 4. In Sec. 5, results from a simulation to quantify performance of the accounting system are presented

  17. Testing Fiscal Dominance Hypothesis in a Structural VAR Specification for Pakistan

    Directory of Open Access Journals (Sweden)

    Shaheen Rozina

    2018-03-01

    Full Text Available This research aims to test the fiscal dominance hypothesis for Pakistan through a bivariate structural vector auto regression (SVAR specification, covering time period 1977 – 2016. This study employs real primary deficit (non interest government expenditures minus total revenues and real primary liabilities (sum of monetary base and domestic public debt as indicators of fiscal measures and monetary policy respectively. A structural VAR is retrieved both for entire sample period and four sub periods (1977 – 1986, 1987 – 1997, 1998 – 2008, and 2009 – 2016. This study identifies the presence of fiscal dominance for the entire sample period and the sub period from 1987 – 2008. The estimates reveal an interesting phenomenon that fiscal dominance is significant in the elected regimes and weaker in the presence of military regimes in Pakistan. From a policy perspective, this research suggests increased autonomy of central bank to achieve long term price stability and reduced administration costs to ensure efficient democratic regime in Pakistan.

  18. Test of a hypothesis of realism in quantum theory using a Bayesian approach

    Science.gov (United States)

    Nikitin, N.; Toms, K.

    2017-05-01

    In this paper we propose a time-independent equality and time-dependent inequality, suitable for an experimental test of the hypothesis of realism. The derivation of these relations is based on the concept of conditional probability and on Bayes' theorem in the framework of Kolmogorov's axiomatics of probability theory. The equality obtained is intrinsically different from the well-known Greenberger-Horne-Zeilinger (GHZ) equality and its variants, because violation of the proposed equality might be tested in experiments with only two microsystems in a maximally entangled Bell state |Ψ-> , while a test of the GHZ equality requires at least three quantum systems in a special state |ΨGHZ> . The obtained inequality differs from Bell's, Wigner's, and Leggett-Garg inequalities, because it deals with spin s =1 /2 projections onto only two nonparallel directions at two different moments of time, while a test of the Bell and Wigner inequalities requires at least three nonparallel directions, and a test of the Leggett-Garg inequalities requires at least three distinct moments of time. Hence, the proposed inequality seems to open an additional experimental possibility to avoid the "contextuality loophole." Violation of the proposed equality and inequality is illustrated with the behavior of a pair of anticorrelated spins in an external magnetic field and also with the oscillations of flavor-entangled pairs of neutral pseudoscalar mesons.

  19. Sequential learning in individuals with agrammatic aphasia: evidence from artificial grammar learning.

    Science.gov (United States)

    Schuchard, Julia; Thompson, Cynthia K

    2017-01-01

    We examined sequential learning in individuals with agrammatic aphasia ( n = 12) and healthy age-matched participants ( n = 12) using an artificial grammar. Artificial grammar acquisition, 24-hour retention, and the potential benefits of additional training were examined by administering an artificial grammar judgment test (1) immediately following auditory exposure-based training, (2) one day after training, and (3) after a second training session on the second day. An untrained control group ( n = 12 healthy age-matched participants) completed the tests on the same time schedule. The trained healthy and aphasic groups showed greater sensitivity to the detection of grammatical items than the control group. No significant correlations between sequential learning and language abilities were observed among the aphasic participants. The results suggest that individuals with agrammatic aphasia show sequential learning, but the underlying processes involved in this learning may be different than for healthy adults.

  20. Properties of hypothesis testing techniques and (Bayesian) model selection for exploration-based and theory-based (order-restricted) hypotheses.

    Science.gov (United States)

    Kuiper, Rebecca M; Nederhoff, Tim; Klugkist, Irene

    2015-05-01

    In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is investigated (a classical, exploration-based set of hypotheses containing equality constraints on the means, or a theory-based limited set of hypotheses with equality and/or order restrictions). A simulation study is conducted to examine the performance of these techniques. We demonstrate that, if one has specific, a priori specified hypotheses, confirmation (i.e., investigating theory-based hypotheses) has advantages over exploration (i.e., examining all possible equality-constrained hypotheses). Furthermore, examining reasonable order-restricted hypotheses has more power to detect the true effect/non-null hypothesis than evaluating only equality restrictions. Additionally, when investigating more than one theory-based hypothesis, model selection is preferred over hypothesis testing. Because of the first two results, we further examine the techniques that are able to evaluate order restrictions in a confirmatory fashion by examining their performance when the homogeneity of variance assumption is violated. Results show that the techniques are robust to heterogeneity when the sample sizes are equal. When the sample sizes are unequal, the performance is affected by heterogeneity. The size and direction of the deviations from the baseline, where there is no heterogeneity, depend on the effect size (of the means) and on the trend in the group variances with respect to the ordering of the group sizes. Importantly, the deviations are less pronounced when the group variances and sizes exhibit the same trend (e.g., are both increasing with group number). © 2014 The British Psychological Society.

  1. Assessing Threat Detection Scenarios through Hypothesis Generation and Testing

    Science.gov (United States)

    2015-12-01

    Publications. Field, A. (2005). Discovering statistics using SPSS (2nd ed.). Thousand Oaks, CA: Sage Publications. Fisher, S. D., Gettys, C. F...therefore, subsequent F statistics are reported using the Huynh-Feldt correction (Greenhouse-Geisser Epsilon > .775). Experienced and inexperienced...change in hypothesis using experience and initial confidence as predictors. In the Dog Day scenario, the regression was not statistically

  2. Test of an Hypothesis of Magnetization, Tilt and Flow in an Hypabyssal Intrusion, Colombian Andes

    Science.gov (United States)

    Muggleton, S.; MacDonald, W. D.; Estrada, J. J.; Sierra, G. M.

    2002-05-01

    Magnetic remanence in the Miocene Clavijo intrusion in the Cauca Valley, adjacent to the Cordillera Central, plunges steeply northward (MacDonald et al., 1996). Assuming magnetization in a normal magnetic field, the expected remanence direction is approximately I= 10o, D= 000o; the observed remanence is I=84o, D=003o. The discrepancy could be explained by a 74o rotation about a horizontal E-W axis, i.e., about an axis normal to the nearby N-S trending Romeral fault zone. If the intrusion is the shallow feeder of a now-eroded andesitic volcano, then perhaps the paleovertical direction is preserved in flow lineations and provides a test of the tilt/rotation of the remanence. In combination, the steep remanence direction, vertical flow, and the inferred rotation of the volcanic neck lead to the hypothesis of a shallow-plunging southward lineation for this body. Using anisotropy of magnetic susceptibility (AMS) as a proxy for the flow lineation, it is predicted that the K1 (maximum susceptibility) axis in this body plunges gently south. This hypothesis was tested using approximately 50 oriented cores from 5 sites near the west margin of the Clavijo intrusion. The results suggest a NW plunging lineation, inconsistent with the initial hypothesis. However, a relatively consistent flow lineation is suggested by the K1 axes. If this flow axis represents paleovertical, it suggests moderate tilting of the Clavijo body towards the southeast. The results are encouraging enough to suggest that AMS may be useful for determining paleo-vertical in shallow volcanic necks and hypabyssal intrusions, and might ultimately be useful in a tilt-correction for such bodies. Other implications of the results will be discussed. MacDonald, WD, Estrada, JJ, Sierra, GM, Gonzalez, H, 1996, Late Cenozoic tectonics and paleomagnetism of North Cauca Basin intrusions, Colombian Andes: Dual rotation modes: Tectonophysics, v 261, p. 277-289.

  3. Sequential lineup presentation promotes less-biased criterion setting but does not improve discriminability.

    Science.gov (United States)

    Palmer, Matthew A; Brewer, Neil

    2012-06-01

    When compared with simultaneous lineup presentation, sequential presentation has been shown to reduce false identifications to a greater extent than it reduces correct identifications. However, there has been much debate about whether this difference in identification performance represents improved discriminability or more conservative responding. In this research, data from 22 experiments that compared sequential and simultaneous lineups were analyzed using a compound signal-detection model, which is specifically designed to describe decision-making performance on tasks such as eyewitness identification tests. Sequential (cf. simultaneous) presentation did not influence discriminability, but produced a conservative shift in response bias that resulted in less-biased choosing for sequential than simultaneous lineups. These results inform understanding of the effects of lineup presentation mode on eyewitness identification decisions.

  4. Remarks on sequential designs in risk assessment

    International Nuclear Information System (INIS)

    Seidenfeld, T.

    1982-01-01

    The special merits of sequential designs are reviewed in light of particular challenges that attend risk assessment for human population. The kinds of ''statistical inference'' are distinguished and the problem of design which is pursued is the clash between Neyman-Pearson and Bayesian programs of sequential design. The value of sequential designs is discussed and the Neyman-Pearson vs. Bayesian sequential designs are probed in particular. Finally, warnings with sequential designs are considered, especially in relation to utilitarianism

  5. Sex and Class Differences in Parent-Child Interaction: A Test of Kohn's Hypothesis

    Science.gov (United States)

    Gecas, Viktor; Nye, F. Ivan

    1974-01-01

    This paper focuses on Melvin Kohn's suggestive hypothesis that white-collar parents stress the development of internal standards of conduct in their children while blue-collar parents are more likely to react on the basis of the consequences of the child's behavior. This hypothesis was supported. (Author)

  6. Comparing two Poisson populations sequentially: an application

    International Nuclear Information System (INIS)

    Halteman, E.J.

    1986-01-01

    Rocky Flats Plant in Golden, Colorado monitors each of its employees for radiation exposure. Excess exposure is detected by comparing the means of two Poisson populations. A sequential probability ratio test (SPRT) is proposed as a replacement for the fixed sample normal approximation test. A uniformly most efficient SPRT exists, however logistics suggest using a truncated SPRT. The truncated SPRT is evaluated in detail and shown to possess large potential savings in average time spent by employees in the monitoring process

  7. Sequential lineup laps and eyewitness accuracy.

    Science.gov (United States)

    Steblay, Nancy K; Dietrich, Hannah L; Ryan, Shannon L; Raczynski, Jeanette L; James, Kali A

    2011-08-01

    Police practice of double-blind sequential lineups prompts a question about the efficacy of repeated viewings (laps) of the sequential lineup. Two laboratory experiments confirmed the presence of a sequential lap effect: an increase in witness lineup picks from first to second lap, when the culprit was a stranger. The second lap produced more errors than correct identifications. In Experiment 2, lineup diagnosticity was significantly higher for sequential lineup procedures that employed a single versus double laps. Witnesses who elected to view a second lap made significantly more errors than witnesses who chose to stop after one lap or those who were required to view two laps. Witnesses with prior exposure to the culprit did not exhibit a sequential lap effect.

  8. Sequential motor task (Luria's Fist-Edge-Palm Test in children with benign focal epilepsy of childhood with centrotemporal spikes

    Directory of Open Access Journals (Sweden)

    Carmen Silvia Molleis Galego Miziara

    2013-06-01

    Full Text Available This study evaluated the sequential motor manual actions in children with benign focal epilepsy of childhood with centrotemporal spikes (BECTS and compares the results with matched control group, through the application of Luria's fist-edge-palm test. The children with BECTS underwent interictal single photon emission computed tomography (SPECT and School Performance Test (SPT. Significant difference occurred between the study and control groups for manual motor action through three equal and three different movements. Children with lower school performance had higher error rate in the imitation of hand gestures. Another factor significantly associated with the failure was the abnormality in SPECT. Children with BECTS showed abnormalities in the test that evaluated manual motor programming/planning. This study may suggest that the functional changes related to epileptiform activity in rolandic region interfere with the executive function in children with BECTS.

  9. A Global comparison of surface soil characteristics across five cities: A test of the urban ecosystem convergence hypothesis.

    Science.gov (United States)

    Richard V. Pouyat; Ian D. Yesilonis; Miklos Dombos; Katalin Szlavecz; Heikki Setala; Sarel Cilliers; Erzsebet Hornung; D. Johan Kotze; Stephanie Yarwood

    2015-01-01

    As part of the Global Urban Soil Ecology and Education Network and to test the urban ecosystem convergence hypothesis, we report on soil pH, organic carbon (OC), total nitrogen (TN), phosphorus (P), and potassium (K) measured in four soil habitat types (turfgrass, ruderal, remnant, and reference) in five metropolitan areas (Baltimore, Budapest,...

  10. Robustness of the Sequential Lineup Advantage

    Science.gov (United States)

    Gronlund, Scott D.; Carlson, Curt A.; Dailey, Sarah B.; Goodsell, Charles A.

    2009-01-01

    A growing movement in the United States and around the world involves promoting the advantages of conducting an eyewitness lineup in a sequential manner. We conducted a large study (N = 2,529) that included 24 comparisons of sequential versus simultaneous lineups. A liberal statistical criterion revealed only 2 significant sequential lineup…

  11. Colour vision in ADHD: part 1--testing the retinal dopaminergic hypothesis.

    Science.gov (United States)

    Kim, Soyeon; Al-Haj, Mohamed; Chen, Samantha; Fuller, Stuart; Jain, Umesh; Carrasco, Marisa; Tannock, Rosemary

    2014-10-24

    To test the retinal dopaminergic hypothesis, which posits deficient blue color perception in ADHD, resulting from hypofunctioning CNS and retinal dopamine, to which blue cones are exquisitely sensitive. Also, purported sex differences in red color perception were explored. 30 young adults diagnosed with ADHD and 30 healthy young adults, matched on age and gender, performed a psychophysical task to measure blue and red color saturation and contrast discrimination ability. Visual function measures, such as the Visual Activities Questionnaire (VAQ) and Farnsworth-Munsell 100 hue test (FMT), were also administered. Females with ADHD were less accurate in discriminating blue and red color saturation relative to controls but did not differ in contrast sensitivity. Female control participants were better at discriminating red saturation than males, but no sex difference was present within the ADHD group. Poorer discrimination of red as well as blue color saturation in the female ADHD group may be partly attributable to a hypo-dopaminergic state in the retina, given that color perception (blue-yellow and red-green) is based on input from S-cones (short wavelength cone system) early in the visual pathway. The origin of female superiority in red perception may be rooted in sex-specific functional specialization in hunter-gather societies. The absence of this sexual dimorphism for red colour perception in ADHD females warrants further investigation.

  12. Proform-Antecedent Linking in Individuals with Agrammatic Aphasia: A Test of the Intervener Hypothesis.

    Science.gov (United States)

    Engel, Samantha; Shapiro, Lewis P; Love, Tracy

    2018-02-01

    To evaluate processing and comprehension of pronouns and reflexives in individuals with agrammatic (Broca's) aphasia and age-matched control participants. Specifically, we evaluate processing and comprehension patterns in terms of a specific hypothesis -- the Intervener Hypothesis - that posits that the difficulty of individuals with agrammatic (Broca's) aphasia results from similarity-based interference caused by the presence of an intervening NP between two elements of a dependency chain. We used an eye tracking-while-listening paradigm to investigate real-time processing (Experiment 1) and a sentence-picture matching task to investigate final interpretive comprehension (Experiment 2) of sentences containing proforms in complement phrase and subject relative constructions. Individuals with agrammatic aphasia demonstrated a greater proportion of gazes to the correct referent of reflexives relative to pronouns and significantly greater comprehension accuracy of reflexives relative to pronouns. These results provide support for the Intervener Hypothesis, previous support for which comes from studies of Wh- questions and unaccusative verbs, and we argue that this account provides an explanation for the deficits of individuals with agrammatic aphasia across a growing set of sentence constructions. The current study extends this hypothesis beyond filler-gap dependencies to referential dependencies and allows us to refine the hypothesis in terms of the structural constraints that meet the description of the Intervener Hypothesis.

  13. Fluorescence Microspectroscopy for Testing the Dimerization Hypothesis of BACE1 Protein in Cultured HEK293 Cells

    Science.gov (United States)

    Gardeen, Spencer; Johnson, Joseph L.; Heikal, Ahmed A.

    2016-06-01

    Alzheimer's Disease (AD) is a neurodegenerative disorder that results from the formation of beta-amyloid plaques in the brain that trigger the known symptoms of memory loss in AD patients. The beta-amyloid plaques are formed by the proteolytic cleavage of the amyloid precursor protein (APP) by the proteases BACE1 and gamma-secretase. These enzyme-facilitated cleavages lead to the production of beta-amyloid fragments that aggregate to form plaques, which ultimately lead to neuronal cell death. Recent detergent protein extraction studies suggest that BACE1 protein forms a dimer that has significantly higher catalytic activity than its monomeric counterpart. In this contribution, we examine the dimerization hypothesis of BACE1 in cultured HEK293 cells using complementary fluorescence spectroscopy and microscopy methods. Cells were transfected with a BACE1-EGFP fusion protein construct and imaged using confocal, and differential interference contrast to monitor the localization and distribution of intracellular BACE1. Complementary fluorescence lifetime and anisotropy measurements enabled us to examine the conformational and environmental changes of BACE1 as a function of substrate binding. Using fluorescence correlation spectroscopy, we also quantified the diffusion coefficient of BACE1-EGFP on the plasma membrane as a means to test the dimerization hypothesis as a fucntion of substrate-analog inhibitition. Our results represent an important first towards examining the substrate-mediated dimerization hypothesis of BACE1 in live cells.

  14. Testing the hypothesis of the natural suicide rates: Further evidence from OECD data

    DEFF Research Database (Denmark)

    Andres, Antonio Rodriguez; Halicioglu, Ferda

    2011-01-01

    This paper provides further evidence on the hypothesis of the natural rate of suicide using the time series data for 15 OECD countries over the period 1970–2004. This hypothesis suggests that the suicide rate of a society could never be zero even if both the economic and the social conditions wer...

  15. Ruling out cardiac failure: Cost-benefit analysis of a sequential testing strategy with NT-proBNP before echocardiography

    Science.gov (United States)

    Ferrandis, Maria-José; Ryden, Ingvar; Lindahl, Tomas L.

    2013-01-01

    Objectives To estimate the possible economic benefit of a sequential testing strategy with NT-proBNP to reduce the number of echocardiographies. Methods Retrospective study in a third-party payer perspective. The costs were calculated from three Swedish counties: Blekinge, Östergötland, and Uppland. Two cut-off levels of NT-proBNP were used: 400 and 300 pg/mL. The cost-effectiveness of the testing strategy was estimated through the short-term cost avoidance and reduction in demand for echocardiographies. Results The estimated costs for NT-proBNP tests and echocardiographies per county were reduced by 33%–36% with the 400 pg/mL cut-off and by 28%–29% with the 300 pg/mL cut-off. This corresponded to a yearly cost reduction of approximately €2–5 million per million inhabitants in these counties. Conclusion The use of NT-proBNP as a screening test could substantially reduce the number of echocardiographies in the diagnostic work-up of patients with suspected cardiac failure, as well as the associated costs. PMID:23230860

  16. Long-term resource variation and group size: A large-sample field test of the Resource Dispersion Hypothesis

    Directory of Open Access Journals (Sweden)

    Morecroft Michael D

    2001-07-01

    Full Text Available Abstract Background The Resource Dispersion Hypothesis (RDH proposes a mechanism for the passive formation of social groups where resources are dispersed, even in the absence of any benefits of group living per se. Despite supportive modelling, it lacks empirical testing. The RDH predicts that, rather than Territory Size (TS increasing monotonically with Group Size (GS to account for increasing metabolic needs, TS is constrained by the dispersion of resource patches, whereas GS is independently limited by their richness. We conducted multiple-year tests of these predictions using data from the long-term study of badgers Meles meles in Wytham Woods, England. The study has long failed to identify direct benefits from group living and, consequently, alternative explanations for their large group sizes have been sought. Results TS was not consistently related to resource dispersion, nor was GS consistently related to resource richness. Results differed according to data groupings and whether territories were mapped using minimum convex polygons or traditional methods. Habitats differed significantly in resource availability, but there was also evidence that food resources may be spatially aggregated within habitat types as well as between them. Conclusions This is, we believe, the largest ever test of the RDH and builds on the long-term project that initiated part of the thinking behind the hypothesis. Support for predictions were mixed and depended on year and the method used to map territory borders. We suggest that within-habitat patchiness, as well as model assumptions, should be further investigated for improved tests of the RDH in the future.

  17. Influence of synchronous and sequential stimulation on muscle fatigue

    NARCIS (Netherlands)

    Thomsen, M.; Thomsen, M.; Veltink, Petrus H.

    1997-01-01

    In acute experiments the sciatic nerve of the rat is electrically stimulated to induce fatigue in the medial Gastrocnemius muscle. Fatigue tests are carried out using intermittent stimulation of different compartments (sequential) or a single compartment (synchronous) of the sciatic nerve. The

  18. Sequential stochastic optimization

    CERN Document Server

    Cairoli, Renzo

    1996-01-01

    Sequential Stochastic Optimization provides mathematicians and applied researchers with a well-developed framework in which stochastic optimization problems can be formulated and solved. Offering much material that is either new or has never before appeared in book form, it lucidly presents a unified theory of optimal stopping and optimal sequential control of stochastic processes. This book has been carefully organized so that little prior knowledge of the subject is assumed; its only prerequisites are a standard graduate course in probability theory and some familiarity with discrete-paramet

  19. Testing the environmental Kuznets curve hypothesis with bird populations as habitat-specific environmental indicators: evidence from Canada.

    Science.gov (United States)

    Lantz, Van; Martínez-Espiñeira, Roberto

    2008-04-01

    The traditional environmental Kuznets curve (EKC) hypothesis postulates that environmental degradation follows an inverted U-shaped relationship with gross domestic product (GDP) per capita. We tested the EKC hypothesis with bird populations in 5 different habitats as environmental quality indicators. Because birds are considered environmental goods, for them the EKC hypothesis would instead be associated with a U-shaped relationship between bird populations and GDP per capita. In keeping with the literature, we included other variables in the analysis-namely, human population density and time index variables (the latter variable captured the impact of persistent and exogenous climate and/or policy changes on bird populations over time). Using data from 9 Canadian provinces gathered over 37 years, we used a generalized least-squares regression for each bird habitat type, which accounted for the panel structure of the data, the cross-sectional dependence across provinces in the residuals, heteroskedasticity, and fixed- or random-effect specifications of the models. We found evidence that supports the EKC hypothesis for 3 of the 5 bird population habitat types. In addition, the relationship between human population density and the different bird populations varied, which emphasizes the complex nature of the impact that human populations have on the environment. The relationship between the time-index variable and the different bird populations also varied, which indicates there are other persistent and significant influences on bird populations over time. Overall our EKC results were consistent with those found for threatened bird species, indicating that economic prosperity does indeed act to benefit some bird populations.

  20. Support for distinct subcomponents of spatial working memory: a double dissociation between spatial-simultaneous and spatial-sequential performance in unilateral neglect.

    Science.gov (United States)

    Wansard, Murielle; Bartolomeo, Paolo; Bastin, Christine; Segovia, Fermín; Gillet, Sophie; Duret, Christophe; Meulemans, Thierry

    2015-01-01

    Over the last decade, many studies have demonstrated that visuospatial working memory (VSWM) can be divided into separate subsystems dedicated to the retention of visual patterns and their serial order. Impaired VSWM has been suggested to exacerbate left visual neglect in right-brain-damaged individuals. The aim of this study was to investigate the segregation between spatial-sequential and spatial-simultaneous working memory in individuals with neglect. We demonstrated that patterns of results on these VSWM tasks can be dissociated. Spatial-simultaneous and sequential aspects of VSWM can be selectively impaired in unilateral neglect. Our results support the hypothesis of multiple VSWM subsystems, which should be taken into account to better understand neglect-related deficits.

  1. An experimental test of the habitat-amount hypothesis for saproxylic beetles in a forested region.

    Science.gov (United States)

    Seibold, Sebastian; Bässler, Claus; Brandl, Roland; Fahrig, Lenore; Förster, Bernhard; Heurich, Marco; Hothorn, Torsten; Scheipl, Fabian; Thorn, Simon; Müller, Jörg

    2017-06-01

    The habitat-amount hypothesis challenges traditional concepts that explain species richness within habitats, such as the habitat-patch hypothesis, where species number is a function of patch size and patch isolation. It posits that effects of patch size and patch isolation are driven by effects of sample area, and thus that the number of species at a site is basically a function of the total habitat amount surrounding this site. We tested the habitat-amount hypothesis for saproxylic beetles and their habitat of dead wood by using an experiment comprising 190 plots with manipulated patch sizes situated in a forested region with a high variation in habitat amount (i.e., density of dead trees in the surrounding landscape). Although dead wood is a spatio-temporally dynamic habitat, saproxylic insects have life cycles shorter than the time needed for habitat turnover and they closely track their resource. Patch size was manipulated by adding various amounts of downed dead wood to the plots (~800 m³ in total); dead trees in the surrounding landscape (~240 km 2 ) were identified using airborne laser scanning (light detection and ranging). Over 3 yr, 477 saproxylic species (101,416 individuals) were recorded. Considering 20-1,000 m radii around the patches, local landscapes were identified as having a radius of 40-120 m. Both patch size and habitat amount in the local landscapes independently affected species numbers without a significant interaction effect, hence refuting the island effect. Species accumulation curves relative to cumulative patch size were not consistent with either the habitat-patch hypothesis or the habitat-amount hypothesis: several small dead-wood patches held more species than a single large patch with an amount of dead wood equal to the sum of that of the small patches. Our results indicate that conservation of saproxylic beetles in forested regions should primarily focus on increasing the overall amount of dead wood without considering its

  2. A Blind Test of the Younger Dryas Impact Hypothesis.

    Directory of Open Access Journals (Sweden)

    Vance Holliday

    Full Text Available The Younger Dryas Impact Hypothesis (YDIH states that North America was devastated by some sort of extraterrestrial event ~12,800 calendar years before present. Two fundamental questions persist in the debate over the YDIH: Can the results of analyses for purported impact indicators be reproduced? And are the indicators unique to the lower YD boundary (YDB, i.e., ~12.8k cal yrs BP? A test reported here presents the results of analyses that address these questions. Two different labs analyzed identical splits of samples collected at, above, and below the ~12.8ka zone at the Lubbock Lake archaeological site (LL in northwest Texas. Both labs reported similar variation in levels of magnetic micrograins (>300 mg/kg >12.8ka and <11.5ka, but <150 mg/kg 12.8ka to 11.5ka. Analysis for magnetic microspheres in one split, reported elsewhere, produced very low to nonexistent levels throughout the section. In the other split, reported here, the levels of magnetic microspherules and nanodiamonds are low or nonexistent at, below, and above the YDB with the notable exception of a sample <11,500 cal years old. In that sample the claimed impact proxies were recovered at abundances two to four orders of magnitude above that from the other samples. Reproducibility of at least some analyses are problematic. In particular, no standard criteria exist for identification of magnetic spheres. Moreover, the purported impact proxies are not unique to the YDB.

  3. Exploring the sequential lineup advantage using WITNESS.

    Science.gov (United States)

    Goodsell, Charles A; Gronlund, Scott D; Carlson, Curt A

    2010-12-01

    Advocates claim that the sequential lineup is an improvement over simultaneous lineup procedures, but no formal (quantitatively specified) explanation exists for why it is better. The computational model WITNESS (Clark, Appl Cogn Psychol 17:629-654, 2003) was used to develop theoretical explanations for the sequential lineup advantage. In its current form, WITNESS produced a sequential advantage only by pairing conservative sequential choosing with liberal simultaneous choosing. However, this combination failed to approximate four extant experiments that exhibited large sequential advantages. Two of these experiments became the focus of our efforts because the data were uncontaminated by likely suspect position effects. Decision-based and memory-based modifications to WITNESS approximated the data and produced a sequential advantage. The next step is to evaluate the proposed explanations and modify public policy recommendations accordingly.

  4. Testing sequential extraction methods for the analysis of multiple stable isotope systems from a bone sample

    Science.gov (United States)

    Sahlstedt, Elina; Arppe, Laura

    2017-04-01

    Stable isotope composition of bones, analysed either from the mineral phase (hydroxyapatite) or from the organic phase (mainly collagen) carry important climatological and ecological information and are therefore widely used in paleontological and archaeological research. For the analysis of the stable isotope compositions, both of the phases, hydroxyapatite and collagen, have their more or less well established separation and analytical techniques. Recent development in IRMS and wet chemical extraction methods have facilitated the analysis of very small bone fractions (500 μg or less starting material) for PO43-O isotope composition. However, the uniqueness and (pre-) historical value of each archaeological and paleontological finding lead to preciously little material available for stable isotope analyses, encouraging further development of microanalytical methods for the use of stable isotope analyses. Here we present the first results in developing extraction methods for combining collagen C- and N-isotope analyses to PO43-O-isotope analyses from a single bone sample fraction. We tested sequential extraction starting with dilute acid demineralization and collection of both collagen and PO43-fractions, followed by further purification step by H2O2 (PO43-fraction). First results show that bone sample separates as small as 2 mg may be analysed for their δ15N, δ13C and δ18OPO4 values. The method may be incorporated in detailed investigation of sequentially developing skeletal material such as teeth, potentially allowing for the investigation of interannual variability in climatological/environmental signals or investigation of the early life history of an individual.

  5. A test of the massive binary black hole hypothesis - Arp 102B

    Science.gov (United States)

    Helpern, J. P.; Filippenko, Alexei V.

    1988-01-01

    The emission-line spectra of several AGN have broad peaks which are significantly displaced in velocity with respect to the host galaxy. An interpretation of this effect in terms of orbital motion of a binary black hole predicts periods of a few centuries. It is pointed out here that recent measurements of the masses and sizes of many low-luminosity AGN imply orbital periods much shorter than this. In particular, it is found that the elliptical galaxy Arp 102B is the most likely candidate for observation of radial velocity variations; its period is expected to be about 3 yr. The H-alpha line profile of Arp 102B has been measured for 5 yr without detecting any change in velocity, and it is thus found that a rather restrictive observational test of the massive binary black hole hypothesis already exists, albeit for this one object.

  6. A single test for rejecting the null hypothesis in subgroups and in the overall sample.

    Science.gov (United States)

    Lin, Yunzhi; Zhou, Kefei; Ganju, Jitendra

    2017-01-01

    In clinical trials, some patient subgroups are likely to demonstrate larger effect sizes than other subgroups. For example, the effect size, or informally the benefit with treatment, is often greater in patients with a moderate condition of a disease than in those with a mild condition. A limitation of the usual method of analysis is that it does not incorporate this ordering of effect size by patient subgroup. We propose a test statistic which supplements the conventional test by including this information and simultaneously tests the null hypothesis in pre-specified subgroups and in the overall sample. It results in more power than the conventional test when the differences in effect sizes across subgroups are at least moderately large; otherwise it loses power. The method involves combining p-values from models fit to pre-specified subgroups and the overall sample in a manner that assigns greater weight to subgroups in which a larger effect size is expected. Results are presented for randomized trials with two and three subgroups.

  7. Whiplash and the compensation hypothesis.

    Science.gov (United States)

    Spearing, Natalie M; Connelly, Luke B

    2011-12-01

    Review article. To explain why the evidence that compensation-related factors lead to worse health outcomes is not compelling, either in general, or in the specific case of whiplash. There is a common view that compensation-related factors lead to worse health outcomes ("the compensation hypothesis"), despite the presence of important, and unresolved sources of bias. The empirical evidence on this question has ramifications for the design of compensation schemes. Using studies on whiplash, this article outlines the methodological problems that impede attempts to confirm or refute the compensation hypothesis. Compensation studies are prone to measurement bias, reverse causation bias, and selection bias. Errors in measurement are largely due to the latent nature of whiplash injuries and health itself, a lack of clarity over the unit of measurement (specific factors, or "compensation"), and a lack of appreciation for the heterogeneous qualities of compensation-related factors and schemes. There has been a failure to acknowledge and empirically address reverse causation bias, or the likelihood that poor health influences the decision to pursue compensation: it is unclear if compensation is a cause or a consequence of poor health, or both. Finally, unresolved selection bias (and hence, confounding) is evident in longitudinal studies and natural experiments. In both cases, between-group differences have not been addressed convincingly. The nature of the relationship between compensation-related factors and health is unclear. Current approaches to testing the compensation hypothesis are prone to several important sources of bias, which compromise the validity of their results. Methods that explicitly test the hypothesis and establish whether or not a causal relationship exists between compensation factors and prolonged whiplash symptoms are needed in future studies.

  8. Addendum to the article: Misuse of null hypothesis significance testing: Would estimation of positive and negative predictive values improve certainty of chemical risk assessment?

    Science.gov (United States)

    Bundschuh, Mirco; Newman, Michael C; Zubrod, Jochen P; Seitz, Frank; Rosenfeldt, Ricki R; Schulz, Ralf

    2015-03-01

    We argued recently that the positive predictive value (PPV) and the negative predictive value (NPV) are valuable metrics to include during null hypothesis significance testing: They inform the researcher about the probability of statistically significant and non-significant test outcomes actually being true. Although commonly misunderstood, a reported p value estimates only the probability of obtaining the results or more extreme results if the null hypothesis of no effect was true. Calculations of the more informative PPV and NPV require a priori estimate of the probability (R). The present document discusses challenges of estimating R.

  9. Lexical decoder for continuous speech recognition: sequential neural network approach

    International Nuclear Information System (INIS)

    Iooss, Christine

    1991-01-01

    The work presented in this dissertation concerns the study of a connectionist architecture to treat sequential inputs. In this context, the model proposed by J.L. Elman, a recurrent multilayers network, is used. Its abilities and its limits are evaluated. Modifications are done in order to treat erroneous or noisy sequential inputs and to classify patterns. The application context of this study concerns the realisation of a lexical decoder for analytical multi-speakers continuous speech recognition. Lexical decoding is completed from lattices of phonemes which are obtained after an acoustic-phonetic decoding stage relying on a K Nearest Neighbors search technique. Test are done on sentences formed from a lexicon of 20 words. The results are obtained show the ability of the proposed connectionist model to take into account the sequentiality at the input level, to memorize the context and to treat noisy or erroneous inputs. (author) [fr

  10. Sequential memory: Binding dynamics

    Science.gov (United States)

    Afraimovich, Valentin; Gong, Xue; Rabinovich, Mikhail

    2015-10-01

    Temporal order memories are critical for everyday animal and human functioning. Experiments and our own experience show that the binding or association of various features of an event together and the maintaining of multimodality events in sequential order are the key components of any sequential memories—episodic, semantic, working, etc. We study a robustness of binding sequential dynamics based on our previously introduced model in the form of generalized Lotka-Volterra equations. In the phase space of the model, there exists a multi-dimensional binding heteroclinic network consisting of saddle equilibrium points and heteroclinic trajectories joining them. We prove here the robustness of the binding sequential dynamics, i.e., the feasibility phenomenon for coupled heteroclinic networks: for each collection of successive heteroclinic trajectories inside the unified networks, there is an open set of initial points such that the trajectory going through each of them follows the prescribed collection staying in a small neighborhood of it. We show also that the symbolic complexity function of the system restricted to this neighborhood is a polynomial of degree L - 1, where L is the number of modalities.

  11. Testing the Hypothesis of Biofilm as a Source for Soft Tissue and Cell-Like Structures Preserved in Dinosaur Bone

    Science.gov (United States)

    2016-01-01

    Recovery of still-soft tissue structures, including blood vessels and osteocytes, from dinosaur bone after demineralization was reported in 2005 and in subsequent publications. Despite multiple lines of evidence supporting an endogenous source, it was proposed that these structures arose from contamination from biofilm-forming organisms. To test the hypothesis that soft tissue structures result from microbial invasion of the fossil bone, we used two different biofilm-forming microorganisms to inoculate modern bone fragments from which organic components had been removed. We show fundamental morphological, chemical and textural differences between the resultant biofilm structures and those derived from dinosaur bone. The data do not support the hypothesis that biofilm-forming microorganisms are the source of these structures. PMID:26926069

  12. Are only infants held more often on the left? If so, why? Testing the attention-emotion hypothesis with an infant, a vase, and two chimeric tests, one "emotional," one not.

    Science.gov (United States)

    Harris, Lauren Julius; Cárdenas, Rodrigo A; Stewart, Nathaniel D; Almerigi, Jason B

    2018-05-16

    Most adults, especially women, hold infants and dolls but not books or packages on the left side. One reason may be that attention is more often leftward in response to infants, unlike emotionally neutral objects like books and packages. Women's stronger bias may reflect greater responsiveness to infants. Previously, we tested the attention hypothesis by comparing women's side-of-hold of a doll, book, and package with direction-of-attention on the Chimeric Faces Test (CFT) [Harris, L. J., Cárdenas, R. A., Spradlin, Jr., M. P., & Almerigi, J. B. (2010). Why are infants held on the left? A test of the attention hypothesis with a doll, a book, and a bag. Laterality: Asymmetries of Body, Brain and Cognition, 15(5), 548-571. doi: 10.1080/13576500903064018 ]. Only the doll was held more often to the left, and only for the doll were side-of-hold and CFT scores related, with left-holders showing a stronger left-attention bias than right-holders. In the current study, we tested men and women with a doll and the CFT along with a vase as a neutral object and a "non-emotional" chimeric test. Again, only the doll was held more often to the left, but now, although both chimeric tests showed left-attention biases, scores were unrelated to side-of-hold. Nor were there sex differences. The results support left-hold selectivity but not the attention hypothesis, with or without the element of emotion. They also raise questions about the contribution of sex-of-holder. We conclude with suggestions for addressing these issues.

  13. Sequential lineup presentation: Patterns and policy

    OpenAIRE

    Lindsay, R C L; Mansour, Jamal K; Beaudry, J L; Leach, A-M; Bertrand, M I

    2009-01-01

    Sequential lineups were offered as an alternative to the traditional simultaneous lineup. Sequential lineups reduce incorrect lineup selections; however, the accompanying loss of correct identifications has resulted in controversy regarding adoption of the technique. We discuss the procedure and research relevant to (1) the pattern of results found using sequential versus simultaneous lineups; (2) reasons (theory) for differences in witness responses; (3) two methodological issues; and (4) im...

  14. Sequential Product of Quantum Effects: An Overview

    Science.gov (United States)

    Gudder, Stan

    2010-12-01

    This article presents an overview for the theory of sequential products of quantum effects. We first summarize some of the highlights of this relatively recent field of investigation and then provide some new results. We begin by discussing sequential effect algebras which are effect algebras endowed with a sequential product satisfying certain basic conditions. We then consider sequential products of (discrete) quantum measurements. We next treat transition effect matrices (TEMs) and their associated sequential product. A TEM is a matrix whose entries are effects and whose rows form quantum measurements. We show that TEMs can be employed for the study of quantum Markov chains. Finally, we prove some new results concerning TEMs and vector densities.

  15. Gender differences in emotion perception and self-reported emotional intelligence: A test of the emotion sensitivity hypothesis

    OpenAIRE

    Fischer, Agneta H.; Kret, Mariska E.; Broekens, Joost

    2018-01-01

    Previous meta-analyses and reviews on gender differences in emotion recognition have shown a small to moderate female advantage. However, inconsistent evidence from recent studies has raised questions regarding the implications of different methodologies, stimuli, and samples. In the present research based on a community sample of more than 5000 participants, we tested the emotional sensitivity hypothesis, stating that women are more sensitive to perceive subtle, i.e. low intense or ambiguous...

  16. Testing the niche variation hypothesis with a measure of body condition

    Science.gov (United States)

    Individual variation and fitness are cornerstones of evolution by natural selection. The niche variation hypothesis (NVH) posits that when interspecific competition is relaxed, intraspecific competition should drive niche expansion by selection favoring use of novel resources. Po...

  17. Cross-validation and hypothesis testing in neuroimaging: An irenic comment on the exchange between Friston and Lindquist et al.

    Science.gov (United States)

    Reiss, Philip T

    2015-08-01

    The "ten ironic rules for statistical reviewers" presented by Friston (2012) prompted a rebuttal by Lindquist et al. (2013), which was followed by a rejoinder by Friston (2013). A key issue left unresolved in this discussion is the use of cross-validation to test the significance of predictive analyses. This note discusses the role that cross-validation-based and related hypothesis tests have come to play in modern data analyses, in neuroimaging and other fields. It is shown that such tests need not be suboptimal and can fill otherwise-unmet inferential needs. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. A Hypothesis-Driven Approach to Site Investigation

    Science.gov (United States)

    Nowak, W.

    2008-12-01

    Variability of subsurface formations and the scarcity of data lead to the notion of aquifer parameters as geostatistical random variables. Given an information need and limited resources for field campaigns, site investigation is often put into the context of optimal design. In optimal design, the types, numbers and positions of samples are optimized under case-specific objectives to meet the information needs. Past studies feature optimal data worth (balancing maximum financial profit in an engineering task versus the cost of additional sampling), or aim at a minimum prediction uncertainty of stochastic models for a prescribed investigation budget. Recent studies also account for other sources of uncertainty outside the hydrogeological range, such as uncertain toxicity, ingestion and behavioral parameters of the affected population when predicting the human health risk from groundwater contaminations. The current study looks at optimal site investigation from a new angle. Answering a yes/no question under uncertainty directly requires recasting the original question as a hypothesis test. Otherwise, false confidence in the resulting answer would be pretended. A straightforward example is whether a recent contaminant spill will cause contaminant concentrations in excess of a legal limit at a nearby drinking water well. This question can only be answered down to a specified chance of error, i.e., based on the significance level used in hypothesis tests. Optimal design is placed into the hypothesis-driven context by using the chance of providing a false yes/no answer as new criterion to be minimized. Different configurations apply for one-sided and two-sided hypothesis tests. If a false answer entails financial liability, the hypothesis-driven context can be re-cast in the context of data worth. The remaining difference is that failure is a hard constraint in the data worth context versus a monetary punishment term in the hypothesis-driven context. The basic principle

  19. Is Variability in Mate Choice Similar for Intelligence and Personality Traits? Testing a Hypothesis about the Evolutionary Genetics of Personality

    Science.gov (United States)

    Stone, Emily A.; Shackelford, Todd K.; Buss, David M.

    2012-01-01

    This study tests the hypothesis presented by Penke, Denissen, and Miller (2007a) that condition-dependent traits, including intelligence, attractiveness, and health, are universally and uniformly preferred as characteristics in a mate relative to traits that are less indicative of condition, including personality traits. We analyzed…

  20. Efficient sequential and parallel algorithms for record linkage.

    Science.gov (United States)

    Mamun, Abdullah-Al; Mi, Tian; Aseltine, Robert; Rajasekaran, Sanguthevar

    2014-01-01

    Integrating data from multiple sources is a crucial and challenging problem. Even though there exist numerous algorithms for record linkage or deduplication, they suffer from either large time needs or restrictions on the number of datasets that they can integrate. In this paper we report efficient sequential and parallel algorithms for record linkage which handle any number of datasets and outperform previous algorithms. Our algorithms employ hierarchical clustering algorithms as the basis. A key idea that we use is radix sorting on certain attributes to eliminate identical records before any further processing. Another novel idea is to form a graph that links similar records and find the connected components. Our sequential and parallel algorithms have been tested on a real dataset of 1,083,878 records and synthetic datasets ranging in size from 50,000 to 9,000,000 records. Our sequential algorithm runs at least two times faster, for any dataset, than the previous best-known algorithm, the two-phase algorithm using faster computation of the edit distance (TPA (FCED)). The speedups obtained by our parallel algorithm are almost linear. For example, we get a speedup of 7.5 with 8 cores (residing in a single node), 14.1 with 16 cores (residing in two nodes), and 26.4 with 32 cores (residing in four nodes). We have compared the performance of our sequential algorithm with TPA (FCED) and found that our algorithm outperforms the previous one. The accuracy is the same as that of this previous best-known algorithm.

  1. Testing the Hypothesis of Biofilm as a Source for Soft Tissue and Cell-Like Structures Preserved in Dinosaur Bone.

    Directory of Open Access Journals (Sweden)

    Mary Higby Schweitzer

    Full Text Available Recovery of still-soft tissue structures, including blood vessels and osteocytes, from dinosaur bone after demineralization was reported in 2005 and in subsequent publications. Despite multiple lines of evidence supporting an endogenous source, it was proposed that these structures arose from contamination from biofilm-forming organisms. To test the hypothesis that soft tissue structures result from microbial invasion of the fossil bone, we used two different biofilm-forming microorganisms to inoculate modern bone fragments from which organic components had been removed. We show fundamental morphological, chemical and textural differences between the resultant biofilm structures and those derived from dinosaur bone. The data do not support the hypothesis that biofilm-forming microorganisms are the source of these structures.

  2. Biomic specialization and speciation rates in ruminants (Cetartiodactyla, Mammalia): a test of the resource-use hypothesis at the global scale.

    Science.gov (United States)

    Cantalapiedra, Juan L; Hernández Fernández, Manuel; Morales, Jorge

    2011-01-01

    The resource-use hypothesis proposed by E.S. Vrba predicts that specialist species have higher speciation and extinction rates than generalists because they are more susceptible to environmental changes and vicariance. In this work, we test some of the predictions derived from this hypothesis on the 197 extant and recently extinct species of Ruminantia (Cetartiodactyla, Mammalia) using the biomic specialization index (BSI) of each species, which is based on its distribution within different biomes. We ran 10000 Monte Carlo simulations of our data in order to get a null distribution of BSI values against which to contrast the observed data. Additionally, we drew on a supertree of the ruminants and a phylogenetic likelihood-based method (QuaSSE) for testing whether the degree of biomic specialization affects speciation rates in ruminant lineages. Our results are consistent with the predictions of the resource-use hypothesis, which foretells a higher speciation rate of lineages restricted to a single biome (BSI = 1) and higher frequency of specialist species in biomes that underwent high degree of contraction and fragmentation during climatic cycles. Bovids and deer present differential specialization across biomes; cervids show higher specialization in biomes with a marked hydric seasonality (tropical deciduous woodlands and schlerophyllous woodlands), while bovids present higher specialization in a greater variety of biomes. This might be the result of divergent physiological constraints as well as a different biogeographic and evolutionary history.

  3. Quantum Inequalities and Sequential Measurements

    International Nuclear Information System (INIS)

    Candelpergher, B.; Grandouz, T.; Rubinx, J.L.

    2011-01-01

    In this article, the peculiar context of sequential measurements is chosen in order to analyze the quantum specificity in the two most famous examples of Heisenberg and Bell inequalities: Results are found at some interesting variance with customary textbook materials, where the context of initial state re-initialization is described. A key-point of the analysis is the possibility of defining Joint Probability Distributions for sequential random variables associated to quantum operators. Within the sequential context, it is shown that Joint Probability Distributions can be defined in situations where not all of the quantum operators (corresponding to random variables) do commute two by two. (authors)

  4. Sequential Objective Structured Clinical Examination based on item response theory in Iran

    Directory of Open Access Journals (Sweden)

    Sara Mortaz Hejri

    2017-09-01

    Full Text Available Purpose In a sequential objective structured clinical examination (OSCE, all students initially take a short screening OSCE. Examinees who pass are excused from further testing, but an additional OSCE is administered to the remaining examinees. Previous investigations of sequential OSCE were based on classical test theory. We aimed to design and evaluate screening OSCEs based on item response theory (IRT. Methods We carried out a retrospective observational study. At each station of a 10-station OSCE, the students’ performance was graded on a Likert-type scale. Since the data were polytomous, the difficulty parameters, discrimination parameters, and students’ ability were calculated using a graded response model. To design several screening OSCEs, we identified the 5 most difficult stations and the 5 most discriminative ones. For each test, 5, 4, or 3 stations were selected. Normal and stringent cut-scores were defined for each test. We compared the results of each of the 12 screening OSCEs to the main OSCE and calculated the positive and negative predictive values (PPV and NPV, as well as the exam cost. Results A total of 253 students (95.1% passed the main OSCE, while 72.6% to 94.4% of examinees passed the screening tests. The PPV values ranged from 0.98 to 1.00, and the NPV values ranged from 0.18 to 0.59. Two tests effectively predicted the results of the main exam, resulting in financial savings of 34% to 40%. Conclusion If stations with the highest IRT-based discrimination values and stringent cut-scores are utilized in the screening test, sequential OSCE can be an efficient and convenient way to conduct an OSCE.

  5. Sequential Objective Structured Clinical Examination based on item response theory in Iran.

    Science.gov (United States)

    Hejri, Sara Mortaz; Jalili, Mohammad

    2017-01-01

    In a sequential objective structured clinical examination (OSCE), all students initially take a short screening OSCE. Examinees who pass are excused from further testing, but an additional OSCE is administered to the remaining examinees. Previous investigations of sequential OSCE were based on classical test theory. We aimed to design and evaluate screening OSCEs based on item response theory (IRT). We carried out a retrospective observational study. At each station of a 10-station OSCE, the students' performance was graded on a Likert-type scale. Since the data were polytomous, the difficulty parameters, discrimination parameters, and students' ability were calculated using a graded response model. To design several screening OSCEs, we identified the 5 most difficult stations and the 5 most discriminative ones. For each test, 5, 4, or 3 stations were selected. Normal and stringent cut-scores were defined for each test. We compared the results of each of the 12 screening OSCEs to the main OSCE and calculated the positive and negative predictive values (PPV and NPV), as well as the exam cost. A total of 253 students (95.1%) passed the main OSCE, while 72.6% to 94.4% of examinees passed the screening tests. The PPV values ranged from 0.98 to 1.00, and the NPV values ranged from 0.18 to 0.59. Two tests effectively predicted the results of the main exam, resulting in financial savings of 34% to 40%. If stations with the highest IRT-based discrimination values and stringent cut-scores are utilized in the screening test, sequential OSCE can be an efficient and convenient way to conduct an OSCE.

  6. On the Flexibility of Social Source Memory: A Test of the Emotional Incongruity Hypothesis

    Science.gov (United States)

    Bell, Raoul; Buchner, Axel; Kroneisen, Meike; Giang, Trang

    2012-01-01

    A popular hypothesis in evolutionary psychology posits that reciprocal altruism is supported by a cognitive module that helps cooperative individuals to detect and remember cheaters. Consistent with this hypothesis, a source memory advantage for faces of cheaters (better memory for the cheating context in which these faces were encountered) was…

  7. Received social support and exercising: An intervention study to test the enabling hypothesis.

    Science.gov (United States)

    Rackow, Pamela; Scholz, Urte; Hornung, Rainer

    2015-11-01

    Received social support is considered important for health-enhancing exercise participation. The enabling hypothesis of social support suggests an indirect association of social support and exercising via constructs of self-regulation, such as self-efficacy. This study aimed at examining an expanded enabling hypothesis by examining effects of different kinds of social support (i.e., emotional and instrumental) on exercising not only via self-efficacy but also via self-monitoring and action planning. An 8-week online study was conducted. Participants were randomly assigned to an intervention or a control group. The intervention comprised finding and then exercising regularly with a new exercise companion. Intervention and control group effects were compared by a manifest multigroup model. Received emotional social support predicted self-efficacy, self-monitoring, and action planning in the intervention group. Moreover, received emotional social support was indirectly connected with exercise via the examined mediators. The indirect effect from received emotional social support via self-efficacy mainly contributed to the total effect. No direct or indirect effect of received instrumental social support on exercise emerged. In the control group, neither emotional nor instrumental social support was associated with any of the self-regulation constructs nor with exercise. Actively looking for a new exercise companion and exercising together seems to be beneficial for the promotion of received emotional and instrumental social support. Emotional support in turn promotes exercise by enabling better self-regulation, in particular self-efficacy. Statement of contribution What is already known on this subject? With the 'enabling hypothesis', Benight and Bandura (2004, Behav. Res. Ther., 42, 1129) claimed that social support indirectly affects behaviour via self-efficacy. Research in the domain of physical exercise has provided evidence for this enabling hypothesis on a

  8. Assess the Critical Period Hypothesis in Second Language Acquisition

    Science.gov (United States)

    Du, Lihong

    2010-01-01

    The Critical Period Hypothesis aims to investigate the reason for significant difference between first language acquisition and second language acquisition. Over the past few decades, researchers carried out a series of studies to test the validity of the hypothesis. Although there were certain limitations in these studies, most of their results…

  9. A field test of three LQAS designs to assess the prevalence of acute malnutrition.

    Science.gov (United States)

    Deitchler, Megan; Valadez, Joseph J; Egge, Kari; Fernandez, Soledad; Hennigan, Mary

    2007-08-01

    The conventional method for assessing the prevalence of Global Acute Malnutrition (GAM) in emergency settings is the 30 x 30 cluster-survey. This study describes alternative approaches: three Lot Quality Assurance Sampling (LQAS) designs to assess GAM. The LQAS designs were field-tested and their results compared with those from a 30 x 30 cluster-survey. Computer simulations confirmed that small clusters instead of a simple random sample could be used for LQAS assessments of GAM. Three LQAS designs were developed (33 x 6, 67 x 3, Sequential design) to assess GAM thresholds of 10, 15 and 20%. The designs were field-tested simultaneously with a 30 x 30 cluster-survey in Siraro, Ethiopia during June 2003. Using a nested study design, anthropometric, morbidity and vaccination data were collected on all children 6-59 months in sampled households. Hypothesis tests about GAM thresholds were conducted for each LQAS design. Point estimates were obtained for the 30 x 30 cluster-survey and the 33 x 6 and 67 x 3 LQAS designs. Hypothesis tests showed GAM as or =10% for the 67 x 3 and Sequential designs. Point estimates for the 33 x 6 and 67 x 3 designs were similar to those of the 30 x 30 cluster-survey for GAM (6.7%, CI = 3.2-10.2%; 8.2%, CI = 4.3-12.1%, 7.4%, CI = 4.8-9.9%) and all other indicators. The CIs for the LQAS designs were only slightly wider than the CIs for the 30 x 30 cluster-survey; yet the LQAS designs required substantially less time to administer. The LQAS designs provide statistically appropriate alternatives to the more time-consuming 30 x 30 cluster-survey. However, additional field-testing is needed using independent samples rather than a nested study design.

  10. Testing the Cuckoldry Risk Hypothesis of Partner Sexual Coercion in Community and Forensic Samples

    Directory of Open Access Journals (Sweden)

    Joseph A. Camilleri

    2009-04-01

    Full Text Available Evolutionary theory has informed the investigation of male sexual coercion but has seldom been applied to the analysis of sexual coercion within established couples. The cuckoldry risk hypothesis, that sexual coercion is a male tactic used to reduce the risk of extrapair paternity, was tested in two studies. In a community sample, indirect cues of infidelity predicted male propensity for sexual coaxing in the relationship, and direct cues predicted propensity for sexual coercion. In the forensic sample, we found that most partner rapists experienced cuckoldry risk prior to committing their offence and experienced more types of cuckoldry risk events than non-sexual partner assaulters. These findings suggest that cuckoldry risk influences male sexual coercion in established sexual relationships.

  11. MUSCLE OR MOTIVATION? A STOP SIGNAL STUDY ON THE EFFECTS OF SEQUENTIAL COGNITIVE CONTROL

    Directory of Open Access Journals (Sweden)

    Hilde M. Huizenga

    2012-05-01

    Full Text Available Performance in cognitive control tasks deteriorates when these tasks are performed together with other tasks that also require cognitive control, that is, if simultaneous cognitive control is required. Surprisingly, this decrease in performance is also observed if tasks are preceded by other cognitive control tasks, that is, if sequential cognitive control is required. The common explanation for the latter finding is that previous acts of cognitive control deplete a common resource, just like a muscle becomes fatigued after repeated use. An alternative explanation however has also been put forward, namely that repeated acts of cognitive control reduce the motivation to match allocated resources to required resources. In this paper we formalize these two accounts, the muscle and the motivation account, and show that they yield differential predictions on the interaction between simultaneous and sequential cognitive control. Such an interaction is not predicted by the muscle account, whereas it is predicted by the motivation account.These predictions were tested in a paradigm where participants had to perform a series of stop-signal tasks, these tasks varied both in their demands on simultaneous control and in their demands on sequential control. This paradigm, combined with a multilevel analysis, offered the possibility to test the differential predictions directly. Results of two studies indicate that an interaction between simultaneous and sequential cognitive control is present. Therefore it is concluded that effects of sequential cognitive control are best explained by the motivation account.

  12. Resemblance profiles as clustering decision criteria: Estimating statistical power, error, and correspondence for a hypothesis test for multivariate structure.

    Science.gov (United States)

    Kilborn, Joshua P; Jones, David L; Peebles, Ernst B; Naar, David F

    2017-04-01

    Clustering data continues to be a highly active area of data analysis, and resemblance profiles are being incorporated into ecological methodologies as a hypothesis testing-based approach to clustering multivariate data. However, these new clustering techniques have not been rigorously tested to determine the performance variability based on the algorithm's assumptions or any underlying data structures. Here, we use simulation studies to estimate the statistical error rates for the hypothesis test for multivariate structure based on dissimilarity profiles (DISPROF). We concurrently tested a widely used algorithm that employs the unweighted pair group method with arithmetic mean (UPGMA) to estimate the proficiency of clustering with DISPROF as a decision criterion. We simulated unstructured multivariate data from different probability distributions with increasing numbers of objects and descriptors, and grouped data with increasing overlap, overdispersion for ecological data, and correlation among descriptors within groups. Using simulated data, we measured the resolution and correspondence of clustering solutions achieved by DISPROF with UPGMA against the reference grouping partitions used to simulate the structured test datasets. Our results highlight the dynamic interactions between dataset dimensionality, group overlap, and the properties of the descriptors within a group (i.e., overdispersion or correlation structure) that are relevant to resemblance profiles as a clustering criterion for multivariate data. These methods are particularly useful for multivariate ecological datasets that benefit from distance-based statistical analyses. We propose guidelines for using DISPROF as a clustering decision tool that will help future users avoid potential pitfalls during the application of methods and the interpretation of results.

  13. Motivation in vigilance - A test of the goal-setting hypothesis of the effectiveness of knowledge of results.

    Science.gov (United States)

    Warm, J. S.; Riechmann, S. W.; Grasha, A. F.; Seibel, B.

    1973-01-01

    This study tested the prediction, derived from the goal-setting hypothesis, that the facilitating effects of knowledge of results (KR) in a simple vigilance task should be related directly to the level of the performance standard used to regulate KR. Two groups of Ss received dichotomous KR in terms of whether Ss response times (RTs) to signal detections exceeded a high or low standard of performance. The aperiodic offset of a visual signal was the critical event for detection. The vigil was divided into a training phase followed by testing, during which KR was withdrawn. Knowledge of results enhanced performance in both phases. However, the two standards used to regulate feedback contributed little to these effects.

  14. The Influence of Maternal Acculturation, Neighborhood Disadvantage, and Parenting on Chinese American Adolescents' Conduct Problems: Testing the Segmented Assimilation Hypothesis

    Science.gov (United States)

    Liu, Lisa L.; Lau, Anna S.; Chen, Angela Chia-Chen; Dinh, Khanh T.; Kim, Su Yeong

    2009-01-01

    Associations among neighborhood disadvantage, maternal acculturation, parenting and conduct problems were investigated in a sample of 444 Chinese American adolescents. Adolescents (54% female, 46% male) ranged from 12 to 15 years of age (mean age = 13.0 years). Multilevel modeling was employed to test the hypothesis that the association between…

  15. Sequential Generalized Transforms on Function Space

    Directory of Open Access Journals (Sweden)

    Jae Gil Choi

    2013-01-01

    Full Text Available We define two sequential transforms on a function space Ca,b[0,T] induced by generalized Brownian motion process. We then establish the existence of the sequential transforms for functionals in a Banach algebra of functionals on Ca,b[0,T]. We also establish that any one of these transforms acts like an inverse transform of the other transform. Finally, we give some remarks about certain relations between our sequential transforms and other well-known transforms on Ca,b[0,T].

  16. Using Saccharomyces cerevisiae to Test the Mutagenicity of Household Compounds: An Open Ended Hypothesis-Driven Teaching Lab

    OpenAIRE

    Marshall, Pamela A.

    2007-01-01

    In our Fundamentals of Genetics lab, students perform a wide variety of labs to reinforce and extend the topics covered in lecture. I developed an active-learning lab to augment the lecture topic of mutagenesis. In this lab exercise, students determine if a compound they bring from home is a mutagen. Students are required to read extensive background material, perform research to find a potential mutagen to test, develop a hypothesis, and bring to the lab their own suspected mutagen. This lab...

  17. Testing the carotenoid trade-off hypothesis in the polychromatic Midas cichlid, Amphilophus citrinellus.

    Science.gov (United States)

    Lin, Susan M; Nieves-Puigdoller, Katherine; Brown, Alexandria C; McGraw, Kevin J; Clotfelter, Ethan D

    2010-01-01

    Many animals use carotenoid pigments derived from their diet for coloration and immunity. The carotenoid trade-off hypothesis predicts that, under conditions of carotenoid scarcity, individuals may be forced to allocate limited carotenoids to either coloration or immunity. In polychromatic species, the pattern of allocation may differ among individuals. We tested the carotenoid trade-off hypothesis in the Midas cichlid, Amphilophus citrinellus, a species with two ontogenetic color morphs, barred and gold, the latter of which is the result of carotenoid expression. We performed a diet-supplementation experiment in which cichlids of both color morphs were assigned to one of two diet treatments that differed only in carotenoid content (beta-carotene, lutein, and zeaxanthin). We measured integument color using spectrometry, quantified carotenoid concentrations in tissue and plasma, and assessed innate immunity using lysozyme activity and alternative complement pathway assays. In both color morphs, dietary carotenoid supplementation elevated plasma carotenoid circulation but failed to affect skin coloration. Consistent with observable differences in integument coloration, we found that gold fish sequestered more carotenoids in skin tissue than barred fish, but barred fish had higher concentrations of carotenoids in plasma than gold fish. Neither measure of innate immunity differed between gold and barred fish, or as a function of dietary carotenoid supplementation. Lysozyme activity, but not complement activity, was strongly affected by body condition. Our data show that a diet low in carotenoids is sufficient to maintain both coloration and innate immunity in Midas cichlids. Our data also suggest that the developmental transition from the barred to gold morph is not accompanied by a decrease in innate immunity in this species.

  18. Forced Sequence Sequential Decoding

    DEFF Research Database (Denmark)

    Jensen, Ole Riis; Paaske, Erik

    1998-01-01

    We describe a new concatenated decoding scheme based on iterations between an inner sequentially decoded convolutional code of rate R=1/4 and memory M=23, and block interleaved outer Reed-Solomon (RS) codes with nonuniform profile. With this scheme decoding with good performance is possible as low...... as Eb/N0=0.6 dB, which is about 1.25 dB below the signal-to-noise ratio (SNR) that marks the cutoff rate for the full system. Accounting for about 0.45 dB due to the outer codes, sequential decoding takes place at about 1.7 dB below the SNR cutoff rate for the convolutional code. This is possible since...... the iteration process provides the sequential decoders with side information that allows a smaller average load and minimizes the probability of computational overflow. Analytical results for the probability that the first RS word is decoded after C computations are presented. These results are supported...

  19. Testing of Frank's hypothesis on a containerless packing of macroscopic soft spheres and comparison with mono-atomic metallic liquids

    International Nuclear Information System (INIS)

    Sahu, K.K.; Wessels, V.; Kelton, K.F.; Loeffler, J.F.

    2011-01-01

    Highlights: → Testing of Frank's hypothesis for Centripetal Packing (CP) has been proposed. → It is shown that CP is an idealized model for Monatomic Supercooled Liquid (MSL). → The CP is fit for comparing with studies on MSL in a containerless environment. → We measure local orders in CP by HA and BOO methods for the first time. → It is shown that icosahedral order is greater in CP than MSL and reasons explored. - Abstract: It is well-known that metallic liquids can exist below their equilibrium melting temperature for a considerable time. To explain this, Frank proposed that icosahedral ordering, incompatible with crystalline long-range order, is prevalent in the atomic structure of these liquids, stabilizing them and enabling them to be supercooled. Some studies of the atomic structures of metallic liquids using Beam-line Electrostatic Levitation (BESL; containerless melting), and other techniques, support this hypothesis . Here we examine Frank's hypothesis in a system of macroscopic, monodisperse deformable spheres obtained by containerless packing under the influence of centripetal force. The local structure of this packing is analyzed and compared with atomic ensembles of liquid transition metals obtained by containerless melting using the BESL method.

  20. What Constitutes Science and Scientific Evidence: Roles of Null Hypothesis Testing

    Science.gov (United States)

    Chang, Mark

    2017-01-01

    We briefly discuss the philosophical basis of science, causality, and scientific evidence, by introducing the hidden but most fundamental principle of science: the similarity principle. The principle's use in scientific discovery is illustrated with Simpson's paradox and other examples. In discussing the value of null hypothesis statistical…

  1. Life Origination Hydrate Hypothesis (LOH-Hypothesis

    Directory of Open Access Journals (Sweden)

    Victor Ostrovskii

    2012-01-01

    Full Text Available The paper develops the Life Origination Hydrate Hypothesis (LOH-hypothesis, according to which living-matter simplest elements (LMSEs, which are N-bases, riboses, nucleosides, nucleotides, DNA- and RNA-like molecules, amino-acids, and proto-cells repeatedly originated on the basis of thermodynamically controlled, natural, and inevitable processes governed by universal physical and chemical laws from CH4, niters, and phosphates under the Earth's surface or seabed within the crystal cavities of the honeycomb methane-hydrate structure at low temperatures; the chemical processes passed slowly through all successive chemical steps in the direction that is determined by a gradual decrease in the Gibbs free energy of reacting systems. The hypothesis formulation method is based on the thermodynamic directedness of natural movement and consists ofan attempt to mentally backtrack on the progression of nature and thus reveal principal milestones alongits route. The changes in Gibbs free energy are estimated for different steps of the living-matter origination process; special attention is paid to the processes of proto-cell formation. Just the occurrence of the gas-hydrate periodic honeycomb matrix filled with LMSEs almost completely in its final state accounts for size limitation in the DNA functional groups and the nonrandom location of N-bases in the DNA chains. The slowness of the low-temperature chemical transformations and their “thermodynamic front” guide the gross process of living matter origination and its successive steps. It is shown that the hypothesis is thermodynamically justified and testable and that many observed natural phenomena count in its favor.

  2. An automatic system for acidity determination based on sequential injection titration and the monosegmented flow approach.

    Science.gov (United States)

    Kozak, Joanna; Wójtowicz, Marzena; Gawenda, Nadzieja; Kościelniak, Paweł

    2011-06-15

    An automatic sequential injection system, combining monosegmented flow analysis, sequential injection analysis and sequential injection titration is proposed for acidity determination. The system enables controllable sample dilution and generation of standards of required concentration in a monosegmented sequential injection manner, sequential injection titration of the prepared solutions, data collecting, and handling. It has been tested on spectrophotometric determination of acetic, citric and phosphoric acids with sodium hydroxide used as a titrant and phenolphthalein or thymolphthalein (in the case of phosphoric acid determination) as indicators. Accuracy better than |4.4|% (RE) and repeatability better than 2.9% (RSD) have been obtained. It has been applied to the determination of total acidity in vinegars and various soft drinks. The system provides low sample (less than 0.3 mL) consumption. On average, analysis of a sample takes several minutes. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Biased lineups: sequential presentation reduces the problem.

    Science.gov (United States)

    Lindsay, R C; Lea, J A; Nosworthy, G J; Fulford, J A; Hector, J; LeVan, V; Seabrook, C

    1991-12-01

    Biased lineups have been shown to increase significantly false, but not correct, identification rates (Lindsay, Wallbridge, & Drennan, 1987; Lindsay & Wells, 1980; Malpass & Devine, 1981). Lindsay and Wells (1985) found that sequential lineup presentation reduced false identification rates, presumably by reducing reliance on relative judgment processes. Five staged-crime experiments were conducted to examine the effect of lineup biases and sequential presentation on eyewitness recognition accuracy. Sequential lineup presentation significantly reduced false identification rates from fair lineups as well as from lineups biased with regard to foil similarity, instructions, or witness attire, and from lineups biased in all of these ways. The results support recommendations that police present lineups sequentially.

  4. A novel hypothesis on the sensitivity of the fecal occult blood test: Results of a joint analysis of 3 randomized controlled trials.

    Science.gov (United States)

    Lansdorp-Vogelaar, Iris; van Ballegooijen, Marjolein; Boer, Rob; Zauber, Ann; Habbema, J Dik F

    2009-06-01

    Estimates of the fecal occult blood test (FOBT) (Hemoccult II) sensitivity differed widely between screening trials and led to divergent conclusions on the effects of FOBT screening. We used microsimulation modeling to estimate a preclinical colorectal cancer (CRC) duration and sensitivity for unrehydrated FOBT from the data of 3 randomized controlled trials of Minnesota, Nottingham, and Funen. In addition to 2 usual hypotheses on the sensitivity of FOBT, we tested a novel hypothesis where sensitivity is linked to the stage of clinical diagnosis in the situation without screening. We used the MISCAN-Colon microsimulation model to estimate sensitivity and duration, accounting for differences between the trials in demography, background incidence, and trial design. We tested 3 hypotheses for FOBT sensitivity: sensitivity is the same for all preclinical CRC stages, sensitivity increases with each stage, and sensitivity is higher for the stage in which the cancer would have been diagnosed in the absence of screening than for earlier stages. Goodness-of-fit was evaluated by comparing expected and observed rates of screen-detected and interval CRC. The hypothesis with a higher sensitivity in the stage of clinical diagnosis gave the best fit. Under this hypothesis, sensitivity of FOBT was 51% in the stage of clinical diagnosis and 19% in earlier stages. The average duration of preclinical CRC was estimated at 6.7 years. Our analysis corroborated a long duration of preclinical CRC, with FOBT most sensitive in the stage of clinical diagnosis. (c) 2009 American Cancer Society.

  5. Physics-based, Bayesian sequential detection method and system for radioactive contraband

    Science.gov (United States)

    Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E

    2014-03-18

    A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.

  6. Biorhythms, deciduous enamel thickness, and primary bone growth: a test of the Havers-Halberg Oscillation hypothesis.

    Science.gov (United States)

    Mahoney, Patrick; Miszkiewicz, Justyna J; Pitfield, Rosie; Schlecht, Stephen H; Deter, Chris; Guatelli-Steinberg, Debbie

    2016-06-01

    Across mammalian species, the periodicity with which enamel layers form (Retzius periodicity) in permanent teeth corresponds with average body mass and the pace of life history. According to the Havers-Halberg Oscillation hypothesis (HHO), Retzius periodicity (RP) is a manifestation of a biorhythm that is also expressed in lamellar bone. Potentially, these links provide a basis for investigating aspects of a species' biology from fossilized teeth. Here, we tested intra-specific predictions of this hypothesis on skeletal samples of human juveniles. We measured daily enamel growth increments to calculate RP in deciduous molars (n = 25). Correlations were sought between RP, molar average and relative enamel thickness (AET, RET), and the average amount of primary bone growth (n = 7) in humeri of age-matched juveniles. Results show a previously undescribed relationship between RP and enamel thickness. Reduced major axis regression reveals RP is significantly and positively correlated with AET and RET, and scales isometrically. The direction of the correlation was opposite to HHO predictions as currently understood for human adults. Juveniles with higher RPs and thicker enamel had increased primary bone formation, which suggests a coordinating biorhythm. However, the direction of the correspondence was, again, opposite to predictions. Next, we compared RP from deciduous molars with new data for permanent molars, and with previously published values. The lowermost RP of 4 and 5 days in deciduous enamel extends below the lowermost RP of 6 days in permanent enamel. A lowered range of RP values in deciduous enamel implies that the underlying biorhythm might change with age. Our results develop the intra-specific HHO hypothesis. © 2016 Anatomical Society.

  7. Lineup composition, suspect position, and the sequential lineup advantage.

    Science.gov (United States)

    Carlson, Curt A; Gronlund, Scott D; Clark, Steven E

    2008-06-01

    N. M. Steblay, J. Dysart, S. Fulero, and R. C. L. Lindsay (2001) argued that sequential lineups reduce the likelihood of mistaken eyewitness identification. Experiment 1 replicated the design of R. C. L. Lindsay and G. L. Wells (1985), the first study to show the sequential lineup advantage. However, the innocent suspect was chosen at a lower rate in the simultaneous lineup, and no sequential lineup advantage was found. This led the authors to hypothesize that protection from a sequential lineup might emerge only when an innocent suspect stands out from the other lineup members. In Experiment 2, participants viewed a simultaneous or sequential lineup with either the guilty suspect or 1 of 3 innocent suspects. Lineup fairness was varied to influence the degree to which a suspect stood out. A sequential lineup advantage was found only for the unfair lineups. Additional analyses of suspect position in the sequential lineups showed an increase in the diagnosticity of suspect identifications as the suspect was placed later in the sequential lineup. These results suggest that the sequential lineup advantage is dependent on lineup composition and suspect position. (c) 2008 APA, all rights reserved

  8. Associations among Measures of Sequential Processing in Motor and Linguistics Tasks in Adults with and without a Family History of Childhood Apraxia of Speech: A Replication Study

    Science.gov (United States)

    Button, Le; Peter, Beate; Stoel-Gammon, Carol; Raskind, Wendy H.

    2013-01-01

    The purpose of this study was to address the hypothesis that childhood apraxia of speech (CAS) is influenced by an underlying deficit in sequential processing that is also expressed in other modalities. In a sample of 21 adults from five multigenerational families, 11 with histories of various familial speech sound disorders, 3 biologically…

  9. Perceived message sensation value and psychological reactance: a test of the dominant thought disruption hypothesis.

    Science.gov (United States)

    Quick, Brian L

    2013-01-01

    The present study tests to see whether perceived message sensation value reduces psychological reactance within the context of anti-marijuana ads for television. After controlling for sensation seeking, biological sex, and marijuana use, the results indicate that message novelty is negatively associated with a freedom threat, whereas dramatic impact and emotional arousal were not associated with the antecedent to reactance. Results support the use of novel messages in future ads while at the same time offer an explanation to the challenges involved in creating effective anti-marijuana ads. Overall, the results provide partial support for the dominant thought disruption hypothesis and are discussed with an emphasis on the theoretical and practical implications for health communication researchers and practitioners.

  10. Testing the relativistic Doppler boost hypothesis for supermassive black hole binary candidates

    Science.gov (United States)

    Charisi, Maria; Haiman, Zoltán; Schiminovich, David; D'Orazio, Daniel J.

    2018-06-01

    Supermassive black hole binaries (SMBHBs) should be common in galactic nuclei as a result of frequent galaxy mergers. Recently, a large sample of sub-parsec SMBHB candidates was identified as bright periodically variable quasars in optical surveys. If the observed periodicity corresponds to the redshifted binary orbital period, the inferred orbital velocities are relativistic (v/c ≈ 0.1). The optical and ultraviolet (UV) luminosities are expected to arise from gas bound to the individual BHs, and would be modulated by the relativistic Doppler effect. The optical and UV light curves should vary in tandem with relative amplitudes which depend on the respective spectral slopes. We constructed a control sample of 42 quasars with aperiodic variability, to test whether this Doppler colour signature can be distinguished from intrinsic chromatic variability. We found that the Doppler signature can arise by chance in ˜20 per cent (˜37 per cent) of quasars in the nUV (fUV) band. These probabilities reflect the limited quality of the control sample and represent upper limits on how frequently quasars mimic the Doppler brightness+colour variations. We performed separate tests on the periodic quasar candidates, and found that for the majority, the Doppler boost hypothesis requires an unusually steep UV spectrum or an unexpectedly large BH mass and orbital velocity. We conclude that at most approximately one-third of these periodic candidates can harbor Doppler-modulated SMBHBs.

  11. Changes of peritoneal transport parameters with time on dialysis: assessment with sequential peritoneal equilibration test.

    Science.gov (United States)

    Waniewski, Jacek; Antosiewicz, Stefan; Baczynski, Daniel; Poleszczuk, Jan; Pietribiasi, Mauro; Lindholm, Bengt; Wankowicz, Zofia

    2017-10-27

    Sequential peritoneal equilibration test (sPET) is based on the consecutive performance of the peritoneal equilibration test (PET, 4-hour, glucose 2.27%) and the mini-PET (1-hour, glucose 3.86%), and the estimation of peritoneal transport parameters with the 2-pore model. It enables the assessment of the functional transport barrier for fluid and small solutes. The objective of this study was to check whether the estimated model parameters can serve as better and earlier indicators of the changes in the peritoneal transport characteristics than directly measured transport indices that depend on several transport processes. 17 patients were examined using sPET twice with the interval of about 8 months (230 ± 60 days). There was no difference between the observational parameters measured in the 2 examinations. The indices for solute transport, but not net UF, were well correlated between the examinations. Among the estimated parameters, a significant decrease between the 2 examinations was found only for hydraulic permeability LpS, and osmotic conductance for glucose, whereas the other parameters remained unchanged. These fluid transport parameters did not correlate with D/P for creatinine, although the decrease in LpS values between the examinations was observed mostly for patients with low D/P for creatinine. We conclude that changes in fluid transport parameters, hydraulic permeability and osmotic conductance for glucose, as assessed by the pore model, may precede the changes in small solute transport. The systematic assessment of fluid transport status needs specific clinical and mathematical tools beside the standard PET tests.

  12. Multi-Stage Recognition of Speech Emotion Using Sequential Forward Feature Selection

    Directory of Open Access Journals (Sweden)

    Liogienė Tatjana

    2016-07-01

    Full Text Available The intensive research of speech emotion recognition introduced a huge collection of speech emotion features. Large feature sets complicate the speech emotion recognition task. Among various feature selection and transformation techniques for one-stage classification, multiple classifier systems were proposed. The main idea of multiple classifiers is to arrange the emotion classification process in stages. Besides parallel and serial cases, the hierarchical arrangement of multi-stage classification is most widely used for speech emotion recognition. In this paper, we present a sequential-forward-feature-selection-based multi-stage classification scheme. The Sequential Forward Selection (SFS and Sequential Floating Forward Selection (SFFS techniques were employed for every stage of the multi-stage classification scheme. Experimental testing of the proposed scheme was performed using the German and Lithuanian emotional speech datasets. Sequential-feature-selection-based multi-stage classification outperformed the single-stage scheme by 12–42 % for different emotion sets. The multi-stage scheme has shown higher robustness to the growth of emotion set. The decrease in recognition rate with the increase in emotion set for multi-stage scheme was lower by 10–20 % in comparison with the single-stage case. Differences in SFS and SFFS employment for feature selection were negligible.

  13. TESTING THE HYPOTHESIS THAT METHANOL MASER RINGS TRACE CIRCUMSTELLAR DISKS: HIGH-RESOLUTION NEAR-INFRARED AND MID-INFRARED IMAGING

    International Nuclear Information System (INIS)

    De Buizer, James M.; Bartkiewicz, Anna; Szymczak, Marian

    2012-01-01

    Milliarcsecond very long baseline interferometry maps of regions containing 6.7 GHz methanol maser emission have lead to the recent discovery of ring-like distributions of maser spots and the plausible hypothesis that they may be tracing circumstellar disks around forming high-mass stars. We aimed to test this hypothesis by imaging these regions in the near- and mid-infrared at high spatial resolution and compare the observed emission to the expected infrared morphologies as inferred from the geometries of the maser rings. In the near-infrared we used the Gemini North adaptive optics system of ALTAIR/NIRI, while in the mid-infrared we used the combination of the Gemini South instrument T-ReCS and super-resolution techniques. Resultant images had a resolution of ∼150 mas in both the near-infrared and mid-infrared. We discuss the expected distribution of circumstellar material around young and massive accreting (proto)stars and what infrared emission geometries would be expected for the different maser ring orientations under the assumption that the masers are coming from within circumstellar disks. Based upon the observed infrared emission geometries for the four targets in our sample and the results of spectral energy distribution modeling of the massive young stellar objects associated with the maser rings, we do not find compelling evidence in support of the hypothesis that methanol masers rings reside in circumstellar disks.

  14. Testing the lexical hypothesis: are socially important traits more densely reflected in the English lexicon?

    Science.gov (United States)

    Wood, Dustin

    2015-02-01

    Using a set of 498 English words identified by Saucier (1997) as common person-descriptor adjectives or trait terms, I tested 3 instantiations of the lexical hypothesis, which posit that more socially important person descriptors show greater density in the lexicon. Specifically, I explored whether trait terms that have greater relational impact (i.e., more greatly influence how others respond to a person) have more synonyms, are more frequently used, and are more strongly correlated with other trait terms. I found little evidence to suggest that trait terms rated as having greater relational impact were more frequently used or had more synonyms. However, these terms correlated more strongly with other trait terms in the set. Conversely, a trait term's loadings on structural factors (e.g., the Big Five, HEXACO) were extremely good predictors of the term's relational impact. The findings suggest that the lexical hypothesis may not be strongly supported in some ways it is commonly understood but is supported in the manner most important to investigations of trait structure. Specifically, trait terms with greater relational impact tend to more strongly correlate with other terms in lexical sets and thus have a greater role in driving the location of factors in analyses of trait structure. Implications for understanding the meaning of lexical factors such as the Big Five are discussed. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  15. Metabolically based liver damage pathophysiology in patients with urea cycle disorders - A new hypothesis.

    Science.gov (United States)

    Ivanovski, Ivan; Ješić, Miloš; Ivanovski, Ana; Garavelli, Livia; Ivanovski, Petar

    2017-11-28

    The underlying pathophysiology of liver dysfunction in urea cycle disorders (UCDs) is still largely elusive. There is some evidence that the accumulation of urea cycle (UC) intermediates are toxic for hepatocyte mitochondria. It is possible that liver injury is directly caused by the toxicity of ammonia. The rarity of UCDs, the lack of checking of iron level in these patients, superficial knowledge of UC and an underestimation of the metabolic role of fumaric acid, are the main reasons that are responsible for the incomprehension of the mechanism of liver injury in patients suffering from UCDs. Owing to our routine clinical practice to screen for iron overload in severely ill neonates, with the focus on the newborns suffering from acute liver failure, we report a case of citrullinemia with neonatal liver failure and high blood parameters of iron overload. We hypothesize that the key is in the decreased-deficient fumaric acid production in the course of UC in UCDs that causes several sequentially intertwined metabolic disturbances with final result of liver iron overload. The presented hypothesis could be easily tested by examining the patients suffering from UCDs, for liver iron overload. This could be easily performed in countries with a high population and comprehensive national register for inborn errors of metabolism. Providing the hypothesis is correct, neonatal liver damage in patients having UCD can be prevented by the supplementation of pregnant women with fumaric or succinic acid, prepared in the form of iron supplementation pills. After birth, liver damage in patients having UCDs can be prevented by supplementation of these patients with zinc fumarate or zinc succinylate, as well.

  16. Tradable permit allocations and sequential choice

    Energy Technology Data Exchange (ETDEWEB)

    MacKenzie, Ian A. [Centre for Economic Research, ETH Zuerich, Zurichbergstrasse 18, 8092 Zuerich (Switzerland)

    2011-01-15

    This paper investigates initial allocation choices in an international tradable pollution permit market. For two sovereign governments, we compare allocation choices that are either simultaneously or sequentially announced. We show sequential allocation announcements result in higher (lower) aggregate emissions when announcements are strategic substitutes (complements). Whether allocation announcements are strategic substitutes or complements depends on the relationship between the follower's damage function and governments' abatement costs. When the marginal damage function is relatively steep (flat), allocation announcements are strategic substitutes (complements). For quadratic abatement costs and damages, sequential announcements provide a higher level of aggregate emissions. (author)

  17. A test of the permanent income hypothesis on Czech voucher privatization

    Czech Academy of Sciences Publication Activity Database

    Hanousek, Jan; Tůma, Z.

    2002-01-01

    Roč. 10, č. 2 (2002), s. 235-254 ISSN 0967-0750 Institutional research plan: CEZ:AV0Z7085904 Keywords : Barro-Riccardian * equivalence * permanent income hypothesis Subject RIV: AH - Economics Impact factor: 0.897, year: 2002 http://search. ebscohost .com/login.aspx?direct=true&db=bth&AN=6844845&site=ehost-live

  18. A sequential adaptation technique and its application to the Mark 12 IFF system

    Science.gov (United States)

    Bailey, John S.; Mallett, John D.; Sheppard, Duane J.; Warner, F. Neal; Adams, Robert

    1986-07-01

    Sequential adaptation uses only two sets of receivers, correlators, and A/D converters which are time multiplexed to effect spatial adaptation in a system with (N) adaptive degrees of freedom. This technique can substantially reduce the hardware cost over what is realizable in a parallel architecture. A three channel L-band version of the sequential adapter was built and tested for use with the MARK XII IFF (identify friend or foe) system. In this system the sequentially determined adaptive weights were obtained digitally but implemented at RF. As a result, many of the post RF hardware induced sources of error that normally limit cancellation, such as receiver mismatch, are removed by the feedback property. The result is a system that can yield high levels of cancellation and be readily retrofitted to currently fielded equipment.

  19. Selective condensation drives partitioning and sequential secretion of cyst wall proteins in differentiating Giardia lamblia.

    Directory of Open Access Journals (Sweden)

    Christian Konrad

    2010-04-01

    Full Text Available Controlled secretion of a protective extracellular matrix is required for transmission of the infective stage of a large number of protozoan and metazoan parasites. Differentiating trophozoites of the highly minimized protozoan parasite Giardia lamblia secrete the proteinaceous portion of the cyst wall material (CWM consisting of three paralogous cyst wall proteins (CWP1-3 via organelles termed encystation-specific vesicles (ESVs. Phylogenetic and molecular data indicate that Diplomonads have lost a classical Golgi during reductive evolution. However, neogenesis of ESVs in encysting Giardia trophozoites transiently provides basic Golgi functions by accumulating presorted CWM exported from the ER for maturation. Based on this "minimal Golgi" hypothesis we predicted maturation of ESVs to a trans Golgi-like stage, which would manifest as a sorting event before regulated secretion of the CWM. Here we show that proteolytic processing of pro-CWP2 in maturing ESVs coincides with partitioning of CWM into two fractions, which are sorted and secreted sequentially with different kinetics. This novel sorting function leads to rapid assembly of a structurally defined outer cyst wall, followed by slow secretion of the remaining components. Using live cell microscopy we find direct evidence for condensed core formation in maturing ESVs. Core formation suggests that a mechanism controlled by phase transitions of the CWM from fluid to condensed and back likely drives CWM partitioning and makes sorting and sequential secretion possible. Blocking of CWP2 processing by a protease inhibitor leads to mis-sorting of a CWP2 reporter. Nevertheless, partitioning and sequential secretion of two portions of the CWM are unaffected in these cells. Although these cysts have a normal appearance they are not water resistant and therefore not infective. Our findings suggest that sequential assembly is a basic architectural principle of protective wall formation and requires

  20. Classical and sequential limit analysis revisited

    Science.gov (United States)

    Leblond, Jean-Baptiste; Kondo, Djimédo; Morin, Léo; Remmal, Almahdi

    2018-04-01

    Classical limit analysis applies to ideal plastic materials, and within a linearized geometrical framework implying small displacements and strains. Sequential limit analysis was proposed as a heuristic extension to materials exhibiting strain hardening, and within a fully general geometrical framework involving large displacements and strains. The purpose of this paper is to study and clearly state the precise conditions permitting such an extension. This is done by comparing the evolution equations of the full elastic-plastic problem, the equations of classical limit analysis, and those of sequential limit analysis. The main conclusion is that, whereas classical limit analysis applies to materials exhibiting elasticity - in the absence of hardening and within a linearized geometrical framework -, sequential limit analysis, to be applicable, strictly prohibits the presence of elasticity - although it tolerates strain hardening and large displacements and strains. For a given mechanical situation, the relevance of sequential limit analysis therefore essentially depends upon the importance of the elastic-plastic coupling in the specific case considered.

  1. Simultaneous versus sequential penetrating keratoplasty and cataract surgery.

    Science.gov (United States)

    Hayashi, Ken; Hayashi, Hideyuki

    2006-10-01

    To compare the surgical outcomes of simultaneous penetrating keratoplasty and cataract surgery with those of sequential surgery. Thirty-nine eyes of 39 patients scheduled for simultaneous keratoplasty and cataract surgery and 23 eyes of 23 patients scheduled for sequential keratoplasty and secondary phacoemulsification surgery were recruited. Refractive error, regular and irregular corneal astigmatism determined by Fourier analysis, and endothelial cell loss were studied at 1 week and 3, 6, and 12 months after combined surgery in the simultaneous surgery group or after subsequent phacoemulsification surgery in the sequential surgery group. At 3 and more months after surgery, mean refractive error was significantly greater in the simultaneous surgery group than in the sequential surgery group, although no difference was seen at 1 week. The refractive error at 12 months was within 2 D of that targeted in 15 eyes (39%) in the simultaneous surgery group and within 2 D in 16 eyes (70%) in the sequential surgery group; the incidence was significantly greater in the sequential group (P = 0.0344). The regular and irregular astigmatism was not significantly different between the groups at 3 and more months after surgery. No significant difference was also found in the percentage of endothelial cell loss between the groups. Although corneal astigmatism and endothelial cell loss were not different, refractive error from target refraction was greater after simultaneous keratoplasty and cataract surgery than after sequential surgery, indicating a better outcome after sequential surgery than after simultaneous surgery.

  2. Herbage intake of dairy cows in mixed sequential grazing with breeding ewes as followers.

    Science.gov (United States)

    Jiménez-Rosales, Juan Daniel; Améndola-Massiotti, Ricardo Daniel; Burgueño-Ferreira, Juan Andrés; Ramírez-Valverde, Rodolfo; Topete-Pelayo, Pedro; Huerta-Bravo, Maximino

    2018-03-01

    This study aimed to evaluate the hypothesis that mixed sequential grazing of dairy cows and breeding ewes is beneficial. During the seasons of spring-summer 2013 and autumn-winter 2013-2014, 12 (spring-summer) and 16 (autumn-winter) Holstein Friesian cows and 24 gestating (spring-summer) and lactating (autumn-winter) Pelibuey ewes grazed on six (spring-summer) and nine (autumn-winter) paddocks of alfalfa and orchard grass mixed pastures. The treatments "single species cow grazing" (CowG) and "mixed sequential grazing with ewes as followers of cows" (MixG) were evaluated, under a completely randomized design with two replicates per paddock. Herbage mass on offer (HO) and residual herbage mass (RH) were estimated by cutting samples. The estimate of herbage intake (HI) of cows was based on the use of internal and external markers; the apparent HI of ewes was calculated as the difference between HO (RH of cows) and RH. Even though HO was higher in CowG, the HI of cows was higher in MixG during spring-summer and similar in both treatments during autumn-winter, implying that in MixG the effects on the cows HI of higher alfalfa proportion and herbage accumulation rate evolving from lower residual herbage mass in the previous cycle counteracted that of a higher HO in CowG. The HI of ewes was sufficient to enable satisfactory performance as breeding ewes. Thus, the benefits of mixed sequential grazing arose from higher herbage accumulation, positive changes in botanical composition, and the achievement of sheep production without negative effects on the herbage intake of cows.

  3. GSMA: Gene Set Matrix Analysis, An Automated Method for Rapid Hypothesis Testing of Gene Expression Data

    Directory of Open Access Journals (Sweden)

    Chris Cheadle

    2007-01-01

    Full Text Available Background: Microarray technology has become highly valuable for identifying complex global changes in gene expression patterns. The assignment of functional information to these complex patterns remains a challenging task in effectively interpreting data and correlating results from across experiments, projects and laboratories. Methods which allow the rapid and robust evaluation of multiple functional hypotheses increase the power of individual researchers to data mine gene expression data more efficiently.Results: We have developed (gene set matrix analysis GSMA as a useful method for the rapid testing of group-wise up- or downregulation of gene expression simultaneously for multiple lists of genes (gene sets against entire distributions of gene expression changes (datasets for single or multiple experiments. The utility of GSMA lies in its flexibility to rapidly poll gene sets related by known biological function or as designated solely by the end-user against large numbers of datasets simultaneously.Conclusions: GSMA provides a simple and straightforward method for hypothesis testing in which genes are tested by groups across multiple datasets for patterns of expression enrichment.

  4. TESTING THE EFFICIENT MARKET HYPOTHESIS ON THE ROMANIAN CAPITAL MARKET

    OpenAIRE

    Daniel Stefan ARMEANU; Sorin-Iulian CIOACA

    2014-01-01

    The Efficient Market Hypothesis (EMH) is one of the leading financial concepts that dominated the economic research over the last 50 years, being one of the pillars of the modern economic science. This theory, developed by Eugene Fama in the `70s, was a landmark in the development of theoretical concepts and models trying to explain the price evolution of financial assets (considering the common assumptions of the main developed theories) and also for the development of some branches in the f...

  5. Sexual selection on land snail shell ornamentation: a hypothesis that may explain shell diversity

    Directory of Open Access Journals (Sweden)

    Schilthuizen Menno

    2003-06-01

    Full Text Available Abstract Background Many groups of land snails show great interspecific diversity in shell ornamentation, which may include spines on the shell and flanges on the aperture. Such structures have been explained as camouflage or defence, but the possibility that they might be under sexual selection has not previously been explored. Presentation of the hypothesis The hypothesis that is presented consists of two parts. First, that shell ornamentation is the result of sexual selection. Second, that such sexual selection has caused the divergence in shell shape in different species. Testing the hypothesis The first part of the hypothesis may be tested by searching for sexual dimorphism in shell ornamentation in gonochoristic snails, by searching for increased variance in shell ornamentation relative to other shell traits, and by mate choice experiments using individuals with experimentally enhanced ornamentation. The second part of the hypothesis may be tested by comparing sister groups and correlating shell diversity with degree of polygamy. Implications of the hypothesis If the hypothesis were true, it would provide an explanation for the many cases of allopatric evolutionary radiation in snails, where shell diversity cannot be related to any niche differentiation or environmental differences.

  6. Effects of sequential and discrete rapid naming on reading in Japanese children with reading difficulty.

    Science.gov (United States)

    Wakamiya, Eiji; Okumura, Tomohito; Nakanishi, Makoto; Takeshita, Takashi; Mizuta, Mekumi; Kurimoto, Naoko; Tamai, Hiroshi

    2011-06-01

    To clarify whether rapid naming ability itself is a main underpinning factor of rapid automatized naming tests (RAN) and how deep an influence the discrete decoding process has on reading, we performed discrete naming tasks and discrete hiragana reading tasks as well as sequential naming tasks and sequential hiragana reading tasks with 38 Japanese schoolchildren with reading difficulty. There were high correlations between both discrete and sequential hiragana reading and sentence reading, suggesting that some mechanism which automatizes hiragana reading makes sentence reading fluent. In object and color tasks, there were moderate correlations between sentence reading and sequential naming, and between sequential naming and discrete naming. But no correlation was found between reading tasks and discrete naming tasks. The influence of rapid naming ability of objects and colors upon reading seemed relatively small, and multi-item processing may work in relation to these. In contrast, in the digit naming task there was moderate correlation between sentence reading and discrete naming, while no correlation was seen between sequential naming and discrete naming. There was moderate correlation between reading tasks and sequential digit naming tasks. Digit rapid naming ability has more direct effect on reading while its effect on RAN is relatively limited. The ratio of how rapid naming ability influences RAN and reading seems to vary according to kind of the stimuli used. An assumption about components in RAN which influence reading is discussed in the context of both sequential processing and discrete naming speed. Copyright © 2010 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  7. Fibrin-specific and effective clot lysis requires both plasminogen activators and for them to be in a sequential rather than simultaneous combination.

    Science.gov (United States)

    Pannell, R; Li, S; Gurewich, V

    2017-08-01

    Thrombolysis with tissue plasminogen activator (tPA) has been a disappointment and has now been replaced by an endovascular procedure whenever possible. Nevertheless, thrombolysis remains the only means by which circulation in a thrombosed artery can be restored rapidly. In contrast to tPA monotherapy, endogenous fibrinolysis uses both tPA and urokinase plasminogen activator (uPA), whose native form is a proenzyme, prouPA. This combination is remarkably effective as evidenced by the fibrin degradation product, D-dimer, which is invariably present in plasma. The two activators have complementary mechanisms of plasminogen activation and are synergistic in combination. Since tPA initiates fibrinolysis when released from the vessel wall and prouPA is in the blood, they induce fibrinolysis sequentially. It was postulated that this may be more effective and fibrin-specific. The hypothesis was tested in a model of clot lysis in plasma in which a clot was first exposed to tPA for 5 min, washed and incubated with prouPA. Lysis was compared with that of clots incubated with both activators simultaneously. The sequential combination was almost twice as effective and caused less fibrinogenolysis than the simultaneous combination (p < 0.0001) despite having significantly less tPA, as a result of the wash. A mechanism is described by which this phenomenon can be explained. The findings are believed to have significant therapeutic implications.

  8. The Effects of Social Anxiety and State Anxiety on Visual Attention: Testing the Vigilance-Avoidance Hypothesis.

    Science.gov (United States)

    Singh, J Suzanne; Capozzoli, Michelle C; Dodd, Michael D; Hope, Debra A

    2015-01-01

    A growing theoretical and research literature suggests that trait and state social anxiety can predict attentional patterns in the presence of emotional stimuli. The current study adds to this literature by examining the effects of state anxiety on visual attention and testing the vigilance-avoidance hypothesis, using a method of continuous visual attentional assessment. Participants were 91 undergraduate college students with high or low trait fear of negative evaluation (FNE), a core aspect of social anxiety, who were randomly assigned to either a high or low state anxiety condition. Participants engaged in a free view task in which pairs of emotional facial stimuli were presented and eye movements were continuously monitored. Overall, participants with high FNE avoided angry stimuli and participants with high state anxiety attended to positive stimuli. Participants with high state anxiety and high FNE were avoidant of angry faces, whereas participants with low state and low FNE exhibited a bias toward angry faces. The study provided partial support for the vigilance-avoidance hypothesis. The findings add to the mixed results in the literature that suggest that both positive and negative emotional stimuli may be important in understanding the complex attention patterns associated with social anxiety. Clinical implications and suggestions for future research are discussed.

  9. Sequential bilateral cochlear implantation improves working performance, quality of life, and quality of hearing.

    Science.gov (United States)

    Härkönen, Kati; Kivekäs, Ilkka; Rautiainen, Markus; Kotti, Voitto; Sivonen, Ville; Vasama, Juha-Pekka

    2015-05-01

    This prospective study shows that working performance, quality of life (QoL), and quality of hearing (QoH) are better with two compared with a single cochlear implant (CI). The impact of the second CI on the patient's QoL is as significant as the impact of the first CI. To evaluate the benefits of sequential bilateral cochlear implantation in working, QoL, and QoH. We studied working performance, work-related stress, QoL, and QoH with specific questionnaires in 15 patients with unilateral CI scheduled for sequential CI of another ear. Sound localization performance and speech perception in noise were measured with specific tests. All questionnaires and tests were performed before the second CI surgery and 6 and 12 months after its activation. Bilateral CIs increased patients' working performance and their work-related stress and fatigue decreased. Communication with co-workers was easier and patients were more active in their working environment. Sequential bilateral cochlear implantation improved QoL, QoH, sound localization, and speech perception in noise statistically significantly.

  10. Trial Sequential Methods for Meta-Analysis

    Science.gov (United States)

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  11. Accurately controlled sequential self-folding structures by polystyrene film

    Science.gov (United States)

    Deng, Dongping; Yang, Yang; Chen, Yong; Lan, Xing; Tice, Jesse

    2017-08-01

    Four-dimensional (4D) printing overcomes the traditional fabrication limitations by designing heterogeneous materials to enable the printed structures evolve over time (the fourth dimension) under external stimuli. Here, we present a simple 4D printing of self-folding structures that can be sequentially and accurately folded. When heated above their glass transition temperature pre-strained polystyrene films shrink along the XY plane. In our process silver ink traces printed on the film are used to provide heat stimuli by conducting current to trigger the self-folding behavior. The parameters affecting the folding process are studied and discussed. Sequential folding and accurately controlled folding angles are achieved by using printed ink traces and angle lock design. Theoretical analyses are done to guide the design of the folding processes. Programmable structures such as a lock and a three-dimensional antenna are achieved to test the feasibility and potential applications of this method. These self-folding structures change their shapes after fabrication under controlled stimuli (electric current) and have potential applications in the fields of electronics, consumer devices, and robotics. Our design and fabrication method provides an easy way by using silver ink printed on polystyrene films to 4D print self-folding structures for electrically induced sequential folding with angular control.

  12. Sequentially pulsed traveling wave accelerator

    Science.gov (United States)

    Caporaso, George J [Livermore, CA; Nelson, Scott D [Patterson, CA; Poole, Brian R [Tracy, CA

    2009-08-18

    A sequentially pulsed traveling wave compact accelerator having two or more pulse forming lines each with a switch for producing a short acceleration pulse along a short length of a beam tube, and a trigger mechanism for sequentially triggering the switches so that a traveling axial electric field is produced along the beam tube in synchronism with an axially traversing pulsed beam of charged particles to serially impart energy to the particle beam.

  13. Experimental test of the PCAC-hypothesis in charged current neutrino and antineutrino interactions on protons

    Science.gov (United States)

    Jones, G. T.; Jones, R. W. L.; Kennedy, B. W.; O'Neale, S. W.; Klein, H.; Morrison, D. R. O.; Schmid, P.; Wachsmuth, H.; Miller, D. B.; Mobayyen, M. M.; Wainstein, S.; Aderholz, M.; Hoffmann, E.; Katz, U. F.; Kern, J.; Schmitz, N.; Wittek, W.; Allport, P.; Myatt, G.; Radojicic, D.; Bullock, F. W.; Burke, S.

    1987-03-01

    Data obtained with the bubble chamber BEBC at CERN are used for the first significant test of Adler's prediction for the neutrino and antineutrino-proton scattering cross sections at vanishing four-momentum transfer squared Q 2. An Extended Vector Meson Dominance Model (EVDM) is applied to extrapolate Adler's prediction to experimentally accessible values of Q 2. The data show good agreement with Adler's prediction for Q 2→0 thus confirming the PCAC hypothesis in the kinematical region of high leptonic energy transfer ν>2 GeV. The good agreement of the data with the theoretical predictions also at higher Q 2, where the EVDM terms are dominant, also supports this model. However, an EVDM calculation without PCAC is clearly ruled out by the data.

  14. Experimental test of the PCAC-hypothesis in charged current neutrino and antineutrino interactions on protons

    International Nuclear Information System (INIS)

    Jones, G.T.; Jones, R.W.L.; Kennedy, B.W.; O'Neale, S.W.; Klein, H.; Morrison, D.R.O.; Schmid, P.; Wachsmuth, H.; Allport, P.; Myatt, G.; Radojicic, D.; Bullock, F.W.; Burke, S.

    1987-01-01

    Data obtained with the bubble chamber BEBC at CERN are used for the first significant test of Adler's prediction for the neutrino and antineutrino-proton scattering cross sections at vanishing four-momentum transfer squared Q 2 . An Extended Vector Meson Dominance Model (EVDM) is applied to extrapolate Adler's prediction to experimentally accessible values of Q 2 . The data show good agreement with Adler's prediction for Q 2 → 0 thus confirming the PCAC hypothesis in the kinematical region of high leptonic energy transfer ν > 2 GeV. The good agreement of the data with the theoretical predictions also at higher Q 2 , where the EVDM terms are dominant, also supports this model. However, an EVDM calculation without PCAC is clearly ruled out by the data. (orig.)

  15. Helminth community structure and diet of three Afrotropical anuran species: a test of the interactive-versus-isolationist parasite communities hypothesis

    Directory of Open Access Journals (Sweden)

    G. C. Akani

    2011-09-01

    Full Text Available The interactive-versus-isolationist hypothesis predicts that parasite communities should be depauperated and weakly structured by interspecific competition in amphibians. A parasitological survey was carried out to test this hypothesis using three anuran species from Nigeria, tropical Africa (one Bufonidae; two Ranidae. High values of parasite infection parameters were found in all three species, which were infected by nematodes, cestodes and trematodes. Nonetheless, the parasite communities of the three anurans were very depauperated in terms of number of species (4 to 6. Interspecific competition was irrelevant in all species, as revealed by null models and Monte Carlo permutations. Cluster analyses revealed that, in terms of parasite community composition, the two Ranidae were similar, whereas the Bufonidae was more different. However, when prevalence, intensity, and abundance of parasites are combined into a multivariate analysis, each anuran species was clearly spaced apart from the others, thus revealing considerable species-specific differences in terms of their parasite communities. All anurans were generalists and probably opportunistic in terms of dietary habits, and showed no evidence of interspecific competition for food. Overall, our data are widely consistent with expectations driven from the interactive-versus-isolationist parasite communities hypothesis.

  16. Testing in mice the hypothesis that melanin is protective in malaria infections.

    Directory of Open Access Journals (Sweden)

    Michael Waisberg

    Full Text Available Malaria has had the largest impact of any infectious disease on shaping the human genome, exerting enormous selective pressure on genes that improve survival in severe malaria infections. Modern humans originated in Africa and lost skin melanization as they migrated to temperate regions of the globe. Although it is well documented that loss of melanization improved cutaneous Vitamin D synthesis, melanin plays an evolutionary ancient role in insect immunity to malaria and in some instances melanin has been implicated to play an immunoregulatory role in vertebrates. Thus, we tested the hypothesis that melanization may be protective in malaria infections using mouse models. Congenic C57BL/6 mice that differed only in the gene encoding tyrosinase, a key enzyme in the synthesis of melanin, showed no difference in the clinical course of infection by Plasmodium yoelii 17XL, that causes severe anemia, Plasmodium berghei ANKA, that causes severe cerebral malaria or Plasmodium chabaudi AS that causes uncomplicated chronic disease. Moreover, neither genetic deficiencies in vitamin D synthesis nor vitamin D supplementation had an effect on survival in cerebral malaria. Taken together, these results indicate that neither melanin nor vitamin D production improve survival in severe malaria.

  17. Computerized Classification Testing with the Rasch Model

    Science.gov (United States)

    Eggen, Theo J. H. M.

    2011-01-01

    If classification in a limited number of categories is the purpose of testing, computerized adaptive tests (CATs) with algorithms based on sequential statistical testing perform better than estimation-based CATs (e.g., Eggen & Straetmans, 2000). In these computerized classification tests (CCTs), the Sequential Probability Ratio Test (SPRT) (Wald,…

  18. Brain morphology of the threespine stickleback (Gasterosteus aculeatus) varies inconsistently with respect to habitat complexity: A test of the Clever Foraging Hypothesis.

    Science.gov (United States)

    Ahmed, Newaz I; Thompson, Cole; Bolnick, Daniel I; Stuart, Yoel E

    2017-05-01

    The Clever Foraging Hypothesis asserts that organisms living in a more spatially complex environment will have a greater neurological capacity for cognitive processes related to spatial memory, navigation, and foraging. Because the telencephalon is often associated with spatial memory and navigation tasks, this hypothesis predicts a positive association between telencephalon size and environmental complexity. The association between habitat complexity and brain size has been supported by comparative studies across multiple species but has not been widely studied at the within-species level. We tested for covariation between environmental complexity and neuroanatomy of threespine stickleback ( Gasterosteus aculeatus ) collected from 15 pairs of lakes and their parapatric streams on Vancouver Island. In most pairs, neuroanatomy differed between the adjoining lake and stream populations. However, the magnitude and direction of this difference were inconsistent between watersheds and did not covary strongly with measures of within-site environmental heterogeneity. Overall, we find weak support for the Clever Foraging Hypothesis in our study.

  19. An Efficient System Based On Closed Sequential Patterns for Web Recommendations

    OpenAIRE

    Utpala Niranjan; R.B.V. Subramanyam; V-Khana

    2010-01-01

    Sequential pattern mining, since its introduction has received considerable attention among the researchers with broad applications. The sequential pattern algorithms generally face problems when mining long sequential patterns or while using very low support threshold. One possible solution of such problems is by mining the closed sequential patterns, which is a condensed representation of sequential patterns. Recently, several researchers have utilized the sequential pattern discovery for d...

  20. The Cognitive Mediation Hypothesis Revisited: An Empirical Response to Methodological and Theoretical Criticism.

    Science.gov (United States)

    Romero, Anna A.; And Others

    1996-01-01

    In order to address criticisms raised against the cognitive mediation hypothesis, three experiments were conducted to develop a more direct test of the hypothesis. Taken together, the three experiments provide converging support for the cognitive mediation hypothesis, reconfirming the central role of cognition in the persuasion process.…

  1. Tracing the footsteps of Sherlock Holmes: cognitive representations of hypothesis testing.

    Science.gov (United States)

    Van Wallendael, L R; Hastie, R

    1990-05-01

    A well-documented phenomenon in opinion-revision literature is subjects' failure to revise probability estimates for an exhaustive set of mutually exclusive hypotheses in a complementary manner. However, prior research has not addressed the question of whether such behavior simply represents a misunderstanding of mathematical rules, or whether it is a consequence of a cognitive representation of hypotheses that is at odds with the Bayesian notion of a set relationship. Two alternatives to the Bayesian representation, a belief system (Shafer, 1976) and a system of independent hypotheses, were proposed, and three experiments were conducted to examine cognitive representations of hypothesis sets in the testing of multiple competing hypotheses. Subjects were given brief murder mysteries to solve and allowed to request various types of information about the suspects; after having received each new piece of information, subjects rated each suspect's probability of being the murderer. Presence and timing of suspect eliminations were varied in the first two experiments; the final experiment involved the varying of percentages of clues that referred to more than one suspect (for example, all of the female suspects). The noncomplementarity of opinion revisions remained a strong phenomenon in all conditions. Information-search data refuted the idea that subjects represented hypotheses as a Bayesian set; further study of the independent hypotheses theory and Shaferian belief functions as descriptive models is encouraged.

  2. A meta-analysis of response-time tests of the sequential two-systems model of moral judgment.

    Science.gov (United States)

    Baron, Jonathan; Gürçay, Burcu

    2017-05-01

    The (generalized) sequential two-system ("default interventionist") model of utilitarian moral judgment predicts that utilitarian responses often arise from a system-two correction of system-one deontological intuitions. Response-time (RT) results that seem to support this model are usually explained by the fact that low-probability responses have longer RTs. Following earlier results, we predicted response probability from each subject's tendency to make utilitarian responses (A, "Ability") and each dilemma's tendency to elicit deontological responses (D, "Difficulty"), estimated from a Rasch model. At the point where A = D, the two responses are equally likely, so probability effects cannot account for any RT differences between them. The sequential two-system model still predicts that many of the utilitarian responses made at this point will result from system-two corrections of system-one intuitions, hence should take longer. However, when A = D, RT for the two responses was the same, contradicting the sequential model. Here we report a meta-analysis of 26 data sets, which replicated the earlier results of no RT difference overall at the point where A = D. The data sets used three different kinds of moral judgment items, and the RT equality at the point where A = D held for all three. In addition, we found that RT increased with A-D. This result holds for subjects (characterized by Ability) but not for items (characterized by Difficulty). We explain the main features of this unanticipated effect, and of the main results, with a drift-diffusion model.

  3. Sequential decoding of intramuscular EMG signals via estimation of a Markov model.

    Science.gov (United States)

    Monsifrot, Jonathan; Le Carpentier, Eric; Aoustin, Yannick; Farina, Dario

    2014-09-01

    This paper addresses the sequential decoding of intramuscular single-channel electromyographic (EMG) signals to extract the activity of individual motor neurons. A hidden Markov model is derived from the physiological generation of the EMG signal. The EMG signal is described as a sum of several action potentials (wavelet) trains, embedded in noise. For each train, the time interval between wavelets is modeled by a process that parameters are linked to the muscular activity. The parameters of this process are estimated sequentially by a Bayes filter, along with the firing instants. The method was tested on some simulated signals and an experimental one, from which the rates of detection and classification of action potentials were above 95% with respect to the reference decomposition. The method works sequentially in time, and is the first to address the problem of intramuscular EMG decomposition online. It has potential applications for man-machine interfacing based on motor neuron activities.

  4. Prevalence of hardcore smoking in the Netherlands between 2001 and 2012: a test of the hardening hypothesis

    Directory of Open Access Journals (Sweden)

    Jeroen Bommelé

    2016-08-01

    Full Text Available Abstract Background Hardcore smokers are smokers who have smoked for many years and who do not intend to quit smoking. The “hardening hypothesis” states that light smokers are more likely to quit smoking than heavy smokers (such as hardcore smokers. Therefore, the prevalence of hardcore smoking among smokers would increase over time. If this is true, the smoking population would become harder to reach with tobacco control measures. In this study we tested the hardening hypothesis. Methods We calculated the prevalence of hardcore smoking in the Netherlands from 2001 to 2012. Smokers were ‘hardcore’ if they a smoked every day, b smoked on average 15 cigarettes per day or more, c had not attempted to quit in the past 12 months, and d had no intention to quit within 6 months. We used logistic regression models to test whether the prevalence changed over time. We also investigated whether trends differed between educational levels. Results Among smokers, the prevalence of hardcore smoking decreased from 40.8 % in 2001 to 32.2 % in 2012. In the general population, it decreased from 12.2 to 8.2 %. Hardcore smokers were significantly lower educated than non-hardcore smokers. Among the general population, the prevalence of hardcore smoking decreased more among higher educated people than among lower educated people. Conclusions We found no support for the hardening hypothesis in the Netherlands between 2001 and 2012. Instead, the decrease of hardcore smoking among smokers suggests a ‘softening’ of the smoking population.

  5. The influence of simultaneous or sequential test conditions in the properties of industrial polymers, submitted to PWR accident simulations

    International Nuclear Information System (INIS)

    Carlin, F.; Alba, C.; Chenion, J.; Gaussens, G.; Henry, J.Y.

    1986-10-01

    The effect of PWR plant normal and accident operating conditions on polymers forms the basis of nuclear qualification of safety-related containment equipment. This study was carried out on the request of safety organizations. Its purpose was to check whether accident simulations carried out sequentially during equipment qualification tests would lead to the same deterioration as that caused by an accident involving simultaneous irradiation and thermodynamic effects. The IPSN, DAS and the United States NRC have collaborated in preparing this study. The work carried out by ORIS Company as well as the results obtained from measurement of the mechanical properties of 8 industrial polymers are described in this report. The results are given in the conclusion. They tend to show that, overall, the most suitable test cycle for simulating accident operating conditions would be one which included irradiation and consecutive thermodynamic shock. The results of this study and the results obtained in a previous study, which included the same test cycles, except for more severe thermo-ageing, have been compared. This comparison, which was made on three elastomers, shows that ageing after the accident has a different effect on each material [fr

  6. Tests of the salt-nuclei hypothesis of rain formation

    Energy Technology Data Exchange (ETDEWEB)

    Woodcock, A H; Blanchard, D C

    1955-01-01

    Atmospheric chlorides in sea-salt nuclei and the chlorides dissolved in shower rainwaters were recently measured in Hawaii. A comparison of these measurements reveals the remarkable fact that the weight of chloride present in a certain number of nuclei in a cubic meter of clear air tends to be equal to the weight of chloride dissolved in an equal number of raindrops in a cubic meter of rainy air. This result is explained as an indication that the raindrops grow on the salt nuclei in some manner which prevents a marked change in the distribution of these nuclei during the drop-growth process. The data presented add new evidence in further support of the salt-nuclei raindrop hypothesis previously proposed by the first author.

  7. Recent tests of the equilibrium-point hypothesis (lambda model).

    Science.gov (United States)

    Feldman, A G; Ostry, D J; Levin, M F; Gribble, P L; Mitnitski, A B

    1998-07-01

    The lambda model of the equilibrium-point hypothesis (Feldman & Levin, 1995) is an approach to motor control which, like physics, is based on a logical system coordinating empirical data. The model has gone through an interesting period. On one hand, several nontrivial predictions of the model have been successfully verified in recent studies. In addition, the explanatory and predictive capacity of the model has been enhanced by its extension to multimuscle and multijoint systems. On the other hand, claims have recently appeared suggesting that the model should be abandoned. The present paper focuses on these claims and concludes that they are unfounded. Much of the experimental data that have been used to reject the model are actually consistent with it.

  8. Simultaneous sequential monitoring of efficacy and safety led to masking of effects.

    Science.gov (United States)

    van Eekelen, Rik; de Hoop, Esther; van der Tweel, Ingeborg

    2016-08-01

    Usually, sequential designs for clinical trials are applied on the primary (=efficacy) outcome. In practice, other outcomes (e.g., safety) will also be monitored and influence the decision whether to stop a trial early. Implications of simultaneous monitoring on trial decision making are yet unclear. This study examines what happens to the type I error, power, and required sample sizes when one efficacy outcome and one correlated safety outcome are monitored simultaneously using sequential designs. We conducted a simulation study in the framework of a two-arm parallel clinical trial. Interim analyses on two outcomes were performed independently and simultaneously on the same data sets using four sequential monitoring designs, including O'Brien-Fleming and Triangular Test boundaries. Simulations differed in values for correlations and true effect sizes. When an effect was present in both outcomes, competition was introduced, which decreased power (e.g., from 80% to 60%). Futility boundaries for the efficacy outcome reduced overall type I errors as well as power for the safety outcome. Monitoring two correlated outcomes, given that both are essential for early trial termination, leads to masking of true effects. Careful consideration of scenarios must be taken into account when designing sequential trials. Simulation results can help guide trial design. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Adjuvant chemotherapy with sequential or concurrent anthracycline and docetaxel: Breast International Group 02-98 randomized trial

    DEFF Research Database (Denmark)

    Francis, P.; Crown, J.; Di, Leo A.

    2008-01-01

    BACKGROUND: Docetaxel is more effective than doxorubicin for patients with advanced breast cancer. The Breast International Group 02-98 randomized trial tested the effect of incorporating docetaxel into anthracycline-based adjuvant chemotherapy and compared sequential vs concurrent administration....... However, important differences may be related to doxorubicin and docetaxel scheduling, with sequential but not concurrent administration, appearing to produce better DFS than anthracycline-based chemotherapy Udgivelsesdato: 2008/1/16...

  10. The Twin Deficits Hypothesis: An Empirical Analysis for Tanzania

    Directory of Open Access Journals (Sweden)

    Manamba Epaphra

    2017-09-01

    Full Text Available This paper examines the relationship between current account and government budget deficits in Tanzania. The paper tests the validity of the twin deficits hypothesis, using annual time series data for the 1966-2015 period. The paper is thought to be significant because the concept of the twin deficit hypothesis is fraught with controversy. Some researches support the hypothesis that there is a positive relationship between current account deficits and fiscal deficits in the economy while others do not. In this paper, the empirical tests fail to reject the twin deficits hypothesis, indicating that rising budget deficits put more strain on the current account deficits in Tanzania. Specifically, the Vector Error Correction Model results support the conventional theory of a positive relationship between fiscal and external balances, with a relatively high speed of adjustment toward the equilibrium position. This evidence is consistent with a small open economy. To address the problem that may result from this kind of relationship, appropriate policy variables for reducing budget deficits such as reduction in non-development expenditure, enhancement of domestic revenue collection and actively fight corruption and tax evasion should be adopted. The government should also target export oriented firms and encourage an import substitution industry by creating favorable business environments.

  11. Immunogenicity of simultaneous versus sequential administration of a 23-valent pneumococcal polysaccharide vaccine and a quadrivalent influenza vaccine in older individuals: A randomized, open-label, non-inferiority trial.

    Science.gov (United States)

    Nakashima, Kei; Aoshima, Masahiro; Ohfuji, Satoko; Yamawaki, Satoshi; Nemoto, Masahiro; Hasegawa, Shinya; Noma, Satoshi; Misawa, Masafumi; Hosokawa, Naoto; Yaegashi, Makito; Otsuka, Yoshihito

    2018-03-21

    It is unclear whether simultaneous administration of a 23-valent pneumococcal polysaccharide vaccine (PPSV23) and a quadrivalent influenza vaccine (QIV) produces immunogenicity in older individuals. This study tested the hypothesis that the pneumococcal antibody response elicited by simultaneous administration of PPSV23 and QIV in older individuals is not inferior to that elicited by sequential administration of PPSV23 and QIV. We performed a single-center, randomized, open-label, non-inferiority trial comprising 162 adults aged ≥65 years randomly assigned to either the simultaneous (simultaneous injections of PPSV23 and QIV) or sequential (control; PPSV23 injected 2 weeks after QIV vaccination) groups. Pneumococcal immunoglobulin G (IgG) titers of serotypes 23F, 3, 4, 6B, 14, and 19A were assessed. The primary endpoint was the serotype 23F response rate (a ≥2-fold increase in IgG concentrations 4-6 weeks after PPSV23 vaccination). With the non-inferiority margin set at 20% fewer patients, the response rate of serotype 23F in the simultaneous group (77.8%) was not inferior to that of the sequential group (77.6%; difference, 0.1%; 90% confidence interval, -10.8% to 11.1%). None of the pneumococcal IgG serotype titers were significantly different between the groups 4-6 weeks after vaccination. Simultaneous administration did not show a significant decrease in seroprotection odds ratios for H1N1, H3N2, or B/Phuket influenza strains other than B/Texas. Additionally, simultaneous administration did not increase adverse reactions. Hence, simultaneous administration of PPSV23 and QIV shows an acceptable immunogenicity that is comparable to sequential administration without an increase in adverse reactions. (This study was registered with ClinicalTrials.gov [NCT02592486]).

  12. THE FRACTAL MARKET HYPOTHESIS

    Directory of Open Access Journals (Sweden)

    FELICIA RAMONA BIRAU

    2012-05-01

    Full Text Available In this article, the concept of capital market is analysed using Fractal Market Hypothesis which is a modern, complex and unconventional alternative to classical finance methods. Fractal Market Hypothesis is in sharp opposition to Efficient Market Hypothesis and it explores the application of chaos theory and fractal geometry to finance. Fractal Market Hypothesis is based on certain assumption. Thus, it is emphasized that investors did not react immediately to the information they receive and of course, the manner in which they interpret that information may be different. Also, Fractal Market Hypothesis refers to the way that liquidity and investment horizons influence the behaviour of financial investors.

  13. THE FRACTAL MARKET HYPOTHESIS

    OpenAIRE

    FELICIA RAMONA BIRAU

    2012-01-01

    In this article, the concept of capital market is analysed using Fractal Market Hypothesis which is a modern, complex and unconventional alternative to classical finance methods. Fractal Market Hypothesis is in sharp opposition to Efficient Market Hypothesis and it explores the application of chaos theory and fractal geometry to finance. Fractal Market Hypothesis is based on certain assumption. Thus, it is emphasized that investors did not react immediately to the information they receive and...

  14. Sex differences in DNA methylation and expression in zebrafish brain: a test of an extended 'male sex drive' hypothesis.

    Science.gov (United States)

    Chatterjee, Aniruddha; Lagisz, Malgorzata; Rodger, Euan J; Zhen, Li; Stockwell, Peter A; Duncan, Elizabeth J; Horsfield, Julia A; Jeyakani, Justin; Mathavan, Sinnakaruppan; Ozaki, Yuichi; Nakagawa, Shinichi

    2016-09-30

    The sex drive hypothesis predicts that stronger selection on male traits has resulted in masculinization of the genome. Here we test whether such masculinizing effects can be detected at the level of the transcriptome and methylome in the adult zebrafish brain. Although methylation is globally similar, we identified 914 specific differentially methylated CpGs (DMCs) between males and females (435 were hypermethylated and 479 were hypomethylated in males compared to females). These DMCs were prevalent in gene body, intergenic regions and CpG island shores. We also discovered 15 distinct CpG clusters with striking sex-specific DNA methylation differences. In contrast, at transcriptome level, more female-biased genes than male-biased genes were expressed, giving little support for the male sex drive hypothesis. Our study provides genome-wide methylome and transcriptome assessment and sheds light on sex-specific epigenetic patterns and in zebrafish for the first time. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Learning-Related Changes in Adolescents' Neural Networks during Hypothesis-Generating and Hypothesis-Understanding Training

    Science.gov (United States)

    Lee, Jun-Ki; Kwon, Yongju

    2012-01-01

    Fourteen science high school students participated in this study, which investigated neural-network plasticity associated with hypothesis-generating and hypothesis-understanding in learning. The students were divided into two groups and participated in either hypothesis-generating or hypothesis-understanding type learning programs, which were…

  16. Study of sequential disinfection for the inactivation of protozoa and indicator microorganisms in wastewater

    Directory of Open Access Journals (Sweden)

    Raphael Corrêa Medeiros

    2015-05-01

    Full Text Available Sewage disinfection has the primary objective of inactivating pathogenic organisms to prevent the dissemination of waterborne diseases. This study analyzed individual disinfection, with chlorine and ultraviolet radiation, and sequential disinfection (chlorine-UV radiation. The tests were conducted with anaerobic effluent in batch, in laboratory scale, with two dosages of chlorine (10 and 20 mg L-1 and UV (2.5 and 6.1 Wh m-3. In addition, to guarantee the presence of cysts in the tests, 104 cysts per liter of Giardia spp. were inoculated. The resistance order was as follows: E. coli = Total Coliforms < Clostridium perfringens < Giardia spp.. Furthermore, synergistic effects reached 0.06 to 1.42 log of inactivation in sequential disinfection for both the most resistant microorganisms.

  17. Humans have evolved specialized skills of social cognition: the cultural intelligence hypothesis.

    Science.gov (United States)

    Herrmann, Esther; Call, Josep; Hernàndez-Lloreda, Maráa Victoria; Hare, Brian; Tomasello, Michael

    2007-09-07

    Humans have many cognitive skills not possessed by their nearest primate relatives. The cultural intelligence hypothesis argues that this is mainly due to a species-specific set of social-cognitive skills, emerging early in ontogeny, for participating and exchanging knowledge in cultural groups. We tested this hypothesis by giving a comprehensive battery of cognitive tests to large numbers of two of humans' closest primate relatives, chimpanzees and orangutans, as well as to 2.5-year-old human children before literacy and schooling. Supporting the cultural intelligence hypothesis and contradicting the hypothesis that humans simply have more "general intelligence," we found that the children and chimpanzees had very similar cognitive skills for dealing with the physical world but that the children had more sophisticated cognitive skills than either of the ape species for dealing with the social world.

  18. A novel hypothesis splitting method implementation for multi-hypothesis filters

    DEFF Research Database (Denmark)

    Bayramoglu, Enis; Ravn, Ole; Andersen, Nils Axel

    2013-01-01

    The paper presents a multi-hypothesis filter library featuring a novel method for splitting Gaussians into ones with smaller variances. The library is written in C++ for high performance and the source code is open and free1. The multi-hypothesis filters commonly approximate the distribution tran...

  19. Cognitive processes associated with sequential tool use in New Caledonian crows.

    Directory of Open Access Journals (Sweden)

    Joanna H Wimpenny

    Full Text Available BACKGROUND: Using tools to act on non-food objects--for example, to make other tools--is considered to be a hallmark of human intelligence, and may have been a crucial step in our evolution. One form of this behaviour, 'sequential tool use', has been observed in a number of non-human primates and even in one bird, the New Caledonian crow (Corvus moneduloides. While sequential tool use has often been interpreted as evidence for advanced cognitive abilities, such as planning and analogical reasoning, the behaviour itself can be underpinned by a range of different cognitive mechanisms, which have never been explicitly examined. Here, we present experiments that not only demonstrate new tool-using capabilities in New Caledonian crows, but allow examination of the extent to which crows understand the physical interactions involved. METHODOLOGY/PRINCIPAL FINDINGS: In two experiments, we tested seven captive New Caledonian crows in six tasks requiring the use of up to three different tools in a sequence to retrieve food. Our study incorporated several novel features: (i we tested crows on a three-tool problem (subjects were required to use a tool to retrieve a second tool, then use the second tool to retrieve a third one, and finally use the third one to reach for food; (ii we presented tasks of different complexity in random rather than progressive order; (iii we included a number of control conditions to test whether tool retrieval was goal-directed; and (iv we manipulated the subjects' pre-testing experience. Five subjects successfully used tools in a sequence (four from their first trial, and four subjects repeatedly solved the three-tool condition. Sequential tool use did not require, but was enhanced by, pre-training on each element in the sequence ('chaining', an explanation that could not be ruled out in earlier studies. By analyzing tool choice, tool swapping and improvement over time, we show that successful subjects did not use a random

  20. Multiple Vehicle Cooperative Localization with Spatial Registration Based on a Probability Hypothesis Density Filter

    Directory of Open Access Journals (Sweden)

    Feihu Zhang

    2014-01-01

    Full Text Available This paper studies the problem of multiple vehicle cooperative localization with spatial registration in the formulation of the probability hypothesis density (PHD filter. Assuming vehicles are equipped with proprioceptive and exteroceptive sensors (with biases to cooperatively localize positions, a simultaneous solution for joint spatial registration and state estimation is proposed. For this, we rely on the sequential Monte Carlo implementation of the PHD filtering. Compared to other methods, the concept of multiple vehicle cooperative localization with spatial registration is first proposed under Random Finite Set Theory. In addition, the proposed solution also addresses the challenges for multiple vehicle cooperative localization, e.g., the communication bandwidth issue and data association uncertainty. The simulation result demonstrates its reliability and feasibility in large-scale environments.

  1. Associations among measures of sequential processing in motor and linguistics tasks in adults with and without a family history of childhood apraxia of speech: a replication study.

    Science.gov (United States)

    Button, Le; Peter, Beate; Stoel-Gammon, Carol; Raskind, Wendy H

    2013-03-01

    The purpose of this study was to address the hypothesis that childhood apraxia of speech (CAS) is influenced by an underlying deficit in sequential processing that is also expressed in other modalities. In a sample of 21 adults from five multigenerational families, 11 with histories of various familial speech sound disorders, 3 biologically related adults from a family with familial CAS showed motor sequencing deficits in an alternating motor speech task. Compared with the other adults, these three participants showed deficits in tasks requiring high loads of sequential processing, including nonword imitation, nonword reading and spelling. Qualitative error analyses in real word and nonword imitations revealed group differences in phoneme sequencing errors. Motor sequencing ability was correlated with phoneme sequencing errors during real word and nonword imitation, reading and spelling. Correlations were characterized by extremely high scores in one family and extremely low scores in another. Results are consistent with a central deficit in sequential processing in CAS of familial origin.

  2. A General Relativistic Null Hypothesis Test with Event Horizon Telescope Observations of the Black Hole Shadow in Sgr A*

    Science.gov (United States)

    Psaltis, Dimitrios; Özel, Feryal; Chan, Chi-Kwan; Marrone, Daniel P.

    2015-12-01

    The half opening angle of a Kerr black hole shadow is always equal to (5 ± 0.2)GM/Dc2, where M is the mass of the black hole and D is its distance from the Earth. Therefore, measuring the size of a shadow and verifying whether it is within this 4% range constitutes a null hypothesis test of general relativity. We show that the black hole in the center of the Milky Way, Sgr A*, is the optimal target for performing this test with upcoming observations using the Event Horizon Telescope (EHT). We use the results of optical/IR monitoring of stellar orbits to show that the mass-to-distance ratio for Sgr A* is already known to an accuracy of ∼4%. We investigate our prior knowledge of the properties of the scattering screen between Sgr A* and the Earth, the effects of which will need to be corrected for in order for the black hole shadow to appear sharp against the background emission. Finally, we explore an edge detection scheme for interferometric data and a pattern matching algorithm based on the Hough/Radon transform and demonstrate that the shadow of the black hole at 1.3 mm can be localized, in principle, to within ∼9%. All these results suggest that our prior knowledge of the properties of the black hole, of scattering broadening, and of the accretion flow can only limit this general relativistic null hypothesis test with EHT observations of Sgr A* to ≲10%.

  3. Brood desertion by female shorebirds : a test of the differential parental capacity hypothesis on Kentish plovers

    NARCIS (Netherlands)

    Amat, JA; Visser, GH; Perez-Hurtado, A; Arroyo, GM

    2000-01-01

    The aim of this study was to examine whether the energetic costs of reproduction explain offspring desertion by female shorebirds, as is suggested by the differential parental capacity hypothesis. A prediction of the hypothesis is that, in species with biparental incubation in which females desert

  4. Discrimination between sequential and simultaneous virtual channels with electrical hearing.

    Science.gov (United States)

    Landsberger, David; Galvin, John J

    2011-09-01

    In cochlear implants (CIs), simultaneous or sequential stimulation of adjacent electrodes can produce intermediate pitch percepts between those of the component electrodes. However, it is unclear whether simultaneous and sequential virtual channels (VCs) can be discriminated. In this study, CI users were asked to discriminate simultaneous and sequential VCs; discrimination was measured for monopolar (MP) and bipolar + 1 stimulation (BP + 1), i.e., relatively broad and focused stimulation modes. For sequential VCs, the interpulse interval (IPI) varied between 0.0 and 1.8 ms. All stimuli were presented at comfortably loud, loudness-balanced levels at a 250 pulse per second per electrode (ppse) stimulation rate. On average, CI subjects were able to reliably discriminate between sequential and simultaneous VCs. While there was no significant effect of IPI or stimulation mode on VC discrimination, some subjects exhibited better VC discrimination with BP + 1 stimulation. Subjects' discrimination between sequential and simultaneous VCs was correlated with electrode discrimination, suggesting that spatial selectivity may influence perception of sequential VCs. To maintain equal loudness, sequential VC amplitudes were nearly double those of simultaneous VCs, presumably resulting in a broader spread of excitation. These results suggest that perceptual differences between simultaneous and sequential VCs might be explained by differences in the spread of excitation. © 2011 Acoustical Society of America

  5. SETI in vivo: testing the we-are-them hypothesis

    Science.gov (United States)

    Makukov, Maxim A.; Shcherbak, Vladimir I.

    2018-04-01

    After it was proposed that life on Earth might descend from seeding by an earlier extraterrestrial civilization motivated to secure and spread life, some authors noted that this alternative offers a testable implication: microbial seeds could be intentionally supplied with a durable signature that might be found in extant organisms. In particular, it was suggested that the optimal location for such an artefact is the genetic code, as the least evolving part of cells. However, as the mainstream view goes, this scenario is too speculative and cannot be meaningfully tested because encoding/decoding a signature within the genetic code is something ill-defined, so any retrieval attempt is doomed to guesswork. Here we refresh the seeded-Earth hypothesis in light of recent observations, and discuss the motivation for inserting a signature. We then show that `biological SETI' involves even weaker assumptions than traditional SETI and admits a well-defined methodological framework. After assessing the possibility in terms of molecular and evolutionary biology, we formalize the approach and, adopting the standard guideline of SETI that encoding/decoding should follow from first principles and be convention-free, develop a universal retrieval strategy. Applied to the canonical genetic code, it reveals a non-trivial precision structure of interlocked logical and numerical attributes of systematic character (previously we found these heuristically). To assess this result in view of the initial assumption, we perform statistical, comparison, interdependence and semiotic analyses. Statistical analysis reveals no causal connection of the result to evolutionary models of the genetic code, interdependence analysis precludes overinterpretation, and comparison analysis shows that known variations of the code lack any precision-logic structures, in agreement with these variations being post-LUCA (i.e. post-seeding) evolutionary deviations from the canonical code. Finally, semiotic

  6. Sequential versus simultaneous market delineation

    DEFF Research Database (Denmark)

    Haldrup, Niels; Møllgaard, Peter; Kastberg Nielsen, Claus

    2005-01-01

    and geographical markets. Using a unique data setfor prices of Norwegian and Scottish salmon, we propose a methodologyfor simultaneous market delineation and we demonstrate that comparedto a sequential approach conclusions will be reversed.JEL: C3, K21, L41, Q22Keywords: Relevant market, econometric delineation......Delineation of the relevant market forms a pivotal part of most antitrustcases. The standard approach is sequential. First the product marketis delineated, then the geographical market is defined. Demand andsupply substitution in both the product dimension and the geographicaldimension...

  7. Multiple Choice Testing and the Retrieval Hypothesis of the Testing Effect

    Science.gov (United States)

    Sensenig, Amanda E.

    2010-01-01

    Taking a test often leads to enhanced later memory for the tested information, a phenomenon known as the "testing effect". This memory advantage has been reliably demonstrated with recall tests but not multiple choice tests. One potential explanation for this finding is that multiple choice tests do not rely on retrieval processes to the same…

  8. Monitoring sequential electron transfer with EPR

    International Nuclear Information System (INIS)

    Thurnauer, M.C.; Feezel, L.L.; Snyder, S.W.; Tang, J.; Norris, J.R.; Morris, A.L.; Rustandi, R.R.

    1989-01-01

    A completely general model which treats electron spin polarization (ESP) found in a system in which radical pairs with different magnetic interactions are formed sequentially has been described. This treatment has been applied specifically to the ESP found in the bacterial reaction center. Test cases show clearly how parameters such as structure, lifetime, and magnetic interactions within the successive radical pairs affect the ESP, and demonstrate that previous treatments of this problem have been incomplete. The photosynthetic bacterial reaction center protein is an ideal system for testing the general model of ESP. The radical pair which exhibits ESP, P 870 + Q - (P 870 + is the oxidized, primary electron donor, a bacteriochlorophyll special pair and Q - is the reduced, primary quinone acceptor) is formed via sequential electron transport through the intermediary radical pair P 870 + I - (I - is the reduced, intermediary electron acceptor, a bacteriopheophytin). In addition, it is possible to experimentally vary most of the important parameters, such as the lifetime of the intermediary radical pair and the magnetic interactions in each pair. It has been shown how selective isotopic substitution ( 1 H or 2 H) on P 870 , I and Q affects the ESP of the EPR spectrum of P 870 + Q - , observed at two different microwave frequencies, in Fe 2+ -depleted bacterial reaction centers of Rhodobacter sphaeroides R26. Thus, the relative magnitudes of the magnetic properties (nuclear hyperfine and g-factor differences) which influence ESP development were varied. The results support the general model of ESP in that they suggest that the P 870 + Q - radical pair interactions are the dominant source of ESP production in 2 H bacterial reaction centers

  9. Social learning and evolution: the cultural intelligence hypothesis

    Science.gov (United States)

    van Schaik, Carel P.; Burkart, Judith M.

    2011-01-01

    If social learning is more efficient than independent individual exploration, animals should learn vital cultural skills exclusively, and routine skills faster, through social learning, provided they actually use social learning preferentially. Animals with opportunities for social learning indeed do so. Moreover, more frequent opportunities for social learning should boost an individual's repertoire of learned skills. This prediction is confirmed by comparisons among wild great ape populations and by social deprivation and enculturation experiments. These findings shaped the cultural intelligence hypothesis, which complements the traditional benefit hypotheses for the evolution of intelligence by specifying the conditions in which these benefits can be reaped. The evolutionary version of the hypothesis argues that species with frequent opportunities for social learning should more readily respond to selection for a greater number of learned skills. Because improved social learning also improves asocial learning, the hypothesis predicts a positive interspecific correlation between social-learning performance and individual learning ability. Variation among primates supports this prediction. The hypothesis also predicts that more heavily cultural species should be more intelligent. Preliminary tests involving birds and mammals support this prediction too. The cultural intelligence hypothesis can also account for the unusual cognitive abilities of humans, as well as our unique mechanisms of skill transfer. PMID:21357223

  10. Social learning and evolution: the cultural intelligence hypothesis.

    Science.gov (United States)

    van Schaik, Carel P; Burkart, Judith M

    2011-04-12

    If social learning is more efficient than independent individual exploration, animals should learn vital cultural skills exclusively, and routine skills faster, through social learning, provided they actually use social learning preferentially. Animals with opportunities for social learning indeed do so. Moreover, more frequent opportunities for social learning should boost an individual's repertoire of learned skills. This prediction is confirmed by comparisons among wild great ape populations and by social deprivation and enculturation experiments. These findings shaped the cultural intelligence hypothesis, which complements the traditional benefit hypotheses for the evolution of intelligence by specifying the conditions in which these benefits can be reaped. The evolutionary version of the hypothesis argues that species with frequent opportunities for social learning should more readily respond to selection for a greater number of learned skills. Because improved social learning also improves asocial learning, the hypothesis predicts a positive interspecific correlation between social-learning performance and individual learning ability. Variation among primates supports this prediction. The hypothesis also predicts that more heavily cultural species should be more intelligent. Preliminary tests involving birds and mammals support this prediction too. The cultural intelligence hypothesis can also account for the unusual cognitive abilities of humans, as well as our unique mechanisms of skill transfer.

  11. Physiopathological Hypothesis of Cellulite

    Science.gov (United States)

    de Godoy, José Maria Pereira; de Godoy, Maria de Fátima Guerreiro

    2009-01-01

    A series of questions are asked concerning this condition including as regards to its name, the consensus about the histopathological findings, physiological hypothesis and treatment of the disease. We established a hypothesis for cellulite and confirmed that the clinical response is compatible with this hypothesis. Hence this novel approach brings a modern physiological concept with physiopathologic basis and clinical proof of the hypothesis. We emphasize that the choice of patient, correct diagnosis of cellulite and the technique employed are fundamental to success. PMID:19756187

  12. A hypothesis-testing framework for studies investigating ontogenetic niche shifts using stable isotope ratios.

    Directory of Open Access Journals (Sweden)

    Caroline M Hammerschlag-Peyer

    Full Text Available Ontogenetic niche shifts occur across diverse taxonomic groups, and can have critical implications for population dynamics, community structure, and ecosystem function. In this study, we provide a hypothesis-testing framework combining univariate and multivariate analyses to examine ontogenetic niche shifts using stable isotope ratios. This framework is based on three distinct ontogenetic niche shift scenarios, i.e., (1 no niche shift, (2 niche expansion/reduction, and (3 discrete niche shift between size classes. We developed criteria for identifying each scenario, as based on three important resource use characteristics, i.e., niche width, niche position, and niche overlap. We provide an empirical example for each ontogenetic niche shift scenario, illustrating differences in resource use characteristics among different organisms. The present framework provides a foundation for future studies on ontogenetic niche shifts, and also can be applied to examine resource variability among other population sub-groupings (e.g., by sex or phenotype.

  13. Sequential logic analysis and synthesis

    CERN Document Server

    Cavanagh, Joseph

    2007-01-01

    Until now, there was no single resource for actual digital system design. Using both basic and advanced concepts, Sequential Logic: Analysis and Synthesis offers a thorough exposition of the analysis and synthesis of both synchronous and asynchronous sequential machines. With 25 years of experience in designing computing equipment, the author stresses the practical design of state machines. He clearly delineates each step of the structured and rigorous design principles that can be applied to practical applications. The book begins by reviewing the analysis of combinatorial logic and Boolean a

  14. Testing the gravitational instability hypothesis?

    Science.gov (United States)

    Babul, Arif; Weinberg, David H.; Dekel, Avishai; Ostriker, Jeremiah P.

    1994-01-01

    We challenge a widely accepted assumption of observational cosmology: that successful reconstruction of observed galaxy density fields from measured galaxy velocity fields (or vice versa), using the methods of gravitational instability theory, implies that the observed large-scale structures and large-scale flows were produced by the action of gravity. This assumption is false, in that there exist nongravitational theories that pass the reconstruction tests and gravitational theories with certain forms of biased galaxy formation that fail them. Gravitational instability theory predicts specific correlations between large-scale velocity and mass density fields, but the same correlations arise in any model where (a) structures in the galaxy distribution grow from homogeneous initial conditions in a way that satisfies the continuity equation, and (b) the present-day velocity field is irrotational and proportional to the time-averaged velocity field. We demonstrate these assertions using analytical arguments and N-body simulations. If large-scale structure is formed by gravitational instability, then the ratio of the galaxy density contrast to the divergence of the velocity field yields an estimate of the density parameter Omega (or, more generally, an estimate of beta identically equal to Omega(exp 0.6)/b, where b is an assumed constant of proportionality between galaxy and mass density fluctuations. In nongravitational scenarios, the values of Omega or beta estimated in this way may fail to represent the true cosmological values. However, even if nongravitational forces initiate and shape the growth of structure, gravitationally induced accelerations can dominate the velocity field at late times, long after the action of any nongravitational impulses. The estimated beta approaches the true value in such cases, and in our numerical simulations the estimated beta values are reasonably accurate for both gravitational and nongravitational models. Reconstruction tests

  15. Structural Consistency, Consistency, and Sequential Rationality.

    OpenAIRE

    Kreps, David M; Ramey, Garey

    1987-01-01

    Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...

  16. Examining Age-Related Movement Representations for Sequential (Fine-Motor) Finger Movements

    Science.gov (United States)

    Gabbard, Carl; Cacola, Priscila; Bobbio, Tatiana

    2011-01-01

    Theory suggests that imagined and executed movement planning relies on internal models for action. Using a chronometry paradigm to compare the movement duration of imagined and executed movements, we tested children aged 7-11 years and adults on their ability to perform sequential finger movements. Underscoring this tactic was our desire to gain a…

  17. Sequential vs simultaneous encoding of spatial information: a comparison between the blind and the sighted.

    Science.gov (United States)

    Ruotolo, Francesco; Ruggiero, Gennaro; Vinciguerra, Michela; Iachini, Tina

    2012-02-01

    The aim of this research is to assess whether the crucial factor in determining the characteristics of blind people's spatial mental images is concerned with the visual impairment per se or the processing style that the dominant perceptual modalities used to acquire spatial information impose, i.e. simultaneous (vision) vs sequential (kinaesthesis). Participants were asked to learn six positions in a large parking area via movement alone (congenitally blind, adventitiously blind, blindfolded sighted) or with vision plus movement (simultaneous sighted, sequential sighted), and then to mentally scan between positions in the path. The crucial manipulation concerned the sequential sighted group. Their visual exploration was made sequential by putting visual obstacles within the pathway in such a way that they could not see simultaneously the positions along the pathway. The results revealed a significant time/distance linear relation in all tested groups. However, the linear component was lower in sequential sighted and blind participants, especially congenital. Sequential sighted and congenitally blind participants showed an almost overlapping performance. Differences between groups became evident when mentally scanning farther distances (more than 5m). This threshold effect could be revealing of processing limitations due to the need of integrating and updating spatial information. Overall, the results suggest that the characteristics of the processing style rather than the visual impairment per se affect blind people's spatial mental images. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Reliability assessment of restructured power systems using reliability network equivalent and pseudo-sequential simulation techniques

    International Nuclear Information System (INIS)

    Ding, Yi; Wang, Peng; Goel, Lalit; Billinton, Roy; Karki, Rajesh

    2007-01-01

    This paper presents a technique to evaluate reliability of a restructured power system with a bilateral market. The proposed technique is based on the combination of the reliability network equivalent and pseudo-sequential simulation approaches. The reliability network equivalent techniques have been implemented in the Monte Carlo simulation procedure to reduce the computational burden of the analysis. Pseudo-sequential simulation has been used to increase the computational efficiency of the non-sequential simulation method and to model the chronological aspects of market trading and system operation. Multi-state Markov models for generation and transmission systems are proposed and implemented in the simulation. A new load shedding scheme is proposed during generation inadequacy and network congestion to minimize the load curtailment. The IEEE reliability test system (RTS) is used to illustrate the technique. (author)

  19. Secretive Food Concocting in Binge Eating: Test of a Famine Hypothesis

    Science.gov (United States)

    Boggiano, Mary M.; Turan, Bulent; Maldonado, Christine R.; Oswald, Kimberly D.; Shuman, Ellen S.

    2016-01-01

    Objective Food concocting, or making strange food mixtures, is well documented in the famine and experimental semistarvation literature and appears anecdotally in rare descriptions of eating disorder (ED) patients but has never been scientifically investigated. Here we do so in the context of binge-eating using a “famine hypothesis of concocting.” Method A sample of 552 adults varying in binge eating and dieting traits completed a Concocting Survey created for this study. Exploratory ED groups were created to obtain predictions as to the nature of concocting in clinical populations. Results Binge eating predicted the 24.6% of participants who reported having ever concocted but dietary restraint, independently, even after controlling for binge eating, predicted its frequency and salience. Craving was the main motive. Emotions while concocting mirrored classic high-arousal symptoms associated with drug use; while eating the concoctions were associated with intensely negative/self-deprecating emotions. Concocting prevalence and salience was greater in the anorexia > bulimia > BED > no ED groups, consistent with their respectively incrementing dieting scores. Discussion Concocting distinguishes binge eating from other overeating and, consistent with the famine hypothesis, is accounted for by dietary restraint. Unlike its adaptive function in famine, concocting could worsen binge-eating disorders by increasing negative effect, shame, and secrecy. Its assessment in these disorders may prove therapeutically valuable. PMID:23255044

  20. Generalized infimum and sequential product of quantum effects

    International Nuclear Information System (INIS)

    Li Yuan; Sun Xiuhong; Chen Zhengli

    2007-01-01

    The quantum effects for a physical system can be described by the set E(H) of positive operators on a complex Hilbert space H that are bounded above by the identity operator I. For A, B(set-membership sign)E(H), the operation of sequential product A(convolution sign)B=A 1/2 BA 1/2 was proposed as a model for sequential quantum measurements. A nice investigation of properties of the sequential product has been carried over [Gudder, S. and Nagy, G., 'Sequential quantum measurements', J. Math. Phys. 42, 5212 (2001)]. In this note, we extend some results of this reference. In particular, a gap in the proof of Theorem 3.2 in this reference is overcome. In addition, some properties of generalized infimum A sqcap B are studied

  1. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    Science.gov (United States)

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Combining multiple hypothesis testing and affinity propagation clustering leads to accurate, robust and sample size independent classification on gene expression data

    Directory of Open Access Journals (Sweden)

    Sakellariou Argiris

    2012-10-01

    Full Text Available Abstract Background A feature selection method in microarray gene expression data should be independent of platform, disease and dataset size. Our hypothesis is that among the statistically significant ranked genes in a gene list, there should be clusters of genes that share similar biological functions related to the investigated disease. Thus, instead of keeping N top ranked genes, it would be more appropriate to define and keep a number of gene cluster exemplars. Results We propose a hybrid FS method (mAP-KL, which combines multiple hypothesis testing and affinity propagation (AP-clustering algorithm along with the Krzanowski & Lai cluster quality index, to select a small yet informative subset of genes. We applied mAP-KL on real microarray data, as well as on simulated data, and compared its performance against 13 other feature selection approaches. Across a variety of diseases and number of samples, mAP-KL presents competitive classification results, particularly in neuromuscular diseases, where its overall AUC score was 0.91. Furthermore, mAP-KL generates concise yet biologically relevant and informative N-gene expression signatures, which can serve as a valuable tool for diagnostic and prognostic purposes, as well as a source of potential disease biomarkers in a broad range of diseases. Conclusions mAP-KL is a data-driven and classifier-independent hybrid feature selection method, which applies to any disease classification problem based on microarray data, regardless of the available samples. Combining multiple hypothesis testing and AP leads to subsets of genes, which classify unknown samples from both, small and large patient cohorts with high accuracy.

  3. The "hierarchical" Scratch Collapse Test for identifying multilevel ulnar nerve compression.

    Science.gov (United States)

    Davidge, Kristen M; Gontre, Gil; Tang, David; Boyd, Kirsty U; Yee, Andrew; Damiano, Marci S; Mackinnon, Susan E

    2015-09-01

    The Scratch Collapse Test (SCT) is used to assist in the clinical evaluation of patients with ulnar nerve compression. The purpose of this study is to introduce the hierarchical SCT as a physical examination tool for identifying multilevel nerve compression in patients with cubital tunnel syndrome. A prospective cohort study (2010-2011) was conducted of patients referred with primary cubital tunnel syndrome. Five ulnar nerve compression sites were evaluated with the SCT. Each site generating a positive SCT was sequentially "frozen out" with a topical anesthetic to allow determination of both primary and secondary ulnar nerve entrapment points. The order or "hierarchy" of compression sites was recorded. Twenty-five patients (mean age 49.6 ± 12.3 years; 64 % female) were eligible for inclusion. The primary entrapment point was identified as Osborne's band in 80 % and the cubital tunnel retinaculum in 20 % of patients. Secondary entrapment points were also identified in the following order in all patients: (1) volar antebrachial fascia, (2) Guyon's canal, and (3) arcade of Struthers. The SCT is useful in localizing the site of primary compression of the ulnar nerve in patients with cubital tunnel syndrome. It is also sensitive enough to detect secondary compression points when primary sites are sequentially frozen out with a topical anesthetic, termed the hierarchical SCT. The findings of the hierarchical SCT are in keeping with the double crush hypothesis described by Upton and McComas in 1973 and the hypothesis of multilevel nerve compression proposed by Mackinnon and Novak in 1994.

  4. A Dopamine Hypothesis of Autism Spectrum Disorder.

    Science.gov (United States)

    Pavăl, Denis

    2017-01-01

    Autism spectrum disorder (ASD) comprises a group of neurodevelopmental disorders characterized by social deficits and stereotyped behaviors. While several theories have emerged, the pathogenesis of ASD remains unknown. Although studies report dopamine signaling abnormalities in autistic patients, a coherent dopamine hypothesis which could link neurobiology to behavior in ASD is currently lacking. In this paper, we present such a hypothesis by proposing that autistic behavior arises from dysfunctions in the midbrain dopaminergic system. We hypothesize that a dysfunction of the mesocorticolimbic circuit leads to social deficits, while a dysfunction of the nigrostriatal circuit leads to stereotyped behaviors. Furthermore, we discuss 2 key predictions of our hypothesis, with emphasis on clinical and therapeutic aspects. First, we argue that dopaminergic dysfunctions in the same circuits should associate with autistic-like behavior in nonautistic subjects. Concerning this, we discuss the case of PANDAS (pediatric autoimmune neuropsychiatric disorder associated with streptococcal infections) which displays behaviors similar to those of ASD, presumed to arise from dopaminergic dysfunctions. Second, we argue that providing dopamine modulators to autistic subjects should lead to a behavioral improvement. Regarding this, we present clinical studies of dopamine antagonists which seem to have improving effects on autistic behavior. Furthermore, we explore the means of testing our hypothesis by using neuroreceptor imaging, which could provide comprehensive evidence for dopamine signaling dysfunctions in autistic subjects. Lastly, we discuss the limitations of our hypothesis. Along these lines, we aim to provide a dopaminergic model of ASD which might lead to a better understanding of the ASD pathogenesis. © 2017 S. Karger AG, Basel.

  5. Possible Solution to Publication Bias Through Bayesian Statistics, Including Proper Null Hypothesis Testing

    NARCIS (Netherlands)

    Konijn, Elly A.; van de Schoot, Rens; Winter, Sonja D.; Ferguson, Christopher J.

    2015-01-01

    The present paper argues that an important cause of publication bias resides in traditional frequentist statistics forcing binary decisions. An alternative approach through Bayesian statistics provides various degrees of support for any hypothesis allowing balanced decisions and proper null

  6. Testing Bergmann's rule and the Rosenzweig hypothesis with craniometric studies of the South American sea lion.

    Science.gov (United States)

    Sepúlveda, Maritza; Oliva, Doris; Duran, L René; Urra, Alejandra; Pedraza, Susana N; Majluf, Patrícia; Goodall, Natalie; Crespo, Enrique A

    2013-04-01

    We tested the validity of Bergmann's rule and Rosenzweig's hypothesis through an analysis of the geographical variation of the skull size of Otaria flavescens along the entire distribution range of the species (except Brazil). We quantified the sizes of 606 adult South American sea lion skulls measured in seven localities of Peru, Chile, Uruguay, Argentina, and the Falkland/Malvinas Islands. Geographical and environmental variables included latitude, longitude, and monthly minimum, maximum, and mean air and ocean temperatures. We also included information on fish landings as a proxy for productivity. Males showed a positive relationship between condylobasal length (CBL) and latitude, and between CBL and the six temperature variables. By contrast, females showed a negative relationship between CBL and the same variables. Finally, female skull size showed a significant and positive correlation with fish landings, while males did not show any relationship with this variable. The body size of males conformed to Bergmann's rule, with larger individuals found in southern localities of South America. Females followed the converse of Bergmann's rule at the intraspecific level, but showed a positive relationship with the proxy for productivity, thus supporting Rosenzweig's hypothesis. Differences in the factors that drive body size in females and males may be explained by their different life-history strategies. Our analyses demonstrate that latitude and temperature are not the only factors that explain spatial variation in body size: others such as food availability are also important for explaining the ecogeographical patterns found in O. flavescens.

  7. Combining Multiple Hypothesis Testing with Machine Learning Increases the Statistical Power of Genome-wide Association Studies

    Science.gov (United States)

    Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert

    2016-01-01

    The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008–2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0. PMID:27892471

  8. RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2017-05-01

    Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.

  9. Sequential analysis in neonatal research-systematic review.

    Science.gov (United States)

    Lava, Sebastiano A G; Elie, Valéry; Ha, Phuong Thi Viet; Jacqz-Aigrain, Evelyne

    2018-05-01

    As more new drugs are discovered, traditional designs come at their limits. Ten years after the adoption of the European Paediatric Regulation, we performed a systematic review on the US National Library of Medicine and Excerpta Medica database of sequential trials involving newborns. Out of 326 identified scientific reports, 21 trials were included. They enrolled 2832 patients, of whom 2099 were analyzed: the median number of neonates included per trial was 48 (IQR 22-87), median gestational age was 28.7 (IQR 27.9-30.9) weeks. Eighteen trials used sequential techniques to determine sample size, while 3 used continual reassessment methods for dose-finding. In 16 studies reporting sufficient data, the sequential design allowed to non-significantly reduce the number of enrolled neonates by a median of 24 (31%) patients (IQR - 4.75 to 136.5, p = 0.0674) with respect to a traditional trial. When the number of neonates finally included in the analysis was considered, the difference became significant: 35 (57%) patients (IQR 10 to 136.5, p = 0.0033). Sequential trial designs have not been frequently used in Neonatology. They might potentially be able to reduce the number of patients in drug trials, although this is not always the case. What is known: • In evaluating rare diseases in fragile populations, traditional designs come at their limits. About 20% of pediatric trials are discontinued, mainly because of recruitment problems. What is new: • Sequential trials involving newborns were infrequently used and only a few (n = 21) are available for analysis. • The sequential design allowed to non-significantly reduce the number of enrolled neonates by a median of 24 (31%) patients (IQR - 4.75 to 136.5, p = 0.0674).

  10. Comparison of ablation centration after bilateral sequential versus simultaneous LASIK.

    Science.gov (United States)

    Lin, Jane-Ming; Tsai, Yi-Yu

    2005-01-01

    To compare ablation centration after bilateral sequential and simultaneous myopic LASIK. A retrospective randomized case series was performed of 670 eyes of 335 consecutive patients who had undergone either bilateral sequential (group 1) or simultaneous (group 2) myopic LASIK between July 2000 and July 2001 at the China Medical University Hospital, Taichung, Taiwan. The ablation centrations of the first and second eyes in the two groups were compared 3 months postoperatively. Of 670 eyes, 274 eyes (137 patients) comprised the sequential group and 396 eyes (198 patients) comprised the simultaneous group. Three months post-operatively, 220 eyes of 110 patients (80%) in the sequential group and 236 eyes of 118 patients (60%) in the simultaneous group provided topographic data for centration analysis. For the first eyes, mean decentration was 0.39 +/- 0.26 mm in the sequential group and 0.41 +/- 0.19 mm in the simultaneous group (P = .30). For the second eyes, mean decentration was 0.28 +/- 0.23 mm in the sequential group and 0.30 +/- 0.21 mm in the simultaneous group (P = .36). Decentration in the second eyes significantly improved in both groups (group 1, P = .02; group 2, P sequential group and 0.32 +/- 0.18 mm in the simultaneous group (P = .33). The difference of ablation center angles between the first and second eyes was 43.2 sequential group and 45.1 +/- 50.8 degrees in the simultaneous group (P = .42). Simultaneous bilateral LASIK is comparable to sequential surgery in ablation centration.

  11. A Survey of Multi-Objective Sequential Decision-Making

    NARCIS (Netherlands)

    Roijers, D.M.; Vamplew, P.; Whiteson, S.; Dazeley, R.

    2013-01-01

    Sequential decision-making problems with multiple objectives arise naturally in practice and pose unique challenges for research in decision-theoretic planning and learning, which has largely focused on single-objective settings. This article surveys algorithms designed for sequential

  12. Raison d’être of insulin resistance: the adjustable threshold hypothesis

    OpenAIRE

    Wang, Guanyu

    2014-01-01

    The epidemics of obesity and diabetes demand a deeper understanding of insulin resistance, for which the adjustable threshold hypothesis is formed in this paper. To test the hypothesis, mathematical modelling was used to analyse clinical data and to simulate biological processes at both molecular and organismal levels. I found that insulin resistance roots in the thresholds of the cell's bistable response. By assuming heterogeneity of the thresholds, single cells' all-or-none response can col...

  13. The picture superiority effect in conceptual implicit memory: a conceptual distinctiveness hypothesis.

    Science.gov (United States)

    Hamilton, Maryellen; Geraci, Lisa

    2006-01-01

    According to leading theories, the picture superiority effect is driven by conceptual processing, yet this effect has been difficult to obtain using conceptual implicit memory tests. We hypothesized that the picture superiority effect results from conceptual processing of a picture's distinctive features rather than a picture's semantic features. To test this hypothesis, we used 2 conceptual implicit general knowledge tests; one cued conceptually distinctive features (e.g., "What animal has large eyes?") and the other cued semantic features (e.g., "What animal is the figurehead of Tootsie Roll?"). Results showed a picture superiority effect only on the conceptual test using distinctive cues, supporting our hypothesis that this effect is mediated by conceptual processing of a picture's distinctive features.

  14. Sequential lineups: shift in criterion or decision strategy?

    Science.gov (United States)

    Gronlund, Scott D

    2004-04-01

    R. C. L. Lindsay and G. L. Wells (1985) argued that a sequential lineup enhanced discriminability because it elicited use of an absolute decision strategy. E. B. Ebbesen and H. D. Flowe (2002) argued that a sequential lineup led witnesses to adopt a more conservative response criterion, thereby affecting bias, not discriminability. Height was encoded as absolute (e.g., 6 ft [1.83 m] tall) or relative (e.g., taller than). If a sequential lineup elicited an absolute decision strategy, the principle of transfer-appropriate processing predicted that performance should be best when height was encoded absolutely. Conversely, if a simultaneous lineup elicited a relative decision strategy, performance should be best when height was encoded relatively. The predicted interaction was observed, providing direct evidence for the decision strategies explanation of what happens when witnesses view a sequential lineup.

  15. Risk-Based, Hypothesis-Driven Framework for Hydrological Field Campaigns with Case Studies

    Science.gov (United States)

    Harken, B.; Rubin, Y.

    2014-12-01

    There are several stages in any hydrological modeling campaign, including: formulation and analysis of a priori information, data acquisition through field campaigns, inverse modeling, and prediction of some environmental performance metric (EPM). The EPM being predicted could be, for example, contaminant concentration or plume travel time. These predictions often have significant bearing on a decision that must be made. Examples include: how to allocate limited remediation resources between contaminated groundwater sites or where to place a waste repository site. Answering such questions depends on predictions of EPMs using forward models as well as levels of uncertainty related to these predictions. Uncertainty in EPM predictions stems from uncertainty in model parameters, which can be reduced by measurements taken in field campaigns. The costly nature of field measurements motivates a rational basis for determining a measurement strategy that is optimal with respect to the uncertainty in the EPM prediction. The tool of hypothesis testing allows this uncertainty to be quantified by computing the significance of the test resulting from a proposed field campaign. The significance of the test gives a rational basis for determining the optimality of a proposed field campaign. This hypothesis testing framework is demonstrated and discussed using various synthetic case studies. This study involves contaminated aquifers where a decision must be made based on prediction of when a contaminant will arrive at a specified location. The EPM, in this case contaminant travel time, is cast into the hypothesis testing framework. The null hypothesis states that the contaminant plume will arrive at the specified location before a critical amount of time passes, and the alternative hypothesis states that the plume will arrive after the critical time passes. The optimality of different field campaigns is assessed by computing the significance of the test resulting from each one

  16. Spatial working memory for locations specified by vision and audition: testing the amodality hypothesis.

    Science.gov (United States)

    Loomis, Jack M; Klatzky, Roberta L; McHugh, Brendan; Giudice, Nicholas A

    2012-08-01

    Spatial working memory can maintain representations from vision, hearing, and touch, representations referred to here as spatial images. The present experiment addressed whether spatial images from vision and hearing that are simultaneously present within working memory retain modality-specific tags or are amodal. Observers were presented with short sequences of targets varying in angular direction, with the targets in a given sequence being all auditory, all visual, or a sequential mixture of the two. On two thirds of the trials, one of the locations was repeated, and observers had to respond as quickly as possible when detecting this repetition. Ancillary detection and localization tasks confirmed that the visual and auditory targets were perceptually comparable. Response latencies in the working memory task showed small but reliable costs in performance on trials involving a sequential mixture of auditory and visual targets, as compared with trials of pure vision or pure audition. These deficits were statistically reliable only for trials on which the modalities of the matching location switched from the penultimate to the final target in the sequence, indicating a switching cost. The switching cost for the pair in immediate succession means that the spatial images representing the target locations retain features of the visual or auditory representations from which they were derived. However, there was no reliable evidence of a performance cost for mixed modalities in the matching pair when the second of the two did not immediately follow the first, suggesting that more enduring spatial images in working memory may be amodal.

  17. Robust Approach to Verifying the Weak Form of the Efficient Market Hypothesis

    Science.gov (United States)

    Střelec, Luboš

    2011-09-01

    The weak form of the efficient markets hypothesis states that prices incorporate only past information about the asset. An implication of this form of the efficient markets hypothesis is that one cannot detect mispriced assets and consistently outperform the market through technical analysis of past prices. One of possible formulations of the efficient market hypothesis used for weak form tests is that share prices follow a random walk. It means that returns are realizations of IID sequence of random variables. Consequently, for verifying the weak form of the efficient market hypothesis, we can use distribution tests, among others, i.e. some tests of normality and/or some graphical methods. Many procedures for testing the normality of univariate samples have been proposed in the literature [7]. Today the most popular omnibus test of normality for a general use is the Shapiro-Wilk test. The Jarque-Bera test is the most widely adopted omnibus test of normality in econometrics and related fields. In particular, the Jarque-Bera test (i.e. test based on the classical measures of skewness and kurtosis) is frequently used when one is more concerned about heavy-tailed alternatives. As these measures are based on moments of the data, this test has a zero breakdown value [2]. In other words, a single outlier can make the test worthless. The reason so many classical procedures are nonrobust to outliers is that the parameters of the model are expressed in terms of moments, and their classical estimators are expressed in terms of sample moments, which are very sensitive to outliers. Another approach to robustness is to concentrate on the parameters of interest suggested by the problem under this study. Consequently, novel robust testing procedures of testing normality are presented in this paper to overcome shortcomings of classical normality tests in the field of financial data, which are typical with occurrence of remote data points and additional types of deviations from

  18. Hypothesis testing for differentially correlated features.

    Science.gov (United States)

    Sheng, Elisa; Witten, Daniela; Zhou, Xiao-Hua

    2016-10-01

    In a multivariate setting, we consider the task of identifying features whose correlations with the other features differ across conditions. Such correlation shifts may occur independently of mean shifts, or differences in the means of the individual features across conditions. Previous approaches for detecting correlation shifts consider features simultaneously, by computing a correlation-based test statistic for each feature. However, since correlations involve two features, such approaches do not lend themselves to identifying which feature is the culprit. In this article, we instead consider a serial testing approach, by comparing columns of the sample correlation matrix across two conditions, and removing one feature at a time. Our method provides a novel perspective and favorable empirical results compared with competing approaches. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Cognitive differences between orang-utan species: a test of the cultural intelligence hypothesis.

    Science.gov (United States)

    Forss, Sofia I F; Willems, Erik; Call, Josep; van Schaik, Carel P

    2016-07-28

    Cultural species can - or even prefer to - learn their skills from conspecifics. According to the cultural intelligence hypothesis, selection on underlying mechanisms not only improves this social learning ability but also the asocial (individual) learning ability. Thus, species with systematically richer opportunities to socially acquire knowledge and skills should over time evolve to become more intelligent. We experimentally compared the problem-solving ability of Sumatran orang-utans (Pongo abelii), which are sociable in the wild, with that of the closely related, but more solitary Bornean orang-utans (P. pygmaeus), under the homogeneous environmental conditions provided by zoos. Our results revealed that Sumatrans showed superior innate problem-solving skills to Borneans, and also showed greater inhibition and a more cautious and less rough exploration style. This pattern is consistent with the cultural intelligence hypothesis, which predicts that the more sociable of two sister species experienced stronger selection on cognitive mechanisms underlying learning.

  20. FADTTSter: accelerating hypothesis testing with functional analysis of diffusion tensor tract statistics

    Science.gov (United States)

    Noel, Jean; Prieto, Juan C.; Styner, Martin

    2017-03-01

    Functional Analysis of Diffusion Tensor Tract Statistics (FADTTS) is a toolbox for analysis of white matter (WM) fiber tracts. It allows associating diffusion properties along major WM bundles with a set of covariates of interest, such as age, diagnostic status and gender, and the structure of the variability of these WM tract properties. However, to use this toolbox, a user must have an intermediate knowledge in scripting languages (MATLAB). FADTTSter was created to overcome this issue and make the statistical analysis accessible to any non-technical researcher. FADTTSter is actively being used by researchers at the University of North Carolina. FADTTSter guides non-technical users through a series of steps including quality control of subjects and fibers in order to setup the necessary parameters to run FADTTS. Additionally, FADTTSter implements interactive charts for FADTTS' outputs. This interactive chart enhances the researcher experience and facilitates the analysis of the results. FADTTSter's motivation is to improve usability and provide a new analysis tool to the community that complements FADTTS. Ultimately, by enabling FADTTS to a broader audience, FADTTSter seeks to accelerate hypothesis testing in neuroimaging studies involving heterogeneous clinical data and diffusion tensor imaging. This work is submitted to the Biomedical Applications in Molecular, Structural, and Functional Imaging conference. The source code of this application is available in NITRC.

  1. Testing the Binary Hypothesis: Pulsar Timing Constraints on Supermassive Black Hole Binary Candidates

    Science.gov (United States)

    Sesana, Alberto; Haiman, Zoltán; Kocsis, Bence; Kelley, Luke Zoltan

    2018-03-01

    The advent of time domain astronomy is revolutionizing our understanding of the universe. Programs such as the Catalina Real-time Transient Survey (CRTS) or the Palomar Transient Factory (PTF) surveyed millions of objects for several years, allowing variability studies on large statistical samples. The inspection of ≈250 k quasars in CRTS resulted in a catalog of 111 potentially periodic sources, put forward as supermassive black hole binary (SMBHB) candidates. A similar investigation on PTF data yielded 33 candidates from a sample of ≈35 k quasars. Working under the SMBHB hypothesis, we compute the implied SMBHB merger rate and we use it to construct the expected gravitational wave background (GWB) at nano-Hz frequencies, probed by pulsar timing arrays (PTAs). After correcting for incompleteness and assuming virial mass estimates, we find that the GWB implied by the CRTS sample exceeds the current most stringent PTA upper limits by almost an order of magnitude. After further correcting for the implicit bias in virial mass measurements, the implied GWB drops significantly but is still in tension with the most stringent PTA upper limits. Similar results hold for the PTF sample. Bayesian model selection shows that the null hypothesis (whereby the candidates are false positives) is preferred over the binary hypothesis at about 2.3σ and 3.6σ for the CRTS and PTF samples respectively. Although not decisive, our analysis highlights the potential of PTAs as astrophysical probes of individual SMBHB candidates and indicates that the CRTS and PTF samples are likely contaminated by several false positives.

  2. How to Read the Tractatus Sequentially

    Directory of Open Access Journals (Sweden)

    Tim Kraft

    2016-11-01

    Full Text Available One of the unconventional features of Wittgenstein’s Tractatus Logico-Philosophicus is its use of an elaborated and detailed numbering system. Recently, Bazzocchi, Hacker und Kuusela have argued that the numbering system means that the Tractatus must be read and interpreted not as a sequentially ordered book, but as a text with a two-dimensional, tree-like structure. Apart from being able to explain how the Tractatus was composed, the tree reading allegedly solves exegetical issues both on the local (e. g. how 4.02 fits into the series of remarks surrounding it and the global level (e. g. relation between ontology and picture theory, solipsism and the eye analogy, resolute and irresolute readings. This paper defends the sequential reading against the tree reading. After presenting the challenges generated by the numbering system and the two accounts as attempts to solve them, it is argued that Wittgenstein’s own explanation of the numbering system, anaphoric references within the Tractatus and the exegetical issues mentioned above do not favour the tree reading, but a version of the sequential reading. This reading maintains that the remarks of the Tractatus form a sequential chain: The role of the numbers is to indicate how remarks on different levels are interconnected to form a concise, surveyable and unified whole.

  3. The evolution of polyandry: patterns of genotypic variation in female mating frequency, male fertilization success and a test of the sexy-sperm hypothesis.

    Science.gov (United States)

    Simmons, L W

    2003-07-01

    The sexy-sperm hypothesis predicts that females obtain indirect benefits for their offspring via polyandy, in the form of increased fertilization success for their sons. I use a quantitative genetic approach to test the sexy-sperm hypothesis using the field cricket Teleogryllus oceanicus. Previous studies of this species have shown considerable phenotypic variation in fertilization success when two or more males compete. There were high broad-sense heritabilities for both paternity and polyandry. Patterns of genotypic variance were consistent with X-linked inheritance and/or maternal effects on these traits. The genetic architecture therefore precludes the evolution of polyandry via a sexy-sperm process. Thus the positive genetic correlation between paternity in sons and polyandry in daughters predicted by the sexy-sperm hypothesis was absent. There was significant heritable variation in the investment by females in ovaries and by males in the accessory gland. Surprisingly there was a very strong genetic correlation between these two traits. The significance of this genetic correlation for the coevolution of male seminal products and polyandry is discussed.

  4. Multichannel, sequential or combined X-ray spectrometry

    International Nuclear Information System (INIS)

    Florestan, J.

    1979-01-01

    X-ray spectrometer qualities and defects are evaluated for sequential and multichannel categories. Multichannel X-ray spectrometer has time-coherency advantage and its results could be more reproducible; on the other hand some spatial incoherency limits low percentage and traces applications, specially when backgrounds are very variable. In this last case, sequential X-ray spectrometer would find again great usefulness [fr

  5. A comparator-hypothesis account of biased contingency detection.

    Science.gov (United States)

    Vadillo, Miguel A; Barberia, Itxaso

    2018-02-12

    Our ability to detect statistical dependencies between different events in the environment is strongly biased by the number of coincidences between them. Even when there is no true covariation between a cue and an outcome, if the marginal probability of either of them is high, people tend to perceive some degree of statistical contingency between both events. The present paper explores the ability of the Comparator Hypothesis to explain the general pattern of results observed in this literature. Our simulations show that this model can account for the biasing effects of the marginal probabilities of cues and outcomes. Furthermore, the overall fit of the Comparator Hypothesis to a sample of experimental conditions from previous studies is comparable to that of the popular Rescorla-Wagner model. These results should encourage researchers to further explore and put to the test the predictions of the Comparator Hypothesis in the domain of biased contingency detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Induction of simultaneous and sequential malolactic fermentation in durian wine.

    Science.gov (United States)

    Taniasuri, Fransisca; Lee, Pin-Rou; Liu, Shao-Quan

    2016-08-02

    This study represented for the first time the impact of malolactic fermentation (MLF) induced by Oenococcus oeni and its inoculation strategies (simultaneous vs. sequential) on the fermentation performance as well as aroma compound profile of durian wine. There was no negative impact of simultaneous inoculation of O. oeni and Saccharomyces cerevisiae on the growth and fermentation kinetics of S. cerevisiae as compared to sequential fermentation. Simultaneous MLF did not lead to an excessive increase in volatile acidity as compared to sequential MLF. The kinetic changes of organic acids (i.e. malic, lactic, succinic, acetic and α-ketoglutaric acids) varied with simultaneous and sequential MLF relative to yeast alone. MLF, regardless of inoculation mode, resulted in higher production of fermentation-derived volatiles as compared to control (alcoholic fermentation only), including esters, volatile fatty acids, and terpenes, except for higher alcohols. Most indigenous volatile sulphur compounds in durian were decreased to trace levels with little differences among the control, simultaneous and sequential MLF. Among the different wines, the wine with simultaneous MLF had higher concentrations of terpenes and acetate esters while sequential MLF had increased concentrations of medium- and long-chain ethyl esters. Relative to alcoholic fermentation only, both simultaneous and sequential MLF reduced acetaldehyde substantially with sequential MLF being more effective. These findings illustrate that MLF is an effective and novel way of modulating the volatile and aroma compound profile of durian wine. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. The Income Inequality Hypothesis Revisited : Assessing the Hypothesis Using Four Methodological Approaches

    NARCIS (Netherlands)

    Kragten, N.; Rözer, J.

    The income inequality hypothesis states that income inequality has a negative effect on individual’s health, partially because it reduces social trust. This article aims to critically assess the income inequality hypothesis by comparing several analytical strategies, namely OLS regression,

  8. Sequential Banking.

    OpenAIRE

    Bizer, David S; DeMarzo, Peter M

    1992-01-01

    The authors study environments in which agents may borrow sequentially from more than one leader. Although debt is prioritized, additional lending imposes an externality on prior debt because, with moral hazard, the probability of repayment of prior loans decreases. Equilibrium interest rates are higher than they would be if borrowers could commit to borrow from at most one bank. Even though the loan terms are less favorable than they would be under commitment, the indebtedness of borrowers i...

  9. Equivalence between quantum simultaneous games and quantum sequential games

    OpenAIRE

    Kobayashi, Naoki

    2007-01-01

    A framework for discussing relationships between different types of games is proposed. Within the framework, quantum simultaneous games, finite quantum simultaneous games, quantum sequential games, and finite quantum sequential games are defined. In addition, a notion of equivalence between two games is defined. Finally, the following three theorems are shown: (1) For any quantum simultaneous game G, there exists a quantum sequential game equivalent to G. (2) For any finite quantum simultaneo...

  10. Accounting for Heterogeneous Returns in Sequential Schooling Decisions

    NARCIS (Netherlands)

    Zamarro, G.

    2006-01-01

    This paper presents a method for estimating returns to schooling that takes into account that returns may be heterogeneous among agents and that educational decisions are made sequentially.A sequential decision model is interesting because it explicitly considers that the level of education of each

  11. Simultaneous Versus Sequential Ptosis and Strabismus Surgery in Children.

    Science.gov (United States)

    Revere, Karen E; Binenbaum, Gil; Li, Jonathan; Mills, Monte D; Katowitz, William R; Katowitz, James A

    The authors sought to compare the clinical outcomes of simultaneous versus sequential ptosis and strabismus surgery in children. Retrospective, single-center cohort study of children requiring both ptosis and strabismus surgery on the same eye. Simultaneous surgeries were performed during a single anesthetic event; sequential surgeries were performed at least 7 weeks apart. Outcomes were ptosis surgery success (margin reflex distance 1 ≥ 2 mm, good eyelid contour, and good eyelid crease); strabismus surgery success (ocular alignment within 10 prism diopters of orthophoria and/or improved head position); surgical complications; and reoperations. Fifty-six children were studied, 38 had simultaneous surgery and 18 sequential. Strabismus surgery was performed first in 38/38 simultaneous and 6/18 sequential cases. Mean age at first surgery was 64 months, with mean follow up 27 months. A total of 75% of children had congenital ptosis; 64% had comitant strabismus. A majority of ptosis surgeries were frontalis sling (59%) or Fasanella-Servat (30%) procedures. There were no significant differences between simultaneous and sequential groups with regards to surgical success rates, complications, or reoperations (all p > 0.28). In the first comparative study of simultaneous versus sequential ptosis and strabismus surgery, no advantage for sequential surgery was seen. Despite a theoretical risk of postoperative eyelid malposition or complications when surgeries were performed in a combined manner, the rate of such outcomes was not increased with simultaneous surgeries. Performing ptosis and strabismus surgery together appears to be clinically effective and safe, and reduces anesthesia exposure during childhood.

  12. Water developments and canids in two North American deserts: a test of the indirect effect of water hypothesis.

    Directory of Open Access Journals (Sweden)

    Lucas K Hall

    Full Text Available Anthropogenic modifications to landscapes intended to benefit wildlife may negatively influence wildlife communities. Anthropogenic provisioning of free water (water developments to enhance abundance and distribution of wildlife is a common management practice in arid regions where water is limiting. Despite the long-term and widespread use of water developments, little is known about how they influence native species. Water developments may negatively influence arid-adapted species (e.g., kit fox, Vulpes macrotis by enabling water-dependent competitors (e.g., coyote, Canis latrans to expand distribution in arid landscapes (i.e., indirect effect of water hypothesis. We tested the two predictions of the indirect effect of water hypothesis (i.e., coyotes will visit areas with free water more frequently and kit foxes will spatially and temporally avoid coyotes and evaluated relative use of free water by canids in the Great Basin and Mojave Deserts from 2010 to 2012. We established scent stations in areas with (wet and without (dry free water and monitored visitation by canids to these sites and visitation to water sources using infrared-triggered cameras. There was no difference in the proportions of visits to scent stations in wet or dry areas by coyotes or kit foxes at either study area. We did not detect spatial (no negative correlation between visits to scent stations or temporal (no difference between times when stations were visited segregation between coyotes and kit foxes. Visitation to water sources was not different for coyotes between study areas, but kit foxes visited water sources more in Mojave than Great Basin. Our results did not support the indirect effect of water hypothesis in the Great Basin or Mojave Deserts for these two canids.

  13. Forced Sequence Sequential Decoding

    DEFF Research Database (Denmark)

    Jensen, Ole Riis

    In this thesis we describe a new concatenated decoding scheme based on iterations between an inner sequentially decoded convolutional code of rate R=1/4 and memory M=23, and block interleaved outer Reed-Solomon codes with non-uniform profile. With this scheme decoding with good performance...... is possible as low as Eb/No=0.6 dB, which is about 1.7 dB below the signal-to-noise ratio that marks the cut-off rate for the convolutional code. This is possible since the iteration process provides the sequential decoders with side information that allows a smaller average load and minimizes the probability...... of computational overflow. Analytical results for the probability that the first Reed-Solomon word is decoded after C computations are presented. This is supported by simulation results that are also extended to other parameters....

  14. Conflict of interest between a nematode and a trematode in an amphipod host: Test of the "sabotage" hypothesis

    Science.gov (United States)

    Thomas, Frédéric; Fauchier, Jerome; Lafferty, Kevin D.

    2002-01-01

    Microphallus papillorobustus is a manipulative trematode that induces strong behavioural alterations in the gamaridean amphipod Gammarus insensibilis, making the amphipod more vulnerable to predation by aquatic birds (definitive hosts). Conversely, the sympatric nematodeGammarinema gammari uses Gammarus insensibilis as a habitat and a source of nutrition. We investigated the conflict of interest between these two parasite species by studying the consequences of mixed infection on amphipod behaviour associated with the trematode. In the field, some amphipods infected by the trematode did not display the altered behaviour. These normal amphipods also had more nematodes, suggesting that the nematode overpowered the manipulation of the trematode, a strategy that would prolong the nematode's life. We hypothesize that sabotage of the trematode by the nematode would be an adaptive strategy for the nematode consistent with recent speculation about co-operation and conflict in manipulative parasites. A behavioural test conducted in the laboratory from naturally infected amphipods yielded the same result. However, exposing amphipods to nematodes did not negate or decrease the manipulation exerted by the trematode. Similarly, experimental elimination of nematodes from amphipods did not permit trematodes to manipulate behaviour. These experimental data do not support the hypothesis that the negative association between nematodes and manipulation by the trematode is a result of the "sabotage" hypothesis.

  15. A test of the submentalizing hypothesis: Apes' performance in a false belief task inanimate control

    Science.gov (United States)

    Hirata, Satoshi; Call, Josep; Tomasello, Michael

    2017-01-01

    ABSTRACT Much debate concerns whether any nonhuman animals share with humans the ability to infer others' mental states, such as desires and beliefs. In a recent eye-tracking false-belief task, we showed that great apes correctly anticipated that a human actor would search for a goal object where he had last seen it, even though the apes themselves knew that it was no longer there. In response, Heyes proposed that apes' looking behavior was guided not by social cognitive mechanisms but rather domain-general cueing effects, and suggested the use of inanimate controls to test this alternative submentalizing hypothesis. In the present study, we implemented the suggested inanimate control of our previous false-belief task. Apes attended well to key events but showed markedly fewer anticipatory looks and no significant tendency to look to the correct location. We thus found no evidence that submentalizing was responsible for apes' anticipatory looks in our false-belief task. PMID:28919941

  16. Reading Remediation Based on Sequential and Simultaneous Processing.

    Science.gov (United States)

    Gunnison, Judy; And Others

    1982-01-01

    The theory postulating a dichotomy between sequential and simultaneous processing is reviewed and its implications for remediating reading problems are reviewed. Research is cited on sequential-simultaneous processing for early and advanced reading. A list of remedial strategies based on the processing dichotomy addresses decoding and lexical…

  17. [Optimization and Prognosis of Cell Radiosensitivity Enhancement in vitro and in vivo after Sequential Thermoradiactive Action].

    Science.gov (United States)

    Belkina, S V; Petin, V G

    2016-01-01

    Previously developed mathematical model of simultaneous action of two inactivating agents has been adapted and tested to describe the results of sequential action. The possibility of applying the mathematical model to the interpretation and prognosis of the increase in radio-sensitivity of tumor cells as well as mammalian cells after sequential action of two high temperatures or hyperthermia and ionizing radiation is analyzed. The model predicts the value of the thermal enhancement ratio depending on the duration of thermal exposure, its greatest value, and the condition under which it is achieved.

  18. Fast-responding liquid crystal light-valve technology for color-sequential display applications

    Science.gov (United States)

    Janssen, Peter J.; Konovalov, Victor A.; Muravski, Anatoli A.; Yakovenko, Sergei Y.

    1996-04-01

    A color sequential projection system has some distinct advantages over conventional systems which make it uniquely suitable for consumer TV as well as high performance professional applications such as computer monitors and electronic cinema. A fast responding light-valve is, clearly, essential for a good performing system. Response speed of transmissive LC lightvalves has been marginal thus far for good color rendition. Recently, Sevchenko Institute has made some very fast reflective LC cells which were evaluated at Philips Labs. These devices showed sub millisecond-large signal-response times, even at room temperature, and produced good color in a projector emulation testbed. In our presentation we describe our highly efficient color sequential projector and demonstrate its operation on video tape. Next we discuss light-valve requirements and reflective light-valve test results.

  19. Detection of small leakage combining dedicated Kalman filters and an extended version of the binary SPRT

    International Nuclear Information System (INIS)

    Racz, A.

    1991-12-01

    A new method is outlined for detection of soft reactor failures. The procedure is applicable when the failure can be described by an additive term (failure vector) in the measurement process of an observable dynamic system. A dedicated Kalman filter generates the innovation process for further testing. The innovation is investigated by a sequential hypothesis testing method. In order to avoid the computational difficulties related to sophisticated multiple hypothesis testing methods, an extended version of Walds's classical binary Sequential Probability Ratio Testing (SPRT) has been developed. The procedure is applied for the problem of (small) leakage detection in the feedwater system of nuclear power plants. Computer simulation results show that the method can recognize less than 1% relative water loss reliably. (author) 10 refs.; 5 figs.; 5 tabs

  20. DYNAMIC ANALYSIS OF THE BULK TRITIUM SHIPPING PACKAGE SUBJECTED TO CLOSURE TORQUES AND SEQUENTIAL IMPACTS

    International Nuclear Information System (INIS)

    Wu, T; Paul Blanton, P; Kurt Eberl, K

    2007-01-01

    This paper presents a finite-element technique to simulate the structural responses and to evaluate the cumulative damage of a radioactive material packaging requiring bolt closure-tightening torque and subjected to the scenarios of the Hypothetical Accident Conditions (HAC) defined in the Code of Federal Regulations Title 10 part 71 (10CFR71). Existing finite-element methods for modeling closure stresses from bolt pre-load are not readily adaptable to dynamic analyses. The HAC events are required to occur sequentially per 10CFR71 and thus the evaluation of the cumulative damage is desirable. Generally, each HAC event is analyzed separately and the cumulative damage is partially addressed by superposition. This results in relying on additional physical testing to comply with 10CFR71 requirements for assessment of cumulative damage. The proposed technique utilizes the combination of kinematic constraints, rigid-body motions and structural deformations to overcome some of the difficulties encountered in modeling the effect of cumulative damage. This methodology provides improved numerical solutions in compliance with the 10CFR71 requirements for sequential HAC tests. Analyses were performed for the Bulk Tritium Shipping Package (BTSP) designed by Savannah River National Laboratory to demonstrate the applications of the technique. The methodology proposed simulates the closure bolt torque preload followed by the sequential HAC events, the 30-foot drop and the 30-foot dynamic crush. The analytical results will be compared to the package test data