WorldWideScience

Sample records for plausible dynamical assumptions

  1. Plausible cloth animation using dynamic bending model

    Institute of Scientific and Technical Information of China (English)

    Chuan Zhou; Xiaogang Jin; Charlie C.L. Wang; Jieqing Feng

    2008-01-01

    Simulating the mechanical behavior of a cloth is a very challenging and important problem in computer animation. The models of bending in most existing cloth simulation approaches are taking the assumption that the cloth is little deformed from a plate shape.Therefore, based on the thin-plate theory, these bending models do not consider the condition that the current shape of the cloth under large deformations cannot be regarded as the approximation to that before deformation, which leads to an unreal static bending. [This paper introduces a dynamic bending model which is appropriate to describe large out-plane deformations such as cloth buckling and bending, and develops a compact implementation of the new model on spring-mass systems. Experimental results show that wrinkles and folds generated using this technique in cloth simulation, can appear and vanish in a more natural way than other approaches.

  2. Assessing the Sensitivity of a Reservoir Management System Under Plausible Assumptions About Future Climate Over Seasons to Decades

    Science.gov (United States)

    Ward, M. N.; Brown, C. M.; Baroang, K. M.; Kaheil, Y. H.

    2011-12-01

    We illustrate an analysis procedure that explores the robustness and overall productivity of a reservoir management system under plausible assumptions about climate fluctuation and change. Results are presented based on a stylized version of a multi-use reservoir management model adapted from Angat Dam, Philippines. It represents a modest-sized seasonal storage reservoir in a climate with a pronounced dry season. The reservoir management model focuses on October-March, during which climatological inflow declines due to the arrival of the dry season, and reservoir management becomes critical and challenging. Inflow is assumed to be impacted by climate fluctuations representing interannal variation (white noise), decadal to multidecadal variation (MDV, here represented by a stochastic autoregressive process) and global change (GC), here represented by a systematic linear trend in seasonal inflow total over the simulation period of 2008-2047. Reservoir reliability, and risk of extreme persistent water shortfall, is assessed under different combinations and magnitudes of GC and MDV. We include an illustration of adaptive management, using seasonal forecasts and updated climate normals. A set of seasonal forecast and observed inflow values are generated for 2008-2047 by randomly rearranging the forecast-observed pairs for 1968-2007. Then, trends are imposed on the observed series, with differing assumptions about the extent to which the seasonal forecasts can be expected to track the trend. We consider the framework presented here well-suited to providing insights about managing the climate risks in reservoir operations, providing guidance on expected benefits and risks of different strategies and climate scenarios.

  3. Developmental dynamics: toward a biologically plausible evolutionary psychology.

    Science.gov (United States)

    Lickliter, Robert; Honeycutt, Hunter

    2003-11-01

    There has been a conceptual revolution in the biological sciences over the past several decades. Evidence from genetics, embryology, and developmental biology has converged to offer a more epigenetic, contingent, and dynamic view of how organisms develop. Despite these advances, arguments for the heuristic value of a gene-centered, predeterministic approach to the study of human behavior and development have become increasingly evident in the psychological sciences during this time. In this article, the authors review recent advances in genetics, embryology, and developmental biology that have transformed contemporary developmental and evolutionary theory and explore how these advances challenge gene-centered explanations of human behavior that ignore the complex, highly coordinated system of regulatory dynamics involved in development and evolution.

  4. Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks.

    Science.gov (United States)

    Miconi, Thomas

    2017-02-23

    Neural activity during cognitive tasks exhibits complex dynamics that flexibly encode task-relevant variables. Chaotic recurrent networks, which spontaneously generate rich dynamics, have been proposed as a model of cortical computation during cognitive tasks. However, existing methods for training these networks are either biologically implausible, and/or require a continuous, real-time error signal to guide learning. Here we show that a biologically plausible learning rule can train such recurrent networks, guided solely by delayed, phasic rewards at the end of each trial. Networks endowed with this learning rule can successfully learn nontrivial tasks requiring flexible (context-dependent) associations, memory maintenance, nonlinear mixed selectivities, and coordination among multiple outputs. The resulting networks replicate complex dynamics previously observed in animal cortex, such as dynamic encoding of task features and selective integration of sensory inputs. We conclude that recurrent neural networks offer a plausible model of cortical dynamics during both learning and performance of flexible behavior.

  5. Of paradox and plausibility: the dynamic of change in medical law.

    Science.gov (United States)

    Harrington, John

    2014-01-01

    This article develops a model of change in medical law. Drawing on systems theory, it argues that medical law participates in a dynamic of 'deparadoxification' and 'reparadoxification' whereby the underlying contingency of the law is variously concealed through plausible argumentation, or revealed by critical challenge. Medical law is, thus, thoroughly rhetorical. An examination of the development of the law on abortion and on the sterilization of incompetent adults shows that plausibility is achieved through the deployment of substantive common sense and formal stylistic devices. It is undermined where these elements are shown to be arbitrary and constructed. In conclusion, it is argued that the politics of medical law are constituted by this antagonistic process of establishing and challenging provisionally stable normative regimes.

  6. Exploring gravitational statistics not based on quantum dynamical assumptions

    CERN Document Server

    Mandrin, P A

    2016-01-01

    Despite considerable progress in several approaches to quantum gravity, there remain uncertainties on the conceptual level. One issue concerns the different roles played by space and time in the canonical quantum formalism. This issue occurs because the Hamilton-Jacobi dynamics is being quantised. The question then arises whether additional physically relevant states could exist which cannot be represented in the canonical form or as a partition function. For this reason, the author has explored a statistical approach (NDA) which is not based on quantum dynamical assumptions and does not require space-time splitting boundary conditions either. For dimension 3+1 and under thermal equilibrium, NDA simplifies to a path integral model. However, the general case of NDA cannot be written as a partition function. As a test of NDA, one recovers general relativity at low curvature and quantum field theory in the flat space-time approximation. Related paper: arxiv:1505.03719.

  7. Dynamic Group Diffie-Hellman Key Exchange under standard assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Bresson, Emmanuel; Chevassut, Olivier; Pointcheval, David

    2002-02-14

    Authenticated Diffie-Hellman key exchange allows two principals communicating over a public network, and each holding public-private keys, to agree on a shared secret value. In this paper we study the natural extension of this cryptographic problem to a group of principals. We begin from existing formal security models and refine them to incorporate major missing details (e.g., strong-corruption and concurrent sessions). Within this model we define the execution of a protocol for authenticated dynamic group Diffie-Hellman and show that it is provably secure under the decisional Diffie-Hellman assumption. Our security result holds in the standard model and thus provides better security guarantees than previously published results in the random oracle model.

  8. Modelling the dynamics of reasoning processes: reasoning by assumption

    NARCIS (Netherlands)

    Jonker, C.M.; Treur, J.

    2008-01-01

    To model the dynamics of cognitive processes, often the Dynamical Systems Theory (DST) is advocated. However, for higher cognitive processes such as reasoning and certain forms of natural language processing the techniques adopted within DST are not very adequate. This paper shows how an analysis of

  9. Charting plausible futures for diabetes prevalence in the United States: a role for system dynamics simulation modeling.

    Science.gov (United States)

    Milstein, Bobby; Jones, Andrew; Homer, Jack B; Murphy, Dara; Essien, Joyce; Seville, Don

    2007-07-01

    Healthy People 2010 (HP 2010) objectives call for a 38% reduction in the prevalence of diagnosed diabetes mellitus, type 1 and type 2, by the year 2010. The process for setting this objective, however, did not focus on the achievability or the compatibility of this objective with other national public health objectives. We used a dynamic simulation model to explore plausible trajectories for diabetes prevalence in the wake of rising levels of obesity in the U.S. population. The model helps to interpret historic trends in diabetes prevalence in the United States and to anticipate plausible future trends through 2010. We conducted simulation experiments using a computer model of diabetes population dynamics to 1) track the rates at which people develop diabetes, are diagnosed with the disease, and die, and 2) assess the effects of various preventive-care interventions. System dynamics modeling methodology based on data from multiple sources guided the analyses. With the number of new cases of diabetes being much greater than the number of deaths among those with the disease, the prevalence of diagnosed diabetes in the United States is likely to continue to increase. Even a 29% reduction in the number of new cases (the HP 2010 objective) would only slow the growth, not reverse it. Increased diabetes detection rates or decreased mortality rates--also HP 2010 objectives--would further increase diagnosed prevalence. The HP 2010 objective for reducing diabetes prevalence is unattainable given the historical processes that are affecting incidence, diagnosis, and mortality, and even a zero-growth future is unlikely. System dynamics modeling shows why interventions to protect against chronic diseases have only gradual effects on their diagnosed prevalence.

  10. Continuous-discrete model of parasite-host system dynamics: Trigger regime at simplest assumptions

    Directory of Open Access Journals (Sweden)

    L. V. Nedorezov

    2014-09-01

    Full Text Available In paper continuous-discrete model of parasite-host system dynamics is analyzed. Within the framework of model it is assumed that appearance of individuals of new generations of both populations is realized at fixed time moments tk=hk, t0=0, k=1,2,... , h=const>0; it means that several processes are compressed together: producing of eggs by hosts, attack of eggs by parasites (with respective transformation of host's eggs into parasite's eggs, staying of hosts and parasites in phase "egg", and appearance of new individuals. It is also assumed that death process of individuals has a continuous nature, but developments of both populations are realized independently between fixed time moments. Dynamic regimes of model are analyzed. In particular, it was obtained that with simplest assumptions about birth process in host population and numbers of attacked hosts regime with two non-trivial stable attractors in phase space of system can be realized.

  11. Looking for plausibility

    CERN Document Server

    Abdullah, Wan Ahmad Tajuddin Wan

    2010-01-01

    In the interpretation of experimental data, one is actually looking for plausible explanations. We look for a measure of plausibility, with which we can compare different possible explanations, and which can be combined when there are different sets of data. This is contrasted to the conventional measure for probabilities as well as to the proposed measure of possibilities. We define what characteristics this measure of plausibility should have. In getting to the conception of this measure, we explore the relation of plausibility to abductive reasoning, and to Bayesian probabilities. We also compare with the Dempster-Schaefer theory of evidence, which also has its own definition for plausibility. Abduction can be associated with biconditionality in inference rules, and this provides a platform to relate to the Collins-Michalski theory of plausibility. Finally, using a formalism for wiring logic onto Hopfield neural networks, we ask if this is relevant in obtaining this measure.

  12. The effects of physiologically plausible connectivity structure on local and global dynamics in large scale brain models.

    NARCIS (Netherlands)

    Knock, S.A.; McIntosh, A.R.; Sporns, O.; Kotter, R.; Hagmann, P.; Jirsa, V.K.

    2009-01-01

    Functionally relevant large scale brain dynamics operates within the framework imposed by anatomical connectivity and time delays due to finite transmission speeds. To gain insight on the reliability and comparability of large scale brain network simulations, we investigate the effects of variations

  13. Experimental evaluation of the pure configurational stress assumption in the flow dynamics of entangled polymer melts

    DEFF Research Database (Denmark)

    Rasmussen, Henrik K.; Bejenariu, Anca Gabriela; Hassager, Ole

    2010-01-01

    with the assumption of pure configurational stress was accurately able to predict the startup as well as the reversed flow behavior. This confirms that this commonly used theoretical picture for the flow of polymeric liquids is a correct physical principle to apply. c 2010 The Society of Rheology. [DOI: 10.1122/1.3496378]...

  14. Molecular docking and dynamic simulation studies evidenced plausible immunotherapeutic anticancer property by Withaferin A targeting indoleamine 2,3-dioxygenase.

    Science.gov (United States)

    Reddy, S V G; Reddy, K Thammi; Kumari, V Valli; Basha, Syed Hussain

    2015-01-01

    Indoleamine 2,3-dioxygenase (IDO) is emerging as an important new therapeutic drug target for the treatment of cancer characterized by pathological immune suppression. IDO catalyzes the rate-limiting step of tryptophan degradation along the kynurenine pathway. Reduction in local tryptophan concentration and the production of immunomodulatory tryptophan metabolites contribute to the immunosuppressive effects of IDO. Presence of IDO on dentritic cells in tumor-draining lymph nodes leading to the activation of T cells toward forming immunosuppressive microenvironment for the survival of tumor cells has confirmed the importance of IDO as a promising novel anticancer immunotherapy drug target. On the other hand, Withaferin A (WA) - active constituent of Withania Somnifera ayurvedic herb has shown to be having a wide range of targeted anticancer properties. In the present study conducted here is an attempt to explore the potential of WA in attenuating IDO for immunotherapeutic tumor arresting activity and to elucidate the underlying mode of action in a computational approach. Our docking and molecular dynamic simulation results predict high binding affinity of the ligand to the receptor with up to -11.51 kcal/mol of energy and 3.63 nM of IC50 value. Further, de novo molecular dynamic simulations predicted stable ligand interactions with critically important residues SER167; ARG231; LYS377, and heme moiety involved in IDO's activity. Conclusively, our results strongly suggest WA as a valuable small ligand molecule with strong binding affinity toward IDO.

  15. Recursive Subspace Identification of AUV Dynamic Model under General Noise Assumption

    Directory of Open Access Journals (Sweden)

    Zheping Yan

    2014-01-01

    Full Text Available A recursive subspace identification algorithm for autonomous underwater vehicles (AUVs is proposed in this paper. Due to the advantages at handling nonlinearities and couplings, the AUV model investigated here is for the first time constructed as a Hammerstein model with nonlinear feedback in the linear part. To better take the environment and sensor noises into consideration, the identification problem is concerned as an errors-in-variables (EIV one which means that the identification procedure is under general noise assumption. In order to make the algorithm recursively, propagator method (PM based subspace approach is extended into EIV framework to form the recursive identification method called PM-EIV algorithm. With several identification experiments carried out by the AUV simulation platform, the proposed algorithm demonstrates its effectiveness and feasibility.

  16. Fostering assumption-based stress-test thinking in managing groundwater systems: learning to avoid failures due to basic dynamics

    Science.gov (United States)

    Guillaume, Joseph H. A.; El Sawah, Sondoss

    2014-06-01

    Sustainable groundwater resource management can only be achieved if planning processes address the basic dynamics of the groundwater system. Conceptual and distributed groundwater models do not necessarily translate into an understanding of how a plan might operate in reality. Prompted by Australian experiences, `iterative closed-question modelling' has been used to develop a process of iterative dialogue about management options, objectives and knowledge. Simple hypothetical models of basic system dynamics that satisfy agreed assumptions are used to stress-test the ability of a proposed management plan to achieve desired future conditions. Participants learn from models in which a plan succeeds and fails, updating their assumptions, expectations or plan. Their new understanding is tested against further hypothetical models. The models act as intellectual devices that confront users with new scenarios to discuss. This theoretical approach is illustrated using simple one and two-cell groundwater models that convey basic notions of capture and spatial impacts of pumping. Simple extensions can address uncertain climate, managed-aquifer recharge and alternate water sources. Having learnt to address the dynamics captured by these models, participants may be better placed to address local conditions and develop more effective arrangements to achieve management outcomes.

  17. RESEARCH ON NONLINEAR PROBLEMS IN STRUCTURAL DYNAMICS.

    Science.gov (United States)

    Research on nonlinear problems structural dynamics is briefly summarized. Panel flutter was investigated to make a critical comparison between theory...panel flutter in aerospace vehicles, plausible simplifying assumptions are examined in the light of experimental results. Structural dynamics research

  18. Adhesion Detection Analysis by Modeling Rail Wheel Set Dynamics under the Assumption of Constant Creep Coefficient

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali Soomro

    2014-12-01

    Full Text Available Adhesion level control is very necessary to avoid slippage of rail wheelset and track from derailment for smoothing running of rail vehicle. In this paper the proper dynamics of wheelset for velocities acting in three dimensions of wheelset and rail track has been discussed along with creep forces on each wheel in longitudinal, lateral and spin directions has been enumerated and computed for suitable modeling. The concerned results have been simulated by Matlab code to observe the correlation of this phenomenon to compare creepage and creep forces for detecting adhesion level. This adhesion identification is recognized by applying coulomb’s law for sliding friction by comparing tangential and normal forces through co-efficient of friction

  19. Disastrous assumptions about community disasters

    Energy Technology Data Exchange (ETDEWEB)

    Dynes, R.R. [Univ. of Delaware, Newark, DE (United States). Disaster Research Center

    1995-12-31

    Planning for local community disasters is compounded with erroneous assumptions. Six problematic models are identified: agent facts, big accident, end of the world, media, command and control, administrative. Problematic assumptions in each of them are identified. A more adequate model centered on problem solving is identified. That there is a discrepancy between disaster planning efforts and the actual response experience seems rather universal. That discrepancy is symbolized by the graffiti which predictably surfaces on many walls in post disaster locations -- ``First the earthquake, then the disaster.`` That contradiction is seldom reduced as a result of post disaster critiques, since the most usual conclusion is that the plan was adequate but the ``people`` did not follow it. Another explanation will be provided here. A more plausible explanation for failure is that most planning efforts adopt a number of erroneous assumptions which affect the outcome. Those assumptions are infrequently changed or modified by experience.

  20. Linking assumptions in amblyopia

    Science.gov (United States)

    LEVI, DENNIS M.

    2017-01-01

    Over the last 35 years or so, there has been substantial progress in revealing and characterizing the many interesting and sometimes mysterious sensory abnormalities that accompany amblyopia. A goal of many of the studies has been to try to make the link between the sensory losses and the underlying neural losses, resulting in several hypotheses about the site, nature, and cause of amblyopia. This article reviews some of these hypotheses, and the assumptions that link the sensory losses to specific physiological alterations in the brain. Despite intensive study, it turns out to be quite difficult to make a simple linking hypothesis, at least at the level of single neurons, and the locus of the sensory loss remains elusive. It is now clear that the simplest notion—that reduced contrast sensitivity of neurons in cortical area V1 explains the reduction in contrast sensitivity—is too simplistic. Considerations of noise, noise correlations, pooling, and the weighting of information also play a critically important role in making perceptual decisions, and our current models of amblyopia do not adequately take these into account. Indeed, although the reduction of contrast sensitivity is generally considered to reflect “early” neural changes, it seems plausible that it reflects changes at many stages of visual processing. PMID:23879956

  1. Some Remarks on the Model Theory of Epistemic Plausibility Models

    CERN Document Server

    Demey, Lorenz

    2010-01-01

    Classical logics of knowledge and belief are usually interpreted on Kripke models, for which a mathematically well-developed model theory is available. However, such models are inadequate to capture dynamic phenomena. Therefore, epistemic plausibility models have been introduced. Because these are much richer structures than Kripke models, they do not straightforwardly inherit the model-theoretical results of modal logic. Therefore, while epistemic plausibility structures are well-suited for modeling purposes, an extensive investigation of their model theory has been lacking so far. The aim of the present paper is to fill exactly this gap, by initiating a systematic exploration of the model theory of epistemic plausibility models. Like in 'ordinary' modal logic, the focus will be on the notion of bisimulation. We define various notions of bisimulations (parametrized by a language L) and show that L-bisimilarity implies L-equivalence. We prove a Hennesy-Milner type result, and also two undefinability results. ...

  2. Complex Learning in Bio-plausible Memristive Networks

    OpenAIRE

    Deng, Lei; Li, Guoqi; Deng, Ning; Dong WANG; Zhang, Ziyang; He, Wei; Li, Huanglong; Pei, Jing; Shi, Luping

    2015-01-01

    The emerging memristor-based neuromorphic engineering promises an efficient computing paradigm. However, the lack of both internal dynamics in the previous feedforward memristive networks and efficient learning algorithms in recurrent networks, fundamentally limits the learning ability of existing systems. In this work, we propose a framework to support complex learning functions by introducing dedicated learning algorithms to a bio-plausible recurrent memristive network with internal dynamic...

  3. Environment Assumptions for Synthesis

    CERN Document Server

    Chatterjee, Krishnendu; Jobstmann, Barbara

    2008-01-01

    The synthesis problem asks to construct a reactive finite-state system from an $\\omega$-regular specification. Initial specifications are often unrealizable, which means that there is no system that implements the specification. A common reason for unrealizability is that assumptions on the environment of the system are incomplete. We study the problem of correcting an unrealizable specification $\\phi$ by computing an environment assumption $\\psi$ such that the new specification $\\psi\\to\\phi$ is realizable. Our aim is to construct an assumption $\\psi$ that constrains only the environment and is as weak as possible. We present a two-step algorithm for computing assumptions. The algorithm operates on the game graph that is used to answer the realizability question. First, we compute a safety assumption that removes a minimal set of environment edges from the graph. Second, we compute a liveness assumption that puts fairness conditions on some of the remaining environment edges. We show that the problem of findi...

  4. Plausibility Arguments and Universal Gravitation

    Science.gov (United States)

    Cunha, Ricardo F. F.; Tort, A. C.

    2017-01-01

    Newton's law of universal gravitation underpins our understanding of the dynamics of the Solar System and of a good portion of the observable universe. Generally, in the classroom or in textbooks, the law is presented initially in a qualitative way and at some point during the exposition its mathematical formulation is written on the blackboard…

  5. Hamiltonian formulation of time-dependent plausible inference

    CERN Document Server

    Davis, Sergio

    2014-01-01

    Maximization of the path information entropy is a clear prescription for performing time-dependent plausible inference. Here it is shown that, following this prescription under the assumption of arbitrary instantaneous constraints on position and velocity, a Lagrangian emerges which determines the most probable trajectory. Deviations from the probability maximum can be consistently described as slices in time by a Hamiltonian, according to a nonlinear Langevin equation and its associated Fokker-Planck equation. The connections unveiled between the maximization of path entropy and the Langevin/Fokker-Planck equations imply that missing information about the phase space coordinate never decreases in time, a purely information-theoretical version of the Second Law of Thermodynamics. All of these results are independent of any physical assumptions, and thus valid for any generalized coordinate as a function of time, or any other parameter. This reinforces the view that the Second Law is a fundamental property of ...

  6. What can we learn from Plausible Values?

    Science.gov (United States)

    Marsman, Maarten; Maris, Gunter; Bechger, Timo; Glas, Cees

    2016-06-01

    In this paper, we show that the marginal distribution of plausible values is a consistent estimator of the true latent variable distribution, and, furthermore, that convergence is monotone in an embedding in which the number of items tends to infinity. We use this result to clarify some of the misconceptions that exist about plausible values, and also show how they can be used in the analyses of educational surveys.

  7. Plausibility functions and exact frequentist inference

    CERN Document Server

    Martin, Ryan

    2012-01-01

    In the frequentist program, inferential methods with exact control on error rates are a primary focus. Methods based on asymptotic distribution theory may not be suitable in a particular problem, in which case, a numerical method is needed. This paper presents a general, Monte Carlo-driven framework for the construction of frequentist procedures based on plausibility functions. It is proved that the suitably defined plausibility function-based tests and confidence regions have desired frequentist properties. Moreover, in an important special case involving likelihood ratios, conditions are given such that the plausibility function behaves asymptotically like a consistent Bayesian posterior distribution. An extension of the proposed method is also given for the case where nuisance parameters are present. A number of examples are given which illustrate the method and demonstrate its strong performance compared to other popular existing methods.

  8. Bisimulation for Single-Agent Plausibility Models

    DEFF Research Database (Denmark)

    Andersen, Mikkel Birkegaard; Bolander, Thomas; van Ditmarsch, H.;

    2013-01-01

    Epistemic plausibility models are Kripke models agents use to reason about the knowledge and beliefs of themselves and each other. Restricting ourselves to the single-agent case, we determine when such models are indistinguishable in the logical language containing conditional belief, i.e., we...... define a proper notion of bisimulation, and prove that bisimulation corresponds to logical equivalence on image-finite models. We relate our results to other epistemic notions, such as safe belief and degrees of belief. Our results imply that there are only finitely many non-bisimilar single......-agent epistemic plausibility models on a finite set of propositions. This gives decidability for single-agent epistemic plausibility planning....

  9. The bright lights of city regions: Assumptions, realities and implications of changing population dynamics: Zooming in on the Gauteng city region

    CSIR Research Space (South Africa)

    Pieterse, A

    2014-10-01

    Full Text Available to assumptions of migration and urbanisation. Firstly, even though poverty has been perceived as largely a rural issue, the urbanisation of poverty is in fact occurring at a large scale and city regions, particularly the Gauteng city region, is dealing...

  10. Biologically Plausible, Human-scale Knowledge Representation

    Science.gov (United States)

    Crawford, Eric; Gingerich, Matthew; Eliasmith, Chris

    2016-01-01

    Several approaches to implementing symbol-like representations in neurally plausible models have been proposed. These approaches include binding through synchrony (Shastri & Ajjanagadde, 1993), "mesh" binding (van der Velde & de Kamps, 2006), and conjunctive binding (Smolensky, 1990). Recent theoretical work has suggested that…

  11. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence From Word Segmentation.

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-11-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility. We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition model can aim to be cognitively plausible in multiple ways. We discuss these cognitive plausibility checkpoints generally and then apply them to a case study in word segmentation, investigating a promising Bayesian segmentation strategy. We incorporate cognitive plausibility by using an age-appropriate unit of perceptual representation, evaluating the model output in terms of its utility, and incorporating cognitive constraints into the inference process. Our more cognitively plausible model shows a beneficial effect of cognitive constraints on segmentation performance. One interpretation of this effect is as a synergy between the naive theories of language structure that infants may have and the cognitive constraints that limit the fidelity of their inference processes, where less accurate inference approximations are better when the underlying assumptions about how words are generated are less accurate. More generally, these results highlight the utility of incorporating cognitive plausibility more fully into computational models of language acquisition.

  12. Complex Learning in Bio-plausible Memristive Networks.

    Science.gov (United States)

    Deng, Lei; Li, Guoqi; Deng, Ning; Wang, Dong; Zhang, Ziyang; He, Wei; Li, Huanglong; Pei, Jing; Shi, Luping

    2015-06-19

    The emerging memristor-based neuromorphic engineering promises an efficient computing paradigm. However, the lack of both internal dynamics in the previous feedforward memristive networks and efficient learning algorithms in recurrent networks, fundamentally limits the learning ability of existing systems. In this work, we propose a framework to support complex learning functions by introducing dedicated learning algorithms to a bio-plausible recurrent memristive network with internal dynamics. We fabricate iron oxide memristor-based synapses, with well controllable plasticity and a wide dynamic range of excitatory/inhibitory connection weights, to build the network. To adaptively modify the synaptic weights, the comprehensive recursive least-squares (RLS) learning algorithm is introduced. Based on the proposed framework, the learning of various timing patterns and a complex spatiotemporal pattern of human motor is demonstrated. This work paves a new way to explore the brain-inspired complex learning in neuromorphic systems.

  13. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  14. Neural networks, nativism, and the plausibility of constructivism.

    Science.gov (United States)

    Quartz, S R

    1993-09-01

    Recent interest in PDP (parallel distributed processing) models is due in part to the widely held belief that they challenge many of the assumptions of classical cognitive science. In the domain of language acquisition, for example, there has been much interest in the claim that PDP models might undermine nativism. Related arguments based on PDP learning have also been given against Fodor's anti-constructivist position--a position that has contributed to the widespread dismissal of constructivism. A limitation of many of the claims regarding PDP learning, however, is that the principles underlying this learning have not been rigorously characterized. In this paper, I examine PDP models from within the framework of Valiant's PAC (probably approximately correct) model of learning, now the dominant model in machine learning, and which applies naturally to neural network learning. From this perspective, I evaluate the implications of PDP models for nativism and Fodor's influential anti-constructivist position. In particular, I demonstrate that, contrary to a number of claims, PDP models are nativist in a robust sense. I also demonstrate that PDP models actually serve as a good illustration of Fodor's anti-constructivist position. While these results may at first suggest that neural network models in general are incapable of the sort of concept acquisition that is required to refute Fodor's anti-constructivist position, I suggest that there is an alternative form of neural network learning that demonstrates the plausibility of constructivism. This alternative form of learning is a natural interpretation of the constructivist position in terms of neural network learning, as it employs learning algorithms that incorporate the addition of structure in addition to weight modification schemes. By demonstrating that there is a natural and plausible interpretation of constructivism in terms of neural network learning, the position that nativism is the only plausible model of

  15. Anatomically Plausible Surface Alignment and Reconstruction

    DEFF Research Database (Denmark)

    Paulsen, Rasmus R.; Larsen, Rasmus

    2010-01-01

    With the increasing clinical use of 3D surface scanners, there is a need for accurate and reliable algorithms that can produce anatomically plausible surfaces. In this paper, a combined method for surface alignment and reconstruction is proposed. It is based on an implicit surface representation...... combined with a Markov Random Field regularisation method. Conceptually, the method maintains an implicit ideal description of the sought surface. This implicit surface is iteratively updated by realigning the input point sets and Markov Random Field regularisation. The regularisation is based on a prior...... energy that has earlier proved to be particularly well suited for human surface scans. The method has been tested on full cranial scans of ten test subjects and on several scans of the outer human ear....

  16. The Role of Plausible Values in Large-Scale Surveys

    Science.gov (United States)

    Wu, Margaret

    2005-01-01

    In large-scale assessment programs such as NAEP, TIMSS and PISA, students' achievement data sets provided for secondary analysts contain so-called "plausible values." Plausible values are multiple imputations of the unobservable latent achievement for each student. In this article it has been shown how plausible values are used to: (1) address…

  17. Comprehending Conflicting Science-Related Texts: Graphs as Plausibility Cues

    Science.gov (United States)

    Isberner, Maj-Britt; Richter, Tobias; Maier, Johanna; Knuth-Herzig, Katja; Horz, Holger; Schnotz, Wolfgang

    2013-01-01

    When reading conflicting science-related texts, readers may attend to cues which allow them to assess plausibility. One such plausibility cue is the use of graphs in the texts, which are regarded as typical of "hard science." The goal of our study was to investigate the effects of the presence of graphs on the perceived plausibility and…

  18. Invariant visual object recognition: biologically plausible approaches.

    Science.gov (United States)

    Robinson, Leigh; Rolls, Edmund T

    2015-10-01

    Key properties of inferior temporal cortex neurons are described, and then, the biological plausibility of two leading approaches to invariant visual object recognition in the ventral visual system is assessed to investigate whether they account for these properties. Experiment 1 shows that VisNet performs object classification with random exemplars comparably to HMAX, except that the final layer C neurons of HMAX have a very non-sparse representation (unlike that in the brain) that provides little information in the single-neuron responses about the object class. Experiment 2 shows that VisNet forms invariant representations when trained with different views of each object, whereas HMAX performs poorly when assessed with a biologically plausible pattern association network, as HMAX has no mechanism to learn view invariance. Experiment 3 shows that VisNet neurons do not respond to scrambled images of faces, and thus encode shape information. HMAX neurons responded with similarly high rates to the unscrambled and scrambled faces, indicating that low-level features including texture may be relevant to HMAX performance. Experiment 4 shows that VisNet can learn to recognize objects even when the view provided by the object changes catastrophically as it transforms, whereas HMAX has no learning mechanism in its S-C hierarchy that provides for view-invariant learning. This highlights some requirements for the neurobiological mechanisms of high-level vision, and how some different approaches perform, in order to help understand the fundamental underlying principles of invariant visual object recognition in the ventral visual stream.

  19. Exposing Trust Assumptions in Distributed Policy Enforcement (Briefing Charts)

    Science.gov (United States)

    2016-06-21

    Coordinated defenses appear to be feasible • Writing policies from scratch is hard – Exposing assumptions requires people to think about what assumptions... critical capabilities as: – Adaptation to dynamic service availability – Complex situational dynamics (e.g., differentiating between bot-net and

  20. Cultural group selection is plausible, but the predictions of its hypotheses should be tested with real-world data.

    Science.gov (United States)

    Turchin, Peter; Currie, Thomas E

    2016-01-01

    The evidence compiled in the target article demonstrates that the assumptions of cultural group selection (CGS) theory are often met, and it is therefore a useful framework for generating plausible hypotheses. However, more can be said about how we can test the predictions of CGS hypotheses against competing explanations using historical, archaeological, and anthropological data.

  1. Plausibility and evidence: the case of homeopathy.

    Science.gov (United States)

    Rutten, Lex; Mathie, Robert T; Fisher, Peter; Goossens, Maria; van Wassenhoven, Michel

    2013-08-01

    Homeopathy is controversial and hotly debated. The conclusions of systematic reviews of randomised controlled trials of homeopathy vary from 'comparable to conventional medicine' to 'no evidence of effects beyond placebo'. It is claimed that homeopathy conflicts with scientific laws and that homoeopaths reject the naturalistic outlook, but no evidence has been cited. We are homeopathic physicians and researchers who do not reject the scientific outlook; we believe that examination of the prior beliefs underlying this enduring stand-off can advance the debate. We show that interpretations of the same set of evidence--for homeopathy and for conventional medicine--can diverge. Prior disbelief in homeopathy is rooted in the perceived implausibility of any conceivable mechanism of action. Using the 'crossword analogy', we demonstrate that plausibility bias impedes assessment of the clinical evidence. Sweeping statements about the scientific impossibility of homeopathy are themselves unscientific: scientific statements must be precise and testable. There is growing evidence that homeopathic preparations can exert biological effects; due consideration of such research would reduce the influence of prior beliefs on the assessment of systematic review evidence.

  2. Analytic Models of Plausible Gravitational Lens Potentials

    Energy Technology Data Exchange (ETDEWEB)

    Baltz, Edward A.; Marshall, Phil; Oguri, Masamune

    2007-05-04

    Gravitational lenses on galaxy scales are plausibly modeled as having ellipsoidal symmetry and a universal dark matter density profile, with a Sersic profile to describe the distribution of baryonic matter. Predicting all lensing effects requires knowledge of the total lens potential: in this work we give analytic forms for that of the above hybrid model. Emphasizing that complex lens potentials can be constructed from simpler components in linear combination, we provide a recipe for attaining elliptical symmetry in either projected mass or lens potential.We also provide analytic formulae for the lens potentials of Sersic profiles for integer and half-integer index. We then present formulae describing the gravitational lensing effects due to smoothly-truncated universal density profiles in cold dark matter model. For our isolated haloes the density profile falls off as radius to the minus fifth or seventh power beyond the tidal radius, functional forms that allow all orders of lens potential derivatives to be calculated analytically, while ensuring a non-divergent total mass. We show how the observables predicted by this profile differ from that of the original infinite-mass NFW profile. Expressions for the gravitational flexion are highlighted. We show how decreasing the tidal radius allows stripped haloes to be modeled, providing a framework for a fuller investigation of dark matter substructure in galaxies and clusters. Finally we remark on the need for finite mass halo profiles when doing cosmological ray-tracing simulations, and the need for readily-calculable higher order derivatives of the lens potential when studying catastrophes in strong lenses.

  3. Encoding the target or the plausible preview word? The nature of the plausibility preview benefit in reading Chinese.

    Science.gov (United States)

    Yang, Jinmian; Li, Nan; Wang, Suiping; Slattery, Timothy J; Rayner, Keith

    2014-01-01

    Previous studies have shown that a plausible preview word can facilitate the processing of a target word as compared to an implausible preview word (a plausibility preview benefit effect) when reading Chinese (Yang, Wang, Tong, & Rayner, 2012; Yang, 2013). Regarding the nature of this effect, it is possible that readers processed the meaning of the plausible preview word and did not actually encode the target word (given that the parafoveal preview word lies close to the fovea). The current experiment examined this possibility with three conditions wherein readers received a preview of a target word that was either (1) identical to the target word (identical preview), (2) a plausible continuation of the pre-target text, but the post-target text in the sentence was incompatible with it (initially plausible preview), or (3) not a plausible continuation of the pre-target text, nor compatible with the post-target text (implausible preview). Gaze durations on target words were longer in the initially plausible condition than the identical condition. Overall, the results showed a typical preview benefit, but also implied that readers did not encode the initially plausible preview. Also, a plausibility preview benefit was replicated: gaze durations were longer with implausible previews than the initially plausible ones. Furthermore, late eye movement measures did not reveal differences between the initially plausible and the implausible preview conditions, which argues against the possibility of misreading the plausible preview word as the target word. In sum, these results suggest that a plausible preview word provides benefit in processing the target word as compared to an implausible preview word, and this benefit is only present in early but not late eye movement measures.

  4. Identifying plausible genetic models based on association and linkage results: application to type 2 diabetes.

    Science.gov (United States)

    Guan, Weihua; Boehnke, Michael; Pluzhnikov, Anna; Cox, Nancy J; Scott, Laura J

    2012-12-01

    When planning resequencing studies for complex diseases, previous association and linkage studies can constrain the range of plausible genetic models for a given locus. Here, we explore the combinations of causal risk allele frequency (RAFC ) and genotype relative risk (GRRC ) consistent with no or limited evidence for affected sibling pair (ASP) linkage and strong evidence for case-control association. We find that significant evidence for case-control association combined with no or moderate evidence for ASP linkage can define a lower bound for the plausible RAFC . Using data from large type 2 diabetes (T2D) linkage and genome-wide association study meta-analyses, we find that under reasonable model assumptions, 23 of 36 autosomal T2D risk loci are unlikely to be due to causal variants with combined RAFC < 0.005, and four of the 23 are unlikely to be due to causal variants with combined RAFC < 0.05.

  5. Don't Plan for the Unexpected: Planning Based on Plausibility Models

    DEFF Research Database (Denmark)

    Andersen, Mikkel Birkegaard; Bolander, Thomas; Jensen, Martin Holm

    2015-01-01

    We present a framework for automated planning based on plausibility models, as well as algorithms for computing plans in this framework. Our plausibility models include postconditions, as ontic effects are essential for most planning purposes. The framework presented extends a previously developed...... framework based on dynamic epistemic logic (DEL), without plausibilities/beliefs. In the pure epistemic framework, one can distinguish between strong and weak epistemic plans for achieving some, possibly epistemic, goal. By taking all possible outcomes of actions into account, a strong plan guarantees...... that the agent achieves this goal. Conversely, a weak plan promises only the possibility of leading to the goal. In real-life planning scenarios where the planning agent is faced with a high degree of uncertainty and an almost endless number of possible exogenous events, strong epistemic planning...

  6. Don't Plan for the Unexpected: Planning Based on Plausibility Models

    DEFF Research Database (Denmark)

    Andersen, Mikkel Birkegaard; Bolander, Thomas; Jensen, Martin Holm

    2015-01-01

    that the agent achieves this goal. Conversely, a weak plan promises only the possibility of leading to the goal. In real-life planning scenarios where the planning agent is faced with a high degree of uncertainty and an almost endless number of possible exogenous events, strong epistemic planning......We present a framework for automated planning based on plausibility models, as well as algorithms for computing plans in this framework. Our plausibility models include postconditions, as ontic effects are essential for most planning purposes. The framework presented extends a previously developed...... framework based on dynamic epistemic logic (DEL), without plausibilities/beliefs. In the pure epistemic framework, one can distinguish between strong and weak epistemic plans for achieving some, possibly epistemic, goal. By taking all possible outcomes of actions into account, a strong plan guarantees...

  7. From bone to plausible bipedal locomotion. Part II: Complete motion synthesis for bipedal primates.

    Science.gov (United States)

    Nicolas, Guillaume; Multon, Franck; Berillon, Gilles

    2009-05-29

    This paper addresses the problem of synthesizing plausible bipedal locomotion according to 3D anatomical reconstruction and general hypotheses on human motion control strategies. In a previous paper [Nicolas, G., Multon, F., Berillon, G., Marchal, F., 2007. From bone to plausible bipedal locomotion using inverse kinematics. Journal of Biomechanics 40 (5) 1048-1057], we have validated a method based on using inverse kinematics to obtain plausible lower-limb motions knowing the trajectory of the ankle. In this paper, we propose a more general approach that also involves computing a plausible trajectory of the ankles for a given skeleton. The inputs are the anatomical descriptions of the bipedal species, imposed footprints and a rest posture. This process is based on optimizing a reference ankle trajectory until a set of criteria is minimized. This optimization loop is based on the assumption that a plausible motion is supposed to have little internal mechanical work and should be as less jerky as possible. For each tested ankle trajectory, inverse kinematics is used to compute a lower-body motion that enables us to compute the resulting mechanical work and jerk. This method was tested on a set of modern humans (male and female, with various anthropometric properties). We show that the results obtained with this method are close to experimental data for most of the subjects. We also demonstrate that the method is not sensitive to the choice of the reference ankle trajectory; any ankle trajectory leads to very similar result. We finally apply the method to a skeleton of Pan paniscus (Bonobo), and compare the resulting motion to those described by zoologists.

  8. Plausibility Judgments in Conceptual Change and Epistemic Cognition

    Science.gov (United States)

    Lombardi, Doug; Nussbaum, E. Michael; Sinatra, Gale M.

    2016-01-01

    Plausibility judgments rarely have been addressed empirically in conceptual change research. Recent research, however, suggests that these judgments may be pivotal to conceptual change about certain topics where a gap exists between what scientists and laypersons find plausible. Based on a philosophical and empirical foundation, this article…

  9. Source Effects and Plausibility Judgments When Reading about Climate Change

    Science.gov (United States)

    Lombardi, Doug; Seyranian, Viviane; Sinatra, Gale M.

    2014-01-01

    Gaps between what scientists and laypeople find plausible may act as a barrier to learning complex and/or controversial socioscientific concepts. For example, individuals may consider scientific explanations that human activities are causing current climate change as implausible. This plausibility judgment may be due-in part-to individuals'…

  10. Test of Poisson Failure Assumption.

    Science.gov (United States)

    1982-09-01

    o. ....... 37 00/ D itlr.: DVI r TEST OF POISSON FAILURE ASSUMPTION Chapter 1. INTRODUCTION 1.1 Background. In stockage models... precipitates a regular failure pattern; it is also possible that the coding of scheduled vs unscheduled does not reflect what we would expect. Data

  11. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  12. Plausible values: how to deal with their limitations.

    Science.gov (United States)

    Monseur, Christian; Adams, Raymond

    2009-01-01

    Rasch modeling and plausible values methodology were used to scale and report the results of the Organization for Economic Cooperation and Development's Programme for International Student Achievement (PISA). This article will describe the scaling approach adopted in PISA. In particular it will focus on the use of plausible values, a multiple imputation approach that is now commonly used in large-scale assessment. As with all imputation models the plausible values must be generated using models that are consistent with those used in subsequent data analysis. In the case of PISA the plausible value generation assumes a flat linear regression with all students' background variables collected through the international student questionnaire included as regressors. Further, like most linear models, homoscedasticity and normality of the conditional variance are assumed. This article will explore some of the implications of this approach. First, we will discuss the conditions under which the secondary analyses on variables not included in the model for generating the plausible values might be biased. Secondly, as plausible values were not drawn from a multi-level model, the article will explore the adequacy of the PISA procedures for estimating variance components when the data have a hierarchical structure.

  13. A biologically plausible embodied model of action discovery

    Directory of Open Access Journals (Sweden)

    Rufino eBolado-Gomez

    2013-03-01

    Full Text Available During development, animals can spontaneously discover action-outcomepairings enabling subsequent achievement of their goals. We present abiologically plausible embodied model addressing key aspects of thisprocess. The biomimetic model core comprises the basal ganglia and itsloops through cortex and thalamus. We incorporate reinforcementlearning with phasic dopamine supplying a sensory prediction error,signalling 'surprising' outcomes. Phasic dopamine is used in acorticostriatal learning rule which is consistent with recent data. Wealso hypothesised that objects associated with surprising outcomesacquire 'novelty salience' contingent on the predicability of theoutcome. To test this idea we used a simple model of predictiongoverning the dynamics of novelty salience and phasic dopamine. Thetask of the virtual robotic agent mimicked an in vivo counterpart(Gancarz et al., 2011 and involved interaction with a target objectwhich caused a light flash, or a control object which did not.Learning took place according to two schedules. In one, the phasicoutcome was delivered after interaction with the target in anunpredictable way which emulated the in vivo protocol. Without noveltysalience, the model was unable to account for the experimental data.In the other schedule, the phasic outcome was reliably delivered andthe agent showed a rapid increase in the number of interactions withthe target which then decreased over subsequent sessions. We arguethis is precisely the kind of change in behaviour required torepeatedly present representations of context, action and outcome, toneural networks responsible for learning action-outcome contingency.The model also showed corticostriatal plasticity consistent withlearning a new action in basal ganglia. We conclude that actionlearning is underpinned by a complex interplay of plasticity andstimulus salience, and that our model contains many of the elementsfor biological action discovery to take place.

  14. Modern Cosmology: Assumptions and Limits

    Science.gov (United States)

    Hwang, Jai-Chan

    2012-06-01

    Physical cosmology tries to understand the Universe at large with its origin and evolution. Observational and experimental situations in cosmology do not allow us to proceed purely based on the empirical means. We examine in which sense our cosmological assumptions in fact have shaped our current cosmological worldview with consequent inevitable limits. Cosmology, as other branches of science and knowledge, is a construct of human imagination reflecting the popular belief system of the era. The question at issue deserves further philosophic discussions. In Whitehead's words, ``philosophy, in one of its functions, is the critic of cosmologies.'' (Whitehead 1925).

  15. Modern Cosmology: Assumptions and Limits

    CERN Document Server

    Hwang, Jai-chan

    2012-01-01

    Physical cosmology tries to understand the Universe at large with its origin and evolution. Observational and experimental situations in cosmology do not allow us to proceed purely based on the empirical means. We examine in which sense our cosmological assumptions in fact have shaped our current cosmological worldview with consequent inevitable limits. Cosmology, as other branches of science and knowledge, is a construct of human imagination reflecting the popular belief system of the era. The question at issue deserves further philosophic discussions. In Whitehead's words, "philosophy, in one of its functions, is the critic of cosmologies". (Whitehead 1925)

  16. The Plausibility of a String Quartet Performance in Virtual Reality.

    Science.gov (United States)

    Bergstrom, Ilias; Azevedo, Sergio; Papiotis, Panos; Saldanha, Nuno; Slater, Mel

    2017-04-01

    We describe an experiment that explores the contribution of auditory and other features to the illusion of plausibility in a virtual environment that depicts the performance of a string quartet. 'Plausibility' refers to the component of presence that is the illusion that the perceived events in the virtual environment are really happening. The features studied were: Gaze (the musicians ignored the participant, the musicians sometimes looked towards and followed the participant's movements), Sound Spatialization (Mono, Stereo, Spatial), Auralization (no sound reflections, reflections corresponding to a room larger than the one perceived, reflections that exactly matched the virtual room), and Environment (no sound from outside of the room, birdsong and wind corresponding to the outside scene). We adopted the methodology based on color matching theory, where 20 participants were first able to assess their feeling of plausibility in the environment with each of the four features at their highest setting. Then five times participants started from a low setting on all features and were able to make transitions from one system configuration to another until they matched their original feeling of plausibility. From these transitions a Markov transition matrix was constructed, and also probabilities of a match conditional on feature configuration. The results show that Environment and Gaze were individually the most important factors influencing the level of plausibility. The highest probability transitions were to improve Environment and Gaze, and then Auralization and Spatialization. We present this work as both a contribution to the methodology of assessing presence without questionnaires, and showing how various aspects of a musical performance can influence plausibility.

  17. Challenged assumptions and invisible effects

    DEFF Research Database (Denmark)

    Wimmelmann, Camilla Lawaetz; Vitus, Kathrine; Jervelund, Signe Smith

    2017-01-01

    for the implementation—different from the assumed conditions—not only challenge the implementation of the intervention but also potentially produce unanticipated yet valuable effects. Research implications – Newly arrived immigrants represent a hugely diverse and heterogeneous group of people with differing values...... of two complete intervention courses and an analysis of the official intervention documents. Findings – This case study exemplifies how the basic normative assumptions behind an immigrant-oriented intervention and the intrinsic power relations therein may be challenged and negotiated by the participants....... In particular, the assumed (power) relations inherent in immigrant-oriented educational health interventions, in which immigrants are in a novice position, are challenged, as the immigrants are experienced adults (and parents) in regard to healthcare. The paper proposes that such unexpected conditions...

  18. Faulty assumptions for repository requirements

    Energy Technology Data Exchange (ETDEWEB)

    Sutcliffe, W G

    1999-06-03

    Long term performance requirements for a geologic repository for spent nuclear fuel and high-level waste are based on assumptions concerning water use and subsequent deaths from cancer due to ingesting water contaminated with radio isotopes ten thousand years in the future. This paper argues that the assumptions underlying these requirements are faulty for a number of reasons. First, in light of the inevitable technological progress, including efficient desalination of water, over the next ten thousand years, it is inconceivable that a future society would drill for water near a repository. Second, even today we would not use water without testing its purity. Third, today many types of cancer are curable, and with the rapid progress in medical technology in general, and the prevention and treatment of cancer in particular, it is improbable that cancer caused by ingesting contaminated water will be a sign&ant killer in the far future. This paper reviews the performance requirements for geological repositories and comments on the difficulties in proving compliance in the face of inherent uncertainties. The already tiny long-term risk posed by a geologic repository is presented and contrasted with contemporary every day risks. A number of examples of technological progress, including cancer treatments, are advanced. The real and significant costs resulting from the overly conservative requirements are then assessed. Examples are given of how money (and political capital) could be put to much better use to save lives today and in the future. It is concluded that although a repository represents essentially no long-term risk, monitored retrievable dry storage (above or below ground) is the current best alternative for spent fuel and high-level nuclear waste.

  19. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption. In this ......Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption....... In this paper we investigate a method for testing the mar assumption in the presence of other distributional constraints. We present methods to (approximately) compute a test statistic consisting of the ratio of two profile likelihood functions. This requires the optimization of the likelihood under...

  20. Classification using sparse representations: a biologically plausible approach.

    Science.gov (United States)

    Spratling, M W

    2014-02-01

    Representing signals as linear combinations of basis vectors sparsely selected from an overcomplete dictionary has proven to be advantageous for many applications in pattern recognition, machine learning, signal processing, and computer vision. While this approach was originally inspired by insights into cortical information processing, biologically plausible approaches have been limited to exploring the functionality of early sensory processing in the brain, while more practical applications have employed non-biologically plausible sparse coding algorithms. Here, a biologically plausible algorithm is proposed that can be applied to practical problems. This algorithm is evaluated using standard benchmark tasks in the domain of pattern classification, and its performance is compared to a wide range of alternative algorithms that are widely used in signal and image processing. The results show that for the classification tasks performed here, the proposed method is competitive with the best of the alternative algorithms that have been evaluated. This demonstrates that classification using sparse representations can be performed in a neurally plausible manner, and hence, that this mechanism of classification might be exploited by the brain.

  1. On the coupling of fluid dynamics and electromagnetism at the top of the earth's core

    Science.gov (United States)

    Benton, E. R.

    1985-01-01

    A kinematic approach to short-term geomagnetism has recently been based upon pre-Maxwell frozen-flux electromagnetism. A complete dynamic theory requires coupling fluid dynamics to electromagnetism. A geophysically plausible simplifying assumption for the vertical vorticity balance, namely that the vertical Lorentz torque is negligible, is introduced and its consequences are developed. The simplified coupled magnetohydrodynamic system is shown to conserve a variety of magnetic and vorticity flux integrals. These provide constraints on eligible models for the geomagnetic main field, its secular variation, and the horizontal fluid motions at the top of the core, and so permit a number of tests of the underlying assumptions.

  2. On the coupling of fluid dynamics and electromagnetism at the top of the earth's core

    Science.gov (United States)

    Benton, E. R.

    1985-01-01

    A kinematic approach to short-term geomagnetism has recently been based upon pre-Maxwell frozen-flux electromagnetism. A complete dynamic theory requires coupling fluid dynamics to electromagnetism. A geophysically plausible simplifying assumption for the vertical vorticity balance, namely that the vertical Lorentz torque is negligible, is introduced and its consequences are developed. The simplified coupled magnetohydrodynamic system is shown to conserve a variety of magnetic and vorticity flux integrals. These provide constraints on eligible models for the geomagnetic main field, its secular variation, and the horizontal fluid motions at the top of the core, and so permit a number of tests of the underlying assumptions.

  3. How Symmetrical Assumptions Advance Strategic Management Research

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Hallberg, Hallberg

    2014-01-01

    We develop the case for symmetrical assumptions in strategic management theory. Assumptional symmetry obtains when assumptions made about certain actors and their interactions in one of the application domains of a theory are also made about this set of actors and their interactions in other appl...

  4. Families of Plausible Solutions to the Puzzle of Boyajian's Star

    CERN Document Server

    Wright, Jason T

    2016-01-01

    Good explanations for the unusual light curve of Boyajian's Star have been hard to find. Recent results by Montet & Simon lend strength and plausibility to the conclusion of Schaefer that in addition to short-term dimmings, the star also experiences large, secular decreases in brightness on decadal timescales. This, combined with a lack of long-wavelength excess in the star's spectral energy distribution, strongly constrains scenarios involving circumstellar material, including hypotheses invoking a spherical cloud of artifacts. We show that the timings of the deepest dimmings appear consistent with being randomly distributed, and that the star's reddening and narrow sodium absorption is consistent with the total, long-term dimming observed. Following Montet & Simon's encouragement to generate alternative hypotheses, we attempt to circumscribe the space of possible explanations with a range of plausibilities, including: a cloud in the outer solar system, structure in the ISM, natural and artificial ma...

  5. Representations of physical plausibility revealed by event-related potentials.

    Science.gov (United States)

    Roser, Matthew E; Fugelsang, Jonathan A; Handy, Todd C; Dunbar, Kevin N; Gazzaniga, Michael S

    2009-08-05

    Maintaining an accurate mental representation of the current environment is crucial to detecting change in that environment and ensuring behavioral coherence. Past experience with interactions between objects, such as collisions, has been shown to influence the perception of object interactions. To assess whether mental representations of object interactions derived from experience influence the maintenance of a mental model of the current stimulus environment, we presented physically plausible and implausible collision events while recording brain electrical activity. The parietal P300 response to 'oddball' events was found to be modulated by the physical plausibility of the stimuli, suggesting that past experience of object interactions can influence working memory processes involved in monitoring ongoing changes to the environment.

  6. Probabilistic reasoning in intelligent systems networks of plausible inference

    CERN Document Server

    Pearl, Judea

    1988-01-01

    Probabilistic Reasoning in Intelligent Systems is a complete and accessible account of the theoretical foundations and computational methods that underlie plausible reasoning under uncertainty. The author provides a coherent explication of probability as a language for reasoning with partial belief and offers a unifying perspective on other AI approaches to uncertainty, such as the Dempster-Shafer formalism, truth maintenance systems, and nonmonotonic logic. The author distinguishes syntactic and semantic approaches to uncertainty--and offers techniques, based on belief networks, that provid

  7. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption....... In this paper we investigate a method for testing the mar assumption in the presence of other distributional constraints. We present methods to (approximately) compute a test statistic consisting of the ratio of two profile likelihood functions. This requires the optimization of the likelihood under...... no assumptionson the missingness mechanism, for which we use our recently proposed AI \\& M algorithm. We present experimental results on synthetic data that show that our approximate test statistic is a good indicator for whether data is mar relative to the given distributional assumptions....

  8. Plausible scenarios for the radiography profession in Sweden in 2025.

    Science.gov (United States)

    Björkman, B; Fridell, K; Tavakol Olofsson, P

    2017-11-01

    Radiography is a healthcare speciality with many technical challenges. Advances in engineering and information technology applications may continue to drive and be driven by radiographers. The world of diagnostic imaging is changing rapidly and radiographers must be proactive in order to survive. To ensure sustainable development, organisations have to identify future opportunities and threats in a timely manner and incorporate them into their strategic planning. Hence, the aim of this study was to analyse and describe plausible scenarios for the radiography profession in 2025. The study has a qualitative design with an inductive approach based on focus group interviews. The interviews were inspired by the Scenario-Planning method. Of the seven trends identified in a previous study, the radiographers considered two as the most uncertain scenarios that would have the greatest impact on the profession should they occur. These trends, labelled "Access to career advancement" and "A sufficient number of radiographers", were inserted into the scenario cross. The resulting four plausible future scenarios were: The happy radiographer, the specialist radiographer, the dying profession and the assembly line. It is suggested that "The dying profession" scenario could probably be turned in the opposite direction by facilitating career development opportunities for radiographers within the profession. Changing the direction would probably lead to a profession composed of "happy radiographers" who are specialists, proud of their profession and competent to carry out advanced tasks, in contrast to being solely occupied by "the assembly line". Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  9. Prebiotically plausible mechanisms increase compositional diversity of nucleic acid sequences.

    Science.gov (United States)

    Derr, Julien; Manapat, Michael L; Rajamani, Sudha; Leu, Kevin; Xulvi-Brunet, Ramon; Joseph, Isaac; Nowak, Martin A; Chen, Irene A

    2012-05-01

    During the origin of life, the biological information of nucleic acid polymers must have increased to encode functional molecules (the RNA world). Ribozymes tend to be compositionally unbiased, as is the vast majority of possible sequence space. However, ribonucleotides vary greatly in synthetic yield, reactivity and degradation rate, and their non-enzymatic polymerization results in compositionally biased sequences. While natural selection could lead to complex sequences, molecules with some activity are required to begin this process. Was the emergence of compositionally diverse sequences a matter of chance, or could prebiotically plausible reactions counter chemical biases to increase the probability of finding a ribozyme? Our in silico simulations using a two-letter alphabet show that template-directed ligation and high concatenation rates counter compositional bias and shift the pool toward longer sequences, permitting greater exploration of sequence space and stable folding. We verified experimentally that unbiased DNA sequences are more efficient templates for ligation, thus increasing the compositional diversity of the pool. Our work suggests that prebiotically plausible chemical mechanisms of nucleic acid polymerization and ligation could predispose toward a diverse pool of longer, potentially structured molecules. Such mechanisms could have set the stage for the appearance of functional activity very early in the emergence of life.

  10. The Self in Guidance: Assumptions and Challenges.

    Science.gov (United States)

    Edwards, Richard; Payne, John

    1997-01-01

    Examines the assumptions of "self" made in the professional and managerial discourses of guidance. Suggests that these assumptions obstruct the capacity of guidance workers to explain their own practices. Drawing on contemporary debates over identity, modernity, and postmodernity, argues for a more explicit debate about the self in guidance. (RJM)

  11. Assumptions of Multiple Regression: Correcting Two Misconceptions

    Directory of Open Access Journals (Sweden)

    Matt N. Williams

    2013-09-01

    Full Text Available In 2002, an article entitled - Four assumptions of multiple regression that researchers should always test- by.Osborne and Waters was published in PARE. This article has gone on to be viewed more than 275,000 times.(as of August 2013, and it is one of the first results displayed in a Google search for - regression.assumptions- . While Osborne and Waters' efforts in raising awareness of the need to check assumptions.when using regression are laudable, we note that the original article contained at least two fairly important.misconceptions about the assumptions of multiple regression: Firstly, that multiple regression requires the.assumption of normally distributed variables; and secondly, that measurement errors necessarily cause.underestimation of simple regression coefficients. In this article, we clarify that multiple regression models.estimated using ordinary least squares require the assumption of normally distributed errors in order for.trustworthy inferences, at least in small samples, but not the assumption of normally distributed response or.predictor variables. Secondly, we point out that regression coefficients in simple regression models will be.biased (toward zero estimates of the relationships between variables of interest when measurement error is.uncorrelated across those variables, but that when correlated measurement error is present, regression.coefficients may be either upwardly or downwardly biased. We conclude with a brief corrected summary of.the assumptions of multiple regression when using ordinary least squares.

  12. Wrong assumptions in the financial crisis

    NARCIS (Netherlands)

    Aalbers, M.B.

    2009-01-01

    Purpose - The purpose of this paper is to show how some of the assumptions about the current financial crisis are wrong because they misunderstand what takes place in the mortgage market. Design/methodology/approach - The paper discusses four wrong assumptions: one related to regulation, one to leve

  13. New Cryptosystem Using Multiple Cryptographic Assumptions

    Directory of Open Access Journals (Sweden)

    E. S. Ismail

    2011-01-01

    Full Text Available Problem statement: A cryptosystem is a way for a sender and a receiver to communicate digitally by which the sender can send receiver any confidential or private message by first encrypting it using the receiver’s public key. Upon receiving the encrypted message, the receiver can confirm the originality of the message’s contents using his own secret key. Up to now, most of the existing cryptosystems were developed based on a single cryptographic assumption like factoring, discrete logarithms, quadratic residue or elliptic curve discrete logarithm. Although these schemes remain secure today, one day in a near future they may be broken if one finds a polynomial algorithm that can efficiently solve the underlying cryptographic assumption. Approach: By this motivation, we designed a new cryptosystem based on two cryptographic assumptions; quadratic residue and discrete logarithms. We integrated these two assumptions in our encrypting and decrypting equations so that the former depends on one public key whereas the latter depends on one corresponding secret key and two secret numbers. Each of public and secret keys in our scheme determines the assumptions we use. Results: The newly developed cryptosystem is shown secure against the three common considering algebraic attacks using a heuristic security technique. The efficiency performance of our scheme requires 2Texp+2Tmul +Thash time complexity for encryption and Texp+2Tmul+Tsrt time complexity for decryption and this magnitude of complexity is considered minimal for multiple cryptographic assumptions-like cryptosystems. Conclusion: The new cryptosystem based on multiple cryptographic assumptions offers a greater security level than that schemes based on a single cryptographic assumption. The adversary has to solve the two assumptions simultaneously to recover the original message from the received corresponding encrypted message but this is very unlikely to happen.

  14. A swarm intelligence framework for reconstructing gene networks: searching for biologically plausible architectures.

    Science.gov (United States)

    Kentzoglanakis, Kyriakos; Poole, Matthew

    2012-01-01

    In this paper, we investigate the problem of reverse engineering the topology of gene regulatory networks from temporal gene expression data. We adopt a computational intelligence approach comprising swarm intelligence techniques, namely particle swarm optimization (PSO) and ant colony optimization (ACO). In addition, the recurrent neural network (RNN) formalism is employed for modeling the dynamical behavior of gene regulatory systems. More specifically, ACO is used for searching the discrete space of network architectures and PSO for searching the corresponding continuous space of RNN model parameters. We propose a novel solution construction process in the context of ACO for generating biologically plausible candidate architectures. The objective is to concentrate the search effort into areas of the structure space that contain architectures which are feasible in terms of their topological resemblance to real-world networks. The proposed framework is initially applied to the reconstruction of a small artificial network that has previously been studied in the context of gene network reverse engineering. Subsequently, we consider an artificial data set with added noise for reconstructing a subnetwork of the genetic interaction network of S. cerevisiae (yeast). Finally, the framework is applied to a real-world data set for reverse engineering the SOS response system of the bacterium Escherichia coli. Results demonstrate the relative advantage of utilizing problem-specific knowledge regarding biologically plausible structural properties of gene networks over conducting a problem-agnostic search in the vast space of network architectures.

  15. On the biological plausibility of Wind Turbine Syndrome.

    Science.gov (United States)

    Harrison, Robert V

    2015-01-01

    An emerging environmental health issue relates to potential ill-effects of wind turbine noise. There have been numerous suggestions that the low-frequency acoustic components in wind turbine signals can cause symptoms associated with vestibular system disorders, namely vertigo, nausea, and nystagmus. This constellation of symptoms has been labeled as Wind Turbine Syndrome, and has been identified in case studies of individuals living close to wind farms. This review discusses whether it is biologically plausible for the turbine noise to stimulate the vestibular parts of the inner ear and, by extension, cause Wind Turbine Syndrome. We consider the sound levels that can activate the semicircular canals or otolith end organs in normal subjects, as well as in those with preexisting conditions known to lower vestibular threshold to sound stimulation.

  16. Alkaloids from Pandanus amaryllifolius: Isolation and Their Plausible Biosynthetic Formation.

    Science.gov (United States)

    Tsai, Yu-Chi; Yu, Meng-Lun; El-Shazly, Mohamed; Beerhues, Ludger; Cheng, Yuan-Bin; Chen, Lei-Chin; Hwang, Tsong-Long; Chen, Hui-Fen; Chung, Yu-Ming; Hou, Ming-Feng; Wu, Yang-Chang; Chang, Fang-Rong

    2015-10-23

    Pandanus amaryllifolius Roxb. (Pandanaceae) is used as a flavor and in folk medicine in Southeast Asia. The ethanolic crude extract of the aerial parts of P. amaryllifolius exhibited antioxidant, antibiofilm, and anti-inflammatory activities in previous studies. In the current investigation, the purification of the ethanolic extract yielded nine new compounds, including N-acetylnorpandamarilactonines A (1) and B (2); pandalizines A (3) and B (4); pandanmenyamine (5); pandamarilactones 2 (6) and 3 (7), and 5(E)-pandamarilactonine-32 (8); and pandalactonine (9). The isolated alkaloids, with either a γ-alkylidene-α,β-unsaturated-γ-lactone or γ-alkylidene-α,β-unsaturated-γ-lactam system, can be classified into five skeletons including norpandamarilactonine, indolizinone, pandanamine, pandamarilactone, and pandamarilactonine. A plausible biosynthetic route toward 1-5, 7, and 9 is proposed.

  17. A Cmparison of Closed World Assumptions

    Institute of Scientific and Technical Information of China (English)

    沈一栋

    1992-01-01

    In this Paper.we introduce a notion of the family of closed world assumptions and compare several well-known closed world approaches in the family to the extent to whic an incomplete database is com pleted.

  18. Some Considerations on the Basic Assumptions in Rotordynamics

    Science.gov (United States)

    GENTA, G.; DELPRETE, C.; BRUSA, E.

    1999-10-01

    The dynamic study of rotors is usually performed under a number of assumptions, namely small displacements and rotations, small unbalance and constant angular velocity. The latter assumption can be substituted by a known time history of the spin speed. The present paper develops a general non-linear model which can be used to study the rotordynamic behaviour of both fixed and free rotors without resorting to the mentioned assumptions and compares the results obtained from a number of non-linear numerical simulations with those computed through the usual linearized approach. It is so possible to verify that the validity of the rotordynamic models extends to situations in which fairly large unbalances and whirling motions are present and, above all, it is shown that the doubts forwarded about the application of a model which is based on constant spin speed to the case of free rotors in which the angular momentum is constant have no ground. Rotordynamic models can thus be used to study the stability in the small of spinning spacecrafts and the insight obtained from the study of rotors is useful to understand their attitude dynamics and its interactions with the vibration dynamics.

  19. Examining Computational Assumptions For Godiva IV

    Energy Technology Data Exchange (ETDEWEB)

    Kirkland, Alexander Matthew [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Jaegers, Peter James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-11

    Over the course of summer 2016, the effects of several computational modeling assumptions with respect to the Godiva IV reactor were examined. The majority of these assumptions pertained to modeling errors existing in the control rods and burst rod. The Monte Carlo neutron transport code, MCNP, was used to investigate these modeling changes, primarily by comparing them to that of the original input deck specifications.

  20. Slow Computing Simulation of Bio-plausible Control

    Science.gov (United States)

    2012-03-01

    following discussion have been derived using a few key assumptions: occlusions are ignored, the set of discrete luminance functions is sampled...simulation framework that was provided by the Murray group, and the Grand Unified Fly (GUF) simulation framework developed by Dr. Andrew Straw...sensor inputs, parallelizing sensor processing, and rapidly responding. Thus, the parallel nature of the processors is key to eliciting a low power

  1. Shattering world assumptions: A prospective view of the impact of adverse events on world assumptions.

    Science.gov (United States)

    Schuler, Eric R; Boals, Adriel

    2016-05-01

    Shattered Assumptions theory (Janoff-Bulman, 1992) posits that experiencing a traumatic event has the potential to diminish the degree of optimism in the assumptions of the world (assumptive world), which could lead to the development of posttraumatic stress disorder. Prior research assessed the assumptive world with a measure that was recently reported to have poor psychometric properties (Kaler et al., 2008). The current study had 3 aims: (a) to assess the psychometric properties of a recently developed measure of the assumptive world, (b) to retrospectively examine how prior adverse events affected the optimism of the assumptive world, and (c) to measure the impact of an intervening adverse event. An 8-week prospective design with a college sample (N = 882 at Time 1 and N = 511 at Time 2) was used to assess the study objectives. We split adverse events into those that were objectively or subjectively traumatic in nature. The new measure exhibited adequate psychometric properties. The report of a prior objective or subjective trauma at Time 1 was related to a less optimistic assumptive world. Furthermore, participants who experienced an intervening objectively traumatic event evidenced a decrease in optimistic views of the world compared with those who did not experience an intervening adverse event. We found support for Shattered Assumptions theory retrospectively and prospectively using a reliable measure of the assumptive world. We discuss future assessments of the measure of the assumptive world and clinical implications to help rebuild the assumptive world with current therapies. (PsycINFO Database Record

  2. Liderazgo preventivo para la universidad. Una experiencia plausible

    Directory of Open Access Journals (Sweden)

    Alejandro Rodríguez Rodríguez

    2015-06-01

    Full Text Available El desarrollo del liderazgo, en el ámbito educativo superior, busca soluciones de aplicación inmediata a contextos en que todo líder se desenvuelve, pero se diluye el sustento teórico-práctico en la formación del líder que posibilite entender los procesos intelectivos durante la toma de decisiones. El paradigma de convergencia entre el método antropológico lonerganiano, la comunidad de aprendizaje vygotskiana y una relectura del sistema preventivo salesiano se presentan como propuesta plausible de formación al liderazgo preventivo entre los diversos actores de una comunidad universitaria. Un estudio de caso de la Universidad Salesiana en México empleando un método mixto de investigación, facilita una relectura del liderazgo desde una óptica preventiva como posibilidad de convergencia en un diálogo interdisciplinar. Los resultados teórico-práctico propuestos y examinados se muestran como herramienta útil para evaluar, enriquecer y renovar la teoría sobre el líder y el desarrollo de liderazgo en las universidades frente a una sociedad globalizada.

  3. A perspective on SIDS pathogenesis. The hypotheses: plausibility and evidence

    Directory of Open Access Journals (Sweden)

    Goldwater Paul N

    2011-05-01

    Full Text Available Abstract Several theories of the underlying mechanisms of Sudden Infant Death Syndrome (SIDS have been proposed. These theories have born relatively narrow beach-head research programs attracting generous research funding sustained for many years at expense to the public purse. This perspective endeavors to critically examine the evidence and bases of these theories and determine their plausibility; and questions whether or not a safe and reasoned hypothesis lies at their foundation. The Opinion sets specific criteria by asking the following questions: 1. Does the hypothesis take into account the key pathological findings in SIDS? 2. Is the hypothesis congruent with the key epidemiological risk factors? 3. Does it link 1 and 2? Falling short of any one of these answers, by inference, would imply insufficient grounds for a sustainable hypothesis. Some of the hypotheses overlap, for instance, notional respiratory failure may encompass apnea, prone sleep position, and asphyxia which may be seen to be linked to co-sleeping. For the purposes of this paper, each element will be assessed on the above criteria.

  4. A plausible explanation for male dominance in typhoid ileal perforation

    Directory of Open Access Journals (Sweden)

    Khan M

    2012-11-01

    Full Text Available Mohammad KhanDepartment of Microbiology, College of Medicine, Chichiri, Blantyre, MalawiAbstract: The phenomenon of consistent male dominance in typhoid ileal perforation (TIP is not well understood. It cannot be explained on the basis of microbial virulence, Peyer's patch anatomy, ileal wall thickness, gastric acidity, host genetic factors, or sex-linked bias in hospital attendance. The cytokine response to an intestinal infection in males is predominantly proinflammatory as compared with that in females, presumably due to differences in the sex hormonal milieu. Sex hormone receptors have been detected on lymphocytes and macrophages, including on Peyer's patches, inflammation of which (probably similar to the Shwartzman reaction/Koch phenomenon is the forerunner of TIP, and is not excluded from the regulatory effects of sex hormones. Hormonal control of host-pathogen interaction may override genetic control. Environmental exposure to Salmonella typhi may be more frequent in males, presumably due to sex-linked differences in hygiene practices and dining-out behavior. A plausible explanation of male dominance in TIP could include sex-linked differences in the degree of natural exposure of Peyer's patches to S. typhi. An alternative explanation may include sexual dimorphism in host inflammatory response patterns in Peyer's patches that have been induced by S. typhi. Both hypotheses are testable.Keywords: explanation, dominance, male, perforation, ileum, typhoid

  5. A plausible explanation for male dominance in typhoid ileal perforation.

    Science.gov (United States)

    Khan, Mohammad

    2012-01-01

    The phenomenon of consistent male dominance in typhoid ileal perforation (TIP) is not well understood. It cannot be explained on the basis of microbial virulence, Peyer's patch anatomy, ileal wall thickness, gastric acidity, host genetic factors, or sex-linked bias in hospital attendance. The cytokine response to an intestinal infection in males is predominantly proinflammatory as compared with that in females, presumably due to differences in the sex hormonal milieu. Sex hormone receptors have been detected on lymphocytes and macrophages, including on Peyer's patches, inflammation of which (probably similar to the Shwartzman reaction/Koch phenomenon) is the forerunner of TIP, and is not excluded from the regulatory effects of sex hormones. Hormonal control of host-pathogen interaction may override genetic control. Environmental exposure to Salmonella typhi may be more frequent in males, presumably due to sex-linked differences in hygiene practices and dining-out behavior. A plausible explanation of male dominance in TIP could include sex-linked differences in the degree of natural exposure of Peyer's patches to S. typhi. An alternative explanation may include sexual dimorphism in host inflammatory response patterns in Peyer's patches that have been induced by S. typhi. Both hypotheses are testable.

  6. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.; Wagner, Sandra A.

    2015-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. With the ability to accurately compare different technologies' performance for the same function, managers will be able to make better decisions regarding technology development.

  7. Managerial and Organizational Assumptions in the CMM's

    DEFF Research Database (Denmark)

    2008-01-01

    thinking about large production and manufacturing organisations (particularly in America) in the late industrial age. Many of the difficulties reported with CMMI can be attributed basing practice on these assumptions in organisations which have different cultures and management traditions, perhaps...... in different countries operating different economic and social models. Characterizing CMMI in this way opens the door to another question: are there other sets of organisational and management assumptions which would be better suited to other types of organisations operating in other cultural contexts?...

  8. Plausible rice yield losses under future climate warming.

    Science.gov (United States)

    Zhao, Chuang; Piao, Shilong; Wang, Xuhui; Huang, Yao; Ciais, Philippe; Elliott, Joshua; Huang, Mengtian; Janssens, Ivan A; Li, Tao; Lian, Xu; Liu, Yongwen; Müller, Christoph; Peng, Shushi; Wang, Tao; Zeng, Zhenzhong; Peñuelas, Josep

    2016-12-19

    Rice is the staple food for more than 50% of the world's population(1-3). Reliable prediction of changes in rice yield is thus central for maintaining global food security. This is an extraordinary challenge. Here, we compare the sensitivity of rice yield to temperature increase derived from field warming experiments and three modelling approaches: statistical models, local crop models and global gridded crop models. Field warming experiments produce a substantial rice yield loss under warming, with an average temperature sensitivity of -5.2 ± 1.4% K(-1). Local crop models give a similar sensitivity (-6.3 ± 0.4% K(-1)), but statistical and global gridded crop models both suggest less negative impacts of warming on yields (-0.8 ± 0.3% and -2.4 ± 3.7% K(-1), respectively). Using data from field warming experiments, we further propose a conditional probability approach to constrain the large range of global gridded crop model results for the future yield changes in response to warming by the end of the century (from -1.3% to -9.3% K(-1)). The constraint implies a more negative response to warming (-8.3 ± 1.4% K(-1)) and reduces the spread of the model ensemble by 33%. This yield reduction exceeds that estimated by the International Food Policy Research Institute assessment (-4.2 to -6.4% K(-1)) (ref. 4). Our study suggests that without CO2 fertilization, effective adaptation and genetic improvement, severe rice yield losses are plausible under intensive climate warming scenarios.

  9. Causal Mediation Analysis: Warning! Assumptions Ahead

    Science.gov (United States)

    Keele, Luke

    2015-01-01

    In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…

  10. Managerial and Organizational Assumptions in the CMM's

    DEFF Research Database (Denmark)

    2008-01-01

    in different countries operating different economic and social models. Characterizing CMMI in this way opens the door to another question: are there other sets of organisational and management assumptions which would be better suited to other types of organisations operating in other cultural contexts?...

  11. Mexican-American Cultural Assumptions and Implications.

    Science.gov (United States)

    Carranza, E. Lou

    The search for presuppositions of a people's thought is not new. Octavio Paz and Samuel Ramos have both attempted to describe the assumptions underlying the Mexican character. Paz described Mexicans as private, defensive, and stoic, characteristics taken to the extreme in the "pachuco." Ramos, on the other hand, described Mexicans as…

  12. The homogeneous marginal utility of income assumption

    NARCIS (Netherlands)

    Demuynck, T.

    2015-01-01

    We develop a test to verify if every agent from a population of heterogeneous consumers has the same marginal utility of income function. This homogeneous marginal utility of income assumption is often (implicitly) used in applied demand studies because it has nice aggregation properties and facilit

  13. Mexican-American Cultural Assumptions and Implications.

    Science.gov (United States)

    Carranza, E. Lou

    The search for presuppositions of a people's thought is not new. Octavio Paz and Samuel Ramos have both attempted to describe the assumptions underlying the Mexican character. Paz described Mexicans as private, defensive, and stoic, characteristics taken to the extreme in the "pachuco." Ramos, on the other hand, described Mexicans as…

  14. Culturally Biased Assumptions in Counseling Psychology

    Science.gov (United States)

    Pedersen, Paul B.

    2003-01-01

    Eight clusters of culturally biased assumptions are identified for further discussion from Leong and Ponterotto's (2003) article. The presence of cultural bias demonstrates that cultural bias is so robust and pervasive that is permeates the profession of counseling psychology, even including those articles that effectively attack cultural bias…

  15. Extracurricular Business Planning Competitions: Challenging the Assumptions

    Science.gov (United States)

    Watson, Kayleigh; McGowan, Pauric; Smith, Paul

    2014-01-01

    Business planning competitions [BPCs] are a commonly offered yet under-examined extracurricular activity. Given the extent of sceptical comment about business planning, this paper offers what the authors believe is a much-needed critical discussion of the assumptions that underpin the provision of such competitions. In doing so it is suggested…

  16. Changing beliefs about implausible autobiographical events: a little plausibility goes a long way.

    Science.gov (United States)

    Mazzoni, G A; Loftus, E F; Kirsch, I

    2001-03-01

    Three experiments investigated the malleability of perceived plausibility and the subjective likelihood of occurrence of plausible and implausible events among participants who had no recollection of experiencing them. In Experiment 1, a plausibility-enhancing manipulation (reading accounts of the occurrence of events) combined with a personalized suggestion increased the perceived plausibility of the implausible event, as well as participants' ratings of the likelihood that they had experienced it. Plausibility and likelihood ratings were uncorrelated. Subsequent studies showed that the plausibility manipulation alone was sufficient to increase likelihood ratings but only if the accounts that participants read were set in a contemporary context. These data suggest that false autobiographical beliefs can be induced in clinical and forensic contexts even for initially implausible events.

  17. Biologically-Plausible Reactive Control of Mobile Robots

    OpenAIRE

    Rene, Zapata; Pascal, Lepinay

    2006-01-01

    This chapter addressed the problem of controlling the reactive behaviours of a mobile robot evolving in unstructured and dynamic environments. We have carried out successful experiments for determining the distance field of a mobile robot using two

  18. Modelo Century de dinâmica da matéria orgânica do solo: equações e pressupostos Century model of soil organic matter dynamics: equations and assumptions

    Directory of Open Access Journals (Sweden)

    Luiz Fernando Carvalho Leite

    2003-08-01

    Full Text Available A modelagem de processos biológicos tem por objetivos o planejamento do uso da terra, o estabelecimento de padrões ambientais e as estimativas dos riscos reais e potenciais das atividades agrícolas e ambientais. Diversos modelos têm sido criados nos últimos 25 anos. Century é um modelo mecanístico que analisa em longo prazo a dinâmica da matéria orgânica do solo e de nutrientes no sistema solo-planta em diversos agroecossistemas. O submodelo de matéria orgânica do solo possui os compartimentos ativo (biomassa microbiana e produtos, lento (produtos microbianos e vegetais, fisicamente protegidos ou biologicamente resistentes à decomposição e passivo (quimicamente recalcitrante ou também fisicamente protegido com diferentes taxas de decomposição. Equações de primeira ordem são usadas para modelar todos os compartimentos da matéria orgânica do solo e a temperatura e umidade do solo modificam as taxas de decomposição. A reciclagem do compartimento ativo e a formação do passivo são controladas pelo teor de areia e de argila do solo, respectivamente. Os resíduos vegetais são divididos em compartimentos dependentes dos teores de lignina e nitrogênio. Por meio do modelo, pode-se relacionar matéria orgânica aos níveis de fertilidade e ao manejo atual e futuro, otimizando o entendimento das transformações dos nutrientes em solos de diversos agroecossistemas.The modeling of biological processes has as objectives the planning of land use, setting environmental standards and estimating the actual and potential risks of the agricultural and environmental activities. Several models have been created in the last 25 years. Century is a mechanistic model that analyzes in long-term the dynamics of soil organic matter and of nutrients in soil-plant system in several agroecosystems. The soil organic matter submodel has the active (microbial biomass and products, slow (plant and microbial products that are physically protected or

  19. The OPERA hypothesis: assumptions and clarifications.

    Science.gov (United States)

    Patel, Aniruddh D

    2012-04-01

    Recent research suggests that musical training enhances the neural encoding of speech. Why would musical training have this effect? The OPERA hypothesis proposes an answer on the basis of the idea that musical training demands greater precision in certain aspects of auditory processing than does ordinary speech perception. This paper presents two assumptions underlying this idea, as well as two clarifications, and suggests directions for future research.

  20. On distributional assumptions and whitened cosine similarities

    DEFF Research Database (Denmark)

    Loog, Marco

    2008-01-01

    Recently, an interpretation of the whitened cosine similarity measure as a Bayes decision rule was proposed (C. Liu, "The Bayes Decision Rule Induced Similarity Measures,'' IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 29, no. 6, pp. 1086-1090, June 2007. This communication makes th...... the observation that some of the distributional assumptions made to derive this measure are very restrictive and, considered simultaneously, even inconsistent....

  1. How to Handle Assumptions in Synthesis

    Directory of Open Access Journals (Sweden)

    Roderick Bloem

    2014-07-01

    Full Text Available The increased interest in reactive synthesis over the last decade has led to many improved solutions but also to many new questions. In this paper, we discuss the question of how to deal with assumptions on environment behavior. We present four goals that we think should be met and review several different possibilities that have been proposed. We argue that each of them falls short in at least one aspect.

  2. Closed World Assumption for Disjunctive Reasoning

    Institute of Scientific and Technical Information of China (English)

    WANG Kewen; ZHOU Lizhu

    2001-01-01

    In this paper, the relationship between argumentation and closed world reasoning for disjunctive information is studied. In particular, the authors propose a simple and intuitive generalization of the closed world assumption (CWA) for general disjunctive deductive databases (with default negation). This semantics,called DCWA, allows a natural argumentation-based interpretation and can be used to represent reasoning for disjunctive information. We compare DCWA with GCWA and prove that DCWA extends Minker's GCWA to the class of disjunctive databases with default negation. Also we compare our semantics with some related approaches.In addition, the computational complexity of DCWA is investigated.

  3. Limiting assumptions in molecular modeling: electrostatics.

    Science.gov (United States)

    Marshall, Garland R

    2013-02-01

    Molecular mechanics attempts to represent intermolecular interactions in terms of classical physics. Initial efforts assumed a point charge located at the atom center and coulombic interactions. It is been recognized over multiple decades that simply representing electrostatics with a charge on each atom failed to reproduce the electrostatic potential surrounding a molecule as estimated by quantum mechanics. Molecular orbitals are not spherically symmetrical, an implicit assumption of monopole electrostatics. This perspective reviews recent evidence that requires use of multipole electrostatics and polarizability in molecular modeling.

  4. 39 Questionable Assumptions in Modern Physics

    Science.gov (United States)

    Volk, Greg

    2009-03-01

    The growing body of anomalies in new energy, low energy nuclear reactions, astrophysics, atomic physics, and entanglement, combined with the failure of the Standard Model and string theory to predict many of the most basic fundamental phenomena, all point to a need for major new paradigms. Not Band-Aids, but revolutionary new ways of conceptualizing physics, in the spirit of Thomas Kuhn's The Structure of Scientific Revolutions. This paper identifies a number of long-held, but unproven assumptions currently being challenged by an increasing number of alternative scientists. Two common themes, both with venerable histories, keep recurring in the many alternative theories being proposed: (1) Mach's Principle, and (2) toroidal, vortex particles. Matter-based Mach's Principle differs from both space-based universal frames and observer-based Einsteinian relativity. Toroidal particles, in addition to explaining electron spin and the fundamental constants, satisfy the basic requirement of Gauss's misunderstood B Law, that motion itself circulates. Though a comprehensive theory is beyond the scope of this paper, it will suggest alternatives to the long list of assumptions in context.

  5. Event-based plausibility immediately influences on-line language comprehension.

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge.

  6. A biologically plausible learning rule for the Infomax on recurrent neural networks.

    Science.gov (United States)

    Hayakawa, Takashi; Kaneko, Takeshi; Aoyagi, Toshio

    2014-01-01

    A fundamental issue in neuroscience is to understand how neuronal circuits in the cerebral cortex play their functional roles through their characteristic firing activity. Several characteristics of spontaneous and sensory-evoked cortical activity have been reproduced by Infomax learning of neural networks in computational studies. There are, however, still few models of the underlying learning mechanisms that allow cortical circuits to maximize information and produce the characteristics of spontaneous and sensory-evoked cortical activity. In the present article, we derive a biologically plausible learning rule for the maximization of information retained through time in dynamics of simple recurrent neural networks. Applying the derived learning rule in a numerical simulation, we reproduce the characteristics of spontaneous and sensory-evoked cortical activity: cell-assembly-like repeats of precise firing sequences, neuronal avalanches, spontaneous replays of learned firing sequences and orientation selectivity observed in the primary visual cortex. We further discuss the similarity between the derived learning rule and the spike timing-dependent plasticity of cortical neurons.

  7. Catalyst Deactivation: Control Relevance of Model Assumptions

    Directory of Open Access Journals (Sweden)

    Bernt Lie

    2000-10-01

    Full Text Available Two principles for describing catalyst deactivation are discussed, one based on the deactivation mechanism, the other based on the activity and catalyst age distribution. When the model is based upon activity decay, it is common to use a mean activity developed from the steady-state residence time distribution. We compare control-relevant properties of such an approach with those of a model based upon the deactivation mechanism. Using a continuous stirred tank reactor as an example, we show that the mechanistic approach and the population balance approach lead to identical models. However, common additional assumptions used for activity-based models lead to model properties that may deviate considerably from the correct one.

  8. Inference and Assumption in Historical Seismology

    Science.gov (United States)

    Musson, R. M. W.

    The principal aim in studies of historical earthquakes is usually to be able to derive parameters for past earthquakes from macroseismic or other data and thus extend back in time parametric earthquake catalogues, often with improved seismic hazard studies as the ultimate goal. In cases of relatively recent historical earthquakes, for example, those of the 18th and 19th centuries, it is often the case that there is such an abundance of available macroseismic data that estimating earthquake parameters is relatively straightforward. For earlier historical periods, especially medieval and earlier, and also for areas where settlement or documentation are sparse, the situation is much harder. The seismologist often finds that he has only a few data points (or even one) for an earthquake that nevertheless appears to be regionally significant.In such cases, it is natural that the investigator will attempt to make the most of the available data, expanding it by making working assumptions, and from these deriving conclusions by inference (i.e. the process of proceeding logically from some premise). This can be seen in a number of existing studies; in some cases extremely slight data are so magnified by the use of inference that one must regard the results as tentative in the extreme. Two main types of inference can be distinguished. The first type is inference from documentation. This is where assumptions are made such as: the absence of a report of the earthquake from this monastic chronicle indicates that at this locality the earthquake was not felt. The second type is inference from seismicity. Here one deals with arguments such as all recent earthquakes felt at town X are events occurring in seismic zone Y, therefore this ancient earthquake which is only reported at town X probably also occurred in this zone.

  9. Economic Growth Assumptions in Climate and Energy Policy

    Directory of Open Access Journals (Sweden)

    Nir Y. Krakauer

    2014-03-01

    Full Text Available The assumption that the economic growth seen in recent decades will continue has dominated the discussion of future greenhouse gas emissions and the mitigation of and adaptation to climate change. Given that long-term economic growth is uncertain, the impacts of a wide range of growth trajectories should be considered. In particular, slower economic growth would imply that future generations will be relatively less able to invest in emissions controls or adapt to the detrimental impacts of climate change. Taking into consideration the possibility of economic slowdown therefore heightens the urgency of reducing greenhouse gas emissions now by moving to renewable energy sources, even if this incurs short-term economic cost. I quantify this counterintuitive impact of economic growth assumptions on present-day policy decisions in a simple global economy-climate model (Dynamic Integrated model of Climate and the Economy (DICE. In DICE, slow future growth increases the economically optimal present-day carbon tax rate and the utility of taxing carbon emissions, although the magnitude of the increase is sensitive to model parameters, including the rate of social time preference and the elasticity of the marginal utility of consumption. Future scenario development should specifically include low-growth scenarios, and the possibility of low-growth economic trajectories should be taken into account in climate policy analyses.

  10. Polycyclic aromatic hydrocarbons as plausible prebiotic membrane components.

    Science.gov (United States)

    Groen, Joost; Deamer, David W; Kros, Alexander; Ehrenfreund, Pascale

    2012-08-01

    Aromatic molecules delivered to the young Earth during the heavy bombardment phase in the early history of our solar system were likely to be among the most abundant and stable organic compounds available. The Aromatic World hypothesis suggests that aromatic molecules might function as container elements, energy transduction elements and templating genetic components for early life forms. To investigate the possible role of aromatic molecules as container elements, we incorporated different polycyclic aromatic hydrocarbons (PAH) in the membranes of fatty acid vesicles. The goal was to determine whether PAH could function as a stabilizing agent, similar to the role that cholesterol plays in membranes today. We studied vesicle size distribution, critical vesicle concentration and permeability of the bilayers using C(6)-C(10) fatty acids mixed with amphiphilic PAH derivatives such as 1-hydroxypyrene, 9-anthracene carboxylic acid and 1,4 chrysene quinone. Dynamic Light Scattering (DLS) spectroscopy was used to measure the size distribution of vesicles and incorporation of PAH species was established by phase-contrast and epifluorescence microscopy. We employed conductimetric titration to determine the minimal concentration at which fatty acids could form stable vesicles in the presence of PAHs. We found that oxidized PAH derivatives can be incorporated into decanoic acid (DA) vesicle bilayers in mole ratios up to 1:10 (PAH:DA). Vesicle size distribution and critical vesicle concentration were largely unaffected by PAH incorporation, but 1-hydroxypyrene and 9-anthracene carboxylic acid lowered the permeability of fatty acid bilayers to small solutes up to 4-fold. These data represent the first indication of a cholesterol-like stabilizing effect of oxidized PAH derivatives in a simulated prebiotic membrane.

  11. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence From Word Segmentation

    National Research Council Canada - National Science Library

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility...

  12. Choosing diverse sets of plausible scenarios in multidimensional exploratory futures techniques

    NARCIS (Netherlands)

    Lord, Steven; Helfgott, Ariella; Vervoort, Joost M.

    2016-01-01

    Abstract Morphological analysis allows any number of dimensions to be retained when framing future conditions, and techniques within morphological analysis determine which combinations of those dimensions represent plausible futures. However, even a relatively low number of dimensions in future cond

  13. Towards New Probabilistic Assumptions in Business Intelligence

    Directory of Open Access Journals (Sweden)

    Schumann Andrew

    2015-01-01

    Full Text Available One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot be observable and additive in principle. These variables can be called symbolic values or symbolic meanings and studied within symbolic interactionism, the theory developed since George Herbert Mead and Herbert Blumer. In statistical and econometric tools of business intelligence we accept only phenomena with causal connections measured by additive measures. In the paper we show that in the social world we deal with symbolic interactions which can be studied by non-additive labels (symbolic meanings or symbolic values. For accepting the variety of such phenomena we should avoid additivity of basic labels and construct a new probabilistic method in business intelligence based on non-Archimedean probabilities.

  14. Using critical evaluation to reappraise plausibility judgments: A critical cognitive component of conceptual change

    Science.gov (United States)

    Lombardi, D.

    2011-12-01

    Plausibility judgments-although well represented in conceptual change theories (see, for example, Chi, 2005; diSessa, 1993; Dole & Sinatra, 1998; Posner et al., 1982)-have received little empirical attention until our recent work investigating teachers' and students' understanding of and perceptions about human-induced climate change (Lombardi & Sinatra, 2010, 2011). In our first study with undergraduate students, we found that greater plausibility perceptions of human-induced climate accounted for significantly greater understanding of weather and climate distinctions after instruction, even after accounting for students' prior knowledge (Lombardi & Sinatra, 2010). In a follow-up study with inservice science and preservice elementary teachers, we showed that anger about the topic of climate change and teaching about climate change was significantly related to implausible perceptions about human-induced climate change (Lombardi & Sinatra, 2011). Results from our recent studies helped to inform our development of a model of the role of plausibility judgments in conceptual change situations. The model applies to situations involving cognitive dissonance, where background knowledge conflicts with an incoming message. In such situations, we define plausibility as a judgment on the relative potential truthfulness of incoming information compared to one's existing mental representations (Rescher, 1976). Students may not consciously think when making plausibility judgments, expending only minimal mental effort in what is referred to as an automatic cognitive process (Stanovich, 2009). However, well-designed instruction could facilitate students' reappraisal of plausibility judgments in more effortful and conscious cognitive processing. Critical evaluation specifically may be one effective method to promote plausibility reappraisal in a classroom setting (Lombardi & Sinatra, in progress). In science education, critical evaluation involves the analysis of how evidentiary

  15. Interactions between visual and motor areas during the recognition of plausible actions as revealed by magnetoencephalography.

    Science.gov (United States)

    Pavlidou, Anastasia; Schnitzler, Alfons; Lange, Joachim

    2014-02-01

    Several studies have shown activation of the mirror neuron system (MNS), comprising the temporal, posterior parietal, and sensorimotor areas when observing plausible actions, but far less is known on how these cortical areas interact during the recognition of a plausible action. Here, we recorded neural activity with magnetoencephalography while subjects viewed point-light displays of biologically plausible and scrambled versions of actions. We were interested in modulations of oscillatory activity and, specifically, in coupling of oscillatory activity between visual and motor areas. Both plausible and scrambled actions elicited modulations of θ (5-7 Hz), α (7-13 Hz), β (13-35 Hz), and γ (55-100 Hz) power within visual and motor areas. When comparing between the two actions, we observed sequential and spatially distinct increases of γ (∼65 Hz), β (∼25 Hz), and α (∼11 Hz) power between 0.5 and 1.3 s in parieto-occipital, sensorimotor, and left temporal areas. In addition, significant clusters of γ (∼65 Hz) and α/β (∼15 Hz) power decrease were observed in right temporal and parieto-occipital areas between 1.3 and 2.0 s. We found β-power in sensorimotor areas to be positively correlated on a trial-by-trial basis with parieto-occipital γ and left temporal α-power for the plausible but not for the scrambled condition. These results provide new insights in the neuronal oscillatory activity of the areas involved in the recognition of plausible action movements and their interaction. The power correlations between specific areas underscore the importance of interactions between visual and motor areas of the MNS during the recognition of a plausible action.

  16. Roy's specific life values and the philosophical assumption of humanism.

    Science.gov (United States)

    Hanna, Debra R

    2013-01-01

    Roy's philosophical assumption of humanism, which is shaped by the veritivity assumption, is considered in terms of her specific life values and in contrast to the contemporary view of humanism. Like veritivity, Roy's philosophical assumption of humanism unites a theocentric focus with anthropological values. Roy's perspective enriches the mainly secular, anthropocentric assumption. In this manuscript, the basis for Roy's perspective of humanism will be discussed so that readers will be able to use the Roy adaptation model in an authentic manner.

  17. The zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Etienne, Rampal S.; Alonso, David; McKane, Alan J.

    2007-01-01

    The neutral theory of biodiversity as put forward by Hubbell in his 2001 monograph has received much criticism for its unrealistic simplifying assumptions. These are the assumptions of functional equivalence among different species (neutrality), the assumption of point mutation speciation, and the a

  18. The zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Etienne, R.S.; Alonso, D.; McKane, A.J.

    2007-01-01

    The neutral theory of biodiversity as put forward by Hubbell in his 2001 monograph has received much criticism for its unrealistic simplifying assumptions. These are the assumptions of functional equivalence among different species (neutrality), the assumption of point mutation speciation, and the

  19. Philosophy of Technology Assumptions in Educational Technology Leadership

    Science.gov (United States)

    Webster, Mark David

    2017-01-01

    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved…

  20. Philosophy of Technology Assumptions in Educational Technology Leadership

    Science.gov (United States)

    Webster, Mark David

    2017-01-01

    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved…

  1. Stereotyping to infer group membership creates plausible deniability for prejudice-based aggression.

    Science.gov (United States)

    Cox, William T L; Devine, Patricia G

    2014-02-01

    In the present study, participants administered painful electric shocks to an unseen male opponent who was either explicitly labeled as gay or stereotypically implied to be gay. Identifying the opponent with a gay-stereotypic attribute produced a situation in which the target's group status was privately inferred but plausibly deniable to others. To test the plausible deniability hypothesis, we examined aggression levels as a function of internal (personal) and external (social) motivation to respond without prejudice. Whether plausible deniability was present or absent, participants high in internal motivation aggressed at low levels, and participants low in both internal and external motivation aggressed at high levels. The behavior of participants low in internal and high in external motivation, however, depended on experimental condition. They aggressed at low levels when observers could plausibly attribute their behavior to prejudice and aggressed at high levels when the situation granted plausible deniability. This work has implications for both obstacles to and potential avenues for prejudice-reduction efforts.

  2. Inference and Plausible Reasoning in a Natural Language Understanding System Based on Object-Oriented Semantics

    CERN Document Server

    Ostapov, Yuriy

    2012-01-01

    Algorithms of inference in a computer system oriented to input and semantic processing of text information are presented. Such inference is necessary for logical questions when the direct comparison of objects from a question and database can not give a result. The following classes of problems are considered: a check of hypotheses for persons and non-typical actions, the determination of persons and circumstances for non-typical actions, planning actions, the determination of event cause and state of persons. To form an answer both deduction and plausible reasoning are used. As a knowledge domain under consideration is social behavior of persons, plausible reasoning is based on laws of social psychology. Proposed algorithms of inference and plausible reasoning can be realized in computer systems closely connected with text processing (criminology, operation of business, medicine, document systems).

  3. Biologically plausible and evidence-based risk intervals in immunization safety research.

    Science.gov (United States)

    Rowhani-Rahbar, Ali; Klein, Nicola P; Dekker, Cornelia L; Edwards, Kathryn M; Marchant, Colin D; Vellozzi, Claudia; Fireman, Bruce; Sejvar, James J; Halsey, Neal A; Baxter, Roger

    2012-12-17

    In immunization safety research, individuals are considered at risk for the development of certain adverse events following immunization (AEFI) within a specific period of time referred to as the risk interval. These intervals should ideally be determined based on biologic plausibility considering features of the AEFI, presumed or known pathologic mechanism, and the vaccine. Misspecification of the length and timing of these intervals may result in introducing bias in epidemiologic and clinical studies of immunization safety. To date, little work has been done to formally assess and determine biologically plausible and evidence-based risk intervals in immunization safety research. In this report, we present a systematic process to define biologically plausible and evidence-based risk interval estimates for two specific AEFIs, febrile seizures and acute disseminated encephalomyelitis. In addition, we review methodologic issues related to the determination of risk intervals for consideration in future studies of immunization safety.

  4. The semiosis of prayer and the creation of plausible fictional worlds

    Directory of Open Access Journals (Sweden)

    J. Peter Södergård

    1999-01-01

    Full Text Available Prayer and incantation can perhaps be said to be 'mechanisms' that promise that lack will be liquidated and that there is an unlimited signator, a father, or some other metaphysical creature, standing behind and legitimizing the discourse. A way of communicating with the Unlimited that is privileged by an interpretive community that read the prayers aloud and enacted the magical stage-scripts. These highly overlapping categories function as one of the most common subforms of religious discourse for the creation, actualization and maintenance of plausible fictional worlds. They are liminal and transitional mechanisms that manipulate an empirical reader to phase-shift from an actual world to a plausible, by being inscribed in a possible and fictional world, thus creating a model reader, that perceives and acts according to the plausible world outlined by a given interpretive community, and that hears god talking in voces magicae and in god-speaking silence.

  5. Stochastic Vortex Dynamics in Two-Dimensional Easy Plane Ferromagnets: Multiplicative Versus Additive Noise

    Energy Technology Data Exchange (ETDEWEB)

    Kamppeter, T.; Mertens, F.G.; Moro, E.; Sanchez, A.; Bishop, A.R.

    1998-09-01

    We study how thermal fluctuations affect the dynamics of vortices in the two-dimensional anisotropic Heisenberg model depending on their additive or multiplicative character. Using a collective coordinate theory, we analytically show that multiplicative noise, arising from fluctuations in the local field term of the Landau-Lifshitz equations, and Langevin-like additive noise have the same effect on vortex dynamics (within a very plausible assumption consistent with the collective coordinate approach). This is a highly non-trivial result as multiplicative and additive noises usually modify the dynamics in very different ways. We also carry out numerical simulations of both versions of the model finding that they indeed give rise to very similar vortex dynamics.

  6. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  7. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  8. Plausible Explanation of Quantization of Intrinsic Redshift from Hall Effect and Weyl Quantization

    Directory of Open Access Journals (Sweden)

    Smarandache F.

    2006-10-01

    Full Text Available Using phion condensate model as described by Moffat [1], we consider a plausible explanation of (Tifft intrinsic redshift quantization as described by Bell [6] as result of Hall effect in rotating frame. We also discuss another alternative to explain redshift quantization from the viewpoint of Weyl quantization, which could yield Bohr- Sommerfeld quantization.

  9. “合情推理”辨析%Analysis of Plausible Reasoning

    Institute of Scientific and Technical Information of China (English)

    连四清; 方运加

    2012-01-01

    波利亚的“合情推理”模式引进我国数学课程标准后,就成了我国数学教育研究的关键词。然而,“合情推理”的科学性尚需考证:(1)它的中文意义不明确;(2)它不满足推理模式的客观性要求,存在明显的缺陷;(3)过分强调“合情推理模式”则是过分强调归纳推理和演绎推理的区别,容易割裂它们之间的关系。%After the model of "plausible inference" being introduced into the mathematics curriculum standards, it became a key word of the research on mathematics education in China. However, there are doubts on whether it is scientific. (1) Chinese meaning of plausible inference is ambiguous. (2) The plausible inference can not meet the objective requirement of the reasoning, which has obvious defects. (3) Overemphasizing the model of plausible inference would overemphasize the difference between deductive inference and inductive inference, and would dispart them.

  10. Legal assumptions for private company claim for additional (supplementary payment

    Directory of Open Access Journals (Sweden)

    Šogorov Stevan

    2011-01-01

    Full Text Available Subject matter of analyze in this article are legal assumptions which must be met in order to enable private company to call for additional payment. After introductory remarks discussion is focused on existence of provisions regarding additional payment in formation contract, or in shareholders meeting general resolution, as starting point for company's claim. Second assumption is concrete resolution of shareholders meeting which creates individual obligations for additional payments. Third assumption is defined as distinctness regarding sum of payment and due date. Sending of claim by relevant company body is set as fourth legal assumption for realization of company's right to claim additional payments from member of private company.

  11. A computational model to investigate assumptions in the headturn preference procedure

    Directory of Open Access Journals (Sweden)

    Christina eBergmann

    2013-10-01

    Full Text Available In this paper we use a computational model to investigate four assumptions that are tacitly present in interpreting the results of studies on infants' speech processing abilities using the Headturn Preference Procedure (HPP: (1 behavioural differences originate in different processing; (2 processing involves some form of recognition; (3 words are segmented from connected speech; and (4 differences between infants should not affect overall results. In addition, we investigate the impact of two potentially important aspects in the design and execution of the experiments: (a the specific voices used in the two parts on HPP experiments (familiarisation and test and (b the experimenter's criterion for what is a sufficient headturn angle. The model is designed to be maximise cognitive plausibility. It takes real speech as input, and it contains a module that converts the output of internal speech processing and recognition into headturns that can yield real-time listening preference measurements. Internal processing is based on distributed episodic representations in combination with a matching procedure based on the assumptions that complex episodes can be decomposed as positive weighted sums of simpler constituents. Model simulations show that the first assumptions hold under two different definitions of recognition. However, explicit segmentation is not necessary to simulate the behaviours observed in infant studies. Differences in attention span between infants can affect the outcomes of an experiment. The same holds for the experimenter's decision criterion. The speakers used in experiments affect outcomes in complex ways that require further investigation. The paper ends with recommendations for future studies using the HPP.

  12. Investigation of assumptions underlying current safety guidelines on EM-induced nerve stimulation

    Science.gov (United States)

    Neufeld, Esra; Vogiatzis Oikonomidis, Ioannis; Iacono, Maria Ida; Angelone, Leonardo M.; Kainz, Wolfgang; Kuster, Niels

    2016-06-01

    An intricate network of a variety of nerves is embedded within the complex anatomy of the human body. Although nerves are shielded from unwanted excitation, they can still be stimulated by external electromagnetic sources that induce strongly non-uniform field distributions. Current exposure safety standards designed to limit unwanted nerve stimulation are based on a series of explicit and implicit assumptions and simplifications. This paper demonstrates the applicability of functionalized anatomical phantoms with integrated coupled electromagnetic and neuronal dynamics solvers for investigating the impact of magnetic resonance exposure on nerve excitation within the full complexity of the human anatomy. The impact of neuronal dynamics models, temperature and local hot-spots, nerve trajectory and potential smoothing, anatomical inhomogeneity, and pulse duration on nerve stimulation was evaluated. As a result, multiple assumptions underlying current safety standards are questioned. It is demonstrated that coupled EM-neuronal dynamics modeling involving realistic anatomies is valuable to establish conservative safety criteria.

  13. Co-Dependency: An Examination of Underlying Assumptions.

    Science.gov (United States)

    Myer, Rick A.; And Others

    1991-01-01

    Discusses need for careful examination of codependency as diagnostic category. Critically examines assumptions that codependency is disease, addiction, or predetermined by the environment. Discusses implications of assumptions. Offers recommendations for mental health counselors focusing on need for systematic research, redirection of efforts to…

  14. Co-Dependency: An Examination of Underlying Assumptions.

    Science.gov (United States)

    Myer, Rick A.; And Others

    1991-01-01

    Discusses need for careful examination of codependency as diagnostic category. Critically examines assumptions that codependency is disease, addiction, or predetermined by the environment. Discusses implications of assumptions. Offers recommendations for mental health counselors focusing on need for systematic research, redirection of efforts to…

  15. 29 CFR 4231.10 - Actuarial calculations and assumptions.

    Science.gov (United States)

    2010-07-01

    ... MULTIEMPLOYER PLANS § 4231.10 Actuarial calculations and assumptions. (a) Most recent valuation. All calculations required by this part must be based on the most recent actuarial valuation as of the date of... this part must be based on methods and assumptions that are reasonable in the aggregate, based on...

  16. Special Theory of Relativity without special assumptions and tachyonic motion

    Directory of Open Access Journals (Sweden)

    E. Kapuścik

    2010-01-01

    Full Text Available The most general form of transformations of space-time coordinates in Special Theory of Relativity based solely on physical assumptions is described. Only the linearity of space-time transformations and the constancy of the speed of light are used as assumptions. The application to tachyonic motion is indicated.

  17. 40 CFR 761.2 - PCB concentration assumptions for use.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false PCB concentration assumptions for use..., AND USE PROHIBITIONS General § 761.2 PCB concentration assumptions for use. (a)(1) Any person may..., oil-filled cable, and rectifiers whose PCB concentration is not established contain PCBs at < 50 ppm...

  18. Plausibility of the implausible: is it possible that ultra-high dilutions ‘without biological activity’ cause adverse effects?

    Directory of Open Access Journals (Sweden)

    Marcus Zulian Teixeira

    2013-06-01

    Full Text Available Dear Editor, The homeopathic scientific model suffers constant criticism due to employ different assumptions and antagonistic to conventional scientific model, despite constantly develop studies confirming their premises [1-3]. The preferred target of the critics and skeptics rests on the principle of similitude curative (‘like cures like’ and the use of ultra-high dilutions (dynamized medicines. While the principle of similitude is scientifically grounded in the rebound effect (paradoxical reaction of conventional drugs [4,5], being recently proposed its therapeutic application by modern pharmacology (‘paradoxical pharmacology’ [6-8], several studies show clinical, biological and physical-chemistry activities of ultra-high dilutions in experimental models [9]. Despite these evidences, many skeptics questioning the ‘plausibility’ of the homeopathic model. Disregarding the biological effect of the homeopathic medicines, they gathered in public squares from different countries with the purpose of ingesting large doses of these ‘implausible’ ultra-high diluted drugs and show that nothing will happen, because they would not have the power to cause adverse events as the conventional drugs. Although they have not notified any disorder after this massive ingestion of dynamized homeopathic medicines, a recent systematic review suggests that they must have suffered serious consequences, as we have suggested in the past. [10] In order to counteract the widespread idea that homeopathy ‘is safe to use’, Posadzki et al. [11] conducted a systematic review to critically evaluate the evidence regarding the adverse effects (AEs of homeopathy described in published case reports and case series. In a total of 38 reports analyzed, 30 pertained to direct AEs of homeopathic medicines encompassing 1142 patients submitted to various medicines and forms of treatment (mostly, complex homeopathic medicines in low potencies. Reporting that “in 94

  19. Acquiring Plausible Predications from MEDLINE by Clustering MeSH Annotations.

    Science.gov (United States)

    Miñarro-Giménez, Jose Antonio; Kreuzthaler, Markus; Bernhardt-Melischnig, Johannes; Martínez-Costa, Catalina; Schulz, Stefan

    2015-01-01

    The massive accumulation of biomedical knowledge is reflected by the growth of the literature database MEDLINE with over 23 million bibliographic records. All records are manually indexed by MeSH descriptors, many of them refined by MeSH subheadings. We use subheading information to cluster types of MeSH descriptor co-occurrences in MEDLINE by processing co-occurrence information provided by the UMLS. The goal is to infer plausible predicates to each resulting cluster. In an initial experiment this was done by grouping disease-pharmacologic substance co-occurrences into six clusters. Then, a domain expert manually performed the assignment of meaningful predicates to the clusters. The mean accuracy of the best ten generated biomedical facts of each cluster was 85%. This result supports the evidence of the potential of MeSH subheadings for extracting plausible medical predications from MEDLINE.

  20. Spelling in oral deaf and hearing dyslexic children: A comparison of phonologically plausible errors.

    Science.gov (United States)

    Roy, P; Shergold, Z; Kyle, F E; Herman, R

    2014-11-01

    A written single word spelling to dictation test and a single word reading test were given to 68 severe-profoundly oral deaf 10-11-year-old children and 20 hearing children with a diagnosis of dyslexia. The literacy scores of the deaf children and the hearing children with dyslexia were lower than expected for children of their age and did not differ from each other. Three quarters of the spelling errors of hearing children with dyslexia compared with just over half the errors of the oral deaf group were phonologically plausible. Expressive vocabulary and speech intelligibility predicted the percentage of phonologically plausible errors in the deaf group only. Implications of findings for the phonological decoding self-teaching model and for supporting literacy development are discussed.

  1. On the plausible association between environmental conditions and human eye damage.

    Science.gov (United States)

    Feretis, Elias; Theodorakopoulos, Panagiotis; Varotsos, Costas; Efstathiou, Maria; Tzanis, Christos; Xirou, Tzina; Alexandridou, Nancy; Aggelou, Michael

    2002-01-01

    The increase in solar ultraviolet radiation can have various direct and indirect effects on human health, like the incidence of ocular damage. Data of eye damage in residents of three suburban regions in Greece and in two groups of monks/nuns and fishermen are examined here. The statistics performed on these data provides new information about the plausible association between increased levels of solar ultraviolet radiation, air-pollution at ground level, and the development of ocular defects.

  2. Families of Plausible Solutions to the Puzzle of Boyajian’s Star

    Science.gov (United States)

    Wright, Jason T.; Sigurd̵sson, Steinn

    2016-09-01

    Good explanations for the unusual light curve of Boyajian's Star have been hard to find. Recent results by Montet & Simon lend strength and plausibility to the conclusion of Schaefer that in addition to short-term dimmings, the star also experiences large, secular decreases in brightness on decadal timescales. This, combined with a lack of long-wavelength excess in the star's spectral energy distribution, strongly constrains scenarios involving circumstellar material, including hypotheses invoking a spherical cloud of artifacts. We show that the timings of the deepest dimmings appear consistent with being randomly distributed, and that the star's reddening and narrow sodium absorption is consistent with the total, long-term dimming observed. Following Montet & Simon's encouragement to generate alternative hypotheses, we attempt to circumscribe the space of possible explanations with a range of plausibilities, including: a cloud in the outer solar system, structure in the interstellar medium (ISM), natural and artificial material orbiting Boyajian's Star, an intervening object with a large disk, and variations in Boyajian's Star itself. We find the ISM and intervening disk models more plausible than the other natural models.

  3. What happened (and what didn't): Discourse constraints on encoding of plausible alternatives.

    Science.gov (United States)

    Fraundorf, Scott H; Benjamin, Aaron S; Watson, Duane G

    2013-10-01

    Three experiments investigated how font emphasis influences reading and remembering discourse. Although past work suggests that contrastive pitch contours benefit memory by promoting encoding of salient alternatives, it is unclear both whether this effect generalizes to other forms of linguistic prominence and how the set of alternatives is constrained. Participants read discourses in which some true propositions had salient alternatives (e.g., British scientists found the endangered monkey when the discourse also mentioned French scientists) and completed a recognition memory test. In Experiments 1 and 2, font emphasis in the initial presentation increased participants' ability to later reject false statements about salient alternatives but not about unmentioned items (e.g., Portuguese scientists). In Experiment 3, font emphasis helped reject false statements about plausible alternatives, but not about less plausible alternatives that were nevertheless established in the discourse. These results suggest readers encode a narrow set of only those alternatives plausible in the particular discourse. They also indicate that multiple manipulations of linguistic prominence, not just prosody, can lead to consideration of alternatives.

  4. A biologically plausible model of time-scale invariant interval timing.

    Science.gov (United States)

    Almeida, Rita; Ledberg, Anders

    2010-02-01

    The temporal durations between events often exert a strong influence over behavior. The details of this influence have been extensively characterized in behavioral experiments in different animal species. A remarkable feature of the data collected in these experiments is that they are often time-scale invariant. This means that response measurements obtained under intervals of different durations coincide when plotted as functions of relative time. Here we describe a biologically plausible model of an interval timing device and show that it is consistent with time-scale invariant behavior over a substantial range of interval durations. The model consists of a set of bistable units that switch from one state to the other at random times. We first use an abstract formulation of the model to derive exact expressions for some key quantities and to demonstrate time-scale invariance for any range of interval durations. We then show how the model could be implemented in the nervous system through a generic and biologically plausible mechanism. In particular, we show that any system that can display noise-driven transitions from one stable state to another can be used to implement the timing device. Our work demonstrates that a biologically plausible model can qualitatively account for a large body of data and thus provides a link between the biology and behavior of interval timing.

  5. MONITORED GEOLOGIC REPOSITORY LIFE CYCLE COST ESTIMATE ASSUMPTIONS DOCUMENT

    Energy Technology Data Exchange (ETDEWEB)

    R.E. Sweeney

    2001-02-08

    The purpose of this assumptions document is to provide general scope, strategy, technical basis, schedule and cost assumptions for the Monitored Geologic Repository (MGR) life cycle cost (LCC) estimate and schedule update incorporating information from the Viability Assessment (VA) , License Application Design Selection (LADS), 1999 Update to the Total System Life Cycle Cost (TSLCC) estimate and from other related and updated information. This document is intended to generally follow the assumptions outlined in the previous MGR cost estimates and as further prescribed by DOE guidance.

  6. Supporting calculations and assumptions for use in WESF safetyanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Hey, B.E.

    1997-03-07

    This document provides a single location for calculations and assumptions used in support of Waste Encapsulation and Storage Facility (WESF) safety analyses. It also provides the technical details and bases necessary to justify the contained results.

  7. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...... to articulate and discuss underlying and conflicting assumptions in IT PPM, serving as a basis for adjusting organizations’ IT PPM practices. Keywords: IT project portfolio management or IT PPM, literature review, scientific discourses, underlying assumptions, unintended consequences, epistemological biases...... literature, and few contributions represent the three remaining discourses, which unjustifiably leaves out issues that research could and most probably should investigate. In order to highlight research potentials, limitations, and underlying assumptions of each discourse, we develop four IT PPM metaphors...

  8. Different Random Distributions Research on Logistic-Based Sample Assumption

    Directory of Open Access Journals (Sweden)

    Jing Pan

    2014-01-01

    Full Text Available Logistic-based sample assumption is proposed in this paper, with a research on different random distributions through this system. It provides an assumption system of logistic-based sample, including its sample space structure. Moreover, the influence of different random distributions for inputs has been studied through this logistic-based sample assumption system. In this paper, three different random distributions (normal distribution, uniform distribution, and beta distribution are used for test. The experimental simulations illustrate the relationship between inputs and outputs under different random distributions. Thereafter, numerical analysis infers that the distribution of outputs depends on that of inputs to some extent, and this assumption system is not independent increment process, but it is quasistationary.

  9. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    Swarup Mohalik; R Ramanujam

    2002-04-01

    We propose a class of finite state systems of synchronizing distributed processes, where processes make assumptions at local states about the state of other processes in the system. This constrains the global states of the system to those where assumptions made by a process about another are compatible with the commitments offered by the other at that state. We model examples like reliable bit transmission and sequence transmission protocols in this framework and discuss how assumption-commitment structure facilitates compositional design of such protocols. We prove a decomposition theorem which states that every protocol specified globally as a finite state system can be decomposed into such an assumption compatible system. We also present a syntactic characterization of this class using top level parallel composition.

  10. Tails assumptions and posterior concentration rates for mixtures of Gaussians

    OpenAIRE

    Naulet, Zacharie; Rousseau, Judith

    2016-01-01

    Nowadays in density estimation, posterior rates of convergence for location and location-scale mixtures of Gaussians are only known under light-tail assumptions; with better rates achieved by location mixtures. It is conjectured, but not proved, that the situation should be reversed under heavy tails assumptions. The conjecture is based on the feeling that there is no need to achieve a good order of approximation in regions with few data (say, in the tails), favoring location-scale mixtures w...

  11. US Intervention in Failed States: Bad Assumptions=Poor Outcomes

    Science.gov (United States)

    2002-01-01

    NATIONAL DEFENSE UNIVERSITY NATIONAL WAR COLLEGE STRATEGIC LOGIC ESSAY US INTERVENTION IN FAILED STATES: BAD ASSUMPTIONS = POOR ...2002 2. REPORT TYPE 3. DATES COVERED 00-00-2002 to 00-00-2002 4. TITLE AND SUBTITLE US Intervention in Failed States: Bad Assumptions= Poor ...country remains in the grip of poverty , natural disasters, and stagnation. Rwanda Rwanda, another small African country, is populated principally

  12. Understanding Karma Police: The Perceived Plausibility of Noun Compounds as Predicted by Distributional Models of Semantic Representation

    Science.gov (United States)

    Günther, Fritz; Marelli, Marco

    2016-01-01

    Noun compounds, consisting of two nouns (the head and the modifier) that are combined into a single concept, differ in terms of their plausibility: school bus is a more plausible compound than saddle olive. The present study investigates which factors influence the plausibility of attested and novel noun compounds. Distributional Semantic Models (DSMs) are used to obtain formal (vector) representations of word meanings, and compositional methods in DSMs are employed to obtain such representations for noun compounds. From these representations, different plausibility measures are computed. Three of those measures contribute in predicting the plausibility of noun compounds: The relatedness between the meaning of the head noun and the compound (Head Proximity), the relatedness between the meaning of modifier noun and the compound (Modifier Proximity), and the similarity between the head noun and the modifier noun (Constituent Similarity). We find non-linear interactions between Head Proximity and Modifier Proximity, as well as between Modifier Proximity and Constituent Similarity. Furthermore, Constituent Similarity interacts non-linearly with the familiarity with the compound. These results suggest that a compound is perceived as more plausible if it can be categorized as an instance of the category denoted by the head noun, if the contribution of the modifier to the compound meaning is clear but not redundant, and if the constituents are sufficiently similar in cases where this contribution is not clear. Furthermore, compounds are perceived to be more plausible if they are more familiar, but mostly for cases where the relation between the constituents is less clear. PMID:27732599

  13. Naledi: An example of how natural phenomena can inspire metaphysical assumptions

    Directory of Open Access Journals (Sweden)

    Francois Durand

    2017-02-01

    Full Text Available A new fossil site was discovered in the Rising Star Cave in 2013 in the Cradle of Humankind in South Africa. This site which has yielded 1550 hominin bones so far is considered to be one of the richest palaeoanthropological sites in the world. The deposition of the fossils in a remote part of the cave system, approximately 100 m from the entrance, has resulted in a great deal of speculation. The relative inaccessibility of the site and the number of fossil bones it contained and the fact that virtually all these bones were those of a single species of hominid led to the conclusion that the bones were not deposited because of natural sedimentary processes, but that these phenomena were evidence of purposeful disposal or even burial of the dead by hominins. If this assumption is true, it would be the earliest evidence of a metaphysical awareness in humankind. The tenuous evidence on which this hypothesis rests will be discussed and a more plausible alternative explanation where water and gravity were responsible for the deposition of the remains is forwarded.

  14. A Novel Discovery of Growth Process for Ag Nanowires and Plausible Mechanism

    Directory of Open Access Journals (Sweden)

    Jiejun Zhu

    2016-01-01

    Full Text Available A novel growth process of silver nanowires was revealed by tracing the morphology evolution of Ag nanostructures fabricated by an improved polyol process. A mixture of Ag nanowires and nanoparticles was obtained with the usage of PVP-K25 (MW = 38,000. The products sampled at different reaction time were studied in detail using UV-visible absorption spectra and transmission electron microscopy (TEM. An interesting phenomenon unknown in the past was observed where Ag nanoparticles undergo an important dissolution-recrystallization process and Ag nanowires are formed at the expense of the preformed Ag nanoparticles. A plausible novel growth mechanism for the silver nanowires was proposed.

  15. ‘One of the Challenges that Can Plausibly Be Raised Against Them’?

    DEFF Research Database (Denmark)

    Holtermann, Jakob v. H.

    2017-01-01

    International criminal tribunals (ICTs) are epistemic engines in the sense that they find (or claim to find) factual truths about such past events that qualify as genocide, crimes against humanity and war crimes. The value of this kind of knowledge would seem to be beyond dispute. Yet, in general...... in law is intimately connected to ordinary truth. Truth-finding capacity therefore does belong in legitimacy debates as a challenge that can plausibly be raised against them. This, in turn makes it relevant, in future research, to map, analyse and interrelate the various critiques that have been launched...

  16. A biological plausible Generalized Leaky Integrate-and-Fire neuron model.

    Science.gov (United States)

    Wang, Zhenzhong; Guo, Lilin; Adjouadi, Malek

    2014-01-01

    This study introduces a new Generalized Leaky Integrate-and-Fire (GLIF) neuron model. Unlike Normal Leaky Integrate-and-Fire (NLIF) models, the leaking resistor in the GLIF model equation is assumed to be variable, and an additional term would have the bias current added to the model equation in order to improve the accuracy. Adjusting the parameters defined for the leaking resistor and bias current, a GLIF model could be accurately matched to any Hodgkin-Huxley (HH) model and be able to reproduce plausible biological neuron behaviors.

  17. Higher Data Quality by Online Data-Entry and Automated Plausibility Checks

    Science.gov (United States)

    Pietragalla, Barbara; Sigg, Christian; Güsewell, Sabine; Clot, Bernard

    2014-05-01

    Long-term phenological observations are now recognized as important indicators for climate change impact studies. With the increased need for phenological data, there is also an increased need for higher data quality. Since 1951 MeteoSwiss has been operating a national phenological observation network. Currently the network consists of about 150 active stations observing up to 69 different phenophases. An important aim of a running three years project at MeteoSwiss is a further increase of the quality of the collected data. The higher data quality will be achieved by an automated procedure performing plausibility checks on the data and by online data-entry. Further measures such as intensified observer instructions and collection of more detailed metadata also contribute to a high data quality standard. The plausibility checks include the natural order of the phenophases within a species and also between different species (with regard to possible natural deviation). Additionally it will be checked if the observed date differs by less than two standard deviations from the average for this phenophase at the altitude of the station. A value outside of these limits is not necessarily a false value, since occurrences of extreme values will be beyond these limits. Therefore, within this check of the limits, the timing of the season of the respective year will also be taken into account. In case of an implausible value a comparison with other stations of the same region and sea level is proposed. A further possibility of data quality control could be to model the different phenophases statistically and to use this model for estimating the likelihood of observed values. An overall exploratory data analysis is currently performed providing a solid basis to implement the best possible methods for the plausibility checks. Important advantages of online data-entry are the near real-time availability of the data as well as the avoidance of various kinds of typical mistakes

  18. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    Science.gov (United States)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for

  19. A biologically plausible transform for visual recognition that is invariant to translation, scale and rotation

    Directory of Open Access Journals (Sweden)

    Pavel eSountsov

    2011-11-01

    Full Text Available Visual object recognition occurs easily despite differences in position, size, and rotation of the object, but the neural mechanisms responsible for this invariance are not known. We have found a set of transforms that achieve invariance in a neurally plausible way. We find that a transform based on local spatial frequency analysis of oriented segments and on logarithmic mapping, when applied twice in an iterative fashion, produces an output image that is unique to the object and that remains constant as the input image is shifted, scaled or rotated.

  20. In Silico Structure Prediction of Human Fatty Acid Synthase-Dehydratase: A Plausible Model for Understanding Active Site Interactions.

    Science.gov (United States)

    John, Arun; Umashankar, Vetrivel; Samdani, A; Sangeetha, Manoharan; Krishnakumar, Subramanian; Deepa, Perinkulam Ravi

    2016-01-01

    Fatty acid synthase (FASN, UniProt ID: P49327) is a multienzyme dimer complex that plays a critical role in lipogenesis. Consequently, this lipogenic enzyme has gained tremendous biomedical importance. The role of FASN and its inhibition is being extensively researched in several clinical conditions, such as cancers, obesity, and diabetes. X-ray crystallographic structures of some of its domains, such as β-ketoacyl synthase, acetyl transacylase, malonyl transacylase, enoyl reductase, β-ketoacyl reductase, and thioesterase, (TE) are already reported. Here, we have attempted an in silico elucidation of the uncrystallized dehydratase (DH) catalytic domain of human FASN. This theoretical model for DH domain was predicted using comparative modeling methods. Different stand-alone tools and servers were used to validate and check the reliability of the predicted models, which suggested it to be a highly plausible model. The stereochemical analysis showed 92.0% residues in favorable region of Ramachandran plot. The initial physiological substrate β-hydroxybutyryl group was docked into active site of DH domain using Glide. The molecular dynamics simulations carried out for 20 ns in apo and holo states indicated the stability and accuracy of the predicted structure in solvated condition. The predicted model provided useful biochemical insights into the substrate-active site binding mechanisms. This model was then used for identifying potential FASN inhibitors using high-throughput virtual screening of the National Cancer Institute database of chemical ligands. The inhibitory efficacy of the top hit ligands was validated by performing molecular dynamics simulation for 20 ns, where in the ligand NSC71039 exhibited good enzyme inhibition characteristics and exhibited dose-dependent anticancer cytotoxicity in retinoblastoma cancer cells in vitro.

  1. Changing Assumptions and Progressive Change in Theories of Strategic Organization

    DEFF Research Database (Denmark)

    Foss, Nicolai J.; Hallberg, Niklas L.

    2016-01-01

    are often decoupled from the results of empirical testing, changes in assumptions seem closely intertwined with theoretical progress. Using the case of the resource-based view, we suggest that progressive change in theories of strategic organization may come about as a result of scholarly debate and dispute...... over what constitutes proper assumptions—even in the absence of corroborating or falsifying empirical evidence. We also discuss how changing assumptions may drive future progress in the resource-based view.......A commonly held view is that strategic organization theories progress as a result of a Popperian process of bold conjectures and systematic refutations. However, our field also witnesses vibrant debates or disputes about the specific assumptions that our theories rely on, and although these debates...

  2. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  3. On the Necessary and Sufficient Assumptions for UC Computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Nielsen, Jesper Buus; Orlandi, Claudio

    2010-01-01

    We study the necessary and sufficient assumptions for universally composable (UC) computation, both in terms of setup and computational assumptions. We look at the common reference string model, the uniform random string model and the key-registration authority model (KRA), and provide new results...... for all of them. Perhaps most interestingly we show that: •  For even the minimal meaningful KRA, where we only assume that the secret key is a value which is hard to compute from the public key, one can UC securely compute any poly-time functionality if there exists a passive secure oblivious......-transfer protocol for the stand-alone model. Since a KRA where the secret keys can be computed from the public keys is useless, and some setup assumption is needed for UC secure computation, this establishes the best we could hope for the KRA model: any non-trivial KRA is sufficient for UC computation. •  We show...

  4. More Efficient VLR Group Signature Based on DTDH Assumption

    Directory of Open Access Journals (Sweden)

    Lizhen Ma

    2012-10-01

    Full Text Available In VLR (verifier-local revocation group signature, only verifiers are involved in the revocation of a member, while signers are not. Thus the VLR group signature schemes are suitable for mobile environments. To meet the requirement of speediness, reducing computation costs and shortening signature length are two requirements at the current research of VLR group signatures. A new VLR group signature is proposed based on q-SDH assumption and DTDH assumption. Compared with the existing VLR group signatures based on DTDH assumption, the  proposed scheme not only has the shortest signature size, but also has the lowest computation costs , and can be applicable to mobile environments such as IEEE 802.1x.  

  5. Evolution of Requirements and Assumptions for Future Exploration Missions

    Science.gov (United States)

    Anderson, Molly; Sargusingh, Miriam; Perry, Jay

    2017-01-01

    NASA programs are maturing technologies, systems, and architectures to enabling future exploration missions. To increase fidelity as technologies mature, developers must make assumptions that represent the requirements of a future program. Multiple efforts have begun to define these requirements, including team internal assumptions, planning system integration for early demonstrations, and discussions between international partners planning future collaborations. For many detailed life support system requirements, existing NASA documents set limits of acceptable values, but a future vehicle may be constrained in other ways, and select a limited range of conditions. Other requirements are effectively set by interfaces or operations, and may be different for the same technology depending on whether the hard-ware is a demonstration system on the International Space Station, or a critical component of a future vehicle. This paper highlights key assumptions representing potential life support requirements and explanations of the driving scenarios, constraints, or other issues that drive them.

  6. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...... to articulate and discuss underlying and conflicting assumptions in IT PPM, serving as a basis for adjusting organizations’ IT PPM practices. Keywords: IT project portfolio management or IT PPM, literature review, scientific discourses, underlying assumptions, unintended consequences, epistemological biases......: (1) IT PPM as the top management marketplace, (2) IT PPM as the cause of social dilemmas at the lower organizational levels (3) IT PPM as polity between different organizational interests, (4) IT PPM as power relations that suppress creativity and diversity. Our metaphors can be used by practitioners...

  7. A Biomass-based Model to Estimate the Plausibility of Exoplanet Biosignature Gases

    CERN Document Server

    Seager, S; Hu, R

    2013-01-01

    Biosignature gas detection is one of the ultimate future goals for exoplanet atmosphere studies. We have created a framework for linking biosignature gas detectability to biomass estimates, including atmospheric photochemistry and biological thermodynamics. The new framework is intended to liberate predictive atmosphere models from requiring fixed, Earth-like biosignature gas source fluxes. New biosignature gases can be considered with a check that the biomass estimate is physically plausible. We have validated the models on terrestrial production of NO, H2S, CH4, CH3Cl, and DMS. We have applied the models to propose NH3 as a biosignature gas on a "cold Haber World," a planet with a N2-H2 atmosphere, and to demonstrate why gases such as CH3Cl must have too large of a biomass to be a plausible biosignature gas on planets with Earth or early-Earth-like atmospheres orbiting a Sun-like star. To construct the biomass models, we developed a functional classification of biosignature gases, and found that gases (such...

  8. Self-assembly of phosphate amphiphiles in mixtures of prebiotically plausible surfactants.

    Science.gov (United States)

    Albertsen, A N; Duffy, C D; Sutherland, J D; Monnard, P-A

    2014-06-01

    The spontaneous formation of closed bilayer structures from prebiotically plausible amphiphiles is an essential requirement for the emergence of early cells on prebiotic Earth. The sources of amphiphiles could have been both endo- and exogenous (accretion of meteorite carbonaceous material or interstellar dust particles). Among all prebiotic possible amphiphile candidates, those containing phosphate are the least investigated species because their self-assembly occurs in a seemingly too narrow range of conditions. The self-assembly of simple phosphate amphiphiles should, however, be of great interest, as contemporary membranes predominantly contain phospholipids. In contrast to common expectations, we show that these amphiphiles can be easily synthesized under prebiotically plausible environmental conditions and can efficiently form bilayer structures in the presence of various co-surfactants across a large range of pH values. Vesiculation was even observed in crude reaction mixtures that contained 1-decanol as the amphiphile precursor. The two best co-surfactants promoted vesicle formation over the entire pH range in aqueous solutions. Expanding the pH range where bilayer membranes self-assemble and remain intact is a prerequisite for the emergence of early cell-like compartments and their preservation under fluctuating environmental conditions. These mixed bilayers also retained small charged solutes, such as dyes. These results demonstrate that alkyl phosphate amphiphiles might have played a significant role as early compartment building blocks.

  9. The Sarrazin effect: the presence of absurd statements in conspiracy theories makes canonical information less plausible.

    Science.gov (United States)

    Raab, Marius Hans; Auer, Nikolas; Ortlieb, Stefan A; Carbon, Claus-Christian

    2013-01-01

    Reptile prime ministers and flying Nazi saucers-extreme and sometimes off-wall conclusion are typical ingredients of conspiracy theories. While individual differences are a common research topic concerning conspiracy theories, the role of extreme statements in the process of acquiring and passing on conspiratorial stories has not been regarded in an experimental design so far. We identified six morphological components of conspiracy theories empirically. On the basis of these content categories a set of narrative elements for a 9/11 story was compiled. These elements varied systematically in terms of conspiratorial allegation, i.e., they contained official statements concerning the events of 9/11, statements alleging to a conspiracy limited in time and space as well as extreme statements indicating an all-encompassing cover-up. Using the method of narrative construction, 30 people were given a set of cards with these statements and asked to construct the course of events of 9/11 they deem most plausible. When extreme statements were present in the set, the resulting stories were more conspiratorial; the number of official statements included in the narrative dropped significantly, whereas the self-assessment of the story's plausibility did not differ between conditions. This indicates that blatant statements in a pool of information foster the synthesis of conspiracy theories on an individual level. By relating these findings to one of Germany's most successful (and controversial) non-fiction books, we refer to the real-world dangers of this effect.

  10. Assumptions behind size-based ecosystem models are realistic

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Blanchard, Julia L.; Fulton, Elizabeth A.;

    2016-01-01

    A recent publication about balanced harvesting (Froese et al., ICES Journal of Marine Science; doi:10.1093/icesjms/fsv122) contains several erroneous statements about size-spectrum models. We refute the statements by showing that the assumptions pertaining to size-spectrum models discussed...... by Froese et al. are realistic and consistent. We further show that the assumption about density-dependence being described by a stock recruitment relationship is responsible for determining whether a peak in the cohort biomass of a population occurs late or early in life. Finally, we argue...

  11. Error in the description of foot kinematics due to violation of rigid body assumptions.

    Science.gov (United States)

    Nester, C J; Liu, A M; Ward, E; Howard, D; Cocheba, J; Derrick, T

    2010-03-03

    Kinematic data from rigid segment foot models inevitably includes errors because the bones within each segment move relative to each other. This study sought to define error in foot kinematic data due to violation of the rigid segment assumption. The research compared kinematic data from 17 different mid and forefoot rigid segment models to kinematic data of the individual bones comprising these segments. Kinematic data from a previous dynamic cadaver model study was used to derive individual bone as well as foot segment kinematics. Mean and maximum errors due to violation of the rigid body assumption varied greatly between models. The model with least error was the combination of navicular and cuboid (mean errors kinematics research study being undertaken.

  12. Unpacking assumptions about inclusion in community-based health promotion: perspectives of women living in poverty.

    Science.gov (United States)

    Ponic, Pamela; Frisby, Wendy

    2010-11-01

    Community-based health promoters often aim to facilitate "inclusion" when working with marginalized women to address their exclusion and related health issues. Yet the notion of inclusion has not been critically interrogated within this field, resulting in the perpetuation of assumptions that oversimplify it. We provide qualitative evidence on inclusion as a health-promotion strategy from the perspectives of women living in poverty. We collected data with women engaged in a 6-year community-based health promotion and feminist participatory action research project. Participants' experiences illustrated that inclusion was a multidimensional process that involved a dynamic interplay between structural determinants and individual agency. The women named multiple elements of inclusion across psychosocial, relational, organizational, and participatory dimensions. This knowledge interrupts assumptions that inclusion is achievable and desirable for so-called recipients of such initiatives. We thus call for critical consideration of the complexities, limitations, and possibilities of facilitating inclusion as a health-promotion strategy.

  13. 41 CFR 60-3.9 - No assumption of validity.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 1 2010-07-01 2010-07-01 true No assumption of validity. 60-3.9 Section 60-3.9 Public Contracts and Property Management Other Provisions Relating to Public... 3-UNIFORM GUIDELINES ON EMPLOYEE SELECTION PROCEDURES (1978) General Principles § 60-3.9...

  14. Ontological, Epistemological and Methodological Assumptions: Qualitative versus Quantitative

    Science.gov (United States)

    Ahmed, Abdelhamid

    2008-01-01

    The review to follow is a comparative analysis of two studies conducted in the field of TESOL in Education published in "TESOL QUARTERLY." The aspects to be compared are as follows. First, a brief description of each study will be presented. Second, the ontological, epistemological and methodological assumptions underlying each study…

  15. "Touch Me, Like Me": Testing an Encounter Group Assumption

    Science.gov (United States)

    Boderman, Alvin; And Others

    1972-01-01

    An experiment to test an encounter group assumption that touching increases interpersonal attraction was conducted. College women were randomly assigned to a touch or no-touch condition. A comparison of total evaluation scores verified the hypothesis: subjects who touched the accomplice perceived her as a more attractive person than those who did…

  16. The Metatheoretical Assumptions of Literacy Engagement: A Preliminary Centennial History

    Science.gov (United States)

    Hruby, George G.; Burns, Leslie D.; Botzakis, Stergios; Groenke, Susan L.; Hall, Leigh A.; Laughter, Judson; Allington, Richard L.

    2016-01-01

    In this review of literacy education research in North America over the past century, the authors examined the historical succession of theoretical frameworks on students' active participation in their own literacy learning, and in particular the metatheoretical assumptions that justify those frameworks. The authors used "motivation" and…

  17. Woman's Moral Development in Search of Philosophical Assumptions.

    Science.gov (United States)

    Sichel, Betty A.

    1985-01-01

    Examined is Carol Gilligan's thesis that men and women use different moral languages to resolve moral dilemmas, i.e., women speak a language of caring and responsibility, and men speak a language of rights and justice. Her thesis is not grounded with adequate philosophical assumptions. (Author/RM)

  18. Questioning Engelhardt's assumptions in Bioethics and Secular Humanism.

    Science.gov (United States)

    Ahmadi Nasab Emran, Shahram

    2016-06-01

    In Bioethics and Secular Humanism: The Search for a Common Morality, Tristram Engelhardt examines various possibilities of finding common ground for moral discourse among people from different traditions and concludes their futility. In this paper I will argue that many of the assumptions on which Engelhardt bases his conclusion about the impossibility of a content-full secular bioethics are problematic. By starting with the notion of moral strangers, there is no possibility, by definition, for a content-full moral discourse among moral strangers. It means that there is circularity in starting the inquiry with a definition of moral strangers, which implies that they do not share enough moral background or commitment to an authority to allow for reaching a moral agreement, and concluding that content-full morality is impossible among moral strangers. I argue that assuming traditions as solid and immutable structures that insulate people across their boundaries is problematic. Another questionable assumption in Engelhardt's work is the idea that religious and philosophical traditions provide content-full moralities. As the cardinal assumption in Engelhardt's review of the various alternatives for a content-full moral discourse among moral strangers, I analyze his foundationalist account of moral reasoning and knowledge and indicate the possibility of other ways of moral knowledge, besides the foundationalist one. Then, I examine Engelhardt's view concerning the futility of attempts at justifying a content-full secular bioethics, and indicate how the assumptions have shaped Engelhardt's critique of the alternatives for the possibility of content-full secular bioethics.

  19. Deep Borehole Field Test Requirements and Controlled Assumptions.

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This document presents design requirements and controlled assumptions intended for use in the engineering development and testing of: 1) prototype packages for radioactive waste disposal in deep boreholes; 2) a waste package surface handling system; and 3) a subsurface system for emplacing and retrieving packages in deep boreholes. Engineering development and testing is being performed as part of the Deep Borehole Field Test (DBFT; SNL 2014a). This document presents parallel sets of requirements for a waste disposal system and for the DBFT, showing the close relationship. In addition to design, it will also inform planning for drilling, construction, and scientific characterization activities for the DBFT. The information presented here follows typical preparations for engineering design. It includes functional and operating requirements for handling and emplacement/retrieval equipment, waste package design and emplacement requirements, borehole construction requirements, sealing requirements, and performance criteria. Assumptions are included where they could impact engineering design. Design solutions are avoided in the requirements discussion. Deep Borehole Field Test Requirements and Controlled Assumptions July 21, 2015 iv ACKNOWLEDGEMENTS This set of requirements and assumptions has benefited greatly from reviews by Gordon Appel, Geoff Freeze, Kris Kuhlman, Bob MacKinnon, Steve Pye, David Sassani, Dave Sevougian, and Jiann Su.

  20. Using Contemporary Art to Challenge Cultural Values, Beliefs, and Assumptions

    Science.gov (United States)

    Knight, Wanda B.

    2006-01-01

    Art educators, like many other educators born or socialized within the main-stream culture of a society, seldom have an opportunity to identify, question, and challenge their cultural values, beliefs, assumptions, and perspectives because school culture typically reinforces those they learn at home and in their communities (Bush & Simmons, 1990).…

  1. Lightweight Graphical Models for Selectivity Estimation Without Independence Assumptions

    DEFF Research Database (Denmark)

    Tzoumas, Kostas; Deshpande, Amol; Jensen, Christian S.

    2011-01-01

    ’s optimizers are frequently caused by missed correlations between attributes. We present a selectivity estimation approach that does not make the independence assumptions. By carefully using concepts from the field of graphical models, we are able to factor the joint probability distribution of all...

  2. Making Predictions about Chemical Reactivity: Assumptions and Heuristics

    Science.gov (United States)

    Maeyer, Jenine; Talanquer, Vicente

    2013-01-01

    Diverse implicit cognitive elements seem to support but also constrain reasoning in different domains. Many of these cognitive constraints can be thought of as either implicit assumptions about the nature of things or reasoning heuristics for decision-making. In this study we applied this framework to investigate college students' understanding of…

  3. Unpacking Assumptions in Research Synthesis: A Critical Construct Synthesis Approach

    Science.gov (United States)

    Wolgemuth, Jennifer R.; Hicks, Tyler; Agosto, Vonzell

    2017-01-01

    Research syntheses in education, particularly meta-analyses and best-evidence syntheses, identify evidence-based practices by combining findings across studies whose constructs are similar enough to warrant comparison. Yet constructs come preloaded with social, historical, political, and cultural assumptions that anticipate how research problems…

  4. Challenging Teachers' Pedagogic Practice and Assumptions about Social Media

    Science.gov (United States)

    Cartner, Helen C.; Hallas, Julia L.

    2017-01-01

    This article describes an innovative approach to professional development designed to challenge teachers' pedagogic practice and assumptions about educational technologies such as social media. Developing effective technology-related professional development for teachers can be a challenge for institutions and facilitators who provide this…

  5. Assumptions regarding right censoring in the presence of left truncation.

    Science.gov (United States)

    Qian, Jing; Betensky, Rebecca A

    2014-04-01

    Clinical studies using complex sampling often involve both truncation and censoring, where there are options for the assumptions of independence of censoring and event and for the relationship between censoring and truncation. In this paper, we clarify these choices, show certain equivalences, and provide examples.

  6. DDH-like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike

    2011-01-01

    DDH, is easy in bilinear groups. This motivates our suggestion of a different type of assumption, the d-vector DDH problems (VDDH), which are based on f(X)= X^d, but with a twist to avoid the problems with reducible polynomials. We show in the generic group model that VDDH is hard in bilinear groups...

  7. Quantum cryptography in real-life applications: Assumptions and security

    Science.gov (United States)

    Zhao, Yi

    Quantum cryptography, or quantum key distribution (QKD), provides a means of unconditionally secure communication. The security is in principle based on the fundamental laws of physics. Security proofs show that if quantum cryptography is appropriately implemented, even the most powerful eavesdropper cannot decrypt the message from a cipher. The implementations of quantum crypto-systems in real life may not fully comply with the assumptions made in the security proofs. Such discrepancy between the experiment and the theory can be fatal to the security of a QKD system. In this thesis we address a number of these discrepancies. A perfect single-photon source is often assumed in many security proofs. However, a weak coherent source is widely used in a real-life QKD implementation. Decoy state protocols have been proposed as a novel approach to dramatically improve the performance of a weak coherent source based QKD implementation without jeopardizing its security. Here, we present the first experimental demonstrations of decoy state protocols. Our experimental scheme was later adopted by most decoy state QKD implementations. In the security proof of decoy state protocols as well as many other QKD protocols, it is widely assumed that a sender generates a phase-randomized coherent state. This assumption has been enforced in few implementations. We close this gap in two steps: First, we implement and verify the phase randomization experimentally; second, we prove the security of a QKD implementation without the coherent state assumption. In many security proofs of QKD, it is assumed that all the detectors on the receiver's side have identical detection efficiencies. We show experimentally that this assumption may be violated in a commercial QKD implementation due to an eavesdropper's malicious manipulation. Moreover, we show that the eavesdropper can learn part of the final key shared by the legitimate users as a consequence of this violation of the assumptions.

  8. A neurophysiologically plausible population code model for feature integration explains visual crowding.

    Directory of Open Access Journals (Sweden)

    Ronald van den Berg

    2010-01-01

    Full Text Available An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called "crowding". Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, "compulsory averaging", and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality.

  9. A neurophysiologically plausible population code model for feature integration explains visual crowding.

    Science.gov (United States)

    van den Berg, Ronald; Roerdink, Jos B T M; Cornelissen, Frans W

    2010-01-22

    An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called "crowding". Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, "compulsory averaging", and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality.

  10. Quantum theory as plausible reasoning applied to data obtained by robust experiments.

    Science.gov (United States)

    De Raedt, H; Katsnelson, M I; Michielsen, K

    2016-05-28

    We review recent work that employs the framework of logical inference to establish a bridge between data gathered through experiments and their objective description in terms of human-made concepts. It is shown that logical inference applied to experiments for which the observed events are independent and for which the frequency distribution of these events is robust with respect to small changes of the conditions under which the experiments are carried out yields, without introducing any concept of quantum theory, the quantum theoretical description in terms of the Schrödinger or the Pauli equation, the Stern-Gerlach or Einstein-Podolsky-Rosen-Bohm experiments. The extraordinary descriptive power of quantum theory then follows from the fact that it is plausible reasoning, that is common sense, applied to reproducible and robust experimental data.

  11. Plausible families of compact objects with a Non Local Equation of State

    CERN Document Server

    Hernández, H

    2012-01-01

    We investigate the plausibility of some models emerging from an algorithm devised to generate a one-parameter family of interior solutions for the Einstein equations. It is explored how their physical variables change as the family-parameter varies. The models studied correspond to anisotropic spherical matter configurations having a non local equation of state. This particular type of equation of state with no causality problems provides, at a given point, the radial pressure not only as a function of the density but as a functional of the enclosed matter distribution. We have found that there are several model-independent tendencies as the parameter increases: the equation of state tends to be stiffer and the total mass becomes half of its external radius. Profiting from the concept of cracking of materials in General Relativity, we obtain that those models become more stable as the family parameter increases.

  12. Signature of Plausible Accreting Supermassive Black Holes in Mrk 261/262 and Mrk 266

    Directory of Open Access Journals (Sweden)

    Gagik Ter-Kazarian

    2013-01-01

    Full Text Available We address the neutrino radiation of plausible accreting supermassive black holes closely linking to the 5 nuclear components of galaxy samples of Mrk 261/262 and Mrk 266. We predict a time delay before neutrino emission of the same scale as the age of the Universe. The ultrahigh energy neutrinos are produced in superdense protomatter medium via simple (quark or pionic reactions or modified URCA processes (G. Gamow was inspired to name the process URCA after the name of a casino in Rio de Janeiro. The resulting neutrino fluxes for quark reactions are ranging from to , where is the opening parameter. For pionic and modified URCA reactions, the fluxes are and , respectively. These fluxes are highly beamed along the plane of accretion disk, peaked at ultrahigh energies, and collimated in smaller opening angle .

  13. Plausible role of nanoparticle contamination in the synthesis and properties of organic electronic materials

    Science.gov (United States)

    Ananikov, Valentine P.

    2016-12-01

    Traceless transition metal catalysis (Pd, Ni, Cu, etc.) is very difficult to achieve. Metal contamination in the synthesized products is unavoidable and the most important questions are: How to control metal impurities? What amount of metal impurities can be tolerated? What is the influence of metal impurities? In this brief review, the plausible origins of nanoparticle contamination are discussed in the framework of catalytic synthesis of organic electronic materials. Key factors responsible for increasing the probability of contamination are considered from the point of view of catalytic reaction mechanisms. The purity of the catalyst may greatly affect the molecular weight of a polymer, reaction yield, selectivity and several other parameters. Metal contamination in the final polymeric products may induce some changes in the electric conductivity, charge transport properties, photovoltaic performance and other important parameters.

  14. Spontaneous formation and base pairing of plausible prebiotic nucleotides in water.

    Science.gov (United States)

    Cafferty, Brian J; Fialho, David M; Khanam, Jaheda; Krishnamurthy, Ramanarayanan; Hud, Nicholas V

    2016-04-25

    The RNA World hypothesis presupposes that abiotic reactions originally produced nucleotides, the monomers of RNA and universal constituents of metabolism. However, compatible prebiotic reactions for the synthesis of complementary (that is, base pairing) nucleotides and mechanisms for their mutual selection within a complex chemical environment have not been reported. Here we show that two plausible prebiotic heterocycles, melamine and barbituric acid, form glycosidic linkages with ribose and ribose-5-phosphate in water to produce nucleosides and nucleotides in good yields. Even without purification, these nucleotides base pair in aqueous solution to create linear supramolecular assemblies containing thousands of ordered nucleotides. Nucleotide anomerization and supramolecular assemblies favour the biologically relevant β-anomer form of these ribonucleotides, revealing abiotic mechanisms by which nucleotide structure and configuration could have been originally favoured. These findings indicate that nucleotide formation and selection may have been robust processes on the prebiotic Earth, if other nucleobases preceded those of extant life.

  15. Complex adaptive HIV/AIDS risk reduction: Plausible implications from findings in Limpopo Province, South Africa.

    Science.gov (United States)

    Burman, Chris J; Aphane, Marota A

    2016-05-16

    This article emphasises that when working with complex adaptive systems it is possible to stimulate new social practices and/or cognitive perspectives that contribute to risk reduction, associated with reducing aggregate community viral loads. The process of achieving this is highly participatory and is methodologically possible because evidence of 'attractors' that influence the social practices can be identified using qualitative research techniques. Using findings from Limpopo Province, South Africa, we argue that working with 'wellness attractors' and increasing their presence within the HIV/AIDS landscape could influence aggregate community viral loads. While the analysis that is presented is unconventional, it is plausible that this perspective may hold potential to develop a biosocial response - which the Joint United Nations Programme on HIV and AIDS (UNAIDS) has called for - that reinforces the biomedical opportunities that are now available to achieve the ambition of ending AIDS by 2030.

  16. Reciprocity-based reasons for benefiting research participants: most fail, the most plausible is problematic.

    Science.gov (United States)

    Sofaer, Neema

    2014-11-01

    A common reason for giving research participants post-trial access (PTA) to the trial intervention appeals to reciprocity, the principle, stated most generally, that if one person benefits a second, the second should reciprocate: benefit the first in return. Many authors consider it obvious that reciprocity supports PTA. Yet their reciprocity principles differ, with many authors apparently unaware of alternative versions. This article is the first to gather the range of reciprocity principles. It finds that: (1) most are false. (2) The most plausible principle, which is also problematic, applies only when participants experience significant net risks or burdens. (3) Seldom does reciprocity support PTA for participants or give researchers stronger reason to benefit participants than equally needy non-participants. (4) Reciprocity fails to explain the common view that it is bad when participants in a successful trial have benefited from the trial intervention but lack PTA to it.

  17. Oxidation of cefazolin by potassium permanganate: Transformation products and plausible pathways.

    Science.gov (United States)

    Li, Liping; Wei, Dongbin; Wei, Guohua; Du, Yuguo

    2016-04-01

    Cefazolin was demonstrated to exert high reactivity toward permanganate (Mn(VII)), a common oxidant in water pre-oxidation treatment. In this study, five transformation products were found to be classified into three categories according to the contained characteristic functional groups: three (di-)sulfoxide products, one sulfone product and one di-ketone product. Products analyses showed that two kinds of reactions including oxidation of thioether and the cleavage of unsaturated CC double bond occurred during transformation of cefazolin by Mn(VII). Subsequently, the plausible transformation pathways under different pH conditions were proposed based on the identified products and chemical reaction principles. More importantly, the simulation with real surface water matrix indicated that the proposed transformation pathways of cefazolin could be replayed in real water treatment practices.

  18. Plausible authentication of manuka honey and related products by measuring leptosperin with methyl syringate.

    Science.gov (United States)

    Kato, Yoji; Fujinaka, Rie; Ishisaka, Akari; Nitta, Yoko; Kitamoto, Noritoshi; Takimoto, Yosuke

    2014-07-01

    Manuka honey, obtained from Leptospermum scoparium flowers in New Zealand, has strong antibacterial properties. In this study, plausible authentication of the manuka honey was inspected by measuring leptosperin, methyl syringate 4-O-β-D-gentiobiose, along with methyl syringate. Despite a gradual decrease in methyl syringate content over 30 days at 50 °C, even at moderate 37 °C, leptosperin remained stable. A considerable correlation between nonperoxide antibacterial activity and leptosperin content was observed in 20 certified manuka honey samples. Leptosperin and methyl syringate in manuka honey and related products were analyzed using HPLC connected with mass spectrometry. One noncertified brand displayed significant variations in the leptosperin and methyl syringate contents between two samples obtained from different regions. Therefore, certification is clearly required to protect consumers from disguised and/or low-quality honey. Because leptosperin is stable during storage and specific to manuka honey, its measurement may be applicable for manuka honey authentication.

  19. A plausible simultaneous synthesis of amino acids and simple peptides on the primordial Earth.

    Science.gov (United States)

    Parker, Eric T; Zhou, Manshui; Burton, Aaron S; Glavin, Daniel P; Dworkin, Jason P; Krishnamurthy, Ramanarayanan; Fernández, Facundo M; Bada, Jeffrey L

    2014-07-28

    Following his seminal work in 1953, Stanley Miller conducted an experiment in 1958 to study the polymerization of amino acids under simulated early Earth conditions. In the experiment, Miller sparked a gas mixture of CH4, NH3, and H2O, while intermittently adding the plausible prebiotic condensing reagent cyanamide. For unknown reasons, an analysis of the samples was not reported. We analyzed the archived samples for amino acids, dipeptides, and diketopiperazines by liquid chromatography, ion mobility spectrometry, and mass spectrometry. A dozen amino acids, 10 glycine-containing dipeptides, and 3 glycine-containing diketopiperazines were detected. Miller's experiment was repeated and similar polymerization products were observed. Aqueous heating experiments indicate that Strecker synthesis intermediates play a key role in facilitating polymerization. These results highlight the potential importance of condensing reagents in generating diversity within the prebiotic chemical inventory.

  20. Evaluation and integration of cancer gene classifiers: identification and ranking of plausible drivers.

    Science.gov (United States)

    Liu, Yang; Tian, Feng; Hu, Zhenjun; DeLisi, Charles

    2015-05-11

    The number of mutated genes in cancer cells is far larger than the number of mutations that drive cancer. The difficulty this creates for identifying relevant alterations has stimulated the development of various computational approaches to distinguishing drivers from bystanders. We develop and apply an ensemble classifier (EC) machine learning method, which integrates 10 classifiers that are publically available, and apply it to breast and ovarian cancer. In particular we find the following: (1) Using both standard and non-standard metrics, EC almost always outperforms single method classifiers, often by wide margins. (2) Of the 50 highest ranked genes for breast (ovarian) cancer, 34 (30) are associated with other cancers in either the OMIM, CGC or NCG database (P plausible. Biological implications are briefly discussed. Source codes and detailed results are available at http://www.visantnet.org/misi/driver_integration.zip.

  1. Probability, plausibility, and adequacy evaluations of the Oriente Study demonstrate that supplementation improved child growth.

    Science.gov (United States)

    Habicht, Jean-Pierre; Martorell, Reynaldo

    2010-02-01

    This article presents evidence that the high-nutrient supplement in the Oriente study (Atole) improved child growth. The evidence is presented at 4 levels. There was a causal effect of the intervention on child length, as assessed by probability analyses of the randomized, controlled trial (P < 0.05). The plausibility analyses, which included an examination of wasting, showed that the nutritional impact was due to the Atole, especially in those who were <3 y old and who suffered from diarrhea. The adequacy analyses revealed excellent biological efficacy of the Atole at the individual level. At the level of the whole population, the efficacy of impact was much less, because many children did not participate fully in the supplementation program. The external validity of the biological impact is likely to be good for populations with similar diets and medical care.

  2. The Sarrazin effect: the presence of absurd statements in conspiracy theories makes canonical information less plausible

    Directory of Open Access Journals (Sweden)

    Marius Hans Raab

    2013-07-01

    Full Text Available Reptile prime ministers and flying Nazi saucers—extreme and sometimes off-wall conclusion are common ingredients of conspiracy theories. While individual differences are a common research topic concerning conspiracy theories, the role of extreme statements in the process of acquiring and passing on conspiratorial stories has not been regarded in an experimental design so far. We identified six morphological components of conspiracy theories empirically. On the basis of these content categories a set of narrative elements for a 9/11 story was compiled. These elements varied systematically in terms of conspiratorial allegation, i.e., they contained official statements concerning the events of 9/11, statements alleging to a conspiracy limited in time and space as well as extreme statements indicating an all-encompassing cover-up. Using the method of narrative construction, 30 people were given a set of cards with these statements and asked to construct the course of events of 9/11 they deem most plausible. When extreme statements were present in the set, the resulting stories were more conspiratorial; the number of official statements included in the narrative dropped significantly, whereas the self-assessment of the story’s plausibility did not differ between conditions. This indicates that blatant statements in a pool of information foster the synthesis of conspiracy theories on an individual level. By relating these findings to one of Germany’s most successful (and controversial non-fiction books, we refer to the real-world dangers of this effect.

  3. Testing the physiological plausibility of conflicting psychological models of response inhibition: A forward inference fMRI study.

    Science.gov (United States)

    Criaud, Marion; Longcamp, Marieke; Anton, Jean-Luc; Nazarian, Bruno; Roth, Muriel; Sescousse, Guillaume; Strafella, Antonio P; Ballanger, Bénédicte; Boulinguez, Philippe

    2017-08-30

    The neural mechanisms underlying response inhibition and related disorders are unclear and controversial for several reasons. First, it is a major challenge to assess the psychological bases of behaviour, and ultimately brain-behaviour relationships, of a function which is precisely intended to suppress overt measurable behaviours. Second, response inhibition is difficult to disentangle from other parallel processes involved in more general aspects of cognitive control. Consequently, different psychological and anatomo-functional models coexist, which often appear in conflict with each other even though they are not necessarily mutually exclusive. The standard model of response inhibition in go/no-go tasks assumes that inhibitory processes are reactively and selectively triggered by the stimulus that participants must refrain from reacting to. Recent alternative models suggest that action restraint could instead rely on reactive but non-selective mechanisms (all automatic responses are automatically inhibited in uncertain contexts) or on proactive and non-selective mechanisms (a gating function by which reaction to any stimulus is prevented in anticipation of stimulation when the situation is unpredictable). Here, we assessed the physiological plausibility of these different models by testing their respective predictions regarding event-related BOLD modulations (forward inference using fMRI). We set up a single fMRI design which allowed for us to record simultaneously the different possible forms of inhibition while limiting confounds between response inhibition and parallel cognitive processes. We found BOLD dynamics consistent with non-selective models. These results provide new theoretical and methodological lines of inquiry for the study of basic functions involved in behavioural control and related disorders. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Analysis of one assumption of the Navier-Stokes equations

    CERN Document Server

    Budarin, V A

    2013-01-01

    This article analyses the assumptions regarding the influence of pressure forces during the calculation of the motion of a Newtonian fluid. The purpose of the analysis is to determine the reasonableness of the assumptions and their impact on the results of the analytical calculation. The connections between equations, causes of discrepancies in exact solutions of the Navier-Stokes equations at low Reynolds numbers and the emergence of unstable solutions using computer programs are also addressed. The necessity to complement the well-known equations of motion in mechanical stress requires other equations are substantive. It is shown that there are three methods of solving such a problem and the requirements for the unknown equations are described. Keywords: Navier-Stokes, approximate equation, closing equations, holonomic system.

  5. Testing Modeling Assumptions in the West Africa Ebola Outbreak

    Science.gov (United States)

    Burghardt, Keith; Verzijl, Christopher; Huang, Junming; Ingram, Matthew; Song, Binyang; Hasne, Marie-Pierre

    2016-01-01

    The Ebola virus in West Africa has infected almost 30,000 and killed over 11,000 people. Recent models of Ebola Virus Disease (EVD) have often made assumptions about how the disease spreads, such as uniform transmissibility and homogeneous mixing within a population. In this paper, we test whether these assumptions are necessarily correct, and offer simple solutions that may improve disease model accuracy. First, we use data and models of West African migration to show that EVD does not homogeneously mix, but spreads in a predictable manner. Next, we estimate the initial growth rate of EVD within country administrative divisions and find that it significantly decreases with population density. Finally, we test whether EVD strains have uniform transmissibility through a novel statistical test, and find that certain strains appear more often than expected by chance. PMID:27721505

  6. Evaluating The Markov Assumption For Web Usage Mining

    DEFF Research Database (Denmark)

    Jespersen, S.; Pedersen, Torben Bach; Thorhauge, J.

    2003-01-01

    Web usage mining concerns the discovery of common browsing patterns, i.e., pages requested in sequence, from web logs. To cope with the enormous amounts of data, several aggregated structures based on statistical models of web surfing have appeared, e.g., the Hypertext Probabilistic Grammar (HPG...... knowledge there has been no systematic study of the validity of the Markov assumption wrt.\\ web usage mining and the resulting quality of the mined browsing patterns. In this paper we systematically investigate the quality of browsing patterns mined from structures based on the Markov assumption. Formal...... measures of quality, based on the closeness of the mined patterns to the true traversal patterns, are defined and an extensive experimental evaluation is performed, based on two substantial real-world data sets. The results indicate that a large number of rules must be considered to achieve high quality...

  7. The sufficiency assumption of the reasoned approach to action

    Directory of Open Access Journals (Sweden)

    David Trafimow

    2015-12-01

    Full Text Available The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables account for variance (or how much traditional variables account for variance, to see whether they are important, in general or with respect to specific behaviors under investigation. But this approach tacitly assumes that accounting for variance is highly relevant to understanding the production of variance, which is what really is at issue. Based on the variance law, I question this assumption.

  8. A "unity assumption" does not promote intersensory integration.

    Science.gov (United States)

    Misceo, Giovanni F; Taylor, Nathanael J

    2011-01-01

    An account of intersensory integration is premised on knowing that different sensory inputs arise from the same object. Could, however, the combination of the inputs be impaired although the "unity assumption" holds? Forty observers viewed a square through a minifying (50%) lens while they simultaneously touched the square. Half could see and half could not see their haptic explorations of the square. Both groups, however, had reason to believe that they were touching and viewing the same square. Subsequent matches of the inspected square were mutually biased by touch and vision when the exploratory movements were visible. However, the matches were biased in the direction of the square's haptic size when observers could not see their exploratory movements. This impaired integration without the visible haptic explorations suggests that the unity assumption alone is not enough to promote intersensory integration.

  9. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    A number of waste life cycle assessment (LCA) models have been gradually developed since the early 1990s, in a number of countries, usually independently from each other. Large discrepancies in results have been observed among different waste LCA models, although it has also been shown that results......, such as the functional unit, system boundaries, waste composition and energy modelling. The modelling assumptions of waste management processes, ranging from collection, transportation, intermediate facilities, recycling, thermal treatment, biological treatment, and landfilling, are obviously critical when comparing...... waste LCA models. This review infers that some of the differences in waste LCA models are inherent to the time they were developed. It is expected that models developed later, benefit from past modelling assumptions and knowledge and issues. Models developed in different countries furthermore rely...

  10. Testing Modeling Assumptions in the West Africa Ebola Outbreak

    Science.gov (United States)

    Burghardt, Keith; Verzijl, Christopher; Huang, Junming; Ingram, Matthew; Song, Binyang; Hasne, Marie-Pierre

    2016-10-01

    The Ebola virus in West Africa has infected almost 30,000 and killed over 11,000 people. Recent models of Ebola Virus Disease (EVD) have often made assumptions about how the disease spreads, such as uniform transmissibility and homogeneous mixing within a population. In this paper, we test whether these assumptions are necessarily correct, and offer simple solutions that may improve disease model accuracy. First, we use data and models of West African migration to show that EVD does not homogeneously mix, but spreads in a predictable manner. Next, we estimate the initial growth rate of EVD within country administrative divisions and find that it significantly decreases with population density. Finally, we test whether EVD strains have uniform transmissibility through a novel statistical test, and find that certain strains appear more often than expected by chance.

  11. Fluid-Structure Interaction Modeling of Intracranial Aneurysm Hemodynamics: Effects of Different Assumptions

    Science.gov (United States)

    Rajabzadeh Oghaz, Hamidreza; Damiano, Robert; Meng, Hui

    2015-11-01

    Intracranial aneurysms (IAs) are pathological outpouchings of cerebral vessels, the progression of which are mediated by complex interactions between the blood flow and vasculature. Image-based computational fluid dynamics (CFD) has been used for decades to investigate IA hemodynamics. However, the commonly adopted simplifying assumptions in CFD (e.g. rigid wall) compromise the simulation accuracy and mask the complex physics involved in IA progression and eventual rupture. Several groups have considered the wall compliance by using fluid-structure interaction (FSI) modeling. However, FSI simulation is highly sensitive to numerical assumptions (e.g. linear-elastic wall material, Newtonian fluid, initial vessel configuration, and constant pressure outlet), the effects of which are poorly understood. In this study, a comprehensive investigation of the sensitivity of FSI simulations in patient-specific IAs is investigated using a multi-stage approach with a varying level of complexity. We start with simulations incorporating several common simplifications: rigid wall, Newtonian fluid, and constant pressure at the outlets, and then we stepwise remove these simplifications until the most comprehensive FSI simulations. Hemodynamic parameters such as wall shear stress and oscillatory shear index are assessed and compared at each stage to better understand the sensitivity of in FSI simulations for IA to model assumptions. Supported by the National Institutes of Health (1R01 NS 091075-01).

  12. Assumptions and realities of the NCLEX-RN.

    Science.gov (United States)

    Aucoin, Julia W; Treas, Leslie

    2005-01-01

    Every three years the National Council of State Boards of Nursing conducts a practice analysis to verify the activities that are tested on the licensure exam (NCLEX-RN). Faculty can benefit from information in the practice analysis to ensure that courses and experiences adequately prepare graduates for the NCLEX-RN. This summary of the practice analysis challenges common assumptions and provides recommendations for faculty.

  13. The sufficiency assumption of the reasoned approach to action

    OpenAIRE

    David Trafimow

    2015-01-01

    The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables accou...

  14. Different but equal: the implausible assumption at the heart of neutral theory.

    Science.gov (United States)

    Purves, Drew W; Turnbull, Lindsay A

    2010-11-01

    1. The core assumption of neutral theory is that all individuals in a community have equal fitness regardless of species, and regardless of the species composition of the community. But, real communities consist of species exhibiting large trait differences; hence these differences must be subject to perfect fitness-equalizing trade-offs for neutrality to hold. 2. Here we explain that perfect equalizing trade-offs are extremely unlikely to occur in reality, because equality of fitness among species is destroyed by: (i) any deviation in the functional form of the trade-off away from the one special form that gives equal fitness; (ii) spatial or temporal variation in performance; (iii) random species differences in performance. 3. In the absence of the density-dependent processes stressed by traditional niche-based community ecology, communities featuring small amounts of (i) or (ii) rapidly lose trait variation, becoming dominated by species with similar traits, and exhibit substantially lower species richness compared to the neutral case. Communities featuring random interspecific variation in traits (iii) lose all but a few fortuitous species. 4. Thus neutrality should be viewed, a priori, as a highly improbable explanation for the long-term co-occurrence of measurably different species within ecological communities. In contrast, coexistence via niche structure and density dependence, is robust to species differences in baseline fitness, and so remains plausible. 5. We conclude that: (i) co-occurring species will typically exhibit substantial differences in baseline fitness even when (imperfect) equalizing trade-offs have been taken into account; (ii) therefore, communities must be strongly niche structured, otherwise they would lose both trait variation and species richness; (iii) nonetheless, even in strongly niche-structured communities, it is possible that the abundance of species with similar traits are at least partially free to drift.

  15. Quasispecies dynamics with network constraints.

    Science.gov (United States)

    Barbosa, Valmir C; Donangelo, Raul; Souza, Sergio R

    2012-11-07

    A quasispecies is a set of interrelated genotypes that have reached a stationary state while evolving according to the usual Darwinian principles of selection and mutation. Quasispecies studies invariably assume that it is possible for any genotype to mutate into any other, but recent finds indicate that this assumption is not necessarily true. Here we revisit the traditional quasispecies theory by adopting a network structure to constrain the occurrence of mutations. Such structure is governed by a random-graph model, whose single parameter (a probability p) controls both the graph's density and the dynamics of mutation. We contribute two further modifications to the theory, one to account for the fact that different loci in a genotype may be differently susceptible to the occurrence of mutations, the other to allow for a more plausible description of the transition from adaptation to degeneracy of the quasispecies as p is increased. We give analytical and simulation results for the usual case of binary genotypes, assuming the fitness landscape in which a genotype's fitness decays exponentially with its Hamming distance to the wild type. These results support the theory's assertions regarding the adaptation of the quasispecies to the fitness landscape and also its possible demise as a function of p.

  16. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support.

    Science.gov (United States)

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning, which generalizes the commonalities among the data to induce new rules, and analogical reasoning, which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15%, and

  17. Biologic plausibility, cellular effects, and molecular mechanisms of eicosapentaenoic acid (EPA) in atherosclerosis.

    Science.gov (United States)

    Borow, Kenneth M; Nelson, John R; Mason, R Preston

    2015-09-01

    Residual cardiovascular (CV) risk remains in dyslipidemic patients despite intensive statin therapy, underscoring the need for additional intervention. Eicosapentaenoic acid (EPA), an omega-3 polyunsaturated fatty acid, is incorporated into membrane phospholipids and atherosclerotic plaques and exerts beneficial effects on the pathophysiologic cascade from onset of plaque formation through rupture. Specific salutary actions have been reported relating to endothelial function, oxidative stress, foam cell formation, inflammation, plaque formation/progression, platelet aggregation, thrombus formation, and plaque rupture. EPA also improves atherogenic dyslipidemia characterized by reduction of triglycerides without raising low-density lipoprotein cholesterol. Other beneficial effects of EPA include vasodilation, resulting in blood pressure reductions, as well as improved membrane fluidity. EPA's effects are at least additive to those of statins when given as adjunctive therapy. In this review, we present data supporting the biologic plausibility of EPA as an anti-atherosclerotic agent with potential clinical benefit for prevention of CV events, as well as its cellular effects and molecular mechanisms of action. REDUCE-IT is an ongoing, randomized, controlled study evaluating whether the high-purity ethyl ester of EPA (icosapent ethyl) at 4 g/day combined with statin therapy is superior to statin therapy alone for reducing CV events in high-risk patients with mixed dyslipidemia. The results from this study are expected to clarify the role of EPA as adjunctive therapy to a statin for reduction of residual CV risk.

  18. Mindfulness and Cardiovascular Disease Risk: State of the Evidence, Plausible Mechanisms, and Theoretical Framework.

    Science.gov (United States)

    Loucks, Eric B; Schuman-Olivier, Zev; Britton, Willoughby B; Fresco, David M; Desbordes, Gaelle; Brewer, Judson A; Fulwiler, Carl

    2015-12-01

    The purpose of this review is to provide (1) a synopsis on relations of mindfulness with cardiovascular disease (CVD) and major CVD risk factors, and (2) an initial consensus-based overview of mechanisms and theoretical framework by which mindfulness might influence CVD. Initial evidence, often of limited methodological quality, suggests possible impacts of mindfulness on CVD risk factors including physical activity, smoking, diet, obesity, blood pressure, and diabetes regulation. Plausible mechanisms include (1) improved attention control (e.g., ability to hold attention on experiences related to CVD risk, such as smoking, diet, physical activity, and medication adherence), (2) emotion regulation (e.g., improved stress response, self-efficacy, and skills to manage craving for cigarettes, palatable foods, and sedentary activities), and (3) self-awareness (e.g., self-referential processing and awareness of physical sensations due to CVD risk factors). Understanding mechanisms and theoretical framework should improve etiologic knowledge, providing customized mindfulness intervention targets that could enable greater mindfulness intervention efficacy.

  19. A plausible (overlooked) super-luminous supernova in the SDSS Stripe 82 data

    CERN Document Server

    Kostrzewa-Rutkowska, Zuzanna; Wyrzykowski, Lukasz; Djorgovski, S George; Glikman, Eilat; Mahabal, Ashish A

    2013-01-01

    We present the discovery of a plausible super-luminous supernova (SLSN), found in the archival data of Sloan Digital Sky Survey (SDSS) Stripe 82, called PSN 000123+000504. The supernova peaked at M_g<-21.3 mag in the second half of September 2005, but was missed by the real-time supernova hunt. The observed part of the light curve (17 epochs) showed that the rise to the maximum took over 30 days, while the decline time lasted at least 70 days (observed frame), closely resembling other SLSNe of SN2007bi type. Spectrum of the host galaxy reveals a redshift of z=0.281 and the distance modulus of \\mu=40.77 mag. Combining this information with the SDSS photometry, we found the host galaxy to be an LMC-like irregular dwarf galaxy with the absolute magnitude of M_B=-18.2+/-0.2 mag and the oxygen abundance of 12+log[O/H]=8.3+/-0.2. Our SLSN follows the relation for the most energetic/super-luminous SNe exploding in low-metallicity environments, but we found no clear evidence for SLSNe to explode in low-luminosity ...

  20. From ether to acid: A plausible degradation pathway of glycerol dialkyl glycerol tetraethers

    Science.gov (United States)

    Liu, Xiao-Lei; Birgel, Daniel; Elling, Felix J.; Sutton, Paul A.; Lipp, Julius S.; Zhu, Rong; Zhang, Chuanlun; Könneke, Martin; Peckmann, Jörn; Rowland, Steven J.; Summons, Roger E.; Hinrichs, Kai-Uwe

    2016-06-01

    Glycerol dialkyl glycerol tetraethers (GDGTs) are ubiquitous microbial lipids with extensive demonstrated and potential roles as paleoenvironmental proxies. Despite the great attention they receive, comparatively little is known regarding their diagenetic fate. Putative degradation products of GDGTs, identified as hydroxyl and carboxyl derivatives, were detected in lipid extracts of marine sediment, seep carbonate, hot spring sediment and cells of the marine thaumarchaeon Nitrosopumilus maritimus. The distribution of GDGT degradation products in environmental samples suggests that both biotic and abiotic processes act as sinks for GDGTs. More than a hundred newly recognized degradation products afford a view of the stepwise degradation of GDGT via (1) ether bond hydrolysis yielding hydroxyl isoprenoids, namely, GDGTol (glycerol dialkyl glycerol triether alcohol), GMGD (glycerol monobiphytanyl glycerol diether), GDD (glycerol dibiphytanol diether), GMM (glycerol monobiphytanol monoether) and bpdiol (biphytanic diol); (2) oxidation of isoprenoidal alcohols into corresponding carboxyl derivatives and (3) chain shortening to yield C39 and smaller isoprenoids. This plausible GDGT degradation pathway from glycerol ethers to isoprenoidal fatty acids provides the link to commonly detected head-to-head linked long chain isoprenoidal hydrocarbons in petroleum and sediment samples. The problematic C80 to C82 tetraacids that cause naphthenate deposits in some oil production facilities can be generated from H-shaped glycerol monoalkyl glycerol tetraethers (GMGTs) following the same process, as indicated by the distribution of related derivatives in hydrothermally influenced sediments.

  1. Plausible molecular and crystal structures of chitosan/HI type II salt.

    Science.gov (United States)

    Lertworasirikul, Amornrat; Noguchi, Keiichi; Ogawa, Kozo; Okuyama, Kenji

    2004-03-15

    Chitosan/HI type II salt prepared from crab tendon was investigated by X-ray fiber diffraction. Two polymer chains and 16 iodide ions (I(-)) crystallized in a tetragonal unit cell with lattice parameters of a = b = 10.68(3), c (fiber axis) = 40.77(13) A, and a space group P4(1). Chitosan forms a fourfold helix with a 40.77 A fiber period having a disaccharide as the helical asymmetric unit. One of the O-3... O-5 intramolecular hydrogen bonds at the glycosidic linkage is weakened by interacting with iodide ions, which seems to cause the polymer to take the 4/1-helical symmetry rather than the extended 2/1-helix. The plausible orientations of two O-6 atoms in the helical asymmetric unit were found to be gt and gg. Two chains are running through at the corner and the center of the unit cell along the c-axis. They are linked by hydrogen bonds between N-21 and O-61 atoms. Two out of four independent iodide ions are packed between the corner chains while the other two are packed between the corner and center chains when viewing through the ab-plane. The crystal structure of the salt is stabilized by hydrogen bonds between these iodide ions and N-21, N-22, O-32, O-61, O-62 of the polymer chains.

  2. Solvent effects on the photochemistry of 4-aminoimidazole-5-carbonitrile, a prebiotically plausible precursor of purines.

    Science.gov (United States)

    Szabla, Rafał; Sponer, Judit E; Sponer, Jiří; Sobolewski, Andrzej L; Góra, Robert W

    2014-09-01

    4-Aminoimidazole-5-carbonitrile (AICN) was suggested as a prebiotically plausible precursor of purine nucleobases and nucleotides. Although it can be formed in a sequence of photoreactions, AICN is immune to further irradiation with UV-light. We present state-of-the-art multi-reference quantum-chemical calculations of potential energy surface cuts and conical intersection optimizations to explain the molecular mechanisms underlying the photostability of this compound. We have identified the N-H bond stretching and ring-puckering mechanisms that should be responsible for the photochemistry of AICN in the gas phase. We have further considered the photochemistry of AICN-water clusters, while including up to six explicit water molecules. The calculations reveal charge transfer to solvent followed by formation of an H3O(+) cation, both of which occur on the (1)πσ* hypersurface. Interestingly, a second proton transfer to an adjacent water molecule leads to a (1)πσ*/S0 conical intersection. We suggest that this electron-driven proton relay might be characteristic of low-lying (1)πσ* states in chromophore-water clusters. Owing to its nature, this mechanism might also be responsible for the photostability of analogous organic molecules in bulk water.

  3. Plausible ergogenic effects of vitamin D on athletic performance and recovery.

    Science.gov (United States)

    Dahlquist, Dylan T; Dieter, Brad P; Koehle, Michael S

    2015-01-01

    The purpose of this review is to examine vitamin D in the context of sport nutrition and its potential role in optimizing athletic performance. Vitamin D receptors (VDR) and vitamin D response elements (VDREs) are located in almost every tissue within the human body including skeletal muscle. The hormonally-active form of vitamin D, 1,25-dihydroxyvitamin D, has been shown to play critical roles in the human body and regulates over 900 gene variants. Based on the literature presented, it is plausible that vitamin D levels above the normal reference range (up to 100 nmol/L) might increase skeletal muscle function, decrease recovery time from training, increase both force and power production, and increase testosterone production, each of which could potentiate athletic performance. Therefore, maintaining higher levels of vitamin D could prove beneficial for athletic performance. Despite this situation, large portions of athletic populations are vitamin D deficient. Currently, the research is inconclusive with regards to the optimal intake of vitamin D, the specific forms of vitamin D one should ingest, and the distinct nutrient-nutrient interactions of vitamin D with vitamin K that affect arterial calcification and hypervitaminosis. Furthermore, it is possible that dosages exceeding the recommendations for vitamin D (i.e. dosages up to 4000-5000 IU/day), in combination with 50 to 1000 mcg/day of vitamin K1 and K2 could aid athletic performance. This review will investigate these topics, and specifically their relevance to athletic performance.

  4. A simple biophysically plausible model for long time constants in single neurons.

    Science.gov (United States)

    Tiganj, Zoran; Hasselmo, Michael E; Howard, Marc W

    2015-01-01

    Recent work in computational neuroscience and cognitive psychology suggests that a set of cells that decay exponentially could be used to support memory for the time at which events took place. Analytically and through simulations on a biophysical model of an individual neuron, we demonstrate that exponentially decaying firing with a range of time constants up to minutes could be implemented using a simple combination of well-known neural mechanisms. In particular, we consider firing supported by calcium-controlled cation current. When the amount of calcium leaving the cell during an interspike interval is larger than the calcium influx during a spike, the overall decay in calcium concentration can be exponential, resulting in exponential decay of the firing rate. The time constant of the decay can be several orders of magnitude larger than the time constant of calcium clearance, and it could be controlled externally via a variety of biologically plausible ways. The ability to flexibly and rapidly control time constants could enable working memory of temporal history to be generalized to other variables in computing spatial and ordinal representations.

  5. A plausible mechanism of biosorption in dual symbioses by vesicular-arbuscular mycorrhizal in plants.

    Science.gov (United States)

    Azmat, Rafia; Hamid, Neelofer

    2015-03-01

    Dual symbioses of vesicular-arbuscular mycorrhizal (VAM) fungi with growth of Momordica charantia were elucidated in terms of plausible mechanism of biosorption in this article. The experiment was conducted in green house and mixed inoculum of the VAM fungi was used in the three replicates. Results demonstrated that the starch contents were the main source of C for the VAM to builds their hyphae. The increased plant height and leaves surface area were explained in relation with an increase in the photosynthetic rates to produce rapid sugar contents for the survival of plants. A decreased in protein, and amino acid contents and increased proline and protease activity in VAM plants suggested that these contents were the main bio-indicators of the plants under biotic stress. The decline in protein may be due to the degradation of these contents, which later on converted into dextrose where it can easily be absorbed by for the period of symbioses. A mechanism of C chemisorption in relation with physiology and morphology of plant was discussed.

  6. Vitamin D in primary biliary cirrhosis, a plausible marker of advanced disease.

    Science.gov (United States)

    Agmon-Levin, Nancy; Kopilov, Ron; Selmi, Carlo; Nussinovitch, Udi; Sánchez-Castañón, María; López-Hoyos, Marcos; Amital, Howie; Kivity, Shaye; Gershwin, Eric M; Shoenfeld, Yehuda

    2015-02-01

    Vitamin D immune-modulating effects were extensively studied, and low levels have been linked with autoimmune diseases. The associations of vitamin D with autoimmune diseases of the liver, and particularly primary biliary cirrhosis (PBC), are yet to be defined. Hence, in this study, serum levels of vitamin D were determined in 79 patients with PBC and 70 age- and sex-matched controls by the LIAISON chemiluminescent immunoassays (DiaSorin-Italy). Clinical and serological parameters of patients were analyzed with respect to vitamin D status. Mean levels of vitamin D were significantly lower among patients with PBC compared with controls (16.8 ± 9 vs. 22.1 ± 9 ng/ml; p = 0.029), and vitamin D deficiency (≤10 ng/ml) was documented in 33% of patients with PBC versus 7% of controls (p plausible roles of vitamin D as a prognostic marker of PBC severity, and as a potential player in this disease pathogenesis. While further studies are awaited, monitoring vitamin D in patients with PBC and use of supplements may be advisable.

  7. Is the de Broglie-Bohm interpretation of quantum mechanics really plausible?

    Science.gov (United States)

    Jung, Kurt

    2013-06-01

    Bohmian mechanics also known as de Broglie-Bohm theory is the most popular alternative approach to quantum mechanics. Whereas the standard interpretation of quantum mechanics is based on the complementarity principle Bohmian mechanics assumes that both particle and wave are concrete physical objects. In 1993 Peter Holland has written an ardent account on the plausibility of the de Broglie-Bohm theory. He proved that it fully reproduces quantum mechanics if the initial particle distribution is consistent with a solution of the Schrödinger equation. Which may be the reasons that Bohmian mechanics has not yet found global acceptance? In this article it will be shown that predicted properties of atoms and molecules are in conflict with experimental findings. Moreover it will be demonstrated that repeatedly published ensembles of trajectories illustrating double slit diffraction processes do not agree with quantum mechanics. The credibility of a theory is undermined when recognizably wrong data presented frequently over years are finally not declared obsolete.

  8. Plausible futures of a social-ecological system: Yahara watershed, Wisconsin, USA

    Directory of Open Access Journals (Sweden)

    Stephen R. Carpenter

    2015-06-01

    Full Text Available Agricultural watersheds are affected by changes in climate, land use, agricultural practices, and human demand for energy, food, and water resources. In this context, we analyzed the agricultural, urbanizing Yahara watershed (size: 1345 km², population: 372,000 to assess its responses to multiple changing drivers. We measured recent trends in land use/cover and water quality of the watershed, spatial patterns of 10 ecosystem services, and spatial patterns and nestedness of governance. We developed scenarios for the future of the Yahara watershed by integrating trends and events from the global scenarios literature, perspectives of stakeholders, and models of biophysical drivers and ecosystem services. Four qualitative scenarios were created to explore plausible trajectories to the year 2070 in the watershed's social-ecological system under different regimes: no action on environmental trends, accelerated technological development, strong intervention by government, and shifting values toward sustainability. Quantitative time-series for 2010-2070 were developed for weather and land use/cover during each scenario as inputs to model changes in ecosystem services. Ultimately, our goal is to understand how changes in the social-ecological system of the Yahara watershed, including management of land and water resources, can build or impair resilience to shifting drivers, including climate.

  9. Plausible impact of global climate change on water resources in the Tarim River Basin

    Institute of Scientific and Technical Information of China (English)

    CHEN; Yaning; XU; Zongxue

    2005-01-01

    Combining the temperature and precipitation data from 77 climatological stations and the climatic and hydrological change data from three headstreams of the Tarim River: Hotan, Yarkant, and Aksu in the study area, the plausible association between climate change and the variability of water resources in the Tarim River Basin in recent years was investigated, the long-term trend of the hydrological time series including temperature, precipitation, and streamflow was detected, and the possible association between the El Ni(n)o/Southern Oscillation (ENSO) and these three kinds of time series was tested. The results obtained in this study show that during the past years, the temperature experienced a significant monotonic increase at the speed of 5%, nearly 1℃ rise; the precipitation showed a significant decrease in the 1970s, and a significant increase in the1980s and 1990s, the average annual precipitation was increased with the magnitude of 6.8 mm per decade. A step change occurred in both temperature and precipitation time series around 1986, which may be influenced by the global climate change. Climate change resulted in the increase of the streamflow at the headwater of the Tarim River, but the anthropogenic activities such as over-depletion of the surface water resulted in the decrease of the streamflow at the lower reaches of the Tarim River. The study result also showed that there is no significant association between the ENSO and the temperature, precipitation and streamflow.

  10. Evaluating risk factor assumptions: a simulation-based approach

    Directory of Open Access Journals (Sweden)

    Miglioretti Diana L

    2011-09-01

    Full Text Available Abstract Background Microsimulation models are an important tool for estimating the comparative effectiveness of interventions through prediction of individual-level disease outcomes for a hypothetical population. To estimate the effectiveness of interventions targeted toward high risk groups, the mechanism by which risk factors influence the natural history of disease must be specified. We propose a method for evaluating these risk factor assumptions as part of model-building. Methods We used simulation studies to examine the impact of risk factor assumptions on the relative rate (RR of colorectal cancer (CRC incidence and mortality for a cohort with a risk factor compared to a cohort without the risk factor using an extension of the CRC-SPIN model for colorectal cancer. We also compared the impact of changing age at initiation of screening colonoscopy for different risk mechanisms. Results Across CRC-specific risk factor mechanisms, the RR of CRC incidence and mortality decreased (towards one with increasing age. The rate of change in RRs across age groups depended on both the risk factor mechanism and the strength of the risk factor effect. Increased non-CRC mortality attenuated the effect of CRC-specific risk factors on the RR of CRC when both were present. For each risk factor mechanism, earlier initiation of screening resulted in more life years gained, though the magnitude of life years gained varied across risk mechanisms. Conclusions Simulation studies can provide insight into both the effect of risk factor assumptions on model predictions and the type of data needed to calibrate risk factor models.

  11. THE COMPLEX OF ASSUMPTION CATHEDRAL OF THE ASTRAKHAN KREMLIN

    Directory of Open Access Journals (Sweden)

    Savenkova Aleksandra Igorevna

    2016-08-01

    Full Text Available This article is devoted to an architectural and historical analysis of the constructions forming a complex of Assumption Cathedral of the Astrakhan Kremlin, which earlier hasn’t been considered as a subject of special research. Basing on the archival sources, photographic materials, publications and on-site investigations of monuments, the creation history of the complete architectural complex sustained in one style of the Muscovite baroque, unique in its composite construction, is considered. Its interpretation in the all-Russian architectural context is offered. Typological features of single constructions come to light. The typology of the Prechistinsky bell tower has an untypical architectural solution - “hexagonal structure on octagonal and quadrangular structures”. The way of connecting the building of the Cathedral and the chambers by the passage was characteristic of monastic constructions and was exclusively seldom in kremlins, farmsteads and ensembles of city cathedrals. The composite scheme of the Assumption Cathedral includes the Lobnoye Mesto (“the Place of Execution” located on an axis from the West, it is connected with the main building by a quarter-turn with landing. The only prototype of the structure is a Lobnoye Mesto on the Red Square in Moscow. In the article the version about the emergence of the Place of Execution on the basis of earlier existing construction - a tower “the Peal” which is repeatedly mentioned in written sources in connection with S. Razin’s revolt is considered. The metropolitan Sampson, trying to keep the value of the Astrakhan metropolitanate, builds the Assumption Cathedral and the Place of Execution directly appealing to a capital prototype to emphasize the continuity and close connection with Moscow.

  12. Evaluating The Markov Assumption For Web Usage Mining

    DEFF Research Database (Denmark)

    Jespersen, S.; Pedersen, Torben Bach; Thorhauge, J.

    2003-01-01

    Web usage mining concerns the discovery of common browsing patterns, i.e., pages requested in sequence, from web logs. To cope with the enormous amounts of data, several aggregated structures based on statistical models of web surfing have appeared, e.g., the Hypertext Probabilistic Grammar (HPG......) model~\\cite{borges99data}. These techniques typically rely on the \\textit{Markov assumption with history depth} $n$, i.e., it is assumed that the next requested page is only dependent on the last $n$ pages visited. This is not always valid, i.e. false browsing patterns may be discovered. However, to our...

  13. AN EFFICIENT BIT COMMITMENT SCHEME BASED ON FACTORING ASSUMPTION

    Institute of Scientific and Technical Information of China (English)

    Zhong Ming; Yang Yixian

    2001-01-01

    Recently, many bit commitment schemes have been presented. This paper presents a new practical bit commitment scheme based on Schnorr's one-time knowledge proof scheme,where the use of cut-and-choose method and many random exam candidates in the protocols are replaced by a single challenge number. Therefore the proposed bit commitment scheme is more efficient and practical than the previous schemes In addition, the security of the proposed scheme under factoring assumption is proved, thus the cryptographic basis of the proposed scheme is clarified.

  14. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  15. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    articles across various research disciplines. We find and classify a stock of 107 relevant articles into four scientific discourses: the normative, the interpretive, the critical, and the dialogical discourses, as formulated by Deetz (1996). We find that the normative discourse dominates the IT PPM......, metaphors, information systems....... literature, and few contributions represent the three remaining discourses, which unjustifiably leaves out issues that research could and most probably should investigate. In order to highlight research potentials, limitations, and underlying assumptions of each discourse, we develop four IT PPM metaphors...

  16. Halo-Independent Direct Detection Analyses Without Mass Assumptions

    CERN Document Server

    Anderson, Adam J; Kahn, Yonatan; McCullough, Matthew

    2015-01-01

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the $m_\\chi-\\sigma_n$ plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the $v_{min}-\\tilde{g}$ plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from $v_{min}$ to nuclear recoil momentum ($p_R$), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call $\\tilde{h}(p_R)$. The entire family of conventional halo-independent $\\tilde{g}(v_{min})$ plots for all DM masses are directly found from the single $\\tilde{h}(p_R)$ plot through a simple re...

  17. The contour method cutting assumption: error minimization and correction

    Energy Technology Data Exchange (ETDEWEB)

    Prime, Michael B [Los Alamos National Laboratory; Kastengren, Alan L [ANL

    2010-01-01

    The recently developed contour method can measure 2-D, cross-sectional residual-stress map. A part is cut in two using a precise and low-stress cutting technique such as electric discharge machining. The contours of the new surfaces created by the cut, which will not be flat if residual stresses are relaxed by the cutting, are then measured and used to calculate the original residual stresses. The precise nature of the assumption about the cut is presented theoretically and is evaluated experimentally. Simply assuming a flat cut is overly restrictive and misleading. The critical assumption is that the width of the cut, when measured in the original, undeformed configuration of the body is constant. Stresses at the cut tip during cutting cause the material to deform, which causes errors. The effect of such cutting errors on the measured stresses is presented. The important parameters are quantified. Experimental procedures for minimizing these errors are presented. An iterative finite element procedure to correct for the errors is also presented. The correction procedure is demonstrated on experimental data from a steel beam that was plastically bent to put in a known profile of residual stresses.

  18. On the role of assumptions in cladistic biogeographical analyses

    Directory of Open Access Journals (Sweden)

    Charles Morphy Dias dos Santos

    2011-01-01

    Full Text Available The biogeographical Assumptions 0, 1, and 2 (respectively A0, A1 and A2 are theoretical terms used to interpret and resolve incongruence in order to find general areagrams. The aim of this paper is to suggest the use of A2 instead of A0 and A1 in solving uncertainties during cladistic biogeographical analyses. In a theoretical example, using Component Analysis and Primary Brooks Parsimony Analysis (primary BPA, A2 allows for the reconstruction of the true sequence of disjunction events within a hypothetical scenario, while A0 adds spurious area relationships. A0, A1 and A2 are interpretations of the relationships between areas, not between taxa. Since area relationships are not equivalent to cladistic relationships, it is inappropriate to use the distributional information of taxa to resolve ambiguous patterns in areagrams, as A0 does. Although ambiguity in areagrams is virtually impossible to explain, A2 is better and more neutral than any other biogeographical assumption.

  19. The extended evolutionary synthesis: its structure, assumptions and predictions

    Science.gov (United States)

    Laland, Kevin N.; Uller, Tobias; Feldman, Marcus W.; Sterelny, Kim; Müller, Gerd B.; Moczek, Armin; Jablonka, Eva; Odling-Smee, John

    2015-01-01

    Scientific activities take place within the structured sets of ideas and assumptions that define a field and its practices. The conceptual framework of evolutionary biology emerged with the Modern Synthesis in the early twentieth century and has since expanded into a highly successful research program to explore the processes of diversification and adaptation. Nonetheless, the ability of that framework satisfactorily to accommodate the rapid advances in developmental biology, genomics and ecology has been questioned. We review some of these arguments, focusing on literatures (evo-devo, developmental plasticity, inclusive inheritance and niche construction) whose implications for evolution can be interpreted in two ways—one that preserves the internal structure of contemporary evolutionary theory and one that points towards an alternative conceptual framework. The latter, which we label the ‘extended evolutionary synthesis' (EES), retains the fundaments of evolutionary theory, but differs in its emphasis on the role of constructive processes in development and evolution, and reciprocal portrayals of causation. In the EES, developmental processes, operating through developmental bias, inclusive inheritance and niche construction, share responsibility for the direction and rate of evolution, the origin of character variation and organism–environment complementarity. We spell out the structure, core assumptions and novel predictions of the EES, and show how it can be deployed to stimulate and advance research in those fields that study or use evolutionary biology. PMID:26246559

  20. DDH-like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike;

    2011-01-01

    generalized to use instead d-DDH, and we show in the generic group model that d-DDH is harder than DDH. This means that virtually any application of DDH can now be realized with the same (amortized) efficiency, but under a potentially weaker assumption. On the negative side, we also show that d-DDH, just like...... DDH, is easy in bilinear groups. This motivates our suggestion of a different type of assumption, the d-vector DDH problems (VDDH), which are based on f(X)= X^d, but with a twist to avoid the problems with reducible polynomials. We show in the generic group model that VDDH is hard in bilinear groups...... and that in fact the problems become harder with increasing d and hence form an infinite hierarchy. We show that hardness of VDDH implies CCA-secure encryption, efficient Naor-Reingold style pseudorandom functions, and auxiliary input secure encryption, a strong form of leakage resilience. This can be seen...

  1. Time derivatives of the spectrum: Relaxing the stationarity assumption

    Science.gov (United States)

    Prieto, G. A.; Thomson, D. J.; Vernon, F. L.

    2005-12-01

    Spectrum analysis of seismic waveforms has played a significant role towards the understanding of multiple aspects of Earth structure and earthquake source physics. In recent years the multitaper spectrum estimation approach (Thomson, 1982) has been applied to geophysical problems providing not only reliable estimates of the spectrum, but also estimates of spectral uncertainties (Thomson and Chave, 1991). However, these improved spectral estimates were developed under the assumption of local stationarity and provide an incomplete description of the observed process. It is obvious that due to the intrinsic attenuation of the Earth, the amplitudes, and thus the frequency contents are changing with time as waves pass through a seismic station. There have been incredible improvements in different techniques to analyze non-stationary signals, including wavelet decomposition, Wigner-Ville spectrum and the dual-frequency spectrum. We apply one of the recently developed techniques, the Quadratic Inverse Theory (Thomson, 1990, 1994), combined with the multitaper technique to look at the time derivatives of the spectrum. If the spectrum is reasonably white in a certain bandwidth, using QI theory, we can estimate the derivatives of the spectrum at each frequency. We test synthetic signals to corroborate the approach and apply it the records of small earthquakes at local distances. This is a first approach to try and combine the classical spectrum analysis without the assumption of stationarity that is generally taken.

  2. Relaxing the zero-sum assumption in neutral biodiversity theory.

    Science.gov (United States)

    Haegeman, Bart; Etienne, Rampal S

    2008-05-21

    The zero-sum assumption is one of the ingredients of the standard neutral model of biodiversity by Hubbell. It states that the community is saturated all the time, which in this model means that the total number of individuals in the community is constant over time, and therefore introduces a coupling between species abundances. It was shown recently that a neutral model with independent species, and thus without any coupling between species abundances, has the same sampling formula (given a fixed number of individuals in the sample) as the standard model [Etienne, R.S., Alonso, D., McKane, A.J., 2007. The zero-sum assumption in neutral biodiversity theory. J. Theor. Biol. 248, 522-536]. The equilibria of both models are therefore equivalent from a practical point of view. Here we show that this equivalence can be extended to a class of neutral models with density-dependence on the community-level. This result can be interpreted as robustness of the model, i.e. insensitivity of the model to the precise interaction of the species in a neutral community. It can also be interpreted as a lack of resolution, as different mechanisms of interactions between neutral species cannot be distinguished using only a single snapshot of species abundance data.

  3. Assumptions about footprint layer heights influence the quantification of emission sources: a case study for Cyprus

    Science.gov (United States)

    Hüser, Imke; Harder, Hartwig; Heil, Angelika; Kaiser, Johannes W.

    2017-09-01

    Lagrangian particle dispersion models (LPDMs) in backward mode are widely used to quantify the impact of transboundary pollution on downwind sites. Most LPDM applications count particles with a technique that introduces a so-called footprint layer (FL) with constant height, in which passing air tracer particles are assumed to be affected by surface emissions. The mixing layer dynamics are represented by the underlying meteorological model. This particle counting technique implicitly assumes that the atmosphere is well mixed in the FL. We have performed backward trajectory simulations with the FLEXPART model starting at Cyprus to calculate the sensitivity to emissions of upwind pollution sources. The emission sensitivity is used to quantify source contributions at the receptor and support the interpretation of ground measurements carried out during the CYPHEX campaign in July 2014. Here we analyse the effects of different constant and dynamic FL height assumptions. The results show that calculations with FL heights of 100 and 300 m yield similar but still discernible results. Comparison of calculations with FL heights constant at 300 m and dynamically following the planetary boundary layer (PBL) height exhibits systematic differences, with daytime and night-time sensitivity differences compensating for each other. The differences at daytime when a well-mixed PBL can be assumed indicate that residual inaccuracies in the representation of the mixing layer dynamics in the trajectories may introduce errors in the impact assessment on downwind sites. Emissions from vegetation fires are mixed up by pyrogenic convection which is not represented in FLEXPART. Neglecting this convection may lead to severe over- or underestimations of the downwind smoke concentrations. Introducing an extreme fire source from a different year in our study period and using fire-observation-based plume heights as reference, we find an overestimation of more than 60  % by the constant FL height

  4. Flux-based transport enhancement as a plausible unifying mechanism for auxin transport in meristem development.

    Directory of Open Access Journals (Sweden)

    Szymon Stoma

    2008-10-01

    Full Text Available Plants continuously generate new organs through the activity of populations of stem cells called meristems. The shoot apical meristem initiates leaves, flowers, and lateral meristems in highly ordered, spiralled, or whorled patterns via a process called phyllotaxis. It is commonly accepted that the active transport of the plant hormone auxin plays a major role in this process. Current hypotheses propose that cellular hormone transporters of the PIN family would create local auxin maxima at precise positions, which in turn would lead to organ initiation. To explain how auxin transporters could create hormone fluxes to distinct regions within the plant, different concepts have been proposed. A major hypothesis, canalization, proposes that the auxin transporters act by amplifying and stabilizing existing fluxes, which could be initiated, for example, by local diffusion. This convincingly explains the organised auxin fluxes during vein formation, but for the shoot apical meristem a second hypothesis was proposed, where the hormone would be systematically transported towards the areas with the highest concentrations. This implies the coexistence of two radically different mechanisms for PIN allocation in the membrane, one based on flux sensing and the other on local concentration sensing. Because these patterning processes require the interaction of hundreds of cells, it is impossible to estimate on a purely intuitive basis if a particular scenario is plausible or not. Therefore, computational modelling provides a powerful means to test this type of complex hypothesis. Here, using a dedicated computer simulation tool, we show that a flux-based polarization hypothesis is able to explain auxin transport at the shoot meristem as well, thus providing a unifying concept for the control of auxin distribution in the plant. Further experiments are now required to distinguish between flux-based polarization and other hypotheses.

  5. Identifying and reducing potentially wrong immunoassay results even when plausible and "not-unreasonable".

    Science.gov (United States)

    Ismail, Adel A A

    2014-01-01

    The primary role of the clinical laboratory is to report accurate results for diagnosis of disease and management of illnesses. This goal has, to a large extent been achieved for routine biochemical tests, but not for immunoassays which remained susceptible to interference from endogenous immunoglobulin antibodies, causing false, and clinically misleading results. Clinicians regard all abnormal results including false ones as "pathological" necessitating further investigations, or concluding iniquitous diagnosis. Even more seriously, "false-negative" results may wrongly exclude pathology, thus denying patients' necessary treatment. Analytical error rate in immunoassays is relatively high, ranging from 0.4% to 4.0%. Because analytical interference from endogenous antibodies is confined to individuals' sera, it can be inconspicuous, pernicious, sporadic, and insidious because it cannot be detected by internal or external quality assessment procedures. An approach based on Bayesian reasoning can enhance the robustness of clinical validation in highlighting potentially erroneous immunoassay results. When this rational clinical/statistical approach is followed by analytical affirmative follow-up tests, it can help identifying inaccurate and clinically misleading immunoassay data even when they appear plausible and "not-unreasonable." This chapter is largely based on peer reviewed articles associated with and related to this approach. The first section underlines (without mathematical equations) the dominance and misuse of conventional statistics and the underuse of Bayesian paradigm and shows that laboratorians are intuitively (albeit unwittingly) practicing Bayesians. Secondly, because interference from endogenous antibodies is method's dependent (with numerous formats and different reagents), it is almost impossible to accurately assess its incidence in all differently formulated immunoassays and for each analytes/biomarkers. However, reiterating the basic concepts

  6. A plausible (overlooked) super-luminous supernova in the Sloan digital sky survey stripe 82 data

    Energy Technology Data Exchange (ETDEWEB)

    Kostrzewa-Rutkowska, Zuzanna; Kozłowski, Szymon; Wyrzykowski, Łukasz [Warsaw University Observatory, Al. Ujazdowskie 4, 00-478 Warszawa (Poland); Djorgovski, S. George; Mahabal, Ashish A. [California Institute of Technology, 1200 E California Blvd., Pasadena, CA 91125 (United States); Glikman, Eilat [Department of Physics and Yale Center for Astronomy and Astrophysics, Yale University, P.O. Box 208121, New Haven, CT 06520-8121 (United States); Koposov, Sergey, E-mail: zkostrzewa@astrouw.edu.pl, E-mail: simkoz@astrouw.edu.pl, E-mail: wyrzykow@astrouw.edu.pl [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom)

    2013-12-01

    We present the discovery of a plausible super-luminous supernova (SLSN), found in the archival data of Sloan Digital Sky Survey (SDSS) Stripe 82, called PSN 000123+000504. The supernova (SN) peaked at m {sub g} < 19.4 mag in the second half of 2005 September, but was missed by the real-time SN hunt. The observed part of the light curve (17 epochs) showed that the rise to the maximum took over 30 days, while the decline time lasted at least 70 days (observed frame), closely resembling other SLSNe of SN 2007bi type. The spectrum of the host galaxy reveals a redshift of z = 0.281 and the distance modulus of μ = 40.77 mag. Combining this information with the SDSS photometry, we found the host galaxy to be an LMC-like irregular dwarf galaxy with an absolute magnitude of M{sub B} = –18.2 ± 0.2 mag and an oxygen abundance of 12+log [O/H]=8.3±0.2; hence, the SN peaked at M {sub g} < –21.3 mag. Our SLSN follows the relation for the most energetic/super-luminous SNe exploding in low-metallicity environments, but we found no clear evidence for SLSNe to explode in low-luminosity (dwarf) galaxies only. The available information on the PSN 000123+000504 light curve suggests the magnetar-powered model as a likely scenario of this event. This SLSN is a new addition to a quickly growing family of super-luminous SNe.

  7. Bio-physically plausible visualization of highly scattering fluorescent neocortical models for in silico experimentation

    KAUST Repository

    Abdellah, Marwan

    2017-02-15

    Background We present a visualization pipeline capable of accurate rendering of highly scattering fluorescent neocortical neuronal models. The pipeline is mainly developed to serve the computational neurobiology community. It allows the scientists to visualize the results of their virtual experiments that are performed in computer simulations, or in silico. The impact of the presented pipeline opens novel avenues for assisting the neuroscientists to build biologically accurate models of the brain. These models result from computer simulations of physical experiments that use fluorescence imaging to understand the structural and functional aspects of the brain. Due to the limited capabilities of the current visualization workflows to handle fluorescent volumetric datasets, we propose a physically-based optical model that can accurately simulate light interaction with fluorescent-tagged scattering media based on the basic principles of geometric optics and Monte Carlo path tracing. We also develop an automated and efficient framework for generating dense fluorescent tissue blocks from a neocortical column model that is composed of approximately 31000 neurons. Results Our pipeline is used to visualize a virtual fluorescent tissue block of 50 μm3 that is reconstructed from the somatosensory cortex of juvenile rat. The fluorescence optical model is qualitatively analyzed and validated against experimental emission spectra of different fluorescent dyes from the Alexa Fluor family. Conclusion We discussed a scientific visualization pipeline for creating images of synthetic neocortical neuronal models that are tagged virtually with fluorescent labels on a physically-plausible basis. The pipeline is applied to analyze and validate simulation data generated from neuroscientific in silico experiments.

  8. Dynamics

    CERN Document Server

    Goodman, Lawrence E

    2001-01-01

    Beginning text presents complete theoretical treatment of mechanical model systems and deals with technological applications. Topics include introduction to calculus of vectors, particle motion, dynamics of particle systems and plane rigid bodies, technical applications in plane motions, theory of mechanical vibrations, and more. Exercises and answers appear in each chapter.

  9. What lies beneath: underlying assumptions in bioimage analysis.

    Science.gov (United States)

    Pridmore, Tony P; French, Andrew P; Pound, Michael P

    2012-12-01

    The need for plant image analysis tools is established and has led to a steadily expanding literature and set of software tools. This is encouraging, but raises a question: how does a plant scientist with no detailed knowledge or experience of image analysis methods choose the right tool(s) for the task at hand, or satisfy themselves that a suggested approach is appropriate? We believe that too great an emphasis is currently being placed on low-level mechanisms and software environments. In this opinion article we propose that a renewed focus on the core theories and algorithms used, and in particular the assumptions upon which they rely, will better equip plant scientists to evaluate the available resources. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Assumptions of Customer Knowledge Enablement in the Open Innovation Process

    Directory of Open Access Journals (Sweden)

    Jokubauskienė Raminta

    2017-08-01

    Full Text Available In the scientific literature, open innovation is one of the most effective means to innovate and gain a competitive advantage. In practice, there is a variety of open innovation activities, but, nevertheless, customers stand as the cornerstone in this area, since the customers’ knowledge is one of the most important sources of new knowledge and ideas. Evaluating the context where are the interactions of open innovation and customer knowledge enablement, it is necessary to take into account the importance of customer knowledge management. Increasingly it is highlighted that customers’ knowledge management facilitates the creation of innovations. However, it should be an examination of other factors that influence the open innovation, and, at the same time, customers’ knowledge management. This article presents a theoretical model, which reveals the assumptions of open innovation process and the impact on the firm’s performance.

  11. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Tidball, Rick [ICF International, Fairfax, VA (United States); Bluestein, Joel [ICF International, Fairfax, VA (United States); Rodriguez, Nick [ICF International, Fairfax, VA (United States); Knoke, Stu [ICF International, Fairfax, VA (United States)

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  12. Decision-Theoretic Planning: Structural Assumptions and Computational Leverage

    CERN Document Server

    Boutilier, C; Hanks, S; 10.1613/jair.575

    2011-01-01

    Planning under uncertainty is a central problem in the study of automated sequential decision making, and has been addressed by researchers in many different fields, including AI planning, decision analysis, operations research, control theory and economics. While the assumptions and perspectives adopted in these areas often differ in substantial ways, many planning problems of interest to researchers in these fields can be modeled as Markov decision processes (MDPs) and analyzed using the techniques of decision theory. This paper presents an overview and synthesis of MDP-related methods, showing how they provide a unifying framework for modeling many classes of planning problems studied in AI. It also describes structural properties of MDPs that, when exhibited by particular classes of problems, can be exploited in the construction of optimal or approximately optimal policies or plans. Planning problems commonly possess structure in the reward and value functions used to describe performance criteria, in the...

  13. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Tidball, R.; Bluestein, J.; Rodriguez, N.; Knoke, S.

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  14. Validating modelling assumptions of alpha particles in electrostatic turbulence

    CERN Document Server

    Wilkie, George; Highcock, Edmund; Dorland, William

    2014-01-01

    To rigorously model fast ions in fusion plasmas, a non-Maxwellian equilibrium distribution must be used. In the work, the response of high-energy alpha particles to electrostatic turbulence has been analyzed for several different tokamak parameters. Our results are consistent with known scalings and experimental evidence that alpha particles are generally well-confined: on the order of several seconds. It is also confirmed that the effect of alphas on the turbulence is negligible at realistically low concentrations, consistent with linear theory. It is demonstrated that the usual practice of using a high-temperature Maxwellian gives incorrect estimates for the radial alpha particle flux, and a method of correcting it is provided. Furthermore, we see that the timescales associated with collisions and transport compete at moderate energies, calling into question the assumption that alpha particles remain confined to a flux surface that is used in the derivation of the slowing-down distribution.

  15. Uncovering Metaethical Assumptions in Bioethical Discourse across Cultures.

    Science.gov (United States)

    Sullivan, Laura Specker

    2016-03-01

    Much of bioethical discourse now takes place across cultures. This does not mean that cross-cultural understanding has increased. Many cross-cultural bioethical discussions are marked by entrenched disagreement about whether and why local practices are justified. In this paper, I argue that a major reason for these entrenched disagreements is that problematic metaethical commitments are hidden in these cross-cultural discourses. Using the issue of informed consent in East Asia as an example of one such discourse, I analyze two representative positions in the discussion and identify their metaethical commitments. I suggest that the metaethical assumptions of these positions result from their shared method of ethical justification: moral principlism. I then show why moral principlism is problematic in cross-cultural analyses and propose a more useful method for pursuing ethical justification across cultures.

  16. Posttraumatic Growth and Shattered World Assumptions Among Ex-POWs

    DEFF Research Database (Denmark)

    Lahav, Y.; Bellin, Elisheva S.; Solomon, Z.

    2016-01-01

    PTG and WAs. Method: Former prisoners of war (ex-POWs; n = 158) and comparable controls (n = 106) were assessed 38 years after the Yom Kippur War. Results: Ex-POWs endorsed more negative WAs and higher PTG and dissociation compared to controls. Ex-POWs with posttraumatic stress disorder (PTSD...... world assumptions (WAs) and that the co-occurrence of high PTG and negative WAs among trauma survivors reflects reconstruction of an integrative belief system. The present study aimed to test these claims by investigating, for the first time, the mediating role of dissociation in the relation between......) endorsed negative WAs and a higher magnitude of PTG and dissociation, compared to both ex-POWs without PTSD and controls. WAs were negatively correlated with dissociation and positively correlated with PTG. PTG was positively correlated with dissociation. Moreover, dissociation fully mediated...

  17. Linear irreversible heat engines based on local equilibrium assumptions

    Science.gov (United States)

    Izumida, Yuki; Okuda, Koji

    2015-08-01

    We formulate an endoreversible finite-time Carnot cycle model based on the assumptions of local equilibrium and constant energy flux, where the efficiency and the power are expressed in terms of the thermodynamic variables of the working substance. By analyzing the entropy production rate caused by the heat transfer in each isothermal process during the cycle, and using the endoreversible condition applied to the linear response regime, we identify the thermodynamic flux and force of the present system and obtain a linear relation that connects them. We calculate the efficiency at maximum power in the linear response regime by using the linear relation, which agrees with the Curzon-Ahlborn (CA) efficiency known as the upper bound in this regime. This reason is also elucidated by rewriting our model into the form of the Onsager relations, where our model turns out to satisfy the tight-coupling condition leading to the CA efficiency.

  18. The Impact of Modeling Assumptions in Galactic Chemical Evolution Models

    CERN Document Server

    Côté, Benoit; Ritter, Christian; Herwig, Falk; Venn, Kim A

    2016-01-01

    We use the OMEGA galactic chemical evolution code to investigate how the assumptions used for the treatment of galactic inflows and outflows impact numerical predictions. The goal is to determine how our capacity to reproduce the chemical evolution trends of a galaxy is affected by the choice of implementation used to include those physical processes. In pursuit of this goal, we experiment with three different prescriptions for galactic inflows and outflows and use OMEGA within a Markov Chain Monte Carlo code to recover the set of input parameters that best reproduces the chemical evolution of nine elements in the dwarf spheroidal galaxy Sculptor. Despite their different degrees of intended physical realism, we found that all three prescriptions can reproduce in an almost identical way the stellar abundance trends observed in Sculptor. While the three models have the same capacity to fit the data, the best values recovered for the parameters controlling the number of Type Ia supernovae and the strength of gal...

  19. New media in strategy – mapping assumptions in the field

    DEFF Research Database (Denmark)

    Gulbrandsen, Ib Tunby; Plesner, Ursula; Raviola, Elena

    2017-01-01

    in relation to the outside or the inside of the organization. After discussing the literature according to these dimensions (deterministic/volontaristic) and (internal/external), the article argues for a sociomaterial approach to strategy and strategy making and for using the concept of affordances......There is plenty of empirical evidence for claiming that new media make a difference for how strategy is conceived and executed. Furthermore, there is a rapidly growing body of literature that engages with this theme, and offers recommendations regarding the appropriate strategic actions in relation...... to new media. By contrast, there is relatively little attention to the assumptions behind strategic thinking in relation to new media. This article reviews the most influential strategy journals, asking how new media are conceptualized. It is shown that strategy scholars have a tendency to place...

  20. Experimental assessment of unvalidated assumptions in classical plasticity theory.

    Energy Technology Data Exchange (ETDEWEB)

    Brannon, Rebecca Moss (University of Utah, Salt Lake City, UT); Burghardt, Jeffrey A. (University of Utah, Salt Lake City, UT); Bauer, Stephen J.; Bronowski, David R.

    2009-01-01

    This report investigates the validity of several key assumptions in classical plasticity theory regarding material response to changes in the loading direction. Three metals, two rock types, and one ceramic were subjected to non-standard loading directions, and the resulting strain response increments were displayed in Gudehus diagrams to illustrate the approximation error of classical plasticity theories. A rigorous mathematical framework for fitting classical theories to the data, thus quantifying the error, is provided. Further data analysis techniques are presented that allow testing for the effect of changes in loading direction without having to use a new sample and for inferring the yield normal and flow directions without having to measure the yield surface. Though the data are inconclusive, there is indication that classical, incrementally linear, plasticity theory may be inadequate over a certain range of loading directions. This range of loading directions also coincides with loading directions that are known to produce a physically inadmissible instability for any nonassociative plasticity model.

  1. Statistical Tests of the PTHA Poisson Assumption for Submarine Landslides

    Science.gov (United States)

    Geist, E. L.; Chaytor, J. D.; Parsons, T.; Ten Brink, U. S.

    2012-12-01

    We demonstrate that a sequence of dated mass transport deposits (MTDs) can provide information to statistically test whether or not submarine landslides associated with these deposits conform to a Poisson model of occurrence. Probabilistic tsunami hazard analysis (PTHA) most often assumes Poissonian occurrence for all sources, with an exponential distribution of return times. Using dates that define the bounds of individual MTDs, we first describe likelihood and Monte Carlo methods of parameter estimation for a suite of candidate occurrence models (Poisson, lognormal, gamma, Brownian Passage Time). In addition to age-dating uncertainty, both methods incorporate uncertainty caused by the open time intervals: i.e., before the first and after the last event to the present. Accounting for these open intervals is critical when there are a small number of observed events. The optimal occurrence model is selected according to both the Akaike Information Criteria (AIC) and Akaike's Bayesian Information Criterion (ABIC). In addition, the likelihood ratio test can be performed on occurrence models from the same family: e.g., the gamma model relative to the exponential model of return time distribution. Parameter estimation, model selection, and hypothesis testing are performed on data from two IODP holes in the northern Gulf of Mexico that penetrated a total of 14 MTDs, some of which are correlated between the two holes. Each of these events has been assigned an age based on microfossil zonations and magnetostratigraphic datums. Results from these sites indicate that the Poisson assumption is likely valid. However, parameter estimation results using the likelihood method for one of the sites suggest that the events may have occurred quasi-periodically. Methods developed in this study provide tools with which one can determine both the rate of occurrence and the statistical validity of the Poisson assumption when submarine landslides are included in PTHA.

  2. Evaluating the reliability of equilibrium dissolution assumption from residual gasoline in contact with water saturated sands

    Science.gov (United States)

    Lekmine, Greg; Sookhak Lari, Kaveh; Johnston, Colin D.; Bastow, Trevor P.; Rayner, John L.; Davis, Greg B.

    2017-01-01

    Understanding dissolution dynamics of hazardous compounds from complex gasoline mixtures is a key to long-term predictions of groundwater risks. The aim of this study was to investigate if the local equilibrium assumption for BTEX and TMBs (trimethylbenzenes) dissolution was valid under variable saturation in two dimensional flow conditions and evaluate the impact of local heterogeneities when equilibrium is verified at the scale of investigation. An initial residual gasoline saturation was established over the upper two-thirds of a water saturated sand pack. A constant horizontal pore velocity was maintained and water samples were recovered across 38 sampling ports over 141 days. Inside the residual NAPL zone, BTEX and TMBs dissolution curves were in agreement with the TMVOC model based on the local equilibrium assumption. Results compared to previous numerical studies suggest the presence of small scale dissolution fingering created perpendicular to the horizontal dissolution front, mainly triggered by heterogeneities in the medium structure and the local NAPL residual saturation. In the transition zone, TMVOC was able to represent a range of behaviours exhibited by the data, confirming equilibrium or near-equilibrium dissolution at the scale of investigation. The model locally showed discrepancies with the most soluble compounds, i.e. benzene and toluene, due to local heterogeneities exhibiting that at lower scale flow bypassing and channelling may have occurred. In these conditions mass transfer rates were still high enough to fall under the equilibrium assumption in TMVOC at the scale of investigation. Comparisons with other models involving upscaled mass transfer rates demonstrated that such approximations with TMVOC could lead to overestimate BTEX dissolution rates and underestimate the total remediation time.

  3. Investigating Darcy-scale assumptions by means of a multiphysics algorithm

    Science.gov (United States)

    Tomin, Pavel; Lunati, Ivan

    2016-09-01

    Multiphysics (or hybrid) algorithms, which couple Darcy and pore-scale descriptions of flow through porous media in a single numerical framework, are usually employed to decrease the computational cost of full pore-scale simulations or to increase the accuracy of pure Darcy-scale simulations when a simple macroscopic description breaks down. Despite the massive increase in available computational power, the application of these techniques remains limited to core-size problems and upscaling remains crucial for practical large-scale applications. In this context, the Hybrid Multiscale Finite Volume (HMsFV) method, which constructs the macroscopic (Darcy-scale) problem directly by numerical averaging of pore-scale flow, offers not only a flexible framework to efficiently deal with multiphysics problems, but also a tool to investigate the assumptions used to derive macroscopic models and to better understand the relationship between pore-scale quantities and the corresponding macroscale variables. Indeed, by direct comparison of the multiphysics solution with a reference pore-scale simulation, we can assess the validity of the closure assumptions inherent to the multiphysics algorithm and infer the consequences for macroscopic models at the Darcy scale. We show that the definition of the scale ratio based on the geometric properties of the porous medium is well justified only for single-phase flow, whereas in case of unstable multiphase flow the nonlinear interplay between different forces creates complex fluid patterns characterized by new spatial scales, which emerge dynamically and weaken the scale-separation assumption. In general, the multiphysics solution proves very robust even when the characteristic size of the fluid-distribution patterns is comparable with the observation length, provided that all relevant physical processes affecting the fluid distribution are considered. This suggests that macroscopic constitutive relationships (e.g., the relative

  4. Plausible antioxidant biomechanics and anticonvulsant pharmacological activity of brain-targeted β-carotene nanoparticles.

    Science.gov (United States)

    Yusuf, Mohammad; Khan, Riaz A; Khan, Maria; Ahmed, Bahar

    2012-01-01

    increased in P-80-BCNP to 231.0 ± 16.30 seconds, as compared to PTZ (120.10 ± 4.50 seconds) and placebo control (120.30 ± 7.4 seconds). The results of this study demonstrate a plausible novel anticonvulsant activity of β-carotene at a low dose of 2 mg/kg, with brain-targeted nanodelivery, thus increasing its bioavailability and stability.

  5. Plausibility check of a redesigned rain-on-snow simulator (RASA)

    Science.gov (United States)

    Rössler, Ole; Probst, Sabine; Weingartner, Rolf

    2016-04-01

    Rain-on-snow events are fascinating but still not completely understood processes. Although, several studies and equations have been published since decades that describe past events and theoretical descriptions, empirical data of what is happening in the snow cover is far less available. A way to fill this gap of empirical data, rain-on-snow-simulators might be of help. In 2013, Juras et al. published their inspiring idea of a portable rain-on-snow simulator. The huge advantage of this devise - in contrast to other purely field-based experiments - are their fixed, and mostly standardized conditions and the possibility to measure all required data to monitor the water fluxes and melting processes at a time. Mounted in a convenient location, a large number of experiments are relatively easy conductible. We applied and further developed the original device and plausified the results of this redesigned version, called RASA. The principal design was borrowed from the original version being a frame with a sprinkler on top and a snow sample in a box at the bottom, from which the outflow is measured with a tipping gauge. We added a moving sprinkling plate to ensure a uniform distribution of raindrops on the snow, and - most importantly - we suspended the watered snow sampled on weighting cells. The latter enables to continuous measurement of the snow sample throughout the experiment and thus the indirect quantification of liquid water saturation, water holding capacity, and snowmelt amount via balance equations. As it is remains unclear if this device is capable to reproduce known processes, a hypothesis based plausibility check was accomplished. Thus, eight hypothesizes were derived from literature and tested in 28 experiments with the RASA mounted at 2000 m elevation. In general, we were able to reproduce most of the hypotheses. The RASA proved to be a very valuable device that can generate suitable results and has the potential to extend the empirical-experimental data

  6. Exploring apposite therapeutic target for apoptosis in filarial parasite: a plausible hypothesis.

    Science.gov (United States)

    Hande, Sneha; Goswami, Kalyan; Jena, Lingaraj; Reddy, Maryada Venkata Rami

    2014-03-01

    Human lymphatic filariasis is a parasitic disease with profound socioeconomic encumbrance owing to its associated disability, affecting predominantly but not limited to the developing nations of tropics and subtropics. There are several technical issues like poor therapeutic and preventive repertoire as well as administrative and infrastructural limitations which jeopardize the salvage measures and further complicate the plight. Therefore, considering the gravity of the problem, WHO has mandated (under tropical disease research scheme) for placing emphasis on validation of novel therapeutic targets against this disease with the unfortunate tag of 'neglected tropical disease'. However, dearth of knowledge of parasite biology viciously coupled with difficulty of access to parasitic material from suitable animal model along with growing cost burden of high end research poses formidable challenge. Based on the recent research evidences, here we propose a premise with targeted apoptotic impact as a novel rationale to be exploited towards anti-parasitic drug development. The new era of bioinformatics ushers in new optimism with a wide range of genomic and proteomic database in public domain. Such platform might offer wonders for drug research, but needs highly selective criterion specificity. In order to test our hypothesis presumptively, we deployed a scheme for identification of target proteins from filarial parasitic origin through wide database search with precise criteria of non-homology against the host along with functional essentiality for the parasite. Further screening for proteins with growth potential from such list of essential non-homologous proteins was undertaken to mine out suitable representative target for ensuing apoptotic impact though effective inhibitors. A unique protein enzyme, RNA dependent RNA polymerase, which besides its vital role in RNA virus is believed to have regulatory role in gene expression, emerged as a plausible target. This protein

  7. Lead-induced SCC of alloy 600 in plausible steam generator crevice environments

    Energy Technology Data Exchange (ETDEWEB)

    Wright, M.D. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Manolescu, A. [Ontario Hydro Technologies, Toronto, Ontario (Canada); Mirzai, M. [Ontario Hydro, Toronto, Ontario (Canada)

    1998-07-01

    Laboratory stress corrosion cracking (SCC) test environments developed to simulate representative BNGS-A steam generator (SG) crevice chemistries have been used to determine the susceptibility of Alloy 600 to lead-induced SCC under plausible SG conditions. Test environments were based on plant SG hideout return data and analysis of removed tubes and deposits. Deviations from the normal near neutral crevice pH environment were considered to simulate possible faulted excursion crevice chemistry and to bound the postulated crevice pH range of 3-9 (at temperature). The effect of lead contamination up to 1000 ppm, but with an emphasis on the 100 to 500 ppm range, was determined. SCC susceptibility was investigated using constant extension rate tensile (CERT) tests and encapsulated C-ring tests. CERT tests were performed at 305 degrees C on tubing representative of BNGS-A SG U-bends. The C-ring test method allowed a wider test matrix covering three temperatures (280, 304 and 315 degrees C), three strain levels (0.2%, 2% and 4%) and tubing representative of U-bends plus tubing given a simulated stress relief to represent material at the tubesheet. The results of this test program confirmed that in the absence of lead contamination, cracking does not occur in these concentrated, 3.3 to 8.9 pH range, crevice environments. Also, it appears that the concentrated crevice environments suppress lead-induced cracking relative to that seen in all-volatile-treatment (AVT) water. For the (static) C-ring tests, lead-induced SCC was only produced in the near-neutral crevice environment and was more severe at 500 ppm than 100 ppm PbO. This trend was also observed in CERT tests but some cracking/grain boundary attack occurred in acidic (pH 3.3) and alkaline (pH 8.9) environments. The C-ring tests indicated that a certain amount of resistance to cracking was imparted by simulated stress relief of the tubing. This heat treatment, confirmed to have resulted in sensitization, promoted

  8. Lead-induced stress-corrosion cracking of alloy 600 in plausible steam generator crevice environments

    Energy Technology Data Exchange (ETDEWEB)

    Wright, M.D. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Manolescu, A. [Ontario Hydro Technologies, Toronto, Ontario (Canada); Mirzai, M. [Ontario Hydro, Toronto, Ontario (Canada)

    1999-03-01

    Laboratory stress-corrosion cracking (SCC) test environments were developed to simulate crevice chemistries representative of Bruce Nuclear Generating Station A (BNPD A) steam generators (SGs); these test environments were used to determine the susceptibility of Alloy 600 to lead-induced SCC under plausible SG conditions. Test environments were based on plant SG hideout return data and analysis of removed tubes and deposits. Deviations from the normal near-neutral crevice pH environment were considered to simulate possible faulted excursion crevice chemistry and to bound the postulated crevice pH range of 3 to 9 (at temperature). The effect of lead contamination up to 1000 ppm, but with an emphasis on the 100- to 500-ppm range, was determined. SCC susceptibility was investigated using constant extension rate tensile (CERT) tests and encapsulated C-ring tests. CERT tests were performed at 305 degrees C on tubing representative of BNPD A SG U-bends. The C-ring test method allowed a wider test matrix, covering 3 temperatures (280 degrees C, 304 degrees C and 315 degrees C), 3 strain levels (0.2%, 2% and 4%), and tubing representative of U-bends plus tubing given a simulated stress relief to represent material at the tube sheet. The results of this test program confirmed that in the absence of lead contamination, cracking does not occur in these concentrated, 3.3 to 8.9 pH range, crevice environments. Also, it appears that the concentrated crevice environments suppress lead-induced cracking relative to that seen in all-volatile-treatment (AVT) water. For the (static) C-ring tests, lead-induced SCC was only produced in the near-neutral crevice environment and was more severe at 500 ppm than at 100 ppm PbO. This trend was also observed in CERT tests, but some cracking-grain boundary attack occurred in acidic (pH 3.3) and alkaline (pH 8.9) environments. The C-ring tests indicated that a certain amount of resistance to cracking was imparted by simulated stress relief of

  9. Future coal production outlooks in the IPCC Emission Scenarios: Are they plausible?

    Energy Technology Data Exchange (ETDEWEB)

    Hoeoek, Mikael

    2010-10-15

    Anthropogenic climate change caused by CO{sub 2} emissions is strongly and fundamentally linked to the future energy production. The Special Report on Emission Scenarios (SRES) from 2000 contains 40 scenarios for future fossil fuel production and is used by the IPCC to assess future climate change. Coal, with its 26% share of world energy, is a major source of greenhouse gas emissions and commonly seen as a key contributor to anthropogenic climate change. SRES contains a wide array of different coal production outlooks, ranging from a complete coal phase-out by 2100 to a roughly tenfold increase from present world production levels. Scenarios with high levels of global warming also have high expectations on future fossil fuel production. The assumptions on resource availability are in SRES based on Rogner's assessment of world hydrocarbon resources from 1997, where it is stated that 'the sheer size of the fossil resource base makes fossil sources an energy supply option for many centuries to come'. Regarding the future coal production it is simply assumed to be dependent on economics, accessibility, and environmental acceptance. It is also generally assumed that coal is abundant, and will thus take a dominating part in the future energy system. Depletion, geographical location and geological parameters are not given much influence in the scenario storylines. This study quantifies what the coal production projection in SRES would imply in reality. SRES is riddled with future production projections that would put unreasonable expectation on just a few countries or regions. Is it reasonable to expect that China, among the world's largest coal reserve and resource holder and producer, would increase their production by a factor of 8 over the next 90 years, as implied by certain scenarios? Can massive increases in global coal output really be justified from historical trends or will reality rule out some production outlooks as implausible? The

  10. The Impact of Modeling Assumptions in Galactic Chemical Evolution Models

    Science.gov (United States)

    Côté, Benoit; O'Shea, Brian W.; Ritter, Christian; Herwig, Falk; Venn, Kim A.

    2017-02-01

    We use the OMEGA galactic chemical evolution code to investigate how the assumptions used for the treatment of galactic inflows and outflows impact numerical predictions. The goal is to determine how our capacity to reproduce the chemical evolution trends of a galaxy is affected by the choice of implementation used to include those physical processes. In pursuit of this goal, we experiment with three different prescriptions for galactic inflows and outflows and use OMEGA within a Markov Chain Monte Carlo code to recover the set of input parameters that best reproduces the chemical evolution of nine elements in the dwarf spheroidal galaxy Sculptor. This provides a consistent framework for comparing the best-fit solutions generated by our different models. Despite their different degrees of intended physical realism, we found that all three prescriptions can reproduce in an almost identical way the stellar abundance trends observed in Sculptor. This result supports the similar conclusions originally claimed by Romano & Starkenburg for Sculptor. While the three models have the same capacity to fit the data, the best values recovered for the parameters controlling the number of SNe Ia and the strength of galactic outflows, are substantially different and in fact mutually exclusive from one model to another. For the purpose of understanding how a galaxy evolves, we conclude that only reproducing the evolution of a limited number of elements is insufficient and can lead to misleading conclusions. More elements or additional constraints such as the Galaxy’s star-formation efficiency and the gas fraction are needed in order to break the degeneracy between the different modeling assumptions. Our results show that the successes and failures of chemical evolution models are predominantly driven by the input stellar yields, rather than by the complexity of the Galaxy model itself. Simple models such as OMEGA are therefore sufficient to test and validate stellar yields. OMEGA

  11. Impact of velocity distribution assumption on simplified laser speckle imaging equation

    Science.gov (United States)

    Ramirez-San-Juan, Julio C; Ramos-Garcia, Ruben; Guizar-Iturbide, Ileana; Martinez-Niconoff, Gabriel; Choi, Bernard

    2012-01-01

    Since blood flow is tightly coupled to the health status of biological tissue, several instruments have been developed to monitor blood flow and perfusion dynamics. One such instrument is laser speckle imaging. The goal of this study was to evaluate the use of two velocity distribution assumptions (Lorentzian- and Gaussian-based) to calculate speckle flow index (SFI) values. When the normalized autocorrelation function for the Lorentzian and Gaussian velocity distributions satisfy the same definition of correlation time, then the same velocity range is predicted for low speckle contrast (0 < C < 0.6) and predict different flow velocity range for high contrast. Our derived equations form the basis for simplified calculations of SFI values. PMID:18542407

  12. Meso-scale modeling: beyond local equilibrium assumption for multiphase flow

    CERN Document Server

    Wang, Wei

    2015-01-01

    This is a summary of the article with the same title, accepted for publication in Advances in Chemical Engineering, 47: 193-277 (2015). Gas-solid fluidization is a typical nonlinear nonequilibrium system with multiscale structure. In particular, the mesoscale structure in terms of bubbles or clusters, which can be characterized by nonequilibrium features in terms of bimodal velocity distribution, energy non equipartition, and correlated density fluctuations, is the critical factor. Traditional two-fluid model (TFM) and relevant closures depend on local equilibrium and homogeneous distribution assumptions, and fail to predict the dynamic, nonequilibrium phenomena in circulating fluidized beds even with fine-grid resolution. In contrast, the mesoscale modeling, as exemplified by the energy-minimization multiscale (EMMS) model, is consistent with the nonequilibrium features in multiphase flows. Thus, the structure-dependent multi-fluid model conservation equations with the EMMS-based mesoscale modeling greatly i...

  13. Validity of the assumption of Gaussian turbulence; Gyldighed af antagelsen om Gaussisk turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, M.; Hansen, K.S.; Juul Pedersen, B.

    2000-07-01

    Wind turbines are designed to withstand the impact of turbulent winds, which fluctuations usually are assumed of Gaussian probability distribution. Based on a large number of measurements from many sites, this seems a reasonable assumption in flat homogeneous terrain whereas it may fail in complex terrain. At these sites the wind speed often has a skew distribution with more frequent lulls than gusts. In order to simulate aerodynamic loads, a numerical turbulence simulation method was developed and implemented. This method may simulate multiple time series of variable not necessarily Gaussian distribution without distortion of the spectral distribution or spatial coherence. The simulated time series were used as input to the dynamic-response simulation program Vestas Turbine Simulator (VTS). In this way we simulated the dynamic response of systems exposed to turbulence of either Gaussian or extreme, yet realistic, non-Gaussian probability distribution. Certain loads on turbines with active pitch regulation were enhanced by up to 15% compared to pure Gaussian turbulence. It should, however, be said that the undesired effect depends on the dynamic system, and it might be mitigated by optimisation of the wind turbine regulation system after local turbulence characteristics. (au)

  14. Decision-theoretic saliency: computational principles, biological plausibility, and implications for neurophysiology and psychophysics.

    Science.gov (United States)

    Gao, Dashan; Vasconcelos, Nuno

    2009-01-01

    A decision-theoretic formulation of visual saliency, first proposed for top-down processing (object recognition) (Gao & Vasconcelos, 2005a), is extended to the problem of bottom-up saliency. Under this formulation, optimality is defined in the minimum probability of error sense, under a constraint of computational parsimony. The saliency of the visual features at a given location of the visual field is defined as the power of those features to discriminate between the stimulus at the location and a null hypothesis. For bottom-up saliency, this is the set of visual features that surround the location under consideration. Discrimination is defined in an information-theoretic sense and the optimal saliency detector derived for a class of stimuli that complies with known statistical properties of natural images. It is shown that under the assumption that saliency is driven by linear filtering, the optimal detector consists of what is usually referred to as the standard architecture of V1: a cascade of linear filtering, divisive normalization, rectification, and spatial pooling. The optimal detector is also shown to replicate the fundamental properties of the psychophysics of saliency: stimulus pop-out, saliency asymmetries for stimulus presence versus absence, disregard of feature conjunctions, and Weber's law. Finally, it is shown that the optimal saliency architecture can be applied to the solution of generic inference problems. In particular, for the class of stimuli studied, it performs the three fundamental operations of statistical inference: assessment of probabilities, implementation of Bayes decision rule, and feature selection.

  15. Weak convergence of Jacobian determinants under asymmetric assumptions

    Directory of Open Access Journals (Sweden)

    Teresa Alberico

    2012-05-01

    Full Text Available Let $\\Om$ be a bounded open set in $\\R^2$ sufficiently smooth and $f_k=(u_k,v_k$ and $f=(u,v$ mappings belong to the Sobolev space $W^{1,2}(\\Om,\\R^2$. We prove that if the sequence of Jacobians $J_{f_k}$ converges to a measure $\\mu$ in sense of measures andif one allows different assumptions on the two components of $f_k$ and $f$, e.g.$$u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,2}(\\Om \\qquad \\, v_k \\rightharpoonup v \\;\\;\\mbox{weakly in} \\;\\; W^{1,q}(\\Om$$for some $q\\in(1,2$, then\\begin{equation}\\label{0}d\\mu=J_f\\,dz.\\end{equation}Moreover, we show that this result is optimal in the sense that conclusion fails for $q=1$.On the other hand, we prove that \\eqref{0} remains valid also if one considers the case $q=1$, but it is necessary to require that $u_k$ weakly converges to $u$ in a Zygmund-Sobolev space with a slightly higher degree of regularity than $W^{1,2}(\\Om$ and precisely$$ u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,L^2 \\log^\\alpha L}(\\Om$$for some $\\alpha >1$.    

  16. On Some Unwarranted Tacit Assumptions in Cognitive Neuroscience†

    Science.gov (United States)

    Mausfeld, Rainer

    2011-01-01

    The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input–output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings. PMID:22435062

  17. In vitro versus in vivo culture sensitivities: an unchecked assumption?

    Directory of Open Access Journals (Sweden)

    Prasad V

    2013-03-01

    Full Text Available No abstract available. Article truncated at 150 words. Case Presentation A patient presents to urgent care with the symptoms of a urinary tract infection (UTI. The urinalysis is consistent with infection, and the urine culture is sent to lab. In the interim, a physician prescribes empiric treatment, and sends the patient home. Two days later, the culture is positive for E. coli, resistant to the drug prescribed (Ciprofloxacin, Minimum Inhibitory Concentration (MIC 64 μg/ml, but attempts to contact the patient (by telephone are not successful. The patient returns the call two weeks later to say that the infection resolved without sequelae.Discussion Many clinicians have the experience of treatment success in the setting of known antibiotic resistance, and, conversely, treatment failure in the setting of known sensitivity. Such anomalies and empiric research described here forces us to revisit assumptions about the relationship between in vivo and in vitro drug responses. When it comes to the utility of microbiology…

  18. Finite Element Simulations to Explore Assumptions in Kolsky Bar Experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Crum, Justin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-05

    The chief purpose of this project has been to develop a set of finite element models that attempt to explore some of the assumptions in the experimental set-up and data reduction of the Kolsky bar experiment. In brief, the Kolsky bar, sometimes referred to as the split Hopkinson pressure bar, is an experimental apparatus used to study the mechanical properties of materials at high strain rates. Kolsky bars can be constructed to conduct experiments in tension or compression, both of which are studied in this paper. The basic operation of the tension Kolsky bar is as follows: compressed air is inserted into the barrel that contains the striker; the striker accelerates towards the left and strikes the left end of the barrel producing a tensile stress wave that propogates first through the barrel and then down the incident bar, into the specimen, and finally the transmission bar. In the compression case, the striker instead travels to the right and impacts the incident bar directly. As the stress wave travels through an interface (e.g., the incident bar to specimen connection), a portion of the pulse is transmitted and the rest reflected. The incident pulse, as well as the transmitted and reflected pulses are picked up by two strain gauges installed on the incident and transmitted bars as shown. By interpreting the data acquired by these strain gauges, the stress/strain behavior of the specimen can be determined.

  19. Cleanup of contaminated soil -- Unreal risk assumptions: Contaminant degradation

    Energy Technology Data Exchange (ETDEWEB)

    Schiffman, A. [New Jersey Department of Environmental Protection, Ewing, NJ (United States)

    1995-12-31

    Exposure assessments for development of risk-based soil cleanup standards or criteria assume that contaminant mass in soil is infinite and conservative (constant concentration). This assumption is not real for most organic chemicals. Contaminant mass is lost from soil and ground water when organic chemicals degrade. Factors to correct for chemical mass lost by degradation are derived from first-order kinetics for 85 organic chemicals commonly listed by USEPA and state agencies. Soil cleanup criteria, based on constant concentration, are then corrected for contaminant mass lost. For many chemicals, accounting for mass lost yields large correction factors to risk-based soil concentrations. For degradation in ground water and soil, correction factors range from greater than one to several orders of magnitude. The long exposure durations normally used in exposure assessments (25 to 70 years) result in large correction factors to standards even for carcinogenic chemicals with long half-lives. For the ground water pathway, a typical soil criterion for TCE of 1 mg/kg would be corrected to 11 mg/kg. For noncarcinogens, correcting for mass lost means that risk algorithms used to set soil cleanup requirements are inapplicable for many chemicals, especially for long periods of exposure.

  20. Stream of consciousness: Quantum and biochemical assumptions regarding psychopathology.

    Science.gov (United States)

    Tonello, Lucio; Cocchi, Massimo; Gabrielli, Fabio; Tuszynski, Jack A

    2017-04-01

    The accepted paradigms of mainstream neuropsychiatry appear to be incompletely adequate and in various cases offer equivocal analyses. However, a growing number of new approaches are being proposed that suggest the emergence of paradigm shifts in this area. In particular, quantum theories of mind, brain and consciousness seem to offer a profound change to the current approaches. Unfortunately these quantum paradigms harbor at least two serious problems. First, they are simply models, theories, and assumptions, with no convincing experiments supporting their claims. Second, they deviate from contemporary mainstream views of psychiatric illness and do so in revolutionary ways. We suggest a possible way to integrate experimental neuroscience with quantum models in order to address outstanding issues in psychopathology. A key role is played by the phenomenon called the "stream of consciousness", which can be linked to the so-called "Gamma Synchrony" (GS), which is clearly demonstrated by EEG data. In our novel proposal, a unipolar depressed patient could be seen as a subject with an altered stream of consciousness. In particular, some clues suggest that depression is linked to an "increased power" stream of consciousness. It is additionally suggested that such an approach to depression might be extended to psychopathology in general with potential benefits to diagnostics and therapeutics in neuropsychiatry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. PKreport: report generation for checking population pharmacokinetic model assumptions

    Directory of Open Access Journals (Sweden)

    Li Jun

    2011-05-01

    Full Text Available Abstract Background Graphics play an important and unique role in population pharmacokinetic (PopPK model building by exploring hidden structure among data before modeling, evaluating model fit, and validating results after modeling. Results The work described in this paper is about a new R package called PKreport, which is able to generate a collection of plots and statistics for testing model assumptions, visualizing data and diagnosing models. The metric system is utilized as the currency for communicating between data sets and the package to generate special-purpose plots. It provides ways to match output from diverse software such as NONMEM, Monolix, R nlme package, etc. The package is implemented with S4 class hierarchy, and offers an efficient way to access the output from NONMEM 7. The final reports take advantage of the web browser as user interface to manage and visualize plots. Conclusions PKreport provides 1 a flexible and efficient R class to store and retrieve NONMEM 7 output, 2 automate plots for users to visualize data and models, 3 automatically generated R scripts that are used to create the plots; 4 an archive-oriented management tool for users to store, retrieve and modify figures, 5 high-quality graphs based on the R packages, lattice and ggplot2. The general architecture, running environment and statistical methods can be readily extended with R class hierarchy. PKreport is free to download at http://cran.r-project.org/web/packages/PKreport/index.html.

  2. Testing the habituation assumption underlying models of parasitoid foraging behavior

    Science.gov (United States)

    Abram, Katrina; Colazza, Stefano; Peri, Ezio

    2017-01-01

    Background Habituation, a form of non-associative learning, has several well-defined characteristics that apply to a wide range of physiological and behavioral responses in many organisms. In classic patch time allocation models, habituation is considered to be a major mechanistic component of parasitoid behavioral strategies. However, parasitoid behavioral responses to host cues have not previously been tested for the known, specific characteristics of habituation. Methods In the laboratory, we tested whether the foraging behavior of the egg parasitoid Trissolcus basalis shows specific characteristics of habituation in response to consecutive encounters with patches of host (Nezara viridula) chemical contact cues (footprints), in particular: (i) a training interval-dependent decline in response intensity, and (ii) a training interval-dependent recovery of the response. Results As would be expected of a habituated response, wasps trained at higher frequencies decreased their behavioral response to host footprints more quickly and to a greater degree than those trained at low frequencies, and subsequently showed a more rapid, although partial, recovery of their behavioral response to host footprints. This putative habituation learning could not be blocked by cold anesthesia, ingestion of an ATPase inhibitor, or ingestion of a protein synthesis inhibitor. Discussion Our study provides support for the assumption that diminishing responses of parasitoids to chemical indicators of host presence constitutes habituation as opposed to sensory fatigue, and provides a preliminary basis for exploring the underlying mechanisms. PMID:28321365

  3. Observing gravitational-wave transient GW150914 with minimal assumptions

    Science.gov (United States)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blackburn, L.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chatterji, S.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y.; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Clark, M.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.; DeRosa, R. T.; De Rosa, R.; DeSalvo, R.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fournier, J.-D.; Franco, S.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Haas, R.; Hacker, J. J.

    2016-06-01

    The gravitational-wave signal GW150914 was first identified on September 14, 2015, by searches for short-duration gravitational-wave transients. These searches identify time-correlated transients in multiple detectors with minimal assumptions about the signal morphology, allowing them to be sensitive to gravitational waves emitted by a wide range of sources including binary black hole mergers. Over the observational period from September 12 to October 20, 2015, these transient searches were sensitive to binary black hole mergers similar to GW150914 to an average distance of ˜600 Mpc . In this paper, we describe the analyses that first detected GW150914 as well as the parameter estimation and waveform reconstruction techniques that initially identified GW150914 as the merger of two black holes. We find that the reconstructed waveform is consistent with the signal from a binary black hole merger with a chirp mass of ˜30 M⊙ and a total mass before merger of ˜70 M⊙ in the detector frame.

  4. Is air pollution a plausible candidate for prenatal exposure in autism spectrum disorder (ASD)? : a systematic review / y Dhanashree Vernekar

    OpenAIRE

    Vernekar, Dhanashree

    2013-01-01

    Objective: To present a systematic review of existing literature that investigates biological plausibility of prenatal hazardous air pollutants’ (HAPs) exposure, in the etiology of autism spectrum disorder (ASD) and related outcomes. Method: Electronic databases Pubmed, Biomed Central and National Database for Autism Research, and grey literature pertaining to air pollution association with ASD and related outcomes were searched using specific keywords. The search included 190 HAPs as defi...

  5. Boltzmann's "H"-Theorem and the Assumption of Molecular Chaos

    Science.gov (United States)

    Boozer, A. D.

    2011-01-01

    We describe a simple dynamical model of a one-dimensional ideal gas and use computer simulations of the model to illustrate two fundamental results of kinetic theory: the Boltzmann transport equation and the Boltzmann "H"-theorem. Although the model is time-reversal invariant, both results predict that the behaviour of the gas is time-asymmetric.…

  6. Effective Teacher Practice on the Plausibility of Human-Induced Climate Change

    Science.gov (United States)

    Niepold, F.; Sinatra, G. M.; Lombardi, D.

    2013-12-01

    Climate change education programs in the United States seek to promote a deeper understanding of the science of climate change, behavior change and stewardship, and support informed decision making by individuals, organizations, and institutions--all of which are summarized under the term 'climate literacy.' The ultimate goal of climate literacy is to enable actors to address climate change, both in terms of stabilizing and reducing emissions of greenhouse gases, but also an increased capacity to prepare for the consequences and opportunities of climate change. However, the long-term nature of climate change and the required societal response involve the changing students' ideas about controversial scientific issues which presents unique challenges for educators (Lombardi & Sinatra, 2010; Sinatra & Mason, 2008). This session will explore how the United States educational efforts focus on three distinct, but related, areas: the science of climate change, the human-climate interaction, and using climate education to promote informed decision making. Each of these approaches are represented in the Atlas of Science Literacy (American Association for the Advancement of Science, 2007) and in the conceptual framework for science education developed at the National Research Council (NRC) in 2012. Instruction to develop these fundamental thinking skills (e.g., critical evaluation and plausibility reappraisal) has been called for by the Next Generation Science Standards (NGSS) (Achieve, 2013), an innovative and research based way to address climate change education within the decentralized U.S. education system. However, the promise of the NGSS is that students will have more time to build mastery on the subjects, but the form of that instructional practice has been show to be critical. Research has show that effective instructional activities that promote evaluation of evidence improve students' understanding and acceptance toward the scientifically accepted model of human

  7. Plausible antioxidant biomechanics and anticonvulsant pharmacological activity of brain-targeted β-carotene nanoparticles

    Directory of Open Access Journals (Sweden)

    Yusuf M

    2012-08-01

    general tonic–clonic seizures reduced significantly to 2.90 ± 0.98 seconds by the use of BCNP and was further reduced on P-80-BCNP to 1.20 ± 0.20 seconds as compared to PTZ control and PTZ-placebo control (8.09 ± 0.26 seconds. General tonic–clonic seizures latency was increased significantly to 191.0 ± 9.80 seconds in BCNP and was further increased in P-80-BCNP to 231.0 ± 16.30 seconds, as compared to PTZ (120.10 ± 4.50 seconds and placebo control (120.30 ± 7.4 seconds. The results of this study demonstrate a plausible novel anticonvulsant activity of β-carotene at a low dose of 2 mg/kg, with brain-targeted nanodelivery, thus increasing its bioavailability and stability.Keywords: anticonvulsant, blood–brain barrier (BBB, targeted brain delivery, polysorbate-80-coated β-carotene nanoparticles (P-80-BCNP, maximal electroshock seizure (MES, pentylenetetrazole (PTZ

  8. Vulnerabilities to agricultural production shocks: An extreme, plausible scenario for assessment of risk for the insurance sector

    Directory of Open Access Journals (Sweden)

    Tobias Lunt

    2016-01-01

    Full Text Available Climate risks pose a threat to the function of the global food system and therefore also a hazard to the global financial sector, the stability of governments, and the food security and health of the world’s population. This paper presents a method to assess plausible impacts of an agricultural production shock and potential materiality for global insurers. A hypothetical, near-term, plausible, extreme scenario was developed based upon modules of historical agricultural production shocks, linked under a warm phase El Niño-Southern Oscillation (ENSO meteorological framework. The scenario included teleconnected floods and droughts in disparate agricultural production regions around the world, as well as plausible, extreme biotic shocks. In this scenario, global crop yield declines of 10% for maize, 11% for soy, 7% for wheat and 7% for rice result in quadrupled commodity prices and commodity stock fluctuations, civil unrest, significant negative humanitarian consequences and major financial losses worldwide. This work illustrates a need for the scientific community to partner across sectors and industries towards better-integrated global data, modeling and analytical capacities, to better respond to and prepare for concurrent agricultural failure. Governments, humanitarian organizations and the private sector collectively may recognize significant benefits from more systematic assessment of exposure to agricultural climate risk.

  9. Preliminary Study on Plausible Reasoning in Chemistry Teaching of Senior Middle School%高中化学合情推理教学的初步研究

    Institute of Scientific and Technical Information of China (English)

    杨健; 吴俊明; 骆红山

    2009-01-01

    合情推理(Plausible reasoning)对科学教育具重要意义.通过科学哲学、逻辑学讨论以及历史实例说明科学发现离不开合情推理,科学教育必须重视合情推理能力的培养,并对高中化学合情推理教学的可能性、对象和内容等问题进行了讨论.%Plausible reasoning is significant to science education. Scientific philosophy, logic and historical examples prove that plausible reasoning is indispensable to scientific discoveries,so science education must pay attention to the development of plausible reasoning ability of students. Moreover, it discusses the possibility, object and content of plausible reasoning teaching in chemistry of senior middle school.

  10. Projecting the future of Canada's population: assumptions, implications, and policy

    Directory of Open Access Journals (Sweden)

    Beaujot, Roderic

    2003-01-01

    Full Text Available After considering the assumptions for fertility, mortality and international migration, this paper looks at implications of the evolving demographics for population growth, labour force, retirement, and population distribution. With the help of policies favouring gender equity and supporting families of various types, fertility in Canada could avoid the particularly low levels seen in some countries, and remain at levels closer to 1.6 births per woman. The prognosis in terms of both risk factors and treatment suggests further reductions in mortality toward a life expectancy of 85. On immigration, there are political interests for levels as high as 270,000 per year, while levels of 150,000 correspond to the long term post-war average. The future will see slower population growth, and due to migration more than natural increase. International migration of some 225,000 per year can enable Canada to avoid population decline, and sustain the size of the labour force, but all scenarios show much change in the relative size of the retired compared to the labour force population. According to the ratio of persons aged 20-64 to that aged 65 and over, there were seven persons at labour force ages per person at retirement age in 1951, compared to five in 2001 and probably less than 2.5 in 2051. Growth that is due to migration more so than natural increase will accentuate the urbanization trend and the unevenness of the population distribution over space. Past projections have under-projected the mortality improvements and their impact on the relative size of the population at older age groups. Policies regarding fertility, mortality and migration could be aimed at avoiding population decline and reducing the effect of aging, but there is lack of an institutional basis for policy that would seek to endogenize population.

  11. Providing security assurance in line with national DBT assumptions

    Science.gov (United States)

    Bajramovic, Edita; Gupta, Deeksha

    2017-01-01

    As worldwide energy requirements are increasing simultaneously with climate change and energy security considerations, States are thinking about building nuclear power to fulfill their electricity requirements and decrease their dependence on carbon fuels. New nuclear power plants (NPPs) must have comprehensive cybersecurity measures integrated into their design, structure, and processes. In the absence of effective cybersecurity measures, the impact of nuclear security incidents can be severe. Some of the current nuclear facilities were not specifically designed and constructed to deal with the new threats, including targeted cyberattacks. Thus, newcomer countries must consider the Design Basis Threat (DBT) as one of the security fundamentals during design of physical and cyber protection systems of nuclear facilities. IAEA NSS 10 describes the DBT as "comprehensive description of the motivation, intentions and capabilities of potential adversaries against which protection systems are designed and evaluated". Nowadays, many threat actors, including hacktivists, insider threat, cyber criminals, state and non-state groups (terrorists) pose security risks to nuclear facilities. Threat assumptions are made on a national level. Consequently, threat assessment closely affects the design structures of nuclear facilities. Some of the recent security incidents e.g. Stuxnet worm (Advanced Persistent Threat) and theft of sensitive information in South Korea Nuclear Power Plant (Insider Threat) have shown that these attacks should be considered as the top threat to nuclear facilities. Therefore, the cybersecurity context is essential for secure and safe use of nuclear power. In addition, States should include multiple DBT scenarios in order to protect various target materials, types of facilities, and adversary objectives. Development of a comprehensive DBT is a precondition for the establishment and further improvement of domestic state nuclear-related regulations in the

  12. School Principals' Assumptions about Human Nature: Implications for Leadership in Turkey

    Science.gov (United States)

    Sabanci, Ali

    2008-01-01

    This article considers principals' assumptions about human nature in Turkey and the relationship between the assumptions held and the leadership style adopted in schools. The findings show that school principals hold Y-type assumptions and prefer a relationship-oriented style in their relations with assistant principals. However, both principals…

  13. Coefficient Alpha as an Estimate of Test Reliability under Violation of Two Assumptions.

    Science.gov (United States)

    Zimmerman, Donald W.; And Others

    1993-01-01

    Coefficient alpha was examined through computer simulation as an estimate of test reliability under violation of two assumptions. Coefficient alpha underestimated reliability under violation of the assumption of essential tau-equivalence of subtest scores and overestimated it under violation of the assumption of uncorrelated subtest error scores.…

  14. Challenging Assumptions of International Public Relations: When Government Is the Most Important Public.

    Science.gov (United States)

    Taylor, Maureen; Kent, Michael L.

    1999-01-01

    Explores assumptions underlying Malaysia's and the United States' public-relations practice. Finds many assumptions guiding Western theories and practices are not applicable to other countries. Examines the assumption that the practice of public relations targets a variety of key organizational publics. Advances international public-relations…

  15. Educational Technology as a Subversive Activity: Questioning Assumptions Related to Teaching and Leading with Technology

    Science.gov (United States)

    Kruger-Ross, Matthew J.; Holcomb, Lori B.

    2012-01-01

    The use of educational technologies is grounded in the assumptions of teachers, learners, and administrators. Assumptions are choices that structure our understandings and help us make meaning. Current advances in Web 2.0 and social media technologies challenge our assumptions about teaching and learning. The intersection of technology and…

  16. Exploring the Influence of Ethnicity, Age, and Trauma on Prisoners' World Assumptions

    Science.gov (United States)

    Gibson, Sandy

    2011-01-01

    In this study, the author explores world assumptions of prisoners, how these assumptions vary by ethnicity and age, and whether trauma history affects world assumptions. A random sample of young and old prisoners, matched for prison location, was drawn from the New Jersey Department of Corrections prison population. Age and ethnicity had…

  17. Simplified subsurface modelling: data assimilation and violated model assumptions

    Science.gov (United States)

    Erdal, Daniel; Lange, Natascha; Neuweiler, Insa

    2017-04-01

    Integrated models are gaining more and more attention in hydrological modelling as they can better represent the interaction between different compartments. Naturally, these models come along with larger numbers of unknowns and requirements on computational resources compared to stand-alone models. If large model domains are to be represented, e.g. on catchment scale, the resolution of the numerical grid needs to be reduced or the model itself needs to be simplified. Both approaches lead to a reduced ability to reproduce the present processes. This lack of model accuracy may be compensated by using data assimilation methods. In these methods observations are used to update the model states, and optionally model parameters as well, in order to reduce the model error induced by the imposed simplifications. What is unclear is whether these methods combined with strongly simplified models result in completely data-driven models or if they can even be used to make adequate predictions of the model state for times when no observations are available. In the current work we consider the combined groundwater and unsaturated zone, which can be modelled in a physically consistent way using 3D-models solving the Richards equation. For use in simple predictions, however, simpler approaches may be considered. The question investigated here is whether a simpler model, in which the groundwater is modelled as a horizontal 2D-model and the unsaturated zones as a few sparse 1D-columns, can be used within an Ensemble Kalman filter to give predictions of groundwater levels and unsaturated fluxes. This is tested under conditions where the feedback between the two model-compartments are large (e.g. shallow groundwater table) and the simplification assumptions are clearly violated. Such a case may be a steep hill-slope or pumping wells, creating lateral fluxes in the unsaturated zone, or strong heterogeneous structures creating unaccounted flows in both the saturated and unsaturated

  18. On non-uniform hyperbolicity assumptions in one-dimensional dynamics

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    We give an essentially equivalent formulation of the backward contracting property,defined by Juan Rivera-Letelier,in terms of expansion along the orbits of critical values,for complex polynomials of degree at least 2 which are at most finitely renormalizable and have only hyperbolic periodic points,as well as all C3 interval maps with non-flat critical points.

  19. Fluid dynamics of air in a packed bed: velocity profiles and the continuum model assumption

    Directory of Open Access Journals (Sweden)

    A. L. NEGRINI

    1999-12-01

    Full Text Available Air flow through packed beds was analyzed experimentally under conditions ranging from those that reinforce the effect of the wall on the void fraction to those that minimize it. The packing was spherical particles, with a tube-to-particle diameter ratio (D/dp between 3 and 60. Air flow rates were maintained between 1.3 and 4.44 m3/min, and gas velocity was measured with a Pitot tube positioned above the bed exit. Measurements were made at various radial and angular coordinate values, allowing the distribution of air flow across the bed to be described in detail. Comparison of the experimentally observed radial profiles with those derived from published equations revealed that at high D/dp ratios the measured and calculated velocity profiles behaved similarly. At low ratios, oscillations in the velocity profiles agreed with those in the voidage profiles, signifying that treating the porous medium as a continuum medium is questionable in these cases.

  20. Classic models of population dynamics: assumptions about selfregulative mechanisms and numbers of interactions between individuals

    Directory of Open Access Journals (Sweden)

    L.V. Nedorezov

    2014-09-01

    Full Text Available Stochastic model of migrations of individuals within the limits of finite domain on a plane is considered. It is assumed that population size scale is homogeneous, and there doesn't exist an interval of optimal values of population size (Alley effect doesn't realize for population. For every fixed value of population size number of interactions between individuals is calculated (as average in space and time. Correspondence between several classic models and numbers of interactions between individuals is analyzed.

  1. Classic models of population dynamics: assumptions about selfregulative mechanisms and numbers of interactions between individuals

    OpenAIRE

    L. V. Nedorezov

    2014-01-01

    Stochastic model of migrations of individuals within the limits of finite domain on a plane is considered. It is assumed that population size scale is homogeneous, and there doesn't exist an interval of optimal values of population size (Alley effect doesn't realize for population). For every fixed value of population size number of interactions between individuals is calculated (as average in space and time). Correspondence between several classic models and numbers of interactions between i...

  2. 'Being Explicit about Underlying Values, Assumptions and Views when Designing for Children in the IDC Community’

    DEFF Research Database (Denmark)

    Skovbjerg, Helle Marie

    2016-01-01

    In this full-day workshop we want to discuss how the IDC community can make underlying assumptions, values and views regarding children and childhood in making design decisions more explicit. What assumptions do IDC designers and researchers make, and how can they be supported in reflecting...... on those assumptions and the possible influences on their design decisions? How can we make the assumptions explicit, discuss them in the IDC community and use the discussion to develop higher quality design and research? The workshop will support discussion between researchers, designers and practitioners......, and intends to share different approaches for uncovering and reflecting on values, assumptions and views about children and childhood in design....

  3. Regression assumptions in clinical psychology research practice-a systematic review of common misconceptions.

    Science.gov (United States)

    Ernst, Anja F; Albers, Casper J

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  4. An analog implementation of biologically plausible neurons using CCII building blocks.

    Science.gov (United States)

    Sharifipoor, Ozra; Ahmadi, Arash

    2012-12-01

    This study presents an analog implementation of the spiking neurons based on a piecewise-linear model. This model is a variation of the Izhikevich model, which is capable of reproducing different dynamic behaviors. The proposed circuit utilizes second generation current conveyors (CCII) building blocks. With the same topology and circuit values, this circuit can produce a wide variety of neuron behaviors just by tuning the reference current and voltage sources. In addition, since CCII can be considered as a building block for programmable analog arrays, based on the proposed circuit different neuron types can be implemented on programmable analog platforms. Simulation results are presented for different neuron behaviors with CMOS 350 nm ±1.5 V technology using HSPICE. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Transient cerebral hypoperfusion and hypertensive events during atrial fibrillation: a plausible mechanism for cognitive impairment

    CERN Document Server

    Anselmino, Matteo; Saglietto, Andrea; Gaita, Fiorenzo; Ridolfi, Luca

    2016-01-01

    Atrial fibrillation (AF) is associated with an increased risk of dementia and cognitive decline, independent of strokes. Several mechanisms have been proposed to explain this association, but altered cerebral blood flow dynamics during AF has been poorly investigated: in particular, it is unknown how AF influences hemodynamic parameters of the distal cerebral circulation, at the arteriolar and capillary level. Two coupled lumped-parameter models (systemic and cerebrovascular circulations, respectively) were here used to simulate sinus rhythm (SR) and AF. For each simulation 5000 cardiac cycles were analyzed and cerebral hemodynamic parameters were calculated. With respect to SR, AF triggered a higher variability of the cerebral hemodynamic variables which increases proceeding towards the distal circulation, reaching the maximum extent at the arteriolar and capillary levels. This variability led to critical cerebral hemodynamic events of excessive pressure or reduced blood flow: 303 hypoperfusions occurred at ...

  6. Towards biological plausibility of electronic noses: A spiking neural network based approach for tea odour classification.

    Science.gov (United States)

    Sarkar, Sankho Turjo; Bhondekar, Amol P; Macaš, Martin; Kumar, Ritesh; Kaur, Rishemjit; Sharma, Anupma; Gulati, Ashu; Kumar, Amod

    2015-11-01

    The paper presents a novel encoding scheme for neuronal code generation for odour recognition using an electronic nose (EN). This scheme is based on channel encoding using multiple Gaussian receptive fields superimposed over the temporal EN responses. The encoded data is further applied to a spiking neural network (SNN) for pattern classification. Two forms of SNN, a back-propagation based SpikeProp and a dynamic evolving SNN are used to learn the encoded responses. The effects of information encoding on the performance of SNNs have been investigated. Statistical tests have been performed to determine the contribution of the SNN and the encoding scheme to overall odour discrimination. The approach has been implemented in odour classification of orthodox black tea (Kangra-Himachal Pradesh Region) thereby demonstrating a biomimetic approach for EN data analysis.

  7. Usefulness of an equal-probability assumption for out-of-equilibrium states: A master equation approach

    KAUST Repository

    Nogawa, Tomoaki

    2012-10-18

    We examine the effectiveness of assuming an equal probability for states far from equilibrium. For this aim, we propose a method to construct a master equation for extensive variables describing nonstationary nonequilibrium dynamics. The key point of the method is the assumption that transient states are equivalent to the equilibrium state that has the same extensive variables, i.e., an equal probability holds for microscopic states in nonequilibrium. We demonstrate an application of this method to the critical relaxation of the two-dimensional Potts model by Monte Carlo simulations. While the one-variable description, which is adequate for equilibrium, yields relaxation dynamics that are very fast, the redundant two-variable description well reproduces the true dynamics quantitatively. These results suggest that some class of the nonequilibrium state can be described with a small extension of degrees of freedom, which may lead to an alternative way to understand nonequilibrium phenomena. © 2012 American Physical Society.

  8. A Note on Unified Statistics Including Fermi-Dirac, Bose-Einstein, and Tsallis Statistics, and Plausible Extension to Anisotropic Effect

    Directory of Open Access Journals (Sweden)

    Christianto V.

    2007-04-01

    Full Text Available In the light of some recent hypotheses suggesting plausible unification of thermostatistics where Fermi-Dirac, Bose-Einstein and Tsallis statistics become its special subsets, we consider further plausible extension to include non-integer Hausdorff dimension, which becomes realization of fractal entropy concept. In the subsequent section, we also discuss plausible extension of this unified statistics to include anisotropic effect by using quaternion oscillator, which may be observed in the context of Cosmic Microwave Background Radiation. Further observation is of course recommended in order to refute or verify this proposition.

  9. Evaluating assumptions and parameterization underlying process-based ecosystem models: the case of LPJ-GUESS

    Science.gov (United States)

    Pappas, C.; Fatichi, S.; Leuzinger, S.; Burlando, P.

    2012-04-01

    Dynamic vegetation models have been widely used for analyzing ecosystem dynamics and climate feedbacks. Their performance has been tested extensively against observations and by model intercomparison studies. In the present study, the LPJ-GUESS state-of-the-art ecosystem model was evaluated with respect to its structure, hypothesis, and parameterization by performing a global sensitivity analysis (GSA). The study aims at examining potential model limitations, particularly with regards to regional and watershed scale applications. A detailed GSA based on variance decomposition is presented to investigate the structural assumptions of the model and to highlight processes and parameters that cause the highest variability in the outputs. First order and total sensitivity indexes were calculated for each of the parameters using Sobol's methodology. In order to elucidate the role of climate on model sensitivity synthetic climate scenarios were generated based on climatic data from Switzerland. The results clearly indicate a very high sensitivity of LPJ-GUESS to photosynthetic parameters. Intrinsic quantum efficiency alone is able to explain about 60% of the variability in vegetation carbon fluxes and pools for most of the investigated climate conditions. Processes related to light were also found important together with parameters affecting plant structure (growth, establishment and mortality). The model shows minor sensitivity to hydrological and soil texture parameters, questioning its skills in representing spatial vegetation heterogeneity at regional or watershed scales. We conclude that LPJ-GUESS' structure and possibly the one of other, structurally similar, dynamic vegetation models may need to be reconsidered. Specifically, the oversensitivity of the photosynthetic component deserves a particular attention, as this seems to contradict an increasing number of observations suggesting that photosynthesis may be a consequence rather than the driver of plant growth.

  10. Exploring plausible formation scenarios for the planet candidate orbiting Proxima Centauri

    CERN Document Server

    Coleman, Gavin A L; Paardekooper, Sijme-Jan; Dreizler, Stefan; Giesers, Benjamin; Anglada-Escude, Guillem

    2016-01-01

    We present a study of 4 different formation scenarios that may explain the origin of the recently announced planet `Proxima b' orbiting the star Proxima Centauri. The aim is to examine how the formation scenarios differ in their predictions for the multiplicity of the Proxima planetary system, the water/volatile content of Proxima b and its eccentricity, so that these can be tested by future observations. A scenario of in situ formation via giant impacts from a locally enhanced disc of planetary embryos and planetesimals, predicts that Proxima b will be a member of a multiplanet system with a measurably finite value of orbital eccentricity. Assuming that the local solid enhancement needed to form a Proxima b analogue with a minimum mass of 1.3 Earth masses arises because of the inwards drift of solids in the form of small planetesimals/boulders, this scenario also likely results in Proxima b analogues that are moderately endowed with water/volatiles, arising from the dynamical diffusion of icy planetesimals f...

  11. Transient cerebral hypoperfusion and hypertensive events during atrial fibrillation: a plausible mechanism for cognitive impairment.

    Science.gov (United States)

    Anselmino, Matteo; Scarsoglio, Stefania; Saglietto, Andrea; Gaita, Fiorenzo; Ridolfi, Luca

    2016-06-23

    Atrial fibrillation (AF) is associated with an increased risk of dementia and cognitive decline, independent of strokes. Several mechanisms have been proposed to explain this association, but altered cerebral blood flow dynamics during AF has been poorly investigated: in particular, it is unknown how AF influences hemodynamic parameters of the distal cerebral circulation, at the arteriolar and capillary level. Two coupled lumped-parameter models (systemic and cerebrovascular circulations, respectively) were here used to simulate sinus rhythm (SR) and AF. For each simulation 5000 cardiac cycles were analyzed and cerebral hemodynamic parameters were calculated. With respect to SR, AF triggered a higher variability of the cerebral hemodynamic variables which increases proceeding towards the distal circulation, reaching the maximum extent at the arteriolar and capillary levels. This variability led to critical cerebral hemodynamic events of excessive pressure or reduced blood flow: 303 hypoperfusions occurred at the arteriolar level, while 387 hypertensive events occurred at the capillary level during AF. By contrast, neither hypoperfusions nor hypertensive events occurred during SR. Thus, the impact of AF per se on cerebral hemodynamics candidates as a relevant mechanism into the genesis of AF-related cognitive impairment/dementia.

  12. Ionic liquid pretreatment of biomass for sugars production: Driving factors with a plausible mechanism for higher enzymatic digestibility.

    Science.gov (United States)

    Raj, Tirath; Gaur, Ruchi; Dixit, Pooja; Gupta, Ravi P; Kagdiyal, V; Kumar, Ravindra; Tuli, Deepak K

    2016-09-20

    In this study, five ionic liquids (ILs) have been explored for biomass pretreatment for the production of fermentable sugar. We also investigated the driving factors responsible for improved enzymatic digestibility of various ILs treated biomass along with postulating the plausible mechanism thereof. Post pretreatment, mainly two factors impacted the enzymatic digestibility (i) structural deformation (cellulose I to II) along with xylan/lignin removal and (ii) properties of ILs; wherein, K-T parameters, viscosity and surface tension had a direct influence on pretreatment. A systematic investigation of these parameters and their impact on enzymatic digestibility is drawn. [C2mim][OAc] with β-value 1.32 resulted 97.7% of glucose yield using 10 FPU/g of biomass. A closer insight into the cellulose structural transformation has prompted a plausible mechanism explaining the better digestibility. The impact of these parameters on the digestibility can pave the way to customize the process to make biomass vulnerable to enzymatic attack.

  13. Antimicrobial drug use in Austrian pig farms: plausibility check of electronic on-farm records and estimation of consumption.

    Science.gov (United States)

    Trauffler, M; Griesbacher, A; Fuchs, K; Köfer, J

    2014-10-25

    Electronic drug application records from farmers from 75 conventional pig farms were revised and checked for their plausibility. The registered drug amounts were verified by comparing the farmers' records with veterinarians' dispensary records. The antimicrobial consumption was evaluated from 2008 to 2011 and expressed in weight of active substance(s), number of used daily doses (nUDD), number of animal daily doses (nADD) and number of product-related daily doses (nPrDD). All results were referred to one year and animal bodyweight (kg biomass). The data plausibility proof revealed about 14 per cent of unrealistic drug amount entries in the farmers' records. The annual antimicrobial consumption was 33.9 mg/kg/year, 4.9 UDDkg/kg/year, 1.9 ADDkg/kg/year and 2.5 PrDDkg/kg/year (average). Most of the antimicrobials were applied orally (86 per cent) and at group-level. Main therapy indications were metaphylactic/prophylactic measures (farrow-to-finish and fattening farms) or digestive tract diseases (breeding farms). The proportion of the 'highest priority critically important antimicrobials' was low (12 per cent). After determination of a threshold value, farms with a high antimicrobial use could be detected. Statistical tests showed that the veterinarian had an influence on the dosage, the therapy indication and the active substance. Orally administered antimicrobials were mostly underdosed, parenterally administered antimicrobials rather correctly or overdosed.

  14. Virtual neurorobotics (VNR to accelerate development of plausible neuromorphic brain architectures

    Directory of Open Access Journals (Sweden)

    Philip H Goodman

    2007-11-01

    Full Text Available Traditional research in artificial intelligence and machine learning has viewed the brain as a specially adapted information-processing system. More recently the field of social robotics has been advanced to capture the important dynamics of human cognition and interaction. An overarching societal goal of this research is to incorporate the resultant knowledge about intelligence into technology for prosthetic, assistive, security, and decision support applications. However, despite many decades of investment in learning and classification systems, this paradigm has yet to yield truly “intelligent” systems. For this reason, many investigators are now attempting to incorporate more realistic neuromorphic properties into machine learning systems, encouraged by over two decades of neuroscience research that has provided parameters that characterize the brain’s interdependent genomic, proteomic, metabolomic, anatomic, and electrophysiological networks. Given the complexity of neural systems, developing tenable models to capture the essence of natural intelligence for real-time application requires that we discriminate features underlying information processing and intrinsic motivation from those reflecting biological constraints (such as maintaining structural integrity and transporting metabolic products. We propose herein a conceptual framework and an iterative method of virtual neurorobotics (VNR intended to rapidly forward-engineer and test progressively more complex putative neuromorphic brain prototypes for their ability to support intrinsically intelligent, intentional interaction with humans. The VNR system is based on the viewpoint that a truly intelligent system must be driven by emotion rather than programmed tasking, incorporating intrinsic motivation and intentionality. We report pilot results of a closed-loop, real-time interactive VNR system with a spiking neural brain, and provide a video demonstration as online supplemental

  15. Virtual Neurorobotics (VNR) to Accelerate Development of Plausible Neuromorphic Brain Architectures.

    Science.gov (United States)

    Goodman, Philip H; Buntha, Sermsak; Zou, Quan; Dascalu, Sergiu-Mihai

    2007-01-01

    Traditional research in artificial intelligence and machine learning has viewed the brain as a specially adapted information-processing system. More recently the field of social robotics has been advanced to capture the important dynamics of human cognition and interaction. An overarching societal goal of this research is to incorporate the resultant knowledge about intelligence into technology for prosthetic, assistive, security, and decision support applications. However, despite many decades of investment in learning and classification systems, this paradigm has yet to yield truly "intelligent" systems. For this reason, many investigators are now attempting to incorporate more realistic neuromorphic properties into machine learning systems, encouraged by over two decades of neuroscience research that has provided parameters that characterize the brain's interdependent genomic, proteomic, metabolomic, anatomic, and electrophysiological networks. Given the complexity of neural systems, developing tenable models to capture the essence of natural intelligence for real-time application requires that we discriminate features underlying information processing and intrinsic motivation from those reflecting biological constraints (such as maintaining structural integrity and transporting metabolic products). We propose herein a conceptual framework and an iterative method of virtual neurorobotics (VNR) intended to rapidly forward-engineer and test progressively more complex putative neuromorphic brain prototypes for their ability to support intrinsically intelligent, intentional interaction with humans. The VNR system is based on the viewpoint that a truly intelligent system must be driven by emotion rather than programmed tasking, incorporating intrinsic motivation and intentionality. We report pilot results of a closed-loop, real-time interactive VNR system with a spiking neural brain, and provide a video demonstration as online supplemental material.

  16. [Homicide-suicide: Clinical review and psychological assumptions].

    Science.gov (United States)

    Vandevoorde, J; Estano, N; Painset, G

    2017-08-01

    Suicide-homicide could be defined as a "suicidal" behaviour, which also includes the death of at least one other individual and sometimes up to hundreds. This literature review intends to highlight some characteristic features that might be found amongst the various types of suicide-homicide. It is a complex phenomenon which can occur in different situations, from a familial and somehow intimate setting (filicide, uxoricide, marital homicide…) to a public one (workplace shooting, school shooting), including a wide range of victims, from a single victim in marital cases of suicide-homicide to hundreds of victims in certain types, such as suicide by aircraft or warrior-like multi-homicids in terrorist acts. This literature review offers a combination of data emanating from scientific publications and case studies from our practices in an attempt to insulate some common factors. A thorough examination of the offenses unravels complex processes, ideations, M.O and peculiar cognitive impairments in which the familial suicide-homicide could be rooted. Mass murders might be caused also by a psychopathological alloy, made of Grandiose Self and sub-depressive and even paranoid ideations. Concerning the terrorism and multi-homicide-suicide, this is far more complex phenomenon and is defined by a group-process enrolment and ideological conviction. Beyond epidemiological studies, both descriptive and statistical, this paper's objective is to isolate a hypothesis about a psychopathological ground from which a criminological mechanism could emerge. Despite the lack of blatant psychosis, some traits might be identified in suicide-homicide cases - such as paranoid, psychopathic, narcissistic, melancholic - which can intertwine, potentiate one with another forming a distorted view of the world. The offense dynamic is possibly composed of preparatory behaviours, triggers, the use of death as a narcissistic support, identity choices… METHODS: The data were collected from

  17. Plausibility, necessity and identity: A logic of relative plausibility%似然性、必然性和恒等:一种相对似然性逻辑

    Institute of Scientific and Technical Information of China (English)

    李小五; 文学锋

    2007-01-01

    构造一个希尔伯特型的系统RPL, 来刻画由J·哈尔彭提出的似然性测度概念, 证明RPL相对一个邻域型语义是可靠和完全的.运用表述RPL的语言, 证明它可以定义已经得到深入研究的必然性、条件句和命题恒等这样的概念.%We construct a Hilbert style system RPL for the notion of plausibility measure introduced by Halpern J, and we prove the soundness and completeness with respect to a neighborhood style semantics.Using the language of RPL, we demonstrate that it can define well-studied notions of necessity,conditionals and propositional identity.

  18. 3 DOF Spherical Pendulum Oscillations with a Uniform Slewing Pivot Center and a Small Angle Assumption

    Directory of Open Access Journals (Sweden)

    Alexander V. Perig

    2014-01-01

    Full Text Available The present paper addresses the derivation of a 3 DOF mathematical model of a spherical pendulum attached to a crane boom tip for uniform slewing motion of the crane. The governing nonlinear DAE-based system for crane boom uniform slewing has been proposed, numerically solved, and experimentally verified. The proposed nonlinear and linearized models have been derived with an introduction of Cartesian coordinates. The linearized model with small angle assumption has an analytical solution. The relative and absolute payload trajectories have been derived. The amplitudes of load oscillations, which depend on computed initial conditions, have been estimated. The dependence of natural frequencies on the transport inertia forces and gravity forces has been computed. The conservative system, which contains first time derivatives of coordinates without oscillation damping, has been derived. The dynamic analogy between crane boom-driven payload swaying motion and Foucault’s pendulum motion has been grounded and outlined. For a small swaying angle, good agreement between theoretical and averaged experimental results was obtained.

  19. Relaxing the closure assumption in single-season occupancy models: staggered arrival and departure times

    Science.gov (United States)

    Kendall, William L.; Hines, James E.; Nichols, James D.; Grant, Evan H. Campbell

    2013-01-01

    Occupancy statistical models that account for imperfect detection have proved very useful in several areas of ecology, including species distribution and spatial dynamics, disease ecology, and ecological responses to climate change. These models are based on the collection of multiple samples at each of a number of sites within a given season, during which it is assumed the species is either absent or present and available for detection while each sample is taken. However, for some species, individuals are only present or available for detection seasonally. We present a statistical model that relaxes the closure assumption within a season by permitting staggered entry and exit times for the species of interest at each site. Based on simulation, our open model eliminates bias in occupancy estimators and in some cases increases precision. The power to detect the violation of closure is high if detection probability is reasonably high. In addition to providing more robust estimation of occupancy, this model permits comparison of phenology across sites, species, or years, by modeling variation in arrival or departure probabilities. In a comparison of four species of amphibians in Maryland we found that two toad species arrived at breeding sites later in the season than a salamander and frog species, and departed from sites earlier.

  20. Numerical simulation of flow in mechanical heart valves: grid resolution and the assumption of flow symmetry.

    Science.gov (United States)

    Ge, Liang; Jones, S Casey; Sotiropoulos, Fotis; Healy, Timothy M; Yoganathan, Ajit P

    2003-10-01

    A numerical method is developed for simulating unsteady, 3-D, laminar flow through a bileaflet mechanical heart valve with the leaflets fixed. The method employs a dual-time-stepping artificial-compressibility approach together with overset (Chimera) grids and is second-order accurate in space and time. Calculations are carried out for the full 3-D valve geometry under steady inflow conditions on meshes with a total number of nodes ranging from 4 x 10(5) to 1.6 x 10(6). The computed results show that downstream of the leaflets the flow is dominated by two pairs of counter-rotating vortices, which originate on either side of the central orifice in the aortic sinus and rotate such that the common flow of each pair is directed away from the aortic wall. These vortices intensify with Reynolds number, and at a Reynolds number of approximately 1200 their complex interaction leads to the onset of unsteady flow and the break of symmetry with respect to both geometric planes of symmetry. Our results show the highly 3-D structure of the flow; question the validity of computationally expedient assumptions of flow symmetry; and demonstrate the need for highly resolved, fully 3-D simulations if computational fluid dynamics is to accurately predict the flow in prosthetic mechanical heart valves.

  1. Climate Change Impacts on Agriculture and Food Security in 2050 under a Range of Plausible Socioeconomic and Emissions Scenarios

    Science.gov (United States)

    Wiebe, K.; Lotze-Campen, H.; Bodirsky, B.; Kavallari, A.; Mason-d'Croz, D.; van der Mensbrugghe, D.; Robinson, S.; Sands, R.; Tabeau, A.; Willenbockel, D.; Islam, S.; van Meijl, H.; Mueller, C.; Robertson, R.

    2014-12-01

    Previous studies have combined climate, crop and economic models to examine the impact of climate change on agricultural production and food security, but results have varied widely due to differences in models, scenarios and data. Recent work has examined (and narrowed) these differences through systematic model intercomparison using a high-emissions pathway to highlight the differences. New work extends that analysis to cover a range of plausible socioeconomic scenarios and emission pathways. Results from three general circulation models are combined with one crop model and five global economic models to examine the global and regional impacts of climate change on yields, area, production, prices and trade for coarse grains, rice, wheat, oilseeds and sugar to 2050. Results show that yield impacts vary with changes in population, income and technology as well as emissions, but are reduced in all cases by endogenous changes in prices and other variables.

  2. Gene-ontology enrichment analysis in two independent family-based samples highlights biologically plausible processes for autism spectrum disorders.

    LENUS (Irish Health Repository)

    Anney, Richard J L

    2012-02-01

    Recent genome-wide association studies (GWAS) have implicated a range of genes from discrete biological pathways in the aetiology of autism. However, despite the strong influence of genetic factors, association studies have yet to identify statistically robust, replicated major effect genes or SNPs. We apply the principle of the SNP ratio test methodology described by O\\'Dushlaine et al to over 2100 families from the Autism Genome Project (AGP). Using a two-stage design we examine association enrichment in 5955 unique gene-ontology classifications across four groupings based on two phenotypic and two ancestral classifications. Based on estimates from simulation we identify excess of association enrichment across all analyses. We observe enrichment in association for sets of genes involved in diverse biological processes, including pyruvate metabolism, transcription factor activation, cell-signalling and cell-cycle regulation. Both genes and processes that show enrichment have previously been examined in autistic disorders and offer biologically plausibility to these findings.

  3. The role of adverse childhood experiences in cardiovascular disease risk: a review with emphasis on plausible mechanisms.

    Science.gov (United States)

    Su, Shaoyong; Jimenez, Marcia P; Roberts, Cole T F; Loucks, Eric B

    2015-10-01

    Childhood adversity, characterized by abuse, neglect, and household dysfunction, is a problem that exerts a significant impact on individuals, families, and society. Growing evidence suggests that adverse childhood experiences (ACEs) are associated with health decline in adulthood, including cardiovascular disease (CVD). In the current review, we first provide an overview of the association between ACEs and CVD risk, with updates on the latest epidemiological evidence. Second, we briefly review plausible pathways by which ACEs could influence CVD risk, including traditional risk factors and novel mechanisms. Finally, we highlight the potential implications of ACEs in clinical and public health. Information gleaned from this review should help physicians and researchers in better understanding potential long-term consequences of ACEs and considering adapting current strategies in treatment or intervention for patients with ACEs.

  4. A hitherto undescribed case of cerebellar ataxia as the sole presentation of thyrotoxicosis in a young man: a plausible association.

    Science.gov (United States)

    Elhadd, Tarik Abdelkareim; Linton, Kathryn; McCoy, Caoihme; Saha, Subrata; Holden, Roger

    2014-01-01

    A 16-year-old male presented to hospital following an episode of unusual behavior on the football pitch, where he was witnessed as grossly ataxic by his teammates. The assessment demonstrated marked cerebellar signs on examination but no other neurological deficit. The investigation showed the evidence of biochemical thyrotoxicosis with free T4 at 37 pmol/L (normal reference range: 11-27) and thyrotropin (TSH) plausible because alternative etiologies were excluded, and the normalization of thyroid function with treatment was coupled with complete resolution of the neurological syndrome. Cerebellar syndromes may well be one of the presenting features of thyrotoxicosis, and this should be in the list of its differential diagnosis.

  5. Influence of the Aqueous Environment on Protein Structure—A Plausible Hypothesis Concerning the Mechanism of Amyloidogenesis

    Directory of Open Access Journals (Sweden)

    Irena Roterman

    2016-09-01

    Full Text Available The aqueous environment is a pervasive factor which, in many ways, determines the protein folding process and consequently the activity of proteins. Proteins are unable to perform their function unless immersed in water (membrane proteins excluded from this statement. Tertiary conformational stabilization is dependent on the presence of internal force fields (nonbonding interactions between atoms, as well as an external force field generated by water. The hitherto the unknown structuralization of water as the aqueous environment may be elucidated by analyzing its effects on protein structure and function. Our study is based on the fuzzy oil drop model—a mechanism which describes the formation of a hydrophobic core and attempts to explain the emergence of amyloid-like fibrils. A set of proteins which vary with respect to their fuzzy oil drop status (including titin, transthyretin and a prion protein have been selected for in-depth analysis to suggest the plausible mechanism of amyloidogenesis.

  6. Trial-by-Trial Modulation of Associative Memory Formation by Reward Prediction Error and Reward Anticipation as Revealed by a Biologically Plausible Computational Model

    Science.gov (United States)

    Aberg, Kristoffer C.; Müller, Julia; Schwartz, Sophie

    2017-01-01

    Anticipation and delivery of rewards improves memory formation, but little effort has been made to disentangle their respective contributions to memory enhancement. Moreover, it has been suggested that the effects of reward on memory are mediated by dopaminergic influences on hippocampal plasticity. Yet, evidence linking memory improvements to actual reward computations reflected in the activity of the dopaminergic system, i.e., prediction errors and expected values, is scarce and inconclusive. For example, different previous studies reported that the magnitude of prediction errors during a reinforcement learning task was a positive, negative, or non-significant predictor of successfully encoding simultaneously presented images. Individual sensitivities to reward and punishment have been found to influence the activation of the dopaminergic reward system and could therefore help explain these seemingly discrepant results. Here, we used a novel associative memory task combined with computational modeling and showed independent effects of reward-delivery and reward-anticipation on memory. Strikingly, the computational approach revealed positive influences from both reward delivery, as mediated by prediction error magnitude, and reward anticipation, as mediated by magnitude of expected value, even in the absence of behavioral effects when analyzed using standard methods, i.e., by collapsing memory performance across trials within conditions. We additionally measured trait estimates of reward and punishment sensitivity and found that individuals with increased reward (vs. punishment) sensitivity had better memory for associations encoded during positive (vs. negative) prediction errors when tested after 20 min, but a negative trend when tested after 24 h. In conclusion, modeling trial-by-trial fluctuations in the magnitude of reward, as we did here for prediction errors and expected value computations, provides a comprehensive and biologically plausible description of

  7. Trial-by-Trial Modulation of Associative Memory Formation by Reward Prediction Error and Reward Anticipation as Revealed by a Biologically Plausible Computational Model.

    Science.gov (United States)

    Aberg, Kristoffer C; Müller, Julia; Schwartz, Sophie

    2017-01-01

    Anticipation and delivery of rewards improves memory formation, but little effort has been made to disentangle their respective contributions to memory enhancement. Moreover, it has been suggested that the effects of reward on memory are mediated by dopaminergic influences on hippocampal plasticity. Yet, evidence linking memory improvements to actual reward computations reflected in the activity of the dopaminergic system, i.e., prediction errors and expected values, is scarce and inconclusive. For example, different previous studies reported that the magnitude of prediction errors during a reinforcement learning task was a positive, negative, or non-significant predictor of successfully encoding simultaneously presented images. Individual sensitivities to reward and punishment have been found to influence the activation of the dopaminergic reward system and could therefore help explain these seemingly discrepant results. Here, we used a novel associative memory task combined with computational modeling and showed independent effects of reward-delivery and reward-anticipation on memory. Strikingly, the computational approach revealed positive influences from both reward delivery, as mediated by prediction error magnitude, and reward anticipation, as mediated by magnitude of expected value, even in the absence of behavioral effects when analyzed using standard methods, i.e., by collapsing memory performance across trials within conditions. We additionally measured trait estimates of reward and punishment sensitivity and found that individuals with increased reward (vs. punishment) sensitivity had better memory for associations encoded during positive (vs. negative) prediction errors when tested after 20 min, but a negative trend when tested after 24 h. In conclusion, modeling trial-by-trial fluctuations in the magnitude of reward, as we did here for prediction errors and expected value computations, provides a comprehensive and biologically plausible description of

  8. Making Sense out of Sex Stereotypes in Advertising: A Feminist Analysis of Assumptions.

    Science.gov (United States)

    Ferrante, Karlene

    Sexism and racism in advertising have been well documented, but feminist research aimed at social change must go beyond existing content analyses to ask how advertising is created. Analysis of the "mirror assumption" (advertising reflects society) and the "gender assumption" (advertising speaks in a male voice to female…

  9. Comparative Interpretation of Classical and Keynesian Fiscal Policies (Assumptions, Principles and Primary Opinions

    Directory of Open Access Journals (Sweden)

    Engin Oner

    2015-06-01

    Full Text Available Adam Smith being its founder, in the Classical School, which gives prominence to supply and adopts an approach of unbiased finance, the economy is always in a state of full employment equilibrium. In this system of thought, the main philosophy of which is budget balance, that asserts that there is flexibility between prices and wages and regards public debt as an extraordinary instrument, the interference of the state with the economic and social life is frowned upon. In line with the views of the classical thought, the classical fiscal policy is based on three basic assumptions. These are the "Consumer State Assumption", the assumption accepting that "Public Expenditures are Always Ineffectual" and the assumption concerning the "Impartiality of the Taxes and Expenditure Policies Implemented by the State". On the other hand, the Keynesian School founded by John Maynard Keynes, gives prominence to demand, adopts the approach of functional finance, and asserts that cases of underemployment equilibrium and over-employment equilibrium exist in the economy as well as the full employment equilibrium, that problems cannot be solved through the invisible hand, that prices and wages are strict, the interference of the state is essential and at this point fiscal policies have to be utilized effectively.Keynesian fiscal policy depends on three primary assumptions. These are the assumption of "Filter State", the assumption that "public expenditures are sometimes effective and sometimes ineffective or neutral" and the assumption that "the tax, debt and expenditure policies of the state can never be impartial". 

  10. Implicit Assumptions in Special Education Policy: Promoting Full Inclusion for Students with Learning Disabilities

    Science.gov (United States)

    Kirby, Moira

    2017-01-01

    Introduction: Everyday millions of students in the United States receive special education services. Special education is an institution shaped by societal norms. Inherent in these norms are implicit assumptions regarding disability and the nature of special education services. The two dominant implicit assumptions evident in the American…

  11. Keeping Things Simple: Why the Human Development Index Should Not Diverge from Its Equal Weights Assumption

    Science.gov (United States)

    Stapleton, Lee M.; Garrod, Guy D.

    2007-01-01

    Using a range of statistical criteria rooted in Information Theory we show that there is little justification for relaxing the equal weights assumption underlying the United Nation's Human Development Index (HDI) even if the true HDI diverges significantly from this assumption. Put differently, the additional model complexity that unequal weights…

  12. Being Explicit about Underlying Values, Assumptions and Views when Designing for Children in the IDC Community

    DEFF Research Database (Denmark)

    Skovbjerg, Helle Marie; Bekker, Tilde; Barendregt, Wolmet

    2016-01-01

    on those assumptions and the possible influences on their design decisions? How can we make the assumptions explicit, discuss them in the IDC community and use the discussion to develop higher quality design and research? The workshop will support discussion between researchers, designers and practitioners...

  13. A Proposal for Testing Local Realism Without Using Assumptions Related to Hidden Variable States

    Science.gov (United States)

    Ryff, Luiz Carlos

    1996-01-01

    A feasible experiment is discussed which allows us to prove a Bell's theorem for two particles without using an inequality. The experiment could be used to test local realism against quantum mechanics without the introduction of additional assumptions related to hidden variables states. Only assumptions based on direct experimental observation are needed.

  14. Making Foundational Assumptions Transparent: Framing the Discussion about Group Communication and Influence

    Science.gov (United States)

    Meyers, Renee A.; Seibold, David R.

    2009-01-01

    In this article, the authors seek to augment Dean Hewes's (1986, 1996) intriguing bracketing and admirable larger effort to "return to basic theorizing in the study of group communication" by making transparent the foundational, and debatable, assumptions that underlie those models. Although these assumptions are addressed indirectly by Hewes, the…

  15. Sensitivity of TRIM projections to management, harvest, yield, and stocking adjustment assumptions.

    Science.gov (United States)

    Susan J. Alexander

    1991-01-01

    The Timber Resource Inventory Model (TRIM) was used to make several projections of forest industry timber supply for the Douglas-fir region. The sensitivity of these projections to assumptions about management and yields is discussed. A base run is compared to runs in which yields were altered, stocking adjustment was eliminated, harvest assumptions were changed, and...

  16. Assessing Key Assumptions of Network Meta-Analysis: A Review of Methods

    Science.gov (United States)

    Donegan, Sarah; Williamson, Paula; D'Alessandro, Umberto; Tudur Smith, Catrin

    2013-01-01

    Background: Homogeneity and consistency assumptions underlie network meta-analysis (NMA). Methods exist to assess the assumptions but they are rarely and poorly applied. We review and illustrate methods to assess homogeneity and consistency. Methods: Eligible articles focussed on indirect comparison or NMA methodology. Articles were sought by…

  17. Teaching Lessons in Exclusion: Researchers' Assumptions and the Ideology of Normality

    Science.gov (United States)

    Benincasa, Luciana

    2012-01-01

    Filling in a research questionnaire means coming into contact with the researchers' assumptions. In this sense filling in a questionnaire may be described as a learning situation. In this paper I carry out discourse analysis of selected questionnaire items from a number of studies, in order to highlight underlying values and assumptions, and their…

  18. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    Science.gov (United States)

    Ernst, Anja F.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking. PMID:28533971

  19. Assessing the assumption of symmetric proximity measures in the context of multidimensional scaling.

    Science.gov (United States)

    Kelley, Ken

    2004-01-01

    Applications of multidimensional scaling often make the assumption of symmetry for the population matrix of proximity measures. Although the likelihood of such an assumption holding true varies from one area of research to another, formal assessment of such an assumption has received little attention. The present article develops a nonparametric procedure that can be used in a confirmatory fashion or in an exploratory fashion in order to probabilistically assess the assumption of population symmetry for proximity measures in a multidimensional scaling context. The proposed procedure makes use of the bootstrap technique and alleviates the assumptions of parametric statistical procedures. Computer code for R and S-Plus is included in an appendix in order to carry out the proposed procedures.

  20. Modelling and analysis of dynamics of viral infection of cells and of interferon resistance

    Science.gov (United States)

    Getto, Ph.; Kimmel, M.; Marciniak-Czochra, A.

    2008-08-01

    Interferons are active biomolecules, which help fight viral infections by spreading from infected to uninfected cells and activate effector molecules, which confer resistance from the virus on cells. We propose a new model of dynamics of viral infection, including endocytosis, cell death, production of interferon and development of resistance. The novel element is a specific biologically justified mechanism of interferon action, which results in dynamics different from other infection models. The model reflects conditions prevailing in liquid cultures (ideal mixing), and the absence of cells or virus influx from outside. The basic model is a nonlinear system of five ordinary differential equations. For this variant, it is possible to characterise global behaviour, using a conservation law. Analytic results are supplemented by computational studies. The second variant of the model includes age-of-infection structure of infected cells, which is described by a transport-type partial differential equation for infected cells. The conclusions are: (i) If virus mortality is included, the virus becomes eventually extinct and subpopulations of uninfected and resistant cells are established. (ii) If virus mortality is not included, the dynamics may lead to extinction of uninfected cells. (iii) Switching off the interferon defense results in a decrease of the sum total of uninfected and resistant cells. (iv) Infection-age structure of infected cells may result in stabilisation or destabilisation of the system, depending on detailed assumptions. Our work seems to constitute the first comprehensive mathematical analysis of the cell-virus-interferon system based on biologically plausible hypotheses.

  1. Pathways to plausibility

    DEFF Research Database (Denmark)

    Wahlberg, Ayo

    2008-01-01

    Herbal medicine has long been contrasted to modern medicine in terms of a holistic approach to healing, vitalistic theories of health and illness and an emphasis on the body’s innate self-healing capacities. At the same time, since the early 20th century, the cultivation, preparation and mass...

  2. Pathways to plausibility

    DEFF Research Database (Denmark)

    Wahlberg, Ayo

    2008-01-01

    Herbal medicine has long been contrasted to modern medicine in terms of a holistic approach to healing, vitalistic theories of health and illness and an emphasis on the body’s innate self-healing capacities. At the same time, since the early 20th century, the cultivation, preparation and mass pro...... as normalised, with herbalists, phytochemists and pharmacologists working to develop standardised production procedures as well as to identify ‘plausible’ explanations for the efficacy of these remedies....

  3. High dynamic range images for enhancing low dynamic range content

    OpenAIRE

    Banterle, Francesco; Dellepiane, Matteo; Scopigno, Roberto

    2011-01-01

    This poster presents a practical system for enhancing the quality of Low Dynamic Range (LDR) videos using High Dynamic Range (HDR) background images. Our technique relies on the assumption that the HDR information is static in the video footage. This assumption can be valid in many scenarios where moving subjects are the main focus of the footage and do not have to interact with moving light sources or highly reflective objects. Another valid scenario is teleconferencing via webcams, where th...

  4. Selecting between-sample RNA-Seq normalization methods from the perspective of their assumptions.

    Science.gov (United States)

    Evans, Ciaran; Hardin, Johanna; Stoebel, Daniel M

    2017-02-27

    RNA-Seq is a widely used method for studying the behavior of genes under different biological conditions. An essential step in an RNA-Seq study is normalization, in which raw data are adjusted to account for factors that prevent direct comparison of expression measures. Errors in normalization can have a significant impact on downstream analysis, such as inflated false positives in differential expression analysis. An underemphasized feature of normalization is the assumptions on which the methods rely and how the validity of these assumptions can have a substantial impact on the performance of the methods. In this article, we explain how assumptions provide the link between raw RNA-Seq read counts and meaningful measures of gene expression. We examine normalization methods from the perspective of their assumptions, as an understanding of methodological assumptions is necessary for choosing methods appropriate for the data at hand. Furthermore, we discuss why normalization methods perform poorly when their assumptions are violated and how this causes problems in subsequent analysis. To analyze a biological experiment, researchers must select a normalization method with assumptions that are met and that produces a meaningful measure of expression for the given experiment.

  5. An Exploration of Dental Students' Assumptions About Community-Based Clinical Experiences.

    Science.gov (United States)

    Major, Nicole; McQuistan, Michelle R

    2016-03-01

    The aim of this study was to ascertain which assumptions dental students recalled feeling prior to beginning community-based clinical experiences and whether those assumptions were fulfilled or challenged. All fourth-year students at the University of Iowa College of Dentistry & Dental Clinics participate in community-based clinical experiences. At the completion of their rotations, they write a guided reflection paper detailing the assumptions they had prior to beginning their rotations and assessing the accuracy of their assumptions. For this qualitative descriptive study, the 218 papers from three classes (2011-13) were analyzed for common themes. The results showed that the students had a variety of assumptions about their rotations. They were apprehensive about working with challenging patients, performing procedures for which they had minimal experience, and working too slowly. In contrast, they looked forward to improving their clinical and patient management skills and knowledge. Other assumptions involved the site (e.g., the equipment/facility would be outdated; protocols/procedures would be similar to the dental school's). Upon reflection, students reported experiences that both fulfilled and challenged their assumptions. Some continued to feel apprehensive about treating certain patient populations, while others found it easier than anticipated. Students were able to treat multiple patients per day, which led to increased speed and patient management skills. However, some reported challenges with time management. Similarly, students were surprised to discover some clinics were new/updated although some had limited instruments and materials. Based on this study's findings about students' recalled assumptions and reflective experiences, educators should consider assessing and addressing their students' assumptions prior to beginning community-based dental education experiences.

  6. Some Finite Sample Properties and Assumptions of Methods for Determining Treatment Effects

    DEFF Research Database (Denmark)

    Petrovski, Erik

    2016-01-01

    for determining treatment effects were chosen: ordinary least squares regression, propensity score matching, and inverse probability weighting. The assumptions and properties tested across these methods are: unconfoundedness, differences in average treatment effects and treatment effects on the treated, overlap...... will compare assumptions and properties of select methods for determining treatment effects with Monte Carlo simulation. The comparison will highlight the pros and cons of using one method over another and the assumptions that researchers need to make for the method they choose.Three popular methods...

  7. Troubling 'lived experience': a post-structural critique of mental health nursing qualitative research assumptions.

    Science.gov (United States)

    Grant, A

    2014-08-01

    Qualitative studies in mental health nursing research deploying the 'lived experience' construct are often written on the basis of conventional qualitative inquiry assumptions. These include the presentation of the 'authentic voice' of research participants, related to their 'lived experience' and underpinned by a meta-assumption of the 'metaphysics of presence'. This set of assumptions is critiqued on the basis of contemporary post-structural qualitative scholarship. Implications for mental health nursing qualitative research emerging from this critique are described in relation to illustrative published work, and some benefits and challenges for researchers embracing post-structural sensibilities are outlined.

  8. Tests of the frozen-flux and tangentially geostrophic assumptions using magnetic satellite data

    DEFF Research Database (Denmark)

    Chulliat, A.; Olsen, Nils; Sabaka, T.

    the very large number of flows explaining the observed secular variation under the frozen-flux assumption alone. More recently, it has been shown that the combined frozen-flux and tangentially geostrophic assumptions translate into constraints on the secular variation whose mathematics are now well...... understood. Using these constraints, we test the combined frozen-flux and tangentially geostrophic assumptions against recent, high-precision magnetic data provided by the and CHAMP satellites. The methodology involves building constrained field models using least-squares methods. Two types of models...

  9. The induction ability of qualitative plausibility measures in default reasoning%量化plausibility测度在默认推理系统中的推理能力

    Institute of Scientific and Technical Information of China (English)

    霍旭辉; 寇辉

    2011-01-01

    作者讨论了量化plausibility测度在默认推理逻辑系统(P系统)中的推理能力,给出了一般量化plausibility测度与possibility测度具有相同推理能力的条件.%In this paper,the authors investigate the induction ability of qualitative plausibility measures in default reasoning,and obtain the conditions such that the general qualitative plausibility measures and the possibilty measures have the same induction in default reasoning.

  10. Plausible Drug Targets in the Streptococcus mutans Quorum Sensing Pathways to Combat Dental Biofilms and Associated Risks.

    Science.gov (United States)

    Kaur, Gurmeet; Rajesh, Shrinidhi; Princy, S Adline

    2015-12-01

    Streptococcus mutans, a Gram positive facultative anaerobe, is one among the approximately seven hundred bacterial species to exist in human buccal cavity and cause dental caries. Quorum sensing (QS) is a cell-density dependent communication process that respond to the inter/intra-species signals and elicit responses to show behavioral changes in the bacteria to an aggressive forms. In accordance to this phenomenon, the S. mutans also harbors a Competing Stimulating Peptide (CSP)-mediated quorum sensing, ComCDE (Two-component regulatory system) to regulate several virulence-associated traits that includes the formation of the oral biofilm (dental plaque), genetic competence and acidogenicity. The QS-mediated response of S. mutans adherence on tooth surface (dental plaque) imparts antibiotic resistance to the bacterium and further progresses to lead a chronic state, known as periodontitis. In recent years, the oral streptococci, S. mutans are not only recognized for its cariogenic potential but also well known to worsen the infective endocarditis due to its inherent ability to colonize and form biofilm on heart valves. The review significantly appreciate the increasing complexity of the CSP-mediated quorum-sensing pathway with a special emphasis to identify the plausible drug targets within the system for the development of anti-quorum drugs to control biofilm formation and associated risks.

  11. Three-layered metallodielectric nanoshells: plausible meta-atoms for metamaterials with isotropic negative refractive index at visible wavelengths.

    Science.gov (United States)

    Wu, DaJian; Jiang, ShuMin; Cheng, Ying; Liu, XiaoJun

    2013-01-14

    A three-layered Ag-low-permittivity (LP)-high-permittivity (HP) nanoshell is proposed as a plausible meta-atom for building the three-dimensional isotropic negative refractive index metamaterials (NIMs). The overlap between the electric and magnetic responses of Ag-LP-HP nanoshell can be realized by designing the geometry of the particle, which can lead to the negative electric and magnetic polarizabilities. Then, the negative refractive index is found in the random arrangement of Ag-LP-HP nanoshells. Especially, the modulation of the middle LP layer can move the negative refractive index range into the visible region. Because the responses arise from the each meta-atom, the metamaterial is intrinsically isotropic and polarization independent. It is further found with the increase of the LP layer thickness that the negative refractive index range of the random arrangement shows a large blue-shift and becomes narrow. With the decrease of the filling fraction, the negative refractive index range shows a blue-shift and becomes narrow while the maximum of the negative refractive index decreases.

  12. Non-canonical 3'-5' extension of RNA with prebiotically plausible ribonucleoside 2',3'-cyclic phosphates.

    Science.gov (United States)

    Mutschler, Hannes; Holliger, Philipp

    2014-04-09

    Ribonucleoside 2',3'-cyclic phosphates (N>p's) are generated by multiple prebiotically plausible processes and are credible building blocks for the assembly of early RNA oligomers. While N>p's can be polymerized into short RNAs by non-enzymatic processes with variable efficiency and regioselectivity, no enzymatic route for RNA synthesis had been described. Here we report such a non-canonical 3'-5' nucleotidyl transferase activity. We engineered a variant of the hairpin ribozyme to catalyze addition of all four N>p's (2',3'-cyclic A-, G-, U-, and CMP) to the 5'-hydroxyl termini of RNA strands with 5' nucleotide addition enhanced in all cases by eutectic ice phase formation at -7 °C. We also observed 5' addition of 2',3'-cyclic phosphate-activated β-nicotinamide adenine dinucleotide (NAD>p) and ACA>p RNA trinucleotide, and multiple additions of GUCCA>p RNA pentamers. Our results establish a new mode of RNA 3'-5' extension with implications for RNA oligomer synthesis from prebiotic nucleotide pools.

  13. Plausibility of stromal initiation of epithelial cancers without a mutation in the epithelium: a computer simulation of morphostats

    Directory of Open Access Journals (Sweden)

    Cappuccio Antonio

    2009-03-01

    Full Text Available Abstract Background There is experimental evidence from animal models favoring the notion that the disruption of interactions between stroma and epithelium plays an important role in the initiation of carcinogenesis. These disrupted interactions are hypothesized to be mediated by molecules, termed morphostats, which diffuse through the tissue to determine cell phenotype and maintain tissue architecture. Methods We developed a computer simulation based on simple properties of cell renewal and morphostats. Results Under the computer simulation, the disruption of the morphostat gradient in the stroma generated epithelial precursors of cancer without any mutation in the epithelium. Conclusion The model is consistent with the possibility that the accumulation of genetic and epigenetic changes found in tumors could arise after the formation of a founder population of aberrant cells, defined as cells that are created by low or insufficient morphostat levels and that no longer respond to morphostat concentrations. Because the model is biologically plausible, we hope that these results will stimulate further experiments.

  14. Bilinguals' Plausibility Judgments for Phrases with a Literal vs. Non-literal Meaning: The Influence of Language Brokering Experience

    Directory of Open Access Journals (Sweden)

    Belem G. López

    2017-09-01

    Full Text Available Previous work has shown that prior experience in language brokering (informal translation may facilitate the processing of meaning within and across language boundaries. The present investigation examined the influence of brokering on bilinguals' processing of two word collocations with either a literal or a figurative meaning in each language. Proficient Spanish-English bilinguals classified as brokers or non-brokers were asked to judge if adjective+noun phrases presented in each language made sense or not. Phrases with a literal meaning (e.g., stinging insect were interspersed with phrases with a figurative meaning (e.g., stinging insult and non-sensical phrases (e.g., stinging picnic. It was hypothesized that plausibility judgments would be facilitated for literal relative to figurative meanings in each language but that experience in language brokering would be associated with a more equivalent pattern of responding across languages. These predictions were confirmed. The findings add to the body of empirical work on individual differences in language processing in bilinguals associated with prior language brokering experience.

  15. Synchronous volcanic eruptions and abrupt climate change ∼17.7 ka plausibly linked by stratospheric ozone depletion.

    Science.gov (United States)

    McConnell, Joseph R; Burke, Andrea; Dunbar, Nelia W; Köhler, Peter; Thomas, Jennie L; Arienzo, Monica M; Chellman, Nathan J; Maselli, Olivia J; Sigl, Michael; Adkins, Jess F; Baggenstos, Daniel; Burkhart, John F; Brook, Edward J; Buizert, Christo; Cole-Dai, Jihong; Fudge, T J; Knorr, Gregor; Graf, Hans-F; Grieman, Mackenzie M; Iverson, Nels; McGwire, Kenneth C; Mulvaney, Robert; Paris, Guillaume; Rhodes, Rachael H; Saltzman, Eric S; Severinghaus, Jeffrey P; Steffensen, Jørgen Peder; Taylor, Kendrick C; Winckler, Gisela

    2017-09-05

    Glacial-state greenhouse gas concentrations and Southern Hemisphere climate conditions persisted until ∼17.7 ka, when a nearly synchronous acceleration in deglaciation was recorded in paleoclimate proxies in large parts of the Southern Hemisphere, with many changes ascribed to a sudden poleward shift in the Southern Hemisphere westerlies and subsequent climate impacts. We used high-resolution chemical measurements in the West Antarctic Ice Sheet Divide, Byrd, and other ice cores to document a unique, ∼192-y series of halogen-rich volcanic eruptions exactly at the start of accelerated deglaciation, with tephra identifying the nearby Mount Takahe volcano as the source. Extensive fallout from these massive eruptions has been found >2,800 km from Mount Takahe. Sulfur isotope anomalies and marked decreases in ice core bromine consistent with increased surface UV radiation indicate that the eruptions led to stratospheric ozone depletion. Rather than a highly improbable coincidence, circulation and climate changes extending from the Antarctic Peninsula to the subtropics-similar to those associated with modern stratospheric ozone depletion over Antarctica-plausibly link the Mount Takahe eruptions to the onset of accelerated Southern Hemisphere deglaciation ∼17.7 ka.

  16. X-ray investigation of the diffuse emission around plausible gamma-ray emitting pulsar wind nebulae in Kookaburra region

    CERN Document Server

    Kishishita, Tetsuichi; Uchiyama, Yasunobu; Tanaka, Yasuyuki; Takahashi, Tadayuki

    2012-01-01

    We report on the results from {\\it Suzaku} X-ray observations of the radio complex region called Kookaburra, which includes two adjacent TeV $\\gamma$-ray sources HESS J1418-609 and HESS J1420-607. The {\\it Suzaku} observation revealed X-ray diffuse emission around a middle-aged pulsar PSR J1420-6048 and a plausible PWN Rabbit with elongated sizes of $\\sigma_{\\rm X}=1^{\\prime}.66$ and $\\sigma_{\\rm X}=1^{\\prime}.49$, respectively. The peaks of the diffuse X-ray emission are located within the $\\gamma$-ray excess maps obtained by H.E.S.S. and the offsets from the $\\gamma$-ray peaks are $2^{\\prime}.8$ for PSR J1420-6048 and $4^{\\prime}.5$ for Rabbit. The X-ray spectra of the two sources were well reproduced by absorbed power-law models with $\\Gamma=1.7-2.3$. The spectral shapes tend to become softer according to the distance from the X-ray peaks. Assuming the one zone electron emission model as the first order approximation, the ambient magnetic field strengths of HESS J1420-607 and HESS J1418-609 can be estimate...

  17. Bethe-Heitler cascades as a plausible origin of hard spectra in distant TeV blazars

    CERN Document Server

    Zheng, Y G; Kang, S J

    2016-01-01

    Context. Very high-energy (VHE) $\\gamma$-ray measurements of distant TeV blazars can be nicely explained by TeV spectra induced by ultra high-energy cosmic rays. Aims. We develop a model for a plausible origin of hard spectra in distant TeV blazars. Methods. In the model, the TeV emission in distant TeV blazars is dominated by two mixed components. The first is the internal component with the photon energy around 1 TeV produced by inverse Compton scattering of the relativistic electrons on the synchrotron photons (SSC) with a correction for extragalactic background light absorbtion and the other is the external component with the photon energy more than 1 TeV produced by the cascade emission from high-energy protons propagating through intergalactic space. Results. Assuming suitable model parameters, we apply the model to observed spectra of distant TeV blazars of 1ES 0229+200. Our results show that 1) the observed spectrum properties of 1ES 0229+200, especially the TeV $\\gamma$-ray tail of the observed spect...

  18. Simultaneous observations of a pair of kilohertz QPOs and a plausible 1860 Hz QPO from an accreting neutron star system

    CERN Document Server

    Bhattacharyya, Sudip

    2009-01-01

    We report an indication (3.22 sigma) of ~ 1860 Hz quasi-periodic oscillations from a neutron star low-mass X-ray binary 4U 1636-536. If confirmed, this will be by far the highest frequency feature observed from an accreting neutron star system, and hence could be very useful to understand such systems. This plausible timing feature was observed simultaneously with lower (~ 585 Hz) and upper (~ 904 Hz) kilohertz quasi-periodic oscillations. The two kilohertz quasi-periodic oscillation frequencies had the ratio of ~ 1.5, and the frequency of the alleged ~ 1860 Hz feature was close to the triple and the double of these frequencies. This can be useful to constrain the models of all the three features. In particular, the ~ 1860 Hz feature could be (1) from a new and heretofore unknown class of quasi-periodic oscillations, or (2) the first observed overtone of lower or upper kilohertz quasi-periodic oscillations. Finally we note that, although the relatively low significance of the ~ 1860 Hz feature argues for caut...

  19. Synchronous volcanic eruptions and abrupt climate change ˜17.7 ka plausibly linked by stratospheric ozone depletion

    Science.gov (United States)

    McConnell, Joseph R.; Burke, Andrea; Dunbar, Nelia W.; Köhler, Peter; Thomas, Jennie L.; Arienzo, Monica M.; Chellman, Nathan J.; Maselli, Olivia J.; Sigl, Michael; Adkins, Jess F.; Baggenstos, Daniel; Burkhart, John F.; Brook, Edward J.; Buizert, Christo; Cole-Dai, Jihong; Fudge, T. J.; Knorr, Gregor; Graf, Hans-F.; Grieman, Mackenzie M.; Iverson, Nels; McGwire, Kenneth C.; Mulvaney, Robert; Paris, Guillaume; Rhodes, Rachael H.; Saltzman, Eric S.; Severinghaus, Jeffrey P.; Steffensen, Jørgen Peder; Taylor, Kendrick C.; Winckler, Gisela

    2017-09-01

    Glacial-state greenhouse gas concentrations and Southern Hemisphere climate conditions persisted until ˜17.7 ka, when a nearly synchronous acceleration in deglaciation was recorded in paleoclimate proxies in large parts of the Southern Hemisphere, with many changes ascribed to a sudden poleward shift in the Southern Hemisphere westerlies and subsequent climate impacts. We used high-resolution chemical measurements in the West Antarctic Ice Sheet Divide, Byrd, and other ice cores to document a unique, ˜192-y series of halogen-rich volcanic eruptions exactly at the start of accelerated deglaciation, with tephra identifying the nearby Mount Takahe volcano as the source. Extensive fallout from these massive eruptions has been found >2,800 km from Mount Takahe. Sulfur isotope anomalies and marked decreases in ice core bromine consistent with increased surface UV radiation indicate that the eruptions led to stratospheric ozone depletion. Rather than a highly improbable coincidence, circulation and climate changes extending from the Antarctic Peninsula to the subtropics—similar to those associated with modern stratospheric ozone depletion over Antarctica—plausibly link the Mount Takahe eruptions to the onset of accelerated Southern Hemisphere deglaciation ˜17.7 ka.

  20. Removal of hazardous organics from water using metal-organic frameworks (MOFs): plausible mechanisms for selective adsorptions.

    Science.gov (United States)

    Hasan, Zubair; Jhung, Sung Hwa

    2015-01-01

    Provision of clean water is one of the most important issues worldwide because of continuing economic development and the steady increase in the global population. However, clean water resources are decreasing everyday, because of contamination with various pollutants including organic chemicals. Pharmaceutical and personal care products, herbicides/pesticides, dyes, phenolics, and aromatics (from sources such as spilled oil) are typical organics that should be removed from water. Because of their huge porosities, designable pore structures, and facile modification, metal-organic frameworks (MOFs) are used in various adsorption, separation, storage, and delivery applications. In this review, the adsorptive purifications of contaminated water with MOFs are discussed, in order to understand possible applications of MOFs in clean water provision. More importantly, plausible adsorption or interaction mechanisms and selective adsorptions are summarized. The mechanisms of interactions such as electrostatic interaction, acid-base interaction, hydrogen bonding, π-π stacking/interaction, and hydrophobic interaction are discussed for the selective adsorption of organics over MOFs. The adsorption mechanisms will be very helpful not only for understanding adsorptions but also for applications of adsorptions in selective removal, storage, delivery and so on.

  1. A new scenario framework for climate change research: the concept of shared climate policy assumptions

    NARCIS (Netherlands)

    Kriegler, E.; Edmonds, J.; Hallegatte, S.; Ebi, K.L.; Kram, T.; Riahi, K.; Winkler, J.; van Vuuren, Detlef|info:eu-repo/dai/nl/11522016X

    2014-01-01

    The new scenario framework facilitates the coupling of multiple socioeconomic reference pathways with climate model products using the representative concentration pathways. This will allow for improved assessment of climate impacts, adaptation and mitigation. Assumptions about climate policy play a

  2. Who needs the assumption of opportunistic behavior? Transaction cost economics does not!

    DEFF Research Database (Denmark)

    Koch, Carsten Allan

    2000-01-01

    The assumption of opportunistic behavior, familiar from transaction cost economics, has been and remains highly controversial. But opportunistic behavior, albeit undoubtedly an extremely important form of motivation, is not a necessary condition for the contractual problems studied by transaction...

  3. Washington International Renewable Energy Conference (WIREC) 2008 Pledges. Methodology and Assumptions Summary

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, Bill [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bilello, Daniel E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cowlin, Shannon C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wise, Alison [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2008-08-01

    This report describes the methodology and assumptions used by NREL in quantifying the potential CO2 reductions resulting from more than 140 governments, international organizations, and private-sector representatives pledging to advance the uptake of renewable energy.

  4. Learning disabilities theory and Soviet psychology: a comparison of basic assumptions.

    Science.gov (United States)

    Coles, G S

    1982-09-01

    Critics both within and outside the Learning Disabilities (LD) field have pointed to the weaknesses of LD theory. Beginning with the premise that a significant problem of LD theory has been its failure to explore fully its fundamental assumptions, this paper examines a number of these assumptions about individual and social development, cognition, and learning. These assumptions are compared with a contrasting body of premises found in Soviet psychology, particularly in the works of Vygotsky, Leontiev, and Luria. An examination of the basic assumptions of LD theory and Soviet psychology shows that a major difference lies in their respective nondialectical and dialectical interpretation of the relationship of social factors and cognition, learning, and neurological development.

  5. Shattering Man’s Fundamental Assumptions in Don DeLillo’s Falling Man

    Directory of Open Access Journals (Sweden)

    Hazim Adnan Hashim

    2016-09-01

    Full Text Available The present study addresses effects of traumatic events such as the September 11 attacks on victims’ fundamental assumptions. These beliefs or assumptions provide individuals with expectations about the world and their sense of self-worth. Thus, they ground people’s sense of security, stability, and orientation. The September 11 terrorist attacks in the U.S.A. were very tragic for Americans because this fundamentally changed their understandings about many aspects in life. The attacks led many individuals to build new kind of beliefs and assumptions about themselves and the world. Many writers have written about the human ordeals that followed this incident. Don DeLillo’s Falling Man reflects the traumatic repercussions of this disaster on Americans’ fundamental assumptions. The objective of this study is to examine the novel from the traumatic perspective that has afflicted the victims’ fundamental understandings of the world and the self. Individuals’ fundamental understandings could be changed or modified due to exposure to certain types of events like war, terrorism, political violence or even the sense of alienation. The Assumptive World theory of Ronnie Janoff-Bulman will be used as a framework to study the traumatic experience of the characters in Falling Man. The significance of the study lies in providing a new perception to the field of trauma that can help trauma victims to adopt alternative assumptions or reshape their previous ones to heal from traumatic effects.

  6. Unrealistic Assumptions in Economics: an Analysis under the Logic of Socioeconomic Processes

    Directory of Open Access Journals (Sweden)

    Leonardo Ivarola

    2014-11-01

    Full Text Available The realism of assumptions is an ongoing debate within the philosophy of economics. One of the most referenced papers in this matter belongs to Milton Friedman. He defends the use of unrealistic assumptions, not only because of a pragmatic issue, but also the intrinsic difficulties of determining the extent of realism. On the other hand, realists have criticized (and still do today the use of unrealistic assumptions - such as the assumption of rational choice, perfect information, homogeneous goods, etc. However, they did not accompany their statements with a proper epistemological argument that supports their positions. In this work it is expected to show that the realism of (a particular sort of assumptions is clearly relevant when examining economic models, since the system under study (the real economies is not compatible with logic of invariance and of mechanisms, but with the logic of possibility trees. Because of this, models will not function as tools for predicting outcomes, but as representations of alternative scenarios, whose similarity to the real world will be examined in terms of the verisimilitude of a class of model assumptions

  7. Post-traumatic stress and world assumptions: the effects of religious coping.

    Science.gov (United States)

    Zukerman, Gil; Korn, Liat

    2014-12-01

    Religiosity has been shown to moderate the negative effects of traumatic event experiences. The current study was designed to examine the relationship between post-traumatic stress (PTS) following traumatic event exposure; world assumptions defined as basic cognitive schemas regarding the world; and self and religious coping conceptualized as drawing on religious beliefs and practices for understanding and dealing with life stressors. This study examined 777 Israeli undergraduate students who completed several questionnaires which sampled individual world assumptions and religious coping in addition to measuring PTS, as manifested by the PTSD check list. Results indicate that positive religious coping was significantly associated with more positive world assumptions, while negative religious coping was significantly associated with more negative world assumptions. Additionally, negative world assumptions were significantly associated with more avoidance symptoms, while reporting higher rates of traumatic event exposure was significantly associated with more hyper-arousal. These findings suggest that religious-related cognitive schemas directly affect world assumptions by creating protective shields that may prevent the negative effects of confronting an extreme negative experience.

  8. Assessing framing assumptions in quantitative health impact assessments: a housing intervention example.

    Science.gov (United States)

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2013-09-01

    Health impact assessment (HIA) is often used to determine ex ante the health impact of an environmental policy or an environmental intervention. Underpinning any HIA is the framing assumption, which defines the causal pathways mapping environmental exposures to health outcomes. The sensitivity of the HIA to the framing assumptions is often ignored. A novel method based on fuzzy cognitive map (FCM) is developed to quantify the framing assumptions in the assessment stage of a HIA, and is then applied to a housing intervention (tightening insulation) as a case-study. Framing assumptions of the case-study were identified through a literature search of Ovid Medline (1948-2011). The FCM approach was used to identify the key variables that have the most influence in a HIA. Changes in air-tightness, ventilation, indoor air quality and mould/humidity have been identified as having the most influence on health. The FCM approach is widely applicable and can be used to inform the formulation of the framing assumptions in any quantitative HIA of environmental interventions. We argue that it is necessary to explore and quantify framing assumptions prior to conducting a detailed quantitative HIA during the assessment stage.

  9. Agreement Theorems in Dynamic-Epistemic Logic

    NARCIS (Netherlands)

    Degremont, Cedric; Roy, Oliver

    2012-01-01

    This paper introduces Agreement Theorems to dynamic-epistemic logic. We show first that common belief of posteriors is sufficient for agreement in epistemic-plausibility models, under common and well-founded priors. We do not restrict ourselves to the finite case, showing that in countable structure

  10. On Matrix Sampling and Imputation of Context Questionnaires with Implications for the Generation of Plausible Values in Large-Scale Assessments

    Science.gov (United States)

    Kaplan, David; Su, Dan

    2016-01-01

    This article presents findings on the consequences of matrix sampling of context questionnaires for the generation of plausible values in large-scale assessments. Three studies are conducted. Study 1 uses data from PISA 2012 to examine several different forms of missing data imputation within the chained equations framework: predictive mean…

  11. World Literacy Prospects at the Turn of the Century: Is the Objective of Literacy for All by the Year 2000 Statistically Plausible?

    Science.gov (United States)

    Carceles, Gabriel

    1990-01-01

    Describes status and challenge of worldwide illiteracy. Discusses statistical plausibility of universal literacy by 2000. Predicts literacy universalization will take from 14 to 21 years, depending on region, if 1980s trends continue. Implies literacy work requires action strategies commensurate with problem, including national programs and mass…

  12. Impacts of cloud overlap assumptions on radiative budgets and heating fields in convective regions

    Science.gov (United States)

    Wang, XiaoCong; Liu, YiMin; Bao, Qing

    2016-01-01

    Impacts of cloud overlap assumptions on radiative budgets and heating fields are explored with the aid of a cloud-resolving model (CRM), which provided cloud geometry as well as cloud micro and macro properties. Large-scale forcing data to drive the CRM are from TRMM Kwajalein Experiment and the Global Atmospheric Research Program's Atlantic Tropical Experiment field campaigns during which abundant convective systems were observed. The investigated overlap assumptions include those that were traditional and widely used in the past and the one that was recently addressed by Hogan and Illingworth (2000), in which the vertically projected cloud fraction is expressed by a linear combination of maximum and random overlap, with the weighting coefficient depending on the so-called decorrelation length Lcf. Results show that both shortwave and longwave cloud radiative forcings (SWCF/LWCF) are significantly underestimated under maximum (MO) and maximum-random (MRO) overlap assumptions, whereas remarkably overestimated under the random overlap (RO) assumption in comparison with that using CRM inherent cloud geometry. These biases can reach as high as 100 Wm- 2 for SWCF and 60 Wm- 2 for LWCF. By its very nature, the general overlap (GenO) assumption exhibits an encouraging performance on both SWCF and LWCF simulations, with the biases almost reduced by 3-fold compared with traditional overlap assumptions. The superiority of GenO assumption is also manifested in the simulation of shortwave and longwave radiative heating fields, which are either significantly overestimated or underestimated under traditional overlap assumptions. The study also pointed out the deficiency of constant assumption on Lcf in GenO assumption. Further examinations indicate that the CRM diagnostic Lcf varies among different cloud types and tends to be stratified in the vertical. The new parameterization that takes into account variation of Lcf in the vertical well reproduces such a relationship and

  13. Developing spatially explicit footprints of plausible land-use scenarios in the Santa Cruz Watershed, Arizona and Sonora

    Science.gov (United States)

    Norman, Laura M.; Feller, Mark; Villarreal, Miguel L.

    2012-01-01

    The SLEUTH urban growth model is applied to a binational dryland watershed to envision and evaluate plausible future scenarios of land use change into the year 2050. Our objective was to create a suite of geospatial footprints portraying potential land use change that can be used to aid binational decision-makers in assessing the impacts relative to sustainability of natural resources and potential socio-ecological consequences of proposed land-use management. Three alternatives are designed to simulate different conditions: (i) a Current Trends Scenario of unmanaged exponential growth, (ii) a Conservation Scenario with managed growth to protect the environment, and (iii) a Megalopolis Scenario in which growth is accentuated around a defined international trade corridor. The model was calibrated with historical data extracted from a time series of satellite images. Model materials, methodology, and results are presented. Our Current Trends Scenario predicts the footprint of urban growth to approximately triple from 2009 to 2050, which is corroborated by local population estimates. The Conservation Scenario results in protecting 46% more of the Evergreen class (more than 150,000 acres) than the Current Trends Scenario and approximately 95,000 acres of Barren Land, Crops, Deciduous Forest (Mesquite Bosque), Grassland/Herbaceous, Urban/Recreational Grasses, and Wetlands classes combined. The Megalopolis Scenario results also depict the preservation of some of these land-use classes compared to the Current Trends Scenario, most notably in the environmentally important headwaters region. Connectivity and areal extent of land cover types that provide wildlife habitat were preserved under the alternative scenarios when compared to Current Trends.

  14. Combination of monoclonal antibodies and DPP-IV inhibitors in the treatment of type 1 diabetes: a plausible treatment modality?

    Science.gov (United States)

    Dubala, Anil; Gupta, Ankur; Samanta, Malay K

    2014-07-01

    Regulatory T cells (Tregs) are crucial for the maintenance of immunological tolerance. Type 1 diabetes (T1D) occurs when the immune-regulatory mechanism fails. In fact, T1D is reversed by islet transplantation but is associated with hostile effects of persistent immune suppression. T1D is believed to be dependent on the activation of type-1 helper T (Th1) cells. Immune tolerance is liable for the activation of the Th1 cells. The important role of Th1 cells in pathology of T1D entails the depletion of CD4(+) T cells, which initiated the use of monoclonal antibodies (mAbs) against CD4(+) T cells to interfere with induction of T1D. Prevention of autoimmunity is not only a step forward for the treatment of T1D, but could also restore the β-cell mass. Glucagon-like peptide (GLP)-1 stimulates β-cell proliferation and also has anti-apoptotic effects on them. However, the potential use of GLP-1 as a possible method to restore pancreatic β-cells is limited due to rapid degradation by dipeptidyl peptidase (DPP)-IV. We hypothesize that treatment with combination of CD4 mAbs and DPP-IV inhibitors could prevent/reverse T1D. CD4 mAbs have the ability to induce immune tolerance, thereby arresting further progression of T1D; DPP-IV inhibitors have the capability to regenerate the β-cell mass. Consequently, the combination of CD4 mAbs and DPP-IV inhibitor could avoid or at least minimize the constraints of intensive subcutaneous insulin therapy. We presume that if this hypothesis proves correct, it may become one of the plausible therapeutic options for T1D.

  15. Collapsin Response Mediator Protein-2 (CRMP2) is a Plausible Etiological Factor and Potential Therapeutic Target in Alzheimer's Disease: Comparison and Contrast with Microtubule-Associated Protein Tau.

    Science.gov (United States)

    Hensley, Kenneth; Kursula, Petri

    2016-04-15

    Alzheimer's disease (AD) has long been viewed as a pathology that must be caused either by aberrant amyloid-β protein precursor (AβPP) processing, dysfunctional tau protein processing, or a combination of these two factors. This is a reasonable assumption because amyloid-β peptide (Aβ) accumulation and tau hyperphosphorylation are the defining histological features in AD, and because AβPP and tau mutations can cause AD in humans or AD-like features in animal models. Nonetheless, other protein players are emerging that one can argue are significant etiological players in subsets of AD and potentially novel, druggable targets. In particular, the microtubule-associated protein CRMP2 (collapsin response mediator protein-2) bears striking analogies to tau and is similarly relevant to AD. Like tau, CRMP2 dynamically regulates microtubule stability; it is acted upon by the same kinases; collects similarly in neurofibrillary tangles (NFTs); and when sequestered in NFTs, complexes with critical synapse-stabilizing factors. Additionally, CRMP2 is becoming recognized as an important adaptor protein involved in vesicle trafficking, amyloidogenesis and autophagy, in ways that tau is not. This review systematically compares the biology of CRMP2 to that of tau in the context of AD and explores the hypothesis that CRMP2 is an etiologically significant protein in AD and participates in pathways that can be rationally engaged for therapeutic benefit.

  16. Contemporary assumptions on human nature and work and approach to human potential managing

    Directory of Open Access Journals (Sweden)

    Vujić Dobrila

    2006-01-01

    Full Text Available A general problem of this research is to identify if there is a relationship between the assumption on human nature and work (Mcgregor, Argyris, Schein, Steers and Porter and a general organizational model preference, as well as a mechanism of human resource management? This research was carried out in 2005/2006. The sample consisted of 317 subjects (197 managers, 105 highly educated subordinates and 15 entrepreneurs in 7 big enterprises in a group of small business enterprises differentiating in terms of the entrepreneur’s structure and a type of activity. A general hypothesis "that assumptions on human nature and work are statistically significant in connection to the preference approach (models, of work motivation commitment", has been confirmed. A specific hypothesis have been also confirmed: ·The assumptions on a human as a rational economic being are statistically significant in correlation with only two mechanisms of traditional models, the mechanism of method work control and the working discipline mechanism. ·Statistically significant assumptions on a human as a social being are correlated with all mechanisms of engaging employees, which belong to the model of the human relations, except the mechanism introducing the adequate type of prizes for all employees independently of working results. ·The assumptions on a human as a creative being are statistically significant, positively correlating with preference of two mechanisms belonging to the human resource model by investing into education and training and making conditions for the application of knowledge and skills. The young with assumptions on a human as a creative being prefer much broader repertoire of mechanisms belonging to the human resources model from the remaining category of subjects in the pattern. The connection between the assumption on human nature and preference models of engaging appears especially in the sub-pattern of managers, in the category of young subjects

  17. Comprehensive analysis of schizophrenia-associated loci highlights ion channel pathways and biologically plausible candidate causal genes

    DEFF Research Database (Denmark)

    Pers, Tune H; Timshel, Pascal; Ripke, Stephan;

    2016-01-01

    Over 100 associated genetic loci have been robustly associated with schizophrenia. Gene prioritization and pathway analysis have focused on a priori hypotheses and thus may have been unduly influenced by prior assumptions and missed important causal genes and pathways. Using a data-driven approach...... validate the relevance of the prioritized genes by showing that they are enriched for rare disruptive variants and de novo variants from schizophrenia sequencing studies (odds ratio 1.67, P=0.039), and are enriched for genes encoding members of mouse and human postsynaptic density proteomes (odds ratio 4......, we show that genes in associated loci: (1) are highly expressed in cortical brain areas; (2) are enriched for ion channel pathways (false discovery ratesgenes that are functionally related to each other and hence represent promising candidates for experimental follow up. We...

  18. World assumptions, posttraumatic stress and quality of life after a natural disaster: A longitudinal study

    Directory of Open Access Journals (Sweden)

    Nygaard Egil

    2012-06-01

    Full Text Available Abstract Background Changes in world assumptions are a fundamental concept within theories that explain posttraumatic stress disorder. The objective of the present study was to gain a greater understanding of how changes in world assumptions are related to quality of life and posttraumatic stress symptoms after a natural disaster. Methods A longitudinal study of 574 Norwegian adults who survived the Southeast Asian tsunami in 2004 was undertaken. Multilevel analyses were used to identify which factors at six months post-tsunami predicted quality of life and posttraumatic stress symptoms two years post-tsunami. Results Good quality of life and posttraumatic stress symptoms were negatively related. However, major differences in the predictors of these outcomes were found. Females reported significantly higher quality of life and more posttraumatic stress than men. The association between level of exposure to the tsunami and quality of life seemed to be mediated by posttraumatic stress. Negative perceived changes in the assumption “the world is just” were related to adverse outcome in both quality of life and posttraumatic stress. Positive perceived changes in the assumptions “life is meaningful” and “feeling that I am a valuable human” were associated with higher levels of quality of life but not with posttraumatic stress. Conclusions Quality of life and posttraumatic stress symptoms demonstrate differences in their etiology. World assumptions may be less specifically related to posttraumatic stress than has been postulated in some cognitive theories.

  19. Are nest sites actively chosen? Testing a common assumption for three non-resource limited birds

    Science.gov (United States)

    Goodenough, A. E.; Elliot, S. L.; Hart, A. G.

    2009-09-01

    Many widely-accepted ecological concepts are simplified assumptions about complex situations that remain largely untested. One example is the assumption that nest-building species choose nest sites actively when they are not resource limited. This assumption has seen little direct empirical testing: most studies on nest-site selection simply assume that sites are chosen actively (and seek explanations for such behaviour) without considering that sites may be selected randomly. We used 15 years of data from a nestbox scheme in the UK to test the assumption of active nest-site choice in three cavity-nesting bird species that differ in breeding and migratory strategy: blue tit ( Cyanistes caeruleus), great tit ( Parus major) and pied flycatcher ( Ficedula hypoleuca). Nest-site selection was non-random (implying active nest-site choice) for blue and great tits, but not for pied flycatchers. We also considered the relative importance of year-specific and site-specific factors in determining occupation of nest sites. Site-specific factors were more important than year-specific factors for the tit species, while the reverse was true for pied flycatchers. Our results show that nest-site selection, in birds at least, is not always the result of active choice, such that choice should not be assumed automatically in studies of nesting behaviour. We use this example to highlight the need to test key ecological assumptions empirically, and the importance of doing so across taxa rather than for single "model" species.

  20. Assumptions and moral understanding of the wish to hasten death: a philosophical review of qualitative studies.

    Science.gov (United States)

    Rodríguez-Prat, Andrea; van Leeuwen, Evert

    2017-07-01

    It is not uncommon for patients with advanced disease to express a wish to hasten death (WTHD). Qualitative studies of the WTHD have found that such a wish may have different meanings, none of which can be understood outside of the patient's personal and sociocultural background, or which necessarily imply taking concrete steps to ending one's life. The starting point for the present study was a previous systematic review of qualitative studies of the WTHD in advanced patients. Here we analyse in greater detail the statements made by patients included in that review in order to examine their moral understandings and representations of illness, the dying process and death. We identify and discuss four classes of assumptions: (1) assumptions related to patients' moral understandings in terms of dignity, autonomy and authenticity; (2) assumptions related to social interactions; (3) assumptions related to the value of life; and (4) assumptions related to medicalisation as an overarching context within which the WTHD is expressed. Our analysis shows how a philosophical perspective can add to an understanding of the WTHD by taking into account cultural and anthropological aspects of the phenomenon. We conclude that the knowledge gained through exploring patients' experience and moral understandings in the end-of-life context may serve as the basis for care plans and interventions that can help them experience their final days as a meaningful period of life, restoring some sense of personal dignity in those patients who feel this has been lost.

  1. Modelling sexual transmission of HIV: testing the assumptions, validating the predictions

    Science.gov (United States)

    Baggaley, Rebecca F.; Fraser, Christophe

    2010-01-01

    Purpose of review To discuss the role of mathematical models of sexual transmission of HIV: the methods used and their impact. Recent findings We use mathematical modelling of “universal test and treat” as a case study to illustrate wider issues relevant to all modelling of sexual HIV transmission. Summary Mathematical models are used extensively in HIV epidemiology to deduce the logical conclusions arising from one or more sets of assumptions. Simple models lead to broad qualitative understanding, while complex models can encode more realistic assumptions and thus be used for predictive or operational purposes. An overreliance on model analysis where assumptions are untested and input parameters cannot be estimated should be avoided. Simple models providing bold assertions have provided compelling arguments in recent public health policy, but may not adequately reflect the uncertainty inherent in the analysis. PMID:20543600

  2. COMPETITION VERSUS COLLUSION: THE PARALLEL BEHAVIOUR IN THE ABSENCE OF THE SYMETRY ASSUMPTION

    Directory of Open Access Journals (Sweden)

    Romano Oana Maria

    2012-07-01

    Full Text Available Cartel detection is usually viewed as a key task of competition authorities. A special case of cartel is the parallel behaviour in terms of price selling. This type of behaviour is difficult to assess and its analysis has not always conclusive results. For evaluating such behaviour the data available are compared with theoretical values obtained by using a competitive or a collusive model. When different competitive or collusive models are considered, for the simplicity of calculations the economists use the symmetry assumption of costs and quantities produced / sold. This assumption has the disadvantage that the theoretical values obtained may deviate significantly from actual values (the real values on the market, which can sometimes lead to ambiguous results. The present paper analyses the parallel behaviour of economic agents in the absence of the symmetry assumption and study the identification of the model in this conditions.

  3. Assumptions and Axioms: Mathematical Structures to Describe the Physics of Rigid Bodies

    CERN Document Server

    Butler, Philip H; Renaud, Peter F

    2010-01-01

    This paper challenges some of the common assumptions underlying the mathematics used to describe the physical world. We start by reviewing many of the assumptions underlying the concepts of real, physical, rigid bodies and the translational and rotational properties of such rigid bodies. Nearly all elementary and advanced texts make physical assumptions that are subtly different from ours, and as a result we develop a mathematical description that is subtly different from the standard mathematical structure. Using the homogeneity and isotropy of space, we investigate the translational and rotational features of rigid bodies in two and three dimensions. We find that the concept of rigid bodies and the concept of the homogeneity of space are intrinsically linked. The geometric study of rotations of rigid objects leads to a geometric product relationship for lines and vectors. By requiring this product to be both associative and to satisfy Pythagoras' theorem, we obtain a choice of Clifford algebras. We extend o...

  4. Sensitivity of Rooftop PV Projections in the SunShot Vision Study to Market Assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Drury, E.; Denholm, P.; Margolis, R.

    2013-01-01

    The SunShot Vision Study explored the potential growth of solar markets if solar prices decreased by about 75% from 2010 to 2020. The SolarDS model was used to simulate rooftop PV demand for this study, based on several PV market assumptions--future electricity rates, customer access to financing, and others--in addition to the SunShot PV price projections. This paper finds that modeled PV demand is highly sensitive to several non-price market assumptions, particularly PV financing parameters.

  5. Impact of unseen assumptions on communication of atmospheric carbon mitigation options

    Science.gov (United States)

    Elliot, T. R.; Celia, M. A.; Court, B.

    2010-12-01

    With the rapid access and dissemination of information made available through online and digital pathways, there is need for a concurrent openness and transparency in communication of scientific investigation. Even with open communication it is essential that the scientific community continue to provide impartial result-driven information. An unknown factor in climate literacy is the influence of an impartial presentation of scientific investigation that has utilized biased base-assumptions. A formal publication appendix, and additional digital material, provides active investigators a suitable framework and ancillary material to make informed statements weighted by assumptions made in a study. However, informal media and rapid communiqués rarely make such investigatory attempts, often citing headline or key phrasing within a written work. This presentation is focused on Geologic Carbon Sequestration (GCS) as a proxy for the wider field of climate science communication, wherein we primarily investigate recent publications in GCS literature that produce scenario outcomes using apparently biased pro- or con- assumptions. A general review of scenario economics, capture process efficacy and specific examination of sequestration site assumptions and processes, reveals an apparent misrepresentation of what we consider to be a base-case GCS system. The authors demonstrate the influence of the apparent bias in primary assumptions on results from commonly referenced subsurface hydrology models. By use of moderate semi-analytical model simplification and Monte Carlo analysis of outcomes, we can establish the likely reality of any GCS scenario within a pragmatic middle ground. Secondarily, we review the development of publically available web-based computational tools and recent workshops where we presented interactive educational opportunities for public and institutional participants, with the goal of base-assumption awareness playing a central role. Through a series of

  6. cBrother: relaxing parental tree assumptions for Bayesian recombination detection.

    Science.gov (United States)

    Fang, Fang; Ding, Jing; Minin, Vladimir N; Suchard, Marc A; Dorman, Karin S

    2007-02-15

    Bayesian multiple change-point models accurately detect recombination in molecular sequence data. Previous Java-based implementations assume a fixed topology for the representative parental data. cBrother is a novel C language implementation that capitalizes on reduced computational time to relax the fixed tree assumption. We show that cBrother is 19 times faster than its predecessor and the fixed tree assumption can influence estimates of recombination in a medically-relevant dataset. cBrother can be freely downloaded from http://www.biomath.org/dormanks/ and can be compiled on Linux, Macintosh and Windows operating systems. Online documentation and a tutorial are also available at the site.

  7. Automatic ethics: the effects of implicit assumptions and contextual cues on moral behavior.

    Science.gov (United States)

    Reynolds, Scott J; Leavitt, Keith; DeCelles, Katherine A

    2010-07-01

    We empirically examine the reflexive or automatic aspects of moral decision making. To begin, we develop and validate a measure of an individual's implicit assumption regarding the inherent morality of business. Then, using an in-basket exercise, we demonstrate that an implicit assumption that business is inherently moral impacts day-to-day business decisions and interacts with contextual cues to shape moral behavior. Ultimately, we offer evidence supporting a characterization of employees as reflexive interactionists: moral agents whose automatic decision-making processes interact with the environment to shape their moral behavior.

  8. A criterion of orthogonality on the assumption and restrictions in subgrid-scale modelling of turbulence

    Science.gov (United States)

    Fang, L.; Sun, X. Y.; Liu, Y. W.

    2016-12-01

    In order to shed light on understanding the subgrid-scale (SGS) modelling methodology, we analyze and define the concepts of assumption and restriction in the modelling procedure, then show by a generalized derivation that if there are multiple stationary restrictions in a modelling, the corresponding assumption function must satisfy a criterion of orthogonality. Numerical tests using one-dimensional nonlinear advection equation are performed to validate this criterion. This study is expected to inspire future research on generally guiding the SGS modelling methodology.

  9. Sensitivity of Rooftop PV Projections in the SunShot Vision Study to Market Assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Drury, E.; Denholm, P.; Margolis, R.

    2013-01-01

    The SunShot Vision Study explored the potential growth of solar markets if solar prices decreased by about 75% from 2010 to 2020. The SolarDS model was used to simulate rooftop PV demand for this study, based on several PV market assumptions--future electricity rates, customer access to financing, and others--in addition to the SunShot PV price projections. This paper finds that modeled PV demand is highly sensitive to several non-price market assumptions, particularly PV financing parameters.

  10. Technoeconomic assumptions adopted for the development of a long-term electricity supply model for Cyprus.

    Science.gov (United States)

    Taliotis, Constantinos; Taibi, Emanuele; Howells, Mark; Rogner, Holger; Bazilian, Morgan; Welsch, Manuel

    2017-10-01

    The generation mix of Cyprus has been dominated by oil products for decades. In order to conform with European Union and international legislation, a transformation of the supply system is called for. Energy system models can facilitate energy planning into the future, but a large volume of data is required to populate such models. The present data article provides information on key modelling assumptions and input data adopted with the aim of representing the electricity supply system of Cyprus in a separate research article. Data in regards to renewable energy technoeconomic characteristics and investment cost projections, fossil fuel price projections, storage technology characteristics and system operation assumptions are described in this article.

  11. A statistical test of the stability assumption inherent in empirical estimates of economic depreciation.

    Science.gov (United States)

    Shriver, K A

    1986-01-01

    Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.

  12. Validity of the Michaelis-Menten equation--steady-state or reactant stationary assumption: that is the question.

    Science.gov (United States)

    Schnell, Santiago

    2014-01-01

    The Michaelis-Menten equation is generally used to estimate the kinetic parameters, V and K(M), when the steady-state assumption is valid. Following a brief overview of the derivation of the Michaelis-Menten equation for the single-enzyme, single-substrate reaction, a critical review of the criteria for validity of the steady-state assumption is presented. The application of the steady-state assumption makes the implicit assumption that there is an initial transient during which the substrate concentration remains approximately constant, equal to the initial substrate concentration, while the enzyme-substrate complex concentration builds up. This implicit assumption is known as the reactant stationary assumption. This review presents evidence showing that the reactant stationary assumption is distinct from and independent of the steady-state assumption. Contrary to the widely believed notion that the Michaelis-Menten equation can always be applied under the steady-state assumption, the reactant stationary assumption is truly the necessary condition for validity of the Michaelis-Menten equation to estimate kinetic parameters. Therefore, the application of the Michaelis-Menten equation only leads to accurate estimation of kinetic parameters when it is used under experimental conditions meeting the reactant stationary assumption. The criterion for validity of the reactant stationary assumption does not require the restrictive condition of choosing a substrate concentration that is much higher than the enzyme concentration in initial rate experiments. © 2013 FEBS.

  13. Comparing the Performance of Approaches for Testing the Homogeneity of Variance Assumption in One-Factor ANOVA Models

    Science.gov (United States)

    Wang, Yan; Rodríguez de Gil, Patricia; Chen, Yi-Hsin; Kromrey, Jeffrey D.; Kim, Eun Sook; Pham, Thanh; Nguyen, Diep; Romano, Jeanine L.

    2017-01-01

    Various tests to check the homogeneity of variance assumption have been proposed in the literature, yet there is no consensus as to their robustness when the assumption of normality does not hold. This simulation study evaluated the performance of 14 tests for the homogeneity of variance assumption in one-way ANOVA models in terms of Type I error…

  14. Comprehensive analysis of schizophrenia-associated loci highlights ion channel pathways and biologically plausible candidate causal genes.

    Science.gov (United States)

    Pers, Tune H; Timshel, Pascal; Ripke, Stephan; Lent, Samantha; Sullivan, Patrick F; O'Donovan, Michael C; Franke, Lude; Hirschhorn, Joel N

    2016-03-15

    Over 100 associated genetic loci have been robustly associated with schizophrenia. Gene prioritization and pathway analysis have focused on a priori hypotheses and thus may have been unduly influenced by prior assumptions and missed important causal genes and pathways. Using a data-driven approach, we show that genes in associated loci: (1) are highly expressed in cortical brain areas; (2) are enriched for ion channel pathways (false discovery rates genes that are functionally related to each other and hence represent promising candidates for experimental follow up. We validate the relevance of the prioritized genes by showing that they are enriched for rare disruptive variants and de novo variants from schizophrenia sequencing studies (odds ratio 1.67, P = 0.039), and are enriched for genes encoding members of mouse and human postsynaptic density proteomes (odds ratio 4.56, P = 5.00 × 10(-4); odds ratio 2.60, P = 0.049).The authors wish it to be known that, in their opinion, the first 2 authors should be regarded as joint First Author.

  15. Conceptualizing Identity Development: Unmasking the Assumptions within Inventories Measuring Identity Development

    Science.gov (United States)

    Moran, Christy D.

    2009-01-01

    The purpose of this qualitative research was to analyze the dimensions and manifestations of identity development embedded within commonly used instruments measuring student identity development. To this end, a content analysis of ten identity assessment tools was conducted to determine the assumptions about identity development contained therein.…

  16. Benchmarking biological nutrient removal in wastewater treatment plants: influence of mathematical model assumptions

    DEFF Research Database (Denmark)

    Flores-Alsina, Xavier; Gernaey, Krist V.; Jeppsson, Ulf

    2012-01-01

    This paper examines the effect of different model assumptions when describing biological nutrient removal (BNR) by the activated sludge models (ASM) 1, 2d & 3. The performance of a nitrogen removal (WWTP1) and a combined nitrogen and phosphorus removal (WWTP2) benchmark wastewater treatment plant...

  17. Investigating assumptions of crown archetypes for modelling LiDAR returns

    NARCIS (Netherlands)

    Calders, K.; Lewis, P.; Disney, M.; Verbesselt, J.; Herold, M.

    2013-01-01

    LiDAR has the potential to derive canopy structural information such as tree height and leaf area index (LAI), via models of the LiDAR signal. Such models often make assumptions regarding crown shape to simplify parameter retrieval and crown archetypes are typically assumed to contain a turbid

  18. Vocational Didactics: Core Assumptions and Approaches from Denmark, Germany, Norway, Spain and Sweden

    Science.gov (United States)

    Gessler, Michael; Moreno Herrera, Lázaro

    2015-01-01

    The design of vocational didactics has to meet special requirements. Six core assumptions are identified: outcome orientation, cultural-historical embedding, horizontal structure, vertical structure, temporal structure, and the changing nature of work. Different approaches and discussions from school-based systems (Spain and Sweden) and dual…

  19. Mutual assumptions and facts about nondisclosure among clinical supervisors and students in group supervision

    DEFF Research Database (Denmark)

    Nielsen, Geir Høstmark; Skjerve, Jan; Jacobsen, Claus Haugaard;

    2009-01-01

    In the two preceding papers of this issue of Nordic Psychology the authors report findings from a study of nondisclosure among student therapists and clinical supervisors. The findings were reported separately for each group. In this article, the two sets of findings are held together and compared......, so as to draw a picture of mutual assumptions and facts about nondisclosure among students and supervisors....

  20. Complex Learning Theory--Its Epistemology and Its Assumptions about Learning: Implications for Physical Education

    Science.gov (United States)

    Light, Richard

    2008-01-01

    Davis and Sumara (2003) argue that differences between commonsense assumptions about learning and those upon which constructivism rests present a significant challenge for the fostering of constructivist approaches to teaching in schools. Indeed, as Rink (2001) suggests, initiating any change process for teaching method needs to involve some…

  1. 76 FR 17158 - Assumption Buster Workshop: Distributed Data Schemes Provide Security

    Science.gov (United States)

    2011-03-28

    ... group that coordinates cyber security research activities in support of national security systems, is...: There is a strong and often repeated call for research to provide novel cyber security solutions. The... capable, and that re-examining cyber security solutions in the context of these assumptions will result in...

  2. Kinematic and static assumptions for homogenization in micromechanics of granular materials

    NARCIS (Netherlands)

    Kruyt, N.P.; Rothenburg, L.

    2004-01-01

    A study is made of kinematic and static assumptions for homogenization in micromechanics of granular materials for two cases. The first case considered deals with the elastic behaviour of isotropic, two-dimensional assemblies with bonded contacts. Using a minimum potential energy principle and estim

  3. Is a "Complex" Task Really Complex? Validating the Assumption of Cognitive Task Complexity

    Science.gov (United States)

    Sasayama, Shoko

    2016-01-01

    In research on task-based learning and teaching, it has traditionally been assumed that differing degrees of cognitive task complexity can be inferred through task design and/or observations of differing qualities in linguistic production elicited by second language (L2) communication tasks. Without validating this assumption, however, it is…

  4. How Do People Learn at the Workplace? Investigating Four Workplace Learning Assumptions

    NARCIS (Netherlands)

    Kooken, Jose; Ley, Tobias; Hoog, de Robert; Duval, Erik; Klamma, Ralf

    2007-01-01

    Any software development project is based on assumptions about the state of the world that probably will hold when it is fielded. Investigating whether they are true can be seen as an important task. This paper describes how an empirical investigation was designed and conducted for the EU funded APO

  5. Under What Assumptions Do Site-by-Treatment Instruments Identify Average Causal Effects?

    Science.gov (United States)

    Reardon, Sean F.; Raudenbush, Stephen W.

    2013-01-01

    The increasing availability of data from multi-site randomized trials provides a potential opportunity to use instrumental variables methods to study the effects of multiple hypothesized mediators of the effect of a treatment. We derive nine assumptions needed to identify the effects of multiple mediators when using site-by-treatment interactions…

  6. Investigating assumptions of crown archetypes for modelling LiDAR returns

    NARCIS (Netherlands)

    Calders, K.; Lewis, P.; Disney, M.; Verbesselt, J.; Herold, M.

    2013-01-01

    LiDAR has the potential to derive canopy structural information such as tree height and leaf area index (LAI), via models of the LiDAR signal. Such models often make assumptions regarding crown shape to simplify parameter retrieval and crown archetypes are typically assumed to contain a turbid mediu

  7. Post Stereotypes: Deconstructing Racial Assumptions and Biases through Visual Culture and Confrontational Pedagogy

    Science.gov (United States)

    Jung, Yuha

    2015-01-01

    The Post Stereotypes project embodies confrontational pedagogy and involves postcard artmaking designed to both solicit expression of and deconstruct students' racial, ethnic, and cultural stereotypes and assumptions. As part of the Cultural Diversity in American Art course, students created postcard art that visually represented their personal…

  8. Credit Transfer amongst Students in Contrasting Disciplines: Examining Assumptions about Wastage, Mobility and Lifelong Learning

    Science.gov (United States)

    Di Paolo, Terry; Pegg, Ann

    2013-01-01

    While arrangements for credit transfer exist across the UK higher education sector, little is known about credit-transfer students or why they re-engage with study. Policy makers have cited credit transfer as a mechanism for reducing wastage and drop-out, but this paper challenges this assumption and instead examines how credit transfer serves…

  9. Net Generation at Social Software: Challenging Assumptions, Clarifying Relationships and Raising Implications for Learning

    Science.gov (United States)

    Valtonen, Teemu; Dillon, Patrick; Hacklin, Stina; Vaisanen, Pertti

    2010-01-01

    This paper takes as its starting point assumptions about use of information and communication technology (ICT) by people born after 1983, the so called net generation. The focus of the paper is on social networking. A questionnaire survey was carried out with 1070 students from schools in Eastern Finland. Data are presented on students' ICT-skills…

  10. An Algorithm for Determining Database Consistency Under the Coles World Assumption

    Institute of Scientific and Technical Information of China (English)

    沈一栋

    1992-01-01

    It is well-known that there are circumstances where applying Reiter's closed world assumption(CWA)will lead to logical inconsistencies.In this paper,a new characterization of the CA consistency is pesented and an algorithm is proposed for determining whether a datalase without function symbols is consistent with the CWA.The algorithm is shown to be efficient.

  11. Tests of the frozen-flux and tangentially geostrophic assumptions using magnetic satellite data

    DEFF Research Database (Denmark)

    Chulliat, A.; Olsen, Nils; Sabaka, T.

    In 1984, Jean-Louis Le Mouël published a paper suggesting that the flow at the top of the Earth’s core is tangentially geostrophic, i.e., the Lorentz force is much smaller than the Coriolis force in this particular region of the core. This new assumption wassubsequently used to discriminate among...

  12. Exploring the Estimation of Examinee Locations Using Multidimensional Latent Trait Models under Different Distributional Assumptions

    Science.gov (United States)

    Jang, Hyesuk

    2014-01-01

    This study aims to evaluate a multidimensional latent trait model to determine how well the model works in various empirical contexts. Contrary to the assumption of these latent trait models that the traits are normally distributed, situations in which the latent trait is not shaped with a normal distribution may occur (Sass et al, 2008; Woods…

  13. H-INFINITY-OPTIMIZATION WITHOUT ASSUMPTIONS ON FINITE OR INFINITE ZEROS

    NARCIS (Netherlands)

    SCHERER, C

    1992-01-01

    Explicit algebraic conditions are presented for the suboptimality of some parameter in the H(infinity)-optimization problem by output measurement control. Apart from two strict properness conditions, no artificial assumptions restrict the underlying system. In particular, the plant may have zeros on

  14. A critical assessment of the ecological assumptions underpinning compensatory mitigation of salmon-derived nutrients

    Science.gov (United States)

    Collins, Scott F.; Marcarelli, Amy M.; Baxter, Colden V.; Wipfli, Mark S.

    2015-01-01

    We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited.

  15. HIERARCHICAL STRUCTURE IN ADL AND IADL - ANALYTICAL ASSUMPTIONS AND APPLICATIONS FOR CLINICIAN AND RESEARCHERS

    NARCIS (Netherlands)

    KEMPEN, GIJM; MYERS, AM; POWELL, LE

    1995-01-01

    The results of a Canadian study have shown that a set of 12 (I)ADL items did not meet the criteria of Guttman's scalogram program, questioning the assumption of hierarchical ordering. In this article, the hierarchical structure of (I)ADL items from the Canadian elderly sample is retested with anothe

  16. The Impact of Feedback Frequency on Learning and Task Performance: Challenging the "More Is Better" Assumption

    Science.gov (United States)

    Lam, Chak Fu; DeRue, D. Scott; Karam, Elizabeth P.; Hollenbeck, John R.

    2011-01-01

    Previous research on feedback frequency suggests that more frequent feedback improves learning and task performance (Salmoni, Schmidt, & Walter, 1984). Drawing from resource allocation theory (Kanfer & Ackerman, 1989), we challenge the "more is better" assumption and propose that frequent feedback can overwhelm an individual's cognitive resource…

  17. The Mediating Effect of World Assumptions on the Relationship between Trauma Exposure and Depression

    Science.gov (United States)

    Lilly, Michelle M.; Valdez, Christine E.; Graham-Bermann, Sandra A.

    2011-01-01

    The association between trauma exposure and mental health-related challenges such as depression are well documented in the research literature. The assumptive world theory was used to explore this relationship in 97 female survivors of intimate partner violence (IPV). Participants completed self-report questionnaires that assessed trauma history,…

  18. Post Stereotypes: Deconstructing Racial Assumptions and Biases through Visual Culture and Confrontational Pedagogy

    Science.gov (United States)

    Jung, Yuha

    2015-01-01

    The Post Stereotypes project embodies confrontational pedagogy and involves postcard artmaking designed to both solicit expression of and deconstruct students' racial, ethnic, and cultural stereotypes and assumptions. As part of the Cultural Diversity in American Art course, students created postcard art that visually represented their personal…

  19. 76 FR 22925 - Assumption Buster Workshop: Abnormal Behavior Detection Finds Malicious Actors

    Science.gov (United States)

    2011-04-25

    ... Assumption Buster Workshop: Abnormal Behavior Detection Finds Malicious Actors AGENCY: The National... assumptionbusters@nitrd.gov . Travel expenses will be paid at the government rate for selected participants who live... behavioral models to monitor the size and destinations of financial transfers, and/or on-line...

  20. World assumptions, religiosity, and PTSD in survivors of intimate partner violence.

    Science.gov (United States)

    Lilly, Michelle M; Howell, Kathryn H; Graham-Bermann, Sandra

    2015-01-01

    Intimate partner violence (IPV) is among the most frequent types of violence annually affecting women. One frequent outcome of violence exposure is posttraumatic stress disorder (PTSD). The theory of shattered world assumptions represents one possible explanation for adverse mental health outcomes following trauma, contending that trauma disintegrates individuals' core assumptions that the world is safe and meaningful, and that the self is worthy. Research that explores world assumptions in relationship to survivors of IPV has remained absent. A more consistent finding in research on IPV suggests that religiosity is strongly associated with survivors' reactions to, and recovery from, IPV. The present study found that world assumptions was a significant mediator of the relationship between IPV exposure and PTSD symptoms. Religiosity was also significantly, positively related to PTSD symptoms, but was not significantly related to amount of IPV exposure. Though African American women reported more IPV exposure and greater religiosity than European American women in the sample, there were no interethnic differences in PTSD symptom endorsement. Implications of these findings are discussed.

  1. The Impact of Feedback Frequency on Learning and Task Performance: Challenging the "More Is Better" Assumption

    Science.gov (United States)

    Lam, Chak Fu; DeRue, D. Scott; Karam, Elizabeth P.; Hollenbeck, John R.

    2011-01-01

    Previous research on feedback frequency suggests that more frequent feedback improves learning and task performance (Salmoni, Schmidt, & Walter, 1984). Drawing from resource allocation theory (Kanfer & Ackerman, 1989), we challenge the "more is better" assumption and propose that frequent feedback can overwhelm an individual's cognitive resource…

  2. Challenging Assumptions about Values, Interests and Power in Further and Higher Education Partnerships

    Science.gov (United States)

    Elliott, Geoffrey

    2017-01-01

    This article raises questions that challenge assumptions about values, interests and power in further and higher education partnerships. These topics were explored in a series of semi-structured interviews with a sample of principals and senior higher education partnership managers of colleges spread across a single region in England. The data…

  3. Comparison of Three Common Experimental Designs to Improve Statistical Power When Data Violate Parametric Assumptions.

    Science.gov (United States)

    Porter, Andrew C.; McSweeney, Maryellen

    A Monte Carlo technique was used to investigate the small sample goodness of fit and statistical power of several nonparametric tests and their parametric analogues when applied to data which violate parametric assumptions. The motivation was to facilitate choice among three designs, simple random assignment with and without a concomitant variable…

  4. Exploring Epistemologies: Social Work Action as a Reflection of Philosophical Assumptions.

    Science.gov (United States)

    Dean, Ruth G.; Fenby, Barbara L.

    1989-01-01

    Two major philosophical assumptions underlying the literature, practice, and teaching of social work are reviewed: empiricism and existentialism. Two newer theoretical positions, critical theory and deconstruction, are also introduced. The implications for using each position as a context for teaching are considered. (MSE)

  5. Monitoring long-lasting insecticidal net (LLIN) durability to validate net serviceable life assumptions, in Rwanda

    NARCIS (Netherlands)

    Hakizimana, E.; Cyubahiro, B.; Rukundo, A.; Kabayiza, A.; Mutabazi, A.; Beach, R.; Patel, R.; Tongren, J.E.; Karema, C.

    2014-01-01

    Background To validate assumptions about the length of the distribution–replacement cycle for long-lasting insecticidal nets (LLINs) in Rwanda, the Malaria and other Parasitic Diseases Division, Rwanda Ministry of Health, used World Health Organization methods to independently confirm the three-year

  6. The National Teacher Corps: A Study of Shifting Goals and Changing Assumptions

    Science.gov (United States)

    Eckert, Sarah Anne

    2011-01-01

    This article investigates the lasting legacy of the National Teacher Corps (NTC), which was created in 1965 by the U.S. federal government with two crucial assumptions: that teaching poor urban children required a very specific skill set and that teacher preparation programs were not providing adequate training in these skills. Analysis reveals…

  7. A Taxonomy of Latent Structure Assumptions for Probability Matrix Decomposition Models.

    Science.gov (United States)

    Meulders, Michel; De Boeck, Paul; Van Mechelen, Iven

    2003-01-01

    Proposed a taxonomy of latent structure assumptions for probability matrix decomposition (PMD) that includes the original PMD model and a three-way extension of the multiple classification latent class model. Simulation study results show the usefulness of the taxonomy. (SLD)

  8. Principle Assumption of Space Object Detection Using Shipborne Great Aperture Photoelectrical Theodolite

    Science.gov (United States)

    Ouyang, Jia; Zhang, Tong-shuang; Wang, Qian-xue

    2016-02-01

    In this paper the use of space object detection is introduced. By analyzing the research actuality of space object detection using photoelectrical equipment, a shipborne great aperture photoelectrical theodolite is designed. The principle assumption of space object detection using shipborne great aperture photoelectrical theodolite is put forward.

  9. The Mediating Effect of World Assumptions on the Relationship between Trauma Exposure and Depression

    Science.gov (United States)

    Lilly, Michelle M.; Valdez, Christine E.; Graham-Bermann, Sandra A.

    2011-01-01

    The association between trauma exposure and mental health-related challenges such as depression are well documented in the research literature. The assumptive world theory was used to explore this relationship in 97 female survivors of intimate partner violence (IPV). Participants completed self-report questionnaires that assessed trauma history,…

  10. Herd immunity effect of the HPV vaccination program in Australia under different assumptions regarding natural immunity against re-infection.

    Science.gov (United States)

    Korostil, Igor A; Peters, Gareth W; Law, Matthew G; Regan, David G

    2013-04-01

    Deterministic dynamic compartmental transmission models (DDCTMs) of human papillomavirus (HPV) transmission have been used in a number of studies to estimate the potential impact of HPV vaccination programs. In most cases, the models were built under the assumption that an individual who cleared HPV infection develops (life-long) natural immunity against re-infection with the same HPV type (this is known as SIR scenario). This assumption was also made by two Australian modelling studies evaluating the impact of the National HPV Vaccination Program to assist in the health-economic assessment of male vaccination. An alternative view denying natural immunity after clearance (SIS scenario) was only presented in one study, although neither scenario has been supported by strong evidence. Some recent findings, however, provide arguments in favour of SIS. We developed HPV transmission models implementing life-time (SIR), limited, and non-existent (SIS) natural immunity. For each model we estimated the herd immunity effect of the ongoing Australian HPV vaccination program and its extension to cover males. Given the Australian setting, we aimed to clarify the extent to which the choice of model structure would influence estimation of this effect. A statistically robust and efficient calibration methodology was applied to ensure credibility of our results. We observed that for non-SIR models the herd immunity effect measured in relative reductions in HPV prevalence in the unvaccinated population was much more pronounced than for the SIR model. For example, with vaccine efficacy of 95% for females and 90% for males, the reductions for HPV-16 were 3% in females and 28% in males for the SIR model, and at least 30% (females) and 60% (males) for non-SIR models. The magnitude of these differences implies that evaluations of the impact of vaccination programs using DDCTMs should incorporate several model structures until our understanding of natural immunity is improved.

  11. The SAFRR Tsunami Scenario: Improving Resilience for California from a Plausible M9 Earthquake near the Alaska Peninsula

    Science.gov (United States)

    Ross, S.; Jones, L.; Wilson, R. I.; Bahng, B.; Barberopoulou, A.; Borrero, J. C.; Brosnan, D.; Bwarie, J.; Geist, E. L.; Johnson, L.; Kirby, S. H.; Knight, W.; Long, K.; Lynett, P. J.; Miller, K.; Mortensen, C. E.; Nicolsky, D.; Oglesby, D. D.; Perry, S. C.; Plumlee, G. S.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Suleimani, E.; Thio, H. K.; Titov, V.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2013-12-01

    The SAFRR Tsunami Scenario models a hypothetical but plausible tsunami, created by an Mw9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. We present the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs, economic consequences, environmental impacts, social vulnerability, emergency management, and policy implications for California associated with the tsunami scenario. The intended users are those who must make mitigation decisions before and rapid decisions during future tsunamis. Around a half million people would be present in the scenario's inundation area in residences, businesses, public venues, parks and beaches. Evacuation would likely be ordered for the State of California's maximum mapped tsunami inundation zone, evacuating an additional quarter million people from residences and businesses. Some island and peninsula communities would face particular evacuation challenges because of limited access options and short warning time, caused by the distance between Alaska and California. Evacuations may also be a challenge for certain dependent-care populations. One third of the boats in California's marinas could be damaged or sunk, costing at least 700 million in repairs to boats and docks, and potentially much more to address serious issues due to sediment transport and environmental contamination. Fires would likely start at many sites where fuel and petrochemicals are stored in ports and marinas. Tsunami surges and bores may travel several miles inland up coastal rivers. Debris clean-up and recovery of inundated and damaged areas will take days, months, or years depending on the severity of impacts and the available resources for recovery. The Ports of Los Angeles and Long Beach (POLA/LB) would be shut down for a miniμm of two days due to strong currents. Inundation of dry land in the ports would result in 100 million damages to cargo and additional

  12. Bipolar-pulses observed by the LRS/WFC-L onboard KAGUYA - Plausible evidence of lunar dust impact -

    Science.gov (United States)

    Kasahara, Yoshiya; Horie, Hiroki; Hashimoto, Kozo; Omura, Yoshiharu; Goto, Yoshitaka; Kumamoto, Atsushi; Ono, Takayuki; Tsunakawa, Hideo; Lrs/Wfc Team; Map/Lmag Team

    2010-05-01

    same) and thus most of bipolar-pulses which can be detected in MONO mode are cancelled in DIFF mode. This fact suggests that these bipolar pulses are not a kind of natural wave but these are caused by instantaneous potential changes of the KAGUYA spacecraft. Discussion: Similar type of bipolar-pulses has been observed by the monopole antenna measurements using Radio and Plasma Wave Science (RPWS) instruments on-board Cassini around Saturn [4]. They demonstrated that these bipolar pulses are caused by impacts of dusts floating around the Saturn. It is well-known that lunar dusts are widely dis-tributed in higher altitude range around the moon and it is plausible that these bipolar pulses are caused by the lunar dust impacts. In the presentation, we show the detailed charac-teristics of bipolar pulses detected by the WFC-L onboard KAGUYA. References: [1] Y. Kasahara et al., Earth, Planets and Space, 60(4), 341-351, 2008. [2] T. Ono et al., Earth, Planets and Space, 60(4), 321-332, 2008. [3] K. Hashimoto et al., The 4th SELENE (KAGUYA) Science Working Team Meeting, (this issue), 2010. [4] W.S. Kurth et al, Planetary and Space Science, 54(9-10), 988-998, 2006.

  13. On assumption in low-altitude investigation of dayside magnetospheric phenomena

    Science.gov (United States)

    Koskinen, H. E. J.

    In the physics of large-scale phenomena in complicated media, such as space plasmas, the chain of reasoning from the fundamental physics to conceptual models is a long and winding road, requiring much physical insight and reliance on various assumptions and approximations. The low-altitude investigation of dayside phenomena provides numerous examples of problems arising from the necessity to make strong assumptions. In this paper we discuss some important assumptions that are either unavoidable or at least widely used. Two examples are the concepts of frozen-in field lines and convection velocity. Instead of asking what violates the frozen-in condition, it is quite legitimate to ask what freezes the plasma and the magnetic field in the first place. Another important complex of problems are the limitations introduced by a two-dimensional approach or linearization of equations. Although modern research is more and more moving toward three-dimensional and time-dependent models, limitations in computing power often make a two-dimensional approach tempting. In a similar way, linearization makes equations analytically tractable. Finally, a very central question is the mapping. In the first approximation, the entire dayside magnetopause maps down to the ionosphere through the dayside cusp region. From the mapping viewpoint, the cusp is one of the most difficult regions and assumptions needed to perform the mapping in practice must be considered with the greatest possible care. We can never avoid assumptions but we must always make them clear to ourselves and also to the readers of our papers.

  14. Neural circuits as computational dynamical systems.

    Science.gov (United States)

    Sussillo, David

    2014-04-01

    Many recent studies of neurons recorded from cortex reveal complex temporal dynamics. How such dynamics embody the computations that ultimately lead to behavior remains a mystery. Approaching this issue requires developing plausible hypotheses couched in terms of neural dynamics. A tool ideally suited to aid in this question is the recurrent neural network (RNN). RNNs straddle the fields of nonlinear dynamical systems and machine learning and have recently seen great advances in both theory and application. I summarize recent theoretical and technological advances and highlight an example of how RNNs helped to explain perplexing high-dimensional neurophysiological data in the prefrontal cortex.

  15. Attrition of Knowledge Workforce in Healthcare in Northern parts of India – Health Information Technology as a Plausible Retention Strategy

    Directory of Open Access Journals (Sweden)

    Indrajit Bhattacharya

    2012-06-01

    Full Text Available Faced with a global shortage of skilled health workers due to attrition, countries are struggling to build and maintain optimum knowledge workforce in healthcare for delivering quality healthcare services. Forces that affect healthcare professionals’ turnover needs to be addressed before a competent uniformly adoptable strategy could be proposed for mitigating the problem. In this study we investigate the effects of the socio–demographic characteristics on attrition of healthcare knowledge workforce in northern parts of India that have a wide gradient of rural and urban belt, taking into account both public and private healthcare organizations. For this purpose healthcare professional attrition tracking survey (HATS was designed. The data has been collected from a random sample of 807 respondents consisting of doctors, nurses, paramedics and administrators to explore the relationships between various factors acting as antecedents in affecting the job satisfaction, commitment and intention of a healthcare professional to stay in the job. Structured questionnaires were utilized as the data collection tools. Descriptive statistics, factor analysis and path analysis were carried out using multiple regression and correlation to propose a model that best explains the theoretical assumption of factors leading to attrition. Six factors of attrition namely compensation and perks, work life balance, sense of accomplishment, work load, need for automation and technology improvement, substandard nature of work have been identified as the main factors with a data reliability of 0.809%. It has also been identified that the intention to shift is a major decision maker that affects attrition and in turn affected by job satisfaction dimensions. Based on the survey response and analysis, a highly possible strategy of utilizing information technology implementation for increasing worker motivation, job satisfaction and commitment to reduce attrition has been

  16. Rethinking individualism and collectivism: evaluation of theoretical assumptions and meta-analyses.

    Science.gov (United States)

    Oyserman, Daphna; Coon, Heather M; Kemmelmeier, Markus

    2002-01-01

    Are Americans more individualistic and less collectivistic than members of other groups? The authors summarize plausible psychological implications of individualism-collectivism (IND-COL), meta-analyze cross-national and within-United States IND-COL differences, and review evidence for effects of IND-COL on self-concept, well-being, cognition, and relationality. European Americans were found to be both more individualistic-valuing personal independence more-and less collectivistic-feeling duty to in-groups less-than others. However, European Americans were not more individualistic than African Americans, or Latinos, and not less collectivistic than Japanese or Koreans. Among Asians, only Chinese showed large effects, being both less individualistic and more collectivistic. Moderate IND-COL effects were found on self-concept and relationality, and large effects were found on attribution and cognitive style.

  17. An optimization framework of biological dynamical systems.

    Science.gov (United States)

    Horie, Ryota

    2008-07-07

    Different biological dynamics are often described by different mathematical equations. On the other hand, some mathematical models describe many biological dynamics universally. Here, we focus on three biological dynamics: the Lotka-Volterra equation, the Hopfield neural networks, and the replicator equation. We describe these three dynamical models using a single optimization framework, which is constructed with employing the Riemannian geometry. Then, we show that the optimization structures of these dynamics are identical, and the differences among the three dynamics are only in the constraints of the optimization. From this perspective, we discuss the unified view for biological dynamics. We also discuss the plausible categorizations, the fundamental nature, and the efficient modeling of the biological dynamics, which arise from the optimization perspective of the dynamical systems.

  18. Sampling dynamics: an alternative to payoff-monotone selection dynamics

    DEFF Research Database (Denmark)

    Berkemer, Rainer

    options this result is rather counter intuitive and indeed there is experimental evidence which indicates that deviation will be likely even though the Nash equilibrium is strict. One can use the fact that strict Nash equilbria must be also sampling equilibria to test for the ``plausibility......Osborne and Rubinstein introduced sampling equilibria which are based on the concept of ``procedural rationality''. Sethi extended their idea to a dynamic framework which leads to the so called sampling dynamics. Unlike e.g the replicator dynamics this selection dynamics turns out to be neither......'' of the standard game theory result. Both, analytical tools and agent based simulation are used to investigate the dynamic stability of sampling equilibria in a generalized travelers dilemma. Two parameters are of interest: the number of strategy options (m) available to each traveler and an experience parameter...

  19. Common-sense chemistry: The use of assumptions and heuristics in problem solving

    Science.gov (United States)

    Maeyer, Jenine Rachel

    Students experience difficulty learning and understanding chemistry at higher levels, often because of cognitive biases stemming from common sense reasoning constraints. These constraints can be divided into two categories: assumptions (beliefs held about the world around us) and heuristics (the reasoning strategies or rules used to build predictions and make decisions). A better understanding and characterization of these constraints are of central importance in the development of curriculum and teaching strategies that better support student learning in science. It was the overall goal of this thesis to investigate student reasoning in chemistry, specifically to better understand and characterize the assumptions and heuristics used by undergraduate chemistry students. To achieve this, two mixed-methods studies were conducted, each with quantitative data collected using a questionnaire and qualitative data gathered through semi-structured interviews. The first project investigated the reasoning heuristics used when ranking chemical substances based on the relative value of a physical or chemical property, while the second study characterized the assumptions and heuristics used when making predictions about the relative likelihood of different types of chemical processes. Our results revealed that heuristics for cue selection and decision-making played a significant role in the construction of answers during the interviews. Many study participants relied frequently on one or more of the following heuristics to make their decisions: recognition, representativeness, one-reason decision-making, and arbitrary trend. These heuristics allowed students to generate answers in the absence of requisite knowledge, but often led students astray. When characterizing assumptions, our results indicate that students relied on intuitive, spurious, and valid assumptions about the nature of chemical substances and processes in building their responses. In particular, many

  20. Analysis of Modeling Assumptions used in Production Cost Models for Renewable Integration Studies

    Energy Technology Data Exchange (ETDEWEB)

    Stoll, Brady [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Townsend, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bloom, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-01-01

    Renewable energy integration studies have been published for many different regions exploring the question of how higher penetration of renewable energy will impact the electric grid. These studies each make assumptions about the systems they are analyzing; however the effect of many of these assumptions has not been yet been examined and published. In this paper we analyze the impact of modeling assumptions in renewable integration studies, including the optimization method used (linear or mixed-integer programming) and the temporal resolution of the dispatch stage (hourly or sub-hourly). We analyze each of these assumptions on a large and a small system and determine the impact of each assumption on key metrics including the total production cost, curtailment of renewables, CO2 emissions, and generator starts and ramps. Additionally, we identified the impact on these metrics if a four-hour ahead commitment step is included before the dispatch step and the impact of retiring generators to reduce the degree to which the system is overbuilt. We find that the largest effect of these assumptions is at the unit level on starts and ramps, particularly for the temporal resolution, and saw a smaller impact at the aggregate level on system costs and emissions. For each fossil fuel generator type we measured the average capacity started, average run-time per start, and average number of ramps. Linear programming results saw up to a 20% difference in number of starts and average run time of traditional generators, and up to a 4% difference in the number of ramps, when compared to mixed-integer programming. Utilizing hourly dispatch instead of sub-hourly dispatch saw no difference in coal or gas CC units for either start metric, while gas CT units had a 5% increase in the number of starts and 2% increase in the average on-time per start. The number of ramps decreased up to 44%. The smallest effect seen was on the CO2 emissions and total production cost, with a 0.8% and 0

  1. Modelling the dynamics of youth subcultures

    CERN Document Server

    Holme, P; Holme, Petter; Gronlund, Andreas

    2005-01-01

    What are the dynamics behind youth subcultures such as punk, hippie, or hip-hop cultures? How does the global dynamics of these subcultures relate to the individual's search for a personal identity? We propose a simple dynamical model to address these questions and find that only a few assumptions of the individual's behaviour are necessary to regenerate known features of youth culture.

  2. Structural dynamics

    CERN Document Server

    Strømmen, Einar N

    2014-01-01

    This book introduces to the theory of structural dynamics, with focus on civil engineering structures that may be described by line-like beam or beam-column type of systems, or by a system of rectangular plates. Throughout this book the mathematical presentation contains a classical analytical description as well as a description in a discrete finite element format, covering the mathematical development from basic assumptions to the final equations ready for practical dynamic response predictions. Solutions are presented in time domain as well as in frequency domain. Structural Dynamics starts off at a basic level and step by step brings the reader up to a level where the necessary safety considerations to wind or horizontal ground motion induced dynamic design problems can be performed. The special theory of the tuned mass damper has been given a comprehensive treatment, as this is a theory not fully covered elsewhere. For the same reason a chapter on the problem of moving loads on beams has been included.

  3. Minimal Braid in Applied Symbolic Dynamics

    Institute of Scientific and Technical Information of China (English)

    张成; 张亚刚; 彭守礼

    2003-01-01

    Based on the minimal braid assumption, three-dimensional periodic flows of a dynamical system are reconstructed in the case of unimodal map, and their topological structures are compared with those of the periodic orbits of the Rossler system in phase space through the numerical experiment. The numerical results justify the validity of the minimal braid assumption which provides a suspension from one-dimensional symbolic dynamics in the Poincare section to the knots of three-dimensional periodic flows.

  4. Innovation or 'Inventions'? The conflict between latent assumptions in marine aquaculture and local fishery.

    Science.gov (United States)

    Martínez-Novo, Rodrigo; Lizcano, Emmánuel; Herrera-Racionero, Paloma; Miret-Pastor, Lluís

    2016-06-01

    Recent European policy highlights the need to promote local fishery and aquaculture by means of innovation and joint participation in fishery management as one of the keys to achieve the sustainability of our seas. However, the implicit assumptions held by the actors in the two main groups involved - innovators (scientists, businessmen and administration managers) and local fishermen - can complicate, perhaps even render impossible, mutual understanding and co-operation. A qualitative analysis of interviews with members of both groups in the Valencian Community (Spain) reveals those latent assumptions and their impact on the respective practices. The analysis shows that the innovation narrative in which one group is based and the inventions narrative used by the other one are rooted in two dramatically different, or even antagonistic, collective worldviews. Any environmental policy that implies these groups should take into account these strong discords.

  5. IRT models with relaxed assumptions in eRm: A manual-like instruction

    Directory of Open Access Journals (Sweden)

    REINHOLD HATZINGER

    2009-03-01

    Full Text Available Linear logistic models with relaxed assumptions (LLRA as introduced by Fischer (1974 are a flexible tool for the measurement of change for dichotomous or polytomous responses. As opposed to the Rasch model, assumptions on dimensionality of items, their mutual dependencies and the distribution of the latent trait in the population of subjects are relaxed. Conditional maximum likelihood estimation allows for inference about treatment, covariate or trend effect parameters without taking the subjects' latent trait values into account. In this paper we will show how LLRAs based on the LLTM, LRSM and LPCM can be used to answer various questions about the measurement of change and how they can be fitted in R using the eRm package. A number of small didactic examples is provided that can easily be used as templates for real data sets. All datafiles used in this paper are available from http://eRm.R-Forge.R-project.org/

  6. NGNP: High Temperature Gas-Cooled Reactor Key Definitions, Plant Capabilities, and Assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Phillip Mills

    2012-02-01

    This document is intended to provide a Next Generation Nuclear Plant (NGNP) Project tool in which to collect and identify key definitions, plant capabilities, and inputs and assumptions to be used in ongoing efforts related to the licensing and deployment of a high temperature gas-cooled reactor (HTGR). These definitions, capabilities, and assumptions are extracted from a number of sources, including NGNP Project documents such as licensing related white papers [References 1-11] and previously issued requirement documents [References 13-15]. Also included is information agreed upon by the NGNP Regulatory Affairs group's Licensing Working Group and Configuration Council. The NGNP Project approach to licensing an HTGR plant via a combined license (COL) is defined within the referenced white papers and reference [12], and is not duplicated here.

  7. THE HISTORY OF BUILDING THE NORTHERN FRATERNAL CELLS OF VIRGIN MARY ASSUMPTION MONASTERY IN TIKHVIN

    Directory of Open Access Journals (Sweden)

    Tatiana Nikolaevna PYATNITSKAYA

    2014-01-01

    Full Text Available The article is focused on the formation of one of the fra-ternal houses of the Virgin Mary Assumption Monastery in Tikhvin (Leningrad region, the volume-spatial compo-sition of which was developed during the second half of the 17th century. It describes the history of the complex origin around the Assumption Cathedral of the 16th cen-tury and Cell housing location in the wooden and stone ensembles. Comparing the archival documents and the data obtained as a result of field studies, were identified the initial planning and design features of the Nordic fraternal cells. The research identified brigades of Tikhvin masons of 1680-1690 who worked in the construction of the building. Fragments of the original architectural dec-orations and facade colors were found. The research also identified graphic reconstructions, giving an idea not only of the original appearance of the building, but also the history of its changes.

  8. Load assumption for fatigue design of structures and components counting methods, safety aspects, practical application

    CERN Document Server

    Köhler, Michael; Pötter, Kurt; Zenner, Harald

    2017-01-01

    Understanding the fatigue behaviour of structural components under variable load amplitude is an essential prerequisite for safe and reliable light-weight design. For designing and dimensioning, the expected stress (load) is compared with the capacity to withstand loads (fatigue strength). In this process, the safety necessary for each particular application must be ensured. A prerequisite for ensuring the required fatigue strength is a reliable load assumption. The authors describe the transformation of the stress- and load-time functions which have been measured under operational conditions to spectra or matrices with the application of counting methods. The aspects which must be considered for ensuring a reliable load assumption for designing and dimensioning are discussed in detail. Furthermore, the theoretical background for estimating the fatigue life of structural components is explained, and the procedures are discussed for numerous applications in practice. One of the prime intentions of the authors ...

  9. The sexual victimization of men in America: new data challenge old assumptions.

    Science.gov (United States)

    Stemple, Lara; Meyer, Ilan H

    2014-06-01

    We assessed 12-month prevalence and incidence data on sexual victimization in 5 federal surveys that the Bureau of Justice Statistics, the Centers for Disease Control and Prevention, and the Federal Bureau of Investigation conducted independently in 2010 through 2012. We used these data to examine the prevailing assumption that men rarely experience sexual victimization. We concluded that federal surveys detect a high prevalence of sexual victimization among men-in many circumstances similar to the prevalence found among women. We identified factors that perpetuate misperceptions about men's sexual victimization: reliance on traditional gender stereotypes, outdated and inconsistent definitions, and methodological sampling biases that exclude inmates. We recommend changes that move beyond regressive gender assumptions, which can harm both women and men.

  10. Heterosexual assumptions in verbal and non-verbal communication in nursing.

    Science.gov (United States)

    Röndahl, Gerd; Innala, Sune; Carlsson, Marianne

    2006-11-01

    This paper reports a study of what lesbian women and gay men had to say, as patients and as partners, about their experiences of nursing in hospital care, and what they regarded as important to communicate about homosexuality and nursing. The social life of heterosexual cultures is based on the assumption that all people are heterosexual, thereby making homosexuality socially invisible. Nurses may assume that all patients and significant others are heterosexual, and these heteronormative assumptions may lead to poor communication that affects nursing quality by leading nurses to ask the wrong questions and make incorrect judgements. A qualitative interview study was carried out in the spring of 2004. Seventeen women and 10 men ranging in age from 23 to 65 years from different parts of Sweden participated. They described 46 experiences as patients and 31 as partners. Heteronormativity was communicated in waiting rooms, in patient documents and when registering for admission, and nursing staff sometimes showed perplexity when an informant deviated from this heteronormative assumption. Informants had often met nursing staff who showed fear of behaving incorrectly, which could lead to a sense of insecurity, thereby impeding further communication. As partners of gay patients, informants felt that they had to deal with heterosexual assumptions more than they did when they were patients, and the consequences were feelings of not being accepted as a 'true' relative, of exclusion and neglect. Almost all participants offered recommendations about how nursing staff could facilitate communication. Heterosexual norms communicated unconsciously by nursing staff contribute to ambivalent attitudes and feelings of insecurity that prevent communication and easily lead to misconceptions. Educational and management interventions, as well as increased communication, could make gay people more visible and thereby encourage openness and awareness by hospital staff of the norms that they

  11. Determination of the optimal periodic maintenance policy under imperfect repair assumption

    OpenAIRE

    Maria Luiza Guerra de Toledo

    2014-01-01

    . An appropriate maintenance policy is essential to reduce expenses and risks related to repairable systems failures. The usual assumptions of minimal or perfect repair at failures are not suitable for many real systems, requiring the application of Imperfect Repair models. In this work, the classes Arithmetic Reduction of Age and Arithmetic Reduction of Intensity, proposed by Doyen and Gaudoin (2004) are explored. Likelihood functions for such models are derived, and the parameters are es...

  12. RateMyProfessors.com: Testing Assumptions about Student Use and Misuse

    Science.gov (United States)

    Bleske-Rechek, April; Michels, Kelsey

    2010-01-01

    Since its inception in 1999, the RateMyProfessors.com (RMP.com) website has grown in popularity and, with that, notoriety. In this research we tested three assumptions about the website: (1) Students use RMP.com to either rant or rave; (2) Students who post on RMP.com are different from students who do not post; and (3) Students reward easiness by…

  13. Assumptions in quantitative analyses of health risks of overhead power lines

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, A.; Wardekker, J.A.; Van der Sluijs, J.P. [Department of Science, Technology and Society, Copernicus Institute, Utrecht University, Budapestlaan 6, 3584 CD Utrecht (Netherlands)

    2012-02-15

    One of the major issues hampering the formulation of uncontested policy decisions on contemporary risks is the presence of uncertainties in various stages of the policy cycle. In literature, different lines are suggested to address the problem of provisional and uncertain evidence. Reflective approaches such as pedigree analysis can be used to explore the quality of evidence when quantification of uncertainties is at stake. One of the issues where the quality of evidence impedes policy making, is the case of electromagnetic fields. In this case, a (statistical) association was suggested with an increased risk on childhood leukaemia in the vicinity of overhead power lines. A biophysical mechanism that could support this association was not found till date however. The Dutch government bases its policy concerning overhead power lines on the precautionary principle. For The Netherlands, previous studies have assessed the potential number of extra cases of childhood leukaemia due to the presence over overhead power lines. However, such a quantification of the health risk of EMF entails a (large) number of assumptions, both prior to and in the calculation chain. In this study, these assumptions were prioritized and critically appraised in an expert elicitation workshop, using a pedigree matrix for characterization of assumptions in assessments. It appeared that assumptions that were regarded to be important in quantifying the health risks show a high value-ladenness. The results show that, given the present state of knowledge, quantification of the health risks of EMF is premature. We consider the current implementation of the precautionary principle by the Dutch government to be adequate.

  14. Risk Pooling, Commitment and Information: An experimental test of two fundamental assumptions

    OpenAIRE

    Abigail Barr

    2003-01-01

    This paper presents rigorous and direct tests of two assumptions relating to limited commitment and asymmetric information that current underpin current models of risk pooling. A specially designed economic experiment involving 678 subjects across 23 Zimbabwean villages is used to solve the problems of observability and quantification that have frustrated previous attempts to conduct such tests. I find that more extrinsic commitment is associated with more risk pooling, but that more informat...

  15. Logic Assumptions and Risks Framework Applied to Defence Campaign Planning and Evaluation

    Science.gov (United States)

    2013-05-01

    Checkland, P. and J. Poulter (2006). Learning for Action:A Definitive Account of Soft Systems Methodology and its use for Practitioners, Teachers and... Systems Methodology alerts us to as differing ‘world views’. These are contrasted with assumptions about the causal linkages about the implementation...the problem and of the population, and the boundary, or limiting conditions, of the effects of the program – what Checkland and Poulter’s (2006) Soft

  16. Bootstrapping realized volatility and realized beta under a local Gaussianity assumption

    DEFF Research Database (Denmark)

    Hounyo, Ulrich

    The main contribution of this paper is to propose a new bootstrap method for statistics based on high frequency returns. The new method exploits the local Gaussianity and the local constancy of volatility of high frequency returns, two assumptions that can simplify inference in the high frequency...... simulations and use empirical data to compare the finite sample accuracy of our new bootstrap confidence intervals for integrated volatility and integrated beta with the existing results....

  17. Camera traps and mark-resight models: The value of ancillary data for evaluating assumptions

    Science.gov (United States)

    Parsons, Arielle W.; Simons, Theodore R.; Pollock, Kenneth H.; Stoskopf, Michael K.; Stocking, Jessica J.; O'Connell, Allan F.

    2015-01-01

    Unbiased estimators of abundance and density are fundamental to the study of animal ecology and critical for making sound management decisions. Capture–recapture models are generally considered the most robust approach for estimating these parameters but rely on a number of assumptions that are often violated but rarely validated. Mark-resight models, a form of capture–recapture, are well suited for use with noninvasive sampling methods and allow for a number of assumptions to be relaxed. We used ancillary data from continuous video and radio telemetry to evaluate the assumptions of mark-resight models for abundance estimation on a barrier island raccoon (Procyon lotor) population using camera traps. Our island study site was geographically closed, allowing us to estimate real survival and in situ recruitment in addition to population size. We found several sources of bias due to heterogeneity of capture probabilities in our study, including camera placement, animal movement, island physiography, and animal behavior. Almost all sources of heterogeneity could be accounted for using the sophisticated mark-resight models developed by McClintock et al. (2009b) and this model generated estimates similar to a spatially explicit mark-resight model previously developed for this population during our study. Spatially explicit capture–recapture models have become an important tool in ecology and confer a number of advantages; however, non-spatial models that account for inherent individual heterogeneity may perform nearly as well, especially where immigration and emigration are limited. Non-spatial models are computationally less demanding, do not make implicit assumptions related to the isotropy of home ranges, and can provide insights with respect to the biological traits of the local population.

  18. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning

    Science.gov (United States)

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning. PMID:27310576

  19. Understanding the multiple realities of everyday life: basic assumptions in focus-group methodology.

    Science.gov (United States)

    Ivanoff, Synneve Dahlin; Hultberg, John

    2006-06-01

    In recent years, there has been a notable growth in the use of focus groups within occupational therapy. It is important to understand what kind of knowledge focus-group methodology is meant to acquire. The purpose of this article is to create an understanding of the basic assumptions within focus-group methodology from a theory of science perspective in order to elucidate and encourage reflection on the paradigm. This will be done based on a study of contemporary literature. To further the knowledge of basic assumptions the article will focus on the following themes: the focus-group research arena, the foundation and its core components; subjects, the role of the researcher and the participants; activities, the specific tasks and procedures. Focus-group methodology can be regarded as a specific research method within qualitative methodology with its own form of methodological criteria, as well as its own research procedures. Participants construct a framework to make sense of their experiences, and in interaction with others these experiences will be modified, leading to the construction of new knowledge. The role of the group leader is to facilitate a fruitful environment for the meaning to emerge and to ensure that the understanding of the meaning emerges independently of the interpreter. Focus-group methodology thus shares, in the authors' view, some basic assumptions with social constructivism.

  20. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Science.gov (United States)

    Hsu, Anne; Griffiths, Thomas L

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  1. Combustion Effects in Laser-oxygen Cutting: Basic Assumptions, Numerical Simulation and High Speed Visualization

    Science.gov (United States)

    Zaitsev, Alexander V.; Ermolaev, Grigory V.

    Laser-oxygen cutting is very complicated for theoretical description technological process. Iron-oxygen combustion playing a leading role making it highly effective, able to cut thicker plates and, at the same time, producing special types of striations and other defects on the cut surface. In this paper results of numerical simulation based on elementary assumptions on iron-oxygen combustion are verified with high speed visualization of laser-oxygen cutting process. On a base of assumption that iron oxide lost its protective properties after melting simulation of striation formation due cycles of laser induced non self-sustained combustion is proposed. Assumption that reaction limiting factor is oxygen transport from the jet to cutting front allows to calculate reaction intensity by solving Navier - Stokes and diffusion system in gas phase. Influence of oxygen purity and pressure is studied theoretically. The results of numerical simulation are examined with high speed visualization of laser-oxygen cutting of 4-20 mm mild steel plates at cutting conditions close to industrial.

  2. Efficient Accountable Authority Identity-Based Encryption under Static Complexity Assumptions

    CERN Document Server

    Libert, Benoît

    2008-01-01

    At Crypto'07, Goyal introduced the concept of Accountable Authority Identity-Based Encryption (A-IBE) as a convenient means to reduce the amount of trust in authorities in Identity-Based Encryption (IBE). In this model, if the Private Key Generator (PKG) maliciously re-distributes users' decryption keys, it runs the risk of being caught and prosecuted. Goyal proposed two constructions: a first one based on Gentry's IBE which relies on strong assumptions (such as q-Bilinear Diffie-Hellman Inversion) and a second one resting on the more classical Decision Bilinear Diffie-Hellman (DBDH) assumption but that is too inefficient for practical use. In this work, we propose a new construction that is secure assuming the hardness of the DBDH problem. The efficiency of our scheme is comparable with that of Goyal's main proposal with the advantage of relying on static assumptions (i.e. the strength of which does not depend on the number of queries allowed to the adversary). By limiting the number of adversarial rewinds i...

  3. Bayesian Mass Estimates of the Milky Way II: The dark and light sides of parameter assumptions

    CERN Document Server

    Eadie, Gwendolyn M

    2016-01-01

    We present mass and mass profile estimates for the Milky Way Galaxy using the Bayesian analysis developed by Eadie et al (2015b) and using globular clusters (GCs) as tracers of the Galactic potential. The dark matter and GCs are assumed to follow different spatial distributions; we assume power-law model profiles and use the model distribution functions described in Evans et al. (1997); Deason et al (2011, 2012a). We explore the relationships between assumptions about model parameters and how these assumptions affect mass profile estimates. We also explore how using subsamples of the GC population beyond certain radii affect mass estimates. After exploring the posterior distributions of different parameter assumption scenarios, we conclude that a conservative estimate of the Galaxy's mass within 125kpc is $5.22\\times10^{11} M_{\\odot}$, with a $50\\%$ probability region of $(4.79, 5.63) \\times10^{11} M_{\\odot}$. Extrapolating out to the virial radius, we obtain a virial mass for the Milky Way of $6.82\\times10^{...

  4. Differentiating Different Modeling Assumptions in Simulations of MagLIF loads on the Z Generator

    Science.gov (United States)

    Jennings, C. A.; Gomez, M. R.; Harding, E. C.; Knapp, P. F.; Ampleford, D. J.; Hansen, S. B.; Weis, M. R.; Glinsky, M. E.; Peterson, K.; Chittenden, J. P.

    2016-10-01

    Metal liners imploded by a fast rising (MagLIF experiments have had some success. While experiments are increasingly well diagnosed, many of the measurements (particularly during stagnation) are time integrated, limited in spatial resolution or require additional assumptions to interpret in the context of a structured, rapidly evolving system. As such, in validating MHD calculations, there is the potential for the same observables in the experimental data to be reproduced under different modeling assumptions. Using synthetic diagnostics of the results of different pre-heat, implosion and stagnation simulations run with the Gorgon MHD code, we discuss how the interpretation of typical Z diagnostics relate to more fundamental simulation parameters. We then explore the extent to which different assumptions on instability development, current delivery, high-Z mix into the fuel and initial laser deposition can be differentiated in our existing measurements. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DoE's NNSA under contract DE-AC04-94AL85000.

  5. Effects of various assumptions on the calculated liquid fraction in isentropic saturated equilibrium expansions

    Science.gov (United States)

    Bursik, J. W.; Hall, R. M.

    1980-01-01

    The saturated equilibrium expansion approximation for two phase flow often involves ideal-gas and latent-heat assumptions to simplify the solution procedure. This approach is well documented by Wegener and Mack and works best at low pressures where deviations from ideal-gas behavior are small. A thermodynamic expression for liquid mass fraction that is decoupled from the equations of fluid mechanics is used to compare the effects of the various assumptions on nitrogen-gas saturated equilibrium expansion flow starting at 8.81 atm, 2.99 atm, and 0.45 atm, which are conditions representative of transonic cryogenic wind tunnels. For the highest pressure case, the entire set of ideal-gas and latent-heat assumptions are shown to be in error by 62 percent for the values of heat capacity and latent heat. An approximation of the exact, real-gas expression is also developed using a constant, two phase isentropic expansion coefficient which results in an error of only 2 percent for the high pressure case.

  6. What is this Substance? What Makes it Different? Mapping Progression in Students' Assumptions about Chemical Identity

    Science.gov (United States)

    Ngai, Courtney; Sevian, Hannah; Talanquer, Vicente

    2014-09-01

    Given the diversity of materials in our surroundings, one should expect scientifically literate citizens to have a basic understanding of the core ideas and practices used to analyze chemical substances. In this article, we use the term 'chemical identity' to encapsulate the assumptions, knowledge, and practices upon which chemical analysis relies. We conceive chemical identity as a core crosscutting disciplinary concept which can bring coherence and relevance to chemistry curricula at all educational levels, primary through tertiary. Although chemical identity is not a concept explicitly addressed by traditional chemistry curricula, its understanding can be expected to evolve as students are asked to recognize different types of substances and explore their properties. The goal of this contribution is to characterize students' assumptions about factors that determine chemical identity and to map how core assumptions change with training in the discipline. Our work is based on the review and critical analysis of existing research findings on students' alternative conceptions in chemistry education, and historical and philosophical analyses of chemistry. From this perspective, our analysis contributes to the growing body of research in the area of learning progressions. In particular, it reveals areas in which our understanding of students' ideas about chemical identity is quite robust, but also highlights the existence of major knowledge gaps that should be filled in to better foster student understanding. We provide suggestions in this area and discuss implications for the teaching of chemistry.

  7. The Universality of Intuition an aposteriori Criticize to an apriori Assumption

    Directory of Open Access Journals (Sweden)

    Roohollah Haghshenas

    2015-03-01

    Full Text Available Intuition has a central role in philosophy, the role to arbitrating between different opinions. When a philosopher shows that "intuition" supports his view, he thinks this is a good reason for him. In contrast, if we show some contraries between intuition and a theory or some implications of it, we think a replacement or at least some revisions would be needed. There are some well-known examples of this role for intuition in many fields of philosophy the transplant case in ethics, the chinese nation case in philosophy of mind and the Gettier examples in epistemology. But there is an assumption here we suppose all people think in same manner, i.e. we think intuition(s is universal. Experimental philosophy tries to study this assumption experimentally. This project continues Quine's movement to "pursuit of truth" from a naturalistic point of view and making epistemology "as a branch of natural science." The work of experimental philosophy shows that in many cases people with different cultural backgrounds reflect to some specific moral or epistemological cases –like Gettier examples- differently and thus intuition is not universal. So, many problems that are based on this assumption maybe dissolved, have plural forms for plural cultures or bounded to some specific cultures –western culture in many cases.

  8. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Directory of Open Access Journals (Sweden)

    Anne Hsu

    Full Text Available A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  9. [The methods selected and the assumption fixed for the population projection by region: the case of Japan since the 1960s (author's transl)].

    Science.gov (United States)

    Kawabe, H

    1983-01-01

    interrelationships between migration and socioeconomic factors which affect migration. It was emphasized that migration rates should be estimated by use of the econometric techniques such as the system dynamics model. The careful analysis of the assumptions fixed in 21 cases allowed the authors to conclude that 1 of these projections had not succeeded in obtaining satisfactory results, even when estimated by econometric techniques. This is partly due to the lack of information the relationships between migration and socioeconomic factors affecting migration, and partly due to the difficulties in estimating the future trend of socioeconomic variables. (author's modified)

  10. Presupuestos hermenéuticos de la teoría comunicacional del derecho de Gregorio Robles | Hermeneutical Assumptions of Gregorio Robles’s Communicational Theory of Law

    Directory of Open Access Journals (Sweden)

    José Antonio Santos Arnaiz

    2017-06-01

    Full Text Available Resumen: El presente trabajo se centra en los presupuestos hermenéuticos de los que parte la teoría comunicacional del derecho de Gregorio Robles, como doctrina que muestra una vía de superación de la dicotomía entre filosofía analítica y hermenéutica con la finalidad de hacer más claro el lenguaje de los juristas. Para ello, se analizan cuatro de las obras del autor, desde un punto de vista descriptivo y crítico, que presentan una mayor impronta hermenéutica como son Introducción a la Teoría del Derecho, El Derecho como texto, Teoría del Derecho y Comunicación, lenguaje y derecho, para abordar el contexto en el que se mueve, sus influencias, la plausibilidad de la propia teoría dentro del marco analítico-hermenéutico, y el análisis descriptivo y crítico del manejo de determinados conceptos del debate hermenéutico en la teoría comunicacional del derecho.   Abstract: The present paper focuses on the hermeneutical assumptions of Gregorio Robles’s communicational theory of law, as doctrine that shows a way of overcoming the dichotomy between analytical philosophy and hermeneutics in order to clarify the language of jurists. For that purpose, it analyzes four of the author’s works from a descriptive and critical point of view that present a greater hermeneutical influence such as Introduction to Legal Theory, Law as Text, Legal Theory and Communication, Language and Law, to tackle the context of the author, his influences, the plausibility of the theory itself within the analytical and hermeneutical framework, and the descriptive and critical analysis of the use of certain concepts of the hermeneutical debate in the communicational theory of law.

  11. Climate Change: Implications for the Assumptions, Goals and Methods of Urban Environmental Planning

    Directory of Open Access Journals (Sweden)

    Kristina Hill

    2016-12-01

    Full Text Available As a result of increasing awareness of the implications of global climate change, shifts are becoming necessary and apparent in the assumptions, concepts, goals and methods of urban environmental planning. This review will present the argument that these changes represent a genuine paradigm shift in urban environmental planning. Reflection and action to develop this paradigm shift is critical now and in the next decades, because environmental planning for cities will only become more urgent as we enter a new climate period. The concepts, methods and assumptions that urban environmental planners have relied on in previous decades to protect people, ecosystems and physical structures are inadequate if they do not explicitly account for a rapidly changing regional climate context, specifically from a hydrological and ecological perspective. The over-arching concept of spatial suitability that guided planning in most of the 20th century has already given way to concepts that address sustainability, recognizing the importance of temporality. Quite rapidly, the concept of sustainability has been replaced in many planning contexts by the priority of establishing resilience in the face of extreme disturbance events. Now even this concept of resilience is being incorporated into a novel concept of urban planning as a process of adaptation to permanent, incremental environmental changes. This adaptation concept recognizes the necessity for continued resilience to extreme events, while acknowledging that permanent changes are also occurring as a result of trends that have a clear direction over time, such as rising sea levels. Similarly, the methods of urban environmental planning have relied on statistical data about hydrological and ecological systems that will not adequately describe these systems under a new climate regime. These methods are beginning to be replaced by methods that make use of early warning systems for regime shifts, and process

  12. Climate Change: Implications for the Assumptions, Goals and Methods of Urban Environmental Planning

    Directory of Open Access Journals (Sweden)

    Kristina Hill

    2016-12-01

    Full Text Available As a result of increasing awareness of the implications of global climate change, shifts are becoming necessary and apparent in the assumptions, concepts, goals and methods of urban environmental planning. This review will present the argument that these changes represent a genuine paradigm shift in urban environmental planning. Reflection and action to develop this paradigm shift is critical now and in the next decades, because environmental planning for cities will only become more urgent as we enter a new climate period. The concepts, methods and assumptions that urban environmental planners have relied on in previous decades to protect people, ecosystems and physical structures are inadequate if they do not explicitly account for a rapidly changing regional climate context, specifically from a hydrological and ecological perspective. The over-arching concept of spatial suitability that guided planning in most of the 20th century has already given way to concepts that address sustainability, recognizing the importance of temporality. Quite rapidly, the concept of sustainability has been replaced in many planning contexts by the priority of establishing resilience in the face of extreme disturbance events. Now even this concept of resilience is being incorporated into a novel concept of urban planning as a process of adaptation to permanent, incremental environmental changes. This adaptation concept recognizes the necessity for continued resilience to extreme events, while acknowledging that permanent changes are also occurring as a result of trends that have a clear direction over time, such as rising sea levels. Similarly, the methods of urban environmental planning have relied on statistical data about hydrological and ecological systems that will not adequately describe these systems under a new climate regime. These methods are beginning to be replaced by methods that make use of early warning systems for regime shifts, and process

  13. Modelling N2O dynamics in the engineered N cycle: Observations, assumptions, knowns, and unknowns

    DEFF Research Database (Denmark)

    Smets, Barth F.; Pellicer i Nàcher, Carles; Jensen, Marlene Mark;

    of the main microbial processes responsible for its production and consumption. The conceptualization of these pathways in mathematical models has the potential to become a key tool to increase our understanding on the complex interrelationships within these ecosystems and develop strategies to minimize...... the carbon footprint of wastewater treatment plants. Unfortunately, existing model structures are limited to describe the emissions of individual microbial pathways in an attempt to decrease their complexity and facilitate their calibration. The present contribution summarizes the recent developments...

  14. Adapting forest science, practice, and policy to shifting ground: From steady-state assumptions to dynamic change

    Science.gov (United States)

    Daniel B. Botkin

    2014-01-01

    What forestry needs in the Anthropogenic Era is what has been needed for the past 30 years. The proper methods, theory, and goals have been clear and are available; the failure has been, and continues to be, that our laws, policies, and actions are misdirected because we confuse a truly scientific base with nonscientific beliefs. The result is a confusion of folklore...

  15. Dynamical Supersymmetry Breaking

    CERN Document Server

    Shadmi, Y; Shadmi, Yael; Shirman, Yuri

    2000-01-01

    Supersymmetry is one of the most plausible and theoretically motivated frameworks for extending the Standard Model. However, any supersymmetry in Nature must be a broken symmetry. Dynamical supersymmetry breaking (DSB) is an attractive idea for incorporating supersymmetry into a successful description of Nature. The study of DSB has recently enjoyed dramatic progress, fueled by advances in our understanding of the dynamics of supersymmetric field theories. These advances have allowed for direct analysis of DSB in strongly coupled theories, and for the discovery of new DSB theories, some of which contradict early criteria for DSB. We review these criteria, emphasizing recently discovered exceptions. We also describe, through many examples, various techniques for directly establishing DSB by studying the infrared theory, including both older techniques in regions of weak coupling, and new techniques in regions of strong coupling. Finally, we present a list of representative DSB models, their main properties, an...

  16. One-pot synthesis of tetrazole-1,2,5,6-tetrahydronicotinonitriles and cholinesterase inhibition: Probing the plausible reaction mechanism via computational studies.

    Science.gov (United States)

    Hameed, Abdul; Zehra, Syeda Tazeen; Abbas, Saba; Nisa, Riffat Un; Mahmood, Tariq; Ayub, Khurshid; Al-Rashida, Mariya; Bajorath, Jürgen; Khan, Khalid Mohammed; Iqbal, Jamshed

    2016-04-01

    In the present study, one-pot synthesis of 1H-tetrazole linked 1,2,5,6-tetrahydronicotinonitriles under solvent-free conditions have been carried out in the presence of tetra-n-butyl ammonium fluoride trihydrated (TBAF) as catalyst and solvent. Computational studies have been conducted to elaborate two plausible mechanistic pathways of this one-pot reaction. Moreover, the synthesized compounds were screened for cholinesterases (acetylcholinesterase and butyrylcholinesterase) inhibition which are consider to be major malefactors of Alzheimer's disease (AD) to find lead compounds for further research in AD therapy.

  17. Phylogenetic analysis of NS5B gene of classical swine fever virus isolates indicates plausible Chinese origin of Indian subgroup 2.2 viruses.

    Science.gov (United States)

    Patil, S S; Hemadri, D; Veeresh, H; Sreekala, K; Gajendragad, M R; Prabhudas, K

    2012-02-01

    Twenty-three CSFV isolates recovered from field outbreaks in various parts of India during 2006-2009 were used for genetic analysis in the NS5B region (409 nts). Seventeen of these were studied earlier [16] in the 5'UTR region. Phylogenetic analysis indicated the continued dominance of subgroup 1.1 strains in the country. Detailed analysis of a subgroup 2.2 virus indicated the plausible Chinese origin of this subgroup in India and provided indirect evidence of routes of CSFV movement within South East Asia region.

  18. On the underlying assumptions of threshold Boolean networks as a model for genetic regulatory network behavior

    Science.gov (United States)

    Tran, Van; McCall, Matthew N.; McMurray, Helene R.; Almudevar, Anthony

    2013-01-01

    Boolean networks (BoN) are relatively simple and interpretable models of gene regulatory networks. Specifying these models with fewer parameters while retaining their ability to describe complex regulatory relationships is an ongoing methodological challenge. Additionally, extending these models to incorporate variable gene decay rates, asynchronous gene response, and synergistic regulation while maintaining their Markovian nature increases the applicability of these models to genetic regulatory networks (GRN). We explore a previously-proposed class of BoNs characterized by linear threshold functions, which we refer to as threshold Boolean networks (TBN). Compared to traditional BoNs with unconstrained transition functions, these models require far fewer parameters and offer a more direct interpretation. However, the functional form of a TBN does result in a reduction in the regulatory relationships which can be modeled. We show that TBNs can be readily extended to permit self-degradation, with explicitly modeled degradation rates. We note that the introduction of variable degradation compromises the Markovian property fundamental to BoN models but show that a simple state augmentation procedure restores their Markovian nature. Next, we study the effect of assumptions regarding self-degradation on the set of possible steady states. Our findings are captured in two theorems relating self-degradation and regulatory feedback to the steady state behavior of a TBN. Finally, we explore assumptions of synchronous gene response and asynergistic regulation and show that TBNs can be easily extended to relax these assumptions. Applying our methods to the budding yeast cell-cycle network revealed that although the network is complex, its steady state is simplified by the presence of self-degradation and lack of purely positive regulatory cycles. PMID:24376454

  19. On the underlying assumptions of threshold Boolean networks as a model for genetic regulatory network behavior

    Directory of Open Access Journals (Sweden)

    Van eTran

    2013-12-01

    Full Text Available Boolean networks (BoN are relatively simple and interpretable models of gene regulatorynetworks. Specifying these models with fewer parameters while retaining their ability to describe complex regulatory relationships is an ongoing methodological challenge. Additionally, extending these models to incorporate variable gene decay rates, asynchronous gene response, and synergistic regulation while maintaining their Markovian nature increases the applicability of these models to genetic regulatory networks.We explore a previously-proposed class of BoNs characterized by linear threshold functions, which we refer to as threshold Boolean networks (TBN. Compared to traditional BoNs with unconstrained transition functions, these models require far fewer parameters and offer a more direct interpretation. However, the functional form of a TBN does result in a reduction in the regulatory relationships which can be modeled.We show that TBNs can be readily extended to permit self-degradation, with explicitly modeled degradation rates. We note that the introduction of variable degradation compromises the Markovian property fundamental to BoN models but show that a simple state augmentation procedure restores their Markovian nature. Next, we study the effect of assumptions regarding self-degradation on the set of possible steady states. Our findings are captured in two theorems relating self-degradation and regulatory feedback to the steady state behavior of a TBN. Finally, we explore assumptions of synchronous gene response and asynergistic regulation and show that TBNs can be easily extended to relax these assumptions.Applying our methods to the budding yeast cell-cycle network revealed that although the network is complex, its steady state is simplified by the presence of self-degradation and lack of purely positive regulatory cycles.

  20. Old and new ideas for data screening and assumption testing for exploratory and confirmatory factor analysis

    Directory of Open Access Journals (Sweden)

    David B. Flora

    2012-03-01

    Full Text Available We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables.

  1. Old and new ideas for data screening and assumption testing for exploratory and confirmatory factor analysis.

    Science.gov (United States)

    Flora, David B; Labrish, Cathy; Chalmers, R Philip

    2012-01-01

    We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables, and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables.

  2. Semi-Supervised Transductive Hot Spot Predictor Working on Multiple Assumptions

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-05-23

    Protein-protein interactions are critically dependent on just a few residues (“hot spots”) at the interfaces. Hot spots make a dominant contribution to the binding free energy and if mutated they can disrupt the interaction. As mutagenesis studies require significant experimental efforts, there exists a need for accurate and reliable computational hot spot prediction methods. Compared to the supervised hot spot prediction algorithms, the semi-supervised prediction methods can take into consideration both the labeled and unlabeled residues in the dataset during the prediction procedure. The transductive support vector machine has been utilized for this task and demonstrated a better prediction performance. To the best of our knowledge, however, none of the transductive semi-supervised algorithms takes all the three semisupervised assumptions, i.e., smoothness, cluster and manifold assumptions, together into account during learning. In this paper, we propose a novel semi-supervised method for hot spot residue prediction, by considering all the three semisupervised assumptions using nonlinear models. Our algorithm, IterPropMCS, works in an iterative manner. In each iteration, the algorithm first propagates the labels of the labeled residues to the unlabeled ones, along the shortest path between them on a graph, assuming that they lie on a nonlinear manifold. Then it selects the most confident residues as the labeled ones for the next iteration, according to the cluster and smoothness criteria, which is implemented by a nonlinear density estimator. Experiments on a benchmark dataset, using protein structure-based features, demonstrate that our approach is effective in predicting hot spots and compares favorably to other available methods. The results also show that our method outperforms the state-of-the-art transductive learning methods.

  3. On the relevance of assumptions associated with classical factor analytic approaches.

    Science.gov (United States)

    Kasper, Daniel; Unlü, Ali

    2013-01-01

    A personal trait, for example a person's cognitive ability, represents a theoretical concept postulated to explain behavior. Interesting constructs are latent, that is, they cannot be observed. Latent variable modeling constitutes a methodology to deal with hypothetical constructs. Constructs are modeled as random variables and become components of a statistical model. As random variables, they possess a probability distribution in the population of reference. In applications, this distribution is typically assumed to be the normal distribution. The normality assumption may be reasonable in many cases, but there are situations where it cannot be justified. For example, this is true for criterion-referenced tests or for background characteristics of students in large scale assessment studies. Nevertheless, the normal procedures in combination with the classical factor analytic methods are frequently pursued, despite the effects of violating this "implicit" assumption are not clear in general. In a simulation study, we investigate whether classical factor analytic approaches can be instrumental in estimating the factorial structure and properties of the population distribution of a latent personal trait from educational test data, when violations of classical assumptions as the aforementioned are present. The results indicate that having a latent non-normal distribution clearly affects the estimation of the distribution of the factor scores and properties thereof. Thus, when the population distribution of a personal trait is assumed to be non-symmetric, we recommend avoiding those factor analytic approaches for estimation of a person's factor score, even though the number of extracted factors and the estimated loading matrix may not be strongly affected. An application to the Progress in International Reading Literacy Study (PIRLS) is given. Comments on possible implications for the Programme for International Student Assessment (PISA) complete the presentation.

  4. The Avalanche Hypothesis and Compression of Morbidity: Testing Assumptions through Cohort-Sequential Analysis.

    Directory of Open Access Journals (Sweden)

    Jordan Silberman

    Full Text Available The compression of morbidity model posits a breakpoint in the adult lifespan that separates an initial period of relative health from a subsequent period of ever increasing morbidity. Researchers often assume that such a breakpoint exists; however, this assumption is hitherto untested.To test the assumption that a breakpoint exists--which we term a morbidity tipping point--separating a period of relative health from a subsequent deterioration in health status. An analogous tipping point for healthcare costs was also investigated.Four years of adults' (N = 55,550 morbidity and costs data were retrospectively analyzed. Data were collected in Pittsburgh, PA between 2006 and 2009; analyses were performed in Rochester, NY and Ann Arbor, MI in 2012 and 2013. Cohort-sequential and hockey stick regression models were used to characterize long-term trajectories and tipping points, respectively, for both morbidity and costs.Morbidity increased exponentially with age (P<.001. A morbidity tipping point was observed at age 45.5 (95% CI, 41.3-49.7. An exponential trajectory was also observed for costs (P<.001, with a costs tipping point occurring at age 39.5 (95% CI, 32.4-46.6. Following their respective tipping points, both morbidity and costs increased substantially (Ps<.001.Findings support the existence of a morbidity tipping point, confirming an important but untested assumption. This tipping point, however, may occur earlier in the lifespan than is widely assumed. An "avalanche of morbidity" occurred after the morbidity tipping point-an ever increasing rate of morbidity progression. For costs, an analogous tipping point and "avalanche" were observed. The time point at which costs began to increase substantially occurred approximately 6 years before health status began to deteriorate.

  5. Bayesian Mass Estimates of the Milky Way: The Dark and Light Sides of Parameter Assumptions

    Science.gov (United States)

    Eadie, Gwendolyn M.; Harris, William E.

    2016-10-01

    We present mass and mass profile estimates for the Milky Way (MW) Galaxy using the Bayesian analysis developed by Eadie et al. and using globular clusters (GCs) as tracers of the Galactic potential. The dark matter and GCs are assumed to follow different spatial distributions; we assume power-law model profiles and use the model distribution functions described in Evans et al. and Deason et al. We explore the relationships between assumptions about model parameters and how these assumptions affect mass profile estimates. We also explore how using subsamples of the GC population beyond certain radii affect mass estimates. After exploring the posterior distributions of different parameter assumption scenarios, we conclude that a conservative estimate of the Galaxy’s mass within 125 kpc is 5.22× {10}11 {M}⊙ , with a 50% probability region of (4.79,5.63)× {10}11 {M}⊙ . Extrapolating out to the virial radius, we obtain a virial mass for the MW of 6.82× {10}11 {M}⊙ with 50% credible region of (6.06,7.53)× {10}11 {M}⊙ ({r}{vir}={185}-7+7 {{kpc}}). If we consider only the GCs beyond 10 kpc, then the virial mass is 9.02 (5.69,10.86)× {10}11 {M}⊙ ({r}{vir}={198}-24+19 kpc). We also arrive at an estimate of the velocity anisotropy parameter β of the GC population, which is β =0.28 with a 50% credible region (0.21, 0.35). Interestingly, the mass estimates are sensitive to both the dark matter halo potential and visible matter tracer parameters, but are not very sensitive to the anisotropy parameter.

  6. Benchmarking biological nutrient removal in wastewater treatment plants: influence of mathematical model assumptions.

    Science.gov (United States)

    Flores-Alsina, Xavier; Gernaey, Krist V; Jeppsson, Ulf

    2012-01-01

    This paper examines the effect of different model assumptions when describing biological nutrient removal (BNR) by the activated sludge models (ASM) 1, 2d & 3. The performance of a nitrogen removal (WWTP1) and a combined nitrogen and phosphorus removal (WWTP2) benchmark wastewater treatment plant was compared for a series of model assumptions. Three different model approaches describing BNR are considered. In the reference case, the original model implementations are used to simulate WWTP1 (ASM1 & 3) and WWTP2 (ASM2d). The second set of models includes a reactive settler, which extends the description of the non-reactive TSS sedimentation and transport in the reference case with the full set of ASM processes. Finally, the third set of models is based on including electron acceptor dependency of biomass decay rates for ASM1 (WWTP1) and ASM2d (WWTP2). The results show that incorporation of a reactive settler: (1) increases the hydrolysis of particulates; (2) increases the overall plant's denitrification efficiency by reducing the S(NOx) concentration at the bottom of the clarifier; (3) increases the oxidation of COD compounds; (4) increases X(OHO) and X(ANO) decay; and, finally, (5) increases the growth of X(PAO) and formation of X(PHA,Stor) for ASM2d, which has a major impact on the whole P removal system. Introduction of electron acceptor dependent decay leads to a substantial increase of the concentration of X(ANO), X(OHO) and X(PAO) in the bottom of the clarifier. The paper ends with a critical discussion of the influence of the different model assumptions, and emphasizes the need for a model user to understand the significant differences in simulation results that are obtained when applying different combinations of 'standard' models.

  7. From the lab to the world: The paradigmatic assumption and the functional cognition of avian foraging

    Institute of Scientific and Technical Information of China (English)

    Danielle SULIKOWSKI; Darren BURKE

    2015-01-01

    Mechanisms of animal learning and memory were traditionally studied without reference to niche-specific functional considerations.More recently,ecological demands have informed such investigations,most notably with respect to foraging in birds.In parallel,behavioural ecologists,primarily concerned with functional optimization,have begun to consider the role of mechanistic factors,including cognition,to explain apparent deviations from optimal predictions.In the present paper we discuss the application of laboratory-based constructs and paradigms of cognition to the real-world challenges faced by avian foragers.We argue that such applications have been handicapped by what we term the 'paradigmatic assumption'-the assumption that a given laboratory paradigm maps well enough onto a congruent cognitive mechanism (or cognitive ability) to justify conflation of the two.We present evidence against the paradigmatic assumption and suggest that to achieve a profitable integration between function and mechanism,with respect to animal cognition,a new conceptualization of cognitive mechanisms-functional cognition-is required.This new conceptualization should define cognitive mechanisms based on the informational properties of the animal's environment and the adaptive challenges faced.Cognitive mechanisms must be examined in settings that mimic the im portant aspects of the natural environment,using customized tasks designed to probe defined aspects of the mechanisms' operation.We suggest that this approach will facilitate investigations of the functional and evolutionary relevance of cognitive mechanisms,as well as the patterns of divergence,convergence and specialization of cognitive mechanisms within and between species [Current Zoology 61 (2):328-340,2015].

  8. Assumption-versus data-based approaches to summarizing species' ranges.

    Science.gov (United States)

    Peterson, A Townsend; Navarro-Sigüenza, Adolfo G; Gordillo, Alejandro

    2016-08-04

    For conservation decision making, species' geographic distributions are mapped using various approaches. Some such efforts have downscaled versions of coarse-resolution extent-of-occurrence maps to fine resolutions for conservation planning. We examined the quality of the extent-of-occurrence maps as range summaries and the utility of refining those maps into fine-resolution distributional hypotheses. Extent-of-occurrence maps tend to be overly simple, omit many known and well-documented populations, and likely frequently include many areas not holding populations. Refinement steps involve typological assumptions about habitat preferences and elevational ranges of species, which can introduce substantial error in estimates of species' true areas of distribution. However, no model-evaluation steps are taken to assess the predictive ability of these models, so model inaccuracies are not noticed. Whereas range summaries derived by these methods may be useful in coarse-grained, global-extent studies, their continued use in on-the-ground conservation applications at fine spatial resolutions is not advisable in light of reliance on assumptions, lack of real spatial resolution, and lack of testing. In contrast, data-driven techniques that integrate primary data on biodiversity occurrence with remotely sensed data that summarize environmental dimensions (i.e., ecological niche modeling or species distribution modeling) offer data-driven solutions based on a minimum of assumptions that can be evaluated and validated quantitatively to offer a well-founded, widely accepted method for summarizing species' distributional patterns for conservation applications. © 2016 Society for Conservation Biology.

  9. Tests of data quality, scaling assumptions, and reliability of the Danish SF-36

    DEFF Research Database (Denmark)

    Bjorner, J B; Damsgaard, M T; Watt, T;

    1998-01-01

    We used general population data (n = 4084) to examine data completeness, response consistency, tests of scaling assumptions, and reliability of the Danish SF-36 Health Survey. We compared traditional multitrait scaling analyses to analyses using polychoric correlations and Spearman correlations...... discriminant validity, equal item-own scale correlations, and equal variances) were satisfactory in the total sample and in all subgroups. The SF-36 could discriminate between levels of health in all subgroups, but there were skewness, kurtosis, and ceiling effects in many subgroups (elderly people and people...

  10. Science with the Square Kilometer Array: Motivation, Key Science Projects, Standards and Assumptions

    CERN Document Server

    Carilli, C

    2004-01-01

    The Square Kilometer Array (SKA) represents the next major, and natural, step in radio astronomical facilities, providing two orders of magnitude increase in collecting area over existing telescopes. In a series of meetings, starting in Groningen, the Netherlands (August 2002) and culminating in a `science retreat' in Leiden (November 2003), the SKA International Science Advisory Committee (ISAC), conceived of, and carried-out, a complete revision of the SKA science case (to appear in New Astronomy Reviews). This preface includes: (i) general introductory material, (ii) summaries of the key science programs, and (iii) a detailed listing of standards and assumptions used in the revised science case.

  11. Condition for Energy Efficient Watermarking with Random Vector Model without WSS Assumption

    CERN Document Server

    Yan, Bin; Guo, Yinjing

    2009-01-01

    Energy efficient watermarking preserves the watermark energy after linear attack as much as possible. We consider in this letter non-stationary signal models and derive conditions for energy efficient watermarking under random vector model without WSS assumption. We find that the covariance matrix of the energy efficient watermark should be proportional to host covariance matrix to best resist the optimal linear removal attacks. In WSS process our result reduces to the well known power spectrum condition. Intuitive geometric interpretation of the results are also discussed which in turn also provide more simpler proof of the main results.

  12. Expressing Environment Assumptions and Real-time Requirements for a Distributed Embedded System with Shared Variables

    DEFF Research Database (Denmark)

    Tjell, Simon; Fernandes, João Miguel

    2008-01-01

    In a distributed embedded system, it is often necessary to share variables among its computing nodes to allow the distribution of control algorithms. It is therefore necessary to include a component in each node that provides the service of variable sharing. For that type of component, this paper...... discusses how to create a Colored Petri Nets (CPN) model that formally expresses the following elements in a clearly separated structure: (1) assumptions about the behavior of the environment of the component, (2) real-time requirements for the component, and (3) a possible solution in terms of an algorithm...

  13. Local conservation scores without a priori assumptions on neutral substitution rates

    Directory of Open Access Journals (Sweden)

    Hagenauer Joachim

    2008-04-01

    Full Text Available Abstract Background Comparative genomics aims to detect signals of evolutionary conservation as an indicator of functional constraint. Surprisingly, results of the ENCODE project revealed that about half of the experimentally verified functional elements found in non-coding DNA were classified as unconstrained by computational predictions. Following this observation, it has been hypothesized that this may be partly explained by biased estimates on neutral evolutionary rates used by existing sequence conservation metrics. All methods we are aware of rely on a comparison with the neutral rate and conservation is estimated by measuring the deviation of a particular genomic region from this rate. Consequently, it is a reasonable assumption that inaccurate neutral rate estimates may lead to biased conservation and constraint estimates. Results We propose a conservation signal that is produced by local Maximum Likelihood estimation of evolutionary parameters using an optimized sliding window and present a Kullback-Leibler projection that allows multiple different estimated parameters to be transformed into a conservation measure. This conservation measure does not rely on assumptions about neutral evolutionary substitution rates and little a priori assumptions on the properties of the conserved regions are imposed. We show the accuracy of our approach (KuLCons on synthetic data and compare it to the scores generated by state-of-the-art methods (phastCons, GERP, SCONE in an ENCODE region. We find that KuLCons is most often in agreement with the conservation/constraint signatures detected by GERP and SCONE while qualitatively very different patterns from phastCons are observed. Opposed to standard methods KuLCons can be extended to more complex evolutionary models, e.g. taking insertion and deletion events into account and corresponding results show that scores obtained under this model can diverge significantly from scores using the simpler model

  14. Untested assumptions: psychological research and credibility assessment in legal decision-making

    Directory of Open Access Journals (Sweden)

    Jane Herlihy

    2015-05-01

    Full Text Available Background: Trauma survivors often have to negotiate legal systems such as refugee status determination or the criminal justice system. Methods & results: We outline and discuss the contribution which research on trauma and related psychological processes can make to two particular areas of law where complex and difficult legal decisions must be made: in claims for refugee and humanitarian protection, and in reporting and prosecuting sexual assault in the criminal justice system. Conclusion: There is a breadth of psychological knowledge that, if correctly applied, would limit the inappropriate reliance on assumptions and myth in legal decision-making in these settings. Specific recommendations are made for further study.

  15. What is a god? Metatheistic assumptions in Old Testament Yahwism(s

    Directory of Open Access Journals (Sweden)

    J W Gericke

    2006-09-01

    Full Text Available In this article, the author provides a prolegomena to further research attempting to answer a most undamental and basic question � much more so than what has thus far been the case in the disciplines of Old Testament theology and history of Israelite religion. It concerns the implicit assumptions in the Hebrew Bible�s discourse about the fundamental nature of deity. In other words, the question is not, �What is� YHWH like?� but rather , �what, according to the Old Testament texts, is a god?�

  16. Bases, Assumptions, and Results of the Flowsheet Calculations for the Decision Phase Salt Disposition Alternatives

    Energy Technology Data Exchange (ETDEWEB)

    Dimenna, R.A.; Jacobs, R.A.; Taylor, G.A.; Durate, O.E.; Paul, P.K.; Elder, H.H.; Pike, J.A.; Fowler, J.R.; Rutland, P.L.; Gregory, M.V.; Smith III, F.G.; Hang, T.; Subosits, S.G.; Campbell, S.G.

    2001-03-26

    The High Level Waste (HLW) Salt Disposition Systems Engineering Team was formed on March 13, 1998, and chartered to identify options, evaluate alternatives, and recommend a selected alternative(s) for processing HLW salt to a permitted wasteform. This requirement arises because the existing In-Tank Precipitation process at the Savannah River Site, as currently configured, cannot simultaneously meet the HLW production and Authorization Basis safety requirements. This engineering study was performed in four phases. This document provides the technical bases, assumptions, and results of this engineering study.

  17. Identification of the bkdAB gene cluster, a plausible source of the starter-unit for virginiamycin M production in Streptomyces virginiae.

    Science.gov (United States)

    Pulsawat, Nattika; Kitani, Shigeru; Kinoshita, Hiroshi; Lee, Chang Kwon; Nihira, Takuya

    2007-06-01

    The bkdAB gene cluster, which encodes plausible E1 and E2 components of the branched-chain alpha-keto acid dehydrogenase (BCDH) complex, was isolated from Streptomyces virginiae in the vicinity of a regulatory island for virginiamycin production. Gene disruption of bkdA completely abolished the production of virginiamycin M (a polyketide-peptide antibiotic), while the production of virginiamycin S (a cyclodepsipeptide antibiotic) was unaffected. Complementation of the bkdA disruptant by genome-integration of intact bkdA completely restored the virginiamycin M production, indicating that the bkdAB cluster is essential for virginiamycin M biosynthesis, plausibly via the provision of isobutyryl-CoA as a primer unit. In contrast to a feature usually seen in the Streptomyces E1 component, namely, the separate encoding of the alpha and beta subunits, S. virginiae bkdA seemed to encode the fused form of the alpha and beta subunits, which was verified by the actual catalytic activity of the fused protein in vitro using recombinant BkdA overexpressed in Escherichia coli. Supply of an additional bkdA gene under the strong and constitutive promoter ermE* in the wild-type strain of S. virginiae resulted in enhanced production of virginiamycin M, suggesting that the supply of isobutyryl-CoA is one of the rate-limiting factors in the biosynthesis of virginiamycin M.

  18. Analysis of multi-domain hypothetical proteins containing iron-sulphur clusters and fad ligands reveal rieske dioxygenase activity suggesting their plausible roles in bioremediation.

    Science.gov (United States)

    Sathyanarayanan, Nitish; Nagendra, Holenarasipur Gundurao

    2012-01-01

    'Conserved hypothetical' proteins pose a challenge not just for functional genomics, but also to biology in general. As long as there are hundreds of conserved proteins with unknown function in model organisms such as Escherichia coli, Bacillus subtilis or Saccharomyces cerevisiae, any discussion towards a 'complete' understanding of these biological systems will remain a wishful thinking. Insilico approaches exhibit great promise towards attempts that enable appreciating the plausible roles of these hypothetical proteins. Among the majority of genomic proteins, two-thirds in unicellular organisms and more than 80% in metazoa, are multi-domain proteins, created as a result of gene duplication events. Aromatic ring-hydroxylating dioxygenases, also called Rieske dioxygenases (RDOs), are class of multi-domain proteins that catalyze the initial step in microbial aerobic degradation of many aromatic compounds. Investigations here address the computational characterization of hypothetical proteins containing Ferredoxin and Flavodoxin signatures. Consensus sequence of each class of oxidoreductase was obtained by a phylogenetic analysis, involving clustering methods based on evolutionary relationship. A synthetic sequence was developed by combining the consensus, which was used as the basis to search for their homologs via BLAST. The exercise yielded 129 multidomain hypothetical proteins containing both 2Fe-2S (Ferredoxin) and FNR (Flavodoxin) domains. In the current study, 40 proteins with N-terminus 2Fe-2S domain and C-terminus FNR domain are characterized, through homology modelling and docking exercises which suggest dioxygenase activity indicating their plausible roles in degradation of aromatic moieties.

  19. Examining Assumptions and Limitations of Research on the Effects of Emerging Technologies for Teaching and Learning in Higher Education

    Science.gov (United States)

    Kirkwood, Adrian; Price, Linda

    2013-01-01

    This paper examines assumptions and beliefs underpinning research into educational technology. It critically reviews some approaches used to investigate the impact of technologies for teaching and learning. It focuses on comparative studies, performance comparisons and attitudinal studies to illustrate how under-examined assumptions lead to…

  20. The Assumption of Proportional Components when Candecomp Is Applied to Symmetric Matrices in the Context of INDSCAL

    Science.gov (United States)

    Dosse, Mohammed Bennani; Berge, Jos M. F.

    2008-01-01

    The use of Candecomp to fit scalar products in the context of INDSCAL is based on the assumption that the symmetry of the data matrices involved causes the component matrices to be equal when Candecomp converges. Ten Berge and Kiers gave examples where this assumption is violated for Gramian data matrices. These examples are believed to be local…