WorldWideScience

Sample records for plausible systems-level theory

  1. Physiological Plausibility and Boundary Conditions of Theories of Risk Sensitivity

    DEFF Research Database (Denmark)

    Marchiori, Davide; Elqayam, Shira

    2012-01-01

    dilatation, which in turn positively correlates with a risk aversion behavior. They hypothesize that participants’ attention is increased in decision problems involving losses, which trigger an innate prudent behavior in situations entailing danger and/or hazard. Interestingly, Y&T find that the nature...... of attention is not selective, i.e., when losses are present, participants are shown to devote more attention to the task as a whole rather than to the single negative outcomes, in contrast to Prospect Theory's loss aversion....... and physiological underpinnings of one of the central topics in judgment and decision-making (JDM) research – choice behavior in decisions from experience. Y&T successfully contributes to this goal by demonstrating a novel effect that losses increase experimental participants’ arousal as measured by pupil...

  2. The Sarrazin effect: the presence of absurd statements in conspiracy theories makes canonical information less plausible.

    Science.gov (United States)

    Raab, Marius Hans; Auer, Nikolas; Ortlieb, Stefan A; Carbon, Claus-Christian

    2013-01-01

    Reptile prime ministers and flying Nazi saucers-extreme and sometimes off-wall conclusion are typical ingredients of conspiracy theories. While individual differences are a common research topic concerning conspiracy theories, the role of extreme statements in the process of acquiring and passing on conspiratorial stories has not been regarded in an experimental design so far. We identified six morphological components of conspiracy theories empirically. On the basis of these content categories a set of narrative elements for a 9/11 story was compiled. These elements varied systematically in terms of conspiratorial allegation, i.e., they contained official statements concerning the events of 9/11, statements alleging to a conspiracy limited in time and space as well as extreme statements indicating an all-encompassing cover-up. Using the method of narrative construction, 30 people were given a set of cards with these statements and asked to construct the course of events of 9/11 they deem most plausible. When extreme statements were present in the set, the resulting stories were more conspiratorial; the number of official statements included in the narrative dropped significantly, whereas the self-assessment of the story's plausibility did not differ between conditions. This indicates that blatant statements in a pool of information foster the synthesis of conspiracy theories on an individual level. By relating these findings to one of Germany's most successful (and controversial) non-fiction books, we refer to the real-world dangers of this effect.

  3. Quantum theory as plausible reasoning applied to data obtained by robust experiments.

    Science.gov (United States)

    De Raedt, H; Katsnelson, M I; Michielsen, K

    2016-05-28

    We review recent work that employs the framework of logical inference to establish a bridge between data gathered through experiments and their objective description in terms of human-made concepts. It is shown that logical inference applied to experiments for which the observed events are independent and for which the frequency distribution of these events is robust with respect to small changes of the conditions under which the experiments are carried out yields, without introducing any concept of quantum theory, the quantum theoretical description in terms of the Schrödinger or the Pauli equation, the Stern-Gerlach or Einstein-Podolsky-Rosen-Bohm experiments. The extraordinary descriptive power of quantum theory then follows from the fact that it is plausible reasoning, that is common sense, applied to reproducible and robust experimental data. © 2016 The Author(s).

  4. Psychological Plausibility of the Theory of Probabilistic Mental Models and the Fast and Frugal Heuristics

    Science.gov (United States)

    Dougherty, Michael R.; Franco-Watkins, Ana M.; Thomas, Rick

    2008-01-01

    The theory of probabilistic mental models (PMM; G. Gigerenzer, U. Hoffrage, & H. Kleinbolting, 1991) has had a major influence on the field of judgment and decision making, with the most recent important modifications to PMM theory being the identification of several fast and frugal heuristics (G. Gigerenzer & D. G. Goldstein, 1996). These…

  5. On the Biological Plausibility of Grandmother Cells: Implications for Neural Network Theories in Psychology and Neuroscience

    Science.gov (United States)

    Bowers, Jeffrey S.

    2009-01-01

    A fundamental claim associated with parallel distributed processing (PDP) theories of cognition is that knowledge is coded in a distributed manner in mind and brain. This approach rejects the claim that knowledge is coded in a localist fashion, with words, objects, and simple concepts (e.g. "dog"), that is, coded with their own dedicated…

  6. Pathways to plausibility

    DEFF Research Database (Denmark)

    Wahlberg, Ayo

    2008-01-01

    Herbal medicine has long been contrasted to modern medicine in terms of a holistic approach to healing, vitalistic theories of health and illness and an emphasis on the body’s innate self-healing capacities. At the same time, since the early 20th century, the cultivation, preparation and mass...... production of herbal medicines have become increasingly industrialised, scientificised and commercialised. What is more, phytochemical efforts to identify and isolate particular ‘active ingredients’ from whole-plant extracts have intensified, often in response to increasing regulatory scrutiny of the safety...... and quality of herbal medicinal products. In this paper, I examine whether describing these developments in terms of a biomedical ‘colonisation’ of herbal medicine, as has been common, allows us to sufficiently account for the mundane collaborative efforts of herbalists, botanists, phytochemists...

  7. Application of plausible reasoning to AI-based control systems

    Science.gov (United States)

    Berenji, Hamid; Lum, Henry, Jr.

    1987-01-01

    Some current approaches to plausible reasoning in artificial intelligence are reviewed and discussed. Some of the most significant recent advances in plausible and approximate reasoning are examined. A synergism among the techniques of uncertainty management is advocated, and brief discussions on the certainty factor approach, probabilistic approach, Dempster-Shafer theory of evidence, possibility theory, linguistic variables, and fuzzy control are presented. Some extensions to these methods are described, and the applications of the methods are considered.

  8. Optimality and Plausibility in Language Design

    Directory of Open Access Journals (Sweden)

    Michael R. Levot

    2016-12-01

    Full Text Available The Minimalist Program in generative syntax has been the subject of much rancour, a good proportion of it stoked by Noam Chomsky’s suggestion that language may represent “a ‘perfect solution’ to minimal design specifications.” A particular flash point has been the application of Minimalist principles to speculations about how language evolved in the human species. This paper argues that Minimalism is well supported as a plausible approach to language evolution. It is claimed that an assumption of minimal design specifications like that employed in MP syntax satisfies three key desiderata of evolutionary and general scientific plausibility: Physical Optimism, Rational Optimism, and Darwin’s Problem. In support of this claim, the methodologies employed in MP to maximise parsimony are characterised through an analysis of recent theories in Minimalist syntax, and those methodologies are defended with reference to practices and arguments from evolutionary biology and other natural sciences.

  9. Heuristic Elements of Plausible Reasoning.

    Science.gov (United States)

    Dudczak, Craig A.

    At least some of the reasoning processes involved in argumentation rely on inferences which do not fit within the traditional categories of inductive or deductive reasoning. The reasoning processes involved in plausibility judgments have neither the formal certainty of deduction nor the imputed statistical probability of induction. When utilizing…

  10. Plausible values in statistical inference

    NARCIS (Netherlands)

    Marsman, M.

    2014-01-01

    In Chapter 2 it is shown that the marginal distribution of plausible values is a consistent estimator of the true latent variable distribution, and, furthermore, that convergence is monotone in an embedding in which the number of items tends to infinity. This result is used to clarify some of the

  11. System level ESD protection

    CERN Document Server

    Vashchenko, Vladislav

    2014-01-01

    This book addresses key aspects of analog integrated circuits and systems design related to system level electrostatic discharge (ESD) protection.  It is an invaluable reference for anyone developing systems-on-chip (SoC) and systems-on-package (SoP), integrated with system-level ESD protection. The book focuses on both the design of semiconductor integrated circuit (IC) components with embedded, on-chip system level protection and IC-system co-design. The readers will be enabled to bring the system level ESD protection solutions to the level of integrated circuits, thereby reducing or completely eliminating the need for additional, discrete components on the printed circuit board (PCB) and meeting system-level ESD requirements. The authors take a systematic approach, based on IC-system ESD protection co-design. A detailed description of the available IC-level ESD testing methods is provided, together with a discussion of the correlation between IC-level and system-level ESD testing methods. The IC-level ESD...

  12. Plausibility orderings in dynamic games

    NARCIS (Netherlands)

    Perea ý Monsuwé, A.

    2014-01-01

    In this paper we explore game-theoretic reasoning in dynamic games within the framework of belief revision theory. More precisely, we focus on the forward induction concept of ‘common strong belief in rationality’ (Battigalli and Siniscalchi (2002) and the backward induction concept of ‘common

  13. Bisimulation for Single-Agent Plausibility Models

    DEFF Research Database (Denmark)

    Andersen, Mikkel Birkegaard; Bolander, Thomas; van Ditmarsch, H.

    2013-01-01

    define a proper notion of bisimulation, and prove that bisimulation corresponds to logical equivalence on image-finite models. We relate our results to other epistemic notions, such as safe belief and degrees of belief. Our results imply that there are only finitely many non-bisimilar single......-agent epistemic plausibility models on a finite set of propositions. This gives decidability for single-agent epistemic plausibility planning....

  14. Anatomically Plausible Surface Alignment and Reconstruction

    DEFF Research Database (Denmark)

    Paulsen, Rasmus R.; Larsen, Rasmus

    2010-01-01

    With the increasing clinical use of 3D surface scanners, there is a need for accurate and reliable algorithms that can produce anatomically plausible surfaces. In this paper, a combined method for surface alignment and reconstruction is proposed. It is based on an implicit surface representation...

  15. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence From Word Segmentation.

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-11-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility. We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition model can aim to be cognitively plausible in multiple ways. We discuss these cognitive plausibility checkpoints generally and then apply them to a case study in word segmentation, investigating a promising Bayesian segmentation strategy. We incorporate cognitive plausibility by using an age-appropriate unit of perceptual representation, evaluating the model output in terms of its utility, and incorporating cognitive constraints into the inference process. Our more cognitively plausible model shows a beneficial effect of cognitive constraints on segmentation performance. One interpretation of this effect is as a synergy between the naive theories of language structure that infants may have and the cognitive constraints that limit the fidelity of their inference processes, where less accurate inference approximations are better when the underlying assumptions about how words are generated are less accurate. More generally, these results highlight the utility of incorporating cognitive plausibility more fully into computational models of language acquisition. Copyright © 2015 Cognitive Science Society, Inc.

  16. Plausibility and evidence: the case of homeopathy.

    Science.gov (United States)

    Rutten, Lex; Mathie, Robert T; Fisher, Peter; Goossens, Maria; van Wassenhoven, Michel

    2013-08-01

    Homeopathy is controversial and hotly debated. The conclusions of systematic reviews of randomised controlled trials of homeopathy vary from 'comparable to conventional medicine' to 'no evidence of effects beyond placebo'. It is claimed that homeopathy conflicts with scientific laws and that homoeopaths reject the naturalistic outlook, but no evidence has been cited. We are homeopathic physicians and researchers who do not reject the scientific outlook; we believe that examination of the prior beliefs underlying this enduring stand-off can advance the debate. We show that interpretations of the same set of evidence--for homeopathy and for conventional medicine--can diverge. Prior disbelief in homeopathy is rooted in the perceived implausibility of any conceivable mechanism of action. Using the 'crossword analogy', we demonstrate that plausibility bias impedes assessment of the clinical evidence. Sweeping statements about the scientific impossibility of homeopathy are themselves unscientific: scientific statements must be precise and testable. There is growing evidence that homeopathic preparations can exert biological effects; due consideration of such research would reduce the influence of prior beliefs on the assessment of systematic review evidence.

  17. A cognitively plausible model for grammar induction

    Directory of Open Access Journals (Sweden)

    Roni Katzir

    2015-01-01

    Full Text Available This paper aims to bring theoretical linguistics and cognition-general theories of learning into closer contact. I argue that linguists' notions of rich UGs are well-founded, but that cognition-general learning approaches are viable as well and that the two can and should co-exist and support each other. Specifically, I use the observation that any theory of UG provides a learning criterion -- the total memory space used to store a grammar and its encoding of the input -- that supports learning according to the principle of Minimum Description-Length. This mapping from UGs to learners maintains a minimal ontological commitment: the learner for a particular UG uses only what is already required to account for linguistic competence in adults. I suggest that such learners should be our null hypothesis regarding the child's learning mechanism, and that furthermore, the mapping from theories of UG to learners provides a framework for comparing theories of UG.

  18. Analytic models of plausible gravitational lens potentials

    International Nuclear Information System (INIS)

    Baltz, Edward A.; Marshall, Phil; Oguri, Masamune

    2009-01-01

    Gravitational lenses on galaxy scales are plausibly modelled as having ellipsoidal symmetry and a universal dark matter density profile, with a Sérsic profile to describe the distribution of baryonic matter. Predicting all lensing effects requires knowledge of the total lens potential: in this work we give analytic forms for that of the above hybrid model. Emphasising that complex lens potentials can be constructed from simpler components in linear combination, we provide a recipe for attaining elliptical symmetry in either projected mass or lens potential. We also provide analytic formulae for the lens potentials of Sérsic profiles for integer and half-integer index. We then present formulae describing the gravitational lensing effects due to smoothly-truncated universal density profiles in cold dark matter model. For our isolated haloes the density profile falls off as radius to the minus fifth or seventh power beyond the tidal radius, functional forms that allow all orders of lens potential derivatives to be calculated analytically, while ensuring a non-divergent total mass. We show how the observables predicted by this profile differ from that of the original infinite-mass NFW profile. Expressions for the gravitational flexion are highlighted. We show how decreasing the tidal radius allows stripped haloes to be modelled, providing a framework for a fuller investigation of dark matter substructure in galaxies and clusters. Finally we remark on the need for finite mass halo profiles when doing cosmological ray-tracing simulations, and the need for readily-calculable higher order derivatives of the lens potential when studying catastrophes in strong lenses

  19. Resolution of cosmological singularity and a plausible mechanism of the big bang

    International Nuclear Information System (INIS)

    Choudhury, D.C.

    2002-01-01

    The initial cosmological singularity in the framework of the general theory of relativity is resolved by introducing the effect of the uncertainty principle of quantum theory without violating conventional laws of physics. A plausible account of the mechanism of the big bang, analogous to that of a nuclear explosion, is given and the currently accepted Planck temperature of ≅10 32 K at the beginning of the big bang is predicted

  20. Space elevator systems level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Laubscher, B. E. (Bryan E.)

    2004-01-01

    The Space Elevator (SE) represents a major paradigm shift in space access. It involves new, untried technologies in most of its subsystems. Thus the successful construction of the SE requires a significant amount of development, This in turn implies a high level of risk for the SE. This paper will present a systems level analysis of the SE by subdividing its components into their subsystems to determine their level of technological maturity. such a high-risk endeavor is to follow a disciplined approach to the challenges. A systems level analysis informs this process and is the guide to where resources should be applied in the development processes. It is an efficient path that, if followed, minimizes the overall risk of the system's development. systems level analysis is that the overall system is divided naturally into its subsystems, and those subsystems are further subdivided as appropriate for the analysis. By dealing with the complex system in layers, the parameter space of decisions is kept manageable. Moreover, A rational way to manage One key aspect of a resources are not expended capriciously; rather, resources are put toward the biggest challenges and most promising solutions. This overall graded approach is a proven road to success. The analysis includes topics such as nanotube technology, deployment scenario, power beaming technology, ground-based hardware and operations, ribbon maintenance and repair and climber technology.

  1. System-level musings about system-level science (Invited)

    Science.gov (United States)

    Liu, W.

    2009-12-01

    In teleology, a system has a purpose. In physics, a system has a tendency. For example, a mechanical system has a tendency to lower its potential energy. A thermodynamic system has a tendency to increase its entropy. Therefore, if geospace is seen as a system, what is its tendency? Surprisingly or not, there is no simple answer to this question. Or, to flip the statement, the answer is complex, or complexity. We can understand generally why complexity arises, as the geospace boundary is open to influences from the solar wind and Earth’s atmosphere and components of the system couple to each other in a myriad of ways to make the systemic behavior highly nonlinear. But this still begs the question: What is the system-level approach to geospace science? A reductionist view might assert that as our understanding of a component or subsystem progresses to a certain point, we can couple some together to understand the system on a higher level. However, in practice, a subsystem can almost never been observed in isolation with others. Even if such is possible, there is no guarantee that the subsystem behavior will not change when coupled to others. Hence, there is no guarantee that a subsystem, such as the ring current, has an innate and intrinsic behavior like a hydrogen atom. An absolutist conclusion from this logic can be sobering, as one would have to trace a flash of aurora to the nucleosynthesis in the solar core. The practical answer, however, is more promising; it is a mix of the common sense we call reductionism and awareness that, especially when strongly coupled, subsystems can experience behavioral changes, breakdowns, and catastrophes. If the stock answer to the systemic tendency of geospace is complexity, the objective of the system-level approach to geospace science is to define, measure, and understand this complexity. I will use the example of magnetotail dynamics to illuminate some key points in this talk.

  2. Resolution of Cosmological Singularity and a Plausible Mechanism of the Big Bang

    OpenAIRE

    Choudhury, D. C.

    2001-01-01

    The initial cosmological singularity in the framework of the general theory of relativity is resolved by introducing the effect of the uncertainty principle of quantum theory without violating conventional laws of physics. A plausible account of the mechanism of the big bang, analogous to that of a nuclear explosion, is given and the currently accepted Planck temperature of about 10^(32)K at the beginning of the big bang is predicted. Subj-class: cosmology: theory-pre-big bang; mechanism of t...

  3. Pilgrims sailing the Titanic: plausibility effects on memory for misinformation.

    Science.gov (United States)

    Hinze, Scott R; Slaten, Daniel G; Horton, William S; Jenkins, Ryan; Rapp, David N

    2014-02-01

    People rely on information they read even when it is inaccurate (Marsh, Meade, & Roediger, Journal of Memory and Language 49:519-536, 2003), but how ubiquitous is this phenomenon? In two experiments, we investigated whether this tendency to encode and rely on inaccuracies from text might be influenced by the plausibility of misinformation. In Experiment 1, we presented stories containing inaccurate plausible statements (e.g., "The Pilgrims' ship was the Godspeed"), inaccurate implausible statements (e.g., . . . the Titanic), or accurate statements (e.g., . . . the Mayflower). On a subsequent test of general knowledge, participants relied significantly less on implausible than on plausible inaccuracies from the texts but continued to rely on accurate information. In Experiment 2, we replicated these results with the addition of a think-aloud procedure to elicit information about readers' noticing and evaluative processes for plausible and implausible misinformation. Participants indicated more skepticism and less acceptance of implausible than of plausible inaccuracies. In contrast, they often failed to notice, completely ignored, and at times even explicitly accepted the misinformation provided by plausible lures. These results offer insight into the conditions under which reliance on inaccurate information occurs and suggest potential mechanisms that may underlie reported misinformation effects.

  4. SPAR thermal analysis processors reference manual, system level 16. Volume 1: Program executive. Volume 2: Theory. Volume 3: Demonstration problems. Volume 4: Experimental thermal element capability. Volume 5: Programmer reference

    Science.gov (United States)

    Marlowe, M. B.; Moore, R. A.; Whetstone, W. D.

    1979-01-01

    User instructions are given for performing linear and nonlinear steady state and transient thermal analyses with SPAR thermal analysis processors TGEO, SSTA, and TRTA. It is assumed that the user is familiar with basic SPAR operations and basic heat transfer theory.

  5. Of paradox and plausibility: the dynamic of change in medical law.

    Science.gov (United States)

    Harrington, John

    2014-01-01

    This article develops a model of change in medical law. Drawing on systems theory, it argues that medical law participates in a dynamic of 'deparadoxification' and 'reparadoxification' whereby the underlying contingency of the law is variously concealed through plausible argumentation, or revealed by critical challenge. Medical law is, thus, thoroughly rhetorical. An examination of the development of the law on abortion and on the sterilization of incompetent adults shows that plausibility is achieved through the deployment of substantive common sense and formal stylistic devices. It is undermined where these elements are shown to be arbitrary and constructed. In conclusion, it is argued that the politics of medical law are constituted by this antagonistic process of establishing and challenging provisionally stable normative regimes. © The Author [2014]. Published by Oxford University Press; all rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Searching for Plausible N-k Contingencies Endangering Voltage Stability

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel; Van Cutsem, Thierry

    2017-01-01

    This paper presents a novel search algorithm using time-domain simulations to identify plausible N − k contingencies endangering voltage stability. Starting from an initial list of disturbances, progressively more severe contingencies are investigated. After simulation of a N − k contingency......, the simulation results are assessed. If the system response is unstable, a plausible harmful contingency sequence has been found. Otherwise, components affected by the contingencies are considered as candidate next event leading to N − (k + 1) contingencies. This implicitly takes into account hidden failures...

  7. A Stochastic Model of Plausibility in Live Virtual Constructive Environments

    Science.gov (United States)

    2017-09-14

    from the model parameters that are inputs to the computer model ( mathematical model) but whose exact values are unknown to experimentalists and...Environments Jeremy R. Millar Follow this and additional works at: https://scholar.afit.edu/etd Part of the Computer Sciences Commons This Dissertation...25 3.3 Computing Plausibility Exceedance Probabilities . . . . . . . . . . . . . . . . . . . 28 IV

  8. Endocrine distrupting chemicals and human health: The plausibility ...

    African Journals Online (AJOL)

    The plausibility of research results on DDT and reproductive health ... cals in the environment and that human health is inextri- cably linked to the health of .... periods of folliculo-genesis or embryo-genesis that increases risk for adverse effects.

  9. Features, Events, and Processes: System Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-04-19

    The primary purpose of this analysis is to evaluate System Level features, events, and processes (FEPs). The System Level FEPs typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem level analyses and models reports. The System Level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. This evaluation determines which of the System Level FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the information presented in analysis reports, model reports, direct input, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  10. Probabilistic reasoning in intelligent systems networks of plausible inference

    CERN Document Server

    Pearl, Judea

    1988-01-01

    Probabilistic Reasoning in Intelligent Systems is a complete and accessible account of the theoretical foundations and computational methods that underlie plausible reasoning under uncertainty. The author provides a coherent explication of probability as a language for reasoning with partial belief and offers a unifying perspective on other AI approaches to uncertainty, such as the Dempster-Shafer formalism, truth maintenance systems, and nonmonotonic logic. The author distinguishes syntactic and semantic approaches to uncertainty--and offers techniques, based on belief networks, that provid

  11. Generation of Plausible Hurricane Tracks for Preparedness Exercises

    Science.gov (United States)

    2017-04-25

    product kernel. KDE with a beta kernel gene- rates maximum sustained winds, and linear regression simulates minimum central pressure. Maximum significant...the Storm level models the number of waypoints M , birth and death locations w1 and wM , and total number of steps L. The Stage level models the...MATLAB and leverages HURDAT2 to construct data-driven statistical models that can generate plausible yet never-before-seen storm behaviors. For a

  12. Credibility judgments of narratives: language, plausibility, and absorption.

    Science.gov (United States)

    Nahari, Galit; Glicksohn, Joseph; Nachson, Israel

    2010-01-01

    Two experiments were conducted in order to find out whether textual features of narratives differentially affect credibility judgments made by judges having different levels of absorption (a disposition associated with rich visual imagination). Participants in both experiments were exposed to a textual narrative and requested to judge whether the narrator actually experienced the event he described in his story. In Experiment 1, the narrative varied in terms of language (literal, figurative) and plausibility (ordinary, anomalous). In Experiment 2, the narrative varied in terms of language only. The participants' perceptions of the plausibility of the story described and the extent to which they were absorbed in reading were measured. The data from both experiments together suggest that the groups applied entirely different criteria in credibility judgments. For high-absorption individuals, their credibility judgment depends on the degree to which the text can be assimilated into their own vivid imagination, whereas for low-absorption individuals it depends mainly on plausibility. That is, high-absorption individuals applied an experiential mental set while judging the credibility of the narrator, whereas low-absorption individuals applied an instrumental mental set. Possible cognitive mechanisms and implications for credibility judgments are discussed.

  13. System level ESD co-design

    CERN Document Server

    Gossner, Harald

    2015-01-01

    An effective and cost efficient protection of electronic system against ESD stress pulses specified by IEC 61000-4-2 is paramount for any system design. This pioneering book presents the collective knowledge of system designers and system testing experts and state-of-the-art techniques for achieving efficient system-level ESD protection, with minimum impact on the system performance. All categories of system failures ranging from ‘hard’ to ‘soft’ types are considered to review simulation and tool applications that can be used. The principal focus of System Level ESD Co-Design is defining and establishing the importance of co-design efforts from both IC supplier and system builder perspectives. ESD designers often face challenges in meeting customers' system-level ESD requirements and, therefore, a clear understanding of the techniques presented here will facilitate effective simulation approaches leading to better solutions without compromising system performance. With contributions from Robert Asht...

  14. Morality Principles for Risk Modelling: Needs and Links with the Origins of Plausible Inference

    Science.gov (United States)

    Solana-Ortega, Alberto; Solana, Vicente

    2009-12-01

    In comparison with the foundations of probability calculus, the inescapable and controversial issue of how to assign probabilities has only recently become a matter of formal study. The introduction of information as a technical concept was a milestone, but the most promising entropic assignment methods still face unsolved difficulties, manifesting the incompleteness of plausible inference theory. In this paper we examine the situation faced by risk analysts in the critical field of extreme events modelling, where the former difficulties are especially visible, due to scarcity of observational data, the large impact of these phenomena and the obligation to assume professional responsibilities. To respond to the claim for a sound framework to deal with extremes, we propose a metafoundational approach to inference, based on a canon of extramathematical requirements. We highlight their strong moral content, and show how this emphasis in morality, far from being new, is connected with the historic origins of plausible inference. Special attention is paid to the contributions of Caramuel, a contemporary of Pascal, unfortunately ignored in the usual mathematical accounts of probability.

  15. Neural networks, nativism, and the plausibility of constructivism.

    Science.gov (United States)

    Quartz, S R

    1993-09-01

    Recent interest in PDP (parallel distributed processing) models is due in part to the widely held belief that they challenge many of the assumptions of classical cognitive science. In the domain of language acquisition, for example, there has been much interest in the claim that PDP models might undermine nativism. Related arguments based on PDP learning have also been given against Fodor's anti-constructivist position--a position that has contributed to the widespread dismissal of constructivism. A limitation of many of the claims regarding PDP learning, however, is that the principles underlying this learning have not been rigorously characterized. In this paper, I examine PDP models from within the framework of Valiant's PAC (probably approximately correct) model of learning, now the dominant model in machine learning, and which applies naturally to neural network learning. From this perspective, I evaluate the implications of PDP models for nativism and Fodor's influential anti-constructivist position. In particular, I demonstrate that, contrary to a number of claims, PDP models are nativist in a robust sense. I also demonstrate that PDP models actually serve as a good illustration of Fodor's anti-constructivist position. While these results may at first suggest that neural network models in general are incapable of the sort of concept acquisition that is required to refute Fodor's anti-constructivist position, I suggest that there is an alternative form of neural network learning that demonstrates the plausibility of constructivism. This alternative form of learning is a natural interpretation of the constructivist position in terms of neural network learning, as it employs learning algorithms that incorporate the addition of structure in addition to weight modification schemes. By demonstrating that there is a natural and plausible interpretation of constructivism in terms of neural network learning, the position that nativism is the only plausible model of

  16. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  17. Features, Events, and Processes: system Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  18. Features, Events, and Processes: system Level

    International Nuclear Information System (INIS)

    D. McGregor

    2004-01-01

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760)

  19. The ethical plausibility of the 'Right To Try' laws.

    Science.gov (United States)

    Carrieri, D; Peccatori, F A; Boniolo, G

    2018-02-01

    'Right To Try' (RTT) laws originated in the USA to allow terminally ill patients to request access to early stage experimental medical products directly from the producer, removing the oversight and approval of the Food and Drug Administration. These laws have received significant media attention and almost equally unanimous criticism by the bioethics, clinical and scientific communities. They touch indeed on complex issues such as the conflict between individual and public interest, and the public understanding of medical research and its regulation. The increased awareness around RTT laws means that healthcare providers directly involved in the management of patients with life-threatening conditions such as cancer, infective, or neurologic conditions will deal more frequently with patients' requests of access to experimental medical products. This paper aims to assess the ethical plausibility of the RTT laws, and to suggest some possible ethical tools and considerations to address the main issues they touch. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. On the biological plausibility of Wind Turbine Syndrome.

    Science.gov (United States)

    Harrison, Robert V

    2015-01-01

    An emerging environmental health issue relates to potential ill-effects of wind turbine noise. There have been numerous suggestions that the low-frequency acoustic components in wind turbine signals can cause symptoms associated with vestibular system disorders, namely vertigo, nausea, and nystagmus. This constellation of symptoms has been labeled as Wind Turbine Syndrome, and has been identified in case studies of individuals living close to wind farms. This review discusses whether it is biologically plausible for the turbine noise to stimulate the vestibular parts of the inner ear and, by extension, cause Wind Turbine Syndrome. We consider the sound levels that can activate the semicircular canals or otolith end organs in normal subjects, as well as in those with preexisting conditions known to lower vestibular threshold to sound stimulation.

  1. Plausible scenarios for the radiography profession in Sweden in 2025

    International Nuclear Information System (INIS)

    Björkman, B.; Fridell, K.; Tavakol Olofsson, P.

    2017-01-01

    Introduction: Radiography is a healthcare speciality with many technical challenges. Advances in engineering and information technology applications may continue to drive and be driven by radiographers. The world of diagnostic imaging is changing rapidly and radiographers must be proactive in order to survive. To ensure sustainable development, organisations have to identify future opportunities and threats in a timely manner and incorporate them into their strategic planning. Hence, the aim of this study was to analyse and describe plausible scenarios for the radiography profession in 2025. Method: The study has a qualitative design with an inductive approach based on focus group interviews. The interviews were inspired by the Scenario-Planning method. Results: Of the seven trends identified in a previous study, the radiographers considered two as the most uncertain scenarios that would have the greatest impact on the profession should they occur. These trends, labelled “Access to career advancement” and “A sufficient number of radiographers”, were inserted into the scenario cross. The resulting four plausible future scenarios were: The happy radiographer, the specialist radiographer, the dying profession and the assembly line. Conclusion: It is suggested that “The dying profession” scenario could probably be turned in the opposite direction by facilitating career development opportunities for radiographers within the profession. Changing the direction would probably lead to a profession composed of “happy radiographers” who are specialists, proud of their profession and competent to carry out advanced tasks, in contrast to being solely occupied by “the assembly line”. - Highlights: • The world of radiography is changing rapidly and radiographers must be proactive in order to survive. • Future opportunities and threats should be identified and incorporated into the strategic planning. • Appropriate actions can probably change the

  2. Plausible inference: A multi-valued logic for problem solving

    Science.gov (United States)

    Friedman, L.

    1979-01-01

    A new logic is developed which permits continuously variable strength of belief in the truth of assertions. Four inference rules result, with formal logic as a limiting case. Quantification of belief is defined. Propagation of belief to linked assertions results from dependency-based techniques of truth maintenance so that local consistency is achieved or contradiction discovered in problem solving. Rules for combining, confirming, or disconfirming beliefs are given, and several heuristics are suggested that apply to revising already formed beliefs in the light of new evidence. The strength of belief that results in such revisions based on conflicting evidence are a highly subjective phenomenon. Certain quantification rules appear to reflect an orderliness in the subjectivity. Several examples of reasoning by plausible inference are given, including a legal example and one from robot learning. Propagation of belief takes place in directions forbidden in formal logic and this results in conclusions becoming possible for a given set of assertions that are not reachable by formal logic.

  3. Liderazgo preventivo para la universidad. Una experiencia plausible

    Directory of Open Access Journals (Sweden)

    Alejandro Rodríguez Rodríguez

    2015-06-01

    Full Text Available El desarrollo del liderazgo, en el ámbito educativo superior, busca soluciones de aplicación inmediata a contextos en que todo líder se desenvuelve, pero se diluye el sustento teórico-práctico en la formación del líder que posibilite entender los procesos intelectivos durante la toma de decisiones. El paradigma de convergencia entre el método antropológico lonerganiano, la comunidad de aprendizaje vygotskiana y una relectura del sistema preventivo salesiano se presentan como propuesta plausible de formación al liderazgo preventivo entre los diversos actores de una comunidad universitaria. Un estudio de caso de la Universidad Salesiana en México empleando un método mixto de investigación, facilita una relectura del liderazgo desde una óptica preventiva como posibilidad de convergencia en un diálogo interdisciplinar. Los resultados teórico-práctico propuestos y examinados se muestran como herramienta útil para evaluar, enriquecer y renovar la teoría sobre el líder y el desarrollo de liderazgo en las universidades frente a una sociedad globalizada.

  4. Metal ion binding with dehydroannulenes – Plausible two ...

    Indian Academy of Sciences (India)

    WINTEC

    Theoretical investigations have been carried out at B3LYP/6-311++G** level of theory to study the binding ... Alkali metals; dehydroannulenes; binding energy; penetration barrier. 1. .... can be discriminated from larger metal ions by running.

  5. Structure before meaning: sentence processing, plausibility, and subcategorization.

    Science.gov (United States)

    Kizach, Johannes; Nyvad, Anne Mette; Christensen, Ken Ramshøj

    2013-01-01

    Natural language processing is a fast and automatized process. A crucial part of this process is parsing, the online incremental construction of a syntactic structure. The aim of this study was to test whether a wh-filler extracted from an embedded clause is initially attached as the object of the matrix verb with subsequent reanalysis, and if so, whether the plausibility of such an attachment has an effect on reaction time. Finally, we wanted to examine whether subcategorization plays a role. We used a method called G-Maze to measure response time in a self-paced reading design. The experiments confirmed that there is early attachment of fillers to the matrix verb. When this attachment is implausible, the off-line acceptability of the whole sentence is significantly reduced. The on-line results showed that G-Maze was highly suited for this type of experiment. In accordance with our predictions, the results suggest that the parser ignores (or has no access to information about) implausibility and attaches fillers as soon as possible to the matrix verb. However, the results also show that the parser uses the subcategorization frame of the matrix verb. In short, the parser ignores semantic information and allows implausible attachments but adheres to information about which type of object a verb can take, ensuring that the parser does not make impossible attachments. We argue that the evidence supports a syntactic parser informed by syntactic cues, rather than one guided by semantic cues or one that is blind, or completely autonomous.

  6. Structure before meaning: sentence processing, plausibility, and subcategorization.

    Directory of Open Access Journals (Sweden)

    Johannes Kizach

    Full Text Available Natural language processing is a fast and automatized process. A crucial part of this process is parsing, the online incremental construction of a syntactic structure. The aim of this study was to test whether a wh-filler extracted from an embedded clause is initially attached as the object of the matrix verb with subsequent reanalysis, and if so, whether the plausibility of such an attachment has an effect on reaction time. Finally, we wanted to examine whether subcategorization plays a role. We used a method called G-Maze to measure response time in a self-paced reading design. The experiments confirmed that there is early attachment of fillers to the matrix verb. When this attachment is implausible, the off-line acceptability of the whole sentence is significantly reduced. The on-line results showed that G-Maze was highly suited for this type of experiment. In accordance with our predictions, the results suggest that the parser ignores (or has no access to information about implausibility and attaches fillers as soon as possible to the matrix verb. However, the results also show that the parser uses the subcategorization frame of the matrix verb. In short, the parser ignores semantic information and allows implausible attachments but adheres to information about which type of object a verb can take, ensuring that the parser does not make impossible attachments. We argue that the evidence supports a syntactic parser informed by syntactic cues, rather than one guided by semantic cues or one that is blind, or completely autonomous.

  7. Hologenomics: Systems-Level Host Biology.

    Science.gov (United States)

    Theis, Kevin R

    2018-01-01

    The hologenome concept of evolution is a hypothesis explaining host evolution in the context of the host microbiomes. As a hypothesis, it needs to be evaluated, especially with respect to the extent of fidelity of transgenerational coassociation of host and microbial lineages and the relative fitness consequences of repeated associations within natural holobiont populations. Behavioral ecologists are in a prime position to test these predictions because they typically focus on animal phenotypes that are quantifiable, conduct studies over multiple generations within natural animal populations, and collect metadata on genetic relatedness and relative reproductive success within these populations. Regardless of the conclusion on the hologenome concept as an evolutionary hypothesis, a hologenomic perspective has applied value as a systems-level framework for host biology, including in medicine. Specifically, it emphasizes investigating the multivarious and dynamic interactions between patient genomes and the genomes of their diverse microbiota when attempting to elucidate etiologies of complex, noninfectious diseases.

  8. System Level Analysis of LTE-Advanced

    DEFF Research Database (Denmark)

    Wang, Yuanye

    This PhD thesis focuses on system level analysis of Multi-Component Carrier (CC) management for Long Term Evolution (LTE)-Advanced. Cases where multiple CCs are aggregated to form a larger bandwidth are studied. The analysis is performed for both local area and wide area networks. In local area...... reduction. Compared to the case of reuse-1, they achieve a gain of 50∼500% in cell edge user throughput, with small or no loss in average cell throughput. For the wide area network, effort is devoted to the downlink of LTE-Advanced. Such a system is assumed to be backwards compatible to LTE release 8, i...... scheme is recommended. It reduces the CQI by 94% at low load, and 79∼93% at medium to high load, with reasonable loss in downlink performance. To reduce the ACK/NACK feedback, multiple ACK/NACKs can be bundled, with slightly degraded downlink throughput....

  9. Compressed sensing along physically plausible sampling trajectories in MRI

    International Nuclear Information System (INIS)

    Chauffert, Nicolas

    2015-01-01

    Magnetic Resonance Imaging (MRI) is a non-invasive and non-ionizing imaging technique that provides images of body tissues, using the contrast sensitivity coming from the magnetic parameters (T_1, T_2 and proton density). Data are acquired in the κ-space, corresponding to spatial Fourier frequencies. Because of physical constraints, the displacement in the κ-space is subject to kinematic constraints. Indeed, magnetic field gradients and their temporal derivative are upper bounded. Hence, the scanning time increases with the image resolution. Decreasing scanning time is crucial to improve patient comfort, decrease exam costs, limit the image distortions (eg, created by the patient movement), or decrease temporal resolution in functional MRI. Reducing scanning time can be addressed by Compressed Sensing (CS) theory. The latter is a technique that guarantees the perfect recovery of an image from under sampled data in κ-space, by assuming that the image is sparse in a wavelet basis. Unfortunately, CS theory cannot be directly cast to the MRI setting. The reasons are: i) acquisition (Fourier) and representation (wavelets) bases are coherent and ii) sampling schemes obtained using CS theorems are composed of isolated measurements and cannot be realistically implemented by magnetic field gradients: the sampling is usually performed along continuous or more regular curves. However, heuristic application of CS in MRI has provided promising results. In this thesis, we aim to develop theoretical tools to apply CS to MRI and other modalities. On the one hand, we propose a variable density sampling theory to answer the first impediment. The more the sample contains information, the more it is likely to be drawn. On the other hand, we propose sampling schemes and design sampling trajectories that fulfill acquisition constraints, while traversing the κ-space with the sampling density advocated by the theory. The second point is complex and is thus addressed step by step

  10. Quantum theory as plausible reasoning applied to data obtained by robust experiments

    NARCIS (Netherlands)

    De Raedt, H.; Katsnelson, M. I.; Michielsen, K.

    2016-01-01

    We review recent work that employs the framework of logical inference to establish a bridge between data gathered through experiments and their objective description in terms of human-made concepts. It is shown that logical inference applied to experiments for which the observed events are

  11. Event-based plausibility immediately influences on-line language comprehension.

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge.

  12. Stereotyping to infer group membership creates plausible deniability for prejudice-based aggression.

    Science.gov (United States)

    Cox, William T L; Devine, Patricia G

    2014-02-01

    In the present study, participants administered painful electric shocks to an unseen male opponent who was either explicitly labeled as gay or stereotypically implied to be gay. Identifying the opponent with a gay-stereotypic attribute produced a situation in which the target's group status was privately inferred but plausibly deniable to others. To test the plausible deniability hypothesis, we examined aggression levels as a function of internal (personal) and external (social) motivation to respond without prejudice. Whether plausible deniability was present or absent, participants high in internal motivation aggressed at low levels, and participants low in both internal and external motivation aggressed at high levels. The behavior of participants low in internal and high in external motivation, however, depended on experimental condition. They aggressed at low levels when observers could plausibly attribute their behavior to prejudice and aggressed at high levels when the situation granted plausible deniability. This work has implications for both obstacles to and potential avenues for prejudice-reduction efforts.

  13. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  14. High School Students' Evaluations, Plausibility (Re) Appraisals, and Knowledge about Topics in Earth Science

    Science.gov (United States)

    Lombardi, Doug; Bickel, Elliot S.; Bailey, Janelle M.; Burrell, Shondricka

    2018-01-01

    Evaluation is an important aspect of science and is receiving increasing attention in science education. The present study investigated (1) changes to plausibility judgments and knowledge as a result of a series of instructional scaffolds, called model-evidence link activities, that facilitated evaluation of scientific and alternative models in…

  15. Preview Effects of Plausibility and Character Order in Reading Chinese Transposed Words: Evidence from Eye Movements

    Science.gov (United States)

    Yang, Jinmian

    2013-01-01

    The current paper examined the role of plausibility information in the parafovea for Chinese readers by using two-character transposed words (in which the order of the component characters is reversed but are still words). In two eye-tracking experiments, readers received a preview of a target word that was (1) identical to the target word, (2) a…

  16. The Radical Promise of Reformist Zeal: What Makes "Inquiry for Equity" Plausible?

    Science.gov (United States)

    Lashaw, Amanda

    2010-01-01

    Education reform movements often promise more than they deliver. Why are such promises plausible in light of seemingly perpetual education reform? Drawing on ethnographic fieldwork based in a nonprofit education reform organization, this article explores the appeal of popular notions about "using data to close the racial achievement…

  17. A Distributed Approach to System-Level Prognostics

    Science.gov (United States)

    Daigle, Matthew J.; Bregon, Anibal; Roychoudhury, Indranil

    2012-01-01

    Prognostics, which deals with predicting remaining useful life of components, subsystems, and systems, is a key technology for systems health management that leads to improved safety and reliability with reduced costs. The prognostics problem is often approached from a component-centric view. However, in most cases, it is not specifically component lifetimes that are important, but, rather, the lifetimes of the systems in which these components reside. The system-level prognostics problem can be quite difficult due to the increased scale and scope of the prognostics problem and the relative Jack of scalability and efficiency of typical prognostics approaches. In order to address these is ues, we develop a distributed solution to the system-level prognostics problem, based on the concept of structural model decomposition. The system model is decomposed into independent submodels. Independent local prognostics subproblems are then formed based on these local submodels, resul ting in a scalable, efficient, and flexible distributed approach to the system-level prognostics problem. We provide a formulation of the system-level prognostics problem and demonstrate the approach on a four-wheeled rover simulation testbed. The results show that the system-level prognostics problem can be accurately and efficiently solved in a distributed fashion.

  18. From information processing to decisions: Formalizing and comparing psychologically plausible choice models.

    Science.gov (United States)

    Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten

    2017-08-01

    Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Epidemiologic studies of occupational pesticide exposure and cancer: regulatory risk assessments and biologic plausibility.

    Science.gov (United States)

    Acquavella, John; Doe, John; Tomenson, John; Chester, Graham; Cowell, John; Bloemen, Louis

    2003-01-01

    Epidemiologic studies frequently show associations between self-reported use of specific pesticides and human cancers. These findings have engendered debate largely on methodologic grounds. However, biologic plausibility is a more fundamental issue that has received only superficial attention. The purpose of this commentary is to review briefly the toxicology and exposure data that are developed as part of the pesticide regulatory process and to discuss the applicability of this data to epidemiologic research. The authors also provide a generic example of how worker pesticide exposures might be estimated and compared to relevant toxicologic dose levels. This example provides guidance for better characterization of exposure and for consideration of biologic plausibility in epidemiologic studies of pesticides.

  20. L’Analyse du Risque Géopolitique: du Plausible au Probable

    OpenAIRE

    Adib Bencherif

    2015-01-01

    This paper is going to explore the logical process behind risk analysis, particularly in geopolitics. The main goal is to demonstrate the ambiguities behind risk calculation and to highlight the continuum between plausibility and probability in risk analysis. To demonstrate it, the author introduces two notions: the inference of abduction, often neglected in the social sciences literature, and the Bayesian calculation. Inspired by the works of Louise Amoore, this paper tries to go further by ...

  1. On Metaphysical Cases against Political Theories

    NARCIS (Netherlands)

    M. Buitenhuis (Manuel)

    2014-01-01

    textabstractThis paper considers some arguments that argue against particular political theories on metaphysical grounds. These arguments contain the implicit premise that political theories are only viable if they are grounded in a plausible metaphysical theory. This thesis was called the

  2. Design for testability and diagnosis at the system-level

    Science.gov (United States)

    Simpson, William R.; Sheppard, John W.

    1993-01-01

    The growing complexity of full-scale systems has surpassed the capabilities of most simulation software to provide detailed models or gate-level failure analyses. The process of system-level diagnosis approaches the fault-isolation problem in a manner that differs significantly from the traditional and exhaustive failure mode search. System-level diagnosis is based on a functional representation of the system. For example, one can exercise one portion of a radar algorithm (the Fast Fourier Transform (FFT) function) by injecting several standard input patterns and comparing the results to standardized output results. An anomalous output would point to one of several items (including the FFT circuit) without specifying the gate or failure mode. For system-level repair, identifying an anomalous chip is sufficient. We describe here an information theoretic and dependency modeling approach that discards much of the detailed physical knowledge about the system and analyzes its information flow and functional interrelationships. The approach relies on group and flow associations and, as such, is hierarchical. Its hierarchical nature allows the approach to be applicable to any level of complexity and to any repair level. This approach has been incorporated in a product called STAMP (System Testability and Maintenance Program) which was developed and refined through more than 10 years of field-level applications to complex system diagnosis. The results have been outstanding, even spectacular in some cases. In this paper we describe system-level testability, system-level diagnoses, and the STAMP analysis approach, as well as a few STAMP applications.

  3. A biologically plausible transform for visual recognition that is invariant to translation, scale and rotation

    Directory of Open Access Journals (Sweden)

    Pavel eSountsov

    2011-11-01

    Full Text Available Visual object recognition occurs easily despite differences in position, size, and rotation of the object, but the neural mechanisms responsible for this invariance are not known. We have found a set of transforms that achieve invariance in a neurally plausible way. We find that a transform based on local spatial frequency analysis of oriented segments and on logarithmic mapping, when applied twice in an iterative fashion, produces an output image that is unique to the object and that remains constant as the input image is shifted, scaled or rotated.

  4. A Biologically Plausible Transform for Visual Recognition that is Invariant to Translation, Scale, and Rotation.

    Science.gov (United States)

    Sountsov, Pavel; Santucci, David M; Lisman, John E

    2011-01-01

    Visual object recognition occurs easily despite differences in position, size, and rotation of the object, but the neural mechanisms responsible for this invariance are not known. We have found a set of transforms that achieve invariance in a neurally plausible way. We find that a transform based on local spatial frequency analysis of oriented segments and on logarithmic mapping, when applied twice in an iterative fashion, produces an output image that is unique to the object and that remains constant as the input image is shifted, scaled, or rotated.

  5. System-level modeling of acetone-butanol-ethanol fermentation.

    Science.gov (United States)

    Liao, Chen; Seo, Seung-Oh; Lu, Ting

    2016-05-01

    Acetone-butanol-ethanol (ABE) fermentation is a metabolic process of clostridia that produces bio-based solvents including butanol. It is enabled by an underlying metabolic reaction network and modulated by cellular gene regulation and environmental cues. Mathematical modeling has served as a valuable strategy to facilitate the understanding, characterization and optimization of this process. In this review, we highlight recent advances in system-level, quantitative modeling of ABE fermentation. We begin with an overview of integrative processes underlying the fermentation. Next we survey modeling efforts including early simple models, models with a systematic metabolic description, and those incorporating metabolism through simple gene regulation. Particular focus is given to a recent system-level model that integrates the metabolic reactions, gene regulation and environmental cues. We conclude by discussing the remaining challenges and future directions towards predictive understanding of ABE fermentation. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. System-level Modeling of Wireless Integrated Sensor Networks

    DEFF Research Database (Denmark)

    Virk, Kashif M.; Hansen, Knud; Madsen, Jan

    2005-01-01

    Wireless integrated sensor networks have emerged as a promising infrastructure for a new generation of monitoring and tracking applications. In order to efficiently utilize the extremely limited resources of wireless sensor nodes, accurate modeling of the key aspects of wireless sensor networks...... is necessary so that system-level design decisions can be made about the hardware and the software (applications and real-time operating system) architecture of sensor nodes. In this paper, we present a SystemC-based abstract modeling framework that enables system-level modeling of sensor network behavior...... by modeling the applications, real-time operating system, sensors, processor, and radio transceiver at the sensor node level and environmental phenomena, including radio signal propagation, at the sensor network level. We demonstrate the potential of our modeling framework by simulating and analyzing a small...

  7. Neural correlates of early-closure garden-path processing: Effects of prosody and plausibility.

    Science.gov (United States)

    den Ouden, Dirk-Bart; Dickey, Michael Walsh; Anderson, Catherine; Christianson, Kiel

    2016-01-01

    Functional magnetic resonance imaging (fMRI) was used to investigate neural correlates of early-closure garden-path sentence processing and use of extrasyntactic information to resolve temporary syntactic ambiguities. Sixteen participants performed an auditory picture verification task on sentences presented with natural versus flat intonation. Stimuli included sentences in which the garden-path interpretation was plausible, implausible because of a late pragmatic cue, or implausible because of a semantic mismatch between an optionally transitive verb and the following noun. Natural sentence intonation was correlated with left-hemisphere temporal activation, but also with activation that suggests the allocation of more resources to interpretation when natural prosody is provided. Garden-path processing was associated with upregulation in bilateral inferior parietal and right-hemisphere dorsolateral prefrontal and inferior frontal cortex, while differences between the strength and type of plausibility cues were also reflected in activation patterns. Region of interest (ROI) analyses in regions associated with complex syntactic processing are consistent with a role for posterior temporal cortex supporting access to verb argument structure. Furthermore, ROI analyses within left-hemisphere inferior frontal gyrus suggest a division of labour, with the anterior-ventral part primarily involved in syntactic-semantic mismatch detection, the central part supporting structural reanalysis, and the posterior-dorsal part showing a general structural complexity effect.

  8. A swarm intelligence framework for reconstructing gene networks: searching for biologically plausible architectures.

    Science.gov (United States)

    Kentzoglanakis, Kyriakos; Poole, Matthew

    2012-01-01

    In this paper, we investigate the problem of reverse engineering the topology of gene regulatory networks from temporal gene expression data. We adopt a computational intelligence approach comprising swarm intelligence techniques, namely particle swarm optimization (PSO) and ant colony optimization (ACO). In addition, the recurrent neural network (RNN) formalism is employed for modeling the dynamical behavior of gene regulatory systems. More specifically, ACO is used for searching the discrete space of network architectures and PSO for searching the corresponding continuous space of RNN model parameters. We propose a novel solution construction process in the context of ACO for generating biologically plausible candidate architectures. The objective is to concentrate the search effort into areas of the structure space that contain architectures which are feasible in terms of their topological resemblance to real-world networks. The proposed framework is initially applied to the reconstruction of a small artificial network that has previously been studied in the context of gene network reverse engineering. Subsequently, we consider an artificial data set with added noise for reconstructing a subnetwork of the genetic interaction network of S. cerevisiae (yeast). Finally, the framework is applied to a real-world data set for reverse engineering the SOS response system of the bacterium Escherichia coli. Results demonstrate the relative advantage of utilizing problem-specific knowledge regarding biologically plausible structural properties of gene networks over conducting a problem-agnostic search in the vast space of network architectures.

  9. Particulate air pollution and increased mortality: Biological plausibility for causal relationship

    International Nuclear Information System (INIS)

    Henderson, R.F.

    1995-01-01

    Recently, a number of epidemiological studies have concluded that ambient particulate exposure is associated with increased mortality and morbidity at PM concentrations well below those previously thought to affect human health. These studies have been conducted in several different geographical locations and have involved a range of populations. While the consistency of the findings and the presence of an apparent concentration response relationship provide a strong argument for causality, epidemiological studies can only conclude this based upon inference from statistical associations. The biological plausibility of a causal relationship between low concentrations of PM and daily mortality and morbidity rates is neither intuitively obvious nor expected based on past experimental studies on the toxicity of inhaled particles. Chronic toxicity from inhaled, poorly soluble particles has been observed based on the slow accumulation of large lung burdens of particles, not on small daily fluctuations in PM levels. Acute toxicity from inhaled particles is associated mainly with acidic particles and is observed at much higher concentrations than those observed in the epidemiology studies reporting an association between PM concentrations and morbidity/mortality. To approach the difficult problem of determining if the association between PM concentrations and daily morbidity and mortality is biologically plausible and causal, one must consider (1) the chemical and physical characteristics of the particles in the inhaled atmospheres, (2) the characteristics of the morbidity/mortality observed and the people who are affected, and (3) potential mechanisms that might link the two

  10. Biologically plausible learning in neural networks: a lesson from bacterial chemotaxis.

    Science.gov (United States)

    Shimansky, Yury P

    2009-12-01

    Learning processes in the brain are usually associated with plastic changes made to optimize the strength of connections between neurons. Although many details related to biophysical mechanisms of synaptic plasticity have been discovered, it is unclear how the concurrent performance of adaptive modifications in a huge number of spatial locations is organized to minimize a given objective function. Since direct experimental observation of even a relatively small subset of such changes is not feasible, computational modeling is an indispensable investigation tool for solving this problem. However, the conventional method of error back-propagation (EBP) employed for optimizing synaptic weights in artificial neural networks is not biologically plausible. This study based on computational experiments demonstrated that such optimization can be performed rather efficiently using the same general method that bacteria employ for moving closer to an attractant or away from a repellent. With regard to neural network optimization, this method consists of regulating the probability of an abrupt change in the direction of synaptic weight modification according to the temporal gradient of the objective function. Neural networks utilizing this method (regulation of modification probability, RMP) can be viewed as analogous to swimming in the multidimensional space of their parameters in the flow of biochemical agents carrying information about the optimality criterion. The efficiency of RMP is comparable to that of EBP, while RMP has several important advantages. Since the biological plausibility of RMP is beyond a reasonable doubt, the RMP concept provides a constructive framework for the experimental analysis of learning in natural neural networks.

  11. The effect of decentralized behavioral decision making on system-level risk.

    Science.gov (United States)

    Kaivanto, Kim

    2014-12-01

    Certain classes of system-level risk depend partly on decentralized lay decision making. For instance, an organization's network security risk depends partly on its employees' responses to phishing attacks. On a larger scale, the risk within a financial system depends partly on households' responses to mortgage sales pitches. Behavioral economics shows that lay decisionmakers typically depart in systematic ways from the normative rationality of expected utility (EU), and instead display heuristics and biases as captured in the more descriptively accurate prospect theory (PT). In turn, psychological studies show that successful deception ploys eschew direct logical argumentation and instead employ peripheral-route persuasion, manipulation of visceral emotions, urgency, and familiar contextual cues. The detection of phishing emails and inappropriate mortgage contracts may be framed as a binary classification task. Signal detection theory (SDT) offers the standard normative solution, formulated as an optimal cutoff threshold, for distinguishing between good/bad emails or mortgages. In this article, we extend SDT behaviorally by rederiving the optimal cutoff threshold under PT. Furthermore, we incorporate the psychology of deception into determination of SDT's discriminability parameter. With the neo-additive probability weighting function, the optimal cutoff threshold under PT is rendered unique under well-behaved sampling distributions, tractable in computation, and transparent in interpretation. The PT-based cutoff threshold is (i) independent of loss aversion and (ii) more conservative than the classical SDT cutoff threshold. Independently of any possible misalignment between individual-level and system-level misclassification costs, decentralized behavioral decisionmakers are biased toward underdetection, and system-level risk is consequently greater than in analyses predicated upon normative rationality. © 2014 Society for Risk Analysis.

  12. System level modeling and component level control of fuel cells

    Science.gov (United States)

    Xue, Xingjian

    This dissertation investigates the fuel cell systems and the related technologies in three aspects: (1) system-level dynamic modeling of both PEM fuel cell (PEMFC) and solid oxide fuel cell (SOFC); (2) condition monitoring scheme development of PEM fuel cell system using model-based statistical method; and (3) strategy and algorithm development of precision control with potential application in energy systems. The dissertation first presents a system level dynamic modeling strategy for PEM fuel cells. It is well known that water plays a critical role in PEM fuel cell operations. It makes the membrane function appropriately and improves the durability. The low temperature operating conditions, however, impose modeling difficulties in characterizing the liquid-vapor two phase change phenomenon, which becomes even more complex under dynamic operating conditions. This dissertation proposes an innovative method to characterize this phenomenon, and builds a comprehensive model for PEM fuel cell at the system level. The model features the complete characterization of multi-physics dynamic coupling effects with the inclusion of dynamic phase change. The model is validated using Ballard stack experimental result from open literature. The system behavior and the internal coupling effects are also investigated using this model under various operating conditions. Anode-supported tubular SOFC is also investigated in the dissertation. While the Nernst potential plays a central role in characterizing the electrochemical performance, the traditional Nernst equation may lead to incorrect analysis results under dynamic operating conditions due to the current reverse flow phenomenon. This dissertation presents a systematic study in this regard to incorporate a modified Nernst potential expression and the heat/mass transfer into the analysis. The model is used to investigate the limitations and optimal results of various operating conditions; it can also be utilized to perform the

  13. System-Level Shared Governance Structures and Processes in Healthcare Systems With Magnet®-Designated Hospitals: A Descriptive Study.

    Science.gov (United States)

    Underwood, Carlisa M; Hayne, Arlene N

    The purpose was to identify and describe structures and processes of best practices for system-level shared governance in healthcare systems. Currently, more than 64.6% of US community hospitals are part of a system. System chief nurse executives (SCNEs) are challenged to establish leadership structures and processes that effectively and efficiently disseminate best practices for patients and staff across complex organizations, geographically dispersed locations, and populations. Eleven US healthcare SCNEs from the American Nurses Credentialing Center's repository of Magnet®-designated facilities participated in a 35-multiquestion interview based on Kanter's Theory of Organizational Empowerment. Most SCNEs reported the presence of more than 50% of the empowerment structures and processes in system-level shared governance. Despite the difficulties and complexities of growing health systems, SCNEs have replicated empowerment characteristics of hospital shared governance structures and processes at the system level.

  14. Accelerating next generation sequencing data analysis with system level optimizations.

    Science.gov (United States)

    Kathiresan, Nagarajan; Temanni, Ramzi; Almabrazi, Hakeem; Syed, Najeeb; Jithesh, Puthen V; Al-Ali, Rashid

    2017-08-22

    Next generation sequencing (NGS) data analysis is highly compute intensive. In-memory computing, vectorization, bulk data transfer, CPU frequency scaling are some of the hardware features in the modern computing architectures. To get the best execution time and utilize these hardware features, it is necessary to tune the system level parameters before running the application. We studied the GATK-HaplotypeCaller which is part of common NGS workflows, that consume more than 43% of the total execution time. Multiple GATK 3.x versions were benchmarked and the execution time of HaplotypeCaller was optimized by various system level parameters which included: (i) tuning the parallel garbage collection and kernel shared memory to simulate in-memory computing, (ii) architecture-specific tuning in the PairHMM library for vectorization, (iii) including Java 1.8 features through GATK source code compilation and building a runtime environment for parallel sorting and bulk data transfer (iv) the default 'on-demand' mode of CPU frequency is over-clocked by using 'performance-mode' to accelerate the Java multi-threads. As a result, the HaplotypeCaller execution time was reduced by 82.66% in GATK 3.3 and 42.61% in GATK 3.7. Overall, the execution time of NGS pipeline was reduced to 70.60% and 34.14% for GATK 3.3 and GATK 3.7 respectively.

  15. A systems-level approach for investigating organophosphorus pesticide toxicity.

    Science.gov (United States)

    Zhu, Jingbo; Wang, Jing; Ding, Yan; Liu, Baoyue; Xiao, Wei

    2018-03-01

    The full understanding of the single and joint toxicity of a variety of organophosphorus (OP) pesticides is still unavailable, because of the extreme complex mechanism of action. This study established a systems-level approach based on systems toxicology to investigate OP pesticide toxicity by incorporating ADME/T properties, protein prediction, and network and pathway analysis. The results showed that most OP pesticides are highly toxic according to the ADME/T parameters, and can interact with significant receptor proteins to cooperatively lead to various diseases by the established OP pesticide -protein and protein-disease networks. Furthermore, the studies that multiple OP pesticides potentially act on the same receptor proteins and/or the functionally diverse proteins explained that multiple OP pesticides could mutually enhance toxicological synergy or additive on a molecular/systematic level. To the end, the integrated pathways revealed the mechanism of toxicity of the interaction of OP pesticides and elucidated the pathogenesis induced by OP pesticides. This study demonstrates a systems-level approach for investigating OP pesticide toxicity that can be further applied to risk assessments of various toxins, which is of significant interest to food security and environmental protection. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. MEDICARE PAYMENTS AND SYSTEM-LEVEL HEALTH-CARE USE

    Science.gov (United States)

    ROBBINS, JACOB A.

    2015-01-01

    The rapid growth of Medicare managed care over the past decade has the potential to increase the efficiency of health-care delivery. Improvements in care management for some may improve efficiency system-wide, with implications for optimal payment policy in public insurance programs. These system-level effects may depend on local health-care market structure and vary based on patient characteristics. We use exogenous variation in the Medicare payment schedule to isolate the effects of market-level managed care enrollment on the quantity and quality of care delivered. We find that in areas with greater enrollment of Medicare beneficiaries in managed care, the non–managed care beneficiaries have fewer days in the hospital but more outpatient visits, consistent with a substitution of less expensive outpatient care for more expensive inpatient care, particularly at high levels of managed care. We find no evidence that care is of lower quality. Optimal payment policies for Medicare managed care enrollees that account for system-level spillovers may thus be higher than those that do not. PMID:27042687

  17. Measuring healthcare productivity - from unit to system level.

    Science.gov (United States)

    Kämäräinen, Vesa Johannes; Peltokorpi, Antti; Torkki, Paulus; Tallbacka, Kaj

    2016-04-18

    Purpose - Healthcare productivity is a growing issue in most Western countries where healthcare expenditure is rapidly increasing. Therefore, accurate productivity metrics are essential to avoid sub-optimization within a healthcare system. The purpose of this paper is to focus on healthcare production system productivity measurement. Design/methodology/approach - Traditionally, healthcare productivity has been studied and measured independently at the unit, organization and system level. Suggesting that productivity measurement should be done in different levels, while simultaneously linking productivity measurement to incentives, this study presents the challenges of productivity measurement at the different levels. The study introduces different methods to measure productivity in healthcare. In addition, it provides background information on the methods used to measure productivity and the parameters used in these methods. A pilot investigation of productivity measurement is used to illustrate the challenges of measurement, to test the developed measures and to prove the practical information for managers. Findings - The study introduces different approaches and methods to measure productivity in healthcare. Practical implications - A pilot investigation of productivity measurement is used to illustrate the challenges of measurement, to test the developed measures and to prove the practical benefits for managers. Originality/value - The authors focus on the measurement of the whole healthcare production system and try to avoid sub-optimization. Additionally considering an individual patient approach, productivity measurement is examined at the unit level, the organizational level and the system level.

  18. Signature of Plausible Accreting Supermassive Black Holes in Mrk 261/262 and Mrk 266

    Directory of Open Access Journals (Sweden)

    Gagik Ter-Kazarian

    2013-01-01

    Full Text Available We address the neutrino radiation of plausible accreting supermassive black holes closely linking to the 5 nuclear components of galaxy samples of Mrk 261/262 and Mrk 266. We predict a time delay before neutrino emission of the same scale as the age of the Universe. The ultrahigh energy neutrinos are produced in superdense protomatter medium via simple (quark or pionic reactions or modified URCA processes (G. Gamow was inspired to name the process URCA after the name of a casino in Rio de Janeiro. The resulting neutrino fluxes for quark reactions are ranging from to , where is the opening parameter. For pionic and modified URCA reactions, the fluxes are and , respectively. These fluxes are highly beamed along the plane of accretion disk, peaked at ultrahigh energies, and collimated in smaller opening angle .

  19. Nitrogenous Derivatives of Phosphorus and the Origins of Life: Plausible Prebiotic Phosphorylating Agents in Water

    Directory of Open Access Journals (Sweden)

    Megha Karki

    2017-07-01

    Full Text Available Phosphorylation under plausible prebiotic conditions continues to be one of the defining issues for the role of phosphorus in the origins of life processes. In this review, we cover the reactions of alternative forms of phosphate, specifically the nitrogenous versions of phosphate (and other forms of reduced phosphorus species from a prebiotic, synthetic organic and biochemistry perspective. The ease with which such amidophosphates or phosphoramidate derivatives phosphorylate a wide variety of substrates suggests that alternative forms of phosphate could have played a role in overcoming the “phosphorylation in water problem”. We submit that serious consideration should be given to the search for primordial sources of nitrogenous versions of phosphate and other versions of phosphorus.

  20. Reciprocity-based reasons for benefiting research participants: most fail, the most plausible is problematic.

    Science.gov (United States)

    Sofaer, Neema

    2014-11-01

    A common reason for giving research participants post-trial access (PTA) to the trial intervention appeals to reciprocity, the principle, stated most generally, that if one person benefits a second, the second should reciprocate: benefit the first in return. Many authors consider it obvious that reciprocity supports PTA. Yet their reciprocity principles differ, with many authors apparently unaware of alternative versions. This article is the first to gather the range of reciprocity principles. It finds that: (1) most are false. (2) The most plausible principle, which is also problematic, applies only when participants experience significant net risks or burdens. (3) Seldom does reciprocity support PTA for participants or give researchers stronger reason to benefit participants than equally needy non-participants. (4) Reciprocity fails to explain the common view that it is bad when participants in a successful trial have benefited from the trial intervention but lack PTA to it. © 2013 John Wiley & Sons Ltd.

  1. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    International Nuclear Information System (INIS)

    D.L. McGregor

    2000-01-01

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process

  2. System-level techniques for analog performance enhancement

    CERN Document Server

    Song, Bang-Sup

    2016-01-01

    This book shows readers to avoid common mistakes in circuit design, and presents classic circuit concepts and design approaches from the transistor to the system levels. The discussion is geared to be accessible and optimized for practical designers who want to learn to create circuits without simulations. Topic by topic, the author guides designers to learn the classic analog design skills by understanding the basic electronics principles correctly, and further prepares them to feel confident in designing high-performance, state-of-the art CMOS analog systems. This book combines and presents all in-depth necessary information to perform various design tasks so that readers can grasp essential material, without reading through the entire book. This top-down approach helps readers to build practical design expertise quickly, starting from their understanding of electronics fundamentals. .

  3. Process for Selecting System Level Assessments for Human System Technologies

    Science.gov (United States)

    Watts, James; Park, John

    2006-01-01

    The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.

  4. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    Energy Technology Data Exchange (ETDEWEB)

    D.L. McGregor

    2000-12-20

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process.

  5. Systems-Level Synthetic Biology for Advanced Biofuel Production

    Energy Technology Data Exchange (ETDEWEB)

    Ruffing, Anne [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jensen, Travis J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Strickland, Lucas Marshall [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Meserole, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tallant, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-03-01

    Cyanobacteria have been shown to be capable of producing a variety of advanced biofuels; however, product yields remain well below those necessary for large scale production. New genetic tools and high throughput metabolic engineering techniques are needed to optimize cyanobacterial metabolisms for enhanced biofuel production. Towards this goal, this project advances the development of a multiple promoter replacement technique for systems-level optimization of gene expression in a model cyanobacterial host: Synechococcus sp. PCC 7002. To realize this multiple-target approach, key capabilities were developed, including a high throughput detection method for advanced biofuels, enhanced transformation efficiency, and genetic tools for Synechococcus sp. PCC 7002. Moreover, several additional obstacles were identified for realization of this multiple promoter replacement technique. The techniques and tools developed in this project will help to enable future efforts in the advancement of cyanobacterial biofuels.

  6. System-level integration of active silicon photonic biosensors

    Science.gov (United States)

    Laplatine, L.; Al'Mrayat, O.; Luan, E.; Fang, C.; Rezaiezadeh, S.; Ratner, D. M.; Cheung, K.; Dattner, Y.; Chrostowski, L.

    2017-02-01

    Biosensors based on silicon photonic integrated circuits have attracted a growing interest in recent years. The use of sub-micron silicon waveguides to propagate near-infrared light allows for the drastic reduction of the optical system size, while increasing its complexity and sensitivity. Using silicon as the propagating medium also leverages the fabrication capabilities of CMOS foundries, which offer low-cost mass production. Researchers have deeply investigated photonic sensor devices, such as ring resonators, interferometers and photonic crystals, but the practical integration of silicon photonic biochips as part of a complete system has received less attention. Herein, we present a practical system-level architecture which can be employed to integrate the aforementioned photonic biosensors. We describe a system based on 1 mm2 dies that integrate germanium photodetectors and a single light coupling device. The die are embedded into a 16x16 mm2 epoxy package to enable microfluidic and electrical integration. First, we demonstrate a simple process to mimic Fan-Out Wafer-level-Packaging, which enables low-cost mass production. We then characterize the photodetectors in the photovoltaic mode, which exhibit high sensitivity at low optical power. Finally, we present a new grating coupler concept to relax the lateral alignment tolerance down to +/- 50 μm at 1-dB (80%) power penalty, which should permit non-experts to use the biochips in a"plug-and-play" style. The system-level integration demonstrated in this study paves the way towards the mass production of low-cost and highly sensitive biosensors, and can facilitate their wide adoption for biomedical and agro-environmental applications.

  7. Public health preparedness in Alberta: a systems-level study.

    Science.gov (United States)

    Moore, Douglas; Shiell, Alan; Noseworthy, Tom; Russell, Margaret; Predy, Gerald

    2006-12-28

    Recent international and national events have brought critical attention to the Canadian public health system and how prepared the system is to respond to various types of contemporary public health threats. This article describes the study design and methods being used to conduct a systems-level analysis of public health preparedness in the province of Alberta, Canada. The project is being funded under the Health Research Fund, Alberta Heritage Foundation for Medical Research. We use an embedded, multiple-case study design, integrating qualitative and quantitative methods to measure empirically the degree of inter-organizational coordination existing among public health agencies in Alberta, Canada. We situate our measures of inter-organizational network ties within a systems-level framework to assess the relative influence of inter-organizational ties, individual organizational attributes, and institutional environmental features on public health preparedness. The relative contribution of each component is examined for two potential public health threats: pandemic influenza and West Nile virus. The organizational dimensions of public health preparedness depend on a complex mix of individual organizational characteristics, inter-agency relationships, and institutional environmental factors. Our study is designed to discriminate among these different system components and assess the independent influence of each on the other, as well as the overall level of public health preparedness in Alberta. While all agree that competent organizations and functioning networks are important components of public health preparedness, this study is one of the first to use formal network analysis to study the role of inter-agency networks in the development of prepared public health systems.

  8. Promoting system-level learning from project-level lessons

    International Nuclear Information System (INIS)

    Jong, Amos A. de; Runhaar, Hens A.C.; Runhaar, Piety R.; Kolhoff, Arend J.; Driessen, Peter P.J.

    2012-01-01

    A growing number of low and middle income nations (LMCs) have adopted some sort of system for environmental impact assessment (EIA). However, generally many of these EIA systems are characterised by a low performance in terms of timely information dissemination, monitoring and enforcement after licencing. Donor actors (such as the World Bank) have attempted to contribute to a higher performance of EIA systems in LMCs by intervening at two levels: the project level (e.g. by providing scoping advice or EIS quality review) and the system level (e.g. by advising on EIA legislation or by capacity building). The aims of these interventions are environmental protection in concrete cases and enforcing the institutionalisation of environmental protection, respectively. Learning by actors involved is an important condition for realising these aims. A relatively underexplored form of learning concerns learning at EIA system-level via project level donor interventions. This ‘indirect’ learning potentially results in system changes that better fit the specific context(s) and hence contribute to higher performances. Our exploratory research in Ghana and the Maldives shows that thus far, ‘indirect’ learning only occurs incidentally and that donors play a modest role in promoting it. Barriers to indirect learning are related to the institutional context rather than to individual characteristics. Moreover, ‘indirect’ learning seems to flourish best in large projects where donors achieved a position of influence that they can use to evoke reflection upon system malfunctions. In order to enhance learning at all levels donors should thereby present the outcomes of the intervention elaborately (i.e. discuss the outcomes with a large audience), include practical suggestions about post-EIS activities such as monitoring procedures and enforcement options and stimulate the use of their advisory reports to generate organisational memory and ensure a better information

  9. Public health preparedness in Alberta: a systems-level study

    Directory of Open Access Journals (Sweden)

    Noseworthy Tom

    2006-12-01

    Full Text Available Abstract Background Recent international and national events have brought critical attention to the Canadian public health system and how prepared the system is to respond to various types of contemporary public health threats. This article describes the study design and methods being used to conduct a systems-level analysis of public health preparedness in the province of Alberta, Canada. The project is being funded under the Health Research Fund, Alberta Heritage Foundation for Medical Research. Methods/Design We use an embedded, multiple-case study design, integrating qualitative and quantitative methods to measure empirically the degree of inter-organizational coordination existing among public health agencies in Alberta, Canada. We situate our measures of inter-organizational network ties within a systems-level framework to assess the relative influence of inter-organizational ties, individual organizational attributes, and institutional environmental features on public health preparedness. The relative contribution of each component is examined for two potential public health threats: pandemic influenza and West Nile virus. Discussion The organizational dimensions of public health preparedness depend on a complex mix of individual organizational characteristics, inter-agency relationships, and institutional environmental factors. Our study is designed to discriminate among these different system components and assess the independent influence of each on the other, as well as the overall level of public health preparedness in Alberta. While all agree that competent organizations and functioning networks are important components of public health preparedness, this study is one of the first to use formal network analysis to study the role of inter-agency networks in the development of prepared public health systems.

  10. Promoting system-level learning from project-level lessons

    Energy Technology Data Exchange (ETDEWEB)

    Jong, Amos A. de, E-mail: amosdejong@gmail.com [Innovation Management, Utrecht (Netherlands); Runhaar, Hens A.C., E-mail: h.a.c.runhaar@uu.nl [Section of Environmental Governance, Utrecht University, Utrecht (Netherlands); Runhaar, Piety R., E-mail: piety.runhaar@wur.nl [Organisational Psychology and Human Resource Development, University of Twente, Enschede (Netherlands); Kolhoff, Arend J., E-mail: Akolhoff@eia.nl [The Netherlands Commission for Environmental Assessment, Utrecht (Netherlands); Driessen, Peter P.J., E-mail: p.driessen@geo.uu.nl [Department of Innovation and Environment Sciences, Utrecht University, Utrecht (Netherlands)

    2012-02-15

    A growing number of low and middle income nations (LMCs) have adopted some sort of system for environmental impact assessment (EIA). However, generally many of these EIA systems are characterised by a low performance in terms of timely information dissemination, monitoring and enforcement after licencing. Donor actors (such as the World Bank) have attempted to contribute to a higher performance of EIA systems in LMCs by intervening at two levels: the project level (e.g. by providing scoping advice or EIS quality review) and the system level (e.g. by advising on EIA legislation or by capacity building). The aims of these interventions are environmental protection in concrete cases and enforcing the institutionalisation of environmental protection, respectively. Learning by actors involved is an important condition for realising these aims. A relatively underexplored form of learning concerns learning at EIA system-level via project level donor interventions. This 'indirect' learning potentially results in system changes that better fit the specific context(s) and hence contribute to higher performances. Our exploratory research in Ghana and the Maldives shows that thus far, 'indirect' learning only occurs incidentally and that donors play a modest role in promoting it. Barriers to indirect learning are related to the institutional context rather than to individual characteristics. Moreover, 'indirect' learning seems to flourish best in large projects where donors achieved a position of influence that they can use to evoke reflection upon system malfunctions. In order to enhance learning at all levels donors should thereby present the outcomes of the intervention elaborately (i.e. discuss the outcomes with a large audience), include practical suggestions about post-EIS activities such as monitoring procedures and enforcement options and stimulate the use of their advisory reports to generate organisational memory and ensure a better

  11. On the plausibility of socioeconomic mortality estimates derived from linked data: a demographic approach.

    Science.gov (United States)

    Lerch, Mathias; Spoerri, Adrian; Jasilionis, Domantas; Viciana Fernandèz, Francisco

    2017-07-14

    Reliable estimates of mortality according to socioeconomic status play a crucial role in informing the policy debate about social inequality, social cohesion, and exclusion as well as about the reform of pension systems. Linked mortality data have become a gold standard for monitoring socioeconomic differentials in survival. Several approaches have been proposed to assess the quality of the linkage, in order to avoid the misclassification of deaths according to socioeconomic status. However, the plausibility of mortality estimates has never been scrutinized from a demographic perspective, and the potential problems with the quality of the data on the at-risk populations have been overlooked. Using indirect demographic estimation (i.e., the synthetic extinct generation method), we analyze the plausibility of old-age mortality estimates according to educational attainment in four European data contexts with different quality issues: deterministic and probabilistic linkage of deaths, as well as differences in the methodology of the collection of educational data. We evaluate whether the at-risk population according to educational attainment is misclassified and/or misestimated, correct these biases, and estimate the education-specific linkage rates of deaths. The results confirm a good linkage of death records within different educational strata, even when probabilistic matching is used. The main biases in mortality estimates concern the classification and estimation of the person-years of exposure according to educational attainment. Changes in the census questions about educational attainment led to inconsistent information over time, which misclassified the at-risk population. Sample censuses also misestimated the at-risk populations according to educational attainment. The synthetic extinct generation method can be recommended for quality assessments of linked data because it is capable not only of quantifying linkage precision, but also of tracking problems in

  12. The missing link between sleep disorders and age-related dementia: recent evidence and plausible mechanisms.

    Science.gov (United States)

    Zhang, Feng; Zhong, Rujia; Li, Song; Chang, Raymond Chuen-Chung; Le, Weidong

    2017-05-01

    Sleep disorders are among the most common clinical problems and possess a significant concern for the geriatric population. More importantly, while around 40% of elderly adults have sleep-related complaints, sleep disorders are more frequently associated with co-morbidities including age-related neurodegenerative diseases and mild cognitive impairment. Recently, increasing evidence has indicated that disturbed sleep may not only serve as the consequence of brain atrophy, but also contribute to the pathogenesis of dementia and, therefore, significantly increase dementia risk. Since the current therapeutic interventions lack efficacies to prevent, delay or reverse the pathological progress of dementia, a better understanding of underlying mechanisms by which sleep disorders interact with the pathogenesis of dementia will provide possible targets for the prevention and treatment of dementia. In this review, we briefly describe the physiological roles of sleep in learning/memory, and specifically update the recent research evidence demonstrating the association between sleep disorders and dementia. Plausible mechanisms are further discussed. Moreover, we also evaluate the possibility of sleep therapy as a potential intervention for dementia.

  13. Mindfulness and Cardiovascular Disease Risk: State of the Evidence, Plausible Mechanisms, and Theoretical Framework

    Science.gov (United States)

    Schuman-Olivier, Zev; Britton, Willoughby B.; Fresco, David M.; Desbordes, Gaelle; Brewer, Judson A.; Fulwiler, Carl

    2016-01-01

    The purpose of this review is to provide (1) a synopsis on relations of mindfulness with cardiovascular disease (CVD) and major CVD risk factors, and (2) an initial consensus-based overview of mechanisms and theoretical framework by which mindfulness might influence CVD. Initial evidence, often of limited methodological quality, suggests possible impacts of mindfulness on CVD risk factors including physical activity, smoking, diet, obesity, blood pressure, and diabetes regulation. Plausible mechanisms include (1) improved attention control (e.g., ability to hold attention on experiences related to CVD risk, such as smoking, diet, physical activity, and medication adherence), (2) emotion regulation (e.g., improved stress response, self-efficacy, and skills to manage craving for cigarettes, palatable foods, and sedentary activities), and (3) self-awareness (e.g., self-referential processing and awareness of physical sensations due to CVD risk factors). Understanding mechanisms and theoretical framework should improve etiologic knowledge, providing customized mindfulness intervention targets that could enable greater mindfulness intervention efficacy. PMID:26482755

  14. Phthalates impact human health: Epidemiological evidences and plausible mechanism of action.

    Science.gov (United States)

    Benjamin, Sailas; Masai, Eiji; Kamimura, Naofumi; Takahashi, Kenji; Anderson, Robin C; Faisal, Panichikkal Abdul

    2017-10-15

    Disregarding the rising alarm on the hazardous nature of various phthalates and their metabolites, ruthless usage of phthalates as plasticizer in plastics and as additives in innumerable consumer products continues due low their cost, attractive properties, and lack of suitable alternatives. Globally, in silico computational, in vitro mechanistic, in vivo preclinical and limited clinical or epidemiological human studies showed that over a dozen phthalates and their metabolites ingested passively by man from the general environment, foods, drinks, breathing air, and routine household products cause various dysfunctions. Thus, this review addresses the health hazards posed by phthalates on children and adolescents, epigenetic modulation, reproductive toxicity in women and men; insulin resistance and type II diabetes; overweight and obesity, skeletal anomalies, allergy and asthma, cancer, etc., coupled with the description of major phthalates and their general uses, phthalate exposure routes, biomonitoring and risk assessment, special account on endocrine disruption; and finally, a plausible molecular cross-talk with a unique mechanism of action. This clinically focused comprehensive review on the hazards of phthalates would benefit the general population, academia, scientists, clinicians, environmentalists, and law or policy makers to decide upon whether usage of phthalates to be continued swiftly without sufficient deceleration or regulated by law or to be phased out from earth forever. Copyright © 2017. Published by Elsevier B.V.

  15. A plausible mechanism of biosorption in dual symbioses by vesicular-arbuscular mycorrhizal in plants.

    Science.gov (United States)

    Azmat, Rafia; Hamid, Neelofer

    2015-03-01

    Dual symbioses of vesicular-arbuscular mycorrhizal (VAM) fungi with growth of Momordica charantia were elucidated in terms of plausible mechanism of biosorption in this article. The experiment was conducted in green house and mixed inoculum of the VAM fungi was used in the three replicates. Results demonstrated that the starch contents were the main source of C for the VAM to builds their hyphae. The increased plant height and leaves surface area were explained in relation with an increase in the photosynthetic rates to produce rapid sugar contents for the survival of plants. A decreased in protein, and amino acid contents and increased proline and protease activity in VAM plants suggested that these contents were the main bio-indicators of the plants under biotic stress. The decline in protein may be due to the degradation of these contents, which later on converted into dextrose where it can easily be absorbed by for the period of symbioses. A mechanism of C chemisorption in relation with physiology and morphology of plant was discussed.

  16. Non-specific effects of vaccines: plausible and potentially important, but implications uncertain.

    Science.gov (United States)

    Pollard, Andrew J; Finn, Adam; Curtis, Nigel

    2017-11-01

    Non-specific effects (NSE) or heterologous effects of vaccines are proposed to explain observations in some studies that certain vaccines have an impact beyond the direct protection against infection with the specific pathogen for which the vaccines were designed. The importance and implications of such effects remain controversial. There are several known immunological mechanisms which could lead to NSE, since it is widely recognised that the generation of specific immunity is initiated by non-specific innate immune mechanisms that may also have wider effects on adaptive immune function. However, there are no published studies that demonstrate a mechanistic link between such immunological phenomena and clinically relevant NSE in humans. While it is highly plausible that some vaccines do have NSE, their magnitude and duration, and thus importance, remain uncertain. Although the WHO recently concluded that current evidence does not justify changes to immunisation policy, further studies of sufficient size and quality are needed to assess the importance of NSE for all-cause mortality. This could provide insights into vaccine immunobiology with important implications for infant health and survival. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. Photoinduced catalytic synthesis of biologically important metabolites from formaldehyde and ammonia under plausible "prebiotic" conditions

    Science.gov (United States)

    Delidovich, I. V.; Taran, O. P.; Simonov, A. N.; Matvienko, L. G.; Parmon, V. N.

    2011-08-01

    The article analyzes new and previously reported data on several catalytic and photochemical processes yielding biologically important molecules. UV-irradiation of formaldehyde aqueous solution yields acetaldehyde, glyoxal, glycolaldehyde and glyceraldehyde, which can serve as precursors of more complex biochemically relevant compounds. Photolysis of aqueous solution of acetaldehyde and ammonium nitrate results in formation of alanine and pyruvic acid. Dehydration of glyceraldehyde catalyzed by zeolite HZSM-5-17 yields pyruvaldehyde. Monosaccharides are formed in the course of the phosphate-catalyzed aldol condensation reactions of glycolaldehyde, glyceraldehyde and formaldehyde. The possibility of the direct synthesis of tetroses, keto- and aldo-pentoses from pure formaldehyde due to the combination of the photochemical production of glycolahyde and phosphate-catalyzed carbohydrate chain growth is demonstrated. Erythrulose and 3-pentulose are the main products of such combined synthesis with selectivity up to 10%. Biologically relevant aldotetroses, aldo- and ketopentoses are more resistant to the photochemical destruction owing to the stabilization in hemiacetal cyclic forms. They are formed as products of isomerization of erythrulose and 3-pentulose. The conjugation of the concerned reactions results in a plausible route to the formation of sugars, amino and organic acids from formaldehyde and ammonia under presumed 'prebiotic' conditions.

  18. A Systems-Level Approach to Characterizing Effects of ENMs ...

    Science.gov (United States)

    Engineered nanomaterials (ENMs) represent a new regulatory challenge because of their unique properties and their potential to interact with ecological organisms at various developmental stages, in numerous environmental compartments. Traditional toxicity tests have proven to be unreliable due to their short-term nature and the subtle responses often observed following ENM exposure. In order to fully assess the potential for various ENMs to affect responses in organisms and ecosystems, we are using a systems-level framework to link molecular initiating events with changes in whole-organism responses, and to identify how these changes may translate across scales to disrupt important ecosystem processes. This framework utilizes information from nanoparticle characteristics and exposures to help make linkages across scales. We have used Arabidopsis thaliana as a model organism to identify potential transcriptome changes in response to specific ENMs. In addition, we have focused on plant species of agronomic importance to follow multi-generational changes in physiology and phenology, as well as epigenetic markers to identify possible mechanisms of inheritance. We are employing and developing complementary analytical tools (plasma-based and synchrotron spectroscopies, microscopy, and molecular and stable-isotopic techniques) to follow movement of ENMs and ENM products in plants as they develop. These studies have revealed that changes in gene expression do not a

  19. System level traffic shaping in disk servers with heterogeneous protocols

    International Nuclear Information System (INIS)

    Cano, Eric; Kruse, Daniele Francesco

    2014-01-01

    Disk access and tape migrations compete for network bandwidth in CASTORs disk servers, over various protocols: RFIO, Xroot, root and GridFTP. As there are a limited number of tape drives, it is important to keep them busy all the time, at their nominal speed. With potentially 100s of user read streams per server, the bandwidth for the tape migrations has to be guaranteed to a controlled level, and not the fair share the system gives by default. Xroot provides a prioritization mechanism, but using it implies moving exclusively to the Xroot protocol, which is not possible in short to mid-term time frame, as users are equally using all protocols. The greatest commonality of all those protocols is not more than the usage of TCP/IP. We investigated the Linux kernel traffic shaper to control TCP/ IP bandwidth. The performance and limitations of the traffic shaper have been understood in test environment, and satisfactory working point has been found for production. Notably, TCP offload engines' negative impact on traffic shaping, and the limitations of the length of the traffic shaping rules were discovered and measured. A suitable working point has been found and the traffic shaping is now successfully deployed in the CASTOR production systems at CERN. This system level approach could be transposed easily to other environments.

  20. A system-level model for the microbial regulatory genome.

    Science.gov (United States)

    Brooks, Aaron N; Reiss, David J; Allard, Antoine; Wu, Wei-Ju; Salvanha, Diego M; Plaisier, Christopher L; Chandrasekaran, Sriram; Pan, Min; Kaur, Amardeep; Baliga, Nitin S

    2014-07-15

    Microbes can tailor transcriptional responses to diverse environmental challenges despite having streamlined genomes and a limited number of regulators. Here, we present data-driven models that capture the dynamic interplay of the environment and genome-encoded regulatory programs of two types of prokaryotes: Escherichia coli (a bacterium) and Halobacterium salinarum (an archaeon). The models reveal how the genome-wide distributions of cis-acting gene regulatory elements and the conditional influences of transcription factors at each of those elements encode programs for eliciting a wide array of environment-specific responses. We demonstrate how these programs partition transcriptional regulation of genes within regulons and operons to re-organize gene-gene functional associations in each environment. The models capture fitness-relevant co-regulation by different transcriptional control mechanisms acting across the entire genome, to define a generalized, system-level organizing principle for prokaryotic gene regulatory networks that goes well beyond existing paradigms of gene regulation. An online resource (http://egrin2.systemsbiology.net) has been developed to facilitate multiscale exploration of conditional gene regulation in the two prokaryotes. © 2014 The Authors. Published under the terms of the CC BY 4.0 license.

  1. Evidence for systems-level molecular mechanisms of tumorigenesis

    Directory of Open Access Journals (Sweden)

    Capellá Gabriel

    2007-06-01

    Full Text Available Abstract Background Cancer arises from the consecutive acquisition of genetic alterations. Increasing evidence suggests that as a consequence of these alterations, molecular interactions are reprogrammed in the context of highly connected and regulated cellular networks. Coordinated reprogramming would allow the cell to acquire the capabilities for malignant growth. Results Here, we determine the coordinated function of cancer gene products (i.e., proteins encoded by differentially expressed genes in tumors relative to healthy tissue counterparts, hereafter referred to as "CGPs" defined as their topological properties and organization in the interactome network. We show that CGPs are central to information exchange and propagation and that they are specifically organized to promote tumorigenesis. Centrality is identified by both local (degree and global (betweenness and closeness measures, and systematically appears in down-regulated CGPs. Up-regulated CGPs do not consistently exhibit centrality, but both types of cancer products determine the overall integrity of the network structure. In addition to centrality, down-regulated CGPs show topological association that correlates with common biological processes and pathways involved in tumorigenesis. Conclusion Given the current limited coverage of the human interactome, this study proposes that tumorigenesis takes place in a specific and organized way at the molecular systems-level and suggests a model that comprises the precise down-regulation of groups of topologically-associated proteins involved in particular functions, orchestrated with the up-regulation of specific proteins.

  2. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support.

    Science.gov (United States)

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning , which generalizes the commonalities among the data to induce new rules, and analogical reasoning , which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15

  3. A System-Level Throughput Model for Quantum Key Distribution

    Science.gov (United States)

    2015-09-17

    discrete logarithms in a finite field [35]. Arguably the most popular asymmetric encryption scheme is the RSA algorithm, published a year later in...Theory, vol. 22, no. 6, pp. 644-654, 1976. [36] G. Singh and S. Supriya, ’A Study of Encryption Algorithms ( RSA , DES, 3DES and AES) for Information...xv Dictionary QKD = Quantum Key Distribution OTP = One-Time Pad cryptographic algorithm DES = Data Encryption Standard 3DES

  4. Estimating yield gaps at the cropping system level.

    Science.gov (United States)

    Guilpart, Nicolas; Grassini, Patricio; Sadras, Victor O; Timsina, Jagadish; Cassman, Kenneth G

    2017-05-01

    Yield gap analyses of individual crops have been used to estimate opportunities for increasing crop production at local to global scales, thus providing information crucial to food security. However, increases in crop production can also be achieved by improving cropping system yield through modification of spatial and temporal arrangement of individual crops. In this paper we define the cropping system yield potential as the output from the combination of crops that gives the highest energy yield per unit of land and time, and the cropping system yield gap as the difference between actual energy yield of an existing cropping system and the cropping system yield potential. Then, we provide a framework to identify alternative cropping systems which can be evaluated against the current ones. A proof-of-concept is provided with irrigated rice-maize systems at four locations in Bangladesh that represent a range of climatic conditions in that country. The proposed framework identified (i) realistic alternative cropping systems at each location, and (ii) two locations where expected improvements in crop production from changes in cropping intensity (number of crops per year) were 43% to 64% higher than from improving the management of individual crops within the current cropping systems. The proposed framework provides a tool to help assess food production capacity of new systems ( e.g. with increased cropping intensity) arising from climate change, and assess resource requirements (water and N) and associated environmental footprint per unit of land and production of these new systems. By expanding yield gap analysis from individual crops to the cropping system level and applying it to new systems, this framework could also be helpful to bridge the gap between yield gap analysis and cropping/farming system design.

  5. Climate change impacts on agriculture in 2050 under a range of plausible socioeconomic and emissions scenarios

    International Nuclear Information System (INIS)

    Wiebe, Keith; Islam, Shahnila; Mason-D’Croz, Daniel; Robertson, Richard; Robinson, Sherman; Lotze-Campen, Hermann; Biewald, Anne; Bodirsky, Benjamin; Müller, Christoph; Popp, Alexander; Sands, Ronald; Tabeau, Andrzej; Van Meijl, Hans; Van der Mensbrugghe, Dominique; Kavallari, Aikaterini; Willenbockel, Dirk

    2015-01-01

    Previous studies have combined climate, crop and economic models to examine the impact of climate change on agricultural production and food security, but results have varied widely due to differences in models, scenarios and input data. Recent work has examined (and narrowed) these differences through systematic model intercomparison using a high-emissions pathway to highlight the differences. This paper extends that analysis to explore a range of plausible socioeconomic scenarios and emission pathways. Results from multiple climate and economic models are combined to examine the global and regional impacts of climate change on agricultural yields, area, production, consumption, prices and trade for coarse grains, rice, wheat, oilseeds and sugar crops to 2050. We find that climate impacts on global average yields, area, production and consumption are similar across shared socioeconomic pathways (SSP 1, 2 and 3, as we implement them based on population, income and productivity drivers), except when changes in trade policies are included. Impacts on trade and prices are higher for SSP 3 than SSP 2, and higher for SSP 2 than for SSP 1. Climate impacts for all variables are similar across low to moderate emissions pathways (RCP 4.5 and RCP 6.0), but increase for a higher emissions pathway (RCP 8.5). It is important to note that these global averages may hide regional variations. Projected reductions in agricultural yields due to climate change by 2050 are larger for some crops than those estimated for the past half century, but smaller than projected increases to 2050 due to rising demand and intrinsic productivity growth. Results illustrate the sensitivity of climate change impacts to differences in socioeconomic and emissions pathways. Yield impacts increase at high emissions levels and vary with changes in population, income and technology, but are reduced in all cases by endogenous changes in prices and other variables. (paper)

  6. Evaporative water loss is a plausible explanation for mortality of bats from white-nose syndrome.

    Science.gov (United States)

    Willis, Craig K R; Menzies, Allyson K; Boyles, Justin G; Wojciechowski, Michal S

    2011-09-01

    White-nose syndrome (WNS) has caused alarming declines of North American bat populations in the 5 years since its discovery. Affected bats appear to starve during hibernation, possibly because of disruption of normal cycles of torpor and arousal. The importance of hydration state and evaporative water loss (EWL) for influencing the duration of torpor bouts in hibernating mammals recently led to "the dehydration hypothesis," that cutaneous infection of the wing membranes of bats with the fungus Geomyces destructans causes dehydration which in turn, increases arousal frequency during hibernation. This hypothesis predicts that uninfected individuals of species most susceptible to WNS, like little brown bats (Myotis lucifugus), exhibit high rates of EWL compared to less susceptible species. We tested the feasibility of this prediction using data from the literature and new data quantifying EWL in Natterer's bats (Myotis nattereri), a species that is, like other European bats, sympatric with G. destructans but does not appear to suffer significant mortality from WNS. We found that little brown bats exhibited significantly higher rates of normothermic EWL than did other bat species for which comparable EWL data are available. We also found that Natterer's bats exhibited significantly lower rates of EWL, in both wet and dry air, compared with values predicted for little brown bats exposed to identical relative humidity (RH). We used a population model to show that the increase in EWL required to cause the pattern of mortality observed for WNS-affected little brown bats was small, equivalent to a solitary bat hibernating exposed to RH of ∼95%, or clusters hibernating in ∼87% RH, as opposed to typical near-saturation conditions. Both of these results suggest the dehydration hypothesis is plausible and worth pursuing as a possible explanation for mortality of bats from WNS.

  7. Flux-based transport enhancement as a plausible unifying mechanism for auxin transport in meristem development.

    Directory of Open Access Journals (Sweden)

    Szymon Stoma

    2008-10-01

    Full Text Available Plants continuously generate new organs through the activity of populations of stem cells called meristems. The shoot apical meristem initiates leaves, flowers, and lateral meristems in highly ordered, spiralled, or whorled patterns via a process called phyllotaxis. It is commonly accepted that the active transport of the plant hormone auxin plays a major role in this process. Current hypotheses propose that cellular hormone transporters of the PIN family would create local auxin maxima at precise positions, which in turn would lead to organ initiation. To explain how auxin transporters could create hormone fluxes to distinct regions within the plant, different concepts have been proposed. A major hypothesis, canalization, proposes that the auxin transporters act by amplifying and stabilizing existing fluxes, which could be initiated, for example, by local diffusion. This convincingly explains the organised auxin fluxes during vein formation, but for the shoot apical meristem a second hypothesis was proposed, where the hormone would be systematically transported towards the areas with the highest concentrations. This implies the coexistence of two radically different mechanisms for PIN allocation in the membrane, one based on flux sensing and the other on local concentration sensing. Because these patterning processes require the interaction of hundreds of cells, it is impossible to estimate on a purely intuitive basis if a particular scenario is plausible or not. Therefore, computational modelling provides a powerful means to test this type of complex hypothesis. Here, using a dedicated computer simulation tool, we show that a flux-based polarization hypothesis is able to explain auxin transport at the shoot meristem as well, thus providing a unifying concept for the control of auxin distribution in the plant. Further experiments are now required to distinguish between flux-based polarization and other hypotheses.

  8. Bio-physically plausible visualization of highly scattering fluorescent neocortical models for in silico experimentation

    KAUST Repository

    Abdellah, Marwan

    2017-02-15

    Background We present a visualization pipeline capable of accurate rendering of highly scattering fluorescent neocortical neuronal models. The pipeline is mainly developed to serve the computational neurobiology community. It allows the scientists to visualize the results of their virtual experiments that are performed in computer simulations, or in silico. The impact of the presented pipeline opens novel avenues for assisting the neuroscientists to build biologically accurate models of the brain. These models result from computer simulations of physical experiments that use fluorescence imaging to understand the structural and functional aspects of the brain. Due to the limited capabilities of the current visualization workflows to handle fluorescent volumetric datasets, we propose a physically-based optical model that can accurately simulate light interaction with fluorescent-tagged scattering media based on the basic principles of geometric optics and Monte Carlo path tracing. We also develop an automated and efficient framework for generating dense fluorescent tissue blocks from a neocortical column model that is composed of approximately 31000 neurons. Results Our pipeline is used to visualize a virtual fluorescent tissue block of 50 μm3 that is reconstructed from the somatosensory cortex of juvenile rat. The fluorescence optical model is qualitatively analyzed and validated against experimental emission spectra of different fluorescent dyes from the Alexa Fluor family. Conclusion We discussed a scientific visualization pipeline for creating images of synthetic neocortical neuronal models that are tagged virtually with fluorescent labels on a physically-plausible basis. The pipeline is applied to analyze and validate simulation data generated from neuroscientific in silico experiments.

  9. A plausible (overlooked) super-luminous supernova in the Sloan digital sky survey stripe 82 data

    International Nuclear Information System (INIS)

    Kostrzewa-Rutkowska, Zuzanna; Kozłowski, Szymon; Wyrzykowski, Łukasz; Djorgovski, S. George; Mahabal, Ashish A.; Glikman, Eilat; Koposov, Sergey

    2013-01-01

    We present the discovery of a plausible super-luminous supernova (SLSN), found in the archival data of Sloan Digital Sky Survey (SDSS) Stripe 82, called PSN 000123+000504. The supernova (SN) peaked at m g < 19.4 mag in the second half of 2005 September, but was missed by the real-time SN hunt. The observed part of the light curve (17 epochs) showed that the rise to the maximum took over 30 days, while the decline time lasted at least 70 days (observed frame), closely resembling other SLSNe of SN 2007bi type. The spectrum of the host galaxy reveals a redshift of z = 0.281 and the distance modulus of μ = 40.77 mag. Combining this information with the SDSS photometry, we found the host galaxy to be an LMC-like irregular dwarf galaxy with an absolute magnitude of M B = –18.2 ± 0.2 mag and an oxygen abundance of 12+log [O/H]=8.3±0.2; hence, the SN peaked at M g < –21.3 mag. Our SLSN follows the relation for the most energetic/super-luminous SNe exploding in low-metallicity environments, but we found no clear evidence for SLSNe to explode in low-luminosity (dwarf) galaxies only. The available information on the PSN 000123+000504 light curve suggests the magnetar-powered model as a likely scenario of this event. This SLSN is a new addition to a quickly growing family of super-luminous SNe.

  10. A plausible neural circuit for decision making and its formation based on reinforcement learning.

    Science.gov (United States)

    Wei, Hui; Dai, Dawei; Bu, Yijie

    2017-06-01

    A human's, or lower insects', behavior is dominated by its nervous system. Each stable behavior has its own inner steps and control rules, and is regulated by a neural circuit. Understanding how the brain influences perception, thought, and behavior is a central mandate of neuroscience. The phototactic flight of insects is a widely observed deterministic behavior. Since its movement is not stochastic, the behavior should be dominated by a neural circuit. Based on the basic firing characteristics of biological neurons and the neural circuit's constitution, we designed a plausible neural circuit for this phototactic behavior from logic perspective. The circuit's output layer, which generates a stable spike firing rate to encode flight commands, controls the insect's angular velocity when flying. The firing pattern and connection type of excitatory and inhibitory neurons are considered in this computational model. We simulated the circuit's information processing using a distributed PC array, and used the real-time average firing rate of output neuron clusters to drive a flying behavior simulation. In this paper, we also explored how a correct neural decision circuit is generated from network flow view through a bee's behavior experiment based on the reward and punishment feedback mechanism. The significance of this study: firstly, we designed a neural circuit to achieve the behavioral logic rules by strictly following the electrophysiological characteristics of biological neurons and anatomical facts. Secondly, our circuit's generality permits the design and implementation of behavioral logic rules based on the most general information processing and activity mode of biological neurons. Thirdly, through computer simulation, we achieved new understanding about the cooperative condition upon which multi-neurons achieve some behavioral control. Fourthly, this study aims in understanding the information encoding mechanism and how neural circuits achieve behavior control

  11. Is knowing believing? The role of event plausibility and background knowledge in planting false beliefs about the personal past.

    Science.gov (United States)

    Pezdek, Kathy; Blandon-Gitlin, Iris; Lam, Shirley; Hart, Rhiannon Ellis; Schooler, Jonathan W

    2006-12-01

    False memories are more likely to be planted for plausible than for implausible events, but does just knowing about an implausible event make individuals more likely to think that the event happened to them? Two experiments assessed the independent contributions o f plausibility a nd background knowledge to planting false beliefs. In Experiment 1, subjects rated 20 childhood events as to the likelihood of each event having happened to them. The list included the implausible target event "received an enema," a critical target event of Pezdek, Finger, and Hodge (1997). Two weeks later, subjects were presented with (1) information regarding the high prevalence rate of enemas; (2) background information on how to administer an enema; (3) neither type of information; or (4) both. Immediately or 2 weeks later, they rated the 20 childhood events again. Only plausibility significantly increased occurrence ratings. In Experiment 2, the target event was changed from "barium enema administered in a hospital" to "home enema for constipation"; significant effects of both plausibility and background knowledge resulted. The results suggest that providing background knowledge can increase beliefs about personal events, but that its impact is limited by the extent of the individual's familiarity with the context of the suggested target event.

  12. Relevance theory: pragmatics and cognition.

    Science.gov (United States)

    Wearing, Catherine J

    2015-01-01

    Relevance Theory is a cognitively oriented theory of pragmatics, i.e., a theory of language use. It builds on the seminal work of H.P. Grice(1) to develop a pragmatic theory which is at once philosophically sensitive and empirically plausible (in both psychological and evolutionary terms). This entry reviews the central commitments and chief contributions of Relevance Theory, including its Gricean commitment to the centrality of intention-reading and inference in communication; the cognitively grounded notion of relevance which provides the mechanism for explaining pragmatic interpretation as an intention-driven, inferential process; and several key applications of the theory (lexical pragmatics, metaphor and irony, procedural meaning). Relevance Theory is an important contribution to our understanding of the pragmatics of communication. © 2014 John Wiley & Sons, Ltd.

  13. Systems-level mechanisms of action of Panax ginseng: a network pharmacological approach.

    Science.gov (United States)

    Park, Sa-Yoon; Park, Ji-Hun; Kim, Hyo-Su; Lee, Choong-Yeol; Lee, Hae-Jeung; Kang, Ki Sung; Kim, Chang-Eop

    2018-01-01

    Panax ginseng has been used since ancient times based on the traditional Asian medicine theory and clinical experiences, and currently, is one of the most popular herbs in the world. To date, most of the studies concerning P. ginseng have focused on specific mechanisms of action of individual constituents. However, in spite of many studies on the molecular mechanisms of P. ginseng , it still remains unclear how multiple active ingredients of P. ginseng interact with multiple targets simultaneously, giving the multidimensional effects on various conditions and diseases. In order to decipher the systems-level mechanism of multiple ingredients of P. ginseng , a novel approach is needed beyond conventional reductive analysis. We aim to review the systems-level mechanism of P. ginseng by adopting novel analytical framework-network pharmacology. Here, we constructed a compound-target network of P. ginseng using experimentally validated and machine learning-based prediction results. The targets of the network were analyzed in terms of related biological process, pathways, and diseases. The majority of targets were found to be related with primary metabolic process, signal transduction, nitrogen compound metabolic process, blood circulation, immune system process, cell-cell signaling, biosynthetic process, and neurological system process. In pathway enrichment analysis of targets, mainly the terms related with neural activity showed significant enrichment and formed a cluster. Finally, relative degrees analysis for the target-disease association of P. ginseng revealed several categories of related diseases, including respiratory, psychiatric, and cardiovascular diseases.

  14. Understanding Whole Systems Change in Health Care: Insights into System Level Diffusion from Nursing Service Delivery Innovations--A Multiple Case Study

    Science.gov (United States)

    Berta, Whitney; Virani, Tazim; Bajnok, Irmajean; Edwards, Nancy; Rowan, Margo

    2014-01-01

    Our study responds to calls for theory-driven approaches to studying innovation diffusion processes in health care. While most research on diffusion in health care is situated at the service delivery level, we study innovations and associated processes that have diffused to the system level, and refer to work on complex adaptive systems and whole…

  15. Small- and large-stakes risk aversion: implications of concavity calabration for decision theory

    NARCIS (Netherlands)

    Cox, J.C.; Sadiraj, V.

    2006-01-01

    A growing literature reports the conclusions that: (a) expected utility theory does not provide a plausible theory of risk aversion for both small-stakes and large-stakes gambles; and (b) this decision theory should be replaced with an alternative theory characterized by loss aversion. This paper

  16. Effective Teacher Practice on the Plausibility of Human-Induced Climate Change

    Science.gov (United States)

    Niepold, F.; Sinatra, G. M.; Lombardi, D.

    2013-12-01

    Climate change education programs in the United States seek to promote a deeper understanding of the science of climate change, behavior change and stewardship, and support informed decision making by individuals, organizations, and institutions--all of which are summarized under the term 'climate literacy.' The ultimate goal of climate literacy is to enable actors to address climate change, both in terms of stabilizing and reducing emissions of greenhouse gases, but also an increased capacity to prepare for the consequences and opportunities of climate change. However, the long-term nature of climate change and the required societal response involve the changing students' ideas about controversial scientific issues which presents unique challenges for educators (Lombardi & Sinatra, 2010; Sinatra & Mason, 2008). This session will explore how the United States educational efforts focus on three distinct, but related, areas: the science of climate change, the human-climate interaction, and using climate education to promote informed decision making. Each of these approaches are represented in the Atlas of Science Literacy (American Association for the Advancement of Science, 2007) and in the conceptual framework for science education developed at the National Research Council (NRC) in 2012. Instruction to develop these fundamental thinking skills (e.g., critical evaluation and plausibility reappraisal) has been called for by the Next Generation Science Standards (NGSS) (Achieve, 2013), an innovative and research based way to address climate change education within the decentralized U.S. education system. However, the promise of the NGSS is that students will have more time to build mastery on the subjects, but the form of that instructional practice has been show to be critical. Research has show that effective instructional activities that promote evaluation of evidence improve students' understanding and acceptance toward the scientifically accepted model of human

  17. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support

    OpenAIRE

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    Background Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians? experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mech...

  18. Systematic reviews need to consider applicability to disadvantaged populations: inter-rater agreement for a health equity plausibility algorithm.

    Science.gov (United States)

    Welch, Vivian; Brand, Kevin; Kristjansson, Elizabeth; Smylie, Janet; Wells, George; Tugwell, Peter

    2012-12-19

    Systematic reviews have been challenged to consider effects on disadvantaged groups. A priori specification of subgroup analyses is recommended to increase the credibility of these analyses. This study aimed to develop and assess inter-rater agreement for an algorithm for systematic review authors to predict whether differences in effect measures are likely for disadvantaged populations relative to advantaged populations (only relative effect measures were addressed). A health equity plausibility algorithm was developed using clinimetric methods with three items based on literature review, key informant interviews and methodology studies. The three items dealt with the plausibility of differences in relative effects across sex or socioeconomic status (SES) due to: 1) patient characteristics; 2) intervention delivery (i.e., implementation); and 3) comparators. Thirty-five respondents (consisting of clinicians, methodologists and research users) assessed the likelihood of differences across sex and SES for ten systematic reviews with these questions. We assessed inter-rater reliability using Fleiss multi-rater kappa. The proportion agreement was 66% for patient characteristics (95% confidence interval: 61%-71%), 67% for intervention delivery (95% confidence interval: 62% to 72%) and 55% for the comparator (95% confidence interval: 50% to 60%). Inter-rater kappa, assessed with Fleiss kappa, ranged from 0 to 0.199, representing very low agreement beyond chance. Users of systematic reviews rated that important differences in relative effects across sex and socioeconomic status were plausible for a range of individual and population-level interventions. However, there was very low inter-rater agreement for these assessments. There is an unmet need for discussion of plausibility of differential effects in systematic reviews. Increased consideration of external validity and applicability to different populations and settings is warranted in systematic reviews to meet this

  19. Uncertain socioeconomic projections used in travel demand and emissions models: could plausible errors result in air quality nonconformity?

    International Nuclear Information System (INIS)

    Rodier, C.J.; Johnston, R.A.

    2002-01-01

    A sensitivity analysis of plausible errors in population, employment, fuel price, and income projections is conducted using the travel demand and emissions models of the Sacramento, CA, USA, region for their transportation plan. The results of the analyses indicate that plausible error ranges for household income and fuel prices are not a significant source of uncertainty with respect to the region's travel demand and emissions projections. However, plausible errors in population and employment projections (within approximately one standard deviation) may result in the region's transportation plan not meeting the conformity test for nitrogens of oxides (NO x ) in the year 2005 (i.e., an approximately 16% probability). This outcome is also possible in the year 2015 but less likely (within approximately two standard deviations or a 2.5% probability). Errors in socioeconomic projections are only one of many sources of error in travel demand and emissions models. These results have several policy implications. First, regions like Sacramento that meet their conformity tests by a very small margin should rethink new highway investment and consider contingency transportation plans that incorporate more aggressive emissions reduction policies. Second, regional transportation planning agencies should conduct sensitivity analyses as part of their conformity analysis to make explicit significant uncertainties in the methods and to identify the probability of their transportation plan not conforming. Third, the US Environmental Protection Agency (EPA) should clarify the interpretation of ''demonstrate'' conformity of transportation plans; that is, specify the level of certainty that it considers a sufficient demonstration of conformity. (author)

  20. Vulnerabilities to agricultural production shocks: An extreme, plausible scenario for assessment of risk for the insurance sector

    Directory of Open Access Journals (Sweden)

    Tobias Lunt

    2016-01-01

    Full Text Available Climate risks pose a threat to the function of the global food system and therefore also a hazard to the global financial sector, the stability of governments, and the food security and health of the world’s population. This paper presents a method to assess plausible impacts of an agricultural production shock and potential materiality for global insurers. A hypothetical, near-term, plausible, extreme scenario was developed based upon modules of historical agricultural production shocks, linked under a warm phase El Niño-Southern Oscillation (ENSO meteorological framework. The scenario included teleconnected floods and droughts in disparate agricultural production regions around the world, as well as plausible, extreme biotic shocks. In this scenario, global crop yield declines of 10% for maize, 11% for soy, 7% for wheat and 7% for rice result in quadrupled commodity prices and commodity stock fluctuations, civil unrest, significant negative humanitarian consequences and major financial losses worldwide. This work illustrates a need for the scientific community to partner across sectors and industries towards better-integrated global data, modeling and analytical capacities, to better respond to and prepare for concurrent agricultural failure. Governments, humanitarian organizations and the private sector collectively may recognize significant benefits from more systematic assessment of exposure to agricultural climate risk.

  1. Unintended outcomes evaluation approach: A plausible way to evaluate unintended outcomes of social development programmes.

    Science.gov (United States)

    Jabeen, Sumera

    2018-06-01

    Social development programmes are deliberate attempts to bring about change and unintended outcomes can be considered as inherent to any such intervention. There is now a solid consensus among the international evaluation community regarding the need to consider unintended outcomes as a key aspect in any evaluative study. However, this concern often equates to nothing more than false piety. Exiting evaluation theory suffers from overlap of terminology, inadequate categorisation of unintended outcomes and lack of guidance on how to study them. To advance the knowledge of evaluation theory, methods and practice, the author has developed an evaluation approach to study unintended effects using a theory building, testing and refinement process. A comprehensive classification of unintended outcomes on the basis of knowability, value, distribution and temporality helped specify various type of unintended outcomes for programme evaluation. Corresponding to this classification, a three-step evaluation process was proposed including a) outlining programme intentions b) forecasting likely unintended effects c) mapping the anticipated and understanding unanticipated unintended outcomes. This unintended outcomes evaluation approach (UOEA) was then trialled by undertaking a multi-site and multi-method case study of a poverty alleviation programme in Pakistan and refinements were made to the approach.The case study revealed that this programme was producing a number of unintended effects, mostly negative, affecting those already disadvantaged such as the poorest, women and children. The trialling process demonstrated the effectiveness of the UOEA and suggests that this can serve as a useful guide for future evaluation practice. It also provides the discipline of evaluation with an empirically-based reference point for further theoretical developments in the study of unintended outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. The Conceptual Copy Theory for the Origin of Language

    NARCIS (Netherlands)

    Odijk, J.E.J.M.

    2013-01-01

    The CC-Theory is, if correct, an attractive theory: Almost all of (3b) is explained from a very small evolutionary change. The character of the evolutionary change is biologically and evolutionary plausible. Also Chomsky needs a second evolutionary event to account for externalization. The

  3. Moral Contract Theory and Social Cognition : An Empirical Perspective

    NARCIS (Netherlands)

    Timmerman, Peter

    2014-01-01

    This interdisciplinary work draws on research from psychology and behavioral economics to evaluate the plausibility of moral contract theory. In a compelling manner with implications for moral theory more broadly, the author’s novel approach resolves a number of key contingencies in contractarianism

  4. Multiscale cosmology and structure-emerging dark energy: A plausibility analysis

    International Nuclear Information System (INIS)

    Wiegand, Alexander; Buchert, Thomas

    2010-01-01

    Cosmological backreaction suggests a link between structure formation and the expansion history of the Universe. In order to quantitatively examine this connection, we dynamically investigate a volume partition of the Universe into over- and underdense regions. This allows us to trace structure formation using the volume fraction of the overdense regions λ M as its characterizing parameter. Employing results from cosmological perturbation theory and extrapolating the leading mode into the nonlinear regime, we construct a three-parameter model for the effective cosmic expansion history, involving λ M 0 , the matter density Ω m D 0 , and the Hubble rate H D 0 of today's Universe. Taking standard values for Ω m D 0 and H D 0 as well as a reasonable value for λ M 0 , that we derive from N-body simulations, we determine the corresponding amounts of backreaction and spatial curvature. We find that the obtained values that are sufficient to generate today's structure also lead to a ΛCDM-like behavior of the scale factor, parametrized by the same parameters Ω m D 0 and H D 0 , but without a cosmological constant. However, the temporal behavior of λ M does not faithfully reproduce the structure formation history. Surprisingly, however, the model matches with structure formation with the assumption of a low matter content, Ω m D 0 ≅3%, a result that hints to a different interpretation of part of the backreaction effect as kinematical dark matter. A complementary investigation assumes the ΛCDM fit-model for the evolution of the global scale factor by imposing a global replacement of the cosmological constant through backreaction, and also supposes that a Newtonian simulation of structure formation provides the correct volume partition into over- and underdense regions. From these assumptions we derive the corresponding evolution laws for backreaction and spatial curvature on the partitioned domains. We find the correct scaling limit predicted by perturbation

  5. Bohm's theory versus dynamical reduction

    International Nuclear Information System (INIS)

    Ghirardi, G.C.; Grassi, R.

    1995-10-01

    This essay begins with a comparison between Bohm's theory and the dynamical reduction program. While there are similarities (e.g., the preferred basis), there are also important differences (e.g., the type of nonlocality or of Lorentz invariance). In particular, it is made plausible that theories which exhibit parameter dependence effects cannot be ''genuinely Lorentz invariant''. For the two approaches under consideration, this analysis provides a comparison that can produce a richer understanding both of the pilot wave and of the dynamical reduction mechanism. (author). 33 refs, 1 fig

  6. A Note on Unified Statistics Including Fermi-Dirac, Bose-Einstein, and Tsallis Statistics, and Plausible Extension to Anisotropic Effect

    Directory of Open Access Journals (Sweden)

    Christianto V.

    2007-04-01

    Full Text Available In the light of some recent hypotheses suggesting plausible unification of thermostatistics where Fermi-Dirac, Bose-Einstein and Tsallis statistics become its special subsets, we consider further plausible extension to include non-integer Hausdorff dimension, which becomes realization of fractal entropy concept. In the subsequent section, we also discuss plausible extension of this unified statistics to include anisotropic effect by using quaternion oscillator, which may be observed in the context of Cosmic Microwave Background Radiation. Further observation is of course recommended in order to refute or verify this proposition.

  7. DAEDALUS: System-Level Design Methodology for Streaming Multiprocessor Embedded Systems on Chips

    NARCIS (Netherlands)

    Stefanov, T.; Pimentel, A.; Nikolov, H.; Ha, S.; Teich, J.

    2017-01-01

    The complexity of modern embedded systems, which are increasingly based on heterogeneous multiprocessor system-on-chip (MPSoC) architectures, has led to the emergence of system-level design. To cope with this design complexity, system-level design aims at raising the abstraction level of the design

  8. Power monitors: A framework for system-level power estimation using heterogeneous power models

    NARCIS (Netherlands)

    Bansal, N.; Lahiri, K.; Raghunathan, A.; Chakradhar, S.T.

    2005-01-01

    Paper analysis early in the design cycle is critical for the design of low-power systems. With the move to system-level specifications and design methodologies, there has been significant research interest in system-level power estimation. However, as demonstrated in this paper, the addition of

  9. NASA: A generic infrastructure for system-level MP-SoC design space exploration

    NARCIS (Netherlands)

    Jia, Z.J.; Pimentel, A.D.; Thompson, M.; Bautista, T.; Núñez, A.

    2010-01-01

    System-level simulation and design space exploration (DSE) are key ingredients for the design of multiprocessor system-on-chip (MP-SoC) based embedded systems. The efforts in this area, however, typically use ad-hoc software infrastructures to facilitate and support the system-level DSE experiments.

  10. Exploiting Domain Knowledge in System-level MPSoC Design Space Exploration

    NARCIS (Netherlands)

    Thompson, M.; Pimentel, A.D.

    2013-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded multimedia systems. During system-level DSE, system parameters like, e.g., the number and type of processors, and the mapping of

  11. Homelessness Outcome Reporting Normative Framework: Systems-Level Evaluation of Progress in Ending Homelessness

    Science.gov (United States)

    Austen, Tyrone; Pauly, Bernie

    2012-01-01

    Homelessness is a serious and growing issue. Evaluations of systemic-level changes are needed to determine progress in reducing or ending homelessness. The report card methodology is one means of systems-level assessment. Rather than solely establishing an enumeration, homelessness report cards can capture pertinent information about structural…

  12. Design space pruning through hybrid analysis in system-level design space exploration

    NARCIS (Netherlands)

    Piscitelli, R.; Pimentel, A.D.

    2012-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded system archi- tectures. During system-level DSE, system parameters like, e.g., the number and type of processors, the type and size

  13. Interleaving methods for hybrid system-level MPSoC design space exploration

    NARCIS (Netherlands)

    Piscitelli, R.; Pimentel, A.D.; McAllister, J.; Bhattacharyya, S.

    2012-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded system architectures. During system-level DSE, system parameters like, e.g., the number and type of processors, the type and size of

  14. Pruning techniques for multi-objective system-level design space exploration

    NARCIS (Netherlands)

    Piscitelli, R.

    2014-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded system architectures. During system-level DSE, system parameters like, e.g., the number and type of processors, the type and size of

  15. Dedicated clock/timing-circuit theories of time perception and timed performance

    NARCIS (Netherlands)

    van Rijn, Hedderik; Gu, Bon-Mi; Meck, Warren H

    2014-01-01

    Scalar Timing Theory (an information-processing version of Scalar Expectancy Theory) and its evolution into the neurobiologically plausible Striatal Beat-Frequency (SBF) theory of interval timing are reviewed. These pacemaker/accumulator or oscillation/coincidence detection models are then

  16. Gene-ontology enrichment analysis in two independent family-based samples highlights biologically plausible processes for autism spectrum disorders.

    LENUS (Irish Health Repository)

    Anney, Richard J L

    2012-02-01

    Recent genome-wide association studies (GWAS) have implicated a range of genes from discrete biological pathways in the aetiology of autism. However, despite the strong influence of genetic factors, association studies have yet to identify statistically robust, replicated major effect genes or SNPs. We apply the principle of the SNP ratio test methodology described by O\\'Dushlaine et al to over 2100 families from the Autism Genome Project (AGP). Using a two-stage design we examine association enrichment in 5955 unique gene-ontology classifications across four groupings based on two phenotypic and two ancestral classifications. Based on estimates from simulation we identify excess of association enrichment across all analyses. We observe enrichment in association for sets of genes involved in diverse biological processes, including pyruvate metabolism, transcription factor activation, cell-signalling and cell-cycle regulation. Both genes and processes that show enrichment have previously been examined in autistic disorders and offer biologically plausibility to these findings.

  17. ASPECT (Automated System-level Performance Evaluation and Characterization Tool), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — SSCI has developed a suite of SAA tools and an analysis capability referred to as ASPECT (Automated System-level Performance Evaluation and Characterization Tool)....

  18. Vocation in theology-based nursing theories.

    Science.gov (United States)

    Lundmark, Mikael

    2007-11-01

    By using the concepts of intrinsicality/extrinsicality as analytic tools, the theology-based nursing theories of Ann Bradshaw and Katie Eriksson are analyzed regarding their explicit and/or implicit understanding of vocation as a motivational factor for nursing. The results show that both theories view intrinsic values as guarantees against reducing nursing practice to mechanistic applications of techniques and as being a way of reinforcing a high ethical standard. The theories explicitly (Bradshaw) or implicitly (Eriksson) advocate a vocational understanding of nursing as being essential for nursing theories. Eriksson's theory has a potential for conceptualizing an understanding of extrinsic and intrinsic motivational factors for nursing but one weakness in the theory could be the risk of slipping over to moral judgments where intrinsic factors are valued as being superior to extrinsic. Bradshaw's theory is more complex and explicit in understanding the concept of vocation and is theologically more plausible, although also more confessional.

  19. M(atrix) theory: matrix quantum mechanics as a fundamental theory

    International Nuclear Information System (INIS)

    Taylor, Washington

    2001-01-01

    This article reviews the matrix model of M theory. M theory is an 11-dimensional quantum theory of gravity that is believed to underlie all superstring theories. M theory is currently the most plausible candidate for a theory of fundamental physics which reconciles gravity and quantum field theory in a realistic fashion. Evidence for M theory is still only circumstantial -- no complete background-independent formulation of the theory exists as yet. Matrix theory was first developed as a regularized theory of a supersymmetric quantum membrane. More recently, it has appeared in a different guise as the discrete light-cone quantization of M theory in flat space. These two approaches to matrix theory are described in detail and compared. It is shown that matrix theory is a well-defined quantum theory that reduces to a supersymmetric theory of gravity at low energies. Although its fundamental degrees of freedom are essentially pointlike, higher-dimensional fluctuating objects (branes) arise through the non-Abelian structure of the matrix degrees of freedom. The problem of formulating matrix theory in a general space-time background is discussed, and the connections between matrix theory and other related models are reviewed

  20. MRI Proton Density Fat Fraction Is Robust Across the Biologically Plausible Range of Triglyceride Spectra in Adults With Nonalcoholic Steatohepatitis

    Science.gov (United States)

    Hong, Cheng William; Mamidipalli, Adrija; Hooker, Jonathan C.; Hamilton, Gavin; Wolfson, Tanya; Chen, Dennis H.; Dehkordy, Soudabeh Fazeli; Middleton, Michael S.; Reeder, Scott B.; Loomba, Rohit; Sirlin, Claude B.

    2017-01-01

    Background Proton density fat fraction (PDFF) estimation requires spectral modeling of the hepatic triglyceride (TG) signal. Deviations in the TG spectrum may occur, leading to bias in PDFF quantification. Purpose To investigate the effects of varying six-peak TG spectral models on PDFF estimation bias. Study Type Retrospective secondary analysis of prospectively acquired clinical research data. Population Forty-four adults with biopsy-confirmed nonalcoholic steatohepatitis. Field Strength/Sequence Confounder-corrected chemical-shift-encoded 3T MRI (using a 2D multiecho gradient-recalled echo technique with magnitude reconstruction) and MR spectroscopy. Assessment In each patient, 61 pairs of colocalized MRI-PDFF and MRS-PDFF values were estimated: one pair used the standard six-peak spectral model, the other 60 were six-peak variants calculated by adjusting spectral model parameters over their biologically plausible ranges. MRI-PDFF values calculated using each variant model and the standard model were compared, and the agreement between MRI-PDFF and MRS-PDFF was assessed. Statistical Tests MRS-PDFF and MRI-PDFF were summarized descriptively. Bland–Altman (BA) analyses were performed between PDFF values calculated using each variant model and the standard model. Linear regressions were performed between BA biases and mean PDFF values for each variant model, and between MRI-PDFF and MRS-PDFF. Results Using the standard model, mean MRS-PDFF of the study population was 17.9±8.0% (range: 4.1–34.3%). The difference between the highest and lowest mean variant MRI-PDFF values was 1.5%. Relative to the standard model, the model with the greatest absolute BA bias overestimated PDFF by 1.2%. Bias increased with increasing PDFF (P hepatic fat content, PDFF estimation is robust across the biologically plausible range of TG spectra. Although absolute estimation bias increased with higher PDFF, its magnitude was small and unlikely to be clinically meaningful. Level of

  1. Reconstruction of abstract quantum theory

    International Nuclear Information System (INIS)

    Drieschner, M.; Goernitz, T.; von Weizsaecker, C.F.

    1988-01-01

    Understanding quantum theory as a general theory of prediction, we reconstruct abstract quantum theory. Abstract means the general frame of quantum theory, without reference to a three-dimensional position space, to concepts like particle or field, or to special laws of dynamics. Reconstruction is the attempt to do this by formulating simple and plausible postulates on prediction in order to derive the basic concepts of quantum theory from them. Thereby no law of classical physics is presupposed which would then have to be quantized. We briefly discuss the relationship of theory and interpretation in physics and the fundamental role of time as a basic concept for physics. Then a number of assertions are given, formulated as succinctly as possible in order to make them easily quotable and comparable. The assertations are arranged in four groups: heuristic principles, verbal definitions of some terms, three basic postulates, and consequences. The three postulates of separable alternatives, indeterminism, and kinematics are the central points of this work. These brief assertions are commented upon, and their relationship with the interpretation of quantum theory is discussed. Also given are an outlook on the further development into concrete quantum theory and some philosophical reflections

  2. NASA System-Level Design, Analysis and Simulation Tools Research on NextGen

    Science.gov (United States)

    Bardina, Jorge

    2011-01-01

    A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.

  3. System-Level Design Methodologies for Networked Multiprocessor Systems-on-Chip

    DEFF Research Database (Denmark)

    Virk, Kashif Munir

    2008-01-01

    is the first such attempt in the published literature. The second part of the thesis deals with the issues related to the development of system-level design methodologies for networked multiprocessor systems-on-chip at various levels of design abstraction with special focus on the modeling and design...... at the system-level. The multiprocessor modeling framework is then extended to include models of networked multiprocessor systems-on-chip which is then employed to model wireless sensor networks both at the sensor node level as well as the wireless network level. In the third and the final part, the thesis...... to the transaction-level model. The thesis, as a whole makes contributions by describing a design methodology for networked multiprocessor embedded systems at three layers of abstraction from system-level through transaction-level to the cycle accurate level as well as demonstrating it practically by implementing...

  4. System-Level Modelling and Simulation of MEMS-Based Sensors

    DEFF Research Database (Denmark)

    Virk, Kashif M.; Madsen, Jan; Shafique, Mohammad

    2005-01-01

    The growing complexity of MEMS devices and their increased used in embedded systems (e.g., wireless integrated sensor networks) demands a disciplined aproach for MEMS design as well as the development of techniques for system-level modeling of these devices so that a seamless integration with the......The growing complexity of MEMS devices and their increased used in embedded systems (e.g., wireless integrated sensor networks) demands a disciplined aproach for MEMS design as well as the development of techniques for system-level modeling of these devices so that a seamless integration...... with the existing embedded system design methodologies is possible. In this paper, we present a MEMS design methodology that uses VHDL-AMS based system-level model of a MEMS device as a starting point and combines the top-down and bottom-up design approaches for design, verification, and optimization...

  5. Plausibility of stromal initiation of epithelial cancers without a mutation in the epithelium: a computer simulation of morphostats

    Directory of Open Access Journals (Sweden)

    Cappuccio Antonio

    2009-03-01

    Full Text Available Abstract Background There is experimental evidence from animal models favoring the notion that the disruption of interactions between stroma and epithelium plays an important role in the initiation of carcinogenesis. These disrupted interactions are hypothesized to be mediated by molecules, termed morphostats, which diffuse through the tissue to determine cell phenotype and maintain tissue architecture. Methods We developed a computer simulation based on simple properties of cell renewal and morphostats. Results Under the computer simulation, the disruption of the morphostat gradient in the stroma generated epithelial precursors of cancer without any mutation in the epithelium. Conclusion The model is consistent with the possibility that the accumulation of genetic and epigenetic changes found in tumors could arise after the formation of a founder population of aberrant cells, defined as cells that are created by low or insufficient morphostat levels and that no longer respond to morphostat concentrations. Because the model is biologically plausible, we hope that these results will stimulate further experiments.

  6. Self-focused and other-focused resiliency: Plausible mechanisms linking early family adversity to health problems in college women.

    Science.gov (United States)

    Coleman, Sulamunn R M; Zawadzki, Matthew J; Heron, Kristin E; Vartanian, Lenny R; Smyth, Joshua M

    2016-01-01

    This study examined whether self-focused and other-focused resiliency help explain how early family adversity relates to perceived stress, subjective health, and health behaviors in college women. Female students (N = 795) participated between October 2009 and May 2010. Participants completed self-report measures of early family adversity, self-focused (self-esteem, personal growth initiative) and other-focused (perceived social support, gratitude) resiliency, stress, subjective health, and health behaviors. Using structural equation modeling, self-focused resiliency associated with less stress, better subjective health, more sleep, less smoking, and less weekend alcohol consumption. Other-focused resiliency associated with more exercise, greater stress, and more weekend alcohol consumption. Early family adversity was indirectly related to all health outcomes, except smoking, via self-focused and other-focused resiliency. Self-focused and other-focused resiliency represent plausible mechanisms through which early family adversity relates to stress and health in college women. This highlights areas for future research in disease prevention and management.

  7. Synchronous volcanic eruptions and abrupt climate change ∼17.7 ka plausibly linked by stratospheric ozone depletion.

    Science.gov (United States)

    McConnell, Joseph R; Burke, Andrea; Dunbar, Nelia W; Köhler, Peter; Thomas, Jennie L; Arienzo, Monica M; Chellman, Nathan J; Maselli, Olivia J; Sigl, Michael; Adkins, Jess F; Baggenstos, Daniel; Burkhart, John F; Brook, Edward J; Buizert, Christo; Cole-Dai, Jihong; Fudge, T J; Knorr, Gregor; Graf, Hans-F; Grieman, Mackenzie M; Iverson, Nels; McGwire, Kenneth C; Mulvaney, Robert; Paris, Guillaume; Rhodes, Rachael H; Saltzman, Eric S; Severinghaus, Jeffrey P; Steffensen, Jørgen Peder; Taylor, Kendrick C; Winckler, Gisela

    2017-09-19

    Glacial-state greenhouse gas concentrations and Southern Hemisphere climate conditions persisted until ∼17.7 ka, when a nearly synchronous acceleration in deglaciation was recorded in paleoclimate proxies in large parts of the Southern Hemisphere, with many changes ascribed to a sudden poleward shift in the Southern Hemisphere westerlies and subsequent climate impacts. We used high-resolution chemical measurements in the West Antarctic Ice Sheet Divide, Byrd, and other ice cores to document a unique, ∼192-y series of halogen-rich volcanic eruptions exactly at the start of accelerated deglaciation, with tephra identifying the nearby Mount Takahe volcano as the source. Extensive fallout from these massive eruptions has been found >2,800 km from Mount Takahe. Sulfur isotope anomalies and marked decreases in ice core bromine consistent with increased surface UV radiation indicate that the eruptions led to stratospheric ozone depletion. Rather than a highly improbable coincidence, circulation and climate changes extending from the Antarctic Peninsula to the subtropics-similar to those associated with modern stratospheric ozone depletion over Antarctica-plausibly link the Mount Takahe eruptions to the onset of accelerated Southern Hemisphere deglaciation ∼17.7 ka.

  8. The Plausibility of Tonal Evolution in the Malay Dialect Spoken in Thailand: Evidence from an Acoustic Study

    Directory of Open Access Journals (Sweden)

    Phanintra Teeranon

    2007-12-01

    Full Text Available The F0 values of vowels following voiceless consonants are higher than those of vowels following voiced consonants; high vowels have a higher F0 than low vowels. It has also been found that when high vowels follow voiced consonants, the F0 values decrease. In contrast, low vowels following voiceless consonants show increasing F0 values. In other words, the voicing of initial consonants has been found to counterbalance the intrinsic F0 values of high and low vowels (House and Fairbanks 1953, Lehiste and Peterson 1961, Lehiste 1970, Laver 1994, Teeranon 2006. To test whether these three findings are applicable to a disyllabic language, the F0 values of high and low vowels following voiceless and voiced consonants were studied in a Malay dialect of the Austronesian language family spoken in Pathumthani Province, Thailand. The data was collected from three male informants, aged 30-35. The Praat program was used for acoustic analysis. The findings revealed the influence of the voicing of initial consonants on the F0 of vowels to be greater than that of the influence of vowel height. Evidence from this acoustic study shows the plausibility for the Malay dialect spoken in Pathumthani to become a tonal language by the influence of initial consonants rather by the influence of the high-low vowel dimension.

  9. Contrast normalization contributes to a biologically-plausible model of receptive-field development in primary visual cortex (V1)

    Science.gov (United States)

    Willmore, Ben D.B.; Bulstrode, Harry; Tolhurst, David J.

    2012-01-01

    Neuronal populations in the primary visual cortex (V1) of mammals exhibit contrast normalization. Neurons that respond strongly to simple visual stimuli – such as sinusoidal gratings – respond less well to the same stimuli when they are presented as part of a more complex stimulus which also excites other, neighboring neurons. This phenomenon is generally attributed to generalized patterns of inhibitory connections between nearby V1 neurons. The Bienenstock, Cooper and Munro (BCM) rule is a neural network learning rule that, when trained on natural images, produces model neurons which, individually, have many tuning properties in common with real V1 neurons. However, when viewed as a population, a BCM network is very different from V1 – each member of the BCM population tends to respond to the same dominant features of visual input, producing an incomplete, highly redundant code for visual information. Here, we demonstrate that, by adding contrast normalization into the BCM rule, we arrive at a neurally-plausible Hebbian learning rule that can learn an efficient sparse, overcomplete representation that is a better model for stimulus selectivity in V1. This suggests that one role of contrast normalization in V1 is to guide the neonatal development of receptive fields, so that neurons respond to different features of visual input. PMID:22230381

  10. Bilinguals' Plausibility Judgments for Phrases with a Literal vs. Non-literal Meaning: The Influence of Language Brokering Experience

    Directory of Open Access Journals (Sweden)

    Belem G. López

    2017-09-01

    Full Text Available Previous work has shown that prior experience in language brokering (informal translation may facilitate the processing of meaning within and across language boundaries. The present investigation examined the influence of brokering on bilinguals' processing of two word collocations with either a literal or a figurative meaning in each language. Proficient Spanish-English bilinguals classified as brokers or non-brokers were asked to judge if adjective+noun phrases presented in each language made sense or not. Phrases with a literal meaning (e.g., stinging insect were interspersed with phrases with a figurative meaning (e.g., stinging insult and non-sensical phrases (e.g., stinging picnic. It was hypothesized that plausibility judgments would be facilitated for literal relative to figurative meanings in each language but that experience in language brokering would be associated with a more equivalent pattern of responding across languages. These predictions were confirmed. The findings add to the body of empirical work on individual differences in language processing in bilinguals associated with prior language brokering experience.

  11. A theory evaluation of an induction programme

    Directory of Open Access Journals (Sweden)

    Kenrick Hendricks

    2012-07-01

    Full Text Available Orientation: An induction programme is commonly used to help new employees understand their job within the organisation. Research purpose: The main aim of this study was to examine whether or not the programme theory of an induction programme was plausible and would lead to the intended outcomes as described by the programme manager. Motivation for the study: Induction training is one of the most common training programmes in an organisation. However, there is little research to evaluate whether or not the activities of an induction programme will lead to the intended outcomes of such a programme. Research design, approach and method: This theory evaluation used a descriptive design. One hundred and thirteen employees of a media company completed a ten-item, five-point Likert scale which measured their perceptions of the programme’s outcome, identification with the organisation and intentions to stay with the organisation. Main findings: From this theory evaluation it was apparent that an induction programme based on an implausible programme theory could be problematic. An implausible programme theory affects the design of the programme activities and unsuitable activities may not deliver the desired outcomes. Practical/managerial implications: The intention of the evaluation is to guide human resource managers through a process of replacing an implausible programme theory with one that is plausible, and which ensures better alignment of programme activities and outcomes. Contribution/value-add: The evaluators showed how a plausible programme theory could improve programme design. This redesigned induction programme may lead to benefits, such as staff retention and company identification, rather than the vague assumption that it has been conforming to a legal obligation.

  12. System-Level Sensitivity Analysis of SiNW-bioFET-Based Biosensing Using Lockin Amplification

    DEFF Research Database (Denmark)

    Patou, François; Dimaki, Maria; Kjærgaard, Claus

    2017-01-01

    carry out for the first time the system-level sensitivity analysis of a generic SiNW-bioFET model coupled to a custom-design instrument based on the lock-in amplifier. By investigating a large parametric space spanning over both sensor and instrumentation specifications, we demonstrate that systemwide...

  13. The Artemis workbench for system-level performance evaluation of embedded systems

    NARCIS (Netherlands)

    Pimentel, A.D.

    2008-01-01

    In this paper, we present an overview of the Artemis workbench, which provides modelling and simulation methods and tools for efficient performance evaluation and exploration of heterogeneous embedded multimedia systems. More specifically, we describe the Artemis system-level modelling methodology,

  14. A system-level modelling perspective of the KwaZulu-Natal Bight ...

    African Journals Online (AJOL)

    Requirements to take the hydrodynamic, biogeochemical and first ecosystem modelling efforts towards a meaningful predictive capability are discussed. The importance of adopting a system-level view of the bight and its connected systems for realistic exploration of global change scenarios is highlighted. Keywords: ...

  15. System-level modelling of dynamic reconfigurable designs using functional programming abstractions

    NARCIS (Netherlands)

    Uchevler, B.N.; Svarstad, Kjetil; Kuper, Jan; Baaij, C.P.R.

    With the increasing size and complexity of designs in electronics, new approaches are required for the description and verification of digital circuits, specifically at the system level. Functional HDLs can appear as an advantageous choice for formal verification and high-level descriptions. In this

  16. A Probabilistic Approach for the System-Level Design of Multi-ASIP Platforms

    DEFF Research Database (Denmark)

    Micconi, Laura

    introduce a system-level Design Space Exploration (DSE) for the very early phases of the design that automatizes part of the multi-ASIP design flow. Our DSE is responsible for assigning the tasks to the different ASIPs exploring different platform alternatives. We perform a schedulability analysis for each...

  17. System-Level Design of an Integrated Receiver Front End for a Wireless Ultrasound Probe

    DEFF Research Database (Denmark)

    di Ianni, Tommaso; Hemmsen, Martin Christian; Llimos Muntal, Pere

    2016-01-01

    In this paper, a system-level design is presented for an integrated receive circuit for a wireless ultrasound probe, which includes analog front ends and beamformation modules. This paper focuses on the investigation of the effects of architectural design choices on the image quality. The point...

  18. Strawsonian Libertarianism: A Theory of Free Will and Moral Responsibility

    OpenAIRE

    Franklin, Christopher

    2010-01-01

    My dissertation develops a novel theory of free will and moral responsibility, Strawsonian libertarianism, which combines Strawsonianism about the concept of moral responsibility with event-causal libertarianism concerning its conditions of application. I construct this theory in light of and response to the three main objections to libertarianism: the moral shallowness objection, the intelligibility objection, and the empirical plausibility objection.The moral shallowness objection contends...

  19. Theory of bending waves with applications to disk galaxies

    International Nuclear Information System (INIS)

    Mark, J.W.K.

    1982-01-01

    A theory of bending waves is surveyed which provides an explanation for the required amplification of the warp in the Milky Way. It also provides for self-generated warps in isolated external galaxies. The shape of observed warps and partly their existence in isolated galaxies are indicative of substantial spheroidal components. The theory also provides a plausible explanation for the bending of the inner disk (<2 kpc) of the Milky Way

  20. The Mediterranean dietary pattern as the diet of choice for non-alcoholic fatty liver disease: Evidence and plausible mechanisms.

    Science.gov (United States)

    Zelber-Sagi, Shira; Salomone, Federico; Mlynarsky, Liat

    2017-07-01

    Non-alcoholic fatty liver disease (NAFLD) has become a major global health burden, leading to increased risk for cirrhosis, hepatocellular carcinoma, type-2 diabetes and cardiovascular disease. Lifestyle intervention aiming at weight reduction is the most established treatment. However, changing the dietary composition even without weight loss can also reduce steatosis and improve metabolic alterations as insulin resistance and lipid profile. The Mediterranean diet (MD) pattern has been proposed as appropriate for this goal, and was recommended as the diet of choice for the treatment of NAFLD by the EASL-EASD-EASO Clinical Practice Guidelines. The MD has an established superiority in long term weight reduction over low fat diet, but it improves metabolic status and steatosis even without it. However, the effect on liver inflammation and fibrosis was tested only in few observational studies with positive results. Furthermore, considering the strong association between NAFLD and diabetes and CVD, the MD has a highly established advantage in prevention of these diseases, demonstrated in randomized clinical trials. The individual components of the MD such as olive oil, fish, nuts, whole grains, fruits, and vegetables, have been shown to beneficially effect or negatively correlate with NAFLD, while consumption of components that characterize a Western dietary pattern as soft drinks, fructose, meat and saturated fatty acids have been shown to have detrimental association with NAFLD. In this review we will cover the epidemiological evidence and the plausible molecular mechanisms by which the MD as a whole and each of its components can be of benefit in NAFLD. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Developing spatially explicit footprints of plausible land-use scenarios in the Santa Cruz Watershed, Arizona and Sonora

    Science.gov (United States)

    Norman, Laura M.; Feller, Mark; Villarreal, Miguel L.

    2012-01-01

    The SLEUTH urban growth model is applied to a binational dryland watershed to envision and evaluate plausible future scenarios of land use change into the year 2050. Our objective was to create a suite of geospatial footprints portraying potential land use change that can be used to aid binational decision-makers in assessing the impacts relative to sustainability of natural resources and potential socio-ecological consequences of proposed land-use management. Three alternatives are designed to simulate different conditions: (i) a Current Trends Scenario of unmanaged exponential growth, (ii) a Conservation Scenario with managed growth to protect the environment, and (iii) a Megalopolis Scenario in which growth is accentuated around a defined international trade corridor. The model was calibrated with historical data extracted from a time series of satellite images. Model materials, methodology, and results are presented. Our Current Trends Scenario predicts the footprint of urban growth to approximately triple from 2009 to 2050, which is corroborated by local population estimates. The Conservation Scenario results in protecting 46% more of the Evergreen class (more than 150,000 acres) than the Current Trends Scenario and approximately 95,000 acres of Barren Land, Crops, Deciduous Forest (Mesquite Bosque), Grassland/Herbaceous, Urban/Recreational Grasses, and Wetlands classes combined. The Megalopolis Scenario results also depict the preservation of some of these land-use classes compared to the Current Trends Scenario, most notably in the environmentally important headwaters region. Connectivity and areal extent of land cover types that provide wildlife habitat were preserved under the alternative scenarios when compared to Current Trends.

  2. On Matrix Sampling and Imputation of Context Questionnaires with Implications for the Generation of Plausible Values in Large-Scale Assessments

    Science.gov (United States)

    Kaplan, David; Su, Dan

    2016-01-01

    This article presents findings on the consequences of matrix sampling of context questionnaires for the generation of plausible values in large-scale assessments. Three studies are conducted. Study 1 uses data from PISA 2012 to examine several different forms of missing data imputation within the chained equations framework: predictive mean…

  3. Bohm`s theory versus dynamical reduction

    Energy Technology Data Exchange (ETDEWEB)

    Ghirardi, G C [International Centre for Theoretical Physics, Trieste (Italy); Grassi, R [Udine Univ., Udine (Italy). Dept. of Civil Engineering

    1995-10-01

    This essay begins with a comparison between Bohm`s theory and the dynamical reduction program. While there are similarities (e.g., the preferred basis), there are also important differences (e.g., the type of nonlocality or of Lorentz invariance). In particular, it is made plausible that theories which exhibit parameter dependence effects cannot be ``genuinely Lorentz invariant``. For the two approaches under consideration, this analysis provides a comparison that can produce a richer understanding both of the pilot wave and of the dynamical reduction mechanism. (author). 33 refs, 1 fig.

  4. System Level Design of a Continuous-Time Delta-Sigma Modulator for Portable Ultrasound Scanners

    DEFF Research Database (Denmark)

    Llimos Muntal, Pere; Færch, Kjartan; Jørgensen, Ivan Harald Holger

    2015-01-01

    In this paper the system level design of a continuous-time ∆Σ modulator for portable ultrasound scanners is presented. The overall required signal-to-noise ratio (SNR) is derived to be 42 dB and the sampling frequency used is 320 MHz for an oversampling ratio of 16. In order to match these requir......, based on high-level VerilogA simulations, the performance of the ∆Σ modulator versus various block performance parameters is presented as trade-off curves. Based on these results, the block specifications are derived.......In this paper the system level design of a continuous-time ∆Σ modulator for portable ultrasound scanners is presented. The overall required signal-to-noise ratio (SNR) is derived to be 42 dB and the sampling frequency used is 320 MHz for an oversampling ratio of 16. In order to match...

  5. System-Level Optimization of a DAC for Hearing-Aid Audio Class D Output Stage

    DEFF Research Database (Denmark)

    Pracný, Peter; Jørgensen, Ivan Harald Holger; Bruun, Erik

    2013-01-01

    This paper deals with system-level optimization of a digital-to-analog converter (DAC) for hearing-aid audio Class D output stage. We discuss the ΣΔ modulator system-level design parameters – the order, the oversampling ratio (OSR) and the number of bits in the quantizer. We show that combining...... by comparing two ΣΔ modulator designs. The proposed optimization has impact on the whole hearing-aid audio back-end system including less hardware in the interpolation filter and half the switching rate in the digital-pulse-width-modulation (DPWM) block and Class D output stage...... a reduction of the OSR with an increase of the order results in considerable power savings while the audio quality is kept. For further savings in the ΣΔ modulator, overdesign and subsequent coarse coefficient quantization are used. A figure of merit (FOM) is introduced to confirm this optimization approach...

  6. Next Generation Civil Transport Aircraft Design Considerations for Improving Vehicle and System-Level Efficiency

    Science.gov (United States)

    Acosta, Diana M.; Guynn, Mark D.; Wahls, Richard A.; DelRosario, Ruben,

    2013-01-01

    The future of aviation will benefit from research in aircraft design and air transportation management aimed at improving efficiency and reducing environmental impacts. This paper presents civil transport aircraft design trends and opportunities for improving vehicle and system-level efficiency. Aircraft design concepts and the emerging technologies critical to reducing thrust specific fuel consumption, reducing weight, and increasing lift to drag ratio currently being developed by NASA are discussed. Advancements in the air transportation system aimed towards system-level efficiency are discussed as well. Finally, the paper describes the relationship between the air transportation system, aircraft, and efficiency. This relationship is characterized by operational constraints imposed by the air transportation system that influence aircraft design, and operational capabilities inherent to an aircraft design that impact the air transportation system.

  7. A system level boundary scan controller board for VME applications [to CERN experiments

    CERN Document Server

    Cardoso, N; Da Silva, J C

    2000-01-01

    This work is the result of a collaboration between INESC and LIP in the CMS experiment being conducted at CERN. The collaboration addresses the application of boundary scan test at system level namely the development of a VME boundary scan controller (BSC) board prototype and the corresponding software. This prototype uses the MTM bus existing in the VME64* backplane to apply the 1149.1 test vectors to a system composed of nineteen boards, called here units under test (UUTs). A top-down approach is used to describe our work. The paper begins with some insights about the experiment being conducted at CERN, proceed with system level considerations concerning our work and with some details about the BSC board. The results obtained so far and the proposed work is reviewed in the end of this contribution. (11 refs).

  8. System-level perturbations of cell metabolism using CRISPR/Cas9

    DEFF Research Database (Denmark)

    Jakociunas, Tadas; Jensen, Michael Krogh; Keasling, Jay

    2017-01-01

    CRISPR/Cas9 (clustered regularly interspaced palindromic repeats and the associated protein Cas9) techniques have made genome engineering and transcriptional reprogramming studies more advanced and cost-effective. For metabolic engineering purposes, the CRISPR-based tools have been applied...... previously possible. In this mini-review we highlight recent studies adopting CRISPR/Cas9 for systems-level perturbations and model-guided metabolic engineering....

  9. Competition, liquidity and stability: international evidence at the bank and systemic levels

    OpenAIRE

    Nguyen, Thi Ngoc My

    2017-01-01

    This thesis investigates the impact of market power on bank liquidity; the association between competition and systemic liquidity; and whether the associations between liquidity and stability at both bank- and systemic- levels are affected by competition. The first research question is explored in the context of 101 countries over 1996-2013 while the second and the third, which require listed banks, use a smaller sample of 32 nations during 2001-2013. The Panel Least Squares and the system Ge...

  10. System-level modeling for economic evaluation of geological CO2 storage in gas reservoirs

    International Nuclear Information System (INIS)

    Zhang, Yingqi; Oldenburg, Curtis M.; Finsterle, Stefan; Bodvarsson, Gudmundur S.

    2007-01-01

    One way to reduce the effects of anthropogenic greenhouse gases on climate is to inject carbon dioxide (CO 2 ) from industrial sources into deep geological formations such as brine aquifers or depleted oil or gas reservoirs. Research is being conducted to improve understanding of factors affecting particular aspects of geological CO 2 storage (such as storage performance, storage capacity, and health, safety and environmental (HSE) issues) as well as to lower the cost of CO 2 capture and related processes. However, there has been less emphasis to date on system-level analyses of geological CO 2 storage that consider geological, economic, and environmental issues by linking detailed process models to representations of engineering components and associated economic models. The objective of this study is to develop a system-level model for geological CO 2 storage, including CO 2 capture and separation, compression, pipeline transportation to the storage site, and CO 2 injection. Within our system model we are incorporating detailed reservoir simulations of CO 2 injection into a gas reservoir and related enhanced production of methane. Potential leakage and associated environmental impacts are also considered. The platform for the system-level model is GoldSim [GoldSim User's Guide. GoldSim Technology Group; 2006, http://www.goldsim.com]. The application of the system model focuses on evaluating the feasibility of carbon sequestration with enhanced gas recovery (CSEGR) in the Rio Vista region of California. The reservoir simulations are performed using a special module of the TOUGH2 simulator, EOS7C, for multicomponent gas mixtures of methane and CO 2 . Using a system-level modeling approach, the economic benefits of enhanced gas recovery can be directly weighed against the costs and benefits of CO 2 injection

  11. System-level energy efficiency is the greatest barrier to development of the hydrogen economy

    International Nuclear Information System (INIS)

    Page, Shannon; Krumdieck, Susan

    2009-01-01

    Current energy research investment policy in New Zealand is based on assumed benefits of transitioning to hydrogen as a transport fuel and as storage for electricity from renewable resources. The hydrogen economy concept, as set out in recent commissioned research investment policy advice documents, includes a range of hydrogen energy supply and consumption chains for transport and residential energy services. The benefits of research and development investments in these advice documents were not fully analyzed by cost or improvements in energy efficiency or green house gas emissions reduction. This paper sets out a straightforward method to quantify the system-level efficiency of these energy chains. The method was applied to transportation and stationary heat and power, with hydrogen generated from wind energy, natural gas and coal. The system-level efficiencies for the hydrogen chains were compared to direct use of conventionally generated electricity, and with internal combustion engines operating on gas- or coal-derived fuel. The hydrogen energy chains were shown to provide little or no system-level efficiency improvement over conventional technology. The current research investment policy is aimed at enabling a hydrogen economy without considering the dramatic loss of efficiency that would result from using this energy carrier.

  12. Integrating Omics Technologies to Study Pulmonary Physiology and Pathology at the Systems Level

    Directory of Open Access Journals (Sweden)

    Ravi Ramesh Pathak

    2014-04-01

    Full Text Available Assimilation and integration of “omics” technologies, including genomics, epigenomics, proteomics, and metabolomics has readily altered the landscape of medical research in the last decade. The vast and complex nature of omics data can only be interpreted by linking molecular information at the organismic level, forming the foundation of systems biology. Research in pulmonary biology/medicine has necessitated integration of omics, network, systems and computational biology data to differentially diagnose, interpret, and prognosticate pulmonary diseases, facilitating improvement in therapy and treatment modalities. This review describes how to leverage this emerging technology in understanding pulmonary diseases at the systems level -called a “systomic” approach. Considering the operational wholeness of cellular and organ systems, diseased genome, proteome, and the metabolome needs to be conceptualized at the systems level to understand disease pathogenesis and progression. Currently available omics technology and resources require a certain degree of training and proficiency in addition to dedicated hardware and applications, making them relatively less user friendly for the pulmonary biologist and clinicians. Herein, we discuss the various strategies, computational tools and approaches required to study pulmonary diseases at the systems level for biomedical scientists and clinical researchers.

  13. Integrated System-Level Optimization for Concurrent Engineering With Parametric Subsystem Modeling

    Science.gov (United States)

    Schuman, Todd; DeWeck, Oliver L.; Sobieski, Jaroslaw

    2005-01-01

    The introduction of concurrent design practices to the aerospace industry has greatly increased the productivity of engineers and teams during design sessions as demonstrated by JPL's Team X. Simultaneously, advances in computing power have given rise to a host of potent numerical optimization methods capable of solving complex multidisciplinary optimization problems containing hundreds of variables, constraints, and governing equations. Unfortunately, such methods are tedious to set up and require significant amounts of time and processor power to execute, thus making them unsuitable for rapid concurrent engineering use. This paper proposes a framework for Integration of System-Level Optimization with Concurrent Engineering (ISLOCE). It uses parametric neural-network approximations of the subsystem models. These approximations are then linked to a system-level optimizer that is capable of reaching a solution quickly due to the reduced complexity of the approximations. The integration structure is described in detail and applied to the multiobjective design of a simplified Space Shuttle external fuel tank model. Further, a comparison is made between the new framework and traditional concurrent engineering (without system optimization) through an experimental trial with two groups of engineers. Each method is evaluated in terms of optimizer accuracy, time to solution, and ease of use. The results suggest that system-level optimization, running as a background process during integrated concurrent engineering sessions, is potentially advantageous as long as it is judiciously implemented.

  14. Optimal maintenance policy incorporating system level and unit level for mechanical systems

    Science.gov (United States)

    Duan, Chaoqun; Deng, Chao; Wang, Bingran

    2018-04-01

    The study works on a multi-level maintenance policy combining system level and unit level under soft and hard failure modes. The system experiences system-level preventive maintenance (SLPM) when the conditional reliability of entire system exceeds SLPM threshold, and also undergoes a two-level maintenance for each single unit, which is initiated when a single unit exceeds its preventive maintenance (PM) threshold, and the other is performed simultaneously the moment when any unit is going for maintenance. The units experience both periodic inspections and aperiodic inspections provided by failures of hard-type units. To model the practical situations, two types of economic dependence have been taken into account, which are set-up cost dependence and maintenance expertise dependence due to the same technology and tool/equipment can be utilised. The optimisation problem is formulated and solved in a semi-Markov decision process framework. The objective is to find the optimal system-level threshold and unit-level thresholds by minimising the long-run expected average cost per unit time. A formula for the mean residual life is derived for the proposed multi-level maintenance policy. The method is illustrated by a real case study of feed subsystem from a boring machine, and a comparison with other policies demonstrates the effectiveness of our approach.

  15. Waltz's Theory of Theory

    DEFF Research Database (Denmark)

    Wæver, Ole

    2009-01-01

    -empiricism and anti-positivism of his position. Followers and critics alike have treated Waltzian neorealism as if it was at bottom a formal proposition about cause-effect relations. The extreme case of Waltz being so victorious in the discipline, and yet being consistently mis-interpreted on the question of theory......, shows the power of a dominant philosophy of science in US IR, and thus the challenge facing any ambitious theorising. The article suggests a possible movement of fronts away from the ‘fourth debate' between rationalism and reflectivism towards one of theory against empiricism. To help this new agenda...

  16. Organizational- and system-level characteristics that influence implementation of shared decision-making and strategies to address them - a scoping review.

    Science.gov (United States)

    Scholl, Isabelle; LaRussa, Allison; Hahlweg, Pola; Kobrin, Sarah; Elwyn, Glyn

    2018-03-09

    failure to implement SDM in routine care. A wide range of characteristics described as supporting and inhibiting implementation were identified. Future studies should assess the impact of these characteristics on SDM implementation more thoroughly, quantify likely interactions, and assess how characteristics might operate across types of systems and areas of healthcare. Organizations that wish to support the adoption of SDM should carefully consider the role of organizational- and system-level characteristics. Implementation and organizational theory could provide useful guidance for how to address facilitators and barriers to change.

  17. The SAFRR Tsunami Scenario: Improving Resilience for California from a Plausible M9 Earthquake near the Alaska Peninsula

    Science.gov (United States)

    Ross, S.; Jones, L.; Wilson, R. I.; Bahng, B.; Barberopoulou, A.; Borrero, J. C.; Brosnan, D.; Bwarie, J.; Geist, E. L.; Johnson, L.; Kirby, S. H.; Knight, W.; Long, K.; Lynett, P. J.; Miller, K.; Mortensen, C. E.; Nicolsky, D.; Oglesby, D. D.; Perry, S. C.; Plumlee, G. S.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Suleimani, E.; Thio, H. K.; Titov, V.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2013-12-01

    The SAFRR Tsunami Scenario models a hypothetical but plausible tsunami, created by an Mw9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. We present the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs, economic consequences, environmental impacts, social vulnerability, emergency management, and policy implications for California associated with the tsunami scenario. The intended users are those who must make mitigation decisions before and rapid decisions during future tsunamis. Around a half million people would be present in the scenario's inundation area in residences, businesses, public venues, parks and beaches. Evacuation would likely be ordered for the State of California's maximum mapped tsunami inundation zone, evacuating an additional quarter million people from residences and businesses. Some island and peninsula communities would face particular evacuation challenges because of limited access options and short warning time, caused by the distance between Alaska and California. Evacuations may also be a challenge for certain dependent-care populations. One third of the boats in California's marinas could be damaged or sunk, costing at least 700 million in repairs to boats and docks, and potentially much more to address serious issues due to sediment transport and environmental contamination. Fires would likely start at many sites where fuel and petrochemicals are stored in ports and marinas. Tsunami surges and bores may travel several miles inland up coastal rivers. Debris clean-up and recovery of inundated and damaged areas will take days, months, or years depending on the severity of impacts and the available resources for recovery. The Ports of Los Angeles and Long Beach (POLA/LB) would be shut down for a miniμm of two days due to strong currents. Inundation of dry land in the ports would result in 100 million damages to cargo and additional

  18. Looking for a Location: Dissociated Effects of Event-Related Plausibility and Verb–Argument Information on Predictive Processing in Aphasia

    Science.gov (United States)

    Dickey, Michael Walsh; Warren, Tessa

    2016-01-01

    Purpose This study examined the influence of verb–argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. Method This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54–82 years) as well as 44 young adults (aged 18–31 years) and 18 older adults (aged 50–71 years) participated. Results Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Conclusions Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure. PMID:27997951

  19. Looking for a Location: Dissociated Effects of Event-Related Plausibility and Verb-Argument Information on Predictive Processing in Aphasia.

    Science.gov (United States)

    Hayes, Rebecca A; Dickey, Michael Walsh; Warren, Tessa

    2016-12-01

    This study examined the influence of verb-argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54-82 years) as well as 44 young adults (aged 18-31 years) and 18 older adults (aged 50-71 years) participated. Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure.

  20. Study on the system-level test method of digital metering in smart substation

    Science.gov (United States)

    Zhang, Xiang; Yang, Min; Hu, Juan; Li, Fuchao; Luo, Ruixi; Li, Jinsong; Ai, Bing

    2017-03-01

    Nowadays, the test methods of digital metering system in smart substation are used to test and evaluate the performance of a single device, but these methods can only effectively guarantee the accuracy and reliability of the measurement results of a digital metering device in a single run, it does not completely reflect the performance when each device constitutes a complete system. This paper introduced the shortages of the existing test methods. A system-level test method of digital metering in smart substation was proposed, and the feasibility of the method was proved by the actual test.

  1. Enhanced Discrete-Time Scheduler Engine for MBMS E-UMTS System Level Simulator

    DEFF Research Database (Denmark)

    Pratas, Nuno; Rodrigues, António

    2007-01-01

    In this paper the design of an E-UMTS system level simulator developed for the study of optimization methods for the MBMS is presented. The simulator uses a discrete event based philosophy, which captures the dynamic behavior of the Radio Network System. This dynamic behavior includes the user...... mobility, radio interfaces and the Radio Access Network. Its given emphasis on the enhancements developed for the simulator core, the Event Scheduler Engine. Two implementations for the Event Scheduler Engine are proposed, one optimized for single core processors and other for multi-core ones....

  2. Exploration of a digital audio processing platform using a compositional system level performance estimation framework

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer; Madsen, Jan

    2009-01-01

    This paper presents the application of a compositional simulation based system-level performance estimation framework on a non-trivial industrial case study. The case study is provided by the Danish company Bang & Olufsen ICEpower a/s and focuses on the exploration of a digital mobile audio...... processing platform. A short overview of the compositional performance estimation framework used is given followed by a presentation of how it is used for performance estimation using an iterative refinement process towards the final implementation. Finally, an evaluation in terms of accuracy and speed...

  3. System-level modeling and simulation of the cell culture microfluidic biochip ProCell

    DEFF Research Database (Denmark)

    Minhass, Wajid Hassan; Pop, Paul; Madsen, Jan

    2010-01-01

    Microfluidic biochips offer a promising alternative to a conventional biochemical laboratory. There are two technologies for the microfluidic biochips: droplet-based and flow-based. In this paper we are interested in flow-based microfluidic biochips, where the liquid flows continuously through pre......-defined micro-channels using valves and pumps. We present an approach to the system-level modeling and simulation of a cell culture microfluidic biochip called ProCell, Programmable Cell Culture Chip. ProCell contains a cell culture chamber, which is envisioned to run 256 simultaneous experiments (viewed...

  4. Abstract Radio Resource Management Framework for System Level Simulations in LTE-A Systems

    DEFF Research Database (Denmark)

    Fotiadis, Panagiotis; Viering, Ingo; Zanier, Paolo

    2014-01-01

    This paper provides a simple mathematical model of different packet scheduling policies in Long Term Evolution- Advanced (LTE-A) systems, by investigating the performance of Proportional Fair (PF) and the generalized cross-Component Carrier scheduler from a theoretical perspective. For that purpose......, an abstract Radio Resource Management (RRM) framework has been developed and tested for different ratios of users with Carrier Aggregation (CA) capabilities. The conducted system level simulations confirm that the proposed model can satisfactorily capture the main properties of the aforementioned scheduling...

  5. Design of power converter in DFIG wind turbine with enhanced system-level reliability

    DEFF Research Database (Denmark)

    Zhou, Dao; Zhang, Guanguan; Blaabjerg, Frede

    2017-01-01

    With the increasing penetration of wind power, reliable and cost-effective wind energy production are of more and more importance. As one of the promising configurations, the doubly-fed induction generator based partial-scale wind power converter is still dominating in the existing wind farms...... margin. It can be seen that the B1 lifetime of the grid-side converter and the rotor-side converter deviates a lot by considering the electrical stresses, while they become more balanced by using an optimized reliable design. The system-level lifetime significantly increases with an appropriate design...

  6. A system-level multiprocessor system-on-chip modeling framework

    DEFF Research Database (Denmark)

    Virk, Kashif Munir; Madsen, Jan

    2004-01-01

    We present a system-level modeling framework to model system-on-chips (SoC) consisting of heterogeneous multiprocessors and network-on-chip communication structures in order to enable the developers of today's SoC designs to take advantage of the flexibility and scalability of network-on-chip and...... SoC design. We show how a hand-held multimedia terminal, consisting of JPEG, MP3 and GSM applications, can be modeled as a multiprocessor SoC in our framework....

  7. Out-of-order parallel discrete event simulation for electronic system-level design

    CERN Document Server

    Chen, Weiwei

    2014-01-01

    This book offers readers a set of new approaches and tools a set of tools and techniques for facing challenges in parallelization with design of embedded systems.? It provides an advanced parallel simulation infrastructure for efficient and effective system-level model validation and development so as to build better products in less time.? Since parallel discrete event simulation (PDES) has the potential to exploit the underlying parallel computational capability in today's multi-core simulation hosts, the author begins by reviewing the parallelization of discrete event simulation, identifyin

  8. System-level perturbations of cell metabolism using CRISPR/Cas9

    Energy Technology Data Exchange (ETDEWEB)

    Jakočiūnas, Tadas [Technical Univ. of Denmark, Lyngby (Denmark); Jensen, Michael K. [Technical Univ. of Denmark, Lyngby (Denmark); Keasling, Jay D. [Technical Univ. of Denmark, Lyngby (Denmark); Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States)

    2017-03-30

    CRISPR/Cas9 (clustered regularly interspaced palindromic repeats and the associated protein Cas9) techniques have made genome engineering and transcriptional reprogramming studies much more advanced and cost-effective. For metabolic engineering purposes, the CRISPR-based tools have been applied to single and multiplex pathway modifications and transcriptional regulations. The effectiveness of these tools allows researchers to implement genome-wide perturbations, test model-guided genome editing strategies, and perform transcriptional reprogramming perturbations in a more advanced manner than previously possible. In this mini-review we highlight recent studies adopting CRISPR/Cas9 for systems-level perturbations and model-guided metabolic engineering.

  9. System Level Power Optimization of Digital Audio Back End for Hearing Aids

    DEFF Research Database (Denmark)

    Pracny, Peter; Jørgensen, Ivan Harald Holger; Bruun, Erik

    2017-01-01

    This work deals with power optimization of the audio processing back end for hearing aids - the interpolation filter (IF), the sigma-delta (SD modulator and the Class D power amplifier (PA) as a whole. Specifications are derived and insight into the tradeoffs involved is used to optimize...... the interpolation filter and the SD modulator on the system level so that the switching frequency of the Class D PA - the main power consumer in the back end - is minimized. A figure-of-merit (FOM) which allows judging the power consumption of the digital part of the back end early in the design process is used...

  10. Empirical LTE Smartphone Power Model with DRX Operation for System Level Simulations

    DEFF Research Database (Denmark)

    Lauridsen, Mads; Noël, Laurent; Mogensen, Preben

    2013-01-01

    An LTE smartphone power model is presented to enable academia and industry to evaluate users’ battery life on system level. The model is based on empirical measurements on a smartphone using a second generation LTE chipset, and the model includes functions of receive and transmit data rates...... and power levels. The first comprehensive Discontinuous Reception (DRX) power consumption measurements are reported together with cell bandwidth, screen and CPU power consumption. The transmit power level and to some extent the receive data rate constitute the overall power consumption, while DRX proves...

  11. An investigation into soft error detection efficiency at operating system level.

    Science.gov (United States)

    Asghari, Seyyed Amir; Kaynak, Okyay; Taheri, Hassan

    2014-01-01

    Electronic equipment operating in harsh environments such as space is subjected to a range of threats. The most important of these is radiation that gives rise to permanent and transient errors on microelectronic components. The occurrence rate of transient errors is significantly more than permanent errors. The transient errors, or soft errors, emerge in two formats: control flow errors (CFEs) and data errors. Valuable research results have already appeared in literature at hardware and software levels for their alleviation. However, there is the basic assumption behind these works that the operating system is reliable and the focus is on other system levels. In this paper, we investigate the effects of soft errors on the operating system components and compare their vulnerability with that of application level components. Results show that soft errors in operating system components affect both operating system and application level components. Therefore, by providing endurance to operating system level components against soft errors, both operating system and application level components gain tolerance.

  12. Local and System Level Considerations for Plasma-Based Techniques in Hypersonic Flight

    Science.gov (United States)

    Suchomel, Charles; Gaitonde, Datta

    2007-01-01

    The harsh environment encountered due to hypersonic flight, particularly when air-breathing propulsion devices are utilized, poses daunting challenges to successful maturation of suitable technologies. This has spurred the quest for revolutionary solutions, particularly those exploiting the fact that air under these conditions can become electrically conducting either naturally or through artificial enhancement. Optimized development of such concepts must emphasize not only the detailed physics by which the fluid interacts with the imposed electromagnetic fields, but must also simultaneously identify system level issues integration and efficiencies that provide the greatest leverage. This paper presents some recent advances at both levels. At the system level, an analysis is summarized that incorporates the interdependencies occurring between weight, power and flow field performance improvements. Cruise performance comparisons highlight how one drag reduction device interacts with the vehicle to improve range. Quantified parameter interactions allow specification of system requirements and energy consuming technologies that affect overall flight vehicle performance. Results based on on the fundamental physics are presented by distilling numerous computational studies into a few guiding principles. These highlight the complex non-intuitive relationships between the various fluid and electromagnetic fields, together with thermodynamic considerations. Generally, energy extraction is an efficient process, while the reverse is accompanied by significant dissipative heating and inefficiency. Velocity distortions can be detrimental to plasma operation, but can be exploited to tailor flows through innovative electromagnetic configurations.

  13. A High-Throughput, High-Accuracy System-Level Simulation Framework for System on Chips

    Directory of Open Access Journals (Sweden)

    Guanyi Sun

    2011-01-01

    Full Text Available Today's System-on-Chips (SoCs design is extremely challenging because it involves complicated design tradeoffs and heterogeneous design expertise. To explore the large solution space, system architects have to rely on system-level simulators to identify an optimized SoC architecture. In this paper, we propose a system-level simulation framework, System Performance Simulation Implementation Mechanism, or SPSIM. Based on SystemC TLM2.0, the framework consists of an executable SoC model, a simulation tool chain, and a modeling methodology. Compared with the large body of existing research in this area, this work is aimed at delivering a high simulation throughput and, at the same time, guaranteeing a high accuracy on real industrial applications. Integrating the leading TLM techniques, our simulator can attain a simulation speed that is not slower than that of the hardware execution by a factor of 35 on a set of real-world applications. SPSIM incorporates effective timing models, which can achieve a high accuracy after hardware-based calibration. Experimental results on a set of mobile applications proved that the difference between the simulated and measured results of timing performance is within 10%, which in the past can only be attained by cycle-accurate models.

  14. Self-Driving Cars and Engineering Ethics: The Need for a System Level Analysis.

    Science.gov (United States)

    Borenstein, Jason; Herkert, Joseph R; Miller, Keith W

    2017-11-13

    The literature on self-driving cars and ethics continues to grow. Yet much of it focuses on ethical complexities emerging from an individual vehicle. That is an important but insufficient step towards determining how the technology will impact human lives and society more generally. What must complement ongoing discussions is a broader, system level of analysis that engages with the interactions and effects that these cars will have on one another and on the socio-technical systems in which they are embedded. To bring the conversation of self-driving cars to the system level, we make use of two traffic scenarios which highlight some of the complexities that designers, policymakers, and others should consider related to the technology. We then describe three approaches that could be used to address such complexities and their associated shortcomings. We conclude by bringing attention to the "Moral Responsibility for Computing Artifacts: The Rules", a framework that can provide insight into how to approach ethical issues related to self-driving cars.

  15. Value of information in sequential decision making: Component inspection, permanent monitoring and system-level scheduling

    International Nuclear Information System (INIS)

    Memarzadeh, Milad; Pozzi, Matteo

    2016-01-01

    We illustrate how to assess the Value of Information (VoI) in sequential decision making problems modeled by Partially Observable Markov Decision Processes (POMDPs). POMDPs provide a general framework for modeling the management of infrastructure components, including operation and maintenance, when only partial or noisy observations are available; VoI is a key concept for selecting explorative actions, with application to component inspection and monitoring. Furthermore, component-level VoI can serve as an effective heuristic for assigning priorities to system-level inspection scheduling. We introduce two alternative models for the availability of information, and derive the VoI in each of those settings: the Stochastic Allocation (SA) model assumes that observations are collected with a given probability, while the Fee-based Allocation model (FA) assumes that they are available at a given cost. After presenting these models at component-level, we investigate how they perform for system-level inspection scheduling. - Highlights: • On the Value of Information in POMDPs, for optimal exploration of systems. • A method for assessing the Value of Information of permanent monitoring. • A method for allocating inspections in systems made up by parallel POMDPs.

  16. An Investigation into Soft Error Detection Efficiency at Operating System Level

    Directory of Open Access Journals (Sweden)

    Seyyed Amir Asghari

    2014-01-01

    Full Text Available Electronic equipment operating in harsh environments such as space is subjected to a range of threats. The most important of these is radiation that gives rise to permanent and transient errors on microelectronic components. The occurrence rate of transient errors is significantly more than permanent errors. The transient errors, or soft errors, emerge in two formats: control flow errors (CFEs and data errors. Valuable research results have already appeared in literature at hardware and software levels for their alleviation. However, there is the basic assumption behind these works that the operating system is reliable and the focus is on other system levels. In this paper, we investigate the effects of soft errors on the operating system components and compare their vulnerability with that of application level components. Results show that soft errors in operating system components affect both operating system and application level components. Therefore, by providing endurance to operating system level components against soft errors, both operating system and application level components gain tolerance.

  17. Virtual design and optimization studies for industrial silicon microphones applying tailored system-level modeling

    Science.gov (United States)

    Kuenzig, Thomas; Dehé, Alfons; Krumbein, Ulrich; Schrag, Gabriele

    2018-05-01

    Maxing out the technological limits in order to satisfy the customers’ demands and obtain the best performance of micro-devices and-systems is a challenge of today’s manufacturers. Dedicated system simulation is key to investigate the potential of device and system concepts in order to identify the best design w.r.t. the given requirements. We present a tailored, physics-based system-level modeling approach combining lumped with distributed models that provides detailed insight into the device and system operation at low computational expense. The resulting transparent, scalable (i.e. reusable) and modularly composed models explicitly contain the physical dependency on all relevant parameters, thus being well suited for dedicated investigation and optimization of MEMS devices and systems. This is demonstrated for an industrial capacitive silicon microphone. The performance of such microphones is determined by distributed effects like viscous damping and inhomogeneous capacitance variation across the membrane as well as by system-level phenomena like package-induced acoustic effects and the impact of the electronic circuitry for biasing and read-out. The here presented model covers all relevant figures of merit and, thus, enables to evaluate the optimization potential of silicon microphones towards high fidelity applications. This work was carried out at the Technical University of Munich, Chair for Physics of Electrotechnology. Thomas Kuenzig is now with Infineon Technologies AG, Neubiberg.

  18. A System-level Infrastructure for Multi-dimensional MP-SoC Design Space Co-exploration

    NARCIS (Netherlands)

    Jia, Z.J.; Bautista, T.; Nunez, A.; Pimentel, A.D.; Thompson, M.

    2013-01-01

    In this article, we present a flexible and extensible system-level MP-SoC design space exploration (DSE) infrastructure, called NASA. This highly modular framework uses well-defined interfaces to easily integrate different system-level simulation tools as well as different combinations of search

  19. Ring Theory

    CERN Document Server

    Jara, Pascual; Torrecillas, Blas

    1988-01-01

    The papers in this proceedings volume are selected research papers in different areas of ring theory, including graded rings, differential operator rings, K-theory of noetherian rings, torsion theory, regular rings, cohomology of algebras, local cohomology of noncommutative rings. The book will be important for mathematicians active in research in ring theory.

  20. Game theory

    DEFF Research Database (Denmark)

    Hendricks, Vincent F.

    Game Theory is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in game theory. We hear their views on game theory, its aim, scope, use, the future direction of game theory and how their work fits in these respects....

  1. String theory

    International Nuclear Information System (INIS)

    Chan Hongmo.

    1987-10-01

    The paper traces the development of the String Theory, and was presented at Professor Sir Rudolf Peierls' 80sup(th) Birthday Symposium. The String theory is discussed with respect to the interaction of strings, the inclusion of both gauge theory and gravitation, inconsistencies in the theory, and the role of space-time. The physical principles underlying string theory are also outlined. (U.K.)

  2. Electrochemical reverse engineering: A systems-level tool to probe the redox-based molecular communication of biology.

    Science.gov (United States)

    Li, Jinyang; Liu, Yi; Kim, Eunkyoung; March, John C; Bentley, William E; Payne, Gregory F

    2017-04-01

    The intestine is the site of digestion and forms a critical interface between the host and the outside world. This interface is composed of host epithelium and a complex microbiota which is "connected" through an extensive web of chemical and biological interactions that determine the balance between health and disease for the host. This biology and the associated chemical dialogues occur within a context of a steep oxygen gradient that provides the driving force for a variety of reduction and oxidation (redox) reactions. While some redox couples (e.g., catecholics) can spontaneously exchange electrons, many others are kinetically "insulated" (e.g., biothiols) allowing the biology to set and control their redox states far from equilibrium. It is well known that within cells, such non-equilibrated redox couples are poised to transfer electrons to perform reactions essential to immune defense (e.g., transfer from NADH to O 2 for reactive oxygen species, ROS, generation) and protection from such oxidative stresses (e.g., glutathione-based reduction of ROS). More recently, it has been recognized that some of these redox-active species (e.g., H 2 O 2 ) cross membranes and diffuse into the extracellular environment including lumen to transmit redox information that is received by atomically-specific receptors (e.g., cysteine-based sulfur switches) that regulate biological functions. Thus, redox has emerged as an important modality in the chemical signaling that occurs in the intestine and there have been emerging efforts to develop the experimental tools needed to probe this modality. We suggest that electrochemistry provides a unique tool to experimentally probe redox interactions at a systems level. Importantly, electrochemistry offers the potential to enlist the extensive theories established in signal processing in an effort to "reverse engineer" the molecular communication occurring in this complex biological system. Here, we review our efforts to develop this

  3. String theory or field theory?

    International Nuclear Information System (INIS)

    Marshakov, A.V.

    2002-01-01

    The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments, which are our concern in this review [ru

  4. Efficient Uplink Modeling for Dynamic System-Level Simulations of Cellular and Mobile Networks

    Directory of Open Access Journals (Sweden)

    Lobinger Andreas

    2010-01-01

    Full Text Available A novel theoretical framework for uplink simulations is proposed. It allows investigations which have to cover a very long (real- time and which at the same time require a certain level of accuracy in terms of radio resource management, quality of service, and mobility. This is of particular importance for simulations of self-organizing networks. For this purpose, conventional system level simulators are not suitable due to slow simulation speeds far beyond real-time. Simpler, snapshot-based tools are lacking the aforementioned accuracy. The runtime improvements are achieved by deriving abstract theoretical models for the MAC layer behavior. The focus in this work is long term evolution, and the most important uplink effects such as fluctuating interference, power control, power limitation, adaptive transmission bandwidth, and control channel limitations are considered. Limitations of the abstract models will be discussed as well. Exemplary results are given at the end to demonstrate the capability of the derived framework.

  5. Fused Silica Final Optics for Inertial Fusion Energy: Radiation Studies and System-Level Analysis

    International Nuclear Information System (INIS)

    Latkowski, Jeffery F.; Kubota, Alison; Caturla, Maria J.; Dixit, Sham N.; Speth, Joel A.; Payne, Stephen A.

    2003-01-01

    The survivability of the final optic, which must sit in the line of sight of high-energy neutrons and gamma rays, is a key issue for any laser-driven inertial fusion energy (IFE) concept. Previous work has concentrated on the use of reflective optics. Here, we introduce and analyze the use of a transmissive final optic for the IFE application. Our experimental work has been conducted at a range of doses and dose rates, including those comparable to the conditions at the IFE final optic. The experimental work, in conjunction with detailed analysis, suggests that a thin, fused silica Fresnel lens may be an attractive option when used at a wavelength of 351 nm. Our measurements and molecular dynamics simulations provide convincing evidence that the radiation damage, which leads to optical absorption, not only saturates but that a 'radiation annealing' effect is observed. A system-level description is provided, including Fresnel lens and phase plate designs

  6. Enhancing Security by System-Level Virtualization in Cloud Computing Environments

    Science.gov (United States)

    Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei

    Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.

  7. Unravelling evolutionary strategies of yeast for improving galactose utilization through integrated systems level analysis

    DEFF Research Database (Denmark)

    Hong, Kuk-Ki; Vongsangnak, Wanwipa; Vemuri, Goutham N

    2011-01-01

    Identification of the underlying molecular mechanisms for a derived phenotype by adaptive evolution is difficult. Here, we performed a systems-level inquiry into the metabolic changes occurring in the yeast Saccharomyces cerevisiae as a result of its adaptive evolution to increase its specific...... showed changes in ergosterol biosynthesis. Mutations were identified in proteins involved in the global carbon sensing Ras/PKA pathway, which is known to regulate the reserve carbohydrates metabolism. We evaluated one of the identified mutations, RAS2(Tyr112), and this mutation resulted in an increased...... design in bioengineering of improved strains and, that through systems biology, it is possible to identify mutations in evolved strain that can serve as unforeseen metabolic engineering targets for improving microbial strains for production of biofuels and chemicals....

  8. System-level Reliability Assessment of Power Stage in Fuel Cell Application

    DEFF Research Database (Denmark)

    Zhou, Dao; Wang, Huai; Blaabjerg, Frede

    2016-01-01

    reliability. In a case study of a 5 kW fuel cell power stage, the parameter variations of the lifetime model prove that the exponential factor of the junction temperature fluctuation is the most sensitive parameter. Besides, if a 5-out-of-6 redundancy is used, it is concluded both the B10 and the B1 system......High efficient and less pollutant fuel cell stacks are emerging and strong candidates of the power solution used for mobile base stations. In the application of the backup power, the availability and reliability hold the highest priority. This paper considers the reliability metrics from...... the component-level to the system-level for the power stage used in a fuel cell application. It starts with an estimation of the annual accumulated damage for the key power electronic components according to the real mission profile of the fuel cell system. Then, considering the parameter variations in both...

  9. System-Level Model for OFDM WiMAX Transceiver in Radiation Environment

    International Nuclear Information System (INIS)

    Abdel Alim, O.; Elboghdadly, N.; Ashour, M.M.; Elaskary, A.M.

    2008-01-01

    WiMAX (Worldwide Inter operability for Microwave Access), an evolving standard for point-to-multipoint wireless networking, works for the l ast mile c onnections for replacing optical fiber technology network but with no need for adding more infra structure within crowded areas. Optical fiber technology is seriously considered for communication and monitoring applications in space and around nuclear reactors. Space and nuclear environments are characterized, in particular, by the presence of ionizing radiation fields. Therefore the influence of radiation on such networks needs to be investigated. This paper has the objective of building a System level model for a WiMAX OFDM (Orthogonal Frequency Division Multiplexing) based transceiver. Modeling irradiation noise as an external effect added to the Additive White Gaussian noise (AWGN). Then analyze, discuss the results based on qualitatively performance evaluation using BER calculations for radiation environment

  10. Goal-directed behaviour and instrumental devaluation: a neural system-level computational model

    Directory of Open Access Journals (Sweden)

    Francesco Mannella

    2016-10-01

    Full Text Available Devaluation is the key experimental paradigm used to demonstrate the presence of instrumental behaviours guided by goals in mammals. We propose a neural system-level computational model to address the question of which brain mechanisms allow the current value of rewards to control instrumental actions. The model pivots on and shows the computational soundness of the hypothesis for which the internal representation of instrumental manipulanda (e.g., levers activate the representation of rewards (or `action-outcomes', e.g. foods while attributing to them a value which depends on the current internal state of the animal (e.g., satiation for some but not all foods. The model also proposes an initial hypothesis of the integrated system of key brain components supporting this process and allowing the recalled outcomes to bias action selection: (a the sub-system formed by the basolateral amygdala and insular cortex acquiring the manipulanda-outcomes associations and attributing the current value to the outcomes; (b the three basal ganglia-cortical loops selecting respectively goals, associative sensory representations, and actions; (c the cortico-cortical and striato-nigro-striatal neural pathways supporting the selection, and selection learning, of actions based on habits and goals. The model reproduces and integrates the results of different devaluation experiments carried out with control rats and rats with pre- and post-training lesions of the basolateral amygdala, the nucleus accumbens core, the prelimbic cortex, and the dorso-medial striatum. The results support the soundness of the hypotheses of the model and show its capacity to integrate, at the system-level, the operations of the key brain structures underlying devaluation. Based on its hypotheses and predictions, the model also represents an operational framework to support the design and analysis of new experiments on the motivational aspects of goal-directed behaviour.

  11. A system-level cost-of-energy wind farm layout optimization with landowner modeling

    International Nuclear Information System (INIS)

    Chen, Le; MacDonald, Erin

    2014-01-01

    Highlights: • We model the role of landowners in determining the success of wind projects. • A cost-of-energy (COE) model with realistic landowner remittances is developed. • These models are included in a system-level wind farm layout optimization. • Basic verification indicates the optimal COE is in-line with real-world data. • Land plots crucial to a project’s success can be identified with the approach. - Abstract: This work applies an enhanced levelized wind farm cost model, including landowner remittance fees, to determine optimal turbine placements under three landowner participation scenarios and two land-plot shapes. Instead of assuming a continuous piece of land is available for the wind farm construction, as in most layout optimizations, the problem formulation represents landowner participation scenarios as a binary string variable, along with the number of turbines. The cost parameters and model are a combination of models from the National Renewable Energy Laboratory (NREL), Lawrence Berkeley National Laboratory, and Windustry. The system-level cost-of-energy (COE) optimization model is also tested under two land-plot shapes: equally-sized square land plots and unequal rectangle land plots. The optimal COEs results are compared to actual COE data and found to be realistic. The results show that landowner remittances account for approximately 10% of farm operating costs across all cases. Irregular land-plot shapes are easily handled by the model. We find that larger land plots do not necessarily receive higher remittance fees. The model can help site developers identify the most crucial land plots for project success and the optimal positions of turbines, with realistic estimates of costs and profitability

  12. System-level hazard analysis using the sequence-tree method

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih Chunkuan; Yih Swu; Chen, M.-H.

    2008-01-01

    A system-level PHA using the sequence-tree method is presented to perform safety-related digital I and C system SSA. The conventional PHA involves brainstorming among experts on various portions of the system to identify hazards through discussions. However, since the conventional PHA is not a systematic technique, the analysis results depend strongly on the experts' subjective opinions. The quality of analysis cannot be appropriately controlled. Therefore, this study presents a system-level sequence tree based PHA, which can clarify the relationship among the major digital I and C systems. This sequence-tree-based technique has two major phases. The first phase adopts a table to analyze each event in SAR Chapter 15 for a specific safety-related I and C system, such as RPS. The second phase adopts a sequence tree to recognize the I and C systems involved in the event, the working of the safety-related systems and how the backup systems can be activated to mitigate the consequence if the primary safety systems fail. The defense-in-depth echelons, namely the Control echelon, Reactor trip echelon, ESFAS echelon and Monitoring and indicator echelon, are arranged to build the sequence-tree structure. All the related I and C systems, including the digital systems and the analog back-up systems, are allocated in their specific echelons. This system-centric sequence-tree analysis not only systematically identifies preliminary hazards, but also vulnerabilities in a nuclear power plant. Hence, an effective simplified D3 evaluation can also be conducted

  13. Neural nets for the plausibility check of measured values in the integrated measurement and information system for the surveillance of environmental radioactivity (IMIS)

    International Nuclear Information System (INIS)

    Haase, G.

    2003-01-01

    Neural nets to the plausibility check of measured values in the ''integrated measurement and information system for the surveillance of environmental radioactivity, IMIS'' is a research project supported by the Federal Minister for the Environment, Nature Conservation and Nuclear Safety. A goal of this project was the automatic recognition of implausible measured values in the data base ORACLE, which measured values from surveillance of environmental radioactivity of most diverse environmental media contained. The conversion of this project [ 1 ] was realized by institut of logic, complexity and deduction systems of the university Karlsruhe under the direction of Professor Dr. Menzel, Dr. Martin Riedmueller and Martin Lauer. (orig.)

  14. The Maximum Entropy Principle and the Modern Portfolio Theory

    Directory of Open Access Journals (Sweden)

    Ailton Cassetari

    2003-12-01

    Full Text Available In this work, a capital allocation methodology base don the Principle of Maximum Entropy was developed. The Shannons entropy is used as a measure, concerning the Modern Portfolio Theory, are also discuted. Particularly, the methodology is tested making a systematic comparison to: 1 the mean-variance (Markovitz approach and 2 the mean VaR approach (capital allocations based on the Value at Risk concept. In principle, such confrontations show the plausibility and effectiveness of the developed method.

  15. Aspects of affine Toda field theory

    International Nuclear Information System (INIS)

    Braden, H.W.; Corrigan, E.; Dorey, P.E.; Sasaki, R.

    1990-05-01

    The report is devoted to properties of the affine Toda field theory, the intention being to highlight a selection of curious properties that should be explicable in terms of the underlying group theory but for which in most cases there are no explanation. The motivation for exploring the ideas contained in this report came principally from the recent work of Zamolodchikov concerning the two dimensional Ising model at critical temperature perturbed by a magnetic field. Hollowood and Mansfield pointed out that since Toda field theory is conformal the perturbation considered by Zamolodchikov might well be best regarded as a perturbation of a Toda field theory. This work made it seem plausible that the theory sought by Zamolodchikov was actually affine E 8 Toda field theory. However, this connection required an imaginary value of the coupling constant. Investigations here concerning exact S-matrices use a perturbative approach based on real coupling and the results differ in various ways from those thought to correspond to perturbed conformal field theory. A further motivation is to explore the connection between conformal and perturbed conformal field theories in other contexts using similar ideas. (N.K.)

  16. Supergravity theories

    International Nuclear Information System (INIS)

    Uehara, S.

    1985-01-01

    Of all supergravity theories, the maximal, i.e., N = 8 in 4-dimension or N = 1 in 11-dimension, theory should perform the unification since it owns the highest degree of symmetry. As to the N = 1 in d = 11 theory, it has been investigated how to compactify to the d = 4 theories. From the phenomenological point of view, local SUSY GUTs, i.e., N = 1 SUSY GUTs with soft breaking terms, have been studied from various angles. The structures of extended supergravity theories are less understood than those of N = 1 supergravity theories, and matter couplings in N = 2 extended supergravity theories are under investigation. The harmonic superspace was recently proposed which may be useful to investigate the quantum effects of extended supersymmetry and supergravity theories. As to the so-called Kaluza-Klein supergravity, there is another possibility. (Mori, K.)

  17. Topos theory

    CERN Document Server

    Johnstone, PT

    2014-01-01

    Focusing on topos theory's integration of geometric and logical ideas into the foundations of mathematics and theoretical computer science, this volume explores internal category theory, topologies and sheaves, geometric morphisms, other subjects. 1977 edition.

  18. Cancellation of infrared and collinear singularities in relativistic thermal field theories. Pt. 2

    International Nuclear Information System (INIS)

    Le Bellac, M.; Reynaud, P.

    1992-01-01

    We study the infrared and collinear divergences of a renormalizable scalar field theory at finite temperature. We give the final results of an investigation undertaken in a previous work by showing the complete cancellation of all divergences at two-loop order in a physical process. This result makes the validity of the Kinoshita-Lee-Nauenberg theorem at finite temperature extremely plausible. (orig.)

  19. Beliefs in Context: Understanding Language Policy Implementation at a Systems Level

    Science.gov (United States)

    Hopkins, Megan

    2016-01-01

    Drawing on institutional theory, this study describes how cognitive, normative, and regulative mechanisms shape bilingual teachers' language policy implementation in both English-only and bilingual contexts. Aligned with prior educational language policy research, findings indicate the important role that teachers' beliefs play in the policy…

  20. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  1. Gauge theories

    International Nuclear Information System (INIS)

    Lee, B.W.

    1976-01-01

    Some introductory remarks to Yang-Mills fields are given and the problem of the Coulomb gauge is considered. The perturbation expansion for quantized gauge theories is discussed and a survey of renormalization schemes is made. The role of Ward-Takahashi identities in gauge theories is discussed. The author then discusses the renormalization of pure gauge theories and theories with spontaneously broken symmetry. (B.R.H.)

  2. A mathematical model of metabolism an regulation provides a systems-level view of how Escherichia coli responds to oxigen

    NARCIS (Netherlands)

    Ederer, M.; Steinsiek, S.; Stagge, S.; Rolfe, M.D.; ter Beek, A.; Knies, D.; Teixeira De Mattos, M.J.; Sauter, T.; Green, J.; Poole, R.K.; Bettenbrock, K.; Sawodny, O.

    2014-01-01

    The efficient redesign of bacteria for biotechnological purposes, such as biofuel production, waste disposal or specific biocatalytic functions, requires a quantitative systems-level understanding of energy supply, carbon, and redox metabolism. The measurement of transcript levels, metabolite

  3. Quantum theory as an emergent phenomenon the statistical mechanics of matrix models as the precursor of quantum field theory

    CERN Document Server

    Adler, Stephen L

    2004-01-01

    Quantum mechanics is our most successful physical theory. However, it raises conceptual issues that have perplexed physicists and philosophers of science for decades. This 2004 book develops an approach, based on the proposal that quantum theory is not a complete, final theory, but is in fact an emergent phenomenon arising from a deeper level of dynamics. The dynamics at this deeper level are taken to be an extension of classical dynamics to non-commuting matrix variables, with cyclic permutation inside a trace used as the basic calculational tool. With plausible assumptions, quantum theory is shown to emerge as the statistical thermodynamics of this underlying theory, with the canonical commutation/anticommutation relations derived from a generalized equipartition theorem. Brownian motion corrections to this thermodynamics are argued to lead to state vector reduction and to the probabilistic interpretation of quantum theory, making contact with phenomenological proposals for stochastic modifications to Schr�...

  4. Atomic theories

    CERN Document Server

    Loring, FH

    2014-01-01

    Summarising the most novel facts and theories which were coming into prominence at the time, particularly those which had not yet been incorporated into standard textbooks, this important work was first published in 1921. The subjects treated cover a wide range of research that was being conducted into the atom, and include Quantum Theory, the Bohr Theory, the Sommerfield extension of Bohr's work, the Octet Theory and Isotopes, as well as Ionisation Potentials and Solar Phenomena. Because much of the material of Atomic Theories lies on the boundary between experimentally verified fact and spec

  5. Grounded theory.

    Science.gov (United States)

    Harris, Tina

    2015-04-29

    Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.

  6. Number theory via Representation theory

    Indian Academy of Sciences (India)

    2014-11-09

    Number theory via Representation theory. Eknath Ghate. November 9, 2014. Eightieth Annual Meeting, Chennai. Indian Academy of Sciences1. 1. This is a non-technical 20 minute talk intended for a general Academy audience.

  7. Superstring theory

    International Nuclear Information System (INIS)

    Schwarz, J.H.

    1985-01-01

    Dual string theories, initially developed as phenomenological models of hadrons, now appear more promising as candidates for a unified theory of fundamental interactions. Type I superstring theory (SST I), is a ten-dimensional theory of interacting open and closed strings, with one supersymmetry, that is free from ghosts and tachyons. It requires that an SO(eta) or Sp(2eta) gauge group be used. A light-cone-gauge string action with space-time supersymmetry automatically incorporates the superstring restrictions and leads to the discovery of type II superstring theory (SST II). SST II is an interacting theory of closed strings only, with two D=10 supersymmetries, that is also free from ghosts and tachyons. By taking six of the spatial dimensions to form a compact space, it becomes possible to reconcile the models with our four-dimensional perception of spacetime and to define low-energy limits in which SST I reduces to N=4, D=4 super Yang-Mills theory and SST II reduces to N=8, D=4 supergravity theory. The superstring theories can be described by a light-cone-gauge action principle based on fields that are functionals of string coordinates. With this formalism any physical quantity should be calculable. There is some evidence that, unlike any conventional field theory, the superstring theories provide perturbatively renormalizable (SST I) or finite (SST II) unifications of gravity with other interactions

  8. System-level tools and reconfigurable computing for next-generation HWIL systems

    Science.gov (United States)

    Stark, Derek; McAulay, Derek; Cantle, Allan J.; Devlin, Malachy

    2001-08-01

    Previous work has been presented on the creation of computing architectures called DIME, which addressed the particular computing demands of hardware in the loop systems. These demands include low latency, high data rates and interfacing. While it is essential to have a capable platform for handling and processing of the data streams, the tools must also complement this so that a system's engineer is able to construct their final system. The paper will present the work in the area of integration of system level design tools, such as MATLAB and SIMULINK, with a reconfigurable computing platform. This will demonstrate how algorithms can be implemented and simulated in a familiar rapid application development environment before they are automatically transposed for downloading directly to the computing platform. This complements the established control tools, which handle the configuration and control of the processing systems leading to a tool suite for system development and implementation. As the development tools have evolved the core-processing platform has also been enhanced. These improved platforms are based on dynamically reconfigurable computing, utilizing FPGA technologies, and parallel processing methods that more than double the performance and data bandwidth capabilities. This offers support for the processing of images in Infrared Scene Projectors with 1024 X 1024 resolutions at 400 Hz frame rates. The processing elements will be using the latest generation of FPGAs, which implies that the presented systems will be rated in terms of Tera (1012) operations per second.

  9. A Systems-Level Analysis Reveals Circadian Regulation of Splicing in Colorectal Cancer.

    Science.gov (United States)

    El-Athman, Rukeia; Fuhr, Luise; Relógio, Angela

    2018-06-20

    Accumulating evidence points to a significant role of the circadian clock in the regulation of splicing in various organisms, including mammals. Both dysregulated circadian rhythms and aberrant pre-mRNA splicing are frequently implicated in human disease, in particular in cancer. To investigate the role of the circadian clock in the regulation of splicing in a cancer progression context at the systems-level, we conducted a genome-wide analysis and compared the rhythmic transcriptional profiles of colon carcinoma cell lines SW480 and SW620, derived from primary and metastatic sites of the same patient, respectively. We identified spliceosome components and splicing factors with cell-specific circadian expression patterns including SRSF1, HNRNPLL, ESRP1, and RBM 8A, as well as altered alternative splicing events and circadian alternative splicing patterns of output genes (e.g., VEGFA, NCAM1, FGFR2, CD44) in our cellular model. Our data reveals a remarkable interplay between the circadian clock and pre-mRNA splicing with putative consequences in tumor progression and metastasis. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  10. The next generation in optical transport semiconductors: IC solutions at the system level

    Science.gov (United States)

    Gomatam, Badri N.

    2005-02-01

    In this tutorial overview, we survey some of the challenging problems facing Optical Transport and their solutions using new semiconductor-based technologies. Advances in 0.13um CMOS, SiGe/HBT and InP/HBT IC process technologies and mixed-signal design strategies are the fundamental breakthroughs that have made these solutions possible. In combination with innovative packaging and transponder/transceiver architectures IC approaches have clearly demonstrated enhanced optical link budgets with simultaneously lower (perhaps the lowest to date) cost and manufacturability tradeoffs. This paper will describe: *Electronic Dispersion Compensation broadly viewed as the overcoming of dispersion based limits to OC-192 links and extending link budgets, *Error Control/Coding also known as Forward Error Correction (FEC), *Adaptive Receivers for signal quality monitoring for real-time estimation of Q/OSNR, eye-pattern, signal BER and related temporal statistics (such as jitter). We will discuss the theoretical underpinnings of these receiver and transmitter architectures, provide examples of system performance and conclude with general market trends. These Physical layer IC solutions represent a fundamental new toolbox of options for equipment designers in addressing systems level problems. With unmatched cost and yield/performance tradeoffs, it is expected that IC approaches will provide significant flexibility in turn, for carriers and service providers who must ultimately manage the network and assure acceptable quality of service under stringent cost constraints.

  11. Modeling systems-level dynamics: Understanding without mechanistic explanation in integrative systems biology.

    Science.gov (United States)

    MacLeod, Miles; Nersessian, Nancy J

    2015-02-01

    In this paper we draw upon rich ethnographic data of two systems biology labs to explore the roles of explanation and understanding in large-scale systems modeling. We illustrate practices that depart from the goal of dynamic mechanistic explanation for the sake of more limited modeling goals. These processes use abstract mathematical formulations of bio-molecular interactions and data fitting techniques which we call top-down abstraction to trade away accurate mechanistic accounts of large-scale systems for specific information about aspects of those systems. We characterize these practices as pragmatic responses to the constraints many modelers of large-scale systems face, which in turn generate more limited pragmatic non-mechanistic forms of understanding of systems. These forms aim at knowledge of how to predict system responses in order to manipulate and control some aspects of them. We propose that this analysis of understanding provides a way to interpret what many systems biologists are aiming for in practice when they talk about the objective of a "systems-level understanding." Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Tinnitus: pathology of synaptic plasticity at the cellular and system levels

    Directory of Open Access Journals (Sweden)

    Matthieu J Guitton

    2012-03-01

    Full Text Available Despite being more and more common, and having a high impact on the quality of life of sufferers, tinnitus does not yet have a cure. This has been mostly the result of limited knowledge of the biological mechanisms underlying this adverse pathology. However, the last decade has witnessed tremendous progress in our understanding on the pathophysiology of tinnitus. Animal models have demonstrated that tinnitus is a pathology of neural plasticity, and has two main components: a molecular, peripheral component related to the initiation phase of tinnitus; and a system-level, central component related to the long-term maintenance of tinnitus. Using the most recent experimental data and the molecular/system dichotomy as a framework, we describe here the biological basis of tinnitus. We then discuss these mechanisms from an evolutionary perspective, highlighting similarities with memory. Finally, we consider how these discoveries can translate into therapies, and we suggest operative strategies to design new and effective combined therapeutic solutions using both pharmacological (local and systemic and behavioral tools (e.g., using tele-medicine and virtual reality settings.

  13. System-Level Testing of the Advanced Stirling Radioisotope Generator Engineering Hardware

    Science.gov (United States)

    Chan, Jack; Wiser, Jack; Brown, Greg; Florin, Dominic; Oriti, Salvatore M.

    2014-01-01

    To support future NASA deep space missions, a radioisotope power system utilizing Stirling power conversion technology was under development. This development effort was performed under the joint sponsorship of the Department of Energy and NASA, until its termination at the end of 2013 due to budget constraints. The higher conversion efficiency of the Stirling cycle compared with that of the Radioisotope Thermoelectric Generators (RTGs) used in previous missions (Viking, Pioneer, Voyager, Galileo, Ulysses, Cassini, Pluto New Horizons and Mars Science Laboratory) offers the advantage of a four-fold reduction in Pu-238 fuel, thereby extending its limited domestic supply. As part of closeout activities, system-level testing of flight-like Advanced Stirling Convertors (ASCs) with a flight-like ASC Controller Unit (ACU) was performed in February 2014. This hardware is the most representative of the flight design tested to date. The test fully demonstrates the following ACU and system functionality: system startup; ASC control and operation at nominal and worst-case operating conditions; power rectification; DC output power management throughout nominal and out-of-range host voltage levels; ACU fault management, and system command / telemetry via MIL-STD 1553 bus. This testing shows the viability of such a system for future deep space missions and bolsters confidence in the maturity of the flight design.

  14. System Level Analysis of a Water PCM HX Integrated into Orion's Thermal Control System

    Science.gov (United States)

    Navarro, Moses; Hansen, Scott; Seth, Rubik; Ungar, Eugene

    2015-01-01

    In a cyclical heat load environment such as low Lunar orbit, a spacecraft's radiators are not sized to reject the full heat load requirement. Traditionally, a supplemental heat rejection device (SHReD) such as an evaporator or sublimator is used to act as a "topper" to meet the additional heat rejection demands. Utilizing a Phase Change Material (PCM) heat exchanger (HX) as a SHReD provides an attractive alternative to evaporators and sublimators as PCM HXs do not use a consumable, thereby leading to reduced launch mass and volume requirements. In continued pursuit of water PCM HX development an Orion system level analysis was performed using Thermal Desktop for a water PCM HX integrated into Orion's thermal control system in a 100km Lunar orbit. The study verified of the thermal model by using a wax PCM and analyzed 1) placing the PCM on the Internal Thermal Control System (ITCS) versus the External Thermal Control System (ETCS) 2) use of 30/70 PGW verses 50/50 PGW and 3) increasing the radiator area in order to reduce PCM freeze times. The analysis showed that for the assumed operating and boundary conditions utilizing a water PCM HX on Orion is not a viable option for any case. Additionally, it was found that the radiator area would have to be increased by at least 40% in order to support a viable water-based PCM HX.

  15. System Level Analysis of a Water PCM HX Integrated Into Orion's Thermal Control System Abstract

    Science.gov (United States)

    Navarro, Moses; Hansen, Scott; Ungar, Eugene; Sheth, Rubik

    2015-01-01

    In a cyclical heat load environment such as low Lunar orbit, a spacecraft's radiators are not sized to reject the full heat load requirement. Traditionally, a supplemental heat rejection device (SHReD) such as an evaporator or sublimator is used to act as a "topper" to meet the additional heat rejection demands. Utilizing a Phase Change Material (PCM) heat exchanger (HX) as a SHReD provides an attractive alternative to evaporators and sublimators as PCM HXs do not use a consumable, thereby leading to reduced launch mass and volume requirements. In continued pursuit of water PCM HX development an Orion system level analysis was performed using Thermal Desktop for a water PCM HX integrated into Orion's thermal control system and in a 100km Lunar orbit. The study analyzed 1) placing the PCM on the Internal Thermal Control System (ITCS) versus the External Thermal Control System (ETCS) 2) use of 30/70 PGW verses 50/50 PGW and 3) increasing the radiator area in order to reduce PCM freeze times. The analysis showed that for the assumed operating and boundary conditions utilizing a water PCM HX on Orion is not a viable option. Additionally, it was found that the radiator area would have to be increased over 20% in order to have a viable water-based PCM HX.

  16. Interventions to Support System-level Implementation of Health Promoting Schools: A Scoping Review

    Directory of Open Access Journals (Sweden)

    Jessie-Lee D. McIsaac

    2016-02-01

    Full Text Available Health promoting schools (HPS is recognized globally as a multifaceted approach that can support health behaviours. There is increasing clarity around factors that influence HPS at a school level but limited synthesized knowledge on the broader system-level elements that may impact local implementation barriers and support uptake of a HPS approach. This study comprised a scoping review to identify, summarise and disseminate the range of research to support the uptake of a HPS approach across school systems. Two reviewers screened and extracted data according to inclusion/exclusion criteria. Relevant studies were identified using a multi-phased approach including searching electronic bibliographic databases of peer reviewed literature, hand-searching reference lists and article recommendations from experts. In total, 41 articles met the inclusion criteria for the review, representing studies across nine international school systems. Overall, studies described policies that provided high-level direction and resources within school jurisdictions to support implementation of a HPS approach. Various multifaceted organizational and professional interventions were identified, including strategies to enable and restructure school environments through education, training, modelling and incentives. A systematic realist review of the literature may be warranted to identify the types of intervention that work best for whom, in what circumstance to create healthier schools and students.

  17. Calibration and Evaluation of Fixed and Mobile Relay-Based System Level Simulator

    Directory of Open Access Journals (Sweden)

    Shahid Mumtaz

    2010-01-01

    Full Text Available Future wireless communication systems are expected to provide more stable and higher data rate transmissions in the whole OFDMA networks, but the mobile stations (MSs in the cell boundary experience poor spectral efficiency due to the path loss from the transmitting antenna and interference from adjacent cells. Therefore, satisfying QoS (Quality of Service requirements of each MS at the cell boundary has been an important issue. To resolve this spectral efficiency problem at the cell boundary, deploying relay stations has been actively considered. As multihop/relay has complex interactions between the routing and medium access control decisions, the extent to which analytical expressions can be used to explore its benefits is limited. Consequently, simulations tend to be the preferred way of assessing the performance of relays. In this paper, we evaluate the performance of relay-assisted OFDMA networks by means of system level simulator (SLS. We consistently observed that the throughput is increased and the outage is decreased in the relay-assisted OFDMA network, which is converted to range extension without any capacity penalty, for the realistic range of values of the propagation and other system parameters investigated.

  18. System Level Design of Reconfigurable Server Farms Using Elliptic Curve Cryptography Processor Engines

    Directory of Open Access Journals (Sweden)

    Sangook Moon

    2014-01-01

    Full Text Available As today’s hardware architecture becomes more and more complicated, it is getting harder to modify or improve the microarchitecture of a design in register transfer level (RTL. Consequently, traditional methods we have used to develop a design are not capable of coping with complex designs. In this paper, we suggest a way of designing complex digital logic circuits with a soft and advanced type of SystemVerilog at an electronic system level. We apply the concept of design-and-reuse with a high level of abstraction to implement elliptic curve crypto-processor server farms. With the concept of the superior level of abstraction to the RTL used with the traditional HDL design, we successfully achieved the soft implementation of the crypto-processor server farms as well as robust test bench code with trivial effort in the same simulation environment. Otherwise, it could have required error-prone Verilog simulations for the hardware IPs and other time-consuming jobs such as C/SystemC verification for the software, sacrificing more time and effort. In the design of the elliptic curve cryptography processor engine, we propose a 3X faster GF(2m serial multiplication architecture.

  19. Female mating preferences determine system-level evolution in a gene network model.

    Science.gov (United States)

    Fierst, Janna L

    2013-06-01

    Environmental patterns of directional, stabilizing and fluctuating selection can influence the evolution of system-level properties like evolvability and mutational robustness. Intersexual selection produces strong phenotypic selection and these dynamics may also affect the response to mutation and the potential for future adaptation. In order to to assess the influence of mating preferences on these evolutionary properties, I modeled a male trait and female preference determined by separate gene regulatory networks. I studied three sexual selection scenarios: sexual conflict, a Gaussian model of the Fisher process described in Lande (in Proc Natl Acad Sci 78(6):3721-3725, 1981) and a good genes model in which the male trait signalled his mutational condition. I measured the effects these mating preferences had on the potential for traits and preferences to evolve towards new states, and mutational robustness of both the phenotype and the individual's overall viability. All types of sexual selection increased male phenotypic robustness relative to a randomly mating population. The Fisher model also reduced male evolvability and mutational robustness for viability. Under good genes sexual selection, males evolved an increased mutational robustness for viability. Females choosing their mates is a scenario that is sufficient to create selective forces that impact genetic evolution and shape the evolutionary response to mutation and environmental selection. These dynamics will inevitably develop in any population where sexual selection is operating, and affect the potential for future adaptation.

  20. On-Site Renewable Energy and Green Buildings: A System-Level Analysis.

    Science.gov (United States)

    Al-Ghamdi, Sami G; Bilec, Melissa M

    2016-05-03

    Adopting a green building rating system (GBRSs) that strongly considers use of renewable energy can have important environmental consequences, particularly in developing countries. In this paper, we studied on-site renewable energy and GBRSs at the system level to explore potential benefits and challenges. While we have focused on GBRSs, the findings can offer additional insight for renewable incentives across sectors. An energy model was built for 25 sites to compute the potential solar and wind power production on-site and available within the building footprint and regional climate. A life-cycle approach and cost analysis were then completed to analyze the environmental and economic impacts. Environmental impacts of renewable energy varied dramatically between sites, in some cases, the environmental benefits were limited despite the significant economic burden of those renewable systems on-site and vice versa. Our recommendation for GBRSs, and broader policies and regulations, is to require buildings with higher environmental impacts to achieve higher levels of energy performance and on-site renewable energy utilization, instead of fixed percentages.

  1. Metabolic Compartmentation – A System Level Property of Muscle Cells

    Directory of Open Access Journals (Sweden)

    Theo Wallimann

    2008-05-01

    Full Text Available Problems of quantitative investigation of intracellular diffusion and compartmentation of metabolites are analyzed. Principal controversies in recently published analyses of these problems for the living cells are discussed. It is shown that the formal theoretical analysis of diffusion of metabolites based on Fick’s equation and using fixed diffusion coefficients for diluted homogenous aqueous solutions, but applied for biological systems in vivo without any comparison with experimental results, may lead to misleading conclusions, which are contradictory to most biological observations. However, if the same theoretical methods are used for analysis of actual experimental data, the apparent diffusion constants obtained are orders of magnitude lower than those in diluted aqueous solutions. Thus, it can be concluded that local restrictions of diffusion of metabolites in a cell are a system-level properties caused by complex structural organization of the cells, macromolecular crowding, cytoskeletal networks and organization of metabolic pathways into multienzyme complexes and metabolons. This results in microcompartmentation of metabolites, their channeling between enzymes and in modular organization of cellular metabolic networks. The perspectives of further studies of these complex intracellular interactions in the framework of Systems Biology are discussed.

  2. A Platform-Based Methodology for System-Level Mixed-Signal Design

    Directory of Open Access Journals (Sweden)

    Alberto Sangiovanni-Vincentelli

    2010-01-01

    Full Text Available The complexity of today's embedded electronic systems as well as their demanding performance and reliability requirements are such that their design can no longer be tackled with ad hoc techniques while still meeting tight time to-market constraints. In this paper, we present a system level design approach for electronic circuits, utilizing the platform-based design (PBD paradigm as the natural framework for mixed-domain design formalization. In PBD, a meet-in-the-middle approach allows systematic exploration of the design space through a series of top-down mapping of system constraints onto component feasibility models in a platform library, which is based on bottom-up characterizations. In this framework, new designs can be assembled from the precharacterized library components, giving the highest priority to design reuse, correct assembly, and efficient design flow from specifications to implementation. We apply concepts from design centering to enforce robustness to modeling errors as well as process, voltage, and temperature variations, which are currently plaguing embedded system design in deep-submicron technologies. The effectiveness of our methodology is finally shown on the design of a pipeline A/D converter and two receiver front-ends for UMTS and UWB communications.

  3. Heightened systemic levels of neutrophil and eosinophil granular proteins in pulmonary tuberculosis and reversal following treatment.

    Science.gov (United States)

    Moideen, Kadar; Kumar, Nathella Pavan; Nair, Dina; Banurekha, Vaithilingam V; Bethunaickan, Ramalingam; Babu, Subash

    2018-04-09

    Granulocytes are activated during tuberculosis (TB) infection and act as immune effector cells and granulocyte responses are implicated in TB pathogenesis. Plasma levels of neutrophil and eosinophil granular proteins provide an indirect measure of degranulation. In this study, we wanted to examine the levels of neutrophil and eosinophil granular proteins in individuals with pulmonary tuberculosis (PTB) and to compare them with the levels in latent TB (LTB) individuals. Hence, we measured the plasma levels of myeloperoxidase (MPO), neutrophil elastase, and proteinase-3; major basic protein (MBP), eosinophil derived neurotoxin (EDN), eosinophil cationic protein (ECP) and eosinophil peroxidase (EPX) in these individuals. Finally, we also measured the levels of all of these parameters in PTB individuals following anti-tuberculosis (ATT) treatment. Our data reveal that PTB individuals are characterized by significantly higher plasma levels of MPO, elastase, human proteinase 3 as well as MBP and EDN in comparison to LTB individuals. Our data also reveal that ATT resulted in reversal of all of these changes, indicating an association with TB disease. Finally, our data show that the systemic levels of MPO and proteinase-3 can significantly discriminate PTB from LTB individuals. Thus, our data suggest that neutrophil and eosinophil granular proteins could play a potential role in the innate immune response and therefore, the pathogenesis of pulmonary TB. Copyright © 2018 American Society for Microbiology.

  4. Interventions to Support System-level Implementation of Health Promoting Schools: A Scoping Review

    Science.gov (United States)

    McIsaac, Jessie-Lee D.; Hernandez, Kimberley J.; Kirk, Sara F.L.; Curran, Janet A.

    2016-01-01

    Health promoting schools (HPS) is recognized globally as a multifaceted approach that can support health behaviours. There is increasing clarity around factors that influence HPS at a school level but limited synthesized knowledge on the broader system-level elements that may impact local implementation barriers and support uptake of a HPS approach. This study comprised a scoping review to identify, summarise and disseminate the range of research to support the uptake of a HPS approach across school systems. Two reviewers screened and extracted data according to inclusion/exclusion criteria. Relevant studies were identified using a multi-phased approach including searching electronic bibliographic databases of peer reviewed literature, hand-searching reference lists and article recommendations from experts. In total, 41 articles met the inclusion criteria for the review, representing studies across nine international school systems. Overall, studies described policies that provided high-level direction and resources within school jurisdictions to support implementation of a HPS approach. Various multifaceted organizational and professional interventions were identified, including strategies to enable and restructure school environments through education, training, modelling and incentives. A systematic realist review of the literature may be warranted to identify the types of intervention that work best for whom, in what circumstance to create healthier schools and students. PMID:26861376

  5. String theory or field theory?

    International Nuclear Information System (INIS)

    Marshakov, Andrei V

    2002-01-01

    The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of string theory in the modern picture of the physical world. Even though quantum field theory describes a wide range of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments which are our concern in this review. (reviews of topical problems)

  6. Dependence theory via game theory

    NARCIS (Netherlands)

    Grossi, D.; Turrini, P.

    2011-01-01

    In the multi-agent systems community, dependence theory and game theory are often presented as two alternative perspectives on the analysis of social interaction. Up till now no research has been done relating these two approaches. The unification presented provides dependence theory with the sort

  7. Infrared Constraint on Ultraviolet Theories

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, Yuhsin [Cornell Univ., Ithaca, NY (United States)

    2012-08-01

    While our current paradigm of particle physics, the Standard Model (SM), has been extremely successful at explaining experiments, it is theoretically incomplete and must be embedded into a larger framework. In this thesis, we review the main motivations for theories beyond the SM (BSM) and the ways such theories can be constrained using low energy physics. The hierarchy problem, neutrino mass and the existence of dark matter (DM) are the main reasons why the SM is incomplete . Two of the most plausible theories that may solve the hierarchy problem are the Randall-Sundrum (RS) models and supersymmetry (SUSY). RS models usually suffer from strong flavor constraints, while SUSY models produce extra degrees of freedom that need to be hidden from current experiments. To show the importance of infrared (IR) physics constraints, we discuss the flavor bounds on the anarchic RS model in both the lepton and quark sectors. For SUSY models, we discuss the difficulties in obtaining a phenomenologically allowed gaugino mass, its relation to R-symmetry breaking, and how to build a model that avoids this problem. For the neutrino mass problem, we discuss the idea of generating small neutrino masses using compositeness. By requiring successful leptogenesis and the existence of warm dark matter (WDM), we can set various constraints on the hidden composite sector. Finally, to give an example of model independent bounds from collider experiments, we show how to constrain the DM–SM particle interactions using collider results with an effective coupling description.

  8. Anticipating and Communicating Plausible Environmental and Health Concerns Associated with Future Disasters: The ShakeOut and ARkStorm Scenarios as Examples

    Science.gov (United States)

    Plumlee, G. S.; Morman, S. A.; Alpers, C. N.; Hoefen, T. M.; Meeker, G. P.

    2010-12-01

    Disasters commonly pose immediate threats to human safety, but can also produce hazardous materials (HM) that pose short- and long-term environmental-health threats. The U.S. Geological Survey (USGS) has helped assess potential environmental health characteristics of HM produced by various natural and anthropogenic disasters, such as the 2001 World Trade Center collapse, 2005 hurricanes Katrina and Rita, 2007-2009 southern California wildfires, various volcanic eruptions, and others. Building upon experience gained from these responses, we are now developing methods to anticipate plausible environmental and health implications of the 2008 Great Southern California ShakeOut scenario (which modeled the impacts of a 7.8 magnitude earthquake on the southern San Andreas fault, http://urbanearth.gps.caltech.edu/scenario08/), and the recent ARkStorm scenario (modeling the impacts of a major, weeks-long winter storm hitting nearly all of California, http://urbanearth.gps.caltech.edu/winter-storm/). Environmental-health impacts of various past earthquakes and extreme storms are first used to identify plausible impacts that could be associated with the disaster scenarios. Substantial insights can then be gleaned using a Geographic Information Systems (GIS) approach to link ShakeOut and ARkStorm effects maps with data extracted from diverse database sources containing geologic, hazards, and environmental information. This type of analysis helps constrain where potential geogenic (natural) and anthropogenic sources of HM (and their likely types of contaminants or pathogens) fall within areas of predicted ShakeOut-related shaking, firestorms, and landslides, and predicted ARkStorm-related precipitation, flooding, and winds. Because of uncertainties in the event models and many uncertainties in the databases used (e.g., incorrect location information, lack of detailed information on specific facilities, etc.) this approach should only be considered as the first of multiple steps

  9. Randomization and resilience of brain functional networks as systems-level endophenotypes of schizophrenia.

    Science.gov (United States)

    Lo, Chun-Yi Zac; Su, Tsung-Wei; Huang, Chu-Chung; Hung, Chia-Chun; Chen, Wei-Ling; Lan, Tsuo-Hung; Lin, Ching-Po; Bullmore, Edward T

    2015-07-21

    Schizophrenia is increasingly conceived as a disorder of brain network organization or dysconnectivity syndrome. Functional MRI (fMRI) networks in schizophrenia have been characterized by abnormally random topology. We tested the hypothesis that network randomization is an endophenotype of schizophrenia and therefore evident also in nonpsychotic relatives of patients. Head movement-corrected, resting-state fMRI data were acquired from 25 patients with schizophrenia, 25 first-degree relatives of patients, and 29 healthy volunteers. Graphs were used to model functional connectivity as a set of edges between regional nodes. We estimated the topological efficiency, clustering, degree distribution, resilience, and connection distance (in millimeters) of each functional network. The schizophrenic group demonstrated significant randomization of global network metrics (reduced clustering, greater efficiency), a shift in the degree distribution to a more homogeneous form (fewer hubs), a shift in the distance distribution (proportionally more long-distance edges), and greater resilience to targeted attack on network hubs. The networks of the relatives also demonstrated abnormal randomization and resilience compared with healthy volunteers, but they were typically less topologically abnormal than the patients' networks and did not have abnormal connection distances. We conclude that schizophrenia is associated with replicable and convergent evidence for functional network randomization, and a similar topological profile was evident also in nonpsychotic relatives, suggesting that this is a systems-level endophenotype or marker of familial risk. We speculate that the greater resilience of brain networks may confer some fitness advantages on nonpsychotic relatives that could explain persistence of this endophenotype in the population.

  10. Unravelling evolutionary strategies of yeast for improving galactose utilization through integrated systems level analysis.

    Science.gov (United States)

    Hong, Kuk-Ki; Vongsangnak, Wanwipa; Vemuri, Goutham N; Nielsen, Jens

    2011-07-19

    Identification of the underlying molecular mechanisms for a derived phenotype by adaptive evolution is difficult. Here, we performed a systems-level inquiry into the metabolic changes occurring in the yeast Saccharomyces cerevisiae as a result of its adaptive evolution to increase its specific growth rate on galactose and related these changes to the acquired phenotypic properties. Three evolved mutants (62A, 62B, and 62C) with higher specific growth rates and faster specific galactose uptake were isolated. The evolved mutants were compared with a reference strain and two engineered strains, SO16 and PGM2, which also showed higher galactose uptake rate in previous studies. The profile of intermediates in galactose metabolism was similar in evolved and engineered mutants, whereas reserve carbohydrates metabolism was specifically elevated in the evolved mutants and one evolved strain showed changes in ergosterol biosynthesis. Mutations were identified in proteins involved in the global carbon sensing Ras/PKA pathway, which is known to regulate the reserve carbohydrates metabolism. We evaluated one of the identified mutations, RAS2(Tyr112), and this mutation resulted in an increased specific growth rate on galactose. These results show that adaptive evolution results in the utilization of unpredicted routes to accommodate increased galactose flux in contrast to rationally engineered strains. Our study demonstrates that adaptive evolution represents a valuable alternative to rational design in bioengineering of improved strains and, that through systems biology, it is possible to identify mutations in evolved strain that can serve as unforeseen metabolic engineering targets for improving microbial strains for production of biofuels and chemicals.

  11. Systems-level thinking for nanoparticle-mediated therapeutic delivery to neurological diseases.

    Science.gov (United States)

    Curtis, Chad; Zhang, Mengying; Liao, Rick; Wood, Thomas; Nance, Elizabeth

    2017-03-01

    Neurological diseases account for 13% of the global burden of disease. As a result, treating these diseases costs $750 billion a year. Nanotechnology, which consists of small (~1-100 nm) but highly tailorable platforms, can provide significant opportunities for improving therapeutic delivery to the brain. Nanoparticles can increase drug solubility, overcome the blood-brain and brain penetration barriers, and provide timed release of a drug at a site of interest. Many researchers have successfully used nanotechnology to overcome individual barriers to therapeutic delivery to the brain, yet no platform has translated into a standard of care for any neurological disease. The challenge in translating nanotechnology platforms into clinical use for patients with neurological disease necessitates a new approach to: (1) collect information from the fields associated with understanding and treating brain diseases and (2) apply that information using scalable technologies in a clinically-relevant way. This approach requires systems-level thinking to integrate an understanding of biological barriers to therapeutic intervention in the brain with the engineering of nanoparticle material properties to overcome those barriers. To demonstrate how a systems perspective can tackle the challenge of treating neurological diseases using nanotechnology, this review will first present physiological barriers to drug delivery in the brain and common neurological disease hallmarks that influence these barriers. We will then analyze the design of nanotechnology platforms in preclinical in vivo efficacy studies for treatment of neurological disease, and map concepts for the interaction of nanoparticle physicochemical properties and pathophysiological hallmarks in the brain. WIREs Nanomed Nanobiotechnol 2017, 9:e1422. doi: 10.1002/wnan.1422 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  12. Highlighting the Need for Systems-level Experimental Characterization of Plant Metabolic Enzymes

    Directory of Open Access Journals (Sweden)

    Martin Karl Magnus Engqvist

    2016-07-01

    Full Text Available The biology of living organisms is determined by the action and interaction of a large number of individual gene products, each with specific functions. Discovering and annotating the function of gene products is key to our understanding of these organisms. Controlled experiments and bioinformatic predictions both contribute to functional gene annotation. For most species it is difficult to gain an overview of what portion of gene annotations are based on experiments and what portion represent predictions. Here, I survey the current state of experimental knowledge of enzymes and metabolism in Arabidopsis thaliana as well as eleven economically important crops and forestry trees – with a particular focus on reactions involving organic acids in central metabolism. I illustrate the limited availability of experimental data for functional annotation of enzymes in most of these species. Many enzymes involved in metabolism of citrate, malate, fumarate, lactate, and glycolate in crops and forestry trees have not been characterized. Furthermore, enzymes involved in key biosynthetic pathways which shape important traits in crops and forestry trees have not been characterized. I argue for the development of novel high-throughput platforms with which limited functional characterization of gene products can be performed quickly and relatively cheaply. I refer to this approach as systems-level experimental characterization. The data collected from such platforms would form a layer intermediate between bioinformatic gene function predictions and in-depth experimental studies of these functions. Such a data layer would greatly aid in the pursuit of understanding a multiplicity of biological processes in living organisms.

  13. Virtual Systems Pharmacology (ViSP software for mechanistic system-level model simulations

    Directory of Open Access Journals (Sweden)

    Sergey eErmakov

    2014-10-01

    Full Text Available Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user’s particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  14. System-Level Power Consumption Analysis of the Wearable Asthmatic Wheeze Quantification

    Directory of Open Access Journals (Sweden)

    Dinko Oletic

    2018-01-01

    Full Text Available Long-term quantification of asthmatic wheezing envisions an m-Health sensor system consisting of a smartphone and a body-worn wireless acoustic sensor. As both devices are power constrained, the main criterion guiding the system design comes down to minimization of power consumption, while retaining sufficient respiratory sound classification accuracy (i.e., wheeze detection. Crucial for assessment of the system-level power consumption is the understanding of trade-off between power cost of computationally intensive local processing and communication. Therefore, we analyze power requirements of signal acquisition, processing, and communication in three typical operating scenarios: (1 streaming of uncompressed respiratory signal to a smartphone for classification, (2 signal streaming utilizing compressive sensing (CS for reduction of data rate, and (3 respiratory sound classification onboard the wearable sensor. Study shows that the third scenario featuring the lowest communication cost enables the lowest total sensor system power consumption ranging from 328 to 428 μW. In such scenario, 32-bit ARM Cortex M3/M4 cores typically embedded within Bluetooth 4 SoC modules feature the optimal trade-off between onboard classification performance and consumption. On the other hand, study confirms that CS enables the most power-efficient design of the wearable sensor (216 to 357 μW in the compressed signal streaming, the second scenario. In such case, a single low-power ARM Cortex-A53 core is sufficient for simultaneous real-time CS reconstruction and classification on the smartphone, while keeping the total system power within budget for uncompressed streaming.

  15. Multiple fMRI system-level baseline connectivity is disrupted in patients with consciousness alterations.

    Science.gov (United States)

    Demertzi, Athena; Gómez, Francisco; Crone, Julia Sophia; Vanhaudenhuyse, Audrey; Tshibanda, Luaba; Noirhomme, Quentin; Thonnard, Marie; Charland-Verville, Vanessa; Kirsch, Murielle; Laureys, Steven; Soddu, Andrea

    2014-03-01

    In healthy conditions, group-level fMRI resting state analyses identify ten resting state networks (RSNs) of cognitive relevance. Here, we aim to assess the ten-network model in severely brain-injured patients suffering from disorders of consciousness and to identify those networks which will be most relevant to discriminate between patients and healthy subjects. 300 fMRI volumes were obtained in 27 healthy controls and 53 patients in minimally conscious state (MCS), vegetative state/unresponsive wakefulness syndrome (VS/UWS) and coma. Independent component analysis (ICA) reduced data dimensionality. The ten networks were identified by means of a multiple template-matching procedure and were tested on neuronality properties (neuronal vs non-neuronal) in a data-driven way. Univariate analyses detected between-group differences in networks' neuronal properties and estimated voxel-wise functional connectivity in the networks, which were significantly less identifiable in patients. A nearest-neighbor "clinical" classifier was used to determine the networks with high between-group discriminative accuracy. Healthy controls were characterized by more neuronal components compared to patients in VS/UWS and in coma. Compared to healthy controls, fewer patients in MCS and VS/UWS showed components of neuronal origin for the left executive control network, default mode network (DMN), auditory, and right executive control network. The "clinical" classifier indicated the DMN and auditory network with the highest accuracy (85.3%) in discriminating patients from healthy subjects. FMRI multiple-network resting state connectivity is disrupted in severely brain-injured patients suffering from disorders of consciousness. When performing ICA, multiple-network testing and control for neuronal properties of the identified RSNs can advance fMRI system-level characterization. Automatic data-driven patient classification is the first step towards future single-subject objective diagnostics

  16. Advancements toward a Systems Level Understanding of the Human Oral Microbiome

    Directory of Open Access Journals (Sweden)

    Jeffrey Scott Mclean

    2014-07-01

    Full Text Available Oral microbes represent one of the most well studied microbial communities owing to the fact that they are a fundamental part of human development influencing health and disease, an easily accessible human microbiome, a highly structured and remarkably resilient biofilm as well as a model of bacteria-bacteria and bacteria-host interactions. In the last eighty years since oral plaque was first characterized for its functionally stable physiological properties such as the highly repeatable rapid pH decrease upon carbohydrate addition and subsequent recovery phase, the fundamental approaches to study the oral microbiome have cycled back and forth between community level investigations and characterizing individual model isolates. Since that time, many individual species have been well characterized and the development of the early plaque community, which involves many cell–cell binding interactions, has been carefully described. With high throughput sequencing enabling the enormous diversity of the oral cavity to be realized, a number of new challenges to progress were revealed. The large number of uncultivated oral species, the high interpersonal variability of taxonomic carriage and the possibility of multiple pathways to dysbiosis pose as major hurdles to obtain a systems level understanding from the community to the gene level. It is now possible however to start connecting the insights gained from single species with community wide approaches. This review will discuss some of the recent insights into the oral microbiome at a fundamental level, existing knowledge gaps, as well as challenges that have surfaced and the approaches to address them.

  17. Model theory

    CERN Document Server

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  18. Viability Theory

    CERN Document Server

    Aubin, Jean-Pierre; Saint-Pierre, Patrick

    2011-01-01

    Viability theory designs and develops mathematical and algorithmic methods for investigating the adaptation to viability constraints of evolutions governed by complex systems under uncertainty that are found in many domains involving living beings, from biological evolution to economics, from environmental sciences to financial markets, from control theory and robotics to cognitive sciences. It involves interdisciplinary investigations spanning fields that have traditionally developed in isolation. The purpose of this book is to present an initiation to applications of viability theory, explai

  19. Galois Theory

    CERN Document Server

    Cox, David A

    2012-01-01

    Praise for the First Edition ". . .will certainly fascinate anyone interested in abstract algebra: a remarkable book!"—Monatshefte fur Mathematik Galois theory is one of the most established topics in mathematics, with historical roots that led to the development of many central concepts in modern algebra, including groups and fields. Covering classic applications of the theory, such as solvability by radicals, geometric constructions, and finite fields, Galois Theory, Second Edition delves into novel topics like Abel’s theory of Abelian equations, casus irreducibili, and the Galo

  20. Game theory.

    Science.gov (United States)

    Dufwenberg, Martin

    2011-03-01

    Game theory is a toolkit for examining situations where decision makers influence each other. I discuss the nature of game-theoretic analysis, the history of game theory, why game theory is useful for understanding human psychology, and why game theory has played a key role in the recent explosion of interest in the field of behavioral economics. WIREs Cogni Sci 2011 2 167-173 DOI: 10.1002/wcs.119 For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.

  1. Elastoplasticity theory

    CERN Document Server

    Hashiguchi, Koichi

    2009-01-01

    This book details the mathematics and continuum mechanics necessary as a foundation of elastoplasticity theory. It explains physical backgrounds with illustrations and provides descriptions of detailed derivation processes..

  2. Causal quantum theory and the collapse locality loophole

    International Nuclear Information System (INIS)

    Kent, Adrian

    2005-01-01

    Causal quantum theory is an umbrella term for ordinary quantum theory modified by two hypotheses: state vector reduction is a well-defined process, and strict local causality applies. The first of these holds in some versions of Copenhagen quantum theory and need not necessarily imply practically testable deviations from ordinary quantum theory. The second implies that measurement events which are spacelike separated have no nonlocal correlations. To test this prediction, which sharply differs from standard quantum theory, requires a precise definition of state vector reduction. Formally speaking, any precise version of causal quantum theory defines a local hidden variable theory. However, causal quantum theory is most naturally seen as a variant of standard quantum theory. For that reason it seems a more serious rival to standard quantum theory than local hidden variable models relying on the locality or detector efficiency loopholes. Some plausible versions of causal quantum theory are not refuted by any Bell experiments to date, nor is it evident that they are inconsistent with other experiments. They evade refutation via a neglected loophole in Bell experiments--the collapse locality loophole--which exists because of the possible time lag between a particle entering a measurement device and a collapse taking place. Fairly definitive tests of causal versus standard quantum theory could be made by observing entangled particles separated by ≅0.1 light seconds

  3. Integration Strategy for Free-form Lithium Ion Battery: Material, Design to System level Applications

    KAUST Repository

    Kutbee, Arwa T.

    2017-10-31

    Power supply in any electronic system is a crucial necessity. Especially so in fully compliant personalized advanced healthcare electronic self-powered systems where we envision seamless integration of sensors and actuators with data management components in a single freeform platform to augment the quality of our healthcare, smart living and sustainable future. However, the status-quo energy storage (battery) options require packaging to protect the indwelling toxic materials against harsh physiological environment and vice versa, compromising its mechanical flexibility, conformability and wearability at the highest electrochemical performance. Therefore, clean and safe energy storage solutions for wearable and implantable electronics are needed to replace the commercially used unsafe lithium-ion batteries. This dissertation discusses a highly manufacturable integration strategy for a free-form lithium-ion battery towards a genuine mechanically compliant wearable system. We sequentially start with the optimization process for the preparation of all solid-state material comprising a ‘’Lithium-free’’ lithium-ion microbattery with a focus on thin film texture optimization of the cathode material. State of the art complementary metal oxide semiconductor technology was used for the thin film based battery. Additionally, this thesis reports successful development of a transfer-less scheme for a flexible battery with small footprint and free form factor in a high yield production process. The reliable process for the flexible lithium-ion battery achieves an enhanced energy density by three orders of magnitude compared to the available rigid ones. Interconnection and bonding procedures of the developed batteries are discussed for a reliable back end of line process flexible, stretchable and stackable modules. Special attention is paid to the advanced bonding, handling and packaging strategies of flexible batteries towards system-level applications. Finally, this

  4. Striatal response to reward anticipation: evidence for a systems-level intermediate phenotype for schizophrenia.

    Science.gov (United States)

    Grimm, Oliver; Heinz, Andreas; Walter, Henrik; Kirsch, Peter; Erk, Susanne; Haddad, Leila; Plichta, Michael M; Romanczuk-Seiferth, Nina; Pöhland, Lydia; Mohnke, Sebastian; Mühleisen, Thomas W; Mattheisen, Manuel; Witt, Stephanie H; Schäfer, Axel; Cichon, Sven; Nöthen, Markus; Rietschel, Marcella; Tost, Heike; Meyer-Lindenberg, Andreas

    2014-05-01

    Attenuated ventral striatal response during reward anticipation is a core feature of schizophrenia that is seen in prodromal, drug-naive, and chronic schizophrenic patients. Schizophrenia is highly heritable, raising the possibility that this phenotype is related to the genetic risk for the disorder. To examine a large sample of healthy first-degree relatives of schizophrenic patients and compare their neural responses to reward anticipation with those of carefully matched controls without a family psychiatric history. To further support the utility of this phenotype, we studied its test-retest reliability, its potential brain structural contributions, and the effects of a protective missense variant in neuregulin 1 (NRG1) linked to schizophrenia by meta-analysis (ie, rs10503929). Examination of a well-established monetary reward anticipation paradigm during functional magnetic resonance imaging at a university hospital; voxel-based morphometry; test-retest reliability analysis of striatal activations in an independent sample of 25 healthy participants scanned twice with the same task; and imaging genetics analysis of the control group. A total of 54 healthy first-degree relatives of schizophrenic patients and 80 controls matched for demographic, psychological, clinical, and task performance characteristics were studied. Blood oxygen level-dependent response during reward anticipation, analysis of intraclass correlations of functional contrasts, and associations between striatal gray matter volume and NRG1 genotype. Compared with controls, healthy first-degree relatives showed a highly significant decrease in ventral striatal activation during reward anticipation (familywise error-corrected P systems-level functional phenotype is reliable (with intraclass correlation coefficients of 0.59-0.73), independent of local gray matter volume (with no corresponding group differences and no correlation to function, and with all uncorrected P values >.05), and affected by

  5. Systems-level organization of non-alcoholic fatty liver disease progression network

    Directory of Open Access Journals (Sweden)

    K. Shubham

    2017-10-01

    the coordination of metabolism and inflammation in NAFLD patients. We found that genes of arachidonic acid, sphingolipid and glycosphingolipid metabolism were upregulated and co-expressed with genes of proinflammatory signaling pathways and hypoxia in NASH/NASH with fibrosis. These metabolic alterations might play a role in sustaining VAT inflammation. Further, the inflammation related genes were also co-expressed with genes involved in the ECM degradation. We interlink these cellular processes to obtain a systems-level understanding of NAFLD.

  6. Perturbation theory

    International Nuclear Information System (INIS)

    Bartlett, R.; Kirtman, B.; Davidson, E.R.

    1978-01-01

    After noting some advantages of using perturbation theory some of the various types are related on a chart and described, including many-body nonlinear summations, quartic force-field fit for geometry, fourth-order correlation approximations, and a survey of some recent work. Alternative initial approximations in perturbation theory are also discussed. 25 references

  7. Need theory

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    2014-01-01

    markdownabstract__Abstract__ Need theory of happiness is linked to affect theory, which holds that happiness is a reflection of how well we feel generally. In this view, we do not "calculate" happiness but rather "infer" it, the typical heuristic being "I feel good most of the time, hence

  8. Diffraction theory

    NARCIS (Netherlands)

    Bouwkamp, C.J.

    1954-01-01

    A critical review is presented of recent progress in classical diffraction theory. Both scalar and electromagnetic problems are discussed. The report may serve as an introduction to general diffraction theory although the main emphasis is on diffraction by plane obstacles. Various modifications of

  9. Potential Theory

    CERN Document Server

    Lukeš, Jaroslav; Netuka, Ivan; Veselý, Jiří

    1988-01-01

    Within the tradition of meetings devoted to potential theory, a conference on potential theory took place in Prague on 19-24, July 1987. The Conference was organized by the Faculty of Mathematics and Physics, Charles University, with the collaboration of the Institute of Mathematics, Czechoslovak Academy of Sciences, the Department of Mathematics, Czech University of Technology, the Union of Czechoslovak Mathematicians and Physicists, the Czechoslovak Scientific and Technical Society, and supported by IMU. During the Conference, 69 scientific communications from different branches of potential theory were presented; the majority of them are in­ cluded in the present volume. (Papers based on survey lectures delivered at the Conference, its program as well as a collection of problems from potential theory will appear in a special volume of the Lecture Notes Series published by Springer-Verlag). Topics of these communications truly reflect the vast scope of contemporary potential theory. Some contributions deal...

  10. Conspiracy Theory

    DEFF Research Database (Denmark)

    Bjerg, Ole; Presskorn-Thygesen, Thomas

    2017-01-01

    The paper is a contribution to current debates about conspiracy theories within philosophy and cultural studies. Wittgenstein’s understanding of language is invoked to analyse the epistemological effects of designating particular questions and explanations as a ‘conspiracy theory......’. It is demonstrated how such a designation relegates these questions and explanations beyond the realm of meaningful discourse. In addition, Agamben’s concept of sovereignty is applied to explore the political effects of using the concept of conspiracy theory. The exceptional epistemological status assigned...... to alleged conspiracy theories within our prevalent paradigms of knowledge and truth is compared to the exceptional legal status assigned to individuals accused of terrorism under the War on Terror. The paper concludes by discussing the relation between conspiracy theory and ‘the paranoid style...

  11. Field theory

    CERN Multimedia

    1999-11-08

    In these lectures I will build up the concept of field theory using the language of Feynman diagrams. As a starting point, field theory in zero spacetime dimensions is used as a vehicle to develop all the necessary techniques: path integral, Feynman diagrams, Schwinger-Dyson equations, asymptotic series, effective action, renormalization etc. The theory is then extended to more dimensions, with emphasis on the combinatorial aspects of the diagrams rather than their particular mathematical structure. The concept of unitarity is used to, finally, arrive at the various Feynman rules in an actual, four-dimensional theory. The concept of gauge-invariance is developed, and the structure of a non-abelian gauge theory is discussed, again on the level of Feynman diagrams and Feynman rules.

  12. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  13. Concept theory

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2009-01-01

      Concept theory is an extremely broad, interdisciplinary and complex field of research related to many deep fields with very long historical traditions without much consensus. However, information science and knowledge organization cannot avoid relating to theories of concepts. Knowledge...... organizing systems (e.g. classification systems, thesauri and ontologies) should be understood as systems basically organizing concepts and their semantic relations. The same is the case with information retrieval systems. Different theories of concepts have different implications for how to construe......, evaluate and use such systems. Based on "a post-Kuhnian view" of paradigms this paper put forward arguments that the best understanding and classification of theories of concepts is to view and classify them in accordance with epistemological theories (empiricism, rationalism, historicism and pragmatism...

  14. Hierarchical random cellular neural networks for system-level brain-like signal processing.

    Science.gov (United States)

    Kozma, Robert; Puljic, Marko

    2013-09-01

    Sensory information processing and cognition in brains are modeled using dynamic systems theory. The brain's dynamic state is described by a trajectory evolving in a high-dimensional state space. We introduce a hierarchy of random cellular automata as the mathematical tools to describe the spatio-temporal dynamics of the cortex. The corresponding brain model is called neuropercolation which has distinct advantages compared to traditional models using differential equations, especially in describing spatio-temporal discontinuities in the form of phase transitions. Phase transitions demarcate singularities in brain operations at critical conditions, which are viewed as hallmarks of higher cognition and awareness experience. The introduced Monte-Carlo simulations obtained by parallel computing point to the importance of computer implementations using very large-scale integration (VLSI) and analog platforms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. ∑∆ Modulator System-Level Considerations for Hearing-Aid Audio Class-D Output Stage Application

    DEFF Research Database (Denmark)

    Pracný, Peter; Bruun, Erik

    2012-01-01

    This paper deals with a system-level design of a digital sigma-delta (∑∆) modulator for hearing-aid audio Class D output stage application. The aim of this paper is to provide a thorough discussion on various possibilities and tradeoffs of ∑∆ modulator system-level design parameter combinations...... - order, oversampling ratio (OSR) and number of bits in the quantizer - including their impact on interpolation filter design as well. The system is kept in digital domain up to the input of the Class D power stage including the digital pulse width modulation (DPWM) block. Notes on the impact of the DPWM...

  16. Theories and models on the biological of cells in space

    Science.gov (United States)

    Todd, P.; Klaus, D. M.

    1996-01-01

    A wide variety of observations on cells in space, admittedly made under constraining and unnatural conditions in may cases, have led to experimental results that were surprising or unexpected. Reproducibility, freedom from artifacts, and plausibility must be considered in all cases, even when results are not surprising. The papers in symposium on 'Theories and Models on the Biology of Cells in Space' are dedicated to the subject of the plausibility of cellular responses to gravity -- inertial accelerations between 0 and 9.8 m/sq s and higher. The mechanical phenomena inside the cell, the gravitactic locomotion of single eukaryotic and prokaryotic cells, and the effects of inertial unloading on cellular physiology are addressed in theoretical and experimental studies.

  17. Plausibility Arguments and Universal Gravitation

    Science.gov (United States)

    Cunha, Ricardo F. F.; Tort, A. C.

    2017-01-01

    Newton's law of universal gravitation underpins our understanding of the dynamics of the Solar System and of a good portion of the observable universe. Generally, in the classroom or in textbooks, the law is presented initially in a qualitative way and at some point during the exposition its mathematical formulation is written on the blackboard…

  18. Towards a more plausible dragon

    Science.gov (United States)

    Efthimiou, Costas

    2014-08-01

    Wizards, mermaids, dragons and aliens. Walking, running, flying and space travel. A hi-tech elevator, a computer, a propulsion engine and a black hole. What do all of these things have in common? This might seem like a really hard brainteaser but the answer is simple: they all obey the fundamental laws of our universe.

  19. System-level power optimization for real-time distributed embedded systems

    Science.gov (United States)

    Luo, Jiong

    Power optimization is one of the crucial design considerations for modern electronic systems. In this thesis, we present several system-level power optimization techniques for real-time distributed embedded systems, based on dynamic voltage scaling, dynamic power management, and management of peak power and variance of the power profile. Dynamic voltage scaling has been widely acknowledged as an important and powerful technique to trade off dynamic power consumption and delay. Efficient dynamic voltage scaling requires effective variable-voltage scheduling mechanisms that can adjust voltages and clock frequencies adaptively based on workloads and timing constraints. For this purpose, we propose static variable-voltage scheduling algorithms utilizing criticalpath driven timing analysis for the case when tasks are assumed to have uniform switching activities, as well as energy-gradient driven slack allocation for a more general scenario. The proposed techniques can achieve closeto-optimal power savings with very low computational complexity, without violating any real-time constraints. We also present algorithms for power-efficient joint scheduling of multi-rate periodic task graphs along with soft aperiodic tasks. The power issue is addressed through both dynamic voltage scaling and power management. Periodic task graphs are scheduled statically. Flexibility is introduced into the static schedule to allow the on-line scheduler to make local changes to PE schedules through resource reclaiming and slack stealing, without interfering with the validity of the global schedule. We provide a unified framework in which the response times of aperiodic tasks and power consumption are dynamically optimized simultaneously. Interconnection network fabrics point to a new generation of power-efficient and scalable interconnection architectures for distributed embedded systems. As the system bandwidth continues to increase, interconnection networks become power/energy limited as

  20. Number theory

    CERN Document Server

    Andrews, George E

    1994-01-01

    Although mathematics majors are usually conversant with number theory by the time they have completed a course in abstract algebra, other undergraduates, especially those in education and the liberal arts, often need a more basic introduction to the topic.In this book the author solves the problem of maintaining the interest of students at both levels by offering a combinatorial approach to elementary number theory. In studying number theory from such a perspective, mathematics majors are spared repetition and provided with new insights, while other students benefit from the consequent simpl

  1. Risk theory

    CERN Document Server

    Schmidli, Hanspeter

    2017-01-01

    This book provides an overview of classical actuarial techniques, including material that is not readily accessible elsewhere such as the Ammeter risk model and the Markov-modulated risk model. Other topics covered include utility theory, credibility theory, claims reserving and ruin theory. The author treats both theoretical and practical aspects and also discusses links to Solvency II. Written by one of the leading experts in the field, these lecture notes serve as a valuable introduction to some of the most frequently used methods in non-life insurance. They will be of particular interest to graduate students, researchers and practitioners in insurance, finance and risk management.

  2. Mapping Theory

    DEFF Research Database (Denmark)

    Smith, Shelley

    This paper came about within the context of a 13-month research project, Focus Area 1 - Method and Theory, at the Center for Public Space Research at the Royal Academy of the Arts School of Architecture in Copenhagen, Denmark. This project has been funded by RealDania. The goals of the research...... project, Focus Area 1 - Method and Theory, which forms the framework for this working paper, are: * To provide a basis from which to discuss the concept of public space in a contemporary architectural and urban context - specifically relating to theory and method * To broaden the discussion of the concept...

  3. Plasticity theory

    CERN Document Server

    Lubliner, Jacob

    2008-01-01

    The aim of Plasticity Theory is to provide a comprehensive introduction to the contemporary state of knowledge in basic plasticity theory and to its applications. It treats several areas not commonly found between the covers of a single book: the physics of plasticity, constitutive theory, dynamic plasticity, large-deformation plasticity, and numerical methods, in addition to a representative survey of problems treated by classical methods, such as elastic-plastic problems, plane plastic flow, and limit analysis; the problem discussed come from areas of interest to mechanical, structural, and

  4. Agency Theory

    DEFF Research Database (Denmark)

    Linder, Stefan; Foss, Nicolai Juul

    Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting and informational conditions, the theory addresses problems of ex...... ante (“hidden characteristics”) as well as ex post information asymmetry (“hidden action”), and examines conditions under which various kinds of incentive instruments and monitoring arrangements can be deployed to minimize the welfare loss. Its clear predictions and broad applicability have allowed...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism....

  5. Agency Theory

    DEFF Research Database (Denmark)

    Linder, Stefan; Foss, Nicolai Juul

    2015-01-01

    Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting, and informational conditions, the theory addresses problems of ex...... ante (‘hidden characteristics’) as well as ex post information asymmetry (‘hidden action’), and examines conditions under which various kinds of incentive instruments and monitoring arrangements can be deployed to minimize the welfare loss. Its clear predictions and broad applicability have allowed...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism....

  6. How to Do Things with Mouse Clicks: Applying Austin's Speech Act Theory to Explain Learning in Virtual Worlds

    Science.gov (United States)

    Loke, Swee-Kin; Golding, Clinton

    2016-01-01

    This article addresses learning in desktop virtual worlds where students role play for professional education. When students role play in such virtual worlds, they can learn some knowledge and skills that are useful in the physical world. However, existing learning theories do not provide a plausible explanation of how performing non-verbal…

  7. System-Level Demonstration of a Dynamically Reconfigured Burst-Mode Link Using a Nanosecond Si-Photonic Switch

    DEFF Research Database (Denmark)

    Forencich, Alex; Kamchevska, Valerija; Dupuis, Nicolas

    2018-01-01

    Using a novel FPGA-based network emulator, microsecond-scale packets with 12.5-20-Gb/s data are generated, routed through a nanosecond Si-photonic switch, and received in a fast-locking burst-mode receiver. Error-free links with <382-ns system-level switching are shown....

  8. Is health workforce planning recognising the dynamic interplay between health literacy at an individual, organisation and system level?

    Science.gov (United States)

    Naccarella, Lucio; Wraight, Brenda; Gorman, Des

    2016-02-01

    The growing demands on the health system to adapt to constant change has led to investment in health workforce planning agencies and approaches. Health workforce planning approaches focusing on identifying, predicting and modelling workforce supply and demand are criticised as being simplistic and not contributing to system-level resiliency. Alternative evidence- and needs-based health workforce planning approaches are being suggested. However, to contribute to system-level resiliency, workforce planning approaches need to also adopt system-based approaches. The increased complexity and fragmentation of the healthcare system, especially for patients with complex and chronic conditions, has also led to a focus on health literacy not simply as an individual trait, but also as a dynamic product of the interaction between individual (patients, workforce)-, organisational- and system-level health literacy. Although it is absolutely essential that patients have a level of health literacy that enables them to navigate and make decisions, so too the health workforce, organisations and indeed the system also needs to be health literate. Herein we explore whether health workforce planning is recognising the dynamic interplay between health literacy at an individual, organisation and system level, and the potential for strengthening resiliency across all those levels.

  9. ARTS: A System-Level Framework for Modeling MPSoC Components and Analysis of their Causality

    DEFF Research Database (Denmark)

    Mahadevan, Shankar; Storgaard, Michael; Madsen, Jan

    2005-01-01

    Designing complex heterogeneousmultiprocessor Systemon- Chip (MPSoC) requires support for modeling and analysis of the different layers i.e. application, operating system (OS) and platform architecture. This paper presents an abstract system-level modeling framework, called ARTS, to support...

  10. A Generic System-Level Framework for Self-Serve Health Monitoring System through Internet of Things (IoT).

    Science.gov (United States)

    Ahmed, Mobyen Uddin; Björkman, Mats; Lindén, Maria

    2015-01-01

    Sensor data are traveling from sensors to a remote server, data is analyzed remotely in a distributed manner, and health status of a user is presented in real-time. This paper presents a generic system-level framework for a self-served health monitoring system through the Internet of Things (IoT) to facilities an efficient sensor data management.

  11. Continuity theory

    CERN Document Server

    Nel, Louis

    2016-01-01

    This book presents a detailed, self-contained theory of continuous mappings. It is mainly addressed to students who have already studied these mappings in the setting of metric spaces, as well as multidimensional differential calculus. The needed background facts about sets, metric spaces and linear algebra are developed in detail, so as to provide a seamless transition between students' previous studies and new material. In view of its many novel features, this book will be of interest also to mature readers who have studied continuous mappings from the subject's classical texts and wish to become acquainted with a new approach. The theory of continuous mappings serves as infrastructure for more specialized mathematical theories like differential equations, integral equations, operator theory, dynamical systems, global analysis, topological groups, topological rings and many more. In light of the centrality of the topic, a book of this kind fits a variety of applications, especially those that contribute to ...

  12. Model theory

    CERN Document Server

    Hodges, Wilfrid

    1993-01-01

    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  13. Interpolation theory

    CERN Document Server

    Lunardi, Alessandra

    2018-01-01

    This book is the third edition of the 1999 lecture notes of the courses on interpolation theory that the author delivered at the Scuola Normale in 1998 and 1999. In the mathematical literature there are many good books on the subject, but none of them is very elementary, and in many cases the basic principles are hidden below great generality. In this book the principles of interpolation theory are illustrated aiming at simplification rather than at generality. The abstract theory is reduced as far as possible, and many examples and applications are given, especially to operator theory and to regularity in partial differential equations. Moreover the treatment is self-contained, the only prerequisite being the knowledge of basic functional analysis.

  14. [Nuclear theory

    International Nuclear Information System (INIS)

    1989-06-01

    This report discusses concepts in nuclear theory such as: neutrino nucleosynthesis; double beta decay; neutrino oscillations; chiral symmetry breaking; T invariance; quark propagator; cold fusion; and other related topics

  15. Livability theory

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    2014-01-01

    markdownabstract__Abstract__ Assumptions Livability theory involves the following six key assumptions: 1. Like all animals, humans have innate needs, such as for food, safety, and companionship. 2. Gratification of needs manifests in hedonic experience. 3. Hedonic experience determines how

  16. Nokton theory

    OpenAIRE

    SAIDANI Lassaad

    2015-01-01

    The nokton theory is an attempt to construct a theory adapted to every physical phenomenon. Space and time have been discretized. Its laws are iterative and precise. Probability plays an important role here. At first I defined the notion of image function and its mathematical framework. The notion of nokton and its state are the basis of several definitions. I later defined the canonical image function and the canonical contribution. Two constants have been necessary to define the dynam...

  17. Nokton theory

    OpenAIRE

    SAIDANI Lassaad

    2017-01-01

    The nokton theory is an attempt to construct a theory adapted to every physical phenomenon. Space and time have been discretized. Its laws are iterative and precise. Probability plays an important role here. At first I defined the notion of image function and its mathematical framework. The notion of nokton and its state are the basis of several definitions. I later defined the canonical image function and the canonical contribution. Two constants have been necessary to define the dynam...

  18. Graph theory

    CERN Document Server

    Gould, Ronald

    2012-01-01

    This introduction to graph theory focuses on well-established topics, covering primary techniques and including both algorithmic and theoretical problems. The algorithms are presented with a minimum of advanced data structures and programming details. This thoroughly corrected 1988 edition provides insights to computer scientists as well as advanced undergraduates and graduate students of topology, algebra, and matrix theory. Fundamental concepts and notation and elementary properties and operations are the first subjects, followed by examinations of paths and searching, trees, and networks. S

  19. Bacterially-Associated Transcriptional Remodelling in a Distinct Genomic Subtype of Colorectal Cancer Provides a Plausible Molecular Basis for Disease Development.

    Directory of Open Access Journals (Sweden)

    Katie S Lennard

    Full Text Available The relevance of specific microbial colonisation to colorectal cancer (CRC disease pathogenesis is increasingly recognised, but our understanding of possible underlying molecular mechanisms that may link colonisation to disease in vivo remains limited. Here, we investigate the relationships between the most commonly studied CRC-associated bacteria (Enterotoxigenic Bacteroides fragilis, pks+ Escherichia coli, Fusobacterium spp., afaC+ E. coli, Enterococcus faecalis & Enteropathogenic E. coli and altered transcriptomic and methylation profiles of CRC patients, in order to gain insight into the potential contribution of these bacteria in the aetiopathogenesis of CRC. We show that colonisation by E. faecalis and high levels of Fusobacterium is associated with a specific transcriptomic subtype of CRC that is characterised by CpG island methylation, microsatellite instability and a significant increase in inflammatory and DNA damage pathways. Analysis of the significant, bacterially-associated changes in host gene expression, both at the level of individual genes as well as pathways, revealed a transcriptional remodeling that provides a plausible mechanistic link between specific bacterial colonisation and colorectal cancer disease development and progression in this subtype; these included upregulation of REG3A, REG1A and REG1P in the case of high-level colonization by Fusobacterium, and CXCL10 and BMI1 in the case of colonisation by E. faecalis. The enrichment of both E. faecalis and Fusobacterium in this CRC subtype suggests that polymicrobial colonisation of the colonic epithelium may well be an important aspect of colonic tumourigenesis.

  20. Nevanlinna theory

    CERN Document Server

    Kodaira, Kunihiko

    2017-01-01

    This book deals with the classical theory of Nevanlinna on the value distribution of meromorphic functions of one complex variable, based on minimum prerequisites for complex manifolds. The theory was extended to several variables by S. Kobayashi, T. Ochiai, J. Carleson, and P. Griffiths in the early 1970s. K. Kodaira took up this subject in his course at The University of Tokyo in 1973 and gave an introductory account of this development in the context of his final paper, contained in this book. The first three chapters are devoted to holomorphic mappings from C to complex manifolds. In the fourth chapter, holomorphic mappings between higher dimensional manifolds are covered. The book is a valuable treatise on the Nevanlinna theory, of special interests to those who want to understand Kodaira's unique approach to basic questions on complex manifolds.

  1. Gauge theories

    International Nuclear Information System (INIS)

    Kenyon, I.R.

    1986-01-01

    Modern theories of the interactions between fundamental particles are all gauge theories. In the case of gravitation, application of this principle to space-time leads to Einstein's theory of general relativity. All the other interactions involve the application of the gauge principle to internal spaces. Electromagnetism serves to introduce the idea of a gauge field, in this case the electromagnetic field. The next example, the strong force, shows unique features at long and short range which have their origin in the self-coupling of the gauge fields. Finally the unification of the description of the superficially dissimilar electromagnetic and weak nuclear forces completes the picture of successes of the gauge principle. (author)

  2. Galois theory

    CERN Document Server

    Stewart, Ian

    2003-01-01

    Ian Stewart's Galois Theory has been in print for 30 years. Resoundingly popular, it still serves its purpose exceedingly well. Yet mathematics education has changed considerably since 1973, when theory took precedence over examples, and the time has come to bring this presentation in line with more modern approaches.To this end, the story now begins with polynomials over the complex numbers, and the central quest is to understand when such polynomials have solutions that can be expressed by radicals. Reorganization of the material places the concrete before the abstract, thus motivating the g

  3. Scattering theory

    International Nuclear Information System (INIS)

    Sitenko, A.

    1991-01-01

    This book emerged out of graduate lectures given by the author at the University of Kiev and is intended as a graduate text. The fundamentals of non-relativistic quantum scattering theory are covered, including some topics, such as the phase-function formalism, separable potentials, and inverse scattering, which are not always coverded in textbooks on scattering theory. Criticisms of the text are minor, but the reviewer feels an inadequate index is provided and the citing of references in the Russian language is a hindrance in a graduate text

  4. Systems-level analysis of Escherichia coli response to silver nanoparticles: the roles of anaerobic respiration in microbial resistance.

    Science.gov (United States)

    Du, Huamao; Lo, Tat-Ming; Sitompul, Johnner; Chang, Matthew Wook

    2012-08-10

    Despite extensive use of silver nanoparticles for antimicrobial applications, cellular mechanisms underlying microbial response to silver nanoparticles remain to be further elucidated at the systems level. Here, we report systems-level response of Escherichia coli to silver nanoparticles using transcriptome-based biochemical and phenotype assays. Notably, we provided the evidence that anaerobic respiration is induced upon exposure to silver nanoparticles. Further we showed that anaerobic respiration-related regulators and enzymes play an important role in E. coli resistance to silver nanoparticles. In particular, our results suggest that arcA is essential for resistance against silver NPs and the deletion of fnr, fdnH and narH significantly increases the resistance. We envision that this study offers novel insights into modes of antimicrobial action of silver nanoparticles, and cellular mechanisms contributing to the development of microbial resistance to silver nanoparticles. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Systems-level modeling the effects of arsenic exposure with sequential pulsed and fluctuating patterns for tilapia and freshwater clam

    International Nuclear Information System (INIS)

    Chen, W.-Y.; Tsai, J.-W.; Ju, Y.-R.; Liao, C.-M.

    2010-01-01

    The purpose of this paper was to use quantitative systems-level approach employing biotic ligand model based threshold damage model to examine physiological responses of tilapia and freshwater clam to sequential pulsed and fluctuating arsenic concentrations. We tested present model and triggering mechanisms by carrying out a series of modeling experiments where we used periodic pulses and sine-wave as featured exposures. Our results indicate that changes in the dominant frequencies and pulse timing can shift the safe rate distributions for tilapia, but not for that of freshwater clam. We found that tilapia increase bioenergetic costs to maintain the acclimation during pulsed and sine-wave exposures. Our ability to predict the consequences of physiological variation under time-varying exposure patterns has also implications for optimizing species growing, cultivation strategies, and risk assessment in realistic situations. - Systems-level modeling the pulsed and fluctuating arsenic exposures.

  6. Systems-level modeling the effects of arsenic exposure with sequential pulsed and fluctuating patterns for tilapia and freshwater clam

    Energy Technology Data Exchange (ETDEWEB)

    Chen, W.-Y. [Department of Bioenvironmental Systems Engineering, National Taiwan University, Taipei 10617, Taiwan (China); Tsai, J.-W. [Institute of Ecology and Evolutionary Ecology, China Medical University, Taichung 40402, Taiwan (China); Ju, Y.-R. [Department of Bioenvironmental Systems Engineering, National Taiwan University, Taipei 10617, Taiwan (China); Liao, C.-M., E-mail: cmliao@ntu.edu.t [Department of Bioenvironmental Systems Engineering, National Taiwan University, Taipei 10617, Taiwan (China)

    2010-05-15

    The purpose of this paper was to use quantitative systems-level approach employing biotic ligand model based threshold damage model to examine physiological responses of tilapia and freshwater clam to sequential pulsed and fluctuating arsenic concentrations. We tested present model and triggering mechanisms by carrying out a series of modeling experiments where we used periodic pulses and sine-wave as featured exposures. Our results indicate that changes in the dominant frequencies and pulse timing can shift the safe rate distributions for tilapia, but not for that of freshwater clam. We found that tilapia increase bioenergetic costs to maintain the acclimation during pulsed and sine-wave exposures. Our ability to predict the consequences of physiological variation under time-varying exposure patterns has also implications for optimizing species growing, cultivation strategies, and risk assessment in realistic situations. - Systems-level modeling the pulsed and fluctuating arsenic exposures.

  7. Application of Dempster–Shafer theory in dose response outcome analysis

    International Nuclear Information System (INIS)

    Chen Wenzhou; Cui Yunfeng; Yu Yan; Galvin, James; Xiao Ying; He Yanyan; Hussaini, Yousuff M

    2012-01-01

    The Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) reviews summarize the currently available three-dimensional dose/volume/outcome data from multi-institutions and numerous articles to update and refine the normal tissue dose/volume tolerance guidelines. As pointed out in the review, the data have limitations and even some inconsistency. However, with the help of new physical and statistical techniques, the information in the review could be updated so that patient care can be continually improved. The purpose of this work is to demonstrate the application of a mathematical theory, the Dempster–Shafer theory, in dose/volume/outcome data analysis. We applied this theory to the original data obtained from published clinical studies describing dose response for radiation pneumonitis. Belief and plausibility concepts were introduced for dose response evaluation. We were also able to consider the uncertainty and inconsistency of the data from these studies with Yager's combination rule, a special methodology of Dempster–Shafer theory, to fuse the data at several specific doses. The values of belief and plausibility functions were obtained at the corresponding doses. Then we applied the Lyman–Kutcher–Burman (LKB) model to fit these values and a belief–plausibility range was obtained. This range could be considered as a probability range to assist physicians and treatment planners in determining acceptable dose–volume constraints. Finally, the parameters obtained from the LKB model fitting were compared with those in Emami and Burman's papers and those from other frequentist statistics methods. We found that Emami and Burman's parameters are within the belief–plausibility range we calculated by the Dempster–Shafer theory. (paper)

  8. Application of Dempster-Shafer theory in dose response outcome analysis

    Science.gov (United States)

    Chen, Wenzhou; Cui, Yunfeng; He, Yanyan; Yu, Yan; Galvin, James; Hussaini, Yousuff M.; Xiao, Ying

    2012-09-01

    The Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) reviews summarize the currently available three-dimensional dose/volume/outcome data from multi-institutions and numerous articles to update and refine the normal tissue dose/volume tolerance guidelines. As pointed out in the review, the data have limitations and even some inconsistency. However, with the help of new physical and statistical techniques, the information in the review could be updated so that patient care can be continually improved. The purpose of this work is to demonstrate the application of a mathematical theory, the Dempster-Shafer theory, in dose/volume/outcome data analysis. We applied this theory to the original data obtained from published clinical studies describing dose response for radiation pneumonitis. Belief and plausibility concepts were introduced for dose response evaluation. We were also able to consider the uncertainty and inconsistency of the data from these studies with Yager's combination rule, a special methodology of Dempster-Shafer theory, to fuse the data at several specific doses. The values of belief and plausibility functions were obtained at the corresponding doses. Then we applied the Lyman-Kutcher-Burman (LKB) model to fit these values and a belief-plausibility range was obtained. This range could be considered as a probability range to assist physicians and treatment planners in determining acceptable dose-volume constraints. Finally, the parameters obtained from the LKB model fitting were compared with those in Emami and Burman's papers and those from other frequentist statistics methods. We found that Emami and Burman's parameters are within the belief-plausibility range we calculated by the Dempster-Shafer theory.

  9. System-Level Coupled Modeling of Piezoelectric Vibration Energy Harvesting Systems by Joint Finite Element and Circuit Analysis

    Directory of Open Access Journals (Sweden)

    Congcong Cheng

    2016-01-01

    Full Text Available A practical piezoelectric vibration energy harvesting (PVEH system is usually composed of two coupled parts: a harvesting structure and an interface circuit. Thus, it is much necessary to build system-level coupled models for analyzing PVEH systems, so that the whole PVEH system can be optimized to obtain a high overall efficiency. In this paper, two classes of coupled models are proposed by joint finite element and circuit analysis. The first one is to integrate the equivalent circuit model of the harvesting structure with the interface circuit and the second one is to integrate the equivalent electrical impedance of the interface circuit into the finite element model of the harvesting structure. Then equivalent circuit model parameters of the harvesting structure are estimated by finite element analysis and the equivalent electrical impedance of the interface circuit is derived by circuit analysis. In the end, simulations are done to validate and compare the proposed two classes of system-level coupled models. The results demonstrate that harvested powers from the two classes of coupled models approximate to theoretic values. Thus, the proposed coupled models can be used for system-level optimizations in engineering applications.

  10. Theta-Gamma Coding Meets Communication-through-Coherence: Neuronal Oscillatory Multiplexing Theories Reconciled.

    Science.gov (United States)

    McLelland, Douglas; VanRullen, Rufin

    2016-10-01

    Several theories have been advanced to explain how cross-frequency coupling, the interaction of neuronal oscillations at different frequencies, could enable item multiplexing in neural systems. The communication-through-coherence theory proposes that phase-matching of gamma oscillations between areas enables selective processing of a single item at a time, and a later refinement of the theory includes a theta-frequency oscillation that provides a periodic reset of the system. Alternatively, the theta-gamma neural code theory proposes that a sequence of items is processed, one per gamma cycle, and that this sequence is repeated or updated across theta cycles. In short, both theories serve to segregate representations via the temporal domain, but differ on the number of objects concurrently represented. In this study, we set out to test whether each of these theories is actually physiologically plausible, by implementing them within a single model inspired by physiological data. Using a spiking network model of visual processing, we show that each of these theories is physiologically plausible and computationally useful. Both theories were implemented within a single network architecture, with two areas connected in a feedforward manner, and gamma oscillations generated by feedback inhibition within areas. Simply increasing the amplitude of global inhibition in the lower area, equivalent to an increase in the spatial scope of the gamma oscillation, yielded a switch from one mode to the other. Thus, these different processing modes may co-exist in the brain, enabling dynamic switching between exploratory and selective modes of attention.

  11. Leadership Theories.

    Science.gov (United States)

    Sferra, Bobbie A.; Paddock, Susan C.

    This booklet describes various theoretical aspects of leadership, including the proper exercise of authority, effective delegation, goal setting, exercise of control, assignment of responsibility, performance evaluation, and group process facilitation. It begins by describing the evolution of general theories of leadership from historic concepts…

  12. Combinatorial Theory

    CERN Document Server

    Hall, Marshall

    2011-01-01

    Includes proof of van der Waerden's 1926 conjecture on permanents, Wilson's theorem on asymptotic existence, and other developments in combinatorics since 1967. Also covers coding theory and its important connection with designs, problems of enumeration, and partition. Presents fundamentals in addition to latest advances, with illustrative problems at the end of each chapter. Enlarged appendixes include a longer list of block designs.

  13. Control Theory.

    Science.gov (United States)

    Toso, Robert B.

    2000-01-01

    Inspired by William Glasser's Reality Therapy ideas, Control Theory (CT) is a disciplinary approach that stresses people's ability to control only their own behavior, based on internal motivations to satisfy five basic needs. At one North Dakota high school, CT-trained teachers are the program's best recruiters. (MLH)

  14. Framing theory

    NARCIS (Netherlands)

    de Vreese, C.H.; Lecheler, S.; Mazzoleni, G.; Barnhurst, K.G.; Ikeda, K.; Maia, R.C.M.; Wessler, H.

    2016-01-01

    Political issues can be viewed from different perspectives and they can be defined differently in the news media by emphasizing some aspects and leaving others aside. This is at the core of news framing theory. Framing originates within sociology and psychology and has become one of the most used

  15. Electricity Theory

    International Nuclear Information System (INIS)

    Gong, Ha Soung

    2006-12-01

    The text book composed of five parts, which are summary of this book, arrangement of electricity theory including electricity nad magnetism, a direct current, and alternating current. It has two dictionary electricity terms for a synonym. The last is an appendix. It is for preparing for test of officer, electricity engineer and fire fighting engineer.

  16. Theory U

    DEFF Research Database (Denmark)

    Monthoux, Pierre Guillet de; Statler, Matt

    2014-01-01

    The recent Carnegie report (Colby, et al., 2011) characterizes the goal of business education as the development of practical wisdom. In this chapter, the authors reframe Scharmer’s Theory U as an attempt to develop practical wisdom by applying certain European philosophical concepts. Specifically...

  17. Theory U

    DEFF Research Database (Denmark)

    Guillet de Monthoux, Pierre; Statler, Matt

    2017-01-01

    The recent Carnegie report (Colby, et al., 2011) characterizes the goal of business education as the development of practical wisdom. In this chapter, the authors reframe Scharmer's Theory U as an attempt to develop practical wisdom by applying certain European philosophical concepts. Specifically...

  18. Theory summary

    International Nuclear Information System (INIS)

    Tang, W.M.

    2001-01-01

    This is a summary of the advances in magnetic fusion energy theory research presented at the 17th International Atomic Energy Agency Fusion Energy Conference from 19 24 October, 1998 in Yokohama, Japan. Theory and simulation results from this conference provided encouraging evidence of significant progress in understanding the physics of thermonuclear plasmas. Indeed, the grand challenge for this field is to acquire the basic understanding that can readily enable the innovations which would make fusion energy practical. In this sense, research in fusion energy is increasingly able to be categorized as fitting well the 'Pasteur's Quadrant' paradigm, where the research strongly couples basic science ('Bohr's Quadrant') to technological impact ('Edison's Quadrant'). As supported by some of the work presented at this conference, this trend will be further enhanced by advanced simulations. Eventually, realistic three-dimensional modeling capabilities, when properly combined with rapid and complete data interpretation of results from both experiments and simulations, can contribute to a greatly enhanced cycle of understanding and innovation. Plasma science theory and simulation have provided reliable foundations for this improved modeling capability, and the exciting advances in high-performance computational resources have further accelerated progress. There were 68 papers presented at this conference in the area of magnetic fusion energy theory

  19. Communication Theory.

    Science.gov (United States)

    Penland, Patrick R.

    Three papers are presented which delineate the foundation of theory and principles which underlie the research and instructional approach to communications at the Graduate School of Library and Information Science, University of Pittsburgh. Cybernetic principles provide the integration, and validation is based in part on a situation-producing…

  20. Complexity Theory

    Science.gov (United States)

    Lee, William H K.

    2016-01-01

    A complex system consists of many interacting parts, generates new collective behavior through self organization, and adaptively evolves through time. Many theories have been developed to study complex systems, including chaos, fractals, cellular automata, self organization, stochastic processes, turbulence, and genetic algorithms.

  1. Matching theory

    CERN Document Server

    Plummer, MD

    1986-01-01

    This study of matching theory deals with bipartite matching, network flows, and presents fundamental results for the non-bipartite case. It goes on to study elementary bipartite graphs and elementary graphs in general. Further discussed are 2-matchings, general matching problems as linear programs, the Edmonds Matching Algorithm (and other algorithmic approaches), f-factors and vertex packing.

  2. Activity Theory

    DEFF Research Database (Denmark)

    Bertelsen, Olav Wedege; Bødker, Susanne

    2003-01-01

    the young HCI research tradition. But HCI was already facing problems: lack of consideration for other aspects of human behavior, for interaction with other people, for culture. Cognitive science-based theories lacked means to address several issues that came out of the empirical projects....

  3. Trial-by-Trial Modulation of Associative Memory Formation by Reward Prediction Error and Reward Anticipation as Revealed by a Biologically Plausible Computational Model.

    Science.gov (United States)

    Aberg, Kristoffer C; Müller, Julia; Schwartz, Sophie

    2017-01-01

    Anticipation and delivery of rewards improves memory formation, but little effort has been made to disentangle their respective contributions to memory enhancement. Moreover, it has been suggested that the effects of reward on memory are mediated by dopaminergic influences on hippocampal plasticity. Yet, evidence linking memory improvements to actual reward computations reflected in the activity of the dopaminergic system, i.e., prediction errors and expected values, is scarce and inconclusive. For example, different previous studies reported that the magnitude of prediction errors during a reinforcement learning task was a positive, negative, or non-significant predictor of successfully encoding simultaneously presented images. Individual sensitivities to reward and punishment have been found to influence the activation of the dopaminergic reward system and could therefore help explain these seemingly discrepant results. Here, we used a novel associative memory task combined with computational modeling and showed independent effects of reward-delivery and reward-anticipation on memory. Strikingly, the computational approach revealed positive influences from both reward delivery, as mediated by prediction error magnitude, and reward anticipation, as mediated by magnitude of expected value, even in the absence of behavioral effects when analyzed using standard methods, i.e., by collapsing memory performance across trials within conditions. We additionally measured trait estimates of reward and punishment sensitivity and found that individuals with increased reward (vs. punishment) sensitivity had better memory for associations encoded during positive (vs. negative) prediction errors when tested after 20 min, but a negative trend when tested after 24 h. In conclusion, modeling trial-by-trial fluctuations in the magnitude of reward, as we did here for prediction errors and expected value computations, provides a comprehensive and biologically plausible description of

  4. The erotetic theory of delusional thinking.

    Science.gov (United States)

    Parrott, Matthew; Koralus, Philipp

    2015-01-01

    In this paper, we argue for a novel account of one cognitive factor implicated in delusional cognition. According to the erotetic theory of delusion we present, the central cognitive factor in delusion is impaired endogenous question raising. After presenting the erotetic theory, we draw on it to model three distinct patterns of reasoning exhibited by delusional and schizophrenic patients, and contrast our explanations with Bayesian alternatives. We argue that the erotetic theory has considerable advantages over Bayesian models. Specifically, we show that it offers a superior explanation of three phenomena: the onset and persistence of the Capgras delusion; recent data indicating that schizophrenic subjects manifest superior reasoning with conditionals in certain contexts; and evidence that schizophrenic and delusional subjects have a tendency to "jump to conclusions." Moreover, since the cognitive mechanisms we appeal to are independently motivated, we avoid having to posit distinct epistemic states that are intrinsically irrational in order to fit our model to the variety of data. In contrast to Bayesian models, the erotetic theory offers a simple, unified explanation of a range of empirical data. We therefore conclude that it offers a more plausible framework for explaining delusional cognition.

  5. Communication theory

    DEFF Research Database (Denmark)

    Stein, Irene F.; Stelter, Reinhard

    2011-01-01

    Communication theory covers a wide variety of theories related to the communication process (Littlejohn, 1999). Communication is not simply an exchange of information, in which we have a sender and a receiver. This very technical concept of communication is clearly outdated; a human being...... is not a data processing device. In this chapter, communication is understood as a process of shared meaning-making (Bruner, 1990). Human beings interpret their environment, other people, and themselves on the basis of their dynamic interaction with the surrounding world. Meaning is essential because people...... ascribe specific meanings to their experiences, their actions in life or work, and their interactions. Meaning is reshaped, adapted, and transformed in every communication encounter. Furthermore, meaning is cocreated in dialogues or in communities of practice, such as in teams at a workplace or in school...

  6. Operator theory

    CERN Document Server

    2015-01-01

    A one-sentence definition of operator theory could be: The study of (linear) continuous operations between topological vector spaces, these being in general (but not exclusively) Fréchet, Banach, or Hilbert spaces (or their duals). Operator theory is thus a very wide field, with numerous facets, both applied and theoretical. There are deep connections with complex analysis, functional analysis, mathematical physics, and electrical engineering, to name a few. Fascinating new applications and directions regularly appear, such as operator spaces, free probability, and applications to Clifford analysis. In our choice of the sections, we tried to reflect this diversity. This is a dynamic ongoing project, and more sections are planned, to complete the picture. We hope you enjoy the reading, and profit from this endeavor.

  7. Potential theory

    CERN Document Server

    Helms, Lester L

    2014-01-01

    Potential Theory presents a clear path from calculus to classical potential theory and beyond, with the aim of moving the reader into the area of mathematical research as quickly as possible. The subject matter is developed from first principles using only calculus. Commencing with the inverse square law for gravitational and electromagnetic forces and the divergence theorem, the author develops methods for constructing solutions of Laplace's equation on a region with prescribed values on the boundary of the region. The latter half of the book addresses more advanced material aimed at those with the background of a senior undergraduate or beginning graduate course in real analysis. Starting with solutions of the Dirichlet problem subject to mixed boundary conditions on the simplest of regions, methods of morphing such solutions onto solutions of Poisson's equation on more general regions are developed using diffeomorphisms and the Perron-Wiener-Brelot method, culminating in application to Brownian motion. In ...

  8. Practical theories

    DEFF Research Database (Denmark)

    Jensen, Klaus Bruhn

    2016-01-01

    This article revisits the place of normative and other practical issues in the wider conceptual architecture of communication theory, building on the tradition of philosophical pragmatism. The article first characterizes everyday concepts of communication as the accumulated outcome of natural...... evolution and history: practical resources for human existence and social coexistence. Such practical concepts have served as the point of departure for diverse theoretical conceptions of what communication is. The second part of the article highlights the past neglect and current potential of normative...... communication theories that ask, in addition, what communication ought to be, and what it could be, taking the relationship between communication and justice as a case in point. The final section returns to empirical conceptualizations of different institutions, practices and discourses of communication...

  9. Gauge theories

    International Nuclear Information System (INIS)

    Jarlskog, C.

    An introduction to the unified gauge theories of weak and electromagnetic interactions is given. The ingredients of gauge theories and symmetries and conservation laws lead to discussion of local gauge invariance and QED, followed by weak interactions and quantum flavor dynamics. The construction of the standard SU(2)xU(1) model precedes discussion of the unification of weak and electromagnetic interactions and weak neutral current couplings in this model. Presentation of spontaneous symmetry breaking and spontaneous breaking of a local symmetry leads to a spontaneous breaking scheme for the standard SU(2)xU(1) model. Consideration of quarks, leptons, masses and the Cabibbo angles, of the four quark and six quark models and CP violation lead finally to grand unification, followed by discussion of mixing angles in the Georgi-Glashow model, the Higgses of the SU(5) model and proton/ neutron decay in SU(5). (JIW)

  10. Twistor theory

    International Nuclear Information System (INIS)

    Perjes, Z.

    1982-01-01

    Particle models in twistor theory are reviewed, starting with an introduction into the kinematical-twistor formalism which describes massive particles in Minkowski space-time. The internal transformations of constituent twistors are then discussed. The quantization rules available from a study of twistor scattering situations are used to construct quantum models of fundamental particles. The theory allows the introduction of an internal space with a Kaehlerian metric where hadron structure is described by spherical states of bound constituents. It is conjectured that the spectrum of successive families of hadrons might approach an accumulation point in energy. Above this threshold energy, the Kaehlerian analog of ionization could occur wherein the zero-mass constituents (twistors) of the particle break free. (Auth.)

  11. Biocultural Theory

    DEFF Research Database (Denmark)

    Carroll, Joseph; Clasen, Mathias; Jonsson, Emelie

    2017-01-01

    Biocultural theory is an integrative research program designed to investigate the causal interactions between biological adaptations and cultural constructions. From the biocultural perspective, cultural processes are rooted in the biological necessities of the human life cycle: specifically human...... of research as contributions to a coherent, collective research program. This article argues that a mature biocultural paradigm needs to be informed by at least 7 major research clusters: (a) gene-culture coevolution; (b) human life history theory; (c) evolutionary social psychology; (d) anthropological...... forms of birth, growth, survival, mating, parenting, and sociality. Conversely, from the biocultural perspective, human biological processes are constrained, organized, and developed by culture, which includes technology, culturally specific socioeconomic and political structures, religious...

  12. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  13. Elastoplasticity theory

    CERN Document Server

    Hashiguchi, Koichi

    2014-01-01

    This book was written to serve as the standard textbook of elastoplasticity for students, engineers and researchers in the field of applied mechanics. The present second edition is improved thoroughly from the first edition by selecting the standard theories from various formulations and models, which are required to study the essentials of elastoplasticity steadily and effectively and will remain universally in the history of elastoplasticity. It opens with an explanation of vector-tensor analysis and continuum mechanics as a foundation to study elastoplasticity theory, extending over various strain and stress tensors and their rates. Subsequently, constitutive equations of elastoplastic and viscoplastic deformations for monotonic, cyclic and non-proportional loading behavior in a general rate and their applications to metals and soils are described in detail, and constitutive equations of friction behavior between solids and its application to the prediction of stick-slip phenomena are delineated. In additi...

  14. Livability theory

    OpenAIRE

    Veenhoven, Ruut

    2014-01-01

    markdownabstract__Abstract__ Assumptions Livability theory involves the following six key assumptions: 1. Like all animals, humans have innate needs, such as for food, safety, and companionship. 2. Gratification of needs manifests in hedonic experience. 3. Hedonic experience determines how much we like the life we live (happiness). Hence, happiness depends on need gratification. 4.Need gratification depends on both external living conditions and inner abilities to use these. Hence, bad living...

  15. Testing theories

    International Nuclear Information System (INIS)

    Casten, R F

    2015-01-01

    This paper discusses some simple issues that arise in testing models, with a focus on models for low energy nuclear structure. By way of simplified examples, we illustrate some dangers in blind statistical assessments, pointing out especially the need to include theoretical uncertainties, the danger of over-weighting precise or physically redundant experimental results, the need to assess competing theories with independent and physically sensitive observables, and the value of statistical tests properly evaluated. (paper)

  16. Graph theory

    CERN Document Server

    Diestel, Reinhard

    2017-01-01

    This standard textbook of modern graph theory, now in its fifth edition, combines the authority of a classic with the engaging freshness of style that is the hallmark of active mathematics. It covers the core material of the subject with concise yet reliably complete proofs, while offering glimpses of more advanced methods in each field by one or two deeper results, again with proofs given in full detail. The book can be used as a reliable text for an introductory course, as a graduate text, and for self-study. From the reviews: “This outstanding book cannot be substituted with any other book on the present textbook market. It has every chance of becoming the standard textbook for graph theory.”Acta Scientiarum Mathematiciarum “Deep, clear, wonderful. This is a serious book about the heart of graph theory. It has depth and integrity. ”Persi Diaconis & Ron Graham, SIAM Review “The book has received a very enthusiastic reception, which it amply deserves. A masterly elucidation of modern graph theo...

  17. Scattering theory

    CERN Document Server

    Friedrich, Harald

    2016-01-01

    This corrected and updated second edition of "Scattering Theory" presents a concise and modern coverage of the subject. In the present treatment, special attention is given to the role played by the long-range behaviour of the projectile-target interaction, and a theory is developed, which is well suited to describe near-threshold bound and continuum states in realistic binary systems such as diatomic molecules or molecular ions. It is motivated by the fact that experimental advances have shifted and broadened the scope of applications where concepts from scattering theory are used, e.g. to the field of ultracold atoms and molecules, which has been experiencing enormous growth in recent years, largely triggered by the successful realization of Bose-Einstein condensates of dilute atomic gases in 1995. The book contains sections on special topics such as near-threshold quantization, quantum reflection, Feshbach resonances and the quantum description of scattering in two dimensions. The level of abstraction is k...

  18. A new definition of entropy of belief functions in the Dempster-Shafer theory

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim; Shenoy, P. P.

    2018-01-01

    Roč. 92, č. 1 (2018), s. 49-65 ISSN 0888-613X Grant - others:GA ČR(CZ) GA15-00215S Institutional support: RVO:67985556 Keywords : Dempster-Shafer theory * Dempster’s rule of combination * Plausibility transform Subject RIV: AH - Economics Impact factor: 2.845, year: 2016 http://library.utia.cas.cz/separaty/2017/MTR/jirousek-0481470.pdf

  19. Using Agent-Based Modeling to Enhance System-Level Real-time Control of Urban Stormwater Systems

    Science.gov (United States)

    Rimer, S.; Mullapudi, A. M.; Kerkez, B.

    2017-12-01

    The ability to reduce combined-sewer overflow (CSO) events is an issue that challenges over 800 U.S. municipalities. When the volume of a combined sewer system or wastewater treatment plant is exceeded, untreated wastewater then overflows (a CSO event) into nearby streams, rivers, or other water bodies causing localized urban flooding and pollution. The likelihood and impact of CSO events has only exacerbated due to urbanization, population growth, climate change, aging infrastructure, and system complexity. Thus, there is an urgent need for urban areas to manage CSO events. Traditionally, mitigating CSO events has been carried out via time-intensive and expensive structural interventions such as retention basins or sewer separation, which are able to reduce CSO events, but are costly, arduous, and only provide a fixed solution to a dynamic problem. Real-time control (RTC) of urban drainage systems using sensor and actuator networks has served as an inexpensive and versatile alternative to traditional CSO intervention. In particular, retrofitting individual stormwater elements for sensing and automated active distributed control has been shown to significantly reduce the volume of discharge during CSO events, with some RTC models demonstrating a reduction upwards of 90% when compared to traditional passive systems. As more stormwater elements become retrofitted for RTC, system-level RTC across complete watersheds is an attainable possibility. However, when considering the diverse set of control needs of each of these individual stormwater elements, such system-level RTC becomes a far more complex problem. To address such diverse control needs, agent-based modeling is employed such that each individual stormwater element is treated as an autonomous agent with a diverse decision making capabilities. We present preliminary results and limitations of utilizing the agent-based modeling computational framework for the system-level control of diverse, interacting

  20. Single event and TREE latchup mitigation for a star tracker sensor: An innovative approach to system level latchup mitigation

    International Nuclear Information System (INIS)

    Kimbrough, J.R.; Colella, N.J.; Davis, R.W.; Bruener, D.B.; Coakley, P.G.; Lutjens, S.W.; Mallon, C.E.

    1994-08-01

    Electronic packages designed for spacecraft should be fault-tolerant and operate without ground control intervention through extremes in the space radiation environment. If designed for military use, the electronics must survive and function in a nuclear radiation environment. This paper presents an innovative ''blink'' approach rather than the typical ''operate through'' approach to achieve system level latchup mitigation on a prototype star tracker camera. Included are circuit designs, flash x-ray test data, and heavy ion data demonstrating latchup mitigation protecting micro-electronics from current latchup and burnout due to Single Event Latchup (SEL) and Transient Radiation Effects on Electronics (TREE)

  1. System-Level Power Optimization for a ΣΔ D/A Converter for Hearing-Aid Application

    DEFF Research Database (Denmark)

    Pracný, Peter; Jørgensen, Ivan Harald Holger; Bruun, Erik

    2013-01-01

    This paper deals with a system-level optimization of a back-end of audio signal processing chain for hearing-aids, including a sigma-delta modulator digital-to-analog converter (DAC) and a Class D power amplifier. Compared to other stateof-the-art designs dealing with sigma-delta modulator design...... hearing-aid audio back-end system resulting in less hardware and power consumption in the interpolation filter, in the sigma-delta modulator and reduced switching rate of the Class D output stage....

  2. The utility of system-level RAM analysis and standards for the US nuclear waste management system

    International Nuclear Information System (INIS)

    Rod, S.R.; Adickes, M.D.; Paul, B.K.

    1992-03-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) is responsible for developing a system to manage spent nuclear fuel and high-level radioactive waste in accordance with the Nuclear Waste Policy Act of 1982 and its subsequent amendments. Pacific Northwest Laboratory (PNL) is assisting OCRWM in its investigation of whether system-level reliability, availability, and maintainability (RAM) requirements are appropriate for the waste management system and, if they are, what appropriate form should be for such requirements. Results and recommendations are presented

  3. Pathogenesis of chronic pancreatitis: an evidence-based review of past theories and recent developments.

    Science.gov (United States)

    Stevens, Tyler; Conwell, Darwin L; Zuccaro, Gregory

    2004-11-01

    In the past several decades, four prominent theories of chronic pancreatitis pathogenesis have emerged: the toxic-metabolic theory, the oxidative stress hypothesis, the stone and duct obstruction theory, and the necrosis-fibrosis hypothesis. Although these traditional theories are formulated based on compelling scientific observations, substantial contradictory data also exist for each. Furthermore, the basic premises of some of these theories are directly contradictory. Because of the recent scientific progress in the underlying genetic, cellular, and molecular pathophysiology, there have been substantial advances in the understanding of chronic pancreatitis pathogenesis. This paper will provide an evidence-based review and critique of the traditional pathogenic theories, followed by a discussion of the new advances in pancreatic fibrogenesis. Moreover, we will discuss plausible pathogenic sequences applied to each of the known etiologies.

  4. Communication theory

    CERN Document Server

    Goldie, Charles M

    1991-01-01

    This book is an introduction, for mathematics students, to the theories of information and codes. They are usually treated separately but, as both address the problem of communication through noisy channels (albeit from different directions), the authors have been able to exploit the connection to give a reasonably self-contained treatment, relating the probabilistic and algebraic viewpoints. The style is discursive and, as befits the subject, plenty of examples and exercises are provided. Some examples and exercises are provided. Some examples of computer codes are given to provide concrete illustrations of abstract ideas.

  5. Design theory

    CERN Document Server

    2009-01-01

    This book deals with the basic subjects of design theory. It begins with balanced incomplete block designs, various constructions of which are described in ample detail. In particular, finite projective and affine planes, difference sets and Hadamard matrices, as tools to construct balanced incomplete block designs, are included. Orthogonal latin squares are also treated in detail. Zhu's simpler proof of the falsity of Euler's conjecture is included. The construction of some classes of balanced incomplete block designs, such as Steiner triple systems and Kirkman triple systems, are also given.

  6. Ruling by canal: Governance and system-level design characteristics of large scale irrigation infrastructure in India and Uzbekistan

    Directory of Open Access Journals (Sweden)

    Peter Mollinga

    2016-06-01

    Full Text Available This paper explores the relationship between governance regime and large-scale irrigation system design by investigating three cases: 1 protective irrigation design in post-independent South India; 2 canal irrigation system design in Khorezm Province, Uzbekistan, as implemented in the USSR period, and 3 canal design by the Madras Irrigation and Canal Company, as part of an experiment to do canal irrigation development in colonial India on commercial terms in the 1850s-1860s. The mutual shaping of irrigation infrastructure design characteristics on the one hand and management requirements and conditions on the other has been documented primarily at lower, within-system levels of the irrigation systems, notably at the level of division structures. Taking a 'social construction of technology' perspective, the paper analyses the relationship between technological structures and management and governance arrangements at irrigation system level. The paper finds qualitative differences in the infrastructural configuration of the three irrigation systems expressing and facilitating particular forms of governance and rule, differences that matter for management and use, and their effects and impacts.

  7. A Mathematical Model of Metabolism and Regulation Provides a Systems-Level View of How Escherichia coli Responds to Oxygen

    Directory of Open Access Journals (Sweden)

    Michael eEderer

    2014-03-01

    Full Text Available The efficient redesign of bacteria for biotechnological purposes, such as biofuel production, waste disposal or specific biocatalytic functions, requires a quantitative systems-level understanding of energy supply, carbon and redox metabolism. The measurement of transcript levels, metabolite concentrations and metabolic fluxes per se gives an incomplete picture. An appreciation of the interdependencies between the different measurement values is essential for systems-level understanding. Mathematical modeling has the potential to provide a coherent and quantitative description of the interplay between gene expression, metabolite concentrations and metabolic fluxes. Escherichia coli undergoes major adaptations in central metabolism when the availability of oxygen changes. Thus, an integrated description of the oxygen response provides a benchmark of our understanding of carbon, energy and redox metabolism. We present the first comprehensive model of the central metabolism of E. coli that describes steady-state metabolism at different levels of oxygen availability. Variables of the model are metabolite concentrations, gene expression levels, transcription factor activities, metabolic fluxes and biomass concentration. We analyze the model with respect to the production capabilities of central metabolism of E. coli. In particular, we predict how precursor and biomass concentration are affected by product formation.

  8. Transcriptome-Based Analysis in Lactobacillus plantarum WCFS1 Reveals New Insights into Resveratrol Effects at System Level.

    Science.gov (United States)

    Reverón, Inés; Plaza-Vinuesa, Laura; Franch, Mónica; de Las Rivas, Blanca; Muñoz, Rosario; López de Felipe, Félix

    2018-05-01

    This study was undertaken to expand our insights into the mechanisms involved in the tolerance to resveratrol (RSV) that operate at system-level in gut microorganisms and advance knowledge on new RSV-responsive gene circuits. Whole genome transcriptional profiling was used to characterize the molecular response of Lactobacillus plantarum WCFS1 to RSV. DNA repair mechanisms were induced by RSV and responses were triggered to decrease the load of copper, a metal required for RSV-mediated DNA cleavage, and H 2 S, a genotoxic gas. To counter the effects of RSV, L. plantarum strongly up- or downregulated efflux systems and ABC transporters pointing to transport control of RSV across the membrane as a key mechanism for RSV tolerance. L. plantarum also downregulated tRNAs, induced chaperones, and reprogrammed its transcriptome to tightly control ammonia levels. RSV induced a probiotic effector gene and a likely deoxycholate transporter, two functions that improve the host health status. Our data identify novel protective mechanisms involved in RSV tolerance operating at system level in a gut microbe. These insights could influence the way RSV is used for a better management of gut microbial ecosystems to obtain associated health benefits. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. International business theory and marketing theory

    OpenAIRE

    Soldner, Helmut

    1984-01-01

    International business theory and marketing theory : elements for internat. marketing theory building. - In: Marketing aspects of international business / Gerald M. Hampton ... (eds.). - Boston u.a. : Kluwer, 1984. - S. 25-57

  10. Options theory

    International Nuclear Information System (INIS)

    Markland, J.T.

    1992-01-01

    Techniques used in conventional project appraisal are mathematically very simple in comparison to those used in reservoir modelling, and in the geosciences. Clearly it would be possible to value assets in mathematically more sophisticated ways if it were meaningful and worthwhile so to do. The DCf approach in common use has recognized limitations; the inability to select a meaningful discount rate being particularly significant. Financial Theory has advanced enormously over the last few years, along with computational techniques, and methods are beginning to appear which may change the way we do project evaluations in practice. The starting point for all of this was a paper by Black and Scholes, which asserts that almost all corporate liabilities can be viewed as options of varying degrees of complexity. Although the financial presentation may be unfamiliar to engineers and geoscientists, some of the concepts used will not be. This paper outlines, in plain English, the basis of option pricing theory for assessing the market value of a project. it also attempts to assess the future role of this type of approach in practical Petroleum Exploration and Engineering economics. Reference is made to relevant published Natural Resource literature

  11. Quantum gravity from descriptive set theory

    International Nuclear Information System (INIS)

    El Naschie, M.S.

    2004-01-01

    We start from Hilbert's criticism of the axioms of classical geometry and the possibility of abandoning the Archimedean axiom. Subsequently we proceed to the physical possibility of a fundamental limitation on the smallest length connected to certain singular points in spacetime and below which measurements become meaningless, Finally we arrive at the conclusion that maximising the Hawking-Bekenstein informational content of spacetime makes the existence of a transfinite geometry for physical 'spacetime' not only plausible but probably inevitable. The main part of the paper is then concerned with a proposal for a mathematical description of a transfinite, non-Archimedean geometry using descriptive set theory. Nevertheless, and despite all abstract mathematics, we remain quite close to similar lines of investigation initiated by physicists like A. Wheeler, D. Finkelstein and G. 'tHooft. In particular we introduce a logarithmic gauge transformation linking classical gravity with the electro weak via a version of informational entropy. That way we may claim to have accomplished an important step towards a general theory of quantum gravity using ε (∞) and complexity theory and finding that α G =(2) α-bar ew -1 congruent with (1.7)(10) 38 where α G is the dimensionless Newton gravity constant, and α ew ≅128 is the fine structure constant at the electro weak scale

  12. Toward a holographic theory for general spacetimes

    Science.gov (United States)

    Nomura, Yasunori; Salzetta, Nico; Sanches, Fabio; Weinberg, Sean J.

    2017-04-01

    We study a holographic theory of general spacetimes that does not rely on the existence of asymptotic regions. This theory is to be formulated in a holographic space. When a semiclassical description is applicable, the holographic space is assumed to be a holographic screen: a codimension-1 surface that is capable of encoding states of the gravitational spacetime. Our analysis is guided by conjectured relationships between gravitational spacetime and quantum entanglement in the holographic description. To understand basic features of this picture, we catalog predictions for the holographic entanglement structure of cosmological spacetimes. We find that qualitative features of holographic entanglement entropies for such spacetimes differ from those in AdS/CFT but that the former reduce to the latter in the appropriate limit. The Hilbert space of the theory is analyzed, and two plausible structures are found: a direct-sum and "spacetime-equals-entanglement" structure. The former preserves a naive relationship between linear operators and observable quantities, while the latter respects a more direct connection between holographic entanglement and spacetime. We also discuss the issue of selecting a state in quantum gravity, in particular how the state of the multiverse may be selected in the landscape.

  13. Review of Mechanisms and Theories of Aging

    Directory of Open Access Journals (Sweden)

    Gholam Reza Azari

    2006-10-01

    Full Text Available Several factors have incentive role for study of aging which includes increasing of the average and maximum of human life span, the increase in percentage of elderly in the societies and proportion of the national expenditure utilized by them. The Recent views of aging indicating that aging is extremely a complex multifactorial process despite of earlier views about definite cause aging like gene or decline of a key factor(1. This brief review tries to inspect aging at the molecular, cellular, and systemic levels; and consider interaction between genetic and environmental factors. Evolutionary theories argue that aging results from a decline in the force of natural selection. On the other hand, molecular theories emphasis on the genetically regulation of aging and argue that aging results from changing in genes. There are cellular theories that telomere theory is most famous. Stress induced aging is in this group too. free radical theory is next known way for cellular damages. Finally, we see systemic theories that contain two main groups, neuroendocrine and immunologic theories.

  14. Systems Level Dissection of Anaerobic Methane Cycling: Quantitative Measurements of Single Cell Ecophysiology, Genetic Mechanisms, and Microbial Interactions

    Energy Technology Data Exchange (ETDEWEB)

    Orphan, Victoria [California Inst. of Technology (CalTech), Pasadena, CA (United States); Tyson, Gene [University of Queensland, Brisbane Australia; Meile, Christof [University of Georgia, Athens, Georgia; McGlynn, Shawn [California Inst. of Technology (CalTech), Pasadena, CA (United States); Yu, Hang [California Inst. of Technology (CalTech), Pasadena, CA (United States); Chadwick, Grayson [California Inst. of Technology (CalTech), Pasadena, CA (United States); Marlow, Jeffrey [California Inst. of Technology (CalTech), Pasadena, CA (United States); Trembath-Reichert, Elizabeth [California Inst. of Technology (CalTech), Pasadena, CA (United States); Dekas, Anne [California Inst. of Technology (CalTech), Pasadena, CA (United States); Hettich, Robert [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pan, Chongle [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ellisman, Mark [University of California San Diego; Hatzenpichler, Roland [California Inst. of Technology (CalTech), Pasadena, CA (United States); Skennerton, Connor [California Inst. of Technology (CalTech), Pasadena, CA (United States); Scheller, Silvan [California Inst. of Technology (CalTech), Pasadena, CA (United States)

    2017-12-25

    The global biological CH4 cycle is largely controlled through coordinated and often intimate microbial interactions between archaea and bacteria, the majority of which are still unknown or have been only cursorily identified. Members of the methanotrophic archaea, aka ‘ANME’, are believed to play a major role in the cycling of methane in anoxic environments coupled to sulfate, nitrate, and possibly iron and manganese oxides, frequently forming diverse physical and metabolic partnerships with a range of bacteria. The thermodynamic challenges overcome by the ANME and their bacterial partners and corresponding slow rates of growth are common characteristics in anaerobic ecosystems, and, in stark contrast to most cultured microorganisms, this type of energy and resource limited microbial lifestyle is likely the norm in the environment. While we have gained an in-depth systems level understanding of fast-growing, energy-replete microorganisms, comparatively little is known about the dynamics of cell respiration, growth, protein turnover, gene expression, and energy storage in the slow-growing microbial majority. These fundamental properties, combined with the observed metabolic and symbiotic versatility of methanotrophic ANME, make these cooperative microbial systems a relevant (albeit challenging) system to study and for which to develop and optimize culture-independent methodologies, which enable a systems-level understanding of microbial interactions and metabolic networks. We used an integrative systems biology approach to study anaerobic sediment microcosms and methane-oxidizing bioreactors and expanded our understanding of the methanotrophic ANME archaea, their interactions with physically-associated bacteria, ecophysiological characteristics, and underlying genetic basis for cooperative microbial methane-oxidation linked with different terminal electron acceptors. Our approach is inherently multi-disciplinary and multi-scaled, combining transcriptional and

  15. Electric theory

    International Nuclear Information System (INIS)

    Gong, Ha Seong

    2006-02-01

    This book explains electric theory which is divided into four chapters. The first chapter includes electricity and material, electric field, capacitance, magnetic field and electromagnetic force, inductance. The second chapter mentions electronic circuit analysis, electric resistance,heating and power, chemical activity on current and battery with electrolysis. The third chapter deals with an alternating current circuit about the basics of an AC circuit, operating of resistance, inductance and capacitance, series circuit and parallel circuit of PLC, an alternating current circuit, Three-phase Alternating current, two terminal pair network and voltage and current of non-linearity circuit. The last explains transient phenomena of RC series circuit, RL series circuit, transient phenomena of an alternating current circuit and transient phenomena of RLC series circuit.

  16. Sustainablegrowth theories

    International Nuclear Information System (INIS)

    Nobile, G.

    1993-07-01

    With reference to highly debated sustainable growth strategies to counter pressing interrelated global environmental and socio-economic problems, this paper reviews economic and resource development theories proposed by classical and neoclassical economists. The review evidences the growing debate among public administration decision makers regarding appropriate methods to assess the worth of natural resources and ecosystems. Proposed methods tend to be biased either towards environmental protection or economic development. Two major difficulties in the effective implementation of sustainable growth strategies are also evidenced - the management of such strategies would require appropriate revisions to national accounting systems, and the dynamic flow of energy and materials between an economic system and the environment would generate a sequence of unstable structures evolving in a chaotic and unpredictable way

  17. Systems Level Analysis of Histone H3 Post-translational Modifications (PTMs) Reveals Features of PTM Crosstalk in Chromatin Regulation

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Sidoli, Simone; Ruminowicz, Chrystian

    2016-01-01

    molecules contain multiple coexisting PTMs, some of which exhibit crosstalk, i.e. coordinated or mutually exclusive activities. Here, we present an integrated experimental and computational systems level molecular characterization of histone PTMs and PTM crosstalk. Using wild type and engineered mouse....... We characterized combinatorial PTM features across the four mESC lines and then applied statistical data analysis to predict crosstalk between histone H3 PTMs. We detected an overrepresentation of positive crosstalk (codependent marks) between adjacent mono-methylated and acetylated marks......, and negative crosstalk (mutually exclusive marks) among most of the seven characterized di- and tri-methylated lysine residues in the H3 tails. We report novel features of PTM interplay involving hitherto poorly characterized arginine methylation and lysine methylation sites, including H3R2me, H3R8me and H3K37...

  18. A wearable 3D motion sensing system integrated with a Bluetooth smart phone application: A system level overview

    KAUST Repository

    Karimi, Muhammad Akram

    2018-01-02

    An era of ubiquitous motion sensing has just begun. All electronic gadgets ranging from game consoles to mobile phones have some sort of motion sensors in them. In contrast to rigid motion sensing systems, this paper presents a system level description of a wearable 3D motion sensor. The sensing mechanism is based upon well-established magnetic and inertial measurement unit (MIMU), which integrates accelerometer, gyroscope and magnetometer data. Two sensor boards have been integrated within a wearable arm sleeve to capture 3D orientation of the human arm. The sensors have been interfaced with a Bluetooth transceiver chip, which transmits data to a mobile phone app using standard Bluetooth protocol. An android mobile phone app has been developed to display the human arm motion in real time.

  19. Unified System-Level Modeling of Intermittent Renewable Energy Sources and Energy Storage for Power System Operation

    DEFF Research Database (Denmark)

    Heussen, Kai; Koch, Stephan; Ulbig, Andreas

    2011-01-01

    The system-level consideration of inter- mittent renewable energy sources and small-scale en- ergy storage in power systems remains a challenge as either type is incompatible with traditional operation concepts. Non-controllability and energy-constraints are still considered contingent cases...... in market-based operation. The design of operation strategies for up to 100 % renewable energy systems requires an explicit consideration of non-dispatchable generation and stor- age capacities, as well as the evaluation of operational performance in terms of energy eciency, reliability, environmental...... impact and cost. By abstracting from technology-dependent and physical unit properties, the modeling framework presented and extended in this pa- per allows the modeling of a technologically diverse unit portfolio with a unied approach, whilst establishing the feasibility of energy-storage consideration...

  20. Strategies and Systems-Level Interventions to Combat or Prevent Drug Counterfeiting: A Systematic Review of Evidence Beyond Effectiveness.

    Science.gov (United States)

    Fadlallah, Racha; El-Jardali, Fadi; Annan, Farah; Azzam, Hayat; Akl, Elie A

    2016-01-01

    A recent systematic review suggested that drug registrations and onsite quality inspections may be effective in reducing the prevalence of counterfeit and substandard drugs. However, simply replicating the most effective interventions is problematic, as it denotes implementing the intervention without further adaptation. The aim was to systematically review the evidence beyond effectiveness for systems-level interventions to combat or prevent drug counterfeiting. We conducted an extensive search, including an electronic search of 14 databases. We included studies examining the efficiency, feasibility, reliability, and economic outcomes of the interventions, as well as barriers and facilitators to their implementation. Two reviewers selected eligible studies and abstracted data in duplicate and independently. We synthesized the results narratively, stratified by type of intervention. Of 10,220 captured citations, 19 met our inclusion criteria. The findings suggest that the following may strengthen regulatory measures (e.g., registration): minimizing drug diversion, enhancing lines of communications, ensuring feedback on drug quality, and promoting strict licensing criteria. There is evidence that onsite quality surveillance and inspection systems may be efficient and cost-effective for preliminary testing of large samples of drugs. Laws and legislation need to be specific to counterfeit drugs, include firm penalties, address online purchasing of drugs, and be complemented by education of judges and lawyers. Public awareness and education should rely on multiple platforms and comprehensive and dedicated content. While product authentication technologies may be efficient and reliable in detecting counterfeit drugs in the supply chain, they require a strong information system infrastructure. As for pharmacovigilance systems, it is critical to tackle the issue of underreporting, to enhance their chances of success. Several factors are critical to the successful design

  1. Integrating community-based verbal autopsy into civil registration and vital statistics (CRVS): system-level considerations

    Science.gov (United States)

    de Savigny, Don; Riley, Ian; Chandramohan, Daniel; Odhiambo, Frank; Nichols, Erin; Notzon, Sam; AbouZahr, Carla; Mitra, Raj; Cobos Muñoz, Daniel; Firth, Sonja; Maire, Nicolas; Sankoh, Osman; Bronson, Gay; Setel, Philip; Byass, Peter; Jakob, Robert; Boerma, Ties; Lopez, Alan D.

    2017-01-01

    ABSTRACT Background: Reliable and representative cause of death (COD) statistics are essential to inform public health policy, respond to emerging health needs, and document progress towards Sustainable Development Goals. However, less than one-third of deaths worldwide are assigned a cause. Civil registration and vital statistics (CRVS) systems in low- and lower-middle-income countries are failing to provide timely, complete and accurate vital statistics, and it will still be some time before they can provide physician-certified COD for every death. Proposals: Verbal autopsy (VA) is a method to ascertain the probable COD and, although imperfect, it is the best alternative in the absence of medical certification. There is extensive experience with VA in research settings but only a few examples of its use on a large scale. Data collection using electronic questionnaires on mobile devices and computer algorithms to analyse responses and estimate probable COD have increased the potential for VA to be routinely applied in CRVS systems. However, a number of CRVS and health system integration issues should be considered in planning, piloting and implementing a system-wide intervention such as VA. These include addressing the multiplicity of stakeholders and sub-systems involved, integration with existing CRVS work processes and information flows, linking VA results to civil registration records, information technology requirements and data quality assurance. Conclusions: Integrating VA within CRVS systems is not simply a technical undertaking. It will have profound system-wide effects that should be carefully considered when planning for an effective implementation. This paper identifies and discusses the major system-level issues and emerging practices, provides a planning checklist of system-level considerations and proposes an overview for how VA can be integrated into routine CRVS systems. PMID:28137194

  2. Human- Versus System-Level Factors and Their Effect on Electronic Work List Variation: Challenging Radiology's Fundamental Attribution Error.

    Science.gov (United States)

    Davenport, Matthew S; Khalatbari, Shokoufeh; Platt, Joel F

    2015-09-01

    The aim of this study was to analyze sources of variation influencing the unread volume on an electronic abdominopelvic CT work list and to compare those results with blinded radiologist perception. The requirement for institutional review board approval was waived for this HIPAA-compliant quality improvement effort. Data pertaining to an electronic abdominopelvic CT work list were analyzed retrospectively from July 1, 2013, to June 30, 2014, and modeled with respect to the unread case total at 6 pm (Monday through Friday, excluding holidays). Eighteen system-level factors outside individual control (eg, number of workers, workload) and 7 human-level factors within individual control (eg, individual productivity) were studied. Attending radiologist perception was assessed with a blinded anonymous survey (n = 12 of 15 surveys completed). The mean daily unread total was 24 (range, 3-72). The upper control limit (48 CT studies [3 SDs above the mean]) was exceeded 10 times. Multivariate analysis revealed that the rate of unread CT studies was affected principally by system-level factors, including the number of experienced trainees on service (postgraduate year 5 residents [odds ratio, 0.83; 95% confidence interval, 0.74-0.92; P = .0008] and fellows [odds ratio, 0.84; 95% confidence interval, 0.74-0.95; P = .005]) and the daily workload (P = .02 to P level factors best predict the variation in unread CT examinations, but blinded faculty radiologists believe that it relates most strongly to variable individual effort. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  3. Parental concern about vaccine safety in Canadian children partially immunized at age 2: a multivariable model including system level factors.

    Science.gov (United States)

    MacDonald, Shannon E; Schopflocher, Donald P; Vaudry, Wendy

    2014-01-01

    Children who begin but do not fully complete the recommended series of childhood vaccines by 2 y of age are a much larger group than those who receive no vaccines. While parents who refuse all vaccines typically express concern about vaccine safety, it is critical to determine what influences parents of 'partially' immunized children. This case-control study examined whether parental concern about vaccine safety was responsible for partial immunization, and whether other personal or system-level factors played an important role. A random sample of parents of partially and completely immunized 2 y old children were selected from a Canadian regional immunization registry and completed a postal survey assessing various personal and system-level factors. Unadjusted odds ratios (OR) and adjusted ORs (aOR) were calculated with logistic regression. While vaccine safety concern was associated with partial immunization (OR 7.338, 95% CI 4.138-13.012), other variables were more strongly associated and reduced the strength of the relationship between concern and partial immunization in multivariable analysis (aOR 2.829, 95% CI 1.151-6.957). Other important factors included perceived disease susceptibility and severity (aOR 4.629, 95% CI 2.017-10.625), residential mobility (aOR 3.908, 95% CI 2.075-7.358), daycare use (aOR 0.310, 95% CI 0.144-0.671), number of needles administered at each visit (aOR 7.734, 95% CI 2.598-23.025) and access to a regular physician (aOR 0.219, 95% CI 0.057-0.846). While concern about vaccine safety may be addressed through educational strategies, this study suggests that additional program and policy-level strategies may positively impact immunization uptake.

  4. A Future of Communication Theory: Systems Theory.

    Science.gov (United States)

    Lindsey, Georg N.

    Concepts of general systems theory, cybernetics and the like may provide the methodology for communication theory to move from a level of technology to a level of pure science. It was the purpose of this paper to (1) demonstrate the necessity of applying systems theory to the construction of communication theory, (2) review relevant systems…

  5. Implausibility of the vibrational theory of olfaction.

    Science.gov (United States)

    Block, Eric; Jang, Seogjoo; Matsunami, Hiroaki; Sekharan, Sivakumar; Dethier, Bérénice; Ertem, Mehmed Z; Gundala, Sivaji; Pan, Yi; Li, Shengju; Li, Zhen; Lodge, Stephene N; Ozbil, Mehmet; Jiang, Huihong; Penalba, Sonia F; Batista, Victor S; Zhuang, Hanyi

    2015-05-26

    The vibrational theory of olfaction assumes that electron transfer occurs across odorants at the active sites of odorant receptors (ORs), serving as a sensitive measure of odorant vibrational frequencies, ultimately leading to olfactory perception. A previous study reported that human subjects differentiated hydrogen/deuterium isotopomers (isomers with isotopic atoms) of the musk compound cyclopentadecanone as evidence supporting the theory. Here, we find no evidence for such differentiation at the molecular level. In fact, we find that the human musk-recognizing receptor, OR5AN1, identified using a heterologous OR expression system and robustly responding to cyclopentadecanone and muscone, fails to distinguish isotopomers of these compounds in vitro. Furthermore, the mouse (methylthio)methanethiol-recognizing receptor, MOR244-3, as well as other selected human and mouse ORs, responded similarly to normal, deuterated, and (13)C isotopomers of their respective ligands, paralleling our results with the musk receptor OR5AN1. These findings suggest that the proposed vibration theory does not apply to the human musk receptor OR5AN1, mouse thiol receptor MOR244-3, or other ORs examined. Also, contrary to the vibration theory predictions, muscone-d30 lacks the 1,380- to 1,550-cm(-1) IR bands claimed to be essential for musk odor. Furthermore, our theoretical analysis shows that the proposed electron transfer mechanism of the vibrational frequencies of odorants could be easily suppressed by quantum effects of nonodorant molecular vibrational modes. These and other concerns about electron transfer at ORs, together with our extensive experimental data, argue against the plausibility of the vibration theory.

  6. SAM Theory Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Rui [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-03-01

    The System Analysis Module (SAM) is an advanced and modern system analysis tool being developed at Argonne National Laboratory under the U.S. DOE Office of Nuclear Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM development aims for advances in physical modeling, numerical methods, and software engineering to enhance its user experience and usability for reactor transient analyses. To facilitate the code development, SAM utilizes an object-oriented application framework (MOOSE), and its underlying meshing and finite-element library (libMesh) and linear and non-linear solvers (PETSc), to leverage modern advanced software environments and numerical methods. SAM focuses on modeling advanced reactor concepts such as SFRs (sodium fast reactors), LFRs (lead-cooled fast reactors), and FHRs (fluoride-salt-cooled high temperature reactors) or MSRs (molten salt reactors). These advanced concepts are distinguished from light-water reactors in their use of single-phase, low-pressure, high-temperature, and low Prandtl number (sodium and lead) coolants. As a new code development, the initial effort has been focused on modeling and simulation capabilities of heat transfer and single-phase fluid dynamics responses in Sodium-cooled Fast Reactor (SFR) systems. The system-level simulation capabilities of fluid flow and heat transfer in general engineering systems and typical SFRs have been verified and validated. This document provides the theoretical and technical basis of the code to help users understand the underlying physical models (such as governing equations, closure models, and component models), system modeling approaches, numerical discretization and solution methods, and the overall capabilities in SAM. As the code is still under ongoing development, this SAM Theory Manual will be updated periodically to keep it consistent with the state of the development.

  7. An Evolutionary Comparison of the Handicap Principle and Hybrid Equilibrium Theories of Signaling

    Science.gov (United States)

    Kane, Patrick; Zollman, Kevin J. S.

    2015-01-01

    The handicap principle has come under significant challenge both from empirical studies and from theoretical work. As a result, a number of alternative explanations for honest signaling have been proposed. This paper compares the evolutionary plausibility of one such alternative, the “hybrid equilibrium,” to the handicap principle. We utilize computer simulations to compare these two theories as they are instantiated in Maynard Smith’s Sir Philip Sidney game. We conclude that, when both types of communication are possible, evolution is unlikely to lead to handicap signaling and is far more likely to result in the partially honest signaling predicted by hybrid equilibrium theory. PMID:26348617

  8. Competition explains limited attention and perceptual resources: implications for perceptual load and dilution theories

    Directory of Open Access Journals (Sweden)

    Paige E. Scalf

    2013-05-01

    Full Text Available Both perceptual load theory and dilution theory purport to explain when and why task-irrelevant information, or so-called distractors are processed. Central to both explanations is the notion of limited resources, although the theories differ in the precise way in which those limitations affect distractor processing. We have recently proposed a neurally plausible explanation of limited resources in which neural competition among stimuli hinders their representation in the brain. This view of limited capacity can also explain distractor processing, whereby the competitive interactions and bias imposed to resolve the competition determine the extent to which a distractor is processed. This idea is compatible with aspects of both perceptual load and dilution models of distractor processing, but also serves to highlight their differences. Here we review the evidence in favor of a biased competition view of limited resources and relate these ideas to both classic perceptual load theory and dilution theory.

  9. Competition explains limited attention and perceptual resources: implications for perceptual load and dilution theories.

    Science.gov (United States)

    Scalf, Paige E; Torralbo, Ana; Tapia, Evelina; Beck, Diane M

    2013-01-01

    Both perceptual load theory and dilution theory purport to explain when and why task-irrelevant information, or so-called distractors are processed. Central to both explanations is the notion of limited resources, although the theories differ in the precise way in which those limitations affect distractor processing. We have recently proposed a neurally plausible explanation of limited resources in which neural competition among stimuli hinders their representation in the brain. This view of limited capacity can also explain distractor processing, whereby the competitive interactions and bias imposed to resolve the competition determine the extent to which a distractor is processed. This idea is compatible with aspects of both perceptual load and dilution models of distractor processing, but also serves to highlight their differences. Here we review the evidence in favor of a biased competition view of limited resources and relate these ideas to both classic perceptual load theory and dilution theory.

  10. Rate theory

    International Nuclear Information System (INIS)

    Maillard, S.; Skorek, R.; Maugis, P.; Dumont, M.

    2015-01-01

    This chapter presents the basic principles of cluster dynamics as a particular case of mesoscopic rate theory models developed to investigate fuel behaviour under irradiation such as in UO 2 . It is shown that as this method simulates the evolution of the concentration of every type of point or aggregated defect in a grain of material. It produces rich information that sheds light on the mechanisms involved in microstructure evolution and gas behaviour that are not accessible through conventional models but yet can provide for improvements in those models. Cluster dynamics parameters are mainly the energetic values governing the basic evolution mechanisms of the material (diffusion, trapping and thermal resolution). In this sense, the model has a general applicability to very different operational situations (irradiation, ion-beam implantation, annealing) provided that they rely on the same basic mechanisms, without requiring additional data fitting, as is required for more empirical conventional models. This technique, when applied to krypton implanted and annealed samples, yields a precise interpretation of the release curves and helps assess migration mechanisms and the krypton diffusion coefficient, for which data is very difficult to obtain due to the low solubility of the gas. (authors)

  11. Derivation of Einstein-Cartan theory from general relativity

    Science.gov (United States)

    Petti, Richard

    2015-04-01

    General relativity cannot describe exchange of classical intrinsic angular momentum and orbital angular momentum. Einstein-Cartan theory fixes this problem in the least invasive way. In the late 20th century, the consensus view was that Einstein-Cartan theory requires inclusion of torsion without adequate justification, it has no empirical support (though it doesn't conflict with any known evidence), it solves no important problem, and it complicates gravitational theory with no compensating benefit. In 1986 the author published a derivation of Einstein-Cartan theory from general relativity, with no additional assumptions or parameters. Starting without torsion, Poincaré symmetry, classical or quantum spin, or spinors, it derives torsion and its relation to spin from a continuum limit of general relativistic solutions. The present work makes the case that this computation, combined with supporting arguments, constitutes a derivation of Einstein-Cartan theory from general relativity, not just a plausibility argument. This paper adds more and simpler explanations, more computational details, correction of a factor of 2, discussion of limitations of the derivation, and discussion of some areas of gravitational research where Einstein-Cartan theory is relevant.

  12. Divorce and health: good data in need of better theory.

    Science.gov (United States)

    Sbarra, David A; Coan, James A

    2017-02-01

    A very large literature links the experiences of marital separation and divorce to risk for a range of poor distal health outcomes, including early death. What is far less clear, however, is the mechanistic pathways that convey this risk. Several plausible mechanisms are identified in the literature, and the central thesis of this paper is that the empirical study of divorce and health will benefit enormously from a renewed reliance on theory to dictate how these mechanisms of action may unfold over time. This review emphasizes the roles of attachment and social baseline theories in making specific mechanistic predictions and highlights the ways in which these perspectives can contribute new empirical knowledge on risk and resilience following marital dissolution. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    Science.gov (United States)

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  14. System-Level Process Change Improves Communication and Follow-Up for Emergency Department Patients With Incidental Radiology Findings.

    Science.gov (United States)

    Baccei, Steven J; Chinai, Sneha A; Reznek, Martin; Henderson, Scott; Reynolds, Kevin; Brush, D Eric

    2018-04-01

    The appropriate communication and management of incidental findings on emergency department (ED) radiology studies is an important component of patient safety. Guidelines have been issued by the ACR and other medical associations that best define incidental findings across various modalities and imaging studies. However, there are few examples of health care facilities designing ways to manage incidental findings. Our institution aimed to improve communication and follow-up of incidental radiology findings in ED patients through the collaborative development and implementation of system-level process changes including a standardized loop-closure method. We assembled a multidisciplinary team to address the nature of these incidental findings and designed new workflows and operational pathways for both radiology and ED staff to properly communicate incidental findings. Our results are based on all incidental findings received and acknowledged between November 1, 2016, and May 30, 2017. The total number of incidental findings discovered was 1,409. Our systematic compliance fluctuated between 45% and 95% initially after implementation. However, after overcoming various challenges through optimization, our system reached a compliance rate of 93% to 95%. Through the implementation of our new, standardized communication system, a high degree of compliance with loop closure for ED incidental radiology findings was achieved at our institution. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  15. The challenge of measuring emergency preparedness: integrating component metrics to build system-level measures for strategic national stockpile operations.

    Science.gov (United States)

    Jackson, Brian A; Faith, Kay Sullivan

    2013-02-01

    Although significant progress has been made in measuring public health emergency preparedness, system-level performance measures are lacking. This report examines a potential approach to such measures for Strategic National Stockpile (SNS) operations. We adapted an engineering analytic technique used to assess the reliability of technological systems-failure mode and effects analysis-to assess preparedness. That technique, which includes systematic mapping of the response system and identification of possible breakdowns that affect performance, provides a path to use data from existing SNS assessment tools to estimate likely future performance of the system overall. Systems models of SNS operations were constructed and failure mode analyses were performed for each component. Linking data from existing assessments, including the technical assistance review and functional drills, to reliability assessment was demonstrated using publicly available information. The use of failure mode and effects estimates to assess overall response system reliability was demonstrated with a simple simulation example. Reliability analysis appears an attractive way to integrate information from the substantial investment in detailed assessments for stockpile delivery and dispensing to provide a view of likely future response performance.

  16. Alternative conceptions of memory consolidation and the role of the hippocampus at the systems level in rodents.

    Science.gov (United States)

    Sutherland, R J; Lehmann, H

    2011-06-01

    We discuss very recent experiments with rodents addressing the idea that long-term memories initially depending on the hippocampus, over a prolonged period, become independent of it. No unambiguous recent evidence exists to substantiate that this occurs. Most experiments find that recent and remote memories are equally affected by hippocampus damage. Nearly all experiments that report spared remote memories suffer from two problems: retrieval could be based upon substantial regions of spared hippocampus and recent memory is tested at intervals that are of the same order of magnitude as cellular consolidation. Accordingly, we point the way beyond systems consolidation theories, both the Standard Model of Consolidation and the Multiple Trace Theory, and propose a simpler multiple storage site hypothesis. On this view, with event reiterations, different memory representations are independently established in multiple networks. Many detailed memories always depend on the hippocampus; the others may be established and maintained independently. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Recursion Theory Week

    CERN Document Server

    Müller, Gert; Sacks, Gerald

    1990-01-01

    These proceedings contain research and survey papers from many subfields of recursion theory, with emphasis on degree theory, in particular the development of frameworks for current techniques in this field. Other topics covered include computational complexity theory, generalized recursion theory, proof theoretic questions in recursion theory, and recursive mathematics.

  18. K-theory and representation theory

    International Nuclear Information System (INIS)

    Kuku, A.O.

    2003-01-01

    This contribution includes K-theory of orders, group-rings and modules over EI categories, equivariant higher algebraic K-theory for finite, profinite and compact Lie group actions together with their relative generalisations and applications

  19. Gravity, general relativity theory and alternative theories

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.; Grishchuk, L.P.; Moskovskij Gosudarstvennyj Univ.

    1986-01-01

    The main steps in plotting the current gravitation theory and some prospects of its subsequent development are reviewed. The attention is concentrated on a comparison of the relativistic gravitational field with other physical fields. Two equivalent formulations of the general relativity (GR) - geometrical and field-theoretical - are considered in detail. It is shown that some theories of gravity constructed as the field theories at a flat background space-time are in fact just different formulations of GR and not alternative theories

  20. Generalizability theory and item response theory

    OpenAIRE

    Glas, Cornelis A.W.; Eggen, T.J.H.M.; Veldkamp, B.P.

    2012-01-01

    Item response theory is usually applied to items with a selected-response format, such as multiple choice items, whereas generalizability theory is usually applied to constructed-response tasks assessed by raters. However, in many situations, raters may use rating scales consisting of items with a selected-response format. This chapter presents a short overview of how item response theory and generalizability theory were integrated to model such assessments. Further, the precision of the esti...

  1. String Theory and M-Theory

    Science.gov (United States)

    Becker, Katrin; Becker, Melanie; Schwarz, John H.

    String theory is one of the most exciting and challenging areas of modern theoretical physics. This book guides the reader from the basics of string theory to recent developments. It introduces the basics of perturbative string theory, world-sheet supersymmetry, space-time supersymmetry, conformal field theory and the heterotic string, before describing modern developments, including D-branes, string dualities and M-theory. It then covers string geometry and flux compactifications, applications to cosmology and particle physics, black holes in string theory and M-theory, and the microscopic origin of black-hole entropy. It concludes with Matrix theory, the AdS/CFT duality and its generalizations. This book is ideal for graduate students and researchers in modern string theory, and will make an excellent textbook for a one-year course on string theory. It contains over 120 exercises with solutions, and over 200 homework problems with solutions available on a password protected website for lecturers at www.cambridge.org/9780521860697. Comprehensive coverage of topics from basics of string theory to recent developments Ideal textbook for a one-year course in string theory Includes over 100 exercises with solutions Contains over 200 homework problems with solutions available to lecturers on-line

  2. Intrinsically motivated action-outcome learning and goal-based action recall: a system-level bio-constrained computational model.

    Science.gov (United States)

    Baldassarre, Gianluca; Mannella, Francesco; Fiore, Vincenzo G; Redgrave, Peter; Gurney, Kevin; Mirolli, Marco

    2013-05-01

    Reinforcement (trial-and-error) learning in animals is driven by a multitude of processes. Most animals have evolved several sophisticated systems of 'extrinsic motivations' (EMs) that guide them to acquire behaviours allowing them to maintain their bodies, defend against threat, and reproduce. Animals have also evolved various systems of 'intrinsic motivations' (IMs) that allow them to acquire actions in the absence of extrinsic rewards. These actions are used later to pursue such rewards when they become available. Intrinsic motivations have been studied in Psychology for many decades and their biological substrates are now being elucidated by neuroscientists. In the last two decades, investigators in computational modelling, robotics and machine learning have proposed various mechanisms that capture certain aspects of IMs. However, we still lack models of IMs that attempt to integrate all key aspects of intrinsically motivated learning and behaviour while taking into account the relevant neurobiological constraints. This paper proposes a bio-constrained system-level model that contributes a major step towards this integration. The model focusses on three processes related to IMs and on the neural mechanisms underlying them: (a) the acquisition of action-outcome associations (internal models of the agent-environment interaction) driven by phasic dopamine signals caused by sudden, unexpected changes in the environment; (b) the transient focussing of visual gaze and actions on salient portions of the environment; (c) the subsequent recall of actions to pursue extrinsic rewards based on goal-directed reactivation of the representations of their outcomes. The tests of the model, including a series of selective lesions, show how the focussing processes lead to a faster learning of action-outcome associations, and how these associations can be recruited for accomplishing goal-directed behaviours. The model, together with the background knowledge reviewed in the paper

  3. A system-level mathematical model of Basal Ganglia motor-circuit for kinematic planning of arm movements.

    Science.gov (United States)

    Salimi-Badr, Armin; Ebadzadeh, Mohammad Mehdi; Darlot, Christian

    2018-01-01

    In this paper, a novel system-level mathematical model of the Basal Ganglia (BG) for kinematic planning, is proposed. An arm composed of several segments presents a geometric redundancy. Thus, selecting one trajectory among an infinite number of possible ones requires overcoming redundancy, according to some kinds of optimization. Solving this optimization is assumed to be the function of BG in planning. In the proposed model, first, a mathematical solution of kinematic planning is proposed for movements of a redundant arm in a plane, based on minimizing energy consumption. Next, the function of each part in the model is interpreted as a possible role of a nucleus of BG. Since the kinematic variables are considered as vectors, the proposed model is presented based on the vector calculus. This vector model predicts different neuronal populations in BG which is in accordance with some recent experimental studies. According to the proposed model, the function of the direct pathway is to calculate the necessary rotation of each joint, and the function of the indirect pathway is to control each joint rotation considering the movement of the other joints. In the proposed model, the local feedback loop between Subthalamic Nucleus and Globus Pallidus externus is interpreted as a local memory to store the previous amounts of movements of the other joints, which are utilized by the indirect pathway. In this model, activities of dopaminergic neurons would encode, at short-term, the error between the desired and actual positions of the end-effector. The short-term modulating effect of dopamine on Striatum is also modeled as cross product. The model is simulated to generate the commands of a redundant manipulator. The performance of the model is studied for different reaching movements between 8 points in a plane. Finally, some symptoms of Parkinson's disease such as bradykinesia and akinesia are simulated by modifying the model parameters, inspired by the dopamine depletion

  4. Unitary field theories

    International Nuclear Information System (INIS)

    Bergmann, P.G.

    1980-01-01

    A problem of construction of the unitary field theory is discussed. The preconditions of the theory are briefly described. The main attention is paid to the geometrical interpretation of physical fields. The meaning of the conceptions of diversity and exfoliation is elucidated. Two unitary field theories are described: the Weyl conformic geometry and Calitzy five-dimensioned theory. It is proposed to consider supersymmetrical theories as a new approach to the problem of a unitary field theory. It is noted that the supergravitational theories are really unitary theories, since the fields figuring there do not assume invariant expansion

  5. Theory of thermal stresses

    CERN Document Server

    Boley, Bruno A

    1997-01-01

    Highly regarded text presents detailed discussion of fundamental aspects of theory, background, problems with detailed solutions. Basics of thermoelasticity, heat transfer theory, thermal stress analysis, more. 1985 edition.

  6. Elementary particle theory

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1984-12-01

    The present state of the art in elementary particle theory is reviewed. Topics include quantum electrodynamics, weak interactions, electroweak unification, quantum chromodynamics, and grand unified theories. 113 references

  7. Dedicated clock/timing-circuit theories of time perception and timed performance.

    Science.gov (United States)

    van Rijn, Hedderik; Gu, Bon-Mi; Meck, Warren H

    2014-01-01

    Scalar Timing Theory (an information-processing version of Scalar Expectancy Theory) and its evolution into the neurobiologically plausible Striatal Beat-Frequency (SBF) theory of interval timing are reviewed. These pacemaker/accumulator or oscillation/coincidence detection models are then integrated with the Adaptive Control of Thought-Rational (ACT-R) cognitive architecture as dedicated timing modules that are able to make use of the memory and decision-making mechanisms contained in ACT-R. The different predictions made by the incorporation of these timing modules into ACT-R are discussed as well as the potential limitations. Novel implementations of the original SBF model that allow it to be incorporated into ACT-R in a more fundamental fashion than the earlier simulations of Scalar Timing Theory are also considered in conjunction with the proposed properties and neural correlates of the "internal clock".

  8. Local homotopy theory

    CERN Document Server

    Jardine, John F

    2015-01-01

    This monograph on the homotopy theory of topologized diagrams of spaces and spectra gives an expert account of a subject at the foundation of motivic homotopy theory and the theory of topological modular forms in stable homotopy theory. Beginning with an introduction to the homotopy theory of simplicial sets and topos theory, the book covers core topics such as the unstable homotopy theory of simplicial presheaves and sheaves, localized theories, cocycles, descent theory, non-abelian cohomology, stacks, and local stable homotopy theory. A detailed treatment of the formalism of the subject is interwoven with explanations of the motivation, development, and nuances of ideas and results. The coherence of the abstract theory is elucidated through the use of widely applicable tools, such as Barr's theorem on Boolean localization, model structures on the category of simplicial presheaves on a site, and cocycle categories. A wealth of concrete examples convey the vitality and importance of the subject in topology, n...

  9. Structural and electronic properties of barbituric acid and melamine-containing ribonucleosides as plausible components of prebiotic RNA: implications for prebiotic self-assembly.

    Science.gov (United States)

    Kaur, Sarabjeet; Sharma, Purshotam; Wetmore, Stacey D

    2017-11-22

    The RNA world hypothesis assumes that RNA was the first informational polymer that originated from prebiotic chemical soup. However, since the reaction of d-ribose with canonical nucleobases (A, C, G and U) fails to yield ribonucleosides (rNs) in substantial amounts, the spontaneous origin of rNs and the subsequent synthesis of RNA remains an unsolved mystery. To this end, it has been suggested that RNA may have evolved from primitive genetic material (preRNA) composed of simpler prebiotic heterocycles that spontaneously form glycosidic bonds with ribose. As an effort toward evaluating this hypothesis, the present study uses density functional theory (DFT) to assess the suitability of barbituric acid (BA) and melamine (MM) to act as prebiotic nucleobases, both of which have recently been shown to spontaneously form a glycosidic bond with ribose and organize into supramolecular assemblies in solution. The significant strength of hydrogen bonds involving BA and MM indicates that such interactions may have played a crucial role in their preferential selection over competing heterocycles that interact solely through stacking interactions from the primordial soup during the early phase of evolution. However, the greater stability of stacked dimers involving BA or MM and the canonical nucleobases compared to those consisting solely of BA and/or MM points towards the possible evolution of intermediate informational polymers consisting of prebiotic and canonical nucleobases, which could have eventually evolved into RNA. Analysis of the associated rNs reveals an anti conformational preference for the biologically-relevant β-anomer of both BA and MM rNs, which will allow complementary WC-like hydrogen bonding that can stabilize preRNA polymers. Large calculated deglycosylation barriers suggest BA rNs containing C-C glycosidic bonds are relevant in challenging prebiotic environments such as volcanic geotherms, while lower barriers indicate the MM rNs containing C

  10. Rationality, Theory Acceptance and Decision Theory

    Directory of Open Access Journals (Sweden)

    J. Nicolas Kaufmann

    1998-06-01

    Full Text Available Following Kuhn's main thesis according to which theory revision and acceptance is always paradigm relative, I propose to outline some possible consequences of such a view. First, asking the question in what sense Bayesian decision theory could serve as the appropriate (normative theory of rationality examined from the point of view of the epistemology of theory acceptance, I argue that Bayesianism leads to a narrow conception of theory acceptance. Second, regarding the different types of theory revision, i.e. expansion, contraction, replacement and residuals shifts, I extract from Kuhn's view a series of indications showing that theory replacement cannot be rationalized within the framework of Bayesian decision theory, not even within a more sophisticated version of that model. Third, and finally, I will point to the need for a more comprehensive model of rationality than the Bayesian expected utility maximization model, the need for a model which could better deal with the different aspects of theory replacement. I will show that Kuhn's distinction between normal and revolutionary science gives us several hints for a more adequate theory of rationality in science. I will also show that Kuhn is not in a position to fully articulate his main ideas and that he well be confronted with a serious problem concerning collective choice of a paradigm.

  11. An organizational framework and strategic implementation for system-level change to enhance research-based practice: QUERI Series

    Directory of Open Access Journals (Sweden)

    Mittman Brian S

    2008-05-01

    Full Text Available Abstract Background The continuing gap between available evidence and current practice in health care reinforces the need for more effective solutions, in particular related to organizational context. Considerable advances have been made within the U.S. Veterans Health Administration (VA in systematically implementing evidence into practice. These advances have been achieved through a system-level program focused on collaboration and partnerships among policy makers, clinicians, and researchers. The Quality Enhancement Research Initiative (QUERI was created to generate research-driven initiatives that directly enhance health care quality within the VA and, simultaneously, contribute to the field of implementation science. This paradigm-shifting effort provided a natural laboratory for exploring organizational change processes. This article describes the underlying change framework and implementation strategy used to operationalize QUERI. Strategic approach to organizational change QUERI used an evidence-based organizational framework focused on three contextual elements: 1 cultural norms and values, in this case related to the role of health services researchers in evidence-based quality improvement; 2 capacity, in this case among researchers and key partners to engage in implementation research; 3 and supportive infrastructures to reinforce expectations for change and to sustain new behaviors as part of the norm. As part of a QUERI Series in Implementation Science, this article describes the framework's application in an innovative integration of health services research, policy, and clinical care delivery. Conclusion QUERI's experience and success provide a case study in organizational change. It demonstrates that progress requires a strategic, systems-based effort. QUERI's evidence-based initiative involved a deliberate cultural shift, requiring ongoing commitment in multiple forms and at multiple levels. VA's commitment to QUERI came in the

  12. From chaos to unification: U theory vs. M theory

    International Nuclear Information System (INIS)

    Ye, Fred Y.

    2009-01-01

    A unified physical theory called U theory, that is different from M theory, is defined and characterized. U theory, which includes spinor and twistor theory, loop quantum gravity, causal dynamical triangulations, E-infinity unification theory, and Clifford-Finslerian unifications, is based on physical tradition and experimental foundations. In contrast, M theory pays more attention to mathematical forms. While M theory is characterized by supersymmetry string theory, U theory is characterized by non-supersymmetry unified field theory.

  13. Contemporary theories of democracy

    Directory of Open Access Journals (Sweden)

    Mladenović Ivan

    2008-01-01

    Full Text Available The aim of this paper is two-fold: first, to analyze several contemporary theories of democracy, and secondly, to propose a theoretical framework for further investigations based on analyzed theories. The following four theories will be analyzed: pluralism, social choice theory, deliberative democracy and participatory democracy.

  14. Descriptive set theory

    CERN Document Server

    Moschovakis, YN

    1987-01-01

    Now available in paperback, this monograph is a self-contained exposition of the main results and methods of descriptive set theory. It develops all the necessary background material from logic and recursion theory, and treats both classical descriptive set theory and the effective theory developed by logicians.

  15. A theory of everything?

    CERN Multimedia

    't Hooft, Gerardus; Witten, Edward

    2005-01-01

    In his later years, Einstein sought a unified theory that would extend general relativity and provide an alternative to quantum theory. There is now talk of a "theory of everything"; fifty years after his death, how close are we to such a theory? (3 pages)

  16. Game theory in philosophy

    NARCIS (Netherlands)

    de Bruin, B.P.

    2005-01-01

    Game theory is the mathematical study of strategy and conflict. It has wide applications in economics, political science, sociology, and, to some extent, in philosophy. Where rational choice theory or decision theory is concerned with individual agents facing games against nature, game theory deals

  17. Integrating pro-environmental behavior with transportation network modeling: User and system level strategies, implementation, and evaluation

    Science.gov (United States)

    Aziz, H. M. Abdul

    Personal transport is a leading contributor to fossil fuel consumption and greenhouse (GHG) emissions in the U.S. The U.S. Energy Information Administration (EIA) reports that light-duty vehicles (LDV) are responsible for 61% of all transportation related energy consumption in 2012, which is equivalent to 8.4 million barrels of oil (fossil fuel) per day. The carbon content in fossil fuels is the primary source of GHG emissions that links to the challenge associated with climate change. Evidently, it is high time to develop actionable and innovative strategies to reduce fuel consumption and GHG emissions from the road transportation networks. This dissertation integrates the broader goal of minimizing energy and emissions into the transportation planning process using novel systems modeling approaches. This research aims to find, investigate, and evaluate strategies that minimize carbon-based fuel consumption and emissions for a transportation network. We propose user and system level strategies that can influence travel decisions and can reinforce pro-environmental attitudes of road users. Further, we develop strategies that system operators can implement to optimize traffic operations with emissions minimization goal. To complete the framework we develop an integrated traffic-emissions (EPA-MOVES) simulation framework that can assess the effectiveness of the strategies with computational efficiency and reasonable accuracy. The dissertation begins with exploring the trade-off between emissions and travel time in context of daily travel decisions and its heterogeneous nature. Data are collected from a web-based survey and the trade-off values indicating the average additional travel minutes a person is willing to consider for reducing a lb. of GHG emissions are estimated from random parameter models. Results indicate that different trade-off values for male and female groups. Further, participants from high-income households are found to have higher trade-off values

  18. Introduction to game theory

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The basic ideas of game theory were originated from the problems of maximum and minimum given by J.Yon Neumann in 1928. Later, wars accelerated the study of game theory, there are many developments that contributed to the advancement of game theory, many problems of optimum appeared in economic development process. Scientists applied mathematic methods to studying game theory to make the theory more profound and perfect. The axiomatic structure of game theory was nearly complete in 1944. The path of the development of game theory started from finite to infinite, from two players to many players, from expressing gains with quantity to showing the ending of game theory with abstract result, and from certainty problems to random problems. Thus development of game theory is closely related to the economic development. In recent years, the research on the non-differentiability of Shapley value posed by Belgian Mertens is one of the advanced studies in game theory.

  19. Nonrelativistic superstring theories

    International Nuclear Information System (INIS)

    Kim, Bom Soo

    2007-01-01

    We construct a supersymmetric version of the critical nonrelativistic bosonic string theory [B. S. Kim, Phys. Rev. D 76, 106007 (2007).] with its manifest global symmetry. We introduce the anticommuting bc conformal field theory (CFT) which is the super partner of the βγ CFT. The conformal weights of the b and c fields are both 1/2. The action of the fermionic sector can be transformed into that of the relativistic superstring theory. We explicitly quantize the theory with manifest SO(8) symmetry and find that the spectrum is similar to that of type IIB superstring theory. There is one notable difference: the fermions are nonchiral. We further consider noncritical generalizations of the supersymmetric theory using the superspace formulation. There is an infinite range of possible string theories similar to the supercritical string theories. We comment on the connection between the critical nonrelativistic string theory and the lightlike linear dilaton theory

  20. Nonrelativistic closed string theory

    International Nuclear Information System (INIS)

    Gomis, Jaume; Ooguri, Hirosi

    2001-01-01

    We construct a Galilean invariant nongravitational closed string theory whose excitations satisfy a nonrelativistic dispersion relation. This theory can be obtained by taking a consistent low energy limit of any of the conventional string theories, including the heterotic string. We give a finite first order worldsheet Hamiltonian for this theory and show that this string theory has a sensible perturbative expansion, interesting high energy behavior of scattering amplitudes and a Hagedorn transition of the thermal ensemble. The strong coupling duals of the Galilean superstring theories are considered and are shown to be described by an eleven-dimensional Galilean invariant theory of light membrane fluctuations. A new class of Galilean invariant nongravitational theories of light-brane excitations are obtained. We exhibit dual formulations of the strong coupling limits of these Galilean invariant theories and show that they exhibit many of the conventional dualities of M theory in a nonrelativistic setting

  1. Gauge theory loop operators and Liouville theory

    International Nuclear Information System (INIS)

    Drukker, Nadav; Teschner, Joerg

    2009-10-01

    We propose a correspondence between loop operators in a family of four dimensional N=2 gauge theories on S 4 - including Wilson, 't Hooft and dyonic operators - and Liouville theory loop operators on a Riemann surface. This extends the beautiful relation between the partition function of these N=2 gauge theories and Liouville correlators found by Alday, Gaiotto and Tachikawa. We show that the computation of these Liouville correlators with the insertion of a Liouville loop operator reproduces Pestun's formula capturing the expectation value of a Wilson loop operator in the corresponding gauge theory. We prove that our definition of Liouville loop operators is invariant under modular transformations, which given our correspondence, implies the conjectured action of S-duality on the gauge theory loop operators. Our computations in Liouville theory make an explicit prediction for the exact expectation value of 't Hooft and dyonic loop operators in these N=2 gauge theories. The Liouville loop operators are also found to admit a simple geometric interpretation within quantum Teichmueller theory as the quantum operators representing the length of geodesics. We study the algebra of Liouville loop operators and show that it gives evidence for our proposal as well as providing definite predictions for the operator product expansion of loop operators in gauge theory. (orig.)

  2. Identity theory and personality theory: mutual relevance.

    Science.gov (United States)

    Stryker, Sheldon

    2007-12-01

    Some personality psychologists have found a structural symbolic interactionist frame and identity theory relevant to their work. This frame and theory, developed in sociology, are first reviewed. Emphasized in the review are a multiple identity conception of self, identities as internalized expectations derived from roles embedded in organized networks of social interaction, and a view of social structures as facilitators in bringing people into networks or constraints in keeping them out, subsequently, attention turns to a discussion of the mutual relevance of structural symbolic interactionism/identity theory and personality theory, looking to extensions of the current literature on these topics.

  3. Towards a theory of spacetime theories

    CERN Document Server

    Schiemann, Gregor; Scholz, Erhard

    2017-01-01

    This contributed volume is the result of a July 2010 workshop at the University of Wuppertal Interdisciplinary Centre for Science and Technology Studies which brought together world-wide experts from physics, philosophy and history, in order to address a set of questions first posed in the 1950s: How do we compare spacetime theories? How do we judge, objectively, which is the “best” theory? Is there even a unique answer to this question? The goal of the workshop, and of this book, is to contribute to the development of a meta-theory of spacetime theories. Such a meta-theory would reveal insights about specific spacetime theories by distilling their essential similarities and differences, deliver a framework for a class of theories that could be helpful as a blueprint to build other meta-theories, and provide a higher level viewpoint for judging which theory most accurately describes nature. But rather than drawing a map in broad strokes, the focus is on particularly rich regions in the “space of spaceti...

  4. Gauge theory loop operators and Liouville theory

    Energy Technology Data Exchange (ETDEWEB)

    Drukker, Nadav [Humboldt Univ. Berlin (Germany). Inst. fuer Physik; Gomis, Jaume; Okuda, Takuda [Perimeter Inst. for Theoretical Physics, Waterloo, ON (Canada); Teschner, Joerg [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2009-10-15

    We propose a correspondence between loop operators in a family of four dimensional N=2 gauge theories on S{sup 4} - including Wilson, 't Hooft and dyonic operators - and Liouville theory loop operators on a Riemann surface. This extends the beautiful relation between the partition function of these N=2 gauge theories and Liouville correlators found by Alday, Gaiotto and Tachikawa. We show that the computation of these Liouville correlators with the insertion of a Liouville loop operator reproduces Pestun's formula capturing the expectation value of a Wilson loop operator in the corresponding gauge theory. We prove that our definition of Liouville loop operators is invariant under modular transformations, which given our correspondence, implies the conjectured action of S-duality on the gauge theory loop operators. Our computations in Liouville theory make an explicit prediction for the exact expectation value of 't Hooft and dyonic loop operators in these N=2 gauge theories. The Liouville loop operators are also found to admit a simple geometric interpretation within quantum Teichmueller theory as the quantum operators representing the length of geodesics. We study the algebra of Liouville loop operators and show that it gives evidence for our proposal as well as providing definite predictions for the operator product expansion of loop operators in gauge theory. (orig.)

  5. What genre theory does

    DEFF Research Database (Denmark)

    Andersen, Jack

    2015-01-01

    Purpose To provide a small overview of genre theory and its associated concepts and to show how genre theory has had its antecedents in certain parts of the social sciences and not in the humanities. Findings The chapter argues that the explanatory force of genre theory may be explained with its...... emphasis on everyday genres, de facto genres. Originality/value By providing an overview of genre theory, the chapter demonstrates the wealth and richness of forms of explanations in genre theory....

  6. Why string theory?

    CERN Document Server

    Conlon, Joseph

    2016-01-01

    Is string theory a fraud or one of the great scientific advances? Why do so many physicists work on string theory if it cannot be tested? This book provides insight into why such a theory, with little direct experimental support, plays such a prominent role in theoretical physics. The book gives a modern and accurate account of string theory and science, explaining what string theory is, why it is regarded as so promising, and why it is hard to test.

  7. Teaching Theory X and Theory Y in Organizational Communication

    Science.gov (United States)

    Noland, Carey

    2014-01-01

    The purpose of the activity described here is to integrate McGregor's Theory X and Theory Y into a group application: design a syllabus that embodies either Theory X or Theory Y tenets. Students should be able to differentiate between Theory X and Theory Y, create a syllabus based on Theory X or Theory Y tenets, evaluate the different syllabi…

  8. The need for theory evaluation in global citizenship programmes: The case of the GCSA programme.

    Science.gov (United States)

    Goodier, Sarah; Field, Carren; Goodman, Suki

    2018-02-01

    Many education programmes lack a documented programme theory. This is a problem for programme planners and evaluators as the ability to measure programme success is grounded in the plausibility of the programme's underlying causal logic. Where the programme theory has not been documented, conducting a theory evaluation offers a foundational evaluation step as it gives an indication of whether the theory behind a programme is sound. This paper presents a case of a theory evaluation of a Global Citizenship programme at a top-ranking university in South Africa, subsequently called the GCSA Programme. This evaluation highlights the need for documented programme theory in global citizenship-type programmes for future programme development. An articulated programme theory produced for the GCSA Programme, analysed against the available social science literature, indicated it is comparable to other such programmes in terms of its overarching framework. What the research found is that most other global citizenship programmes do not have an articulated programme theory. These programmes also do not explicitly link their specific activities to their intended outcomes, making demonstrating impact impossible. In conclusion, we argue that taking a theory-based approach can strengthen and enable outcome evaluations in global citizenship programmes. Copyright © 2017. Published by Elsevier Ltd.

  9. Psyche=singularity: A comparison of Carl Jung's transpersonal psychology and Leonard Susskind's holographic string theory

    Science.gov (United States)

    Desmond, Timothy

    In this dissertation I discern what Carl Jung calls the mandala image of the ultimate archetype of unity underlying and structuring cosmos and psyche by pointing out parallels between his transpersonal psychology and Stanford physicist Leonard Susskind's string theory. Despite his atheistic, materialistically reductionist interpretation of it, I demonstrate how Susskind's string theory of holographic information conservation at the event horizons of black holes, and the cosmic horizon of the universe, corroborates the following four topics about which Jung wrote: (1) his near-death experience of the cosmic horizon after a heart attack in 1944; ( 2) his equation relating psychic energy to mass, "Psyche=highest intensity in the smallest space" (1997, 162), which I translate into the equation, Psyche=Singularity; (3) his theory that the mandala, a circle or sphere with a central point, is the symbolic image of the ultimate archetype of unity through the union of opposites, which structures both cosmos and psyche, and which rises spontaneously from the collective unconscious to compensate a conscious mind torn by irreconcilable demands (1989, 334-335, 396-397); and (4) his theory of synchronicity. I argue that Susskind's inside-out black hole model of our Big Bang universe forms a geometrically perfect mandala: a central Singularity encompassed by a two-dimensional sphere which serves as a universal memory bank. Moreover, in precise fulfillment of Jung's theory, Susskind used that mandala to reconcile the notoriously incommensurable paradigms of general relativity and quantum mechanics, providing in the process a mathematically plausible explanation for Jung's near-death experience of his past, present, and future life simultaneously at the cosmic horizon. Finally, Susskind's theory also provides a plausible cosmological model to explain Jung's theory of synchronicity--meaningful coincidences may be tied together by strings at the cosmic horizon, from which they

  10. Direction: unified theory of interactions

    International Nuclear Information System (INIS)

    Valko, P.

    1987-01-01

    Briefly characterized are the individual theories, namely, the general relativity theory, the Kaluza-Klein theory, the Weyl theory, the unified theory of electromagnetic and weak interactions, the supergravity theory, and the superstring theory. The history is recalled of efforts aimed at creating a unified theory of interactions, and future prospects are outlined. (M.D.). 2 figs

  11. A theory for the Langmuir waves in the electron foreshock

    International Nuclear Information System (INIS)

    Cairns, I.H.

    1987-01-01

    A theory for the Langmuir (L) waves observed in the electron foreshock is suggested. Free energy for the Langmuir wave growth is contained in cutoff distributions of energetic electrons streaming from the bow shock. These cutoff distributions drive Langmuir wave growth primarily by the kinetic version of the beam instability, and wave growth is limited by quasi-linear relaxation. The observed bump-on-tail electron distributions are interpreted as the remnants of cutoff distributions after quasi-linear relaxation has limited the wave growth. Only plausibility arguments for this theory are given since suitable treatments of quasi-linear relaxation are not presently available. However, it is shown that the wave processes L ± S → L' and L ± S → T (where S and T denote ion sound and transverse waves, respectively), refraction in steady-state density structures, diffusion due to interactions with ion sound turbulence, and effects due to wave convection and spatial gradients in the beam velocity, are unable to suppress the beam instability. The theory leads to natural interpretations of the Langmuir electric field waveforms observed and of the decrease in the Langmuir wave electric fields with increasing distance from the foreshock boundary. The theory for the beam instability is reviewed, and previous analytic and numerical treatments of the beam instability are related

  12. Toward a Responsibility-Catering Prioritarian Ethical Theory of Risk.

    Science.gov (United States)

    Wikman-Svahn, Per; Lindblom, Lars

    2018-03-05

    Standard tools used in societal risk management such as probabilistic risk analysis or cost-benefit analysis typically define risks in terms of only probabilities and consequences and assume a utilitarian approach to ethics that aims to maximize expected utility. The philosopher Carl F. Cranor has argued against this view by devising a list of plausible aspects of the acceptability of risks that points towards a non-consequentialist ethical theory of societal risk management. This paper revisits Cranor's list to argue that the alternative ethical theory responsibility-catering prioritarianism can accommodate the aspects identified by Cranor and that the elements in the list can be used to inform the details of how to view risks within this theory. An approach towards operationalizing the theory is proposed based on a prioritarian social welfare function that operates on responsibility-adjusted utilities. A responsibility-catering prioritarian ethical approach towards managing risks is a promising alternative to standard tools such as cost-benefit analysis.

  13. On Born's deformed reciprocal complex gravitational theory and noncommutative gravity

    International Nuclear Information System (INIS)

    Castro, Carlos

    2008-01-01

    Born's reciprocal relativity in flat spacetimes is based on the principle of a maximal speed limit (speed of light) and a maximal proper force (which is also compatible with a maximal and minimal length duality) and where coordinates and momenta are unified on a single footing. We extend Born's theory to the case of curved spacetimes and construct a deformed Born reciprocal general relativity theory in curved spacetimes (without the need to introduce star products) as a local gauge theory of the deformed Quaplectic group that is given by the semi-direct product of U(1,3) with the deformed (noncommutative) Weyl-Heisenberg group corresponding to noncommutative generators [Z a ,Z b ]≠0. The Hermitian metric is complex-valued with symmetric and nonsymmetric components and there are two different complex-valued Hermitian Ricci tensors R μν ,S μν . The deformed Born's reciprocal gravitational action linear in the Ricci scalars R,S with Torsion-squared terms and BF terms is presented. The plausible interpretation of Z μ =E μ a Z a as noncommuting p-brane background complex spacetime coordinates is discussed in the conclusion, where E μ a is the complex vielbein associated with the Hermitian metric G μν =g (μν) +ig [μν] =E μ a E-bar ν b η ab . This could be one of the underlying reasons why string-theory involves gravity

  14. Implementing a Multi-Tiered System of Support (MTSS): Collaboration between School Psychologists and Administrators to Promote Systems-Level Change

    Science.gov (United States)

    Eagle, John W.; Dowd-Eagle, Shannon E.; Snyder, Andrew; Holtzman, Elizabeth Gibbons

    2015-01-01

    Current educational reform mandates the implementation of school-based models for early identification and intervention, progress monitoring, and data-based assessment of student progress. This article provides an overview of interdisciplinary collaboration for systems-level consultation within a Multi-Tiered System of Support (MTSS) framework.…

  15. Integrating Operational Energy Implications into System-Level Combat Effects Modeling: Assessing the Combat Effectiveness and Fuel Use of ABCT 2020 and Current ABCT

    Science.gov (United States)

    2015-01-01

    Endy M. Daehner, John Matsumura, Thomas J. Herbert , Jeremy R. Kurz, Keith Walters Integrating Operational Energy Implications into System-Level... George Guthridge, and Megan Corso for their clear guid- ance and assistance throughout the study. We also received valuable information and insights from...helped with processing modeling and simulation outputs. Laura Novacic and Donna Mead provided invaluable administrative assistance and help with

  16. Wear-out Failure Analysis of an Impedance-Source PV Microinverter Based on System-Level Electro-Thermal Modeling

    DEFF Research Database (Denmark)

    Shen, Yanfeng; Chub, Andrii; Wang, Huai

    2018-01-01

    and system-level finite element method (FEM) simulations, the electro-thermal models are built for the most reliability-critical components, i.e., power semi-conductor devices and capacitors. The dependence of the power loss on the junction/hotspot temperature is considered, the enclosure temperature...

  17. Building theory through design

    DEFF Research Database (Denmark)

    Markussen, Thomas

    2017-01-01

    This chapter deals with a fundamental matter of concern in research through design: how can design work lead to the building of new theory? Controversy exists about the balance between theory and design work in research through design. While some researchers see theory production as the scientific...... hallmark of this type of research, others argue for design work being the primary achievement, with theory serving the auxiliary function of inspiring new designs. This paper demonstrates how design work and theory can be appreciated as two equally important outcomes of research through design. To set...... the scene, it starts out by briefly examining ideas on this issue presented in existing research literature. Hereafter, it introduces three basic forms in which design work can lead to theory that is referred to as extending theories, scaffolding theories and blending theories. Finally, it is discussed how...

  18. Generalizability Theory and Classical Test Theory

    Science.gov (United States)

    Brennan, Robert L.

    2011-01-01

    Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…

  19. Generalizability theory and item response theory

    NARCIS (Netherlands)

    Glas, Cornelis A.W.; Eggen, T.J.H.M.; Veldkamp, B.P.

    2012-01-01

    Item response theory is usually applied to items with a selected-response format, such as multiple choice items, whereas generalizability theory is usually applied to constructed-response tasks assessed by raters. However, in many situations, raters may use rating scales consisting of items with a

  20. String field theory

    International Nuclear Information System (INIS)

    Kaku, M.

    1987-01-01

    In this article, the authors summarize the rapid progress in constructing string field theory actions, such as the development of the covariant BRST theory. They also present the newer geometric formulation of string field theory, from which the BRST theory and the older light cone theory can be derived from first principles. This geometric formulation allows us to derive the complete field theory of strings from two geometric principles, in the same way that general relativity and Yang-Mills theory can be derived from two principles based on global and local symmetry. The geometric formalism therefore reduces string field theory to a problem of finding an invariant under a new local gauge group they call the universal string group (USG). Thus, string field theory is the gauge theory of the universal string group in much the same way that Yang-Mills theory is the gauge theory of SU(N). The geometric formulation places superstring theory on the same rigorous group theoretical level as general relativity and gauge theory

  1. System-level planning, coordination, and communication: care of the critically ill and injured during pandemics and disasters: CHEST consensus statement.

    Science.gov (United States)

    Dichter, Jeffrey R; Kanter, Robert K; Dries, David; Luyckx, Valerie; Lim, Matthew L; Wilgis, John; Anderson, Michael R; Sarani, Babak; Hupert, Nathaniel; Mutter, Ryan; Devereaux, Asha V; Christian, Michael D; Kissoon, Niranjan

    2014-10-01

    System-level planning involves uniting hospitals and health systems, local/regional government agencies, emergency medical services, and other health-care entities involved in coordinating and enabling care in a major disaster. We reviewed the literature and sought expert opinions concerning system-level planning and engagement for mass critical care due to disasters or pandemics and offer suggestions for system-planning, coordination, communication, and response. The suggestions in this chapter are important for all of those involved in a pandemic or disaster with multiple critically ill or injured patients, including front-line clinicians, hospital administrators, and public health or government officials. The American College of Chest Physicians (CHEST) consensus statement development process was followed in developing suggestions. Task Force members met in person to develop nine key questions believed to be most relevant for system-planning, coordination, and communication. A systematic literature review was then performed for relevant articles and documents, reports, and other publications reported since 1993. No studies of sufficient quality were identified upon which to make evidence-based recommendations. Therefore, the panel developed expert opinion-based suggestions using a modified Delphi process. Suggestions were developed and grouped according to the following thematic elements: (1) national government support of health-care coalitions/regional health authorities (HC/RHAs), (2) teamwork within HC/RHAs, (3) system-level communication, (4) system-level surge capacity and capability, (5) pediatric patients and special populations, (6) HC/RHAs and networks, (7) models of advanced regional care systems, and (8) the use of simulation for preparedness and planning. System-level planning is essential to provide care for large numbers of critically ill patients because of disaster or pandemic. It also entails a departure from the routine, independent system and

  2. Effective quantum field theories

    International Nuclear Information System (INIS)

    Georgi, H.M.

    1993-01-01

    The most appropriate description of particle interactions in the language of quantum field theory depends on the energy at which the interactions are studied; the description is in terms of an ''effective field theory'' that contains explicit reference only to those particles that are actually important at the energy being studied. The various themes of the article are: local quantum field theory, quantum electrodynamics, new physics, dimensional parameters and renormalizability, socio-dynamics of particle theory, spontaneously broken gauge theories, scale dependence, grand unified and effective field theories. 2 figs

  3. The Grounded Theory Bookshelf

    Directory of Open Access Journals (Sweden)

    Vivian B. Martin, Ph.D.

    2005-03-01

    Full Text Available Bookshelf will provide critical reviews and perspectives on books on theory and methodology of interest to grounded theory. This issue includes a review of Heaton’s Reworking Qualitative Data, of special interest for some of its references to grounded theory as a secondary analysis tool; and Goulding’s Grounded Theory: A practical guide for management, business, and market researchers, a book that attempts to explicate the method and presents a grounded theory study that falls a little short of the mark of a fully elaborated theory.Reworking Qualitative Data, Janet Heaton (Sage, 2004. Paperback, 176 pages, $29.95. Hardcover also available.

  4. Quantum potential theory

    CERN Document Server

    Schürmann, Michael

    2008-01-01

    This volume contains the revised and completed notes of lectures given at the school "Quantum Potential Theory: Structure and Applications to Physics," held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald from February 26 to March 10, 2007. Quantum potential theory studies noncommutative (or quantum) analogs of classical potential theory. These lectures provide an introduction to this theory, concentrating on probabilistic potential theory and it quantum analogs, i.e. quantum Markov processes and semigroups, quantum random walks, Dirichlet forms on C* and von Neumann algebras, and boundary theory. Applications to quantum physics, in particular the filtering problem in quantum optics, are also presented.

  5. Historicizing affordance theory

    DEFF Research Database (Denmark)

    Pedersen, Sofie; Bang, Jytte Susanne

    2017-01-01

    The aim of this article is to discuss how mutually enriching points from both affordance theory and cultural-historical activity theory can promote theoretical ideas which may prove useful as analytical tools for the study of human life and human development. There are two issues that need...... to be overcome in order to explore the potentials of James Gibson’s affordance theory: it does not sufficiently theorize (a) development and (b) society. We claim that Gibson’s affordance theory still needs to be brought beyond “the axiom of immediacy.” Ambivalences in Gibson’s affordance theory...... societal character of affordance theory....

  6. [Contemporary cognitive theories about developmental dyscalculia].

    Science.gov (United States)

    Castro-Cañizares, D; Estévez-Pérez, N; Reigosa-Crespo, V

    To analyze the current theories describing the cognitive mechanisms underlying developmental dyscalculia. The four most researched hypotheses concerning the cognitive deficits related to developmental dyscalculia, as well as experimental evidences supporting or refusing them are presented. The first hypothesis states that developmental dyscalculia is consequence of domain general cognitive deficits. The second hypothesis suggests that it is due to a failure in the development of specialized brain systems dedicated to numerosity processing. The third hypothesis asserts the disorder is caused by a deficit in accessing quantity representation through numerical symbols. The last hypothesis states developmental dyscalculia appears as a consequence of impairments in a generalized magnitude system dedicated to the processing of continuous and discrete magnitudes. None of the hypotheses has been proven more plausible than the rest. Relevant issues rose by them need to be revisited and answered in the light of new experimental designs. In the last years the understanding of cognitive disorders involved in developmental dyscalculia has remarkably increased, but it is nonetheless insufficient. Additional research is required in order to achieve a comprehensive cognitive model of numerical processing development and its disorders. This will improve the diagnostic precision and the effectiveness of developmental dyscalculia intervention strategies.

  7. The chaperone-like activity of α-synuclein attenuates aggregation of its alternatively spliced isoform, 112-synuclein in vitro: plausible cross-talk between isoforms in protein aggregation.

    Directory of Open Access Journals (Sweden)

    Krishna Madhuri Manda

    Full Text Available Abnormal oligomerization and aggregation of α-synuclein (α-syn/WT-syn has been shown to be a precipitating factor in the pathophysiology of Parkinson's disease (PD. Earlier observations on the induced-alternative splicing of α-syn by Parkinsonism mimetics as well as identification of region specific abnormalities in the transcript levels of 112-synuclein (112-syn in diseased subjects underscores the role of 112-syn in the pathophysiology of PD. In the present study, we sought to identify the aggregation potential of 112-syn in the presence or absence of WT-syn to predict its plausible role in protein aggregation events. Results demonstrate that unlike WT-syn, lack of 28 aa in the C-terminus results in the loss of chaperone-like activity with a concomitant gain in vulnerability to heat-induced aggregation and time-dependent fibrillation. The effects were dose and time-dependent and a significant aggregation of 112-syn was evident at as low as 45 °C following 10 min of incubation. The heat-induced aggregates were found to be ill-defined structures and weakly positive towards Thioflavin-T (ThT staining as compared to clearly distinguishable ThT positive extended fibrils resulting upon 24 h of incubation at 37 °C. Further, the chaperone-like activity of WT-syn significantly attenuated heat-induced aggregation of 112-syn in a dose and time-dependent manner. On contrary, WT-syn synergistically enhanced fibrillation of 112-syn. Overall, the present findings highlight a plausible cross-talk between isoforms of α-syn and the relative abundance of these isoforms may dictate the nature and fate of protein aggregates.

  8. “Making difference: theories on gender, body and behaviour”

    Directory of Open Access Journals (Sweden)

    Maria Teresa Citeli

    2001-01-01

    Full Text Available Since the end of the nineteenth century, when Darwin published his work on evolution, several female scientists have reacted by adopting basically two points of view: while some deny the potential of the biological sciences to explain social arrangements, others reinterpret biology studies on sex differences, admitting that these may explain human behavior and social inequality. In an attempt to appraise how social differences are assigned to the human body, this article discusses theoretical trends in recent works of biological sciences, which try to either reaffirm or deny the plausibility of theories that resort to sex differences presumably located in the body (brains, genes, male and female physiology to explain variations in human beings’ skills, abilities, cognitive patterns, and sexuality. And, given the influence of the media on our views on male and female, it also discusses the repercussion of such essentialist views on national and international print media.

  9. Systems level analysis of systemic sclerosis shows a network of immune and profibrotic pathways connected with genetic polymorphisms.

    Directory of Open Access Journals (Sweden)

    J Matthew Mahoney

    2015-01-01

    Full Text Available Systemic sclerosis (SSc is a rare systemic autoimmune disease characterized by skin and organ fibrosis. The pathogenesis of SSc and its progression are poorly understood. The SSc intrinsic gene expression subsets (inflammatory, fibroproliferative, normal-like, and limited are observed in multiple clinical cohorts of patients with SSc. Analysis of longitudinal skin biopsies suggests that a patient's subset assignment is stable over 6-12 months. Genetically, SSc is multi-factorial with many genetic risk loci for SSc generally and for specific clinical manifestations. Here we identify the genes consistently associated with the intrinsic subsets across three independent cohorts, show the relationship between these genes using a gene-gene interaction network, and place the genetic risk loci in the context of the intrinsic subsets. To identify gene expression modules common to three independent datasets from three different clinical centers, we developed a consensus clustering procedure based on mutual information of partitions, an information theory concept, and performed a meta-analysis of these genome-wide gene expression datasets. We created a gene-gene interaction network of the conserved molecular features across the intrinsic subsets and analyzed their connections with SSc-associated genetic polymorphisms. The network is composed of distinct, but interconnected, components related to interferon activation, M2 macrophages, adaptive immunity, extracellular matrix remodeling, and cell proliferation. The network shows extensive connections between the inflammatory- and fibroproliferative-specific genes. The network also shows connections between these subset-specific genes and 30 SSc-associated polymorphic genes including STAT4, BLK, IRF7, NOTCH4, PLAUR, CSK, IRAK1, and several human leukocyte antigen (HLA genes. Our analyses suggest that the gene expression changes underlying the SSc subsets may be long-lived, but mechanistically interconnected

  10. Rigour and grounded theory.

    Science.gov (United States)

    Cooney, Adeline

    2011-01-01

    This paper explores ways to enhance and demonstrate rigour in a grounded theory study. Grounded theory is sometimes criticised for a lack of rigour. Beck (1993) identified credibility, auditability and fittingness as the main standards of rigour for qualitative research methods. These criteria were evaluated for applicability to a Straussian grounded theory study and expanded or refocused where necessary. The author uses a Straussian grounded theory study (Cooney, In press) to examine how the revised criteria can be applied when conducting a grounded theory study. Strauss and Corbin (1998b) criteria for judging the adequacy of a grounded theory were examined in the context of the wider literature examining rigour in qualitative research studies in general and grounded theory studies in particular. A literature search for 'rigour' and 'grounded theory' was carried out to support this analysis. Criteria are suggested for enhancing and demonstrating the rigour of a Straussian grounded theory study. These include: cross-checking emerging concepts against participants' meanings, asking experts if the theory 'fit' their experiences, and recording detailed memos outlining all analytical and sampling decisions. IMPLICATIONS FOR RESEARCH PRACTICE: The criteria identified have been expressed as questions to enable novice researchers to audit the extent to which they are demonstrating rigour when writing up their studies. However, it should not be forgotten that rigour is built into the grounded theory method through the inductive-deductive cycle of theory generation. Care in applying the grounded theory methodology correctly is the single most important factor in ensuring rigour.

  11. Application of fixed point theory to chaotic attractors of forced oscillators

    International Nuclear Information System (INIS)

    Stewart, H.B.

    1990-11-01

    A review of the structure of chaotic attractors of periodically forced second order nonlinear oscillators suggests that the theory of fixed points of transformations gives information about the fundamental topological structure of attractors. First a simple extension of the Levinson index formula is proved. Then numerical evidence is used to formulate plausible conjectures about absorbing regions containing chaotic attractors in forced oscillators. Applying the Levinson formula suggests a fundamental relation between the number of fixed points or periodic points in a section of the chaotic attractor on the one hand, and a topological invariant of an absorbing region on the other hand. (author)

  12. Theories of Career Development. A Comparison of the Theories.

    Science.gov (United States)

    Osipow, Samuel H.

    These seven theories of career development are examined in previous chapters: (1) Roe's personality theory, (2) Holland's career typology theory, (3) the Ginzberg, Ginsburg, Axelrod, and Herma Theory, (4) psychoanalytic conceptions, (5) Super's developmental self-concept theory, (6) other personality theories, and (7) social systems theories.…

  13. Supersymmetric gauge theories from string theory

    International Nuclear Information System (INIS)

    Metzger, St.

    2005-12-01

    This thesis presents various ways to construct four-dimensional quantum field theories from string theory. In a first part we study the generation of a supersymmetric Yang-Mills theory, coupled to an adjoint chiral superfield, from type IIB string theory on non-compact Calabi-Yau manifolds, with D-branes wrapping certain sub-cycles. Properties of the gauge theory are then mapped to the geometric structure of the Calabi-Yau space. Even if the Calabi-Yau geometry is too complicated to evaluate the geometric integrals explicitly, one can then always use matrix model perturbation theory to calculate the effective superpotential. The second part of this work covers the generation of four-dimensional super-symmetric gauge theories, carrying several important characteristic features of the standard model, from compactifications of eleven-dimensional supergravity on G 2 -manifolds. If the latter contain conical singularities, chiral fermions are present in the four-dimensional gauge theory, which potentially lead to anomalies. We show that, locally at each singularity, these anomalies are cancelled by the non-invariance of the classical action through a mechanism called 'anomaly inflow'. Unfortunately, no explicit metric of a compact G 2 -manifold is known. Here we construct families of metrics on compact weak G 2 -manifolds, which contain two conical singularities. Weak G 2 -manifolds have properties that are similar to the ones of proper G 2 -manifolds, and hence the explicit examples might be useful to better understand the generic situation. Finally, we reconsider the relation between eleven-dimensional supergravity and the E 8 x E 8 -heterotic string. This is done by carefully studying the anomalies that appear if the supergravity theory is formulated on a ten-manifold times the interval. Again we find that the anomalies cancel locally at the boundaries of the interval through anomaly inflow, provided one suitably modifies the classical action. (author)

  14. Can Innate, modular "foundations" explain morality? Challenges for Haidt's Moral Foundations Theory.

    Science.gov (United States)

    Suhler, Christopher L; Churchland, Patricia

    2011-09-01

    Jonathan Haidt's Moral Foundations Theory is an influential scientific account of morality incorporating psychological, developmental, and evolutionary perspectives. The theory proposes that morality is built upon five innate "foundations," each of which is believed to have been selected for during human evolution and, subsequently, tuned-up by learning during development. We argue here that although some general elements of Haidt's theory are plausible, many other important aspects of his account are seriously flawed. First, innateness and modularity figure centrally in Haidt's account, but terminological and conceptual problems foster confusion and ambiguities. Second, both the theory's proposed number of moral foundations and its taxonomy of the moral domain appear contrived, ignoring equally good candidate foundations and the possibility of substantial intergroup differences in the foundations' contents. Third, the mechanisms (viz., modules) and categorical distinctions (viz., between foundations) proposed by the theory are not consilient with discoveries in contemporary neuroscience concerning the organization, functioning, and development of the brain. In light of these difficulties, we suggest that Haidt's theory is inadequate as a scientific account of morality. Nevertheless, the theory's weaknesses are instructive, and hence, criticism may be useful to psychologists, neuroscientists, and philosophers attempting to advance theories of morality, as well as to researchers wishing to invoke concepts such as innateness and modularity more generally.

  15. Higher spin gauge theories

    CERN Document Server

    Henneaux, Marc; Vasiliev, Mikhail A

    2017-01-01

    Symmetries play a fundamental role in physics. Non-Abelian gauge symmetries are the symmetries behind theories for massless spin-1 particles, while the reparametrization symmetry is behind Einstein's gravity theory for massless spin-2 particles. In supersymmetric theories these particles can be connected also to massless fermionic particles. Does Nature stop at spin-2 or can there also be massless higher spin theories. In the past strong indications have been given that such theories do not exist. However, in recent times ways to evade those constraints have been found and higher spin gauge theories have been constructed. With the advent of the AdS/CFT duality correspondence even stronger indications have been given that higher spin gauge theories play an important role in fundamental physics. All these issues were discussed at an international workshop in Singapore in November 2015 where the leading scientists in the field participated. This volume presents an up-to-date, detailed overview of the theories i...

  16. Covariant Noncommutative Field Theory

    Energy Technology Data Exchange (ETDEWEB)

    Estrada-Jimenez, S [Licenciaturas en Fisica y en Matematicas, Facultad de Ingenieria, Universidad Autonoma de Chiapas Calle 4a Ote. Nte. 1428, Tuxtla Gutierrez, Chiapas (Mexico); Garcia-Compean, H [Departamento de Fisica, Centro de Investigacion y de Estudios Avanzados del IPN P.O. Box 14-740, 07000 Mexico D.F., Mexico and Centro de Investigacion y de Estudios Avanzados del IPN, Unidad Monterrey Via del Conocimiento 201, Parque de Investigacion e Innovacion Tecnologica (PIIT) Autopista nueva al Aeropuerto km 9.5, Lote 1, Manzana 29, cp. 66600 Apodaca Nuevo Leon (Mexico); Obregon, O [Instituto de Fisica de la Universidad de Guanajuato P.O. Box E-143, 37150 Leon Gto. (Mexico); Ramirez, C [Facultad de Ciencias Fisico Matematicas, Universidad Autonoma de Puebla, P.O. Box 1364, 72000 Puebla (Mexico)

    2008-07-02

    The covariant approach to noncommutative field and gauge theories is revisited. In the process the formalism is applied to field theories invariant under diffeomorphisms. Local differentiable forms are defined in this context. The lagrangian and hamiltonian formalism is consistently introduced.

  17. Theory of calorimetry

    CERN Document Server

    Zielenkiewicz, Wojciech

    2004-01-01

    The purpose of this book is to give a comprehensive description of the theoretical fundamentals of calorimetry. The considerations are based on the relations deduced from the laws and general equations of heat exchange theory and steering theory.

  18. Covariant Noncommutative Field Theory

    International Nuclear Information System (INIS)

    Estrada-Jimenez, S.; Garcia-Compean, H.; Obregon, O.; Ramirez, C.

    2008-01-01

    The covariant approach to noncommutative field and gauge theories is revisited. In the process the formalism is applied to field theories invariant under diffeomorphisms. Local differentiable forms are defined in this context. The lagrangian and hamiltonian formalism is consistently introduced

  19. Stabilizing bottomless action theories

    International Nuclear Information System (INIS)

    Greensite, J.; Halpern, M.B.

    1983-12-01

    The authors show how to construct the Euclidean quantum theory corresponding to classical actions which are unbounded from below. The method preserves the classical limit, the large-N limit, and the perturbative expansion of the unstabilized theories. (Auth.)

  20. Algebraic conformal field theory

    International Nuclear Information System (INIS)

    Fuchs, J.; Nationaal Inst. voor Kernfysica en Hoge-Energiefysica

    1991-11-01

    Many conformal field theory features are special versions of structures which are present in arbitrary 2-dimensional quantum field theories. So it makes sense to describe 2-dimensional conformal field theories in context of algebraic theory of superselection sectors. While most of the results of the algebraic theory are rather abstract, conformal field theories offer the possibility to work out many formulae explicitly. In particular, one can construct the full algebra A-bar of global observables and the endomorphisms of A-bar which represent the superselection sectors. Some explicit results are presented for the level 1 so(N) WZW theories; the algebra A-bar is found to be the enveloping algebra of a Lie algebra L-bar which is an extension of the chiral symmetry algebra of the WZW theory. (author). 21 refs., 6 figs

  1. The theory of remainders

    CERN Document Server

    Rothbart, Andrea

    2012-01-01

    An imaginative introduction to number theory and abstract algebra, this unique approach employs a pair of fictional characters whose dialogues explain theories and demonstrate applications in terms of football scoring, chess moves, and more.

  2. Economic theories of dictatorship

    OpenAIRE

    Alexandre Debs

    2010-01-01

    This article reviews recent advances in economic theories of dictatorships and their lessons for the political stability and economic performance of dictatorships. It reflects on the general usefulness of economic theories of dictatorship, with an application to foreign relations.

  3. Plasma kinetic theory

    International Nuclear Information System (INIS)

    Elliott, J.A.

    1993-01-01

    Plasma kinetic theory is discussed and a comparison made with the kinetic theory of gases. The plasma is described by a modified set of fluid equations and it is shown how these fluid equations can be derived. (UK)

  4. Inflationary string theory?

    Indian Academy of Sciences (India)

    strongly motivate a detailed search for inflation within string theory, although it has ... between string theory and observations provides a strong incentive for ..... sonably be expected to arise for any system having very many degrees of freedom.

  5. Field theory and strings

    International Nuclear Information System (INIS)

    Bonara, L.; Cotta-Ramusino, P.; Rinaldi, M.

    1987-01-01

    It is well-known that type I and heterotic superstring theories have a zero mass spectrum which correspond to the field content of N=1 supergravity theory coupled to supersymmetric Yang-Mills theory in 10-D. The authors study the field theory ''per se'', in the hope that simple consistency requirements will determine the theory completely once one knows the field content inherited from string theory. The simplest consistency requirements are: N=1 supersymmetry; and absence of chiral anomalies. This is what the authors discuss in this paper here leaving undetermined the question of the range of validity of the resulting field theory. As is known, a model of N=1 supergravity (SUGRA) coupled to supersymmetric Yang-Mills (SYM) theory was known in the form given by Chapline and Manton. The coupling of SUGRA to SYM was determined by the definition of the ''field strength'' 3-form H in this paper

  6. Introduction to percolation theory

    CERN Document Server

    Stauffer, Dietrich

    1991-01-01

    Percolation theory deals with clustering, criticallity, diffusion, fractals, phase transitions and disordered systems. This book covers the basic theory for the graduate, and also professionals dealing with it for the first time

  7. Making HCI Theory Work

    DEFF Research Database (Denmark)

    Clemmensen, Torkil; Kaptelinin, Victor; Nardi, Bonnie

    2016-01-01

    different ways of using activity theory: (1) analysing unique features, principles, and problematic aspects of the theory; (2) identifying domain-specific requirements for new theoretical tools; (3) developing new conceptual accounts of issues in the field of HCI; (4) guiding and supporting empirical...... analyses of HCI phenomena; and (5) providing new design illustrations, claims, and guidelines. We conclude that HCI researchers are not only users of imported theory, but also theory-makers who adapt and develop theory for different purposes.......This paper reports a study of the use of activity theory in human–computer interaction (HCI) research. We analyse activity theory in HCI since its first appearance about 25 years ago. Through an analysis and meta-synthesis of 109 selected HCI activity theory papers, we created a taxonomy of 5...

  8. Introduction of the transtheoretical model and organisational development theory in weight management: A narrative review.

    Science.gov (United States)

    Wu, Ya-Ke; Chu, Nain-Feng

    2015-01-01

    Overweight and obesity are serious public health and medical problems among children and adults worldwide. Behavioural change has been demonstrably contributory to weight management programs. Behavioural change-based weight loss programs require a theoretical framework. We will review the transtheoretical model and the organisational development theory in weight management. The transtheoretical model is a behaviour theory of individual level frequently used for weight management programs. The organisational development theory is a more complicated behaviour theory that applies to behavioural change on the system level. Both of these two theories have their respective strengths and weaknesses. In this manuscript, we try to introduce the transtheoretical model and the organisational development theory in the context of weight loss programs among population that are overweight or obese. Ultimately, we wish to present a new framework/strategy of weight management by integrating these two theories together. Copyright © 2015 Asian Oceanian Association for the Study of Obesity. Published by Elsevier Ltd. All rights reserved.

  9. Nonlocal gauge theories

    International Nuclear Information System (INIS)

    Partovi, M.H.

    1982-01-01

    From a generalization of the covariant derivative, nonlocal gauge theories are developed. These theories enjoy local gauge invariance and associated Ward identities, a corresponding locally conserved current, and a locally conserved energy-momentum tensor, with the Ward identities implying the masslessness of the gauge field as in local theories. Their ultraviolet behavior allows the presence as well as the absence of the Adler-Bell-Jackiw anomaly, the latter in analogy with lattice theories

  10. Noncommutative field theory

    International Nuclear Information System (INIS)

    Douglas, Michael R.; Nekrasov, Nikita A.

    2001-01-01

    This article reviews the generalization of field theory to space-time with noncommuting coordinates, starting with the basics and covering most of the active directions of research. Such theories are now known to emerge from limits of M theory and string theory and to describe quantum Hall states. In the last few years they have been studied intensively, and many qualitatively new phenomena have been discovered, on both the classical and the quantum level

  11. Problems in particle theory

    International Nuclear Information System (INIS)

    Adler, S.L.; Wilczek, F.

    1993-11-01

    Areas of emphasis include acceleration algorithms for the Monte Carlo analysis of lattice field and gauge theories, quaternionic generalizations of complex quantum mechanics and field theory, application of the renormalization group to the QCD phase transition, the quantum Hall effect, and black holes. Other work involved string theory, statistical properties of energy levels in integrable quantum systems, baryon asymmetry and the electroweak phase transition, anisotropies of the cosmic microwave background, and theory of superconductors

  12. Multiscale System Theory

    Science.gov (United States)

    1990-02-21

    LIDS-P-1953 Multiscale System Theory Albert Benveniste IRISA-INRIA, Campus de Beaulieu 35042 RENNES CEDEX, FRANCE Ramine Nikoukhah INRIA...TITLE AND SUBTITLE Multiscale System Theory 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...the development of a corresponding system theory and a theory of stochastic processes and their estimation. The research presented in this and several

  13. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  14. Modern Theories of Language.

    Science.gov (United States)

    Davis, Philip W.

    This volume explores objectively the essential characteristic of nine twentieth-century linguistic theories with the theoretical variant for discussion based on one closely representative of work within a given approach or usually associated with the name of the theory. First, the theory of Ferdinand de Saussure is discussed based on his book,…

  15. Constructivist Grounded Theory?

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, PhD, Hon. PhD

    2012-06-01

    Full Text Available AbstractI refer to and use as scholarly inspiration Charmaz’s excellent article on constructivist grounded theory as a tool of getting to the fundamental issues on why grounded theory is not constructivist. I show that constructivist data, if it exists at all, is a very, very small part of the data that grounded theory uses.

  16. Essays in auction theory

    NARCIS (Netherlands)

    Maasland, E.

    2012-01-01

    Auction theory is a branch of game theory that considers human behavior in auction markets and the ensuing market outcomes. It is also successfully used as a tool to design real-life auctions. This thesis contains five essays addressing a variety of topics within the realm of auction theory. The

  17. Unified field theory

    International Nuclear Information System (INIS)

    Prasad, R.

    1975-01-01

    Results of researches into Unified Field Theory over the past seven years are presented. The subject is dealt with in chapters entitled: the choice of affine connection, algebraic properties of the vector fields, field laws obtained from the affine connection based on the path integral method, application to quantum theory and cosmology, interpretation of physical theory in terms of geometry. (U.K.)

  18. Frankl's Theory and Therapy.

    Science.gov (United States)

    Missinne, Leo E.; Wilcox, Victoria

    This paper discusses the life, theories, and therapeutic techniques of psychotherapist, Viktor E. Frankl. A brief biography of Frankl is included discussing the relationship of his early experiences as a physician to his theory of personality. Frankl's theory focusing on man's need for meaning and emphasizing the spiritual dimension in each human…

  19. Cognitive Theories of Autism

    Science.gov (United States)

    Rajendran, Gnanathusharan; Mitchell, Peter

    2007-01-01

    This article considers three theories of autism: The Theory of Mind Deficit, Executive Dysfunction and the Weak Central Coherence accounts. It outlines each along with studies relevant to their emergence, their expansion, their limitations and their possible integration. Furthermore, consideration is given to any implication from the theories in…

  20. Lattice gauge theory

    International Nuclear Information System (INIS)

    Mack, G.

    1982-01-01

    After a description of a pure Yang-Mills theory on a lattice, the author considers a three-dimensional pure U(1) lattice gauge theory. Thereafter he discusses the exact relation between lattice gauge theories with the gauge groups SU(2) and SO(3). Finally he presents Monte Carlo data on phase transitions in SU(2) and SO(3) lattice gauge models. (HSI)

  1. Papers in auction theory

    NARCIS (Netherlands)

    Onderstal, A.M.

    2002-01-01

    This thesis is a collection of six papers in auction theory, with several economic applications, both to real life auctions and to other economic phenomena. In the introduction to the thesis, Onderstal argues why auction theory is an important branch of economic theory, and discusses several

  2. Introduction to number theory

    CERN Document Server

    Vazzana, Anthony; Garth, David

    2007-01-01

    One of the oldest branches of mathematics, number theory is a vast field devoted to studying the properties of whole numbers. Offering a flexible format for a one- or two-semester course, Introduction to Number Theory uses worked examples, numerous exercises, and two popular software packages to describe a diverse array of number theory topics.

  3. Reflections on Activity Theory

    Science.gov (United States)

    Bakhurst, David

    2009-01-01

    It is sometimes suggested that activity theory represents the most important legacy of Soviet philosophy and psychology. But what exactly "is" activity theory? The canonical account in the West is given by Engestrom, who identifies three stages in the theory's development: from Vygotsky's insights, through Leontiev's articulation of the…

  4. Superspace conformal field theory

    Energy Technology Data Exchange (ETDEWEB)

    Quella, Thomas [Koeln Univ. (Germany). Inst. fuer Theoretische Physik; Schomerus, Volker [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2013-07-15

    Conformal sigma models and WZW models on coset superspaces provide important examples of logarithmic conformal field theories. They possess many applications to problems in string and condensed matter theory. We review recent results and developments, including the general construction of WZW models on type I supergroups, the classification of conformal sigma models and their embedding into string theory.

  5. Superspace conformal field theory

    International Nuclear Information System (INIS)

    Quella, Thomas

    2013-07-01

    Conformal sigma models and WZW models on coset superspaces provide important examples of logarithmic conformal field theories. They possess many applications to problems in string and condensed matter theory. We review recent results and developments, including the general construction of WZW models on type I supergroups, the classification of conformal sigma models and their embedding into string theory.

  6. Gauge theory and gravitation

    International Nuclear Information System (INIS)

    Kikkawa, Keiji; Nakanishi, Noboru; Nariai, Hidekazu

    1983-01-01

    These proceedings contain the articles presented at the named symposium. They deal with geometrical aspects of gauge theory and gravitation, special problems in gauge theories, quantum field theory in curved space-time, quantum gravity, supersymmetry including supergravity, and grand unification. See hints under the relevant topics. (HSI)

  7. Constructor theory of probability

    Science.gov (United States)

    2016-01-01

    Unitary quantum theory, having no Born Rule, is non-probabilistic. Hence the notorious problem of reconciling it with the unpredictability and appearance of stochasticity in quantum measurements. Generalizing and improving upon the so-called ‘decision-theoretic approach’, I shall recast that problem in the recently proposed constructor theory of information—where quantum theory is represented as one of a class of superinformation theories, which are local, non-probabilistic theories conforming to certain constructor-theoretic conditions. I prove that the unpredictability of measurement outcomes (to which constructor theory gives an exact meaning) necessarily arises in superinformation theories. Then I explain how the appearance of stochasticity in (finitely many) repeated measurements can arise under superinformation theories. And I establish sufficient conditions for a superinformation theory to inform decisions (made under it) as if it were probabilistic, via a Deutsch–Wallace-type argument—thus defining a class of decision-supporting superinformation theories. This broadens the domain of applicability of that argument to cover constructor-theory compliant theories. In addition, in this version some of the argument's assumptions, previously construed as merely decision-theoretic, follow from physical properties expressed by constructor-theoretic principles. PMID:27616914

  8. Endogenous Prospect Theory

    OpenAIRE

    Schmidt, Ulrich; Zank, Horst

    2010-01-01

    In previous models of (cumulative) prospect theory reference-dependence of preferences is imposed beforehand and the location of the reference point is exogenously determined. This paper provides an axiomatization of a new specification of cumulative prospect theory, termed endogenous prospect theory, where reference-dependence is derived from preference conditions and a unique reference point arises endogenously.

  9. Sharpening Intertemporal Prospect Theory

    OpenAIRE

    Pushpa, Rathie; Carlos, Radavelli; Sergio, Da Silva

    2006-01-01

    Prospect theory [4] of risky choices has been extended to encompass intertemporal choices [6]. Presentation of intertemporal prospect theory suffers from minor mistakes, however [2]. To clarify the theory we restate it and show further mistakes in current presentations ([6], [2]) of value and discount functions.

  10. Gauge theories as theories of spontaneous breakdown

    International Nuclear Information System (INIS)

    Ivanov, E.A.; Ogievetsky, V.I.

    1976-01-01

    Any gauge theory is proved to arise from spontaneous breakdown of symmetry under certain infinite parameter group, the corresponding gauge field being the Goldstone field by which this breakdown is accompanied

  11. Theory and context / Theory in context

    DEFF Research Database (Denmark)

    Glaveanu, Vlad Petre

    2014-01-01

    trans-disciplinary manner. Consideration needs to be given as well to connected scholarship focusing on imagination, innova-tion, and improvisation. Last but not least, an expanded the-ory of context cannot ignore the institutional context of doing research on creativity. Creativity scholars are facing......It is debatable whether the psychology of creativity is a field in crisis or not. There are clear signs of increased fragmenta-tion and a scarcity of integrative efforts, but is this necessari-ly bad? Do we need more comprehensive theories of creativ-ity and a return to old epistemological...... questions? This de-pends on how one understands theory. Against a view of theoretical work as aiming towards generality, universality, uniformity, completeness, and singularity, I advocate for a dynamic perspective in which theory is plural, multifaceted, and contextual. Far from ‘waiting for the Messiah...

  12. Invariant Theory (IT) & Standard Monomial Theory (SMT)

    Indian Academy of Sciences (India)

    2013-07-06

    Jul 6, 2013 ... Why invariant theory? (continued). Now imagine algebraic calculations being made, with the two different sets of co-ordinates, about something of geometrical or physical interest concerning the configuration of points, ...

  13. Nuclear structure theory

    CERN Document Server

    Irvine, J M

    1972-01-01

    Nuclear Structure Theory provides a guide to nuclear structure theory. The book is comprised of 23 chapters that are organized into four parts; each part covers an aspect of nuclear structure theory. In the first part, the text discusses the experimentally observed phenomena, which nuclear structure theories need to look into and detail the information that supports those theories. The second part of the book deals with the phenomenological nucleon-nucleon potentials derived from phase shift analysis of nucleon-nucleon scattering. Part III talks about the phenomenological parameters used to de

  14. Measure and integration theory

    CERN Document Server

    Burckel, Robert B

    2001-01-01

    This book gives a straightforward introduction to the field as it is nowadays required in many branches of analysis and especially in probability theory. The first three chapters (Measure Theory, Integration Theory, Product Measures) basically follow the clear and approved exposition given in the author's earlier book on ""Probability Theory and Measure Theory"". Special emphasis is laid on a complete discussion of the transformation of measures and integration with respect to the product measure, convergence theorems, parameter depending integrals, as well as the Radon-Nikodym theorem. The fi

  15. Special theory of relativity

    CERN Document Server

    Kilmister, Clive William

    1970-01-01

    Special Theory of Relativity provides a discussion of the special theory of relativity. Special relativity is not, like other scientific theories, a statement about the matter that forms the physical world, but has the form of a condition that the explicit physical theories must satisfy. It is thus a form of description, playing to some extent the role of the grammar of physics, prescribing which combinations of theoretical statements are admissible as descriptions of the physical world. Thus, to describe it, one needs also to describe those specific theories and to say how much they are limit

  16. Variational Transition State Theory

    Energy Technology Data Exchange (ETDEWEB)

    Truhlar, Donald G. [Univ. of Minnesota, Minneapolis, MN (United States)

    2016-09-29

    This is the final report on a project involving the development and applications of variational transition state theory. This project involved the development of variational transition state theory for gas-phase reactions, including optimized multidimensional tunneling contributions and the application of this theory to gas-phase reactions with a special emphasis on developing reaction rate theory in directions that are important for applications to combustion. The development of variational transition state theory with optimized multidimensional tunneling as a useful computational tool for combustion kinetics involved eight objectives.

  17. Extremal graph theory

    CERN Document Server

    Bollobas, Bela

    2004-01-01

    The ever-expanding field of extremal graph theory encompasses a diverse array of problem-solving methods, including applications to economics, computer science, and optimization theory. This volume, based on a series of lectures delivered to graduate students at the University of Cambridge, presents a concise yet comprehensive treatment of extremal graph theory.Unlike most graph theory treatises, this text features complete proofs for almost all of its results. Further insights into theory are provided by the numerous exercises of varying degrees of difficulty that accompany each chapter. A

  18. Introduction to spectral theory

    CERN Document Server

    Levitan, B M

    1975-01-01

    This monograph is devoted to the spectral theory of the Sturm- Liouville operator and to the spectral theory of the Dirac system. In addition, some results are given for nth order ordinary differential operators. Those parts of this book which concern nth order operators can serve as simply an introduction to this domain, which at the present time has already had time to become very broad. For the convenience of the reader who is not familar with abstract spectral theory, the authors have inserted a chapter (Chapter 13) in which they discuss this theory, concisely and in the main without proofs, and indicate various connections with the spectral theory of differential operators.

  19. Game theory an introduction

    CERN Document Server

    Barron, E N

    2013-01-01

    An exciting new edition of the popular introduction to game theory and its applications The thoroughly expanded Second Edition presents a unique, hands-on approach to game theory. While most books on the subject are too abstract or too basic for mathematicians, Game Theory: An Introduction, Second Edition offers a blend of theory and applications, allowing readers to use theory and software to create and analyze real-world decision-making models. With a rigorous, yet accessible, treatment of mathematics, the book focuses on results that can be used to

  20. Model theory and modules

    CERN Document Server

    Prest, M

    1988-01-01

    In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module

  1. Finite quantum field theories

    International Nuclear Information System (INIS)

    Lucha, W.; Neufeld, H.

    1986-01-01

    We investigate the relation between finiteness of a four-dimensional quantum field theory and global supersymmetry. To this end we consider the most general quantum field theory and analyse the finiteness conditions resulting from the requirement of the absence of divergent contributions to the renormalizations of the parameters of the theory. In addition to the gauge bosons, both fermions and scalar bosons turn out to be a necessary ingredient in a non-trivial finite gauge theory. In all cases discussed, the supersymmetric theory restricted by two well-known constraints on the dimensionless couplings proves to be the unique solution of the finiteness conditions. (Author)

  2. The SusHouse project. Use and maintenance of clothing as an example. Environmental analysis of system-level future scenarios

    International Nuclear Information System (INIS)

    Knot, M.; Bras-Klapwijk, R.M.

    2001-01-01

    The SusHouse project assumed that system-level innovations are necessary for sustainable development, involving new arrangements and combined innovations in technology, organisation and behaviour. The life cycle analysis (LCA) method has been used and adapted to evaluate the potential of such complex system-level strategies to reduce environmental impact. This article explains and discusses this approach and presents some assessment results for the SusHouse research into clothing. The requirements and systems used were found to yield interesting insights into relevant solutions and strategies. The future scenarios for clothing promise major improvements in most of the environmental indicators, with particular contributions from changes in the quantity and quality of clothing consumed. The article recommends extending the approach with a 'turning points' analysis, because of the many uncertainties, as well as the use of more differentiated indicators and the inclusion of a focused trend analysis. 17 refs

  3. Toward Petascale Biologically Plausible Neural Networks

    Science.gov (United States)

    Long, Lyle

    This talk will describe an approach to achieving petascale neural networks. Artificial intelligence has been oversold for many decades. Computers in the beginning could only do about 16,000 operations per second. Computer processing power, however, has been doubling every two years thanks to Moore's law, and growing even faster due to massively parallel architectures. Finally, 60 years after the first AI conference we have computers on the order of the performance of the human brain (1016 operations per second). The main issues now are algorithms, software, and learning. We have excellent models of neurons, such as the Hodgkin-Huxley model, but we do not know how the human neurons are wired together. With careful attention to efficient parallel computing, event-driven programming, table lookups, and memory minimization massive scale simulations can be performed. The code that will be described was written in C + + and uses the Message Passing Interface (MPI). It uses the full Hodgkin-Huxley neuron model, not a simplified model. It also allows arbitrary network structures (deep, recurrent, convolutional, all-to-all, etc.). The code is scalable, and has, so far, been tested on up to 2,048 processor cores using 107 neurons and 109 synapses.

  4. Speech recognition employing biologically plausible receptive fields

    DEFF Research Database (Denmark)

    Fereczkowski, Michal; Bothe, Hans-Heinrich

    2011-01-01

    spectro-temporal receptive fields to auditory spectrogram input, motivated by the auditory pathway of humans, and ii) the adaptation or learning algorithms involved are biologically inspired. This is in contrast to state-of-the-art combinations of Mel-frequency cepstral coefficients and Hidden Markov...

  5. Migraine & paediatric obesity: a plausible link?

    Directory of Open Access Journals (Sweden)

    Sarit Ravid

    2014-01-01

    Full Text Available Obesity and migraine are both highly prevalent disorders in the general population, influenced by genetic and environmental risk factors. In recent studies, obesity was found to be a strong risk factor for transformed migraine and, among migraineurs, obesity was associated with frequent headaches and higher disability scores. Suggested mechanisms included: (i obesity as a pro-inflammatory state may be associated with neurovascular inflammation in patients with migraine; (ii elevated levels of plasma calcitonin gene-related peptide (CGRP in obese individuals may play a role as an important post-synaptic mediator of trigeminovascular inflammation in migraine; (iii dismodulation in the hypothalamic neuropeptide, orexin, in obese persons may be associated with increased susceptibility to neurogenic inflammation causing migraine attacks; and (iv leptin and adiponectin can activate proinflammatory cytokine release that is involved in the pathogenesis of migraine. In addition, both conditions are associated with psychiatric co-morbidities, such as depression and anxiety, that can further increase headache frequency and disability. Therefore, the effect of obesity on migraine outcome is important. Weight and BMI should be measured and calculated in all children presenting with migraine, and weight control should be a part of the treatment.

  6. DIOXINS AND ENDOMETRIOSIS: A PLAUSIBLE HYPOTHESIS

    Science.gov (United States)

    A potential connection exists between the increasing prevalence of endometriosis and exposure to organochlorine chemicals. There is evidence that dioxin (2,3,7,8-TCDD) can increase the incidence and severity of the disease in monkeys and can promote the growth or survival of end...

  7. Geophysical Field Theory

    International Nuclear Information System (INIS)

    Eloranta, E.

    2003-11-01

    The geophysical field theory includes the basic principles of electromagnetism, continuum mechanics, and potential theory upon which the computational modelling of geophysical phenomena is based on. Vector analysis is the main mathematical tool in the field analyses. Electrostatics, stationary electric current, magnetostatics, and electrodynamics form a central part of electromagnetism in geophysical field theory. Potential theory concerns especially gravity, but also electrostatics and magnetostatics. Solid state mechanics and fluid mechanics are central parts in continuum mechanics. Also the theories of elastic waves and rock mechanics belong to geophysical solid state mechanics. The theories of geohydrology and mass transport form one central field theory in geophysical fluid mechanics. Also heat transfer is included in continuum mechanics. (orig.)

  8. Theory of superconductivity

    International Nuclear Information System (INIS)

    Crisan, M.

    1988-01-01

    This book discusses the most important aspects of the theory. The phenomenological model is followed by the microscopic theory of superconductivity, in which modern formalism of the many-body theory is used to treat most important problems such as superconducting alloys, coexistence of superconductivity with the magnetic order, and superconductivity in quasi-one-dimensional systems. It concludes with a discussion on models for exotic and high temperature superconductivity. Its main aim is to review, as complete as possible, the theory of superconductivity from classical models and methods up to the 1987 results on high temperature superconductivity. Contents: Phenomenological Theory of Superconductivity; Microscopic Theory of Superconductivity; Theory of Superconducting Alloys; Superconductors in a Magnetic Field; Superconductivity and Magnetic Order; Superconductivity in Quasi-One-Dimensional Systems; and Non-Conventional Superconductivity

  9. Gravitation and source theory

    International Nuclear Information System (INIS)

    Yilmaz, H.

    1975-01-01

    Schwinger's source theory is applied to the problem of gravitation and its quantization. It is shown that within the framework of a flat-space the source theory implementation leads to a violation of probability. To avoid the difficulty one must introduce a curved space-time hence the source concept may be said to necessitate the transition to a curved-space theory of gravitation. It is further shown that the curved-space theory of gravitation implied by the source theory is not equivalent to the conventional Einstein theory. The source concept leads to a different theory where the gravitational field has a stress-energy tensor t/sup nu//sub mu/ which contributes to geometric curvatures

  10. Introduction to representation theory

    CERN Document Server

    Etingof, Pavel; Hensel, Sebastian; Liu, Tiankai; Schwendner, Alex

    2011-01-01

    Very roughly speaking, representation theory studies symmetry in linear spaces. It is a beautiful mathematical subject which has many applications, ranging from number theory and combinatorics to geometry, probability theory, quantum mechanics, and quantum field theory. The goal of this book is to give a "holistic" introduction to representation theory, presenting it as a unified subject which studies representations of associative algebras and treating the representation theories of groups, Lie algebras, and quivers as special cases. Using this approach, the book covers a number of standard topics in the representation theories of these structures. Theoretical material in the book is supplemented by many problems and exercises which touch upon a lot of additional topics; the more difficult exercises are provided with hints. The book is designed as a textbook for advanced undergraduate and beginning graduate students. It should be accessible to students with a strong background in linear algebra and a basic k...

  11. [Introduction to grounded theory].

    Science.gov (United States)

    Wang, Shou-Yu; Windsor, Carol; Yates, Patsy

    2012-02-01

    Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.

  12. Applied number theory

    CERN Document Server

    Niederreiter, Harald

    2015-01-01

    This textbook effectively builds a bridge from basic number theory to recent advances in applied number theory. It presents the first unified account of the four major areas of application where number theory plays a fundamental role, namely cryptography, coding theory, quasi-Monte Carlo methods, and pseudorandom number generation, allowing the authors to delineate the manifold links and interrelations between these areas.  Number theory, which Carl-Friedrich Gauss famously dubbed the queen of mathematics, has always been considered a very beautiful field of mathematics, producing lovely results and elegant proofs. While only very few real-life applications were known in the past, today number theory can be found in everyday life: in supermarket bar code scanners, in our cars’ GPS systems, in online banking, etc.  Starting with a brief introductory course on number theory in Chapter 1, which makes the book more accessible for undergraduates, the authors describe the four main application areas in Chapters...

  13. Computational invariant theory

    CERN Document Server

    Derksen, Harm

    2015-01-01

    This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...

  14. Politics, Security, Theory

    DEFF Research Database (Denmark)

    Wæver, Ole

    2011-01-01

    theory is found to ‘act politically’ through three structural features that systematically shape the political effects of using the theory. The article further discusses – on the basis of the preceding articles in the special issue – three emerging debates around securitization theory: ethics......This article outlines three ways of analysing the ‘politics of securitization’, emphasizing an often-overlooked form of politics practised through theory design. The structure and nature of a theory can have systematic political implications. Analysis of this ‘politics of securitization......’ is distinct from both the study of political practices of securitization and explorations of competing concepts of politics among security theories. It means tracking what kinds of analysis the theory can produce and whether such analysis systematically impacts real-life political struggles. Securitization...

  15. Development of Accounting Theories Specific to the National Accounting Literature of the First Half of Twentieth Century

    Directory of Open Access Journals (Sweden)

    Sorin Damian

    2011-05-01

    Full Text Available Need to identify plausible explanations of the principles underlying the double entry accountingover time determined by various manifestations of thought that have resulted in many theories. All thesetheories have proposed to explain and substantiate dopic formalism, but many of them no longer a valuetoday than a purely historical perspective. The representative of such theories has been many pages writtenRomanian and foreign authors in the first half of the twentieth century. Some Romanian authors mention theIoan E. Evian, D. Voina, CG Demetrescu, S. Iacobescu, Al. Sorescu, C. Pantu, C. Petrescu, Grigore-TrancuIaşi and others. Bibliography time accounting theories shared accounts: embryos of theories and scientifictheories.

  16. Microcanonical quantum field theory

    International Nuclear Information System (INIS)

    Strominger, A.

    1983-01-01

    Euclidean quantum field theory is equivalent to the equilibrium statistical mechanics of classical fields in 4+1 dimensions at temperature h. It is well known in statistical mechanics that the theory of systems at fixed temperature is embedded within the more general and fundamental theory of systems at fixed energy. We therefore develop, in precise analogy, a fixed action (macrocanonical) formulation of quantum field theory. For the case of ordinary renormalizable field theories, we show (with one exception) that the microcanonical is entirely equivalent to the canonical formulation. That is, for some particular fixed value of the total action, the Green's functions of the microcanonical theory are equal, in the bulk limit, to those of the canonical theory. The microcanonical perturbation expansion is developed in some detail for lambdaphi 4 . The particular value of the action for which the two formulations are equivalent can be calculated to all orders in perturbation theory. We prove, using Lehmann's Theorem, that this value is one-half Planck unit per degree of freedom, if fermionic degrees of freedom are counted negatively. This is the 4+1 dimensional analog of the equipartition theorem. The one exception to this is supersymmetric theories. A microcanonical formulation exists if and only if supersymmetry is broken. In statistical mechanics and in field theory there are systems for which the canonical description is pathological, but the microcanonical is not. An example of such a field theory is found in one dimension. A semiclassical expansion of the microcanonical theory is well defined, while an expansion of the canonical theory is hoplessly divergent

  17. Criteria for selecting implementation science theories and frameworks: results from an international survey

    Directory of Open Access Journals (Sweden)

    Sarah A. Birken

    2017-10-01

    Full Text Available Abstract Background Theories provide a synthesizing architecture for implementation science. The underuse, superficial use, and misuse of theories pose a substantial scientific challenge for implementation science and may relate to challenges in selecting from the many theories in the field. Implementation scientists may benefit from guidance for selecting a theory for a specific study or project. Understanding how implementation scientists select theories will help inform efforts to develop such guidance. Our objective was to identify which theories implementation scientists use, how they use theories, and the criteria used to select theories. Methods We identified initial lists of uses and criteria for selecting implementation theories based on seminal articles and an iterative consensus process. We incorporated these lists into a self-administered survey for completion by self-identified implementation scientists. We recruited potential respondents at the 8th Annual Conference on the Science of Dissemination and Implementation in Health and via several international email lists. We used frequencies and percentages to report results. Results Two hundred twenty-three implementation scientists from 12 countries responded to the survey. They reported using more than 100 different theories spanning several disciplines. Respondents reported using theories primarily to identify implementation determinants, inform data collection, enhance conceptual clarity, and guide implementation planning. Of the 19 criteria presented in the survey, the criteria used by the most respondents to select theory included analytic level (58%, logical consistency/plausibility (56%, empirical support (53%, and description of a change process (54%. The criteria used by the fewest respondents included fecundity (10%, uniqueness (12%, and falsifiability (15%. Conclusions Implementation scientists use a large number of criteria to select theories, but there is little

  18. Criteria for selecting implementation science theories and frameworks: results from an international survey.

    Science.gov (United States)

    Birken, Sarah A; Powell, Byron J; Shea, Christopher M; Haines, Emily R; Alexis Kirk, M; Leeman, Jennifer; Rohweder, Catherine; Damschroder, Laura; Presseau, Justin

    2017-10-30

    Theories provide a synthesizing architecture for implementation science. The underuse, superficial use, and misuse of theories pose a substantial scientific challenge for implementation science and may relate to challenges in selecting from the many theories in the field. Implementation scientists may benefit from guidance for selecting a theory for a specific study or project. Understanding how implementation scientists select theories will help inform efforts to develop such guidance. Our objective was to identify which theories implementation scientists use, how they use theories, and the criteria used to select theories. We identified initial lists of uses and criteria for selecting implementation theories based on seminal articles and an iterative consensus process. We incorporated these lists into a self-administered survey for completion by self-identified implementation scientists. We recruited potential respondents at the 8th Annual Conference on the Science of Dissemination and Implementation in Health and via several international email lists. We used frequencies and percentages to report results. Two hundred twenty-three implementation scientists from 12 countries responded to the survey. They reported using more than 100 different theories spanning several disciplines. Respondents reported using theories primarily to identify implementation determinants, inform data collection, enhance conceptual clarity, and guide implementation planning. Of the 19 criteria presented in the survey, the criteria used by the most respondents to select theory included analytic level (58%), logical consistency/plausibility (56%), empirical support (53%), and description of a change process (54%). The criteria used by the fewest respondents included fecundity (10%), uniqueness (12%), and falsifiability (15%). Implementation scientists use a large number of criteria to select theories, but there is little consensus on which are most important. Our results suggest that the

  19. Classical field theory

    CERN Document Server

    Franklin, Joel

    2017-01-01

    Classical field theory, which concerns the generation and interaction of fields, is a logical precursor to quantum field theory, and can be used to describe phenomena such as gravity and electromagnetism. Written for advanced undergraduates, and appropriate for graduate level classes, this book provides a comprehensive introduction to field theories, with a focus on their relativistic structural elements. Such structural notions enable a deeper understanding of Maxwell's equations, which lie at the heart of electromagnetism, and can also be applied to modern variants such as Chern–Simons and Born–Infeld. The structure of field theories and their physical predictions are illustrated with compelling examples, making this book perfect as a text in a dedicated field theory course, for self-study, or as a reference for those interested in classical field theory, advanced electromagnetism, or general relativity. Demonstrating a modern approach to model building, this text is also ideal for students of theoretic...

  20. Algebraic quantum field theory

    International Nuclear Information System (INIS)

    Foroutan, A.

    1996-12-01

    The basic assumption that the complete information relevant for a relativistic, local quantum theory is contained in the net structure of the local observables of this theory results first of all in a concise formulation of the algebraic structure of the superselection theory and an intrinsic formulation of charge composition, charge conjugation and the statistics of an algebraic quantum field theory. In a next step, the locality of massive particles together with their spectral properties are wed for the formulation of a selection criterion which opens the access to the massive, non-abelian quantum gauge theories. The role of the electric charge as a superselection rule results in the introduction of charge classes which in term lead to a set of quantum states with optimum localization properties. Finally, the asymptotic observables of quantum electrodynamics are investigated within the framework of algebraic quantum field theory. (author)