WorldWideScience

Sample records for plausible systems-level theory

  1. Some Remarks on the Model Theory of Epistemic Plausibility Models

    CERN Document Server

    Demey, Lorenz

    2010-01-01

    Classical logics of knowledge and belief are usually interpreted on Kripke models, for which a mathematically well-developed model theory is available. However, such models are inadequate to capture dynamic phenomena. Therefore, epistemic plausibility models have been introduced. Because these are much richer structures than Kripke models, they do not straightforwardly inherit the model-theoretical results of modal logic. Therefore, while epistemic plausibility structures are well-suited for modeling purposes, an extensive investigation of their model theory has been lacking so far. The aim of the present paper is to fill exactly this gap, by initiating a systematic exploration of the model theory of epistemic plausibility models. Like in 'ordinary' modal logic, the focus will be on the notion of bisimulation. We define various notions of bisimulations (parametrized by a language L) and show that L-bisimilarity implies L-equivalence. We prove a Hennesy-Milner type result, and also two undefinability results. ...

  2. The Sarrazin effect: the presence of absurd statements in conspiracy theories makes canonical information less plausible.

    Science.gov (United States)

    Raab, Marius Hans; Auer, Nikolas; Ortlieb, Stefan A; Carbon, Claus-Christian

    2013-01-01

    Reptile prime ministers and flying Nazi saucers-extreme and sometimes off-wall conclusion are typical ingredients of conspiracy theories. While individual differences are a common research topic concerning conspiracy theories, the role of extreme statements in the process of acquiring and passing on conspiratorial stories has not been regarded in an experimental design so far. We identified six morphological components of conspiracy theories empirically. On the basis of these content categories a set of narrative elements for a 9/11 story was compiled. These elements varied systematically in terms of conspiratorial allegation, i.e., they contained official statements concerning the events of 9/11, statements alleging to a conspiracy limited in time and space as well as extreme statements indicating an all-encompassing cover-up. Using the method of narrative construction, 30 people were given a set of cards with these statements and asked to construct the course of events of 9/11 they deem most plausible. When extreme statements were present in the set, the resulting stories were more conspiratorial; the number of official statements included in the narrative dropped significantly, whereas the self-assessment of the story's plausibility did not differ between conditions. This indicates that blatant statements in a pool of information foster the synthesis of conspiracy theories on an individual level. By relating these findings to one of Germany's most successful (and controversial) non-fiction books, we refer to the real-world dangers of this effect.

  3. Quantum theory as plausible reasoning applied to data obtained by robust experiments.

    Science.gov (United States)

    De Raedt, H; Katsnelson, M I; Michielsen, K

    2016-05-28

    We review recent work that employs the framework of logical inference to establish a bridge between data gathered through experiments and their objective description in terms of human-made concepts. It is shown that logical inference applied to experiments for which the observed events are independent and for which the frequency distribution of these events is robust with respect to small changes of the conditions under which the experiments are carried out yields, without introducing any concept of quantum theory, the quantum theoretical description in terms of the Schrödinger or the Pauli equation, the Stern-Gerlach or Einstein-Podolsky-Rosen-Bohm experiments. The extraordinary descriptive power of quantum theory then follows from the fact that it is plausible reasoning, that is common sense, applied to reproducible and robust experimental data.

  4. The Sarrazin effect: the presence of absurd statements in conspiracy theories makes canonical information less plausible

    Directory of Open Access Journals (Sweden)

    Marius Hans Raab

    2013-07-01

    Full Text Available Reptile prime ministers and flying Nazi saucers—extreme and sometimes off-wall conclusion are common ingredients of conspiracy theories. While individual differences are a common research topic concerning conspiracy theories, the role of extreme statements in the process of acquiring and passing on conspiratorial stories has not been regarded in an experimental design so far. We identified six morphological components of conspiracy theories empirically. On the basis of these content categories a set of narrative elements for a 9/11 story was compiled. These elements varied systematically in terms of conspiratorial allegation, i.e., they contained official statements concerning the events of 9/11, statements alleging to a conspiracy limited in time and space as well as extreme statements indicating an all-encompassing cover-up. Using the method of narrative construction, 30 people were given a set of cards with these statements and asked to construct the course of events of 9/11 they deem most plausible. When extreme statements were present in the set, the resulting stories were more conspiratorial; the number of official statements included in the narrative dropped significantly, whereas the self-assessment of the story’s plausibility did not differ between conditions. This indicates that blatant statements in a pool of information foster the synthesis of conspiracy theories on an individual level. By relating these findings to one of Germany’s most successful (and controversial non-fiction books, we refer to the real-world dangers of this effect.

  5. Looking for plausibility

    CERN Document Server

    Abdullah, Wan Ahmad Tajuddin Wan

    2010-01-01

    In the interpretation of experimental data, one is actually looking for plausible explanations. We look for a measure of plausibility, with which we can compare different possible explanations, and which can be combined when there are different sets of data. This is contrasted to the conventional measure for probabilities as well as to the proposed measure of possibilities. We define what characteristics this measure of plausibility should have. In getting to the conception of this measure, we explore the relation of plausibility to abductive reasoning, and to Bayesian probabilities. We also compare with the Dempster-Schaefer theory of evidence, which also has its own definition for plausibility. Abduction can be associated with biconditionality in inference rules, and this provides a platform to relate to the Collins-Michalski theory of plausibility. Finally, using a formalism for wiring logic onto Hopfield neural networks, we ask if this is relevant in obtaining this measure.

  6. The Sarrazin effect: the presence of absurd statements in conspiracy theories makes canonical information less plausible

    OpenAIRE

    Raab, Marius Hans; Auer, Nikolas; Ortlieb, Stefan A.; Carbon, Claus-Christian

    2013-01-01

    Reptile prime ministers and flying Nazi saucers—extreme and sometimes off-wall conclusion are typical ingredients of conspiracy theories. While individual differences are a common research topic concerning conspiracy theories, the role of extreme statements in the process of acquiring and passing on conspiratorial stories has not been regarded in an experimental design so far. We identified six morphological components of conspiracy theories empirically. On the basis of these content categori...

  7. The Sarrazin effect: the presence of absurd statements in conspiracy theories makes canonical information less plausible

    OpenAIRE

    Marius Hans Raab; Nikolas eAuer; Ortlieb, Stefan A.; Claus-Christian eCarbon

    2013-01-01

    Reptile prime ministers and flying Nazi saucers—extreme and sometimes off-wall conclusion are common ingredients of conspiracy theories. While individual differences are a common research topic concerning conspiracy theories, the role of extreme statements in the process of acquiring and passing on conspiratorial stories has not been regarded in an experimental design so far. We identified six morphological components of conspiracy theories empirically. On the basis of these content categorie...

  8. On the Biological Plausibility of Grandmother Cells: Implications for Neural Network Theories in Psychology and Neuroscience

    Science.gov (United States)

    Bowers, Jeffrey S.

    2009-01-01

    A fundamental claim associated with parallel distributed processing (PDP) theories of cognition is that knowledge is coded in a distributed manner in mind and brain. This approach rejects the claim that knowledge is coded in a localist fashion, with words, objects, and simple concepts (e.g. "dog"), that is, coded with their own dedicated…

  9. Plausibility functions and exact frequentist inference

    CERN Document Server

    Martin, Ryan

    2012-01-01

    In the frequentist program, inferential methods with exact control on error rates are a primary focus. Methods based on asymptotic distribution theory may not be suitable in a particular problem, in which case, a numerical method is needed. This paper presents a general, Monte Carlo-driven framework for the construction of frequentist procedures based on plausibility functions. It is proved that the suitably defined plausibility function-based tests and confidence regions have desired frequentist properties. Moreover, in an important special case involving likelihood ratios, conditions are given such that the plausibility function behaves asymptotically like a consistent Bayesian posterior distribution. An extension of the proposed method is also given for the case where nuisance parameters are present. A number of examples are given which illustrate the method and demonstrate its strong performance compared to other popular existing methods.

  10. Pathways to plausibility

    DEFF Research Database (Denmark)

    Wahlberg, Ayo

    2008-01-01

    Herbal medicine has long been contrasted to modern medicine in terms of a holistic approach to healing, vitalistic theories of health and illness and an emphasis on the body’s innate self-healing capacities. At the same time, since the early 20th century, the cultivation, preparation and mass...

  11. Pathways to plausibility

    DEFF Research Database (Denmark)

    Wahlberg, Ayo

    2008-01-01

    Herbal medicine has long been contrasted to modern medicine in terms of a holistic approach to healing, vitalistic theories of health and illness and an emphasis on the body’s innate self-healing capacities. At the same time, since the early 20th century, the cultivation, preparation and mass pro...... as normalised, with herbalists, phytochemists and pharmacologists working to develop standardised production procedures as well as to identify ‘plausible’ explanations for the efficacy of these remedies....

  12. System level ESD protection

    CERN Document Server

    Vashchenko, Vladislav

    2014-01-01

    This book addresses key aspects of analog integrated circuits and systems design related to system level electrostatic discharge (ESD) protection.  It is an invaluable reference for anyone developing systems-on-chip (SoC) and systems-on-package (SoP), integrated with system-level ESD protection. The book focuses on both the design of semiconductor integrated circuit (IC) components with embedded, on-chip system level protection and IC-system co-design. The readers will be enabled to bring the system level ESD protection solutions to the level of integrated circuits, thereby reducing or completely eliminating the need for additional, discrete components on the printed circuit board (PCB) and meeting system-level ESD requirements. The authors take a systematic approach, based on IC-system ESD protection co-design. A detailed description of the available IC-level ESD testing methods is provided, together with a discussion of the correlation between IC-level and system-level ESD testing methods. The IC-level ESD...

  13. The Plausibility of a String Quartet Performance in Virtual Reality.

    Science.gov (United States)

    Bergstrom, Ilias; Azevedo, Sergio; Papiotis, Panos; Saldanha, Nuno; Slater, Mel

    2017-04-01

    We describe an experiment that explores the contribution of auditory and other features to the illusion of plausibility in a virtual environment that depicts the performance of a string quartet. 'Plausibility' refers to the component of presence that is the illusion that the perceived events in the virtual environment are really happening. The features studied were: Gaze (the musicians ignored the participant, the musicians sometimes looked towards and followed the participant's movements), Sound Spatialization (Mono, Stereo, Spatial), Auralization (no sound reflections, reflections corresponding to a room larger than the one perceived, reflections that exactly matched the virtual room), and Environment (no sound from outside of the room, birdsong and wind corresponding to the outside scene). We adopted the methodology based on color matching theory, where 20 participants were first able to assess their feeling of plausibility in the environment with each of the four features at their highest setting. Then five times participants started from a low setting on all features and were able to make transitions from one system configuration to another until they matched their original feeling of plausibility. From these transitions a Markov transition matrix was constructed, and also probabilities of a match conditional on feature configuration. The results show that Environment and Gaze were individually the most important factors influencing the level of plausibility. The highest probability transitions were to improve Environment and Gaze, and then Auralization and Spatialization. We present this work as both a contribution to the methodology of assessing presence without questionnaires, and showing how various aspects of a musical performance can influence plausibility.

  14. What can we learn from Plausible Values?

    Science.gov (United States)

    Marsman, Maarten; Maris, Gunter; Bechger, Timo; Glas, Cees

    2016-06-01

    In this paper, we show that the marginal distribution of plausible values is a consistent estimator of the true latent variable distribution, and, furthermore, that convergence is monotone in an embedding in which the number of items tends to infinity. We use this result to clarify some of the misconceptions that exist about plausible values, and also show how they can be used in the analyses of educational surveys.

  15. Bisimulation for Single-Agent Plausibility Models

    DEFF Research Database (Denmark)

    Andersen, Mikkel Birkegaard; Bolander, Thomas; van Ditmarsch, H.;

    2013-01-01

    Epistemic plausibility models are Kripke models agents use to reason about the knowledge and beliefs of themselves and each other. Restricting ourselves to the single-agent case, we determine when such models are indistinguishable in the logical language containing conditional belief, i.e., we...... define a proper notion of bisimulation, and prove that bisimulation corresponds to logical equivalence on image-finite models. We relate our results to other epistemic notions, such as safe belief and degrees of belief. Our results imply that there are only finitely many non-bisimilar single......-agent epistemic plausibility models on a finite set of propositions. This gives decidability for single-agent epistemic plausibility planning....

  16. Biologically Plausible, Human-scale Knowledge Representation

    Science.gov (United States)

    Crawford, Eric; Gingerich, Matthew; Eliasmith, Chris

    2016-01-01

    Several approaches to implementing symbol-like representations in neurally plausible models have been proposed. These approaches include binding through synchrony (Shastri & Ajjanagadde, 1993), "mesh" binding (van der Velde & de Kamps, 2006), and conjunctive binding (Smolensky, 1990). Recent theoretical work has suggested that…

  17. System-Level Radiation Hardening

    Science.gov (United States)

    Ladbury, Ray

    2014-01-01

    Although system-level radiation hardening can enable the use of high-performance components and enhance the capabilities of a spacecraft, hardening techniques can be costly and can compromise the very performance designers sought from the high-performance components. Moreover, such techniques often result in a complicated design, especially if several complex commercial microcircuits are used, each posing its own hardening challenges. The latter risk is particularly acute for Commercial-Off-The-Shelf components since high-performance parts (e.g. double-data-rate synchronous dynamic random access memories - DDR SDRAMs) may require other high-performance commercial parts (e.g. processors) to support their operation. For these reasons, it is essential that system-level radiation hardening be a coordinated effort, from setting requirements through testing up to and including validation.

  18. Anatomically Plausible Surface Alignment and Reconstruction

    DEFF Research Database (Denmark)

    Paulsen, Rasmus R.; Larsen, Rasmus

    2010-01-01

    With the increasing clinical use of 3D surface scanners, there is a need for accurate and reliable algorithms that can produce anatomically plausible surfaces. In this paper, a combined method for surface alignment and reconstruction is proposed. It is based on an implicit surface representation...... combined with a Markov Random Field regularisation method. Conceptually, the method maintains an implicit ideal description of the sought surface. This implicit surface is iteratively updated by realigning the input point sets and Markov Random Field regularisation. The regularisation is based on a prior...... energy that has earlier proved to be particularly well suited for human surface scans. The method has been tested on full cranial scans of ten test subjects and on several scans of the outer human ear....

  19. The Role of Plausible Values in Large-Scale Surveys

    Science.gov (United States)

    Wu, Margaret

    2005-01-01

    In large-scale assessment programs such as NAEP, TIMSS and PISA, students' achievement data sets provided for secondary analysts contain so-called "plausible values." Plausible values are multiple imputations of the unobservable latent achievement for each student. In this article it has been shown how plausible values are used to: (1) address…

  20. Comprehending Conflicting Science-Related Texts: Graphs as Plausibility Cues

    Science.gov (United States)

    Isberner, Maj-Britt; Richter, Tobias; Maier, Johanna; Knuth-Herzig, Katja; Horz, Holger; Schnotz, Wolfgang

    2013-01-01

    When reading conflicting science-related texts, readers may attend to cues which allow them to assess plausibility. One such plausibility cue is the use of graphs in the texts, which are regarded as typical of "hard science." The goal of our study was to investigate the effects of the presence of graphs on the perceived plausibility and…

  1. Invariant visual object recognition: biologically plausible approaches.

    Science.gov (United States)

    Robinson, Leigh; Rolls, Edmund T

    2015-10-01

    Key properties of inferior temporal cortex neurons are described, and then, the biological plausibility of two leading approaches to invariant visual object recognition in the ventral visual system is assessed to investigate whether they account for these properties. Experiment 1 shows that VisNet performs object classification with random exemplars comparably to HMAX, except that the final layer C neurons of HMAX have a very non-sparse representation (unlike that in the brain) that provides little information in the single-neuron responses about the object class. Experiment 2 shows that VisNet forms invariant representations when trained with different views of each object, whereas HMAX performs poorly when assessed with a biologically plausible pattern association network, as HMAX has no mechanism to learn view invariance. Experiment 3 shows that VisNet neurons do not respond to scrambled images of faces, and thus encode shape information. HMAX neurons responded with similarly high rates to the unscrambled and scrambled faces, indicating that low-level features including texture may be relevant to HMAX performance. Experiment 4 shows that VisNet can learn to recognize objects even when the view provided by the object changes catastrophically as it transforms, whereas HMAX has no learning mechanism in its S-C hierarchy that provides for view-invariant learning. This highlights some requirements for the neurobiological mechanisms of high-level vision, and how some different approaches perform, in order to help understand the fundamental underlying principles of invariant visual object recognition in the ventral visual stream.

  2. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence From Word Segmentation.

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-11-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility. We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition model can aim to be cognitively plausible in multiple ways. We discuss these cognitive plausibility checkpoints generally and then apply them to a case study in word segmentation, investigating a promising Bayesian segmentation strategy. We incorporate cognitive plausibility by using an age-appropriate unit of perceptual representation, evaluating the model output in terms of its utility, and incorporating cognitive constraints into the inference process. Our more cognitively plausible model shows a beneficial effect of cognitive constraints on segmentation performance. One interpretation of this effect is as a synergy between the naive theories of language structure that infants may have and the cognitive constraints that limit the fidelity of their inference processes, where less accurate inference approximations are better when the underlying assumptions about how words are generated are less accurate. More generally, these results highlight the utility of incorporating cognitive plausibility more fully into computational models of language acquisition.

  3. Physiological Plausibility and Boundary Conditions of Theories of Risk Sensitivity

    DEFF Research Database (Denmark)

    Marchiori, Davide; Elqayam, Shira

    2012-01-01

    dilatation, which in turn positively correlates with a risk aversion behavior. They hypothesize that participants’ attention is increased in decision problems involving losses, which trigger an innate prudent behavior in situations entailing danger and/or hazard. Interestingly, Y&T find that the nature...

  4. Using critical evaluation to reappraise plausibility judgments: A critical cognitive component of conceptual change

    Science.gov (United States)

    Lombardi, D.

    2011-12-01

    Plausibility judgments-although well represented in conceptual change theories (see, for example, Chi, 2005; diSessa, 1993; Dole & Sinatra, 1998; Posner et al., 1982)-have received little empirical attention until our recent work investigating teachers' and students' understanding of and perceptions about human-induced climate change (Lombardi & Sinatra, 2010, 2011). In our first study with undergraduate students, we found that greater plausibility perceptions of human-induced climate accounted for significantly greater understanding of weather and climate distinctions after instruction, even after accounting for students' prior knowledge (Lombardi & Sinatra, 2010). In a follow-up study with inservice science and preservice elementary teachers, we showed that anger about the topic of climate change and teaching about climate change was significantly related to implausible perceptions about human-induced climate change (Lombardi & Sinatra, 2011). Results from our recent studies helped to inform our development of a model of the role of plausibility judgments in conceptual change situations. The model applies to situations involving cognitive dissonance, where background knowledge conflicts with an incoming message. In such situations, we define plausibility as a judgment on the relative potential truthfulness of incoming information compared to one's existing mental representations (Rescher, 1976). Students may not consciously think when making plausibility judgments, expending only minimal mental effort in what is referred to as an automatic cognitive process (Stanovich, 2009). However, well-designed instruction could facilitate students' reappraisal of plausibility judgments in more effortful and conscious cognitive processing. Critical evaluation specifically may be one effective method to promote plausibility reappraisal in a classroom setting (Lombardi & Sinatra, in progress). In science education, critical evaluation involves the analysis of how evidentiary

  5. System-level flight test

    Energy Technology Data Exchange (ETDEWEB)

    Cornwall, J. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Dyson, F. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Eardley, D. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Happer, W. [The MITRE Corporation, McLean, VA (US). JASON Program Office; LeLevier, R. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Nierenberg, W. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Press, W. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Ruderman, M. [The MITRE Corporation, McLean, VA (US). JASON Program Office; Sullivan, J. [The MITRE Corporation, McLean, VA (US). JASON Program Office; York, H. [The MITRE Corporation, McLean, VA (US). JASON Program Office

    1999-11-23

    System-level flight tests are an important part of the overall effort by the United States to maintain confidence in the reliability, safety, and performance of its nuclear deterrent forces. This study of activities by the Department of Energy in support of operational tests by the Department of Defense was originally suggested by Dr. Rick Wayne, Director, National Security Programs, Sandia National Laboratory/Livermore, and undertaken at the request of the Department of Energy, Defense Programs Division. It follows two 1997 studies by JASON that focused on the Department of Energy's Enhanced Surveillance Program for the physics package — i.e. the nuclear warhead.

  6. A perspective on SIDS pathogenesis. The hypotheses: plausibility and evidence

    Directory of Open Access Journals (Sweden)

    Goldwater Paul N

    2011-05-01

    Full Text Available Abstract Several theories of the underlying mechanisms of Sudden Infant Death Syndrome (SIDS have been proposed. These theories have born relatively narrow beach-head research programs attracting generous research funding sustained for many years at expense to the public purse. This perspective endeavors to critically examine the evidence and bases of these theories and determine their plausibility; and questions whether or not a safe and reasoned hypothesis lies at their foundation. The Opinion sets specific criteria by asking the following questions: 1. Does the hypothesis take into account the key pathological findings in SIDS? 2. Is the hypothesis congruent with the key epidemiological risk factors? 3. Does it link 1 and 2? Falling short of any one of these answers, by inference, would imply insufficient grounds for a sustainable hypothesis. Some of the hypotheses overlap, for instance, notional respiratory failure may encompass apnea, prone sleep position, and asphyxia which may be seen to be linked to co-sleeping. For the purposes of this paper, each element will be assessed on the above criteria.

  7. System-level musings about system-level science (Invited)

    Science.gov (United States)

    Liu, W.

    2009-12-01

    In teleology, a system has a purpose. In physics, a system has a tendency. For example, a mechanical system has a tendency to lower its potential energy. A thermodynamic system has a tendency to increase its entropy. Therefore, if geospace is seen as a system, what is its tendency? Surprisingly or not, there is no simple answer to this question. Or, to flip the statement, the answer is complex, or complexity. We can understand generally why complexity arises, as the geospace boundary is open to influences from the solar wind and Earth’s atmosphere and components of the system couple to each other in a myriad of ways to make the systemic behavior highly nonlinear. But this still begs the question: What is the system-level approach to geospace science? A reductionist view might assert that as our understanding of a component or subsystem progresses to a certain point, we can couple some together to understand the system on a higher level. However, in practice, a subsystem can almost never been observed in isolation with others. Even if such is possible, there is no guarantee that the subsystem behavior will not change when coupled to others. Hence, there is no guarantee that a subsystem, such as the ring current, has an innate and intrinsic behavior like a hydrogen atom. An absolutist conclusion from this logic can be sobering, as one would have to trace a flash of aurora to the nucleosynthesis in the solar core. The practical answer, however, is more promising; it is a mix of the common sense we call reductionism and awareness that, especially when strongly coupled, subsystems can experience behavioral changes, breakdowns, and catastrophes. If the stock answer to the systemic tendency of geospace is complexity, the objective of the system-level approach to geospace science is to define, measure, and understand this complexity. I will use the example of magnetotail dynamics to illuminate some key points in this talk.

  8. Plausibility and evidence: the case of homeopathy.

    Science.gov (United States)

    Rutten, Lex; Mathie, Robert T; Fisher, Peter; Goossens, Maria; van Wassenhoven, Michel

    2013-08-01

    Homeopathy is controversial and hotly debated. The conclusions of systematic reviews of randomised controlled trials of homeopathy vary from 'comparable to conventional medicine' to 'no evidence of effects beyond placebo'. It is claimed that homeopathy conflicts with scientific laws and that homoeopaths reject the naturalistic outlook, but no evidence has been cited. We are homeopathic physicians and researchers who do not reject the scientific outlook; we believe that examination of the prior beliefs underlying this enduring stand-off can advance the debate. We show that interpretations of the same set of evidence--for homeopathy and for conventional medicine--can diverge. Prior disbelief in homeopathy is rooted in the perceived implausibility of any conceivable mechanism of action. Using the 'crossword analogy', we demonstrate that plausibility bias impedes assessment of the clinical evidence. Sweeping statements about the scientific impossibility of homeopathy are themselves unscientific: scientific statements must be precise and testable. There is growing evidence that homeopathic preparations can exert biological effects; due consideration of such research would reduce the influence of prior beliefs on the assessment of systematic review evidence.

  9. A cognitively plausible model for grammar induction

    Directory of Open Access Journals (Sweden)

    Roni Katzir

    2015-01-01

    Full Text Available This paper aims to bring theoretical linguistics and cognition-general theories of learning into closer contact. I argue that linguists' notions of rich UGs are well-founded, but that cognition-general learning approaches are viable as well and that the two can and should co-exist and support each other. Specifically, I use the observation that any theory of UG provides a learning criterion -- the total memory space used to store a grammar and its encoding of the input -- that supports learning according to the principle of Minimum Description-Length. This mapping from UGs to learners maintains a minimal ontological commitment: the learner for a particular UG uses only what is already required to account for linguistic competence in adults. I suggest that such learners should be our null hypothesis regarding the child's learning mechanism, and that furthermore, the mapping from theories of UG to learners provides a framework for comparing theories of UG.

  10. Analytic Models of Plausible Gravitational Lens Potentials

    Energy Technology Data Exchange (ETDEWEB)

    Baltz, Edward A.; Marshall, Phil; Oguri, Masamune

    2007-05-04

    Gravitational lenses on galaxy scales are plausibly modeled as having ellipsoidal symmetry and a universal dark matter density profile, with a Sersic profile to describe the distribution of baryonic matter. Predicting all lensing effects requires knowledge of the total lens potential: in this work we give analytic forms for that of the above hybrid model. Emphasizing that complex lens potentials can be constructed from simpler components in linear combination, we provide a recipe for attaining elliptical symmetry in either projected mass or lens potential.We also provide analytic formulae for the lens potentials of Sersic profiles for integer and half-integer index. We then present formulae describing the gravitational lensing effects due to smoothly-truncated universal density profiles in cold dark matter model. For our isolated haloes the density profile falls off as radius to the minus fifth or seventh power beyond the tidal radius, functional forms that allow all orders of lens potential derivatives to be calculated analytically, while ensuring a non-divergent total mass. We show how the observables predicted by this profile differ from that of the original infinite-mass NFW profile. Expressions for the gravitational flexion are highlighted. We show how decreasing the tidal radius allows stripped haloes to be modeled, providing a framework for a fuller investigation of dark matter substructure in galaxies and clusters. Finally we remark on the need for finite mass halo profiles when doing cosmological ray-tracing simulations, and the need for readily-calculable higher order derivatives of the lens potential when studying catastrophes in strong lenses.

  11. Encoding the target or the plausible preview word? The nature of the plausibility preview benefit in reading Chinese.

    Science.gov (United States)

    Yang, Jinmian; Li, Nan; Wang, Suiping; Slattery, Timothy J; Rayner, Keith

    2014-01-01

    Previous studies have shown that a plausible preview word can facilitate the processing of a target word as compared to an implausible preview word (a plausibility preview benefit effect) when reading Chinese (Yang, Wang, Tong, & Rayner, 2012; Yang, 2013). Regarding the nature of this effect, it is possible that readers processed the meaning of the plausible preview word and did not actually encode the target word (given that the parafoveal preview word lies close to the fovea). The current experiment examined this possibility with three conditions wherein readers received a preview of a target word that was either (1) identical to the target word (identical preview), (2) a plausible continuation of the pre-target text, but the post-target text in the sentence was incompatible with it (initially plausible preview), or (3) not a plausible continuation of the pre-target text, nor compatible with the post-target text (implausible preview). Gaze durations on target words were longer in the initially plausible condition than the identical condition. Overall, the results showed a typical preview benefit, but also implied that readers did not encode the initially plausible preview. Also, a plausibility preview benefit was replicated: gaze durations were longer with implausible previews than the initially plausible ones. Furthermore, late eye movement measures did not reveal differences between the initially plausible and the implausible preview conditions, which argues against the possibility of misreading the plausible preview word as the target word. In sum, these results suggest that a plausible preview word provides benefit in processing the target word as compared to an implausible preview word, and this benefit is only present in early but not late eye movement measures.

  12. Plausibility Judgments in Conceptual Change and Epistemic Cognition

    Science.gov (United States)

    Lombardi, Doug; Nussbaum, E. Michael; Sinatra, Gale M.

    2016-01-01

    Plausibility judgments rarely have been addressed empirically in conceptual change research. Recent research, however, suggests that these judgments may be pivotal to conceptual change about certain topics where a gap exists between what scientists and laypersons find plausible. Based on a philosophical and empirical foundation, this article…

  13. Source Effects and Plausibility Judgments When Reading about Climate Change

    Science.gov (United States)

    Lombardi, Doug; Seyranian, Viviane; Sinatra, Gale M.

    2014-01-01

    Gaps between what scientists and laypeople find plausible may act as a barrier to learning complex and/or controversial socioscientific concepts. For example, individuals may consider scientific explanations that human activities are causing current climate change as implausible. This plausibility judgment may be due-in part-to individuals'…

  14. Plausible cloth animation using dynamic bending model

    Institute of Scientific and Technical Information of China (English)

    Chuan Zhou; Xiaogang Jin; Charlie C.L. Wang; Jieqing Feng

    2008-01-01

    Simulating the mechanical behavior of a cloth is a very challenging and important problem in computer animation. The models of bending in most existing cloth simulation approaches are taking the assumption that the cloth is little deformed from a plate shape.Therefore, based on the thin-plate theory, these bending models do not consider the condition that the current shape of the cloth under large deformations cannot be regarded as the approximation to that before deformation, which leads to an unreal static bending. [This paper introduces a dynamic bending model which is appropriate to describe large out-plane deformations such as cloth buckling and bending, and develops a compact implementation of the new model on spring-mass systems. Experimental results show that wrinkles and folds generated using this technique in cloth simulation, can appear and vanish in a more natural way than other approaches.

  15. Plausible values: how to deal with their limitations.

    Science.gov (United States)

    Monseur, Christian; Adams, Raymond

    2009-01-01

    Rasch modeling and plausible values methodology were used to scale and report the results of the Organization for Economic Cooperation and Development's Programme for International Student Achievement (PISA). This article will describe the scaling approach adopted in PISA. In particular it will focus on the use of plausible values, a multiple imputation approach that is now commonly used in large-scale assessment. As with all imputation models the plausible values must be generated using models that are consistent with those used in subsequent data analysis. In the case of PISA the plausible value generation assumes a flat linear regression with all students' background variables collected through the international student questionnaire included as regressors. Further, like most linear models, homoscedasticity and normality of the conditional variance are assumed. This article will explore some of the implications of this approach. First, we will discuss the conditions under which the secondary analyses on variables not included in the model for generating the plausible values might be biased. Secondly, as plausible values were not drawn from a multi-level model, the article will explore the adequacy of the PISA procedures for estimating variance components when the data have a hierarchical structure.

  16. Of paradox and plausibility: the dynamic of change in medical law.

    Science.gov (United States)

    Harrington, John

    2014-01-01

    This article develops a model of change in medical law. Drawing on systems theory, it argues that medical law participates in a dynamic of 'deparadoxification' and 'reparadoxification' whereby the underlying contingency of the law is variously concealed through plausible argumentation, or revealed by critical challenge. Medical law is, thus, thoroughly rhetorical. An examination of the development of the law on abortion and on the sterilization of incompetent adults shows that plausibility is achieved through the deployment of substantive common sense and formal stylistic devices. It is undermined where these elements are shown to be arbitrary and constructed. In conclusion, it is argued that the politics of medical law are constituted by this antagonistic process of establishing and challenging provisionally stable normative regimes.

  17. Cultural group selection is plausible, but the predictions of its hypotheses should be tested with real-world data.

    Science.gov (United States)

    Turchin, Peter; Currie, Thomas E

    2016-01-01

    The evidence compiled in the target article demonstrates that the assumptions of cultural group selection (CGS) theory are often met, and it is therefore a useful framework for generating plausible hypotheses. However, more can be said about how we can test the predictions of CGS hypotheses against competing explanations using historical, archaeological, and anthropological data.

  18. Classification using sparse representations: a biologically plausible approach.

    Science.gov (United States)

    Spratling, M W

    2014-02-01

    Representing signals as linear combinations of basis vectors sparsely selected from an overcomplete dictionary has proven to be advantageous for many applications in pattern recognition, machine learning, signal processing, and computer vision. While this approach was originally inspired by insights into cortical information processing, biologically plausible approaches have been limited to exploring the functionality of early sensory processing in the brain, while more practical applications have employed non-biologically plausible sparse coding algorithms. Here, a biologically plausible algorithm is proposed that can be applied to practical problems. This algorithm is evaluated using standard benchmark tasks in the domain of pattern classification, and its performance is compared to a wide range of alternative algorithms that are widely used in signal and image processing. The results show that for the classification tasks performed here, the proposed method is competitive with the best of the alternative algorithms that have been evaluated. This demonstrates that classification using sparse representations can be performed in a neurally plausible manner, and hence, that this mechanism of classification might be exploited by the brain.

  19. System level ESD co-design

    CERN Document Server

    Gossner, Harald

    2015-01-01

    An effective and cost efficient protection of electronic system against ESD stress pulses specified by IEC 61000-4-2 is paramount for any system design. This pioneering book presents the collective knowledge of system designers and system testing experts and state-of-the-art techniques for achieving efficient system-level ESD protection, with minimum impact on the system performance. All categories of system failures ranging from ‘hard’ to ‘soft’ types are considered to review simulation and tool applications that can be used. The principal focus of System Level ESD Co-Design is defining and establishing the importance of co-design efforts from both IC supplier and system builder perspectives. ESD designers often face challenges in meeting customers' system-level ESD requirements and, therefore, a clear understanding of the techniques presented here will facilitate effective simulation approaches leading to better solutions without compromising system performance. With contributions from Robert Asht...

  20. Families of Plausible Solutions to the Puzzle of Boyajian's Star

    CERN Document Server

    Wright, Jason T

    2016-01-01

    Good explanations for the unusual light curve of Boyajian's Star have been hard to find. Recent results by Montet & Simon lend strength and plausibility to the conclusion of Schaefer that in addition to short-term dimmings, the star also experiences large, secular decreases in brightness on decadal timescales. This, combined with a lack of long-wavelength excess in the star's spectral energy distribution, strongly constrains scenarios involving circumstellar material, including hypotheses invoking a spherical cloud of artifacts. We show that the timings of the deepest dimmings appear consistent with being randomly distributed, and that the star's reddening and narrow sodium absorption is consistent with the total, long-term dimming observed. Following Montet & Simon's encouragement to generate alternative hypotheses, we attempt to circumscribe the space of possible explanations with a range of plausibilities, including: a cloud in the outer solar system, structure in the ISM, natural and artificial ma...

  1. Representations of physical plausibility revealed by event-related potentials.

    Science.gov (United States)

    Roser, Matthew E; Fugelsang, Jonathan A; Handy, Todd C; Dunbar, Kevin N; Gazzaniga, Michael S

    2009-08-05

    Maintaining an accurate mental representation of the current environment is crucial to detecting change in that environment and ensuring behavioral coherence. Past experience with interactions between objects, such as collisions, has been shown to influence the perception of object interactions. To assess whether mental representations of object interactions derived from experience influence the maintenance of a mental model of the current stimulus environment, we presented physically plausible and implausible collision events while recording brain electrical activity. The parietal P300 response to 'oddball' events was found to be modulated by the physical plausibility of the stimuli, suggesting that past experience of object interactions can influence working memory processes involved in monitoring ongoing changes to the environment.

  2. Probabilistic reasoning in intelligent systems networks of plausible inference

    CERN Document Server

    Pearl, Judea

    1988-01-01

    Probabilistic Reasoning in Intelligent Systems is a complete and accessible account of the theoretical foundations and computational methods that underlie plausible reasoning under uncertainty. The author provides a coherent explication of probability as a language for reasoning with partial belief and offers a unifying perspective on other AI approaches to uncertainty, such as the Dempster-Shafer formalism, truth maintenance systems, and nonmonotonic logic. The author distinguishes syntactic and semantic approaches to uncertainty--and offers techniques, based on belief networks, that provid

  3. Complex Learning in Bio-plausible Memristive Networks

    OpenAIRE

    Deng, Lei; Li, Guoqi; Deng, Ning; Dong WANG; Zhang, Ziyang; He, Wei; Li, Huanglong; Pei, Jing; Shi, Luping

    2015-01-01

    The emerging memristor-based neuromorphic engineering promises an efficient computing paradigm. However, the lack of both internal dynamics in the previous feedforward memristive networks and efficient learning algorithms in recurrent networks, fundamentally limits the learning ability of existing systems. In this work, we propose a framework to support complex learning functions by introducing dedicated learning algorithms to a bio-plausible recurrent memristive network with internal dynamic...

  4. Features, Events, and Processes: system Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  5. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh;

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  6. Towards a system-level science support

    NARCIS (Netherlands)

    Gubala, T.; Kasztelnik, M.; Malawski, M.; Bubak, M.

    2008-01-01

    Recently, there is a growing need for an information technology solution to support a new methodology of scientific investigation, called system-level science. This paper presents a new approach to development and execution of collaborative applications. These applications are built as experiment

  7. Plausible scenarios for the radiography profession in Sweden in 2025.

    Science.gov (United States)

    Björkman, B; Fridell, K; Tavakol Olofsson, P

    2017-11-01

    Radiography is a healthcare speciality with many technical challenges. Advances in engineering and information technology applications may continue to drive and be driven by radiographers. The world of diagnostic imaging is changing rapidly and radiographers must be proactive in order to survive. To ensure sustainable development, organisations have to identify future opportunities and threats in a timely manner and incorporate them into their strategic planning. Hence, the aim of this study was to analyse and describe plausible scenarios for the radiography profession in 2025. The study has a qualitative design with an inductive approach based on focus group interviews. The interviews were inspired by the Scenario-Planning method. Of the seven trends identified in a previous study, the radiographers considered two as the most uncertain scenarios that would have the greatest impact on the profession should they occur. These trends, labelled "Access to career advancement" and "A sufficient number of radiographers", were inserted into the scenario cross. The resulting four plausible future scenarios were: The happy radiographer, the specialist radiographer, the dying profession and the assembly line. It is suggested that "The dying profession" scenario could probably be turned in the opposite direction by facilitating career development opportunities for radiographers within the profession. Changing the direction would probably lead to a profession composed of "happy radiographers" who are specialists, proud of their profession and competent to carry out advanced tasks, in contrast to being solely occupied by "the assembly line". Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  8. Prebiotically plausible mechanisms increase compositional diversity of nucleic acid sequences.

    Science.gov (United States)

    Derr, Julien; Manapat, Michael L; Rajamani, Sudha; Leu, Kevin; Xulvi-Brunet, Ramon; Joseph, Isaac; Nowak, Martin A; Chen, Irene A

    2012-05-01

    During the origin of life, the biological information of nucleic acid polymers must have increased to encode functional molecules (the RNA world). Ribozymes tend to be compositionally unbiased, as is the vast majority of possible sequence space. However, ribonucleotides vary greatly in synthetic yield, reactivity and degradation rate, and their non-enzymatic polymerization results in compositionally biased sequences. While natural selection could lead to complex sequences, molecules with some activity are required to begin this process. Was the emergence of compositionally diverse sequences a matter of chance, or could prebiotically plausible reactions counter chemical biases to increase the probability of finding a ribozyme? Our in silico simulations using a two-letter alphabet show that template-directed ligation and high concatenation rates counter compositional bias and shift the pool toward longer sequences, permitting greater exploration of sequence space and stable folding. We verified experimentally that unbiased DNA sequences are more efficient templates for ligation, thus increasing the compositional diversity of the pool. Our work suggests that prebiotically plausible chemical mechanisms of nucleic acid polymerization and ligation could predispose toward a diverse pool of longer, potentially structured molecules. Such mechanisms could have set the stage for the appearance of functional activity very early in the emergence of life.

  9. Traffic Modeling in WCDMA System Level Simulations

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Traffic modeling is a crucial element in WCDMA system level simulations. A clear understanding of the nature of traffic in the WCDMA system and subsequent selection of an appropriate random traffic model are critical to the success of the modeling enterprise. The resultant performances will evidently be of a function that our design has been well adapted to the traffic, channel and user mobility models, and these models are also accurate. In this article, our attention will be focused on modeling voice and WWW data traffic with the SBBP model and Victor model respectively.

  10. Neural networks, nativism, and the plausibility of constructivism.

    Science.gov (United States)

    Quartz, S R

    1993-09-01

    Recent interest in PDP (parallel distributed processing) models is due in part to the widely held belief that they challenge many of the assumptions of classical cognitive science. In the domain of language acquisition, for example, there has been much interest in the claim that PDP models might undermine nativism. Related arguments based on PDP learning have also been given against Fodor's anti-constructivist position--a position that has contributed to the widespread dismissal of constructivism. A limitation of many of the claims regarding PDP learning, however, is that the principles underlying this learning have not been rigorously characterized. In this paper, I examine PDP models from within the framework of Valiant's PAC (probably approximately correct) model of learning, now the dominant model in machine learning, and which applies naturally to neural network learning. From this perspective, I evaluate the implications of PDP models for nativism and Fodor's influential anti-constructivist position. In particular, I demonstrate that, contrary to a number of claims, PDP models are nativist in a robust sense. I also demonstrate that PDP models actually serve as a good illustration of Fodor's anti-constructivist position. While these results may at first suggest that neural network models in general are incapable of the sort of concept acquisition that is required to refute Fodor's anti-constructivist position, I suggest that there is an alternative form of neural network learning that demonstrates the plausibility of constructivism. This alternative form of learning is a natural interpretation of the constructivist position in terms of neural network learning, as it employs learning algorithms that incorporate the addition of structure in addition to weight modification schemes. By demonstrating that there is a natural and plausible interpretation of constructivism in terms of neural network learning, the position that nativism is the only plausible model of

  11. On the biological plausibility of Wind Turbine Syndrome.

    Science.gov (United States)

    Harrison, Robert V

    2015-01-01

    An emerging environmental health issue relates to potential ill-effects of wind turbine noise. There have been numerous suggestions that the low-frequency acoustic components in wind turbine signals can cause symptoms associated with vestibular system disorders, namely vertigo, nausea, and nystagmus. This constellation of symptoms has been labeled as Wind Turbine Syndrome, and has been identified in case studies of individuals living close to wind farms. This review discusses whether it is biologically plausible for the turbine noise to stimulate the vestibular parts of the inner ear and, by extension, cause Wind Turbine Syndrome. We consider the sound levels that can activate the semicircular canals or otolith end organs in normal subjects, as well as in those with preexisting conditions known to lower vestibular threshold to sound stimulation.

  12. Hamiltonian formulation of time-dependent plausible inference

    CERN Document Server

    Davis, Sergio

    2014-01-01

    Maximization of the path information entropy is a clear prescription for performing time-dependent plausible inference. Here it is shown that, following this prescription under the assumption of arbitrary instantaneous constraints on position and velocity, a Lagrangian emerges which determines the most probable trajectory. Deviations from the probability maximum can be consistently described as slices in time by a Hamiltonian, according to a nonlinear Langevin equation and its associated Fokker-Planck equation. The connections unveiled between the maximization of path entropy and the Langevin/Fokker-Planck equations imply that missing information about the phase space coordinate never decreases in time, a purely information-theoretical version of the Second Law of Thermodynamics. All of these results are independent of any physical assumptions, and thus valid for any generalized coordinate as a function of time, or any other parameter. This reinforces the view that the Second Law is a fundamental property of ...

  13. Alkaloids from Pandanus amaryllifolius: Isolation and Their Plausible Biosynthetic Formation.

    Science.gov (United States)

    Tsai, Yu-Chi; Yu, Meng-Lun; El-Shazly, Mohamed; Beerhues, Ludger; Cheng, Yuan-Bin; Chen, Lei-Chin; Hwang, Tsong-Long; Chen, Hui-Fen; Chung, Yu-Ming; Hou, Ming-Feng; Wu, Yang-Chang; Chang, Fang-Rong

    2015-10-23

    Pandanus amaryllifolius Roxb. (Pandanaceae) is used as a flavor and in folk medicine in Southeast Asia. The ethanolic crude extract of the aerial parts of P. amaryllifolius exhibited antioxidant, antibiofilm, and anti-inflammatory activities in previous studies. In the current investigation, the purification of the ethanolic extract yielded nine new compounds, including N-acetylnorpandamarilactonines A (1) and B (2); pandalizines A (3) and B (4); pandanmenyamine (5); pandamarilactones 2 (6) and 3 (7), and 5(E)-pandamarilactonine-32 (8); and pandalactonine (9). The isolated alkaloids, with either a γ-alkylidene-α,β-unsaturated-γ-lactone or γ-alkylidene-α,β-unsaturated-γ-lactam system, can be classified into five skeletons including norpandamarilactonine, indolizinone, pandanamine, pandamarilactone, and pandamarilactonine. A plausible biosynthetic route toward 1-5, 7, and 9 is proposed.

  14. Complex Learning in Bio-plausible Memristive Networks.

    Science.gov (United States)

    Deng, Lei; Li, Guoqi; Deng, Ning; Wang, Dong; Zhang, Ziyang; He, Wei; Li, Huanglong; Pei, Jing; Shi, Luping

    2015-06-19

    The emerging memristor-based neuromorphic engineering promises an efficient computing paradigm. However, the lack of both internal dynamics in the previous feedforward memristive networks and efficient learning algorithms in recurrent networks, fundamentally limits the learning ability of existing systems. In this work, we propose a framework to support complex learning functions by introducing dedicated learning algorithms to a bio-plausible recurrent memristive network with internal dynamics. We fabricate iron oxide memristor-based synapses, with well controllable plasticity and a wide dynamic range of excitatory/inhibitory connection weights, to build the network. To adaptively modify the synaptic weights, the comprehensive recursive least-squares (RLS) learning algorithm is introduced. Based on the proposed framework, the learning of various timing patterns and a complex spatiotemporal pattern of human motor is demonstrated. This work paves a new way to explore the brain-inspired complex learning in neuromorphic systems.

  15. System Level Analysis of LTE-Advanced

    DEFF Research Database (Denmark)

    Wang, Yuanye

    This PhD thesis focuses on system level analysis of Multi-Component Carrier (CC) management for Long Term Evolution (LTE)-Advanced. Cases where multiple CCs are aggregated to form a larger bandwidth are studied. The analysis is performed for both local area and wide area networks. In local area......, Time Division Duplexing (TDD) is chosen as the duplexing mode in this study. The performance with different network time synchronization levels is compared, and it is observed that achieving time synchronization significantly improves the uplink performance without penalizing much of the downlink.......e., some users can access all CCs (LTE-Advanced users), whereas some are restricted to operate within a single CC (release 8 users). First, load balancing across the multiple CCs is analyzed. Several known approaches are studied and the best one is identified. A cross-CC packet scheduler is afterwards...

  16. A neurophysiologically plausible population code model for feature integration explains visual crowding.

    Directory of Open Access Journals (Sweden)

    Ronald van den Berg

    2010-01-01

    Full Text Available An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called "crowding". Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, "compulsory averaging", and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality.

  17. A neurophysiologically plausible population code model for feature integration explains visual crowding.

    Science.gov (United States)

    van den Berg, Ronald; Roerdink, Jos B T M; Cornelissen, Frans W

    2010-01-22

    An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called "crowding". Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, "compulsory averaging", and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality.

  18. Is the de Broglie-Bohm interpretation of quantum mechanics really plausible?

    Science.gov (United States)

    Jung, Kurt

    2013-06-01

    Bohmian mechanics also known as de Broglie-Bohm theory is the most popular alternative approach to quantum mechanics. Whereas the standard interpretation of quantum mechanics is based on the complementarity principle Bohmian mechanics assumes that both particle and wave are concrete physical objects. In 1993 Peter Holland has written an ardent account on the plausibility of the de Broglie-Bohm theory. He proved that it fully reproduces quantum mechanics if the initial particle distribution is consistent with a solution of the Schrödinger equation. Which may be the reasons that Bohmian mechanics has not yet found global acceptance? In this article it will be shown that predicted properties of atoms and molecules are in conflict with experimental findings. Moreover it will be demonstrated that repeatedly published ensembles of trajectories illustrating double slit diffraction processes do not agree with quantum mechanics. The credibility of a theory is undermined when recognizably wrong data presented frequently over years are finally not declared obsolete.

  19. Liderazgo preventivo para la universidad. Una experiencia plausible

    Directory of Open Access Journals (Sweden)

    Alejandro Rodríguez Rodríguez

    2015-06-01

    Full Text Available El desarrollo del liderazgo, en el ámbito educativo superior, busca soluciones de aplicación inmediata a contextos en que todo líder se desenvuelve, pero se diluye el sustento teórico-práctico en la formación del líder que posibilite entender los procesos intelectivos durante la toma de decisiones. El paradigma de convergencia entre el método antropológico lonerganiano, la comunidad de aprendizaje vygotskiana y una relectura del sistema preventivo salesiano se presentan como propuesta plausible de formación al liderazgo preventivo entre los diversos actores de una comunidad universitaria. Un estudio de caso de la Universidad Salesiana en México empleando un método mixto de investigación, facilita una relectura del liderazgo desde una óptica preventiva como posibilidad de convergencia en un diálogo interdisciplinar. Los resultados teórico-práctico propuestos y examinados se muestran como herramienta útil para evaluar, enriquecer y renovar la teoría sobre el líder y el desarrollo de liderazgo en las universidades frente a una sociedad globalizada.

  20. A plausible explanation for male dominance in typhoid ileal perforation

    Directory of Open Access Journals (Sweden)

    Khan M

    2012-11-01

    Full Text Available Mohammad KhanDepartment of Microbiology, College of Medicine, Chichiri, Blantyre, MalawiAbstract: The phenomenon of consistent male dominance in typhoid ileal perforation (TIP is not well understood. It cannot be explained on the basis of microbial virulence, Peyer's patch anatomy, ileal wall thickness, gastric acidity, host genetic factors, or sex-linked bias in hospital attendance. The cytokine response to an intestinal infection in males is predominantly proinflammatory as compared with that in females, presumably due to differences in the sex hormonal milieu. Sex hormone receptors have been detected on lymphocytes and macrophages, including on Peyer's patches, inflammation of which (probably similar to the Shwartzman reaction/Koch phenomenon is the forerunner of TIP, and is not excluded from the regulatory effects of sex hormones. Hormonal control of host-pathogen interaction may override genetic control. Environmental exposure to Salmonella typhi may be more frequent in males, presumably due to sex-linked differences in hygiene practices and dining-out behavior. A plausible explanation of male dominance in TIP could include sex-linked differences in the degree of natural exposure of Peyer's patches to S. typhi. An alternative explanation may include sexual dimorphism in host inflammatory response patterns in Peyer's patches that have been induced by S. typhi. Both hypotheses are testable.Keywords: explanation, dominance, male, perforation, ileum, typhoid

  1. A plausible explanation for male dominance in typhoid ileal perforation.

    Science.gov (United States)

    Khan, Mohammad

    2012-01-01

    The phenomenon of consistent male dominance in typhoid ileal perforation (TIP) is not well understood. It cannot be explained on the basis of microbial virulence, Peyer's patch anatomy, ileal wall thickness, gastric acidity, host genetic factors, or sex-linked bias in hospital attendance. The cytokine response to an intestinal infection in males is predominantly proinflammatory as compared with that in females, presumably due to differences in the sex hormonal milieu. Sex hormone receptors have been detected on lymphocytes and macrophages, including on Peyer's patches, inflammation of which (probably similar to the Shwartzman reaction/Koch phenomenon) is the forerunner of TIP, and is not excluded from the regulatory effects of sex hormones. Hormonal control of host-pathogen interaction may override genetic control. Environmental exposure to Salmonella typhi may be more frequent in males, presumably due to sex-linked differences in hygiene practices and dining-out behavior. A plausible explanation of male dominance in TIP could include sex-linked differences in the degree of natural exposure of Peyer's patches to S. typhi. An alternative explanation may include sexual dimorphism in host inflammatory response patterns in Peyer's patches that have been induced by S. typhi. Both hypotheses are testable.

  2. Plausible rice yield losses under future climate warming.

    Science.gov (United States)

    Zhao, Chuang; Piao, Shilong; Wang, Xuhui; Huang, Yao; Ciais, Philippe; Elliott, Joshua; Huang, Mengtian; Janssens, Ivan A; Li, Tao; Lian, Xu; Liu, Yongwen; Müller, Christoph; Peng, Shushi; Wang, Tao; Zeng, Zhenzhong; Peñuelas, Josep

    2016-12-19

    Rice is the staple food for more than 50% of the world's population(1-3). Reliable prediction of changes in rice yield is thus central for maintaining global food security. This is an extraordinary challenge. Here, we compare the sensitivity of rice yield to temperature increase derived from field warming experiments and three modelling approaches: statistical models, local crop models and global gridded crop models. Field warming experiments produce a substantial rice yield loss under warming, with an average temperature sensitivity of -5.2 ± 1.4% K(-1). Local crop models give a similar sensitivity (-6.3 ± 0.4% K(-1)), but statistical and global gridded crop models both suggest less negative impacts of warming on yields (-0.8 ± 0.3% and -2.4 ± 3.7% K(-1), respectively). Using data from field warming experiments, we further propose a conditional probability approach to constrain the large range of global gridded crop model results for the future yield changes in response to warming by the end of the century (from -1.3% to -9.3% K(-1)). The constraint implies a more negative response to warming (-8.3 ± 1.4% K(-1)) and reduces the spread of the model ensemble by 33%. This yield reduction exceeds that estimated by the International Food Policy Research Institute assessment (-4.2 to -6.4% K(-1)) (ref. 4). Our study suggests that without CO2 fertilization, effective adaptation and genetic improvement, severe rice yield losses are plausible under intensive climate warming scenarios.

  3. A biologically plausible embodied model of action discovery

    Directory of Open Access Journals (Sweden)

    Rufino eBolado-Gomez

    2013-03-01

    Full Text Available During development, animals can spontaneously discover action-outcomepairings enabling subsequent achievement of their goals. We present abiologically plausible embodied model addressing key aspects of thisprocess. The biomimetic model core comprises the basal ganglia and itsloops through cortex and thalamus. We incorporate reinforcementlearning with phasic dopamine supplying a sensory prediction error,signalling 'surprising' outcomes. Phasic dopamine is used in acorticostriatal learning rule which is consistent with recent data. Wealso hypothesised that objects associated with surprising outcomesacquire 'novelty salience' contingent on the predicability of theoutcome. To test this idea we used a simple model of predictiongoverning the dynamics of novelty salience and phasic dopamine. Thetask of the virtual robotic agent mimicked an in vivo counterpart(Gancarz et al., 2011 and involved interaction with a target objectwhich caused a light flash, or a control object which did not.Learning took place according to two schedules. In one, the phasicoutcome was delivered after interaction with the target in anunpredictable way which emulated the in vivo protocol. Without noveltysalience, the model was unable to account for the experimental data.In the other schedule, the phasic outcome was reliably delivered andthe agent showed a rapid increase in the number of interactions withthe target which then decreased over subsequent sessions. We arguethis is precisely the kind of change in behaviour required torepeatedly present representations of context, action and outcome, toneural networks responsible for learning action-outcome contingency.The model also showed corticostriatal plasticity consistent withlearning a new action in basal ganglia. We conclude that actionlearning is underpinned by a complex interplay of plasticity andstimulus salience, and that our model contains many of the elementsfor biological action discovery to take place.

  4. Changing beliefs about implausible autobiographical events: a little plausibility goes a long way.

    Science.gov (United States)

    Mazzoni, G A; Loftus, E F; Kirsch, I

    2001-03-01

    Three experiments investigated the malleability of perceived plausibility and the subjective likelihood of occurrence of plausible and implausible events among participants who had no recollection of experiencing them. In Experiment 1, a plausibility-enhancing manipulation (reading accounts of the occurrence of events) combined with a personalized suggestion increased the perceived plausibility of the implausible event, as well as participants' ratings of the likelihood that they had experienced it. Plausibility and likelihood ratings were uncorrelated. Subsequent studies showed that the plausibility manipulation alone was sufficient to increase likelihood ratings but only if the accounts that participants read were set in a contemporary context. These data suggest that false autobiographical beliefs can be induced in clinical and forensic contexts even for initially implausible events.

  5. Plausible inference and the interpretation of quantitative data

    Energy Technology Data Exchange (ETDEWEB)

    Nakhleh, C.W.

    1998-02-01

    The analysis of quantitative data is central to scientific investigation. Probability theory, which is founded on two rules, the sum and product rules, provides the unique, logically consistent method for drawing valid inferences from quantitative data. This primer on the use of probability theory is meant to fulfill a pedagogical purpose. The discussion begins at the foundation of scientific inference by showing how the sum and product rules of probability theory follow from some very basic considerations of logical consistency. The authors then develop general methods of probability theory that are essential to the analysis and interpretation of data. They discuss how to assign probability distributions using the principle of maximum entropy, how to estimate parameters from data, how to handle nuisance parameters whose values are of little interest, and how to determine which of a set of models is most justified by a data set. All these methods are used together in most realistic data analyses. Examples are given throughout to illustrate the basic points.

  6. Plausible Reasoning in Modular Robotics and Human Reasoning

    Directory of Open Access Journals (Sweden)

    Claudiu Pozna

    2007-12-01

    Full Text Available Present paper continues the researches on cognitive system design. The goal ofthe paper is to illustrate the variety of models which can be constructed using the Bayesianplausible reasoning theory. The first case study develops a classical inverse kinematicalmodel into a Bayesian model. The second case study models the human reasoningpresented by the famous story of Sun Tzu: ‘Advance to Chengang by a hidden path’.

  7. Event-based plausibility immediately influences on-line language comprehension.

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge.

  8. System-level modeling of MEMS v.10

    CERN Document Server

    Schrag, Gabriele; Hierold, Christofer; Korvink , Jan G

    2012-01-01

    System-level modeling of MEMS - microelectromechanical systems - comprises integrated approaches to simulate, understand, and optimize the performance of sensors, actuators, and microsystems, taking into account the intricacies of the interplay between mechanical and electrical properties, circuitry, packaging, and design considerations. Thereby, system-level modeling overcomes the limitations inherent to methods that focus only on one of these aspects and do not incorporate their mutual dependencies. The book addresses the two most important approaches of system-level modeling, namely physics

  9. Quantum theory as plausible reasoning applied to data obtained by robust experiments

    NARCIS (Netherlands)

    De Raedt, H.; Katsnelson, M. I.; Michielsen, K.

    2016-01-01

    We review recent work that employs the framework of logical inference to establish a bridge between data gathered through experiments and their objective description in terms of human-made concepts. It is shown that logical inference applied to experiments for which the observed events are independe

  10. Quantum theory as plausible reasoning applied to data obtained by robust experiments

    NARCIS (Netherlands)

    De Raedt, H.; Katsnelson, M. I.; Michielsen, K.

    2016-01-01

    We review recent work that employs the framework of logical inference to establish a bridge between data gathered through experiments and their objective description in terms of human-made concepts. It is shown that logical inference applied to experiments for which the observed events are independe

  11. Developmental dynamics: toward a biologically plausible evolutionary psychology.

    Science.gov (United States)

    Lickliter, Robert; Honeycutt, Hunter

    2003-11-01

    There has been a conceptual revolution in the biological sciences over the past several decades. Evidence from genetics, embryology, and developmental biology has converged to offer a more epigenetic, contingent, and dynamic view of how organisms develop. Despite these advances, arguments for the heuristic value of a gene-centered, predeterministic approach to the study of human behavior and development have become increasingly evident in the psychological sciences during this time. In this article, the authors review recent advances in genetics, embryology, and developmental biology that have transformed contemporary developmental and evolutionary theory and explore how these advances challenge gene-centered explanations of human behavior that ignore the complex, highly coordinated system of regulatory dynamics involved in development and evolution.

  12. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence From Word Segmentation

    National Research Council Canada - National Science Library

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility...

  13. Choosing diverse sets of plausible scenarios in multidimensional exploratory futures techniques

    NARCIS (Netherlands)

    Lord, Steven; Helfgott, Ariella; Vervoort, Joost M.

    2016-01-01

    Abstract Morphological analysis allows any number of dimensions to be retained when framing future conditions, and techniques within morphological analysis determine which combinations of those dimensions represent plausible futures. However, even a relatively low number of dimensions in future cond

  14. Program Theory Evaluation: Logic Analysis

    Science.gov (United States)

    Brousselle, Astrid; Champagne, Francois

    2011-01-01

    Program theory evaluation, which has grown in use over the past 10 years, assesses whether a program is designed in such a way that it can achieve its intended outcomes. This article describes a particular type of program theory evaluation--logic analysis--that allows us to test the plausibility of a program's theory using scientific knowledge.…

  15. System-Level Autonomy Trust Enabler (SLATE) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR project will achieve trusted, reconfigurable, intelligent autonomy through system-level validation. The goal is to design and develop a representation and...

  16. Interactions between visual and motor areas during the recognition of plausible actions as revealed by magnetoencephalography.

    Science.gov (United States)

    Pavlidou, Anastasia; Schnitzler, Alfons; Lange, Joachim

    2014-02-01

    Several studies have shown activation of the mirror neuron system (MNS), comprising the temporal, posterior parietal, and sensorimotor areas when observing plausible actions, but far less is known on how these cortical areas interact during the recognition of a plausible action. Here, we recorded neural activity with magnetoencephalography while subjects viewed point-light displays of biologically plausible and scrambled versions of actions. We were interested in modulations of oscillatory activity and, specifically, in coupling of oscillatory activity between visual and motor areas. Both plausible and scrambled actions elicited modulations of θ (5-7 Hz), α (7-13 Hz), β (13-35 Hz), and γ (55-100 Hz) power within visual and motor areas. When comparing between the two actions, we observed sequential and spatially distinct increases of γ (∼65 Hz), β (∼25 Hz), and α (∼11 Hz) power between 0.5 and 1.3 s in parieto-occipital, sensorimotor, and left temporal areas. In addition, significant clusters of γ (∼65 Hz) and α/β (∼15 Hz) power decrease were observed in right temporal and parieto-occipital areas between 1.3 and 2.0 s. We found β-power in sensorimotor areas to be positively correlated on a trial-by-trial basis with parieto-occipital γ and left temporal α-power for the plausible but not for the scrambled condition. These results provide new insights in the neuronal oscillatory activity of the areas involved in the recognition of plausible action movements and their interaction. The power correlations between specific areas underscore the importance of interactions between visual and motor areas of the MNS during the recognition of a plausible action.

  17. Stereotyping to infer group membership creates plausible deniability for prejudice-based aggression.

    Science.gov (United States)

    Cox, William T L; Devine, Patricia G

    2014-02-01

    In the present study, participants administered painful electric shocks to an unseen male opponent who was either explicitly labeled as gay or stereotypically implied to be gay. Identifying the opponent with a gay-stereotypic attribute produced a situation in which the target's group status was privately inferred but plausibly deniable to others. To test the plausible deniability hypothesis, we examined aggression levels as a function of internal (personal) and external (social) motivation to respond without prejudice. Whether plausible deniability was present or absent, participants high in internal motivation aggressed at low levels, and participants low in both internal and external motivation aggressed at high levels. The behavior of participants low in internal and high in external motivation, however, depended on experimental condition. They aggressed at low levels when observers could plausibly attribute their behavior to prejudice and aggressed at high levels when the situation granted plausible deniability. This work has implications for both obstacles to and potential avenues for prejudice-reduction efforts.

  18. A system-level pathway-phenotype association analysis using synthetic feature random forest.

    Science.gov (United States)

    Pan, Qinxin; Hu, Ting; Malley, James D; Andrew, Angeline S; Karagas, Margaret R; Moore, Jason H

    2014-04-01

    As the cost of genome-wide genotyping decreases, the number of genome-wide association studies (GWAS) has increased considerably. However, the transition from GWAS findings to the underlying biology of various phenotypes remains challenging. As a result, due to its system-level interpretability, pathway analysis has become a popular tool for gaining insights on the underlying biology from high-throughput genetic association data. In pathway analyses, gene sets representing particular biological processes are tested for significant associations with a given phenotype. Most existing pathway analysis approaches rely on single-marker statistics and assume that pathways are independent of each other. As biological systems are driven by complex biomolecular interactions, embracing the complex relationships between single-nucleotide polymorphisms (SNPs) and pathways needs to be addressed. To incorporate the complexity of gene-gene interactions and pathway-pathway relationships, we propose a system-level pathway analysis approach, synthetic feature random forest (SF-RF), which is designed to detect pathway-phenotype associations without making assumptions about the relationships among SNPs or pathways. In our approach, the genotypes of SNPs in a particular pathway are aggregated into a synthetic feature representing that pathway via Random Forest (RF). Multiple synthetic features are analyzed using RF simultaneously and the significance of a synthetic feature indicates the significance of the corresponding pathway. We further complement SF-RF with pathway-based Statistical Epistasis Network (SEN) analysis that evaluates interactions among pathways. By investigating the pathway SEN, we hope to gain additional insights into the genetic mechanisms contributing to the pathway-phenotype association. We apply SF-RF to a population-based genetic study of bladder cancer and further investigate the mechanisms that help explain the pathway-phenotype associations using SEN. The

  19. A Distributed Approach to System-Level Prognostics

    Science.gov (United States)

    Daigle, Matthew J.; Bregon, Anibal; Roychoudhury, Indranil

    2012-01-01

    Prognostics, which deals with predicting remaining useful life of components, subsystems, and systems, is a key technology for systems health management that leads to improved safety and reliability with reduced costs. The prognostics problem is often approached from a component-centric view. However, in most cases, it is not specifically component lifetimes that are important, but, rather, the lifetimes of the systems in which these components reside. The system-level prognostics problem can be quite difficult due to the increased scale and scope of the prognostics problem and the relative Jack of scalability and efficiency of typical prognostics approaches. In order to address these is ues, we develop a distributed solution to the system-level prognostics problem, based on the concept of structural model decomposition. The system model is decomposed into independent submodels. Independent local prognostics subproblems are then formed based on these local submodels, resul ting in a scalable, efficient, and flexible distributed approach to the system-level prognostics problem. We provide a formulation of the system-level prognostics problem and demonstrate the approach on a four-wheeled rover simulation testbed. The results show that the system-level prognostics problem can be accurately and efficiently solved in a distributed fashion.

  20. Don't Plan for the Unexpected: Planning Based on Plausibility Models

    DEFF Research Database (Denmark)

    Andersen, Mikkel Birkegaard; Bolander, Thomas; Jensen, Martin Holm

    2015-01-01

    We present a framework for automated planning based on plausibility models, as well as algorithms for computing plans in this framework. Our plausibility models include postconditions, as ontic effects are essential for most planning purposes. The framework presented extends a previously developed...... framework based on dynamic epistemic logic (DEL), without plausibilities/beliefs. In the pure epistemic framework, one can distinguish between strong and weak epistemic plans for achieving some, possibly epistemic, goal. By taking all possible outcomes of actions into account, a strong plan guarantees...... that the agent achieves this goal. Conversely, a weak plan promises only the possibility of leading to the goal. In real-life planning scenarios where the planning agent is faced with a high degree of uncertainty and an almost endless number of possible exogenous events, strong epistemic planning...

  1. Inference and Plausible Reasoning in a Natural Language Understanding System Based on Object-Oriented Semantics

    CERN Document Server

    Ostapov, Yuriy

    2012-01-01

    Algorithms of inference in a computer system oriented to input and semantic processing of text information are presented. Such inference is necessary for logical questions when the direct comparison of objects from a question and database can not give a result. The following classes of problems are considered: a check of hypotheses for persons and non-typical actions, the determination of persons and circumstances for non-typical actions, planning actions, the determination of event cause and state of persons. To form an answer both deduction and plausible reasoning are used. As a knowledge domain under consideration is social behavior of persons, plausible reasoning is based on laws of social psychology. Proposed algorithms of inference and plausible reasoning can be realized in computer systems closely connected with text processing (criminology, operation of business, medicine, document systems).

  2. Biologically plausible and evidence-based risk intervals in immunization safety research.

    Science.gov (United States)

    Rowhani-Rahbar, Ali; Klein, Nicola P; Dekker, Cornelia L; Edwards, Kathryn M; Marchant, Colin D; Vellozzi, Claudia; Fireman, Bruce; Sejvar, James J; Halsey, Neal A; Baxter, Roger

    2012-12-17

    In immunization safety research, individuals are considered at risk for the development of certain adverse events following immunization (AEFI) within a specific period of time referred to as the risk interval. These intervals should ideally be determined based on biologic plausibility considering features of the AEFI, presumed or known pathologic mechanism, and the vaccine. Misspecification of the length and timing of these intervals may result in introducing bias in epidemiologic and clinical studies of immunization safety. To date, little work has been done to formally assess and determine biologically plausible and evidence-based risk intervals in immunization safety research. In this report, we present a systematic process to define biologically plausible and evidence-based risk interval estimates for two specific AEFIs, febrile seizures and acute disseminated encephalomyelitis. In addition, we review methodologic issues related to the determination of risk intervals for consideration in future studies of immunization safety.

  3. The semiosis of prayer and the creation of plausible fictional worlds

    Directory of Open Access Journals (Sweden)

    J. Peter Södergård

    1999-01-01

    Full Text Available Prayer and incantation can perhaps be said to be 'mechanisms' that promise that lack will be liquidated and that there is an unlimited signator, a father, or some other metaphysical creature, standing behind and legitimizing the discourse. A way of communicating with the Unlimited that is privileged by an interpretive community that read the prayers aloud and enacted the magical stage-scripts. These highly overlapping categories function as one of the most common subforms of religious discourse for the creation, actualization and maintenance of plausible fictional worlds. They are liminal and transitional mechanisms that manipulate an empirical reader to phase-shift from an actual world to a plausible, by being inscribed in a possible and fictional world, thus creating a model reader, that perceives and acts according to the plausible world outlined by a given interpretive community, and that hears god talking in voces magicae and in god-speaking silence.

  4. Don't Plan for the Unexpected: Planning Based on Plausibility Models

    DEFF Research Database (Denmark)

    Andersen, Mikkel Birkegaard; Bolander, Thomas; Jensen, Martin Holm

    2015-01-01

    that the agent achieves this goal. Conversely, a weak plan promises only the possibility of leading to the goal. In real-life planning scenarios where the planning agent is faced with a high degree of uncertainty and an almost endless number of possible exogenous events, strong epistemic planning......We present a framework for automated planning based on plausibility models, as well as algorithms for computing plans in this framework. Our plausibility models include postconditions, as ontic effects are essential for most planning purposes. The framework presented extends a previously developed...... framework based on dynamic epistemic logic (DEL), without plausibilities/beliefs. In the pure epistemic framework, one can distinguish between strong and weak epistemic plans for achieving some, possibly epistemic, goal. By taking all possible outcomes of actions into account, a strong plan guarantees...

  5. System Level Modelling and Performance Estimation of Embedded Systems

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer

    is simulation based and allows performance estimation to be carried out throughout all design phases ranging from early functional to cycle accurate and bit true descriptions of the system, modelling both hardware and software components in a unied way. Design space exploration and performance estimation...... an efficient system level design methodology, a modelling framework for performance estimation and design space exploration at the system level is required. This thesis presents a novel component based modelling framework for system level modelling and performance estimation of embedded systems. The framework...... is performed by having the framework produce detailed quantitative information about the system model under investigation. The project is part of the national Danish research project, Danish Network of Embedded Systems (DaNES), which is funded by the Danish National Advanced Technology Foundation. The project...

  6. System-Level Design Methodologies for Networked Multiprocessor Systems-on-Chip

    DEFF Research Database (Denmark)

    Virk, Kashif Munir

    2008-01-01

    of wireless integrated sensor networks which are an emerging class of networked embedded computer systems. The work described here demonstrates how to model multiprocessor systems-on-chip at the system level by abstracting away most of the lower-level details albeit retaining the parameters most relevant......The first part of the thesis presents an overview of the existing theories and practices of modeling and simulation of multiprocessor systems-on-chip. The systematic categorization of the plethora of existing programming models at various levels of abstraction is the main contribution here which...... is the first such attempt in the published literature. The second part of the thesis deals with the issues related to the development of system-level design methodologies for networked multiprocessor systems-on-chip at various levels of design abstraction with special focus on the modeling and design...

  7. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  8. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  9. Plausible Explanation of Quantization of Intrinsic Redshift from Hall Effect and Weyl Quantization

    Directory of Open Access Journals (Sweden)

    Smarandache F.

    2006-10-01

    Full Text Available Using phion condensate model as described by Moffat [1], we consider a plausible explanation of (Tifft intrinsic redshift quantization as described by Bell [6] as result of Hall effect in rotating frame. We also discuss another alternative to explain redshift quantization from the viewpoint of Weyl quantization, which could yield Bohr- Sommerfeld quantization.

  10. “合情推理”辨析%Analysis of Plausible Reasoning

    Institute of Scientific and Technical Information of China (English)

    连四清; 方运加

    2012-01-01

    波利亚的“合情推理”模式引进我国数学课程标准后,就成了我国数学教育研究的关键词。然而,“合情推理”的科学性尚需考证:(1)它的中文意义不明确;(2)它不满足推理模式的客观性要求,存在明显的缺陷;(3)过分强调“合情推理模式”则是过分强调归纳推理和演绎推理的区别,容易割裂它们之间的关系。%After the model of "plausible inference" being introduced into the mathematics curriculum standards, it became a key word of the research on mathematics education in China. However, there are doubts on whether it is scientific. (1) Chinese meaning of plausible inference is ambiguous. (2) The plausible inference can not meet the objective requirement of the reasoning, which has obvious defects. (3) Overemphasizing the model of plausible inference would overemphasize the difference between deductive inference and inductive inference, and would dispart them.

  11. From bone to plausible bipedal locomotion. Part II: Complete motion synthesis for bipedal primates.

    Science.gov (United States)

    Nicolas, Guillaume; Multon, Franck; Berillon, Gilles

    2009-05-29

    This paper addresses the problem of synthesizing plausible bipedal locomotion according to 3D anatomical reconstruction and general hypotheses on human motion control strategies. In a previous paper [Nicolas, G., Multon, F., Berillon, G., Marchal, F., 2007. From bone to plausible bipedal locomotion using inverse kinematics. Journal of Biomechanics 40 (5) 1048-1057], we have validated a method based on using inverse kinematics to obtain plausible lower-limb motions knowing the trajectory of the ankle. In this paper, we propose a more general approach that also involves computing a plausible trajectory of the ankles for a given skeleton. The inputs are the anatomical descriptions of the bipedal species, imposed footprints and a rest posture. This process is based on optimizing a reference ankle trajectory until a set of criteria is minimized. This optimization loop is based on the assumption that a plausible motion is supposed to have little internal mechanical work and should be as less jerky as possible. For each tested ankle trajectory, inverse kinematics is used to compute a lower-body motion that enables us to compute the resulting mechanical work and jerk. This method was tested on a set of modern humans (male and female, with various anthropometric properties). We show that the results obtained with this method are close to experimental data for most of the subjects. We also demonstrate that the method is not sensitive to the choice of the reference ankle trajectory; any ankle trajectory leads to very similar result. We finally apply the method to a skeleton of Pan paniscus (Bonobo), and compare the resulting motion to those described by zoologists.

  12. Behavioural modelling and system-level simulation of micromechanical beam resonators

    Science.gov (United States)

    Khine, Lynn; Palaniapan, Moorthi

    2006-04-01

    This paper presents a behavioural modelling technique for micromechanical beam resonators that enables the simulation of MEMS resonator model in Analog Hardware Description Language (AHDL) format within a system-level circuit simulation. A 1.13 MHz clamped-clamped beam and a 10.4 MHz free-free beam resonators have been modelled into Verilog-A code and successfully simulated with Spectre in Cadence. Analysis has shown that both models behave well and their electrical characteristics are in agreement with the theory.

  13. System-level modeling for geological storage of CO2

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yingqi; Oldenburg, Curtis M.; Finsterle, Stefan; Bodvarsson, Gudmundur S.

    2006-04-24

    One way to reduce the effects of anthropogenic greenhousegases on climate is to inject carbon dioxide (CO2) from industrialsources into deep geological formations such as brine formations ordepleted oil or gas reservoirs. Research has and is being conducted toimprove understanding of factors affecting particular aspects ofgeological CO2 storage, such as performance, capacity, and health, safetyand environmental (HSE) issues, as well as to lower the cost of CO2capture and related processes. However, there has been less emphasis todate on system-level analyses of geological CO2 storage that considergeological, economic, and environmental issues by linking detailedrepresentations of engineering components and associated economic models.The objective of this study is to develop a system-level model forgeological CO2 storage, including CO2 capture and separation,compression, pipeline transportation to the storage site, and CO2injection. Within our system model we are incorporating detailedreservoir simulations of CO2 injection and potential leakage withassociated HSE effects. The platform of the system-level modelingisGoldSim [GoldSim, 2006]. The application of the system model is focusedon evaluating the feasibility of carbon sequestration with enhanced gasrecovery (CSEGR) in the Rio Vista region of California. The reservoirsimulations are performed using a special module of the TOUGH2 simulator,EOS7C, for multicomponent gas mixtures of methane and CO2 or methane andnitrogen. Using this approach, the economic benefits of enhanced gasrecovery can be directly weighed against the costs, risks, and benefitsof CO2 injection.

  14. Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks.

    Science.gov (United States)

    Miconi, Thomas

    2017-02-23

    Neural activity during cognitive tasks exhibits complex dynamics that flexibly encode task-relevant variables. Chaotic recurrent networks, which spontaneously generate rich dynamics, have been proposed as a model of cortical computation during cognitive tasks. However, existing methods for training these networks are either biologically implausible, and/or require a continuous, real-time error signal to guide learning. Here we show that a biologically plausible learning rule can train such recurrent networks, guided solely by delayed, phasic rewards at the end of each trial. Networks endowed with this learning rule can successfully learn nontrivial tasks requiring flexible (context-dependent) associations, memory maintenance, nonlinear mixed selectivities, and coordination among multiple outputs. The resulting networks replicate complex dynamics previously observed in animal cortex, such as dynamic encoding of task features and selective integration of sensory inputs. We conclude that recurrent neural networks offer a plausible model of cortical dynamics during both learning and performance of flexible behavior.

  15. Identifying plausible genetic models based on association and linkage results: application to type 2 diabetes.

    Science.gov (United States)

    Guan, Weihua; Boehnke, Michael; Pluzhnikov, Anna; Cox, Nancy J; Scott, Laura J

    2012-12-01

    When planning resequencing studies for complex diseases, previous association and linkage studies can constrain the range of plausible genetic models for a given locus. Here, we explore the combinations of causal risk allele frequency (RAFC ) and genotype relative risk (GRRC ) consistent with no or limited evidence for affected sibling pair (ASP) linkage and strong evidence for case-control association. We find that significant evidence for case-control association combined with no or moderate evidence for ASP linkage can define a lower bound for the plausible RAFC . Using data from large type 2 diabetes (T2D) linkage and genome-wide association study meta-analyses, we find that under reasonable model assumptions, 23 of 36 autosomal T2D risk loci are unlikely to be due to causal variants with combined RAFC < 0.005, and four of the 23 are unlikely to be due to causal variants with combined RAFC < 0.05.

  16. Acquiring Plausible Predications from MEDLINE by Clustering MeSH Annotations.

    Science.gov (United States)

    Miñarro-Giménez, Jose Antonio; Kreuzthaler, Markus; Bernhardt-Melischnig, Johannes; Martínez-Costa, Catalina; Schulz, Stefan

    2015-01-01

    The massive accumulation of biomedical knowledge is reflected by the growth of the literature database MEDLINE with over 23 million bibliographic records. All records are manually indexed by MeSH descriptors, many of them refined by MeSH subheadings. We use subheading information to cluster types of MeSH descriptor co-occurrences in MEDLINE by processing co-occurrence information provided by the UMLS. The goal is to infer plausible predicates to each resulting cluster. In an initial experiment this was done by grouping disease-pharmacologic substance co-occurrences into six clusters. Then, a domain expert manually performed the assignment of meaningful predicates to the clusters. The mean accuracy of the best ten generated biomedical facts of each cluster was 85%. This result supports the evidence of the potential of MeSH subheadings for extracting plausible medical predications from MEDLINE.

  17. Spelling in oral deaf and hearing dyslexic children: A comparison of phonologically plausible errors.

    Science.gov (United States)

    Roy, P; Shergold, Z; Kyle, F E; Herman, R

    2014-11-01

    A written single word spelling to dictation test and a single word reading test were given to 68 severe-profoundly oral deaf 10-11-year-old children and 20 hearing children with a diagnosis of dyslexia. The literacy scores of the deaf children and the hearing children with dyslexia were lower than expected for children of their age and did not differ from each other. Three quarters of the spelling errors of hearing children with dyslexia compared with just over half the errors of the oral deaf group were phonologically plausible. Expressive vocabulary and speech intelligibility predicted the percentage of phonologically plausible errors in the deaf group only. Implications of findings for the phonological decoding self-teaching model and for supporting literacy development are discussed.

  18. On the plausible association between environmental conditions and human eye damage.

    Science.gov (United States)

    Feretis, Elias; Theodorakopoulos, Panagiotis; Varotsos, Costas; Efstathiou, Maria; Tzanis, Christos; Xirou, Tzina; Alexandridou, Nancy; Aggelou, Michael

    2002-01-01

    The increase in solar ultraviolet radiation can have various direct and indirect effects on human health, like the incidence of ocular damage. Data of eye damage in residents of three suburban regions in Greece and in two groups of monks/nuns and fishermen are examined here. The statistics performed on these data provides new information about the plausible association between increased levels of solar ultraviolet radiation, air-pollution at ground level, and the development of ocular defects.

  19. Families of Plausible Solutions to the Puzzle of Boyajian’s Star

    Science.gov (United States)

    Wright, Jason T.; Sigurd̵sson, Steinn

    2016-09-01

    Good explanations for the unusual light curve of Boyajian's Star have been hard to find. Recent results by Montet & Simon lend strength and plausibility to the conclusion of Schaefer that in addition to short-term dimmings, the star also experiences large, secular decreases in brightness on decadal timescales. This, combined with a lack of long-wavelength excess in the star's spectral energy distribution, strongly constrains scenarios involving circumstellar material, including hypotheses invoking a spherical cloud of artifacts. We show that the timings of the deepest dimmings appear consistent with being randomly distributed, and that the star's reddening and narrow sodium absorption is consistent with the total, long-term dimming observed. Following Montet & Simon's encouragement to generate alternative hypotheses, we attempt to circumscribe the space of possible explanations with a range of plausibilities, including: a cloud in the outer solar system, structure in the interstellar medium (ISM), natural and artificial material orbiting Boyajian's Star, an intervening object with a large disk, and variations in Boyajian's Star itself. We find the ISM and intervening disk models more plausible than the other natural models.

  20. What happened (and what didn't): Discourse constraints on encoding of plausible alternatives.

    Science.gov (United States)

    Fraundorf, Scott H; Benjamin, Aaron S; Watson, Duane G

    2013-10-01

    Three experiments investigated how font emphasis influences reading and remembering discourse. Although past work suggests that contrastive pitch contours benefit memory by promoting encoding of salient alternatives, it is unclear both whether this effect generalizes to other forms of linguistic prominence and how the set of alternatives is constrained. Participants read discourses in which some true propositions had salient alternatives (e.g., British scientists found the endangered monkey when the discourse also mentioned French scientists) and completed a recognition memory test. In Experiments 1 and 2, font emphasis in the initial presentation increased participants' ability to later reject false statements about salient alternatives but not about unmentioned items (e.g., Portuguese scientists). In Experiment 3, font emphasis helped reject false statements about plausible alternatives, but not about less plausible alternatives that were nevertheless established in the discourse. These results suggest readers encode a narrow set of only those alternatives plausible in the particular discourse. They also indicate that multiple manipulations of linguistic prominence, not just prosody, can lead to consideration of alternatives.

  1. A biologically plausible model of time-scale invariant interval timing.

    Science.gov (United States)

    Almeida, Rita; Ledberg, Anders

    2010-02-01

    The temporal durations between events often exert a strong influence over behavior. The details of this influence have been extensively characterized in behavioral experiments in different animal species. A remarkable feature of the data collected in these experiments is that they are often time-scale invariant. This means that response measurements obtained under intervals of different durations coincide when plotted as functions of relative time. Here we describe a biologically plausible model of an interval timing device and show that it is consistent with time-scale invariant behavior over a substantial range of interval durations. The model consists of a set of bistable units that switch from one state to the other at random times. We first use an abstract formulation of the model to derive exact expressions for some key quantities and to demonstrate time-scale invariance for any range of interval durations. We then show how the model could be implemented in the nervous system through a generic and biologically plausible mechanism. In particular, we show that any system that can display noise-driven transitions from one stable state to another can be used to implement the timing device. Our work demonstrates that a biologically plausible model can qualitatively account for a large body of data and thus provides a link between the biology and behavior of interval timing.

  2. The effect of decentralized behavioral decision making on system-level risk.

    Science.gov (United States)

    Kaivanto, Kim

    2014-12-01

    Certain classes of system-level risk depend partly on decentralized lay decision making. For instance, an organization's network security risk depends partly on its employees' responses to phishing attacks. On a larger scale, the risk within a financial system depends partly on households' responses to mortgage sales pitches. Behavioral economics shows that lay decisionmakers typically depart in systematic ways from the normative rationality of expected utility (EU), and instead display heuristics and biases as captured in the more descriptively accurate prospect theory (PT). In turn, psychological studies show that successful deception ploys eschew direct logical argumentation and instead employ peripheral-route persuasion, manipulation of visceral emotions, urgency, and familiar contextual cues. The detection of phishing emails and inappropriate mortgage contracts may be framed as a binary classification task. Signal detection theory (SDT) offers the standard normative solution, formulated as an optimal cutoff threshold, for distinguishing between good/bad emails or mortgages. In this article, we extend SDT behaviorally by rederiving the optimal cutoff threshold under PT. Furthermore, we incorporate the psychology of deception into determination of SDT's discriminability parameter. With the neo-additive probability weighting function, the optimal cutoff threshold under PT is rendered unique under well-behaved sampling distributions, tractable in computation, and transparent in interpretation. The PT-based cutoff threshold is (i) independent of loss aversion and (ii) more conservative than the classical SDT cutoff threshold. Independently of any possible misalignment between individual-level and system-level misclassification costs, decentralized behavioral decisionmakers are biased toward underdetection, and system-level risk is consequently greater than in analyses predicated upon normative rationality.

  3. System level modeling and component level control of fuel cells

    Science.gov (United States)

    Xue, Xingjian

    This dissertation investigates the fuel cell systems and the related technologies in three aspects: (1) system-level dynamic modeling of both PEM fuel cell (PEMFC) and solid oxide fuel cell (SOFC); (2) condition monitoring scheme development of PEM fuel cell system using model-based statistical method; and (3) strategy and algorithm development of precision control with potential application in energy systems. The dissertation first presents a system level dynamic modeling strategy for PEM fuel cells. It is well known that water plays a critical role in PEM fuel cell operations. It makes the membrane function appropriately and improves the durability. The low temperature operating conditions, however, impose modeling difficulties in characterizing the liquid-vapor two phase change phenomenon, which becomes even more complex under dynamic operating conditions. This dissertation proposes an innovative method to characterize this phenomenon, and builds a comprehensive model for PEM fuel cell at the system level. The model features the complete characterization of multi-physics dynamic coupling effects with the inclusion of dynamic phase change. The model is validated using Ballard stack experimental result from open literature. The system behavior and the internal coupling effects are also investigated using this model under various operating conditions. Anode-supported tubular SOFC is also investigated in the dissertation. While the Nernst potential plays a central role in characterizing the electrochemical performance, the traditional Nernst equation may lead to incorrect analysis results under dynamic operating conditions due to the current reverse flow phenomenon. This dissertation presents a systematic study in this regard to incorporate a modified Nernst potential expression and the heat/mass transfer into the analysis. The model is used to investigate the limitations and optimal results of various operating conditions; it can also be utilized to perform the

  4. Accelerating next generation sequencing data analysis with system level optimizations.

    Science.gov (United States)

    Kathiresan, Nagarajan; Temanni, Ramzi; Almabrazi, Hakeem; Syed, Najeeb; Jithesh, Puthen V; Al-Ali, Rashid

    2017-08-22

    Next generation sequencing (NGS) data analysis is highly compute intensive. In-memory computing, vectorization, bulk data transfer, CPU frequency scaling are some of the hardware features in the modern computing architectures. To get the best execution time and utilize these hardware features, it is necessary to tune the system level parameters before running the application. We studied the GATK-HaplotypeCaller which is part of common NGS workflows, that consume more than 43% of the total execution time. Multiple GATK 3.x versions were benchmarked and the execution time of HaplotypeCaller was optimized by various system level parameters which included: (i) tuning the parallel garbage collection and kernel shared memory to simulate in-memory computing, (ii) architecture-specific tuning in the PairHMM library for vectorization, (iii) including Java 1.8 features through GATK source code compilation and building a runtime environment for parallel sorting and bulk data transfer (iv) the default 'on-demand' mode of CPU frequency is over-clocked by using 'performance-mode' to accelerate the Java multi-threads. As a result, the HaplotypeCaller execution time was reduced by 82.66% in GATK 3.3 and 42.61% in GATK 3.7. Overall, the execution time of NGS pipeline was reduced to 70.60% and 34.14% for GATK 3.3 and GATK 3.7 respectively.

  5. System-level simulation of liquid filling in microfluidic chips.

    Science.gov (United States)

    Song, Hongjun; Wang, Yi; Pant, Kapil

    2011-06-01

    Liquid filling in microfluidic channels is a complex process that depends on a variety of geometric, operating, and material parameters such as microchannel geometry, flow velocity∕pressure, liquid surface tension, and contact angle of channel surface. Accurate analysis of the filling process can provide key insights into the filling time, air bubble trapping, and dead zone formation, and help evaluate trade-offs among the various design parameters and lead to optimal chip design. However, efficient modeling of liquid filling in complex microfluidic networks continues to be a significant challenge. High-fidelity computational methods, such as the volume of fluid method, are prohibitively expensive from a computational standpoint. Analytical models, on the other hand, are primarily applicable to idealized geometries and, hence, are unable to accurately capture chip level behavior of complex microfluidic systems. This paper presents a parametrized dynamic model for the system-level analysis of liquid filling in three-dimensional (3D) microfluidic networks. In our approach, a complex microfluidic network is deconstructed into a set of commonly used components, such as reservoirs, microchannels, and junctions. The components are then assembled according to their spatial layout and operating rationale to achieve a rapid system-level model. A dynamic model based on the transient momentum equation is developed to track the liquid front in the microchannels. The principle of mass conservation at the junction is used to link the fluidic parameters in the microchannels emanating from the junction. Assembly of these component models yields a set of differential and algebraic equations, which upon integration provides temporal information of the liquid filling process, particularly liquid front propagation (i.e., the arrival time). The models are used to simulate the transient liquid filling process in a variety of microfluidic constructs and in a multiplexer, representing a

  6. Understanding Karma Police: The Perceived Plausibility of Noun Compounds as Predicted by Distributional Models of Semantic Representation

    Science.gov (United States)

    Günther, Fritz; Marelli, Marco

    2016-01-01

    Noun compounds, consisting of two nouns (the head and the modifier) that are combined into a single concept, differ in terms of their plausibility: school bus is a more plausible compound than saddle olive. The present study investigates which factors influence the plausibility of attested and novel noun compounds. Distributional Semantic Models (DSMs) are used to obtain formal (vector) representations of word meanings, and compositional methods in DSMs are employed to obtain such representations for noun compounds. From these representations, different plausibility measures are computed. Three of those measures contribute in predicting the plausibility of noun compounds: The relatedness between the meaning of the head noun and the compound (Head Proximity), the relatedness between the meaning of modifier noun and the compound (Modifier Proximity), and the similarity between the head noun and the modifier noun (Constituent Similarity). We find non-linear interactions between Head Proximity and Modifier Proximity, as well as between Modifier Proximity and Constituent Similarity. Furthermore, Constituent Similarity interacts non-linearly with the familiarity with the compound. These results suggest that a compound is perceived as more plausible if it can be categorized as an instance of the category denoted by the head noun, if the contribution of the modifier to the compound meaning is clear but not redundant, and if the constituents are sufficiently similar in cases where this contribution is not clear. Furthermore, compounds are perceived to be more plausible if they are more familiar, but mostly for cases where the relation between the constituents is less clear. PMID:27732599

  7. Systems-Level Synthetic Biology for Advanced Biofuel Production

    Energy Technology Data Exchange (ETDEWEB)

    Ruffing, Anne [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jensen, Travis J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Strickland, Lucas Marshall [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Meserole, Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tallant, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-03-01

    Cyanobacteria have been shown to be capable of producing a variety of advanced biofuels; however, product yields remain well below those necessary for large scale production. New genetic tools and high throughput metabolic engineering techniques are needed to optimize cyanobacterial metabolisms for enhanced biofuel production. Towards this goal, this project advances the development of a multiple promoter replacement technique for systems-level optimization of gene expression in a model cyanobacterial host: Synechococcus sp. PCC 7002. To realize this multiple-target approach, key capabilities were developed, including a high throughput detection method for advanced biofuels, enhanced transformation efficiency, and genetic tools for Synechococcus sp. PCC 7002. Moreover, several additional obstacles were identified for realization of this multiple promoter replacement technique. The techniques and tools developed in this project will help to enable future efforts in the advancement of cyanobacterial biofuels.

  8. Quantized Feedback Control Software Synthesis from System Level Formal Specifications

    CERN Document Server

    Mari, Federico; Salvo, Ivano; Tronci, Enrico

    2011-01-01

    Many Embedded Systems are indeed Software Based Control Systems (SBCSs), that is control systems whose controller consists of control software running on a microcontroller device. This motivates investigation on Formal Model Based Design approaches for automatic synthesis of SBCS control software. We present an algorithm, along with a tool QKS implementing it, that from a formal model (as a Discrete Time Linear Hybrid System, DTLHS) of the controlled system (plant), implementation specifications (that is, number of bits in the Analog-to-Digital, AD, conversion) and System Level Formal Specifications (that is, safety and liveness requirements for the closed loop system) returns correct-by-construction control software that has a Worst Case Execution Time (WCET) linear in the number of AD bits and meets the given specifications. We show feasibility of our approach by presenting experimental results on using it to synthesize control software for a buck DC-DC converter, a widely used mixed-mode analog circuit.

  9. System-level techniques for analog performance enhancement

    CERN Document Server

    Song, Bang-Sup

    2016-01-01

    This book shows readers to avoid common mistakes in circuit design, and presents classic circuit concepts and design approaches from the transistor to the system levels. The discussion is geared to be accessible and optimized for practical designers who want to learn to create circuits without simulations. Topic by topic, the author guides designers to learn the classic analog design skills by understanding the basic electronics principles correctly, and further prepares them to feel confident in designing high-performance, state-of-the art CMOS analog systems. This book combines and presents all in-depth necessary information to perform various design tasks so that readers can grasp essential material, without reading through the entire book. This top-down approach helps readers to build practical design expertise quickly, starting from their understanding of electronics fundamentals. .

  10. Modelling system level health information exchange: an ontological approach.

    Science.gov (United States)

    McMurray, J; Zhu, L; McKillop, I; Chen, H

    2015-01-01

    Investment of resources to purposively improve the movement of information between health system providers is currently made with imperfect information. No inventories of system-level digital information flows currently exist, nor do measures of inter-organizational electronic information. exchange (HIE). Using Protégé 4, an open-source OWL Web ontology language editor and knowledge-based framework we formalized a model that decomposes inter-organizational electronic health information flow into derivative concepts such as diversity, breadth, volume, structure, standardization and connectivity. Self-reported data from a regional health system is used to measure HIE; the ontology identifies providers with low and high HIE, useful for planners, and using a related database is used to monitor data quality.

  11. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    Energy Technology Data Exchange (ETDEWEB)

    D.L. McGregor

    2000-12-20

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process.

  12. Promoting system-level learning from project-level lessons

    Energy Technology Data Exchange (ETDEWEB)

    Jong, Amos A. de, E-mail: amosdejong@gmail.com [Innovation Management, Utrecht (Netherlands); Runhaar, Hens A.C., E-mail: h.a.c.runhaar@uu.nl [Section of Environmental Governance, Utrecht University, Utrecht (Netherlands); Runhaar, Piety R., E-mail: piety.runhaar@wur.nl [Organisational Psychology and Human Resource Development, University of Twente, Enschede (Netherlands); Kolhoff, Arend J., E-mail: Akolhoff@eia.nl [The Netherlands Commission for Environmental Assessment, Utrecht (Netherlands); Driessen, Peter P.J., E-mail: p.driessen@geo.uu.nl [Department of Innovation and Environment Sciences, Utrecht University, Utrecht (Netherlands)

    2012-02-15

    A growing number of low and middle income nations (LMCs) have adopted some sort of system for environmental impact assessment (EIA). However, generally many of these EIA systems are characterised by a low performance in terms of timely information dissemination, monitoring and enforcement after licencing. Donor actors (such as the World Bank) have attempted to contribute to a higher performance of EIA systems in LMCs by intervening at two levels: the project level (e.g. by providing scoping advice or EIS quality review) and the system level (e.g. by advising on EIA legislation or by capacity building). The aims of these interventions are environmental protection in concrete cases and enforcing the institutionalisation of environmental protection, respectively. Learning by actors involved is an important condition for realising these aims. A relatively underexplored form of learning concerns learning at EIA system-level via project level donor interventions. This 'indirect' learning potentially results in system changes that better fit the specific context(s) and hence contribute to higher performances. Our exploratory research in Ghana and the Maldives shows that thus far, 'indirect' learning only occurs incidentally and that donors play a modest role in promoting it. Barriers to indirect learning are related to the institutional context rather than to individual characteristics. Moreover, 'indirect' learning seems to flourish best in large projects where donors achieved a position of influence that they can use to evoke reflection upon system malfunctions. In order to enhance learning at all levels donors should thereby present the outcomes of the intervention elaborately (i.e. discuss the outcomes with a large audience), include practical suggestions about post-EIS activities such as monitoring procedures and enforcement options and stimulate the use of their advisory reports to generate organisational memory and ensure a better

  13. Public health preparedness in Alberta: a systems-level study

    Directory of Open Access Journals (Sweden)

    Noseworthy Tom

    2006-12-01

    Full Text Available Abstract Background Recent international and national events have brought critical attention to the Canadian public health system and how prepared the system is to respond to various types of contemporary public health threats. This article describes the study design and methods being used to conduct a systems-level analysis of public health preparedness in the province of Alberta, Canada. The project is being funded under the Health Research Fund, Alberta Heritage Foundation for Medical Research. Methods/Design We use an embedded, multiple-case study design, integrating qualitative and quantitative methods to measure empirically the degree of inter-organizational coordination existing among public health agencies in Alberta, Canada. We situate our measures of inter-organizational network ties within a systems-level framework to assess the relative influence of inter-organizational ties, individual organizational attributes, and institutional environmental features on public health preparedness. The relative contribution of each component is examined for two potential public health threats: pandemic influenza and West Nile virus. Discussion The organizational dimensions of public health preparedness depend on a complex mix of individual organizational characteristics, inter-agency relationships, and institutional environmental factors. Our study is designed to discriminate among these different system components and assess the independent influence of each on the other, as well as the overall level of public health preparedness in Alberta. While all agree that competent organizations and functioning networks are important components of public health preparedness, this study is one of the first to use formal network analysis to study the role of inter-agency networks in the development of prepared public health systems.

  14. A Novel Discovery of Growth Process for Ag Nanowires and Plausible Mechanism

    Directory of Open Access Journals (Sweden)

    Jiejun Zhu

    2016-01-01

    Full Text Available A novel growth process of silver nanowires was revealed by tracing the morphology evolution of Ag nanostructures fabricated by an improved polyol process. A mixture of Ag nanowires and nanoparticles was obtained with the usage of PVP-K25 (MW = 38,000. The products sampled at different reaction time were studied in detail using UV-visible absorption spectra and transmission electron microscopy (TEM. An interesting phenomenon unknown in the past was observed where Ag nanoparticles undergo an important dissolution-recrystallization process and Ag nanowires are formed at the expense of the preformed Ag nanoparticles. A plausible novel growth mechanism for the silver nanowires was proposed.

  15. ‘One of the Challenges that Can Plausibly Be Raised Against Them’?

    DEFF Research Database (Denmark)

    Holtermann, Jakob v. H.

    2017-01-01

    International criminal tribunals (ICTs) are epistemic engines in the sense that they find (or claim to find) factual truths about such past events that qualify as genocide, crimes against humanity and war crimes. The value of this kind of knowledge would seem to be beyond dispute. Yet, in general...... in law is intimately connected to ordinary truth. Truth-finding capacity therefore does belong in legitimacy debates as a challenge that can plausibly be raised against them. This, in turn makes it relevant, in future research, to map, analyse and interrelate the various critiques that have been launched...

  16. A biological plausible Generalized Leaky Integrate-and-Fire neuron model.

    Science.gov (United States)

    Wang, Zhenzhong; Guo, Lilin; Adjouadi, Malek

    2014-01-01

    This study introduces a new Generalized Leaky Integrate-and-Fire (GLIF) neuron model. Unlike Normal Leaky Integrate-and-Fire (NLIF) models, the leaking resistor in the GLIF model equation is assumed to be variable, and an additional term would have the bias current added to the model equation in order to improve the accuracy. Adjusting the parameters defined for the leaking resistor and bias current, a GLIF model could be accurately matched to any Hodgkin-Huxley (HH) model and be able to reproduce plausible biological neuron behaviors.

  17. Higher Data Quality by Online Data-Entry and Automated Plausibility Checks

    Science.gov (United States)

    Pietragalla, Barbara; Sigg, Christian; Güsewell, Sabine; Clot, Bernard

    2014-05-01

    Long-term phenological observations are now recognized as important indicators for climate change impact studies. With the increased need for phenological data, there is also an increased need for higher data quality. Since 1951 MeteoSwiss has been operating a national phenological observation network. Currently the network consists of about 150 active stations observing up to 69 different phenophases. An important aim of a running three years project at MeteoSwiss is a further increase of the quality of the collected data. The higher data quality will be achieved by an automated procedure performing plausibility checks on the data and by online data-entry. Further measures such as intensified observer instructions and collection of more detailed metadata also contribute to a high data quality standard. The plausibility checks include the natural order of the phenophases within a species and also between different species (with regard to possible natural deviation). Additionally it will be checked if the observed date differs by less than two standard deviations from the average for this phenophase at the altitude of the station. A value outside of these limits is not necessarily a false value, since occurrences of extreme values will be beyond these limits. Therefore, within this check of the limits, the timing of the season of the respective year will also be taken into account. In case of an implausible value a comparison with other stations of the same region and sea level is proposed. A further possibility of data quality control could be to model the different phenophases statistically and to use this model for estimating the likelihood of observed values. An overall exploratory data analysis is currently performed providing a solid basis to implement the best possible methods for the plausibility checks. Important advantages of online data-entry are the near real-time availability of the data as well as the avoidance of various kinds of typical mistakes

  18. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    Science.gov (United States)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for

  19. A biologically plausible transform for visual recognition that is invariant to translation, scale and rotation

    Directory of Open Access Journals (Sweden)

    Pavel eSountsov

    2011-11-01

    Full Text Available Visual object recognition occurs easily despite differences in position, size, and rotation of the object, but the neural mechanisms responsible for this invariance are not known. We have found a set of transforms that achieve invariance in a neurally plausible way. We find that a transform based on local spatial frequency analysis of oriented segments and on logarithmic mapping, when applied twice in an iterative fashion, produces an output image that is unique to the object and that remains constant as the input image is shifted, scaled or rotated.

  20. A Biomass-based Model to Estimate the Plausibility of Exoplanet Biosignature Gases

    CERN Document Server

    Seager, S; Hu, R

    2013-01-01

    Biosignature gas detection is one of the ultimate future goals for exoplanet atmosphere studies. We have created a framework for linking biosignature gas detectability to biomass estimates, including atmospheric photochemistry and biological thermodynamics. The new framework is intended to liberate predictive atmosphere models from requiring fixed, Earth-like biosignature gas source fluxes. New biosignature gases can be considered with a check that the biomass estimate is physically plausible. We have validated the models on terrestrial production of NO, H2S, CH4, CH3Cl, and DMS. We have applied the models to propose NH3 as a biosignature gas on a "cold Haber World," a planet with a N2-H2 atmosphere, and to demonstrate why gases such as CH3Cl must have too large of a biomass to be a plausible biosignature gas on planets with Earth or early-Earth-like atmospheres orbiting a Sun-like star. To construct the biomass models, we developed a functional classification of biosignature gases, and found that gases (such...

  1. Self-assembly of phosphate amphiphiles in mixtures of prebiotically plausible surfactants.

    Science.gov (United States)

    Albertsen, A N; Duffy, C D; Sutherland, J D; Monnard, P-A

    2014-06-01

    The spontaneous formation of closed bilayer structures from prebiotically plausible amphiphiles is an essential requirement for the emergence of early cells on prebiotic Earth. The sources of amphiphiles could have been both endo- and exogenous (accretion of meteorite carbonaceous material or interstellar dust particles). Among all prebiotic possible amphiphile candidates, those containing phosphate are the least investigated species because their self-assembly occurs in a seemingly too narrow range of conditions. The self-assembly of simple phosphate amphiphiles should, however, be of great interest, as contemporary membranes predominantly contain phospholipids. In contrast to common expectations, we show that these amphiphiles can be easily synthesized under prebiotically plausible environmental conditions and can efficiently form bilayer structures in the presence of various co-surfactants across a large range of pH values. Vesiculation was even observed in crude reaction mixtures that contained 1-decanol as the amphiphile precursor. The two best co-surfactants promoted vesicle formation over the entire pH range in aqueous solutions. Expanding the pH range where bilayer membranes self-assemble and remain intact is a prerequisite for the emergence of early cell-like compartments and their preservation under fluctuating environmental conditions. These mixed bilayers also retained small charged solutes, such as dyes. These results demonstrate that alkyl phosphate amphiphiles might have played a significant role as early compartment building blocks.

  2. A swarm intelligence framework for reconstructing gene networks: searching for biologically plausible architectures.

    Science.gov (United States)

    Kentzoglanakis, Kyriakos; Poole, Matthew

    2012-01-01

    In this paper, we investigate the problem of reverse engineering the topology of gene regulatory networks from temporal gene expression data. We adopt a computational intelligence approach comprising swarm intelligence techniques, namely particle swarm optimization (PSO) and ant colony optimization (ACO). In addition, the recurrent neural network (RNN) formalism is employed for modeling the dynamical behavior of gene regulatory systems. More specifically, ACO is used for searching the discrete space of network architectures and PSO for searching the corresponding continuous space of RNN model parameters. We propose a novel solution construction process in the context of ACO for generating biologically plausible candidate architectures. The objective is to concentrate the search effort into areas of the structure space that contain architectures which are feasible in terms of their topological resemblance to real-world networks. The proposed framework is initially applied to the reconstruction of a small artificial network that has previously been studied in the context of gene network reverse engineering. Subsequently, we consider an artificial data set with added noise for reconstructing a subnetwork of the genetic interaction network of S. cerevisiae (yeast). Finally, the framework is applied to a real-world data set for reverse engineering the SOS response system of the bacterium Escherichia coli. Results demonstrate the relative advantage of utilizing problem-specific knowledge regarding biologically plausible structural properties of gene networks over conducting a problem-agnostic search in the vast space of network architectures.

  3. Nanomagnet logic: progress toward system-level integration.

    Science.gov (United States)

    Niemier, M T; Bernstein, G H; Csaba, G; Dingler, A; Hu, X S; Kurtz, S; Liu, S; Nahas, J; Porod, W; Siddiq, M; Varga, E

    2011-12-14

    Quoting the International Technology Roadmap for Semiconductors (ITRS) 2009 Emerging Research Devices section, 'Nanomagnetic logic (NML) has potential advantages relative to CMOS of being non-volatile, dense, low-power, and radiation-hard. Such magnetic elements are compatible with MRAM technology, which can provide input–output interfaces. Compatibility with MRAM also promises a natural integration of memory and logic. Nanomagnetic logic also appears to be scalable to the ultimate limit of using individual atomic spins.' This article reviews progress toward complete and reliable NML systems. More specifically, we (i) review experimental progress toward fundamental characteristics a device must possess if it is to be used in a digital system, (ii) consider how the NML design space may impact the system-level energy (especially when considering the clock needed to drive a computation), (iii) explain--using both the NML design space and a discussion of clocking as context—how reliable circuit operation may be achieved, (iv) highlight experimental efforts regarding CMOS friendly clock structures for NML systems, (v) explain how electrical I/O could be achieved, and (vi) conclude with a brief discussion of suitable architectures for this technology. Throughout the article, we attempt to identify important areas for future work. © 2011 IOP Publishing Ltd

  4. System-level challenges in pressure-operated soft robotics

    Science.gov (United States)

    Onal, Cagdas D.

    2016-05-01

    Last decade witnessed the revival of fluidic soft actuation. As pressure-operated soft robotics becomes more popular with promising recent results, system integration remains an outstanding challenge. Inspired greatly by biology, we envision future robotic systems to embrace mechanical compliance with bodies composed of soft and hard components as well as electronic and sensing sub-systems, such that robot maintenance starts to resemble surgery. In this vision, portable energy sources and driving infrastructure plays a key role to offer autonomous many-DoF soft actuation. On the other hand, while offering many advantages in safety and adaptability to interact with unstructured environments, objects, and human bodies, mechanical compliance also violates many inherent assumptions in traditional rigid-body robotics. Thus, a complete soft robotic system requires new approaches to utilize proprioception that provides rich sensory information while remaining flexible, and motion control under significant time delay. This paper discusses our proposed solutions for each of these system-level challenges in soft robotics research.

  5. Node Grouping in System-Level Fault Diagnosis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Dafang; XIE Gaogang; MIN Yinghua

    2001-01-01

    With the popularization of network applications and multiprocessor systems,dependability of systems has drawn considerable attention. This paper presents a new technique of node grouping for system-level fault diagnosis to simplify the complexity of large system diagnosis. The technique transforms a complicated system to a group network, where each group may consist of many nodes that are either fault-free or faulty. It is proven that the transformation leads to a unique group network to ease system diagnosis. Then it studies systematically one-step t-faults diagnosis problem based on node grouping by means of the concept of independent point sets and gives a simple sufficient and necessary condition. The paper presents a diagnosis procedure for t-diagnosable systems. Furthermore, an efficient probabilistic diagnosis algorithm for practical applications is proposed based on the belief that most of the nodes in a system are fault-free. The result of software simulation shows that the probabilistic diagnosis provides high probability of correct diagnosis and low diagnosis cost, and is suitable for systems of any kind of topology.

  6. Systems-level perspective of sudden infant death syndrome.

    Science.gov (United States)

    Salomonis, Nathan

    2014-09-01

    Sudden infant death syndrome (SIDS) remains one of the primary causes of infant mortality in developed countries. Although the causes of SIDS remain largely inconclusive, some of the most informative associations implicate molecular, genetic, anatomical, physiological, and environmental (i.e., infant sleep) factors. Thus, a comprehensive and evolving systems-level model is required to understand SIDS susceptibility. Such models, by being powerful enough to uncover indirect associations, could be used to expand our list of candidate targets for in-depth analysis. We present an integrated WikiPathways model for SIDS susceptibility that includes associated cell systems, signaling pathways, genetics, and animal phenotypes. Experimental and literature-based gene-regulatory data have been integrated into this model to identify intersecting upstream control elements and associated interactions. To expand this pathway model, we performed a comprehensive analysis of existing proteomics data from brainstem samples of infants with SIDS. From this analysis, we discovered changes in the expression of several proteins linked to known SIDS pathologies, including factors involved in glial cell production, hypoxia regulation, and synaptic vesicle release, in addition to interactions with annotated SIDS markers. Our results highlight new targets for further consideration that further enrich this pathway model, which, over time, can improve as a wiki-based, community curation project.

  7. System Level Uncertainty Assessment for Collaborative RLV Design

    Science.gov (United States)

    Charania, A. C.; Bradford, John E.; Olds, John R.; Graham, Matthew

    2002-01-01

    A collaborative design process utilizing Probabilistic Data Assessment (PDA) is showcased. Given the limitation of financial resources by both the government and industry, strategic decision makers need more than just traditional point designs, they need to be aware of the likelihood of these future designs to meet their objectives. This uncertainty, an ever-present character in the design process, can be embraced through a probabilistic design environment. A conceptual design process is presented that encapsulates the major engineering disciplines for a Third Generation Reusable Launch Vehicle (RLV). Toolsets consist of aerospace industry standard tools in disciplines such as trajectory, propulsion, mass properties, cost, operations, safety, and economics. Variations of the design process are presented that use different fidelities of tools. The disciplinary engineering models are used in a collaborative engineering framework utilizing Phoenix Integration's ModelCenter and AnalysisServer environment. These tools allow the designer to join disparate models and simulations together in a unified environment wherein each discipline can interact with any other discipline. The design process also uses probabilistic methods to generate the system level output metrics of interest for a RLV conceptual design. The specific system being examined is the Advanced Concept Rocket Engine 92 (ACRE-92) RLV. Previous experience and knowledge (in terms of input uncertainty distributions from experts and modeling and simulation codes) can be coupled with Monte Carlo processes to best predict the chances of program success.

  8. System level traffic shaping in disk servers with heterogeneous protocols

    Science.gov (United States)

    Cano, Eric; Kruse, Daniele Francesco

    2014-06-01

    Disk access and tape migrations compete for network bandwidth in CASTORs disk servers, over various protocols: RFIO, Xroot, root and GridFTP. As there are a limited number of tape drives, it is important to keep them busy all the time, at their nominal speed. With potentially 100s of user read streams per server, the bandwidth for the tape migrations has to be guaranteed to a controlled level, and not the fair share the system gives by default. Xroot provides a prioritization mechanism, but using it implies moving exclusively to the Xroot protocol, which is not possible in short to mid-term time frame, as users are equally using all protocols. The greatest commonality of all those protocols is not more than the usage of TCP/IP. We investigated the Linux kernel traffic shaper to control TCP/ IP bandwidth. The performance and limitations of the traffic shaper have been understood in test environment, and satisfactory working point has been found for production. Notably, TCP offload engines' negative impact on traffic shaping, and the limitations of the length of the traffic shaping rules were discovered and measured. A suitable working point has been found and the traffic shaping is now successfully deployed in the CASTOR production systems at CERN. This system level approach could be transposed easily to other environments.

  9. Systems-Level Smoking Cessation Activities by Private Health Plans

    Directory of Open Access Journals (Sweden)

    Sharon Reif, PhD

    2011-01-01

    Full Text Available IntroductionThe US Public Health Service urges providers to screen patients for smoking and advise smokers to quit. Yet, these practices are not widely implemented in clinical practice. This study provides national estimates of systems-level strategies used by private health insurance plans to influence provider delivery of smoking cessation activities.MethodsData are from a nationally representative survey of health plans for benefit year 2003, across product types offered by insurers, including health maintenance organizations (HMOs, preferred provider organizations, and point-of-service products, regarding alcohol, tobacco, drug, and mental health services. Executive directors of 368 health plans responded to the administrative module (83% response rate. Medical directors of 347 of those health plans, representing 771 products, completed the clinical module in which health plan respondents were asked about screening for smoking, guideline distribution, and incentives for guideline adherence.ResultsOnly 9% of products require, and 12% verify, that primary care providers (PCPs screen for smoking. HMOs are more likely than other product types to require screening. Only 17% of products distribute smoking cessation guidelines to PCPs, and HMOs are more likely to do this. Feedback to PCPs was most frequently used to encourage guideline adherence; financial incentives were rarely used. Furthermore, health plans that did require screening often conducted other cessation activities.ConclusionFew private health plans have adopted techniques to encourage the use of smoking cessation activities by their providers. Increasing health plan involvement is necessary to reduce tobacco use and concomitant disease in the United States.

  10. Estimating yield gaps at the cropping system level.

    Science.gov (United States)

    Guilpart, Nicolas; Grassini, Patricio; Sadras, Victor O; Timsina, Jagadish; Cassman, Kenneth G

    2017-05-01

    Yield gap analyses of individual crops have been used to estimate opportunities for increasing crop production at local to global scales, thus providing information crucial to food security. However, increases in crop production can also be achieved by improving cropping system yield through modification of spatial and temporal arrangement of individual crops. In this paper we define the cropping system yield potential as the output from the combination of crops that gives the highest energy yield per unit of land and time, and the cropping system yield gap as the difference between actual energy yield of an existing cropping system and the cropping system yield potential. Then, we provide a framework to identify alternative cropping systems which can be evaluated against the current ones. A proof-of-concept is provided with irrigated rice-maize systems at four locations in Bangladesh that represent a range of climatic conditions in that country. The proposed framework identified (i) realistic alternative cropping systems at each location, and (ii) two locations where expected improvements in crop production from changes in cropping intensity (number of crops per year) were 43% to 64% higher than from improving the management of individual crops within the current cropping systems. The proposed framework provides a tool to help assess food production capacity of new systems (e.g. with increased cropping intensity) arising from climate change, and assess resource requirements (water and N) and associated environmental footprint per unit of land and production of these new systems. By expanding yield gap analysis from individual crops to the cropping system level and applying it to new systems, this framework could also be helpful to bridge the gap between yield gap analysis and cropping/farming system design.

  11. Plausible families of compact objects with a Non Local Equation of State

    CERN Document Server

    Hernández, H

    2012-01-01

    We investigate the plausibility of some models emerging from an algorithm devised to generate a one-parameter family of interior solutions for the Einstein equations. It is explored how their physical variables change as the family-parameter varies. The models studied correspond to anisotropic spherical matter configurations having a non local equation of state. This particular type of equation of state with no causality problems provides, at a given point, the radial pressure not only as a function of the density but as a functional of the enclosed matter distribution. We have found that there are several model-independent tendencies as the parameter increases: the equation of state tends to be stiffer and the total mass becomes half of its external radius. Profiting from the concept of cracking of materials in General Relativity, we obtain that those models become more stable as the family parameter increases.

  12. Signature of Plausible Accreting Supermassive Black Holes in Mrk 261/262 and Mrk 266

    Directory of Open Access Journals (Sweden)

    Gagik Ter-Kazarian

    2013-01-01

    Full Text Available We address the neutrino radiation of plausible accreting supermassive black holes closely linking to the 5 nuclear components of galaxy samples of Mrk 261/262 and Mrk 266. We predict a time delay before neutrino emission of the same scale as the age of the Universe. The ultrahigh energy neutrinos are produced in superdense protomatter medium via simple (quark or pionic reactions or modified URCA processes (G. Gamow was inspired to name the process URCA after the name of a casino in Rio de Janeiro. The resulting neutrino fluxes for quark reactions are ranging from to , where is the opening parameter. For pionic and modified URCA reactions, the fluxes are and , respectively. These fluxes are highly beamed along the plane of accretion disk, peaked at ultrahigh energies, and collimated in smaller opening angle .

  13. Plausible role of nanoparticle contamination in the synthesis and properties of organic electronic materials

    Science.gov (United States)

    Ananikov, Valentine P.

    2016-12-01

    Traceless transition metal catalysis (Pd, Ni, Cu, etc.) is very difficult to achieve. Metal contamination in the synthesized products is unavoidable and the most important questions are: How to control metal impurities? What amount of metal impurities can be tolerated? What is the influence of metal impurities? In this brief review, the plausible origins of nanoparticle contamination are discussed in the framework of catalytic synthesis of organic electronic materials. Key factors responsible for increasing the probability of contamination are considered from the point of view of catalytic reaction mechanisms. The purity of the catalyst may greatly affect the molecular weight of a polymer, reaction yield, selectivity and several other parameters. Metal contamination in the final polymeric products may induce some changes in the electric conductivity, charge transport properties, photovoltaic performance and other important parameters.

  14. Spontaneous formation and base pairing of plausible prebiotic nucleotides in water.

    Science.gov (United States)

    Cafferty, Brian J; Fialho, David M; Khanam, Jaheda; Krishnamurthy, Ramanarayanan; Hud, Nicholas V

    2016-04-25

    The RNA World hypothesis presupposes that abiotic reactions originally produced nucleotides, the monomers of RNA and universal constituents of metabolism. However, compatible prebiotic reactions for the synthesis of complementary (that is, base pairing) nucleotides and mechanisms for their mutual selection within a complex chemical environment have not been reported. Here we show that two plausible prebiotic heterocycles, melamine and barbituric acid, form glycosidic linkages with ribose and ribose-5-phosphate in water to produce nucleosides and nucleotides in good yields. Even without purification, these nucleotides base pair in aqueous solution to create linear supramolecular assemblies containing thousands of ordered nucleotides. Nucleotide anomerization and supramolecular assemblies favour the biologically relevant β-anomer form of these ribonucleotides, revealing abiotic mechanisms by which nucleotide structure and configuration could have been originally favoured. These findings indicate that nucleotide formation and selection may have been robust processes on the prebiotic Earth, if other nucleobases preceded those of extant life.

  15. Complex adaptive HIV/AIDS risk reduction: Plausible implications from findings in Limpopo Province, South Africa.

    Science.gov (United States)

    Burman, Chris J; Aphane, Marota A

    2016-05-16

    This article emphasises that when working with complex adaptive systems it is possible to stimulate new social practices and/or cognitive perspectives that contribute to risk reduction, associated with reducing aggregate community viral loads. The process of achieving this is highly participatory and is methodologically possible because evidence of 'attractors' that influence the social practices can be identified using qualitative research techniques. Using findings from Limpopo Province, South Africa, we argue that working with 'wellness attractors' and increasing their presence within the HIV/AIDS landscape could influence aggregate community viral loads. While the analysis that is presented is unconventional, it is plausible that this perspective may hold potential to develop a biosocial response - which the Joint United Nations Programme on HIV and AIDS (UNAIDS) has called for - that reinforces the biomedical opportunities that are now available to achieve the ambition of ending AIDS by 2030.

  16. Reciprocity-based reasons for benefiting research participants: most fail, the most plausible is problematic.

    Science.gov (United States)

    Sofaer, Neema

    2014-11-01

    A common reason for giving research participants post-trial access (PTA) to the trial intervention appeals to reciprocity, the principle, stated most generally, that if one person benefits a second, the second should reciprocate: benefit the first in return. Many authors consider it obvious that reciprocity supports PTA. Yet their reciprocity principles differ, with many authors apparently unaware of alternative versions. This article is the first to gather the range of reciprocity principles. It finds that: (1) most are false. (2) The most plausible principle, which is also problematic, applies only when participants experience significant net risks or burdens. (3) Seldom does reciprocity support PTA for participants or give researchers stronger reason to benefit participants than equally needy non-participants. (4) Reciprocity fails to explain the common view that it is bad when participants in a successful trial have benefited from the trial intervention but lack PTA to it.

  17. Oxidation of cefazolin by potassium permanganate: Transformation products and plausible pathways.

    Science.gov (United States)

    Li, Liping; Wei, Dongbin; Wei, Guohua; Du, Yuguo

    2016-04-01

    Cefazolin was demonstrated to exert high reactivity toward permanganate (Mn(VII)), a common oxidant in water pre-oxidation treatment. In this study, five transformation products were found to be classified into three categories according to the contained characteristic functional groups: three (di-)sulfoxide products, one sulfone product and one di-ketone product. Products analyses showed that two kinds of reactions including oxidation of thioether and the cleavage of unsaturated CC double bond occurred during transformation of cefazolin by Mn(VII). Subsequently, the plausible transformation pathways under different pH conditions were proposed based on the identified products and chemical reaction principles. More importantly, the simulation with real surface water matrix indicated that the proposed transformation pathways of cefazolin could be replayed in real water treatment practices.

  18. Plausible authentication of manuka honey and related products by measuring leptosperin with methyl syringate.

    Science.gov (United States)

    Kato, Yoji; Fujinaka, Rie; Ishisaka, Akari; Nitta, Yoko; Kitamoto, Noritoshi; Takimoto, Yosuke

    2014-07-01

    Manuka honey, obtained from Leptospermum scoparium flowers in New Zealand, has strong antibacterial properties. In this study, plausible authentication of the manuka honey was inspected by measuring leptosperin, methyl syringate 4-O-β-D-gentiobiose, along with methyl syringate. Despite a gradual decrease in methyl syringate content over 30 days at 50 °C, even at moderate 37 °C, leptosperin remained stable. A considerable correlation between nonperoxide antibacterial activity and leptosperin content was observed in 20 certified manuka honey samples. Leptosperin and methyl syringate in manuka honey and related products were analyzed using HPLC connected with mass spectrometry. One noncertified brand displayed significant variations in the leptosperin and methyl syringate contents between two samples obtained from different regions. Therefore, certification is clearly required to protect consumers from disguised and/or low-quality honey. Because leptosperin is stable during storage and specific to manuka honey, its measurement may be applicable for manuka honey authentication.

  19. A plausible simultaneous synthesis of amino acids and simple peptides on the primordial Earth.

    Science.gov (United States)

    Parker, Eric T; Zhou, Manshui; Burton, Aaron S; Glavin, Daniel P; Dworkin, Jason P; Krishnamurthy, Ramanarayanan; Fernández, Facundo M; Bada, Jeffrey L

    2014-07-28

    Following his seminal work in 1953, Stanley Miller conducted an experiment in 1958 to study the polymerization of amino acids under simulated early Earth conditions. In the experiment, Miller sparked a gas mixture of CH4, NH3, and H2O, while intermittently adding the plausible prebiotic condensing reagent cyanamide. For unknown reasons, an analysis of the samples was not reported. We analyzed the archived samples for amino acids, dipeptides, and diketopiperazines by liquid chromatography, ion mobility spectrometry, and mass spectrometry. A dozen amino acids, 10 glycine-containing dipeptides, and 3 glycine-containing diketopiperazines were detected. Miller's experiment was repeated and similar polymerization products were observed. Aqueous heating experiments indicate that Strecker synthesis intermediates play a key role in facilitating polymerization. These results highlight the potential importance of condensing reagents in generating diversity within the prebiotic chemical inventory.

  20. Evaluation and integration of cancer gene classifiers: identification and ranking of plausible drivers.

    Science.gov (United States)

    Liu, Yang; Tian, Feng; Hu, Zhenjun; DeLisi, Charles

    2015-05-11

    The number of mutated genes in cancer cells is far larger than the number of mutations that drive cancer. The difficulty this creates for identifying relevant alterations has stimulated the development of various computational approaches to distinguishing drivers from bystanders. We develop and apply an ensemble classifier (EC) machine learning method, which integrates 10 classifiers that are publically available, and apply it to breast and ovarian cancer. In particular we find the following: (1) Using both standard and non-standard metrics, EC almost always outperforms single method classifiers, often by wide margins. (2) Of the 50 highest ranked genes for breast (ovarian) cancer, 34 (30) are associated with other cancers in either the OMIM, CGC or NCG database (P plausible. Biological implications are briefly discussed. Source codes and detailed results are available at http://www.visantnet.org/misi/driver_integration.zip.

  1. Probability, plausibility, and adequacy evaluations of the Oriente Study demonstrate that supplementation improved child growth.

    Science.gov (United States)

    Habicht, Jean-Pierre; Martorell, Reynaldo

    2010-02-01

    This article presents evidence that the high-nutrient supplement in the Oriente study (Atole) improved child growth. The evidence is presented at 4 levels. There was a causal effect of the intervention on child length, as assessed by probability analyses of the randomized, controlled trial (P < 0.05). The plausibility analyses, which included an examination of wasting, showed that the nutritional impact was due to the Atole, especially in those who were <3 y old and who suffered from diarrhea. The adequacy analyses revealed excellent biological efficacy of the Atole at the individual level. At the level of the whole population, the efficacy of impact was much less, because many children did not participate fully in the supplementation program. The external validity of the biological impact is likely to be good for populations with similar diets and medical care.

  2. Relatively Selective Production of the Simplest Criegee Intermediate in a CH4/O2 Electric Discharge: Kinetic Analysis of a Plausible Mechanism.

    Science.gov (United States)

    Nguyen, Thanh Lam; McCarthy, Michael C; Stanton, John F

    2015-07-16

    High -accuracy coupled cluster methods in combination with microcanonical semiclassical transition state theory are used to investigate a plausible formation mechanism of the simplest Criegee intermediate in a CH4/O2 discharge experiment. Our results suggest that the Criegee intermediate is produced in a three-step process: (i) production of methyl radical by cleavage of a C-H bond of CH4; (ii) association of methyl radical with molecular oxygen to form a vibrationally excited methyl peroxy, which is in a rapid microequilibrium with the reactants; and finally, (iii) H-abstraction of CH3OO by O2, which results in the formation of cool CH2OO, which has insufficient internal energy to rearrange to dioxirane.

  3. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support.

    Science.gov (United States)

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning, which generalizes the commonalities among the data to induce new rules, and analogical reasoning, which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15%, and

  4. Understanding Whole Systems Change in Health Care: Insights into System Level Diffusion from Nursing Service Delivery Innovations--A Multiple Case Study

    Science.gov (United States)

    Berta, Whitney; Virani, Tazim; Bajnok, Irmajean; Edwards, Nancy; Rowan, Margo

    2014-01-01

    Our study responds to calls for theory-driven approaches to studying innovation diffusion processes in health care. While most research on diffusion in health care is situated at the service delivery level, we study innovations and associated processes that have diffused to the system level, and refer to work on complex adaptive systems and whole…

  5. Biologic plausibility, cellular effects, and molecular mechanisms of eicosapentaenoic acid (EPA) in atherosclerosis.

    Science.gov (United States)

    Borow, Kenneth M; Nelson, John R; Mason, R Preston

    2015-09-01

    Residual cardiovascular (CV) risk remains in dyslipidemic patients despite intensive statin therapy, underscoring the need for additional intervention. Eicosapentaenoic acid (EPA), an omega-3 polyunsaturated fatty acid, is incorporated into membrane phospholipids and atherosclerotic plaques and exerts beneficial effects on the pathophysiologic cascade from onset of plaque formation through rupture. Specific salutary actions have been reported relating to endothelial function, oxidative stress, foam cell formation, inflammation, plaque formation/progression, platelet aggregation, thrombus formation, and plaque rupture. EPA also improves atherogenic dyslipidemia characterized by reduction of triglycerides without raising low-density lipoprotein cholesterol. Other beneficial effects of EPA include vasodilation, resulting in blood pressure reductions, as well as improved membrane fluidity. EPA's effects are at least additive to those of statins when given as adjunctive therapy. In this review, we present data supporting the biologic plausibility of EPA as an anti-atherosclerotic agent with potential clinical benefit for prevention of CV events, as well as its cellular effects and molecular mechanisms of action. REDUCE-IT is an ongoing, randomized, controlled study evaluating whether the high-purity ethyl ester of EPA (icosapent ethyl) at 4 g/day combined with statin therapy is superior to statin therapy alone for reducing CV events in high-risk patients with mixed dyslipidemia. The results from this study are expected to clarify the role of EPA as adjunctive therapy to a statin for reduction of residual CV risk.

  6. Mindfulness and Cardiovascular Disease Risk: State of the Evidence, Plausible Mechanisms, and Theoretical Framework.

    Science.gov (United States)

    Loucks, Eric B; Schuman-Olivier, Zev; Britton, Willoughby B; Fresco, David M; Desbordes, Gaelle; Brewer, Judson A; Fulwiler, Carl

    2015-12-01

    The purpose of this review is to provide (1) a synopsis on relations of mindfulness with cardiovascular disease (CVD) and major CVD risk factors, and (2) an initial consensus-based overview of mechanisms and theoretical framework by which mindfulness might influence CVD. Initial evidence, often of limited methodological quality, suggests possible impacts of mindfulness on CVD risk factors including physical activity, smoking, diet, obesity, blood pressure, and diabetes regulation. Plausible mechanisms include (1) improved attention control (e.g., ability to hold attention on experiences related to CVD risk, such as smoking, diet, physical activity, and medication adherence), (2) emotion regulation (e.g., improved stress response, self-efficacy, and skills to manage craving for cigarettes, palatable foods, and sedentary activities), and (3) self-awareness (e.g., self-referential processing and awareness of physical sensations due to CVD risk factors). Understanding mechanisms and theoretical framework should improve etiologic knowledge, providing customized mindfulness intervention targets that could enable greater mindfulness intervention efficacy.

  7. A plausible (overlooked) super-luminous supernova in the SDSS Stripe 82 data

    CERN Document Server

    Kostrzewa-Rutkowska, Zuzanna; Wyrzykowski, Lukasz; Djorgovski, S George; Glikman, Eilat; Mahabal, Ashish A

    2013-01-01

    We present the discovery of a plausible super-luminous supernova (SLSN), found in the archival data of Sloan Digital Sky Survey (SDSS) Stripe 82, called PSN 000123+000504. The supernova peaked at M_g<-21.3 mag in the second half of September 2005, but was missed by the real-time supernova hunt. The observed part of the light curve (17 epochs) showed that the rise to the maximum took over 30 days, while the decline time lasted at least 70 days (observed frame), closely resembling other SLSNe of SN2007bi type. Spectrum of the host galaxy reveals a redshift of z=0.281 and the distance modulus of \\mu=40.77 mag. Combining this information with the SDSS photometry, we found the host galaxy to be an LMC-like irregular dwarf galaxy with the absolute magnitude of M_B=-18.2+/-0.2 mag and the oxygen abundance of 12+log[O/H]=8.3+/-0.2. Our SLSN follows the relation for the most energetic/super-luminous SNe exploding in low-metallicity environments, but we found no clear evidence for SLSNe to explode in low-luminosity ...

  8. From ether to acid: A plausible degradation pathway of glycerol dialkyl glycerol tetraethers

    Science.gov (United States)

    Liu, Xiao-Lei; Birgel, Daniel; Elling, Felix J.; Sutton, Paul A.; Lipp, Julius S.; Zhu, Rong; Zhang, Chuanlun; Könneke, Martin; Peckmann, Jörn; Rowland, Steven J.; Summons, Roger E.; Hinrichs, Kai-Uwe

    2016-06-01

    Glycerol dialkyl glycerol tetraethers (GDGTs) are ubiquitous microbial lipids with extensive demonstrated and potential roles as paleoenvironmental proxies. Despite the great attention they receive, comparatively little is known regarding their diagenetic fate. Putative degradation products of GDGTs, identified as hydroxyl and carboxyl derivatives, were detected in lipid extracts of marine sediment, seep carbonate, hot spring sediment and cells of the marine thaumarchaeon Nitrosopumilus maritimus. The distribution of GDGT degradation products in environmental samples suggests that both biotic and abiotic processes act as sinks for GDGTs. More than a hundred newly recognized degradation products afford a view of the stepwise degradation of GDGT via (1) ether bond hydrolysis yielding hydroxyl isoprenoids, namely, GDGTol (glycerol dialkyl glycerol triether alcohol), GMGD (glycerol monobiphytanyl glycerol diether), GDD (glycerol dibiphytanol diether), GMM (glycerol monobiphytanol monoether) and bpdiol (biphytanic diol); (2) oxidation of isoprenoidal alcohols into corresponding carboxyl derivatives and (3) chain shortening to yield C39 and smaller isoprenoids. This plausible GDGT degradation pathway from glycerol ethers to isoprenoidal fatty acids provides the link to commonly detected head-to-head linked long chain isoprenoidal hydrocarbons in petroleum and sediment samples. The problematic C80 to C82 tetraacids that cause naphthenate deposits in some oil production facilities can be generated from H-shaped glycerol monoalkyl glycerol tetraethers (GMGTs) following the same process, as indicated by the distribution of related derivatives in hydrothermally influenced sediments.

  9. Plausible molecular and crystal structures of chitosan/HI type II salt.

    Science.gov (United States)

    Lertworasirikul, Amornrat; Noguchi, Keiichi; Ogawa, Kozo; Okuyama, Kenji

    2004-03-15

    Chitosan/HI type II salt prepared from crab tendon was investigated by X-ray fiber diffraction. Two polymer chains and 16 iodide ions (I(-)) crystallized in a tetragonal unit cell with lattice parameters of a = b = 10.68(3), c (fiber axis) = 40.77(13) A, and a space group P4(1). Chitosan forms a fourfold helix with a 40.77 A fiber period having a disaccharide as the helical asymmetric unit. One of the O-3... O-5 intramolecular hydrogen bonds at the glycosidic linkage is weakened by interacting with iodide ions, which seems to cause the polymer to take the 4/1-helical symmetry rather than the extended 2/1-helix. The plausible orientations of two O-6 atoms in the helical asymmetric unit were found to be gt and gg. Two chains are running through at the corner and the center of the unit cell along the c-axis. They are linked by hydrogen bonds between N-21 and O-61 atoms. Two out of four independent iodide ions are packed between the corner chains while the other two are packed between the corner and center chains when viewing through the ab-plane. The crystal structure of the salt is stabilized by hydrogen bonds between these iodide ions and N-21, N-22, O-32, O-61, O-62 of the polymer chains.

  10. Solvent effects on the photochemistry of 4-aminoimidazole-5-carbonitrile, a prebiotically plausible precursor of purines.

    Science.gov (United States)

    Szabla, Rafał; Sponer, Judit E; Sponer, Jiří; Sobolewski, Andrzej L; Góra, Robert W

    2014-09-01

    4-Aminoimidazole-5-carbonitrile (AICN) was suggested as a prebiotically plausible precursor of purine nucleobases and nucleotides. Although it can be formed in a sequence of photoreactions, AICN is immune to further irradiation with UV-light. We present state-of-the-art multi-reference quantum-chemical calculations of potential energy surface cuts and conical intersection optimizations to explain the molecular mechanisms underlying the photostability of this compound. We have identified the N-H bond stretching and ring-puckering mechanisms that should be responsible for the photochemistry of AICN in the gas phase. We have further considered the photochemistry of AICN-water clusters, while including up to six explicit water molecules. The calculations reveal charge transfer to solvent followed by formation of an H3O(+) cation, both of which occur on the (1)πσ* hypersurface. Interestingly, a second proton transfer to an adjacent water molecule leads to a (1)πσ*/S0 conical intersection. We suggest that this electron-driven proton relay might be characteristic of low-lying (1)πσ* states in chromophore-water clusters. Owing to its nature, this mechanism might also be responsible for the photostability of analogous organic molecules in bulk water.

  11. A biologically plausible learning rule for the Infomax on recurrent neural networks.

    Science.gov (United States)

    Hayakawa, Takashi; Kaneko, Takeshi; Aoyagi, Toshio

    2014-01-01

    A fundamental issue in neuroscience is to understand how neuronal circuits in the cerebral cortex play their functional roles through their characteristic firing activity. Several characteristics of spontaneous and sensory-evoked cortical activity have been reproduced by Infomax learning of neural networks in computational studies. There are, however, still few models of the underlying learning mechanisms that allow cortical circuits to maximize information and produce the characteristics of spontaneous and sensory-evoked cortical activity. In the present article, we derive a biologically plausible learning rule for the maximization of information retained through time in dynamics of simple recurrent neural networks. Applying the derived learning rule in a numerical simulation, we reproduce the characteristics of spontaneous and sensory-evoked cortical activity: cell-assembly-like repeats of precise firing sequences, neuronal avalanches, spontaneous replays of learned firing sequences and orientation selectivity observed in the primary visual cortex. We further discuss the similarity between the derived learning rule and the spike timing-dependent plasticity of cortical neurons.

  12. Plausible ergogenic effects of vitamin D on athletic performance and recovery.

    Science.gov (United States)

    Dahlquist, Dylan T; Dieter, Brad P; Koehle, Michael S

    2015-01-01

    The purpose of this review is to examine vitamin D in the context of sport nutrition and its potential role in optimizing athletic performance. Vitamin D receptors (VDR) and vitamin D response elements (VDREs) are located in almost every tissue within the human body including skeletal muscle. The hormonally-active form of vitamin D, 1,25-dihydroxyvitamin D, has been shown to play critical roles in the human body and regulates over 900 gene variants. Based on the literature presented, it is plausible that vitamin D levels above the normal reference range (up to 100 nmol/L) might increase skeletal muscle function, decrease recovery time from training, increase both force and power production, and increase testosterone production, each of which could potentiate athletic performance. Therefore, maintaining higher levels of vitamin D could prove beneficial for athletic performance. Despite this situation, large portions of athletic populations are vitamin D deficient. Currently, the research is inconclusive with regards to the optimal intake of vitamin D, the specific forms of vitamin D one should ingest, and the distinct nutrient-nutrient interactions of vitamin D with vitamin K that affect arterial calcification and hypervitaminosis. Furthermore, it is possible that dosages exceeding the recommendations for vitamin D (i.e. dosages up to 4000-5000 IU/day), in combination with 50 to 1000 mcg/day of vitamin K1 and K2 could aid athletic performance. This review will investigate these topics, and specifically their relevance to athletic performance.

  13. A simple biophysically plausible model for long time constants in single neurons.

    Science.gov (United States)

    Tiganj, Zoran; Hasselmo, Michael E; Howard, Marc W

    2015-01-01

    Recent work in computational neuroscience and cognitive psychology suggests that a set of cells that decay exponentially could be used to support memory for the time at which events took place. Analytically and through simulations on a biophysical model of an individual neuron, we demonstrate that exponentially decaying firing with a range of time constants up to minutes could be implemented using a simple combination of well-known neural mechanisms. In particular, we consider firing supported by calcium-controlled cation current. When the amount of calcium leaving the cell during an interspike interval is larger than the calcium influx during a spike, the overall decay in calcium concentration can be exponential, resulting in exponential decay of the firing rate. The time constant of the decay can be several orders of magnitude larger than the time constant of calcium clearance, and it could be controlled externally via a variety of biologically plausible ways. The ability to flexibly and rapidly control time constants could enable working memory of temporal history to be generalized to other variables in computing spatial and ordinal representations.

  14. A plausible mechanism of biosorption in dual symbioses by vesicular-arbuscular mycorrhizal in plants.

    Science.gov (United States)

    Azmat, Rafia; Hamid, Neelofer

    2015-03-01

    Dual symbioses of vesicular-arbuscular mycorrhizal (VAM) fungi with growth of Momordica charantia were elucidated in terms of plausible mechanism of biosorption in this article. The experiment was conducted in green house and mixed inoculum of the VAM fungi was used in the three replicates. Results demonstrated that the starch contents were the main source of C for the VAM to builds their hyphae. The increased plant height and leaves surface area were explained in relation with an increase in the photosynthetic rates to produce rapid sugar contents for the survival of plants. A decreased in protein, and amino acid contents and increased proline and protease activity in VAM plants suggested that these contents were the main bio-indicators of the plants under biotic stress. The decline in protein may be due to the degradation of these contents, which later on converted into dextrose where it can easily be absorbed by for the period of symbioses. A mechanism of C chemisorption in relation with physiology and morphology of plant was discussed.

  15. Vitamin D in primary biliary cirrhosis, a plausible marker of advanced disease.

    Science.gov (United States)

    Agmon-Levin, Nancy; Kopilov, Ron; Selmi, Carlo; Nussinovitch, Udi; Sánchez-Castañón, María; López-Hoyos, Marcos; Amital, Howie; Kivity, Shaye; Gershwin, Eric M; Shoenfeld, Yehuda

    2015-02-01

    Vitamin D immune-modulating effects were extensively studied, and low levels have been linked with autoimmune diseases. The associations of vitamin D with autoimmune diseases of the liver, and particularly primary biliary cirrhosis (PBC), are yet to be defined. Hence, in this study, serum levels of vitamin D were determined in 79 patients with PBC and 70 age- and sex-matched controls by the LIAISON chemiluminescent immunoassays (DiaSorin-Italy). Clinical and serological parameters of patients were analyzed with respect to vitamin D status. Mean levels of vitamin D were significantly lower among patients with PBC compared with controls (16.8 ± 9 vs. 22.1 ± 9 ng/ml; p = 0.029), and vitamin D deficiency (≤10 ng/ml) was documented in 33% of patients with PBC versus 7% of controls (p plausible roles of vitamin D as a prognostic marker of PBC severity, and as a potential player in this disease pathogenesis. While further studies are awaited, monitoring vitamin D in patients with PBC and use of supplements may be advisable.

  16. Plausible futures of a social-ecological system: Yahara watershed, Wisconsin, USA

    Directory of Open Access Journals (Sweden)

    Stephen R. Carpenter

    2015-06-01

    Full Text Available Agricultural watersheds are affected by changes in climate, land use, agricultural practices, and human demand for energy, food, and water resources. In this context, we analyzed the agricultural, urbanizing Yahara watershed (size: 1345 km², population: 372,000 to assess its responses to multiple changing drivers. We measured recent trends in land use/cover and water quality of the watershed, spatial patterns of 10 ecosystem services, and spatial patterns and nestedness of governance. We developed scenarios for the future of the Yahara watershed by integrating trends and events from the global scenarios literature, perspectives of stakeholders, and models of biophysical drivers and ecosystem services. Four qualitative scenarios were created to explore plausible trajectories to the year 2070 in the watershed's social-ecological system under different regimes: no action on environmental trends, accelerated technological development, strong intervention by government, and shifting values toward sustainability. Quantitative time-series for 2010-2070 were developed for weather and land use/cover during each scenario as inputs to model changes in ecosystem services. Ultimately, our goal is to understand how changes in the social-ecological system of the Yahara watershed, including management of land and water resources, can build or impair resilience to shifting drivers, including climate.

  17. Plausible impact of global climate change on water resources in the Tarim River Basin

    Institute of Scientific and Technical Information of China (English)

    CHEN; Yaning; XU; Zongxue

    2005-01-01

    Combining the temperature and precipitation data from 77 climatological stations and the climatic and hydrological change data from three headstreams of the Tarim River: Hotan, Yarkant, and Aksu in the study area, the plausible association between climate change and the variability of water resources in the Tarim River Basin in recent years was investigated, the long-term trend of the hydrological time series including temperature, precipitation, and streamflow was detected, and the possible association between the El Ni(n)o/Southern Oscillation (ENSO) and these three kinds of time series was tested. The results obtained in this study show that during the past years, the temperature experienced a significant monotonic increase at the speed of 5%, nearly 1℃ rise; the precipitation showed a significant decrease in the 1970s, and a significant increase in the1980s and 1990s, the average annual precipitation was increased with the magnitude of 6.8 mm per decade. A step change occurred in both temperature and precipitation time series around 1986, which may be influenced by the global climate change. Climate change resulted in the increase of the streamflow at the headwater of the Tarim River, but the anthropogenic activities such as over-depletion of the surface water resulted in the decrease of the streamflow at the lower reaches of the Tarim River. The study result also showed that there is no significant association between the ENSO and the temperature, precipitation and streamflow.

  18. Flux-based transport enhancement as a plausible unifying mechanism for auxin transport in meristem development.

    Directory of Open Access Journals (Sweden)

    Szymon Stoma

    2008-10-01

    Full Text Available Plants continuously generate new organs through the activity of populations of stem cells called meristems. The shoot apical meristem initiates leaves, flowers, and lateral meristems in highly ordered, spiralled, or whorled patterns via a process called phyllotaxis. It is commonly accepted that the active transport of the plant hormone auxin plays a major role in this process. Current hypotheses propose that cellular hormone transporters of the PIN family would create local auxin maxima at precise positions, which in turn would lead to organ initiation. To explain how auxin transporters could create hormone fluxes to distinct regions within the plant, different concepts have been proposed. A major hypothesis, canalization, proposes that the auxin transporters act by amplifying and stabilizing existing fluxes, which could be initiated, for example, by local diffusion. This convincingly explains the organised auxin fluxes during vein formation, but for the shoot apical meristem a second hypothesis was proposed, where the hormone would be systematically transported towards the areas with the highest concentrations. This implies the coexistence of two radically different mechanisms for PIN allocation in the membrane, one based on flux sensing and the other on local concentration sensing. Because these patterning processes require the interaction of hundreds of cells, it is impossible to estimate on a purely intuitive basis if a particular scenario is plausible or not. Therefore, computational modelling provides a powerful means to test this type of complex hypothesis. Here, using a dedicated computer simulation tool, we show that a flux-based polarization hypothesis is able to explain auxin transport at the shoot meristem as well, thus providing a unifying concept for the control of auxin distribution in the plant. Further experiments are now required to distinguish between flux-based polarization and other hypotheses.

  19. Identifying and reducing potentially wrong immunoassay results even when plausible and "not-unreasonable".

    Science.gov (United States)

    Ismail, Adel A A

    2014-01-01

    The primary role of the clinical laboratory is to report accurate results for diagnosis of disease and management of illnesses. This goal has, to a large extent been achieved for routine biochemical tests, but not for immunoassays which remained susceptible to interference from endogenous immunoglobulin antibodies, causing false, and clinically misleading results. Clinicians regard all abnormal results including false ones as "pathological" necessitating further investigations, or concluding iniquitous diagnosis. Even more seriously, "false-negative" results may wrongly exclude pathology, thus denying patients' necessary treatment. Analytical error rate in immunoassays is relatively high, ranging from 0.4% to 4.0%. Because analytical interference from endogenous antibodies is confined to individuals' sera, it can be inconspicuous, pernicious, sporadic, and insidious because it cannot be detected by internal or external quality assessment procedures. An approach based on Bayesian reasoning can enhance the robustness of clinical validation in highlighting potentially erroneous immunoassay results. When this rational clinical/statistical approach is followed by analytical affirmative follow-up tests, it can help identifying inaccurate and clinically misleading immunoassay data even when they appear plausible and "not-unreasonable." This chapter is largely based on peer reviewed articles associated with and related to this approach. The first section underlines (without mathematical equations) the dominance and misuse of conventional statistics and the underuse of Bayesian paradigm and shows that laboratorians are intuitively (albeit unwittingly) practicing Bayesians. Secondly, because interference from endogenous antibodies is method's dependent (with numerous formats and different reagents), it is almost impossible to accurately assess its incidence in all differently formulated immunoassays and for each analytes/biomarkers. However, reiterating the basic concepts

  20. A plausible (overlooked) super-luminous supernova in the Sloan digital sky survey stripe 82 data

    Energy Technology Data Exchange (ETDEWEB)

    Kostrzewa-Rutkowska, Zuzanna; Kozłowski, Szymon; Wyrzykowski, Łukasz [Warsaw University Observatory, Al. Ujazdowskie 4, 00-478 Warszawa (Poland); Djorgovski, S. George; Mahabal, Ashish A. [California Institute of Technology, 1200 E California Blvd., Pasadena, CA 91125 (United States); Glikman, Eilat [Department of Physics and Yale Center for Astronomy and Astrophysics, Yale University, P.O. Box 208121, New Haven, CT 06520-8121 (United States); Koposov, Sergey, E-mail: zkostrzewa@astrouw.edu.pl, E-mail: simkoz@astrouw.edu.pl, E-mail: wyrzykow@astrouw.edu.pl [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom)

    2013-12-01

    We present the discovery of a plausible super-luminous supernova (SLSN), found in the archival data of Sloan Digital Sky Survey (SDSS) Stripe 82, called PSN 000123+000504. The supernova (SN) peaked at m {sub g} < 19.4 mag in the second half of 2005 September, but was missed by the real-time SN hunt. The observed part of the light curve (17 epochs) showed that the rise to the maximum took over 30 days, while the decline time lasted at least 70 days (observed frame), closely resembling other SLSNe of SN 2007bi type. The spectrum of the host galaxy reveals a redshift of z = 0.281 and the distance modulus of μ = 40.77 mag. Combining this information with the SDSS photometry, we found the host galaxy to be an LMC-like irregular dwarf galaxy with an absolute magnitude of M{sub B} = –18.2 ± 0.2 mag and an oxygen abundance of 12+log [O/H]=8.3±0.2; hence, the SN peaked at M {sub g} < –21.3 mag. Our SLSN follows the relation for the most energetic/super-luminous SNe exploding in low-metallicity environments, but we found no clear evidence for SLSNe to explode in low-luminosity (dwarf) galaxies only. The available information on the PSN 000123+000504 light curve suggests the magnetar-powered model as a likely scenario of this event. This SLSN is a new addition to a quickly growing family of super-luminous SNe.

  1. Bio-physically plausible visualization of highly scattering fluorescent neocortical models for in silico experimentation

    KAUST Repository

    Abdellah, Marwan

    2017-02-15

    Background We present a visualization pipeline capable of accurate rendering of highly scattering fluorescent neocortical neuronal models. The pipeline is mainly developed to serve the computational neurobiology community. It allows the scientists to visualize the results of their virtual experiments that are performed in computer simulations, or in silico. The impact of the presented pipeline opens novel avenues for assisting the neuroscientists to build biologically accurate models of the brain. These models result from computer simulations of physical experiments that use fluorescence imaging to understand the structural and functional aspects of the brain. Due to the limited capabilities of the current visualization workflows to handle fluorescent volumetric datasets, we propose a physically-based optical model that can accurately simulate light interaction with fluorescent-tagged scattering media based on the basic principles of geometric optics and Monte Carlo path tracing. We also develop an automated and efficient framework for generating dense fluorescent tissue blocks from a neocortical column model that is composed of approximately 31000 neurons. Results Our pipeline is used to visualize a virtual fluorescent tissue block of 50 μm3 that is reconstructed from the somatosensory cortex of juvenile rat. The fluorescence optical model is qualitatively analyzed and validated against experimental emission spectra of different fluorescent dyes from the Alexa Fluor family. Conclusion We discussed a scientific visualization pipeline for creating images of synthetic neocortical neuronal models that are tagged virtually with fluorescent labels on a physically-plausible basis. The pipeline is applied to analyze and validate simulation data generated from neuroscientific in silico experiments.

  2. System-Level Validation through Post-Flight Reconstruction and Anchoring

    Science.gov (United States)

    2008-12-30

    Level Post-Flight Reconstruction and Anchoring Definitions -System-Level Post-Flight Reconstruction ( PFR ): » Manually recreate and run a past...cause analysis of the system-level anomalies found in the PFR ; generate, test and implement M&S improvements to address anomalies Approved for Public...08-MDA-4058 (30 DEC 08) 5 Foundation of System PFR ystem alidation System-Level Validation is built on individual Element Validation Approved for

  3. Relevance theory: pragmatics and cognition.

    Science.gov (United States)

    Wearing, Catherine J

    2015-01-01

    Relevance Theory is a cognitively oriented theory of pragmatics, i.e., a theory of language use. It builds on the seminal work of H.P. Grice(1) to develop a pragmatic theory which is at once philosophically sensitive and empirically plausible (in both psychological and evolutionary terms). This entry reviews the central commitments and chief contributions of Relevance Theory, including its Gricean commitment to the centrality of intention-reading and inference in communication; the cognitively grounded notion of relevance which provides the mechanism for explaining pragmatic interpretation as an intention-driven, inferential process; and several key applications of the theory (lexical pragmatics, metaphor and irony, procedural meaning). Relevance Theory is an important contribution to our understanding of the pragmatics of communication.

  4. Plausible antioxidant biomechanics and anticonvulsant pharmacological activity of brain-targeted β-carotene nanoparticles.

    Science.gov (United States)

    Yusuf, Mohammad; Khan, Riaz A; Khan, Maria; Ahmed, Bahar

    2012-01-01

    increased in P-80-BCNP to 231.0 ± 16.30 seconds, as compared to PTZ (120.10 ± 4.50 seconds) and placebo control (120.30 ± 7.4 seconds). The results of this study demonstrate a plausible novel anticonvulsant activity of β-carotene at a low dose of 2 mg/kg, with brain-targeted nanodelivery, thus increasing its bioavailability and stability.

  5. Plausibility check of a redesigned rain-on-snow simulator (RASA)

    Science.gov (United States)

    Rössler, Ole; Probst, Sabine; Weingartner, Rolf

    2016-04-01

    Rain-on-snow events are fascinating but still not completely understood processes. Although, several studies and equations have been published since decades that describe past events and theoretical descriptions, empirical data of what is happening in the snow cover is far less available. A way to fill this gap of empirical data, rain-on-snow-simulators might be of help. In 2013, Juras et al. published their inspiring idea of a portable rain-on-snow simulator. The huge advantage of this devise - in contrast to other purely field-based experiments - are their fixed, and mostly standardized conditions and the possibility to measure all required data to monitor the water fluxes and melting processes at a time. Mounted in a convenient location, a large number of experiments are relatively easy conductible. We applied and further developed the original device and plausified the results of this redesigned version, called RASA. The principal design was borrowed from the original version being a frame with a sprinkler on top and a snow sample in a box at the bottom, from which the outflow is measured with a tipping gauge. We added a moving sprinkling plate to ensure a uniform distribution of raindrops on the snow, and - most importantly - we suspended the watered snow sampled on weighting cells. The latter enables to continuous measurement of the snow sample throughout the experiment and thus the indirect quantification of liquid water saturation, water holding capacity, and snowmelt amount via balance equations. As it is remains unclear if this device is capable to reproduce known processes, a hypothesis based plausibility check was accomplished. Thus, eight hypothesizes were derived from literature and tested in 28 experiments with the RASA mounted at 2000 m elevation. In general, we were able to reproduce most of the hypotheses. The RASA proved to be a very valuable device that can generate suitable results and has the potential to extend the empirical-experimental data

  6. Exploring apposite therapeutic target for apoptosis in filarial parasite: a plausible hypothesis.

    Science.gov (United States)

    Hande, Sneha; Goswami, Kalyan; Jena, Lingaraj; Reddy, Maryada Venkata Rami

    2014-03-01

    Human lymphatic filariasis is a parasitic disease with profound socioeconomic encumbrance owing to its associated disability, affecting predominantly but not limited to the developing nations of tropics and subtropics. There are several technical issues like poor therapeutic and preventive repertoire as well as administrative and infrastructural limitations which jeopardize the salvage measures and further complicate the plight. Therefore, considering the gravity of the problem, WHO has mandated (under tropical disease research scheme) for placing emphasis on validation of novel therapeutic targets against this disease with the unfortunate tag of 'neglected tropical disease'. However, dearth of knowledge of parasite biology viciously coupled with difficulty of access to parasitic material from suitable animal model along with growing cost burden of high end research poses formidable challenge. Based on the recent research evidences, here we propose a premise with targeted apoptotic impact as a novel rationale to be exploited towards anti-parasitic drug development. The new era of bioinformatics ushers in new optimism with a wide range of genomic and proteomic database in public domain. Such platform might offer wonders for drug research, but needs highly selective criterion specificity. In order to test our hypothesis presumptively, we deployed a scheme for identification of target proteins from filarial parasitic origin through wide database search with precise criteria of non-homology against the host along with functional essentiality for the parasite. Further screening for proteins with growth potential from such list of essential non-homologous proteins was undertaken to mine out suitable representative target for ensuing apoptotic impact though effective inhibitors. A unique protein enzyme, RNA dependent RNA polymerase, which besides its vital role in RNA virus is believed to have regulatory role in gene expression, emerged as a plausible target. This protein

  7. Lead-induced SCC of alloy 600 in plausible steam generator crevice environments

    Energy Technology Data Exchange (ETDEWEB)

    Wright, M.D. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Manolescu, A. [Ontario Hydro Technologies, Toronto, Ontario (Canada); Mirzai, M. [Ontario Hydro, Toronto, Ontario (Canada)

    1998-07-01

    Laboratory stress corrosion cracking (SCC) test environments developed to simulate representative BNGS-A steam generator (SG) crevice chemistries have been used to determine the susceptibility of Alloy 600 to lead-induced SCC under plausible SG conditions. Test environments were based on plant SG hideout return data and analysis of removed tubes and deposits. Deviations from the normal near neutral crevice pH environment were considered to simulate possible faulted excursion crevice chemistry and to bound the postulated crevice pH range of 3-9 (at temperature). The effect of lead contamination up to 1000 ppm, but with an emphasis on the 100 to 500 ppm range, was determined. SCC susceptibility was investigated using constant extension rate tensile (CERT) tests and encapsulated C-ring tests. CERT tests were performed at 305 degrees C on tubing representative of BNGS-A SG U-bends. The C-ring test method allowed a wider test matrix covering three temperatures (280, 304 and 315 degrees C), three strain levels (0.2%, 2% and 4%) and tubing representative of U-bends plus tubing given a simulated stress relief to represent material at the tubesheet. The results of this test program confirmed that in the absence of lead contamination, cracking does not occur in these concentrated, 3.3 to 8.9 pH range, crevice environments. Also, it appears that the concentrated crevice environments suppress lead-induced cracking relative to that seen in all-volatile-treatment (AVT) water. For the (static) C-ring tests, lead-induced SCC was only produced in the near-neutral crevice environment and was more severe at 500 ppm than 100 ppm PbO. This trend was also observed in CERT tests but some cracking/grain boundary attack occurred in acidic (pH 3.3) and alkaline (pH 8.9) environments. The C-ring tests indicated that a certain amount of resistance to cracking was imparted by simulated stress relief of the tubing. This heat treatment, confirmed to have resulted in sensitization, promoted

  8. Lead-induced stress-corrosion cracking of alloy 600 in plausible steam generator crevice environments

    Energy Technology Data Exchange (ETDEWEB)

    Wright, M.D. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Manolescu, A. [Ontario Hydro Technologies, Toronto, Ontario (Canada); Mirzai, M. [Ontario Hydro, Toronto, Ontario (Canada)

    1999-03-01

    Laboratory stress-corrosion cracking (SCC) test environments were developed to simulate crevice chemistries representative of Bruce Nuclear Generating Station A (BNPD A) steam generators (SGs); these test environments were used to determine the susceptibility of Alloy 600 to lead-induced SCC under plausible SG conditions. Test environments were based on plant SG hideout return data and analysis of removed tubes and deposits. Deviations from the normal near-neutral crevice pH environment were considered to simulate possible faulted excursion crevice chemistry and to bound the postulated crevice pH range of 3 to 9 (at temperature). The effect of lead contamination up to 1000 ppm, but with an emphasis on the 100- to 500-ppm range, was determined. SCC susceptibility was investigated using constant extension rate tensile (CERT) tests and encapsulated C-ring tests. CERT tests were performed at 305 degrees C on tubing representative of BNPD A SG U-bends. The C-ring test method allowed a wider test matrix, covering 3 temperatures (280 degrees C, 304 degrees C and 315 degrees C), 3 strain levels (0.2%, 2% and 4%), and tubing representative of U-bends plus tubing given a simulated stress relief to represent material at the tube sheet. The results of this test program confirmed that in the absence of lead contamination, cracking does not occur in these concentrated, 3.3 to 8.9 pH range, crevice environments. Also, it appears that the concentrated crevice environments suppress lead-induced cracking relative to that seen in all-volatile-treatment (AVT) water. For the (static) C-ring tests, lead-induced SCC was only produced in the near-neutral crevice environment and was more severe at 500 ppm than at 100 ppm PbO. This trend was also observed in CERT tests, but some cracking-grain boundary attack occurred in acidic (pH 3.3) and alkaline (pH 8.9) environments. The C-ring tests indicated that a certain amount of resistance to cracking was imparted by simulated stress relief of

  9. Design space pruning through hybrid analysis in system-level design space exploration

    NARCIS (Netherlands)

    Piscitelli, R.; Pimentel, A.D.

    2012-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded system archi- tectures. During system-level DSE, system parameters like, e.g., the number and type of processors, the type and size

  10. Pruning techniques for multi-objective system-level design space exploration

    NARCIS (Netherlands)

    Piscitelli, R.

    2014-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded system architectures. During system-level DSE, system parameters like, e.g., the number and type of processors, the type and size of

  11. NASA: A generic infrastructure for system-level MP-SoC design space exploration

    NARCIS (Netherlands)

    Jia, Z.J.; Pimentel, A.D.; Thompson, M.; Bautista, T.; Núñez, A.

    2010-01-01

    System-level simulation and design space exploration (DSE) are key ingredients for the design of multiprocessor system-on-chip (MP-SoC) based embedded systems. The efforts in this area, however, typically use ad-hoc software infrastructures to facilitate and support the system-level DSE experiments.

  12. System-level synthesis of multi-ASIP platforms using an uncertainty model

    DEFF Research Database (Denmark)

    Micconi, Laura; Madsen, Jan; Pop, Paul

    2015-01-01

    In this paper we propose a system-level synthesis for MPSoCs that integrates multiple Application Specific Instruction Set Processors (ASIPs). Each ASIP is customized for a specific set of tasks. The system-level synthesis is responsible for assigning the tasks to the ASIPs, exploring different...

  13. Interleaving methods for hybrid system-level MPSoC design space exploration

    NARCIS (Netherlands)

    Piscitelli, R.; Pimentel, A.D.; McAllister, J.; Bhattacharyya, S.

    2012-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded system architectures. During system-level DSE, system parameters like, e.g., the number and type of processors, the type and size of

  14. NASA: A generic infrastructure for system-level MP-SoC design space exploration

    NARCIS (Netherlands)

    Jia, Z.J.; Pimentel, A.D.; Thompson, M.; Bautista, T.; Núñez, A.

    2010-01-01

    System-level simulation and design space exploration (DSE) are key ingredients for the design of multiprocessor system-on-chip (MP-SoC) based embedded systems. The efforts in this area, however, typically use ad-hoc software infrastructures to facilitate and support the system-level DSE experiments.

  15. Exploiting Domain Knowledge in System-level MPSoC Design Space Exploration

    NARCIS (Netherlands)

    Thompson, M.; Pimentel, A.D.

    2013-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded multimedia systems. During system-level DSE, system parameters like, e.g., the number and type of processors, and the mapping of

  16. Interleaving methods for hybrid system-level MPSoC design space exploration

    NARCIS (Netherlands)

    Piscitelli, R.; Pimentel, A.D.; McAllister, J.; Bhattacharyya, S.

    2012-01-01

    System-level design space exploration (DSE), which is performed early in the design process, is of eminent importance to the design of complex multi-processor embedded system architectures. During system-level DSE, system parameters like, e.g., the number and type of processors, the type and size of

  17. Homelessness Outcome Reporting Normative Framework: Systems-Level Evaluation of Progress in Ending Homelessness

    Science.gov (United States)

    Austen, Tyrone; Pauly, Bernie

    2012-01-01

    Homelessness is a serious and growing issue. Evaluations of systemic-level changes are needed to determine progress in reducing or ending homelessness. The report card methodology is one means of systems-level assessment. Rather than solely establishing an enumeration, homelessness report cards can capture pertinent information about structural…

  18. Is air pollution a plausible candidate for prenatal exposure in autism spectrum disorder (ASD)? : a systematic review / y Dhanashree Vernekar

    OpenAIRE

    Vernekar, Dhanashree

    2013-01-01

    Objective: To present a systematic review of existing literature that investigates biological plausibility of prenatal hazardous air pollutants’ (HAPs) exposure, in the etiology of autism spectrum disorder (ASD) and related outcomes. Method: Electronic databases Pubmed, Biomed Central and National Database for Autism Research, and grey literature pertaining to air pollution association with ASD and related outcomes were searched using specific keywords. The search included 190 HAPs as defi...

  19. Metal ion binding with dehydroannulenes - Plausible two-dimensional molecular sieves

    Indian Academy of Sciences (India)

    B Sateesh; Y Soujanya; G Narahari Sastry

    2007-09-01

    Theoretical investigations have been carried out at B3LYP/6-311++G∗∗ level of theory to study the binding interaction of various metal ions, Li+, Na+ and K+ with dehydroannulene systems. The present study reveals that alkali metal ions bind strongly to dehydroannulenes and the passage through the central cavity is controlled by the size of metal ion and dimension of dehydroannulene cavity.

  20. Effective Teacher Practice on the Plausibility of Human-Induced Climate Change

    Science.gov (United States)

    Niepold, F.; Sinatra, G. M.; Lombardi, D.

    2013-12-01

    Climate change education programs in the United States seek to promote a deeper understanding of the science of climate change, behavior change and stewardship, and support informed decision making by individuals, organizations, and institutions--all of which are summarized under the term 'climate literacy.' The ultimate goal of climate literacy is to enable actors to address climate change, both in terms of stabilizing and reducing emissions of greenhouse gases, but also an increased capacity to prepare for the consequences and opportunities of climate change. However, the long-term nature of climate change and the required societal response involve the changing students' ideas about controversial scientific issues which presents unique challenges for educators (Lombardi & Sinatra, 2010; Sinatra & Mason, 2008). This session will explore how the United States educational efforts focus on three distinct, but related, areas: the science of climate change, the human-climate interaction, and using climate education to promote informed decision making. Each of these approaches are represented in the Atlas of Science Literacy (American Association for the Advancement of Science, 2007) and in the conceptual framework for science education developed at the National Research Council (NRC) in 2012. Instruction to develop these fundamental thinking skills (e.g., critical evaluation and plausibility reappraisal) has been called for by the Next Generation Science Standards (NGSS) (Achieve, 2013), an innovative and research based way to address climate change education within the decentralized U.S. education system. However, the promise of the NGSS is that students will have more time to build mastery on the subjects, but the form of that instructional practice has been show to be critical. Research has show that effective instructional activities that promote evaluation of evidence improve students' understanding and acceptance toward the scientifically accepted model of human

  1. Plausible antioxidant biomechanics and anticonvulsant pharmacological activity of brain-targeted β-carotene nanoparticles

    Directory of Open Access Journals (Sweden)

    Yusuf M

    2012-08-01

    general tonic–clonic seizures reduced significantly to 2.90 ± 0.98 seconds by the use of BCNP and was further reduced on P-80-BCNP to 1.20 ± 0.20 seconds as compared to PTZ control and PTZ-placebo control (8.09 ± 0.26 seconds. General tonic–clonic seizures latency was increased significantly to 191.0 ± 9.80 seconds in BCNP and was further increased in P-80-BCNP to 231.0 ± 16.30 seconds, as compared to PTZ (120.10 ± 4.50 seconds and placebo control (120.30 ± 7.4 seconds. The results of this study demonstrate a plausible novel anticonvulsant activity of β-carotene at a low dose of 2 mg/kg, with brain-targeted nanodelivery, thus increasing its bioavailability and stability.Keywords: anticonvulsant, blood–brain barrier (BBB, targeted brain delivery, polysorbate-80-coated β-carotene nanoparticles (P-80-BCNP, maximal electroshock seizure (MES, pentylenetetrazole (PTZ

  2. Vulnerabilities to agricultural production shocks: An extreme, plausible scenario for assessment of risk for the insurance sector

    Directory of Open Access Journals (Sweden)

    Tobias Lunt

    2016-01-01

    Full Text Available Climate risks pose a threat to the function of the global food system and therefore also a hazard to the global financial sector, the stability of governments, and the food security and health of the world’s population. This paper presents a method to assess plausible impacts of an agricultural production shock and potential materiality for global insurers. A hypothetical, near-term, plausible, extreme scenario was developed based upon modules of historical agricultural production shocks, linked under a warm phase El Niño-Southern Oscillation (ENSO meteorological framework. The scenario included teleconnected floods and droughts in disparate agricultural production regions around the world, as well as plausible, extreme biotic shocks. In this scenario, global crop yield declines of 10% for maize, 11% for soy, 7% for wheat and 7% for rice result in quadrupled commodity prices and commodity stock fluctuations, civil unrest, significant negative humanitarian consequences and major financial losses worldwide. This work illustrates a need for the scientific community to partner across sectors and industries towards better-integrated global data, modeling and analytical capacities, to better respond to and prepare for concurrent agricultural failure. Governments, humanitarian organizations and the private sector collectively may recognize significant benefits from more systematic assessment of exposure to agricultural climate risk.

  3. Universality and string theory

    Science.gov (United States)

    Bachlechner, Thomas Christian

    The first run at the Large Hadron Collider has deeply challenged conventional notions of naturalness, and CMB polarization experiments are about to open a new window to early universe cosmology. As a compelling candidate for the ultraviolet completion of the standard model, string theory provides a prime opportunity to study both early universe cosmology and particle physics. However, relating low energy observations to ultraviolet physics requires knowledge of the metastable states of string theory through the study of vacua. While it is difficult to directly obtain infrared data from explicit string theory constructions, string theory imposes constraints on low energy physics. The study of ensembles of low energy theories consistent with ultra-violet constraints provides insight on generic features we might expect to occur in string compactifications. In this thesis we present a statistical treatment of vacuum stability and vacuum properties in the context of random supergravity theories motivated by string theory. Early universe cosmology provides another avenue to high energy physics. From the low energy perspective large field inflation is typically considered highly unnatural: the scale relevant for the diameter of flat regions in moduli space is sub-Planckian in regions of perturbative control. To approach this problem, we consider generic Calabi-Yau compactifications of string theory and find that super-Planckian diameters of axion fundamental domains in fact arise generically. We further demonstrate that such super-Planckian flat regions are plausibly consistent with theWeak Gravity Conjecture.

  4. Preliminary Study on Plausible Reasoning in Chemistry Teaching of Senior Middle School%高中化学合情推理教学的初步研究

    Institute of Scientific and Technical Information of China (English)

    杨健; 吴俊明; 骆红山

    2009-01-01

    合情推理(Plausible reasoning)对科学教育具重要意义.通过科学哲学、逻辑学讨论以及历史实例说明科学发现离不开合情推理,科学教育必须重视合情推理能力的培养,并对高中化学合情推理教学的可能性、对象和内容等问题进行了讨论.%Plausible reasoning is significant to science education. Scientific philosophy, logic and historical examples prove that plausible reasoning is indispensable to scientific discoveries,so science education must pay attention to the development of plausible reasoning ability of students. Moreover, it discusses the possibility, object and content of plausible reasoning teaching in chemistry of senior middle school.

  5. Moral Contract Theory and Social Cognition : An Empirical Perspective

    NARCIS (Netherlands)

    Timmerman, Peter

    2014-01-01

    This interdisciplinary work draws on research from psychology and behavioral economics to evaluate the plausibility of moral contract theory. In a compelling manner with implications for moral theory more broadly, the author’s novel approach resolves a number of key contingencies in contractarianism

  6. Moral Contract Theory and Social Cognition : An Empirical Perspective

    NARCIS (Netherlands)

    Timmerman, Peter

    2014-01-01

    This interdisciplinary work draws on research from psychology and behavioral economics to evaluate the plausibility of moral contract theory. In a compelling manner with implications for moral theory more broadly, the author’s novel approach resolves a number of key contingencies in contractarianism

  7. The Conceptual Copy Theory for the Origin of Language

    NARCIS (Netherlands)

    Odijk, J.E.J.M.

    2013-01-01

    The CC-Theory is, if correct, an attractive theory: Almost all of (3b) is explained from a very small evolutionary change. The character of the evolutionary change is biologically and evolutionary plausible. Also Chomsky needs a second evolutionary event to account for externalization. The CC-

  8. Plausible mechanisms for brain structural and size changes in human evolution.

    Science.gov (United States)

    Blazek, Vladimir; Brùzek, Jaroslav; Casanova, Manuel F

    2011-09-01

    Encephalization has many contexts and implications. On one hand, it is concerned with the transformation of eating habits, social relationships and communication, cognitive skills and the mind. Along with the increase in brain size on the other hand, encephalization is connected with the creation of more complex brain structures, namely in the cerebral cortex. It is imperative to inquire into the mechanisms which are linked with brain growth and to find out which of these mechanisms allow it and determine it. There exist a number of theories for understanding human brain evolution which originate from neurological sciences. These theories are the concept of radial units, minicolumns, mirror neurons, and neurocognitive networks. Over the course of evolution, it is evident that a whole range of changes have taken place in regards to heredity. These changes include new mutations of genes in the microcephalin complex, gene duplications, gene co-expression, and genomic imprinting. This complex study of the growth and reorganization of the brain and the functioning of hereditary factors and their external influences creates an opportunity to consider the implications of cultural evolution and cognitive faculties.

  9. Procedural Semantics as a Theory of Meaning.

    Science.gov (United States)

    1981-03-01

    of the system that uses it. I will begin with a brief outline of a theory of intelligence put forward by Daniel Dennett [1974], which I find highly...commented on the draft, including especially Eugene Charniak, Daniel Dennett , Gerald Gazdar, Steven Isard, David Israel, Philip Johnson-Laird, and...satisfying and a useful precursor to a theory of the evolution of natural languages. Dennett presents a plausibly mechanistic account for an array of

  10. Protoflight photovoltaic power module system-level tests in the space power facility

    Science.gov (United States)

    Rivera, Juan C.; Kirch, Luke A.

    1989-01-01

    Work Package Four, which includes the NASA-Lewis and Rocketdyne, has selected an approach for the Space Station Freedom Photovoltaic (PV) Power Module flight certification that combines system level qualification and acceptance testing in the thermal vacuum environment: The protoflight vehicle approach. This approach maximizes ground test verification to assure system level performance and to minimize risk of on-orbit failures. The preliminary plans for system level thermal vacuum environmental testing of the protoflight PV Power Module in the NASA-Lewis Space Power Facility (SPF), are addressed. Details of the facility modifications to refurbish SPF, after 13 years of downtime, are briefly discussed. The results of an evaluation of the effectiveness of system level environmental testing in screening out incipient part and workmanship defects and unique failure modes are discussed. Preliminary test objectives, test hardware configurations, test support equipment, and operations are presented.

  11. A System Level Tool for Translating Software to Reconfigurable Hardware Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this research we will develop a system level tool to translate binary code of a general-purpose processor into Register Transfer Level VHDL code to be mapped onto...

  12. Plausible Speech Translation in Literature%论文学作品中似是而非的辞格翻译

    Institute of Scientific and Technical Information of China (English)

    廖巧云

    2011-01-01

    辞格是一种语言变体,是各种语言共有的现象。由于各种原因,辞格的误译现象在翻译活动中并不鲜见。然而,辞格误译却有很大一部分呈现出似是而非的特征。辞格中似是而非的翻译,表面看来完全正确,并且在两种语言中具有形式上完全对等的特征,这使得它们更容易被译文读者接受,使其无意识中被迫接受错误信息。本文借用奈达的翻译功能对等理论,通过对文学作品中出现的似是而非的辞格译文进行案例分析,从而尽可能地为译者在今后的辞格翻译活动提供借鉴,减少辞格翻译中似是而非的现象。%Speech is a language variant,is the common phenomenon to all languages.For various reasons,the phenomenon of a mistranslation of speech activities in the translation is not uncommon.However,there is a huge misinterpretation of speech as part of showing a paradoxical character.Speech translation in the paradoxical,seemingly completely correct,and in the two languages have formal characteristics of full reciprocity,which makes them more likely to be asked readers to accept,it was forced to accept the error message unconscious.This paper uses Nida's translation functional equivalence theory,through the literary works that ap-pear plausible case analysis of the translation of speech,which as much as possible for the translators in the future of speech trans-lation activities provide a reference,to reduce the plausible translation of speech phenomenon.

  13. Delinquency, Social Skills and the Structure of Peer Relations: Assessing Criminological Theories by Social Network Theory

    Science.gov (United States)

    Smangs, Mattias

    2010-01-01

    This article explores the plausibility of the conflicting theoretical assumptions underlying the main criminological perspectives on juvenile delinquents, their peer relations and social skills: the social ability model, represented by Sutherland's theory of differential associations, and the social disability model, represented by Hirschi's…

  14. Theories and mechanisms of aging.

    Science.gov (United States)

    Cefalu, Charles A

    2011-11-01

    This article discusses various theories of aging and their relative plausibility related to the human aging process. Structural and physiologic changes of aging are discussed in detail by organ system. Each of the organ systems is discussed when applicable to the various theories of aging. Normal versus abnormal aging is discussed in the context of specific aging processes, with atypical presentations of disease and general links to life expectancy. Life expectancy and lifespan are discussed in the context of advances in medical science and the potential ultimate link to human life span.

  15. A Note on Unified Statistics Including Fermi-Dirac, Bose-Einstein, and Tsallis Statistics, and Plausible Extension to Anisotropic Effect

    Directory of Open Access Journals (Sweden)

    Christianto V.

    2007-04-01

    Full Text Available In the light of some recent hypotheses suggesting plausible unification of thermostatistics where Fermi-Dirac, Bose-Einstein and Tsallis statistics become its special subsets, we consider further plausible extension to include non-integer Hausdorff dimension, which becomes realization of fractal entropy concept. In the subsequent section, we also discuss plausible extension of this unified statistics to include anisotropic effect by using quaternion oscillator, which may be observed in the context of Cosmic Microwave Background Radiation. Further observation is of course recommended in order to refute or verify this proposition.

  16. Dedicated clock/timing-circuit theories of time perception and timed performance

    NARCIS (Netherlands)

    van Rijn, Hedderik; Gu, Bon-Mi; Meck, Warren H

    2014-01-01

    Scalar Timing Theory (an information-processing version of Scalar Expectancy Theory) and its evolution into the neurobiologically plausible Striatal Beat-Frequency (SBF) theory of interval timing are reviewed. These pacemaker/accumulator or oscillation/coincidence detection models are then

  17. Dedicated clock/timing-circuit theories of time perception and timed performance

    NARCIS (Netherlands)

    van Rijn, Hedderik; Gu, Bon-Mi; Meck, Warren H

    2014-01-01

    Scalar Timing Theory (an information-processing version of Scalar Expectancy Theory) and its evolution into the neurobiologically plausible Striatal Beat-Frequency (SBF) theory of interval timing are reviewed. These pacemaker/accumulator or oscillation/coincidence detection models are then integrate

  18. Ionic liquid pretreatment of biomass for sugars production: Driving factors with a plausible mechanism for higher enzymatic digestibility.

    Science.gov (United States)

    Raj, Tirath; Gaur, Ruchi; Dixit, Pooja; Gupta, Ravi P; Kagdiyal, V; Kumar, Ravindra; Tuli, Deepak K

    2016-09-20

    In this study, five ionic liquids (ILs) have been explored for biomass pretreatment for the production of fermentable sugar. We also investigated the driving factors responsible for improved enzymatic digestibility of various ILs treated biomass along with postulating the plausible mechanism thereof. Post pretreatment, mainly two factors impacted the enzymatic digestibility (i) structural deformation (cellulose I to II) along with xylan/lignin removal and (ii) properties of ILs; wherein, K-T parameters, viscosity and surface tension had a direct influence on pretreatment. A systematic investigation of these parameters and their impact on enzymatic digestibility is drawn. [C2mim][OAc] with β-value 1.32 resulted 97.7% of glucose yield using 10 FPU/g of biomass. A closer insight into the cellulose structural transformation has prompted a plausible mechanism explaining the better digestibility. The impact of these parameters on the digestibility can pave the way to customize the process to make biomass vulnerable to enzymatic attack.

  19. Antimicrobial drug use in Austrian pig farms: plausibility check of electronic on-farm records and estimation of consumption.

    Science.gov (United States)

    Trauffler, M; Griesbacher, A; Fuchs, K; Köfer, J

    2014-10-25

    Electronic drug application records from farmers from 75 conventional pig farms were revised and checked for their plausibility. The registered drug amounts were verified by comparing the farmers' records with veterinarians' dispensary records. The antimicrobial consumption was evaluated from 2008 to 2011 and expressed in weight of active substance(s), number of used daily doses (nUDD), number of animal daily doses (nADD) and number of product-related daily doses (nPrDD). All results were referred to one year and animal bodyweight (kg biomass). The data plausibility proof revealed about 14 per cent of unrealistic drug amount entries in the farmers' records. The annual antimicrobial consumption was 33.9 mg/kg/year, 4.9 UDDkg/kg/year, 1.9 ADDkg/kg/year and 2.5 PrDDkg/kg/year (average). Most of the antimicrobials were applied orally (86 per cent) and at group-level. Main therapy indications were metaphylactic/prophylactic measures (farrow-to-finish and fattening farms) or digestive tract diseases (breeding farms). The proportion of the 'highest priority critically important antimicrobials' was low (12 per cent). After determination of a threshold value, farms with a high antimicrobial use could be detected. Statistical tests showed that the veterinarian had an influence on the dosage, the therapy indication and the active substance. Orally administered antimicrobials were mostly underdosed, parenterally administered antimicrobials rather correctly or overdosed.

  20. NASA System-Level Design, Analysis and Simulation Tools Research on NextGen

    Science.gov (United States)

    Bardina, Jorge

    2011-01-01

    A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.

  1. System-Level Modelling and Simulation of MEMS-Based Sensors

    DEFF Research Database (Denmark)

    Virk, Kashif M.; Madsen, Jan; Shafique, Mohammad

    2005-01-01

    The growing complexity of MEMS devices and their increased used in embedded systems (e.g., wireless integrated sensor networks) demands a disciplined aproach for MEMS design as well as the development of techniques for system-level modeling of these devices so that a seamless integration with the......The growing complexity of MEMS devices and their increased used in embedded systems (e.g., wireless integrated sensor networks) demands a disciplined aproach for MEMS design as well as the development of techniques for system-level modeling of these devices so that a seamless integration...... with the existing embedded system design methodologies is possible. In this paper, we present a MEMS design methodology that uses VHDL-AMS based system-level model of a MEMS device as a starting point and combines the top-down and bottom-up design approaches for design, verification, and optimization...

  2. System Level Power Optimization of Digital Audio Back End for Hearing Aids

    DEFF Research Database (Denmark)

    Pracny, Peter; Jørgensen, Ivan Harald Holger; Bruun, Erik

    2017-01-01

    This work deals with power optimization of the audio processing back end for hearing aids - the interpolation filter (IF), the sigma-delta (SD modulator and the Class D power amplifier (PA) as a whole. Specifications are derived and insight into the tradeoffs involved is used to optimize...... the interpolation filter and the SD modulator on the system level so that the switching frequency of the Class D PA - the main power consumer in the back end - is minimized. A figure-of-merit (FOM) which allows judging the power consumption of the digital part of the back end early in the design process is used...... to track the hardware and power demands as the tradeoffs of the system level parameters are investigated. The result is the digital part of the back end optimized with respect to power which provides audio performance comparable to state-of-theart. A combination of system level parameters leading...

  3. Plausibility, necessity and identity: A logic of relative plausibility%似然性、必然性和恒等:一种相对似然性逻辑

    Institute of Scientific and Technical Information of China (English)

    李小五; 文学锋

    2007-01-01

    构造一个希尔伯特型的系统RPL, 来刻画由J·哈尔彭提出的似然性测度概念, 证明RPL相对一个邻域型语义是可靠和完全的.运用表述RPL的语言, 证明它可以定义已经得到深入研究的必然性、条件句和命题恒等这样的概念.%We construct a Hilbert style system RPL for the notion of plausibility measure introduced by Halpern J, and we prove the soundness and completeness with respect to a neighborhood style semantics.Using the language of RPL, we demonstrate that it can define well-studied notions of necessity,conditionals and propositional identity.

  4. System-Level Optimization of a DAC for Hearing-Aid Audio Class D Output Stage

    DEFF Research Database (Denmark)

    Pracný, Peter; Jørgensen, Ivan Harald Holger; Bruun, Erik

    2013-01-01

    This paper deals with system-level optimization of a digital-to-analog converter (DAC) for hearing-aid audio Class D output stage. We discuss the ΣΔ modulator system-level design parameters – the order, the oversampling ratio (OSR) and the number of bits in the quantizer. We show that combining...... by comparing two ΣΔ modulator designs. The proposed optimization has impact on the whole hearing-aid audio back-end system including less hardware in the interpolation filter and half the switching rate in the digital-pulse-width-modulation (DPWM) block and Class D output stage...

  5. System-Level Optimization of a DAC for Hearing-Aid Audio Class D Output Stage

    DEFF Research Database (Denmark)

    Pracný, Peter; Jørgensen, Ivan Harald Holger; Bruun, Erik

    2013-01-01

    This paper deals with system-level optimization of a digital-to-analog converter (DAC) for hearing-aid audio Class D output stage. We discuss the ΣΔ modulator system-level design parameters – the order, the oversampling ratio (OSR) and the number of bits in the quantizer. We show that combining a...... by comparing two ΣΔ modulator designs. The proposed optimization has impact on the whole hearing-aid audio back-end system including less hardware in the interpolation filter and half the switching rate in the digital-pulse-width-modulation (DPWM) block and Class D output stage...

  6. System-Level Optimization of a DAC for Hearing-Aid Audio Class D Output Stage

    OpenAIRE

    Pracný, Peter; Jørgensen, Ivan,; Bruun, Erik

    2013-01-01

    Part 21: Electronics: Applications; International audience; This paper deals with system-level optimization of a digital-to-analog converter (DAC) for hearing-aid audio Class D output stage. We discuss the ΣΔ modulator system-level design parameters – the order, the oversampling ratio (OSR) and the number of bits in the quantizer. We show that combining a reduction of the OSR with an increase of the order results in considerable power savings while the audio quality is kept. For further savin...

  7. Sociological Theory or Psychological Types: A Case Study of Attitudes to-wards Relationships

    OpenAIRE

    Watt, Laura; Elliot, Mark

    2014-01-01

    Sociological theories can be viewed as models of (sub)-populations. In this paper we explore the possibility of representing social theories as attitudinal types rather than as descriptions of society at large. To test this idea we investigate the relevance of four different theories of couple relationships to the attitudes of 18 to 30 year olds. Rather than testing these theories via aggregate social trends, we investigate the plausibility of treating the four social theories as attitudinal ...

  8. Climate Change Impacts on Agriculture and Food Security in 2050 under a Range of Plausible Socioeconomic and Emissions Scenarios

    Science.gov (United States)

    Wiebe, K.; Lotze-Campen, H.; Bodirsky, B.; Kavallari, A.; Mason-d'Croz, D.; van der Mensbrugghe, D.; Robinson, S.; Sands, R.; Tabeau, A.; Willenbockel, D.; Islam, S.; van Meijl, H.; Mueller, C.; Robertson, R.

    2014-12-01

    Previous studies have combined climate, crop and economic models to examine the impact of climate change on agricultural production and food security, but results have varied widely due to differences in models, scenarios and data. Recent work has examined (and narrowed) these differences through systematic model intercomparison using a high-emissions pathway to highlight the differences. New work extends that analysis to cover a range of plausible socioeconomic scenarios and emission pathways. Results from three general circulation models are combined with one crop model and five global economic models to examine the global and regional impacts of climate change on yields, area, production, prices and trade for coarse grains, rice, wheat, oilseeds and sugar to 2050. Results show that yield impacts vary with changes in population, income and technology as well as emissions, but are reduced in all cases by endogenous changes in prices and other variables.

  9. Gene-ontology enrichment analysis in two independent family-based samples highlights biologically plausible processes for autism spectrum disorders.

    LENUS (Irish Health Repository)

    Anney, Richard J L

    2012-02-01

    Recent genome-wide association studies (GWAS) have implicated a range of genes from discrete biological pathways in the aetiology of autism. However, despite the strong influence of genetic factors, association studies have yet to identify statistically robust, replicated major effect genes or SNPs. We apply the principle of the SNP ratio test methodology described by O\\'Dushlaine et al to over 2100 families from the Autism Genome Project (AGP). Using a two-stage design we examine association enrichment in 5955 unique gene-ontology classifications across four groupings based on two phenotypic and two ancestral classifications. Based on estimates from simulation we identify excess of association enrichment across all analyses. We observe enrichment in association for sets of genes involved in diverse biological processes, including pyruvate metabolism, transcription factor activation, cell-signalling and cell-cycle regulation. Both genes and processes that show enrichment have previously been examined in autistic disorders and offer biologically plausibility to these findings.

  10. The role of adverse childhood experiences in cardiovascular disease risk: a review with emphasis on plausible mechanisms.

    Science.gov (United States)

    Su, Shaoyong; Jimenez, Marcia P; Roberts, Cole T F; Loucks, Eric B

    2015-10-01

    Childhood adversity, characterized by abuse, neglect, and household dysfunction, is a problem that exerts a significant impact on individuals, families, and society. Growing evidence suggests that adverse childhood experiences (ACEs) are associated with health decline in adulthood, including cardiovascular disease (CVD). In the current review, we first provide an overview of the association between ACEs and CVD risk, with updates on the latest epidemiological evidence. Second, we briefly review plausible pathways by which ACEs could influence CVD risk, including traditional risk factors and novel mechanisms. Finally, we highlight the potential implications of ACEs in clinical and public health. Information gleaned from this review should help physicians and researchers in better understanding potential long-term consequences of ACEs and considering adapting current strategies in treatment or intervention for patients with ACEs.

  11. A hitherto undescribed case of cerebellar ataxia as the sole presentation of thyrotoxicosis in a young man: a plausible association.

    Science.gov (United States)

    Elhadd, Tarik Abdelkareim; Linton, Kathryn; McCoy, Caoihme; Saha, Subrata; Holden, Roger

    2014-01-01

    A 16-year-old male presented to hospital following an episode of unusual behavior on the football pitch, where he was witnessed as grossly ataxic by his teammates. The assessment demonstrated marked cerebellar signs on examination but no other neurological deficit. The investigation showed the evidence of biochemical thyrotoxicosis with free T4 at 37 pmol/L (normal reference range: 11-27) and thyrotropin (TSH) plausible because alternative etiologies were excluded, and the normalization of thyroid function with treatment was coupled with complete resolution of the neurological syndrome. Cerebellar syndromes may well be one of the presenting features of thyrotoxicosis, and this should be in the list of its differential diagnosis.

  12. Influence of the Aqueous Environment on Protein Structure—A Plausible Hypothesis Concerning the Mechanism of Amyloidogenesis

    Directory of Open Access Journals (Sweden)

    Irena Roterman

    2016-09-01

    Full Text Available The aqueous environment is a pervasive factor which, in many ways, determines the protein folding process and consequently the activity of proteins. Proteins are unable to perform their function unless immersed in water (membrane proteins excluded from this statement. Tertiary conformational stabilization is dependent on the presence of internal force fields (nonbonding interactions between atoms, as well as an external force field generated by water. The hitherto the unknown structuralization of water as the aqueous environment may be elucidated by analyzing its effects on protein structure and function. Our study is based on the fuzzy oil drop model—a mechanism which describes the formation of a hydrophobic core and attempts to explain the emergence of amyloid-like fibrils. A set of proteins which vary with respect to their fuzzy oil drop status (including titin, transthyretin and a prion protein have been selected for in-depth analysis to suggest the plausible mechanism of amyloidogenesis.

  13. Charting plausible futures for diabetes prevalence in the United States: a role for system dynamics simulation modeling.

    Science.gov (United States)

    Milstein, Bobby; Jones, Andrew; Homer, Jack B; Murphy, Dara; Essien, Joyce; Seville, Don

    2007-07-01

    Healthy People 2010 (HP 2010) objectives call for a 38% reduction in the prevalence of diagnosed diabetes mellitus, type 1 and type 2, by the year 2010. The process for setting this objective, however, did not focus on the achievability or the compatibility of this objective with other national public health objectives. We used a dynamic simulation model to explore plausible trajectories for diabetes prevalence in the wake of rising levels of obesity in the U.S. population. The model helps to interpret historic trends in diabetes prevalence in the United States and to anticipate plausible future trends through 2010. We conducted simulation experiments using a computer model of diabetes population dynamics to 1) track the rates at which people develop diabetes, are diagnosed with the disease, and die, and 2) assess the effects of various preventive-care interventions. System dynamics modeling methodology based on data from multiple sources guided the analyses. With the number of new cases of diabetes being much greater than the number of deaths among those with the disease, the prevalence of diagnosed diabetes in the United States is likely to continue to increase. Even a 29% reduction in the number of new cases (the HP 2010 objective) would only slow the growth, not reverse it. Increased diabetes detection rates or decreased mortality rates--also HP 2010 objectives--would further increase diagnosed prevalence. The HP 2010 objective for reducing diabetes prevalence is unattainable given the historical processes that are affecting incidence, diagnosis, and mortality, and even a zero-growth future is unlikely. System dynamics modeling shows why interventions to protect against chronic diseases have only gradual effects on their diagnosed prevalence.

  14. Exploration of a digital audio processing platform using a compositional system level performance estimation framework

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer; Madsen, Jan

    2009-01-01

    This paper presents the application of a compositional simulation based system-level performance estimation framework on a non-trivial industrial case study. The case study is provided by the Danish company Bang & Olufsen ICEpower a/s and focuses on the exploration of a digital mobile audio...

  15. System-level modeling and simulation of the cell culture microfluidic biochip ProCell

    DEFF Research Database (Denmark)

    Minhass, Wajid Hassan; Pop, Paul; Madsen, Jan

    2010-01-01

    -defined micro-channels using valves and pumps. We present an approach to the system-level modeling and simulation of a cell culture microfluidic biochip called ProCell, Programmable Cell Culture Chip. ProCell contains a cell culture chamber, which is envisioned to run 256 simultaneous experiments (viewed...

  16. On the Implementation of AM/AM AM/PM Behavioral Models in System Level Simulation

    NARCIS (Netherlands)

    Shen, Y.; Tauritz, J.L.

    2003-01-01

    The use of nonlinear device behavioral models offers an economical way of simulating the performance of complex communication systems. A concrete method for implementing the AM/AM AM/PM behavioral model in system level simulation using ADS is developed. This method seamlessly tansfers the data from

  17. Systems-level modeling of mycobacterial metabolism for the identification of new (multi-)drug targets

    NARCIS (Netherlands)

    Rienksma, R.A.; Suarez Diez, M.; Spina, L.; Schaap, P.J.; Martins dos Santos, V.A.P.

    2014-01-01

    Systems-level metabolic network reconstructions and the derived constraint-based (CB) mathematical models are efficient tools to explore bacterial metabolism. Approximately one-fourth of the Mycobacterium tuberculosis (Mtb) genome contains genes that encode proteins directly involved in its metaboli

  18. The Artemis workbench for system-level performance evaluation of embedded systems

    NARCIS (Netherlands)

    A.D. Pimentel

    2008-01-01

    In this paper, we present an overview of the Artemis workbench, which provides modelling and simulation methods and tools for efficient performance evaluation and exploration of heterogeneous embedded multimedia systems. More specifically, we describe the Artemis system-level modelling methodology,

  19. Empirical LTE Smartphone Power Model with DRX Operation for System Level Simulations

    DEFF Research Database (Denmark)

    Lauridsen, Mads; Noël, Laurent; Mogensen, Preben

    2013-01-01

    An LTE smartphone power model is presented to enable academia and industry to evaluate users’ battery life on system level. The model is based on empirical measurements on a smartphone using a second generation LTE chipset, and the model includes functions of receive and transmit data rates...

  20. A system-level multiprocessor system-on-chip modeling framework

    DEFF Research Database (Denmark)

    Virk, Kashif Munir; Madsen, Jan

    2004-01-01

    We present a system-level modeling framework to model system-on-chips (SoC) consisting of heterogeneous multiprocessors and network-on-chip communication structures in order to enable the developers of today's SoC designs to take advantage of the flexibility and scalability of network-on-chip...

  1. Vocation in theology-based nursing theories.

    Science.gov (United States)

    Lundmark, Mikael

    2007-11-01

    By using the concepts of intrinsicality/extrinsicality as analytic tools, the theology-based nursing theories of Ann Bradshaw and Katie Eriksson are analyzed regarding their explicit and/or implicit understanding of vocation as a motivational factor for nursing. The results show that both theories view intrinsic values as guarantees against reducing nursing practice to mechanistic applications of techniques and as being a way of reinforcing a high ethical standard. The theories explicitly (Bradshaw) or implicitly (Eriksson) advocate a vocational understanding of nursing as being essential for nursing theories. Eriksson's theory has a potential for conceptualizing an understanding of extrinsic and intrinsic motivational factors for nursing but one weakness in the theory could be the risk of slipping over to moral judgments where intrinsic factors are valued as being superior to extrinsic. Bradshaw's theory is more complex and explicit in understanding the concept of vocation and is theologically more plausible, although also more confessional.

  2. Expanding the Role of Connectionism in SLA Theory

    Science.gov (United States)

    Language Learning, 2013

    2013-01-01

    In this article, I explore how connectionism might expand its role in second language acquisition (SLA) theory by showing how some symbolic models of bilingual and second language lexical memory can be reduced to a biologically realistic (i.e., neurally plausible) connectionist model. This integration or hybridization of the two models follows the…

  3. Expanding the Role of Connectionism in SLA Theory

    Science.gov (United States)

    Language Learning, 2013

    2013-01-01

    In this article, I explore how connectionism might expand its role in second language acquisition (SLA) theory by showing how some symbolic models of bilingual and second language lexical memory can be reduced to a biologically realistic (i.e., neurally plausible) connectionist model. This integration or hybridization of the two models follows the…

  4. Rationales for Indirect Speech: The Theory of the Strategic Speaker

    Science.gov (United States)

    Lee, James J.; Pinker, Steven

    2010-01-01

    Speakers often do not state requests directly but employ innuendos such as "Would you like to see my etchings?" Though such indirectness seems puzzlingly inefficient, it can be explained by a theory of the "strategic speaker", who seeks plausible deniability when he or she is uncertain of whether the hearer is cooperative or…

  5. Axions in String Theory

    Energy Technology Data Exchange (ETDEWEB)

    Svrcek, Peter; /Stanford U., Phys. Dept. /SLAC; Witten, Edward; /Princeton, Inst. Advanced Study

    2006-06-09

    In the context of string theory, axions appear to provide the most plausible solution of the strong CP problem. However, as has been known for a long time, in many string-based models, the axion coupling parameter Fa is several orders of magnitude higher than the standard cosmological bounds. We re-examine this problem in a variety of models, showing that Fa is close to the GUT scale or above in many models that have GUT-like phenomenology, as well as some that do not. On the other hand, in some models with Standard Model gauge fields supported on vanishing cycles, it is possible for Fa to be well below the GUT scale.

  6. System-level analysis of single event upset susceptibility in RRAM architectures

    Science.gov (United States)

    Liu, Rui; Barnaby, Hugh J.; Yu, Shimeng

    2016-12-01

    In this work, the single event upset susceptibility of a resistive random access memory (RRAM) system with 1-transistor-1-resistor (1T1R) and crossbar architectures to heavy ion strikes is investigated from the circuit-level to the system-level. From a circuit-level perspective, the 1T1R is only susceptible to single-bit-upset (SBU) due to the isolation of cells, while in the crossbar, multiple-bit-upsets may occur because ion-induced voltage spikes generated on drivers may propagate along rows or columns. Three factors are considered to evaluate system-level susceptibility: the upset rate, the sensitive area, and the vulnerable time window. Our analysis indicates that the crossbar architecture has a smaller maximum bit-error-rate per day as compared to the 1T1R architecture for a given sub-array size, I/O width and susceptible time window.

  7. System-Level Modeling and Synthesis of Flow-Based Microfluidic Biochips

    DEFF Research Database (Denmark)

    Minhass, Wajid Hassan; Pop, Paul; Madsen, Jan

    2011-01-01

    Microfluidic biochips are replacing the conventional biochemical analyzers and are able to integrate the necessary functions for biochemical analysis on-chip. There are several types of microfluidic biochips, each having its advantages and limitations. In this paper we are interested in flow......-based biochips, in which the flow of liquid is manipulated using integrated microvalves. By combining several microvalves, more complex units, such as micropumps, switches, mixers, and multiplexers, can be built. Although researchers have proposed significant work on the system-level synthesis of droplet......-based biochips, which manipulate droplets on a two-dimensional array of electrodes, no research on system-level synthesis of flow-based biochips has been reported so far. The focus has been on application modeling and component-level simulation. Therefore, for the first time to our knowledge, we propose a system...

  8. A system level boundary scan controller board for VME applications [to CERN experiments

    CERN Document Server

    Cardoso, N; Da Silva, J C

    2000-01-01

    This work is the result of a collaboration between INESC and LIP in the CMS experiment being conducted at CERN. The collaboration addresses the application of boundary scan test at system level namely the development of a VME boundary scan controller (BSC) board prototype and the corresponding software. This prototype uses the MTM bus existing in the VME64* backplane to apply the 1149.1 test vectors to a system composed of nineteen boards, called here units under test (UUTs). A top-down approach is used to describe our work. The paper begins with some insights about the experiment being conducted at CERN, proceed with system level considerations concerning our work and with some details about the BSC board. The results obtained so far and the proposed work is reviewed in the end of this contribution. (11 refs).

  9. Equation-Free Multiscale Computation enabling microscopic simulators to perform system-level tasks

    CERN Document Server

    Kevrekidis, Yu G; Hyman, J M; Kevrekidis, P G; Runborg, O; Theodoropoulos, C; Kevrekidis, Ioannis G.; Hyman, James M.; Kevrekidis, Panagiotis G.; Runborg, Olof; Theodoropoulos, Constantinos

    2002-01-01

    We present and discuss a framework for computer-aided multiscale analysis, which enables models at a "fine" (microscopic/stochastic) level of description to perform modeling tasks at a "coarse" (macroscopic, systems) level. These macroscopic modeling tasks, yielding information over long time and large space scales, are accomplished through appropriately initialized calls to the microscopic simulator for only short times and small spatial domains. Our equation-free (EF) approach, when successful, can bypass the derivation of the macroscopic evolution equations when these equations conceptually exist but are not available in closed form. We discuss how the mathematics-assisted development of a computational superstructure may enable alternative descriptions of the problem physics (e.g. Lattice Boltzmann (LB), kinetic Monte Carlo (KMC) or Molecular Dynamics (MD) microscopic simulators, executed over relatively short time and space scales) to perform systems level tasks (integration over relatively large time an...

  10. System-Level Reform in Healthcare Delivery for Patients and Populations Living with Chronic Disease.

    Science.gov (United States)

    Wedge, Richard; Currie, Douglas W

    2016-01-01

    Healthcare in Canada has generally not kept pace with the evolving needs of patients since the creation of medicare in the 1960s. Budgets for hospitals, physicians and prescription drugs make up the bulk of spending in health, despite the need for better prevention and management of chronic disease, the needed expansion of home-based care services and the call for reform of front-line primary care. Over the past decade, a number of Canadian health authorities have adopted the US-based Institute for Healthcare Improvement Triple Aim philosophy (better population health, better patient experience and better per capita cost of care) in order to build system-level change. The Atlantic Healthcare Collaboration was one attempt to initiate system-level reform in healthcare delivery for patients living with chronic disease.

  11. System-level perturbations of cell metabolism using CRISPR/Cas9

    DEFF Research Database (Denmark)

    Jakociunas, Tadas; Jensen, Michael Krogh; Keasling, Jay

    2017-01-01

    CRISPR/Cas9 (clustered regularly interspaced palindromic repeats and the associated protein Cas9) techniques have made genome engineering and transcriptional reprogramming studies more advanced and cost-effective. For metabolic engineering purposes, the CRISPR-based tools have been applied...... previously possible. In this mini-review we highlight recent studies adopting CRISPR/Cas9 for systems-level perturbations and model-guided metabolic engineering....

  12. Unravelling evolutionary strategies of yeast for improving galactose utilization through integrated systems level analysis

    DEFF Research Database (Denmark)

    Hong, Kuk-Ki; Vongsangnak, Wanwipa; Vemuri, Goutham N

    2011-01-01

    Identification of the underlying molecular mechanisms for a derived phenotype by adaptive evolution is difficult. Here, we performed a systems-level inquiry into the metabolic changes occurring in the yeast Saccharomyces cerevisiae as a result of its adaptive evolution to increase its specific...... design in bioengineering of improved strains and, that through systems biology, it is possible to identify mutations in evolved strain that can serve as unforeseen metabolic engineering targets for improving microbial strains for production of biofuels and chemicals....

  13. Using High Performance Computing to Realize a System-Level RDDO for Military Ground Vehicles

    Science.gov (United States)

    2008-07-14

    Using High Performance Computing to Realize a System-Level RBDO for Military Ground Vehicles • David A. Lamb, Ph.D. • Computational Reliability and...fictitious load cases is number of design variables X number of static load cases (6 X 24 = 144 for Stryker A-arm). RBDO Flowchart Pre-processor Morpher...Based Geometry Morpher Mesh Finite Element Analysis Durability Sensitivity RBDO /PBDO FE Analysis FE re-analysis for DSA Sensitivity of SIC and Fatigue

  14. Integrated System-Level Optimization for Concurrent Engineering With Parametric Subsystem Modeling

    Science.gov (United States)

    Schuman, Todd; DeWeck, Oliver L.; Sobieski, Jaroslaw

    2005-01-01

    The introduction of concurrent design practices to the aerospace industry has greatly increased the productivity of engineers and teams during design sessions as demonstrated by JPL's Team X. Simultaneously, advances in computing power have given rise to a host of potent numerical optimization methods capable of solving complex multidisciplinary optimization problems containing hundreds of variables, constraints, and governing equations. Unfortunately, such methods are tedious to set up and require significant amounts of time and processor power to execute, thus making them unsuitable for rapid concurrent engineering use. This paper proposes a framework for Integration of System-Level Optimization with Concurrent Engineering (ISLOCE). It uses parametric neural-network approximations of the subsystem models. These approximations are then linked to a system-level optimizer that is capable of reaching a solution quickly due to the reduced complexity of the approximations. The integration structure is described in detail and applied to the multiobjective design of a simplified Space Shuttle external fuel tank model. Further, a comparison is made between the new framework and traditional concurrent engineering (without system optimization) through an experimental trial with two groups of engineers. Each method is evaluated in terms of optimizer accuracy, time to solution, and ease of use. The results suggest that system-level optimization, running as a background process during integrated concurrent engineering sessions, is potentially advantageous as long as it is judiciously implemented.

  15. Integrating Omics Technologies to Study Pulmonary Physiology and Pathology at the Systems Level

    Directory of Open Access Journals (Sweden)

    Ravi Ramesh Pathak

    2014-04-01

    Full Text Available Assimilation and integration of “omics” technologies, including genomics, epigenomics, proteomics, and metabolomics has readily altered the landscape of medical research in the last decade. The vast and complex nature of omics data can only be interpreted by linking molecular information at the organismic level, forming the foundation of systems biology. Research in pulmonary biology/medicine has necessitated integration of omics, network, systems and computational biology data to differentially diagnose, interpret, and prognosticate pulmonary diseases, facilitating improvement in therapy and treatment modalities. This review describes how to leverage this emerging technology in understanding pulmonary diseases at the systems level -called a “systomic” approach. Considering the operational wholeness of cellular and organ systems, diseased genome, proteome, and the metabolome needs to be conceptualized at the systems level to understand disease pathogenesis and progression. Currently available omics technology and resources require a certain degree of training and proficiency in addition to dedicated hardware and applications, making them relatively less user friendly for the pulmonary biologist and clinicians. Herein, we discuss the various strategies, computational tools and approaches required to study pulmonary diseases at the systems level for biomedical scientists and clinical researchers.

  16. Studying the Relationship between System-Level and Component-Level Resilience

    Directory of Open Access Journals (Sweden)

    Michael D. Mitchell

    2015-01-01

    Full Text Available The capacity to maintain stability in a system relies on the components which make up the system. This study explores the relationship between component-level resilience and system-level resilience with the aim of identifying policies which foster system-level resilience in situations where existing incentives might undermine it. We use an abstract model of interacting specialized resource users and producers which can be parameterized to represent specific real systems. We want to understand how features, such as stockpiles, influence system versus component resilience. Systems are subject to perturbations of varying intensity and frequency. For our study, we create a simplified economy in which an inventory carrying cost is imposed to incentivize smaller inventories and examine how components with varying inventory levels compete in environments subject to periods of resource scarcity. The results show that policies requiring larger inventories foster higher component-level resilience but do not foster higher system-level resilience. Inventory carrying costs reduce production efficiency as inventory sizes increase. JIT inventory strategies improve production efficiency but do not afford any buffer against future uncertainty of resource availability.

  17. Evaluation of architectures for an ASP MPEG-4 decoder using a system-level design methodology

    Science.gov (United States)

    Garcia, Luz; Reyes, Victor; Barreto, Dacil; Marrero, Gustavo; Bautista, Tomas; Nunez, Antonio

    2005-06-01

    Trends in multimedia consumer electronics, digital video and audio, aim to reach users through low-cost mobile devices connected to data broadcasting networks with limited bandwidth. An emergent broadcasting network is the digital audio broadcasting network (DAB) which provides CD quality audio transmission together with robustness and efficiency techniques to allow good quality reception in motion conditions. This paper focuses on the system-level evaluation of different architectural options to allow low bandwidth digital video reception over DAB, based on video compression techniques. Profiling and design space exploration techniques are applied over the ASP MPEG-4 decoder in order to find out the best HW/SW partition given the application and platform constraints. An innovative SystemC-based system-level design tool, called CASSE, is being used for modelling, exploration and evaluation of different ASP MPEG-4 decoder HW/SW partitions. System-level trade offs and quantitative data derived from this analysis are also presented in this work.

  18. A theory evaluation of an induction programme

    Directory of Open Access Journals (Sweden)

    Kenrick Hendricks

    2012-07-01

    Full Text Available Orientation: An induction programme is commonly used to help new employees understand their job within the organisation. Research purpose: The main aim of this study was to examine whether or not the programme theory of an induction programme was plausible and would lead to the intended outcomes as described by the programme manager.Motivation for the study: Induction training is one of the most common training programmes in an organisation. However, there is little research to evaluate whether or not the activities of an induction programme will lead to the intended outcomes of such a programme.Research design, approach and method: This theory evaluation used a descriptive design. One hundred and thirteen employees of a media company completed a ten-item, five-point Likert scale which measured their perceptions of the programme’s outcome, identification with the organisation and intentions to stay with the organisation.Main findings: From this theory evaluation it was apparent that an induction programme based on an implausible programme theory could be problematic. An implausible programme theory affects the design of the programme activities and unsuitable activities may not deliver the desired outcomes.Practical/managerial implications: The intention of the evaluation is to guide human resource managers through a process of replacing an implausible programme theory with one that is plausible, and which ensures better alignment of programme activities and outcomes.Contribution/value-add: The evaluators showed how a plausible programme theory could improve programme design. This redesigned induction programme may lead to benefits, such as staff retention and company identification, rather than the vague assumption that it has been conforming to a legal obligation.

  19. The induction ability of qualitative plausibility measures in default reasoning%量化plausibility测度在默认推理系统中的推理能力

    Institute of Scientific and Technical Information of China (English)

    霍旭辉; 寇辉

    2011-01-01

    作者讨论了量化plausibility测度在默认推理逻辑系统(P系统)中的推理能力,给出了一般量化plausibility测度与possibility测度具有相同推理能力的条件.%In this paper,the authors investigate the induction ability of qualitative plausibility measures in default reasoning,and obtain the conditions such that the general qualitative plausibility measures and the possibilty measures have the same induction in default reasoning.

  20. Plausible Drug Targets in the Streptococcus mutans Quorum Sensing Pathways to Combat Dental Biofilms and Associated Risks.

    Science.gov (United States)

    Kaur, Gurmeet; Rajesh, Shrinidhi; Princy, S Adline

    2015-12-01

    Streptococcus mutans, a Gram positive facultative anaerobe, is one among the approximately seven hundred bacterial species to exist in human buccal cavity and cause dental caries. Quorum sensing (QS) is a cell-density dependent communication process that respond to the inter/intra-species signals and elicit responses to show behavioral changes in the bacteria to an aggressive forms. In accordance to this phenomenon, the S. mutans also harbors a Competing Stimulating Peptide (CSP)-mediated quorum sensing, ComCDE (Two-component regulatory system) to regulate several virulence-associated traits that includes the formation of the oral biofilm (dental plaque), genetic competence and acidogenicity. The QS-mediated response of S. mutans adherence on tooth surface (dental plaque) imparts antibiotic resistance to the bacterium and further progresses to lead a chronic state, known as periodontitis. In recent years, the oral streptococci, S. mutans are not only recognized for its cariogenic potential but also well known to worsen the infective endocarditis due to its inherent ability to colonize and form biofilm on heart valves. The review significantly appreciate the increasing complexity of the CSP-mediated quorum-sensing pathway with a special emphasis to identify the plausible drug targets within the system for the development of anti-quorum drugs to control biofilm formation and associated risks.

  1. Three-layered metallodielectric nanoshells: plausible meta-atoms for metamaterials with isotropic negative refractive index at visible wavelengths.

    Science.gov (United States)

    Wu, DaJian; Jiang, ShuMin; Cheng, Ying; Liu, XiaoJun

    2013-01-14

    A three-layered Ag-low-permittivity (LP)-high-permittivity (HP) nanoshell is proposed as a plausible meta-atom for building the three-dimensional isotropic negative refractive index metamaterials (NIMs). The overlap between the electric and magnetic responses of Ag-LP-HP nanoshell can be realized by designing the geometry of the particle, which can lead to the negative electric and magnetic polarizabilities. Then, the negative refractive index is found in the random arrangement of Ag-LP-HP nanoshells. Especially, the modulation of the middle LP layer can move the negative refractive index range into the visible region. Because the responses arise from the each meta-atom, the metamaterial is intrinsically isotropic and polarization independent. It is further found with the increase of the LP layer thickness that the negative refractive index range of the random arrangement shows a large blue-shift and becomes narrow. With the decrease of the filling fraction, the negative refractive index range shows a blue-shift and becomes narrow while the maximum of the negative refractive index decreases.

  2. Non-canonical 3'-5' extension of RNA with prebiotically plausible ribonucleoside 2',3'-cyclic phosphates.

    Science.gov (United States)

    Mutschler, Hannes; Holliger, Philipp

    2014-04-09

    Ribonucleoside 2',3'-cyclic phosphates (N>p's) are generated by multiple prebiotically plausible processes and are credible building blocks for the assembly of early RNA oligomers. While N>p's can be polymerized into short RNAs by non-enzymatic processes with variable efficiency and regioselectivity, no enzymatic route for RNA synthesis had been described. Here we report such a non-canonical 3'-5' nucleotidyl transferase activity. We engineered a variant of the hairpin ribozyme to catalyze addition of all four N>p's (2',3'-cyclic A-, G-, U-, and CMP) to the 5'-hydroxyl termini of RNA strands with 5' nucleotide addition enhanced in all cases by eutectic ice phase formation at -7 °C. We also observed 5' addition of 2',3'-cyclic phosphate-activated β-nicotinamide adenine dinucleotide (NAD>p) and ACA>p RNA trinucleotide, and multiple additions of GUCCA>p RNA pentamers. Our results establish a new mode of RNA 3'-5' extension with implications for RNA oligomer synthesis from prebiotic nucleotide pools.

  3. Plausibility of stromal initiation of epithelial cancers without a mutation in the epithelium: a computer simulation of morphostats

    Directory of Open Access Journals (Sweden)

    Cappuccio Antonio

    2009-03-01

    Full Text Available Abstract Background There is experimental evidence from animal models favoring the notion that the disruption of interactions between stroma and epithelium plays an important role in the initiation of carcinogenesis. These disrupted interactions are hypothesized to be mediated by molecules, termed morphostats, which diffuse through the tissue to determine cell phenotype and maintain tissue architecture. Methods We developed a computer simulation based on simple properties of cell renewal and morphostats. Results Under the computer simulation, the disruption of the morphostat gradient in the stroma generated epithelial precursors of cancer without any mutation in the epithelium. Conclusion The model is consistent with the possibility that the accumulation of genetic and epigenetic changes found in tumors could arise after the formation of a founder population of aberrant cells, defined as cells that are created by low or insufficient morphostat levels and that no longer respond to morphostat concentrations. Because the model is biologically plausible, we hope that these results will stimulate further experiments.

  4. Bilinguals' Plausibility Judgments for Phrases with a Literal vs. Non-literal Meaning: The Influence of Language Brokering Experience

    Directory of Open Access Journals (Sweden)

    Belem G. López

    2017-09-01

    Full Text Available Previous work has shown that prior experience in language brokering (informal translation may facilitate the processing of meaning within and across language boundaries. The present investigation examined the influence of brokering on bilinguals' processing of two word collocations with either a literal or a figurative meaning in each language. Proficient Spanish-English bilinguals classified as brokers or non-brokers were asked to judge if adjective+noun phrases presented in each language made sense or not. Phrases with a literal meaning (e.g., stinging insect were interspersed with phrases with a figurative meaning (e.g., stinging insult and non-sensical phrases (e.g., stinging picnic. It was hypothesized that plausibility judgments would be facilitated for literal relative to figurative meanings in each language but that experience in language brokering would be associated with a more equivalent pattern of responding across languages. These predictions were confirmed. The findings add to the body of empirical work on individual differences in language processing in bilinguals associated with prior language brokering experience.

  5. Synchronous volcanic eruptions and abrupt climate change ∼17.7 ka plausibly linked by stratospheric ozone depletion.

    Science.gov (United States)

    McConnell, Joseph R; Burke, Andrea; Dunbar, Nelia W; Köhler, Peter; Thomas, Jennie L; Arienzo, Monica M; Chellman, Nathan J; Maselli, Olivia J; Sigl, Michael; Adkins, Jess F; Baggenstos, Daniel; Burkhart, John F; Brook, Edward J; Buizert, Christo; Cole-Dai, Jihong; Fudge, T J; Knorr, Gregor; Graf, Hans-F; Grieman, Mackenzie M; Iverson, Nels; McGwire, Kenneth C; Mulvaney, Robert; Paris, Guillaume; Rhodes, Rachael H; Saltzman, Eric S; Severinghaus, Jeffrey P; Steffensen, Jørgen Peder; Taylor, Kendrick C; Winckler, Gisela

    2017-09-05

    Glacial-state greenhouse gas concentrations and Southern Hemisphere climate conditions persisted until ∼17.7 ka, when a nearly synchronous acceleration in deglaciation was recorded in paleoclimate proxies in large parts of the Southern Hemisphere, with many changes ascribed to a sudden poleward shift in the Southern Hemisphere westerlies and subsequent climate impacts. We used high-resolution chemical measurements in the West Antarctic Ice Sheet Divide, Byrd, and other ice cores to document a unique, ∼192-y series of halogen-rich volcanic eruptions exactly at the start of accelerated deglaciation, with tephra identifying the nearby Mount Takahe volcano as the source. Extensive fallout from these massive eruptions has been found >2,800 km from Mount Takahe. Sulfur isotope anomalies and marked decreases in ice core bromine consistent with increased surface UV radiation indicate that the eruptions led to stratospheric ozone depletion. Rather than a highly improbable coincidence, circulation and climate changes extending from the Antarctic Peninsula to the subtropics-similar to those associated with modern stratospheric ozone depletion over Antarctica-plausibly link the Mount Takahe eruptions to the onset of accelerated Southern Hemisphere deglaciation ∼17.7 ka.

  6. X-ray investigation of the diffuse emission around plausible gamma-ray emitting pulsar wind nebulae in Kookaburra region

    CERN Document Server

    Kishishita, Tetsuichi; Uchiyama, Yasunobu; Tanaka, Yasuyuki; Takahashi, Tadayuki

    2012-01-01

    We report on the results from {\\it Suzaku} X-ray observations of the radio complex region called Kookaburra, which includes two adjacent TeV $\\gamma$-ray sources HESS J1418-609 and HESS J1420-607. The {\\it Suzaku} observation revealed X-ray diffuse emission around a middle-aged pulsar PSR J1420-6048 and a plausible PWN Rabbit with elongated sizes of $\\sigma_{\\rm X}=1^{\\prime}.66$ and $\\sigma_{\\rm X}=1^{\\prime}.49$, respectively. The peaks of the diffuse X-ray emission are located within the $\\gamma$-ray excess maps obtained by H.E.S.S. and the offsets from the $\\gamma$-ray peaks are $2^{\\prime}.8$ for PSR J1420-6048 and $4^{\\prime}.5$ for Rabbit. The X-ray spectra of the two sources were well reproduced by absorbed power-law models with $\\Gamma=1.7-2.3$. The spectral shapes tend to become softer according to the distance from the X-ray peaks. Assuming the one zone electron emission model as the first order approximation, the ambient magnetic field strengths of HESS J1420-607 and HESS J1418-609 can be estimate...

  7. Bethe-Heitler cascades as a plausible origin of hard spectra in distant TeV blazars

    CERN Document Server

    Zheng, Y G; Kang, S J

    2016-01-01

    Context. Very high-energy (VHE) $\\gamma$-ray measurements of distant TeV blazars can be nicely explained by TeV spectra induced by ultra high-energy cosmic rays. Aims. We develop a model for a plausible origin of hard spectra in distant TeV blazars. Methods. In the model, the TeV emission in distant TeV blazars is dominated by two mixed components. The first is the internal component with the photon energy around 1 TeV produced by inverse Compton scattering of the relativistic electrons on the synchrotron photons (SSC) with a correction for extragalactic background light absorbtion and the other is the external component with the photon energy more than 1 TeV produced by the cascade emission from high-energy protons propagating through intergalactic space. Results. Assuming suitable model parameters, we apply the model to observed spectra of distant TeV blazars of 1ES 0229+200. Our results show that 1) the observed spectrum properties of 1ES 0229+200, especially the TeV $\\gamma$-ray tail of the observed spect...

  8. Simultaneous observations of a pair of kilohertz QPOs and a plausible 1860 Hz QPO from an accreting neutron star system

    CERN Document Server

    Bhattacharyya, Sudip

    2009-01-01

    We report an indication (3.22 sigma) of ~ 1860 Hz quasi-periodic oscillations from a neutron star low-mass X-ray binary 4U 1636-536. If confirmed, this will be by far the highest frequency feature observed from an accreting neutron star system, and hence could be very useful to understand such systems. This plausible timing feature was observed simultaneously with lower (~ 585 Hz) and upper (~ 904 Hz) kilohertz quasi-periodic oscillations. The two kilohertz quasi-periodic oscillation frequencies had the ratio of ~ 1.5, and the frequency of the alleged ~ 1860 Hz feature was close to the triple and the double of these frequencies. This can be useful to constrain the models of all the three features. In particular, the ~ 1860 Hz feature could be (1) from a new and heretofore unknown class of quasi-periodic oscillations, or (2) the first observed overtone of lower or upper kilohertz quasi-periodic oscillations. Finally we note that, although the relatively low significance of the ~ 1860 Hz feature argues for caut...

  9. Synchronous volcanic eruptions and abrupt climate change ˜17.7 ka plausibly linked by stratospheric ozone depletion

    Science.gov (United States)

    McConnell, Joseph R.; Burke, Andrea; Dunbar, Nelia W.; Köhler, Peter; Thomas, Jennie L.; Arienzo, Monica M.; Chellman, Nathan J.; Maselli, Olivia J.; Sigl, Michael; Adkins, Jess F.; Baggenstos, Daniel; Burkhart, John F.; Brook, Edward J.; Buizert, Christo; Cole-Dai, Jihong; Fudge, T. J.; Knorr, Gregor; Graf, Hans-F.; Grieman, Mackenzie M.; Iverson, Nels; McGwire, Kenneth C.; Mulvaney, Robert; Paris, Guillaume; Rhodes, Rachael H.; Saltzman, Eric S.; Severinghaus, Jeffrey P.; Steffensen, Jørgen Peder; Taylor, Kendrick C.; Winckler, Gisela

    2017-09-01

    Glacial-state greenhouse gas concentrations and Southern Hemisphere climate conditions persisted until ˜17.7 ka, when a nearly synchronous acceleration in deglaciation was recorded in paleoclimate proxies in large parts of the Southern Hemisphere, with many changes ascribed to a sudden poleward shift in the Southern Hemisphere westerlies and subsequent climate impacts. We used high-resolution chemical measurements in the West Antarctic Ice Sheet Divide, Byrd, and other ice cores to document a unique, ˜192-y series of halogen-rich volcanic eruptions exactly at the start of accelerated deglaciation, with tephra identifying the nearby Mount Takahe volcano as the source. Extensive fallout from these massive eruptions has been found >2,800 km from Mount Takahe. Sulfur isotope anomalies and marked decreases in ice core bromine consistent with increased surface UV radiation indicate that the eruptions led to stratospheric ozone depletion. Rather than a highly improbable coincidence, circulation and climate changes extending from the Antarctic Peninsula to the subtropics—similar to those associated with modern stratospheric ozone depletion over Antarctica—plausibly link the Mount Takahe eruptions to the onset of accelerated Southern Hemisphere deglaciation ˜17.7 ka.

  10. Removal of hazardous organics from water using metal-organic frameworks (MOFs): plausible mechanisms for selective adsorptions.

    Science.gov (United States)

    Hasan, Zubair; Jhung, Sung Hwa

    2015-01-01

    Provision of clean water is one of the most important issues worldwide because of continuing economic development and the steady increase in the global population. However, clean water resources are decreasing everyday, because of contamination with various pollutants including organic chemicals. Pharmaceutical and personal care products, herbicides/pesticides, dyes, phenolics, and aromatics (from sources such as spilled oil) are typical organics that should be removed from water. Because of their huge porosities, designable pore structures, and facile modification, metal-organic frameworks (MOFs) are used in various adsorption, separation, storage, and delivery applications. In this review, the adsorptive purifications of contaminated water with MOFs are discussed, in order to understand possible applications of MOFs in clean water provision. More importantly, plausible adsorption or interaction mechanisms and selective adsorptions are summarized. The mechanisms of interactions such as electrostatic interaction, acid-base interaction, hydrogen bonding, π-π stacking/interaction, and hydrophobic interaction are discussed for the selective adsorption of organics over MOFs. The adsorption mechanisms will be very helpful not only for understanding adsorptions but also for applications of adsorptions in selective removal, storage, delivery and so on.

  11. System of Systems (SoS) M&S VV&A Decomposition: Integrated System Level VV&A (ISLA)

    Science.gov (United States)

    2009-01-01

    PFR ): » Manually recreate and run a past flight test scenario in a test venue performing system-level comparative analysis of the real- world...Perform root cause analysis of the system-level anomalies found in the PFR ; generate, test and implement M&S improvements to address anomalies...11Approved for Public Release 08-MDA-4059 (5 JAN 09) M&S System-Level PFR Preparation Process (U) DESV Flight Test TC Target Specs +10 Day BET +30

  12. Small Spacecraft System-Level Design and Optimization for Interplanetary Trajectories

    Science.gov (United States)

    Spangelo, Sara; Dalle, Derek; Longmier, Ben

    2014-01-01

    The feasibility of an interplanetary mission for a CubeSat, a type of miniaturized spacecraft, that uses an emerging technology, the CubeSat Ambipolar Thruster (CAT) is investigated. CAT is a large delta-V propulsion system that uses a high-density plasma source that has been miniaturized for small spacecraft applications. An initial feasibility assessment that demonstrated escaping Low Earth Orbit (LEO) and achieving Earth-escape trajectories with a 3U CubeSat and this thruster technology was demonstrated in previous work. We examine a mission architecture with a trajectory that begins in Earth orbits such as LEO and Geostationary Earth Orbit (GEO) which escapes Earth orbit and travels to Mars, Jupiter, or Saturn. The goal was to minimize travel time to reach the destinations and considering trade-offs between spacecraft dry mass, fuel mass, and solar power array size. Sensitivities to spacecraft dry mass and available power are considered. CubeSats are extremely size, mass, and power constrained, and their subsystems are tightly coupled, limiting their performance potential. System-level modeling, simulation, and optimization approaches are necessary to find feasible and optimal operational solutions to ensure system-level interactions are modeled. Thus, propulsion, power/energy, attitude, and orbit transfer models are integrated to enable systems-level analysis and trades. The CAT technology broadens the possible missions achievable with small satellites. In particular, this technology enables more sophisticated maneuvers by small spacecraft such as polar orbit insertion from an equatorial orbit, LEO to GEO transfers, Earth-escape trajectories, and transfers to other interplanetary bodies. This work lays the groundwork for upcoming CubeSat launch opportunities and supports future development of interplanetary and constellation CubeSat and small satellite mission concepts.

  13. On Matrix Sampling and Imputation of Context Questionnaires with Implications for the Generation of Plausible Values in Large-Scale Assessments

    Science.gov (United States)

    Kaplan, David; Su, Dan

    2016-01-01

    This article presents findings on the consequences of matrix sampling of context questionnaires for the generation of plausible values in large-scale assessments. Three studies are conducted. Study 1 uses data from PISA 2012 to examine several different forms of missing data imputation within the chained equations framework: predictive mean…

  14. World Literacy Prospects at the Turn of the Century: Is the Objective of Literacy for All by the Year 2000 Statistically Plausible?

    Science.gov (United States)

    Carceles, Gabriel

    1990-01-01

    Describes status and challenge of worldwide illiteracy. Discusses statistical plausibility of universal literacy by 2000. Predicts literacy universalization will take from 14 to 21 years, depending on region, if 1980s trends continue. Implies literacy work requires action strategies commensurate with problem, including national programs and mass…

  15. Study on the system-level test method of digital metering in smart substation

    Science.gov (United States)

    Zhang, Xiang; Yang, Min; Hu, Juan; Li, Fuchao; Luo, Ruixi; Li, Jinsong; Ai, Bing

    2017-03-01

    Nowadays, the test methods of digital metering system in smart substation are used to test and evaluate the performance of a single device, but these methods can only effectively guarantee the accuracy and reliability of the measurement results of a digital metering device in a single run, it does not completely reflect the performance when each device constitutes a complete system. This paper introduced the shortages of the existing test methods. A system-level test method of digital metering in smart substation was proposed, and the feasibility of the method was proved by the actual test.

  16. Abstract Radio Resource Management Framework for System Level Simulations in LTE-A Systems

    DEFF Research Database (Denmark)

    Fotiadis, Panagiotis; Viering, Ingo; Zanier, Paolo;

    2014-01-01

    This paper provides a simple mathematical model of different packet scheduling policies in Long Term Evolution- Advanced (LTE-A) systems, by investigating the performance of Proportional Fair (PF) and the generalized cross-Component Carrier scheduler from a theoretical perspective. For that purpose......, an abstract Radio Resource Management (RRM) framework has been developed and tested for different ratios of users with Carrier Aggregation (CA) capabilities. The conducted system level simulations confirm that the proposed model can satisfactorily capture the main properties of the aforementioned scheduling...

  17. Out-of-order parallel discrete event simulation for electronic system-level design

    CERN Document Server

    Chen, Weiwei

    2014-01-01

    This book offers readers a set of new approaches and tools a set of tools and techniques for facing challenges in parallelization with design of embedded systems.? It provides an advanced parallel simulation infrastructure for efficient and effective system-level model validation and development so as to build better products in less time.? Since parallel discrete event simulation (PDES) has the potential to exploit the underlying parallel computational capability in today's multi-core simulation hosts, the author begins by reviewing the parallelization of discrete event simulation, identifyin

  18. System Level Modelling of RF IC in SystemC-WMS

    Directory of Open Access Journals (Sweden)

    Massimo Conti

    2008-06-01

    Full Text Available This paper proposes a methodology for modelling and simulation of RF systems in SystemC-WMS. Analog RF modules have been described at system level only by using their specifications. A complete Bluetooth transceiver, consisting of digital and analog blocks, has been modelled and simulated using the proposed design methodology. The developed transceiver modules have been connected to the higher levels of the Bluetooth stack described in SystemC, allowing the analysis of the performance of the Bluetooth protocol at all the different layers of the protocol stack.

  19. System Level Design of a Continuous-Time Delta-Sigma Modulator for Portable Ultrasound Scanners

    DEFF Research Database (Denmark)

    Llimos Muntal, Pere; Færch, Kjartan; Jørgensen, Ivan Harald Holger;

    2015-01-01

    In this paper the system level design of a continuous-time ∆Σ modulator for portable ultrasound scanners is presented. The overall required signal-to-noise ratio (SNR) is derived to be 42 dB and the sampling frequency used is 320 MHz for an oversampling ratio of 16. In order to match these requir......, based on high-level VerilogA simulations, the performance of the ∆Σ modulator versus various block performance parameters is presented as trade-off curves. Based on these results, the block specifications are derived....

  20. System-level view of geospace dynamics: Challenges for high-latitude ground-based observations

    Science.gov (United States)

    Donovan, E.

    2014-12-01

    Increasingly, research programs including GEM, CEDAR, GEMSIS, GO Canada, and others are focusing on how geospace works as a system. Coupling sits at the heart of system level dynamics. In all cases, coupling is accomplished via fundamental processes such as reconnection and plasma waves, and can be between regions, energy ranges, species, scales, and energy reservoirs. Three views of geospace are required to attack system level questions. First, we must observe the fundamental processes that accomplish the coupling. This "observatory view" requires in situ measurements by satellite-borne instruments or remote sensing from powerful well-instrumented ground-based observatories organized around, for example, Incoherent Scatter Radars. Second, we need to see how this coupling is controlled and what it accomplishes. This demands quantitative observations of the system elements that are being coupled. This "multi-scale view" is accomplished by networks of ground-based instruments, and by global imaging from space. Third, if we take geospace as a whole, the system is too complicated, so at the top level we need time series of simple quantities such as indices that capture important aspects of the system level dynamics. This requires a "key parameter view" that is typically provided through indices such as AE and DsT. With the launch of MMS, and ongoing missions such as THEMIS, Cluster, Swarm, RBSP, and ePOP, we are entering a-once-in-a-lifetime epoch with a remarkable fleet of satellites probing processes at key regions throughout geospace, so the observatory view is secure. With a few exceptions, our key parameter view provides what we need. The multi-scale view, however, is compromised by space/time scales that are important but under-sampled, combined extent of coverage and resolution that falls short of what we need, and inadequate conjugate observations. In this talk, I present an overview of what we need for taking system level research to its next level, and how

  1. System-Level Design of an Integrated Receiver Front End for a Wireless Ultrasound Probe

    DEFF Research Database (Denmark)

    di Ianni, Tommaso; Hemmsen, Martin Christian; Llimos Muntal, Pere;

    2016-01-01

    In this paper, a system-level design is presented for an integrated receive circuit for a wireless ultrasound probe, which includes analog front ends and beamformation modules. This paper focuses on the investigation of the effects of architectural design choices on the image quality. The point......). The designs that minimally satisfy the specifications are based on an 8-b 30-MSPS Nyquist converter and a single-bit third-order 240-MSPS modulator, with an SNR for the LNA in both cases equal to 64 dB. The mean lateral FWHM and CR are 2.4% and 7.1% lower for the architecture compared with the Nyquistrate one...

  2. System-level perturbations of cell metabolism using CRISPR/Cas9

    Energy Technology Data Exchange (ETDEWEB)

    Jakočiūnas, Tadas [Technical Univ. of Denmark, Lyngby (Denmark); Jensen, Michael K. [Technical Univ. of Denmark, Lyngby (Denmark); Keasling, Jay D. [Technical Univ. of Denmark, Lyngby (Denmark); Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States)

    2017-03-30

    CRISPR/Cas9 (clustered regularly interspaced palindromic repeats and the associated protein Cas9) techniques have made genome engineering and transcriptional reprogramming studies much more advanced and cost-effective. For metabolic engineering purposes, the CRISPR-based tools have been applied to single and multiplex pathway modifications and transcriptional regulations. The effectiveness of these tools allows researchers to implement genome-wide perturbations, test model-guided genome editing strategies, and perform transcriptional reprogramming perturbations in a more advanced manner than previously possible. In this mini-review we highlight recent studies adopting CRISPR/Cas9 for systems-level perturbations and model-guided metabolic engineering.

  3. System Level Modelling of RF IC in SystemC-WMS

    Directory of Open Access Journals (Sweden)

    Orcioni Simone

    2008-01-01

    Full Text Available Abstract This paper proposes a methodology for modelling and simulation of RF systems in SystemC-WMS. Analog RF modules have been described at system level only by using their specifications. A complete Bluetooth transceiver, consisting of digital and analog blocks, has been modelled and simulated using the proposed design methodology. The developed transceiver modules have been connected to the higher levels of the Bluetooth stack described in SystemC, allowing the analysis of the performance of the Bluetooth protocol at all the different layers of the protocol stack.

  4. Waltz's Theory of Theory

    DEFF Research Database (Denmark)

    Wæver, Ole

    2009-01-01

    Kenneth N. Waltz's 1979 book, Theory of International Politics, is the most influential in the history of the discipline. It worked its effects to a large extent through raising the bar for what counted as theoretical work, in effect reshaping not only realism but rivals like liberalism and refle......Kenneth N. Waltz's 1979 book, Theory of International Politics, is the most influential in the history of the discipline. It worked its effects to a large extent through raising the bar for what counted as theoretical work, in effect reshaping not only realism but rivals like liberalism...... and reflectivism. Yet, ironically, there has been little attention to Waltz's very explicit and original arguments about the nature of theory. This article explores and explicates Waltz's theory of theory. Central attention is paid to his definition of theory as ‘a picture, mentally formed' and to the radical anti......-empiricism and anti-positivism of his position. Followers and critics alike have treated Waltzian neorealism as if it was at bottom a formal proposition about cause-effect relations. The extreme case of Waltz being so victorious in the discipline, and yet being consistently mis-interpreted on the question of theory...

  5. Developing spatially explicit footprints of plausible land-use scenarios in the Santa Cruz Watershed, Arizona and Sonora

    Science.gov (United States)

    Norman, Laura M.; Feller, Mark; Villarreal, Miguel L.

    2012-01-01

    The SLEUTH urban growth model is applied to a binational dryland watershed to envision and evaluate plausible future scenarios of land use change into the year 2050. Our objective was to create a suite of geospatial footprints portraying potential land use change that can be used to aid binational decision-makers in assessing the impacts relative to sustainability of natural resources and potential socio-ecological consequences of proposed land-use management. Three alternatives are designed to simulate different conditions: (i) a Current Trends Scenario of unmanaged exponential growth, (ii) a Conservation Scenario with managed growth to protect the environment, and (iii) a Megalopolis Scenario in which growth is accentuated around a defined international trade corridor. The model was calibrated with historical data extracted from a time series of satellite images. Model materials, methodology, and results are presented. Our Current Trends Scenario predicts the footprint of urban growth to approximately triple from 2009 to 2050, which is corroborated by local population estimates. The Conservation Scenario results in protecting 46% more of the Evergreen class (more than 150,000 acres) than the Current Trends Scenario and approximately 95,000 acres of Barren Land, Crops, Deciduous Forest (Mesquite Bosque), Grassland/Herbaceous, Urban/Recreational Grasses, and Wetlands classes combined. The Megalopolis Scenario results also depict the preservation of some of these land-use classes compared to the Current Trends Scenario, most notably in the environmentally important headwaters region. Connectivity and areal extent of land cover types that provide wildlife habitat were preserved under the alternative scenarios when compared to Current Trends.

  6. Testing the physiological plausibility of conflicting psychological models of response inhibition: A forward inference fMRI study.

    Science.gov (United States)

    Criaud, Marion; Longcamp, Marieke; Anton, Jean-Luc; Nazarian, Bruno; Roth, Muriel; Sescousse, Guillaume; Strafella, Antonio P; Ballanger, Bénédicte; Boulinguez, Philippe

    2017-08-30

    The neural mechanisms underlying response inhibition and related disorders are unclear and controversial for several reasons. First, it is a major challenge to assess the psychological bases of behaviour, and ultimately brain-behaviour relationships, of a function which is precisely intended to suppress overt measurable behaviours. Second, response inhibition is difficult to disentangle from other parallel processes involved in more general aspects of cognitive control. Consequently, different psychological and anatomo-functional models coexist, which often appear in conflict with each other even though they are not necessarily mutually exclusive. The standard model of response inhibition in go/no-go tasks assumes that inhibitory processes are reactively and selectively triggered by the stimulus that participants must refrain from reacting to. Recent alternative models suggest that action restraint could instead rely on reactive but non-selective mechanisms (all automatic responses are automatically inhibited in uncertain contexts) or on proactive and non-selective mechanisms (a gating function by which reaction to any stimulus is prevented in anticipation of stimulation when the situation is unpredictable). Here, we assessed the physiological plausibility of these different models by testing their respective predictions regarding event-related BOLD modulations (forward inference using fMRI). We set up a single fMRI design which allowed for us to record simultaneously the different possible forms of inhibition while limiting confounds between response inhibition and parallel cognitive processes. We found BOLD dynamics consistent with non-selective models. These results provide new theoretical and methodological lines of inquiry for the study of basic functions involved in behavioural control and related disorders. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Combination of monoclonal antibodies and DPP-IV inhibitors in the treatment of type 1 diabetes: a plausible treatment modality?

    Science.gov (United States)

    Dubala, Anil; Gupta, Ankur; Samanta, Malay K

    2014-07-01

    Regulatory T cells (Tregs) are crucial for the maintenance of immunological tolerance. Type 1 diabetes (T1D) occurs when the immune-regulatory mechanism fails. In fact, T1D is reversed by islet transplantation but is associated with hostile effects of persistent immune suppression. T1D is believed to be dependent on the activation of type-1 helper T (Th1) cells. Immune tolerance is liable for the activation of the Th1 cells. The important role of Th1 cells in pathology of T1D entails the depletion of CD4(+) T cells, which initiated the use of monoclonal antibodies (mAbs) against CD4(+) T cells to interfere with induction of T1D. Prevention of autoimmunity is not only a step forward for the treatment of T1D, but could also restore the β-cell mass. Glucagon-like peptide (GLP)-1 stimulates β-cell proliferation and also has anti-apoptotic effects on them. However, the potential use of GLP-1 as a possible method to restore pancreatic β-cells is limited due to rapid degradation by dipeptidyl peptidase (DPP)-IV. We hypothesize that treatment with combination of CD4 mAbs and DPP-IV inhibitors could prevent/reverse T1D. CD4 mAbs have the ability to induce immune tolerance, thereby arresting further progression of T1D; DPP-IV inhibitors have the capability to regenerate the β-cell mass. Consequently, the combination of CD4 mAbs and DPP-IV inhibitor could avoid or at least minimize the constraints of intensive subcutaneous insulin therapy. We presume that if this hypothesis proves correct, it may become one of the plausible therapeutic options for T1D.

  8. In Silico Structure Prediction of Human Fatty Acid Synthase-Dehydratase: A Plausible Model for Understanding Active Site Interactions.

    Science.gov (United States)

    John, Arun; Umashankar, Vetrivel; Samdani, A; Sangeetha, Manoharan; Krishnakumar, Subramanian; Deepa, Perinkulam Ravi

    2016-01-01

    Fatty acid synthase (FASN, UniProt ID: P49327) is a multienzyme dimer complex that plays a critical role in lipogenesis. Consequently, this lipogenic enzyme has gained tremendous biomedical importance. The role of FASN and its inhibition is being extensively researched in several clinical conditions, such as cancers, obesity, and diabetes. X-ray crystallographic structures of some of its domains, such as β-ketoacyl synthase, acetyl transacylase, malonyl transacylase, enoyl reductase, β-ketoacyl reductase, and thioesterase, (TE) are already reported. Here, we have attempted an in silico elucidation of the uncrystallized dehydratase (DH) catalytic domain of human FASN. This theoretical model for DH domain was predicted using comparative modeling methods. Different stand-alone tools and servers were used to validate and check the reliability of the predicted models, which suggested it to be a highly plausible model. The stereochemical analysis showed 92.0% residues in favorable region of Ramachandran plot. The initial physiological substrate β-hydroxybutyryl group was docked into active site of DH domain using Glide. The molecular dynamics simulations carried out for 20 ns in apo and holo states indicated the stability and accuracy of the predicted structure in solvated condition. The predicted model provided useful biochemical insights into the substrate-active site binding mechanisms. This model was then used for identifying potential FASN inhibitors using high-throughput virtual screening of the National Cancer Institute database of chemical ligands. The inhibitory efficacy of the top hit ligands was validated by performing molecular dynamics simulation for 20 ns, where in the ligand NSC71039 exhibited good enzyme inhibition characteristics and exhibited dose-dependent anticancer cytotoxicity in retinoblastoma cancer cells in vitro.

  9. Assessing the Sensitivity of a Reservoir Management System Under Plausible Assumptions About Future Climate Over Seasons to Decades

    Science.gov (United States)

    Ward, M. N.; Brown, C. M.; Baroang, K. M.; Kaheil, Y. H.

    2011-12-01

    We illustrate an analysis procedure that explores the robustness and overall productivity of a reservoir management system under plausible assumptions about climate fluctuation and change. Results are presented based on a stylized version of a multi-use reservoir management model adapted from Angat Dam, Philippines. It represents a modest-sized seasonal storage reservoir in a climate with a pronounced dry season. The reservoir management model focuses on October-March, during which climatological inflow declines due to the arrival of the dry season, and reservoir management becomes critical and challenging. Inflow is assumed to be impacted by climate fluctuations representing interannal variation (white noise), decadal to multidecadal variation (MDV, here represented by a stochastic autoregressive process) and global change (GC), here represented by a systematic linear trend in seasonal inflow total over the simulation period of 2008-2047. Reservoir reliability, and risk of extreme persistent water shortfall, is assessed under different combinations and magnitudes of GC and MDV. We include an illustration of adaptive management, using seasonal forecasts and updated climate normals. A set of seasonal forecast and observed inflow values are generated for 2008-2047 by randomly rearranging the forecast-observed pairs for 1968-2007. Then, trends are imposed on the observed series, with differing assumptions about the extent to which the seasonal forecasts can be expected to track the trend. We consider the framework presented here well-suited to providing insights about managing the climate risks in reservoir operations, providing guidance on expected benefits and risks of different strategies and climate scenarios.

  10. Systems-level characterization of the kernel mechanism of the cyanobacterial circadian oscillator.

    Science.gov (United States)

    Ma, Lan; Ranganathan, Rama

    2014-03-01

    Circadian clock is an essential molecular regulatory mechanism that coordinates daily biological processes. Toward understanding the design principles of the circadian mechanism in cyanobacteria, the only prokaryotes reported to possess circadian rhythmicity, mathematical models have been used as important tools to help elucidate the complicated biochemical processes. In this study, we focus on elucidating the underlying systems properties that drive the oscillation of the cyanobacterial clockwork. We apply combined methods of time scale separation, phase space analysis, bifurcation analysis and sensitivity analysis to a model of the in vitro cyanobacterial circadian clock proposed by us recently. The original model is reduced to a three-dimensional slow subsystem by time scale separation. Phase space analysis of the reduced subsystem shows that the null-surface of the Serine-phosphorylated state (S-state) of KaiC is a bistable surface, and that the characteristic of the phase portrait indicates that the kernel mechanism of the clockwork behaves as a relaxation oscillator induced by interlinked positive and negative feedback loops. Phase space analysis together with perturbation analysis supports our previous viewpoint that the S-state of KaiC is plausibly a key component for the protein regulatory network of the cyanobacterial circadian clock.

  11. ∑∆ Modulator System-Level Considerations for Hearing-Aid Audio Class-D Output Stage Application

    DEFF Research Database (Denmark)

    Pracný, Peter; Bruun, Erik

    2012-01-01

    This paper deals with a system-level design of a digital sigma-delta (∑∆) modulator for hearing-aid audio Class D output stage application. The aim of this paper is to provide a thorough discussion on various possibilities and tradeoffs of ∑∆ modulator system-level design parameter combinations...

  12. A System-level Infrastructure for Multi-dimensional MP-SoC Design Space Co-exploration

    NARCIS (Netherlands)

    Jia, Z.J.; Bautista, T.; Nunez, A.; Pimentel, A.D.; Thompson, M.

    2013-01-01

    In this article, we present a flexible and extensible system-level MP-SoC design space exploration (DSE) infrastructure, called NASA. This highly modular framework uses well-defined interfaces to easily integrate different system-level simulation tools as well as different combinations of search str

  13. System Level Design of a Continuous-Time Delta-Sigma Modulator for Portable Ultrasound Scanners

    DEFF Research Database (Denmark)

    Llimos Muntal, Pere; Færch, Kjartan; Jørgensen, Ivan Harald Holger

    2015-01-01

    In this paper the system level design of a continuous-time ∆Σ modulator for portable ultrasound scanners is presented. The overall required signal-to-noise ratio (SNR) is derived to be 42 dB and the sampling frequency used is 320 MHz for an oversampling ratio of 16. In order to match these requir......In this paper the system level design of a continuous-time ∆Σ modulator for portable ultrasound scanners is presented. The overall required signal-to-noise ratio (SNR) is derived to be 42 dB and the sampling frequency used is 320 MHz for an oversampling ratio of 16. In order to match...... these requirements, a fourth order, 1-bit modulator with optimal zero placing is used. An analysis shows that the thermal noise from the resistors and operational transconductance amplifier is not a limiting factor due to the low required SNR, leading to an inherently very low-power implementation. Furthermore......, based on high-level VerilogA simulations, the performance of the ∆Σ modulator versus various block performance parameters is presented as trade-off curves. Based on these results, the block specifications are derived....

  14. Performance Evaluation at the System Level of Reconfigurable Space-Time Coding Techniques for HSDPA

    Directory of Open Access Journals (Sweden)

    Alexiou Angeliki

    2005-01-01

    Full Text Available A reconfigurable space-time coding technique is investigated, for a high-speed downlink packet access multiple-antenna network, which combats the effects of antenna correlation. Reconfigurability is achieved at the link level by introducing a linear precoder in a space-time block coded system. The technique assumes knowledge of the long-term characteristics of the channel, namely the channel correlation matrix at the transmitter. The benefits of the proposed reconfigurable technique as compared to the conventional non-reconfigurable versions are evaluated via system-level simulations. In order to characterize the system-level performance accurately and, at the same time, use a feasible approach in terms of computational complexity, a suitable link-to-system interface has been developed. The average system throughput and the number of satisfied users are the performance metrics of interest. Simulation results demonstrate the performance enhancements achieved by the application of reconfigurable techniques as compared to their conventional counterparts.

  15. Integrated Strategies to Gain a Systems-Level View of Dynamic Signaling Networks.

    Science.gov (United States)

    Newman, Robert H; Zhang, Jin

    2017-01-01

    In order to survive and function properly in the face of an ever changing environment, cells must be able to sense changes in their surroundings and respond accordingly. Cells process information about their environment through complex signaling networks composed of many discrete signaling molecules. Individual pathways within these networks are often tightly integrated and highly dynamic, allowing cells to respond to a given stimulus (or, as is typically the case under physiological conditions, a combination of stimuli) in a specific and appropriate manner. However, due to the size and complexity of many cellular signaling networks, it is often difficult to predict how cellular signaling networks will respond under a particular set of conditions. Indeed, crosstalk between individual signaling pathways may lead to responses that are nonintuitive (or even counterintuitive) based on examination of the individual pathways in isolation. Therefore, to gain a more comprehensive view of cell signaling processes, it is important to understand how signaling networks behave at the systems level. This requires integrated strategies that combine quantitative experimental data with computational models. In this chapter, we first examine some of the progress that has recently been made toward understanding the systems-level regulation of cellular signaling networks, with a particular emphasis on phosphorylation-dependent signaling networks. We then discuss how genetically targetable fluorescent biosensors are being used together with computational models to gain unique insights into the spatiotemporal regulation of signaling networks within single, living cells.

  16. An investigation into soft error detection efficiency at operating system level.

    Science.gov (United States)

    Asghari, Seyyed Amir; Kaynak, Okyay; Taheri, Hassan

    2014-01-01

    Electronic equipment operating in harsh environments such as space is subjected to a range of threats. The most important of these is radiation that gives rise to permanent and transient errors on microelectronic components. The occurrence rate of transient errors is significantly more than permanent errors. The transient errors, or soft errors, emerge in two formats: control flow errors (CFEs) and data errors. Valuable research results have already appeared in literature at hardware and software levels for their alleviation. However, there is the basic assumption behind these works that the operating system is reliable and the focus is on other system levels. In this paper, we investigate the effects of soft errors on the operating system components and compare their vulnerability with that of application level components. Results show that soft errors in operating system components affect both operating system and application level components. Therefore, by providing endurance to operating system level components against soft errors, both operating system and application level components gain tolerance.

  17. A High-Throughput, High-Accuracy System-Level Simulation Framework for System on Chips

    Directory of Open Access Journals (Sweden)

    Guanyi Sun

    2011-01-01

    Full Text Available Today's System-on-Chips (SoCs design is extremely challenging because it involves complicated design tradeoffs and heterogeneous design expertise. To explore the large solution space, system architects have to rely on system-level simulators to identify an optimized SoC architecture. In this paper, we propose a system-level simulation framework, System Performance Simulation Implementation Mechanism, or SPSIM. Based on SystemC TLM2.0, the framework consists of an executable SoC model, a simulation tool chain, and a modeling methodology. Compared with the large body of existing research in this area, this work is aimed at delivering a high simulation throughput and, at the same time, guaranteeing a high accuracy on real industrial applications. Integrating the leading TLM techniques, our simulator can attain a simulation speed that is not slower than that of the hardware execution by a factor of 35 on a set of real-world applications. SPSIM incorporates effective timing models, which can achieve a high accuracy after hardware-based calibration. Experimental results on a set of mobile applications proved that the difference between the simulated and measured results of timing performance is within 10%, which in the past can only be attained by cycle-accurate models.

  18. Water Phase Change Heat Exchanger System Level Analysis for Low Lunar Orbit

    Science.gov (United States)

    Navarro, Moses; Ungar, Eugene; Sheth, Rubik; Hansen, Scott

    2016-01-01

    In low Lunar orbit (LLO) the thermal environment is cyclic - extremely cold in the eclipse and as warm as room temperature near the subsolar point. Phase change material heat exchangers (PCHXs) are the best option for long term missions in these environments. The Orion spacecraft will use a n-pentadecane wax PCHX for its envisioned mission to LLO. Using water as a PCM material is attractive because its higher heat of fusion and greater density result in a lighter, more compact PCHX. To assess the use of a water PCHX for a human spacecraft in a circular LLO, a system level analysis was performed for the Orion spacecraft. Three cases were evaluated: 1) A one-to-one replacement of the wax PCHX on the internal thermal control loop with a water PCHX (including the appropriate control modifications), 2) reducing the radiator return setpoint temperature below Orion's value to enhance PCHX freezing, and 3) placing the water PCM on the external loop. The model showed that the water PCHX could not be used as a drop-in replacement for the wax PCHX. It did not freeze fully during the eclipse owing to its low freezing point. To obtain equivalent performance, 40% more radiator area than the Orion baseline was required. The study shows that, although water PCHXs are attractive at a component level, system level effects mean that they are not the best choice for LLO.

  19. Preprogrammed capillarity to passively control system-level sequential and parallel microfluidic flows.

    Science.gov (United States)

    Kim, Sung-Jin; Paczesny, Sophie; Takayama, Shuichi; Kurabayashi, Katsuo

    2013-06-01

    In microfluidics, capillarity-driven solution flow is often beneficial, owing to its inherently spontaneous motion. However, it is commonly perceived that, in an integrated microfluidic system, the passive capillarity control alone can hardly achieve well-controlled sequential and parallel flow of multiple solutions. Despite this common notion, we hereby demonstrate system-level sequential and parallel microfluidic flow processing by fully passive capillarity-driven control. After manual loading of solutions with a pipette, a network of microfluidic channels passively regulates the flow timing of the multiple solution menisci in a sequential and synchronous manner. Also, use of auxiliary channels and preprogramming of inlet-well meniscus pressure and channel fluidic conductance allow for controlling the flow direction of multiple solutions in our microfluidic system. With those components orchestrated in a single device chip, we show preprogrammed flow control of 10 solutions. The demonstrated system-level flow control proves capillarity as a useful means even for sophisticated microfluidic processing without any actively controlled valves and pumps.

  20. A Fast Approach for System-Level Power Modeling and Simulation

    Institute of Scientific and Technical Information of China (English)

    XIAJun; ZOUXuecheng

    2004-01-01

    Power is one of the main constraints in SOC (System-on-a-chip) design. System-level power modeling and simulation help to reduce the power dissipation in early stage. But by reason of the variability of system architecture, the amount of simulation is vast which result in the simulation time is unacceptable. On the basis of previous work, a modified hybrid approach for core-based system-level power modeling is proposed in this paper, which enables SOC designers to estimate system power consumption under different core parameters and just simulates the system only once. Thereafter designers can make a trade-off rapidly between performance characters (such as power, area, speed and test, etc) and make a decision about which architecture is the best solution to implement system functionality. The key of our approach is to help the core designers to provide a power metric function to the core users and the effectiveness and efficiency of our approach hinge on whether the IP provider could provide an accurate power model of each core. A linear model is chosen to describe the relationship between power consumption and parameters and least-square-error is selected as the optimization criterion to mitigate error. Although the power is given as an example, our approach can be applied to speed and area performance trade-off also because speed and area performance modeling is simpler relatively.

  1. Electrochemical reverse engineering: A systems-level tool to probe the redox-based molecular communication of biology.

    Science.gov (United States)

    Li, Jinyang; Liu, Yi; Kim, Eunkyoung; March, John C; Bentley, William E; Payne, Gregory F

    2016-12-29

    The intestine is the site of digestion and forms a critical interface between the host and the outside world. This interface is composed of host epithelium and a complex microbiota which is "connected" through an extensive web of chemical and biological interactions that determine the balance between health and disease for the host. This biology and the associated chemical dialogues occur within a context of a steep oxygen gradient that provides the driving force for a variety of reduction and oxidation (redox) reactions. While some redox couples (e.g., catecholics) can spontaneously exchange electrons, many others are kinetically "insulated" (e.g., biothiols) allowing the biology to set and control their redox states far from equilibrium. It is well known that within cells, such non-equilibrated redox couples are poised to transfer electrons to perform reactions essential to immune defense (e.g., transfer from NADH to O2 for reactive oxygen species, ROS, generation) and protection from such oxidative stresses (e.g., glutathione-based reduction of ROS). More recently, it has been recognized that some of these redox-active species (e.g., H2O2) cross membranes and diffuse into the extracellular environment including lumen to transmit redox information that is received by atomically-specific receptors (e.g., cysteine-based sulfur switches) that regulate biological functions. Thus, redox has emerged as an important modality in the chemical signaling that occurs in the intestine and there have been emerging efforts to develop the experimental tools needed to probe this modality. We suggest that electrochemistry provides a unique tool to experimentally probe redox interactions at a systems level. Importantly, electrochemistry offers the potential to enlist the extensive theories established in signal processing in an effort to "reverse engineer" the molecular communication occurring in this complex biological system. Here, we review our efforts to develop this

  2. Systems-level regulation of microRNA networks by miR-130/301 promotes pulmonary hypertension

    Science.gov (United States)

    Bertero, Thomas; Lu, Yu; Annis, Sofia; Hale, Andrew; Bhat, Balkrishen; Saggar, Rajan; Saggar, Rajeev; Wallace, W. Dean; Ross, David J.; Vargas, Sara O.; Graham, Brian B.; Kumar, Rahul; Black, Stephen M.; Fratz, Sohrab; Fineman, Jeffrey R.; West, James D.; Haley, Kathleen J.; Waxman, Aaron B.; Chau, B. Nelson; Cottrill, Katherine A.; Chan, Stephen Y.

    2014-01-01

    Development of the vascular disease pulmonary hypertension (PH) involves disparate molecular pathways that span multiple cell types. MicroRNAs (miRNAs) may coordinately regulate PH progression, but the integrative functions of miRNAs in this process have been challenging to define with conventional approaches. Here, analysis of the molecular network architecture specific to PH predicted that the miR-130/301 family is a master regulator of cellular proliferation in PH via regulation of subordinate miRNA pathways with unexpected connections to one another. In validation of this model, diseased pulmonary vessels and plasma from mammalian models and human PH subjects exhibited upregulation of miR-130/301 expression. Evaluation of pulmonary arterial endothelial cells and smooth muscle cells revealed that miR-130/301 targeted PPARγ with distinct consequences. In endothelial cells, miR-130/301 modulated apelin-miR-424/503-FGF2 signaling, while in smooth muscle cells, miR-130/301 modulated STAT3-miR-204 signaling to promote PH-associated phenotypes. In murine models, induction of miR-130/301 promoted pathogenic PH-associated effects, while miR-130/301 inhibition prevented PH pathogenesis. Together, these results provide insight into the systems-level regulation of miRNA-disease gene networks in PH with broad implications for miRNA-based therapeutics in this disease. Furthermore, these findings provide critical validation for the evolving application of network theory to the discovery of the miRNA-based origins of PH and other diseases. PMID:24960162

  3. A Probabilistic Approach for the System-Level Design of Multi-ASIP Platforms

    DEFF Research Database (Denmark)

    Micconi, Laura

    C with a relatively short time-to-market. While there are several commercial tools for the design of a single ASIP, there is still a lack of automation in the design of multi-ASIP platforms. In this thesis we consider multi-ASIP platforms for real-time applications. Each ASIP is designed to run a specific group...... introduce a system-level Design Space Exploration (DSE) for the very early phases of the design that automatizes part of the multi-ASIP design flow. Our DSE is responsible for assigning the tasks to the different ASIPs exploring different platform alternatives. We perform a schedulability analysis for each...... solution to determine which one has the highest chances of meeting the deadlines of the applications and that should be considered in the next stages of the multi-ASIP design flow....

  4. System-level Reliability Assessment of Power Stage in Fuel Cell Application

    DEFF Research Database (Denmark)

    Zhou, Dao; Wang, Huai; Blaabjerg, Frede

    2016-01-01

    High efficient and less pollutant fuel cell stacks are emerging and strong candidates of the power solution used for mobile base stations. In the application of the backup power, the availability and reliability hold the highest priority. This paper considers the reliability metrics from...... the component-level to the system-level for the power stage used in a fuel cell application. It starts with an estimation of the annual accumulated damage for the key power electronic components according to the real mission profile of the fuel cell system. Then, considering the parameter variations in both...... reliability. In a case study of a 5 kW fuel cell power stage, the parameter variations of the lifetime model prove that the exponential factor of the junction temperature fluctuation is the most sensitive parameter. Besides, if a 5-out-of-6 redundancy is used, it is concluded both the B10 and the B1 system...

  5. Efficient Uplink Modeling for Dynamic System-Level Simulations of Cellular and Mobile Networks

    Directory of Open Access Journals (Sweden)

    Lobinger Andreas

    2010-01-01

    Full Text Available A novel theoretical framework for uplink simulations is proposed. It allows investigations which have to cover a very long (real- time and which at the same time require a certain level of accuracy in terms of radio resource management, quality of service, and mobility. This is of particular importance for simulations of self-organizing networks. For this purpose, conventional system level simulators are not suitable due to slow simulation speeds far beyond real-time. Simpler, snapshot-based tools are lacking the aforementioned accuracy. The runtime improvements are achieved by deriving abstract theoretical models for the MAC layer behavior. The focus in this work is long term evolution, and the most important uplink effects such as fluctuating interference, power control, power limitation, adaptive transmission bandwidth, and control channel limitations are considered. Limitations of the abstract models will be discussed as well. Exemplary results are given at the end to demonstrate the capability of the derived framework.

  6. A system-level bandwidth design method for wormhole network-on-chip

    Science.gov (United States)

    Wang, Jian; Li, Yubai; Liao, Changjun

    2016-11-01

    To improve the Network-on-Chip (NoC) performance, we propose a system-level bandwidth design method customising the bandwidths of the NoC links. In details, we first built a mathematical model to catch the relationship between the NoC commutation latency and the NoC link bandwidth, and then develop a bandwidth allocation algorithm to automatically optimise the bandwidth for each NoC link. The experimental results show that our bandwidth-customising method improves the NoC performance compared to the traditional uniform bandwidth allocation method. Besides, it can also make our NoC to achieve the same communication performance level as the uniform bandwidth NoC but using fewer bandwidth resources, which is beneficial to save the NoC area and power.

  7. Design of a Computationally Efficient Dynamic System-Level Simulator for Enterprise LTE Femtocell Scenarios

    Directory of Open Access Journals (Sweden)

    J. M. Ruiz-Avilés

    2012-01-01

    Full Text Available In the context of Long-Term Evolution (LTE, the next generation mobile telecommunication network, femtocells are low-power base stations that efficiently provide coverage and capacity indoors. This paper presents a computationally efficient dynamic system-level LTE simulator for enterprise femtocell scenarios. The simulator includes specific mobility and traffic and propagation models for indoor environments. A physical layer abstraction is performed to predict link-layer performance with low computational cost. At link layer, two important functions are included to increase network capacity: Link Adaptation and Dynamic Scheduling. At network layer, other Radio Resource Management functionalities, such as Admission Control and Mobility Management, are also included. The resulting tool can be used to test and validate optimization algorithms in the context of Self-Organizing Networks (SON.

  8. Development of a Software Framework for System-Level Carbon Sequestration Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Miller, R.

    2013-02-28

    The overall purpose of this project was to identify, evaluate, select, develop, and test a suite of enhancements to the GoldSim software program, in order to make it a better tool for use in support of Carbon Capture and Sequestration (CCS) projects. The GoldSim software is a foundational tool used by scientists at NETL and at other laboratories and research institutions to evaluate system-level risks of proposed CCS projects. The primary product of the project was a series of successively improved versions of the GoldSim software, supported by an extensive User’s Guide. All of the enhancements were tested by scientists at Los Alamos National Laboratory, and several of the enhancements have already been incorporated into the CO{sub 2}-PENS sequestration model.

  9. System-level integrated circuit (SLIC) development for phased array antenna applications

    Science.gov (United States)

    Shalkhauser, K. A.; Raquet, C. A.

    1991-01-01

    A microwave/millimeter wave system-level integrated circuit (SLIC) being developed for use in phased array antenna applications is described. The program goal is to design, fabricate, test, and deliver an advanced integrated circuit that merges radio frequency (RF) monolithic microwave integrated circuit (MMIC) technologies with digital, photonic, and analog circuitry that provide control, support, and interface functions. As a whole, the SLIC will offer improvements in RF device performance, uniformity, and stability while enabling accurate, rapid, repeatable control of the RF signal. Furthermore, the SLIC program addresses issues relating to insertion of solid state devices into antenna systems, such as the reduction in number of bias, control, and signal lines. Program goals, approach, and status are discussed.

  10. Knowledge representation model for systems-level analysis of signal transduction networks.

    Science.gov (United States)

    Lee, Dong-Yup; Zimmer, Ralf; Lee, Sang-Yup; Hanisch, Daniel; Park, Sunwon

    2004-01-01

    A Petri-net based model for knowledge representation has been developed to describe as explicitly and formally as possible the molecular mechanisms of cell signaling and their pathological implications. A conceptual framework has been established for reconstructing and analyzing signal transduction networks on the basis of the formal representation. Such a conceptual framework renders it possible to qualitatively understand the cell signaling behavior at systems-level. The mechanisms of the complex signaling network are explored by applying the established framework to the signal transduction induced by potent proinflammatory cytokines, IL-1beta and TNF-alpha The corresponding expert-knowledge network is constructed to evaluate its mechanisms in detail. This strategy should be useful in drug target discovery and its validation.

  11. Computational methodology to predict satellite system-level effects from impacts of untrackable space debris

    Science.gov (United States)

    Welty, N.; Rudolph, M.; Schäfer, F.; Apeldoorn, J.; Janovsky, R.

    2013-07-01

    This paper presents a computational methodology to predict the satellite system-level effects resulting from impacts of untrackable space debris particles. This approach seeks to improve on traditional risk assessment practices by looking beyond the structural penetration of the satellite and predicting the physical damage to internal components and the associated functional impairment caused by untrackable debris impacts. The proposed method combines a debris flux model with the Schäfer-Ryan-Lambert ballistic limit equation (BLE), which accounts for the inherent shielding of components positioned behind the spacecraft structure wall. Individual debris particle impact trajectories and component shadowing effects are considered and the failure probabilities of individual satellite components as a function of mission time are calculated. These results are correlated to expected functional impairment using a Boolean logic model of the system functional architecture considering the functional dependencies and redundancies within the system.

  12. Relative performance analysis of IR FPA technologies from the perspective of system level performance

    Science.gov (United States)

    Haran, Terence L.; James, J. Christopher; Cincotta, Tomas E.

    2017-08-01

    The majority of high performance infrared systems today utilize FPAs composed of intrinsic direct bandgap semiconductor photon detectors such as MCT or InSb. Quantum well detector technologies such as QWIPs, QDIPs, and SLS photodetectors are potentially lower cost alternatives to MCT and InSb, but the relative performance of these technologies has not been sufficiently high to allow widespread adoption outside of a handful of applications. While detectors are often evaluated using figures of merit such as NETD or D∗, these metrics, which include many underlying aspects such as spectral quantum efficiency, dark current, well size, MTF, and array response uniformity, may be far removed from the performance metrics used to judge performance of a system in an operationally relevant scenario. True comparisons of performance for various detector technologies from the perspective of end-to-end system performance have rarely been conducted, especially considering the rapid progress of the newer quantum well technologies. System level models such as the US Army's Night Vision Integrated Performance Model (NV-IPM) can calculate image contrast and spatial frequency content using data from the target/background, intervening atmosphere, and system components. This paper includes results from a performance parameter sensitivity analysis using NV-IPM to determine the relative importance of various FPA performance parameters to the overall performance of a long range imaging system. Parameters included are: QE, dark current density, quantum well capacity, downstream readout noise, well fill, image frame rate, frame averaging, and residual fixed pattern noise. The state-of-the art for XBn, QWIP, and SLS detector technologies operating in the MWIR and LWIR bands will be surveyed to assess performance of quantum structures compared to MCT and InSb. The intent is to provide a comprehensive assessment of quantum detector performance and to identify areas where increased research

  13. Goal-directed behaviour and instrumental devaluation: a neural system-level computational model

    Directory of Open Access Journals (Sweden)

    Francesco Mannella

    2016-10-01

    Full Text Available Devaluation is the key experimental paradigm used to demonstrate the presence of instrumental behaviours guided by goals in mammals. We propose a neural system-level computational model to address the question of which brain mechanisms allow the current value of rewards to control instrumental actions. The model pivots on and shows the computational soundness of the hypothesis for which the internal representation of instrumental manipulanda (e.g., levers activate the representation of rewards (or `action-outcomes', e.g. foods while attributing to them a value which depends on the current internal state of the animal (e.g., satiation for some but not all foods. The model also proposes an initial hypothesis of the integrated system of key brain components supporting this process and allowing the recalled outcomes to bias action selection: (a the sub-system formed by the basolateral amygdala and insular cortex acquiring the manipulanda-outcomes associations and attributing the current value to the outcomes; (b the three basal ganglia-cortical loops selecting respectively goals, associative sensory representations, and actions; (c the cortico-cortical and striato-nigro-striatal neural pathways supporting the selection, and selection learning, of actions based on habits and goals. The model reproduces and integrates the results of different devaluation experiments carried out with control rats and rats with pre- and post-training lesions of the basolateral amygdala, the nucleus accumbens core, the prelimbic cortex, and the dorso-medial striatum. The results support the soundness of the hypotheses of the model and show its capacity to integrate, at the system-level, the operations of the key brain structures underlying devaluation. Based on its hypotheses and predictions, the model also represents an operational framework to support the design and analysis of new experiments on the motivational aspects of goal-directed behaviour.

  14. Goal-Directed Behavior and Instrumental Devaluation: A Neural System-Level Computational Model.

    Science.gov (United States)

    Mannella, Francesco; Mirolli, Marco; Baldassarre, Gianluca

    2016-01-01

    Devaluation is the key experimental paradigm used to demonstrate the presence of instrumental behaviors guided by goals in mammals. We propose a neural system-level computational model to address the question of which brain mechanisms allow the current value of rewards to control instrumental actions. The model pivots on and shows the computational soundness of the hypothesis for which the internal representation of instrumental manipulanda (e.g., levers) activate the representation of rewards (or "action-outcomes", e.g., foods) while attributing to them a value which depends on the current internal state of the animal (e.g., satiation for some but not all foods). The model also proposes an initial hypothesis of the integrated system of key brain components supporting this process and allowing the recalled outcomes to bias action selection: (a) the sub-system formed by the basolateral amygdala and insular cortex acquiring the manipulanda-outcomes associations and attributing the current value to the outcomes; (b) three basal ganglia-cortical loops selecting respectively goals, associative sensory representations, and actions; (c) the cortico-cortical and striato-nigro-striatal neural pathways supporting the selection, and selection learning, of actions based on habits and goals. The model reproduces and explains the results of several devaluation experiments carried out with control rats and rats with pre- and post-training lesions of the basolateral amygdala, the nucleus accumbens core, the prelimbic cortex, and the dorso-medial striatum. The results support the soundness of the hypotheses of the model and show its capacity to integrate, at the system-level, the operations of the key brain structures underlying devaluation. Based on its hypotheses and predictions, the model also represents an operational framework to support the design and analysis of new experiments on the motivational aspects of goal-directed behavior.

  15. Z Theory

    OpenAIRE

    Nekrasov, Nikita

    2004-01-01

    We present the evidence for the existence of the topological string analogue of M-theory, which we call Z-theory. The corners of Z-theory moduli space correspond to the Donaldson-Thomas theory, Kodaira-Spencer theory, Gromov-Witten theory, and Donaldson-Witten theory. We discuss the relations of Z-theory with Hitchin's gravities in six and seven dimensions, and make our own proposal, involving spinor generalization of Chern-Simons theory of three-forms. Based on the talk at Strings'04 in Paris.

  16. The SAFRR Tsunami Scenario: Improving Resilience for California from a Plausible M9 Earthquake near the Alaska Peninsula

    Science.gov (United States)

    Ross, S.; Jones, L.; Wilson, R. I.; Bahng, B.; Barberopoulou, A.; Borrero, J. C.; Brosnan, D.; Bwarie, J.; Geist, E. L.; Johnson, L.; Kirby, S. H.; Knight, W.; Long, K.; Lynett, P. J.; Miller, K.; Mortensen, C. E.; Nicolsky, D.; Oglesby, D. D.; Perry, S. C.; Plumlee, G. S.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Suleimani, E.; Thio, H. K.; Titov, V.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2013-12-01

    The SAFRR Tsunami Scenario models a hypothetical but plausible tsunami, created by an Mw9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. We present the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs, economic consequences, environmental impacts, social vulnerability, emergency management, and policy implications for California associated with the tsunami scenario. The intended users are those who must make mitigation decisions before and rapid decisions during future tsunamis. Around a half million people would be present in the scenario's inundation area in residences, businesses, public venues, parks and beaches. Evacuation would likely be ordered for the State of California's maximum mapped tsunami inundation zone, evacuating an additional quarter million people from residences and businesses. Some island and peninsula communities would face particular evacuation challenges because of limited access options and short warning time, caused by the distance between Alaska and California. Evacuations may also be a challenge for certain dependent-care populations. One third of the boats in California's marinas could be damaged or sunk, costing at least 700 million in repairs to boats and docks, and potentially much more to address serious issues due to sediment transport and environmental contamination. Fires would likely start at many sites where fuel and petrochemicals are stored in ports and marinas. Tsunami surges and bores may travel several miles inland up coastal rivers. Debris clean-up and recovery of inundated and damaged areas will take days, months, or years depending on the severity of impacts and the available resources for recovery. The Ports of Los Angeles and Long Beach (POLA/LB) would be shut down for a miniμm of two days due to strong currents. Inundation of dry land in the ports would result in 100 million damages to cargo and additional

  17. Bipolar-pulses observed by the LRS/WFC-L onboard KAGUYA - Plausible evidence of lunar dust impact -

    Science.gov (United States)

    Kasahara, Yoshiya; Horie, Hiroki; Hashimoto, Kozo; Omura, Yoshiharu; Goto, Yoshitaka; Kumamoto, Atsushi; Ono, Takayuki; Tsunakawa, Hideo; Lrs/Wfc Team; Map/Lmag Team

    2010-05-01

    same) and thus most of bipolar-pulses which can be detected in MONO mode are cancelled in DIFF mode. This fact suggests that these bipolar pulses are not a kind of natural wave but these are caused by instantaneous potential changes of the KAGUYA spacecraft. Discussion: Similar type of bipolar-pulses has been observed by the monopole antenna measurements using Radio and Plasma Wave Science (RPWS) instruments on-board Cassini around Saturn [4]. They demonstrated that these bipolar pulses are caused by impacts of dusts floating around the Saturn. It is well-known that lunar dusts are widely dis-tributed in higher altitude range around the moon and it is plausible that these bipolar pulses are caused by the lunar dust impacts. In the presentation, we show the detailed charac-teristics of bipolar pulses detected by the WFC-L onboard KAGUYA. References: [1] Y. Kasahara et al., Earth, Planets and Space, 60(4), 341-351, 2008. [2] T. Ono et al., Earth, Planets and Space, 60(4), 321-332, 2008. [3] K. Hashimoto et al., The 4th SELENE (KAGUYA) Science Working Team Meeting, (this issue), 2010. [4] W.S. Kurth et al, Planetary and Space Science, 54(9-10), 988-998, 2006.

  18. Beliefs in Context: Understanding Language Policy Implementation at a Systems Level

    Science.gov (United States)

    Hopkins, Megan

    2016-01-01

    Drawing on institutional theory, this study describes how cognitive, normative, and regulative mechanisms shape bilingual teachers' language policy implementation in both English-only and bilingual contexts. Aligned with prior educational language policy research, findings indicate the important role that teachers' beliefs play in the policy…

  19. Multiscale computational analysis of Xenopus laevis morphogenesis reveals key insights of systems-level behavior

    Directory of Open Access Journals (Sweden)

    DeSimone Douglas W

    2007-10-01

    Full Text Available Abstract Background Tissue morphogenesis is a complex process whereby tissue structures self-assemble by the aggregate behaviors of independently acting cells responding to both intracellular and extracellular cues in their environment. During embryonic development, morphogenesis is particularly important for organizing cells into tissues, and although key regulatory events of this process are well studied in isolation, a number of important systems-level questions remain unanswered. This is due, in part, to a lack of integrative tools that enable the coupling of biological phenomena across spatial and temporal scales. Here, we present a new computational framework that integrates intracellular signaling information with multi-cell behaviors in the context of a spatially heterogeneous tissue environment. Results We have developed a computational simulation of mesendoderm migration in the Xenopus laevis explant model, which is a well studied biological model of tissue morphogenesis that recapitulates many features of this process during development in humans. The simulation couples, via a JAVA interface, an ordinary differential equation-based mass action kinetics model to compute intracellular Wnt/β-catenin signaling with an agent-based model of mesendoderm migration across a fibronectin extracellular matrix substrate. The emergent cell behaviors in the simulation suggest the following properties of the system: maintaining the integrity of cell-to-cell contact signals is necessary for preventing fractionation of cells as they move, contact with the Fn substrate and the existence of a Fn gradient provides an extracellular feedback loop that governs migration speed, the incorporation of polarity signals is required for cells to migrate in the same direction, and a delicate balance of integrin and cadherin interactions is needed to reproduce experimentally observed migratory behaviors. Conclusion Our computational framework couples two different

  20. Possibilistic systems within a general information theory

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, C.

    1999-06-01

    The author surveys possibilistic systems theory and place it in the context of Imprecise Probabilities and General Information Theory (GIT). In particular, he argues that possibilistic systems hold a distinct position within a broadly conceived, synthetic GIT. The focus is on systems and applications which are semantically grounded by empirical measurement methods (statistical counting), rather than epistemic or subjective knowledge elicitation or assessment methods. Regarding fuzzy measures as special provisions, and evidence measures (belief and plausibility measures) as special fuzzy measures, thereby he can measure imprecise probabilities directly and empirically from set-valued frequencies (random set measurement). More specifically, measurements of random intervals yield empirical fuzzy intervals. In the random set (Dempster-Shafer) context, probability and possibility measures stand as special plausibility measures in that their distributionality (decomposability) maps directly to an aggregable structure of the focal classes of their random sets. Further, possibility measures share with imprecise probabilities the ability to better handle open world problems where the universe of discourse is not specified in advance. In addition to empirically grounded measurement methods, possibility theory also provides another crucial component of a full systems theory, namely prediction methods in the form of finite (Markov) processes which are also strictly analogous to the probabilistic forms.

  1. A theory-driven evaluation of a wellness initiative

    Directory of Open Access Journals (Sweden)

    Carren Field

    2012-07-01

    Full Text Available Orientation: By reporting on an evaluation of a wellness initiative, this article brings together an element of organisational development (employee wellness with an approach to programme evaluation (programme theory-driven evaluation. Research purpose: Two questions were addressed: ‘What is the causal logic of the wellness initiative?’ and ‘Is this a plausible programme theory according to social science research and literature?’Motivation for the study: A study that could demonstrate the usefulness of the theory-driven evaluation approach, especially in the local human resource (HR domain, was considered to be valuable. In addition, this evaluation provided a careful consideration of how plausible it is for such interventions to achieve what they set out to do.Research design, approach and method: The evaluation relied mainly on qualitative methods (the examination of secondary data and interviewing to extract programme theory, and on literature to assess plausibility.Main findings: The study had two main outcomes: the finalisation of a model of how the programme is supposed to work according to programme staff, and the conclusion that the model is plausible, provided it is implemented at full strength.Practical/managerial implications: Programme staff are advised to pay particular attention to implementation fidelity, especially to employee participation and involvement in the programme’s activities. A number of strategies are recommended to strengthen the effect of the model.Contribution/value-add: This evaluation showed the importance of conducting a theory- driven evaluation, not only in order to understand the programme and its context, but also to provide a basis for an implementation and outcome evaluation.

  2. Interventions to Support System-level Implementation of Health Promoting Schools: A Scoping Review

    Directory of Open Access Journals (Sweden)

    Jessie-Lee D. McIsaac

    2016-02-01

    Full Text Available Health promoting schools (HPS is recognized globally as a multifaceted approach that can support health behaviours. There is increasing clarity around factors that influence HPS at a school level but limited synthesized knowledge on the broader system-level elements that may impact local implementation barriers and support uptake of a HPS approach. This study comprised a scoping review to identify, summarise and disseminate the range of research to support the uptake of a HPS approach across school systems. Two reviewers screened and extracted data according to inclusion/exclusion criteria. Relevant studies were identified using a multi-phased approach including searching electronic bibliographic databases of peer reviewed literature, hand-searching reference lists and article recommendations from experts. In total, 41 articles met the inclusion criteria for the review, representing studies across nine international school systems. Overall, studies described policies that provided high-level direction and resources within school jurisdictions to support implementation of a HPS approach. Various multifaceted organizational and professional interventions were identified, including strategies to enable and restructure school environments through education, training, modelling and incentives. A systematic realist review of the literature may be warranted to identify the types of intervention that work best for whom, in what circumstance to create healthier schools and students.

  3. Systems level circuit model of C. elegans undulatory locomotion: mathematical modeling and molecular genetics.

    Science.gov (United States)

    Karbowski, Jan; Schindelman, Gary; Cronin, Christopher J; Seah, Adeline; Sternberg, Paul W

    2008-06-01

    To establish the relationship between locomotory behavior and dynamics of neural circuits in the nematode C. elegans we combined molecular and theoretical approaches. In particular, we quantitatively analyzed the motion of C. elegans with defective synaptic GABA and acetylcholine transmission, defective muscle calcium signaling, and defective muscles and cuticle structures, and compared the data with our systems level circuit model. The major experimental findings are: (1) anterior-to-posterior gradients of body bending flex for almost all strains both for forward and backward motion, and for neuronal mutants, also analogous weak gradients of undulatory frequency, (2) existence of some form of neuromuscular (stretch receptor) feedback, (3) invariance of neuromuscular wavelength, (4) biphasic dependence of frequency on synaptic signaling, and (5) decrease of frequency with increase of the muscle time constant. Based on (1) we hypothesize that the Central Pattern Generator (CPG) is located in the head both for forward and backward motion. Points (1) and (2) are the starting assumptions for our theoretical model, whose dynamical patterns are qualitatively insensitive to the details of the CPG design if stretch receptor feedback is sufficiently strong and slow. The model reveals that stretch receptor coupling in the body wall is critical for generation of the neuromuscular wave. Our model agrees with our behavioral data (3), (4), and (5), and with other pertinent published data, e.g., that frequency is an increasing function of muscle gap-junction coupling.

  4. SPATIAL: A System-level PAThway Impact AnaLysis approach

    Science.gov (United States)

    Bokanizad, Behzad; Tagett, Rebecca; Ansari, Sahar; Helmi, B. Hoda; Draghici, Sorin

    2016-01-01

    The goal of pathway analysis is to identify the pathways that are significantly impacted when a biological system is perturbed, e.g. by a disease or drug. Current methods treat pathways as independent entities. However, many signals are constantly sent from one pathway to another, essentially linking all pathways into a global, system-wide complex. In this work, we propose a set of three pathway analysis methods based on the impact analysis, that performs a system-level analysis by considering all signals between pathways, as well as their overlaps. Briefly, the global system is modeled in two ways: (i) considering the inter-pathway interaction exchange for each individual pathways, and (ii) combining all individual pathways to form a global, system-wide graph. The third analysis method is a hybrid of these two models. The new methods were compared with DAVID, GSEA, GSA, PathNet, Crosstalk and SPIA on 23 GEO data sets involving 19 tissues investigated in 12 conditions. The results show that both the ranking and the P-values of the target pathways are substantially improved when the analysis considers the system-wide dependencies and interactions between pathways. PMID:27193997

  5. Modeling systems-level dynamics: Understanding without mechanistic explanation in integrative systems biology.

    Science.gov (United States)

    MacLeod, Miles; Nersessian, Nancy J

    2015-02-01

    In this paper we draw upon rich ethnographic data of two systems biology labs to explore the roles of explanation and understanding in large-scale systems modeling. We illustrate practices that depart from the goal of dynamic mechanistic explanation for the sake of more limited modeling goals. These processes use abstract mathematical formulations of bio-molecular interactions and data fitting techniques which we call top-down abstraction to trade away accurate mechanistic accounts of large-scale systems for specific information about aspects of those systems. We characterize these practices as pragmatic responses to the constraints many modelers of large-scale systems face, which in turn generate more limited pragmatic non-mechanistic forms of understanding of systems. These forms aim at knowledge of how to predict system responses in order to manipulate and control some aspects of them. We propose that this analysis of understanding provides a way to interpret what many systems biologists are aiming for in practice when they talk about the objective of a "systems-level understanding."

  6. System Level Analysis of a Water PCM HX Integrated Into Orion's Thermal Control System Abstract

    Science.gov (United States)

    Navarro, Moses; Hansen, Scott; Ungar, Eugene; Sheth, Rubik

    2015-01-01

    In a cyclical heat load environment such as low Lunar orbit, a spacecraft's radiators are not sized to reject the full heat load requirement. Traditionally, a supplemental heat rejection device (SHReD) such as an evaporator or sublimator is used to act as a "topper" to meet the additional heat rejection demands. Utilizing a Phase Change Material (PCM) heat exchanger (HX) as a SHReD provides an attractive alternative to evaporators and sublimators as PCM HXs do not use a consumable, thereby leading to reduced launch mass and volume requirements. In continued pursuit of water PCM HX development an Orion system level analysis was performed using Thermal Desktop for a water PCM HX integrated into Orion's thermal control system and in a 100km Lunar orbit. The study analyzed 1) placing the PCM on the Internal Thermal Control System (ITCS) versus the External Thermal Control System (ETCS) 2) use of 30/70 PGW verses 50/50 PGW and 3) increasing the radiator area in order to reduce PCM freeze times. The analysis showed that for the assumed operating and boundary conditions utilizing a water PCM HX on Orion is not a viable option. Additionally, it was found that the radiator area would have to be increased over 20% in order to have a viable water-based PCM HX.

  7. System Level Analysis of a Water PCM HX Integrated into Orion's Thermal Control System

    Science.gov (United States)

    Navarro, Moses; Hansen, Scott; Seth, Rubik; Ungar, Eugene

    2015-01-01

    In a cyclical heat load environment such as low Lunar orbit, a spacecraft's radiators are not sized to reject the full heat load requirement. Traditionally, a supplemental heat rejection device (SHReD) such as an evaporator or sublimator is used to act as a "topper" to meet the additional heat rejection demands. Utilizing a Phase Change Material (PCM) heat exchanger (HX) as a SHReD provides an attractive alternative to evaporators and sublimators as PCM HXs do not use a consumable, thereby leading to reduced launch mass and volume requirements. In continued pursuit of water PCM HX development an Orion system level analysis was performed using Thermal Desktop for a water PCM HX integrated into Orion's thermal control system in a 100km Lunar orbit. The study verified of the thermal model by using a wax PCM and analyzed 1) placing the PCM on the Internal Thermal Control System (ITCS) versus the External Thermal Control System (ETCS) 2) use of 30/70 PGW verses 50/50 PGW and 3) increasing the radiator area in order to reduce PCM freeze times. The analysis showed that for the assumed operating and boundary conditions utilizing a water PCM HX on Orion is not a viable option for any case. Additionally, it was found that the radiator area would have to be increased by at least 40% in order to support a viable water-based PCM HX.

  8. System-level design trade-offs for truly wearable wireless medical devices.

    Science.gov (United States)

    Chen, Guangwei; Rodriguez-Villegas, Esther

    2010-01-01

    Power and current management in emerging wearable medical devices, intended to continuously monitor physiological signals, are crucial design issues. The overall size of the electronic part of these systems is generally going to be dominated by the size of the batteries. Unfortunately, the options of smaller batteries do not only come at the expense of a lower capacity and hence shorter operation time. It also significantly constrains the amount of available current that can be used by different electronic blocks, as well as their operating power supply voltage. This paper discusses all the typical power and current management system level issues in the design of a typical miniature wearable wireless medical device. The discussion is illustrated with experimental results obtained with two devices built using two of the currently most popular low power commercial transceivers in the market, the Texas Instruments (TI) CC2500 and the Nordic Semiconductor nRF24L01+. The numbers presented can be used as a more realistic guidance of the energy per bit required in a real system implementation, as opposed to the ideal figures normally quoted by the manufacturers. Furthermore the analysis in this paper can also be extrapolated to the design of future wireless monitoring wearable devices with further optimized radio transceivers.

  9. System Level Design of Reconfigurable Server Farms Using Elliptic Curve Cryptography Processor Engines

    Directory of Open Access Journals (Sweden)

    Sangook Moon

    2014-01-01

    Full Text Available As today’s hardware architecture becomes more and more complicated, it is getting harder to modify or improve the microarchitecture of a design in register transfer level (RTL. Consequently, traditional methods we have used to develop a design are not capable of coping with complex designs. In this paper, we suggest a way of designing complex digital logic circuits with a soft and advanced type of SystemVerilog at an electronic system level. We apply the concept of design-and-reuse with a high level of abstraction to implement elliptic curve crypto-processor server farms. With the concept of the superior level of abstraction to the RTL used with the traditional HDL design, we successfully achieved the soft implementation of the crypto-processor server farms as well as robust test bench code with trivial effort in the same simulation environment. Otherwise, it could have required error-prone Verilog simulations for the hardware IPs and other time-consuming jobs such as C/SystemC verification for the software, sacrificing more time and effort. In the design of the elliptic curve cryptography processor engine, we propose a 3X faster GF(2m serial multiplication architecture.

  10. System-level modeling and verification of a micro pitch-tunable grating

    Science.gov (United States)

    Lv, Xianglian; Xu, Jinghui; Yu, Yiting; He, Yang; Yuan, Weizheng

    2010-10-01

    Micro Pitch-tunable Grating based on microeletromechanical systems(MEMS) technology can modulate the grating period dynamically by controlling the drive voltage. The device is so complex that it is impossible to model and sumulation by FEA method or only analysis macromodel. In this paper, a new hybrid system-level modeling method was presented. Firstly the grating was decomposed into function components such as grating beam, supporting beam, electrostatic comb-driver. Block Arnoldi algorithm was used to obtain the numerical macromodel of the grating beams and supporting beams, the analytical macromodels called multi-port-elements(MPEs) of the comb-driver and other parts were also established, and the elements were connected together to form hybrid network for representing the systemlevel models of the grating in MEME Garden, which is a MEMS CAD tool developed by Micro and Nano Electromechanical Systems Laboratory, Northwestern Polytechnical University. Both frequency and time domain simulation were implemented. The grating was fabricated using silicon-on-glass(SOG) process. The measured working displacement is 16.5μm at a driving voltage of 40V. The simulation result is 17.6μm which shows an acceptable agreement with the measurement result within the error tolerance of 6.7%. The method proposed in this paper can solve the voltage-displacement simulation problem of this kind of complex grating. It can also be adapted to similar MEMS/MOEMS devices simulations.

  11. A Platform-Based Methodology for System-Level Mixed-Signal Design

    Directory of Open Access Journals (Sweden)

    Alberto Sangiovanni-Vincentelli

    2010-01-01

    Full Text Available The complexity of today's embedded electronic systems as well as their demanding performance and reliability requirements are such that their design can no longer be tackled with ad hoc techniques while still meeting tight time to-market constraints. In this paper, we present a system level design approach for electronic circuits, utilizing the platform-based design (PBD paradigm as the natural framework for mixed-domain design formalization. In PBD, a meet-in-the-middle approach allows systematic exploration of the design space through a series of top-down mapping of system constraints onto component feasibility models in a platform library, which is based on bottom-up characterizations. In this framework, new designs can be assembled from the precharacterized library components, giving the highest priority to design reuse, correct assembly, and efficient design flow from specifications to implementation. We apply concepts from design centering to enforce robustness to modeling errors as well as process, voltage, and temperature variations, which are currently plaguing embedded system design in deep-submicron technologies. The effectiveness of our methodology is finally shown on the design of a pipeline A/D converter and two receiver front-ends for UMTS and UWB communications.

  12. System-Level Integrated Circuit (SLIC) Technology Development for Phased Array Antenna Applications

    Science.gov (United States)

    Windyka, John A.; Zablocki, Ed G.

    1997-01-01

    This report documents the efforts and progress in developing a 'system-level' integrated circuit, or SLIC, for application in advanced phased array antenna systems. The SLIC combines radio-frequency (RF) microelectronics, digital and analog support circuitry, and photonic interfaces into a single micro-hybrid assembly. Together, these technologies provide not only the amplitude and phase control necessary for electronic beam steering in the phased array, but also add thermally-compensated automatic gain control, health and status feedback, bias regulation, and reduced interconnect complexity. All circuitry is integrated into a compact, multilayer structure configured for use as a two-by-four element phased array module, operating at 20 Gigahertz, using a Microwave High-Density Interconnect (MHDI) process. The resultant hardware is constructed without conventional wirebonds, maintains tight inter-element spacing, and leads toward low-cost mass production. The measured performances and development issues associated with both the two-by-four element module and the constituent elements are presented. Additionally, a section of the report describes alternative architectures and applications supported by the SLIC electronics. Test results show excellent yield and performance of RF circuitry and full automatic gain control for multiple, independent channels. Digital control function, while suffering from lower manufacturing yield, also proved successful.

  13. Highlighting the Need for Systems-Level Experimental Characterization of Plant Metabolic Enzymes.

    Science.gov (United States)

    Engqvist, Martin K M

    2016-01-01

    The biology of living organisms is determined by the action and interaction of a large number of individual gene products, each with specific functions. Discovering and annotating the function of gene products is key to our understanding of these organisms. Controlled experiments and bioinformatic predictions both contribute to functional gene annotation. For most species it is difficult to gain an overview of what portion of gene annotations are based on experiments and what portion represent predictions. Here, I survey the current state of experimental knowledge of enzymes and metabolism in Arabidopsis thaliana as well as eleven economically important crops and forestry trees - with a particular focus on reactions involving organic acids in central metabolism. I illustrate the limited availability of experimental data for functional annotation of enzymes in most of these species. Many enzymes involved in metabolism of citrate, malate, fumarate, lactate, and glycolate in crops and forestry trees have not been characterized. Furthermore, enzymes involved in key biosynthetic pathways which shape important traits in crops and forestry trees have not been characterized. I argue for the development of novel high-throughput platforms with which limited functional characterization of gene products can be performed quickly and relatively cheaply. I refer to this approach as systems-level experimental characterization. The data collected from such platforms would form a layer intermediate between bioinformatic gene function predictions and in-depth experimental studies of these functions. Such a data layer would greatly aid in the pursuit of understanding a multiplicity of biological processes in living organisms.

  14. Tinnitus: pathology of synaptic plasticity at the cellular and system levels

    Directory of Open Access Journals (Sweden)

    Matthieu J Guitton

    2012-03-01

    Full Text Available Despite being more and more common, and having a high impact on the quality of life of sufferers, tinnitus does not yet have a cure. This has been mostly the result of limited knowledge of the biological mechanisms underlying this adverse pathology. However, the last decade has witnessed tremendous progress in our understanding on the pathophysiology of tinnitus. Animal models have demonstrated that tinnitus is a pathology of neural plasticity, and has two main components: a molecular, peripheral component related to the initiation phase of tinnitus; and a system-level, central component related to the long-term maintenance of tinnitus. Using the most recent experimental data and the molecular/system dichotomy as a framework, we describe here the biological basis of tinnitus. We then discuss these mechanisms from an evolutionary perspective, highlighting similarities with memory. Finally, we consider how these discoveries can translate into therapies, and we suggest operative strategies to design new and effective combined therapeutic solutions using both pharmacological (local and systemic and behavioral tools (e.g., using tele-medicine and virtual reality settings.

  15. Towards a predictive systems-level model of the human microbiome: progress, challenges, and opportunities.

    Science.gov (United States)

    Greenblum, Sharon; Chiu, Hsuan-Chao; Levy, Roie; Carr, Rogan; Borenstein, Elhanan

    2013-08-01

    The human microbiome represents a vastly complex ecosystem that is tightly linked to our development, physiology, and health. Our increased capacity to generate multiple channels of omic data from this system, brought about by recent advances in high throughput molecular technologies, calls for the development of systems-level methods and models that take into account not only the composition of genes and species in a microbiome but also the interactions between these components. Such models should aim to study the microbiome as a community of species whose metabolisms are tightly intertwined with each other and with that of the host, and should be developed with a view towards an integrated, comprehensive, and predictive modeling framework. Here, we review recent work specifically in metabolic modeling of the human microbiome, highlighting both novel methodologies and pressing challenges. We discuss various modeling approaches that lay the foundation for a full-scale predictive model, focusing on models of interactions between microbial species, metagenome-scale models of community-level metabolism, and models of the interaction between the microbiome and the host. Continued development of such models and of their integration into a multi-scale model of the microbiome will lead to a deeper mechanistic understanding of how variation in the microbiome impacts the host, and will promote the discovery of clinically relevant and ecologically relevant insights from the rich trove of data now available. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. System-Level Design of an Integrated Receiver Front End for a Wireless Ultrasound Probe

    DEFF Research Database (Denmark)

    di Ianni, Tommaso; Hemmsen, Martin Christian; Llimos Muntal, Pere

    2016-01-01

    In this paper, a system-level design is presented for an integrated receive circuit for a wireless ultrasound probe, which includes analog front ends and beamformation modules. This paper focuses on the investigation of the effects of architectural design choices on the image quality. The point...... spread function is simulated in Field II from 10 to 160 mm using a convex array transducer. A noise analysis is performed, and the minimum signal-to-noise ratio (SNR) requirements are derived for the low-noise amplifiers (LNAs) and A/D converters (ADCs) to fulfill the design specifications of a dynamic......). The designs that minimally satisfy the specifications are based on an 8-b 30-MSPS Nyquist converter and a single-bit third-order 240-MSPS modulator, with an SNR for the LNA in both cases equal to 64 dB. The mean lateral FWHM and CR are 2.4% and 7.1% lower for the architecture compared with the Nyquistrate one...

  17. System-Level Testing of the Advanced Stirling Radioisotope Generator Engineering Hardware

    Science.gov (United States)

    Chan, Jack; Wiser, Jack; Brown, Greg; Florin, Dominic; Oriti, Salvatore M.

    2014-01-01

    To support future NASA deep space missions, a radioisotope power system utilizing Stirling power conversion technology was under development. This development effort was performed under the joint sponsorship of the Department of Energy and NASA, until its termination at the end of 2013 due to budget constraints. The higher conversion efficiency of the Stirling cycle compared with that of the Radioisotope Thermoelectric Generators (RTGs) used in previous missions (Viking, Pioneer, Voyager, Galileo, Ulysses, Cassini, Pluto New Horizons and Mars Science Laboratory) offers the advantage of a four-fold reduction in Pu-238 fuel, thereby extending its limited domestic supply. As part of closeout activities, system-level testing of flight-like Advanced Stirling Convertors (ASCs) with a flight-like ASC Controller Unit (ACU) was performed in February 2014. This hardware is the most representative of the flight design tested to date. The test fully demonstrates the following ACU and system functionality: system startup; ASC control and operation at nominal and worst-case operating conditions; power rectification; DC output power management throughout nominal and out-of-range host voltage levels; ACU fault management, and system command / telemetry via MIL-STD 1553 bus. This testing shows the viability of such a system for future deep space missions and bolsters confidence in the maturity of the flight design.

  18. The next generation in optical transport semiconductors: IC solutions at the system level

    Science.gov (United States)

    Gomatam, Badri N.

    2005-02-01

    In this tutorial overview, we survey some of the challenging problems facing Optical Transport and their solutions using new semiconductor-based technologies. Advances in 0.13um CMOS, SiGe/HBT and InP/HBT IC process technologies and mixed-signal design strategies are the fundamental breakthroughs that have made these solutions possible. In combination with innovative packaging and transponder/transceiver architectures IC approaches have clearly demonstrated enhanced optical link budgets with simultaneously lower (perhaps the lowest to date) cost and manufacturability tradeoffs. This paper will describe: *Electronic Dispersion Compensation broadly viewed as the overcoming of dispersion based limits to OC-192 links and extending link budgets, *Error Control/Coding also known as Forward Error Correction (FEC), *Adaptive Receivers for signal quality monitoring for real-time estimation of Q/OSNR, eye-pattern, signal BER and related temporal statistics (such as jitter). We will discuss the theoretical underpinnings of these receiver and transmitter architectures, provide examples of system performance and conclude with general market trends. These Physical layer IC solutions represent a fundamental new toolbox of options for equipment designers in addressing systems level problems. With unmatched cost and yield/performance tradeoffs, it is expected that IC approaches will provide significant flexibility in turn, for carriers and service providers who must ultimately manage the network and assure acceptable quality of service under stringent cost constraints.

  19. A DC-DC Converter Efficiency Model for System Level Analysis in Ultra Low Power Applications

    Directory of Open Access Journals (Sweden)

    Benton H. Calhoun

    2013-06-01

    Full Text Available This paper presents a model of inductor based DC-DC converters that can be used to study the impact of power management techniques such as dynamic voltage and frequency scaling (DVFS. System level power models of low power systems on chip (SoCs and power management strategies cannot be correctly established without accounting for the associated overhead related to the DC-DC converters that provide regulated power to the system. The proposed model accurately predicts the efficiency of inductor based DC-DC converters with varying topologies and control schemes across a range of output voltage and current loads. It also accounts for the energy and timing overhead associated with the change in the operating condition of the regulator. Since modern SoCs employ power management techniques that vary the voltage and current loads seen by the converter, accurate modeling of the impact on the converter efficiency becomes critical. We use this model to compute the overall cost of two power distribution strategies for a SoC with multiple voltage islands. The proposed model helps us to obtain the energy benefits of a power management technique and can also be used as a basis for comparison between power management techniques or as a tool for design space exploration early in a SoC design cycle.

  20. On-Site Renewable Energy and Green Buildings: A System-Level Analysis.

    Science.gov (United States)

    Al-Ghamdi, Sami G; Bilec, Melissa M

    2016-05-03

    Adopting a green building rating system (GBRSs) that strongly considers use of renewable energy can have important environmental consequences, particularly in developing countries. In this paper, we studied on-site renewable energy and GBRSs at the system level to explore potential benefits and challenges. While we have focused on GBRSs, the findings can offer additional insight for renewable incentives across sectors. An energy model was built for 25 sites to compute the potential solar and wind power production on-site and available within the building footprint and regional climate. A life-cycle approach and cost analysis were then completed to analyze the environmental and economic impacts. Environmental impacts of renewable energy varied dramatically between sites, in some cases, the environmental benefits were limited despite the significant economic burden of those renewable systems on-site and vice versa. Our recommendation for GBRSs, and broader policies and regulations, is to require buildings with higher environmental impacts to achieve higher levels of energy performance and on-site renewable energy utilization, instead of fixed percentages.

  1. Optimal unified combination rule in application of Dempster-Shafer theory to lung cancer radiotherapy dose response outcome analysis.

    Science.gov (United States)

    He, Yanyan; Hussaini, M Yousuff; Gong, Yutao U T; Xiao, Ying

    2016-01-08

    Our previous study demonstrated the application of the Dempster-Shafer theory of evidence to dose/volume/outcome data analysis. Specifically, it provided Yager's rule to fuse data from different institutions pertaining to radiotherapy pneumonitis versus mean lung dose. The present work is a follow-on study that employs the optimal unified combination rule, which optimizes data similarity among independent sources. Specifically, we construct belief and plausibility functions on the lung cancer radiotherapy dose outcome datasets, and then apply the optimal unified combination rule to obtain combined belief and plausibility, which bound the probabilities of pneumonitis incidence. To estimate the incidence of pneumonitis at any value of mean lung dose, we use the Lyman-Kutcher-Burman (LKB) model to fit the combined belief and plausibility curves. The results show that the optimal unified combination rule yields a narrower uncertainty range (as represented by the belief-plausibility range) than Yager's rule, which is also theoretically proven.

  2. Dempster-Shafer information measures in category theory

    Science.gov (United States)

    Peri, Joseph S. J.

    2016-05-01

    In the Dempster Shafer context, one can construct new types of information measures based on belief and plausibility functions. These measures differ from those in Shannon's theory because, in his theory, information measures are based on probability functions. Other types of information measures were discovered by Kampe de Feriet and his colleagues in the French and Italian schools of mathematics. The objective of this paper is to construct a new category of information. I use category theory to construct a general setting in which the various types of information measures are special cases.

  3. String theory

    OpenAIRE

    Marino Beiras, Marcos

    2001-01-01

    We give an overview of the relations between matrix models and string theory, focusing on topological string theory and the Dijkgraaf--Vafa correspondence. We discuss applications of this correspondence and its generalizations to supersymmetric gauge theory, enumerative geometry and mirror symmetry. We also present a brief overview of matrix quantum mechanical models in superstring theory.

  4. Game Theory

    DEFF Research Database (Denmark)

    Game Theory is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in game theory. We hear their views on game theory, its aim, scope, use, the future direction of game theory and how their work fits in these respects....

  5. Randomization and resilience of brain functional networks as systems-level endophenotypes of schizophrenia.

    Science.gov (United States)

    Lo, Chun-Yi Zac; Su, Tsung-Wei; Huang, Chu-Chung; Hung, Chia-Chun; Chen, Wei-Ling; Lan, Tsuo-Hung; Lin, Ching-Po; Bullmore, Edward T

    2015-07-21

    Schizophrenia is increasingly conceived as a disorder of brain network organization or dysconnectivity syndrome. Functional MRI (fMRI) networks in schizophrenia have been characterized by abnormally random topology. We tested the hypothesis that network randomization is an endophenotype of schizophrenia and therefore evident also in nonpsychotic relatives of patients. Head movement-corrected, resting-state fMRI data were acquired from 25 patients with schizophrenia, 25 first-degree relatives of patients, and 29 healthy volunteers. Graphs were used to model functional connectivity as a set of edges between regional nodes. We estimated the topological efficiency, clustering, degree distribution, resilience, and connection distance (in millimeters) of each functional network. The schizophrenic group demonstrated significant randomization of global network metrics (reduced clustering, greater efficiency), a shift in the degree distribution to a more homogeneous form (fewer hubs), a shift in the distance distribution (proportionally more long-distance edges), and greater resilience to targeted attack on network hubs. The networks of the relatives also demonstrated abnormal randomization and resilience compared with healthy volunteers, but they were typically less topologically abnormal than the patients' networks and did not have abnormal connection distances. We conclude that schizophrenia is associated with replicable and convergent evidence for functional network randomization, and a similar topological profile was evident also in nonpsychotic relatives, suggesting that this is a systems-level endophenotype or marker of familial risk. We speculate that the greater resilience of brain networks may confer some fitness advantages on nonpsychotic relatives that could explain persistence of this endophenotype in the population.

  6. Virtual Systems Pharmacology (ViSP software for mechanistic system-level model simulations

    Directory of Open Access Journals (Sweden)

    Sergey eErmakov

    2014-10-01

    Full Text Available Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user’s particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  7. Thread mapping using system-level model for shared memory multicores

    Science.gov (United States)

    Mitra, Reshmi

    Exploring thread-to-core mapping options for a parallel application on a multicore architecture is computationally very expensive. For the same algorithm, the mapping strategy (MS) with the best response time may change with data size and thread counts. The primary challenge is to design a fast, accurate and automatic framework for exploring these MSs for large data-intensive applications. This is to ensure that the users can explore the design space within reasonable machine hours, without thorough understanding on how the code interacts with the platform. Response time is related to the cycles per instructions retired (CPI), taking into account both active and sleep states of the pipeline. This work establishes a hybrid approach, based on Markov Chain Model (MCM) and Model Tree (MT) for system-level steady state CPI prediction. It is designed for shared memory multicore processors with coarse-grained multithreading. The thread status is represented by the MCM states. The program characteristics are modeled as the transition probabilities, representing the system moving between active and suspended thread states. The MT model extrapolates these probabilities for the actual application size (AS) from the smaller AS performance. This aspect of the framework, along with, the use of mathematical expressions for the actual AS performance information, results in a tremendous reduction in the CPI prediction time. The framework is validated using an electromagnetics application. The average performance prediction error for steady state CPI results with 12 different MSs is less than 1%. The total run time of model is of the order of minutes, whereas the actual application execution time is in terms of days.

  8. Towards a systems-level understanding of gene regulatory, protein interaction, and metabolic networks in cyanobacteria

    Directory of Open Access Journals (Sweden)

    Miguel Angel Hernández-Prieto

    2014-07-01

    Full Text Available Cyanobacteria are essential primary producers in marine ecosystems, playing an important role in both carbon and nitrogen cycles. In the last decade, various genome sequencing and metagenomic projects have generated large amounts of genetic data for cyanobacteria. This wealth of data provides researchers with a new basis for the study of molecular adaptation, ecology and evolution of cyanobacteria, as well as for developing biotechnological applications. It also facilitates the use of multiplex techniques, i.e., expression profiling by high-throughput technologies such as microarrays, RNA-seq, and proteomics. However, exploration and analysis of these data is challenging, and often requires advanced computational methods. Also, they need to be integrated into our existing framework of knowledge to use them to draw reliable biological conclusions. Here, systems biology provides important tools. Especially, the construction and analysis of molecular networks has emerged as a powerful systems-level framework, with which to integrate such data, and to better understand biological relevant processes in these organisms.In this review, we provide an overview of the advances and experimental approaches undertaken using multiplex data from genomic, transcriptomic, proteomic, and metabolomic studies in cyanobacteria. Furthermore, we summarize currently available web-based tools dedicated to cyanobacteria, i.e., CyanoBase, CyanoEXpress, ProPortal, Cyanorak, CyanoBIKE, and CINPER. Finally, we present a case study for the freshwater model cyanobacteria, Synechocystis sp. PCC6803, to show the power of meta-analysis, and the potential to extrapolate acquired knowledge to the ecologically important marine cyanobacteria genus, Prochlorococcus.

  9. Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models.

    Science.gov (United States)

    Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A

    2014-01-01

    Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  10. System-Level Design of an Integrated Receiver Front End for a Wireless Ultrasound Probe.

    Science.gov (United States)

    Di Ianni, Tommaso; Hemmsen, Martin Christian; Llimos Muntal, Pere; Jorgensen, Ivan Harald Holger; Jensen, Jorgen Arendt

    2016-11-01

    In this paper, a system-level design is presented for an integrated receive circuit for a wireless ultrasound probe, which includes analog front ends and beamformation modules. This paper focuses on the investigation of the effects of architectural design choices on the image quality. The point spread function is simulated in Field II from 10 to 160 mm using a convex array transducer. A noise analysis is performed, and the minimum signal-to-noise ratio (SNR) requirements are derived for the low-noise amplifiers (LNAs) and A/D converters (ADCs) to fulfill the design specifications of a dynamic range of 60 dB and a penetration depth of 160 mm in the B-mode image. Six front-end implementations are compared using Nyquist-rate and Σ∆ modulator ADCs. The image quality is evaluated as a function of the depth in terms of lateral full-width at half-maximum (FWHM) and -12-dB cystic resolution (CR). The designs that minimally satisfy the specifications are based on an 8-b 30-MSPS Nyquist converter and a single-bit third-order 240-MSPS Σ∆ modulator, with an SNR for the LNA in both cases equal to 64 dB. The mean lateral FWHM and CR are 2.4% and 7.1% lower for the Σ∆ architecture compared with the Nyquist-rate one. However, the results generally show minimal differences between equivalent architectures. Advantages and drawbacks are finally discussed for the two families of converters.

  11. Theories on educational effectiveness and ineffectiveness

    NARCIS (Netherlands)

    Scheerens, Jaap

    2015-01-01

    Following Snow’s (1973) description of an “inductive” process of theory formation, this article addresses the organization of the knowledge base on school effectiveness. A multilevel presentation stimulated the conceptualization of educational effectiveness as an integration of system-level, school-

  12. Theories on Educational Effectiveness and Ineffectiveness

    Science.gov (United States)

    Scheerens, Jaap

    2015-01-01

    Following Snow's (1973) description of an "inductive" process of theory formation, this article addresses the organization of the knowledge base on school effectiveness. A multilevel presentation stimulated the conceptualization of educational effectiveness as an integration of system-level, school-level, and classroom-level…

  13. A neutral oxygen-vacancy center in diamond: A plausible qubit candidate and its spintronic and electronic properties

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Y. G.; Tang, Z., E-mail: ztang@ee.ecnu.edu.cn; Zhao, X. G.; Cheng, G. D.; Tu, Y.; Cong, W. T.; Zhu, Z. Q.; Chu, J. H. [Key Laboratory of Polar Materials and Devices, Ministry of Education of China, East China Normal University, Shanghai 200241 (China); Peng, W., E-mail: wpeng@ecnu.edu.cn [Supercomputer Center, Administration Department of Equipments, East China Normal University, Shanghai 200062 (China)

    2014-08-04

    Spintronic and electronic properties of a neutral oxygen-vacancy (O-V) center, an isoelectronic defect similar to the negatively charged nitrogen-vacancy center in diamond, were studied by combining first-principles calculations and a mean-field theory for spin hyperfine interaction. It is elucidated that the neutral O-V center is stable in the p-type diamond and possesses an S = 1 triplet ground state and four spin-conserved excited states with the spin coherence times in an order of second at T = 0 K. The results indicate that the neutral O-V center is another promising candidate for spin coherent manipulation and qubit operation.

  14. Plausibility of the implausible: is it possible that ultra-high dilutions ‘without biological activity’ cause adverse effects?

    Directory of Open Access Journals (Sweden)

    Marcus Zulian Teixeira

    2013-06-01

    .7% of cases the potencies were described as below of the 12ª Centesimal, the point beyond which the likelihood of a single molecule being present in the remedy approaches zero”, the authors claim that “in the majority of cases, the possible mechanism of action involved allergic reactions or ingestion of toxic substances”. With this approach, the authors seek to dismiss the biological effects of ultra-high dilutions, because if they cause AEs would be confirming the plausibility of its possible therapeutic effects. However, toxicological tests are required to affirm that AEs are a consequence of toxic (allergic effects of the substances or of ‘imponderable’ effects of ultra-high dilutions. In view of the recent report cited in the review [12] in which a complex homeopathic medicine indicated for treating infant colic (Gali-col Baby, GCB caused apparent life-threatening events (ALTEs were described by the National Institutes of Health consensus group in 1986 as “an episode that is frightening to the observer and that is characterized by some combination of apnea (central or occasionally obstructive, color change (usually cyanotic or pallid but occasionally erythematous or plethoric, a marked change in muscle tone (usually marked limpness, choking or gagging” [13] in consequence of the ‘toxicity of active ingredients’ (Citrullus colocynthis, Matricaria chamomilla, Bryonia alba, Nux vomica, Veratrum album, Magnesia phosphorica and Cuprum metallicum at potencies between 4C and 5C, Oberbaum et al. [14] performed a toxicological study of these components showing that “doses ingested in the GCB series were 10-13 orders of magnitude smaller than those reported to cause toxic reactions in humans” and that “there was poor correlation between symptoms with GCB and toxic profiles of the components”. As alternative explanation, they suggest that “four components (Veratrum album, Cuprum metallicum, Bryonia alba and Matricaria chamomilla have an

  15. From Number Agreement to the Subjunctive: Evidence for Processability Theory in L2 Spanish

    Science.gov (United States)

    Bonilla, Carrie L.

    2015-01-01

    This article contributes to typological plausibility of Processability Theory (PT) (Pienemann, 1998, 2005) by providing empirical data that show that the stages predicted by PT are followed in the second language (L2) acquisition of Spanish syntax and morphology. In the present article, the PT stages for L2 Spanish morphology and syntax are first…

  16. One-pot synthesis of tetrazole-1,2,5,6-tetrahydronicotinonitriles and cholinesterase inhibition: Probing the plausible reaction mechanism via computational studies.

    Science.gov (United States)

    Hameed, Abdul; Zehra, Syeda Tazeen; Abbas, Saba; Nisa, Riffat Un; Mahmood, Tariq; Ayub, Khurshid; Al-Rashida, Mariya; Bajorath, Jürgen; Khan, Khalid Mohammed; Iqbal, Jamshed

    2016-04-01

    In the present study, one-pot synthesis of 1H-tetrazole linked 1,2,5,6-tetrahydronicotinonitriles under solvent-free conditions have been carried out in the presence of tetra-n-butyl ammonium fluoride trihydrated (TBAF) as catalyst and solvent. Computational studies have been conducted to elaborate two plausible mechanistic pathways of this one-pot reaction. Moreover, the synthesized compounds were screened for cholinesterases (acetylcholinesterase and butyrylcholinesterase) inhibition which are consider to be major malefactors of Alzheimer's disease (AD) to find lead compounds for further research in AD therapy.

  17. Phylogenetic analysis of NS5B gene of classical swine fever virus isolates indicates plausible Chinese origin of Indian subgroup 2.2 viruses.

    Science.gov (United States)

    Patil, S S; Hemadri, D; Veeresh, H; Sreekala, K; Gajendragad, M R; Prabhudas, K

    2012-02-01

    Twenty-three CSFV isolates recovered from field outbreaks in various parts of India during 2006-2009 were used for genetic analysis in the NS5B region (409 nts). Seventeen of these were studied earlier [16] in the 5'UTR region. Phylogenetic analysis indicated the continued dominance of subgroup 1.1 strains in the country. Detailed analysis of a subgroup 2.2 virus indicated the plausible Chinese origin of this subgroup in India and provided indirect evidence of routes of CSFV movement within South East Asia region.

  18. Toward a Holographic Theory for General Spacetimes

    CERN Document Server

    Nomura, Yasunori; Sanches, Fabio; Weinberg, Sean J

    2016-01-01

    We study a holographic theory of general spacetimes that does not rely on the existence of asymptotic regions. This theory is to be formulated in a holographic space. When a semiclassical description is applicable, the holographic space is assumed to be a holographic screen: a codimension-1 surface that is capable of encoding states of the gravitational spacetime. Our analysis is guided by conjectured relationships between gravitational spacetime and quantum entanglement in the holographic description. To understand basic features of this picture, we catalog predictions for the holographic entanglement structure of cosmological spacetimes. We find that qualitative features of holographic entanglement entropies for such spacetimes differ from those in AdS/CFT but that the former reduce to the latter in the appropriate limit. The Hilbert space of the theory is analyzed, and two plausible structures are found: a direct sum and "spacetime equals entanglement" structure. The former preserves a naive relationship b...

  19. Supersymmetric Microscopic Theory of the Standard Model

    CERN Document Server

    Ter-Kazarian, G T

    2000-01-01

    We promote the microscopic theory of standard model (MSM, hep-ph/0007077) into supersymmetric framework in order to solve its technical aspects of vacuum zero point energy and hierarchy problems, and attempt, further, to develop its realistic viable minimal SUSY extension. Among other things that - the MSM provides a natural unification of geometry and the field theory, has clarified the physical conditions in which the geometry and particles come into being, in microscopic sense enables an insight to key problems of particle phenomenology and answers to some of its nagging questions - a present approach also leads to quite a new realization of the SUSY yielding a physically realistic particle spectrum. It stems from the special subquark algebra, from which the nilpotent supercharge operators are derived. The resulting theory makes plausible following testable implications for the current experiments at LEP2, at the Tevatron and at LHC drastically different from those of the conventional MSSM models: 1. All t...

  20. Microscopic Theory of the Standard Model

    CERN Document Server

    Ter-Kazarian, G T

    2000-01-01

    The operator manifold formalism (part I) enables the unification of the geometry and the field theory, and yields the quantization of geometry. This is the mathematical framework for our physical outlook that the geometry and fields, with the internal symmetries and all interactions, as well the four major principles of relativity (special and general), quantum, gauge and colour confinement, are derivative, and come into being simultaneously in the stable system of the underlying ``primordial structures''. In part II we attempt to develop, further, the microscopic approach to the Standard Model of particle physics, which enables an insight to the key problems of particle phenomenology. We suggest the microscopic theory of the unified electroweak interactions. The Higgs bosons have arisen on an analogy of the Cooper pairs in superconductivity. Besides of microscopic interpretation of all physical parameters the resulting theory also makes plausible following testable implications for the current experiments: 1...

  1. Towards a communication-theoretic understanding of system-level power consumption

    CERN Document Server

    Grover, Pulkit; Sahai, Anant

    2010-01-01

    Traditional communication theory focuses on minimizing transmit power. Increasingly, however, communication links are operating at shorter ranges where transmit power can drop below the power consumed in decoding. In this paper, we model the required decoding power and investigate the minimization of total system power from two complementary perspectives. First, an isolated point-to-point link is considered. Using new lower bounds on the complexity of message-passing decoding, lower bounds are derived on decoding power. These bounds show that 1) there is a fundamental tradeoff between transmit and decoding power; 2) unlike the implications of the traditional ``waterfall'' curve which focuses on transmit power, the total power must diverge to infinity as error probability goes to zero; 3) Regular LDPCs, and not their capacity-achieving counterparts, can be shown to be power order optimal in some cases; and 4) the optimizing transmit power is bounded away from the Shannon limit. Second, we consider a collection...

  2. Hierarchical random cellular neural networks for system-level brain-like signal processing.

    Science.gov (United States)

    Kozma, Robert; Puljic, Marko

    2013-09-01

    Sensory information processing and cognition in brains are modeled using dynamic systems theory. The brain's dynamic state is described by a trajectory evolving in a high-dimensional state space. We introduce a hierarchy of random cellular automata as the mathematical tools to describe the spatio-temporal dynamics of the cortex. The corresponding brain model is called neuropercolation which has distinct advantages compared to traditional models using differential equations, especially in describing spatio-temporal discontinuities in the form of phase transitions. Phase transitions demarcate singularities in brain operations at critical conditions, which are viewed as hallmarks of higher cognition and awareness experience. The introduced Monte-Carlo simulations obtained by parallel computing point to the importance of computer implementations using very large-scale integration (VLSI) and analog platforms.

  3. Emergent structured transition from variation to repetition in a biologically-plausible model of learning in basal ganglia.

    Directory of Open Access Journals (Sweden)

    Ashvin eShah

    2014-02-01

    Full Text Available Often, when animals encounter an unexpected sensory event, they transition from executing a variety of movements to repeating the movement(s that may have caused the event. According to a recent theory of action discovery (Redgrave and Gurney 2006, repetition allows the animal to represent those movements, and the outcome, as an action for later recruitment. The transition from variation to repetition often follows a non-random, structured, pattern. While the structure of the pattern can be explained by sophisticated cognitive mechanisms, simpler mechanisms based on dopaminergic modulation of basal ganglia (BG activity are thought to underlie action discovery (Redgrave and Gurney 2006. In this paper we ask the question: can simple BG-mediated mechanisms account for a structured transition from variation to repetition, or are more sophisticated cognitive mechanisms always necessary?To address this question, we present a computational model of BG-mediated biasing of behavior. In our model, unlike most other models of BG function, the BG biases behaviour through modulation of cortical response to excitation; many possible movements are represented by the cortical area; and excitation to the cortical area is topographically-organized. We subject the model to simple reaching tasks, inspired by behavioral studies, in which a location to which to reach must be selected. Locations within a target area elicit a reinforcement signal. A structured transition from variation to repetition emerges from simple BG-mediated biasing of cortical response to excitation. We show how the structured pattern influences behavior in simple and complicated tasks. We also present analyses that describe the structured transition from variation to repetition due to BG-mediated biasing and from biasing that would be expected from a type of cognitive biasing, allowing us to compare behaviour resulting from these types of biasing and make connections with future behavioural

  4. Topos theory

    CERN Document Server

    Johnstone, PT

    2014-01-01

    Focusing on topos theory's integration of geometric and logical ideas into the foundations of mathematics and theoretical computer science, this volume explores internal category theory, topologies and sheaves, geometric morphisms, other subjects. 1977 edition.

  5. Identification of the bkdAB gene cluster, a plausible source of the starter-unit for virginiamycin M production in Streptomyces virginiae.

    Science.gov (United States)

    Pulsawat, Nattika; Kitani, Shigeru; Kinoshita, Hiroshi; Lee, Chang Kwon; Nihira, Takuya

    2007-06-01

    The bkdAB gene cluster, which encodes plausible E1 and E2 components of the branched-chain alpha-keto acid dehydrogenase (BCDH) complex, was isolated from Streptomyces virginiae in the vicinity of a regulatory island for virginiamycin production. Gene disruption of bkdA completely abolished the production of virginiamycin M (a polyketide-peptide antibiotic), while the production of virginiamycin S (a cyclodepsipeptide antibiotic) was unaffected. Complementation of the bkdA disruptant by genome-integration of intact bkdA completely restored the virginiamycin M production, indicating that the bkdAB cluster is essential for virginiamycin M biosynthesis, plausibly via the provision of isobutyryl-CoA as a primer unit. In contrast to a feature usually seen in the Streptomyces E1 component, namely, the separate encoding of the alpha and beta subunits, S. virginiae bkdA seemed to encode the fused form of the alpha and beta subunits, which was verified by the actual catalytic activity of the fused protein in vitro using recombinant BkdA overexpressed in Escherichia coli. Supply of an additional bkdA gene under the strong and constitutive promoter ermE* in the wild-type strain of S. virginiae resulted in enhanced production of virginiamycin M, suggesting that the supply of isobutyryl-CoA is one of the rate-limiting factors in the biosynthesis of virginiamycin M.

  6. Analysis of multi-domain hypothetical proteins containing iron-sulphur clusters and fad ligands reveal rieske dioxygenase activity suggesting their plausible roles in bioremediation.

    Science.gov (United States)

    Sathyanarayanan, Nitish; Nagendra, Holenarasipur Gundurao

    2012-01-01

    'Conserved hypothetical' proteins pose a challenge not just for functional genomics, but also to biology in general. As long as there are hundreds of conserved proteins with unknown function in model organisms such as Escherichia coli, Bacillus subtilis or Saccharomyces cerevisiae, any discussion towards a 'complete' understanding of these biological systems will remain a wishful thinking. Insilico approaches exhibit great promise towards attempts that enable appreciating the plausible roles of these hypothetical proteins. Among the majority of genomic proteins, two-thirds in unicellular organisms and more than 80% in metazoa, are multi-domain proteins, created as a result of gene duplication events. Aromatic ring-hydroxylating dioxygenases, also called Rieske dioxygenases (RDOs), are class of multi-domain proteins that catalyze the initial step in microbial aerobic degradation of many aromatic compounds. Investigations here address the computational characterization of hypothetical proteins containing Ferredoxin and Flavodoxin signatures. Consensus sequence of each class of oxidoreductase was obtained by a phylogenetic analysis, involving clustering methods based on evolutionary relationship. A synthetic sequence was developed by combining the consensus, which was used as the basis to search for their homologs via BLAST. The exercise yielded 129 multidomain hypothetical proteins containing both 2Fe-2S (Ferredoxin) and FNR (Flavodoxin) domains. In the current study, 40 proteins with N-terminus 2Fe-2S domain and C-terminus FNR domain are characterized, through homology modelling and docking exercises which suggest dioxygenase activity indicating their plausible roles in degradation of aromatic moieties.

  7. System-level power optimization for real-time distributed embedded systems

    Science.gov (United States)

    Luo, Jiong

    Power optimization is one of the crucial design considerations for modern electronic systems. In this thesis, we present several system-level power optimization techniques for real-time distributed embedded systems, based on dynamic voltage scaling, dynamic power management, and management of peak power and variance of the power profile. Dynamic voltage scaling has been widely acknowledged as an important and powerful technique to trade off dynamic power consumption and delay. Efficient dynamic voltage scaling requires effective variable-voltage scheduling mechanisms that can adjust voltages and clock frequencies adaptively based on workloads and timing constraints. For this purpose, we propose static variable-voltage scheduling algorithms utilizing criticalpath driven timing analysis for the case when tasks are assumed to have uniform switching activities, as well as energy-gradient driven slack allocation for a more general scenario. The proposed techniques can achieve closeto-optimal power savings with very low computational complexity, without violating any real-time constraints. We also present algorithms for power-efficient joint scheduling of multi-rate periodic task graphs along with soft aperiodic tasks. The power issue is addressed through both dynamic voltage scaling and power management. Periodic task graphs are scheduled statically. Flexibility is introduced into the static schedule to allow the on-line scheduler to make local changes to PE schedules through resource reclaiming and slack stealing, without interfering with the validity of the global schedule. We provide a unified framework in which the response times of aperiodic tasks and power consumption are dynamically optimized simultaneously. Interconnection network fabrics point to a new generation of power-efficient and scalable interconnection architectures for distributed embedded systems. As the system bandwidth continues to increase, interconnection networks become power/energy limited as

  8. Packaging Theory.

    Science.gov (United States)

    Williams, Jeffrey

    1994-01-01

    Considers the recent flood of anthologies of literary criticism and theory as exemplifications of the confluence of pedagogical concerns, economics of publishing, and other historical factors. Looks specifically at how these anthologies present theory. Cites problems with their formatting theory and proposes alternative ways of organizing theory…

  9. Agency Theory

    DEFF Research Database (Denmark)

    Linder, Stefan; Foss, Nicolai Juul

    Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting and informational conditions, the theory addresses problems of ex...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism....

  10. Systems Level Analysis of Histone H3 Post-translational Modifications (PTMs) Reveals Features of PTM Crosstalk in Chromatin Regulation

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Sidoli, Simone; Ruminowicz, Chrystian

    2016-01-01

    molecules contain multiple coexisting PTMs, some of which exhibit crosstalk, i.e. coordinated or mutually exclusive activities. Here, we present an integrated experimental and computational systems level molecular characterization of histone PTMs and PTM crosstalk. Using wild type and engineered mouse...

  11. A Generic System-Level Framework for Self-Serve Health Monitoring System through Internet of Things (IoT).

    Science.gov (United States)

    Ahmed, Mobyen Uddin; Björkman, Mats; Lindén, Maria

    2015-01-01

    Sensor data are traveling from sensors to a remote server, data is analyzed remotely in a distributed manner, and health status of a user is presented in real-time. This paper presents a generic system-level framework for a self-served health monitoring system through the Internet of Things (IoT) to facilities an efficient sensor data management.

  12. Is health workforce planning recognising the dynamic interplay between health literacy at an individual, organisation and system level?

    Science.gov (United States)

    Naccarella, Lucio; Wraighe, Brenda; Gorman, Des

    2016-02-01

    The growing demands on the health system to adapt to constant change has led to investment in health workforce planning agencies and approaches. Health workforce planning approaches focusing on identifying, predicting and modelling workforce supply and demand are criticised as being simplistic and not contributing to system-level resiliency. Alternative evidence- and needs-based health workforce planning approaches are being suggested. However, to contribute to system-level resiliency, workforce planning approaches need to also adopt system-based approaches. The increased complexity and fragmentation of the healthcare system, especially for patients with complex and chronic conditions, has also led to a focus on health literacy not simply as an individual trait, but also as a dynamic product of the interaction between individual (patients, workforce)-, organisational- and system-level health literacy. Although it is absolutely essential that patients have a level of health literacy that enables them to navigate and make decisions, so too the health workforce, organisations and indeed the system also needs to be health literate. Herein we explore whether health workforce planning is recognising the dynamic interplay between health literacy at an individual, organisation and system level, and the potential for strengthening resiliency across all those levels.

  13. ArchSim: A System-Level Parallel Simulation Platform for the Architecture Design of High Performance Computer

    Institute of Scientific and Technical Information of China (English)

    Yong-Qin Huang; Hong-Liang Li; Xiang-Hui Xie; Lei Qian; Zi-Yu Hao; Feng Guo; Kun Zhang

    2009-01-01

    High performance computer(HPC)is a complex huge system,of which the architecture design meets increasing difficulties and risks.Traditional methods,such as theoretical analysis,component-level simulation and sequential simulation,are not applicable to system-level simulations of HPC systems.Eyen the parallel simulation using large-scale parallel machines also have many difficulties in scalability,reliability,generality,as well as efficiency.According to the current needs of HPC architecture design,this paper proposes a system-level parallel simulation platform:ArchSim.We first introduce the architecture of ArchSim simulation platform which is composed of a global server(GS),local server agents(LSA)and entities.Secondly,we emphasize some key techniques of ArchSim,including the synchronization protocol,the communication mechanism and the distributed checkpointing/restart mechanism.We then make a synthesized test of some main performance indices of ArchSim with the phold benchmark and analyze the extra overhead generated by ArchSim.Finally,based on ArchSim.we construct a parallel event-driven interconnection network simulator and a system-level simulator for a small scale HPC system with 256 processors.The results of the performance test and HPC system simulations demonstrate that ArchSim can achieve high speedup ratio and high scalability on parallel host machine and support system-level simulations for the architecture design of HPC systems.

  14. Infrared Constraint on Ultraviolet Theories

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, Yuhsin [Cornell Univ., Ithaca, NY (United States)

    2012-08-01

    While our current paradigm of particle physics, the Standard Model (SM), has been extremely successful at explaining experiments, it is theoretically incomplete and must be embedded into a larger framework. In this thesis, we review the main motivations for theories beyond the SM (BSM) and the ways such theories can be constrained using low energy physics. The hierarchy problem, neutrino mass and the existence of dark matter (DM) are the main reasons why the SM is incomplete . Two of the most plausible theories that may solve the hierarchy problem are the Randall-Sundrum (RS) models and supersymmetry (SUSY). RS models usually suffer from strong flavor constraints, while SUSY models produce extra degrees of freedom that need to be hidden from current experiments. To show the importance of infrared (IR) physics constraints, we discuss the flavor bounds on the anarchic RS model in both the lepton and quark sectors. For SUSY models, we discuss the difficulties in obtaining a phenomenologically allowed gaugino mass, its relation to R-symmetry breaking, and how to build a model that avoids this problem. For the neutrino mass problem, we discuss the idea of generating small neutrino masses using compositeness. By requiring successful leptogenesis and the existence of warm dark matter (WDM), we can set various constraints on the hidden composite sector. Finally, to give an example of model independent bounds from collider experiments, we show how to constrain the DM–SM particle interactions using collider results with an effective coupling description.

  15. Atomic theories

    CERN Document Server

    Loring, FH

    2014-01-01

    Summarising the most novel facts and theories which were coming into prominence at the time, particularly those which had not yet been incorporated into standard textbooks, this important work was first published in 1921. The subjects treated cover a wide range of research that was being conducted into the atom, and include Quantum Theory, the Bohr Theory, the Sommerfield extension of Bohr's work, the Octet Theory and Isotopes, as well as Ionisation Potentials and Solar Phenomena. Because much of the material of Atomic Theories lies on the boundary between experimentally verified fact and spec

  16. Agency Theory

    DEFF Research Database (Denmark)

    Linder, Stefan; Foss, Nicolai Juul

    2015-01-01

    Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting, and informational conditions, the theory addresses problems of ex ...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism.......Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting, and informational conditions, the theory addresses problems of ex...

  17. Agency Theory

    DEFF Research Database (Denmark)

    Linder, Stefan; Foss, Nicolai Juul

    Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting and informational conditions, the theory addresses problems of ex a...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism.......Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting and informational conditions, the theory addresses problems of ex...

  18. Ring theory

    CERN Document Server

    Rowen, Louis H

    1991-01-01

    This is an abridged edition of the author's previous two-volume work, Ring Theory, which concentrates on essential material for a general ring theory course while ommitting much of the material intended for ring theory specialists. It has been praised by reviewers:**""As a textbook for graduate students, Ring Theory joins the best....The experts will find several attractive and pleasant features in Ring Theory. The most noteworthy is the inclusion, usually in supplements and appendices, of many useful constructions which are hard to locate outside of the original sources....The audience of non

  19. Grounded theory.

    Science.gov (United States)

    Harris, Tina

    2015-04-29

    Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.

  20. Anticipating and Communicating Plausible Environmental and Health Concerns Associated with Future Disasters: The ShakeOut and ARkStorm Scenarios as Examples

    Science.gov (United States)

    Plumlee, G. S.; Morman, S. A.; Alpers, C. N.; Hoefen, T. M.; Meeker, G. P.

    2010-12-01

    Disasters commonly pose immediate threats to human safety, but can also produce hazardous materials (HM) that pose short- and long-term environmental-health threats. The U.S. Geological Survey (USGS) has helped assess potential environmental health characteristics of HM produced by various natural and anthropogenic disasters, such as the 2001 World Trade Center collapse, 2005 hurricanes Katrina and Rita, 2007-2009 southern California wildfires, various volcanic eruptions, and others. Building upon experience gained from these responses, we are now developing methods to anticipate plausible environmental and health implications of the 2008 Great Southern California ShakeOut scenario (which modeled the impacts of a 7.8 magnitude earthquake on the southern San Andreas fault, http://urbanearth.gps.caltech.edu/scenario08/), and the recent ARkStorm scenario (modeling the impacts of a major, weeks-long winter storm hitting nearly all of California, http://urbanearth.gps.caltech.edu/winter-storm/). Environmental-health impacts of various past earthquakes and extreme storms are first used to identify plausible impacts that could be associated with the disaster scenarios. Substantial insights can then be gleaned using a Geographic Information Systems (GIS) approach to link ShakeOut and ARkStorm effects maps with data extracted from diverse database sources containing geologic, hazards, and environmental information. This type of analysis helps constrain where potential geogenic (natural) and anthropogenic sources of HM (and their likely types of contaminants or pathogens) fall within areas of predicted ShakeOut-related shaking, firestorms, and landslides, and predicted ARkStorm-related precipitation, flooding, and winds. Because of uncertainties in the event models and many uncertainties in the databases used (e.g., incorrect location information, lack of detailed information on specific facilities, etc.) this approach should only be considered as the first of multiple steps

  1. Quantum theory as a relevant framework for the statement of probabilistic and many-valued logic

    CERN Document Server

    Vol, E D

    2012-01-01

    Based on ideas of quantum theory of open systems we propose the consistent approach to the formulation of logic of plausible propositions. To this end we associate with every plausible proposition diagonal matrix of its likelihood and examine it as density matrix of relevant quantum system. We are showing that all logical connectives between plausible propositions can be represented as special positive valued transformations of these matrices. We demonstrate also the above transformations can be realized in relevant composite quantum systems by quantum engineering methods. The approach proposed allows one not only to reproduce and generalize results of well-known logical systems (Boolean, Lukasiewicz and so on) but also to classify and analyze from unified point of view various actual problems in psychophysics and social sciences.

  2. Insertion of O-H Bond of Rh(Ⅱ)-methylene Carbene into Alcohols: A Stepwise Mechanism More Plausible than a Concerted Mechanism

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The mechanisms of insertion of O-H bond of Rh( Ⅱ ) -methylene carbene into methanol and ethanol were studied by using B3LYP functional both in gas phase and in CH2Cl2. The formation of free alcoholic oxonium ylides is found to be impossible. Alcoholic oxonium ylide are formed as the intermediates before both the stepwise and the concerted transition states of insertion of O-H bond of Rh( Ⅱ ) -methylene carbene into methanol and ethanol. With regard to the mechanisms of insertion of O-H of Rh( Ⅱ ) -methylene carbene into alcohols, analysis of the energy barriers of the two mechanisms indicate that the stepwise mechanism is more plausible than the concerted mechanism.

  3. Is the Framework of Cohn's 'Tritope Model' for How T Cell Receptors Recognize Peptide/Self-MHC Complexes and Allo-MHC Plausible?

    Science.gov (United States)

    Bretscher, Peter A

    2016-05-01

    Cohn has developed the tritope model to describe how distinct domains of the T cell receptor (TcR) recognize peptide/self-MHC complexes and allo-MHC. He has over the years employed this model as a framework for considering how the TcR might mediate various signals [1-5]. In a recent publication [5], Cohn employs the Tritope Model to propose a detailed mechanism for the T cell receptor's involvement in positive thymic selection [5]. During a review of this proposal, I became uneasy over the plausibility of the underlying framework of the Tritope Model. I outline here the evolutionary considerations making me question this framework. I also suggest that the proposed framework underlying the Tritope Model makes strong predictions whose validity can most probably be assessed by considering observations reported in the literature.

  4. An Italian population-based case-control study on the association between farming and cancer: Are pesticides a plausible risk factor?

    Science.gov (United States)

    Salerno, Christian; Carcagnì, Antonella; Sacco, Sara; Palin, Lucio Antonio; Vanhaecht, Kris; Panella, Massimiliano; Guido, Davide

    2016-05-01

    This population-based case-control study investigated the association between farming (a proxy for pesticide exposure) and cancer in the Vercelli suburban area (northwest Italy). The residents, aged 25 to 79 years, in the above-mentioned area during the period 2002-2009 were considered. Cases were all the first hospital admissions for cancer. Controls were all the subjects not included in the cases and not excluded from the study. Cases and controls were classified according to whether they occupationally resulted farmers or nonfarmers during the period 1965-2009. Cancer odds ratios (ORs) between farmers and nonfarmers were calculated with generalized linear mixed models adjusted by gender and age. Farmers showed higher odds for all cancers (OR=1.459; p plausible association between pesticide exposure and cancer occurrence.

  5. Changes in the Distribution of Lesser Adjutant Storks (Leptoptilos javanicus in South and Southeast Asia: A Plausible Evidence of Global Climate and Land-use Change Effect

    Directory of Open Access Journals (Sweden)

    Kapil K. Khadka

    2014-01-01

    Full Text Available Species distribution Models (SDMs illustrate the relation between species and environmental variables. In an attempt to model the historical and current distribution of Lesser Adjutant Stork (Leptoptilos javanicus and gain qualitative insight into range shift, maxEnt modeling approach was applied. The model was projected into maps to illustrate the variation in spatial distribution of the species in South and Southeast Asia over time. A distributional shift was observed towards the north accompanied by range contraction to the south and expansion to the north. Besides, annual precipitation and temperature of the coldest period of a year appeared to be the major climatic determinants of species distribution. It provides plausible evidence of global climate and land-use change effect on the bird’s distribution and suggests avenues for further research.

  6. Model theory

    CERN Document Server

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  7. Viability Theory

    CERN Document Server

    Aubin, Jean-Pierre; Saint-Pierre, Patrick

    2011-01-01

    Viability theory designs and develops mathematical and algorithmic methods for investigating the adaptation to viability constraints of evolutions governed by complex systems under uncertainty that are found in many domains involving living beings, from biological evolution to economics, from environmental sciences to financial markets, from control theory and robotics to cognitive sciences. It involves interdisciplinary investigations spanning fields that have traditionally developed in isolation. The purpose of this book is to present an initiation to applications of viability theory, explai

  8. Systems-level modeling the effects of arsenic exposure with sequential pulsed and fluctuating patterns for tilapia and freshwater clam

    Energy Technology Data Exchange (ETDEWEB)

    Chen, W.-Y. [Department of Bioenvironmental Systems Engineering, National Taiwan University, Taipei 10617, Taiwan (China); Tsai, J.-W. [Institute of Ecology and Evolutionary Ecology, China Medical University, Taichung 40402, Taiwan (China); Ju, Y.-R. [Department of Bioenvironmental Systems Engineering, National Taiwan University, Taipei 10617, Taiwan (China); Liao, C.-M., E-mail: cmliao@ntu.edu.t [Department of Bioenvironmental Systems Engineering, National Taiwan University, Taipei 10617, Taiwan (China)

    2010-05-15

    The purpose of this paper was to use quantitative systems-level approach employing biotic ligand model based threshold damage model to examine physiological responses of tilapia and freshwater clam to sequential pulsed and fluctuating arsenic concentrations. We tested present model and triggering mechanisms by carrying out a series of modeling experiments where we used periodic pulses and sine-wave as featured exposures. Our results indicate that changes in the dominant frequencies and pulse timing can shift the safe rate distributions for tilapia, but not for that of freshwater clam. We found that tilapia increase bioenergetic costs to maintain the acclimation during pulsed and sine-wave exposures. Our ability to predict the consequences of physiological variation under time-varying exposure patterns has also implications for optimizing species growing, cultivation strategies, and risk assessment in realistic situations. - Systems-level modeling the pulsed and fluctuating arsenic exposures.

  9. Field theory

    CERN Document Server

    Roman, Steven

    2006-01-01

    Intended for graduate courses or for independent study, this book presents the basic theory of fields. The first part begins with a discussion of polynomials over a ring, the division algorithm, irreducibility, field extensions, and embeddings. The second part is devoted to Galois theory. The third part of the book treats the theory of binomials. The book concludes with a chapter on families of binomials - the Kummer theory. This new edition has been completely rewritten in order to improve the pedagogy and to make the text more accessible to graduate students.  The exercises have also been im

  10. Elastoplasticity theory

    CERN Document Server

    Hashiguchi, Koichi

    2009-01-01

    This book details the mathematics and continuum mechanics necessary as a foundation of elastoplasticity theory. It explains physical backgrounds with illustrations and provides descriptions of detailed derivation processes..

  11. Galois Theory

    CERN Document Server

    Cox, David A

    2012-01-01

    Praise for the First Edition ". . .will certainly fascinate anyone interested in abstract algebra: a remarkable book!"—Monatshefte fur Mathematik Galois theory is one of the most established topics in mathematics, with historical roots that led to the development of many central concepts in modern algebra, including groups and fields. Covering classic applications of the theory, such as solvability by radicals, geometric constructions, and finite fields, Galois Theory, Second Edition delves into novel topics like Abel’s theory of Abelian equations, casus irreducibili, and the Galo

  12. Game theory.

    Science.gov (United States)

    Dufwenberg, Martin

    2011-03-01

    Game theory is a toolkit for examining situations where decision makers influence each other. I discuss the nature of game-theoretic analysis, the history of game theory, why game theory is useful for understanding human psychology, and why game theory has played a key role in the recent explosion of interest in the field of behavioral economics. WIREs Cogni Sci 2011 2 167-173 DOI: 10.1002/wcs.119 For further resources related to this article, please visit the WIREs website.

  13. Penalty for Fuel Economy - System Level Perspectives on the Reliability of Hybrid Electric Vehicles During Normal and Graceful Degradation Operation

    Science.gov (United States)

    2008-08-27

    the issue of system level reliability in hybrid electric vehicles from a quantitative point of view. It also introduces a quantitative meaning to the...internal combustion engine based vehicle and later transition of those to hybrid electric vehicles . The paper intends to drive the point that in HEV...Generally people tend to think only in terms of fuel economy and additional cost premium on vehicle price while discussing about hybrid electric

  14. Identifying Second Language Errors: How Plausible are Plausible Reconstructions?

    Science.gov (United States)

    Hamid, Obaidul

    2007-01-01

    The research reported in the study was undertaken to measure English language teachers' ability to interpret second language learners' intended meanings in idiosyncratic utterances in written English. In doing so, it also aimed at verifying Corder's (1981) speculation that language teachers can correctly guess the intended meanings of learners in…

  15. Theories and models on the biological of cells in space

    Science.gov (United States)

    Todd, P.; Klaus, D. M.

    1996-01-01

    A wide variety of observations on cells in space, admittedly made under constraining and unnatural conditions in may cases, have led to experimental results that were surprising or unexpected. Reproducibility, freedom from artifacts, and plausibility must be considered in all cases, even when results are not surprising. The papers in symposium on 'Theories and Models on the Biology of Cells in Space' are dedicated to the subject of the plausibility of cellular responses to gravity -- inertial accelerations between 0 and 9.8 m/sq s and higher. The mechanical phenomena inside the cell, the gravitactic locomotion of single eukaryotic and prokaryotic cells, and the effects of inertial unloading on cellular physiology are addressed in theoretical and experimental studies.

  16. How to Do Things with Mouse Clicks: Applying Austin's Speech Act Theory to Explain Learning in Virtual Worlds

    Science.gov (United States)

    Loke, Swee-Kin; Golding, Clinton

    2016-01-01

    This article addresses learning in desktop virtual worlds where students role play for professional education. When students role play in such virtual worlds, they can learn some knowledge and skills that are useful in the physical world. However, existing learning theories do not provide a plausible explanation of how performing non-verbal…

  17. How to Do Things with Mouse Clicks: Applying Austin's Speech Act Theory to Explain Learning in Virtual Worlds

    Science.gov (United States)

    Loke, Swee-Kin; Golding, Clinton

    2016-01-01

    This article addresses learning in desktop virtual worlds where students role play for professional education. When students role play in such virtual worlds, they can learn some knowledge and skills that are useful in the physical world. However, existing learning theories do not provide a plausible explanation of how performing non-verbal…

  18. Quantum Theory

    CERN Document Server

    Manning, Phillip

    2011-01-01

    The study of quantum theory allowed twentieth-century scientists to examine the world in a new way, one that was filled with uncertainties and probabilities. Further study also led to the development of lasers, the atomic bomb, and the computer. This exciting new book clearly explains quantum theory and its everyday uses in our world.

  19. Shielding Theory

    Directory of Open Access Journals (Sweden)

    Ion N.Chiuta

    2009-05-01

    Full Text Available The paper determines relations for shieldingeffectiveness relative to several variables, includingmetal type, metal properties, thickness, distance,frequency, etc. It starts by presenting some relationshipsregarding magnetic, electric and electromagnetic fieldsas a pertinent background to understanding and applyingfield theory. Since literature about electromagneticcompatibility is replete with discussions about Maxwellequations and field theory only a few aspects arepresented.

  20. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  1. Conspiracy Theory

    DEFF Research Database (Denmark)

    Bjerg, Ole; Presskorn-Thygesen, Thomas

    2017-01-01

    The paper is a contribution to current debates about conspiracy theories within philosophy and cultural studies. Wittgenstein’s understanding of language is invoked to analyse the epistemological effects of designating particular questions and explanations as a ‘conspiracy theory......’. It is demonstrated how such a designation relegates these questions and explanations beyond the realm of meaningful discourse. In addition, Agamben’s concept of sovereignty is applied to explore the political effects of using the concept of conspiracy theory. The exceptional epistemological status assigned...... to alleged conspiracy theories within our prevalent paradigms of knowledge and truth is compared to the exceptional legal status assigned to individuals accused of terrorism under the War on Terror. The paper concludes by discussing the relation between conspiracy theory and ‘the paranoid style...

  2. Potential Theory

    CERN Document Server

    Lukeš, Jaroslav; Netuka, Ivan; Veselý, Jiří

    1988-01-01

    Within the tradition of meetings devoted to potential theory, a conference on potential theory took place in Prague on 19-24, July 1987. The Conference was organized by the Faculty of Mathematics and Physics, Charles University, with the collaboration of the Institute of Mathematics, Czechoslovak Academy of Sciences, the Department of Mathematics, Czech University of Technology, the Union of Czechoslovak Mathematicians and Physicists, the Czechoslovak Scientific and Technical Society, and supported by IMU. During the Conference, 69 scientific communications from different branches of potential theory were presented; the majority of them are in­ cluded in the present volume. (Papers based on survey lectures delivered at the Conference, its program as well as a collection of problems from potential theory will appear in a special volume of the Lecture Notes Series published by Springer-Verlag). Topics of these communications truly reflect the vast scope of contemporary potential theory. Some contributions deal...

  3. Concept theory

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2009-01-01

      Concept theory is an extremely broad, interdisciplinary and complex field of research related to many deep fields with very long historical traditions without much consensus. However, information science and knowledge organization cannot avoid relating to theories of concepts. Knowledge...... organizing systems (e.g. classification systems, thesauri and ontologies) should be understood as systems basically organizing concepts and their semantic relations. The same is the case with information retrieval systems. Different theories of concepts have different implications for how to construe......, evaluate and use such systems. Based on "a post-Kuhnian view" of paradigms this paper put forward arguments that the best understanding and classification of theories of concepts is to view and classify them in accordance with epistemological theories (empiricism, rationalism, historicism and pragmatism...

  4. Conspiracy Theory

    DEFF Research Database (Denmark)

    Bjerg, Ole; Presskorn-Thygesen, Thomas

    2017-01-01

    The paper is a contribution to current debates about conspiracy theories within philosophy and cultural studies. Wittgenstein’s understanding of language is invoked to analyse the epistemological effects of designating particular questions and explanations as a ‘conspiracy theory......’. It is demonstrated how such a designation relegates these questions and explanations beyond the realm of meaningful discourse. In addition, Agamben’s concept of sovereignty is applied to explore the political effects of using the concept of conspiracy theory. The exceptional epistemological status assigned...... to alleged conspiracy theories within our prevalent paradigms of knowledge and truth is compared to the exceptional legal status assigned to individuals accused of terrorism under the War on Terror. The paper concludes by discussing the relation between conspiracy theory and ‘the paranoid style...

  5. Implicit Theories Relate to Youth Psychopathology, But How? A Longitudinal Test of Two Predictive Models.

    Science.gov (United States)

    Schleider, Jessica L; Weisz, John R

    2016-08-01

    Research shows relations between entity theories-i.e., beliefs that traits and abilities are unchangeable-and youth psychopathology. A common interpretation has been that entity theories lead to psychopathology, but another possibility is that psychopathology predicts entity theories. The two models carry different implications for developmental psychopathology and intervention design. We tested each model's plausibility, examining longitudinal associations between entity theories of thoughts, feelings, and behavior and psychopathology in early adolescents across one school year (N = 59, 52 % female, ages 11-14, 0 % attrition). Baseline entity theories did not predict increases in psychopathology; instead, baseline psychopathology predicted increased entity theories over time. When symptom clusters were assessed individually, greater youth internalizing (but not externalizing) problems predicted subsequent increases in entity theories. Findings suggest that the commonly proposed predictive model may not be the only one warranting attention. They suggest that youth psychopathology may contribute to the development of certain kinds of entity theories.

  6. Plausible Reasoning in Tactical Planning.

    Science.gov (United States)

    1987-04-01

    Frames in a Person’s Memory i • flower type/of =(plant) types =Irose, daffodil, peony, bougainvillea ... parts =Ipetals, stem ..-I colors =Ipink...J World South AmericF r ESubtropical Surrey Flowers Daffodils Bougainvillea Peonies Roses Yellow RosesFa Figure 4. Part Hierarchy for England and...England) = Iyellow roses...I (7) SIm flower-type(England)= 1peonies... I (8) DIS flower-type(England)6 bougainvillea ... * *18 that they think are most

  7. Plausibility Arguments and Universal Gravitation

    Science.gov (United States)

    Cunha, Ricardo F. F.; Tort, A. C.

    2017-01-01

    Newton's law of universal gravitation underpins our understanding of the dynamics of the Solar System and of a good portion of the observable universe. Generally, in the classroom or in textbooks, the law is presented initially in a qualitative way and at some point during the exposition its mathematical formulation is written on the blackboard…

  8. Plausible Mechanisms of Cadmium Carcinogenesis

    Science.gov (United States)

    Cadmium is a transition metal and an ubiquitous environmental and industrial pollutant. Laboratory animal studies and epidemiological studies have shown that exposure to cadmium is associated with various organ toxicities and carcinogenic effects. Several national and internation...

  9. Bayesian Theory

    CERN Document Server

    Bernardo, Jose M

    2000-01-01

    This highly acclaimed text, now available in paperback, provides a thorough account of key concepts and theoretical results, with particular emphasis on viewing statistical inference as a special case of decision theory. Information-theoretic concepts play a central role in the development of the theory, which provides, in particular, a detailed discussion of the problem of specification of so-called prior ignorance . The work is written from the authors s committed Bayesian perspective, but an overview of non-Bayesian theories is also provided, and each chapter contains a wide-ranging critica

  10. Psychodynamic Theory

    Directory of Open Access Journals (Sweden)

    Kathleen Holtz Deal

    2007-05-01

    Full Text Available Psychodynamic theory, a theory of personality originated by Sigmund Freud, has a long and complex history within social work and continues to be utilized by social workers. This article traces the theory’s development and explains key concepts with an emphasis on its current relational focus within object relations theory and self-psychology. Empirical support for theoretical concepts and the effectiveness of psychodynamic therapies is reviewed and critiqued. Future directions are discussed, including addressing cultural considerations, increasing research, and emphasizing a relational paradigm

  11. Number theory

    CERN Document Server

    Andrews, George E

    1994-01-01

    Although mathematics majors are usually conversant with number theory by the time they have completed a course in abstract algebra, other undergraduates, especially those in education and the liberal arts, often need a more basic introduction to the topic.In this book the author solves the problem of maintaining the interest of students at both levels by offering a combinatorial approach to elementary number theory. In studying number theory from such a perspective, mathematics majors are spared repetition and provided with new insights, while other students benefit from the consequent simpl

  12. Mapping Theory

    DEFF Research Database (Denmark)

    Smith, Shelley

    This paper came about within the context of a 13-month research project, Focus Area 1 - Method and Theory, at the Center for Public Space Research at the Royal Academy of the Arts School of Architecture in Copenhagen, Denmark. This project has been funded by RealDania. The goals of the research...... project, Focus Area 1 - Method and Theory, which forms the framework for this working paper, are: * To provide a basis from which to discuss the concept of public space in a contemporary architectural and urban context - specifically relating to theory and method * To broaden the discussion of the concept...

  13. Plasticity theory

    CERN Document Server

    Lubliner, Jacob

    2008-01-01

    The aim of Plasticity Theory is to provide a comprehensive introduction to the contemporary state of knowledge in basic plasticity theory and to its applications. It treats several areas not commonly found between the covers of a single book: the physics of plasticity, constitutive theory, dynamic plasticity, large-deformation plasticity, and numerical methods, in addition to a representative survey of problems treated by classical methods, such as elastic-plastic problems, plane plastic flow, and limit analysis; the problem discussed come from areas of interest to mechanical, structural, and

  14. African Trypanosomiasis Detection using Dempster-Shafer Theory

    CERN Document Server

    Maseleno, Andino

    2012-01-01

    World Health Organization reports that African Trypanosomiasis affects mostly poor populations living in remote rural areas of Africa that can be fatal if properly not treated. This paper presents Dempster-Shafer Theory for the detection of African trypanosomiasis. Sustainable elimination of African trypanosomiasis as a public-health problem is feasible and requires continuous efforts and innovative approaches. In this research, we implement Dempster-Shafer theory for detecting African trypanosomiasis and displaying the result of detection process. We describe eleven symptoms as major symptoms which include fever, red urine, skin rash, paralysis, headache, bleeding around the bite, joint the paint, swollen lymph nodes, sleep disturbances, meningitis and arthritis. Dempster-Shafer theory to quantify the degree of belief, our approach uses Dempster-Shafer theory to combine beliefs under conditions of uncertainty and ignorance, and allows quantitative measurement of the belief and plausibility in our identificat...

  15. The utility of system-level RAM analysis and standards for the US nuclear waste management system

    Energy Technology Data Exchange (ETDEWEB)

    Rod, S.R.; Adickes, M.D.; Paul, B.K.

    1992-03-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) is responsible for developing a system to manage spent nuclear fuel and high-level radioactive waste in accordance with the Nuclear Waste Policy Act of 1982 and its subsequent amendments. Pacific Northwest Laboratory (PNL) is assisting OCRWM in its investigation of whether system-level reliability, availability, and maintainability (RAM) requirements are appropriate for the waste management system and, if they are, what appropriate form should be for such requirements. Results and recommendations are presented.

  16. Towards system-level understanding of baculovirus host cell interactions: from molecular fundamental studies to large-scale proteomics approaches

    Directory of Open Access Journals (Sweden)

    Francisca eMonteiro

    2012-11-01

    Full Text Available Baculoviruses are insect viruses extensively exploited as eukaryotic protein expression vectors. Molecular biology studies have provided exciting discoveries on virus-host interactions, but the application of omic high throughput techniques on the baculovirus-insect cell system has been hampered by the lack of host genome sequencing. While a broader, systems level analysis of biological responses to infection is urgently needed, recent advances on proteomic studies have yielded new insights on the impact of infection on the host cell. These works are reviewed and critically assessed in the light of current biological knowledge of the molecular biology of baculoviruses and insect cells.

  17. Continuity theory

    CERN Document Server

    Nel, Louis

    2016-01-01

    This book presents a detailed, self-contained theory of continuous mappings. It is mainly addressed to students who have already studied these mappings in the setting of metric spaces, as well as multidimensional differential calculus. The needed background facts about sets, metric spaces and linear algebra are developed in detail, so as to provide a seamless transition between students' previous studies and new material. In view of its many novel features, this book will be of interest also to mature readers who have studied continuous mappings from the subject's classical texts and wish to become acquainted with a new approach. The theory of continuous mappings serves as infrastructure for more specialized mathematical theories like differential equations, integral equations, operator theory, dynamical systems, global analysis, topological groups, topological rings and many more. In light of the centrality of the topic, a book of this kind fits a variety of applications, especially those that contribute to ...

  18. Model theory

    CERN Document Server

    Hodges, Wilfrid

    1993-01-01

    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  19. Activity Theory.

    Science.gov (United States)

    Koschmann, Timothy; Roschelle, Jeremy; Nardi, Bonnie A.

    1998-01-01

    Includes three articles that discuss activity theory, based on "Context and Consciousness." Topics include human-computer interaction; computer interfaces; hierarchical structuring; mediation; contradictions and development; failure analysis; and designing educational technology. (LRW)

  20. Invited review: Theories of aging.

    Science.gov (United States)

    Weinert, Brian T; Timiras, Poala S

    2003-10-01

    Several factors (the lengthening of the average and, to a lesser extent, of the maximum human life span; the increase in percentage of elderly in the population and in the proportion of the national expenditure utilized by the elderly) have stimulated and continue to expand the study of aging. Recently, the view of aging as an extremely complex multifactorial process has replaced the earlier search for a distinct cause such as a single gene or the decline of a key body system. This minireview keeps in mind the multiplicity of mechanisms regulating aging; examines them at the molecular, cellular, and systemic levels; and explores the possibility of interactions at these three levels. The heterogeneity of the aging phenotype among individuals of the same species and differences in longevity among species underline the contribution of both genetic and environmental factors in shaping the life span. Thus, the presence of several trajectories of the life span, from incidence of disease and disability to absence of pathology and persistence of function, suggest that it is possible to experimentally (e.g., by calorie restriction) prolong functional plasticity and life span. In this minireview, several theories are identified only briefly; a few (evolutionary, gene regulation, cellular senescence, free radical, and neuro-endocrineimmuno theories) are discussed in more detail, at molecular, cellular, and systemic levels.

  1. Graph theory

    CERN Document Server

    Gould, Ronald

    2012-01-01

    This introduction to graph theory focuses on well-established topics, covering primary techniques and including both algorithmic and theoretical problems. The algorithms are presented with a minimum of advanced data structures and programming details. This thoroughly corrected 1988 edition provides insights to computer scientists as well as advanced undergraduates and graduate students of topology, algebra, and matrix theory. Fundamental concepts and notation and elementary properties and operations are the first subjects, followed by examinations of paths and searching, trees, and networks. S

  2. Matrix Theory

    Science.gov (United States)

    1988-06-30

    MATRICES . The monograph Nonnegative Matrices [6] is an advanced book on all aspect of the theory of nonnegative matrices and...and on inverse eigenvalue problems for nonnegative matrices . The work explores some of the most recent developments in the theory of nonnegative...k -1, t0 . Define the associated polynomial of type <z>: t t-t 2 t-t 3 t-tk_ 1,X - x - x . . .X- where t = tk . The

  3. Diagnostic Plausibility of MTBDRplus and MTBDRsl Line Probe Assays for Rapid Drug Susceptibility Testing of Drug Resistant Mycobacterium tuberculosis Strains in Pakistan

    Directory of Open Access Journals (Sweden)

    Javaid

    2016-06-01

    Full Text Available Background World health organization (WHO recommends the use of line probe assays (LiPAs for rapid drug susceptibility testing (DST. However, only a limited number of studies from Pakistan have documented the performance characteristics of line probe assays in testing multi-drug resistant (MDR strains of Mycobacterium tuberculosis (MTB. Objectives The objective of this work is to evaluate the diagnostic plausibility of the LiPA tests MTBDRplus and MTBDRsl on MDR MTB isolates from Pakistan. Patients and Methods This was a cross-sectional study conducted at the Indus hospital, Karachi. LiPA testing was performed on 196 smear-positive samples using BACTEC MGIT 960 as a gold standard. Results The sensitivity of MTBDRplus for isoniazid and rifampicin was found to be 88.8% and 90.2%, respectively, while sensitivity of MTBDRsl for fluoroquinolones, amikacin/capreomycin, and ethambutol was found to be 72.9%, 81.8%, and 56.6%, respectively. Conclusions The MTBDRplus and MTBDRsl genotypic testing can serve as useful additional tools for DST in a high-burden country like Pakistan provided it is used in combination with phenotypic testing.

  4. Observational evidence for the plausible linkage of Equatorial Electrojet (EEJ electric field variations with the post sunset F-region electrodynamics

    Directory of Open Access Journals (Sweden)

    V. Sreeja

    2009-11-01

    Full Text Available The paper is based on a detailed observational study of the Equatorial Spread F (ESF events on geomagnetically quiet (Ap≤20 days of the solar maximum (2001, moderate (2004 and minimum (2006 years using the ionograms and magnetograms from the magnetic equatorial location of Trivandrum (8.5° N; 77° E; dip lat ~0.5° N in India. The study brings out some interesting aspects of the daytime Equatorial Electrojet (EEJ related electric field variations and the post sunset F-region electrodynamics governing the nature of seasonal characteristics of the ESF phenomena during these years. The observed results seem to indicate a plausible linkage of daytime EEJ related electric field variations with pre-reversal enhancement which in turn is related to the occurrence of ESF. These electric field variations are shown to be better represented through a parameter, termed as "E", in the context of possible coupling between the E- and F-regions of the ionosphere. The observed similarities in the gross features of the variations in the parameter "E" and the F-region vertical drift (Vz point towards the potential usage of the EEJ related parameter "E" as an useful index for the assessment of Vz prior to the occurrence of ESF.

  5. Effect of central metal ions of analogous metal-organic frameworks on adsorption of organoarsenic compounds from water: plausible mechanism of adsorption and water purification.

    Science.gov (United States)

    Jun, Jong Won; Tong, Minman; Jung, Beom K; Hasan, Zubair; Zhong, Chongli; Jhung, Sung Hwa

    2015-01-02

    The adsorptive removal of organoarsenic compounds such as p-arsanilic acid (ASA) and roxarsone (ROX) from water using metal-organic frameworks (MOFs) has been investigated for the first time. A MOF, iron benzenetricarboxylate (also called MIL-100-Fe) exhibits a much higher adsorption capacity for ASA and ROX than activated carbon, zeolite (HY), goethite, and other MOFs. The adsorption of ASA and ROX over MIL-100-Fe is also much more rapid than that over activated carbon. Moreover, the used MIL-100-Fe can be recycled by simply washing with acidic ethanol. Therefore, it is determined that a MOF such as MIL-100-Fe can be used to remove organoarsenic compounds from contaminated water because of its high adsorption capacity, rapid adsorption, and ready regeneration. Moreover, only one of three analogous MIL-100 species (MIL-100-Fe, rather than MIL-100-Al or MIL-100-Cr) can effectively remove the organoarsenic compounds. This selective and high adsorption over MIL-100-Fe, different from other analogous MIL-100 species, can be explained (through calculations) by the facile desorption of water from MIL-100-Fe as well as the large (absolute value) replacement energy (difference between the adsorption energies of the organoarsenic compounds and water) exhibited by MIL-100-Fe. A plausible adsorption/desorption mechanism is proposed based on the surface charge of the MOFs, FTIR results, calculations, and the reactivation results with respect to the solvents used in the experiments.

  6. Investigation of plausible mechanistic pathways in hydrogenation of η⁵-(C₅H₅)₂Ta(H)=CH₂: an analysis using DFT and AIM techniques.

    Science.gov (United States)

    Neogi, Soumya Ganguly; Das, Anita; Chaudhury, Pinaki

    2014-03-01

    In this manuscript, we investigate two plausible pathways for addition of H₂ across the bond Ta=C in η⁵-(C₅H₅)₂Ta(H)=CH₂. One of the investigated reaction pathways involves a single concerted step with a four-membered transition state keeping the oxidation state of tantalum unaltered, where as the other pathway deals with a two step reaction with α-insertion of H₂ to produce a 16e⁻ Ta(III)-methyl species and a subsequent oxidative addition. We must emphasize that an experimental study by Bregel et al. [J Am Chem Soc 2002, 124:13827-13832] on a derivative of the investigated chemical system in the present study showed that the two step strategy of α-insertion followed by subsequent oxidative addition is the preferred one. Our numerical investigations using DFT and AIM calculations lead to a similar conclusion. To establish our conclusion, we employ various basis sets to obtain the free energy of activation of the reaction. The AIM technique especially helps us to characterize the bond critical points at the optimized geometries of the reactants, products, transition states, and intermediates for the two step mechanism.

  7. Untemplated nonenzymatic polymerization of 3',5'cGMP: a plausible route to 3',5'-linked oligonucleotides in primordia.

    Science.gov (United States)

    Šponer, Judit E; Šponer, Jiří; Giorgi, Alessandra; Di Mauro, Ernesto; Pino, Samanta; Costanzo, Giovanna

    2015-02-19

    The high-energy 3',5' phosphodiester linkages conserved in 3',5' cyclic GMPs offer a genuine solution for monomer activation required by the transphosphorylation reactions that could lead to the emergence of the first simple oligonucleotide sequences on the early Earth. In this work we provide an in-depth characterization of the effect of the reaction conditions on the yield of the polymerization reaction of 3',5' cyclic GMPs both in aqueous environment as well as under dehydrating conditions. We show that the threshold temperature of the polymerization is about 30 °C lower under dehydrating conditions than in solution. In addition, we present a plausible exergonic reaction pathway for the polymerization reaction, which involves transient formation of anionic centers at the O3' positions of the participating riboses. We suggest that excess Na(+) cations inhibit the polymerization reaction because they block the anionic mechanism via neutralizing the negatively charged O3'. Our experimental findings are compatible with a prebiotic scenario, where gradual desiccation of the environment could induce polymerization of 3',5' cyclic GMPs synthesized in liquid.

  8. A plausible worst-case scenario of increasing multidrug resistance as a tool for assessing societal risks and capabilities in Sweden.

    Science.gov (United States)

    Roffey, Roger; Lindberg, Anna; Molin, Lena; Wikman-Svahn, Per

    2015-01-01

    A "plausible worst-case scenario" of a gradually increasing level of multidrug-resistant bacteria (carbapenem-resistant E. coli) in the human population was developed and used to study how Swedish authorities would manage this situation and to identify preventive measures that could be taken. Key findings include: (1) a scenario in which 5% of the population in southern Sweden become carriers of carbapenem-resistant E. coli is possible or even likely in 10 to 15 years; (2) it is not clear when and how the increase of E. coli resistant to carbapenems as in the scenario would be detected in the general human population; (3) identified negative consequences of the scenario on society were primarily due to increased demands on the healthcare system and potential consequences for food-producing animals, food safety, and environmental health; and (4) a number of preventive and mitigation measures were suggested, including initiating long-term screening programs for public and animal health as well as for food and water production to monitor increasing levels of carbapenem resistance. Strategies and plans to prevent and handle future increasing prevalence of multidrug-resistant bacteria need to be developed.

  9. Wedelolactone mitigates UVB induced oxidative stress, inflammation and early tumor promotion events in murine skin: plausible role of NFkB pathway.

    Science.gov (United States)

    Ali, Farrah; Khan, Bilal Azhar; Sultana, Sarwat

    2016-09-05

    UVB (Ultra-violet B) radiation is one of the major etiological factors in various dermal pathology viz. dermatitis, actinic folliculitis, solar urticaria, psoriasis and cancer among many others. UVB causes toxic manifestation in tissues by inciting inflammatory and tumor promoting events. We have designed this study to assess the anti-inflammatory and anti-tumor promotion effect of Wedelolactone (WDL) a specific IKK inhibitor. Results indicate significant restoration of anti-oxidative enzymes due to WDL treatments. We also found that WDL was effective in mitigating inflammatory markers consisting of MPO (myeloperoxidase), Mast cells trafficking, Langerhans cells suppression and COX 2 expression up regulation due to UVB exposure. We also deduce that WDL presented a promising intervention in attenuating early tumor promotion events caused by UVB exposure as indicated by the results of ODC (Ornithine Decarboxylase), Thymidine assay, Vimentin and VEGF (Vascular-endothelial growth factor) expression. This study was able to provide substantial cues for the therapeutic ability of Wedelolactone against inflammatory and tumor promoting events in murine skin depicting plausible role of NFkB pathway.

  10. Opioid analgesics and P-glycoprotein efflux transporters: a potential systems-level contribution to analgesic tolerance.

    Science.gov (United States)

    Mercer, Susan L; Coop, Andrew

    2011-01-01

    Chronic clinical pain remains poorly treated. Despite attempts to develop novel analgesic agents, opioids remain the standard analgesics of choice in the clinical management of chronic and severe pain. However, mu opioid analgesics have undesired side effects including, but not limited to, respiratory depression, physical dependence and tolerance. A growing body of evidence suggests that P-glycoprotein (P-gp), an efflux transporter, may contribute a systems-level approach to the development of opioid tolerance. Herein, we describe current in vitro and in vivo methodology available to analyze interactions between opioids and P-gp and critically analyze P-gp data associated with six commonly used mu opioids to include morphine, methadone, loperamide, meperidine, oxycodone, and fentanyl. Recent studies focused on the development of opioids lacking P-gp substrate activity are explored, concentrating on structure-activity relationships to develop an optimal opioid analgesic lacking this systems-level contribution to tolerance development. Continued work in this area will potentially allow for delineation of the mechanism responsible for opioid-related P-gp up-regulation and provide further support for evidence based medicine supporting clinical opioid rotation.

  11. A Mathematical Model of Metabolism and Regulation Provides a Systems-Level View of How Escherichia coli Responds to Oxygen

    Directory of Open Access Journals (Sweden)

    Michael eEderer

    2014-03-01

    Full Text Available The efficient redesign of bacteria for biotechnological purposes, such as biofuel production, waste disposal or specific biocatalytic functions, requires a quantitative systems-level understanding of energy supply, carbon and redox metabolism. The measurement of transcript levels, metabolite concentrations and metabolic fluxes per se gives an incomplete picture. An appreciation of the interdependencies between the different measurement values is essential for systems-level understanding. Mathematical modeling has the potential to provide a coherent and quantitative description of the interplay between gene expression, metabolite concentrations and metabolic fluxes. Escherichia coli undergoes major adaptations in central metabolism when the availability of oxygen changes. Thus, an integrated description of the oxygen response provides a benchmark of our understanding of carbon, energy and redox metabolism. We present the first comprehensive model of the central metabolism of E. coli that describes steady-state metabolism at different levels of oxygen availability. Variables of the model are metabolite concentrations, gene expression levels, transcription factor activities, metabolic fluxes and biomass concentration. We analyze the model with respect to the production capabilities of central metabolism of E. coli. In particular, we predict how precursor and biomass concentration are affected by product formation.

  12. Possibility Theory versus Probability Theory in Fuzzy Measure Theory

    Directory of Open Access Journals (Sweden)

    Parul Agarwal

    2015-05-01

    Full Text Available The purpose of this paper is to compare probability theory with possibility theory, and to use this comparison in comparing probability theory with fuzzy set theory. The best way of comparing probabilistic and possibilistic conceptualizations of uncertainty is to examine the two theories from a broader perspective. Such a perspective is offered by evidence theory, within which probability theory and possibility theory are recognized as special branches. While the various characteristic of possibility theory within the broader framework of evidence theory are expounded in this paper, we need to introduce their probabilistic counterparts to facilitate our discussion.

  13. Intention, emotion, and action: a neural theory based on semantic pointers.

    Science.gov (United States)

    Schröder, Tobias; Stewart, Terrence C; Thagard, Paul

    2014-06-01

    We propose a unified theory of intentions as neural processes that integrate representations of states of affairs, actions, and emotional evaluation. We show how this theory provides answers to philosophical questions about the concept of intention, psychological questions about human behavior, computational questions about the relations between belief and action, and neuroscientific questions about how the brain produces actions. Our theory of intention ties together biologically plausible mechanisms for belief, planning, and motor control. The computational feasibility of these mechanisms is shown by a model that simulates psychologically important cases of intention. © 2013 Cognitive Science Society, Inc.

  14. Biocultural Theory

    DEFF Research Database (Denmark)

    Carroll, Joseph; Clasen, Mathias; Jonsson, Emelie

    2017-01-01

    Biocultural theory is an integrative research program designed to investigate the causal interactions between biological adaptations and cultural constructions. From the biocultural perspective, cultural processes are rooted in the biological necessities of the human life cycle: specifically human...... and ideological beliefs, and artistic practices such as music, dance, painting, and storytelling. Establishing biocultural theory as a program that self-consciously encompasses the different particular forms of human evolutionary research could help scholars and scientists envision their own specialized areas...... of research as contributions to a coherent, collective research program. This article argues that a mature biocultural paradigm needs to be informed by at least 7 major research clusters: (a) gene-culture coevolution; (b) human life history theory; (c) evolutionary social psychology; (d) anthropological...

  15. Lattice theory

    CERN Document Server

    Donnellan, Thomas; Maxwell, E A; Plumpton, C

    1968-01-01

    Lattice Theory presents an elementary account of a significant branch of contemporary mathematics concerning lattice theory. This book discusses the unusual features, which include the presentation and exploitation of partitions of a finite set. Organized into six chapters, this book begins with an overview of the concept of several topics, including sets in general, the relations and operations, the relation of equivalence, and the relation of congruence. This text then defines the relation of partial order and then partially ordered sets, including chains. Other chapters examine the properti

  16. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  17. Galois theory

    CERN Document Server

    Stewart, Ian

    2003-01-01

    Ian Stewart's Galois Theory has been in print for 30 years. Resoundingly popular, it still serves its purpose exceedingly well. Yet mathematics education has changed considerably since 1973, when theory took precedence over examples, and the time has come to bring this presentation in line with more modern approaches.To this end, the story now begins with polynomials over the complex numbers, and the central quest is to understand when such polynomials have solutions that can be expressed by radicals. Reorganization of the material places the concrete before the abstract, thus motivating the g

  18. Plausible mechanisms of the fenton-like reactions, M = Fe(II) and Co(II), in the presence of RCO2(-) substrates: are OH(•) radicals formed in the process?

    Science.gov (United States)

    Kornweitz, Haya; Burg, Ariela; Meyerstein, Dan

    2015-05-01

    DFT calculations concerning the plausible mechanism of Fenton-like reactions catalyzed by Fe(II) and Co(II) cations in the presence of carboxylate ligands suggest that hydroxyl radicals are not formed in these reactions. This conclusion suggests that the commonly accepted mechanisms of Fenton-like reactions induced oxidative stress and advanced oxidation processes have to be reconsidered.

  19. Intrinsically motivated action-outcome learning and goal-based action recall: a system-level bio-constrained computational model.

    Science.gov (United States)

    Baldassarre, Gianluca; Mannella, Francesco; Fiore, Vincenzo G; Redgrave, Peter; Gurney, Kevin; Mirolli, Marco

    2013-05-01

    Reinforcement (trial-and-error) learning in animals is driven by a multitude of processes. Most animals have evolved several sophisticated systems of 'extrinsic motivations' (EMs) that guide them to acquire behaviours allowing them to maintain their bodies, defend against threat, and reproduce. Animals have also evolved various systems of 'intrinsic motivations' (IMs) that allow them to acquire actions in the absence of extrinsic rewards. These actions are used later to pursue such rewards when they become available. Intrinsic motivations have been studied in Psychology for many decades and their biological substrates are now being elucidated by neuroscientists. In the last two decades, investigators in computational modelling, robotics and machine learning have proposed various mechanisms that capture certain aspects of IMs. However, we still lack models of IMs that attempt to integrate all key aspects of intrinsically motivated learning and behaviour while taking into account the relevant neurobiological constraints. This paper proposes a bio-constrained system-level model that contributes a major step towards this integration. The model focusses on three processes related to IMs and on the neural mechanisms underlying them: (a) the acquisition of action-outcome associations (internal models of the agent-environment interaction) driven by phasic dopamine signals caused by sudden, unexpected changes in the environment; (b) the transient focussing of visual gaze and actions on salient portions of the environment; (c) the subsequent recall of actions to pursue extrinsic rewards based on goal-directed reactivation of the representations of their outcomes. The tests of the model, including a series of selective lesions, show how the focussing processes lead to a faster learning of action-outcome associations, and how these associations can be recruited for accomplishing goal-directed behaviours. The model, together with the background knowledge reviewed in the paper

  20. An architectural model of conscious and unconscious brain functions: Global Workspace Theory and IDA.

    Science.gov (United States)

    Baars, Bernard J; Franklin, Stan

    2007-11-01

    While neural net models have been developed to a high degree of sophistication, they have some drawbacks at a more integrative, "architectural" level of analysis. We describe a "hybrid" cognitive architecture that is implementable in neuronal nets, and which has uniform brainlike features, including activation-passing and highly distributed "codelets," implementable as small-scale neural nets. Empirically, this cognitive architecture accounts qualitatively for the data described by Baars' Global Workspace Theory (GWT), and Franklin's LIDA architecture, including state-of-the-art models of conscious contents in action-planning, Baddeley-style Working Memory, and working models of episodic and semantic longterm memory. These terms are defined both conceptually and empirically for the current theoretical domain. The resulting architecture meets four desirable goals for a unified theory of cognition: practical workability, autonomous agency, a plausible role for conscious cognition, and translatability into plausible neural terms. It also generates testable predictions, both empirical and computational.

  1. Effective theories of universal theories

    CERN Document Server

    Wells, James D

    2015-01-01

    It is well-known but sometimes overlooked that constraints on the oblique parameters (most notably $S$ and $T$ parameters) are only applicable to a special class of new physics scenarios known as universal theories. In the effective field theory (EFT) framework, the oblique parameters should not be associated with Wilson coefficients in a particular operator basis, unless restrictions have been imposed on the EFT so that it describes universal theories. We work out these restrictions, and present a detailed EFT analysis of universal theories. We find that at the dimension-6 level, universal theories are completely characterized by 16 parameters. They are conveniently chosen to be: 5 oblique parameters that agree with the commonly-adopted ones, 4 anomalous triple-gauge couplings, 3 rescaling factors for the $h^3$, $hff$, $hVV$ vertices, 3 parameters for $hVV$ vertices absent in the Standard Model, and 1 four-fermion coupling of order $y_f^2$. All these parameters are defined in an unambiguous and basis-indepen...

  2. Trial-by-Trial Modulation of Associative Memory Formation by Reward Prediction Error and Reward Anticipation as Revealed by a Biologically Plausible Computational Model

    Science.gov (United States)

    Aberg, Kristoffer C.; Müller, Julia; Schwartz, Sophie

    2017-01-01

    Anticipation and delivery of rewards improves memory formation, but little effort has been made to disentangle their respective contributions to memory enhancement. Moreover, it has been suggested that the effects of reward on memory are mediated by dopaminergic influences on hippocampal plasticity. Yet, evidence linking memory improvements to actual reward computations reflected in the activity of the dopaminergic system, i.e., prediction errors and expected values, is scarce and inconclusive. For example, different previous studies reported that the magnitude of prediction errors during a reinforcement learning task was a positive, negative, or non-significant predictor of successfully encoding simultaneously presented images. Individual sensitivities to reward and punishment have been found to influence the activation of the dopaminergic reward system and could therefore help explain these seemingly discrepant results. Here, we used a novel associative memory task combined with computational modeling and showed independent effects of reward-delivery and reward-anticipation on memory. Strikingly, the computational approach revealed positive influences from both reward delivery, as mediated by prediction error magnitude, and reward anticipation, as mediated by magnitude of expected value, even in the absence of behavioral effects when analyzed using standard methods, i.e., by collapsing memory performance across trials within conditions. We additionally measured trait estimates of reward and punishment sensitivity and found that individuals with increased reward (vs. punishment) sensitivity had better memory for associations encoded during positive (vs. negative) prediction errors when tested after 20 min, but a negative trend when tested after 24 h. In conclusion, modeling trial-by-trial fluctuations in the magnitude of reward, as we did here for prediction errors and expected value computations, provides a comprehensive and biologically plausible description of

  3. Trial-by-Trial Modulation of Associative Memory Formation by Reward Prediction Error and Reward Anticipation as Revealed by a Biologically Plausible Computational Model.

    Science.gov (United States)

    Aberg, Kristoffer C; Müller, Julia; Schwartz, Sophie

    2017-01-01

    Anticipation and delivery of rewards improves memory formation, but little effort has been made to disentangle their respective contributions to memory enhancement. Moreover, it has been suggested that the effects of reward on memory are mediated by dopaminergic influences on hippocampal plasticity. Yet, evidence linking memory improvements to actual reward computations reflected in the activity of the dopaminergic system, i.e., prediction errors and expected values, is scarce and inconclusive. For example, different previous studies reported that the magnitude of prediction errors during a reinforcement learning task was a positive, negative, or non-significant predictor of successfully encoding simultaneously presented images. Individual sensitivities to reward and punishment have been found to influence the activation of the dopaminergic reward system and could therefore help explain these seemingly discrepant results. Here, we used a novel associative memory task combined with computational modeling and showed independent effects of reward-delivery and reward-anticipation on memory. Strikingly, the computational approach revealed positive influences from both reward delivery, as mediated by prediction error magnitude, and reward anticipation, as mediated by magnitude of expected value, even in the absence of behavioral effects when analyzed using standard methods, i.e., by collapsing memory performance across trials within conditions. We additionally measured trait estimates of reward and punishment sensitivity and found that individuals with increased reward (vs. punishment) sensitivity had better memory for associations encoded during positive (vs. negative) prediction errors when tested after 20 min, but a negative trend when tested after 24 h. In conclusion, modeling trial-by-trial fluctuations in the magnitude of reward, as we did here for prediction errors and expected value computations, provides a comprehensive and biologically plausible description of

  4. From provocative narrative scenarios to quantitative biophysical model results: Simulating plausible futures to 2070 in an urbanizing agricultural watershed in Wisconsin, USA

    Science.gov (United States)

    Booth, E.; Chen, X.; Motew, M.; Qiu, J.; Zipper, S. C.; Carpenter, S. R.; Kucharik, C. J.; Steven, L. I.

    2015-12-01

    Scenario analysis is a powerful tool for envisioning future social-ecological change and its consequences on human well-being. Scenarios that integrate qualitative storylines and quantitative biophysical models can create a vivid picture of these potential futures but the integration process is not straightforward. We present - using the Yahara Watershed in southern Wisconsin (USA) as a case study - a method for developing quantitative inputs (climate, land use/cover, and land management) to drive a biophysical modeling suite based on four provocative and contrasting narrative scenarios that describe plausible futures of the watershed to 2070. The modeling suite consists of an agroecosystem model (AgroIBIS-VSF), hydrologic routing model (THMB), and empirical lake water quality model and estimates several biophysical indicators to evaluate the watershed system under each scenario. These indicators include water supply, lake flooding, agricultural production, and lake water quality. Climate (daily precipitation and air temperature) for each scenario was determined using statistics from 210 different downscaled future climate projections for two 20-year time periods (2046-2065 and 2081-2100) and modified using a stochastic weather generator to allow flexibility for matching specific climate events within the scenario narratives. Land use/cover for each scenario was determined first by quantifying changes in areal extent every decade for 15 categories at the watershed scale to be consistent with the storyline events and theme. Next, these changes were spatially distributed using a rule-based framework based on land suitability metrics that determine transition probabilities. Finally, agricultural inputs including manure and fertilizer application rates were determined for each scenario based on the prevalence of livestock, water quality regulations, and technological innovations. Each scenario is compared using model inputs (maps and time-series of land use/cover and

  5. Plausible domain configurations and phase contents in two- and three-phase BaTiO3-based lead-free ferroelectrics

    Science.gov (United States)

    Topolov, Vitaly Yu; Brajesh, Kumar; Ranjan, Rajeev; Panich, Anatoly E.

    2017-02-01

    We have carried out a comparative study of plausible non-180° domain configurations in the two- and three-phase states of lead-free ferroelectrics Ba(Ti1-x Zr x )O3 (0.02  ⩽  x  ⩽  0.08) and (Ba0.85Ca0.15)(Ti0.90Zr0.10)O3, respectively, using the elastic matching approach. The phase contents and stress-relief conditions in Ba(Ti0.93Zr0.07)O3 and (Ba0.85Ca0.15)(Ti0.90Zr0.10)O3 strongly depend on domain types in the rhombohedral R3m phase, whereas domains of the orthorhombic Amm2 phase influence two-phase states in Ba(Ti0.98Zr0.02)O3. Changes in unit-cell parameters of (Ba0.85Ca0.15)(Ti0.90Zr0.10)O3 at poling lead to the complete stress relief in three-phase (P4mm  +  Amm2  +  R3m) structures by increasing the volume fraction of the R3m phase. A link between the heterophase/domain structures and high piezoelectric activity in (Ba0.85Ca0.15)(Ti0.90Zr0.10)O3 is discussed. Based on our results, we state that equal or almost equal volume fractions of the domain types at the three-phase coexistence in (Ba0.85Ca0.15). .(Ti0.90Zr0.10)O3 can lead to an enhanced contribution from domain-wall displacements and therefore, to the large piezoelectric response in this important lead-free ferroelectric compound.

  6. FunnyBase: a systems level functional annotation of Fundulus ESTs for the analysis of gene expression

    Directory of Open Access Journals (Sweden)

    Kolell Kevin J

    2004-12-01

    Full Text Available Abstract Background While studies of non-model organisms are critical for many research areas, such as evolution, development, and environmental biology, they present particular challenges for both experimental and computational genomic level research. Resources such as mass-produced microarrays and the computational tools linking these data to functional annotation at the system and pathway level are rarely available for non-model species. This type of "systems-level" analysis is critical to the understanding of patterns of gene expression that underlie biological processes. Results We describe a bioinformatics pipeline known as FunnyBase that has been used to store, annotate, and analyze 40,363 expressed sequence tags (ESTs from the heart and liver of the fish, Fundulus heteroclitus. Primary annotations based on sequence similarity are linked to networks of systematic annotation in Gene Ontology (GO and the Kyoto Encyclopedia of Genes and Genomes (KEGG and can be queried and computationally utilized in downstream analyses. Steps are taken to ensure that the annotation is self-consistent and that the structure of GO is used to identify higher level functions that may not be annotated directly. An integrated framework for cDNA library production, sequencing, quality control, expression data generation, and systems-level analysis is presented and utilized. In a case study, a set of genes, that had statistically significant regression between gene expression levels and environmental temperature along the Atlantic Coast, shows a statistically significant (P Conclusion The methods described have application for functional genomics studies, particularly among non-model organisms. The web interface for FunnyBase can be accessed at http://genomics.rsmas.miami.edu/funnybase/super_craw4/. Data and source code are available by request at jpaschall@bioinfobase.umkc.edu.

  7. Integrating community-based verbal autopsy into civil registration and vital statistics (CRVS): system-level considerations

    Science.gov (United States)

    de Savigny, Don; Riley, Ian; Chandramohan, Daniel; Odhiambo, Frank; Nichols, Erin; Notzon, Sam; AbouZahr, Carla; Mitra, Raj; Cobos Muñoz, Daniel; Firth, Sonja; Maire, Nicolas; Sankoh, Osman; Bronson, Gay; Setel, Philip; Byass, Peter; Jakob, Robert; Boerma, Ties; Lopez, Alan D.

    2017-01-01

    ABSTRACT Background: Reliable and representative cause of death (COD) statistics are essential to inform public health policy, respond to emerging health needs, and document progress towards Sustainable Development Goals. However, less than one-third of deaths worldwide are assigned a cause. Civil registration and vital statistics (CRVS) systems in low- and lower-middle-income countries are failing to provide timely, complete and accurate vital statistics, and it will still be some time before they can provide physician-certified COD for every death. Proposals: Verbal autopsy (VA) is a method to ascertain the probable COD and, although imperfect, it is the best alternative in the absence of medical certification. There is extensive experience with VA in research settings but only a few examples of its use on a large scale. Data collection using electronic questionnaires on mobile devices and computer algorithms to analyse responses and estimate probable COD have increased the potential for VA to be routinely applied in CRVS systems. However, a number of CRVS and health system integration issues should be considered in planning, piloting and implementing a system-wide intervention such as VA. These include addressing the multiplicity of stakeholders and sub-systems involved, integration with existing CRVS work processes and information flows, linking VA results to civil registration records, information technology requirements and data quality assurance. Conclusions: Integrating VA within CRVS systems is not simply a technical undertaking. It will have profound system-wide effects that should be carefully considered when planning for an effective implementation. This paper identifies and discusses the major system-level issues and emerging practices, provides a planning checklist of system-level considerations and proposes an overview for how VA can be integrated into routine CRVS systems. PMID:28137194

  8. Autophagy Regulatory Network - a systems-level bioinformatics resource for studying the mechanism and regulation of autophagy.

    Science.gov (United States)

    Türei, Dénes; Földvári-Nagy, László; Fazekas, Dávid; Módos, Dezső; Kubisch, János; Kadlecsik, Tamás; Demeter, Amanda; Lenti, Katalin; Csermely, Péter; Vellai, Tibor; Korcsmáros, Tamás

    2015-01-01

    Autophagy is a complex cellular process having multiple roles, depending on tissue, physiological, or pathological conditions. Major post-translational regulators of autophagy are well known, however, they have not yet been collected comprehensively. The precise and context-dependent regulation of autophagy necessitates additional regulators, including transcriptional and post-transcriptional components that are listed in various datasets. Prompted by the lack of systems-level autophagy-related information, we manually collected the literature and integrated external resources to gain a high coverage autophagy database. We developed an online resource, Autophagy Regulatory Network (ARN; http://autophagy-regulation.org), to provide an integrated and systems-level database for autophagy research. ARN contains manually curated, imported, and predicted interactions of autophagy components (1,485 proteins with 4,013 interactions) in humans. We listed 413 transcription factors and 386 miRNAs that could regulate autophagy components or their protein regulators. We also connected the above-mentioned autophagy components and regulators with signaling pathways from the SignaLink 2 resource. The user-friendly website of ARN allows researchers without computational background to search, browse, and download the database. The database can be downloaded in SQL, CSV, BioPAX, SBML, PSI-MI, and in a Cytoscape CYS file formats. ARN has the potential to facilitate the experimental validation of novel autophagy components and regulators. In addition, ARN helps the investigation of transcription factors, miRNAs and signaling pathways implicated in the control of the autophagic pathway. The list of such known and predicted regulators could be important in pharmacological attempts against cancer and neurodegenerative diseases.

  9. Theory Overview

    CERN Document Server

    Lenz, Alexander

    2016-01-01

    We set the scene for theoretical issues in charm physics that were discussed at CHARM 2016 in Bologna. In particular we emphasize the importance of improving our understanding of standard model contributions to numerous charm observables and we discuss also possible tests of our theory tools, like the Heavy Quark Expansion via the lifetime ratios of $D$-mesons

  10. Scattering theory

    Energy Technology Data Exchange (ETDEWEB)

    Friedrich, Harald [Technische Univ. Muenchen, Garching (Germany). Physik-Department

    2013-08-01

    Written by the author of the widely acclaimed textbook. Theoretical Atomic Physics Includes sections on quantum reflection, tunable Feshbach resonances and Efimov states. Useful for advanced students and researchers. This book presents a concise and modern coverage of scattering theory. It is motivated by the fact that experimental advances have shifted and broadened the scope of applications where concepts from scattering theory are used, e.g. to the field of ultracold atoms and molecules, which has been experiencing enormous growth in recent years, largely triggered by the successful realization of Bose-Einstein condensates of dilute atomic gases in 1995. In the present treatment, special attention is given to the role played by the long-range behaviour of the projectile-target interaction, and a theory is developed, which is well suited to describe near-threshold bound and continuum states in realistic binary systems such as diatomic molecules or molecular ions. The level of abstraction is kept as low as at all possible, and deeper questions related to mathematical foundations of scattering theory are passed by. The book should be understandable for anyone with a basic knowledge of nonrelativistic quantum mechanics. It is intended for advanced students and researchers, and it is hoped that it will be useful for theorists and experimentalists alike.

  11. Matching theory

    CERN Document Server

    Plummer, MD

    1986-01-01

    This study of matching theory deals with bipartite matching, network flows, and presents fundamental results for the non-bipartite case. It goes on to study elementary bipartite graphs and elementary graphs in general. Further discussed are 2-matchings, general matching problems as linear programs, the Edmonds Matching Algorithm (and other algorithmic approaches), f-factors and vertex packing.

  12. Livability theory

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    2014-01-01

    markdownabstract__Abstract__ Assumptions Livability theory involves the following six key assumptions: 1. Like all animals, humans have innate needs, such as for food, safety, and companionship. 2. Gratification of needs manifests in hedonic experience. 3. Hedonic experience determines how much we

  13. Theory U

    DEFF Research Database (Denmark)

    Monthoux, Pierre Guillet de; Statler, Matt

    2014-01-01

    The recent Carnegie report (Colby, et al., 2011) characterizes the goal of business education as the development of practical wisdom. In this chapter, the authors reframe Scharmer’s Theory U as an attempt to develop practical wisdom by applying certain European philosophical concepts. Specificall...

  14. Theory U

    DEFF Research Database (Denmark)

    Guillet de Monthoux, Pierre; Statler, Matt

    2017-01-01

    The recent Carnegie report (Colby, et al., 2011) characterizes the goal of business education as the development of practical wisdom. In this chapter, the authors reframe Scharmer's Theory U as an attempt to develop practical wisdom by applying certain European philosophical concepts. Specificall...

  15. Framing theory

    NARCIS (Netherlands)

    de Vreese, C.H.; Lecheler, S.; Mazzoleni, G.; Barnhurst, K.G.; Ikeda, K.; Maia, R.C.M.; Wessler, H.

    2016-01-01

    Political issues can be viewed from different perspectives and they can be defined differently in the news media by emphasizing some aspects and leaving others aside. This is at the core of news framing theory. Framing originates within sociology and psychology and has become one of the most used th

  16. Combinatorial Theory

    CERN Document Server

    Hall, Marshall

    2011-01-01

    Includes proof of van der Waerden's 1926 conjecture on permanents, Wilson's theorem on asymptotic existence, and other developments in combinatorics since 1967. Also covers coding theory and its important connection with designs, problems of enumeration, and partition. Presents fundamentals in addition to latest advances, with illustrative problems at the end of each chapter. Enlarged appendixes include a longer list of block designs.

  17. Activity Theory

    DEFF Research Database (Denmark)

    Bertelsen, Olav Wedege; Bødker, Susanne

    2003-01-01

    the young HCI research tradition. But HCI was already facing problems: lack of consideration for other aspects of human behavior, for interaction with other people, for culture. Cognitive science-based theories lacked means to address several issues that came out of the empirical projects....

  18. Theory U

    DEFF Research Database (Denmark)

    Monthoux, Pierre Guillet de; Statler, Matt

    2014-01-01

    The recent Carnegie report (Colby, et al., 2011) characterizes the goal of business education as the development of practical wisdom. In this chapter, the authors reframe Scharmer’s Theory U as an attempt to develop practical wisdom by applying certain European philosophical concepts. Specifically...

  19. In defense of psychosomatic theory: A critical analysis of Allison and Heshkas critical analysis

    OpenAIRE

    van Strien, T.

    1995-01-01

    This article analyses Allison and Heshka's (International Journal of Eating Disorders, 13, 289–295, 1993.) critical analysis of studies supporting psychosomatic theory. Questionned first is, Allison and Heshka's contention that the obese overreport emotional eating as a result of effects of demand characteristics, social desirability, and interpersonal expectancies. These effects, however, indicate that a more plausible response would be an underreport of emotional eating. Also addressed is A...

  20. Communication theory

    DEFF Research Database (Denmark)

    Stein, Irene F.; Stelter, Reinhard

    2011-01-01

    Communication theory covers a wide variety of theories related to the communication process (Littlejohn, 1999). Communication is not simply an exchange of information, in which we have a sender and a receiver. This very technical concept of communication is clearly outdated; a human being...... is not a data processing device. In this chapter, communication is understood as a process of shared meaning-making (Bruner, 1990). Human beings interpret their environment, other people, and themselves on the basis of their dynamic interaction with the surrounding world. Meaning is essential because people...... ascribe specific meanings to their experiences, their actions in life or work, and their interactions. Meaning is reshaped, adapted, and transformed in every communication encounter. Furthermore, meaning is cocreated in dialogues or in communities of practice, such as in teams at a workplace or in school...

  1. Potential theory

    CERN Document Server

    Helms, Lester L

    2014-01-01

    Potential Theory presents a clear path from calculus to classical potential theory and beyond, with the aim of moving the reader into the area of mathematical research as quickly as possible. The subject matter is developed from first principles using only calculus. Commencing with the inverse square law for gravitational and electromagnetic forces and the divergence theorem, the author develops methods for constructing solutions of Laplace's equation on a region with prescribed values on the boundary of the region. The latter half of the book addresses more advanced material aimed at those with the background of a senior undergraduate or beginning graduate course in real analysis. Starting with solutions of the Dirichlet problem subject to mixed boundary conditions on the simplest of regions, methods of morphing such solutions onto solutions of Poisson's equation on more general regions are developed using diffeomorphisms and the Perron-Wiener-Brelot method, culminating in application to Brownian motion. In ...

  2. Elastoplasticity theory

    CERN Document Server

    Hashiguchi, Koichi

    2014-01-01

    This book was written to serve as the standard textbook of elastoplasticity for students, engineers and researchers in the field of applied mechanics. The present second edition is improved thoroughly from the first edition by selecting the standard theories from various formulations and models, which are required to study the essentials of elastoplasticity steadily and effectively and will remain universally in the history of elastoplasticity. It opens with an explanation of vector-tensor analysis and continuum mechanics as a foundation to study elastoplasticity theory, extending over various strain and stress tensors and their rates. Subsequently, constitutive equations of elastoplastic and viscoplastic deformations for monotonic, cyclic and non-proportional loading behavior in a general rate and their applications to metals and soils are described in detail, and constitutive equations of friction behavior between solids and its application to the prediction of stick-slip phenomena are delineated. In additi...

  3. Operator theory

    CERN Document Server

    2015-01-01

    A one-sentence definition of operator theory could be: The study of (linear) continuous operations between topological vector spaces, these being in general (but not exclusively) Fréchet, Banach, or Hilbert spaces (or their duals). Operator theory is thus a very wide field, with numerous facets, both applied and theoretical. There are deep connections with complex analysis, functional analysis, mathematical physics, and electrical engineering, to name a few. Fascinating new applications and directions regularly appear, such as operator spaces, free probability, and applications to Clifford analysis. In our choice of the sections, we tried to reflect this diversity. This is a dynamic ongoing project, and more sections are planned, to complete the picture. We hope you enjoy the reading, and profit from this endeavor.

  4. Graph theory

    CERN Document Server

    Diestel, Reinhard

    2017-01-01

    This standard textbook of modern graph theory, now in its fifth edition, combines the authority of a classic with the engaging freshness of style that is the hallmark of active mathematics. It covers the core material of the subject with concise yet reliably complete proofs, while offering glimpses of more advanced methods in each field by one or two deeper results, again with proofs given in full detail. The book can be used as a reliable text for an introductory course, as a graduate text, and for self-study. From the reviews: “This outstanding book cannot be substituted with any other book on the present textbook market. It has every chance of becoming the standard textbook for graph theory.”Acta Scientiarum Mathematiciarum “Deep, clear, wonderful. This is a serious book about the heart of graph theory. It has depth and integrity. ”Persi Diaconis & Ron Graham, SIAM Review “The book has received a very enthusiastic reception, which it amply deserves. A masterly elucidation of modern graph theo...

  5. Scattering theory

    CERN Document Server

    Friedrich, Harald

    2016-01-01

    This corrected and updated second edition of "Scattering Theory" presents a concise and modern coverage of the subject. In the present treatment, special attention is given to the role played by the long-range behaviour of the projectile-target interaction, and a theory is developed, which is well suited to describe near-threshold bound and continuum states in realistic binary systems such as diatomic molecules or molecular ions. It is motivated by the fact that experimental advances have shifted and broadened the scope of applications where concepts from scattering theory are used, e.g. to the field of ultracold atoms and molecules, which has been experiencing enormous growth in recent years, largely triggered by the successful realization of Bose-Einstein condensates of dilute atomic gases in 1995. The book contains sections on special topics such as near-threshold quantization, quantum reflection, Feshbach resonances and the quantum description of scattering in two dimensions. The level of abstraction is k...

  6. Regulated Extracellular Choline Acetyltransferase Activity- The Plausible Missing Link of the Distant Action of Acetylcholine in the Cholinergic Anti-Inflammatory Pathway.

    Directory of Open Access Journals (Sweden)

    Swetha Vijayaraghavan

    Full Text Available Acetylcholine (ACh, the classical neurotransmitter, also affects a variety of nonexcitable cells, such as endothelia, microglia, astrocytes and lymphocytes in both the nervous system and secondary lymphoid organs. Most of these cells are very distant from cholinergic synapses. The action of ACh on these distant cells is unlikely to occur through diffusion, given that ACh is very short-lived in the presence of acetylcholinesterase (AChE and butyrylcholinesterase (BuChE, two extremely efficient ACh-degrading enzymes abundantly present in extracellular fluids. In this study, we show compelling evidence for presence of a high concentration and activity of the ACh-synthesizing enzyme, choline-acetyltransferase (ChAT in human cerebrospinal fluid (CSF and plasma. We show that ChAT levels are physiologically balanced to the levels of its counteracting enzymes, AChE and BuChE in the human plasma and CSF. Equilibrium analyses show that soluble ChAT maintains a steady-state ACh level in the presence of physiological levels of fully active ACh-degrading enzymes. We show that ChAT is secreted by cultured human-brain astrocytes, and that activated spleen lymphocytes release ChAT itself rather than ACh. We further report differential CSF levels of ChAT in relation to Alzheimer's disease risk genotypes, as well as in patients with multiple sclerosis, a chronic neuroinflammatory disease, compared to controls. Interestingly, soluble CSF ChAT levels show strong correlation with soluble complement factor levels, supporting a role in inflammatory regulation. This study provides a plausible explanation for the long-distance action of ACh through continuous renewal of ACh in extracellular fluids by the soluble ChAT and thereby maintenance of steady-state equilibrium between hydrolysis and synthesis of this ubiquitous cholinergic signal substance in the brain and peripheral compartments. These findings may have important implications for the role of cholinergic

  7. Biological plausibility as a tool to associate analytical data for micropollutants and effect potentials in wastewater, surface water, and sediments with effects in fishes.

    Science.gov (United States)

    Maier, Diana; Blaha, Ludek; Giesy, John P; Henneberg, Anja; Köhler, Heinz-R; Kuch, Bertram; Osterauer, Raphaela; Peschke, Katharina; Richter, Doreen; Scheurer, Marco; Triebskorn, Rita

    2015-04-01

    of micronuclei in erythrocytes of chub from the river. Chemicals potentially responsible for effects on DNA were identified. Embryotoxic effects on zebrafish (Danio rerio), investigated in the laboratory, were associated with embryotoxic effects in trout exposed in streamwater bypass systems at the two rivers. In general, responses at all levels of organization were more pronounced in samples from the Schussen than in those from the Argen. These results are consistent with the magnitudes of chemical pollution in these two streams. Plausibility chains to establish causality between exposures and effects and to predict effects in biota in the river from studies in the laboratory are discussed.

  8. Determination of nitric oxide metabolites, nitrate and nitrite, in Anopheles culicifacies mosquito midgut and haemolymph by anion exchange high-performance liquid chromatography: plausible mechanism of refractoriness

    Directory of Open Access Journals (Sweden)

    Adak Tridibesh

    2008-04-01

    Full Text Available Abstract Background The diverse physiological and pathological role of nitric oxide in innate immune defenses against many intra and extracellular pathogens, have led to the development of various methods for determining nitric oxide (NO synthesis. NO metabolites, nitrite (NO2- and nitrate (NO3- are produced by the action of an inducible Anopheles culicifacies NO synthase (AcNOS in mosquito mid-guts and may be central to anti-parasitic arsenal of these mosquitoes. Method While exploring a plausible mechanism of refractoriness based on nitric oxide synthase physiology among the sibling species of An. culicifacies, a sensitive, specific and cost effective high performance liquid chromatography (HPLC method was developed, which is not influenced by the presence of biogenic amines, for the determination of NO2- and NO3- from mosquito mid-guts and haemolymph. Results This method is based on extraction, efficiency, assay reproducibility and contaminant minimization. It entails de-proteinization by centrifugal ultra filtration through ultracel 3 K filter and analysis by high performance anion exchange liquid chromatography (Sphereclone, 5 μ SAX column with UV detection at 214 nm. The lower detection limit of the assay procedure is 50 pmoles in all midgut and haemolymph samples. Retention times for NO2- and NO3- in standards and in mid-gut samples were 3.42 and 4.53 min. respectively. Assay linearity for standards ranged between 50 nM and 1 mM. Recoveries of NO2- and NO3- from spiked samples (1–100 μM and from the extracted standards (1–100 μM were calculated to be 100%. Intra-assay and inter assay variations and relative standard deviations (RSDs for NO2- and NO3- in spiked and un-spiked midgut samples were 5.7% or less. Increased levels NO2- and NO3- in midguts and haemolymph of An. culicifacies sibling species B in comparison to species A reflect towards a mechanism of refractoriness based on AcNOS physiology. Conclusion HPLC is a sensitive

  9. System-level change in cultural and linguistic competence (CLC): how changes in CLC are related to service experience outcomes in systems of care.

    Science.gov (United States)

    Barksdale, Crystal L; Ottley, Phyllis Gyamfi; Stephens, Robert; Gebreselassie, Tesfayi; Fua, Imogen; Azur, Melissa; Walrath-Greene, Christine

    2012-06-01

    As US demographic trends shift toward more diversity, it becomes increasingly necessary to address differential needs of diverse groups of youth in mental health service systems. Cultural and linguistic competence (CLC) is essential to providing the most appropriate mental health services to youth and their families. The successful implementation of CLC often begins at the system level. Though various factors may affect change and system-level factors set the tone for broad acceptance of CLC within systems, there is limited empirical evidence linking culturally competent practices to outcomes. The purpose of the present study was to examine system-level CLC changes over time within systems of care and their associations with service experiences among youth and their families. Participants were 4,512 youth and their families enrolled in the national evaluation of the Children's Mental Health Initiative (CMHI). Results suggest that implementation of CLC at the system level improves over time in funded systems of care. Further, variation exists in specific system-level components of CLC. In addition, the changes in CLC at the system level are related to family/caregiver participation in treatment. Implications for supporting positive changes in CLC among systems of care communities, and specific strategies for community psychologists, are discussed.

  10. The challenge of measuring emergency preparedness: integrating component metrics to build system-level measures for strategic national stockpile operations.

    Science.gov (United States)

    Jackson, Brian A; Faith, Kay Sullivan

    2013-02-01

    Although significant progress has been made in measuring public health emergency preparedness, system-level performance measures are lacking. This report examines a potential approach to such measures for Strategic National Stockpile (SNS) operations. We adapted an engineering analytic technique used to assess the reliability of technological systems-failure mode and effects analysis-to assess preparedness. That technique, which includes systematic mapping of the response system and identification of possible breakdowns that affect performance, provides a path to use data from existing SNS assessment tools to estimate likely future performance of the system overall. Systems models of SNS operations were constructed and failure mode analyses were performed for each component. Linking data from existing assessments, including the technical assistance review and functional drills, to reliability assessment was demonstrated using publicly available information. The use of failure mode and effects estimates to assess overall response system reliability was demonstrated with a simple simulation example. Reliability analysis appears an attractive way to integrate information from the substantial investment in detailed assessments for stockpile delivery and dispensing to provide a view of likely future response performance.

  11. Systems biology of the cell cycle of Saccharomyces cerevisiae: From network mining to system-level properties.

    Science.gov (United States)

    Alberghina, Lilia; Coccetti, Paola; Orlandi, Ivan

    2009-01-01

    Following a brief description of the operational procedures of systems biology (SB), the cell cycle of budding yeast is discussed as a successful example of a top-down SB analysis. After the reconstruction of the steps that have led to the identification of a sizer plus timer network in the G1 to S transition, it is shown that basic functions of the cell cycle (the setting of the critical cell size and the accuracy of DNA replication) are system-level properties, detected only by integrating molecular analysis with modelling and simulation of their underlying networks. A detailed network structure of a second relevant regulatory step of the cell cycle, the exit from mitosis, derived from extensive data mining, is constructed and discussed. To reach a quantitative understanding of how nutrients control, through signalling, metabolism and transcription, cell growth and cycle is a very relevant aim of SB. Since we know that about 900 gene products are required for cell cycle execution and control in budding yeast, it is quite clear that a purely systematic approach would require too much time. Therefore lines for a modular SB approach, which prioritises molecular and computational investigations for faster cell cycle understanding, are proposed. The relevance of the insight coming from the cell cycle SB studies in developing a new framework for tackling very complex biological processes, such as cancer and aging, is discussed.

  12. A systems level strategy for analyzing the cell death network: implication in exploring the apoptosis/autophagy connection.

    Science.gov (United States)

    Zalckvar, E; Yosef, N; Reef, S; Ber, Y; Rubinstein, A D; Mor, I; Sharan, R; Ruppin, E; Kimchi, A

    2010-08-01

    The mammalian cell death network comprises three distinct functional modules: apoptosis, autophagy and programmed necrosis. Currently, the field lacks systems level approaches to assess the extent to which the intermodular connectivity affects cell death performance. Here, we developed a platform that is based on single and double sets of RNAi-mediated perturbations targeting combinations of apoptotic and autophagic genes. The outcome of perturbations is measured both at the level of the overall cell death responses, using an unbiased quantitative reporter, and by assessing the molecular responses within the different functional modules. Epistatic analyses determine whether seemingly unrelated pairs of proteins are genetically linked. The initial running of this platform in etoposide-treated cells, using a few single and double perturbations, identified several levels of connectivity between apoptosis and autophagy. The knock down of caspase3 turned on a switch toward autophagic cell death, which requires Atg5 or Beclin-1. In addition, a reciprocal connection between these two autophagic genes and apoptosis was identified. By applying computational tools that are based on mining the protein-protein interaction database, a novel biochemical pathway connecting between Atg5 and caspase3 is suggested. Scaling up this platform into hundreds of perturbations potentially has a wide, general scope of applicability, and will provide the basis for future modeling of the cell death network.

  13. An Innovative Hybrid 3D Analytic-Numerical Approach for System Level Modelling of PEM Fuel Cells

    Directory of Open Access Journals (Sweden)

    Gregor Tavčar

    2013-10-01

    Full Text Available The PEM fuel cell model presented in this paper is based on modelling species transport and coupling electrochemical reactions to species transport in an innovative way. Species transport is modelled by obtaining a 2D analytic solution for species concentration distribution in the plane perpendicular to the gas-flow and coupling consecutive 2D solutions by means of a 1D numerical gas-flow model. The 2D solution is devised on a jigsaw puzzle of multiple coupled domains which enables the modelling of parallel straight channel fuel cells with realistic geometries. Electrochemical and other nonlinear phenomena are coupled to the species transport by a routine that uses derivative approximation with prediction-iteration. A hybrid 3D analytic-numerical fuel cell model of a laboratory test fuel cell is presented and evaluated against a professional 3D computational fluid dynamic (CFD simulation tool. This comparative evaluation shows very good agreement between results of the presented model and those of the CFD simulation. Furthermore, high accuracy results are achieved at computational times short enough to be suitable for system level simulations. This computational efficiency is owed to the semi-analytic nature of its species transport modelling and to the efficient computational coupling of electrochemical kinetics and species transport.

  14. Combining metagenomics, metatranscriptomics and viromics to explore novel microbial interactions: towards a systems-level understanding of human microbiome.

    Science.gov (United States)

    Bikel, Shirley; Valdez-Lara, Alejandra; Cornejo-Granados, Fernanda; Rico, Karina; Canizales-Quinteros, Samuel; Soberón, Xavier; Del Pozo-Yauner, Luis; Ochoa-Leyva, Adrián

    2015-01-01

    The advances in experimental methods and the development of high performance bioinformatic tools have substantially improved our understanding of microbial communities associated with human niches. Many studies have documented that changes in microbial abundance and composition of the human microbiome is associated with human health and diseased state. The majority of research on human microbiome is typically focused in the analysis of one level of biological information, i.e., metagenomics or metatranscriptomics. In this review, we describe some of the different experimental and bioinformatic strategies applied to analyze the 16S rRNA gene profiling and shotgun sequencing data of the human microbiome. We also discuss how some of the recent insights in the combination of metagenomics, metatranscriptomics and viromics can provide more detailed description on the interactions between microorganisms and viruses in oral and gut microbiomes. Recent studies on viromics have begun to gain importance due to the potential involvement of viruses in microbial dysbiosis. In addition, metatranscriptomic combined with metagenomic analysis have shown that a substantial fraction of microbial transcripts can be differentially regulated relative to their microbial genomic abundances. Thus, understanding the molecular interactions in the microbiome using the combination of metagenomics, metatranscriptomics and viromics is one of the main challenges towards a system level understanding of human microbiome.

  15. Residuation theory

    CERN Document Server

    Blyth, T S; Sneddon, I N; Stark, M

    1972-01-01

    Residuation Theory aims to contribute to literature in the field of ordered algebraic structures, especially on the subject of residual mappings. The book is divided into three chapters. Chapter 1 focuses on ordered sets; directed sets; semilattices; lattices; and complete lattices. Chapter 2 tackles Baer rings; Baer semigroups; Foulis semigroups; residual mappings; the notion of involution; and Boolean algebras. Chapter 3 covers residuated groupoids and semigroups; group homomorphic and isotone homomorphic Boolean images of ordered semigroups; Dubreil-Jacotin and Brouwer semigroups; and loli

  16. Graph theory

    CERN Document Server

    Diestel, Reinhard

    2012-01-01

    HauptbeschreibungThis standard textbook of modern graph theory, now in its fourth edition, combinesthe authority of a classic with the engaging freshness of style that is the hallmarkof active mathematics. It covers the core material of the subject with concise yetreliably complete proofs, while offering glimpses of more advanced methodsin each field by one or two deeper results, again with proofs given in full detail.The book can be used as a reliable text for an introductory course, as a graduatetext, and for self-study. Rezension"Deep, clear, wonderful. This is a serious book about the

  17. Design theory

    CERN Document Server

    2009-01-01

    This book deals with the basic subjects of design theory. It begins with balanced incomplete block designs, various constructions of which are described in ample detail. In particular, finite projective and affine planes, difference sets and Hadamard matrices, as tools to construct balanced incomplete block designs, are included. Orthogonal latin squares are also treated in detail. Zhu's simpler proof of the falsity of Euler's conjecture is included. The construction of some classes of balanced incomplete block designs, such as Steiner triple systems and Kirkman triple systems, are also given.

  18. Communication theory

    CERN Document Server

    Goldie, Charles M

    1991-01-01

    This book is an introduction, for mathematics students, to the theories of information and codes. They are usually treated separately but, as both address the problem of communication through noisy channels (albeit from different directions), the authors have been able to exploit the connection to give a reasonably self-contained treatment, relating the probabilistic and algebraic viewpoints. The style is discursive and, as befits the subject, plenty of examples and exercises are provided. Some examples and exercises are provided. Some examples of computer codes are given to provide concrete illustrations of abstract ideas.

  19. Graph theory

    CERN Document Server

    Merris, Russell

    2001-01-01

    A lively invitation to the flavor, elegance, and power of graph theoryThis mathematically rigorous introduction is tempered and enlivened by numerous illustrations, revealing examples, seductive applications, and historical references. An award-winning teacher, Russ Merris has crafted a book designed to attract and engage through its spirited exposition, a rich assortment of well-chosen exercises, and a selection of topics that emphasizes the kinds of things that can be manipulated, counted, and pictured. Intended neither to be a comprehensive overview nor an encyclopedic reference, th

  20. Quantum gravity from descriptive set theory

    Energy Technology Data Exchange (ETDEWEB)

    El Naschie, M.S

    2004-03-01

    We start from Hilbert's criticism of the axioms of classical geometry and the possibility of abandoning the Archimedean axiom. Subsequently we proceed to the physical possibility of a fundamental limitation on the smallest length connected to certain singular points in spacetime and below which measurements become meaningless, Finally we arrive at the conclusion that maximising the Hawking-Bekenstein informational content of spacetime makes the existence of a transfinite geometry for physical 'spacetime' not only plausible but probably inevitable. The main part of the paper is then concerned with a proposal for a mathematical description of a transfinite, non-Archimedean geometry using descriptive set theory. Nevertheless, and despite all abstract mathematics, we remain quite close to similar lines of investigation initiated by physicists like A. Wheeler, D. Finkelstein and G. 'tHooft. In particular we introduce a logarithmic gauge transformation linking classical gravity with the electro weak via a version of informational entropy. That way we may claim to have accomplished an important step towards a general theory of quantum gravity using {epsilon}{sup ({infinity}}{sup )} and complexity theory and finding that {alpha}{sub G}=(2){sup {alpha}}{sup -bar{sub ew-1}} congruent with (1.7)(10){sup 38} where {alpha}{sub G} is the dimensionless Newton gravity constant, and {alpha}{sub ew}{approx_equal}128 is the fine structure constant at the electro weak scale.

  1. Alternative conceptions of memory consolidation and the role of the hippocampus at the systems level in rodents.

    Science.gov (United States)

    Sutherland, R J; Lehmann, H

    2011-06-01

    We discuss very recent experiments with rodents addressing the idea that long-term memories initially depending on the hippocampus, over a prolonged period, become independent of it. No unambiguous recent evidence exists to substantiate that this occurs. Most experiments find that recent and remote memories are equally affected by hippocampus damage. Nearly all experiments that report spared remote memories suffer from two problems: retrieval could be based upon substantial regions of spared hippocampus and recent memory is tested at intervals that are of the same order of magnitude as cellular consolidation. Accordingly, we point the way beyond systems consolidation theories, both the Standard Model of Consolidation and the Multiple Trace Theory, and propose a simpler multiple storage site hypothesis. On this view, with event reiterations, different memory representations are independently established in multiple networks. Many detailed memories always depend on the hippocampus; the others may be established and maintained independently.

  2. 利用合情推理方法得到的Hausdorff L-闭包空间的刻画%Characterizations of Hausdorff L-closure spaces obtained by plausible reasoning

    Institute of Scientific and Technical Information of China (English)

    李生刚; 陆汉川; 伏文清; 黄秦安

    2011-01-01

    Aim To demonstrate the applications of plausible reasoning in mathematical learning and researsh. Methods Analogy, induction, and generalization. Results Some characteriations of Hausdorff L-closure spaces are obtained. Conclusion Some conclusions in general topology are generalized, and plausible reasoning is proved to be of much importance in mathematical learning.%目的 阐述合情推理在数学学习和研究中的应用.方法 类比、归纳、一般化等方法.结果 得到了Hausdorff L-闭包空间的若干个等价刻画.结论 推广了一般拓扑学中的一些已知结果,说明了合情推理方法的重要性.

  3. System-level considerations for the front-end readout ASIC in the CBM experiment from the power supply perspective

    Science.gov (United States)

    Kasinski, K.; Koczon, P.; Ayet, S.; Löchner, S.; Schmidt, C. J.

    2017-03-01

    New fixed target experiments using high intensity beams with energy up to 10 AGeV from the SIS100 synchrotron presently being constructed at FAIR/GSI are under preparation. Most of the readout electronics and power supplies are expected to be exposed to a very high flux of nuclear reaction products and have to be radiation tolerant up to 3 MRad (TID) and sustain up to 1014/cm2 of 1 MeV neutron equivalent in their life time. Moreover, the mostly minimum ionising particles under investigation leave very little signal in the sensors. Therefore very low noise level amplitude measurements are required by the front-end electronics for effective tracking. Sensor and interconnecting micro-cable capacitance and series resistance in conjunction with intrinsic noise of the charge sensitive amplifier are dominant noise sources in the system. However, the single-ended architecture of the amplifiers employed for the charge processing channels implies a potential problem with noise contributions from power supply sources. Strict system-level constraints leave very little freedom in selecting a power supply structure optimal with respect to: power efficiency, cooling capabilities and power density on modules, but also noise injection to the front-end via the power supply lines. Design of the power supply and distribution system of the Silicon Tracking System in the CBM experiment together with details on the front-end ASICs (STS -XYTER2) and measurement results of power supply and conditioning electronics (selected DC/DC converter and LDO regulators) are presented.

  4. Two-component signal transduction pathways regulating growth and cell cycle progression in a bacterium: a system-level analysis.

    Directory of Open Access Journals (Sweden)

    Jeffrey M Skerker

    2005-10-01

    Full Text Available Two-component signal transduction systems, comprised of histidine kinases and their response regulator substrates, are the predominant means by which bacteria sense and respond to extracellular signals. These systems allow cells to adapt to prevailing conditions by modifying cellular physiology, including initiating programs of gene expression, catalyzing reactions, or modifying protein-protein interactions. These signaling pathways have also been demonstrated to play a role in coordinating bacterial cell cycle progression and development. Here we report a system-level investigation of two-component pathways in the model organism Caulobacter crescentus. First, by a comprehensive deletion analysis we show that at least 39 of the 106 two-component genes are required for cell cycle progression, growth, or morphogenesis. These include nine genes essential for growth or viability of the organism. We then use a systematic biochemical approach, called phosphotransfer profiling, to map the connectivity of histidine kinases and response regulators. Combining these genetic and biochemical approaches, we identify a new, highly conserved essential signaling pathway from the histidine kinase CenK to the response regulator CenR, which plays a critical role in controlling cell envelope biogenesis and structure. Depletion of either cenK or cenR leads to an unusual, severe blebbing of cell envelope material, whereas constitutive activation of the pathway compromises cell envelope integrity, resulting in cell lysis and death. We propose that the CenK-CenR pathway may be a suitable target for new antibiotic development, given previous successes in targeting the bacterial cell wall. Finally, the ability of our in vitro phosphotransfer profiling method to identify signaling pathways that operate in vivo takes advantage of an observation that histidine kinases are endowed with a global kinetic preference for their cognate response regulators. We propose that this

  5. Two-Component Signal Transduction Pathways Regulating Growth and Cell Cycle Progression in a Bacterium: A System-Level Analysis

    Science.gov (United States)

    Skerker, Jeffrey M; Prasol, Melanie S; Perchuk, Barrett S; Biondi, Emanuele G

    2005-01-01

    Two-component signal transduction systems, comprised of histidine kinases and their response regulator substrates, are the predominant means by which bacteria sense and respond to extracellular signals. These systems allow cells to adapt to prevailing conditions by modifying cellular physiology, including initiating programs of gene expression, catalyzing reactions, or modifying protein–protein interactions. These signaling pathways have also been demonstrated to play a role in coordinating bacterial cell cycle progression and development. Here we report a system-level investigation of two-component pathways in the model organism Caulobacter crescentus. First, by a comprehensive deletion analysis we show that at least 39 of the 106 two-component genes are required for cell cycle progression, growth, or morphogenesis. These include nine genes essential for growth or viability of the organism. We then use a systematic biochemical approach, called phosphotransfer profiling, to map the connectivity of histidine kinases and response regulators. Combining these genetic and biochemical approaches, we identify a new, highly conserved essential signaling pathway from the histidine kinase CenK to the response regulator CenR, which plays a critical role in controlling cell envelope biogenesis and structure. Depletion of either cenK or cenR leads to an unusual, severe blebbing of cell envelope material, whereas constitutive activation of the pathway compromises cell envelope integrity, resulting in cell lysis and death. We propose that the CenK–CenR pathway may be a suitable target for new antibiotic development, given previous successes in targeting the bacterial cell wall. Finally, the ability of our in vitro phosphotransfer profiling method to identify signaling pathways that operate in vivo takes advantage of an observation that histidine kinases are endowed with a global kinetic preference for their cognate response regulators. We propose that this system

  6. An Evolutionary Comparison of the Handicap Principle and Hybrid Equilibrium Theories of Signaling.

    Science.gov (United States)

    Kane, Patrick; Zollman, Kevin J S

    2015-01-01

    The handicap principle has come under significant challenge both from empirical studies and from theoretical work. As a result, a number of alternative explanations for honest signaling have been proposed. This paper compares the evolutionary plausibility of one such alternative, the "hybrid equilibrium," to the handicap principle. We utilize computer simulations to compare these two theories as they are instantiated in Maynard Smith's Sir Philip Sidney game. We conclude that, when both types of communication are possible, evolution is unlikely to lead to handicap signaling and is far more likely to result in the partially honest signaling predicted by hybrid equilibrium theory.

  7. SAM Theory Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Rui [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-03-01

    The System Analysis Module (SAM) is an advanced and modern system analysis tool being developed at Argonne National Laboratory under the U.S. DOE Office of Nuclear Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM development aims for advances in physical modeling, numerical methods, and software engineering to enhance its user experience and usability for reactor transient analyses. To facilitate the code development, SAM utilizes an object-oriented application framework (MOOSE), and its underlying meshing and finite-element library (libMesh) and linear and non-linear solvers (PETSc), to leverage modern advanced software environments and numerical methods. SAM focuses on modeling advanced reactor concepts such as SFRs (sodium fast reactors), LFRs (lead-cooled fast reactors), and FHRs (fluoride-salt-cooled high temperature reactors) or MSRs (molten salt reactors). These advanced concepts are distinguished from light-water reactors in their use of single-phase, low-pressure, high-temperature, and low Prandtl number (sodium and lead) coolants. As a new code development, the initial effort has been focused on modeling and simulation capabilities of heat transfer and single-phase fluid dynamics responses in Sodium-cooled Fast Reactor (SFR) systems. The system-level simulation capabilities of fluid flow and heat transfer in general engineering systems and typical SFRs have been verified and validated. This document provides the theoretical and technical basis of the code to help users understand the underlying physical models (such as governing equations, closure models, and component models), system modeling approaches, numerical discretization and solution methods, and the overall capabilities in SAM. As the code is still under ongoing development, this SAM Theory Manual will be updated periodically to keep it consistent with the state of the development.

  8. Graph theory

    CERN Document Server

    Diestel, Reinhard

    2000-01-01

    This book is a concise, yet carefully written, introduction to modern graph theory, covering all its major recent developments. It can be used both as a reliable textbook for an introductory course and as a graduate text: on each topic it covers all the basic material in full detail, and adds one or two deeper results (again with detailed proofs) to illustrate the more advanced methods of that field. This second edition extends the first in two ways. It offers a thoroughly revised and updated chapter on graph minors, which now includes full new proofs of two of the central Robertson-Seymour theorems (as well as a detailed sketch of the entire proof of their celebrated Graph Minor Theorem). Second, there is now a section of hints for all the exercises, to enhance their value for both individual study and classroom use.

  9. Implausibility of the vibrational theory of olfaction.

    Science.gov (United States)

    Block, Eric; Jang, Seogjoo; Matsunami, Hiroaki; Sekharan, Sivakumar; Dethier, Bérénice; Ertem, Mehmed Z; Gundala, Sivaji; Pan, Yi; Li, Shengju; Li, Zhen; Lodge, Stephene N; Ozbil, Mehmet; Jiang, Huihong; Penalba, Sonia F; Batista, Victor S; Zhuang, Hanyi

    2015-05-26

    The vibrational theory of olfaction assumes that electron transfer occurs across odorants at the active sites of odorant receptors (ORs), serving as a sensitive measure of odorant vibrational frequencies, ultimately leading to olfactory perception. A previous study reported that human subjects differentiated hydrogen/deuterium isotopomers (isomers with isotopic atoms) of the musk compound cyclopentadecanone as evidence supporting the theory. Here, we find no evidence for such differentiation at the molecular level. In fact, we find that the human musk-recognizing receptor, OR5AN1, identified using a heterologous OR expression system and robustly responding to cyclopentadecanone and muscone, fails to distinguish isotopomers of these compounds in vitro. Furthermore, the mouse (methylthio)methanethiol-recognizing receptor, MOR244-3, as well as other selected human and mouse ORs, responded similarly to normal, deuterated, and (13)C isotopomers of their respective ligands, paralleling our results with the musk receptor OR5AN1. These findings suggest that the proposed vibration theory does not apply to the human musk receptor OR5AN1, mouse thiol receptor MOR244-3, or other ORs examined. Also, contrary to the vibration theory predictions, muscone-d30 lacks the 1,380- to 1,550-cm(-1) IR bands claimed to be essential for musk odor. Furthermore, our theoretical analysis shows that the proposed electron transfer mechanism of the vibrational frequencies of odorants could be easily suppressed by quantum effects of nonodorant molecular vibrational modes. These and other concerns about electron transfer at ORs, together with our extensive experimental data, argue against the plausibility of the vibration theory.

  10. Unified-theory-of-reinforcement neural networks do not simulate the blocking effect.

    Science.gov (United States)

    Calvin, Nicholas T; J McDowell, J

    2015-11-01

    For the last 20 years the unified theory of reinforcement (Donahoe et al., 1993) has been used to develop computer simulations to evaluate its plausibility as an account for behavior. The unified theory of reinforcement states that operant and respondent learning occurs via the same neural mechanisms. As part of a larger project to evaluate the operant behavior predicted by the theory, this project was the first replication of neural network models based on the unified theory of reinforcement. In the process of replicating these neural network models it became apparent that a previously published finding, namely, that the networks simulate the blocking phenomenon (Donahoe et al., 1993), was a misinterpretation of the data. We show that the apparent blocking produced by these networks is an artifact of the inability of these networks to generate the same conditioned response to multiple stimuli. The piecemeal approach to evaluate the unified theory of reinforcement via simulation is critiqued and alternatives are discussed.

  11. General Theories of Regulation

    NARCIS (Netherlands)

    Hertog, J.A. den

    1999-01-01

    This chapter makes a distinction between three types of theories of regulation: public interest theories, the Chicago theory of regulation and the public choice theories. The Chicago theory is mainly directed at the explanation of economic regulation; public interest theories and public choice theor

  12. An organizational framework and strategic implementation for system-level change to enhance research-based practice: QUERI Series

    Directory of Open Access Journals (Sweden)

    Mittman Brian S

    2008-05-01

    Full Text Available Abstract Background The continuing gap between available evidence and current practice in health care reinforces the need for more effective solutions, in particular related to organizational context. Considerable advances have been made within the U.S. Veterans Health Administration (VA in systematically implementing evidence into practice. These advances have been achieved through a system-level program focused on collaboration and partnerships among policy makers, clinicians, and researchers. The Quality Enhancement Research Initiative (QUERI was created to generate research-driven initiatives that directly enhance health care quality within the VA and, simultaneously, contribute to the field of implementation science. This paradigm-shifting effort provided a natural laboratory for exploring organizational change processes. This article describes the underlying change framework and implementation strategy used to operationalize QUERI. Strategic approach to organizational change QUERI used an evidence-based organizational framework focused on three contextual elements: 1 cultural norms and values, in this case related to the role of health services researchers in evidence-based quality improvement; 2 capacity, in this case among researchers and key partners to engage in implementation research; 3 and supportive infrastructures to reinforce expectations for change and to sustain new behaviors as part of the norm. As part of a QUERI Series in Implementation Science, this article describes the framework's application in an innovative integration of health services research, policy, and clinical care delivery. Conclusion QUERI's experience and success provide a case study in organizational change. It demonstrates that progress requires a strategic, systems-based effort. QUERI's evidence-based initiative involved a deliberate cultural shift, requiring ongoing commitment in multiple forms and at multiple levels. VA's commitment to QUERI came in the

  13. Generalized energy conditions in Extended Theories of Gravity

    CERN Document Server

    Capozziello, Salvatore; Mimoso, José P

    2014-01-01

    Theories of physics can be considered viable if the initial value problem and the energy conditions are formulated self-consistently. The former allow a uniquely determined dynamical evolution of the system, and the latter guarantee that causality is preserved and that "plausible" physical sources have been considered. In this work, we consider the further degrees of freedom related to curvature invariants and scalar fields in Extended Theories of Gravity (ETG). These new degrees of freedom can be recast as effective perfect fluids that carry different meanings with respect to the standard matter fluids generally adopted as sources of the field equations. It is thus somewhat misleading to apply the standard general relativistic energy conditions to this effective energy-momentum, as the latter contains the matter content and a geometrical quantity, which arises from the ETG considered. Here, we explore this subtlety, extending on previous work, in particular, to cases with the contracted Bianchi identities wi...

  14. Integrating pro-environmental behavior with transportation network modeling: User and system level strategies, implementation, and evaluation

    Science.gov (United States)

    Aziz, H. M. Abdul

    Personal transport is a leading contributor to fossil fuel consumption and greenhouse (GHG) emissions in the U.S. The U.S. Energy Information Administration (EIA) reports that light-duty vehicles (LDV) are responsible for 61% of all transportation related energy consumption in 2012, which is equivalent to 8.4 million barrels of oil (fossil fuel) per day. The carbon content in fossil fuels is the primary source of GHG emissions that links to the challenge associated with climate change. Evidently, it is high time to develop actionable and innovative strategies to reduce fuel consumption and GHG emissions from the road transportation networks. This dissertation integrates the broader goal of minimizing energy and emissions into the transportation planning process using novel systems modeling approaches. This research aims to find, investigate, and evaluate strategies that minimize carbon-based fuel consumption and emissions for a transportation network. We propose user and system level strategies that can influence travel decisions and can reinforce pro-environmental attitudes of road users. Further, we develop strategies that system operators can implement to optimize traffic operations with emissions minimization goal. To complete the framework we develop an integrated traffic-emissions (EPA-MOVES) simulation framework that can assess the effectiveness of the strategies with computational efficiency and reasonable accuracy. The dissertation begins with exploring the trade-off between emissions and travel time in context of daily travel decisions and its heterogeneous nature. Data are collected from a web-based survey and the trade-off values indicating the average additional travel minutes a person is willing to consider for reducing a lb. of GHG emissions are estimated from random parameter models. Results indicate that different trade-off values for male and female groups. Further, participants from high-income households are found to have higher trade-off values

  15. An Estimation of Turbulent Kinetic Energy and Energy Dissipation Rate Based on Atmospheric Boundary Layer Similarity Theory

    Science.gov (United States)

    Han, Jongil; Arya, S. Pal; Shaohua, Shen; Lin, Yuh-Lang; Proctor, Fred H. (Technical Monitor)

    2000-01-01

    Algorithms are developed to extract atmospheric boundary layer profiles for turbulence kinetic energy (TKE) and energy dissipation rate (EDR), with data from a meteorological tower as input. The profiles are based on similarity theory and scalings for the atmospheric boundary layer. The calculated profiles of EDR and TKE are required to match the observed values at 5 and 40 m. The algorithms are coded for operational use and yield plausible profiles over the diurnal variation of the atmospheric boundary layer.

  16. THEORIES OF CORPORATE GOVERNANCE

    Directory of Open Access Journals (Sweden)

    Sorin Nicolae BORLEA

    2013-03-01

    Full Text Available This study attempts to provide a theoretical framework for the corporate governance debate. The review of various corporate governance theories enhances the major objective of corporate governance which is maximizing the value for shareholders by ensuring good social and environment performances. The theories of corporate governance are rooted in agency theory with the theory of moral hazard’s implications, further developing within stewardship theory and stakeholder theory and evolving at resource dependence theory, transaction cost theory and political theory. Later, to these theories was added ethics theory, information asymmetry theory or the theory of efficient markets. These theories are defined based on the causes and effects of variables such as: the configuration of the board of directors, audit committee, independence of managers, the role of top management and their social relations beyond the legal regulatory framework. Effective corporate governance requires applying a combination

  17. Gauge theory and little gauge theory

    CERN Document Server

    Koizumi, Kozo

    2016-01-01

    The gauge theory is the most important type of the field theory, in which the interactions of the elementary particles are described by the exchange of the gauge bosons.In this article, the gauge theory is reexamined as geometry of the vector space, and a new concept of "little gauge theory" is introduced. A key peculiarity of the little gauge theory is that the theory is able to give a restriction for form of the connection field. Based on the little gauge theory, Cartan geometry, a charged boson and the Dirac fermion field theory are investigated. In particular, the Dirac fermion field theory leads to an extension of Sogami's covariant derivative. And it is interpreted that Higgs bosons are included in new fields introduced in this article.

  18. Magnetron theory

    Science.gov (United States)

    Riyopoulos, Spilios

    1996-03-01

    A guiding center fluid theory is applied to model steady-state, single mode, high-power magnetron operation. A hub of uniform, prescribed density, feeds the current spokes. The spoke charge follows from the continuity equation and the incompressibility of the guiding center flow. Included are the spoke self-fields (DC and AC), obtained by an expansion around the unperturbed (zero-spoke charge) flow in powers of ν/V1, ν, and V1 being the effective charge density and AC amplitude. The spoke current is obtained as a nonlinear function of the detuning from the synchronous (Buneman-Hartree, BH) voltage Vs; the spoke charge is included in the self-consistent definition of Vs. It is shown that there is a DC voltage region of width ‖V-Vs‖˜V1, where the spoke width is constant and the spoke current is simply proportional to the AC voltage. The magnetron characteristic curves are ``flat'' in that range, and are approximated by a linear expansion around Vs. The derived formulas differ from earlier results [J. F. Hull, in Cross Field Microwave Devices, edited by E. Okress (Academic, New York, 1961), pp. 496-527] in (a) there is no current cutoff at synchronism; the tube operates well below as well above the BH voltage; (b) the characteristics are single valued within the synchronous voltage range; (c) the hub top is not treated as virtual cathode; and (d) the hub density is not equal to the Brillouin density; comparisons with tube measurements show the best agreement for hub density near half the Brillouin density. It is also shown that at low space charge and low power the gain curve is symmetric relative to the voltage (frequency) detuning. While symmetry is broken at high-power/high space charge magnetron operation, the BH voltage remains between the current cutoff voltages.

  19. Integrating Operational Energy Implications into System-Level Combat Effects Modeling: Assessing the Combat Effectiveness and Fuel Use of ABCT 2020 and Current ABCT

    Science.gov (United States)

    2015-01-01

    Endy M. Daehner, John Matsumura, Thomas J. Herbert , Jeremy R. Kurz, Keith Walters Integrating Operational Energy Implications into System-Level... George Guthridge, and Megan Corso for their clear guid- ance and assistance throughout the study. We also received valuable information and insights from...helped with processing modeling and simulation outputs. Laura Novacic and Donna Mead provided invaluable administrative assistance and help with

  20. Recursion Theory Week

    CERN Document Server

    Müller, Gert; Sacks, Gerald

    1990-01-01

    These proceedings contain research and survey papers from many subfields of recursion theory, with emphasis on degree theory, in particular the development of frameworks for current techniques in this field. Other topics covered include computational complexity theory, generalized recursion theory, proof theoretic questions in recursion theory, and recursive mathematics.

  1. Adaptive Motivation Theory.

    Science.gov (United States)

    1982-02-01

    of collections of associations, Need theory consists of interrelated concepts, social learning theory consists of rule application in the social...Ryan’s Learning Subdivisions Hierarchically Arranged -27- Landy: ONR Annual Report Expectancy Theory Effectance Theory Social Learning Theory Self-Esteem

  2. System-level analysis of tryptophan regulation in Escherichia coli--performance under starved and well-fed conditions.

    Science.gov (United States)

    Chaudhary, N; Bhartiya, S; Venkatesh, K V

    2007-05-01

    Biological systems respond appropriately to a variety of environments thus representing complex systems with rich physiological behaviour. Quantitative models can be used to identify the design components that result in the system complexity. In this work, the tryptophan system of Escherichia coli that synthesises tryptophan internally when faced with starvation in a rapid manner and shuts off the synthesis sluggishly when the cells are exposed to a medium replete with tryptophan has been discussed. The evolved regulatory design is capable of providing such an asymmetric response that represents an appropriate behaviour to ensure survival. The tryptophan system uses three distinct regulatory mechanisms namely genetic regulation, transcriptional attenuation and enzyme inhibition to achieve its goals. It has been shown that genetic repression and attenuation are the only active regulatory mechanisms during moderate and severe starvation. However, as the degree of starvation increases, repression is relieved prior to attenuation. The analysis also shows that enzyme inhibition does not play a role under severe starvation and plays a marginal role in increasing the rate of repression when the cells are exposed to well-fed conditions. Finally, we use tools from linear systems theory to rationalise the above observations based on the poles and zeros of an approximated linear system.

  3. Systems-level chromosomal parameters represent a suprachromosomal basis for the non-random chromosomal arrangement in human interphase nuclei

    Science.gov (United States)

    Fatakia, Sarosh N.; Mehta, Ishita S.; Rao, Basuthkar J.

    2016-01-01

    Forty-six chromosome territories (CTs) are positioned uniquely in human interphase nuclei, wherein each of their positions can range from the centre of the nucleus to its periphery. A non-empirical basis for their non-random arrangement remains unreported. Here, we derive a suprachromosomal basis of that overall arrangement (which we refer to as a CT constellation), and report a hierarchical nature of the same. Using matrix algebra, we unify intrinsic chromosomal parameters (e.g., chromosomal length, gene density, the number of genes per chromosome), to derive an extrinsic effective gene density matrix, the hierarchy of which is dominated largely by extrinsic mathematical coupling of HSA19, followed by HSA17 (human chromosome 19 and 17, both preferentially interior CTs) with all CTs. We corroborate predicted constellations and effective gene density hierarchy with published reports from fluorescent in situ hybridization based microscopy and Hi-C techniques, and delineate analogous hierarchy in disparate vertebrates. Our theory accurately predicts CTs localised to the nuclear interior, which interestingly share conserved synteny with HSA19 and/or HSA17. Finally, the effective gene density hierarchy dictates how permutations among CT position represents the plasticity within its constellations, based on which we suggest that a differential mix of coding with noncoding genome modulates the same. PMID:27845379

  4. Composite Photon Theory Versus Elementary Photon Theory

    CERN Document Server

    Perkins, Walton A

    2015-01-01

    The purpose of this paper is to show that the composite photon theory measures up well against the Standard Model's elementary photon theory. This is done by comparing the two theories area by area. Although the predictions of quantum electrodynamics are in excellent agreement with experiment (as in the anomalous magnetic moment of the electron), there are some problems, such as the difficulty in describing the electromagnetic field with the four-component vector potential because the photon has only two polarization states. In most areas the two theories give similar results, so it is impossible to rule out the composite photon theory. Pryce's arguments in 1938 against a composite photon theory are shown to be invalid or irrelevant. Recently, it has been realized that in the composite theory the antiphoton does not interact with matter because it is formed of a neutrino and an antineutrino with the wrong helicity. This leads to experimental tests that can determine which theory is correct.

  5. Decidability of formal theories and hyperincursivity theory

    Science.gov (United States)

    Grappone, Arturo G.

    2000-05-01

    This paper shows the limits of the Proof Standard Theory (briefly, PST) and gives some ideas of how to build a proof anticipatory theory (briefly, PAT) that has no such limits. Also, this paper considers that Gödel's proof of the undecidability of Principia Mathematica formal theory is not valid for axiomatic theories that use a PAT to build their proofs because the (hyper)incursive functions are self-representable.

  6. Program theory-driven evaluation science in a youth development context.

    Science.gov (United States)

    Deane, Kelsey L; Harré, Niki

    2014-08-01

    Program theory-driven evaluation science (PTDES) provides a useful framework for uncovering the mechanisms responsible for positive change resulting from participation in youth development (YD) programs. Yet it is difficult to find examples of PTDES that capture the complexity of such experiences. This article offers a much-needed example of PTDES applied to Project K, a youth development program with adventure, service-learning and mentoring components. Findings from eight program staff focus groups, 351 youth participants' comments, four key program documents, and results from six previous Project K research projects were integrated to produce a theory of change for the program. A direct logic analysis was then conducted to assess the plausibility of the proposed theory against relevant research literature. This demonstrated that Project K incorporates many of the best practice principles discussed in the literature that covers the three components of the program. The contributions of this theory-building process to organizational learning and development are discussed.

  7. Multi-Sensory Cognitive Learning as Facilitated in a Multimedia Tutorial for Item Response Theory

    Directory of Open Access Journals (Sweden)

    Chong Ho Yu

    2007-08-01

    Full Text Available The objective of this paper is to introduce an application of multi-sensory cognitive learning theory into the development of a multimedia tutorial for Item Response Theory. The cognitive multimedia theory suggests that the visual and auditory material should be presented simultaneously to reinforce the retention of learned materials. A computer-assisted module is carefully designed based upon the preceding theory and also an experiment was conducted to examine the effect of audio types (human audio, computer audio, and no audio on learner performance measured by an objective test. It was found that while there is no significant performance gap between the human audio and the no audio group, the two groups substantively outperform the computer audio group. A plausible explanation is that un-natural audio requires additional cognitive power to process the information and thus this distraction affects the performance.

  8. Dedicated clock/timing-circuit theories of time perception and timed performance.

    Science.gov (United States)

    van Rijn, Hedderik; Gu, Bon-Mi; Meck, Warren H

    2014-01-01

    Scalar Timing Theory (an information-processing version of Scalar Expectancy Theory) and its evolution into the neurobiologically plausible Striatal Beat-Frequency (SBF) theory of interval timing are reviewed. These pacemaker/accumulator or oscillation/coincidence detection models are then integrated with the Adaptive Control of Thought-Rational (ACT-R) cognitive architecture as dedicated timing modules that are able to make use of the memory and decision-making mechanisms contained in ACT-R. The different predictions made by the incorporation of these timing modules into ACT-R are discussed as well as the potential limitations. Novel implementations of the original SBF model that allow it to be incorporated into ACT-R in a more fundamental fashion than the earlier simulations of Scalar Timing Theory are also considered in conjunction with the proposed properties and neural correlates of the "internal clock".

  9. Model Theory and Applications

    CERN Document Server

    Mangani, P

    2011-01-01

    This title includes: Lectures - G.E. Sacks - Model theory and applications, and H.J. Keisler - Constructions in model theory; and, Seminars - M. Servi - SH formulas and generalized exponential, and J.A. Makowski - Topological model theory.

  10. Decoding the architectural theory

    Institute of Scientific and Technical Information of China (English)

    Gu Mengchao

    2008-01-01

    Starting from the illustration of the definition and concept of the architectural theory, the author established his unique understanding about the framework of the architectural theory and the innovation of the architectural theory underlined by Chinese characteristics.

  11. What is Literary Theory?

    OpenAIRE

    Murray, Paul R.; Paul R., Murray

    2001-01-01

    This paper deals with two difficult questions: (1) What is literary theory? and (2) What does literary theory do? Literary theory is contrasted to literary criticism, and theory is found to be a more all-embracing, inclusive field than criticism, which is tied more closely to literature itself. Literary theory is shown to be a multitude of differing ways of looking at literature, with each theory yielding differing results.

  12. Review of Hydroelasticity Theories

    DEFF Research Database (Denmark)

    Chen, Xu-jun; Wu, You-sheng; Cui, Wei-cheng

    2006-01-01

    Existing hydroelastic theories are reviewed. The theories are classified into different types: two-dimensional linear theory, two-dimensional nonlinear theory, three-dimensional linear theory and three-dimensional nonlinear theory. Applications to analysis of very large floating structures (VLFS)......) are reviewed and discussed in details. Special emphasis is placed on papers from China and Japan (in native languages) as these papers are not generally publicly known in the rest of the world....

  13. Grounded theory, feminist theory, critical theory: toward theoretical triangulation.

    Science.gov (United States)

    Kushner, Kaysi Eastlick; Morrow, Raymond

    2003-01-01

    Nursing and social science scholars have examined the compatibility between feminist and grounded theory traditions in scientific knowledge generation, concluding that they are complementary, yet not without certain tensions. This line of inquiry is extended to propose a critical feminist grounded theory methodology. The construction of symbolic interactionist, feminist, and critical feminist variants of grounded theory methodology is examined in terms of the presuppositions of each tradition and their interplay as a process of theoretical triangulation.

  14. Foundations of Galois theory

    CERN Document Server

    Postnikov, MM; Stark, M; Ulam, S

    1962-01-01

    Foundations of Galois Theory is an introduction to group theory, field theory, and the basic concepts of abstract algebra. The text is divided into two parts. Part I presents the elements of Galois Theory, in which chapters are devoted to the presentation of the elements of field theory, facts from the theory of groups, and the applications of Galois Theory. Part II focuses on the development of general Galois Theory and its use in the solution of equations by radicals. Equations that are solvable by radicals; the construction of equations solvable by radicals; and the unsolvability by radica

  15. Theory of thermal stresses

    CERN Document Server

    Boley, Bruno A

    1997-01-01

    Highly regarded text presents detailed discussion of fundamental aspects of theory, background, problems with detailed solutions. Basics of thermoelasticity, heat transfer theory, thermal stress analysis, more. 1985 edition.

  16. Local homotopy theory

    CERN Document Server

    Jardine, John F

    2015-01-01

    This monograph on the homotopy theory of topologized diagrams of spaces and spectra gives an expert account of a subject at the foundation of motivic homotopy theory and the theory of topological modular forms in stable homotopy theory. Beginning with an introduction to the homotopy theory of simplicial sets and topos theory, the book covers core topics such as the unstable homotopy theory of simplicial presheaves and sheaves, localized theories, cocycles, descent theory, non-abelian cohomology, stacks, and local stable homotopy theory. A detailed treatment of the formalism of the subject is interwoven with explanations of the motivation, development, and nuances of ideas and results. The coherence of the abstract theory is elucidated through the use of widely applicable tools, such as Barr's theorem on Boolean localization, model structures on the category of simplicial presheaves on a site, and cocycle categories. A wealth of concrete examples convey the vitality and importance of the subject in topology, n...

  17. The Extended Relativity Theory in Clifford Spaces

    Directory of Open Access Journals (Sweden)

    Castro C.

    2005-04-01

    Full Text Available An introduction to some of the most important features of the Extended Relativity theory in Clifford-spaces (C-spaces is presented whose “point” coordinates are non-commuting Clifford-valued quantities which incorporate lines, areas, volumes, hyper-volumes. . . degrees of freedom associated with the collective particle, string, membrane, p-brane. . . dynamics of p-loops (closed p-branes in target D-dimensional spacetime backgrounds. C-space Relativity naturally incorporates the ideas of an invariant length (Planck scale, maximal acceleration, non-commuting coordinates, supersymmetry, holography, higher derivative gravity with torsion and variable dimensions/signatures. It permits to study the dynamics of all (closed p-branes, for all values of p, on a unified footing. It resolves the ordering ambiguities in QFT, the problem of time in Cosmology and admits superluminal propagation (tachyons without violations of causality. A discussion of the maximal-acceleration Relativity principle in phase-spaces follows and the study of the invariance group of symmetry transformations in phase-space allows to show why Planck areas are invariant under acceleration-boosts transformations. This invariance feature suggests that a maximal-string tension principle may be operating in Nature. We continue by pointing out how the relativity of signatures of the underlying n-dimensional spacetime results from taking different n-dimensional slices through C-space. The conformal group in spacetime emerges as a natural subgroup of the Clifford group and Relativity in C-spaces involves natural scale changes in the sizes of physical objects without the introduction of forces nor Weyl’s gauge field of dilations. We finalize by constructing the generalization of Maxwell theory of Electrodynamics of point charges to a theory in C-spaces that involves extended charges coupled to antisymmetric tensor fields of arbitrary rank. In the concluding remarks we outline briefly

  18. Comparative systems-level analysis of the G1/S transition in yeast and higher eukaryotes: focusing on the Whi5/Rb network and initiation of DNA replication

    OpenAIRE

    Hasan

    2013-01-01

    Molecular systems biology, holds that biological processes are the result of complex, coordinated, dynamical, non-linear interactions that generate the corresponding function as an emergent property of the system, that therefore is not found in individual components, but only in their networking. Conversely, it has been shown that even a single, multi-domain protein may present a system-level behavior that can be described by adapting the formalism used to describe inter-molecular networks. A...

  19. A theory of everything?

    CERN Multimedia

    't Hooft, Gerardus; Witten, Edward

    2005-01-01

    In his later years, Einstein sought a unified theory that would extend general relativity and provide an alternative to quantum theory. There is now talk of a "theory of everything"; fifty years after his death, how close are we to such a theory? (3 pages)

  20. Game Theory: 5 Questions

    DEFF Research Database (Denmark)

    Hendricks, Vincent F.

    Game Theory is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in game theory. We hear their views on game theory, its aim, scope, use, the future direction of game theory and how their work fits in these respects....