WorldWideScience

Sample records for budget theory model

  1. Application of Dynamic Energy Budget theory for conservation relevant modelling of Bird life histories

    NARCIS (Netherlands)

    Teixeira, C.M.G.L.

    2016-01-01

    Teixeira, C.M.G.L. (2016, Januari 13). Application of Dynamic Energy Budget Theory For Conservation Relevant Modelling of Bird Life Histories. Vrije Universiteit Amsterdam. Prom./coprom.: prof. dr. S.A.L.M. Kooijman & T. Sousa.

  2. Dynamic energy budget theory meets individual-based modelling: a generic and accessible implementation.

    NARCIS (Netherlands)

    Martin, B.; Zimmer, E.; Grimm, V.; Jager, T.

    2012-01-01

    1.Dynamic Energy Budget (DEB) theory was designed to understand the dynamics of biological systems from cells to populations and ecosystems via a mass balance approach of individuals. However, most work so far has focused on the level of the individual. To encourage further use of DEB theory in a

  3. Modeling the eco-physiology of the purple mauve stinger, Pelagia noctiluca using Dynamic Energy Budget theory

    Science.gov (United States)

    Augustine, Starrlight; Rosa, Sara; Kooijman, Sebastiaan A. L. M.; Carlotti, François; Poggiale, Jean-Christophe

    2014-11-01

    Parameters for the standard Dynamic Energy Budget (DEB) model were estimated for the purple mauve stinger, Pelagia noctiluca, using literature data. Overall, the model predictions are in good agreement with data covering the full life-cycle. The parameter set we obtain suggests that P. noctiluca is well adapted to survive long periods of starvation since the predicted maximum reserve capacity is extremely high. Moreover we predict that the reproductive output of larger individuals is relatively insensitive to changes in food level while wet mass and length are. Furthermore, the parameters imply that even if food were scarce (ingestion levels only 14% of the maximum for a given size) an individual would still mature and be able to reproduce. We present detailed model predictions for embryo development and discuss the developmental energetics of the species such as the fact that the metabolism of ephyrae accelerates for several days after birth. Finally we explore a number of concrete testable model predictions which will help to guide future research. The application of DEB theory to the collected data allowed us to conclude that P. noctiluca combines maximizing allocation to reproduction with rather extreme capabilities to survive starvation. The combination of these properties might explain why P. noctiluca is a rapidly growing concern to fisheries and tourism.

  4. Model theory

    CERN Document Server

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  5. Model theory

    CERN Document Server

    Hodges, Wilfrid

    1993-01-01

    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  6. A Theory of the Perturbed Consumer with General Budgets

    DEFF Research Database (Denmark)

    McFadden, Daniel L; Fosgerau, Mogens

    We consider demand systems for utility-maximizing consumers facing general budget constraints whose utilities are perturbed by additive linear shifts in marginal utilities. Budgets are required to be compact but are not required to be convex. We define demand generating functions (DGF) whose...... subgradients with respect to these perturbations are convex hulls of the utility-maximizing demands. We give necessary as well as sufficient conditions for DGF to be consistent with utility maximization, and establish under quite general conditions that utility-maximizing demands are almost everywhere single......-valued and smooth in their arguments. We also give sufficient conditions for integrability of perturbed demand. Our analysis provides a foundation for applications of consumer theory to problems with nonlinear budget constraints....

  7. From food-dependent statistics to metabolic parameters, a practical guide to the use of dynamic energy budget theory.

    NARCIS (Netherlands)

    Kooijman, S.A.L.M.; Sousa, T; Pecquerie, L; van der Meer, J.; Jager, T.

    2008-01-01

    The standard model of the dynamic energy budget theory for metabolic organisation has variables and parameters that can be quantified using indirect methods only. We present new methods (and software) to extract food-independent parameter values of the energy budget from food-dependent quantities

  8. Nambe Pueblo Water Budget and Forecasting model.

    Energy Technology Data Exchange (ETDEWEB)

    Brainard, James Robert

    2009-10-01

    This report documents The Nambe Pueblo Water Budget and Water Forecasting model. The model has been constructed using Powersim Studio (PS), a software package designed to investigate complex systems where flows and accumulations are central to the system. Here PS has been used as a platform for modeling various aspects of Nambe Pueblo's current and future water use. The model contains three major components, the Water Forecast Component, Irrigation Scheduling Component, and the Reservoir Model Component. In each of the components, the user can change variables to investigate the impacts of water management scenarios on future water use. The Water Forecast Component includes forecasting for industrial, commercial, and livestock use. Domestic demand is also forecasted based on user specified current population, population growth rates, and per capita water consumption. Irrigation efficiencies are quantified in the Irrigated Agriculture component using critical information concerning diversion rates, acreages, ditch dimensions and seepage rates. Results from this section are used in the Water Demand Forecast, Irrigation Scheduling, and the Reservoir Model components. The Reservoir Component contains two sections, (1) Storage and Inflow Accumulations by Categories and (2) Release, Diversion and Shortages. Results from both sections are derived from the calibrated Nambe Reservoir model where historic, pre-dam or above dam USGS stream flow data is fed into the model and releases are calculated.

  9. The role of Dynamic Energy Budget theory in predictive modeling of stressor impacts on ecological systems. Comment on: ;Physics of metabolic organization; by Marko Jusup et al.

    Science.gov (United States)

    Galic, Nika; Forbes, Valery E.

    2017-03-01

    Human activities have been modifying ecosystems for centuries, from pressures on wild populations we harvest to modifying habitats through urbanization and agricultural activities. Changes in global climate patterns are adding another layer of, often unpredictable, perturbations to ecosystems on which we rely for life support [1,2]. To ensure the sustainability of ecosystem services, especially at this point in time when the human population is estimated to grow by another 2 billion by 2050 [3], we need to predict possible consequences of our actions and suggest relevant solutions [4,5]. We face several challenges when estimating adverse impacts of our actions on ecosystems. We describe these in the context of ecological risk assessment of chemicals. Firstly, when attempting to assess risk from exposure to chemicals, we base our decisions on a very limited number of species that are easily cultured and kept in the lab. We assume that preventing risk to these species will also protect all of the untested species present in natural ecosystems [6]. Secondly, although we know that chemicals interact with other stressors in the field, the number of stressors that we can test is limited due to logistical and ethical reasons. Similarly, empirical approaches are limited in both spatial and temporal scale due to logistical, financial and ethical reasons [7,8]. To bypass these challenges, we can develop ecological models that integrate relevant life history and other information and make testable predictions across relevant spatial and temporal scales [8-10].

  10. Model theory and applications

    CERN Document Server

    Belegradek, OV

    1999-01-01

    This volume is a collection of papers on model theory and its applications. The longest paper, "Model Theory of Unitriangular Groups" by O. V. Belegradek, forms a subtle general theory behind Mal‴tsev's famous correspondence between rings and groups. This is the first published paper on the topic. Given the present model-theoretic interest in algebraic groups, Belegradek's work is of particular interest to logicians and algebraists. The rest of the collection consists of papers on various questions of model theory, mainly on stability theory. Contributors are leading Russian researchers in the

  11. THE METODOLOGICAL APROACHES TO MODEL CREATING OF ENTERPRISE BUDGETING MANAGEMENT

    Directory of Open Access Journals (Sweden)

    N.A. Shpak

    2007-03-01

    Full Text Available There is the direct definition between the quality of enterprise management and the enterprise competi-tiveness in the conditions of modern market. The main amount of Russian and western enterprises use the technology of budget management in finance for operate management. So far, as practice shows, enter-prises often use such budgeting model, which can not be adequate to the modern market demand and modern Russian economy aspects. Modern budgeting models, their advantages and disadvantages are considered and analyzed in this article.

  12. Model theory and modules

    CERN Document Server

    Prest, M

    1988-01-01

    In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module

  13. An information theory approach for evaluating earth radiation budget (ERB) measurements - Nonuniform sampling of reflected shortwave radiation

    Science.gov (United States)

    Barkstrom, Bruce R.; Direskeneli, Haldun; Halyo, Nesim

    1992-01-01

    An information theory approach to examine the temporal nonuniform sampling characteristics of shortwave (SW) flux for earth radiation budget (ERB) measurements is suggested. The information gain is computed by computing the information content before and after the measurements. A stochastic diurnal model for the SW flux is developed, and measurements for different orbital parameters are examined. The methodology is applied to specific NASA Polar platform and Tropical Rainfall Measuring Mission (TRMM) orbital parameters. The information theory approach, coupled with the developed SW diurnal model, is found to be promising for measurements involving nonuniform orbital sampling characteristics.

  14. The comparative topology of energy allocation in budget models.

    NARCIS (Netherlands)

    Lika, K.; Kooijman, S.A.L.M.

    2011-01-01

    The standard Dynamic Energy Budget (DEB) model assumes that assimilates of an isomorphic individual are first added to reserve, a fraction κ of mobilised reserve is allocated to soma (somatic maintenance plus growth of structure), and the rest to maturity maintenance and maturation or reproduction.

  15. Electric solar wind sail mass budget model

    Directory of Open Access Journals (Sweden)

    P. Janhunen

    2013-02-01

    Full Text Available The electric solar wind sail (E-sail is a new type of propellantless propulsion system for Solar System transportation, which uses the natural solar wind to produce spacecraft propulsion. The E-sail consists of thin centrifugally stretched tethers that are kept charged by an onboard electron gun and, as such, experience Coulomb drag through the high-speed solar wind plasma stream. This paper discusses a mass breakdown and a performance model for an E-sail spacecraft that hosts a mission-specific payload of prescribed mass. In particular, the model is able to estimate the total spacecraft mass and its propulsive acceleration as a function of various design parameters such as the number of tethers and their length. A number of subsystem masses are calculated assuming existing or near-term E-sail technology. In light of the obtained performance estimates, an E-sail represents a promising propulsion system for a variety of transportation needs in the Solar System.

  16. Challenges for dynamic energy budget theory. Comment on ;Physics of metabolic organization; by Marko Jusup et al.

    Science.gov (United States)

    Nisbet, Roger M.

    2017-03-01

    Jusup et al. [1] provide a comprehensive review of Dynamic Energy Budget (DEB) theory - a theory of metabolic organization that has its roots in a model by S.A.L.M Kooijman [2] and has evolved over three decades into a remarkable general theory whose use appears to be growing exponentially. The definitive text on DEB theory [3] is a challenging (though exceptionally rewarding) read, and previous reviews (e.g. [4,5]) have provided focused summaries of some of its main themes, targeted at specific groups of readers. The strong case for a further review is well captured in the abstract: ;Hitherto, the foundations were more accessible to physicists or mathematicians, and the applications to biologists, causing a dichotomy in what always should have been a single body of work.; In response to this need, Jusup et al. provide a review that combines a lucid, rigorous exposition of the core components of DEB theory with a diverse collection of DEB applications. They also highlight some recent advances, notably the rapidly growing on-line database of DEB model parameters (451 species on 15 August 2016 according to [1], now, just a few months later, over 500 species).

  17. Theory and modeling group

    Science.gov (United States)

    Holman, Gordon D.

    1989-01-01

    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  18. Qualitative use of Dynamic Energy Budget theory in ecotoxicology : case study on oil contamination and Arctic copepods

    NARCIS (Netherlands)

    Klok, T.C.; Hjorth, M.; Dahlloef, I.

    2012-01-01

    The Dynamic Energy Budget (DEB) theory provides a logic and consistent framework to evaluate ecotoxicological test results. Currently this framework is not regularly applied in ecotoxicology given perceived complexity and data needs. However, even in the case of low data availability the DEB theory

  19. ENSIS, Pollution inventory, pollution budget model, water quality model and scenario handling. Functional specification

    OpenAIRE

    Bakken, T.H.; Bjørkenes, A.; Dagestad, K.

    2003-01-01

    Årsliste 2003 This is the functional specification of a complete pollution budget model for water. A crucial improvement of this model is implementation of new pollution sources and modification of existing sources. The specification of a water quality model, based on the results from the pollution budget model is also included. The document is intended to give a cost and time estimate of the programming of the functionality it describes, and will be the guideline for implementation of the...

  20. Analyzing variations in life-history traits of Pacific salmon in the context of Dynamic Energy Budget (DEB) theory

    Science.gov (United States)

    Pecquerie, Laure; Johnson, Leah R.; Kooijman, Sebastiaan A. L. M.; Nisbet, Roger M.

    2011-11-01

    To determine the response of Pacific salmon ( Oncorhynchus spp.) populations to environmental change, we need to understand impacts on all life stages. However, an integrative and mechanistic approach is particularly challenging for Pacific salmon as they use multiple habitats (river, estuarine and marine) during their life cycle. Here we develop a bioenergetic model that predicts development, growth and reproduction of a Pacific salmon in a dynamic environment, from an egg to a reproducing female, and that links female state to egg traits. This model uses Dynamic Energy Budget (DEB) theory to predict how life history traits vary among five species of Pacific salmon: Pink, Sockeye, Coho, Chum and Chinook. Supplemented with a limited number of assumptions on anadromy and semelparity and external signals for migrations, the model reproduces the qualitative patterns in egg size, fry size and fecundity both at the inter- and intra-species levels. Our results highlight how modeling all life stages within a single framework enables us to better understand complex life-history patterns. Additionally we show that body size scaling relationships implied by DEB theory provide a simple way to transfer model parameters among Pacific salmon species, thus providing a generic approach to study the impact of environmental conditions on the life cycle of Pacific salmon.

  1. Model Theory for Process Algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2004-01-01

    We present a first-order extension of the algebraic theory about processes known as ACP and its main models. Useful predicates on processes, such as deadlock freedom and determinism, can be added to this theory through first-order definitional extensions. Model theory is used to analyse the

  2. Late Budgets

    DEFF Research Database (Denmark)

    Andersen, Asger Lau; Lassen, David Dreyer; Nielsen, Lasse Holbøll Westh

    The budget forms the legal basis of government spending. If a budget is not in place at the beginning of the fiscal year, planning as well as current spending are jeopardized and government shutdown may result. This paper develops a continuous-time war-of-attrition model of budgeting...

  3. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    Science.gov (United States)

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  4. Body size scaling relationships in bivalves: a comparison of field data with predictions by dynamic energy budgets (deb theory).

    NARCIS (Netherlands)

    Cardoso, J.F.M.F.; van de Veer, H.W..; Kooijman, S.A.L.M.

    2006-01-01

    In this paper, we apply the Dynamic Energy Budget (DEB) theory to bivalve species (1) to provide basic body-size scaling relationships that can be used to predict species characteristics when basic information is lacking, and (2) to analyse the discrepancy between DEB predictions based on energetic

  5. A Model of Tax Compliance Under Budget-Constrained Auditors

    OpenAIRE

    Graetz, Michael J.; Jennifer F. Reinganum; Wilde, Louis L.

    1984-01-01

    In the midst of various taxpayer "revolts" and federal budget deficits of unprecedented magnitude, noncompliance with federal and state income tax laws has become an issue of significant policy concern. If the IRS' budget is limited, the probability that any individual taxpayer will be audited depends on the behavior of other taxpayers. Thus the problem of compliance involves a "congestion" effect, which generates strategic interaction among taxpayers as well as between taxpayers and the IRS....

  6. Lectures on algebraic model theory

    CERN Document Server

    Hart, Bradd

    2001-01-01

    In recent years, model theory has had remarkable success in solving important problems as well as in shedding new light on our understanding of them. The three lectures collected here present recent developments in three such areas: Anand Pillay on differential fields, Patrick Speissegger on o-minimality and Matthias Clasen and Matthew Valeriote on tame congruence theory.

  7. A Computational Theory of Modelling

    Science.gov (United States)

    Rossberg, Axel G.

    2003-04-01

    A metatheory is developed which characterizes the relationship between a modelled system, which complies with some ``basic theory'', and a model, which does not, and yet reproduces important aspects of the modelled system. A model is represented by an (in a certain sense, s.b.) optimal algorithm which generates data that describe the model's state or evolution complying with a ``reduced theory''. Theories are represented by classes of (in a similar sense, s.b.) optimal algorithms that test if their input data comply with the theory. The metatheory does not prescribe the formalisms (data structure, language) to be used for the description of states or evolutions. Transitions to other formalisms and loss of accuracy, common to theory reduction, are explicitly accounted for. The basic assumption of the theory is that resources such as the code length (~ programming time) and the computation time for modelling and testing are costly, but the relative cost of each recourse is unknown. Thus, if there is an algorithm a for which there is no other algorithm b solving the same problem but using less of each recourse, then a is considered optimal. For tests (theories), the set X of wrongly admitted inputs is treated as another resource. It is assumed that X1 is cheaper than X2 when X1 ⊂ X2 (X1 ≠ X2). Depending on the problem, the algorithmic complexity of a reduced theory can be smaller or larger than that of the basic theory. The theory might help to distinguish actual properties of complex systems from mere mental constructs. An application to complex spatio-temporal patterns is discussed.

  8. BEYOND BUDGETING

    OpenAIRE

    Edo Cvrkalj; Denis Smolar

    2015-01-01

    Traditional budgeting principles, with strictly defined business goals, have been, since 1998, slowly growing into more sophisticated and organization-adjusted alternative budgeting concepts. One of those alternative concepts is the “Beyond budgeting” model with an implemented performance effects measuring process. In order for the model to be practicable, budget planning and control has to be reoriented to the “bottom up” planning and control approach. In today’s modern bus...

  9. Revisiting the global surface energy budgets with maximum-entropy-production model of surface heat fluxes

    Science.gov (United States)

    Huang, Shih-Yu; Deng, Yi; Wang, Jingfeng

    2017-09-01

    The maximum-entropy-production (MEP) model of surface heat fluxes, based on contemporary non-equilibrium thermodynamics, information theory, and atmospheric turbulence theory, is used to re-estimate the global surface heat fluxes. The MEP model predicted surface fluxes automatically balance the surface energy budgets at all time and space scales without the explicit use of near-surface temperature and moisture gradient, wind speed and surface roughness data. The new MEP-based global annual mean fluxes over the land surface, using input data of surface radiation, temperature data from National Aeronautics and Space Administration-Clouds and the Earth's Radiant Energy System (NASA CERES) supplemented by surface specific humidity data from the Modern-Era Retrospective Analysis for Research and Applications (MERRA), agree closely with previous estimates. The new estimate of ocean evaporation, not using the MERRA reanalysis data as model inputs, is lower than previous estimates, while the new estimate of ocean sensible heat flux is higher than previously reported. The MEP model also produces the first global map of ocean surface heat flux that is not available from existing global reanalysis products.

  10. Model Theory in Algebra, Analysis and Arithmetic

    CERN Document Server

    Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J

    2014-01-01

    Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.

  11. Towards the determination of Mytilus edulis food preferences using the dynamic energy budget (DEB) theory.

    Science.gov (United States)

    Picoche, Coralie; Le Gendre, Romain; Flye-Sainte-Marie, Jonathan; Françoise, Sylvaine; Maheux, Frank; Simon, Benjamin; Gangnery, Aline

    2014-01-01

    The blue mussel, Mytilus edulis, is a commercially important species, with production based on both fisheries and aquaculture. Dynamic Energy Budget (DEB) models have been extensively applied to study its energetics but such applications require a deep understanding of its nutrition, from filtration to assimilation. Being filter feeders, mussels show multiple responses to temporal fluctuations in their food and environment, raising questions that can be investigated by modeling. To provide a better insight into mussel-environment interactions, an experiment was conducted in one of the main French growing zones (Utah Beach, Normandy). Mussel growth was monitored monthly for 18 months, with a large number of environmental descriptors measured in parallel. Food proxies such as chlorophyll a, particulate organic carbon and phytoplankton were also sampled, in addition to non-nutritious particles. High-frequency physical data recording (e.g., water temperature, immersion duration) completed the habitat description. Measures revealed an increase in dry flesh mass during the first year, followed by a high mass loss, which could not be completely explained by the DEB model using raw external signals. We propose two methods that reconstruct food from shell length and dry flesh mass variations. The former depends on the inversion of the growth equation while the latter is based on iterative simulations. Assemblages of food proxies are then related to reconstructed food input, with a special focus on plankton species. A characteristic contribution is attributed to these sources to estimate nutritional values for mussels. M. edulis shows no preference between most plankton life history traits. Selection is based on the size of the ingested particles, which is modified by the volume and social behavior of plankton species. This finding reveals the importance of diet diversity and both passive and active selections, and confirms the need to adjust DEB models to different

  12. Towards the Determination of Mytilus edulis Food Preferences Using the Dynamic Energy Budget (DEB) Theory

    Science.gov (United States)

    Picoche, Coralie; Le Gendre, Romain; Flye-Sainte-Marie, Jonathan; Françoise, Sylvaine; Maheux, Frank; Simon, Benjamin; Gangnery, Aline

    2014-01-01

    The blue mussel, Mytilus edulis, is a commercially important species, with production based on both fisheries and aquaculture. Dynamic Energy Budget (DEB) models have been extensively applied to study its energetics but such applications require a deep understanding of its nutrition, from filtration to assimilation. Being filter feeders, mussels show multiple responses to temporal fluctuations in their food and environment, raising questions that can be investigated by modeling. To provide a better insight into mussel–environment interactions, an experiment was conducted in one of the main French growing zones (Utah Beach, Normandy). Mussel growth was monitored monthly for 18 months, with a large number of environmental descriptors measured in parallel. Food proxies such as chlorophyll a, particulate organic carbon and phytoplankton were also sampled, in addition to non-nutritious particles. High-frequency physical data recording (e.g., water temperature, immersion duration) completed the habitat description. Measures revealed an increase in dry flesh mass during the first year, followed by a high mass loss, which could not be completely explained by the DEB model using raw external signals. We propose two methods that reconstruct food from shell length and dry flesh mass variations. The former depends on the inversion of the growth equation while the latter is based on iterative simulations. Assemblages of food proxies are then related to reconstructed food input, with a special focus on plankton species. A characteristic contribution is attributed to these sources to estimate nutritional values for mussels. M. edulis shows no preference between most plankton life history traits. Selection is based on the size of the ingested particles, which is modified by the volume and social behavior of plankton species. This finding reveals the importance of diet diversity and both passive and active selections, and confirms the need to adjust DEB models to different

  13. Towards the determination of Mytilus edulis food preferences using the dynamic energy budget (DEB theory.

    Directory of Open Access Journals (Sweden)

    Coralie Picoche

    Full Text Available The blue mussel, Mytilus edulis, is a commercially important species, with production based on both fisheries and aquaculture. Dynamic Energy Budget (DEB models have been extensively applied to study its energetics but such applications require a deep understanding of its nutrition, from filtration to assimilation. Being filter feeders, mussels show multiple responses to temporal fluctuations in their food and environment, raising questions that can be investigated by modeling. To provide a better insight into mussel-environment interactions, an experiment was conducted in one of the main French growing zones (Utah Beach, Normandy. Mussel growth was monitored monthly for 18 months, with a large number of environmental descriptors measured in parallel. Food proxies such as chlorophyll a, particulate organic carbon and phytoplankton were also sampled, in addition to non-nutritious particles. High-frequency physical data recording (e.g., water temperature, immersion duration completed the habitat description. Measures revealed an increase in dry flesh mass during the first year, followed by a high mass loss, which could not be completely explained by the DEB model using raw external signals. We propose two methods that reconstruct food from shell length and dry flesh mass variations. The former depends on the inversion of the growth equation while the latter is based on iterative simulations. Assemblages of food proxies are then related to reconstructed food input, with a special focus on plankton species. A characteristic contribution is attributed to these sources to estimate nutritional values for mussels. M. edulis shows no preference between most plankton life history traits. Selection is based on the size of the ingested particles, which is modified by the volume and social behavior of plankton species. This finding reveals the importance of diet diversity and both passive and active selections, and confirms the need to adjust DEB models to

  14. Budget constraint and vaccine dosing: A mathematical modelling exercise

    NARCIS (Netherlands)

    Standaert, Baudouin A.; Curran, Desmond; Postma, Maarten J.

    2014-01-01

    Background: Increasing the number of vaccine doses may potentially improve overall efficacy. Decision-makers need information about choosing the most efficient dose schedule to maximise the total health gain of a population when operating under a constrained budget. The objective of this study is to

  15. Neuronet Modelling of the Processes of Budgeting and Use of Labour Resources at Coal Mining Enterprises

    Directory of Open Access Journals (Sweden)

    Hlіnska Olha M.

    2014-01-01

    Full Text Available The article considers issues of efficient budgeting and use of labour resources at coal mining enterprises. It proves expediency of use of modern neuronet, namely, multilayer perceptron, for solution of tasks of modelling the process of budgeting and use of labour resources at coal mining enterprises. It shows that Statistika is the best software package for creation of neuronets of the multilayer perceptron architecture. On the basis of analysis and comparative characteristic the article selects the topology and builds a neuronet model of budgeting and use of labour resources at coal mining enterprises.

  16. The "covariation method" for estimating the parameters of the standard Dynamic Energy Budget model I: Philosophy and approach

    Science.gov (United States)

    Lika, Konstadia; Kearney, Michael R.; Freitas, Vânia; van der Veer, Henk W.; van der Meer, Jaap; Wijsman, Johannes W. M.; Pecquerie, Laure; Kooijman, Sebastiaan A. L. M.

    2011-11-01

    The Dynamic Energy Budget (DEB) theory for metabolic organisation captures the processes of development, growth, maintenance, reproduction and ageing for any kind of organism throughout its life-cycle. However, the application of DEB theory is challenging because the state variables and parameters are abstract quantities that are not directly observable. We here present a new approach of parameter estimation, the covariation method, that permits all parameters of the standard Dynamic Energy Budget (DEB) model to be estimated from standard empirical datasets. Parameter estimates are based on the simultaneous minimization of a weighted sum of squared deviations between a number of data sets and model predictions or the minimisation of the negative log likelihood function, both in a single-step procedure. The structure of DEB theory permits the unusual situation of using single data-points (such as the maximum reproduction rate), which we call "zero-variate" data, for estimating parameters. We also introduce the concept of "pseudo-data", exploiting the rules for the covariation of parameter values among species that are implied by the standard DEB model. This allows us to introduce the concept of a generalised animal, which has specified parameter values. We here outline the philosophy behind the approach and its technical implementation. In a companion paper, we assess the behaviour of the estimation procedure and present preliminary findings of emerging patterns in parameter values across diverse taxa.

  17. Qualitative use of Dynamic Energy Budget theory in ecotoxicology. Case study on oil contamination and Arctic copepods

    Science.gov (United States)

    Klok, Chris; Hjorth, Morten; Dahllöf, Ingela

    2012-10-01

    The Dynamic Energy Budget (DEB) theory provides a logic and consistent framework to evaluate ecotoxicological test results. Currently this framework is not regularly applied in ecotoxicology given perceived complexity and data needs. However, even in the case of low data availability the DEB theory is already useful. In this paper we apply the DEB theory to evaluate the results in three previously published papers on the effects of PAHs on Arctic copepods. Since these results do not allow for a quantitative application we used DEB qualitatively. The ecotoxicological results were thereby set in a wider ecological context and we found a logical explanation for an unexpected decline in hatching success described in one of these papers. Moreover, the DEB evaluation helped to derive relevant ecological questions that can guide future experimental work on this subject.

  18. Modelling shellfish growth with dynamic energy budget models: an application for cockles and mussels in the Oosterschelde (southwest Netherlands)

    NARCIS (Netherlands)

    Troost, T.A.; Wijsman, J.W.M.; Saraiva, S.; Freitas, V.

    2010-01-01

    Dynamic energy budget models for growth of individual cockles (Cerastoderma edule) and mussels (Mytilus edulis) are adjusted and calibrated to the Oosterschelde by formulating and parametrizing their functional responses using an extensive set of field observations. The resulting model predictions

  19. A conceptual framework for budget allocation in the RIVM Chronic Disease Model - A case study of Diabetes Mellitus

    NARCIS (Netherlands)

    Hoogenveen RT; Feenstra TL; Baal PHM van; Baan CA; PZO

    2005-01-01

    The research project 'Priority setting in chronic diseases: methodology for budget allocation' aims to develop a methodology to support optimal allocation of the health care budget with respect to chronic diseases. The current report describes the modelling steps required to address budget

  20. Stochastic Climate Theory and Modelling

    CERN Document Server

    Franzke, Christian L E; Berner, Judith; Williams, Paul D; Lucarini, Valerio

    2014-01-01

    Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations as well as for model error representation, uncertainty quantification, data assimilation and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochast...

  1. Outcomes analysis of hospital management model in restricted budget conditions

    Directory of Open Access Journals (Sweden)

    Virsavia Vaseva

    2016-03-01

    Full Text Available Facing conditions of market economy and financial crisis, the head of any healthcare facility has to take adequate decisions about the cost-effective functioning of the hospital. Along with cost reduction, the main problem is how to maintain a high level of health services. The aim of our study was to analyse the quality of healthcare services after the implementation of control over expenses due to a reduction in the budgetary resources in Military Medical Academy (MMA, Sofia, Bulgaria. Data from the hospital information system and the Financial Department about the incomes and expenditures for patient treatment were used. We conducted a retrospective study on the main components of clinical indicators in 2013 to reveal the main problems in the hospital management. In 2014, control was imposed on the use of the most expensive medicines and consumables. Comparative analysis was made of the results of the medical services in MMA for 2013 and 2014. Our results showed that despite the limited budget in MMA over the last year, the policy of control over operational costs succeeded in maintaining the quality of healthcare services. While reducing the expenses for medicines, consumables and laboratory investigations by ∼26%, some quality criteria for healthcare services were observed to be improved by ∼9%. Financial crisis and budget reduction urge healthcare economists to create adequate economical instruments to assist the normal functioning of hospital facilities. Our analysis showed that when a right policy is chosen, better results may be achieved with fewer resources.

  2. A Methodological Review of US Budget-Impact Models for New Drugs.

    Science.gov (United States)

    Mauskopf, Josephine; Earnshaw, Stephanie

    2016-11-01

    A budget-impact analysis is required by many jurisdictions when adding a new drug to the formulary. However, previous reviews have indicated that adherence to methodological guidelines is variable. In this methodological review, we assess the extent to which US budget-impact analyses for new drugs use recommended practices. We describe recommended practice for seven key elements in the design of a budget-impact analysis. Targeted literature searches for US studies reporting estimates of the budget impact of a new drug were performed and we prepared a summary of how each study addressed the seven key elements. The primary finding from this review is that recommended practice is not followed in many budget-impact analyses. For example, we found that growth in the treated population size and/or changes in disease-related costs expected during the model time horizon for more effective treatments was not included in several analyses for chronic conditions. In addition, all drug-related costs were not captured in the majority of the models. Finally, for most studies, one-way sensitivity and scenario analyses were very limited, and the ranges used in one-way sensitivity analyses were frequently arbitrary percentages rather than being data driven. The conclusions from our review are that changes in population size, disease severity mix, and/or disease-related costs should be properly accounted for to avoid over- or underestimating the budget impact. Since each budget holder might have different perspectives and different values for many of the input parameters, it is also critical for published budget-impact analyses to include extensive sensitivity and scenario analyses based on realistic input values.

  3. Assessing GFDL high-resolution climate model water and energy budgets from AMIP simulations over Africa

    Science.gov (United States)

    Tian, Di; Pan, Ming; Jia, Liwei; Vecchi, Gabriel; Wood, Eric F.

    2016-07-01

    This study assessed surface water and energy budgets in Atmospheric Model Intercomparison Project (AMIP) simulations of a coupled atmosphere-land model developed by Geophysical Fluid Dynamics Laboratory (Atmospheric General Circulation Model (AM2.5)). The AM2.5 water and energy budget variables were compared with four reanalyses data sets and an observational-based reference, the Variable Infiltration Capacity model simulations forced by Princeton Global Meteorological Forcing (PGF/VIC) over 20 year period during 1991-2010 in nine African river basins. Results showed that AM2.5 has closed water and energy budgets. However, the discrepancies between AM2.5 and other data sets were notable in terms of their long-term averages. For the water budget, the AM2.5 mostly overestimated precipitation, evapotranspiration, and runoff compared to PGF/VIC and reanalyses. The AM2.5, reanalyses, and PGF/VIC showed similar seasonal cycles but discrepant amplitudes. For the energy budget, while the AM2.5 has relatively consistent net radiation with other data sets, it generally showed higher latent heat, lower sensible heat, and lower Bowen ratio than reanalyses and PGF/VIC. In addition, the AM2.5 water and energy budgets terms mostly had the smallest interannual variability compared to both reanalyses and PGF/VIC. The spatial differences of long-term mean precipitation, runoff, evapotranspiration, and latent heat between AM2.5 and other data sets were reasonably small in dry regions. On average, AM2.5 is closer to PGF/VIC than R2 and 20CR are to PGF/VIC but is not as close as Modern-Era Retrospective analysis for Research and Applications and Climate Forecast System Reanalysis to PGF/VIC. The bias in AM2.5 water and energy budget terms may be associated with the excessive wet surface and parameterization of moisture advection from ocean to land.

  4. Effect of sulfate aerosol on tropospheric NOx and ozone budgets: Model simulations and TOPSE evidence

    Science.gov (United States)

    Tie, Xuexi; Emmons, Louisa; Horowitz, Larry; Brasseur, Guy; Ridley, Brian; Atlas, Elliot; Stround, Craig; Hess, Peter; Klonecki, Andrzej; Madronich, Sasha; Talbot, Robert; Dibb, Jack

    2003-02-01

    The distributions of NOx and O3 are analyzed during TOPSE (Tropospheric Ozone Production about the Spring Equinox). In this study these data are compared with the calculations of a global chemical/transport model (Model for OZone And Related chemical Tracers (MOZART)). Specifically, the effect that hydrolysis of N2O5 on sulfate aerosols has on tropospheric NOx and O3 budgets is studied. The results show that without this heterogeneous reaction, the model significantly overestimates NOx concentrations at high latitudes of the Northern Hemisphere (NH) in winter and spring in comparison to the observations during TOPSE; with this reaction, modeled NOx concentrations are close to the measured values. This comparison provides evidence that the hydrolysis of N2O5 on sulfate aerosol plays an important role in controlling the tropospheric NOx and O3 budgets. The calculated reduction of NOx attributed to this reaction is 80 to 90% in winter at high latitudes over North America. Because of the reduction of NOx, O3 concentrations are also decreased. The maximum O3 reduction occurs in spring although the maximum NOx reduction occurs in winter when photochemical O3 production is relatively low. The uncertainties related to uptake coefficient and aerosol loading in the model is analyzed. The analysis indicates that the changes in NOx due to these uncertainties are much smaller than the impact of hydrolysis of N2O5 on sulfate aerosol. The effect that hydrolysis of N2O5 on global NOx and O3 budgets are also assessed by the model. The results suggest that in the Northern Hemisphere, the average NOx budget decreases 50% due to this reaction in winter and 5% in summer. The average O3 budget is reduced by 8% in winter and 6% in summer. In the Southern Hemisphere (SH), the sulfate aerosol loading is significantly smaller than in the Northern Hemisphere. As a result, sulfate aerosol has little impact on NOx and O3 budgets of the Southern Hemisphere.

  5. Budget calculations for ozone and its precursors: Seasonal and episodic features based on model simulations

    NARCIS (Netherlands)

    Memmesheimer, M.; Ebel, A.; Roemer, M.

    1997-01-01

    Results from two air quality models (LOTOS, EURAD) have been used to analyse the contribution of the different terms in the continuity equation to the budget of ozone, NO(x) and PAN. Both models cover large parts of Europe and describe the processes relevant for tropospheric chemistry and dynamics.

  6. BEYOND BUDGETING

    Directory of Open Access Journals (Sweden)

    Edo Cvrkalj

    2015-12-01

    Full Text Available Traditional budgeting principles, with strictly defined business goals, have been, since 1998, slowly growing into more sophisticated and organization-adjusted alternative budgeting concepts. One of those alternative concepts is the “Beyond budgeting” model with an implemented performance effects measuring process. In order for the model to be practicable, budget planning and control has to be reoriented to the “bottom up” planning and control approach. In today’s modern business surroundings one has to take both present and future opportunities and threats into consideration, by valorizing them in a budget which would allow a company to realize a whole pallet of advantages over the traditional budgeting principles which are presented later in the article. It is essential to emphasize the importance of successfully implementing the new budgeting principles within an organization. If the implementation has been lacking and done without a higher goal in mind, it is easily possible that the process has been implemented without coordination, planning and control framework within the organization itself. Further in the article we present an overview of managerial techniques and instruments within the “Beyond budgeting” model such as balanced scorecard, rolling forecast, dashboard, KPI and other supporting instruments. Lastly we define seven steps for implementing the “Beyond budgeting” model and offer a comparison of “Beyond budgeting” model against traditional budgeting principles which lists twelve reasons why “Beyond budgeting” is better suited to modern and market-oriented organizations. Each company faces those challenges in their own characteristic way but implementing new dynamic planning models will soon become essential for surviving in the market.

  7. Models in cooperative game theory

    CERN Document Server

    Branzei, Rodica; Tijs, Stef

    2008-01-01

    This book investigates models in cooperative game theory in which the players have the possibility to cooperate partially. In a crisp game the agents are either fully involved or not involved at all in cooperation with some other agents, while in a fuzzy game players are allowed to cooperate with infinite many different participation levels, varying from non-cooperation to full cooperation. A multi-choice game describes the intermediate case in which each player may have a fixed number of activity levels. Different set and one-point solution concepts for these games are presented. The properties of these solution concepts and their interrelations on several classes of crisp, fuzzy, and multi-choice games are studied. Applications of the investigated models to many economic situations are indicated as well. The second edition is highly enlarged and contains new results and additional sections in the different chapters as well as one new chapter.

  8. Political Budget Cycles in the European Union

    Directory of Open Access Journals (Sweden)

    Jiří Gregor

    2016-01-01

    Full Text Available This paper provides research on the theme of the political budget cycles. The goal is to find out whether or not the government tries to manipulate the state budget and its components for the purpose of re-election across the countries of the European Union. In order to verify this theory a dynamic panel data model was used. The results were significant, but only if predetermined elections were not counted into the estimations. In that case, the theory of the political budget cycles could be accepted as valid for the EU countries. The main driving force of the political budget cycles across the countries of the European Union is fluctuation of the government expenditures. During the election year, the government expenditures are higher, and a year after the election, government expenditures are lower. This is reflected into the state budget balance.

  9. A Sediment Budget Case Study: Comparing Watershed Scale Erosion Estimates to Modeled and Empirical Sediment Loads

    Science.gov (United States)

    McDavitt, B.; O'Connor, M.

    2003-12-01

    The Pacific Lumber Company Habitat Conservation Plan requires watershed analyses to be conducted on their property. This paper summarizes a portion of that analysis focusing on erosion and sedimentation processes and rates coupled with downstream sediment routing in the Freshwater Creek watershed in northwest California. Watershed scale erosion sources from hillslopes, roads, and channel banks were quantified using field surveys, aerial photo interpretation, and empirical modeling approaches for different elements of the study. Sediment transport rates for bedload were modeled, and sediment transport rates for suspended sediment were estimated based on size distribution of sediment inputs in relation to sizes transported in suspension. Recent short-term, high-quality estimates of suspended sediment yield that a community watershed group collected with technical assistance from the US Forest Service were used to validate the resulting sediment budget. Bedload yield data from an adjacent watershed, Jacoby Creek, provided another check on the sediment budget. The sediment budget techniques and bedload routing models used for this study generated sediment yield estimates that are in good agreement with available data. These results suggest that sediment budget techniques that require moderate levels of fieldwork can be used to provide relatively accurate technical assessments. Ongoing monitoring of sediment sources coupled with sediment routing models and reach scale field data allows for predictions to be made regarding in-channel sediment storage.

  10. Link Budget Analysis and Modeling of Short-Range UWB Channels

    NARCIS (Netherlands)

    Irahhauten, Z.; Dacuna, J.; Janssen, G.J.M.; Nikookar, H.; Yarovoy, A.G.; Ligthart, L.P.

    2008-01-01

    Ultrawideband (UWB) technology is an attractive alternative for short-range applications, e.g., wireless personal area networks. In these applications, transmit and receive antennas are very close to each other and the far-field condition assumed in most of the link budget models may not be

  11. Multi-Sensor Model-Data Assimilation for Improved Modeling of Savanna Carbon and Water Budgets

    Science.gov (United States)

    Barrett, D. J.; Renzullo, L. J.; Guerschman, J.; Hill, M. J.

    2007-12-01

    Model-data assimilation methods are increasingly being used to improve model predictions of carbon pools and fluxes, soil profile moisture contents, and evapotranspiration at catchment to regional scales. In this talk, I will discuss the development of model-data assimilation methods for application to parameter and state estimation problems in the context of savanna carbon and water cycles. A particular focus of this talk will be on the integration of in situ datasets and multiple types of satellite observations with radiative transfer, surface energy balance, and carbon budget models. An example will be drawn from existing work demonstrating regional estimation of soil profile moisture content based on multiple satellite sensors. The data assimilation scheme comprised a forward model, observation operators, multiple observation datasets and an optimization scheme. The forward model propagates model state variables in time based on climate forcing, initial conditions and model parameters and includes processes governing evapotranspiration, water budget and carbon cycle processes. The observation operators calculate modeled land surface temperature and microwave brightness temperatures based on the state variables of profile soil moisture and soil surface layer soil moisture at less than 2.5 cm depth. Satellite observations used in the assimilation scheme are surface brightness temperatures from AMSR-E (passive microwave at 6.9GHz at horizontal polarization) and from AVHRR (thermal channels 4 & 5 from NOAA-18), and land surface reflectances from MODIS Terra (channels 1 and 2 at 250m resolution). These three satellite sensors overpass at approximately the same time of day and provide independent observations of the land surface at different wavelengths. The observed brightness temperatures are used as constraints on the coupled energy balance/microwave radiative transfer model, and a canopy optical model was inverted to retrieve leaf area indices from observed

  12. Halo modelling in chameleon theories

    Energy Technology Data Exchange (ETDEWEB)

    Lombriser, Lucas; Koyama, Kazuya [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, PO1 3FX (United Kingdom); Li, Baojiu, E-mail: lucas.lombriser@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk, E-mail: baojiu.li@durham.ac.uk [Institute for Computational Cosmology, Ogden Centre for Fundamental Physics, Department of Physics, University of Durham, Science Laboratories, South Road, Durham, DH1 3LE (United Kingdom)

    2014-03-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.

  13. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  14. A conceptual framework for budget allocation in the RIVM Chronic Disease Model - A case study of Diabetes Mellitus

    NARCIS (Netherlands)

    Hoogenveen RT; Feenstra TL; van Baal PHM; Baan CA; PZO

    2005-01-01

    Dit rapport beschrijft de elementen van een zogeheten 'budget allocatie model'. Dit model is bedoeld ter ondersteuning van beleidsmakers bij keuzes over de inzet van budget voor primaire preventie en/of preventie in de zorg bij chronische aandoeningen. Als concrete toepassing is gekozen

  15. Using a unit cost model to predict the impact of budget cuts on logistics products and services

    OpenAIRE

    Van Haasteren, Cleve J.

    1992-01-01

    Approved for Public Release; Distribution is Unlimited The Director of the Trident Integrated Logistics Support Division at the Naval Sea Systems Command manages a complex and dynamic budget that supports the provision of logistics products and services to the Trident submarine fleet. This thesis focuses on analyzing the Logistics Division budget and developing a model where the impact of a budget cut can be predicted by employing marginal cost. The thesis also explores ...

  16. The "covariation method" for estimating the parameters of the standard Dynamic Energy Budget model I: Philosophy and approach

    NARCIS (Netherlands)

    Lika, K.; Kearney, M.R.; Freitas, V.; van der Veer, H.W.; van der Meer, J.; Wijsman, J.W.M.; Pecquerie, L.; Kooijman, S.A.L.M.

    2011-01-01

    The Dynamic Energy Budget (DEB) theory for metabolic organisation captures the processes of development, growth, maintenance, reproduction and ageing for any kind of organism throughout its life-cycle. However, the application of DEB theory is challenging because the state variables and parameters

  17. Economic Modelling in Institutional Economic Theory

    National Research Council Canada - National Science Library

    Wadim Strielkowski; Evgeny Popov

    2017-01-01

    Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory...

  18. Demonstration of the gypsy moth energy budget microclimate model

    Science.gov (United States)

    D. E. Anderson; D. R. Miller; W. E. Wallner

    1991-01-01

    The use of a "User friendly" version of "GMMICRO" model to quantify the local environment and resulting core temperature of GM larvae under different conditions of canopy defoliation, different forest sites, and different weather conditions was demonstrated.

  19. The effects of atmospheric chemistry on radiation budget in the Community Earth Systems Model

    Science.gov (United States)

    Choi, Y.; Czader, B.; Diao, L.; Rodriguez, J.; Jeong, G.

    2013-12-01

    The Community Earth Systems Model (CESM)-Whole Atmosphere Community Climate Model (WACCM) simulations were performed to study the impact of atmospheric chemistry on the radiation budget over the surface within a weather prediction time scale. The secondary goal is to get a simplified and optimized chemistry module for the short time period. Three different chemistry modules were utilized to represent tropospheric and stratospheric chemistry, which differ in how their reactions and species are represented: (1) simplified tropospheric and stratospheric chemistry (approximately 30 species), (2) simplified tropospheric chemistry and comprehensive stratospheric chemistry from the Model of Ozone and Related Chemical Tracers, version 3 (MOZART-3, approximately 60 species), and (3) comprehensive tropospheric and stratospheric chemistry (MOZART-4, approximately 120 species). Our results indicate the different details in chemistry treatment from these model components affect the surface temperature and impact the radiation budget.

  20. Stream Heat Budget Modeling of Groundwater Inputs: Model Development and Validation

    Science.gov (United States)

    Glose, A.; Lautz, L. K.

    2012-12-01

    Models of physical processes in fluvial systems are useful for improving understanding of hydrologic systems and for predicting future conditions. Process-based models of fluid flow and heat transport in fluvial systems can be used to quantify unknown spatial and temporal patterns of hydrologic fluxes, such as groundwater discharge, and to predict system response to future change. In this study, a stream heat budget model was developed and calibrated to observed stream water temperature data for Meadowbrook Creek in Syracuse, NY. The one-dimensional (longitudinal), transient stream temperature model is programmed in Matlab and solves the equations for heat and fluid transport using a Crank-Nicholson finite difference scheme. The model considers four meteorologically driven heat fluxes: shortwave solar radiation, longwave radiation, latent heat flux, and sensible heat flux. Streambed conduction is also considered. Input data for the model were collected from June 13-18, 2012 over a 500 m reach of Meadowbrook Creek, a first order urban stream that drains a retention pond in the city of Syracuse, NY. Stream temperature data were recorded every 20 m longitudinally in the stream at 5-minute intervals using iButtons (model DS1922L, accuracy of ±0.5°C, resolution of 0.0625°C). Meteorological data, including air temperature, solar radiation, relative humidity, and wind speed, were recorded at 5-minute intervals using an on-site weather station. Groundwater temperature was measured in wells adjacent to the stream. Stream dimensions, bed temperatures, and type of bed sediments were also collected. A constant rate tracer injection of Rhodamine WT was used to quantify groundwater inputs every 10 m independently to validate model results. Stream temperatures fluctuated diurnally by ~3-5 °C during the observation period with temperatures peaking around 2 pm and cooling overnight, reaching a minimum between 6 and 7 am. Spatially, the stream shows a cooling trend along the

  1. Participative Budgeting as a Communication Process: A Model and Experiment.

    Science.gov (United States)

    1978-01-01

    If two persona share neither the same comparison objects (that is , are not coor it-ated) nor the attributes concerning these o b j e c ts , other...f f e c t s of this model operatio n. B’— inccrp crat inc toe su~ gestions of the prior research and basing the mode l on well established bud get

  2. A Dynamic Energy Budget (DEB model to describe Laternula elliptica (King, 1832 seasonal feeding and metabolism.

    Directory of Open Access Journals (Sweden)

    Antonio Agüera

    Full Text Available Antarctic marine organisms are adapted to an extreme environment, characterized by a very low but stable temperature and a strong seasonality in food availability arousing from variations in day length. Ocean organisms are particularly vulnerable to global climate change with some regions being impacted by temperature increase and changes in primary production. Climate change also affects the biotic components of marine ecosystems and has an impact on the distribution and seasonal physiology of Antarctic marine organisms. Knowledge on the impact of climate change in key species is highly important because their performance affects ecosystem functioning. To predict the effects of climate change on marine ecosystems, a holistic understanding of the life history and physiology of Antarctic key species is urgently needed. DEB (Dynamic Energy Budget theory captures the metabolic processes of an organism through its entire life cycle as a function of temperature and food availability. The DEB model is a tool that can be used to model lifetime feeding, growth, reproduction, and their responses to changes in biotic and abiotic conditions. In this study, we estimate the DEB model parameters for the bivalve Laternula elliptica using literature-extracted and field data. The DEB model we present here aims at better understanding the biology of L. elliptica and its levels of adaptation to its habitat with a special focus on food seasonality. The model parameters describe a metabolism specifically adapted to low temperatures, with a low maintenance cost and a high capacity to uptake and mobilise energy, providing this organism with a level of energetic performance matching that of related species from temperate regions. It was also found that L. elliptica has a large energy reserve that allows enduring long periods of starvation. Additionally, we applied DEB parameters to time-series data on biological traits (organism condition, gonad growth to describe the

  3. A Dynamic Energy Budget (DEB) model to describe Laternula elliptica (King, 1832) seasonal feeding and metabolism.

    Science.gov (United States)

    Agüera, Antonio; Ahn, In-Young; Guillaumot, Charlène; Danis, Bruno

    2017-01-01

    Antarctic marine organisms are adapted to an extreme environment, characterized by a very low but stable temperature and a strong seasonality in food availability arousing from variations in day length. Ocean organisms are particularly vulnerable to global climate change with some regions being impacted by temperature increase and changes in primary production. Climate change also affects the biotic components of marine ecosystems and has an impact on the distribution and seasonal physiology of Antarctic marine organisms. Knowledge on the impact of climate change in key species is highly important because their performance affects ecosystem functioning. To predict the effects of climate change on marine ecosystems, a holistic understanding of the life history and physiology of Antarctic key species is urgently needed. DEB (Dynamic Energy Budget) theory captures the metabolic processes of an organism through its entire life cycle as a function of temperature and food availability. The DEB model is a tool that can be used to model lifetime feeding, growth, reproduction, and their responses to changes in biotic and abiotic conditions. In this study, we estimate the DEB model parameters for the bivalve Laternula elliptica using literature-extracted and field data. The DEB model we present here aims at better understanding the biology of L. elliptica and its levels of adaptation to its habitat with a special focus on food seasonality. The model parameters describe a metabolism specifically adapted to low temperatures, with a low maintenance cost and a high capacity to uptake and mobilise energy, providing this organism with a level of energetic performance matching that of related species from temperate regions. It was also found that L. elliptica has a large energy reserve that allows enduring long periods of starvation. Additionally, we applied DEB parameters to time-series data on biological traits (organism condition, gonad growth) to describe the effect of a

  4. New Pathways between Group Theory and Model Theory

    CERN Document Server

    Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz

    2017-01-01

    This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...

  5. Evaluating model assumptions in item response theory

    NARCIS (Netherlands)

    Tijmstra, J.

    2013-01-01

    This dissertation deals with the evaluation of model assumptions in the context of item response theory. Item response theory, also known as modern test theory, provides a statistical framework for the measurement of psychological constructs that cannot by observed directly, such as intelligence or

  6. Applications of model theory to functional analysis

    CERN Document Server

    Iovino, Jose

    2014-01-01

    During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the

  7. Water Budget Model for a Remnant of the Historic Northern Everglades

    Science.gov (United States)

    Arceneaux, J. C.; Meselhe, E. A.; Habib, E.; Waldon, M. G.

    2006-12-01

    The Arthur R. Marshall Loxahatchee National Wildlife Refuge overlays an area termed Water Conservation Area 1 (WCA-1, a 143,000 acre (58,000 ha) freshwater wetland. It is a remnant of the northern Everglades in Palm Beach County, Florida, USA. Sheetflow that naturally would flow across the Refuge wetlands was disrupted in the 1950s and early 1960s by construction of stormwater pumps, and levees with associated borrow canals which hydraulically isolated the Refuge from its watershed. The U.S. Fish and Wildlife Services (USFWS) concludes that changes in the water quantity, timing, and quality have caused negative impacts to the Refuge ecosystem. It is a top priority of the Refuge to ensure appropriate management that will produce maximum benefits for fish and wildlife, while meeting flood control and water supply needs. Models can improve our understanding and support improvement in these management decisions. The development of a water budget for the Loxahatchee Refuge will provide one useful modeling tool in support of Refuge water management decisions. The water budget model reported here was developed as a double- box (2-compartment) model with a daily time step that predicts temporal variations of water level in the Refuge rim canal and interior marsh based on observed inflows, outflows, precipitation, and evapotranspiration. The water budget model was implemented using Microsoft EXCEL. The model calibration period was from January 1, 1995 to December 31, 1999; the validation period extended from January 1, 2000 to December 31, 2004. Statistical analyses demonstrate the utility of this simple water budget model to predict the temporal variation of water levels in both the Refuge marsh and rim canal. The Refuge water budget model is currently being applied to evaluate various water management scenarios for the Refuge. Preliminary results modeling the mass balance of water quality constituents, including chloride, total phosphorus are encouraging. Success of this

  8. Quantum field theory competitive models

    CERN Document Server

    Tolksdorf, Jürgen; Zeidler, Eberhard

    2009-01-01

    For more than 70 years, quantum field theory (QFT) can be seen as a driving force in the development of theoretical physics. Equally fascinating is the fruitful impact which QFT had in rather remote areas of mathematics. The present book features some of the different approaches, different physically viewpoints and techniques used to make the notion of quantum field theory more precise. For example, the present book contains a discussion including general considerations, stochastic methods, deformation theory and the holographic AdS/CFT correspondence. It also contains a discussion of more recent developments like the use of category theory and topos theoretic methods to describe QFT. The present volume emerged from the 3rd 'Blaubeuren Workshop: Recent Developments in Quantum Field Theory', held in July 2007 at the Max Planck Institute of Mathematics in the Sciences in Leipzig/Germany. All of the contributions are committed to the idea of this workshop series: 'To bring together outstanding experts working in...

  9. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel

    1996-01-01

    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the

  10. Subcellular metabolic organization in the context of dynamic energy budget and biochemical systems theories

    NARCIS (Netherlands)

    Vinga, S.; Neves, A.S.; Santos, H.; Brandt, B.W.; Kooijman, S.A.L.M.

    2010-01-01

    The dynamic modelling of metabolic networks aims to describe the temporal evolution of metabolite concentrations in cells. This area has attracted increasing attention in recent years owing to the availability of high-throughput data and the general development of systems biology as a promising

  11. The Ozone Budget in the Upper Troposphere from Global Modeling Initiative (GMI)Simulations

    Science.gov (United States)

    Rodriquez, J.; Duncan, Bryan N.; Logan, Jennifer A.

    2006-01-01

    Ozone concentrations in the upper troposphere are influenced by in-situ production, long-range tropospheric transport, and influx of stratospheric ozone, as well as by photochemical removal. Since ozone is an important greenhouse gas in this region, it is particularly important to understand how it will respond to changes in anthropogenic emissions and changes in stratospheric ozone fluxes.. This response will be determined by the relative balance of the different production, loss and transport processes. Ozone concentrations calculated by models will differ depending on the adopted meteorological fields, their chemical scheme, anthropogenic emissions, and treatment of the stratospheric influx. We performed simulations using the chemical-transport model from the Global Modeling Initiative (GMI) with meteorological fields from (It)h e NASA Goddard Institute for Space Studies (GISS) general circulation model (GCM), (2) the atmospheric GCM from NASA's Global Modeling and Assimilation Office(GMAO), and (3) assimilated winds from GMAO . These simulations adopt the same chemical mechanism and emissions, and adopt the Synthetic Ozone (SYNOZ) approach for treating the influx of stratospheric ozone -. In addition, we also performed simulations for a coupled troposphere-stratosphere model with a subset of the same winds. Simulations were done for both 4degx5deg and 2degx2.5deg resolution. Model results are being tested through comparison with a suite of atmospheric observations. In this presentation, we diagnose the ozone budget in the upper troposphere utilizing the suite of GMI simulations, to address the sensitivity of this budget to: a) the different meteorological fields used; b) the adoption of the SYNOZ boundary condition versus inclusion of a full stratosphere; c) model horizontal resolution. Model results are compared to observations to determine biases in particular simulations; by examining these comparisons in conjunction with the derived budgets, we may pinpoint

  12. The Friction Theory for Viscosity Modeling

    DEFF Research Database (Denmark)

    Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan

    2001-01-01

    In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet......, in the case when experimental information is available a more accurate modeling can be obtained by means of a simple tuning procedure. A tuned f-theory general model can deliver highly accurate viscosity modeling above the saturation pressure and good prediction of the liquid-phase viscosity at pressures...

  13. Domain Theory, Its Models and Concepts

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Howard, Thomas J.; Bruun, Hans Peter Lomholt

    2014-01-01

    , which can support design work and to form elements of designers’ mindsets and thereby their practice. The theory is a model-based theory, which means it is composed of concepts and models, which explains certain design phenomena. Many similar theories are described in the literature with differences...... and industrial applications especially for the DFX areas (not reported here) and for product modelling. The theory therefore contains a rich ontology of interrelated concepts. The Domain Theory is not aiming to create normative methods but the creation of a collection of concepts related to design phenomena...... in the set of concepts but assumingly all valid. The Domain Theory cannot be falsified or proven; but its value may be seen spanning from its range and productivity as described in the article....

  14. Development of Turbulent Diffusion Transfer Model to Estimate Hydrologic Budget of Upper Klamath Lake Oregon, USA

    Science.gov (United States)

    Sahoo, G. B.; Schladow, G.

    2013-12-01

    Detailed and accurate hydrologic budgets of lake or reservoirs are essential for sustainable water supply and ecosystem managements due to increasing water demand and uncertainties related to climate change. Ensuring sustainable water allocation to stakeholders requires accurate heat and hydrologic budgets. A number of micrometeorological methods have been developed to approximate heat budget components, such as evaporative and sensible heat loss, that are not directly measurable. Although micrometeorological methods estimate the sensible and evaporative loss well for stationary (i.e. ideal) condition, these methods can rarely be approximated for non-idealized condition. We developed a turbulent diffusion transfer model and coupled to the dynamic lake model (DLM-WQ), developed at UC Davis, with the goal of correctly estimating the hydrologic budget of Upper Klamath Lake Oregon, USA. The measured and DLM-WQ estimated lake water temperatures and water elevation are in excellent agreement with correlation coefficient equals 0.95 and 0.99, respectively. Consistent with previous studies, the sensible and latent heat exchange coefficients were found to be site specific. Estimated lake mixing shows that the lake became strongly stratified during summer (between late April and the end of August). For the hypereutrophic shallow Upper Klamath Lake, longer stratification results in low dissolved oxygen (DO) concentration at the sediment surface causing DO sensitive habitat destruction and ecological problems. The updated DLM-WQ can provide quantitative estimates of hydrologic components and predict the effects of natural- or human-induced changes in one component of the hydrologic cycle on the lake supplies and associated consequences.

  15. The AquaDEB project: Physiological flexibility of aquatic animals analysed with a generic dynamic energy budget model (phase II).

    NARCIS (Netherlands)

    Alunno-Bruscia, M.; v.d. Veer, H.; Kooijman, S.A.L.M.

    2011-01-01

    This second special issue of the Journal of Sea Research on development and applications of Dynamic Energy Budget (DEB) theory concludes the European Research Project AquaDEB (2007-2011). In this introductory paper we summarise the progress made during the running time of this 5. years' project,

  16. Reconciliation Model of Transparency Value and Bureaucracy Secretion in Management of Local Government Budget

    Directory of Open Access Journals (Sweden)

    I Putu Yoga Bumi Pradana

    2015-02-01

    Full Text Available This study aims to present a reconciliation model of bureaucratic principles (Secretion and democracy (Transparency through the mapping of public information about managing a local government budget which is accessible to the public and which ones are excluded (secret based on bureaucracy and public perceptions. This study uses a mixed method with sequential exploratory design and data collection research procedures using surveys, depth interviews, and documents. The validation data use source of triangulation techniques. The subjects of this study was divided into 2 (two information assembling that is government bureaucracy and public Kupang determined by purposive. The results of this research showed that Kupang Goverment bureaucracy has 22 types of information perception (33,85% in category information which is open and 42 types of information (64,62% in category information that are closed while the public perceives 29 types of information (44,62% in category information which is open and 26 types of information (40% in the category of information that are closed. Therefore, to achieve the main of reconciliation to end of conflict between bureaucracy and public, later on the amount of information is open budget of management that are 32 types of information (49,2% and the amount of information that is enclosed which includes 33 types of information (50,8 % of the 65 types of management budget information by egulation No. 13 of 2006 on local Financial Management.

  17. The measurement of the earth's radiation budget as a problem in information theory - A tool for the rational design of earth observing systems

    Science.gov (United States)

    Barkstrom, B. R.

    1983-01-01

    The measurement of the earth's radiation budget has been chosen to illustrate the technique of objective system design. The measurement process is an approximately linear transformation of the original field of radiant exitances, so that linear statistical techniques may be employed. The combination of variability, measurement strategy, and error propagation is presently made with the help of information theory, as suggested by Kondratyev et al. (1975) and Peckham (1974). Covariance matrices furnish the quantitative statement of field variability.

  18. Estimating the impact of petroleum substances on survival in ealry life stages of cod (Gadus morhua) using the Dynamic Energy Budget theory

    NARCIS (Netherlands)

    Klok, T.C.; Nordtug, T.; Tamis, J.E.

    2014-01-01

    To estimate the impact of accidental oil-spills on cod fisheries a model framework is developed in which a Dynamic Energy Budget (DEB) model is applied to assess mortality caused by petroleum substances in early life stages. In this paper we report on a literature search and DEB analyses, aiming for

  19. Modelling the carbon budget of intensive forest monitoring sites in Germany using the simulation model BIOME-BGC

    OpenAIRE

    Jochheim, H.; Puhlmann, M.; Beese, F.; Berthold, D.; Einert, P.; Kallweit, R.; Konopatzky, A.; Meesenburg, H.; Meiwes, K.-J.; Raspe, S.; Schulte-Bisping, H.; Schulz, C.

    2008-01-01

    It is shown that by calibrating the simulation model BIOME-BGC with mandatory and optional Level II data, within the ICP Forest programme, a well-founded calculation of the carbon budget of forest stands is achievable and, based on succeeded calibration, the modified BIOME-BGC model is a useful tool to assess the effect of climate change on forest ecosystems. peerReviewed

  20. System of Budget Planning, Programming, Development and Execution and the Defence Resources Management Model (DRMM

    Directory of Open Access Journals (Sweden)

    Davor Čutić

    2010-07-01

    Full Text Available The system of budget planning, programming, development and execution of the Ministry of Defence of the Republic of Croatia (henceforth: the Croatian acronym SPPIIP is the basic system for the strategic management of defence resources through which an effective and rational distribution of available resources is conducted, based on the goals of national security of the Republic of Croatia. This system sets the principles of transparency and democratic management of defence resources while respecting the specificities of the defence system. The SPPIIP allows for decision making based on complete information about alternatives and the choice of the most economical and most efficient way to reach the goal. It unites the strategic plan, program and budget. It consists of four continuous, independent and interconnected phases: planning, programming, development and the execution of the budget. The processes of the phases are dynamic and cyclic. In addition to the SPPIIP, the Defence Resources Management Model (DRMM, Croatian acronym: MURO has also been developed. This is an analytic tool which serves as a decision support system in the SPPIIP. The DRMM is a complex computer model showing graph and tabular overviews in a multi-year period. The model examines three areas: the strength of the forces, expenses and defence programs. The purpose of the model is cost and strength analysis and the analysis of compromise and feasibility, i.e. how sensitive the programs are to fiscal movements in the sphere of the MoD budget in the course of a multiyear cycle, until a certain project ends. The analysis results are an easily understandable basis for decision making. The SPPIIP and the DRMM are mutually independent systems, but they complement each other well. The SPPIIP uses the DRMM in designing and resource allocation based on the goals set. The quality of the DRMM depends on the amount and quality of data in its database. The DRMM can be used as a basis for

  1. Budgeting and Finance

    NARCIS (Netherlands)

    F.K.M. van Nispen tot Pannerden (Frans)

    2012-01-01

    textabstractThe Call for a Budgetary Theory: The appeal of Valdimer Key for a budgetary theory marks the interest in public budgeting in modern history. He clearly referred to a normative theory, raising the question: ‘on what basis shall it be decided to allocate X dollars to activity A instead of

  2. Constraint theory multidimensional mathematical model management

    CERN Document Server

    Friedman, George J

    2017-01-01

    Packed with new material and research, this second edition of George Friedman’s bestselling Constraint Theory remains an invaluable reference for all engineers, mathematicians, and managers concerned with modeling. As in the first edition, this text analyzes the way Constraint Theory employs bipartite graphs and presents the process of locating the “kernel of constraint” trillions of times faster than brute-force approaches, determining model consistency and computational allowability. Unique in its abundance of topological pictures of the material, this book balances left- and right-brain perceptions to provide a thorough explanation of multidimensional mathematical models. Much of the extended material in this new edition also comes from Phan Phan’s PhD dissertation in 2011, titled “Expanding Constraint Theory to Determine Well-Posedness of Large Mathematical Models.” Praise for the first edition: "Dr. George Friedman is indisputably the father of the very powerful methods of constraint theory...

  3. Staircase Models from Affine Toda Field Theory

    CERN Document Server

    Dorey, P; Dorey, Patrick; Ravanini, Francesco

    1993-01-01

    We propose a class of purely elastic scattering theories generalising the staircase model of Al. B. Zamolodchikov, based on the affine Toda field theories for simply-laced Lie algebras g=A,D,E at suitable complex values of their coupling constants. Considering their Thermodynamic Bethe Ansatz equations, we give analytic arguments in support of a conjectured renormalisation group flow visiting the neighbourhood of each W_g minimal model in turn.

  4. A Theory-Based Computer Tutorial Model.

    Science.gov (United States)

    Dixon, Robert C.; Clapp, Elizabeth J.

    Because of the need for models to illustrate some possible answers to practical courseware development questions, a specific, three-section model incorporating the Corrective Feedback Paradigm (PCP) is advanced for applying theory to courseware. The model is reconstructed feature-by-feature against a framework of a hypothetical, one-to-one,…

  5. A Budget Impact Model for Paclitaxel-eluting Stent in Femoropopliteal Disease in France

    Energy Technology Data Exchange (ETDEWEB)

    De Cock, Erwin, E-mail: erwin.decock@unitedbiosource.com [United BioSource Corporation, Peri- and Post-Approval Services (Spain); Sapoval, Marc, E-mail: Marc.sapoval2@egp.aphp.fr [Hopital Europeen Georges Pompidou, Universite Rene Descartes, Department of Cardiovascular and Interventional Radiology (France); Julia, Pierre, E-mail: pierre.julia@egp.aphp.fr [Hopital Europeen Georges Pompidou, Universite Rene Descartes, Cardiovascular Surgery Department (France); Lissovoy, Greg de, E-mail: gdelisso@jhsph.edu [Johns Hopkins Bloomberg School of Public Health, Department of Health Policy and Management (United States); Lopes, Sandra, E-mail: Sandra.Lopes@CookMedical.com [Cook Medical, Health Economics and Reimbursement (Denmark)

    2013-04-15

    The Zilver PTX drug-eluting stent (Cook Ireland Ltd., Limerick, Ireland) represents an advance in endovascular treatments for atherosclerotic superficial femoral artery (SFA) disease. Clinical data demonstrate improved clinical outcomes compared to bare-metal stents (BMS). This analysis assessed the likely impact on the French public health care budget of introducing reimbursement for the Zilver PTX stent. A model was developed in Microsoft Excel to estimate the impact of a progressive transition from BMS to Zilver PTX over a 5-year horizon. The number of patients undergoing SFA stenting was estimated on the basis of hospital episode data. The analysis from the payer perspective used French reimbursement tariffs. Target lesion revascularization (TLR) after primary stent placement was the primary outcome. TLR rates were based on 2-year data from the Zilver PTX single-arm study (6 and 9 %) and BMS rates reported in the literature (average 16 and 22 %) and extrapolated to 5 years. Net budget impact was expressed as the difference in total costs (primary stenting and reinterventions) for a scenario where BMS is progressively replaced by Zilver PTX compared to a scenario of BMS only. The model estimated a net cumulative 5-year budget reduction of Euro-Sign 6,807,202 for a projected population of 82,316 patients (21,361 receiving Zilver PTX). Base case results were confirmed in sensitivity analyses. Adoption of Zilver PTX could lead to important savings for the French public health care payer. Despite higher initial reimbursement for the Zilver PTX stent, fewer expected SFA reinterventions after the primary stenting procedure result in net savings.

  6. A course on basic model theory

    CERN Document Server

    Sarbadhikari, Haimanti

    2017-01-01

    This self-contained book is an exposition of the fundamental ideas of model theory. It presents the necessary background from logic, set theory and other topics of mathematics. Only some degree of mathematical maturity and willingness to assimilate ideas from diverse areas are required. The book can be used for both teaching and self-study, ideally over two semesters. It is primarily aimed at graduate students in mathematical logic who want to specialise in model theory. However, the first two chapters constitute the first introduction to the subject and can be covered in one-semester course to senior undergraduate students in mathematical logic. The book is also suitable for researchers who wish to use model theory in their work.

  7. Developing integrated parametric planning models for budgeting and managing complex projects

    Science.gov (United States)

    Etnyre, Vance A.; Black, Ken U.

    1988-01-01

    The applicability of integrated parametric models for the budgeting and management of complex projects is investigated. Methods for building a very flexible, interactive prototype for a project planning system, and software resources available for this purpose, are discussed and evaluated. The prototype is required to be sensitive to changing objectives, changing target dates, changing costs relationships, and changing budget constraints. To achieve the integration of costs and project and task durations, parametric cost functions are defined by a process of trapezoidal segmentation, where the total cost for the project is the sum of the various project cost segments, and each project cost segment is the integral of a linearly segmented cost loading function over a specific interval. The cost can thus be expressed algebraically. The prototype was designed using Lotus-123 as the primary software tool. This prototype implements a methodology for interactive project scheduling that provides a model of a system that meets most of the goals for the first phase of the study and some of the goals for the second phase.

  8. Simulated effects of nitrogen saturation the global carbon budget using the IBIS model

    Science.gov (United States)

    Lu, Xuehe; Jiang, Hong; Liu, Jinxun; Zhang, Xiuying; Jin, Jiaxin; Zhu, Qiuan; Zhang, Zhen; Peng, Changhui

    2016-01-01

    Over the past 100 years, human activity has greatly changed the rate of atmospheric N (nitrogen) deposition in terrestrial ecosystems, resulting in N saturation in some regions of the world. The contribution of N saturation to the global carbon budget remains uncertain due to the complicated nature of C-N (carbon-nitrogen) interactions and diverse geography. Although N deposition is included in most terrestrial ecosystem models, the effect of N saturation is frequently overlooked. In this study, the IBIS (Integrated BIosphere Simulator) was used to simulate the global-scale effects of N saturation during the period 1961–2009. The results of this model indicate that N saturation reduced global NPP (Net Primary Productivity) and NEP (Net Ecosystem Productivity) by 0.26 and 0.03 Pg C yr−1, respectively. The negative effects of N saturation on carbon sequestration occurred primarily in temperate forests and grasslands. In response to elevated CO2 levels, global N turnover slowed due to increased biomass growth, resulting in a decline in soil mineral N. These changes in N cycling reduced the impact of N saturation on the global carbon budget. However, elevated N deposition in certain regions may further alter N saturation and C-N coupling.

  9. Economic Modelling in Institutional Economic Theory

    Directory of Open Access Journals (Sweden)

    Wadim Strielkowski

    2017-06-01

    Full Text Available Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory. We scrutinize and discuss the scientific principles of this institutional modelling that are increasingly postulated by the classics of institutional theory and find their way into the basics of the institutional economics. We propose scientific ideas concerning the new innovative approaches to institutional modelling. These ideas have been devised and developed on the basis of the results of our own original design, as well as on the formalisation and measurements of economic institutions, their functioning and evolution. Moreover, we consider the applied aspects of the institutional theory of modelling and employ them in our research for formalizing our results and maximising the practical outcome of our paper. Our results and findings might be useful for the researchers and stakeholders searching for the systematic and comprehensive description of institutional level modelling, the principles involved in this process and the main provisions of the institutional theory of economic modelling.

  10. Randomized Item Response Theory Models

    NARCIS (Netherlands)

    Fox, Gerardus J.A.

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by

  11. Ozone Budgets from a Global Chemistry/ Transport Model and Comparison to Observations from POLARIS

    Science.gov (United States)

    Kawa, S. Randy

    1999-01-01

    The objective of the Photochemistry of Ozone Loss in the Arctic Region in Summer (POLARIS) field mission was to obtain data to better characterize the summertime seasonal decrease of ozone at mid to high latitudes. The decrease in ozone occurs mainly in the lower stratosphere and is expected to result from in situ chemical destruction. Instrumented balloons and aircraft were used in POLARIS, along with satellites, to measure ozone and chemical species which are involved with stratospheric ozone chemistry. In order to close the seasonal ozone budget, however, ozone transport must also be estimated. Comparison to a global chemistry and transport model (CTM) of the stratosphere indicates how well the summertime ozone loss processes are simulated and thus how well we can predict the ozone response to changing amounts of chemical source gases. Moreover, the model gives insight into the possible relative magnitude of transport contributions to the seasonal ozone decline. Initial comparison to the Goddard CTM, which uses transport winds and temperatures from meteorological data assimilation, shows a high ozone bias in the model and an attenuated summertime ozone loss cycle. Comparison of the model chemical partitioning, and ozone catalytic loss rates to those derived from measurements shows fairly close agreement both at ER-2 altitudes (20 km) and higher. This suggests that the model transport is too active in resupplying ozone to the high latitude region, although chemistry failings cannot be completely ruled out. Comparison of ozone and related species will be shown along with a full diagnosis of the model ozone budget and its possible sources of error.

  12. Supersymmetric SYK model and random matrix theory

    Science.gov (United States)

    Li, Tianlin; Liu, Junyu; Xin, Yuan; Zhou, Yehao

    2017-06-01

    In this paper, we investigate the effect of supersymmetry on the symmetry classification of random matrix theory ensembles. We mainly consider the random matrix behaviors in the N=1 supersymmetric generalization of Sachdev-Ye-Kitaev (SYK) model, a toy model for two-dimensional quantum black hole with supersymmetric constraint. Some analytical arguments and numerical results are given to show that the statistics of the supersymmetric SYK model could be interpreted as random matrix theory ensembles, with a different eight-fold classification from the original SYK model and some new features. The time-dependent evolution of the spectral form factor is also investigated, where predictions from random matrix theory are governing the late time behavior of the chaotic hamiltonian with supersymmetry.

  13. Graphical Model Theory for Wireless Sensor Networks

    Energy Technology Data Exchange (ETDEWEB)

    Davis, William B.

    2002-12-08

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm.

  14. Some Results in Dynamic Model Theory

    Science.gov (United States)

    2004-01-01

    Science of Computer Programming 51 (2004) 3–22 www.elsevier.com/locate/scico Some results in dynamic model theory Dexter Kozen∗ Computer Science......models. At the /rst-order level, we recall the de/nition of Tarskian frames over a /rst-order signature . D. Kozen / Science of Computer Programming 51

  15. A water-budget model and estimates of groundwater recharge for Guam

    Science.gov (United States)

    Johnson, Adam G.

    2012-01-01

    On Guam, demand for groundwater tripled from the early 1970s to 2010. The demand for groundwater is anticipated to further increase in the near future because of population growth and a proposed military relocation to Guam. Uncertainty regarding the availability of groundwater resources to support the increased demand has prompted an investigation of groundwater recharge on Guam using the most current data and accepted methods. For this investigation, a daily water-budget model was developed and used to estimate mean recharge for various land-cover and rainfall conditions. Recharge was also estimated for part of the island using the chloride mass-balance method. Using the daily water-budget model, estimated mean annual recharge on Guam is 394.1 million gallons per day, which is 39 percent of mean annual rainfall (999.0 million gallons per day). Although minor in comparison to rainfall on the island, water inflows from water-main leakage, septic-system leachate, and stormwater runoff may be several times greater than rainfall at areas that receive these inflows. Recharge is highest in areas that are underlain by limestone, where recharge is typically between 40 and 60 percent of total water inflow. Recharge is relatively high in areas that receive stormwater runoff from storm-drain systems, but is relatively low in urbanized areas where stormwater runoff is routed to the ocean or to other areas. In most of the volcanic uplands in southern Guam where runoff is substantial, recharge is less than 30 percent of total water inflow. The water-budget model in this study differs from all previous water-budget investigations on Guam by directly accounting for canopy evaporation in forested areas, quantifying the evapotranspiration rate of each land-cover type, and accounting for evaporation from impervious areas. For the northern groundwater subbasins defined in Camp, Dresser & McKee Inc. (1982), mean annual baseline recharge computed in this study is 159.1 million gallons

  16. Earth Radiation Budget and Cloudiness Simulations with a General Circulation Model.

    Science.gov (United States)

    Harshvardhan; Randall, David A.; Corsetti, Thomas G.

    1989-07-01

    The UCLA/GLA general circulation model has been endowed with new parameterizations of solar and terrestrial radiation, as well as new parameterized cloud optical properties. A simple representation of the cloud liquid water feedback is included. We have used the model and several observational datasets to analyze the effects of cloudiness on the Earth's radiation budget.Analysis of January and July results obtained with the full model shows that the simulated Earth radiation budget is in reasonable agreement with Nimbus 7 data. The globally averaged planetary albedo and outgoing longwave radiation am both slightly less than observed. A tropical minimum of the outgoing longwave radiation is simulated, but is weaker than observed. Comparisons of the simulated cloudiness with observations from ISCCP and HIRS2/MSU show that the model overpredicts subtropical and midlatitude cloudiness.The simulated cloud radiative forcings at the top of the atmosphere, at the Earth's surface, and across the atmosphere are discussed, and comparisons are made with the limited observations available. The simulated atmospheric cloud radiative forcing (ACRF) is comparable in magnitude to the latent heating. We have compared the clear-sky radiation fields obtained using Methods I and II of Cess and Potter; the results show significant differences between the two methods, primarily due to systematic variations of the cloudiness with time of day.An important feature of the new terrestrial radiation parameterization is its incorporation (for the first time in this GCM) of the effects of the water vapor continuum. To determine the effects of this change on the model results, we performed a numerical experiment in which the effects of the water vapor continuum were neglected. The troposphere warmed dramatically, and shallow convection weakened, and the radiative effects of the clouds were significantly enhanced.

  17. Security Theorems via Model Theory

    Directory of Open Access Journals (Sweden)

    Joshua Guttman

    2009-11-01

    Full Text Available A model-theoretic approach can establish security theorems for cryptographic protocols. Formulas expressing authentication and non-disclosure properties of protocols have a special form. They are quantified implications for all xs . (phi implies for some ys . psi. Models (interpretations for these formulas are *skeletons*, partially ordered structures consisting of a number of local protocol behaviors. *Realized* skeletons contain enough local sessions to explain all the behavior, when combined with some possible adversary behaviors. We show two results. (1 If phi is the antecedent of a security goal, then there is a skeleton A_phi such that, for every skeleton B, phi is satisfied in B iff there is a homomorphism from A_phi to B. (2 A protocol enforces for all xs . (phi implies for some ys . psi iff every realized homomorphic image of A_phi satisfies psi. Hence, to verify a security goal, one can use the Cryptographic Protocol Shapes Analyzer CPSA (TACAS, 2007 to identify minimal realized skeletons, or "shapes," that are homomorphic images of A_phi. If psi holds in each of these shapes, then the goal holds.

  18. 3D modeling of phytoplankton seasonal variation and nutrient budget in a southern Mediterranean Lagoon.

    Science.gov (United States)

    Béjaoui, Béchir; Solidoro, Cosimo; Harzallah, Ali; Chevalier, Cristèle; Chapelle, Annie; Zaaboub, Noureddine; Aleya, Lotfi

    2017-01-30

    A 3D coupled physical-biogeochemical model is developed and applied to Bizerte Lagoon (Tunisia), in order to understand and quantitatively assess its hydrobiological functioning and nutrients budget. The biogeochemical module accounts for nitrogen and phosphorus and includes the water column and upper sediment layer. The simulations showed that water circulation and the seasonal patterns of nutrients, phytoplankton and dissolved oxygen were satisfactorily reproduced. Model results indicate that water circulation in the lagoon is driven mainly by tide and wind. Plankton primary production is co-limited by phosphorus and nitrogen, and is highest in the inner part of the lagoon, due to the combined effects of high water residence time and high nutrient inputs from the boundary. However, a sensitivity analysis highlights the importance of exchanges with the Mediterranean Sea in maintaining a high level of productivity. Intensive use of fertilizers in the catchment area has a significant effect on phytoplankton biomass increase. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Spectral Energy Budget of High Resolution General Circulation Models: Simulation of the Direct Energy Cascade

    Science.gov (United States)

    Augier, P.; Lindborg, E.

    2012-12-01

    Nastrom and Gage (1985) showed that the atmospheric kinetic energy and potential temperature spectra measured in the upper troposphere and lower stratosphere present two inertial ranges. At the mesoscales, the spectra have a kh-5/3 power law dependence. At larger scales, there is a narrow range where the spectra show a kh-3 dependence. Recently, there has been considerable progress in simulating the observed spectra with some high resolution General Circulation Models (GCMs) (see e.g.~Hamilton et al., 2008). Our aim is to understand fundamental mechanisms of energy transfer between different scales and how well these mechanisms are described by different GCMs. In particular, we wish to test the hypothesis recently proposed by Vallgren, Deusebio & Lindborg (2011), that the atmospheric kinetic and potential energy spectra can be explained by assuming that there are two cascade processes emanating from the same large-scale energy source at scales of thousands of kilometers. In order to do this, we calculate the spectral budgets of energy using data from different GCMs, including data from the T639L24 AFES model and the T1279L91 ECMWF Integrated Forecast System. The concept of available potential energy (APE, Lorenz, 1955) has been used to formulate the spectral budgets of the so-called ``primitive equations'' in pressure coordinates, with spherical harmonics as the base functions, and taking into account the topography. The ratio of the total APE over the total kinetic energy (KE) is large, of the order of 3. This is due to a larger magnitude of the APE spectrum at the very large scales of the atmosphere (total wavenumber l ≤slant 3). At the other scales, APE and KE spectra are of the same order of magnitude. For the ECMWF model and at the synoptic scales, the APE spectrum is half the KE spectrum as predicted by Charney (1971). The main terms of the spectral energy budget are computed, which allows us to present a spectral representation of the Lorenz energy cycle

  20. Upper Blue Nile basin water budget from a multi-model perspective

    Science.gov (United States)

    Jung, Hahn Chul; Getirana, Augusto; Policelli, Frederick; McNally, Amy; Arsenault, Kristi R.; Kumar, Sujay; Tadesse, Tsegaye; Peters-Lidard, Christa D.

    2017-12-01

    Improved understanding of the water balance in the Blue Nile is of critical importance because of increasingly frequent hydroclimatic extremes under a changing climate. The intercomparison and evaluation of multiple land surface models (LSMs) associated with different meteorological forcing and precipitation datasets can offer a moderate range of water budget variable estimates. In this context, two LSMs, Noah version 3.3 (Noah3.3) and Catchment LSM version Fortuna 2.5 (CLSMF2.5) coupled with the Hydrological Modeling and Analysis Platform (HyMAP) river routing scheme are used to produce hydrological estimates over the region. The two LSMs were forced with different combinations of two reanalysis-based meteorological datasets from the Modern-Era Retrospective analysis for Research and Applications datasets (i.e., MERRA-Land and MERRA-2) and three observation-based precipitation datasets, generating a total of 16 experiments. Modeled evapotranspiration (ET), streamflow, and terrestrial water storage estimates were evaluated against the Atmosphere-Land Exchange Inverse (ALEXI) ET, in-situ streamflow observations, and NASA Gravity Recovery and Climate Experiment (GRACE) products, respectively. Results show that CLSMF2.5 provided better representation of the water budget variables than Noah3.3 in terms of Nash-Sutcliffe coefficient when considering all meteorological forcing datasets and precipitation datasets. The model experiments forced with observation-based products, the Climate Hazards group Infrared Precipitation with Stations (CHIRPS) and the Tropical Rainfall Measuring Mission (TRMM) Multi-Satellite Precipitation Analysis (TMPA), outperform those run with MERRA-Land and MERRA-2 precipitation. The results presented in this paper would suggest that the Famine Early Warning Systems Network (FEWS NET) Land Data Assimilation System incorporate CLSMF2.5 and HyMAP routing scheme to better represent the water balance in this region.

  1. Vacation queueing models theory and applications

    CERN Document Server

    Tian, Naishuo

    2006-01-01

    A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...

  2. Motivation in Beyond Budgeting: A Motivational Paradox?

    DEFF Research Database (Denmark)

    Sandalgaard, Niels; Bukh, Per Nikolaj

    In this paper we discuss the role of motivation in relation to budgeting and we analyse how the Beyond Budgeting model functions compared with traditional budgeting. In the paper we focus on budget related motivation (and motivation in general) and conclude that the Beyond Budgeting model...... is a motivational paradox....

  3. From Dreams to Dollars: Joining the Theory of Planning with the Practicality of Budget to Maximize Both

    Science.gov (United States)

    Dorsey, Myrtle E. B.

    2008-01-01

    The integrated online planning and budget development system at Baton Rouge Community College is an innovative approach to systematically link college strategic priorities and unit plan objectives with financial resources. Using two industry standards (Microsoft Access and Sungard Banner), a user-friendly program was developed that has facilitated…

  4. Quantum field theory and the standard model

    CERN Document Server

    Schwartz, Matthew D

    2014-01-01

    Providing a comprehensive introduction to quantum field theory, this textbook covers the development of particle physics from its foundations to the discovery of the Higgs boson. Its combination of clear physical explanations, with direct connections to experimental data, and mathematical rigor make the subject accessible to students with a wide variety of backgrounds and interests. Assuming only an undergraduate-level understanding of quantum mechanics, the book steadily develops the Standard Model and state-of-the-art calculation techniques. It includes multiple derivations of many important results, with modern methods such as effective field theory and the renormalization group playing a prominent role. Numerous worked examples and end-of-chapter problems enable students to reproduce classic results and to master quantum field theory as it is used today. Based on a course taught by the author over many years, this book is ideal for an introductory to advanced quantum field theory sequence or for independe...

  5. The Energy Budget of Earthquake Rupture: a View From Spontaneous Rupture Modeling and Finite-Source Models

    Science.gov (United States)

    Mai, P.; Guatteri, M.

    2003-12-01

    It is a common and frustrating experience of many dynamic modelers to initiate spontaneous rupture calculations that subsequently abort before rupturing to the desired earthquake size [Nielsen and Olsen, 2000; Oglesby and Day, 2002]. Source parameters in such dynamic source models are strongly correlated, but stress drop is the main factor affecting the distribution of the other dynamic rupture parameters. Additionally, the position of the hypocenter exerts a strong influence on the dynamic properties of the earthquake, and certain hypocenter positions are not plausible as those would not lead to spontaneous rupture propagation. To further investigate this last statement, we analyze the energy budget during earthquake rupture using spontaneous dynamic rupture calculations and finite-source rupture models. In describing the energy budget during earthquake rupture, we follow Favreau and Archuleta [2003]. Each point on the fault contributes to the radiated seismic energy Ers = Eel - Efr - Erx, where Eel denotes the elasto-static energy and Efr the fracture energy. In this study we neglect for simplicity the relaxation work Erx spent during the stopping of the earthquake. A rupture can be characterized by locally negative seismic energy density values, but its integral over the fault plane must be positive. The fundamental condition for rupture growth is therefore that the integral of Ers on the rupture area remains always positive during rupture propagation. Based on a simple energy budget calculation, we focus on identifying those target slip/stress distribution in dynamic rupture modeling that for a given hypocenter location fail to rupture spontaneously. Additionally, we study the energy budget of finite-source rupture models by analyzing the integrated seismic energy for the inferred slip maps using also hypocenter positions other than the network location. These results indicate how rupture was promoted for the true hypocenter while randomized hypocenters may not

  6. Investigating the relation of thermodynamic processes to local budgets Investigating the relation of thermodynamic processes to local budgets in a mesoscale weather prediction model

    Science.gov (United States)

    Petrik, R.; Gassmann, A.; Schlünzen, H.

    2009-09-01

    Recent models apply the non-hydrostatic compressible equations and include various physical parameterizations. On the one hand, such models are able to resolve flow structures on a very wide range of spatial and temporal scales. On the other hand, their complexity makes it difficult to evaluate and later on to improve the model. One usually verifies the model with meteorological data coming from remote sensing systems or in-situ measurements. Besides the evaluation of the model results, it is essential to evaluate the physical adequacy of the model itself. In this context, a finite volume diagnostic approach, that diagnoses the local budget of various quantities like energy, water mass and total mass in a predefined control volume, is applied for evaluating the physical quality of the mesoscale model COSMO 1. The monitoring of the conservation properties is essential for model development and for the investigation of the hydrological cycle, as well. E. g., the application of different discretization schemes, a variety of physical parameterizations and even non-physical artificial damping mechanisms, added explicitly and implicitly, can detrimentally influence the desired conservation properties. In that talk, it is presented how the introduced diagnostic approach should be applied in order to minimize errors originating from discrete grids and flux reconstructions using an idealized test bed. Starting with a first dry convection test case, the application of our tool to the COSMO model shows good conservation properties far away from the lateral and upper relaxation boundaries. If cloud or rain processes are involved in the simulations, large errors in energy and total mass conservation will reveal. Interestingly, the water mass is not contaminated. It is shown, how physical processes and numerical schemes contaminate the local budgets. Regarding this fact, it is demonstrated how to construct a saturation adjustment technique (SAT) for COSMO to reduce these errors

  7. Spatiotemporal Variability of the Urban Water Budget and Implications for Distributed Modeling

    Science.gov (United States)

    Bhaskar, A. S.; Welty, C.; Maxwell, R. M.

    2011-12-01

    In seeking to understand the feedbacks between urban development and water availability, we are in the process of coupling an integrated hydrologic model with an urban growth model, both of the Baltimore, Maryland, USA region. We are implementing ParFlow.CLM as the integrated hydrologic model (a subsurface-surface flow/land surface processes model) for the 13,000 sq km Baltimore metropolitan area. This work requires an understanding of the distribution of flows and making decisions on how to best model the short-circuiting of water and other phenomena unique to urban systems. In order to assess the attributes of available data, we conducted a study of the urban water budget from 2000 to 2009 and across an urban to rural gradient of development. For 65 watersheds in the Baltimore metropolitan area we quantified both natural (precipitation, evapotranspiration and streamflow) and engineered or piped (wastewater infiltration and inflow, lawn irrigation, water supply pipe leakage and reservoir withdrawals) water budget components on a monthly basis. We used monthly PRISM grids for precipitation, the land surface model GLDAS- Noah for gridded evapotranspiration estimates and streamflow from USGS gage records. For piped components, we used Baltimore City's comprehensive wastewater monitoring program data, which has infiltration and inflow estimates for most of the city's sewer basins, as well as estimates of lawn irrigation from fine-scale land cover data and lawn watering estimates, and water supply pipe leakage based on system wide values and the distribution of water supply pipes. We found that when solely considering natural components, urban watersheds generally appeared to have excess water, although the spatial variability was much higher for urban watersheds as compared to rural ones. This apparent excess water was more than accounted for by the most significant piped component, the export of groundwater and rainwater by cracks and improper connections to the

  8. An approach for modeling sediment budgets in supply-limited rivers

    Science.gov (United States)

    Wright, Scott A.; Topping, David J.; Rubin, David M.; Melis, Theodore S.

    2010-01-01

    Reliable predictions of sediment transport and river morphology in response to variations in natural and human-induced drivers are necessary for river engineering and management. Because engineering and management applications may span a wide range of space and time scales, a broad spectrum of modeling approaches has been developed, ranging from suspended-sediment "rating curves" to complex three-dimensional morphodynamic models. Suspended sediment rating curves are an attractive approach for evaluating changes in multi-year sediment budgets resulting from changes in flow regimes because they are simple to implement, computationally efficient, and the empirical parameters can be estimated from quantities that are commonly measured in the field (i.e., suspended sediment concentration and water discharge). However, the standard rating curve approach assumes a unique suspended sediment concentration for a given water discharge. This assumption is not valid in rivers where sediment supply varies enough to cause changes in particle size or changes in areal coverage of sediment on the bed; both of these changes cause variations in suspended sediment concentration for a given water discharge. More complex numerical models of hydraulics and morphodynamics have been developed to address such physical changes of the bed. This additional complexity comes at a cost in terms of computations as well as the type and amount of data required for model setup, calibration, and testing. Moreover, application of the resulting sediment-transport models may require observations of bed-sediment boundary conditions that require extensive (and expensive) observations or, alternatively, require the use of an additional model (subject to its own errors) merely to predict the bed-sediment boundary conditions for use by the transport model. In this paper we present a hybrid approach that combines aspects of the rating curve method and the more complex morphodynamic models. Our primary objective

  9. Density functional theory and multiscale materials modeling*

    Indian Academy of Sciences (India)

    Unknown

    wide class of problems involving nanomaterials, interfacial science and soft condensed matter has been addressed using the density based ... Keywords. Density functional theory; soft condensed matter; materials modeling. 1. Introduction ... the basic laws of quantum mechanics, their prediction through a direct ab initio ...

  10. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  11. Recursive renormalization group theory based subgrid modeling

    Science.gov (United States)

    Zhou, YE

    1991-01-01

    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  12. Distributed modeling of landsurface water and energy budgets in the inland Heihe river basin of China

    Science.gov (United States)

    Jia, Y.; Ding, X.; Qin, C.; Wang, H.

    2009-10-01

    A distributed model for simulating the land surface hydrological processes in the Heihe river basin was developed and validated on the basis of considering the physical mechanism of hydrological cycle and the artificial system of water utilization in the basin. Modeling approach of every component process was introduced from 2 aspects, i.e., water cycle and energy cycle. The hydrological processes include evapotranspiration, infiltration, runoff, groundwater flow, interaction between groundwater and river water, overland flow, river flow and artificial cycle processes of water utilization. A simulation of 21 years from 1982 to 2002 was carried out after obtaining various input data and model parameters. The model was validated for both the simulation of monthly discharge process and that of daily discharge process. Water budgets and spatial and temporal variations of hydrological cycle components as well as energy cycle components in the upper and middle reach Heihe basin (36 728 km2) were studied by using the distributed hydrological model. In addition, the model was further used to predict the water budgets under the future land surface change scenarios in the basin. The modeling results show: (1) in the upper reach watershed, the annual average evapotranspiration and runoff account for 63% and 37% of the annual precipitation, respectively, the snow melting runoff accounts for 19% of the total runoff and 41% of the direct runoff, and the groundwater storage has no obvious change; (2) in the middle reach basin, the annual average evapotranspiration is 52 mm more than the local annual precipitation, and the groundwater storage is of an obvious declining trend because of irrigation water consumption; (3) for the scenario of conservation forest construction in the upper reach basin, although the evapotranspiration from interception may increase, the soil evaporation may reduce at the same time, therefore the total evapotranspiration may not increase obviously; the

  13. An information theory approach for evaluating earth radiation budget (ERB) measurements - Nonuniform sampling of diurnal longwave flux variations

    Science.gov (United States)

    Halyo, Nesim; Direskeneli, Haldun; Barkstrom, Bruce R.

    1991-01-01

    Satellite measurements are subject to a wide range of uncertainties due to their temporal, spatial, and directional sampling characteristics. An information-theory approach is suggested to examine the nonuniform temporal sampling of ERB measurements. The information (i.e., its entropy or uncertainty) before and after the measurements is determined, and information gain (IG) is defined as a reduction in the uncertainties involved. A stochastic model for the diurnal outgoing flux variations that affect the ERB is developed. Using Gaussian distributions for the a priori and measured radiant exitance fields, the IG is obtained by computing the a posteriori covariance. The IG for the monthly outgoing flux measurements is examined for different orbital parameters and orbital tracks, using the Earth Observing System orbital parameters as specific examples. Variations in IG due to changes in the orbit's inclination angle and the initial ascending node local time are investigated.

  14. Precipitation recycling in West Africa - regional modeling, evaporation tagging and atmospheric water budget analysis

    Science.gov (United States)

    Arnault, Joel; Kunstmann, Harald; Knoche, Hans-Richard

    2015-04-01

    Many numerical studies have shown that the West African monsoon is highly sensitive to the state of the land surface. It is however questionable to which extend a local change of land surface properties would affect the local climate, especially with respect to precipitation. This issue is traditionally addressed with the concept of precipitation recycling, defined as the contribution of local surface evaporation to local precipitation. For this study the West African monsoon has been simulated with the Weather Research and Forecasting (WRF) model using explicit convection, for the domain (1°S-21°N, 18°W-14°E) at a spatial resolution of 10 km, for the period January-October 2013, and using ERA-Interim reanalyses as driving data. This WRF configuration has been selected for its ability to simulate monthly precipitation amounts and daily histograms close to TRMM (Tropical Rainfall Measuring Mission) data. In order to investigate precipitation recycling in this WRF simulation, surface evaporation tagging has been implemented in the WRF source code as well as the budget of total and tagged atmospheric water. Surface evaporation tagging consists in duplicating all water species and the respective prognostic equations in the source code. Then, tagged water species are set to zero at the lateral boundaries of the simulated domain (no inflow of tagged water vapor), and tagged surface evaporation is considered only in a specified region. All the source terms of the prognostic equations of total and tagged water species are finally saved in the outputs for the budget analysis. This allows quantifying the respective contribution of total and tagged atmospheric water to atmospheric precipitation processes. The WRF simulation with surface evaporation tagging and budgets has been conducted two times, first with a 100 km2 tagged region (11-12°N, 1-2°W), and second with a 1000 km2 tagged region (7-16°N, 6°W -3°E). In this presentation we will investigate hydro

  15. Linking Adverse Outcome Pathways to Dynamic Energy Budgets: A Conceptual Model

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, Cheryl [Michigan State University, East Lansing; Nisbet, Roger [University of California Santa Barbara; Antczak, Philipp [University of Liverpool, UK; Reyero, Natalia [Army Corps of Engineers, Vicksburg; Gergs, Andre [Gaiac; Lika, Dina [University of Crete; Mathews, Teresa J. [ORNL; Muller, Eric [University of California, Santa Barbara; Nacci, Dianne [U.S. Environmental Protection Agency (EPA); Peace, Angela L. [ORNL; Remien, Chris [University of Idaho; Schulz, Irv [Pacific Northwest National Laboratory (PNNL); Watanabe, Karen [Arizona State University

    2018-02-01

    Ecological risk assessment quantifies the likelihood of undesirable impacts of stressors, primarily at high levels of biological organization. Data used to inform ecological risk assessments come primarily from tests on individual organisms or from suborganismal studies, indicating a disconnect between primary data and protection goals. We know how to relate individual responses to population dynamics using individual-based models, and there are emerging ideas on how to make connections to ecosystem services. However, there is no established methodology to connect effects seen at higher levels of biological organization with suborganismal dynamics, despite progress made in identifying Adverse Outcome Pathways (AOPs) that link molecular initiating events to ecologically relevant key events. This chapter is a product of a working group at the National Center for Mathematical and Biological Synthesis (NIMBioS) that assessed the feasibility of using dynamic energy budget (DEB) models of individual organisms as a “pivot” connecting suborganismal processes to higher level ecological processes. AOP models quantify explicit molecular, cellular or organ-level processes, but do not offer a route to linking sub-organismal damage to adverse effects on individual growth, reproduction, and survival, which can be propagated to the population level through individual-based models. DEB models describe these processes, but use abstract variables with undetermined connections to suborganismal biology. We propose linking DEB and quantitative AOP models by interpreting AOP key events as measures of damage-inducing processes in a DEB model. Here, we present a conceptual model for linking AOPs to DEB models and review existing modeling tools available for both AOP and DEB.

  16. Modelling the reactive nitrogen budget across Germany using LOTOS-EUROS between 2000 and 2013

    Science.gov (United States)

    Schaap, Martijn; Banzhaf, Sabine; Hendriks, Carlijn; Kranenburg, Richard

    2017-04-01

    Nitrogen deposition causes soil acidification and enhances eutrophication causing biodiversity loss. Currently, a major contribution to N-deposition derives from ammonia. Furthermore, ammonia contributes to the formation of secondary inorganic aerosol, a major contributor to atmospheric particulate matter levels. The aerosol formation provides a means of long range transport of reactive nitrogen as the life time of the aerosols is larger than that of ammonia itself. Despite its central role in these environmental threats, little is known about the ammonia budget. In this study we report on recent modelling study to assess the ammonia and reactive nitrogen budget over Germany for a period of 14 years (2000-2013). Prior to the long term simulation the process descriptions in the LOTOS-EUROS CTM were updated and a sensitivity simulation was performed showing that the impact of the compensation point for ammonia and the changes in aerosol deposition had the largest impact against earlier studies. Next, sensitivity simulations were performed to assess the impact of newly reported emissions totals (with 30 higher emissions caused by adjusted emission factors for fertilizer spreading), different spatial and temporal emission variability. Long term evaluation showed that the model is well able to reproduce the variability in wet deposition fluxes induced by varying precipitation amounts, but that systematic changes remain. These sensitivity simulations showed that detailing the seasonal emission variability is more important to remove systematic differences than lowering the uncertainty in dry deposition parametrization. Evaluation with the ammonia retrievals of the IASI satellite confirm that the newly reported emission data for fertilizer application have positive impacts on the modelled ammonia distribution. The new emission information confirms an emission area observed by the satellite in the northeast of Germany, which was previously absent from the national scale

  17. Lattice gauge theories and spin models

    Science.gov (United States)

    Mathur, Manu; Sreeraj, T. P.

    2016-10-01

    The Wegner Z2 gauge theory-Z2 Ising spin model duality in (2 +1 ) dimensions is revisited and derived through a series of canonical transformations. The Kramers-Wannier duality is similarly obtained. The Wegner Z2 gauge-spin duality is directly generalized to SU(N) lattice gauge theory in (2 +1 ) dimensions to obtain the SU(N) spin model in terms of the SU(N) magnetic fields and their conjugate SU(N) electric scalar potentials. The exact and complete solutions of the Z2, U(1), SU(N) Gauss law constraints in terms of the corresponding spin or dual potential operators are given. The gauge-spin duality naturally leads to a new gauge invariant magnetic disorder operator for SU(N) lattice gauge theory which produces a magnetic vortex on the plaquette. A variational ground state of the SU(2) spin model with nearest neighbor interactions is constructed to analyze SU(2) gauge theory.

  18. The AquaDEB project: Physiological flexibility of aquatic animals analysed with a generic dynamic energy budget model (phase II)

    Science.gov (United States)

    Alunno-Bruscia, Marianne; van der Veer, Henk W.; Kooijman, Sebastiaan A. L. M.

    2011-11-01

    This second special issue of the Journal of Sea Research on development and applications of Dynamic Energy Budget (DEB) theory concludes the European Research Project AquaDEB (2007-2011). In this introductory paper we summarise the progress made during the running time of this 5 years' project, present context for the papers in this volume and discuss future directions. The main scientific objectives in AquaDEB were (i) to study and compare the sensitivity of aquatic species (mainly molluscs and fish) to environmental variability within the context of DEB theory for metabolic organisation, and (ii) to evaluate the inter-relationships between different biological levels (individual, population, ecosystem) and temporal scales (life cycle, population dynamics, evolution). AquaDEB phase I focussed on quantifying bio-energetic processes of various aquatic species ( e.g. molluscs, fish, crustaceans, algae) and phase II on: (i) comparing of energetic and physiological strategies among species through the DEB parameter values and identifying the factors responsible for any differences in bioenergetics and physiology; (ii) considering different scenarios of environmental disruption (excess of nutrients, diffuse or massive pollution, exploitation by man, climate change) to forecast effects on growth, reproduction and survival of key species; (iii) scaling up the models for a few species from the individual level up to the level of evolutionary processes. Apart from the three special issues in the Journal of Sea Research — including the DEBIB collaboration (see vol. 65 issue 2), a theme issue on DEB theory appeared in the Philosophical Transactions of the Royal Society B (vol 365, 2010); a large number of publications were produced; the third edition of the DEB book appeared (2010); open-source software was substantially expanded (over 1000 functions); a large open-source systematic collection of ecophysiological data and DEB parameters has been set up; and a series of DEB

  19. Budget of tropospheric ozone during TOPSE from two chemical transport models

    Science.gov (United States)

    Emmons, L. K.; Hess, P.; Klonecki, A.; Tie, X.; Horowitz, L.; Lamarque, J.-F.; Kinnison, D.; Brasseur, G.; Atlas, E.; Browell, E.; Cantrell, C.; Eisele, F.; Mauldin, R. L.; Merrill, J.; Ridley, B.; Shetter, R.

    2003-04-01

    The tropospheric ozone budget during the Tropospheric Ozone Production about the Spring Equinox (TOPSE) campaign has been studied using two chemical transport models (CTMs): HANK and the Model of Ozone and Related chemical Tracers, version 2 (MOZART-2). The two models have similar chemical schemes but use different meteorological fields, with HANK using MM5 (Pennsylvania State University, National Center for Atmospheric Research Mesoscale Modeling System) and MOZART-2 driven by European Centre for Medium-Range Weather Forecasts (ECMWF) fields. Both models simulate ozone in good agreement with the observations but underestimate NOx. The models indicate that in the troposphere, averaged over the northern middle and high latitudes, chemical production of ozone drives the increase of ozone seen in the spring. Both ozone gross chemical production and loss increase greatly over the spring months. The in situ production is much larger than the net stratospheric input, and the deposition and horizontal fluxes are relatively small in comparison to chemical destruction. The net production depends sensitively on the concentrations of H2O, HO2 and NO, which differ slightly in the two models. Both models underestimate the chemical production calculated in a steady state model using TOPSE measurements, but the chemical loss rates agree well. Measures of the stratospheric influence on tropospheric ozone in relation to in situ ozone production are discussed. Two different estimates of the stratospheric fraction of O3 in the Northern Hemisphere troposphere indicate it decreases from 30-50% in February to 15-30% in June. A sensitivity study of the effect of a perturbation in the vertical flux on tropospheric ozone indicates the contribution from the stratosphere is approximately 15%.

  20. Density Functional Theory Models for Radiation Damage

    Science.gov (United States)

    Dudarev, S. L.

    2013-07-01

    Density functional theory models developed over the past decade provide unique information about the structure of nanoscale defects produced by irradiation and about the nature of short-range interaction between radiation defects, clustering of defects, and their migration pathways. These ab initio models, involving no experimental input parameters, appear to be as quantitatively accurate and informative as the most advanced experimental techniques developed for the observation of radiation damage phenomena. Density functional theory models have effectively created a new paradigm for the scientific investigation and assessment of radiation damage effects, offering new insight into the origin of temperature- and dose-dependent response of materials to irradiation, a problem of pivotal significance for applications.

  1. Crack propagation modeling using Peridynamic theory

    Science.gov (United States)

    Hafezi, M. H.; Alebrahim, R.; Kundu, T.

    2016-04-01

    Crack propagation and branching are modeled using nonlocal peridynamic theory. One major advantage of this nonlocal theory based analysis tool is the unifying approach towards material behavior modeling - irrespective of whether the crack is formed in the material or not. No separate damage law is needed for crack initiation and propagation. This theory overcomes the weaknesses of existing continuum mechanics based numerical tools (e.g. FEM, XFEM etc.) for identifying fracture modes and does not require any simplifying assumptions. Cracks grow autonomously and not necessarily along a prescribed path. However, in some special situations such as in case of ductile fracture, the damage evolution and failure depend on parameters characterizing the local stress state instead of peridynamic damage modeling technique developed for brittle fracture. For brittle fracture modeling the bond is simply broken when the failure criterion is satisfied. This simulation helps us to design more reliable modeling tool for crack propagation and branching in both brittle and ductile materials. Peridynamic analysis has been found to be very demanding computationally, particularly for real-world structures (e.g. vehicles, aircrafts, etc.). It also requires a very expensive visualization process. The goal of this paper is to bring awareness to researchers the impact of this cutting-edge simulation tool for a better understanding of the cracked material response. A computer code has been developed to implement the peridynamic theory based modeling tool for two-dimensional analysis. A good agreement between our predictions and previously published results is observed. Some interesting new results that have not been reported earlier by others are also obtained and presented in this paper. The final objective of this investigation is to increase the mechanics knowledge of self-similar and self-affine cracks.

  2. A budget impact model for biosimilar infliximab in Crohn's disease in Bulgaria, the Czech Republic, Hungary, Poland, Romania, and Slovakia.

    Science.gov (United States)

    Brodszky, Valentin; Rencz, Fanni; Péntek, Márta; Baji, Petra; Lakatos, Péter L; Gulácsi, László

    2016-01-01

    To estimate the budget impact of the introduction of biosimilar infliximab for the treatment of Crohn's disease (CD) in Bulgaria, the Czech Republic, Hungary, Poland, Romania and Slovakia. A 3-year, prevalence-based budget impact analysis for biosimilar infliximab to treat CD was developed from third-party payers' perspective. The model included various scenarios depending on whether interchanging originator infliximab with biosimilar infliximab was allowed or not. Total cost savings achieved in biosimilar scenario 1 (interchanging not allowed) and BSc2 (interchanging allowed in 80% of the patients) were estimated to €8.0 million and €16.9 million in the six countries. Budget savings may cover the biosimilar infliximab therapy for 722-1530 additional CD patients. Introduction of biosimilar infliximab to treat CD may offset the inequity in access to biological therapy for CD between Central and Eastern European countries.

  3. The Body Model Theory of Somatosensory Cortex.

    Science.gov (United States)

    Brecht, Michael

    2017-06-07

    I outline a microcircuit theory of somatosensory cortex as a body model serving both for body representation and "body simulation." A modular model of innervated and non-innervated body parts resides in somatosensory cortical layer 4. This body model is continuously updated and compares to an avatar (an animatable puppet) rather than a mere sensory map. Superficial layers provide context and store sensory memories, whereas layer 5 provides motor output and stores motor memories. I predict that layer-6-to-layer-4 inputs initiate body simulations allowing rehearsal and risk assessment of difficult actions, such as jumps. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Operational budgeting using fuzzy goal programming

    Directory of Open Access Journals (Sweden)

    Saeed Mohammadi

    2013-10-01

    Full Text Available Having an efficient budget normally has different advantages such as measuring the performance of various organizations, setting appropriate targets and promoting managers based on their achievements. However, any budgeting planning requires prediction of different cost components. There are various methods for budgeting planning such as incremental budgeting, program budgeting, zero based budgeting and performance budgeting. In this paper, we present a fuzzy goal programming to estimate operational budget. The proposed model uses fuzzy triangular as well as interval number to estimate budgeting expenses. The proposed study of this paper is implemented for a real-world case study in province of Qom, Iran and the results are analyzed.

  5. Processo Orçamentário: uma aplicação da análise substantiva com utilização da grounded theory [Budgeting: substantive analysis using grounded theory

    Directory of Open Access Journals (Sweden)

    Tânia Regina Sordi Relvas

    2011-09-01

    Full Text Available Diante da constatação de que os estudos sobre o orçamento exploram o fenômeno de forma reducionista, este artigo tem por objetivo propor uma teoria substantiva abrangente e fundamentada em dados empíricos para a análise do orçamento. Essa abordagem considera seus elementos constituintes e suas interdependências. Isso foi feito por meio da aplicação da abordagem indutiva fundamentada nos dados empíricos (grounded theory, sob o paradigma qualitativo. O foco de análise foi uma instituição financeira de grande porte e o trabalho de campo foi desenvolvido ao longo de dois anos, envolvendo vários níveis gerenciais. A contribuição do trabalho advém da disponibilização de framework para o tratamento do tema em um contexto amplo, o que permitiu entender aspectos que deixariam de ser considerados com uma abordagem de análise mais restrita e menos abrangente. Como produto da teoria substantiva, cinco proposições foram desenvolvidas com a perspectiva de serem aplicadas nas organizações. --- Budgeting: substantive analysis using grounded theory --- Abstract --- Considering the fact that studies into budgeting basically use a reductionist approach, this paper proposes a comprehensive substantive theory based on empirical data to be used in budget analysis. This approach takes into consideration its elements and interdependence by applying the inductive approach based on empirical data (grounded theory on a qualitative paradigm. The focus was an in-depth two-year study of a large Brazilian financial institution involving several management levels. The main contribution of the study is as a framework that treats all elements of the budget process in a comprehensive and coherent fashion, otherwise impossible using a reductionist approach. As products of the substantive theory, five propositions were developed to be applied in organizations.

  6. Topos models for physics and topos theory

    Energy Technology Data Exchange (ETDEWEB)

    Wolters, Sander, E-mail: s.wolters@math.ru.nl [Radboud Universiteit Nijmegen, Institute for Mathematics, Astrophysics, and Particle Physics (Netherlands)

    2014-08-15

    What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos.

  7. Development of a Water and Enthalpy Budget-based Glacier mass balance Model (WEB-GM) and its preliminary validation

    Science.gov (United States)

    Ding, Baohong; Yang, Kun; Yang, Wei; He, Xiaobo; Chen, Yingying; Lazhu; Guo, Xiaofeng; Wang, Lei; Wu, Hui; Yao, Tandong

    2017-04-01

    This paper presents a new water and energy budget-based glacier mass balance model. Enthalpy, rather than temperature, is used in the energy balance equations to simplify the computation of the energy transfers through the water phase change and the movement of liquid water in the snow. A new parameterization for albedo estimation and state-of-the-art parameterization schemes for rainfall/snowfall type identification and surface turbulent heat flux calculations are implemented in the model. This model was driven with meteorological data and evaluated using mass balance and turbulent flux data collected during a field experiment implemented in the ablation zone of the Parlung No. 4 Glacier on the Southeast Tibetan Plateau during 2009 and 2015-2016. The evaluation shows that the model can reproduce the observed glacier ablation depth, surface albedo, surface temperature, sensible heat flux, and latent heat flux with high accuracy. Comparing with a traditional energy budget-based glacier mass balance model, this enthalpy-based model shows a superior capacity in simulation accuracy. Therefore, this model can reasonably simulate the energy budget and mass balance of glacier melting in this region and be used as a component of land surface models and hydrological models.

  8. Capital budgeting practices in Spain

    Directory of Open Access Journals (Sweden)

    Pablo de Andrés

    2015-01-01

    Full Text Available This paper seeks to shed further light on the capital budgeting techniques used by Spanish companies. Our paper posits that the gap between theory and practice might be related to the nature of sources of value and to the efficiency of mechanisms aligning managerial and shareholder incentives, rather than to resource restrictions or model misinterpretation. We analyze data from a survey conducted in 2011, the final sample comprising 140 non-financial Spanish firms. Our findings show a behaviour pattern similar to that reported in prior research for firms in other countries. Particularly noteworthy is that payback appears to be the most widely used tool, while real options are used relatively little. Our results confirm that size and industry are related to the frequency of use of certain capital budgeting techniques. Further, we find that the relevance of growth opportunities and flexibility is an important factor explaining the use of real options.

  9. A flexible tool for diagnosing water, energy, and entropy budgets in climate models

    Science.gov (United States)

    Lembo, Valerio; Lucarini, Valerio

    2017-04-01

    We have developed a new flexible software for studying the global energy budget, the hydrological cycle, and the material entropy production of global climate models. The program receives as input radiative, latent and sensible energy fluxes, with the requirement that the variable names are in agreement with the Climate and Forecast (CF) conventions for the production of NetCDF datasets. Annual mean maps, meridional sections and time series are computed by means of Climate Data Operators (CDO) collection of command line operators developed at Max-Planck Institute for Meteorology (MPI-M). If a land-sea mask is provided, the program also computes the required quantities separately on the continents and oceans. Depending on the user's choice, the program also calls the MATLAB software to compute meridional heat transports and location and intensities of the peaks in the two hemispheres. We are currently planning to adapt the program in order to be included in the Earth System Model eValuation Tool (ESMValTool) community diagnostics.

  10. Atlantic tropical cyclones water budget in observations and CNRM-CM5 model

    Science.gov (United States)

    Chauvin, Fabrice; Douville, Hervé; Ribes, Aurélien

    2017-12-01

    Water budgets in tropical cyclones (TCs) are computed in the ERA-interim (ERAI) re-analysis and the CNRM-CM5 model for the late 20th and 21st centuries. At a 6-hourly timescale and averaged over a 5° × 5° box around a TC center, the main contribution to rainfall is moisture convergence, with decreasing contribution of evaporation for increasing rainfall intensities. It is found that TC rainfall in ERAI and the model are underestimated when compared with the tropical rainfall measuring mission (TRMM), probably due to underestimated TC winds in ERAI vs. observed TCs. It is also found that relative increase in TC rainfall between the second half of the 20th and 21st centuries may surpass the rate of change suggested by the Clausius-Clapeyron formula. It may even reach twice this rate for reduced spatial domains corresponding to the highest cyclonic rainfall. This is in agreement with an expected positive feedback between TC rainfall intensity and dynamics.

  11. The carbon budget of a large catchment in the Argentine Pampa plain through hydrochemical modeling.

    Science.gov (United States)

    Glok Galli, M; Martínez, D E; Kruse, E E

    2014-09-15

    Mar Chiquita is a coastal lagoon located in the Argentine Buenos Aires province in South America. The aim of this study is to estimate the annual contribution of inland waters to the carbon cycle in this lagoon's catchment by estimating the corresponding local carbon budget. Fifteen pairs of water samples were chosen to carry out hydrogeochemical modeling using PHREEQC software. Groundwater samples were considered as recharge water (initial solutions), while streamwater samples were taken as groundwater discharge (final solutions for inverse modeling/reference solutions for direct modeling). Fifteen direct models were performed, where each groundwater sample was constrained to calcite equilibrium under two different carbon dioxide partial pressure (PCO2) conditions: atmospheric conditions (log PCO2 (atm) = -3.5) and a PCO2 value of log PCO2 (atm) = -3. Groundwater samples are close to calcite equilibrium conditions. The calcite precipitation process is kinetically slower than gas diffusion, causing oversaturation of this reactant phase in streamwater samples. This was accompanied by a pH increase of approximately two units due to a PCO2 decrease. From the fifteen inverse models it was estimated that, of the total carbon that enters per year in the hydrological cycle of the study area, about 11.9% is delivered to the atmosphere as CO2 and around 6.7% is buried in sediments. This would indicate that 81.4% of the remaining carbon is retained in equilibrium within the system or discharged into the Mar Chiquita lagoon and/or directly to the ocean through regional flows. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Forecasting Rainfall Induced Landslide using High Resolution DEM and Simple Water Budget Model

    Science.gov (United States)

    Luzon, P. K. D.; Lagmay, A. M. F. A.

    2014-12-01

    Philippines is hit by an average of 20 typhoons per year bringing large amount of rainfall. Monsoon carrying rain coming from the southwest of the country also contributes to the annual total rainfall that causes different hazards. Such is shallow landslide mainly triggered by high saturation of soil due to continuous downpour which could take up from hours to days. Recent event like this happened in Zambales province September of 2013 where torrential rain occurred for 24 hours amounting to half a month of rain. Rainfall intensity measured by the nearest weather station averaged to 21 mm/hr from 10 pm of 22 until 10 am the following day. The monsoon rains was intensified by the presence of Typhoon Usagi positioned north and heading northwest of the country. A number of landslides due to this happened in 3 different municipalities; Subic, San Marcelino and Castillejos. The disaster have taken 30 lives from the province. Monitoring these areas for the entire country is but a big challenge in all aspect of disaster preparedness and management. The approach of this paper is utilizing the available forecast of rainfall amount to monitor highly hazardous area during the rainy seasons and forecasting possible landslide that could happen. A simple water budget model following the equation Perct=Pt-R/Ot-∆STt-AETt (where as the terms are Percolation, Runoff, Change in Storage, and Actual Evapotraspiration) was implemented in quantifying all the water budget component. Computations are in Python scripted grid system utilizing the widely used GIS forms for easy transfer of data and faster calculation. Results of successive runs will let percolation and change in water storage as indicators of possible landslide.. This approach needs three primary sets of data; weather data, topographic data, and soil parameters. This research uses 5 m resolution DEM (IfSAR) to define the topography. Soil parameters are from fieldworks conducted. Weather data are from the Philippine

  13. Public Health and Budget Impact of Probiotics on Common Respiratory Tract Infections: A Modelling Study

    Science.gov (United States)

    Lenoir-Wijnkoop, Irene; Gerlier, Laetitia; Bresson, Jean-Louis; Le Pen, Claude; Berdeaux, Gilles

    2015-01-01

    Objectives Two recent meta-analyses by the York Health Economics Consortium (YHEC) and Cochrane demonstrated probiotic efficacy in reducing the duration and number of common respiratory tract infections (CRTI) and associated antibiotic prescriptions. A health-economic analysis was undertaken to estimate the public health and budget consequences of a generalized probiotic consumption in France. Methods A virtual age- and gender-standardized population was generated using a Markov microsimulation model. CRTI risk factors incorporated into this model were age, active/passive smoking and living in a community setting. Incidence rates and resource utilization were based on the 2011-2012 flu season and retrieved from the French GPs Sentinelles network. Results of both meta-analyses were independently applied to the French population to estimate CRTI events, assuming a generalized probiotic use compared to no probiotics during winter months: -0.77 days/CRTI episode (YHEC scenario) or odds-ratio 0.58 for ≥1 CRTI episode (Cochrane scenario) with vs. without probiotics. Economic perspectives were National Health System (NHS), society, family. Outcomes included cost savings related to the reduced numbers of CRTI episodes, days of illness, number of antibiotic courses, sick leave days, medical and indirect costs. Results For France, generalized probiotic use would save 2.4 million CRTI-days, 291,000 antibiotic courses and 581,000 sick leave days, based on YHEC data. Applying the Cochrane data, reductions were 6.6 million CRTI days, 473,000 antibiotic courses and 1.5 million sick days. From the NHS perspective, probiotics’ economic impact was about €14.6 million saved according to YHEC and €37.7 million according to Cochrane. Higher savings were observed in children, active smokers and people with more frequent human contacts. Conclusions Public health and budget impact of probiotics are substantial, whether they reduce CRTI episodes frequency or duration. Noteworthy

  14. Public health and budget impact of probiotics on common respiratory tract infections: a modelling study.

    Directory of Open Access Journals (Sweden)

    Irene Lenoir-Wijnkoop

    Full Text Available Two recent meta-analyses by the York Health Economics Consortium (YHEC and Cochrane demonstrated probiotic efficacy in reducing the duration and number of common respiratory tract infections (CRTI and associated antibiotic prescriptions. A health-economic analysis was undertaken to estimate the public health and budget consequences of a generalized probiotic consumption in France.A virtual age- and gender-standardized population was generated using a Markov microsimulation model. CRTI risk factors incorporated into this model were age, active/passive smoking and living in a community setting. Incidence rates and resource utilization were based on the 2011-2012 flu season and retrieved from the French GPs Sentinelles network. Results of both meta-analyses were independently applied to the French population to estimate CRTI events, assuming a generalized probiotic use compared to no probiotics during winter months: -0.77 days/CRTI episode (YHEC scenario or odds-ratio 0.58 for ≥1 CRTI episode (Cochrane scenario with vs. without probiotics. Economic perspectives were National Health System (NHS, society, family. Outcomes included cost savings related to the reduced numbers of CRTI episodes, days of illness, number of antibiotic courses, sick leave days, medical and indirect costs.For France, generalized probiotic use would save 2.4 million CRTI-days, 291,000 antibiotic courses and 581,000 sick leave days, based on YHEC data. Applying the Cochrane data, reductions were 6.6 million CRTI days, 473,000 antibiotic courses and 1.5 million sick days. From the NHS perspective, probiotics' economic impact was about €14.6 million saved according to YHEC and €37.7 million according to Cochrane. Higher savings were observed in children, active smokers and people with more frequent human contacts.Public health and budget impact of probiotics are substantial, whether they reduce CRTI episodes frequency or duration. Noteworthy, the 2011-12 winter CRTI

  15. Public health and budget impact of probiotics on common respiratory tract infections: a modelling study.

    Science.gov (United States)

    Lenoir-Wijnkoop, Irene; Gerlier, Laetitia; Bresson, Jean-Louis; Le Pen, Claude; Berdeaux, Gilles

    2015-01-01

    Two recent meta-analyses by the York Health Economics Consortium (YHEC) and Cochrane demonstrated probiotic efficacy in reducing the duration and number of common respiratory tract infections (CRTI) and associated antibiotic prescriptions. A health-economic analysis was undertaken to estimate the public health and budget consequences of a generalized probiotic consumption in France. A virtual age- and gender-standardized population was generated using a Markov microsimulation model. CRTI risk factors incorporated into this model were age, active/passive smoking and living in a community setting. Incidence rates and resource utilization were based on the 2011-2012 flu season and retrieved from the French GPs Sentinelles network. Results of both meta-analyses were independently applied to the French population to estimate CRTI events, assuming a generalized probiotic use compared to no probiotics during winter months: -0.77 days/CRTI episode (YHEC scenario) or odds-ratio 0.58 for ≥1 CRTI episode (Cochrane scenario) with vs. without probiotics. Economic perspectives were National Health System (NHS), society, family. Outcomes included cost savings related to the reduced numbers of CRTI episodes, days of illness, number of antibiotic courses, sick leave days, medical and indirect costs. For France, generalized probiotic use would save 2.4 million CRTI-days, 291,000 antibiotic courses and 581,000 sick leave days, based on YHEC data. Applying the Cochrane data, reductions were 6.6 million CRTI days, 473,000 antibiotic courses and 1.5 million sick days. From the NHS perspective, probiotics' economic impact was about €14.6 million saved according to YHEC and €37.7 million according to Cochrane. Higher savings were observed in children, active smokers and people with more frequent human contacts. Public health and budget impact of probiotics are substantial, whether they reduce CRTI episodes frequency or duration. Noteworthy, the 2011-12 winter CRTI incidence was low

  16. Sparse modeling theory, algorithms, and applications

    CERN Document Server

    Rish, Irina

    2014-01-01

    ""A comprehensive, clear, and well-articulated book on sparse modeling. This book will stand as a prime reference to the research community for many years to come.""-Ricardo Vilalta, Department of Computer Science, University of Houston""This book provides a modern introduction to sparse methods for machine learning and signal processing, with a comprehensive treatment of both theory and algorithms. Sparse Modeling is an ideal book for a first-year graduate course.""-Francis Bach, INRIA - École Normale Supřieure, Paris

  17. Temperature characteristics modeling of Preisach theory

    Directory of Open Access Journals (Sweden)

    Chen Hao

    2017-01-01

    Full Text Available This paper proposes a modeling method of the temperature characteristics of Preisach theory. On the basis of the classical Preisach hysteresis model, the Curie temperature, the critical exponent and the ambient temperature are introduced after which the effect of temperature on the magnetic properties of ferromagnetic materials can be accurately reflected. A simulation analysis and a temperature characteristic experiment with silicon steel was carried out. The results are basically the same which proves the validity and the accuracy of the method.

  18. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  19. Chesapeake Bay nitrogen fluxes derived from a land-estuarine ocean biogeochemical modeling system: Model description, evaluation, and nitrogen budgets.

    Science.gov (United States)

    Feng, Yang; Friedrichs, Marjorie A M; Wilkin, John; Tian, Hanqin; Yang, Qichun; Hofmann, Eileen E; Wiggert, Jerry D; Hood, Raleigh R

    2015-08-01

    The Chesapeake Bay plays an important role in transforming riverine nutrients before they are exported to the adjacent continental shelf. Although the mean nitrogen budget of the Chesapeake Bay has been previously estimated from observations, uncertainties associated with interannually varying hydrological conditions remain. In this study, a land-estuarine-ocean biogeochemical modeling system is developed to quantify Chesapeake riverine nitrogen inputs, within-estuary nitrogen transformation processes and the ultimate export of nitrogen to the coastal ocean. Model skill was evaluated using extensive in situ and satellite-derived data, and a simulation using environmental conditions for 2001-2005 was conducted to quantify the Chesapeake Bay nitrogen budget. The 5 year simulation was characterized by large riverine inputs of nitrogen (154 × 10(9) g N yr(-1)) split roughly 60:40 between inorganic:organic components. Much of this was denitrified (34 × 10(9) g N yr(-1)) and buried (46 × 10(9) g N yr(-1)) within the estuarine system. A positive net annual ecosystem production for the bay further contributed to a large advective export of organic nitrogen to the shelf (91 × 10(9) g N yr(-1)) and negligible inorganic nitrogen export. Interannual variability was strong, particularly for the riverine nitrogen fluxes. In years with higher than average riverine nitrogen inputs, most of this excess nitrogen (50-60%) was exported from the bay as organic nitrogen, with the remaining split between burial, denitrification, and inorganic export to the coastal ocean. In comparison to previous simulations using generic shelf biogeochemical model formulations inside the estuary, the estuarine biogeochemical model described here produced more realistic and significantly greater exports of organic nitrogen and lower exports of inorganic nitrogen to the shelf.

  20. Chesapeake Bay nitrogen fluxes derived from a land‐estuarine ocean biogeochemical modeling system: Model description, evaluation, and nitrogen budgets

    Science.gov (United States)

    Friedrichs, Marjorie A. M.; Wilkin, John; Tian, Hanqin; Yang, Qichun; Hofmann, Eileen E.; Wiggert, Jerry D.; Hood, Raleigh R.

    2015-01-01

    Abstract The Chesapeake Bay plays an important role in transforming riverine nutrients before they are exported to the adjacent continental shelf. Although the mean nitrogen budget of the Chesapeake Bay has been previously estimated from observations, uncertainties associated with interannually varying hydrological conditions remain. In this study, a land‐estuarine‐ocean biogeochemical modeling system is developed to quantify Chesapeake riverine nitrogen inputs, within‐estuary nitrogen transformation processes and the ultimate export of nitrogen to the coastal ocean. Model skill was evaluated using extensive in situ and satellite‐derived data, and a simulation using environmental conditions for 2001–2005 was conducted to quantify the Chesapeake Bay nitrogen budget. The 5 year simulation was characterized by large riverine inputs of nitrogen (154 × 109 g N yr−1) split roughly 60:40 between inorganic:organic components. Much of this was denitrified (34 × 109 g N yr−1) and buried (46 × 109 g N yr−1) within the estuarine system. A positive net annual ecosystem production for the bay further contributed to a large advective export of organic nitrogen to the shelf (91 × 109 g N yr−1) and negligible inorganic nitrogen export. Interannual variability was strong, particularly for the riverine nitrogen fluxes. In years with higher than average riverine nitrogen inputs, most of this excess nitrogen (50–60%) was exported from the bay as organic nitrogen, with the remaining split between burial, denitrification, and inorganic export to the coastal ocean. In comparison to previous simulations using generic shelf biogeochemical model formulations inside the estuary, the estuarine biogeochemical model described here produced more realistic and significantly greater exports of organic nitrogen and lower exports of inorganic nitrogen to the shelf. PMID:27668137

  1. Measurements of hydroxyl and hydroperoxy radicals during CalNex-LA: Model comparisons and radical budgets

    Science.gov (United States)

    Griffith, S. M.; Hansen, R. F.; Dusanter, S.; Michoud, V.; Gilman, J. B.; Kuster, W. C.; Veres, P. R.; Graus, M.; de Gouw, J. A.; Roberts, J.; Young, C.; Washenfelder, R.; Brown, S. S.; Thalman, R.; Waxman, E.; Volkamer, R.; Tsai, C.; Stutz, J.; Flynn, J. H.; Grossberg, N.; Lefer, B.; Alvarez, S. L.; Rappenglueck, B.; Mielke, L. H.; Osthoff, H. D.; Stevens, P. S.

    2016-04-01

    Measurements of hydroxyl (OH) and hydroperoxy (HO2*) radical concentrations were made at the Pasadena ground site during the CalNex-LA 2010 campaign using the laser-induced fluorescence-fluorescence assay by gas expansion technique. The measured concentrations of OH and HO2* exhibited a distinct weekend effect, with higher radical concentrations observed on the weekends corresponding to lower levels of nitrogen oxides (NOx). The radical measurements were compared to results from a zero-dimensional model using the Regional Atmospheric Chemical Mechanism-2 constrained by NOx and other measured trace gases. The chemical model overpredicted measured OH concentrations during the weekends by a factor of approximately 1.4 ± 0.3 (1σ), but the agreement was better during the weekdays (ratio of 1.0 ± 0.2). Model predicted HO2* concentrations underpredicted by a factor of 1.3 ± 0.2 on the weekends, while measured weekday concentrations were underpredicted by a factor of 3.0 ± 0.5. However, increasing the modeled OH reactivity to match the measured total OH reactivity improved the overall agreement for both OH and HO2* on all days. A radical budget analysis suggests that photolysis of carbonyls and formaldehyde together accounted for approximately 40% of radical initiation with photolysis of nitrous acid accounting for 30% at the measurement height and ozone photolysis contributing less than 20%. An analysis of the ozone production sensitivity reveals that during the week, ozone production was limited by volatile organic compounds throughout the day during the campaign but NOx limited during the afternoon on the weekends.

  2. A comparison of scope for growth (SFG) and dynamic energy budget (DEB) models applied to the blue mussel (Mytilus edulis)

    OpenAIRE

    Filgueira, Ramón; Rosland, R.; Grant, Jon

    2011-01-01

    Growth of Mytilus edulis was simulated using individual based models following both Scope For Growth (SFG) and Dynamic Energy Budget (DEB) approaches. These models were parameterized using independent studies and calibrated for each dataset by adjusting the half-saturation coefficient of the food ingestion function term, XK, a common parameter in both approaches related to feeding behavior. Auto-calibration was carried out using an optimization tool, which provides an objective way of tuning ...

  3. MODEL OF DISTRIBUTION OF THE BUDGET OF THE PORTFOLIO OF IT PROJECTS TAKING IN-TO ACCOUNT THEIR PRIORITY

    Directory of Open Access Journals (Sweden)

    Anita V. Sotnikova

    2015-01-01

    Full Text Available Article is devoted to a problem of effective distribution of the general budget of a portfolio between the IT projects which are its part taking into ac-count their priority. The designated problem is actual in view of low results of activity of the consulting companies in the sphere of information technologies.For determination of priority of IT projects the method of analytical networks developed by T. Saati is used. For the purpose of application of this method the system of criteria (indicators reflecting influence of IT projects of a portfolio on the most significant purposes of implementation of IT projects of a portfolio is developed. As system of criteria the key indicators of efficiency defined when developing the Balanced system of indicators which meet above-mentioned requirements are used. The essence of a method of analytical net-works consists in paired comparison of key indicators of efficiency concerning the purpose of realization of a portfolio and IT projects which are a part of a portfolio. Result of use of a method of analytical networks are coefficients of priority of each IT project of a portfolio. The received coefficients of priority of IT projects are used in the offered model of distribution of the budget of a portfolio between IT projects. Thus, the budget of a portfolio of IT projects is distributed between them taking into account not only the income from implementation of each IT project, but also other criteria, important for the IT company, for example: the degree of compliance of the IT project to strategic objectives of the IT company defining expediency of implementation of the IT project; the term of implementation of the IT project determined by the customer. The developed model of distribution of the budget of a portfolio between IT projects is approved on the example of distribution of the budget between IT projects of the portfolio consisting of three IT projects. Taking into account the received

  4. Earth radiation budget from a surface perspective and its representation in CMIP5 models

    Science.gov (United States)

    Wild, M.

    2012-04-01

    The genesis and evolution of Earth's climate is largely regulated by the global energy balance. Despite the central importance of the global energy balance for the climate system and climate change, substantial uncertainties still exist in the quantification of its different components, and their representation in climate models (e.g., Wild et al. 1998 Clim. Dyn., Wild 2008 Tellus). While the net radiative energy flow in and out of the climate system at the top of atmosphere (TOA) is known with considerable accuracy from new satellite programs such as CERES, much less is known about the energy distribution within the climate system and at the Earth surface. Here we use direct surface observations from the Baseline Surface Radiation Network (BSRN) and the Global Energy Balance Archive (GEBA) to provide better constraints on the surface radiative components as well as to investigate their temporal changes. We analyze radiation budgets of the latest generation of global climate models as used in the Coupled Model Intercomparison Project Phase 5 (CMIP5) and in the upcoming Fifth IPCC assessment report (IPCCAR5). Compared to a comprehensive set of surface observations, the CMIP5 models overestimate the shortwave radiation incident at the surface by 5-10 Wm-2 on average, due to a lack of absorption in the atmosphere. This suggests that the best estimate for the global mean absorbed shortwave radiation at the surface should be lower than the simulated estimates, which are on average slightly below 170 Wm-2, so that a value of no more than 160 Wm-2 might be the most realistic estimate for the global mean absorbed shortwave radiation at the surface. In contrast, the longwave downward radiation at the surface is underestimated by a similar amount in these models, suggesting that the best estimate for the global mean downward longwave radiation should be rather around 345 Wm-2 than the model average of 338 Wm-2. There is further increasing evidence from the direct

  5. Global atmospheric sulfur budget under volcanically quiescent conditions: Aerosol-chemistry-climate model predictions and validation

    Science.gov (United States)

    Sheng, Jian-Xiong; Weisenstein, Debra K.; Luo, Bei-Ping; Rozanov, Eugene; Stenke, Andrea; Anet, Julien; Bingemer, Heinz; Peter, Thomas

    2015-01-01

    The global atmospheric sulfur budget and its emission dependence have been investigated using the coupled aerosol-chemistry-climate model SOCOL-AER. The aerosol module comprises gaseous and aqueous sulfur chemistry and comprehensive microphysics. The particle distribution is resolved by 40 size bins spanning radii from 0.39 nm to 3.2 μm, including size-dependent particle composition. Aerosol radiative properties required by the climate model are calculated online from the aerosol module. The model successfully reproduces main features of stratospheric aerosols under nonvolcanic conditions, including aerosol extinctions compared to Stratospheric Aerosol and Gas Experiment II (SAGE II) and Halogen Occultation Experiment, and size distributions compared to in situ measurements. The calculated stratospheric aerosol burden is 109 Gg of sulfur, matching the SAGE II-based estimate (112 Gg). In terms of fluxes through the tropopause, the stratospheric aerosol layer is due to about 43% primary tropospheric aerosol, 28% SO2, 23% carbonyl sulfide (OCS), 4% H2S, and 2% dimethyl sulfide (DMS). Turning off emissions of the short-lived species SO2, H2S, and DMS shows that OCS alone still establishes about 56% of the original stratospheric aerosol burden. Further sensitivity simulations reveal that anticipated increases in anthropogenic SO2 emissions in China and India have a larger influence on stratospheric aerosols than the same increase in Western Europe or the U.S., due to deep convection in the western Pacific region. However, even a doubling of Chinese and Indian emissions is predicted to increase the stratospheric background aerosol burden only by 9%. In contrast, small to moderate volcanic eruptions, such as that of Nabro in 2011, may easily double the stratospheric aerosol loading.

  6. Exploring the Dynamics and Modeling National Budget as a Supply Chain System: A Proposal for Reengineering the Budgeting Process and for Developing a Management Flight Simulator

    Science.gov (United States)

    2012-09-01

    Keynesian and the Balanced-Budget case. Finally, a general framework of forming, implementing auditing budgets is initialized and, using the...are other activities that are associated with the implementation stage. Lastly, the review stage consists of all the actions related to audits ...process, which can be proved as a “futile and hopeless labour ” (Olivares-Caminal, 2010). However, previous experience of defaults (i.e. Argentina

  7. System Budgets

    DEFF Research Database (Denmark)

    Jeppesen, Palle

    1996-01-01

    The lecture note is aimed at introducing system budgets for optical communication systems. It treats optical fiber communication systems (six generations), system design, bandwidth effects, other system impairments and optical amplifiers.......The lecture note is aimed at introducing system budgets for optical communication systems. It treats optical fiber communication systems (six generations), system design, bandwidth effects, other system impairments and optical amplifiers....

  8. Standard Model as a Double Field Theory.

    Science.gov (United States)

    Choi, Kang-Sin; Park, Jeong-Hyuck

    2015-10-23

    We show that, without any extra physical degree introduced, the standard model can be readily reformulated as a double field theory. Consequently, the standard model can couple to an arbitrary stringy gravitational background in an O(4,4) T-duality covariant manner and manifest two independent local Lorentz symmetries, Spin(1,3)×Spin(3,1). While the diagonal gauge fixing of the twofold spin groups leads to the conventional formulation on the flat Minkowskian background, the enhanced symmetry makes the standard model more rigid, and also stringy, than it appeared. The CP violating θ term may no longer be allowed by the symmetry, and hence the strong CP problem can be solved. There are now stronger constraints imposed on the possible higher order corrections. We speculate that the quarks and the leptons may belong to the two different spin classes.

  9. Linear sigma model for multiflavor gauge theories

    Science.gov (United States)

    Meurice, Y.

    2017-12-01

    We consider a linear sigma model describing 2 Nf2 bosons (σ , a0 , η' and π ) as an approximate effective theory for a S U (3 ) local gauge theory with Nf Dirac fermions in the fundamental representation. The model has a renormalizable U (Nf)L⊗U (Nf)R invariant part, which has an approximate O (2 Nf2) symmetry, and two additional terms, one describing the effects of a S U (Nf)V invariant mass term and the other the effects of the axial anomaly. We calculate the spectrum for arbitrary Nf. Using preliminary and published lattice results from the LatKMI collaboration, we found combinations of the masses that vary slowly with the explicit chiral symmetry breaking and Nf. This suggests that the anomaly term plays a leading role in the mass spectrum and that simple formulas such as Mσ2≃(2 /Nf-Cσ)Mη' 2 should apply in the chiral limit. Lattice measurements of Mη'2 and of approximate constants such as Cσ could help in locating the boundary of the conformal window. We show that our calculation can be adapted for arbitrary representations of the gauge group and in particular to the minimal model with two sextets, where similar patterns are likely to apply.

  10. Standard Models from Heterotic M-theory

    CERN Document Server

    Donagi, R Y; Pantev, T; Waldram, D; Donagi, Ron; Ovrut, Burt A.; Pantev, Tony; Waldram, Daniel

    1999-01-01

    We present a class of N=1 supersymmetric models of particle physics, derived directly from heterotic M-theory, that contain three families of chiral quarks and leptons coupled to the gauge group $SU(3)_C\\times SU(2)_{L}\\times U(1)_{Y}$. These models are a fundamental form of ``brane-world'' theories, with an observable and hidden sector each confined, after compactification on a Calabi-Yau threefold, to a BPS three-brane separated by a five dimensional bulk space with size of the order of the intermediate scale. The requirement of three families, coupled to the fundamental conditions of anomaly freedom and supersymmetry, constrains these models to contain additional five-branes wrapped around holomorphic curves in the Calabi-Yau threefold. These five-branes ``live'' in the bulk space and represent new, non-perturbative aspects of these particle physics vacua. We discuss, in detail, the relevant mathematical structure of a class of torus-fibered Calabi-Yau threefolds with non-trivial first homotopy groups and ...

  11. A matrix model from string field theory

    Directory of Open Access Journals (Sweden)

    Syoji Zeze

    2016-09-01

    Full Text Available We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large $N$ matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.

  12. DCS Budget Tracking System

    Data.gov (United States)

    Social Security Administration — DCS Budget Tracking System database contains budget information for the Information Technology budget and the 'Other Objects' budget. This data allows for monitoring...

  13. Verification and calibration of Energy- and Flux-Budget (EFB) turbulence closure model through large eddy simulations and direct numerical simulations

    Science.gov (United States)

    Kadantsev, Evgeny; Fortelius, Carl; Druzhinin, Oleg; Mortikov, Evgeny; Glazunov, Andrey; Zilitinkevich, Sergej

    2016-04-01

    We examine and validate the EFB turbulence closure model (Zilitinkevich et al., 2013), which is based on the budget equations for basic second moments, namely, two energies: turbulent kinetic energy EK and turbulent potential energy EP, and vertical turbulent fluxes of momentum and potential temperature, τi (i = 1, 2) and Fz. Instead of traditional postulation of down-gradient turbulent transport, the EFB closure determines the eddy viscosity and eddy conductivity from the steady-state version of the budget equations for τi and Fz. Furthermore, the EFB closure involves new prognostic equation for turbulent dissipation time scale tT, and extends the theory to non-steady turbulence regimes accounting for non-gradient and non-local turbulent transports (when the traditional concepts of eddy viscosity and eddy conductivity become generally inconsistent). Our special interest is in asymptotic behavior of the EFB closure in strongly stable stratification. For this purpose, we consider plane Couette flow, namely, the flow between two infinite parallel plates, one of which is moving relative to another. We use a set of Direct Numerical Simulation (DNS) experiments at the highest possible Reynolds numbers for different bulk Richardson numbers (Druzhinin et al., 2015). To demonstrate potential improvements in Numerical Weather Prediction models, we test the new closure model in various idealized cases, varying stratification from the neutral and conventionally neutral to stable (GABLS1) running a test RANS model and HARMONIE/AROME model in single-column mode. Results are compared with DNS and LES (Large Eddy Simulation) runs and different numerical weather prediction models.

  14. Nature, theory and modelling of geophysical convective planetary boundary layers

    Science.gov (United States)

    Zilitinkevich, Sergej

    2015-04-01

    horizontal branches of organised structures. This mechanism (Zilitinkevich et al., 2006), was overlooked in conventional local theories, such as the Monin-Obukhov similarity theory, and convective heat/mass transfer law: Nu~Ra1/3, where Nu and Ra are the Nusselt number and Raleigh numbers. References Hellsten A., Zilitinkevich S., 2013: Role of convective structures and background turbulence in the dry convective boundary layer. Boundary-Layer Meteorol. 149, 323-353. Zilitinkevich, S.S., 1973: Shear convection. Boundary-Layer Meteorol. 3, 416-423. Zilitinkevich, S.S., 1991: Turbulent Penetrative Convection, Avebury Technical, Aldershot, 180 pp. Zilitinkevich S.S., 2012: The Height of the Atmospheric Planetary Boundary layer: State of the Art and New Development - Chapter 13 in 'National Security and Human Health Implications of Climate Change', edited by H.J.S. Fernando, Z. Klaić, J.L. McKulley, NATO Science for Peace and Security Series - C: Environmental Security (ISBN 978-94-007-2429-7), Springer, 147-161. Zilitinkevich S.S., 2013: Atmospheric Turbulence and Planetary Boundary Layers. Fizmatlit, Moscow, 248 pp. Zilitinkevich, S.S., Hunt, J.C.R., Grachev, A.A., Esau, I.N., Lalas, D.P., Akylas, E., Tombrou, M., Fairall, C.W., Fernando, H.J.S., Baklanov, and A., Joffre, S.M., 2006: The influence of large convective eddies on the surface layer turbulence. Quart. J. Roy. Met. Soc. 132, 1423-1456. Zilitinkevich S.S., Tyuryakov S.A., Troitskaya Yu. I., Mareev E., 2012: Theoretical models of the height of the atmospheric planetary boundary layer and turbulent entrainment at its upper boundary. Izvestija RAN, FAO, 48, No.1, 150-160 Zilitinkevich, S.S., Elperin, T., Kleeorin, N., Rogachevskii, I., Esau, I.N., 2013: A hierarchy of energy- and flux-budget (EFB) turbulence closure models for stably stratified geophysical flows. Boundary-Layer Meteorol. 146, 341-373.

  15. Using chemical organization theory for model checking.

    Science.gov (United States)

    Kaleta, Christoph; Richter, Stephan; Dittrich, Peter

    2009-08-01

    The increasing number and complexity of biomodels makes automatic procedures for checking the models' properties and quality necessary. Approaches like elementary mode analysis, flux balance analysis, deficiency analysis and chemical organization theory (OT) require only the stoichiometric structure of the reaction network for derivation of valuable information. In formalisms like Systems Biology Markup Language (SBML), however, information about the stoichiometric coefficients required for an analysis of chemical organizations can be hidden in kinetic laws. First, we introduce an algorithm that uncovers stoichiometric information that might be hidden in the kinetic laws of a reaction network. This allows us to apply OT to SBML models using modifiers. Second, using the new algorithm, we performed a large-scale analysis of the 185 models contained in the manually curated BioModels Database. We found that for 41 models (22%) the set of organizations changes when modifiers are considered correctly. We discuss one of these models in detail (BIOMD149, a combined model of the ERK- and Wnt-signaling pathways), whose set of organizations drastically changes when modifiers are considered. Third, we found inconsistencies in 5 models (3%) and identified their characteristics. Compared with flux-based methods, OT is able to identify those species and reactions more accurately [in 26 cases (14%)] that can be present in a long-term simulation of the model. We conclude that our approach is a valuable tool that helps to improve the consistency of biomodels and their repositories. All data and a JAVA applet to check SBML-models is available from http://www.minet.uni-jena.de/csb/prj/ot/tools. Supplementary data are available at Bioinformatics online.

  16. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Science.gov (United States)

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  17. Improved predictive ability of climate-human-behaviour interactions with modifications to the COMFA outdoor energy budget model

    Science.gov (United States)

    Vanos, J. K.; Warland, J. S.; Gillespie, T. J.; Kenny, N. A.

    2012-11-01

    The purpose of this paper is to implement current and novel research techniques in human energy budget estimations to give more accurate and efficient application of models by a variety of users. Using the COMFA model, the conditioning level of an individual is incorporated into overall energy budget predictions, giving more realistic estimations of the metabolism experienced at various fitness levels. Through the use of VO2 reserve estimates, errors are found when an elite athlete is modelled as an unconditioned or a conditioned individual, giving budgets underpredicted significantly by -173 and -123 W m-2, respectively. Such underprediction can result in critical errors regarding heat stress, particularly in highly motivated individuals; thus this revision is critical for athletic individuals. A further improvement in the COMFA model involves improved adaptation of clothing insulation ( I cl), as well clothing non-uniformity, with changing air temperature ( T a) and metabolic activity ( M act). Equivalent T a values (for I cl estimation) are calculated in order to lower the I cl value with increasing M act at equal T a. Furthermore, threshold T a values are calculated to predict the point at which an individual will change from a uniform I cl to a segmented I cl (full ensemble to shorts and a T-shirt). Lastly, improved relative velocity ( v r) estimates were found with a refined equation accounting for the degree angle of wind to body movement. Differences between the original and improved v r equations increased with higher wind and activity speeds, and as the wind to body angle moved away from 90°. Under moderate microclimate conditions, and wind from behind a person, the convective heat loss and skin temperature estimates were 47 W m-2 and 1.7°C higher when using the improved v r equation. These model revisions improve the applicability and usability of the COMFA energy budget model for subjects performing physical activity in outdoor environments

  18. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  19. Growth potential of blue mussels (M. edulis) exposed to different salinities evaluated by a Dynamic Energy Budget model

    DEFF Research Database (Denmark)

    Maar, Marie; Saurel, Camille; Landes, Anja

    2015-01-01

    For bluemussels,Mytilus edulis, onemajor constrain in the Baltic Sea is the low salinities that reduce the efficiency of mussel production. However, the effects of living in low and variable salinity regimes are rarely considered in models describing mussel growth. The aim of the present study...... was to incorporate the effects of low salinity into an eco-physiological model of blue mussels and to identify areas suitable for mussel production. A Dynamic Energy Budget (DEB) model was modified with respect to i) the morphological parameters (DW/WW-ratio, shape factor), ii) change in ingestion rate and iii...

  20. A model to estimate hydrological processes and water budget in an irrigation farm pond

    Science.gov (United States)

    Ying Ouyang; Joel O. Paz; Gary Feng; John J. Read; Ardeshir Adeli; Johnie N. Jenkins

    2017-01-01

    With increased interest to conserve groundwater resources without reducing crop yield potential, more on-farm water storage ponds have been constructed in recent years in USA and around the world. However, the hydrological processes, water budget, and environmental benefits and consequences of these ponds have not yet been fully quantified. This study developed a...

  1. Educational Program Evaluation Model, From the Perspective of the New Theories

    Directory of Open Access Journals (Sweden)

    Soleiman Ahmady

    2014-05-01

    Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.

  2. Modeling and Optimization : Theory and Applications Conference

    CERN Document Server

    Terlaky, Tamás

    2017-01-01

    This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 17-19, 2016. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, health, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.

  3. Conformal Field Theory Applied to Loop Models

    Science.gov (United States)

    Jacobsen, Jesper Lykke

    The application of methods of quantum field theory to problems of statistical mechanics can in some sense be traced back to Onsager's 1944 solution [1] of the two-dimensional Ising model. It does however appear fair to state that the 1970's witnessed a real gain of momentum for this approach, when Wilson's ideas on scale invariance [2] were applied to study critical phenomena, in the form of the celebrated renormalisation group [3]. In particular, the so-called ɛ expansion permitted the systematic calculation of critical exponents [4], as formal power series in the space dimensionality d, below the upper critical dimension d c . An important lesson of these efforts was that critical exponents often do not depend on the precise details of the microscopic interactions, leading to the notion of a restricted number of distinct universality classes.

  4. An energy budget agent-based model of earthworm populations and its application to study the effects of pesticides.

    Science.gov (United States)

    Johnston, A S A; Hodson, M E; Thorbek, P; Alvarez, T; Sibly, R M

    2014-05-24

    Earthworms are important organisms in soil communities and so are used as model organisms in environmental risk assessments of chemicals. However current risk assessments of soil invertebrates are based on short-term laboratory studies, of limited ecological relevance, supplemented if necessary by site-specific field trials, which sometimes are challenging to apply across the whole agricultural landscape. Here, we investigate whether population responses to environmental stressors and pesticide exposure can be accurately predicted by combining energy budget and agent-based models (ABMs), based on knowledge of how individuals respond to their local circumstances. A simple energy budget model was implemented within each earthworm Eisenia fetida in the ABM, based on a priori parameter estimates. From broadly accepted physiological principles, simple algorithms specify how energy acquisition and expenditure drive life cycle processes. Each individual allocates energy between maintenance, growth and/or reproduction under varying conditions of food density, soil temperature and soil moisture. When simulating published experiments, good model fits were obtained to experimental data on individual growth, reproduction and starvation. Using the energy budget model as a platform we developed methods to identify which of the physiological parameters in the energy budget model (rates of ingestion, maintenance, growth or reproduction) are primarily affected by pesticide applications, producing four hypotheses about how toxicity acts. We tested these hypotheses by comparing model outputs with published toxicity data on the effects of copper oxychloride and chlorpyrifos on E. fetida. Both growth and reproduction were directly affected in experiments in which sufficient food was provided, whilst maintenance was targeted under food limitation. Although we only incorporate toxic effects at the individual level we show how ABMs can readily extrapolate to larger scales by providing

  5. A Realizability Model for Impredicative Hoare Type Theory

    DEFF Research Database (Denmark)

    Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar

    2008-01-01

    We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....

  6. Global budget of tropospheric ozone: Evaluating recent model advances with satellite (OMI), aircraft (IAGOS), and ozonesonde observations

    Science.gov (United States)

    Hu, Lu; Jacob, Daniel J.; Liu, Xiong; Zhang, Yi; Zhang, Lin; Kim, Patrick S.; Sulprizio, Melissa P.; Yantosca, Robert M.

    2017-10-01

    The global budget of tropospheric ozone is governed by a complicated ensemble of coupled chemical and dynamical processes. Simulation of tropospheric ozone has been a major focus of the GEOS-Chem chemical transport model (CTM) over the past 20 years, and many developments over the years have affected the model representation of the ozone budget. Here we conduct a comprehensive evaluation of the standard version of GEOS-Chem (v10-01) with ozone observations from ozonesondes, the OMI satellite instrument, and MOZAIC-IAGOS commercial aircraft for 2012-2013. Global validation of the OMI 700-400 hPa data with ozonesondes shows that OMI maintained persistent high quality and no significant drift over the 2006-2013 period. GEOS-Chem shows no significant seasonal or latitudinal bias relative to OMI and strong correlations in all seasons on the 2° × 2.5° horizontal scale (r = 0.88-0.95), improving on previous model versions. The most pronounced model bias revealed by ozonesondes and MOZAIC-IAGOS is at high northern latitudes in winter-spring where the model is 10-20 ppbv too low. This appears to be due to insufficient stratosphere-troposphere exchange (STE). Model updates to lightning NOx, Asian anthropogenic emissions, bromine chemistry, isoprene chemistry, and meteorological fields over the past decade have overall led to gradual increase in the simulated global tropospheric ozone burden and more active ozone production and loss. From simulations with different versions of GEOS meteorological fields we find that tropospheric ozone in GEOS-Chem v10-01 has a global production rate of 4960-5530 Tg a-1, lifetime of 20.9-24.2 days, burden of 345-357 Tg, and STE of 325-492 Tg a-1. Change in the intensity of tropical deep convection between these different meteorological fields is a major factor driving differences in the ozone budget.

  7. Stratospheric water vapour budget and convection overshooting the tropopause: modelling study from SCOUT-AMMA

    Directory of Open Access Journals (Sweden)

    X. M. Liu

    2010-09-01

    Full Text Available The aim of this paper is to study the impacts of overshooting convection at a local scale on the water distribution in the tropical UTLS. Overshooting convection is assumed to be one of the processes controlling the entry of water vapour mixing ratio in the stratosphere by injecting ice crystals above the tropopause which later sublimate and hydrate the lower stratosphere. For this purpose, we quantify the individual impact of two cases of overshooting convection in Africa observed during SCOUT-AMMA: the case of 4 August 2006 over Southern Chad which is likely to have influenced the water vapour measurements by micro-SDLA and FLASH-B from Niamey on 5 August, and the case of a mesoscale convective system over Aïr on 5 August 2006. We make use of high resolution (down to 1 km horizontally nested grid simulations with the three-dimensional regional atmospheric model BRAMS (Brazilian Regional Atmospheric Modelling System. In both cases, BRAMS succeeds in simulating the main features of the convective activity, as well as overshooting convection, though the exact position and time of the overshoots indicated by MSG brightness temperature difference is not fully reproduced (typically 1° displacement in latitude compared with the overshoots indicated by brightness temperature difference from satellite observations for both cases, and several hours shift for the Aïr case on 5 August 2006. Total water budgets associated with these two events show a significant injection of ice particles above the tropopause with maximum values of about 3.7 ton s−1 for the Chad case (4 August and 1.4 ton s−1 for the Aïr case (5 August, and a total upward cross tropopause transport of about 3300 ton h−1 for the Chad case and 2400 ton h−1 for the Aïr case in the third domain of simulation. The order of magnitude of these modelled fluxes is lower but comparable with similar studies in other tropical areas based on

  8. Modeling Poker Challenges by Evolutionary Game Theory

    Directory of Open Access Journals (Sweden)

    Marco Alberto Javarone

    2016-12-01

    Full Text Available We introduce a model for studying the evolutionary dynamics of Poker. Notably, despite its wide diffusion and the raised scientific interest around it, Poker still represents an open challenge. Recent attempts for uncovering its real nature, based on statistical physics, showed that Poker in some conditions can be considered as a skill game. In addition, preliminary investigations reported a neat difference between tournaments and ‘cash game’ challenges, i.e., between the two main configurations for playing Poker. Notably, these previous models analyzed populations composed of rational and irrational agents, identifying in the former those that play Poker by using a mathematical strategy, while in the latter those playing randomly. Remarkably, tournaments require very few rational agents to make Poker a skill game, while ‘cash game’ may require several rational agents for not being classified as gambling. In addition, when the agent interactions are based on the ‘cash game’ configuration, the population shows an interesting bistable behavior that deserves further attention. In the proposed model, we aim to study the evolutionary dynamics of Poker by using the framework of Evolutionary Game Theory, in order to get further insights on its nature, and for better clarifying those points that remained open in the previous works (as the mentioned bistable behavior. In particular, we analyze the dynamics of an agent population composed of rational and irrational agents, that modify their behavior driven by two possible mechanisms: self-evaluation of the gained payoff, and social imitation. Results allow to identify a relation between the mechanisms for updating the agents’ behavior and the final equilibrium of the population. Moreover, the proposed model provides further details on the bistable behavior observed in the ‘cash game’ configuration.

  9. Public Health and Budget Impact of Probiotics on Common Respiratory Tract Infections: A Modelling Study

    OpenAIRE

    Irene Lenoir-Wijnkoop; Laetitia Gerlier; Jean-Louis Bresson; Claude Le Pen; Gilles Berdeaux

    2015-01-01

    Objectives Two recent meta-analyses by the York Health Economics Consortium (YHEC) and Cochrane demonstrated probiotic efficacy in reducing the duration and number of common respiratory tract infections (CRTI) and associated antibiotic prescriptions. A health-economic analysis was undertaken to estimate the public health and budget consequences of a generalized probiotic consumption in France. Methods A virtual age- and gender-standardized population was generated using a Markov microsimulati...

  10. TRADITIONAL BUDGETING VERSUS BEYOND BUDGETING: A LITERATURE REVIEW

    Directory of Open Access Journals (Sweden)

    CARDOS ILDIKO REKA

    2014-07-01

    Full Text Available Budgets are an important part of the business environment since 1920 and are considered to be the key drivers and evaluators of managerial performance; and the key elements for planning and control. Budgets are the most powerful tool for management control; they can play an essential role in the organization’s power politics because it can increase the power and authority of top management and limit the autonomy of lower-level managers. Besides its advantages traditional budgeting presents disadvantages also. In recent years criticism towards traditional budgeting has increased. The basis of this criticism is that traditional budgeting is a relic of the past; it prevents reactions to changes in the market, it cannot keep up with the changes and requirements of today’s business world and it isn’t useful for business management. In order to eliminate criticism researchers and practitioners have developed more systematic and alternative concepts of budgeting that suits better for the needs of the modern business environment. Beyond budgeting, better budgeting, rolling forecasts, activity-based budgeting are the main alternatives developed in the last years. From the mentioned alternatives this article examines only beyond budgeting. Our paper discusses how budgeting has evolved into its current state, before examining why this universal technique has come under such heavy criticism of late. The paper is a literature analysis, it contributes to the existing managerial accounting literature and it is structured as follows. In the first part the background and evolution of budgeting is presented, followed by the analysis of related theories in traditional budgeting, emphasizing both the advantages and disadvantages of traditional budgeting. The second part of the paper continues with the discussion about alternative budgeting methods highlighting pros and cons of alternative methods, especially beyond budgeting. In the third part conducted

  11. Development of a process-based model to predict pathogen budgets for the Sydney drinking water catchment.

    Science.gov (United States)

    Ferguson, Christobel M; Croke, Barry F W; Beatson, Peter J; Ashbolt, Nicholas J; Deere, Daniel A

    2007-06-01

    In drinking water catchments, reduction of pathogen loads delivered to reservoirs is an important priority for the management of raw source water quality. To assist with the evaluation of management options, a process-based mathematical model (pathogen catchment budgets - PCB) is developed to predict Cryptosporidium, Giardia and E. coli loads generated within and exported from drinking water catchments. The model quantifies the key processes affecting the generation and transport of microorganisms from humans and animals using land use and flow data, and catchment specific information including point sources such as sewage treatment plants and on-site systems. The resultant pathogen catchment budgets (PCB) can be used to prioritize the implementation of control measures for the reduction of pathogen risks to drinking water. The model is applied in the Wingecarribee catchment and used to rank those sub-catchments that would contribute the highest pathogen loads in dry weather, and in intermediate and large wet weather events. A sensitivity analysis of the model identifies that pathogen excretion rates from animals and humans, and manure mobilization rates are significant factors determining the output of the model and thus warrant further investigation.

  12. Characteristics of Water Budget Components in Paddy Rice Field under the Asian Monsoon Climate: Application of HSPF-Paddy Model

    Directory of Open Access Journals (Sweden)

    Young-Jin Kim

    2014-07-01

    Full Text Available The HSPF-Paddy model was applied to the Bochung watershed in Korea to compare water budget components by the land use types under the Asian monsoon climate. The calibration of HSPF-Paddy during 1992–2001 with PEST, a package program to optimize HSPF, and validation during 1985–1991 were carried out. The model efficiencies for monthly stream flow are 0.85 for calibration and 0.84 for validation. The simulation of annual mean runoff met the criteria of water budget analysis with the acceptable error level (less than 10 percent mean error. The simulation of the movement of water from paddy rice field to watershed was successful, and application of HSPF-Paddy coupled with PEST was able to improve accuracy of model simulation with reduced time and efforts for model calibration. The results of water budget analysis show that most of the outflow (86% for the urban area occurred through surface runoff, showing the highest rate among the land use types compared. Significant amounts of water are irrigated to paddy rice fields, and the runoff depth as well as evapotranspiration from paddy rice field is higher than other land use types. Hydrological characteristic of paddy rice field is that most of water movement occurred at the surface area, resulting from the low infiltration rate and manning’s coefficient, as well as ponded water throughout the growing season. Major impact on input and output of water were precipitation and runoff, respectively, influenced by an Asian monsoon climate.

  13. Theory and Modeling in Support of Tether

    Science.gov (United States)

    Chang, C. L.; Bergeron, G.; Drobot, A. D.; Papadopoulos, K.; Riyopoulos, S.; Szuszczewicz, E.

    1999-01-01

    This final report summarizes the work performed by SAIC's Applied Physics Operation on the modeling and support of Tethered Satellite System missions (TSS-1 and TSS-1R). The SAIC team, known to be Theory and Modeling in Support of Tether (TMST) investigation, was one of the original twelve teams selected in July, 1985 for the first TSS mission. The accomplishments described in this report cover the period December 19, 1985 to September 31, 1999 and are the result of a continuous effort aimed at supporting the TSS missions in the following major areas. During the contract period, the SAIC's TMST investigation acted to: Participate in the planning and the execution on both of the TSS missions; Provide scientific understanding on the issues involved in the electrodynamic tether system operation prior to the TSS missions; Predict ionospheric conditions encountered during the re-flight mission (TSS-lR) based on realtime global ionosounde data; Perform post mission analyses to enhance our understanding on the TSS results. Specifically, we have 1) constructed and improved current collection models and enhanced our understanding on the current-voltage data; 2) investigated the effects of neutral gas in the current collection processes; 3) conducted laboratory experiments to study the discharge phenomena during and after tether-break; and 4) perform numerical simulations to understand data collected by plasma instruments SPES onboard the TSS satellite; Design and produce multi-media CD that highlights TSS mission achievements and convey the knowledge of the tether technology to the general public. Along with discussions of this work, a list of publications and presentations derived from the TMST investigation spanning the reporting period is compiled.

  14. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  15. Internal Universes in Models of Homotopy Type Theory

    DEFF Research Database (Denmark)

    Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.

    2018-01-01

    We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory....

  16. Catastrophe Theory: A Unified Model for Educational Change.

    Science.gov (United States)

    Cryer, Patricia; Elton, Lewis

    1990-01-01

    Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)

  17. A Leadership Identity Development Model: Applications from a Grounded Theory

    Science.gov (United States)

    Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.

    2006-01-01

    This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…

  18. The big bang theory and Universe modeling. Mistakes in the relativity theory

    OpenAIRE

    Javadov, Khaladdin; Javadli, Elmaddin

    2014-01-01

    This article is about Theory of Big Bang and it describes some details of Universe Modelling. It is Physical and Mathematical modeling of Universe formation. Application of mathematical and physical formulas for Universe Calculations.

  19. Simulating the carbon, water, energy budgets and greenhouse gas emissions of arctic soils with the ISBA land surface model

    Science.gov (United States)

    Morel, Xavier; Decharme, Bertrand; Delire, Christine

    2017-04-01

    Permafrost soils and boreal wetlands represent an important challenge for future climate simulations. Our aim is to be able to correctly represent the most important thermal, hydrologic and carbon cycle related processes in boreal areas with our land surface model ISBA (Masson et al, 2013). This is particularly important since ISBA is part of the CNRM-CM Climate Model (Voldoire et al, 2012), that is used for projections of future climate changes. To achieve this goal, we replaced the one layer original soil carbon module based on the CENTURY model (Parton et al, 1987) by a multi-layer soil carbon module that represents C pools and fluxes (CO2 and CH4), organic matter decomposition, gas diffusion (Khvorostyanov et al., 2008), CH4 ebullition and plant-mediated transport, and cryoturbation (Koven et al., 2009). The carbon budget of the new model is closed. The soil carbon module is tightly coupled to the ISBA energy and water budget module that solves the one-dimensional Fourier law and the mixed-form of the Richards equation explicitly to calculate the time evolution of the soil energy and water budgets (Boone et al., 2000; Decharme et al. 2011). The carbon, energy and water modules are solved using the same vertical discretization. Snowpack processes are represented by a multi-layer snow model (Decharme et al, 2016). We test this new model on a pair of monitoring sites in Greenland, one in a permafrost area (Zackenberg Ecological Research Operations, Jensen et al, 2014) and the other in a region without permafrost (Nuuk Ecological Research Operations, Jensen et al, 2013); both sites are established within the GeoBasis part of the Greenland Ecosystem Monitoring (GEM) program. The site of Chokurdakh, in a permafrost area of Siberia is is our third studied site. We test the model's ability to represent the physical variables (soil temperature and water profiles, snow height), the energy and water fluxes as well as the carbon dioxyde and methane fluxes. We also test the

  20. Hydrologic characterization for Spring Creek and hydrologic budget and model scenarios for Sheridan Lake, South Dakota, 1962-2007

    Science.gov (United States)

    Driscoll, Daniel G.; Norton, Parker A.

    2009-01-01

    The U.S. Geological Survey cooperated with South Dakota Game, Fish and Parks to characterize hydrologic information relevant to management of water resources associated with Sheridan Lake, which is formed by a dam on Spring Creek. This effort consisted primarily of characterization of hydrologic data for a base period of 1962 through 2006, development of a hydrologic budget for Sheridan Lake for this timeframe, and development of an associated model for simulation of storage deficits and drawdown in Sheridan Lake for hypothetical release scenarios from the lake. Historically, the dam has been operated primarily as a 'pass-through' system, in which unregulated outflows pass over the spillway; however, the dam recently was retrofitted with an improved control valve system that would allow controlled releases of about 7 cubic feet per second (ft3/s) or less from a fixed depth of about 60 feet (ft). Development of a hydrologic budget for Sheridan Lake involved compilation, estimation, and characterization of data sets for streamflow, precipitation, and evaporation. The most critical data need was for extrapolation of available short-term streamflow records for Spring Creek to be used as the long-term inflow to Sheridan Lake. Available short-term records for water years (WY) 1991-2004 for a gaging station upstream from Sheridan Lake were extrapolated to WY 1962-2006 on the basis of correlations with streamflow records for a downstream station and for stations located along two adjacent streams. Comparisons of data for the two streamflow-gaging stations along Spring Creek indicated that tributary inflow is approximately proportional to the intervening drainage area, which was used as a means of estimating tributary inflow for the hydrologic budget. Analysis of evaporation data shows that sustained daily rates may exceed maximum monthly rates by a factor of about two. A long-term (1962-2006) hydrologic budget was developed for computation of reservoir outflow from

  1. Combining earth observations, gis data and eco-hydrological modelling for predicting carbon budgets and water balance

    Science.gov (United States)

    Boegh, E.; Butts, M.; Hansen, S.; Soegaard, H.; Hasager, C. B.; Pilegaard, K.; Haastrup, M.; Henriksen, H. J.; Jensen, N. O.; Kristensen, M.

    2003-04-01

    Remote sensing data, GIS data and an eco-hydrological model (Daisy) are coupled within the project EO-FLUX-BUDGET for the prediction of CO2 budgets and water balance at Zealand which is the major island of Denmark (covering approximately 7.000 km2). In order to catch the surface heterogeneity shaped by the large variety of small fields, a high-resolution (30 m) land surface map is produced from satellite observations and validated using GIS data and national statistics on agricultural land use. GIS information on the housing density of built-up areas was superimposed on the land use map to facilitate the implementation of engineering methods for assessment of surface runoff in these regions. A geological soil map is combined with soil texture data registered in 5439 locations to construct a 3-layer GIS based soil map. The ground water depth is represented by the 10 year average water head elevation which is simulated by a distributed hydrological model (MIKE SHE). The Daisy model is run using grid based meteorological data and the results are evaluated by comparing with eddy covariance atmospheric fluxes recorded in agricultural, forest and urban regions. Temporal maps of vegetation properties are produced using multi-scale remote sensing data (Landsat TM, Terra-MODIS and SPOT-VEGETATION) and used to adjust the simulated leaf area indices. The initial result shows that the model efficiency is improved by the implementation of satellite data.

  2. Modelling canopy radiation budget through multiple scattering approximation: a case study of coniferous forest in Mexico City Valley

    Science.gov (United States)

    Silván-Cárdenas, Jose L.; Corona-Romero, Nirani

    2015-10-01

    In this paper, we describe some results from a study on hyperspectral analysis of coniferous canopy scattering for the purpose of estimating forest biophysical and structural parameters. Georeferenced airborne hyperspectral measurements were taken from a flying helicopter over a coniferous forest dominated by Pinus hartweguii and Abies religiosa within the Federal District Conservation Land in Mexico City. Hyperspectral data was recorded in the optical range from 350 to 2500 nm at 1nm spectral resolution using the FieldSpec 4 (ASD Inc.). Spectral measurements were also carried out in the ground for vegetation and understory components, including leaf, bark, soil and grass. Measurements were then analyzed through a previously developed multiple scattering approximation (MSA) model, which represents above-canopy spectral reflectance through a non-linear combination of pure spectral components (endmembers), as well as through a set of photon recollision probabilities and interceptance fractions. In this paper we provide an expression for the canopy absorptance as the basis for estimating the components of canopy radiation budget using the MSA model. Furthermore, since MSA does not prescribe a priori the endmembers to incorporate in the model, a multiple endmember selection method (MESMSA) was developed and tested. Photon recollision probabilities and interceptance fractions were estimated by fitting the model to airborne spectral reflectance and selected endmembers where then used to estimate the canopy radiation budget at each measured location.

  3. A mathematical model for maximizing the value of phase 3 drug development portfolios incorporating budget constraints and risk.

    Science.gov (United States)

    Patel, Nitin R; Ankolekar, Suresh; Antonijevic, Zoran; Rajicic, Natasa

    2013-05-10

    We describe a value-driven approach to optimizing pharmaceutical portfolios. Our approach incorporates inputs from research and development and commercial functions by simultaneously addressing internal and external factors. This approach differentiates itself from current practices in that it recognizes the impact of study design parameters, sample size in particular, on the portfolio value. We develop an integer programming (IP) model as the basis for Bayesian decision analysis to optimize phase 3 development portfolios using expected net present value as the criterion. We show how this framework can be used to determine optimal sample sizes and trial schedules to maximize the value of a portfolio under budget constraints. We then illustrate the remarkable flexibility of the IP model to answer a variety of 'what-if' questions that reflect situations that arise in practice. We extend the IP model to a stochastic IP model to incorporate uncertainty in the availability of drugs from earlier development phases for phase 3 development in the future. We show how to use stochastic IP to re-optimize the portfolio development strategy over time as new information accumulates and budget changes occur. Copyright © 2013 John Wiley & Sons, Ltd.

  4. Theory and modeling of active brazing.

    Energy Technology Data Exchange (ETDEWEB)

    van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.

    2013-09-01

    Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.

  5. Maintenance Budgeting.

    Science.gov (United States)

    Smith, J. McCree

    Three methods for the preparation of maintenance budgets are discussed--(1) a traditional method, inconclusive and obsolete, based on gross square footage, (2) the formula approach method based on building classification (wood-frame, masonry-wood, masonry-concrete) with maintenance cost factors for each type plus custodial service rates by type of…

  6. A modelling study of the impact of cirrus clouds on the moisture budget of the upper troposphere

    Directory of Open Access Journals (Sweden)

    S. Fueglistaler

    2006-01-01

    Full Text Available We present a modelling study of the effect of cirrus clouds on the moisture budget of the layer wherein the cloud formed. Our framework simplifies many aspects of cloud microphysics and collapses the problem of sedimentation onto a 0-dimensional box model, but retains essential feedbacks between saturation mixing ratio, particle growth, and water removal through particle sedimentation. The water budget is described by two coupled first-order differential equations for dimensionless particle number density and saturation point temperature, where the parameters defining the system (layer depth, reference temperature, amplitude and time scale of temperature perturbation and inital particle number density, which may or may not be a function of reference temperature and cooling rate are encapsulated in a single coefficient. This allows us to scale the results to a broad range of atmospheric conditions, and to test sensitivities. Results of the moisture budget calculations are presented for a range of atmospheric conditions (T: 238–205 K; p: 325–180 hPa and a range of time scales τT of the temperature perturbation that induces the cloud formation. The cirrus clouds are found to efficiently remove water for τT longer than a few hours, with longer perturbations (τT≳10 h required at lower temperatures (T≲210 K. Conversely, we find that temperature perturbations of duration order 1 h and less (a typical timescale for e.g., gravity waves do not efficiently dehydrate over most of the upper troposphere. A consequence is that (for particle densities typical of current cirrus clouds the assumption of complete dehydration to the saturation mixing ratio may yield valid predictions for upper tropospheric moisture distributions if it is based on the large scale temperature field, but this assumption is not necessarily valid if it is based on smaller scale temperature fields.

  7. Idealized numerical modeling of polar mesocyclones dynamics diagnosed by energy budget

    Science.gov (United States)

    Sergeev, Dennis; Stepanenko, Victor

    2014-05-01

    can be interpreted as the growth rate of the vortex) and energy conversion in the diagnostic equations for kinetic and available potential energy (APE). The energy budget equations are implemented in two forms. The first approach follows the scheme developed by Lorenz (1955) in which KE and APE are broken into a mean component and an eddy component forming a well-known energy cycle. The second method is based on the energy equations that are strictly derived from the governing equations of the numerical mesoscale model used. The latter approach, hence, takes into account all the approximations and numerical features used in the model. Some conclusions based on the comparison of the described methods are presented in the study. A series of high-resolution experiments is carried out using three-dimensional non-hydrostatic limited-area sigma-coordinate numerical model ReMeDy (Research Mesoscale Dynamics), being developed at Lomonosov Moscow State University [3]. An idealized basic state condition is used for all simulations. It is composed of the zonally oriented baroclinic zone over the sea surface partly covered with ice. To realize a baroclinic channel environment zero-gradient boundary conditions at the meridional lateral oundaries are imposed, while the zonal boundary conditions are periodic. The initialization of the mesocyclone is achieved by creating a small axis-symmetric vortex in the center of the model domain. The baroclinicity and stratification of the basic state, as well as the surface parameters, are varied in the typically observed range. References 1. Heinemann G, Øyvind S. 2013. Workshop On Polar Lows. Bull. Amer. Meteor. Soc. 94: ES123-ES126. 2. Yanase W, Niino H. 2006. Dependence of Polar Low Development on Baroclinicity and Physical Processes: An Idealized High-Resolution Experiment, J. Atmos. Sci. 64: 3044-3067. 3. Chechin DG et al. 2013. Idealized dry quasi 2-D mesoscale simulations of cold-air outbreaks over the marginal sea ice zone with fine

  8. CDO budgeting

    Science.gov (United States)

    Nesladek, Pavel; Wiswesser, Andreas; Sass, Björn; Mauermann, Sebastian

    2008-04-01

    The Critical dimension off-target (CDO) is a key parameter for mask house customer, affecting directly the performance of the mask. The CDO is the difference between the feature size target and the measured feature size. The change of CD during the process is either compensated within the process or by data correction. These compensation methods are commonly called process bias and data bias, respectively. The difference between data bias and process bias in manufacturing results in systematic CDO error, however, this systematic error does not take into account the instability of the process bias. This instability is a result of minor variations - instabilities of manufacturing processes and changes in materials and/or logistics. Using several masks the CDO of the manufacturing line can be estimated. For systematic investigation of the unit process contribution to CDO and analysis of the factors influencing the CDO contributors, a solid understanding of each unit process and huge number of masks is necessary. Rough identification of contributing processes and splitting of the final CDO variation between processes can be done with approx. 50 masks with identical design, material and process. Such amount of data allows us to identify the main contributors and estimate the effect of them by means of Analysis of variance (ANOVA) combined with multivariate analysis. The analysis does not provide information about the root cause of the variation within the particular unit process, however, it provides a good estimate of the impact of the process on the stability of the manufacturing line. Additionally this analysis can be used to identify possible interaction between processes, which cannot be investigated if only single processes are considered. Goal of this work is to evaluate limits for CDO budgeting models given by the precision and the number of measurements as well as partitioning the variation within the manufacturing process. The CDO variation splits according to

  9. The Standard Model is Natural as Magnetic Gauge Theory

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2011-01-01

    matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...

  10. Electroweak theory and the Standard Model

    CERN Multimedia

    CERN. Geneva; Giudice, Gian Francesco

    2004-01-01

    There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development/test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector.

  11. Solid modeling and applications rapid prototyping, CAD and CAE theory

    CERN Document Server

    Um, Dugan

    2016-01-01

    The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...

  12. The logical foundations of scientific theories languages, structures, and models

    CERN Document Server

    Krause, Decio

    2016-01-01

    This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...

  13. A review of organizational buyer behaviour models and theories ...

    African Journals Online (AJOL)

    Over the years, models have been developed, and theories propounded, to explain the behavior of industrial buyers on the one hand and the nature of the dyadic relationship between organizational buyers and sellers on the other hand. This paper is an attempt at a review of the major models and theories in extant ...

  14. The Properties of Model Selection when Retaining Theory Variables

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren

    Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...

  15. Program evaluation models and related theories: AMEE guide no. 67.

    Science.gov (United States)

    Frye, Ann W; Hemmer, Paul A

    2012-01-01

    This Guide reviews theories of science that have influenced the development of common educational evaluation models. Educators can be more confident when choosing an appropriate evaluation model if they first consider the model's theoretical basis against their program's complexity and their own evaluation needs. Reductionism, system theory, and (most recently) complexity theory have inspired the development of models commonly applied in evaluation studies today. This Guide describes experimental and quasi-experimental models, Kirkpatrick's four-level model, the Logic Model, and the CIPP (Context/Input/Process/Product) model in the context of the theories that influenced their development and that limit or support their ability to do what educators need. The goal of this Guide is for educators to become more competent and confident in being able to design educational program evaluations that support intentional program improvement while adequately documenting or describing the changes and outcomes-intended and unintended-associated with their programs.

  16. Non-static plane symmetric cosmological model in Wesson's theory

    Indian Academy of Sciences (India)

    ] scale invariant theory of gravitation with a time-dependent gauge function is investigated. The false vacuum model of the universe is constructed and some physical properties of the model are discussed.

  17. Rigid aleph_epsilon-saturated models of superstable theories

    OpenAIRE

    Shami, Ziv; Shelah, Saharon

    1999-01-01

    In a countable superstable NDOP theory, the existence of a rigid aleph_epsilon-saturated model implies the existence of 2^lambda rigid aleph_epsilon-saturated models of power lambda for every lambda>2^{aleph_0}.

  18. The Birth of Model Theory Lowenheim's Theorem in the Frame of the Theory of Relatives

    CERN Document Server

    Badesa, Calixto

    2008-01-01

    Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. In The Birth of Model Theory, Cali

  19. Drafting Multiannual Local Budgets by Economic-Mathematical Modelling of the Evolution of Revenues

    Directory of Open Access Journals (Sweden)

    Ioan Radu

    2009-01-01

    Full Text Available Although seen as a sector with a high degree of inertia and conservatism the public administration system determines the public institutions to record a set of influences both from the internal and external environment. The public administration system is influenced by the frequent legislative changes and recently by the requirements claimed by the European Union. Given the complexity and dynamics of the competitive environment the approach of strategic management tools at the level of public administration becomes more and more important and necessary. One of the main types of exercise of strategic management is represented by financial planning moulded into policies, strategies, plans and programmes whose generation is based on multiannual budgets.

  20. Toric Methods in F-Theory Model Building

    Directory of Open Access Journals (Sweden)

    Johanna Knapp

    2011-01-01

    Full Text Available We discuss recent constructions of global F-theory GUT models and explain how to make use of toric geometry to do calculations within this framework. After introducing the basic properties of global F-theory GUTs, we give a self-contained review of toric geometry and introduce all the tools that are necessary to construct and analyze global F-theory models. We will explain how to systematically obtain a large class of compact Calabi-Yau fourfolds which can support F-theory GUTs by using the software package PALP.

  1. Theories, models and urban realities. From New York to Kathmandu

    Directory of Open Access Journals (Sweden)

    Román Rodríguez González

    2004-12-01

    Full Text Available At the beginning of the 21st century, there are various social theories that speak of global changes in the history of human civilization. Urban models have been through obvious changes throughout the last century according to the important transformation that are pro-posed by previous general theories. Nevertheless global diversity contradicts the generaliza-tion of these theories and models. From our own simple observations and reflections we arrive at conclusions that distance themselves from the prevailing theory of our civilized world. New York, Delhi, Salvador de Bahia, Bruges, Paris, Cartagena de Indias or Kath-mandu still have more internal differences than similarities.

  2. Theories, models and urban realities. From New York to Kathmandu

    Directory of Open Access Journals (Sweden)

    José Somoza Medina

    2004-01-01

    Full Text Available At the beginning of the 21st century, there are various social theories that speak of globalchanges in the history of human civilization. Urban models have been through obviouschanges throughout the last century according to the important transformation that are proposedby previous general theories. Nevertheless global diversity contradicts the generalizationof these theories and models. From our own simple observations and reflections wearrive at conclusions that distance themselves from the prevailing theory of our civilizedworld. New York, Delhi, Salvador de Bahia, Bruges, Paris, Cartagena de Indias or Kathmandustill have more internal differences than similarities.

  3. Model-Based Learning: A Synthesis of Theory and Research

    Science.gov (United States)

    Seel, Norbert M.

    2017-01-01

    This article provides a review of theoretical approaches to model-based learning and related research. In accordance with the definition of model-based learning as an acquisition and utilization of mental models by learners, the first section centers on mental model theory. In accordance with epistemology of modeling the issues of semantics,…

  4. Estimation of energy budget of ionosphere-thermosphere system during two CIR-HSS events: observations and modeling

    Directory of Open Access Journals (Sweden)

    Verkhoglyadova Olga

    2016-01-01

    Full Text Available We analyze the energy budget of the ionosphere-thermosphere (IT system during two High-Speed Streams (HSSs on 22–31 January, 2007 (in the descending phase of solar cycle 23 and 25 April–2 May, 2011 (in the ascending phase of solar cycle 24 to understand typical features, similarities, and differences in magnetosphere-ionosphere-thermosphere (IT coupling during HSS geomagnetic activity. We focus on the solar wind energy input into the magnetosphere (by using coupling functions and energy partitioning within the IT system during these intervals. The Joule heating is estimated empirically. Hemispheric power is estimated based on satellite measurements. We utilize observations from TIMED/SABER (Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics/Sounding of the Atmosphere using Broadband Emission Radiometry to estimate nitric oxide (NO and carbon dioxide (CO2 cooling emission fluxes. We perform a detailed modeling study of these two similar HSS events with the Global Ionosphere-Thermosphere Model (GITM and different external driving inputs to understand the IT response and to address how well the model reproduces the energy transport. GITM is run in a mode with forecastable inputs. It is shown that the model captures the main features of the energy coupling, but underestimates NO cooling and auroral heating in high latitudes. Lower thermospheric forcing at 100 km altitude is important for correct energy balance of the IT system. We discuss challenges for a physics-based general forecasting approach in modeling the energy budget of moderate IT storms caused by HSSs.

  5. Large-scale dynamical influence of a gravity wave generated over the Antarctic Peninsula – regional modelling and budget analysis

    Directory of Open Access Journals (Sweden)

    JOEL Arnault

    2013-03-01

    Full Text Available The case study of a mountain wave triggered by the Antarctic Peninsula on 6 October 2005, which has already been documented in the literature, is chosen here to quantify the associated gravity wave forcing on the large-scale flow, with a budget analysis of the horizontal wind components and horizontal kinetic energy. In particular, a numerical simulation using the Weather Research and Forecasting (WRF model is compared to a control simulation with flat orography to separate the contribution of the mountain wave from that of other synoptic processes of non-orographic origin. The so-called differential budgets of horizontal wind components and horizontal kinetic energy (after subtracting the results from the simulation without orography are then averaged horizontally and vertically in the inner domain of the simulation to quantify the mountain wave dynamical influence at this scale. This allows for a quantitative analysis of the simulated mountain wave's dynamical influence, including the orographically induced pressure drag, the counterbalancing wave-induced vertical transport of momentum from the flow aloft, the momentum and energy exchanges with the outer flow at the lateral and upper boundaries, the effect of turbulent mixing, the dynamics associated with geostrophic re-adjustment of the inner flow, the deceleration of the inner flow, the secondary generation of an inertia–gravity wave and the so-called baroclinic conversion of energy between potential energy and kinetic energy.

  6. [Simulation and data mining model for identifying and prediction budget changes in the care of patients with hypertension].

    Science.gov (United States)

    Joyanes-Aguilar, Luis; Castaño, Néstor J; Osorio, José H

    2015-10-01

    Objective To present a simulation model that establishes the economic impact to the health care system produced by the diagnostic evolution of patients suffering from arterial hypertension. Methodology The information used corresponds to that available in Individual Health Records (RIPs, in Spanish). A statistical characterization was carried out and a model for matrix storage in MATLAB was proposed. Data mining was used to create predictors. Finally, a simulation environment was built to determine the economic cost of diagnostic evolution. Results 5.7 % of the population progresses from the diagnosis, and the cost overrun associated with it is 43.2 %. Conclusions Results shows the applicability and possibility of focussing research on establishing diagnosis relationships using all the information reported in the RIPS in order to create econometric indicators that can determine which diagnostic evolutions are most relevant to budget allocation.

  7. Modeling transonic aerodynamic response using nonlinear systems theory for use with modern control theory

    Science.gov (United States)

    Silva, Walter A.

    1993-01-01

    The presentation begins with a brief description of the motivation and approach that has been taken for this research. This will be followed by a description of the Volterra Theory of Nonlinear Systems and the CAP-TSD code which is an aeroelastic, transonic CFD (Computational Fluid Dynamics) code. The application of the Volterra theory to a CFD model and, more specifically, to a CAP-TSD model of a rectangular wing with a NACA 0012 airfoil section will be presented.

  8. Dimensional reduction of Markov state models from renormalization group theory.

    Science.gov (United States)

    Orioli, S; Faccioli, P

    2016-09-28

    Renormalization Group (RG) theory provides the theoretical framework to define rigorous effective theories, i.e., systematic low-resolution approximations of arbitrary microscopic models. Markov state models are shown to be rigorous effective theories for Molecular Dynamics (MD). Based on this fact, we use real space RG to vary the resolution of the stochastic model and define an algorithm for clustering microstates into macrostates. The result is a lower dimensional stochastic model which, by construction, provides the optimal coarse-grained Markovian representation of the system's relaxation kinetics. To illustrate and validate our theory, we analyze a number of test systems of increasing complexity, ranging from synthetic toy models to two realistic applications, built form all-atom MD simulations. The computational cost of computing the low-dimensional model remains affordable on a desktop computer even for thousands of microstates.

  9. Spectral and scattering theory for translation invariant models in quantum field theory

    DEFF Research Database (Denmark)

    Rasmussen, Morten Grud

    This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...

  10. Theory and model use in social marketing health interventions.

    Science.gov (United States)

    Luca, Nadina Raluca; Suggs, L Suzanne

    2013-01-01

    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.

  11. Carbon, nitrogen, oxygen and sulfide budgets in the Black Sea : a biogeochemical model of the whole water column coupling the oxic and anoxic parts

    NARCIS (Netherlands)

    Grégoire, M.; Soetaert, K.E.R.

    2010-01-01

    Carbon, nitrogen, oxygen and sulfide budgets are derived for the Black Sea water column from a coupled physical–biogeochemical model. The model is applied in the deep part of the sea and simulates processes over the whole water column including the anoxic layer that extends from similar, equals115 m

  12. Measurement Models for Reasoned Action Theory

    OpenAIRE

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-01-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...

  13. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael

    2015-01-01

    -time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented......Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete...

  14. Posterior Predictive Model Checking for Multidimensionality in Item Response Theory

    Science.gov (United States)

    Levy, Roy; Mislevy, Robert J.; Sinharay, Sandip

    2009-01-01

    If data exhibit multidimensionality, key conditional independence assumptions of unidimensional models do not hold. The current work pursues posterior predictive model checking, a flexible family of model-checking procedures, as a tool for criticizing models due to unaccounted for dimensions in the context of item response theory. Factors…

  15. Holomorphy without supersymmetry in the Standard Model Effective Field Theory

    Directory of Open Access Journals (Sweden)

    Rodrigo Alonso

    2014-12-01

    Full Text Available The anomalous dimensions of dimension-six operators in the Standard Model Effective Field Theory (SMEFT respect holomorphy to a large extent. The holomorphy conditions are reminiscent of supersymmetry, even though the SMEFT is not a supersymmetric theory.

  16. Reframing Leadership Pedagogy through Model and Theory Building.

    Science.gov (United States)

    Mello, Jeffrey A.

    1999-01-01

    Leadership theories formed the basis of a course assignment with four objectives: understanding complex factors affecting leadership dynamics, developing abilities to assess organizational factors influencing leadership, practicing model and theory building, and viewing leadership from a multicultural perspective. The assignment was to develop a…

  17. Theory analysis of the Dental Hygiene Human Needs Conceptual Model.

    Science.gov (United States)

    MacDonald, L; Bowen, D M

    2017-11-01

    Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  19. Modeling Multivariate Volatility Processes: Theory and Evidence

    Directory of Open Access Journals (Sweden)

    Jelena Z. Minovic

    2009-05-01

    Full Text Available This article presents theoretical and empirical methodology for estimation and modeling of multivariate volatility processes. It surveys the model specifications and the estimation methods. Multivariate GARCH models covered are VEC (initially due to Bollerslev, Engle and Wooldridge, 1988, diagonal VEC (DVEC, BEKK (named after Baba, Engle, Kraft and Kroner, 1995, Constant Conditional Correlation Model (CCC, Bollerslev, 1990, Dynamic Conditional Correlation Model (DCC models of Tse and Tsui, 2002, and Engle, 2002. I illustrate approach by applying it to daily data from the Belgrade stock exchange, I examine two pairs of daily log returns for stocks and index, report the results obtained, and compare them with the restricted version of BEKK, DVEC and CCC representations. The methods for estimation parameters used are maximum log-likehood (in BEKK and DVEC models and twostep approach (in CCC model.

  20. High spatial resolution radiation budget for Europe: derived from satellite data, validation of a regional model; Raeumlich hochaufgeloeste Strahlungsbilanz ueber Europa: Ableitung aus Satellitendaten, Validation eines regionalen Modells

    Energy Technology Data Exchange (ETDEWEB)

    Hollmann, R. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Atmosphaerenphysik

    2000-07-01

    Since forty years instruments onboard satellites have been demonstrated their usefulness for many applications in the field of meteorology and oceanography. Several experiments, like ERBE, are dedicated to establish a climatology of the global Earth radiation budget at the top of the atmosphere. Now the focus has been changed to the regional scale, e.g. GEWEX with its regional sub-experiments like BALTEX. To obtain a regional radiation budget for Europe in the first part of the work the well calibrated measurements from ScaRaB (scanner for radiation budget) are used to derive a narrow-to-broadband conversion, which is applicable to the AVHRR (advanced very high resolution radiometer). It is shown, that the accuracy of the method is in the order of that from SCaRaB itself. In the second part of the work, results of REMO have been compared with measurements of ScaRaB and AVHRR for March 1994. The model reproduces the measurements overall well, but it is overestimating the cold areas and underestimating the warm areas in the longwave spectral domain. Similarly it is overestimating the dark areas and underestimating the bright areas in the solar spectral domain. (orig.)

  1. An Evolutionary Game Theory Model of Spontaneous Brain Functioning

    National Research Council Canada - National Science Library

    Dario Madeo; Agostino Talarico; Alvaro Pascual-Leone; Chiara Mocenni; Emiliano Santarnecchi

    2017-01-01

    ... conditions, making its understanding of fundamental importance in modern neuroscience. Here we present a theoretical and mathematical model based on an extension of evolutionary game theory on networks (EGN...

  2. Neurocognitive networks: findings, models, and theory.

    Science.gov (United States)

    Meehan, Timothy P; Bressler, Steven L

    2012-11-01

    Through its early history, cognitive neuroscience largely followed a modular paradigm wherein high-level cognitive functions were mapped onto locally segregated brain regions. However, recent evidence drives a continuing shift away from modular theories of cognitive brain function, and toward theories which hold that cognition arises from the integrated activity of large-scale, distributed networks of brain regions. A growing consensus favors the fundamental concept of this new paradigm: the large-scale cognitive brain network, or neurocognitive network. This consensus was the motivation for Neurocognitive Networks 2010 (NCN 2010), a conference sponsored by the Cognitive Neuroscience Program of the National Science Foundation, organized by Drs. Steven Bressler and Craig Richter of Florida Atlantic University (FAU), and held at FAU in Boca Raton, FL on January 29-30, 2010. NCN 2010 gathered together some of today's leading investigators of neurocognitive networks. This paper serves to review their presentations as they relate to the paradigm of neurocognitive networks, as well as to compile the emergent themes, questions, and possible future research directions that arose from the conference. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Simulating the water budget of a Prairie Potholes complex from LiDAR and hydrological models in North Dakota, USA

    Science.gov (United States)

    Huang, Shengli; Young, Claudia; Abdul-Aziz, Omar I.; Dahal, Devendra; Feng, Min; Liu, Shuguang

    2013-01-01

    Hydrological processes of the wetland complex in the Prairie Pothole Region (PPR) are difficult to model, partly due to a lack of wetland morphology data. We used Light Detection And Ranging (LiDAR) data sets to derive wetland features; we then modelled rainfall, snowfall, snowmelt, runoff, evaporation, the “fill-and-spill” mechanism, shallow groundwater loss, and the effect of wet and dry conditions. For large wetlands with a volume greater than thousands of cubic metres (e.g. about 3000 m3), the modelled water volume agreed fairly well with observations; however, it did not succeed for small wetlands (e.g. volume less than 450 m3). Despite the failure for small wetlands, the modelled water area of the wetland complex coincided well with interpretation of aerial photographs, showing a linear regression with R2 of around 0.80 and a mean average error of around 0.55 km2. The next step is to improve the water budget modelling for small wetlands.

  4. Conceptual development: an adaptive resonance theory model of polysemy

    Science.gov (United States)

    Dunbar, George L.

    1997-04-01

    Adaptive Resonance Theory provides a model of pattern classification that addresses the plasticity--stability dilemma and allows a neural network to detect when to construct a new category without the assistance of a supervisor. We show that Adaptive Resonance Theory can be applied to the study of natural concept development. Specifically, a model is presented which is able to categorize different usages of a common noun and group the polysemous senses appropriately.

  5. Translating caring theory into practice: the Carolina Care Model.

    Science.gov (United States)

    Tonges, Mary; Ray, Joel

    2011-09-01

    This article describes how one organization operationalized Swanson Caring Theory and changed practice to ensure consistently high standards of performance. The Carolina Care Model developed at the University of North Carolina Hospitals is designed to actualize caring theory, support practices that promote patient satisfaction, and transform cultural norms. Evaluation suggests that this approach to care delivery enhances patients' and families' hospital experience and facilitates desired outcomes. The authors outline the Professional Practice Model, key characteristics of Carolina Care, links to caring theory, and development and implementation methodologies.

  6. The Number of Atomic Models of Uncountable Theories

    OpenAIRE

    Ulrich, Douglas

    2016-01-01

    We show there exists a complete theory in a language of size continuum possessing a unique atomic model which is not constructible. We also show it is consistent with $ZFC + \\aleph_1 < 2^{\\aleph_0}$ that there is a complete theory in a language of size $\\aleph_1$ possessing a unique atomic model which is not constructible. Finally we show it is consistent with $ZFC + \\aleph_1 < 2^{\\aleph_0}$ that for every complete theory $T$ in a language of size $\\aleph_1$, if $T$ has uncountable atomic mod...

  7. Bianchi class A models in Sàez-Ballester's theory

    Science.gov (United States)

    Socorro, J.; Espinoza-García, Abraham

    2012-08-01

    We apply the Sàez-Ballester (SB) theory to Bianchi class A models, with a barotropic perfect fluid in a stiff matter epoch. We obtain exact classical solutions à la Hamilton for Bianchi type I, II and VIh=-1 models. We also find exact quantum solutions to all Bianchi Class A models employing a particular ansatz for the wave function of the universe.

  8. A Dynamic Systems Theory Model of Visual Perception Development

    Science.gov (United States)

    Coté, Carol A.

    2015-01-01

    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  9. Measurement Models for Reasoned Action Theory.

    Science.gov (United States)

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-03-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.

  10. Modeling acquaintance networks based on balance theory

    Directory of Open Access Journals (Sweden)

    Vukašinović Vida

    2014-09-01

    Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models

  11. Modeling in applied sciences a kinetic theory approach

    CERN Document Server

    Pulvirenti, Mario

    2000-01-01

    Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...

  12. Mathematical Modelling and New Theories of Learning.

    Science.gov (United States)

    Boaler, Jo

    2001-01-01

    Demonstrates the importance of expanding notions of learning beyond knowledge to the practices in mathematics classrooms. Considers a three-year study of students who learned through mathematical modeling. Shows that a modeling approach encouraged the development of a range of important practices in addition to knowledge that were useful in real…

  13. Baldrige Theory into Practice: A Generic Model

    Science.gov (United States)

    Arif, Mohammed

    2007-01-01

    Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…

  14. The Flare Irradiance Spectral Model (FISM) and its Contributions to Space Weather Research, the Flare Energy Budget, and Instrument Design

    Science.gov (United States)

    Chamberlin, Phillip

    2008-01-01

    The Flare Irradiance Spectral Model (FISM) is an empirical model of the solar irradiance spectrum from 0.1 to 190 nm at 1 nm spectral resolution and on a 1-minute time cadence. The goal of FISM is to provide accurate solar spectral irradiances over the vacuum ultraviolet (VUV: 0-200 nm) range as input for ionospheric and thermospheric models. The seminar will begin with a brief overview of the FISM model, and also how the Solar Dynamics Observatory (SDO) EUV Variability Experiment (EVE) will contribute to improving FISM. Some current studies will then be presented that use FISM estimations of the solar VUV irradiance to quantify the contributions of the increased irradiance from flares to Earth's increased thermospheric and ionospheric densites. Initial results will also be presented from a study looking at the electron density increases in the Martian atmosphere during a solar flare. Results will also be shown quantifying the VUV contributions to the total flare energy budget for both the impulsive and gradual phases of solar flares. Lastly, an example of how FISM can be used to simplify the design of future solar VUV irradiance instruments will be discussed, using the future NOAA GOES-R Extreme Ultraviolet and X-Ray Sensors (EXIS) space weather instrument.

  15. Top-down approach to West Siberian regional carbon budget: combination of the CO2 observations and inverse modeling

    Science.gov (United States)

    Maksyutov, S.; Machida, T.; Shimoyama, K.; Carouge, C.; Peregon, A.; Patra, P.; Arshinov, M.; Krasnov, O.; Belan, B.; Fedoseev, N.; Shvidenko, A.; Inoue, G.

    2006-12-01

    Joint Japanese-Russian project is aiming at top-down approach to West Siberian regional carbon budget estimation. Study is combining three main components: regional atmospheric CO2 observing network, regional carbon inventory (bottom-up approach), and inverse model of atmospheric CO2 surface emissions, sinks and transport, that links together CO2 observations and carbon inventories. Airborne air sampling programs and observations are conducted over Siberia since 1993, now at 4 sites. A tower network has been established in West Siberia since 2002 with total of planned 10 tower sites, 6 of them operating in 2005. Bottom-up inventory of the regional carbon pools is based on analysis of the forest/wetland biomass inventories and interannual changes in forest survey totals on eco-region levels. To support the forward and inverse model simulations, detailed soil and vegetation type maps, soil profile and vegetation structure databases were developed. The inverse model of the surface CO2 sources and sinks was used for observation network design and is applied now to the first complete set observational data for year 2005. Preliminary analysis of the multiyear Siberian CO2 observations with inverse model suggest that more carbon sink is needed in Siberia to match the atmospheric data than implied without the regional observations.

  16. Optimal transportation networks models and theory

    CERN Document Server

    Bernot, Marc; Morel, Jean-Michel

    2009-01-01

    The transportation problem can be formalized as the problem of finding the optimal way to transport a given measure into another with the same mass. In contrast to the Monge-Kantorovitch problem, recent approaches model the branched structure of such supply networks as minima of an energy functional whose essential feature is to favour wide roads. Such a branched structure is observable in ground transportation networks, in draining and irrigation systems, in electrical power supply systems and in natural counterparts such as blood vessels or the branches of trees. These lectures provide mathematical proof of several existence, structure and regularity properties empirically observed in transportation networks. The link with previous discrete physical models of irrigation and erosion models in geomorphology and with discrete telecommunication and transportation models is discussed. It will be mathematically proven that the majority fit in the simple model sketched in this volume.

  17. Marketing with limited budget

    OpenAIRE

    Smirnova, Daria

    2017-01-01

    The purpose of this research-based thesis was to get an idea how managers of two small resembling hotels of a specific region deal with marketing process with a limited budget. In addition, the aim of the thesis was to examine if hotel managers who were interviewed perceive marketing only in the way of ‘promotion’ rather than marketing research, marketing mix and marketing environment theories. It was also found out if hotel managers of those hotels consider marketing as a key to successful h...

  18. Comparison of rainfall based SPI drought indices with SMDI and ETDI indices derived from a soil water budget model

    Science.gov (United States)

    Houcine, A.; Bargaoui, Z.

    2012-04-01

    Modelling soil water budget is a key issue for assessing drought awareness indices based on soil moisture estimation. The aim of the study is to compare drought indices based on rainfall time series to those based on soil water content time series and evapotranspiration time series. To this end, a vertically averaged water budget over the root zone is implemented to assist the estimation of evapotranspiration flux. A daily time step is adopted to run the water budget model for a lumped watershed of 250 km2 under arid climate where recorded meteorological and hydrological data are available for a ten year period. The water balance including 7 parameters is computed including evapotranspiration, runoff and leakage. Soil properties related parameters are derived according to pedo transfer functions while two remaining parameters are considered as data driven and are subject to calibration. The model is calibrated using daily hydro meteorological data (solar radiation, air temperature, air humidity, mean areal rainfall) as well as daily runoff records and also average annual (or regional) evapotranspiration. The latter is estimated using an empirical sub-model. A set of acceptable solutions is identified according to the values of the Nash coefficients for annual and decadal runoffs as well as the relative bias for average annual evapotranspiration. Using these acceptable solutions several drought indices are computed: SPI (standard precipitation index), SMDI (soil moisture deficit index) and ETDI (evapotranspiration deficit index). While SPI indicators are based only on monthly precipitation time series, SMDI are based on weekly mean soil water content as computed by the hydrological model. On the other hand ETDI indices are based on weekly mean potential and actual evapotranspirations as estimated by the meteorological and hydrological models. For SPI evaluation various time scales are considered from one to twelve months (SPI1, SPI3, SPI6, SPI9 and SPI12). For all

  19. Conformal Field Theory and its application to the Ising model

    Science.gov (United States)

    Meyer, Joshua

    The two-dimensional Ising model was originally solved by Onsager using statistical physics techniques. More recently, it has been found that the derivation of critical exponents and correlation functions can be greatly simplified by using the methods of Conformal Field Theory (CFT). We review these methods and apply them to the two-dimensional Ising model. The connection between the continuum limit Ising model and the field theory of free fermions is explained, resulting in a CFT on the plane with two non-trivial fields. Through the use of bosonization on the plane, the free-field correlation functions of the model are computed.

  20. Mixed models theory and applications with R

    CERN Document Server

    Demidenko, Eugene

    2013-01-01

    Mixed modeling is one of the most promising and exciting areas of statistical analysis, enabling the analysis of nontraditional, clustered data that may come in the form of shapes or images. This book provides in-depth mathematical coverage of mixed models' statistical properties and numerical algorithms, as well as applications such as the analysis of tumor regrowth, shape, and image. The new edition includes significant updating, over 300 exercises, stimulating chapter projects and model simulations, inclusion of R subroutines, and a revised text format. The target audience continues to be g

  1. Development of Advanced Eco-hydrologic and Biogeochemical Coupling Model to Re-evaluate Greenhouse Gas Budget of Biosphere

    Science.gov (United States)

    Nakayama, T.; Maksyutov, S. S.

    2015-12-01

    Inland waters including rivers, lakes, and groundwater are suggested to act as a transport pathway for water and dissolved substances, and play some role in continental biogeochemical cycling (Cole et al., 2007; Battin et al., 2009). The authors have developed process-based National Integrated Catchment-based Eco-hydrology (NICE) model (2014, 2015, etc.), which includes feedback between hydrologic-geomorphic-ecological processes. In this study, NICE was further developed to couple with various biogeochemical cycle models in biosphere, those for water quality in aquatic ecosystems, and those for carbon weathering. The NICE-biogeochemical coupling model incorporates connectivity of the biogeochemical cycle accompanied by hydrologic cycle between surface water and groundwater, hillslopes and river networks, and other intermediate regions. The model also includes reaction between inorganic and organic carbons, and its relation to nitrogen and phosphorus in terrestrial-aquatic continuum. The coupled model showed to improve the accuracy of inundation stress mechanism such as photosynthesis and primary production, which attributes to improvement of CH4 flux in wetland sensitive to fluctuations of shallow groundwater. The model also simulated CO2 evasion from inland water in global scale, and was relatively in good agreement in empirical relation (Aufdenkampe et al., 2011) which has relatively an uncertainty in the calculated flux because of pCO2 data missing in some region and effect of small tributaries, etc. Further, the model evaluated how the expected CO2 evasion might change as inland waters become polluted with nutrients and eutrophication increases from agriculture and urban areas (Pacheco et al., 2013). This advanced eco-hydrologic and biogeochemical coupling model would play important role to re-evaluate greenhouse gas budget of the biosphere, and to bridge gap between top-down and bottom-up approaches (Battin et al., 2009; Regnier et al., 2013).

  2. Homogeneous cosmological models in Yang's gravitation theory

    Science.gov (United States)

    Fennelly, A. J.; Pavelle, R.

    1979-01-01

    We present a dynamic, spatially homogeneous solution of Yang's pure space gravitational field equations which is non-Einsteinian. The predictions of this cosmological model seem to be at variance with observations.

  3. Budget Elements of Economic Security: Specifics of Classification

    Directory of Open Access Journals (Sweden)

    О. S.

    2017-02-01

    Full Text Available Theoretical aspects of economic security in conjunction with budget components such as “budget interests” and “budget necessities” are analyzed. Key positions of the categories “budget interests” and “budget necessities” in the theory of economic security in the budgetary area are substantiated given their priority role in setting up its implementation strategy. The category “budget interests” is defined as the system of budget necessities of the interest holders, implemented through budget activities of entities and aimed at seeking benefits through the budget, in order to guarantee functioning and development of the society, the state, legal entities and physical persons. “Budget necessities” are defined as the need in budget funds to achieve and sustain, at a certain level, life activities of individuals, social groups, society, state and legal entities. Classification of budget interests by various criteria is made in the context of their impact on the economic security of the state. It is demonstrated that the four-tier classification of the budget interests by interest holder is essential to guaranteeing economic security in the budgetary area: budget interests of the state: the interests held by central and local power offices; budget interests of legal entities: the interests of profit and non-profit (public, budgetary, party and other organizations; budget interests of individuals: basic necessities of individuals, met by budget transfers, which stand out of the array of public necessities by their individual character.

  4. Alluvial and colluvial sediment storage in the Geul River catchment (The Netherlands) — Combining field and modelling data to construct a Late Holocene sediment budget

    NARCIS (Netherlands)

    de Moor, J.J.W.; Verstraeten, G.

    2007-01-01

    We used a combined approach of a two-dimensional erosion and hillslope sediment delivery model (WATEM/SEDEM) and detailed geomorphological reconstructions to quantify the different components in a sediment budget for the Geul River catchment (southern Netherlands) since the High Middle Ages.

  5. Modeling workplace bullying using catastrophe theory.

    Science.gov (United States)

    Escartin, J; Ceja, L; Navarro, J; Zapf, D

    2013-10-01

    Workplace bullying is defined as negative behaviors directed at organizational members or their work context that occur regularly and repeatedly over a period of time. Employees' perceptions of psychosocial safety climate, workplace bullying victimization, and workplace bullying perpetration were assessed within a sample of nearly 5,000 workers. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes in workplace bullying. More specifically, the present study examines whether a nonlinear dynamical systems model (i.e., a cusp catastrophe model) is superior to the linear combination of variables for predicting the effect of psychosocial safety climate and workplace bullying victimization on workplace bullying perpetration. According to the AICc, and BIC indices, the linear regression model fits the data better than the cusp catastrophe model. The study concludes that some phenomena, especially unhealthy behaviors at work (like workplace bullying), may be better studied using linear approaches as opposed to nonlinear dynamical systems models. This can be explained through the healthy variability hypothesis, which argues that positive organizational behavior is likely to present nonlinear behavior, while a decrease in such variability may indicate the occurrence of negative behaviors at work.

  6. Understanding Rasch Measurement: The Rasch Model, Additive Conjoint Measurement, and New Models of Probabilistic Measurement Theory.

    Science.gov (United States)

    Karabatsos, George

    2001-01-01

    Describes similarities and differences between additive conjoint measurement and the Rasch model, and formalizes some new nonparametric item response models that are, in a sense, probabilistic measurement theory models. Applies these new models to published and simulated data. (SLD)

  7. Budgeting for School Media Centers.

    Science.gov (United States)

    Drott, M. Carl

    1978-01-01

    Describes various forms of budgets and discusses concepts in budgeting useful to supervisors of school media centers: line item budgets, capital budgets, creating budgets, the budget calendar, innovations, PPBS (Planning, Programing, Budgeting System), zero-based budgeting, cost-benefit analysis, benefits, benefit guidelines, and budgeting for the…

  8. Spatial interaction models facility location using game theory

    CERN Document Server

    D'Amato, Egidio; Pardalos, Panos

    2017-01-01

    Facility location theory develops the idea of locating one or more facilities by optimizing suitable criteria such as minimizing transportation cost, or capturing the largest market share. The contributions in this book focus an approach to facility location theory through game theoretical tools highlighting situations where a location decision is faced by several decision makers and leading to a game theoretical framework in non-cooperative and cooperative methods. Models and methods regarding the facility location via game theory are explored and applications are illustrated through economics, engineering, and physics. Mathematicians, engineers, economists and computer scientists working in theory, applications and computational aspects of facility location problems using game theory will find this book useful.

  9. Carbon budget of tropical forests in Southeast Asia and the effects of deforestation: an approach using a process-based model and field measurements

    Directory of Open Access Journals (Sweden)

    M. Adachi

    2011-09-01

    Full Text Available More reliable estimates of the carbon (C stock within forest ecosystems and C emission induced by deforestation are urgently needed to mitigate the effects of emissions on climate change. A process-based terrestrial biogeochemical model (VISIT was applied to tropical primary forests of two types (a seasonal dry forest in Thailand and a rainforest in Malaysia and one agro-forest (an oil palm plantation in Malaysia to estimate the C budget of tropical ecosystems in Southeast Asia, including the impacts of land-use conversion. The observed aboveground biomass in the seasonal dry tropical forest in Thailand (226.3 t C ha−1 and the rainforest in Malaysia (201.5 t C ha−1 indicate that tropical forests of Southeast Asia are among the most C-abundant ecosystems in the world. The model simulation results in rainforests were consistent with field data, except for the NEP, however, the VISIT model tended to underestimate C budget and stock in the seasonal dry tropical forest. The gross primary production (GPP based on field observations ranged from 32.0 to 39.6 t C ha−1 yr−1 in the two primary forests, whereas the model slightly underestimated GPP (26.5–34.5 t C ha−1 yr−1. The VISIT model appropriately captured the impacts of disturbances such as deforestation and land-use conversions on the C budget. Results of sensitivity analysis showed that the proportion of remaining residual debris was a key parameter determining the soil C budget after the deforestation event. According to the model simulation, the total C stock (total biomass and soil C of the oil palm plantation was about 35% of the rainforest's C stock at 30 yr following initiation of the plantation. However, there were few field data of C budget and stock, especially in oil palm plantation. The C budget of each ecosystem must be evaluated over the long term using both the model simulations and observations to

  10. Carbon budget of tropical forests in Southeast Asia and the effects of deforestation: an approach using a process-based model and field measurements

    Science.gov (United States)

    Adachi, M.; Ito, A.; Ishida, A.; Kadir, W. R.; Ladpala, P.; Yamagata, Y.

    2011-09-01

    More reliable estimates of the carbon (C) stock within forest ecosystems and C emission induced by deforestation are urgently needed to mitigate the effects of emissions on climate change. A process-based terrestrial biogeochemical model (VISIT) was applied to tropical primary forests of two types (a seasonal dry forest in Thailand and a rainforest in Malaysia) and one agro-forest (an oil palm plantation in Malaysia) to estimate the C budget of tropical ecosystems in Southeast Asia, including the impacts of land-use conversion. The observed aboveground biomass in the seasonal dry tropical forest in Thailand (226.3 t C ha-1) and the rainforest in Malaysia (201.5 t C ha-1) indicate that tropical forests of Southeast Asia are among the most C-abundant ecosystems in the world. The model simulation results in rainforests were consistent with field data, except for the NEP, however, the VISIT model tended to underestimate C budget and stock in the seasonal dry tropical forest. The gross primary production (GPP) based on field observations ranged from 32.0 to 39.6 t C ha-1 yr-1 in the two primary forests, whereas the model slightly underestimated GPP (26.5-34.5 t C ha-1 yr-1). The VISIT model appropriately captured the impacts of disturbances such as deforestation and land-use conversions on the C budget. Results of sensitivity analysis showed that the proportion of remaining residual debris was a key parameter determining the soil C budget after the deforestation event. According to the model simulation, the total C stock (total biomass and soil C) of the oil palm plantation was about 35% of the rainforest's C stock at 30 yr following initiation of the plantation. However, there were few field data of C budget and stock, especially in oil palm plantation. The C budget of each ecosystem must be evaluated over the long term using both the model simulations and observations to understand the effects of climate and land-use conversion on C budgets in tropical forest

  11. Annual variation in carbon budget using remote-sensing data and a process model in Borneo Island, Southeast Asia

    Science.gov (United States)

    Adachi, M.; Ito, A.; Takeuchi, W.; Yamagata, Y.

    2011-12-01

    Reducing emissions from deforestation and forest degradation in developing countries (REDD) is one of the most important carbon emission reduction efforts in the tropical region. Deforestation and land use changes are human activities with major impact on the regional carbon budged and the other greenhouse gases (CH4 and N2O) emissions. Forest carbon biomass in Southeast Asia is largest in Asia region; however, the area of primary forest had continuously decreased due to land-use conversion. The objective of the present study was to evaluate carbon budged and greenhouse gases induced by deforestation from Borneo Island. We used time-series satellite remote-sensing data to track deforestation history in Borneo Island, Southeast Asia, and estimated the resulting forest carbon budget using a process-based model (VISIT: Vegetation Integrative SImulator for Trace gases). The forest/non-forest area was mapped by applying the ALOS/PALSAR-calibrated threshold value to MODIS, SPOT-VEGETATION, and NOAA-AVHRR images. The model allowed us to estimate changes in carbon budged and greenhouse gases by human disturbances, including land-use conversion from primary forest to cropland (e.g., oil-palm plantation). The estimated carbon stocks, budged, and greenhouse gases were verified using field observation of previous studies at some point of Borneo Island. Our results suggested that the southern part of Borneo Island was a large carbon source due to deforestation, although the VISIT model need be revised to account for tropical peatland.

  12. Rational Budgeting? The Stanford Case.

    Science.gov (United States)

    Chaffee, Ellen Earle

    The budget decision making process at Stanford University, California, from 1970 through 1979 was evaluated in relation to the allocation of general funds to 38 academic departments. Using Simon's theory of bounded rationality and an organizational level of analysis, the Stanford decision process was tested for its rationality through…

  13. Cascade Version 1: Theory and Model Formulation

    Science.gov (United States)

    2006-06-01

    that provides this modeling framework , potentially allowing for an arbitrary number of scales. The coupling between coastal evolution at different...breakpoint. The two equations are written as follows: 2 2cos coso go o b gb bH C H Cθ = θ (7) sin sino b o bC C θ θ= (8) where H = wave height

  14. Density functional theory and multiscale materials modeling

    Indian Academy of Sciences (India)

    One of the vital ingredients in the theoretical tools useful in materials modeling at all the length scales of interest is the concept of density. In the microscopic length scale, it is the electron density that has played a major role in providing a deeper understanding of chemical binding in atoms, molecules and solids.

  15. Theory and Model for Martensitic Transformations

    DEFF Research Database (Denmark)

    Lindgård, Per-Anker; Mouritsen, Ole G.

    1986-01-01

    Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...

  16. C and N gross efficiencies of copepod egg production studies using a dynamic energy budget model.

    NARCIS (Netherlands)

    Kuijper, L.D.J.; Anderson, T.R.; Kooijman, S.A.L.M.

    2004-01-01

    Simple stoichiometric models based on the principle that limiting elements are used with high efficiency have been unable to capture the apparently constant and low nitrogen gross growth efficiency that characterizes egg production in marine copepods. A new model of egg production is presented based

  17. C and N gross efficiencies of copepod egg production studies using a dynamic energy budget model

    NARCIS (Netherlands)

    Kuijper, L.D.J.; Anderson, T.; Kooijman, S.A.L.M.

    2003-01-01

    Simple stoichiometric models based on the principle that limiting elements are used with high efficiency have been unable to capture the apparently constant and low nitrogen gross growth efficiency that characterizes egg production in marine copepods. A new model of egg production is presented based

  18. Applying learning theories and instructional design models for effective instruction.

    Science.gov (United States)

    Khalil, Mohammed K; Elkhider, Ihsan A

    2016-06-01

    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory. Copyright © 2016 The American Physiological Society.

  19. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  20. Multilevel Ventilation: Theory and Simplified Mathematical Model

    Directory of Open Access Journals (Sweden)

    P. Torok

    2008-01-01

    Full Text Available Considering the issues of artificial ventilation (AV in non-homogenous pathological lung processes (acute lung injury, acute respiratory distress syndrome, pneumonia, etc., the authors created a mathematical model of multicompartment non-homogenous injured lungs that were ventilated by a new mode of AV, the so-called three-level ventilation. Multilevel ventilation was defined a type (modification of ALV whose basic ventilation level was produced by the modes CMV, PCV or PS (ASB and add-on level, and the so-called background ventilation was generated by the levels of PEEP and high PEEP (PEEPh with varying frequency and duration. Multi-level ventilation on 3 pressure levels was realized by the mathematical model as a combination of pressure-controlled ventilation (PCV and two levels of PEEP and PEEPh. The objective was to prove that in cases of considerably non-homogenous gas distribution in acute pathological disorders of lungs, gas entry into the so-called slow bronchoalveolar compartments could be improved by multilevel AV, without substabtially changing the volume of so-called fast compartments. Material and Method. Multi-level ventilation at 3 pressure levels was realized by the mathematical model as a combination of PCV and two levels of PEEP and PEEPh. Results. By comparing the single-level AV in the PCV mode with the so-called three-level ventilation defined as a combination of PCV+PEEPh/PEEP, the authors have discovered that the loading of slow compartments in the model was considerably improved by 50—60% as compared with the baseline values. In absolute terms, this difference was as many as 2—10 times of the volume. Conclusion. The mathematical model may demonstrate that the application of the so-called three-level AV causes considerable changes in gas distribution in the lung parenchyma disordered by a non-homogenous pathological process. The authors state that the proposed mathematical model requires clinical verification in order

  1. From integrable models to gauge theories Festschrift Matinyan (Sergei G)

    CERN Document Server

    Gurzadyan, V G

    2002-01-01

    This collection of twenty articles in honor of the noted physicist and mentor Sergei Matinyan focuses on topics that are of fundamental importance to high-energy physics, field theory and cosmology. The topics range from integrable quantum field theories, three-dimensional Ising models, parton models and tests of the Standard Model, to black holes in loop quantum gravity, the cosmological constant and magnetic fields in cosmology. A pedagogical essay by Lev Okun concentrates on the problem of fundamental units. The articles have been written by well-known experts and are addressed to graduate

  2. Lenses on reading an introduction to theories and models

    CERN Document Server

    Tracey, Diane H

    2017-01-01

    Widely adopted as an ideal introduction to the major models of reading, this text guides students to understand and facilitate children's literacy development. Coverage encompasses the full range of theories that have informed reading instruction and research, from classical thinking to cutting-edge cognitive, social learning, physiological, and affective perspectives. Readers learn how theory shapes instructional decision making and how to critically evaluate the assumptions and beliefs that underlie their own teaching. Pedagogical features include framing and discussion questions, learning a

  3. Fuzzy Stochastic Optimization Theory, Models and Applications

    CERN Document Server

    Wang, Shuming

    2012-01-01

    Covering in detail both theoretical and practical perspectives, this book is a self-contained and systematic depiction of current fuzzy stochastic optimization that deploys the fuzzy random variable as a core mathematical tool to model the integrated fuzzy random uncertainty. It proceeds in an orderly fashion from the requisite theoretical aspects of the fuzzy random variable to fuzzy stochastic optimization models and their real-life case studies.   The volume reflects the fact that randomness and fuzziness (or vagueness) are two major sources of uncertainty in the real world, with significant implications in a number of settings. In industrial engineering, management and economics, the chances are high that decision makers will be confronted with information that is simultaneously probabilistically uncertain and fuzzily imprecise, and optimization in the form of a decision must be made in an environment that is doubly uncertain, characterized by a co-occurrence of randomness and fuzziness. This book begins...

  4. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  5. Nonlinear model predictive control theory and algorithms

    CERN Document Server

    Grüne, Lars

    2017-01-01

    This book offers readers a thorough and rigorous introduction to nonlinear model predictive control (NMPC) for discrete-time and sampled-data systems. NMPC schemes with and without stabilizing terminal constraints are detailed, and intuitive examples illustrate the performance of different NMPC variants. NMPC is interpreted as an approximation of infinite-horizon optimal control so that important properties like closed-loop stability, inverse optimality and suboptimality can be derived in a uniform manner. These results are complemented by discussions of feasibility and robustness. An introduction to nonlinear optimal control algorithms yields essential insights into how the nonlinear optimization routine—the core of any nonlinear model predictive controller—works. Accompanying software in MATLAB® and C++ (downloadable from extras.springer.com/), together with an explanatory appendix in the book itself, enables readers to perform computer experiments exploring the possibilities and limitations of NMPC. T...

  6. Combining Theory Generation and Model Checking for Security Protocol Analysis,

    Science.gov (United States)

    2000-01-01

    checking to the task of protocol analysis, while the other utilizes the method of theory generation. which borrows from both model checking and...This paper reviews two relatively new tools for automated formal analysis of security protocols. One applies the formal methods technique of model

  7. Thermodynamic Models from Fluctuation Solution Theory Analysis of Molecular Simulations

    DEFF Research Database (Denmark)

    Christensen, Steen; Peters, Günther H.j.; Hansen, Flemming Yssing

    2007-01-01

    Fluctuation solution theory (FST) is employed to analyze results of molecular dynamics (MD) simulations of liquid mixtures. The objective is to generate parameters for macroscopic GE-models, here the modified Margules model. We present a strategy for choosing the number of parameters included...

  8. [Study of dental model testing tool based on robot theory].

    Science.gov (United States)

    Hu, B; Song, Y; Cheng, L

    1999-09-01

    A new three dimensional testing and analysing system of dental model is discussed It is designed based on the motion theory of robots. The system is capable of not only measuring the three dimensional sizes of dental models, but also saving and outputing the tested data. The construction of the system is briefly introduced here.

  9. Pilot evaluation in TENCompetence: a theory-driven model1

    NARCIS (Netherlands)

    Schoonenboom, J.; Sligte, H.; Moghnieh, A.; Specht, M.; Glahn, C.; Stefanov, K.; Navarrete, T.; Blat, J.

    2008-01-01

    This paper describes a theory-driven evaluation model that is used in evaluating four pilots in which an infrastructure for lifelong competence development, which is currently being developed, is validated. The model makes visible the separate implementation steps that connect the envisaged

  10. Clinical outcome measurement: Models, theory, psychometrics and practice.

    Science.gov (United States)

    McClimans, Leah; Browne, John; Cano, Stefan

    In the last decade much has been made of the role that models play in the epistemology of measurement. Specifically, philosophers have been interested in the role of models in producing measurement outcomes. This discussion has proceeded largely within the context of the physical sciences, with notable exceptions considering measurement in economics. However, models also play a central role in the methods used to develop instruments that purport to quantify psychological phenomena. These methods fall under the umbrella term 'psychometrics'. In this paper, we focus on Clinical Outcome Assessments (COAs) and discuss two measurement theories and their associated models: Classical Test Theory (CTT) and Rasch Measurement Theory. We argue that models have an important role to play in coordinating theoretical terms with empirical content, but to do so they must serve: 1) as a representation of the measurement interaction; and 2) in conjunction with a theory of the attribute in which we are interested. We conclude that Rasch Measurement Theory is a more promising approach than CTT in these regards despite the latter's popularity with health outcomes researchers. Copyright © 2017. Published by Elsevier Ltd.

  11. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  12. An Energy Budget Model to Calculate the Low Atmosphere Profiles of Effective Sound Speed at Night

    National Research Council Canada - National Science Library

    Tunick, Arnold

    2003-01-01

    ...) for generating low atmosphere profiles of effective sound speed at night. The alternate model is based on the solution of a quartic equation for surface temperature, which assumes a balance between the net long wave...

  13. Advances in cognitive theory and therapy: the generic cognitive model.

    Science.gov (United States)

    Beck, Aaron T; Haigh, Emily A P

    2014-01-01

    For over 50 years, Beck's cognitive model has provided an evidence-based way to conceptualize and treat psychological disorders. The generic cognitive model represents a set of common principles that can be applied across the spectrum of psychological disorders. The updated theoretical model provides a framework for addressing significant questions regarding the phenomenology of disorders not explained in previous iterations of the original model. New additions to the theory include continuity of adaptive and maladaptive function, dual information processing, energizing of schemas, and attentional focus. The model includes a theory of modes, an organization of schemas relevant to expectancies, self-evaluations, rules, and memories. A description of the new theoretical model is followed by a presentation of the corresponding applied model, which provides a template for conceptualizing a specific disorder and formulating a case. The focus on beliefs differentiates disorders and provides a target for treatment. A variety of interventions are described.

  14. Effective Biot theory and its generalization to poroviscoelastic models

    Science.gov (United States)

    Liu, Xu; Greenhalgh, Stewart; Zhou, Bing; Greenhalgh, Mark

    2018-02-01

    A method is suggested to express the effective bulk modulus of the solid frame of a poroelastic material as a function of the saturated bulk modulus. This method enables effective Biot theory to be described through the use of seismic dispersion measurements or other models developed for the effective saturated bulk modulus. The effective Biot theory is generalized to a poroviscoelastic model of which the moduli are represented by the relaxation functions of the generalized fractional Zener model. The latter covers the general Zener and the Cole-Cole models as special cases. A global search method is described to determine the parameters of the relaxation functions, and a simple deterministic method is also developed to find the defining parameters of the single Cole-Cole model. These methods enable poroviscoelastic models to be constructed, which are based on measured seismic attenuation functions, and ensure that the model dispersion characteristics match the observations.

  15. Applications of Generalizability Theory and Their Relations to Classical Test Theory and Structural Equation Modeling.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2017-01-23

    Although widely recognized as a comprehensive framework for representing score reliability, generalizability theory (G-theory), despite its potential benefits, has been used sparingly in reporting of results for measures of individual differences. In this article, we highlight many valuable ways that G-theory can be used to quantify, evaluate, and improve psychometric properties of scores. Our illustrations encompass assessment of overall reliability, percentages of score variation accounted for by individual sources of measurement error, dependability of cut-scores for decision making, estimation of reliability and dependability for changes made to measurement procedures, disattenuation of validity coefficients for measurement error, and linkages of G-theory with classical test theory and structural equation modeling. We also identify computer packages for performing G-theory analyses, most of which can be obtained free of charge, and describe how they compare with regard to data input requirements, ease of use, complexity of designs supported, and output produced. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Modeling molecular recognition: theory and application.

    Science.gov (United States)

    Mardis, K; Luo, R; David, L; Potter, M; Glemza, A; Payne, G; Gilson, M K

    2000-01-01

    Abstract Efficient, reliable methods for calculating the binding affinities of noncovalent complexes would allow advances in a variety of areas such as drug discovery and separation science. We have recently described a method that accommodates significant physical detail while remaining fast enough for use in molecular design. This approach uses the predominant states method to compute free energies, an empirical force field, and an implicit solvation model based upon continuum electrostatics. We review applications of this method to systems ranging from small molecules to protein-ligand complexes.

  17. Genetic model compensation: Theory and applications

    Science.gov (United States)

    Cruickshank, David Raymond

    1998-12-01

    The adaptive filtering algorithm known as Genetic Model Compensation (GMC) was originally presented in the author's Master's Thesis. The current work extends this earlier work. GMC uses a genetic algorithm to optimize filter process noise parameters in parallel with the estimation of the state and based only on the observational information available to the filter. The original stochastic state model underlying GMC was inherited from the antecedent, non-adaptive Dynamic Model Compensation (DMC) algorithm. The current work develops the stochastic state model from a linear system viewpoint, avoiding the simplifications and approximations of the earlier development, and establishes Riemann sums as unbiased estimators of the stochastic integrals which describe the evolution of the random state components. These are significant developments which provide GMC with a solid theoretical foundation. Orbit determination is the area of application in this work, and two types of problems are studied: real-time autonomous filtering using absolute GPS measurements and precise post-processed filtering using differential GPS measurements. The first type is studied in a satellite navigation simulation in which pseudorange and pseudorange rate measurements are processed by an Extended Kalman Filter which incorporates both DMC and GMC. Both estimators are initialized by a geometric point solution algorithm. Using measurements corrupted by simulated Selective Availability errors, GMC reduces mean RSS position error by 6.4 percent, reduces mean clock bias error by 46 percent, and displays a marked improvement in covariance consistency relative to DMC. To study the second type of problem, GMC is integrated with NASA Jet Propulsion Laboratory's Gipsy/Oasis-II (GOA-II) precision orbit determination program creating an adaptive version of GOA-II's Reduced Dynamic Tracking (RDT) process noise formulation. When run as a sequential estimator with GPS measurements from the TOPEX satellite and

  18. Putting "Organizations" into an Organization Theory Course: A Hybrid CAO Model for Teaching Organization Theory

    Science.gov (United States)

    Hannah, David R.; Venkatachary, Ranga

    2010-01-01

    In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…

  19. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  20. Parameter Estimations of Dynamic Energy Budget (DEB) Model over the Life History of a Key Antarctic Species: The Antarctic Sea Star Odontaster validus Koehler, 1906.

    Science.gov (United States)

    Agüera, Antonio; Collard, Marie; Jossart, Quentin; Moreau, Camille; Danis, Bruno

    2015-01-01

    Marine organisms in Antarctica are adapted to an extreme ecosystem including extremely stable temperatures and strong seasonality due to changes in day length. It is now largely accepted that Southern Ocean organisms are particularly vulnerable to global warming with some regions already being challenged by a rapid increase of temperature. Climate change affects both the physical and biotic components of marine ecosystems and will have an impact on the distribution and population dynamics of Antarctic marine organisms. To predict and assess the effect of climate change on marine ecosystems a more comprehensive knowledge of the life history and physiology of key species is urgently needed. In this study we estimate the Dynamic Energy Budget (DEB) model parameters for key benthic Antarctic species the sea star Odontaster validus using available information from literature and experiments. The DEB theory is unique in capturing the metabolic processes of an organism through its entire life cycle as a function of temperature and food availability. The DEB model allows for the inclusion of the different life history stages, and thus, becomes a tool that can be used to model lifetime feeding, growth, reproduction, and their responses to changes in biotic and abiotic conditions. The DEB model presented here includes the estimation of reproduction handling rules for the development of simultaneous oocyte cohorts within the gonad. Additionally it links the DEB model reserves to the pyloric caeca an organ whose function has long been ascribed to energy storage. Model parameters described a slowed down metabolism of long living animals that mature slowly. O. validus has a large reserve that-matching low maintenance costs- allow withstanding long periods of starvation. Gonad development is continuous and individual cohorts developed within the gonads grow in biomass following a power function of the age of the cohort. The DEB model developed here for O. validus allowed us to

  1. Parameter Estimations of Dynamic Energy Budget (DEB Model over the Life History of a Key Antarctic Species: The Antarctic Sea Star Odontaster validus Koehler, 1906.

    Directory of Open Access Journals (Sweden)

    Antonio Agüera

    Full Text Available Marine organisms in Antarctica are adapted to an extreme ecosystem including extremely stable temperatures and strong seasonality due to changes in day length. It is now largely accepted that Southern Ocean organisms are particularly vulnerable to global warming with some regions already being challenged by a rapid increase of temperature. Climate change affects both the physical and biotic components of marine ecosystems and will have an impact on the distribution and population dynamics of Antarctic marine organisms. To predict and assess the effect of climate change on marine ecosystems a more comprehensive knowledge of the life history and physiology of key species is urgently needed. In this study we estimate the Dynamic Energy Budget (DEB model parameters for key benthic Antarctic species the sea star Odontaster validus using available information from literature and experiments. The DEB theory is unique in capturing the metabolic processes of an organism through its entire life cycle as a function of temperature and food availability. The DEB model allows for the inclusion of the different life history stages, and thus, becomes a tool that can be used to model lifetime feeding, growth, reproduction, and their responses to changes in biotic and abiotic conditions. The DEB model presented here includes the estimation of reproduction handling rules for the development of simultaneous oocyte cohorts within the gonad. Additionally it links the DEB model reserves to the pyloric caeca an organ whose function has long been ascribed to energy storage. Model parameters described a slowed down metabolism of long living animals that mature slowly. O. validus has a large reserve that-matching low maintenance costs- allow withstanding long periods of starvation. Gonad development is continuous and individual cohorts developed within the gonads grow in biomass following a power function of the age of the cohort. The DEB model developed here for O

  2. Localization landscape theory of disorder in semiconductors I: Theory and modeling

    OpenAIRE

    Filoche, Marcel; Piccardo, Marco; Wu, Yuh-Renn; Li, Chi-Kang; Weisbuch, Claude; Mayboroda, Svitlana

    2017-01-01

    We present here a model of carrier distribution and transport in semiconductor alloys accounting for quantum localization effects in disordered materials. This model is based on the recent development of a mathematical theory of quantum localization which introduces for each type of carrier a spatial function called \\emph{localization landscape}. These landscapes allow us to predict the localization regions of electron and hole quantum states, their corresponding energies, and the local densi...

  3. Does National Culture Impact Capital Budgeting Systems?

    Directory of Open Access Journals (Sweden)

    Peter J. Graham

    2017-06-01

    Full Text Available We examine how national culture impacts organisational selection of capital budgeting systems to develop our understanding of what influence a holistic formulation of national culture has on capital budgeting systems. Such an understanding is important as it would not only provide a clearer link between national culture and capital budgeting systems and advance extant literature but would also help multinational firms that have business relationships with Indonesian firms in suitably designing strategies. We conducted semi-structured interviews of selected finance managers of listed firms in Indonesia and Australia. Consistent with the contingency theory, we found that economic, political, legal and social uncertainty impact on the use of capital budgeting systems. The levels of uncertainty were higher in Indonesia than Australia and need to be reckoned in the selection of capital budgeting systems used by firms. We also found that firms are influenced by project size and complexity, when selecting capital budgeting systems.

  4. M-theory model-building and proton stability

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, J. [CERN, Geneva (Switzerland). Theory Div.; Faraggi, A.E. [Florida Univ., Gainesville, FL (United States). Inst. for Fundamental Theory; Nanopoulos, D.V. [Texas A and M Univ., College Station, TX (United States)]|[Houston Advanced Research Center, The Woodlands, TX (United States). Astroparticle Physics Group]|[Academy of Athens (Greece). Div. of Natural Sciences

    1997-09-01

    The authors study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. The authors exhibit the underlying geometric (bosonic) interpretation of these models, which have a Z{sub 2} x Z{sub 2} orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.

  5. M-Theory Model-Building and Proton Stability

    CERN Document Server

    Ellis, Jonathan Richard; Nanopoulos, Dimitri V; Ellis, John; Faraggi, Alon E.

    1998-01-01

    We study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. We exhibit the underlying geometric (bosonic) interpretation of these models, which have a $Z_2 \\times Z_2$ orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.

  6. Teaching Wound Care Management: A Model for the Budget Conscious Educator

    Science.gov (United States)

    Berry, David C.

    2012-01-01

    For the author, the concept of wound care has always been a challenging topic to demonstrate. How to teach the concept without having a student in need of wound care or without having to spend money to buy another simulation manikin/model? The author has recently created a simulation to demonstrate and practice the cleaning, closing, and dressing…

  7. An end-to-end model of the Earth Radiation Budget Experiment (ERBE) Earth-viewing nonscanning radiometric channels

    OpenAIRE

    Priestly, Kory James

    1993-01-01

    The Earth Radiation Budget Experiment (ERBE) active-cavity radiometers are used to measure the incoming solar, reflected solar, and emitted longwave radiation from the Earth and its atmosphere. The radiometers are carried by the National Aeronautics and Space Administration's Earth Radiation Budget Satellite (ERBS) and the National Oceanic and Atmospheric Administration's NOAA-9 and NOAA-10 spacecraft. Four Earth-viewing nonscanning active-cavity radiometers are carried by e...

  8. Making sense of implementation theories, models and frameworks.

    Science.gov (United States)

    Nilsen, Per

    2015-04-21

    Implementation science has progressed towards increased use of theoretical approaches to provide better understanding and explanation of how and why implementation succeeds or fails. The aim of this article is to propose a taxonomy that distinguishes between different categories of theories, models and frameworks in implementation science, to facilitate appropriate selection and application of relevant approaches in implementation research and practice and to foster cross-disciplinary dialogue among implementation researchers. Theoretical approaches used in implementation science have three overarching aims: describing and/or guiding the process of translating research into practice (process models); understanding and/or explaining what influences implementation outcomes (determinant frameworks, classic theories, implementation theories); and evaluating implementation (evaluation frameworks). This article proposes five categories of theoretical approaches to achieve three overarching aims. These categories are not always recognized as separate types of approaches in the literature. While there is overlap between some of the theories, models and frameworks, awareness of the differences is important to facilitate the selection of relevant approaches. Most determinant frameworks provide limited "how-to" support for carrying out implementation endeavours since the determinants usually are too generic to provide sufficient detail for guiding an implementation process. And while the relevance of addressing barriers and enablers to translating research into practice is mentioned in many process models, these models do not identify or systematically structure specific determinants associated with implementation success. Furthermore, process models recognize a temporal sequence of implementation endeavours, whereas determinant frameworks do not explicitly take a process perspective of implementation.

  9. The chlorine budget of the present-day atmosphere - A modeling study

    Science.gov (United States)

    Weisenstein, Debra K.; Ko, Malcolm K. W.; Sze, Nien-Dak

    1992-01-01

    The contribution of source gases to the total amount of inorganic chlorine (ClY) is examined analytically with a time-dependent model employing 11 source gases. The source-gas emission data are described, and the modeling methodology is set forth with attention given to the data interpretation. The abundances and distributions are obtained for all 11 source gases with corresponding ClY production rates and mixing ratios. It is shown that the ClY production rate and the ClY mixing ratio for each source gas are spatially dependent, and the change in the relative contributions from 1950 to 1990 is given. Ozone changes in the past decade are characterized by losses in the polar and midlatitude lower stratosphere. The values for CFC-11, CCl4, and CH3CCl3 suggest that they are more evident in the lower stratosphere than is suggested by steady-state estimates based on surface concentrations.

  10. Budgeting based on need: a model to determine sub-national allocation of resources for health services in Indonesia

    Directory of Open Access Journals (Sweden)

    Ensor Tim

    2012-08-01

    Full Text Available Abstract Background Allocating national resources to regions based on need is a key policy issue in most health systems. Many systems utilise proxy measures of need as the basis for allocation formulae. Increasingly these are underpinned by complex statistical methods to separate need from supplier induced utilisation. Assessment of need is then used to allocate existing global budgets to geographic areas. Many low and middle income countries are beginning to use formula methods for funding however these attempts are often hampered by a lack of information on utilisation, relative needs and whether the budgets allocated bear any relationship to cost. An alternative is to develop bottom-up estimates of the cost of providing for local need. This method is viable where public funding is focused on a relatively small number of targeted services. We describe a bottom-up approach to developing a formula for the allocation of resources. The method is illustrated in the context of the state minimum service package mandated to be provided by the Indonesian public health system. Methods A standardised costing methodology was developed that is sensitive to the main expected drivers of local cost variation including demographic structure, epidemiology and location. Essential package costing is often undertaken at a country level. It is less usual to utilise the methods across different parts of a country in a way that takes account of variation in population needs and location. Costing was based on best clinical practice in Indonesia and province specific data on distribution and costs of facilities. The resulting model was used to estimate essential package costs in a representative district in each province of the country. Findings Substantial differences in the costs of providing basic services ranging from USD 15 in urban Yogyakarta to USD 48 in sparsely populated North Maluku. These costs are driven largely by the structure of the population

  11. Budgeting based on need: a model to determine sub-national allocation of resources for health services in Indonesia.

    Science.gov (United States)

    Ensor, Tim; Firdaus, Hafidz; Dunlop, David; Manu, Alex; Mukti, Ali Ghufron; Ayu Puspandari, Diah; von Roenne, Franz; Indradjaya, Stephanus; Suseno, Untung; Vaughan, Patrick

    2012-08-29

    Allocating national resources to regions based on need is a key policy issue in most health systems. Many systems utilise proxy measures of need as the basis for allocation formulae. Increasingly these are underpinned by complex statistical methods to separate need from supplier induced utilisation. Assessment of need is then used to allocate existing global budgets to geographic areas. Many low and middle income countries are beginning to use formula methods for funding however these attempts are often hampered by a lack of information on utilisation, relative needs and whether the budgets allocated bear any relationship to cost. An alternative is to develop bottom-up estimates of the cost of providing for local need. This method is viable where public funding is focused on a relatively small number of targeted services. We describe a bottom-up approach to developing a formula for the allocation of resources. The method is illustrated in the context of the state minimum service package mandated to be provided by the Indonesian public health system. A standardised costing methodology was developed that is sensitive to the main expected drivers of local cost variation including demographic structure, epidemiology and location. Essential package costing is often undertaken at a country level. It is less usual to utilise the methods across different parts of a country in a way that takes account of variation in population needs and location. Costing was based on best clinical practice in Indonesia and province specific data on distribution and costs of facilities. The resulting model was used to estimate essential package costs in a representative district in each province of the country. Substantial differences in the costs of providing basic services ranging from USD 15 in urban Yogyakarta to USD 48 in sparsely populated North Maluku. These costs are driven largely by the structure of the population, particularly numbers of births, infants and children and also key

  12. Lenses on Reading An Introduction to Theories and Models

    CERN Document Server

    Tracey, Diane H

    2012-01-01

    This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition

  13. Global model simulations of the impact of ocean-going ships on aerosols, clouds, and the radiation budget

    Directory of Open Access Journals (Sweden)

    A. Lauer

    2007-10-01

    Full Text Available International shipping contributes significantly to the fuel consumption of all transport related activities. Specific emissions of pollutants such as sulfur dioxide (SO2 per kg of fuel emitted are higher than for road transport or aviation. Besides gaseous pollutants, ships also emit various types of particulate matter. The aerosol impacts the Earth's radiation budget directly by scattering and absorbing the solar and thermal radiation and indirectly by changing cloud properties. Here we use ECHAM5/MESSy1-MADE, a global climate model with detailed aerosol and cloud microphysics to study the climate impacts of international shipping. The simulations show that emissions from ships significantly increase the cloud droplet number concentration of low marine water clouds by up to 5% to 30% depending on the ship emission inventory and the geographic region. Whereas the cloud liquid water content remains nearly unchanged in these simulations, effective radii of cloud droplets decrease, leading to cloud optical thickness increase of up to 5–10%. The sensitivity of the results is estimated by using three different emission inventories for present-day conditions. The sensitivity analysis reveals that shipping contributes to 2.3% to 3.6% of the total sulfate burden and 0.4% to 1.4% to the total black carbon burden in the year 2000 on the global mean. In addition to changes in aerosol chemical composition, shipping increases the aerosol number concentration, e.g. up to 25% in the size range of the accumulation mode (typically >0.1 μm over the Atlantic. The total aerosol optical thickness over the Indian Ocean, the Gulf of Mexico and the Northeastern Pacific increases by up to 8–10% depending on the emission inventory. Changes in aerosol optical thickness caused by shipping induced modification of aerosol particle number concentration and chemical composition lead to a change in the shortwave radiation budget at the top of the

  14. Modelling study of boundary-layer ozone over northern China - Part I: Ozone budget in summer

    Science.gov (United States)

    Tang, Guiqian; Zhu, Xiaowan; Xin, Jinyuan; Hu, Bo; Song, Tao; Sun, Yang; Zhang, Jinqiang; Wang, Lili; Cheng, Mengtian; Chao, Na; Kong, Lingbin; Li, Xin; Wang, Yuesi

    2017-05-01

    Regional photochemical pollution caused by ozone (O3) is serious in northern China during summer. In this study, we combined network observation data with the Fifth-Generation Pennsylvania State/National Centre for Atmospheric Research Mesoscale Model -Community Multiscale Air Quality (MM5-CMAQ) model system to simulate O3 and its precursors'concentrations over northern China in June 2008. Comparisons of the simulations and observations indicate that the model can accurately reproduce the temporal and spatial distributions of temperature, humidity, and wind as well as the evolution of O3 and its precursors over northern China. The monthly mean of the total oxidants (nitrogen dioxide + O3) at 15:00 LT exceeded 90 ppbv across the North China Plain, thereby indicating significant photochemical pollution in this area. Vertical diffusion is the main source of the near-ground O3, with contributions of more than 20 ppbv h- 1 in the urban areas. Dry deposition and chemical reactions are the main sinks for O3, with contributions of more than 20 ppbv h- 1 and 7 ppbv h- 1 in the forest and urban areas, respectively. Although vertical diffusion is the main source of near-ground O3, photochemical reactions dominate the O3 concentrations in the boundary layer because of the circulation between the lower and upper boundary layers. Considering that O3 is mainly produced in the upper boundary layer, both nitrogen oxide and volatile organic compounds should be controlled on the North China Plain. The results presented here are intended to provide guidance for redefining strategies to control photochemical pollution over northern China.

  15. Global atmospheric budget of acetaldehyde: 3-D model analysis and constraints from in-situ and satellite observations

    Directory of Open Access Journals (Sweden)

    D. B. Millet

    2010-04-01

    Full Text Available We construct a global atmospheric budget for acetaldehyde using a 3-D model of atmospheric chemistry (GEOS-Chem, and use an ensemble of observations to evaluate present understanding of its sources and sinks. Hydrocarbon oxidation provides the largest acetaldehyde source in the model (128 Tg a−1, a factor of 4 greater than the previous estimate, with alkanes, alkenes, and ethanol the main precursors. There is also a minor source from isoprene oxidation. We use an updated chemical mechanism for GEOS-Chem, and photochemical acetaldehyde yields are consistent with the Master Chemical Mechanism. We present a new approach to quantifying the acetaldehyde air-sea flux based on the global distribution of light absorption due to colored dissolved organic matter (CDOM derived from satellite ocean color observations. The resulting net ocean emission is 57 Tg a−1, the second largest global source of acetaldehyde. A key uncertainty is the acetaldehyde turnover time in the ocean mixed layer, with quantitative model evaluation over the ocean complicated by known measurement artifacts in clean air. Simulated concentrations in surface air over the ocean generally agree well with aircraft measurements, though the model tends to overestimate the vertical gradient. PAN:NOx ratios are well-simulated in the marine boundary layer, providing some support for the modeled ocean source. We introduce the Model of Emissions of Gases and Aerosols from Nature (MEGANv2.1 for acetaldehyde and ethanol and use it to quantify their net flux from living terrestrial plants. Including emissions from decaying plants the total direct acetaldehyde source from the land biosphere is 23 Tg a−1. Other terrestrial acetaldehyde sources include biomass burning (3 Tg a−1 and anthropogenic emissions (2 Tg a−1. Simulated concentrations in the continental boundary layer are generally unbiased and capture the spatial

  16. Theory, modeling and simulation of superconducting qubits

    Energy Technology Data Exchange (ETDEWEB)

    Berman, Gennady P [Los Alamos National Laboratory; Kamenev, Dmitry I [Los Alamos National Laboratory; Chumak, Alexander [INSTIT OF PHYSICS, KIEV; Kinion, Carin [LLNL; Tsifrinovich, Vladimir [POLYTECHNIC INSTIT OF NYU

    2011-01-13

    We analyze the dynamics of a qubit-resonator system coupled with a thermal bath and external electromagnetic fields. Using the evolution equations for the set of Heisenberg operators that describe the whole system, we derive an expression for the resonator field, that includes the resonator-drive, the resonator-bath, and resonator-qubit interactions. The renormalization of the resonator frequency, caused by the qubit-resonator interaction, is accounted for. Using the solutions for the resonator field, we derive the equation that describes the qubit dynamics. The dependence of the qubit evolution during the measurement time on the fidelity of a single-shot measurement is studied. The relation between the fidelity and measurement time is shown explicitly. We proposed a novel adiabatic method for the phase qubit measurement. The method utilizes a low-frequency, quasi-classical resonator inductively coupled to the qubit. The resonator modulates the qubit energy, and the back reaction of the qubit causes a shift in the phase of the resonator. The resonator phase shift can be used to determine the qubit state. We have simulated this measurement taking into the account the energy levels outside the phase qubit manifold. We have shown that, for qubit frequencies in the range of 8-12GHZ, a resonator frequency of 500 MHz and a measurement time of 100 ns, the phase difference between the two qubit states is greater than 0.2 rad. This phase difference exceeds the measurement uncertainty, and can be detected using a classical phase-meter. A fidelity of 0.9999 can be achieved for a relaxation time of 0.5 ms. We also model and simulate a microstrip-SQUID amplifier of frequency about 500 MHz, which could be used to amplify the resonator oscillations in the phase qubit adiabatic measurement. The voltage gain and the amplifier noise temperature are calculated. We simulate the preparation of a generalized Bell state and compute the relaxation times required for achieving high

  17. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  18. Traffic Games: Modeling Freeway Traffic with Game Theory.

    Science.gov (United States)

    Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R

    2016-01-01

    We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.

  19. Error budget analysis of SCIAMACHY limb ozone profile retrievals using the SCIATRAN model

    Directory of Open Access Journals (Sweden)

    N. Rahpoe

    2013-10-01

    Full Text Available A comprehensive error characterization of SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric CHartographY limb ozone profiles has been established based upon SCIATRAN transfer model simulations. The study was carried out in order to evaluate the possible impact of parameter uncertainties, e.g. in albedo, stratospheric aerosol optical extinction, temperature, pressure, pointing, and ozone absorption cross section on the limb ozone retrieval. Together with the a posteriori covariance matrix available from the retrieval, total random and systematic errors are defined for SCIAMACHY ozone profiles. Main error sources are the pointing errors, errors in the knowledge of stratospheric aerosol parameters, and cloud interference. Systematic errors are of the order of 7%, while the random error amounts to 10–15% for most of the stratosphere. These numbers can be used for the interpretation of instrument intercomparison and validation of the SCIAMACHY V 2.5 limb ozone profiles in a rigorous manner.

  20. Theory of positive disintegration as a model of adolescent development.

    Science.gov (United States)

    Laycraft, Krystyna

    2011-01-01

    This article introduces a conceptual model of the adolescent development based on the theory of positive disintegration combined with theory of self-organization. Dabrowski's theory of positive disintegration, which was created almost a half century ago, still attracts psychologists' and educators' attention, and is extensively applied into studies of gifted and talented people. The positive disintegration is the mental development described by the process of transition from lower to higher levels of mental life and stimulated by tension, inner conflict, and anxiety. This process can be modeled by a sequence of patterns of organization (attractors) as a developmental potential (a control parameter) changes. Three levels of disintegration (unilevel disintegration, spontaneous multilevel disintegration, and organized multilevel disintegration) are analyzed in detail and it is proposed that they represent behaviour of early, middle and late periods of adolescence. In the discussion, recent research on the adolescent brain development is included.

  1. Integrating social capital theory, social cognitive theory, and the technology acceptance model to explore a behavioral model of telehealth systems.

    Science.gov (United States)

    Tsai, Chung-Hung

    2014-05-07

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  2. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    Science.gov (United States)

    Tsai, Chung-Hung

    2014-01-01

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities. PMID:24810577

  3. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    Directory of Open Access Journals (Sweden)

    Chung-Hung Tsai

    2014-05-01

    Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  4. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739

  5. Theory, modeling, and integrated studies in the Arase (ERG) project

    Science.gov (United States)

    Seki, Kanako; Miyoshi, Yoshizumi; Ebihara, Yusuke; Katoh, Yuto; Amano, Takanobu; Saito, Shinji; Shoji, Masafumi; Nakamizo, Aoi; Keika, Kunihiro; Hori, Tomoaki; Nakano, Shin'ya; Watanabe, Shigeto; Kamiya, Kei; Takahashi, Naoko; Omura, Yoshiharu; Nose, Masahito; Fok, Mei-Ching; Tanaka, Takashi; Ieda, Akimasa; Yoshikawa, Akimasa

    2018-02-01

    Understanding of underlying mechanisms of drastic variations of the near-Earth space (geospace) is one of the current focuses of the magnetospheric physics. The science target of the geospace research project Exploration of energization and Radiation in Geospace (ERG) is to understand the geospace variations with a focus on the relativistic electron acceleration and loss processes. In order to achieve the goal, the ERG project consists of the three parts: the Arase (ERG) satellite, ground-based observations, and theory/modeling/integrated studies. The role of theory/modeling/integrated studies part is to promote relevant theoretical and simulation studies as well as integrated data analysis to combine different kinds of observations and modeling. Here we provide technical reports on simulation and empirical models related to the ERG project together with their roles in the integrated studies of dynamic geospace variations. The simulation and empirical models covered include the radial diffusion model of the radiation belt electrons, GEMSIS-RB and RBW models, CIMI model with global MHD simulation REPPU, GEMSIS-RC model, plasmasphere thermosphere model, self-consistent wave-particle interaction simulations (electron hybrid code and ion hybrid code), the ionospheric electric potential (GEMSIS-POT) model, and SuperDARN electric field models with data assimilation. ERG (Arase) science center tools to support integrated studies with various kinds of data are also briefly introduced.[Figure not available: see fulltext.

  6. Secondary flow structure in a model curved artery: 3D morphology and circulation budget analysis

    Science.gov (United States)

    Bulusu, Kartik V.; Plesniak, Michael W.

    2015-11-01

    In this study, we examined the rate of change of circulation within control regions encompassing the large-scale vortical structures associated with secondary flows, i.e. deformed Dean-, Lyne- and Wall-type (D-L-W) vortices at planar cross-sections in a 180° curved artery model (curvature ratio, 1/7). Magnetic resonance velocimetry (MRV) and particle image velocimetry (PIV) experiments were performed independently, under the same physiological inflow conditions (Womersley number, 4.2) and using Newtonian blood-analog fluids. The MRV-technique performed at Stanford University produced phase-averaged, three-dimensional velocity fields. Secondary flow field comparisons of MRV-data to PIV-data at various cross-sectional planes and inflow phases were made. A wavelet-decomposition-based approach was implemented to characterize various secondary flow morphologies. We hypothesize that the persistence and decay of arterial secondary flow vortices is intrinsically related to the influence of the out-of-plane flow, tilting, in-plane convection and diffusion-related factors within the control regions. Evaluation of these factors will elucidate secondary flow structures in arterial hemodynamics. Supported by the National Science Foundation under Grant Number CBET-0828903, and GW Center for Biomimetics and Bioinspired Engineering (COBRE). The MRV data were acquired at Stanford University in collaboration with Christopher Elkins and John Eaton.

  7. Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions

    Directory of Open Access Journals (Sweden)

    Camaren Peter

    2014-03-01

    Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.

  8. A Defense Budget Primer

    Science.gov (United States)

    1998-12-09

    budget practices. See Appendix D for the actual timetable of congressional action on the FY1999 budget.65 See James V. Saturno , The Appropriations...details, see James V. Saturno , The Appropriations Process and the Congressional69 Budget Act, CRS Report 97-947. Table 6. Milestone Votes on the Defense...James V. Saturno , The74 Appropriations Process and the Congressional Budget Act, CRS Report 97-947. The Budget Enforcement Act of 1990 and subsequent

  9. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest

  10. Anisotropic cosmological models and generalized scalar tensor theory

    Indian Academy of Sciences (India)

    Abstract. In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–. Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been stud- ied and some assumptions ...

  11. Magnetized cosmological models in bimetric theory of gravitation

    Indian Academy of Sciences (India)

    Abstract. Bianchi type-III magnetized cosmological model when the field of gravitation is governed by either a perfect fluid or cosmic string is investigated in Rosen's [1] bimetric theory of gravitation. To complete determinate solution, the condition, viz., A = (BC)n, where n is a constant, between the metric potentials is used.

  12. Anisotropic cosmological models in f (R, T) theory of gravitation

    Indian Academy of Sciences (India)

    cally viable f (R) gravity model, which showed the unification of early time inflation and late time acceleration. Harko et al [13] developed f (R, T) modified theory of gravity, where the gravi- tational Lagrangian is given by an arbitrary function of the Ricci scalar R and the trace T of the energy–momentum tensor. It is to be noted ...

  13. Teaching Model Building to High School Students: Theory and Reality.

    Science.gov (United States)

    Roberts, Nancy; Barclay, Tim

    1988-01-01

    Builds on a National Science Foundation (NSF) microcomputer based laboratory project to introduce system dynamics into the precollege setting. Focuses on providing students with powerful and investigatory theory building tools. Discusses developed hardware, software, and curriculum materials used to introduce model building and simulations into…

  14. A Model to Demonstrate the Place Theory of Hearing

    Science.gov (United States)

    Ganesh, Gnanasenthil; Srinivasan, Venkata Subramanian; Krishnamurthi, Sarayu

    2016-01-01

    In this brief article, the authors discuss Georg von Békésy's experiments showing the existence of traveling waves in the basilar membrane and that maximal displacement of the traveling wave was determined by the frequency of the sound. The place theory of hearing equates the basilar membrane to a frequency analyzer. The model described in this…

  15. Multilevel Higher-Order Item Response Theory Models

    Science.gov (United States)

    Huang, Hung-Yu; Wang, Wen-Chung

    2014-01-01

    In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…

  16. SIMP model at NNLO in chiral perturbation theory

    DEFF Research Database (Denmark)

    Hansen, Martin Rasmus Lundquist; Langaeble, K.; Sannino, F.

    2015-01-01

    We investigate the phenomenological viability of a recently proposed class of composite dark matter models where the relic density is determined by 3 to 2 number-changing processes in the dark sector. Here the pions of the strongly interacting field theory constitute the dark matter particles...

  17. Speech act theory in support of idealised warning models | Carstens ...

    African Journals Online (AJOL)

    ... subsuming lower level speech acts such as POINTING OUT/ALERTING, INFORMING and INSTRUCTING. Secondly, the model is used to analyse and evaluate actual warnings collected from information sheets for hair-dryers, indicating the heuristic value of combined insights from document design and speech act theory ...

  18. A Proposed Model of Jazz Theory Knowledge Acquisition

    Science.gov (United States)

    Ciorba, Charles R.; Russell, Brian E.

    2014-01-01

    The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…

  19. Conceptualizations of Creativity: Comparing Theories and Models of Giftedness

    Science.gov (United States)

    Miller, Angie L.

    2012-01-01

    This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…

  20. Dimensions of Genocide: The Circumplex Model Meets Violentization Theory

    Science.gov (United States)

    Winton, Mark A.

    2008-01-01

    The purpose of this study is to examine the use of Olson's (1995, 2000) family therapy based circumplex model and Athens' (1992, 1997, 2003) violentization theory in explaining genocide. The Rwandan genocide of 1994 is used as a case study. Published texts, including interviews with perpetrators, research reports, human rights reports, and court…

  1. Pilot evaluation in TENCompetence: a theory-driven model

    OpenAIRE

    Schoonenboom, Judith; Sligte, Henk; Moghnieh, Ayman; Specht, Marcus; Glahn, Christian; Stefanov, Krassen

    2007-01-01

    Schoonenboom, J., Sligte, H., Moghnieh, A., Specht, M., Glahn, C., & Stefanov, K. (2007). Pilot evaluation in TENCompetence: a theory-driven model. In T. Navarette, J. Blat & R. Koper (Eds.). Proceedings of the 3rd TENCompetence Open Workshop 'Current Research on IMS Learning Design and Lifelong Competence Development Infrastructures' (pp. 43-50). June, 21-22, 2007, Barcelona, Spain.

  2. Pilot evaluation in TENCompetence: a theory-driven model

    NARCIS (Netherlands)

    Schoonenboom, Judith; Sligte, Henk; Moghnieh, Ayman; Specht, Marcus; Glahn, Christian; Stefanov, Krassen

    2007-01-01

    Schoonenboom, J., Sligte, H., Moghnieh, A., Specht, M., Glahn, C., & Stefanov, K. (2007). Pilot evaluation in TENCompetence: a theory-driven model. In T. Navarette, J. Blat & R. Koper (Eds.). Proceedings of the 3rd TENCompetence Open Workshop 'Current Research on IMS Learning Design and Lifelong

  3. Excellence in Physics Education Award: Modeling Theory for Physics Instruction

    Science.gov (United States)

    Hestenes, David

    2014-03-01

    All humans create mental models to plan and guide their interactions with the physical world. Science has greatly refined and extended this ability by creating and validating formal scientific models of physical things and processes. Research in physics education has found that mental models created from everyday experience are largely incompatible with scientific models. This suggests that the fundamental problem in learning and understanding science is coordinating mental models with scientific models. Modeling Theory has drawn on resources of cognitive science to work out extensive implications of this suggestion and guide development of an approach to science pedagogy and curriculum design called Modeling Instruction. Modeling Instruction has been widely applied to high school physics and, more recently, to chemistry and biology, with noteworthy results.

  4. Alliance: A common factor of psychotherapy modeled by structural theory

    Directory of Open Access Journals (Sweden)

    Wolfgang eTschacher

    2015-04-01

    Full Text Available There is broad consensus that the therapeutic alliance constitutes a core common factor for all modalities of psychotherapy. Meta-analyses corroborated that alliance, as it emerges from therapeutic process, is a significant predictor of therapy outcome. Psychotherapy process is traditionally described and explored using two categorially different approaches, the experiential (first-person perspective and the behavioral (third-person perspective. We propose to add to this duality a third, structural approach. Dynamical systems theory and synergetics on the one hand and enactivist theory on the other together can provide this structural approach, which contributes in specific ways to a clarification of the alliance factor. Systems theory offers concepts and tools for the modeling of the individual self and, building on this, of alliance processes. In the enactive perspective, the self is conceived as a socially enacted autonomous system that strives to maintain identity by observing a two-fold goal: to exist as an individual self in its own right (distinction while also being open to others (participation. Using this conceptualization, we formalized the therapeutic alliance as a phase space whose potential minima (attractors can be shifted by the therapist to approximate therapy goals. This mathematical formalization is derived from probability theory and synergetics. Our conclusions say that structural theory provides powerful tools for the modeling of how therapeutic change is staged by the formation, utilization, and dissolution of the therapeutic alliance. In addition, we point out novel testable hypotheses and future applications.

  5. Route Choice Model Based on Game Theory for Commuters

    Directory of Open Access Journals (Sweden)

    Licai Yang

    2016-06-01

    Full Text Available The traffic behaviours of commuters may cause traffic congestion during peak hours. Advanced Traffic Information System can provide dynamic information to travellers. Due to the lack of timeliness and comprehensiveness, the provided information cannot satisfy the travellers’ needs. Since the assumptions of traditional route choice model based on Expected Utility Theory conflict with the actual situation, a route choice model based on Game Theory is proposed to provide reliable route choice to commuters in actual situation in this paper. The proposed model treats the alternative routes as game players and utilizes the precision of predicted information and familiarity of traffic condition to build a game. The optimal route can be generated considering Nash Equilibrium by solving the route choice game. Simulations and experimental analysis show that the proposed model can describe the commuters’ routine route choice decisionexactly and the provided route is reliable.

  6. Accounting for Errors in Model Analysis Theory: A Numerical Approach

    Science.gov (United States)

    Sommer, Steven R.; Lindell, Rebecca S.

    2004-09-01

    By studying the patterns of a group of individuals' responses to a series of multiple-choice questions, researchers can utilize Model Analysis Theory to create a probability distribution of mental models for a student population. The eigenanalysis of this distribution yields information about what mental models the students possess, as well as how consistently they utilize said mental models. Although the theory considers the probabilistic distribution to be fundamental, there exists opportunities for random errors to occur. In this paper we will discuss a numerical approach for mathematically accounting for these random errors. As an example of this methodology, analysis of data obtained from the Lunar Phases Concept Inventory will be presented. Limitations and applicability of this numerical approach will be discussed.

  7. Design of formative assessment model for professional behavior using stages of change theory.

    Science.gov (United States)

    Hashemi, Akram; Mirzazadeh, Azim; Shirazi, Mandana; Asghari, Fariba

    2016-01-01

    Background: Professionalism is a core competency of physicians. This study was conducted to design a model for formative assessment of professional commitment in medical students according to stages of change theory. Methods: In this qualitative study, data were collected through literature review & focus group interviews in the Tehran University of Medical Sciences in 2013 and analyzed using content analysis approach. Results: Review of the literature and results of focus group interviews led to design a formative assessment model of professional commitment in three phases, including pre-contemplation, contemplation, and readiness for behavior change that each one has interventional and assessment components. In the second phase of the study, experts' opinion collected in two main categories: the educational environment (factors related to students, students' assessment and educational program); and administrative problems (factors related to subcultures, policymakers or managers and budget). Moreover, there was a section of recommendations for each category related to curriculum, professors, students, assessments, making culture, the staff and reinforcing administrative factors. Conclusion: This type of framework analysis made it possible to develop a conceptual model that could be effective on forming the professional commitment and behavioral change in medical students.

  8. FY 1996 Congressional budget request: Budget highlights

    Energy Technology Data Exchange (ETDEWEB)

    1995-02-01

    The FY 1996 budget presentation is organized by the Department`s major business lines. An accompanying chart displays the request for new budget authority. The report compares the budget request for FY 1996 with the appropriated FY 1995 funding levels displayed on a comparable basis. The FY 1996 budget represents the first year of a five year plan in which the Department will reduce its spending by $15.8 billion in budget authority and by $14.1 billion in outlays. FY 1996 is a transition year as the Department embarks on its multiyear effort to do more with less. The Budget Highlights are presented by business line; however, the fifth business line, Economic Productivity, which is described in the Policy Overview section, cuts across multiple organizational missions, funding levels and activities and is therefore included in the discussion of the other four business lines.

  9. Cluster variational theory of spin ((3)/(2)) Ising models

    CERN Document Server

    Tucker, J W

    2000-01-01

    A cluster variational method for spin ((3)/(2)) Ising models on regular lattices is presented that leads to results that are exact for Bethe lattices of the same coordination number. The method is applied to both the Blume-Capel (BC) and the isotropic Blume-Emery-Griffiths model (BEG). In particular, the first-order phase line separating the two low-temperature ferromagnetic phases in the BC model, and the ferrimagnetic phase boundary in the BEG model are studied. Results are compared with those of other theories whose qualitative predictions have been in conflict.

  10. On ADE quiver models and F-theory compactification

    Energy Technology Data Exchange (ETDEWEB)

    Belhaj, A [Department of Mathematics and Statistics, University of Ottawa, 585 King Edward Ave., Ottawa, ON, K1N 6N5 (Canada); Rasmussen, J [Department of Mathematics and Statistics, University of Melbourne, Parkville, Victoria 3010 (Australia); Sebbar, A [Department of Mathematics and Statistics, University of Ottawa, 585 King Edward Ave., Ottawa, ON, K1N 6N5 (Canada); Sedra, M B [Laboratoire de Physique de la Matiere et Rayonnement (LPMR), Morocco Faculte des Sciences, Universite Ibn Tofail, Kenitra, Morocco (Morocco)

    2006-07-21

    Based on mirror symmetry, we discuss geometric engineering of N = 1 ADE quiver models from F-theory compactifications on elliptic K3 surfaces fibred over certain four-dimensional base spaces. The latter are constructed as intersecting 4-cycles according to ADE Dynkin diagrams, thereby mimicking the construction of Calabi-Yau threefolds used in geometric engineering in type II superstring theory. Matter is incorporated by considering D7-branes wrapping these 4-cycles. Using a geometric procedure referred to as folding, we discuss how the corresponding physics can be converted into a scenario with D5-branes wrapping 2-cycles of ALE spaces.

  11. Carbon budget of tropical forests in Southeast Asia and the effects of deforestation: an approach using a process-based model and field measurements

    OpenAIRE

    Adachi, M.; Ito, A.; Ishida, A.; W. R. Kadir; P. Ladpala; Yamagata, Y.

    2011-01-01

    More reliable estimates of the carbon (C) stock within forest ecosystems and C emission induced by deforestation are urgently needed to mitigate the effects of emissions on climate change. A process-based terrestrial biogeochemical model (VISIT) was applied to tropical primary forests of two types (a seasonal dry forest in Thailand and a rainforest in Malaysia) and one agro-forest (an oil palm plantation in Malaysia) to estimate the C budget of tropical ecosystems in Southeast Asia, including...

  12. Should the model for risk-informed regulation be game theory rather than decision theory?

    Science.gov (United States)

    Bier, Vicki M; Lin, Shi-Woei

    2013-02-01

    deception), to identify optimal regulatory strategies. Therefore, we believe that the types of regulatory interactions analyzed in this article are better modeled using game theory rather than decision theory. In particular, the goals of this article are to review the relevant literature in game theory and regulatory economics (to stimulate interest in this area among risk analysts), and to present illustrative results showing how the application of game theory can provide useful insights into the theory and practice of risk-informed regulation. © 2012 Society for Risk Analysis.

  13. Extrapolating toxic effects on individuals to the population level: the role of dynamic energy budgets

    NARCIS (Netherlands)

    Jager, T.; Klok, T.C.

    2010-01-01

    The interest of environmental management is in the long-term health of populations and ecosystems. However, toxicity is usually assessed in short-term experiments with individuals. Modelling based on dynamic energy budget (DEB) theory aids the extraction of mechanistic information from the data,

  14. Retirement decisions in a discrete choice model and implications for the government budget: the case of Belgium.

    Science.gov (United States)

    Pepermans, G

    1992-08-01

    "The purpose of this paper was to analyse the determinants of the retirement decision of the elderly in Belgium, and, by making some simulations, to find out what would be the financial implications for the government budget of changes in the social security system.... The largest effect on labor supply is caused by changes in pensionable age. Giving a lump-sum pension to part-time workers seems an interesting policy to withdraw individuals from the labor market at a relatively low cost. Introducing flexible retirement also is beneficial for the government budget and has, especially for women, a relatively large positive effect on labor supply." excerpt

  15. Finite Element and Plate Theory Modeling of Acoustic Emission Waveforms

    Science.gov (United States)

    Prosser, W. H.; Hamstad, M. A.; Gary, J.; OGallagher, A.

    1998-01-01

    A comparison was made between two approaches to predict acoustic emission waveforms in thin plates. A normal mode solution method for Mindlin plate theory was used to predict the response of the flexural plate mode to a point source, step-function load, applied on the plate surface. The second approach used a dynamic finite element method to model the problem using equations of motion based on exact linear elasticity. Calculations were made using properties for both isotropic (aluminum) and anisotropic (unidirectional graphite/epoxy composite) materials. For simulations of anisotropic plates, propagation along multiple directions was evaluated. In general, agreement between the two theoretical approaches was good. Discrepancies in the waveforms at longer times were caused by differences in reflections from the lateral plate boundaries. These differences resulted from the fact that the two methods used different boundary conditions. At shorter times in the signals, before reflections, the slight discrepancies in the waveforms were attributed to limitations of Mindlin plate theory, which is an approximate plate theory. The advantages of the finite element method are that it used the exact linear elasticity solutions, and that it can be used to model real source conditions and complicated, finite specimen geometries as well as thick plates. These advantages come at a cost of increased computational difficulty, requiring lengthy calculations on workstations or supercomputers. The Mindlin plate theory solutions, meanwhile, can be quickly generated on personal computers. Specimens with finite geometry can also be modeled. However, only limited simple geometries such as circular or rectangular plates can easily be accommodated with the normal mode solution technique. Likewise, very limited source configurations can be modeled and plate theory is applicable only to thin plates.

  16. Nonequilibrium Dynamical Mean-Field Theory for Bosonic Lattice Models

    Directory of Open Access Journals (Sweden)

    Hugo U. R. Strand

    2015-03-01

    Full Text Available We develop the nonequilibrium extension of bosonic dynamical mean-field theory and a Nambu real-time strong-coupling perturbative impurity solver. In contrast to Gutzwiller mean-field theory and strong-coupling perturbative approaches, nonequilibrium bosonic dynamical mean-field theory captures not only dynamical transitions but also damping and thermalization effects at finite temperature. We apply the formalism to quenches in the Bose-Hubbard model, starting from both the normal and the Bose-condensed phases. Depending on the parameter regime, one observes qualitatively different dynamical properties, such as rapid thermalization, trapping in metastable superfluid or normal states, as well as long-lived or strongly damped amplitude oscillations. We summarize our results in nonequilibrium “phase diagrams” that map out the different dynamical regimes.

  17. Super Yang-Mills theory as a random matrix model

    Energy Technology Data Exchange (ETDEWEB)

    Siegel, W. [Institute for Theoretical Physics, State University of New York, Stony Brook, New York 11794-3840 (United States)

    1995-07-15

    We generalize the Gervais-Neveu gauge to four-dimensional {ital N}=1 superspace. The model describes an {ital N}=2 super Yang-Mills theory. All chiral superfields ({ital N}=2 matter and ghost multiplets) exactly cancel to all loops. The remaining Hermitian scalar superfield (matrix) has a renormalizable massive propagator and simplified vertices. These properties are associated with {ital N}=1 supergraphs describing a superstring theory on a random lattice world sheet. We also consider all possible finite matrix models, and find they have a universal large-color limit. These could describe gravitational strings if the matrix-model coupling is fixed to unity, for exact electric-magnetic self-duality.

  18. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  19. Localization landscape theory of disorder in semiconductors. I. Theory and modeling

    Science.gov (United States)

    Filoche, Marcel; Piccardo, Marco; Wu, Yuh-Renn; Li, Chi-Kang; Weisbuch, Claude; Mayboroda, Svitlana

    2017-04-01

    We present here a model of carrier distribution and transport in semiconductor alloys accounting for quantum localization effects in disordered materials. This model is based on the recent development of a mathematical theory of quantum localization which introduces for each type of carrier a spatial function called localization landscape. These landscapes allow us to predict the localization regions of electron and hole quantum states, their corresponding energies, and the local densities of states. We show how the various outputs of these landscapes can be directly implemented into a drift-diffusion model of carrier transport and into the calculation of absorption/emission transitions. This creates a new computational model which accounts for disorder localization effects while also capturing two major effects of quantum mechanics, namely, the reduction of barrier height (tunneling effect) and the raising of energy ground states (quantum confinement effect), without having to solve the Schrödinger equation. Finally, this model is applied to several one-dimensional structures such as single quantum wells, ordered and disordered superlattices, or multiquantum wells, where comparisons with exact Schrödinger calculations demonstrate the excellent accuracy of the approximation provided by the landscape theory.

  20. Integration of mathematical models in marketing theory and practice

    Directory of Open Access Journals (Sweden)

    Ioana Olariu

    2012-12-01

    Full Text Available This article is a theoretical approach on the main mathematical models used in marketing practice. Application of general systems theory in marketing involves setting behavior assumptions as models of various processes.These models have, on the one hand, to describe the interactions between ambiance and system factors, and, secondly, to identify causal dependencies existing in these interactions.Since the models are the means by which possible solutions can be drawn consequences, they occupy a central role in the design of a system to solve a marketing problem.The model is a simplified representation which is described and conceptualized phenomena and real life situations.The purpose of a model is to facilitate understanding of the real system. Models are widely used in marketing, it takes different forms that facilitate understanding the realities of marketing.

  1. Heat, Moisture, and Momentum Budgets of Isolated Deep Midlatitude and Tropical Convective Clouds as Diagnosed from Three-Dimensional Model Output. Part I: Control Experiments.

    Science.gov (United States)

    Schlesinger, Robert E.

    1994-12-01

    This project uses a three-dimensional anelastic cloud model with a simple ice phase parameterization to evaluate the feedback between isolated deep convective clouds and their near surroundings. The horizontal Reynolds averaging approach of Anthes is adopted to diagnose the vertical profiles of the individual budget terms for heat, moisture, and horizontal momentum, as well as the resultant effects of each budget as defined by apparent sources or sinks. The averaging area, 33.75 km on a side, is comparable to one grid cell for typical mesoscale numerical weather prediction models.Two comparative simulations are run, one for a severe Oklahoma thunderstorm in strong vertical wind shear and the other for a tropical Atlantic cumulonimbus in much weaker shear. The midlatitude cloud evolves to a vigorous quasi-steady mature stage with several supercell characteristics including an erect large-diameter updraft, a strong and vertically extensive mesolow, and a well-developed highly asymmetric cold pool that spreads rapidly. In contrast, the tropical updraft is much narrower and slower with a shallow weak midlevel mesolow, leans markedly downshear, and evolves early into slow decay modulated by bubblelike pulsations, while the cold pool is weak and quasi-circular and spreads slowly.There are several similarities between corresponding budgets in the two runs. Most notably: 1) The heat and moisture budgets are dominated by condensation, which is maximized in the midtroposphere. 2) The horizontal pressure gradient force dominates the momentum budget. 3) Vertical eddy transport (flux divergence) is highly important to each budget. Thermodynamically, it acts to mainly cool and dry the lower troposphere, while warming and moistening the upper troposphere, though with a lower crossover level for moisture than for heat. 4) The altitudes of the peak apparent heat sources are determined by the vertical eddy transport of heat. 5) Net evaporation has 40% as much amplitude as the

  2. Estimation of a four-parameter item response theory model.

    Science.gov (United States)

    Loken, Eric; Rulison, Kelly L

    2010-11-01

    We explore the justification and formulation of a four-parameter item response theory model (4PM) and employ a Bayesian approach to recover successfully parameter estimates for items and respondents. For data generated using a 4PM item response model, overall fit is improved when using the 4PM rather than the 3PM or the 2PM. Furthermore, although estimated trait scores under the various models correlate almost perfectly, inferences at the high and low ends of the trait continuum are compromised, with poorer coverage of the confidence intervals when the wrong model is used. We also show in an empirical example that the 4PM can yield new insights into the properties of a widely used delinquency scale. We discuss the implications for building appropriate measurement models in education and psychology to model more accurately the underlying response process.

  3. A parameter optimization tool for evaluating the physical consistency of the plot-scale water budget of the integrated eco-hydrological model GEOtop in complex terrain

    Science.gov (United States)

    Bertoldi, Giacomo; Cordano, Emanuele; Brenner, Johannes; Senoner, Samuel; Della Chiesa, Stefano; Niedrist, Georg

    2017-04-01

    In mountain regions, the plot- and catchment-scale water and energy budgets are controlled by a complex interplay of different abiotic (i.e. topography, geology, climate) and biotic (i.e. vegetation, land management) controlling factors. When integrated, physically-based eco-hydrological models are used in mountain areas, there are a large number of parameters, topographic and boundary conditions that need to be chosen. However, data on soil and land-cover properties are relatively scarce and do not reflect the strong variability at the local scale. For this reason, tools for uncertainty quantification and optimal parameters identification are essential not only to improve model performances, but also to identify most relevant parameters to be measured in the field and to evaluate the impact of different assumptions for topographic and boundary conditions (surface, lateral and subsurface water and energy fluxes), which are usually unknown. In this contribution, we present the results of a sensitivity analysis exercise for a set of 20 experimental stations located in the Italian Alps, representative of different conditions in terms of topography (elevation, slope, aspect), land use (pastures, meadows, and apple orchards), soil type and groundwater influence. Besides micrometeorological parameters, each station provides soil water content at different depths, and in three stations (one for each land cover) eddy covariance fluxes. The aims of this work are: (I) To present an approach for improving calibration of plot-scale soil moisture and evapotranspiration (ET). (II) To identify the most sensitive parameters and relevant factors controlling temporal and spatial differences among sites. (III) Identify possible model structural deficiencies or uncertainties in boundary conditions. Simulations have been performed with the GEOtop 2.0 model, which is a physically-based, fully distributed integrated eco-hydrological model that has been specifically designed for mountain

  4. Decision-Making Theories and Models: A Discussion of Rational and Psychological Decision-Making Theories and Models: The Search for a Cultural-Ethical Decision-Making Model

    OpenAIRE

    Oliveira, Arnaldo

    2007-01-01

    This paper examines rational and psychological decision-making models. Descriptive and normative methodologies such as attribution theory, schema theory, prospect theory, ambiguity model, game theory, and expected utility theory are discussed. The definition of culture is reviewed, and the relationship between culture and decision making is also highlighted as many organizations use a cultural-ethical decision-making model.

  5. Spatially random models, estimation theory, and robot arm dynamics

    Science.gov (United States)

    Rodriguez, G.

    1987-01-01

    Spatially random models provide an alternative to the more traditional deterministic models used to describe robot arm dynamics. These alternative models can be used to establish a relationship between the methodologies of estimation theory and robot dynamics. A new class of algorithms for many of the fundamental robotics problems of inverse and forward dynamics, inverse kinematics, etc. can be developed that use computations typical in estimation theory. The algorithms make extensive use of the difference equations of Kalman filtering and Bryson-Frazier smoothing to conduct spatial recursions. The spatially random models are very easy to describe and are based on the assumption that all of the inertial (D'Alembert) forces in the system are represented by a spatially distributed white-noise model. The models can also be used to generate numerically the composite multibody system inertia matrix. This is done without resorting to the more common methods of deterministic modeling involving Lagrangian dynamics, Newton-Euler equations, etc. These methods make substantial use of human knowledge in derivation and minipulation of equations of motion for complex mechanical systems.

  6. The "covariation method" for estimating the parameters of the standard Dynamic Energy Budget model II: Properties and preliminary patterns

    Science.gov (United States)

    Lika, Konstadia; Kearney, Michael R.; Kooijman, Sebastiaan A. L. M.

    2011-11-01

    The covariation method for estimating the parameters of the standard Dynamic Energy Budget (DEB) model provides a single-step method of accessing all the core DEB parameters from commonly available empirical data. In this study, we assess the robustness of this parameter estimation procedure and analyse the role of pseudo-data using elasticity coefficients. In particular, we compare the performance of Maximum Likelihood (ML) vs. Weighted Least Squares (WLS) approaches and find that the two approaches tend to converge in performance as the number of uni-variate data sets increases, but that WLS is more robust when data sets comprise single points (zero-variate data). The efficiency of the approach is shown to be high, and the prior parameter estimates (pseudo-data) have very little influence if the real data contain information about the parameter values. For instance, the effects of the pseudo-value for the allocation fraction κ is reduced when there is information for both growth and reproduction, that for the energy conductance is reduced when information on age at birth and puberty is given, and the effects of the pseudo-value for the maturity maintenance rate coefficient are insignificant. The estimation of some parameters (e.g., the zoom factor and the shape coefficient) requires little information, while that of others (e.g., maturity maintenance rate, puberty threshold and reproduction efficiency) require data at several food levels. The generality of the standard DEB model, in combination with the estimation of all of its parameters, allows comparison of species on the basis of parameter values. We discuss a number of preliminary patterns emerging from the present collection of parameter estimates across a wide variety of taxa. We make the observation that the estimated value of the fraction κ of mobilised reserve that is allocated to soma is far away from the value that maximises reproduction. We recognise this as the reason why two very different

  7. Nonlinear structural mechanics theory, dynamical phenomena and modeling

    CERN Document Server

    Lacarbonara, Walter

    2013-01-01

    Nonlinear Structural Mechanics: Theory, Dynamical Phenomena and Modeling offers a concise, coherent presentation of the theoretical framework of nonlinear structural mechanics, computational methods, applications, parametric investigations of nonlinear phenomena and their mechanical interpretation towards design. The theoretical and computational tools that enable the formulation, solution, and interpretation of nonlinear structures are presented in a systematic fashion so as to gradually attain an increasing level of complexity of structural behaviors, under the prevailing assumptions on the geometry of deformation, the constitutive aspects and the loading scenarios. Readers will find a treatment of the foundations of nonlinear structural mechanics towards advanced reduced models, unified with modern computational tools in the framework of the prominent nonlinear structural dynamic phenomena while tackling both the mathematical and applied sciences. Nonlinear Structural Mechanics: Theory, Dynamical Phenomena...

  8. Modeling of active transmembrane transport in a mixture theory framework.

    Science.gov (United States)

    Ateshian, Gerard A; Morrison, Barclay; Hung, Clark T

    2010-05-01

    This study formulates governing equations for active transport across semi-permeable membranes within the framework of the theory of mixtures. In mixture theory, which models the interactions of any number of fluid and solid constituents, a supply term appears in the conservation of linear momentum to describe momentum exchanges among the constituents. In past applications, this momentum supply was used to model frictional interactions only, thereby describing passive transport processes. In this study, it is shown that active transport processes, which impart momentum to solutes or solvent, may also be incorporated in this term. By projecting the equation of conservation of linear momentum along the normal to the membrane, a jump condition is formulated for the mechano-electrochemical potential of fluid constituents which is generally applicable to nonequilibrium processes involving active transport. The resulting relations are simple and easy to use, and address an important need in the membrane transport literature.

  9. Groundwater Modeling as an Alternative Approach to Limited Data in the Northeastern Part of Mt. Hermon (Syria, to Develop a Preliminary Water Budget

    Directory of Open Access Journals (Sweden)

    Nazeer M. Asmael

    2015-07-01

    Full Text Available In developing countries such as Syria, the lack of hydrological data affects groundwater resource assessment. Groundwater models provide the means to fill the gaps in the available data in order to improve the understanding of groundwater systems. The study area can be considered as the main recharge area of the eastern side of Barada and Awaj basin in the eastern part of Mt. Hermon. The withdrawal for agricultural and domestic purposes removes a considerable amount of water. The steady-state three-dimensional (3D groundwater model (FEFLOW which is an advanced finite element groundwater flow and transport modeling tool, was used to quantify groundwater budget components by using all available data of hydrological year 2009–2010. The results obtained may be considered as an essential tool for groundwater management options in the study area. The calibrated model demonstrates a good agreement between the observed and simulated hydraulic head. The result of the sensitivity analysis shows that the model is highly sensitive to hydraulic conductivity changes and sensitive to a lesser extent to water recharge amount. Regarding the upper aquifer horizon, the water budget under steady-state condition indicates that the lateral groundwater inflow from the Jurassic aquifer into this horizon is the most important recharge component. The major discharge component from this aquifer horizon occurs at its eastern boundary toward the outside of the model domain. The model was able to produce a satisfying estimation of the preliminary water budget of the upper aquifer horizon which indicates a positive imbalance of 4.6 Mm3·y−1.

  10. An organizational model to distinguish between and integrate research and evaluation activities in a theory based evaluation.

    Science.gov (United States)

    Sample McMeeking, Laura B; Basile, Carole; Brian Cobb, R

    2012-11-01

    Theory-based evaluation (TBE) is an evaluation method that shows how a program will work under certain conditions and has been supported as a viable, evidence-based option in cases where randomized trials or high-quality quasi-experiments are not feasible. Despite the model's widely accepted theoretical appeal there are few examples of its well-implemented use, probably due to time and money limitations necessary for planning and a confusion over the definitions between research and evaluation functions and roles. In this paper, we describe the development of a theory-based evaluation design in a Math and Science Partnership (MSP) research project funded by the U.S. National Science Foundation (NSF). Through this work we developed an organizational model distinguishing between and integrating evaluation and research functions, explicating personnel roles and responsibilities, and highlighting connections between research and evaluation work. Although the research and evaluation components operated on independent budgeting, staffing, and implementation activities, we were able to combine datasets across activities to allow us to assess the integrity of the program theory, not just the hypothesized connections within it. This model has since been used for proposal development and has been invaluable as it creates a research and evaluation plan that is seamless from the beginning. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Molecular Thermodynamic Modeling of Fluctuation Solution Theory Properties

    DEFF Research Database (Denmark)

    O’Connell, John P.; Abildskov, Jens

    2013-01-01

    Fluctuation Solution Theory provides relationships between integrals of the molecular pair total and direct correlation functions and the pressure derivative of solution density, partial molar volumes, and composition derivatives of activity coefficients. For dense fluids, the integrals follow...... a relatively simple corresponding-states behavior even for complex systems, show welldefined relationships for infinite dilution properties in complex and near-critical systems, allow estimation of mixed-solvent solubilities of gases and pharmaceuticals, and can be expressed by simple perturbation models...

  12. Historical analysis and modeling of the forest carbon dynamics using the Carbon Budget Model: an example for the Trento Province (NE, Italy

    Directory of Open Access Journals (Sweden)

    Pilli R

    2014-02-01

    Full Text Available Historical analysis and modeling of the forest carbon dynamics using the Carbon Budget Model: an example for the Trento Province (NE, Italy. The Carbon Budget Model (CBM-CFS3 developed by the Canadian Forest Service was applied to data collected by the last Italian National Forest Inventory (INFC for the Trento Province (NE, Italy. CBM was modified and adapted to the different management types (i.e., even-aged high forests, uneven-aged high forests and coppices and silvicultural systems (including clear cuts, single tree selection systems and thinning applied in this province. The aim of this study was to provide an example of down-scaling of this model from a national to a regional scale, providing (i an historical analysis, from 1995 to 2011, and (ii a projection, from 2012 to 2020, of the forest biomass and the carbon stock evolution. The analysis was based on the harvest rate reported by the Italian National Institute of Statistics (from 1995 to 2011, corrected according to the last INFC data and distinguished between timber and fuel woods and between conifers and broadleaves. Since 2012, we applied a constant harvest rate, equal to about 1300 Mm3 yr-1, estimated from the average harvest rate for the period 2006-2011. Model results were consistent with similar data reported in the literature. The average biomass C stock was 90 Mg C ha-1 and the biomass C stock change was 0.97 Mg C ha-1 yr-1 and 0.87 Mg C ha-1 yr-1, for the period 1995 -2011 and 2012-2020, respectively. The C stock cumulated by the timber products since 1995 was 96 Gg C yr-1, i.e., about 28% of the average annual C stock change of the forests, equal to 345 Gg C yr-1. CBM also provided estimates on the evolution of the age class distribution of the even-aged forests and on the C stock of the DOM forest pools (litter, dead wood and soil. This study demonstrates the utility of CBM to provide estimates at a regional or local scale, using not only the data provided by the forest

  13. Building Better Ecological Machines: Complexity Theory and Alternative Economic Models

    Directory of Open Access Journals (Sweden)

    Jess Bier

    2016-12-01

    Full Text Available Computer models of the economy are regularly used to predict economic phenomena and set financial policy. However, the conventional macroeconomic models are currently being reimagined after they failed to foresee the current economic crisis, the outlines of which began to be understood only in 2007-2008. In this article we analyze the most prominent of this reimagining: Agent-Based models (ABMs. ABMs are an influential alternative to standard economic models, and they are one focus of complexity theory, a discipline that is a more open successor to the conventional chaos and fractal modeling of the 1990s. The modelers who create ABMs claim that their models depict markets as ecologies, and that they are more responsive than conventional models that depict markets as machines. We challenge this presentation, arguing instead that recent modeling efforts amount to the creation of models as ecological machines. Our paper aims to contribute to an understanding of the organizing metaphors of macroeconomic models, which we argue is relevant conceptually and politically, e.g., when models are used for regulatory purposes.

  14. Game Theory Models for Multi-Robot Patrolling of Infrastructures

    Directory of Open Access Journals (Sweden)

    Erik Hernández

    2013-03-01

    Full Text Available This work is focused on the problem of performing multi-robot patrolling for infrastructure security applications in order to protect a known environment at critical facilities. Thus, given a set of robots and a set of points of interest, the patrolling task consists of constantly visiting these points at irregular time intervals for security purposes. Current existing solutions for these types of applications are predictable and inflexible. Moreover, most of the previous work has tackled the patrolling problem with centralized and deterministic solutions and only few efforts have been made to integrate dynamic methods. Therefore, one of the main contributions of this work is the development of new dynamic and decentralized collaborative approaches in order to solve the aforementioned problem by implementing learning models from Game Theory. The model selected in this work that includes belief-based and reinforcement models as special cases is called Experience-Weighted Attraction. The problem has been defined using concepts of Graph Theory to represent the environment in order to work with such Game Theory techniques. Finally, the proposed methods have been evaluated experimentally by using a patrolling simulator. The results obtained have been compared with previous available approaches.

  15. Beyond Zero Based Budgeting.

    Science.gov (United States)

    Ogden, Daniel M., Jr.

    1978-01-01

    Suggests that the most practical budgeting system for most managers is a formalized combination of incremental and zero-based analysis because little can be learned about most programs from an annual zero-based budget. (Author/IRT)

  16. Fiscal Year 2015 Budget

    Data.gov (United States)

    Montgomery County of Maryland — This dataset includes the Fiscal Year 2015 Council-approved operating budget for Montgomery County. The dataset does not include revenues and detailed agency budget...

  17. Quantile hydrologic model selection and model structure deficiency assessment : 1. Theory

    NARCIS (Netherlands)

    Pande, S.

    2013-01-01

    A theory for quantile based hydrologic model selection and model structure deficiency assessment is presented. The paper demonstrates that the degree to which a model selection problem is constrained by the model structure (measured by the Lagrange multipliers of the constraints) quantifies

  18. Application of the evolution theory in modelling of innovation diffusion

    Directory of Open Access Journals (Sweden)

    Krstić Milan

    2016-01-01

    Full Text Available The theory of evolution has found numerous analogies and applications in other scientific disciplines apart from biology. In that sense, today the so-called 'memetic-evolution' has been widely accepted. Memes represent a complex adaptable system, where one 'meme' represents an evolutional cultural element, i.e. the smallest unit of information which can be identified and used in order to explain the evolution process. Among others, the field of innovations has proved itself to be a suitable area where the theory of evolution can also be successfully applied. In this work the authors have started from the assumption that it is also possible to apply the theory of evolution in the modelling of the process of innovation diffusion. Based on the conducted theoretical research, the authors conclude that the process of innovation diffusion in the interpretation of a 'meme' is actually the process of imitation of the 'meme' of innovation. Since during the process of their replication certain 'memes' show a bigger success compared to others, that eventually leads to their natural selection. For the survival of innovation 'memes', their manifestations are of key importance in the sense of their longevity, fruitfulness and faithful replicating. The results of the conducted research have categorically confirmed the assumption of the possibility of application of the evolution theory with the innovation diffusion with the help of innovation 'memes', which opens up the perspectives for some new researches on the subject.

  19. Models of rational decision making in contemporary economic theory

    Directory of Open Access Journals (Sweden)

    Krstić Bojan

    2015-01-01

    Full Text Available The aim of this paper is to show that the economists can not adequately explain the rational behavior if are focused only on the scientific observations from the model of full rationality and the model instrumental rationality, and the inclusion related model makes 'larger views', which like more reprezentative reflection of the rational behavior represents a solid basis for construction the model of decision-making in contemporary economic science. Taking into account the goal of the work and its specific character, we composed adequate structure of work. In the first part of the paper, we define the model of full rationality, its important characteristics. In the second part, we analyze the model of instrumental rationality. In the analysis of model, we start from the statement, which given in economic theory, that the rational actor uses the best means to achieve their objectives. In the third part, we consider of the basic of the models of value rationality. In the fourth part, we consider key characteristics of the model of bounded rationality. In the last part, we focuse on questioning the basic assumptions of the model of full rationality and the model of instrumental rationality. We especially analyze the personal and social goals preferences of high school students and university students.

  20. CMB anomalies from an inflationary model in string theory

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhi-Guo; Piao, Yun-Song [University of Chinese Academy of Sciences, School of Physics, Beijing (China); Guo, Zong-Kuan [Chinese Academy of Sciences, State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, P.O. Box 2735, Beijing (China)

    2014-08-15

    Recent Planck measurements show some CMB anomalies on large angular scales, which confirms the early observations by WMAP. We show that an inflationary model, in which before the slow-roll inflation the Universe is in a superinflationary phase, can generate a large-scale cutoff in the primordial power spectrum, which may account for not only the power suppression on large angular scales, but also a large dipole power asymmetry in the CMB. We discuss an implementation of our model in string theory. (orig.)

  1. Theory and Circuit Model for Lossy Coaxial Transmission Line

    Energy Technology Data Exchange (ETDEWEB)

    Genoni, T. C.; Anderson, C. N.; Clark, R. E.; Gansz-Torres, J.; Rose, D. V.; Welch, Dale Robert

    2017-04-01

    The theory of signal propagation in lossy coaxial transmission lines is revisited and new approximate analytic formulas for the line impedance and attenuation are derived. The accuracy of these formulas from DC to 100 GHz is demonstrated by comparison to numerical solutions of the exact field equations. Based on this analysis, a new circuit model is described which accurately reproduces the line response over the entire frequency range. Circuit model calculations are in excellent agreement with the numerical and analytic results, and with finite-difference-time-domain simulations which resolve the skindepths of the conducting walls.

  2. Models and applications of chaos theory in modern sciences

    CERN Document Server

    Zeraoulia, Elhadj

    2011-01-01

    This book presents a select group of papers that provide a comprehensive view of the models and applications of chaos theory in medicine, biology, ecology, economy, electronics, mechanical, and the human sciences. Covering both the experimental and theoretical aspects of the subject, it examines a range of current topics of interest. It considers the problems arising in the study of discrete and continuous time chaotic dynamical systems modeling the several phenomena in nature and society-highlighting powerful techniques being developed to meet these challenges that stem from the area of nonli

  3. Thick brane models in generalized theories of gravity

    Directory of Open Access Journals (Sweden)

    D. Bazeia

    2015-04-01

    Full Text Available This work deals with thick braneworld models, in an environment where the Ricci scalar is changed to accommodate the addition of two extra terms, one depending on the Ricci scalar itself, and the other, which takes into account the trace of the energy–momentum tensor of the scalar field that sources the braneworld scenario. We suppose that the scalar field engenders standard kinematics, and we show explicitly that the gravity sector of this new braneworld scenario is linearly stable. We illustrate the general results investigating two distinct models, focusing on how the brane profile is changed in the modified theories.

  4. Theories linguistiques, modeles informatiques, experimentation psycholinguistique (Linguistic Theories, Information-Processing Models, Psycholinguistic Experimentation)

    Science.gov (United States)

    Dubois, Daniele

    1975-01-01

    Delineates and elaborates upon the underlying psychological postulates in linguistic and information-processing models, and shows the interdependence of psycholinguistics and linguistic analysis. (Text is in French.) (DB)

  5. Between Bedside and Budget

    NARCIS (Netherlands)

    J.L.T. Blank; E. Eggink

    1998-01-01

    Original title: Tussen bed en budget. The report Between bedside and budget (Tussen bed en budget) describes an extensive empirical study of the efficiency of general and university hospitals in the Netherlands. A policy summary recaps the main findings of the study. Those findings

  6. Library Budget Primer.

    Science.gov (United States)

    Warner, Alice Sizer

    1993-01-01

    Discusses the advantages and disadvantages of six types of budgets commonly used by many different kinds of libraries. The budget types covered are lump-sum; formula; line or line-item; program; performance or function; and zero-based. Accompanying figures demonstrate the differences between four of the budget types. (three references) (KRN)

  7. Understanding Long-term, Large-scale Shoreline Change and the Sediment Budget on Fire Island, NY, using a 3D hydrodynamics-based model

    Science.gov (United States)

    List, J. H.; Safak, I.; Warner, J. C.; Schwab, W. C.; Hapke, C. J.; Lentz, E. E.

    2016-02-01

    The processes responsible for long-term (decadal) shoreline change and the related imbalance in the sediment budget on Fire Island, a 50 km long barrier island on the south coast of Long Island, NY, has been the subject of debate. The estimated net rate of sediment leaving the barrier at the west end of the island is approximately double the estimated net rate of sediment entering in the east, but the island-wide average sediment volume change associated with shoreline change is near zero and cannot account for this deficit. A long-held hypothesis is that onshore sediment flux from the inner continental shelf within the western half of the island is responsible for balancing the sediment budget. To investigate this possibility, we use a nested, 3-D, hydrodynamics-based modeling system (COAWST) to simulate the island-wide alongshore and cross-shore transport, in combination with shoreline change observations. The modeled, net alongshore transport gradients in the nearshore predict that the central part of Fire Island should be erosional, yet shoreline change observations show this area to be accretionary. We compare the model-predicted alongshore transport gradients with the flux gradients that would be required to generate the observed shoreline change, to give the pattern of sediment volume gains or losses that cannot be explained by the modeled alongshore transport gradients. Results show that the western 30 km of coast requires an input of sediment, supporting the hypothesis of onshore flux in this area. The modeled cross-shore flux of sediment between the shoreface and inner shelf is consistent these results, with onshore-directed bottom currents creating an environment more conducive to onshore sediment flux in the western 30 km of the island compared to the eastern 20 km. We conclude that the cross-shore flux of sediment can explain the shoreline change observations, and is an integral component of Fire Island's sediment budget.

  8. Standard Model in multiscale theories and observational constraints

    Science.gov (United States)

    Calcagni, Gianluca; Nardelli, Giuseppe; Rodríguez-Fernández, David

    2016-08-01

    We construct and analyze the Standard Model of electroweak and strong interactions in multiscale spacetimes with (i) weighted derivatives and (ii) q -derivatives. Both theories can be formulated in two different frames, called fractional and integer picture. By definition, the fractional picture is where physical predictions should be made. (i) In the theory with weighted derivatives, it is shown that gauge invariance and the requirement of having constant masses in all reference frames make the Standard Model in the integer picture indistinguishable from the ordinary one. Experiments involving only weak and strong forces are insensitive to a change of spacetime dimensionality also in the fractional picture, and only the electromagnetic and gravitational sectors can break the degeneracy. For the simplest multiscale measures with only one characteristic time, length and energy scale t*, ℓ* and E*, we compute the Lamb shift in the hydrogen atom and constrain the multiscale correction to the ordinary result, getting the absolute upper bound t*28 TeV . Stronger bounds are obtained from the measurement of the fine-structure constant. (ii) In the theory with q -derivatives, considering the muon decay rate and the Lamb shift in light atoms, we obtain the independent absolute upper bounds t*35 MeV . For α0=1 /2 , the Lamb shift alone yields t*450 GeV .

  9. The linear model and hypothesis a general unifying theory

    CERN Document Server

    Seber, George

    2015-01-01

    This book provides a concise and integrated overview of hypothesis testing in four important subject areas, namely linear and nonlinear models, multivariate analysis, and large sample theory. The approach used is a geometrical one based on the concept of projections and their associated idempotent matrices, thus largely avoiding the need to involve matrix ranks. It is shown that all the hypotheses encountered are either linear or asymptotically linear, and that all the underlying models used are either exactly or asymptotically linear normal models. This equivalence can be used, for example, to extend the concept of orthogonality in the analysis of variance to other models, and to show that the asymptotic equivalence of the likelihood ratio, Wald, and Score (Lagrange Multiplier) hypothesis tests generally applies.

  10. Visceral obesity and psychosocial stress: a generalised control theory model

    Science.gov (United States)

    Wallace, Rodrick

    2016-07-01

    The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.

  11. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  12. Adapting Structuration Theory as a Comprehensive Theory for Distance Education: The ASTIDE Model

    Science.gov (United States)

    Aktaruzzaman, Md; Plunkett, Margaret

    2016-01-01

    Distance Education (DE) theorists have argued about the requirement for a theory to be comprehensive in a way that can explicate many of the activities associated with DE. Currently, Transactional Distance Theory (TDT) (Moore, 1993) and the Theory of Instructional Dialogue (IDT) (Caspi & Gorsky, 2006) are the most prominent theories, yet they…

  13. Plane answers to complex questions the theory of linear models

    CERN Document Server

    Christensen, Ronald

    1987-01-01

    This book was written to rigorously illustrate the practical application of the projective approach to linear models. To some, this may seem contradictory. I contend that it is possible to be both rigorous and illustrative and that it is possible to use the projective approach in practical applications. Therefore, unlike many other books on linear models, the use of projections and sub­ spaces does not stop after the general theory. They are used wherever I could figure out how to do it. Solving normal equations and using calculus (outside of maximum likelihood theory) are anathema to me. This is because I do not believe that they contribute to the understanding of linear models. I have similar feelings about the use of side conditions. Such topics are mentioned when appropriate and thenceforward avoided like the plague. On the other side of the coin, I just as strenuously reject teaching linear models with a coordinate free approach. Although Joe Eaton assures me that the issues in complicated problems freq...

  14. Comparison of Polytomous Parametric and Nonparametric Item Response Theory Models

    Directory of Open Access Journals (Sweden)

    Özge BIKMAZ BİLGEN

    2017-12-01

    Full Text Available This research aimed to identify the effects of independent variables as sample size, sample distribution, the number of items in the test, and the number of response categories of items in the test on the estimations of Graded Response Model (GRM under Parametric Item Response Theory (PIRT and by Monotone Homogeneity Model (MHM under Non-Parametric Item Response Theory (NIRT for polytomously scored items. To achieve this aim, the research was performed as a fundamental study in which 192 simulation conditions were designed by the combination of sample size, sample distribution, the number of items, and the number of categories of items. Estimates by GRM and MHM were examined under different levels of sample size (N= 100, 250, 500, 1000, sample distribution (normal, skewed, the number of items (10, 20, 40, 80, and the number of categories of items (3, 5, 7 conditions, by respectively calculating model-data fit, reliability values, standart errors of parameters. As a result of the research, it was found that since the values used to evaluate model-data fit were influenced by the increase of variable while calculating model-data fit and since they can not be interpreted alone, it is difficult to compare and generalize the results. The practical calculation of model data fit, which can be interpreted without the need for another value, in MHM provides superiority over GRM. Another research result is that the reliability values give similar results for both models. The standard errors of the MHM parameter estimates is lower than the GRM estimates under small sample and few items conditions and the standard errors of the MHM parameter estimates are close to each other in all conditions.

  15. Zero Based Budgeting for Voc Ed

    Science.gov (United States)

    Chuang, Ying C.

    1977-01-01

    To help vocational education budget planners take a good look each year at where they are going, what they are trying to accomplish, and where to put their money, this article describes the 12 steps in a model commonly used for zero based budgeting. (Author/HD)

  16. Future projections of the surface heat and water budgets of the Mediterranean Sea in an ensemble of coupled atmosphere-ocean regional climate models

    Energy Technology Data Exchange (ETDEWEB)

    Dubois, C.; Somot, S.; Deque, M.; Sevault, F. [CNRM-GAME, Meteo-France, CNRS, Toulouse (France); Calmanti, S.; Carillo, A.; Dell' Aquilla, A.; Sannino, G. [ENEA, Rome (Italy); Elizalde, A.; Jacob, D. [Max Planck Institute for Meteorology, Hamburg (Germany); Gualdi, S.; Oddo, P.; Scoccimarro, E. [INGV, Bologna (Italy); L' Heveder, B.; Li, L. [Laboratoire de Meteorologie Dynamique, Paris (France)

    2012-10-15

    Within the CIRCE project ''Climate change and Impact Research: the Mediterranean Environment'', an ensemble of high resolution coupled atmosphere-ocean regional climate models (AORCMs) are used to simulate the Mediterranean climate for the period 1950-2050. For the first time, realistic net surface air-sea fluxes are obtained. The sea surface temperature (SST) variability is consistent with the atmospheric forcing above it and oceanic constraints. The surface fluxes respond to external forcing under a warming climate and show an equivalent trend in all models. This study focuses on the present day and on the evolution of the heat and water budget over the Mediterranean Sea under the SRES-A1B scenario. On the contrary to previous studies, the net total heat budget is negative over the present period in all AORCMs and satisfies the heat closure budget controlled by a net positive heat gain at the strait of Gibraltar in the present climate. Under climate change scenario, some models predict a warming of the Mediterranean Sea from the ocean surface (positive net heat flux) in addition to the positive flux at the strait of Gibraltar for the 2021-2050 period. The shortwave and latent flux are increasing and the longwave and sensible fluxes are decreasing compared to the 1961-1990 period due to a reduction of the cloud cover and an increase in greenhouse gases (GHGs) and SSTs over the 2021-2050 period. The AORCMs provide a good estimates of the water budget with a drying of the region during the twenty-first century. For the ensemble mean, he decrease in precipitation and runoff is about 10 and 15% respectively and the increase in evaporation is much weaker, about 2% compared to the 1961-1990 period which confirm results obtained in recent studies. Despite a clear consistency in the trends and results between the models, this study also underlines important differences in the model set-ups, methodology and choices of some physical parameters inducing

  17. Theory, modelling and simulation in origins of life studies.

    Science.gov (United States)

    Coveney, Peter V; Swadling, Jacob B; Wattis, Jonathan A D; Greenwell, H Christopher

    2012-08-21

    Origins of life studies represent an exciting and highly multidisciplinary research field. In this review we focus on the contributions made by theory, modelling and simulation to addressing fundamental issues in the domain and the advances these approaches have helped to make in the field. Theoretical approaches will continue to make a major impact at the "systems chemistry" level based on the analysis of the remarkable properties of nonlinear catalytic chemical reaction networks, which arise due to the auto-catalytic and cross-catalytic nature of so many of the putative processes associated with self-replication and self-reproduction. In this way, we describe inter alia nonlinear kinetic models of RNA replication within a primordial Darwinian soup, the origins of homochirality and homochiral polymerization. We then discuss state-of-the-art computationally-based molecular modelling techniques that are currently being deployed to investigate various scenarios relevant to the origins of life.

  18. Structure and asymptotic theory for nonlinear models with GARCH errors

    Directory of Open Access Journals (Sweden)

    Felix Chan

    2015-01-01

    Full Text Available Nonlinear time series models, especially those with regime-switching and/or conditionally heteroskedastic errors, have become increasingly popular in the economics and finance literature. However, much of the research has concentrated on the empirical applications of various models, with little theoretical or statistical analysis associated with the structure of the processes or the associated asymptotic theory. In this paper, we derive sufficient conditions for strict stationarity and ergodicity of three different specifications of the first-order smooth transition autoregressions with heteroskedastic errors. This is essential, among other reasons, to establish the conditions under which the traditional LM linearity tests based on Taylor expansions are valid. We also provide sufficient conditions for consistency and asymptotic normality of the Quasi-Maximum Likelihood Estimator for a general nonlinear conditional mean model with first-order GARCH errors.

  19. Dynamic statistical models of biological cognition: insights from communications theory

    Science.gov (United States)

    Wallace, Rodrick

    2014-10-01

    Maturana's cognitive perspective on the living state, Dretske's insight on how information theory constrains cognition, the Atlan/Cohen cognitive paradigm, and models of intelligence without representation, permit construction of a spectrum of dynamic necessary conditions statistical models of signal transduction, regulation, and metabolism at and across the many scales and levels of organisation of an organism and its context. Nonequilibrium critical phenomena analogous to physical phase transitions, driven by crosstalk, will be ubiquitous, representing not only signal switching, but the recruitment of underlying cognitive modules into tunable dynamic coalitions that address changing patterns of need and opportunity at all scales and levels of organisation. The models proposed here, while certainly providing much conceptual insight, should be most useful in the analysis of empirical data, much as are fitted regression equations.

  20. A queueing theory based model for business continuity in hospitals.

    Science.gov (United States)

    Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R

    2013-01-01

    Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals.

  1. Modelling apical constriction in epithelia using elastic shell theory.

    Science.gov (United States)

    Jones, Gareth Wyn; Chapman, S Jonathan

    2010-06-01

    Apical constriction is one of the fundamental mechanisms by which embryonic tissue is deformed, giving rise to the shape and form of the fully-developed organism. The mechanism involves a contraction of fibres embedded in the apical side of epithelial tissues, leading to an invagination or folding of the cell sheet. In this article the phenomenon is modelled mechanically by describing the epithelial sheet as an elastic shell, which contains a surface representing the continuous mesh formed from the embedded fibres. Allowing this mesh to contract, an enhanced shell theory is developed in which the stiffness and bending tensors of the shell are modified to include the fibres' stiffness, and in which the active effects of the contraction appear as body forces in the shell equilibrium equations. Numerical examples are presented at the end, including the bending of a plate and a cylindrical shell (modelling neurulation) and the invagination of a spherical shell (modelling simple gastrulation).

  2. Lattice Gauge Theories Within and Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Gelzer, Zechariah John [Iowa U.

    2017-01-01

    The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involving $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($B \\to \\pi \\ell \

  3. Density Functional Theory and Materials Modeling at Atomistic Length Scales

    Directory of Open Access Journals (Sweden)

    Swapan K. Ghosh

    2002-04-01

    Full Text Available Abstract: We discuss the basic concepts of density functional theory (DFT as applied to materials modeling in the microscopic, mesoscopic and macroscopic length scales. The picture that emerges is that of a single unified framework for the study of both quantum and classical systems. While for quantum DFT, the central equation is a one-particle Schrodinger-like Kohn-Sham equation, the classical DFT consists of Boltzmann type distributions, both corresponding to a system of noninteracting particles in the field of a density-dependent effective potential, the exact functional form of which is unknown. One therefore approximates the exchange-correlation potential for quantum systems and the excess free energy density functional or the direct correlation functions for classical systems. Illustrative applications of quantum DFT to microscopic modeling of molecular interaction and that of classical DFT to a mesoscopic modeling of soft condensed matter systems are highlighted.

  4. Criticism of the Classical Theory of Macroeconomic Modeling

    Directory of Open Access Journals (Sweden)

    Konstantin K. Kumehov

    2015-01-01

    Full Text Available Abstract: Current approaches and methods of modeling of macroeconomic systems do not allow to generate research ideas that could be used in applications. This is largely due to the fact that the dominant economic schools and research directions are building their theories on misconceptions about the economic system as object modeling, and have no common methodological approaches in the design of macroeconomic models. All of them are focused on building a model aimed at establishing equilibrium parameters of supply and demand, production and consumption. At the same time as the underlying factors are not considered resource potential and the needs of society in material and other benefits. In addition, there is no unity in the choice of elements and mechanisms of interaction between them. Not installed, what are the criteria to determine the elements of the model: whether it is the institutions, whether the industry is whether the population, or banks, or classes, etc. From the methodological point of view, the design of the model all the most well-known authors extrapolated to the new models of the past state or past events. As a result, every time the model is ready by the time the situation changes, the last parameters underlying the model are losing relevance, so at best, the researcher may have to interpret the events and parameters that are not feasible in the future. In this paper, based on analysis of the works of famous authors, belonging to different schools and areas revealed weaknesses of their proposed macroeconomic models that do not allow you to use them to solve applied problems of economic development. A fundamentally new approaches and methods by which it is possible the construction of macroeconomic models that take into account the theoretical and applied aspects of modeling, as well as formulated the basic methodological requirements.

  5. Multiagent model and mean field theory of complex auction dynamics

    Science.gov (United States)

    Chen, Qinghua; Huang, Zi-Gang; Wang, Yougui; Lai, Ying-Cheng

    2015-09-01

    Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena.

  6. Effective Field Theory and the Gamow Shell Model

    OpenAIRE

    Rotureau, J.; van Kolck, U.

    2013-01-01

    We combine Halo/Cluster Effective Field Theory (H/CEFT) and the Gamow Shell Model (GSM) to describe the $0^+$ ground state of $\\rm{^6He}$ as a three-body halo system. We use two-body interactions for the neutron-alpha particle and two-neutron pairs obtained from H/CEFT at leading order, with parameters determined from scattering in the p$_{3/2}$ and s$_0$ channels, respectively. The three-body dynamics of the system is solved using the GSM formalism, where the continuum states are incorporate...

  7. Mean-field theory and self-consistent dynamo modeling

    Energy Technology Data Exchange (ETDEWEB)

    Yoshizawa, Akira; Yokoi, Nobumitsu [Tokyo Univ. (Japan). Inst. of Industrial Science; Itoh, Sanae-I [Kyushu Univ., Fukuoka (Japan). Research Inst. for Applied Mechanics; Itoh, Kimitaka [National Inst. for Fusion Science, Toki, Gifu (Japan)

    2001-12-01

    Mean-field theory of dynamo is discussed with emphasis on the statistical formulation of turbulence effects on the magnetohydrodynamic equations and the construction of a self-consistent dynamo model. The dynamo mechanism is sought in the combination of the turbulent residual-helicity and cross-helicity effects. On the basis of this mechanism, discussions are made on the generation of planetary magnetic fields such as geomagnetic field and sunspots and on the occurrence of flow by magnetic fields in planetary and fusion phenomena. (author)

  8. Model for urban and indoor cellular propagation using percolation theory

    Science.gov (United States)

    Franceschetti, G.; Marano, S.; Pasquino, N.; Pinto, I. M.

    2000-03-01

    A method for the analysis and statistical characterization of wave propagation in indoor and urban cellular radio channels is presented, based on a percolation model. Pertinent principles of the theory are briefly reviewed, and applied to the problem of interest. Relevant quantities, such as pulsed-signal arrival rate, number of reflections against obstacles, and path lengths are deduced and related to basic environment parameters such as obstacle density and transmitter-receiver separation. Results are found to be in good agreement with alternative simulations and measurements.

  9. Theory and Modeling of High-Power Gyrotrons

    Energy Technology Data Exchange (ETDEWEB)

    Nusinovich, Gregory Semeon [Univ. of Maryland, College Park, MD (United States)

    2016-04-29

    This report summarized results of the work performed at the Institute for Research in Electronics and Applied Physics of the University of Maryland (College Park, MD) in the framework of the DOE Grant “Theory and Modeling of High-Power Gyrotrons”. The report covers the work performed in 2011-2014. The research work was performed in three directions: - possibilities of stable gyrotron operation in very high-order modes offering the output power exceeding 1 MW level in long-pulse/continuous-wave regimes, - effect of small imperfections in gyrotron fabrication and alignment on the gyrotron efficiency and operation, - some issues in physics of beam-wave interaction in gyrotrons.

  10. : The origins of the random walk model in financial theory

    OpenAIRE

    Walter, Christian

    2013-01-01

    Ce texte constitue le chapitre 2 de l'ouvrage Le modèle de marche au hasard en finance, de Christian Walter, à paraître chez Economica, collection " Audit, assurance, actuariat ", en juin 2013. Il est publié ici avec l'accord de l'éditeur.; Three main concerns pave the way for the birth of the random walk model in financial theory: an ethical issue with Jules Regnault (1834-1894), a scientific issue with Louis Bachelier (1870-1946) and a pratical issue with Alfred Cowles (1891-1984). Three to...

  11. The Five-Factor Model and Self-Determination Theory

    DEFF Research Database (Denmark)

    Olesen, Martin Hammershøj; Thomsen, Dorthe Kirkegaard; Schnieber, Anette

    This study investigates conceptual overlap vs. distinction between individual differences in personality traits, i.e. the Five-Factor Model; and Self-determination Theory, i.e. general causality orientations. Twelve-hundred-and-eighty-seven freshmen (mean age 21.71; 64% women) completed electronic...... questionnaires of personality traits (NEO-FFI) and causality orientations (GCOS). To test whether covariance between traits and orientations could be attributed to shared or separate latent variables we conducted joint factor analyses. Results reveal that the Autonomy orientation can be distinguished from...

  12. Reconsideration of r/K Selection Theory Using Stochastic Control Theory and Nonlinear Structured Population Models.

    Science.gov (United States)

    Oizumi, Ryo; Kuniya, Toshikazu; Enatsu, Yoichi

    2016-01-01

    Despite the fact that density effects and individual differences in life history are considered to be important for evolution, these factors lead to several difficulties in understanding the evolution of life history, especially when population sizes reach the carrying capacity. r/K selection theory explains what types of life strategies evolve in the presence of density effects and individual differences. However, the relationship between the life schedules of individuals and population size is still unclear, even if the theory can classify life strategies appropriately. To address this issue, we propose a few equations on adaptive life strategies in r/K selection where density effects are absent or present. The equations detail not only the adaptive life history but also the population dynamics. Furthermore, the equations can incorporate temporal individual differences, which are referred to as internal stochasticity. Our framework reveals that maximizing density effects is an evolutionarily stable strategy related to the carrying capacity. A significant consequence of our analysis is that adaptive strategies in both selections maximize an identical function, providing both population growth rate and carrying capacity. We apply our method to an optimal foraging problem in a semelparous species model and demonstrate that the adaptive strategy yields a lower intrinsic growth rate as well as a lower basic reproductive number than those obtained with other strategies. This study proposes that the diversity of life strategies arises due to the effects of density and internal stochasticity.

  13. Rigorously testing multialternative decision field theory against random utility models.

    Science.gov (United States)

    Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg

    2014-06-01

    Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  14. Corvid re-caching without 'theory of mind': a model.

    Directory of Open Access Journals (Sweden)

    Elske van der Vaart

    Full Text Available Scrub jays are thought to use many tactics to protect their caches. For instance, they predominantly bury food far away from conspecifics, and if they must cache while being watched, they often re-cache their worms later, once they are in private. Two explanations have been offered for such observations, and they are intensely debated. First, the birds may reason about their competitors' mental states, with a 'theory of mind'; alternatively, they may apply behavioral rules learned in daily life. Although this second hypothesis is cognitively simpler, it does seem to require a different, ad-hoc behavioral rule for every caching and re-caching pattern exhibited by the birds. Our new theory avoids this drawback by explaining a large variety of patterns as side-effects of stress and the resulting memory errors. Inspired by experimental data, we assume that re-caching is not motivated by a deliberate effort to safeguard specific caches from theft, but by a general desire to cache more. This desire is brought on by stress, which is determined by the presence and dominance of onlookers, and by unsuccessful recovery attempts. We study this theory in two experiments similar to those done with real birds with a kind of 'virtual bird', whose behavior depends on a set of basic assumptions about corvid cognition, and a well-established model of human memory. Our results show that the 'virtual bird' acts as the real birds did; its re-caching reflects whether it has been watched, how dominant its onlooker was, and how close to that onlooker it has cached. This happens even though it cannot attribute mental states, and it has only a single behavioral rule assumed to be previously learned. Thus, our simulations indicate that corvid re-caching can be explained without sophisticated social cognition. Given our specific predictions, our theory can easily be tested empirically.

  15. THE REAL OPTIONS OF CAPITAL BUDGET

    Directory of Open Access Journals (Sweden)

    Antonio Lopo Martins

    2008-07-01

    Full Text Available The traditional techniques of capital budget, as the deducted cash flow and the net value present, do not incorporate existing flexibilities in an investment project, they tend to distort the value of certain investments, mainly those that are considered in scenes of uncertainty and risk. Therefore, this study intends to demonstrate that the Real Options Theory (TOR is a useful methodology to evaluate and to indicate the best option for project of expansion investment. To reach the considered objective the procedure method was used a case study, having as unit of case the Resort Praia Hotel do Litoral Norte of Salvador. This study was developed of the following form: first it identified the traditional net value present and later it was incorporated the volatileness of each analyzed uncertainty. Second, as the real options are analogous to the financial options, it was necessary to identify elements that composed the terminologies of the financial options with intention to get the value of the real option. For this model of options pricing of Black & Scholes jointly with a computational simulator was used (SLS to get the expanded net value present. As a result of this study it was possible to evidence that using the traditional tool of capital budget Net Value Present (VPL is negative, therefore the project of expansion of the Hotel would be rejected. While for the application of methodology TOR the project presents positive Expanded Present Value which would represent an excellent chance of investment. Key-word: Capital budget, Real options, Analysis of investments.

  16. Understanding the Budget Process

    Directory of Open Access Journals (Sweden)

    Mesut Yalvaç

    2000-03-01

    Full Text Available Many different budgeting techniques can be used in libraries, and some combination of these will be appropriate for almost any individual situation. Li-ne-item, program, performance, formula, variable, and zero-base budgets all have features that may prove beneficial in the preparation of a budget. Budgets also serve a variety of functions, providing for short-term and long-term financial planning as well as for cash management over a period of time. Short-term plans are reflected in the operating budget, while long-term plans are reflected in the capital budget. Since the time when cash is available to an organization does not usually coincide with the time that disbursements must be made, it is also important to carefully plan for the inflow and outflow of funds by means of a cash budget.      During the budget process an organization selects its programs and activities by providing the necessary funding; the library, along with others in the organization, must justify its requests. Because of the cyclical nature of the budget process, it is possible continually to gather information and evaluate alternatives for the next budget period so that the library may achieve its maximum potential for service to its patrons.

  17. Fuzzy structure theory modeling of sound-insulation layers in complex vibroacoustic uncertain systems: theory and experimental validation.

    Science.gov (United States)

    Fernandez, Charles; Soize, Christian; Gagliardini, Laurent

    2009-01-01

    The fuzzy structure theory was introduced 20 years ago in order to model the effects of complex subsystems imprecisely known on a master structure. This theory was only aimed at structural dynamics. In this paper, an extension of that theory is proposed in developing an elastoacoustic element useful to model sound-insulation layers for computational vibroacoustics of complex systems. The simplified model constructed enhances computation time and memory allocation because the number of physical and generalized degrees of freedom in the computational vibroacoustic model is not increased. However, these simplifications introduce model uncertainties. In order to take into account these uncertainties, the nonparametric probabilistic approach recently introduced is used. A robust simplified model for sound-insulation layers is then obtained. This model is controlled by a small number of physical and dispersion parameters. First, the extension of the fuzzy structure theory to elastoacoustic element is presented. Second, the computational vibroacoustic model including such an elastoacoustic element to model sound-insulation layer is given. Then, a design methodology to identify the model parameters with experiments is proposed and is experimentally validated. Finally, the theory is applied to an uncertain vibroacoustic system.

  18. Comparing theory-based condom interventions: health belief model versus theory of planned behavior.

    Science.gov (United States)

    Montanaro, Erika A; Bryan, Angela D

    2014-10-01

    This study sought to experimentally manipulate the core constructs of the Health Belief Model (HBM) and the Theory of Planned Behavior (TPB) in order to compare the success of interventions to increase preparatory condom use behavior (i.e., purchasing condoms, talking to a boyfriend or girlfriend about using condoms, and carrying condoms) based on these theories. A total of 258 participants were randomly assigned to one of three computer-based interventions (HBM, TPB, or information-only control). A total of 204 (79.1%) completed follow-up assessments 1 month later. Regression analyses were conducted to determine which set of theoretical constructs accounted for the most variance in behavior at baseline. A series of structural equation models were estimated to determine which constructs were the "active ingredients" of change. The TPB accounted for 32.8% of the variance in risky sexual behavior at baseline, while the HBM only explained 1.6% of the variance. Mediational analyses revealed differential intervention effects on perceived susceptibility, perceived benefits, and attitudes toward condom use. However, it was attitudes toward condom use and condom use self-efficacy that were associated with intentions, which then predicted preparatory condom use behavior at follow-up. Except for attitudes, the mediators that were successfully manipulated by the interventions (i.e., perceived susceptibility, perceived severity, and attitudes) were not the same constructs that predicted intentions (i.e., attitudes and condom use self-efficacy), and subsequently predicted behavior. This suggests that the constructs that explain behavior are not the same as those that produce behavior change.

  19. Theory-guided exploration with structural equation model forests.

    Science.gov (United States)

    Brandmaier, Andreas M; Prindle, John J; McArdle, John J; Lindenberger, Ulman

    2016-12-01

    Structural equation model (SEM) trees, a combination of SEMs and decision trees, have been proposed as a data-analytic tool for theory-guided exploration of empirical data. With respect to a hypothesized model of multivariate outcomes, such trees recursively find subgroups with similar patterns of observed data. SEM trees allow for the automatic selection of variables that predict differences across individuals in specific theoretical models, for instance, differences in latent factor profiles or developmental trajectories. However, SEM trees are unstable when small variations in the data can result in different trees. As a remedy, SEM forests, which are ensembles of SEM trees based on resamplings of the original dataset, provide increased stability. Because large forests are less suitable for visual inspection and interpretation, aggregate measures provide researchers with hints on how to improve their models: (a) variable importance is based on random permutations of the out-of-bag (OOB) samples of the individual trees and quantifies, for each variable, the average reduction of uncertainty about the model-predicted distribution; and (b) case proximity enables researchers to perform clustering and outlier detection. We provide an overview of SEM forests and illustrate their utility in the context of cross-sectional factor models of intelligence and episodic memory. We discuss benefits and limitations, and provide advice on how and when to use SEM trees and forests in future research. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Exploring Bayesian model selection methods for effective field theory expansions

    Science.gov (United States)

    Schaffner, Taylor; Yamauchi, Yukari; Furnstahl, Richard

    2017-09-01

    A fundamental understanding of the microscopic properties and interactions of nuclei has long evaded physicists due to the complex nature of quantum chromodynamics (QCD). One approach to modeling nuclear interactions is known as chiral effective field theory (EFT). Today, the method's greatest limitation lies in the approximation of interaction potentials and their corresponding uncertainties. Computing EFT expansion coefficients, known as Low-Energy Constants (LECs), from experimental data reduces to a problem of statistics and fitting. In the conventional approach, the fitting is done using frequentist methods that fail to evaluate the quality of the model itself (e.g., how many orders to use) in addition to its fit to the data. By utilizing Bayesian statistical methods for model selection, the model's quality can be taken into account, providing a more controlled and robust EFT expansion. My research involves probing different Bayesian model checking techniques to determine the most effective means for use with estimating the values of LECs. In particular, we are using model problems to explore the Bayesian calculation of an EFT expansion's evidence and an approximation to this value known as the WAIC (Widely Applicable Information Criterion). This work was supported in part by the National Science Foundation under Grant No. PHY-1306250.

  1. Purposeful Program Theory: Effective Use of Theories of Change and Logic Models

    Science.gov (United States)

    Funnell, Sue C.; Rogers, Patricia J.

    2011-01-01

    Between good intentions and great results lies a program theory--not just a list of tasks but a vision of what needs to happen, and how. Now widely used in government and not-for-profit organizations, program theory provides a coherent picture of how change occurs and how to improve performance. "Purposeful Program Theory" shows how to develop,…

  2. DEBtox theory and matrix population models as helpful tools in understanding the interaction between toxic cyanobacteria and zooplankton.

    Science.gov (United States)

    Billoir, Elise; da Silva Ferrão-Filho, Aloysio; Laure Delignette-Muller, Marie; Charles, Sandrine

    2009-06-07

    Bioassays were performed to find out how field samples of the toxic cyanobacteria Microcystis aeruginosa affect Moina micrura, a cladoceran found in the tropical Jacarepagua Lagoon (Rio de Janeiro, Brazil). The DEBtox (Dynamic Energy Budget theory applied to toxicity data) approach has been proposed for use in analysing chronic toxicity tests as an alternative to calculating the usual safety parameters (NOEC, ECx). DEBtox theory deals with the energy balance between physiological processes (assimilation, maintenance, growth and reproduction), and it can be used to investigate and compare various hypotheses concerning the mechanism of action of a toxicant. Even though the DEBtox framework was designed for standard toxicity bioassays carried out with standard species (fish, daphnids), we applied the growth and reproduction models to M. micrura, by adapting the data available using a weight-length allometric relationship. Our modelling approach appeared to be very relevant at the individual level, and confirmed previous conclusions about the toxic mechanism. In our study we also wanted to assess the toxic effects at the population level, which is a more relevant endpoint in risk assessment. We therefore incorporated both lethal and sublethal toxic effects in a matrix population model used to calculate the finite rate of population change as a continuous function of the exposure concentration. Alongside this calculation, we constructed a confidence band to predict the critical exposure concentration for population health. Finally, we discuss our findings with regard to the prospects for further refining the analysis of ecotoxicological data.

  3. An Evolutionary Game Theory Model of Spontaneous Brain Functioning.

    Science.gov (United States)

    Madeo, Dario; Talarico, Agostino; Pascual-Leone, Alvaro; Mocenni, Chiara; Santarnecchi, Emiliano

    2017-11-22

    Our brain is a complex system of interconnected regions spontaneously organized into distinct networks. The integration of information between and within these networks is a continuous process that can be observed even when the brain is at rest, i.e. not engaged in any particular task. Moreover, such spontaneous dynamics show predictive value over individual cognitive profile and constitute a potential marker in neurological and psychiatric conditions, making its understanding of fundamental importance in modern neuroscience. Here we present a theoretical and mathematical model based on an extension of evolutionary game theory on networks (EGN), able to capture brain's interregional dynamics by balancing emulative and non-emulative attitudes among brain regions. This results in the net behavior of nodes composing resting-state networks identified using functional magnetic resonance imaging (fMRI), determining their moment-to-moment level of activation and inhibition as expressed by positive and negative shifts in BOLD fMRI signal. By spontaneously generating low-frequency oscillatory behaviors, the EGN model is able to mimic functional connectivity dynamics, approximate fMRI time series on the basis of initial subset of available data, as well as simulate the impact of network lesions and provide evidence of compensation mechanisms across networks. Results suggest evolutionary game theory on networks as a new potential framework for the understanding of human brain network dynamics.

  4. POLITICAL BUDGET CYCLES: EVIDENCE FROM TURKEY

    Directory of Open Access Journals (Sweden)

    FİLİZ ERYILMAZ

    2015-04-01

    Full Text Available The theorical literature on “Political Business Cycles” presents important insights on the extent to which politicians attempt to manipulate government monetary and fiscal policies to influence electoral outcomes, in particular, with the aim of re-election. In recent years “Political Budget Cycles” is the one of the most important topics in Political Business Cycles literature. According to Political Budget Cycles Theory, some components of the government budget are influenced by the electoral cycle and consequently an increase in government spending or decrease in taxes in an election year, leading to larger fiscal deficit. This incumbent’s fiscal manipulation is a tool that governments possess to increase their changes for re-election. In this paper we investigate the presence of Political Budget Cycles using a data set of budget balance, total expenditure and total revenue over the period 1994–2012. Our findings suggest that incumbents in Turkey use fiscal policy to increase their popularity and win elections, therefore fiscal manipulation was rewarded rather than punished by Turkish voters. The meaning of this result is that Political Budget Cycles Theory is valid for Turkey between 1994 and 2012.

  5. Dynamical 3-Space Gravity Theory: Effects on Polytropic Solar Models

    Directory of Open Access Journals (Sweden)

    May R. D.

    2011-01-01

    Full Text Available Numerous experiments and observations have confirmed the existence of a dynamical 3-space, detectable directly by light-speed anisotropy experiments, and indirectly by means of novel gravitational effects, such as bore hole g anomalies, predictable black hole masses, flat spiral-galaxy rotation curves, and the expansion of the universe, all without dark matter and dark energy. The dynamics for this 3-space follows from a unique generalisation of Newtonian gravity, once that is cast into a velocity formalism. This new theory of gravity is applied to the solar model of the sun to compute new density, pressure and temperature profiles, using polytrope modelling of the equation of state for the matter. These results should be applied to a re-analysis of solar neutrino production, and to stellar evolution in general.

  6. Dynamical 3-Space Gravity Theory: Effects on Polytropic Solar Models

    Directory of Open Access Journals (Sweden)

    Cahill R. T.

    2011-01-01

    Full Text Available Numerous experiments and observations have confirmed the existence of a dynamical 3-space, detectable directly by light-speed anisotropy experiments, and indirectly by means of novel gravitational effects, such as bore hole g-anomalies, predictable black hole masses, flat spiral-galaxy rotation curves, and the expansion of the universe, all without dark matter and dark energy. The dynamics for this 3-space follows from a unique generalisation of Newtonian gravity, once that is cast into a velocity formalism. This new theory of gravity is applied to the solar model of the sun to compute new density, pressure and temperature profiles, using polytrope modelling of the equation of state for the matter. These results should be applied to a re-analysis of solar neutrino production, and to stellar evolution in general.

  7. Modeling of plucking piezoelectric energy harvesters with contact theory

    Science.gov (United States)

    Fu, Xinlei; Liao, Wei-Hsin

    2017-04-01

    Non-harmonic excitations are widely available in the environment of our daily life. We could make use of these excitations to pluck piezoelectric energy harvesters. Plucking piezoelectric energy harvesting could overcome the frequency gap and achieve frequency-up effect. However, there has not been a thorough analysis on plucking piezoelectric energy harvesting, especially with good understanding on the plucking mechanism. This paper is aimed to develop a model to investigate the plucking mechanism and predict the responses of plucking piezoelectric energy harvesters under different kinds of excitations. In the electromechanical model, Hertzian contact theory is applied to account for the interaction between the plectrum and piezoelectric beam. The plucking mechanism is clarified as a cantilever beam impacted by an infinitely heavy mass, in which the multi-impact process prematurely terminates at separation time. We numerically predict the plucking force, which depends on piezoelectric beam, Hertzian contact stiffness, overlap area and plucking velocity. The energy distribution is investigated with connected resistor.

  8. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data......, and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  9. A Proof Theory for Model Checking: An Extended Abstract

    Directory of Open Access Journals (Sweden)

    Quentin Heath

    2017-01-01

    Full Text Available While model checking has often been considered as a practical alternative to building formal proofs, we argue here that the theory of sequent calculus proofs can be used to provide an appealing foundation for model checking. Since the emphasis of model checking is on establishing the truth of a property in a model, we rely on the proof theoretic notion of additive inference rules, since such rules allow provability to directly describe truth conditions. Unfortunately, the additive treatment of quantifiers requires inference rules to have infinite sets of premises and the additive treatment of model descriptions provides no natural notion of state exploration. By employing a focused proof system, it is possible to construct large scale, synthetic rules that also qualify as additive but contain elements of multiplicative inference. These additive synthetic rules—essentially rules built from the description of a model—allow a direct treatment of state exploration. This proof theoretic framework provides a natural treatment of reachability and non-reachability problems, as well as tabled deduction, bisimulation, and winning strategies.

  10. A Systems Model of Parkinson's Disease Using Biochemical Systems Theory.

    Science.gov (United States)

    Sasidharakurup, Hemalatha; Melethadathil, Nidheesh; Nair, Bipin; Diwakar, Shyam

    2017-08-01

    Parkinson's disease (PD), a neurodegenerative disorder, affects millions of people and has gained attention because of its clinical roles affecting behaviors related to motor and nonmotor symptoms. Although studies on PD from various aspects are becoming popular, few rely on predictive systems modeling approaches. Using Biochemical Systems Theory (BST), this article attempts to model and characterize dopaminergic cell death and understand pathophysiology of progression of PD. PD pathways were modeled using stochastic differential equations incorporating law of mass action, and initial concentrations for the modeled proteins were obtained from literature. Simulations suggest that dopamine levels were reduced significantly due to an increase in dopaminergic quinones and 3,4-dihydroxyphenylacetaldehyde (DOPAL) relating to imbalances compared to control during PD progression. Associating to clinically observed PD-related cell death, simulations show abnormal parkin and reactive oxygen species levels with an increase in neurofibrillary tangles. While relating molecular mechanistic roles, the BST modeling helps predicting dopaminergic cell death processes involved in the progression of PD and provides a predictive understanding of neuronal dysfunction for translational neuroscience.

  11. Theory and modeling of cylindrical thermo-acoustic transduction

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Lihong, E-mail: lhtong@ecjtu.edu.cn [School of Civil Engineering and Architecture, East China Jiaotong University, Nanchang, Jiangxi (China); Lim, C.W. [Department of Architecture and Civil Engineering, City University of Hong Kong, Kowloon, Hong Kong SAR (China); Zhao, Xiushao; Geng, Daxing [School of Civil Engineering and Architecture, East China Jiaotong University, Nanchang, Jiangxi (China)

    2016-06-03

    Models both for solid and thinfilm-solid cylindrical thermo-acoustic transductions are proposed and the corresponding acoustic pressure solutions are obtained. The acoustic pressure for an individual carbon nanotube (CNT) as a function of input power is investigated analytically and it is verified by comparing with the published experimental data. Further numerical analysis on the acoustic pressure response and characteristics for varying input frequency and distance are also examined both for solid and thinfilm-solid cylindrical thermo-acoustic transductions. Through detailed theoretical and numerical studies on the acoustic pressure solution for thinfilm-solid cylindrical transduction, it is concluded that a solid with smaller thermal conductivity favors to improve the acoustic performance. In general, the proposed models are applicable to a variety of cylindrical thermo-acoustic devices performing in different gaseous media. - Highlights: • Theory and modeling both for solid and thinfilm-solid cylindrical thermo-acoustic transductions are proposed. • The modeling is verified by comparing with the published experimental data. • Acoustic response characteristics of cylindrical thermo-acoustic transductions are predicted by the proposed model.

  12. Changes in water budgets and sediment yields from a hypothetical agricultural field as a function of landscape and management characteristics--A unit field modeling approach

    Science.gov (United States)

    Roth, Jason L.; Capel, Paul D.

    2012-01-01

    Crop agriculture occupies 13 percent of the conterminous United States. Agricultural management practices, such as crop and tillage types, affect the hydrologic flow paths through the landscape. Some agricultural practices, such as drainage and irrigation, create entirely new hydrologic flow paths upon the landscapes where they are implemented. These hydrologic changes can affect the magnitude and partitioning of water budgets and sediment erosion. Given the wide degree of variability amongst agricultural settings, changes in the magnitudes of hydrologic flow paths and sediment erosion induced by agricultural management practices commonly are difficult to characterize, quantify, and compare using only field observations. The Water Erosion Prediction Project (WEPP) model was used to simulate two landscape characteristics (slope and soil texture) and three agricultural management practices (land cover/crop type, tillage type, and selected agricultural land management practices) to evaluate their effects on the water budgets of and sediment yield from agricultural lands. An array of sixty-eight 60-year simulations were run, each representing a distinct natural or agricultural scenario with various slopes, soil textures, crop or land cover types, tillage types, and select agricultural management practices on an isolated 16.2-hectare field. Simulations were made to represent two common agricultural climate regimes: arid with sprinkler irrigation and humid. These climate regimes were constructed with actual climate and irrigation data. The results of these simulations demonstrate the magnitudes of potential changes in water budgets and sediment yields from lands as a result of landscape characteristics and agricultural practices adopted on them. These simulations showed that variations in landscape characteristics, such as slope and soil type, had appreciable effects on water budgets and sediment yields. As slopes increased, sediment yields increased in both the arid and

  13. Hydrology and sediment budget of Los Laureles Canyon, Tijuana, MX: Modelling channel, gully, and rill erosion with 3D photo-reconstruction, CONCEPTS, and AnnAGNPS

    Science.gov (United States)

    Taniguchi, Kristine; Gudiño, Napoleon; Biggs, Trent; Castillo, Carlos; Langendoen, Eddy; Bingner, Ron; Taguas, Encarnación; Liden, Douglas; Yuan, Yongping

    2015-04-01

    Several watersheds cross the US-Mexico boundary, resulting in trans-boundary environmental problems. Erosion in Tijuana, Mexico, increases the rate of sediment deposition in the Tijuana Estuary in the United States, altering the structure and function of the ecosystem. The well-being of residents in Tijuana is compromised by damage to infrastructure and homes built adjacent to stream channels, gully formation in dirt roads, and deposition of trash. We aim to understand the dominant source of sediment contributing to the sediment budget of the watershed (channel, gully, or rill erosion), where the hotspots of erosion are located, and what the impact of future planned and unplanned land use changes and Best Management Practices (BMPs) will be on sediment and storm flow. We will be using a mix of field methods, including 3D photo-reconstruction of stream channels, with two models, CONCEPTS and AnnAGNPS to constrain estimates of the sediment budget and impacts of land use change. Our research provides an example of how 3D photo-reconstruction and Structure from Motion (SfM) can be used to model channel evolution.

  14. 7 CFR 3402.14 - Budget and budget narrative.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Budget and budget narrative. 3402.14 Section 3402.14 Agriculture Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION... budget narrative. Applicants must prepare the Budget, Form CSREES-2004, and a budget narrative...

  15. Dynamical influence of gravity waves generated by the Vestfjella Mountains in Antarctica: radar observations, fine-scale modelling and kinetic energy budget analysis

    Directory of Open Access Journals (Sweden)

    Joel Arnault

    2012-02-01

    Full Text Available Gravity waves generated by the Vestfjella Mountains (in western Droning Maud Land, Antarctica, southwest of the Finnish/Swedish Aboa/Wasa station have been observed with the Moveable atmospheric radar for Antarctica (MARA during the SWEDish Antarctic Research Programme (SWEDARP in December 2007/January 2008. These radar observations are compared with a 2-month Weather Research Forecast (WRF model experiment operated at 2 km horizontal resolution. A control simulation without orography is also operated in order to separate unambiguously the contribution of the mountain waves on the simulated atmospheric flow. This contribution is then quantified with a kinetic energy budget analysis computed in the two simulations. The results of this study confirm that mountain waves reaching lower-stratospheric heights break through convective overturning and generate inertia gravity waves with a smaller vertical wavelength, in association with a brief depletion of kinetic energy through frictional dissipation and negative vertical advection. The kinetic energy budget also shows that gravity waves have a strong influence on the other terms of the budget, i.e. horizontal advection and horizontal work of pressure forces, so evaluating the influence of gravity waves on the mean-flow with the vertical advection term alone is not sufficient, at least in this case. We finally obtain that gravity waves generated by the Vestfjella Mountains reaching lower stratospheric heights generally deplete (create kinetic energy in the lower troposphere (upper troposphere–lower stratosphere, in contradiction with the usual decelerating effect attributed to gravity waves on the zonal circulation in the upper troposphere–lower stratosphere.

  16. A Well-Designed Budget Yields Long-Term Rewards.

    Science.gov (United States)

    Pinola, Mary; Knirk, Frederick G.

    1984-01-01

    Defines zero-based budgeting, compares it to traditional budgeting, and discusses five steps of a zero-base budget model: determining organization's goals and refining them into objectives; listing activities to achieve objectives in decision packages; evaluating decision packages; ranking packages by order of importance; and funding decision…

  17. Stakeholder Theory and Value Creation Models in Brazilian Firms

    Directory of Open Access Journals (Sweden)

    Natalia Giugni Vidal

    2015-09-01

    Full Text Available Objective – The purpose of this study is to understand how top Brazilian firms think about and communicate value creation to their stakeholders. Design/methodology/approach – We use qualitative content analysis methodology to analyze the sustainability or annual integrated reports of the top 25 Brazilian firms by sales revenue. Findings – Based on our analysis, these firms were classified into three main types of stakeholder value creation models: narrow, broad, or transitioning from narrow to broad. We find that many of the firms in our sample are in a transition state between narrow and broad stakeholder value creation models. We also identify seven areas of concentration discussed by firms in creating value for stakeholders: better stakeholder relationships, better work environment, environmental preservation, increased customer base, local development, reputation, and stakeholder dialogue. Practical implications – This study shows a trend towards broader stakeholder value creation models in Brazilian firms. The findings of this study may inform practitioners interested in broadening their value creation models. Originality/value – This study adds to the discussion of stakeholder theory in the Brazilian context by understanding variations in value creation orientation in Brazil.

  18. Lorentz Violation of the Photon Sector in Field Theory Models

    Directory of Open Access Journals (Sweden)

    Lingli Zhou

    2014-01-01

    Full Text Available We compare the Lorentz violation terms of the pure photon sector between two field theory models, namely, the minimal standard model extension (SME and the standard model supplement (SMS. From the requirement of the identity of the intersection for the two models, we find that the free photon sector of the SMS can be a subset of the photon sector of the minimal SME. We not only obtain some relations between the SME parameters but also get some constraints on the SMS parameters from the SME parameters. The CPT-odd coefficients (kAFα of the SME are predicted to be zero. There are 15 degrees of freedom in the Lorentz violation matrix Δαβ of free photons of the SMS related with the same number of degrees of freedom in the tensor coefficients (kFαβμν, which are independent from each other in the minimal SME but are interrelated in the intersection of the SMS and the minimal SME. With the related degrees of freedom, we obtain the conservative constraints (2σ on the elements of the photon Lorentz violation matrix. The detailed structure of the photon Lorentz violation matrix suggests some applications to the Lorentz violation experiments for photons.

  19. Towards viable cosmological models of disformal theories of gravity

    Science.gov (United States)

    Sakstein, Jeremy

    2015-01-01

    The late-time cosmological dynamics of disformal gravity are investigated using dynamical systems methods. It is shown that in the general case there are no stable attractors that screen fifth forces locally and simultaneously describe a dark energy dominated universe. Viable scenarios have late-time properties that are independent of the disformal parameters and are identical to the equivalent conformal quintessence model. Our analysis reveals that configurations where the Jordan frame metric becomes singular are only reached in the infinite future, thus explaining the natural pathology resistance observed numerically by several previous works. The viability of models where this can happen is discussed in terms of both the cosmological dynamics and local phenomena. We identify a special parameter tuning such that there is a new fixed point that can match the presently observed dark energy density and equation of state. This model is unviable when the scalar couples to the visible sector but may provide a good candidate model for theories where only dark matter is disformally coupled.

  20. CONTEMPORARY ECONOMIC GROWTH MODELS AND THEORIES: A LITERATURE REVIEW

    Directory of Open Access Journals (Sweden)

    Ilkhom SHARIPOV

    2015-11-01

    Full Text Available One of the most important aspects of human development is the ability to have a decent standard of living. The secret of the "economic miracle" of many countries that have high standard of living, in fact, is simple and quite obvious. All these countries are characterized by high and sustained development of national economy, low unemployed population rate, growth of income and consumption. There is no doubt that economic growth leads to an increase in the wealth of the country as a whole, extending its potential in the fight against poverty, unemployment and solving other social problems. That is why a high level of economic growth is one of the main targets of economic policy in many countries around the world. This brief literature review discusses main existing theories and models of economic growth, its endogenous and exogenous aspects. The main purpose of this paper is to determine the current state of development of the economic growth theories and to determine their future directions of development.

  1. Closing the gap between budget formulation and execution

    OpenAIRE

    Lowery, Erainust

    2000-01-01

    Approved for public release; distribution is unlimited This thesis is a case study analysis of the Resource Management Office of the Bureau of Naval Personnel (PERS-02). Specifically, an analysis of projected versus actual budget figures was conducted. The purpose of the research was to explain anomalies in the budget formulation figures as compared to actual budget execution figures and define ways to improve the protocol between budget activities. Based on model comparisons, document rev...

  2. Multi-unit auctions with budget-constrained bidders

    Science.gov (United States)

    Ghosh, Gagan Pratap

    assumptions, there always exist bidder-types who submit unequal bids in equilibrium, (2) the equilibrium is monotonic in the sense that bidders with higher valuations prefer more unequal splits of their budgets than bidders with lower valuations and the same budget-level. With a formal theory in place, I carry out a quantitative exercise, using data from the 1970 OCS auction. I show that the model is able to match many aspects of the data. (1) In the data, the number of tracts bidders submit bids on is positively correlated with budgets (an R2 of 0.84), even though this relationship is non-monotonic; my model is able to capture this non-monotonicity, while producing an R2 of 0.89 (2) In the data, the average number of bids per tract is 8.21; for the model, this number is 10.09. (3) Auction revenue in the data was 1.927 billion; the model produced a mean revenue of 1.944 billion.

  3. A New Theory-to-Practice Model for Student Affairs: Integrating Scholarship, Context, and Reflection

    Science.gov (United States)

    Reason, Robert D.; Kimball, Ezekiel W.

    2012-01-01

    In this article, we synthesize existing theory-to-practice approaches within the student affairs literature to arrive at a new model that incorporates formal and informal theory, institutional context, and reflective practice. The new model arrives at a balance between the rigor necessary for scholarly theory development and the adaptability…

  4. Behavioral and Social Sciences Theories and Models: Are They Used in Unintentional Injury Prevention Research?

    Science.gov (United States)

    Trifiletti, L. B.; Gielen, A. C.; Sleet, D. A.; Hopkins, K.

    2005-01-01

    Behavioral and social sciences theories and models have the potential to enhance efforts to reduce unintentional injuries. The authors reviewed the published literature on behavioral and social science theory applications to unintentional injury problems to enumerate and categorize the ways different theories and models are used in injury…

  5. Modelling effects of temperature and oxygen on the population dynamics of the European sturgeon using dynamic energy budget theory

    OpenAIRE

    Vaugeois, M.; Lambert, P.; Baudrimont, M.; J. Cachot

    2016-01-01

    European sturgeon (Acipenser sturio) is an anadromous fish that breeds in rivers and which was previously found on most coasts of Europe. The last population of this species, nowadays listed as critically endangered, is reproducing in the Garonne basin near Bordeaux, south-west of France. In order to avoid extinction, the applied strategy since 1985 has been to release young fish into natural environment. These young individuals resulted from the assisted reproduction of wild and/or captive m...

  6. A brief history of string theory from dual models to M-theory

    CERN Document Server

    Rickles, Dean

    2014-01-01

    During its forty year lifespan, string theory has always had the power to divide, being called both a 'theory of everything' and a 'theory of nothing'. Critics have even questioned whether it qualifies as a scientific theory at all. This book adopts an objective stance, standing back from the question of the truth or falsity of string theory and instead focusing on how it came to be and how it came to occupy its present position in physics. An unexpectedly rich history is revealed, with deep connections to our most well-established physical theories. Fully self-contained and written in a lively fashion, the book will appeal to a wide variety of readers from novice to specialist.

  7. A hierarchy of energy- and flux-budget (EFB) turbulence closure models for stably stratified geophysical flows

    CERN Document Server

    Zilitinkevich, S S; Kleeorin, N; Rogachevskii, I; Esau, I

    2011-01-01

    In this paper we advance physical background of the EFB turbulence closure and present its comprehensive description. It is based on four budget equations for the second moments: turbulent kinetic and potential energies (TKE and TPE) and vertical turbulent fluxes of momentum and buoyancy; a new relaxation equation for the turbulent dissipation time-scale; and advanced concept of the inter-component exchange of TKE. The EFB closure is designed for stratified, rotating geophysical flows from neutral to very stable. In accordance to modern experimental evidence, it grants maintaining turbulence by the velocity shear at any gradient Richardson number Ri, and distinguishes between the two principally different regimes: "strong turbulence" at Ri 1 typical of the free atmosphere or deep ocean, where Pr_T asymptotically linearly increases with increasing Ri that implies strong suppressing of the heat transfer compared to momentum transfer. For use in different applications, the EFB turbulence closure is formulated a...

  8. Augmenting Parametric Optimal Ascent Trajectory Modeling with Graph Theory

    Science.gov (United States)

    Dees, Patrick D.; Zwack, Matthew R.; Edwards, Stephen; Steffens, Michael

    2016-01-01

    It has been well documented that decisions made in the early stages of Conceptual and Pre-Conceptual design commit up to 80% of total Life-Cycle Cost (LCC) while engineers know the least about the product they are designing [1]. Once within Preliminary and Detailed design however, making changes to the design becomes far more difficult to enact in both cost and schedule. Primarily this has been due to a lack of detailed data usually uncovered later during the Preliminary and Detailed design phases. In our current budget-constrained environment, making decisions within Conceptual and Pre-Conceptual design which minimize LCC while meeting requirements is paramount to a program's success. Within the arena of launch vehicle design, optimizing the ascent trajectory is critical for minimizing the costs present within such concerns as propellant, aerodynamic, aeroheating, and acceleration loads while meeting requirements such as payload delivered to a desired orbit. In order to optimize the vehicle design its constraints and requirements must be known, however as the design cycle proceeds it is all but inevitable that the conditions will change. Upon that change, the previously optimized trajectory may no longer be optimal, or meet design requirements. The current paradigm for adjusting to these updates is generating point solutions for every change in the design's requirements [2]. This can be a tedious, time-consuming task as changes in virtually any piece of a launch vehicle's design can have a disproportionately large effect on the ascent trajectory, as the solution space of the trajectory optimization problem is both non-linear and multimodal [3]. In addition, an industry standard tool, Program to Optimize Simulated Trajectories (POST), requires an expert analyst to produce simulated trajectories that are feasible and optimal [4]. In a previous publication the authors presented a method for combatting these challenges [5]. In order to bring more detailed information

  9. Linking Simple Economic Theory Models and the Cointegrated Vector AutoRegressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    , it is demonstrated how other controversial hypotheses such as Rational Expectations can be formulated directly as restrictions on the CVAR-parameters. A simple example of a "Neoclassical synthetic" AS-AD model is also formulated. Finally, the partial- general equilibrium distinction is related to the CVAR as well......This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its....... Further fundamental extensions and advances to more sophisticated theory models, such as those related to dynamics and expectations (in the structural relations) are left for future papers...

  10. Integrating the effects of salinity on the physiology of the eastern oyster, Crassostrea virginica, in the northern Gulf of Mexico through a Dynamic Energy Budget model

    Science.gov (United States)

    Lavaud, Romain; LaPeyre, Megan K.; Casas, Sandra M.; Bacher, C.; La Peyre, Jerome F.

    2017-01-01

    We present a Dynamic Energy Budget (DEB) model for the eastern oyster, Crassostrea virginica, which enables the inclusion of salinity as a third environmental variable, on top of the standard foodr and temperature variables. Salinity changes have various effects on the physiology of oysters, potentially altering filtration and respiration rates, and ultimately impacting growth, reproduction and mortality. We tested different hypotheses as to how to include these effects in a DEB model for C. virginica. Specifically, we tested two potential mechanisms to explain changes in oyster shell growth (cm), tissue dry weight (g) and gonad dry weight (g) when salinity moves away from the ideal range: 1) a negative effect on filtration rate and 2) an additional somatic maintenance cost. Comparative simulations of shell growth, dry tissue biomass and dry gonad weight in two monitored sites in coastal Louisiana experiencing salinity from 0 to 28 were statistically analyzed to determine the best hypothesis. Model parameters were estimated through the covariation method, using literature data and a set of specifically designed ecophysiological experiments. The model was validated through independent field studies in estuaries along the northern Gulf of Mexico. Our results suggest that salinity impacts C. virginica’s energy budget predominantly through effects on filtration rate. With an overwhelming number of environmental factors impacting organisms, and increasing exposure to novel and extreme conditions, the mechanistic nature of the DEB model with its ability to incorporate more than the standard food and temperature variables provides a powerful tool to verify hypotheses and predict individual organism performance across a range of conditions.

  11. WFIRST: Coronagraph Systems Engineering and Performance Budgets

    Science.gov (United States)

    Poberezhskiy, Ilya; cady, eric; Frerking, Margaret A.; Kern, Brian; Nemati, Bijan; Noecker, Martin; Seo, Byoung-Joon; Zhao, Feng; Zhou, Hanying

    2018-01-01

    The WFIRST coronagraph instrument (CGI) will be the first in-space coronagraph using active wavefront control to directly image and characterize mature exoplanets and zodiacal disks in reflected starlight. For CGI systems engineering, including requirements development, CGI performance is predicted using a hierarchy of performance budgets to estimate various noise components — spatial and temporal flux variations — that obscure exoplanet signals in direct imaging and spectroscopy configurations. These performance budgets are validated through a robust integrated modeling and testbed model validation efforts.We present the performance budgeting framework used by WFIRST for the flow-down of coronagraph science requirements, mission constraints, and observatory interfaces to measurable instrument engineering parameters.

  12. Generalized Potts-Models and their Relevance for Gauge Theories

    Directory of Open Access Journals (Sweden)

    Andreas Wipf

    2007-01-01

    Full Text Available We study the Polyakov loop dynamics originating from finite-temperature Yang-Mills theory. The effective actions contain center-symmetric terms involving powers of the Polyakov loop, each with its own coupling. For a subclass with two couplings we perform a detailed analysis of the statistical mechanics involved. To this end we employ a modified mean field approximation and Monte Carlo simulations based on a novel cluster algorithm. We find excellent agreement of both approaches. The phase diagram exhibits both first and second order transitions between symmetric, ferromagnetic and antiferromagnetic phases with phase boundaries merging at three tricritical points. The critical exponents ν and γ at the continuous transition between symmetric and antiferromagnetic phases are the same as for the 3-state spin Potts model.

  13. Basic first-order model theory in Mizar

    Directory of Open Access Journals (Sweden)

    Marco Bright Caminati

    2010-01-01

    Full Text Available The author has submitted to Mizar Mathematical Library a series of five articles introducing a framework for the formalization of classical first-order model theory.In them, Goedel's completeness and Lowenheim-Skolem theorems have also been formalized for the countable case, to offer a first application of it and to showcase its utility.This is an overview and commentary on some key aspects of this setup.It features exposition and discussion of a new encoding of basic definitions and theoretical gears needed for the task, remarks about the design strategies and approaches adopted in their implementation, and more general reflections about proof checking induced by the work done.

  14. Modeling of tethered satellite formations using graph theory

    DEFF Research Database (Denmark)

    Larsen, Martin Birkelund; Smith, Roy S; Blanke, Mogens

    2011-01-01

    Tethered satellite formations have recently gained increasing attention due to future mission proposals. Several different formations have been investigated for their dynamic properties and control schemes have been suggested. Formulating the equations of motion and investigation which geometries...... satellite formation and proposes a method to deduce the equations of motion for the attitude dynamics of the formation in a compact form. The use of graph theory and Lagrange mechanics together allows a broad class of formations to be described using the same framework. A method is stated for finding...... could form stable formations in space are cumbersome when done at a case to case basis, and a common framework providing a basic model of the dynamics of tethered satellite formations can therefore be advantageous. This paper suggests the use of graph theoretical quantities to describe a tethered...

  15. Partial differential equations in action from modelling to theory

    CERN Document Server

    Salsa, Sandro

    2016-01-01

    The book is intended as an advanced undergraduate or first-year graduate course for students from various disciplines, including applied mathematics, physics and engineering. It has evolved from courses offered on partial differential equations (PDEs) over the last several years at the Politecnico di Milano. These courses had a twofold purpose: on the one hand, to teach students to appreciate the interplay between theory and modeling in problems arising in the applied sciences, and on the other to provide them with a solid theoretical background in numerical methods, such as finite elements. Accordingly, this textbook is divided into two parts. The first part, chapters 2 to 5, is more elementary in nature and focuses on developing and studying basic problems from the macro-areas of diffusion, propagation and transport, waves and vibrations. In turn the second part, chapters 6 to 11, concentrates on the development of Hilbert spaces methods for the variational formulation and the analysis of (mainly) linear bo...

  16. Partial differential equations in action from modelling to theory

    CERN Document Server

    Salsa, Sandro

    2015-01-01

    The book is intended as an advanced undergraduate or first-year graduate course for students from various disciplines, including applied mathematics, physics and engineering. It has evolved from courses offered on partial differential equations (PDEs) over the last several years at the Politecnico di Milano. These courses had a twofold purpose: on the one hand, to teach students to appreciate the interplay between theory and modeling in problems arising in the applied sciences, and on the other to provide them with a solid theoretical background in numerical methods, such as finite elements. Accordingly, this textbook is divided into two parts. The first part, chapters 2 to 5, is more elementary in nature and focuses on developing and studying basic problems from the macro-areas of diffusion, propagation and transport, waves and vibrations. In turn the second part, chapters 6 to 11, concentrates on the development of Hilbert spaces methods for the variational formulation and the analysis of (mainly) linear bo...

  17. Computer Models and Automata Theory in Biology and Medicine

    CERN Document Server

    Baianu, I C

    2004-01-01

    The applications of computers to biological and biomedical problem solving goes back to the very beginnings of computer science, automata theory [1], and mathematical biology [2]. With the advent of more versatile and powerful computers, biological and biomedical applications of computers have proliferated so rapidly that it would be virtually impossible to compile a comprehensive review of all developments in this field. Limitations of computer simulations in biology have also come under close scrutiny, and claims have been made that biological systems have limited information processing power [3]. Such general conjectures do not, however, deter biologists and biomedical researchers from developing new computer applications in biology and medicine. Microprocessors are being widely employed in biological laboratories both for automatic data acquisition/processing and modeling; one particular area, which is of great biomedical interest, involves fast digital image processing and is already established for rout...

  18. Multimodal Transport System Coevolution Model Based on Synergetic Theory

    Directory of Open Access Journals (Sweden)

    Fenling Feng

    2015-01-01

    Full Text Available This study investigates multimodal transport system evolution law with the consideration of synergetic theory. Compared with previous studies, this paper focuses on understanding influencing factors of system collaborative development. In particular, we have applied a multimodal system order parameter model to obtain the order parameter. Based on order parameters, the coevolution equations of the multimodal transport system are constructed with consideration of cooperation and competitive relationship between the subsystems. We set out the multimodal system followed the coevolution law of the freight system and dominated by the combined effects of order parameter line length and freight density. The results show that the coordination effects between railway, road, and water subsystems are stronger than aviation subsystem; the railway system is the short plank of the system. Some functional implications from this study are also discussed. Finally the results indicate that expansion of railway system capacity and mutual cooperation within the subsystems are required to reach an optimal multimodal transport system.

  19. Theories and models on the biological of cells in space

    Science.gov (United States)

    Todd, P.; Klaus, D. M.

    1996-01-01

    A wide variety of observations on cells in space, admittedly made under constraining and unnatural conditions in may cases, have led to experimental results that were surprising or unexpected. Reproducibility, freedom from artifacts, and plausibility must be considered in all cases, even when results are not surprising. The papers in symposium on 'Theories and Models on the Biology of Cells in Space' are dedicated to the subject of the plausibility of cellular responses to gravity -- inertial accelerations between 0 and 9.8 m/sq s and higher. The mechanical phenomena inside the cell, the gravitactic locomotion of single eukaryotic and prokaryotic cells, and the effects of inertial unloading on cellular physiology are addressed in theoretical and experimental studies.

  20. Probing flame chemistry with MBMS, theory, and modeling

    Energy Technology Data Exchange (ETDEWEB)

    Westmoreland, P.R. [Univ. of Massachusetts, Amherst (United States)

    1993-12-01

    The objective is to establish kinetics of combustion and molecular-weight growth in C{sub 3} hydrocarbon flames as part of an ongoing study of flame chemistry. Specific reactions being studied are (1) the growth reactions of C{sub 3}H{sub 5} and C{sub 3}H{sub 3} with themselves and with unsaturated hydrocarbons and (2) the oxidation reactions of O and OH with C{sub 3}`s. This approach combines molecular-beam mass spectrometry (MBMS) experiments on low-pressure flat flames; theoretical predictions of rate constants by thermochemical kinetics, Bimolecular Quantum-RRK, RRKM, and master-equation theory; and whole-flame modeling using full mechanisms of elementary reactions.

  1. Dynamic density functional theory of solid tumor growth: Preliminary models

    Directory of Open Access Journals (Sweden)

    Arnaud Chauviere

    2012-03-01

    Full Text Available Cancer is a disease that can be seen as a complex system whose dynamics and growth result from nonlinear processes coupled across wide ranges of spatio-temporal scales. The current mathematical modeling literature addresses issues at various scales but the development of theoretical methodologies capable of bridging gaps across scales needs further study. We present a new theoretical framework based on Dynamic Density Functional Theory (DDFT extended, for the first time, to the dynamics of living tissues by accounting for cell density correlations, different cell types, phenotypes and cell birth/death processes, in order to provide a biophysically consistent description of processes across the scales. We present an application of this approach to tumor growth.

  2. Development of a dynamic computational model of social cognitive theory

    National Research Council Canada - National Science Library

    Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C

    2016-01-01

    Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors...

  3. Quantifying the impacts of land surface schemes and dynamic vegetation on the model dependency of projected changes in surface energy and water budgets

    Science.gov (United States)

    Yu, Miao; Wang, Guiling; Chen, Haishan

    2016-03-01

    Assessing and quantifying the uncertainties in projected future changes of energy and water budgets over land surface are important steps toward improving our confidence in climate change projections. In this study, the contribution of land surface models to the inter-GCM variation of projected future changes in land surface energy and water fluxes are assessed based on output from 19 global climate models (GCMs) and offline Community Land Model version 4 (CLM4) simulations driven by meteorological forcing from the 19 GCMs. Similar offline simulations using CLM4 with its dynamic vegetation submodel are also conducted to investigate how dynamic vegetation feedback, a process that is being added to more earth system models, may amplify or moderate the intermodel variations of projected future changes. Projected changes are quantified as the difference between the 2081-2100 period from the Representative Concentration Pathway 8.5 (RCP8.5) future experiment and the 1981-2000 period from the historical simulation. Under RCP8.5, projected changes in surface water and heat fluxes show a high degree of model dependency across the globe. Although precipitation is very likely to increase in the high latitudes of the Northern Hemisphere, a high degree of model-related uncertainty exists for evapotranspiration, soil water content, and surface runoff, suggesting discrepancy among land surface models (LSMs) in simulating the surface hydrological processes and snow-related processes. Large model-related uncertainties for the surface water budget also exist in the Tropics including southeastern South America and Central Africa. These uncertainties would be reduced in the hypothetical scenario of a single near-perfect land surface model being used across all GCMs, suggesting the potential to reduce uncertainties through the use of more consistent approaches toward land surface model development. Under such a scenario, the most significant reduction is likely to be seen in the

  4. Multimedia Budget Presentations.

    Science.gov (United States)

    Hughes, Jonathon T.; Rodabaugh, Karl

    This book provides an overview of the potential of multimedia budget proposals. The text reviews the fundamentals of multimedia, emphasizing how it improves communication by using multiple levels of input. A process for analyzing many of the budget decisions that must be made, as adapted from Robert Finney's five-step process of "Gap…

  5. SUSY Breaking in Local String/F-Theory Models

    CERN Document Server

    Blumenhagen, R; Krippendorf, S; Moster, S; Quevedo, F

    2009-01-01

    We investigate bulk moduli stabilisation and supersymmetry breaking in local string/F-theory models where the Standard Model is supported on a del Pezzo surface or singularity. Computing the gravity mediated soft terms on the Standard Model brane induced by bulk supersymmetry breaking in the LARGE volume scenario, we explicitly find suppressions by M_s/M_P ~ V^{-1/2} compared to M_{3/2}. This gives rise to several phenomenological scenarios, depending on the strength of perturbative corrections to the effective action and the source of de Sitter lifting, in which the soft terms are suppressed by at least M_P/V^{3/2} and may be as small as M_P/V^2. Since the gravitino mass is of order M_{3/2} ~ M_P/V, for TeV soft terms all these scenarios give a very heavy gravitino (M_{3/2} >= 10^8 GeV) and generically the lightest moduli field is also heavy enough (m >= 10 TeV) to avoid the cosmological moduli problem. For TeV soft terms, these scenarios predict a minimal value of the volume to be V ~ 10^{6-7} in string uni...

  6. Theory, Modeling and Simulation: Research progress report 1994--1995

    Energy Technology Data Exchange (ETDEWEB)

    Garrett, B.C.; Dixon, D.A.; Dunning, T.H.

    1997-01-01

    The Pacific Northwest National Laboratory (PNNL) has established the Environmental Molecular Sciences Laboratory (EMSL). In April 1994, construction began on the new EMSL, a collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation (TM and S) program will play a critical role in understanding molecular processes important in restoring DOE`s research, development, and production sites, including understanding the migration and reactions of contaminants in soils and ground water, developing processes for isolation and processing of pollutants, developing improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TM and S program are fivefold: to apply available electronic structure and dynamics techniques to study fundamental molecular processes involved in the chemistry of natural and contaminated systems; to extend current electronic structure and dynamics techniques to treat molecular systems of future importance and to develop new techniques for addressing problems that are computationally intractable at present; to apply available molecular modeling techniques to simulate molecular processes occurring in the multi-species, multi-phase systems characteristic of natural and polluted environments; to extend current molecular modeling techniques to treat ever more complex molecular systems and to improve the reliability and accuracy of such simulations; and to develop technologies for advanced parallel architectural computer systems. Research highlights of 82 projects are given.

  7. Optimization models using fuzzy sets and possibility theory

    CERN Document Server

    Orlovski, S

    1987-01-01

    Optimization is of central concern to a number of discip­ lines. Operations Research and Decision Theory are often consi­ dered to be identical with optimizationo But also in other areas such as engineering design, regional policy, logistics and many others, the search for optimal solutions is one of the prime goals. The methods and models which have been used over the last decades in these areas have primarily been "hard" or "crisp", i. e. the solutions were considered to be either fea­ sible or unfeasible, either above a certain aspiration level or below. This dichotomous structure of methods very often forced the modeller to approximate real problem situations of the more-or-less type by yes-or-no-type models, the solutions of which might turn out not to be the solutions to the real prob­ lems. This is particularly true if the problem under considera­ tion includes vaguely defined relationships, human evaluations, uncertainty due to inconsistent or incomplete evidence, if na­ tural language has to be...

  8. Preparing the operating budget.

    Science.gov (United States)

    Williams, R B

    1983-12-01

    The process of preparing a hospital pharmacy budget is presented. The desired characteristics of a budget and the process by which it is developed and approved are described. Fixed, flexible, and zero-based budget types are explained, as are the major components of a well-developed budget: expense, workload, productivity, revenue, and capital equipment and other expenditures. Specific methods for projecting expenses and revenues, based on historical data, are presented along with a discussion of variables that must be considered in order to achieve an accurate and useful budget. The current shift in emphasis away from revenue capture toward critical analysis of pharmacy costs underscores the importance of budgetary analysis for hospital pharmacy managers.

  9. Attachment Theory and Theory of Planned Behavior: An Integrative Model Predicting Underage Drinking

    Science.gov (United States)

    Lac, Andrew; Crano, William D.; Berger, Dale E.; Alvaro, Eusebio M.

    2013-01-01

    Research indicates that peer and maternal bonds play important but sometimes contrasting roles in the outcomes of children. Less is known about attachment bonds to these 2 reference groups in young adults. Using a sample of 351 participants (18 to 20 years of age), the research integrated two theoretical traditions: attachment theory and theory of…

  10. Relationships among Classical Test Theory and Item Response Theory Frameworks via Factor Analytic Models

    Science.gov (United States)

    Kohli, Nidhi; Koran, Jennifer; Henn, Lisa

    2015-01-01

    There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…

  11. Eye growth and myopia development: Unifying theory and Matlab model.

    Science.gov (United States)

    Hung, George K; Mahadas, Kausalendra; Mohammad, Faisal

    2016-03-01

    The aim of this article is to present an updated unifying theory of the mechanisms underlying eye growth and myopia development. A series of model simulation programs were developed to illustrate the mechanism of eye growth regulation and myopia development. Two fundamental processes are presumed to govern the relationship between physiological optics and eye growth: genetically pre-programmed signaling and blur feedback. Cornea/lens is considered to have only a genetically pre-programmed component, whereas eye growth is considered to have both a genetically pre-programmed and a blur feedback component. Moreover, based on the Incremental Retinal-Defocus Theory (IRDT), the rate of change of blur size provides the direction for blur-driven regulation. The various factors affecting eye growth are shown in 5 simulations: (1 - unregulated eye growth): blur feedback is rendered ineffective, as in the case of form deprivation, so there is only genetically pre-programmed eye growth, generally resulting in myopia; (2 - regulated eye growth): blur feedback regulation demonstrates the emmetropization process, with abnormally excessive or reduced eye growth leading to myopia and hyperopia, respectively; (3 - repeated near-far viewing): simulation of large-to-small change in blur size as seen in the accommodative stimulus/response function, and via IRDT as well as nearwork-induced transient myopia (NITM), leading to the development of myopia; (4 - neurochemical bulk flow and diffusion): release of dopamine from the inner plexiform layer of the retina, and the subsequent diffusion and relay of neurochemical cascade show that a decrease in dopamine results in a reduction of proteoglycan synthesis rate, which leads to myopia; (5 - Simulink model): model of genetically pre-programmed signaling and blur feedback components that allows for different input functions to simulate experimental manipulations that result in hyperopia, emmetropia, and myopia. These model simulation programs

  12. Theory and Low-Order Modeling of Unsteady Airfoil Flows

    Science.gov (United States)

    Ramesh, Kiran

    Unsteady flow phenomena are prevalent in a wide range of problems in nature and engineering. These include, but are not limited to, aerodynamics of insect flight, dynamic stall in rotorcraft and wind turbines, leading-edge vortices in delta wings, micro-air vehicle (MAV) design, gust handling and flow control. The most significant characteristics of unsteady flows are rapid changes in the circulation of the airfoil, apparent-mass effects, flow separation and the leading-edge vortex (LEV) phenomenon. Although experimental techniques and computational fluid dynamics (CFD) methods have enabled the detailed study of unsteady flows and their underlying features, a reliable and inexpensive loworder method for fast prediction and for use in control and design is still required. In this research, a low-order methodology based on physical principles rather than empirical fitting is proposed. The objective of such an approach is to enable insights into unsteady phenomena while developing approaches to model them. The basis of the low-order model developed here is unsteady thin-airfoil theory. A time-stepping approach is used to solve for the vorticity on an airfoil camberline, allowing for large amplitudes and nonplanar wakes. On comparing lift coefficients from this method against data from CFD and experiments for some unsteady test cases, it is seen that the method predicts well so long as LEV formation does not occur and flow over the airfoil is attached. The formation of leading-edge vortices (LEVs) in unsteady flows is initiated by flow separation and the formation of a shear layer at the airfoil's leading edge. This phenomenon has been observed to have both detrimental (dynamic stall in helicopters) and beneficial (high-lift flight in insects) effects. To predict the formation of LEVs in unsteady flows, a Leading Edge Suction Parameter (LESP) is proposed. This parameter is calculated from inviscid theory and is a measure of the suction at the airfoil's leading edge. It

  13. Building dynamic models and theories to advance the science of symptom management research.

    Science.gov (United States)

    Brant, Jeannine M; Beck, Susan; Miaskowski, Christine

    2010-01-01

    This paper is a description, comparison, and critique of two models and two theories used to guide symptom management research, and a proposal of directions for new theory or model development. Symptom management research has undergone a paradigmatic shift to include symptom clusters, longitudinal studies that examine symptom trajectories, and the effects of interventions on patient outcomes. Models and theories are used to guide descriptive and intervention research. Over the past 15 years, four conceptual models or theories (i.e. Theory of Symptom Management, the Theory of Unpleasant Symptoms, the Symptoms Experience Model and the Symptoms Experience in Time Model) were used in a variety of symptom management studies. Literature searches were performed in Medline and the Cumulative Index of Nursing and Allied Health Literature between 1990 and 2008 for models and theories that guide symptom management research. Related papers and book chapters were used as supporting documentation. Comparison and critique of the models and theories revealed important gaps including lack of consideration of symptom clusters, failure to incorporate temporal aspects of the symptom experience and failure to incorporate the impact of interventions on patient outcomes. New models and theories should incorporate current trends in symptom management research, capture the dynamic nature of symptoms and incorporate concepts that will facilitate transdisciplinary research in symptom management. Researchers and clinicians need to build more expansive and dynamic symptom management models and theories that parallel advances in symptom research and practice.

  14. Theory development for HIV behavioral health: empirical validation of behavior health models specific to HIV risk.

    Science.gov (United States)

    Traube, Dorian E; Holloway, Ian W; Smith, Lana

    2011-06-01

    In the presence of numerous health behavior theories, it is difficult to determine which of the many theories is most precise in explaining health-related behavior. New models continue to be introduced to the field, despite already existing disparity, overlap, and lack of unification among health promotion theories. This paper will provide an overview of current arguments and frameworks for testing and developing a comprehensive set of health behavior theories. In addition, the authors make a unique contribution to the HIV health behavior theory literature by moving beyond current health behavior theory critiques to argue that one of the field's preexisting, but less popular theories, Social Action Theory (SAT), offers a pragmatic and broad framework to address many of the accuracy issues within HIV health behavior theory. The authors conclude this article by offering a comprehensive plan for validating model accuracy, variable influence, and behavioral applicability of SAT.

  15. Integrating Beck's cognitive model and the response style theory in an adolescent sample.

    Science.gov (United States)

    Winkeljohn Black, Stephanie; Pössel, Patrick

    2015-01-01

    Depression becomes more prevalent as individuals progress from childhood to adulthood. Thus, empirically supported and popular cognitive vulnerability theories to explain depression in adulthood have begun to be tested in younger age groups, particularly adolescence, a time of significant cognitive development. Beck's cognitive theory and the response style theory are well known, empirically supported theories of depression. The current, two-wave longitudinal study (N = 462; mean age = 16.01 years; SD = 0.69; 63.9% female) tested various proposed integrative models of Beck's cognitive theory and the response style theory, as well as the original theories themselves, to determine if and how these cognitive vulnerabilities begin to intertwine in adolescence. Of the integrative models tested-all with structural equation modeling in AMOS 21-the best-fitting integrative model was a moderation model wherein schemata influenced rumination, and rumination then influenced other cognitive variables in Beck's model. Findings revealed that this integrated model fit the data better than the response style theory and explained 1.2% more variance in depressive symptoms. Additionally, multigroup analyses comparing the fit of the best-fitting integrated model across adolescents with clinical and subclinical depressive symptoms revealed that the model was not stable between these two subsamples. However, of the hypotheses relevant to the integrative model, only 1 of the 18 associations was significantly different between the clinical and subclinical samples. Regardless, the integrated model was not superior to the more parsimonious model from Beck's cognitive theory. Implications and limitations are discussed.

  16. A practitioner's guide to persuasion: an overview of 15 selected persuasion theories, models and frameworks.

    Science.gov (United States)

    Cameron, Kenzie A

    2009-03-01

    To provide a brief overview of 15 selected persuasion theories and models, and to present examples of their use in health communication research. The theories are categorized as message effects models, attitude-behavior approaches, cognitive processing theories and models, consistency theories, inoculation theory, and functional approaches. As it is often the intent of a practitioner to shape, reinforce, or change a patient's behavior, familiarity with theories of persuasion may lead to the development of novel communication approaches with existing patients. This article serves as an introductory primer to theories of persuasion with applications to health communication research. Understanding key constructs and general formulations of persuasive theories may allow practitioners to employ useful theoretical frameworks when interacting with patients.

  17. Cognitive performance modeling based on general systems performance theory.

    Science.gov (United States)

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  18. The physical theory and propagation model of THz atmospheric propagation

    Science.gov (United States)

    Wang, R.; Yao, J. Q.; Xu, D. G.; Wang, J. L.; Wang, P.

    2011-02-01

    Terahertz (THz) radiation is extensively applied in diverse fields, such as space communication, Earth environment observation, atmosphere science, remote sensing and so on. And the research on propagation features of THz wave in the atmosphere becomes more and more important. This paper firstly illuminates the advantages and outlook of THz in space technology. Then it introduces the theoretical framework of THz atmospheric propagation, including some fundamental physical concepts and processes. The attenuation effect (especially the absorption of water vapor), the scattering of aerosol particles and the effect of turbulent flow mainly influence THz atmosphere propagation. Fundamental physical laws are illuminated as well, such as Lamber-beer law, Mie scattering theory and radiative transfer equation. The last part comprises the demonstration and comparison of THz atmosphere propagation models like Moliere(V5), SARTre and AMATERASU. The essential problems are the deep analysis of physical mechanism of this process, the construction of atmospheric propagation model and databases of every kind of material in the atmosphere, and the standardization of measurement procedures.

  19. Global Carbon Budget 2016

    Science.gov (United States)

    Le Quéré, Corinne; Andrew, Robbie M.; Canadell, Josep G.; Sitch, Stephen; Korsbakken, Jan Ivar; Peters, Glen P.; Manning, Andrew C.; Boden, Thomas A.; Tans, Pieter P.; Houghton, Richard A.; Keeling, Ralph F.; Alin, Simone; Andrews, Oliver D.; Anthoni, Peter; Barbero, Leticia; Bopp, Laurent; Chevallier, Frédéric; Chini, Louise P.; Ciais, Philippe; Currie, Kim; Delire, Christine; Doney, Scott C.; Friedlingstein, Pierre; Gkritzalis, Thanos; Harris, Ian; Hauck, Judith; Haverd, Vanessa; Hoppema, Mario; Klein Goldewijk, Kees; Jain, Atul K.; Kato, Etsushi; Körtzinger, Arne; Landschützer, Peter; Lefèvre, Nathalie; Lenton, Andrew; Lienert, Sebastian; Lombardozzi, Danica; Melton, Joe R.; Metzl, Nicolas; Millero, Frank; Monteiro, Pedro M. S.; Munro, David R.; Nabel, Julia E. M. S.; Nakaoka, Shin-ichiro; O'Brien, Kevin; Olsen, Are; Omar, Abdirahman M.; Ono, Tsuneo; Pierrot, Denis; Poulter, Benjamin; Rödenbeck, Christian; Salisbury, Joe; Schuster, Ute; Schwinger, Jörg; Séférian, Roland; Skjelvan, Ingunn; Stocker, Benjamin D.; Sutton, Adrienne J.; Takahashi, Taro; Tian, Hanqin; Tilbrook, Bronte; van der Laan-Luijkx, Ingrid T.; van der Werf, Guido R.; Viovy, Nicolas; Walker, Anthony P.; Wiltshire, Andrew J.; Zaehle, Sönke

    2016-11-01

    Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere - the "global carbon budget" - is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates and consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil fuels and industry (EFF) are based on energy statistics and cement production data, respectively, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models. We compare the mean land and ocean fluxes and their variability to estimates from three atmospheric inverse methods for three broad latitude bands. All uncertainties are reported as ±1σ, reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. For the last decade available (2006-2015), EFF was 9

  20. Global Carbon Budget 2016

    Science.gov (United States)

    Quéré, Corinne Le; Andrew, Robbie M.; Canadell, Josep G.; Sitch, Stephen; Korsbakken, Jan Ivar; Peters, Glen P.; Manning, Andrew C.; Boden, Thomas A.; Tans, Pieter P.; Houghton, Richard A.; hide 12px; height:12px; display:none; " src="images/arrow-down.gif" width="12" height="12" border="0" alt="hide" id="author_20170008485_hide">

    2016-01-01

    Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere the global carbon budget is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates and consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil fuels and industry (EFF) are based on energy statistics and cement production data, respectively, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models. We compare the mean land and ocean fluxes and their variability to estimates from three atmospheric inverse methods for three broad latitude bands. All uncertainties are reported as +/- 1(sigma), reflecting the current capacity to characterize the annual estimates of each component of the global carbon budget. For the last decade available (2006-2015), EFF was 9

  1. A theory and a computational model of spatial reasoning with preferred mental models.

    Science.gov (United States)

    Ragni, Marco; Knauff, Markus

    2013-07-01

    Inferences about spatial arrangements and relations like "The Porsche is parked to the left of the Dodge and the Ferrari is parked to the right of the Dodge, thus, the Porsche is parked to the left of the Ferrari," are ubiquitous. However, spatial descriptions are often interpretable in many different ways and compatible with several alternative mental models. This article suggests that individuals tackle such indeterminate multiple-model problems by constructing a single, simple, and typical mental model but neglect other possible models. The model that first comes to reasoners' minds is the preferred mental model. It helps save cognitive resources but also leads to reasoning errors and illusory inferences. The article presents a preferred model theory and an instantiation of this theory in the form of a computational model, preferred inferences in reasoning with spatial mental models (PRISM). PRISM can be used to simulate and explain how preferred models are constructed, inspected, and varied in a spatial array that functions as if it were a spatial working memory. A spatial focus inserts tokens into the array, inspects the array to find new spatial relations, and relocates tokens in the array to generate alternative models of the problem description, if necessary. The article also introduces a general measure of difficulty based on the number of necessary focus operations (rather than the number of models). A comparison with results from psychological experiments shows that the theory can explain preferences, errors, and the difficulty of spatial reasoning problems. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  2. Cold and hot cognition: quantum probability theory and realistic psychological modeling.

    Science.gov (United States)

    Corr, Philip J

    2013-06-01

    Typically, human decision making is emotionally "hot" and does not conform to "cold" classical probability (CP) theory. As quantum probability (QP) theory emphasises order, context, superimposition states, and nonlinear dynamic effects, one of its major strengths may be its power to unify formal modeling and realistic psychological theory (e.g., information uncertainty, anxiety, and indecision, as seen in the Prisoner's Dilemma).

  3. Dynamic chemical process modelling and validation : Theory and application to industrial and literature case study

    NARCIS (Netherlands)

    Schmal, J.P.

    2014-01-01

    Dynamic chemical process modelling is still largely considered an art. In this thesis the theory of large-scale chemical process modelling and validation is discussed and initial steps to extend the theory are explored. In particular we pay attention to the effect of the level of detail on the model

  4. Density functional theory and dynamical mean-field theory. A way to model strongly correlated systems

    Energy Technology Data Exchange (ETDEWEB)

    Backes, Steffen

    2017-04-15

    The study of the electronic properties of correlated systems is a very diverse field and has lead to valuable insight into the physics of real materials. In these systems, the decisive factor that governs the physical properties is the ratio between the electronic kinetic energy, which promotes delocalization over the lattice, and the Coulomb interaction, which instead favours localized electronic states. Due to this competition, correlated electronic systems can show unique and interesting properties like the Metal-Insulator transition, diverse phase diagrams, strong temperature dependence and in general a high sensitivity to the environmental conditions. A theoretical description of these systems is not an easy task, since perturbative approaches that do not preserve the competition between the kinetic and interaction terms can only be applied in special limiting cases. One of the most famous approaches to obtain the electronic properties of a real material is the ab initio density functional theory (DFT) method. It allows one to obtain the ground state density of the system under investigation by mapping onto an effective non-interacting system that has to be found self-consistently. While being an exact theory, in practical implementations certain approximations have to be made to the exchange-correlation potential. The local density approximation (LDA), which approximates the exchange-correlation contribution to the total energy by that of a homogeneous electron gas with the corresponding density, has proven quite successful in many cases. Though, this approximation in general leads to an underestimation of electronic correlations and is not able to describe a metal-insulator transition due to electronic localization in the presence of strong Coulomb interaction. A different approach to the interacting electronic problem is the dynamical mean-field theory (DMFT), which is non-perturbative in the kinetic and interaction term but neglects all non

  5. FY 1997 congressional budget request: Budget highlights

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    This is an overview of the 1997 budget request for the US DOE. The topics of the overview include a policy overview, the budget by business line, business lines by organization, crosswalk from business line to appropriation, summary by appropriation, energy supply research and development, uranium supply and enrichment activities, uranium enrichment decontamination and decommissioning fund, general science and research, weapons activities, defense environmental restoration and waste management, defense nuclear waste disposal, departmental administration, Office of the Inspector General, power marketing administrations, Federal Energy Regulatory commission, nuclear waste disposal fund, fossil energy research and development, naval petroleum and oil shale reserves, energy conservation, economic regulation, strategic petroleum reserve, energy information administration, clean coal technology and a Department of Energy Field Facilities map.

  6. Towards theory integration: Threshold model as a link between signal detection theory, fast-and-frugal trees and evidence accumulation theory.

    Science.gov (United States)

    Hozo, Iztok; Djulbegovic, Benjamin; Luan, Shenghua; Tsalatsanis, Athanasios; Gigerenzer, Gerd

    2017-02-01

    Theories of decision making are divided between those aiming to help decision makers in the real, 'large' world and those who study decisions in idealized 'small' world settings. For the most part, these large- and small-world decision theories remain disconnected. We linked the small-world decision theoretic concepts of signal detection theory (SDT) and evidence accumulation theory (EAT) to the threshold model and the large world of heuristic decision making that rely on fast-and-frugal decision trees (FFT). We connected these large- and small-world theories by demonstrating that seemingly different decision-making concepts are actually equivalent. In doing so, we were able (1) to link the threshold model to EAT and FFT, thereby creating decision criteria that take into account both the classification accuracy of FFT and the consequences built in the threshold model; (2) to demonstrate how threshold criteria can be used as a strategy for optimal selection of cues when constructing FFT; and (3) to show that the compensatory strategy expressed in the threshold model can be linked to a non-compensatory FFT approach to decision making. We also showed how construction and performance of FFT depend on having reliable information - the results were highly sensitive to the estimates of benefits and harms of health interventions. We illustrate the practical usefulness of our analysis by describing an FFT we developed for prescribing statins for primary prevention of cardiovascular disease. By linking SDT and EAT to the compensatory threshold model and to non-compensatory heuristic decision making (FFT), we showed how these two decision strategies are ultimately linked within a broader theoretical framework and thereby respond to calls for integrating decision theory paradigms. © 2015 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.

  7. Validation of cloud forcing simulated by the National Center for Atmospheric Research Community Climate Model using observations from the Earth Radiation Budget Experiment

    Science.gov (United States)

    Soden, B. J.

    1992-01-01

    Satellite measurements of the effect of clouds on the top of atmosphere radiative energy budget are used to validate model simulations from the National Center for Atmospheric Research Community Climate Model (NCAR CCM). The ability of the NCAR CCM to reproduce the monthly mean global distribution and temporal variability on both daily and seasonal time scales is assessed. The comparison reveals several deficiencies in the CCM cloud representation. Most notable are the difficulties in properly simulating the effect of clouds on the planetary albedo. This problem arises from discrepancies in the model's portrayal of low-level cloudiness and leads to significant errors in the absorbed solar radiation simulated by the model. The CCM performs much better in simulating the effect of clouds on the longwave radiation emitted to space, indicating its relative success in capturing the vertical distribution of cloudiness. The daily variability of the radiative effects of clouds in both the shortwave and longwave spectral regions is systematically overestimated. Analysis of the seasonal variations illustrates a distinct lack of coupling in the seasonal changes in the radiative effects of cloudiness between the tropics and mid-latitudes and between the Northern and Southern Hemisphere. Much of this problem also arises from difficulties in simulating low-level cloudiness, placing further emphasis on the need for better model parameterizations of boundary layer clouds.

  8. Queuing theory models used for port equipment sizing

    Science.gov (United States)

    Dragu, V.; Dinu, O.; Ruscă, A.; Burciu, Ş.; Roman, E. A.

    2017-08-01

    The significant growth of volumes and distances on road transportation led to the necessity of finding solutions to increase water transportation market share together with the handling and transfer technologies within its terminals. It is widely known that the biggest times are consumed within the transport terminals (loading/unloading/transfer) and so the necessity of constantly developing handling techniques and technologies in concordance with the goods flows size so that the total waiting time of ships within ports is reduced. Port development should be achieved by harmonizing the contradictory interests of port administration and users. Port administrators aim profit increase opposite to users that want savings by increasing consumers’ surplus. The difficulty consists in the fact that the transport demand - supply equilibrium must be realised at costs and goods quantities transiting the port in order to satisfy the interests of both parties involved. This paper presents a port equipment sizing model by using queueing theory so that the sum of costs for ships waiting operations and equipment usage would be minimum. Ship operation within the port is assimilated to a mass service waiting system in which parameters are later used to determine the main costs for ships and port equipment.

  9. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    Science.gov (United States)

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases.

  10. Multiscale modeling of lymphatic drainage from tissues using homogenization theory.

    Science.gov (United States)

    Roose, Tiina; Swartz, Melody A

    2012-01-03

    Lymphatic capillary drainage of interstitial fluid under both steady-state and inflammatory conditions is important for tissue fluid balance, cancer metastasis, and immunity. Lymphatic drainage function is critically coupled to the fluid mechanical properties of the interstitium, yet this coupling is poorly understood. Here we sought to effectively model the lymphatic-interstitial fluid coupling and ask why the lymphatic capillary network often appears with roughly a hexagonal architecture. We use homogenization method, which allows tissue-scale lymph flow to be integrated with the microstructural details of the lymphatic capillaries, thus gaining insight into the functionality of lymphatic anatomy. We first describe flow in lymphatic capillaries using the Navier-Stokes equations and flow through the interstitium using Darcy's law. We then use multiscale homogenization to derive macroscale equations describing lymphatic drainage, with the mouse tail skin as a basis. We find that the limiting resistance for fluid drainage is that from the interstitium into the capillaries rather than within the capillaries. We also find that between hexagonal, square, and parallel tube configurations of lymphatic capillary networks, the hexagonal structure is the most efficient architecture for coupled interstitial and capillary fluid transport; that is, it clears the most interstitial fluid for a given network density and baseline interstitial fluid pressure. Thus, using homogenization theory, one can assess how vessel microstructure influences the macroscale fluid drainage by the lymphatics and demonstrate why the hexagonal network of dermal lymphatic capillaries is optimal for interstitial tissue fluid clearance. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Sustainable theory of a logistic model - Fisher information approach.

    Science.gov (United States)

    Al-Saffar, Avan; Kim, Eun-Jin

    2017-03-01

    Information theory provides a useful tool to understand the evolution of complex nonlinear systems and their sustainability. In particular, Fisher information has been evoked as a useful measure of sustainability and the variability of dynamical systems including self-organising systems. By utilising Fisher information, we investigate the sustainability of the logistic model for different perturbations in the positive and/or negative feedback. Specifically, we consider different oscillatory modulations in the parameters for positive and negative feedback and investigate their effect on the evolution of the system and Probability Density Functions (PDFs). Depending on the relative time scale of the perturbation to the response time of the system (the linear growth rate), we demonstrate the maintenance of the initial condition for a long time, manifested by a broad bimodal PDF. We present the analysis of Fisher information in different cases and elucidate its implications for the sustainability of population dynamics. We also show that a purely oscillatory growth rate can lead to a finite amplitude solution while self-organisation of these systems can break down with an exponentially growing solution due to the periodic fluctuations in negative feedback. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  13. A New Approach in Public Budgeting: Citizens' Budget

    Science.gov (United States)

    Bilge, Semih

    2015-01-01

    Change and transformation in the understanding and definition of citizenship has led to the emergence of citizen-oriented public service approach. This approach also raised a new term and concept in the field of public budgeting because of the transformation in the processes of public budgeting: citizens' budget. The citizens' budget which seeks…

  14. Game Theory and its Relationship with Linear Programming Models ...

    African Journals Online (AJOL)

    Game theory, a branch of operations research, has been successfully applied to solve various categories of problems arising from human decisions making characterized by the complexity of situations and the limits of individual processing abilities. This paper shows that game theory and linear programming problem are ...

  15. Non-static plane symmetric cosmological model in Wesson's theory

    Indian Academy of Sciences (India)

    in Wesson's theory. B MISHRA. The ICFAI Institute of Science and Technology, Fortune Towers A, Bhubaneswar 751 023, India. Email: bivudutta@yahoo.com. MS received 29 ... scale invariant theory of gravitation with a time-dependent gauge function is investigated. The false ..... London A333, 403 (1973). [3] P A M Dirac ...

  16. Social Construction Theory and the Satir Model: Toward a Synthesis.

    Science.gov (United States)

    Cheung, Maria

    1997-01-01

    Synthesizes social construction theory and the Satir approach to family therapy as a process of cocreation of reality, the use of language and narrative, and the therapist's role as a participant-facilitator. Presents a theory-building process of the Satir approach to family therapy. (Author/MKA)

  17. Relevance Theory as model for analysing visual and multimodal communication

    NARCIS (Netherlands)

    Forceville, C.; Machin, D.

    2014-01-01

    Elaborating on my earlier work (Forceville 1996: chapter 5, 2005, 2009; see also Yus 2008), I will here sketch how discussions of visual and multimodal discourse can be embedded in a more general theory of communication and cognition: Sperber and Wilson’s Relevance Theory/RT (Sperber and Wilson

  18. Naive Probability: A Mental Model Theory of Extensional Reasoning.

    Science.gov (United States)

    Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul

    1999-01-01

    Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…

  19. Applying Learning Theories and Instructional Design Models for Effective Instruction

    Science.gov (United States)

    Khalil, Mohammed K.; Elkhider, Ihsan A.

    2016-01-01

    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning…

  20. Modeling and Performing Relational Theories in the Classroom

    Science.gov (United States)

    Suter, Elizabeth A.; West, Carrie L.

    2011-01-01

    Although directly related to students' everyday lives, the abstract and even intimidating nature of relational theories often bars students from recognizing the immediate relevance to their relationships. The theories of symbolic interactionism, social exchange, relational dialectics, social penetration, and uncertainty reduction offer students…

  1. Models and theories of prescribing decisions: A review and suggested a new model

    Science.gov (United States)

    Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the ‘persuasion theory - elaboration likelihood model’, the stimuli–response marketing model’, the ‘agency theory’, the theory of planned behaviour,’ and ‘social power theory,’ in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research. PMID:28690701

  2. Landscape Water Budget Tool

    Science.gov (United States)

    WaterSense created the Water Budget Tool as one option to help builders, landscape professionals, and irrigation professionals certified by a WaterSense labeled program meet the criteria specified in the WaterSense New Home Specification.

  3. Budget and Actuals

    Data.gov (United States)

    Town of Chapel Hill, North Carolina — This dataset contains the Town's Year-to-Date Budget and Actuals for Fiscal Years 2016, 2017, and 2018. Fiscal years run from July 1 to June 30. The data comes from...

  4. Budget Automation System

    Data.gov (United States)

    U.S. Environmental Protection Agency — BAS is the central Agency system used to integrate strategic planning, annual planning, budgeting and financial management. BAS contains resource (dollars and FTE),...

  5. Using energetic budgets to assess the effects of environmental stress on corals: are we measuring the right things?

    Science.gov (United States)

    Lesser, M. P.

    2013-03-01

    Historically, the response of marine invertebrates to their environment, and environmentally induced stress, has included some measurement of their physiology or metabolism. Eventually, this approach developed into comparative energetics and the construction of energetic budgets. More recently, coral reefs, and scleractinian corals in particular, have suffered significant declines due to climate change-related environmental stress. In addition to a number of physiological, biophysical and molecular measurements to assess "coral health," there has been increased use of energetic approaches that have included the measurement of specific biochemical constituents (i.e., lipid concentrations) as a proxy for energy available to assess the potential outcomes of environmental stress on corals. In reading these studies, there appears to be some confusion between energy budgets and carbon budgets. Additionally, many assumptions regarding proximate biochemical composition, metabolic fuel preferences and metabolic quotients have been made, all of which are essential to construct accurate energy budgets and to convert elemental composition (i.e., carbon) to energy equivalents. Additionally, models of energetics such as the metabolic theory of ecology or dynamic energy budgets are being applied to coral physiology and include several assumptions that are not appropriate for scleractinian corals. As we assess the independent and interactive effects of multiple stressors on corals, efforts to construct quantitative energetic budgets should be a priority component of realistic multifactor experiments that would then improve the use of models as predictors of outcomes related to the effects of environmental change on corals.

  6. Modeling the water budget of the Upper Blue Nile basin using the JGrass-NewAge model system and satellite data

    Science.gov (United States)

    Abera, Wuletawu; Formetta, Giuseppe; Brocca, Luca; Rigon, Riccardo

    2017-06-01

    The Upper Blue Nile basin is one of the most data-scarce regions in developing countries, and hence the hydrological information required for informed decision making in water resource management is limited. The hydrological complexity of the basin, tied with the lack of hydrometeorological data, means that most hydrological studies in the region are either restricted to small subbasins where there are relatively better hydrometeorological data available, or on the whole-basin scale but at very coarse timescales and spatial resolutions. In this study we develop a methodology that can improve the state of the art by using available, but sparse, hydrometeorological data and satellite products to obtain the estimates of all the components of the hydrological cycle (precipitation, evapotranspiration, discharge, and storage). To obtain the water-budget closure, we use the JGrass-NewAge system and various remote sensing products. The satellite product SM2R-CCI is used for obtaining the rainfall inputs, SAF EUMETSAT for cloud cover fraction for proper net radiation estimation, GLEAM for comparison with NewAge-estimated evapotranspiration, and GRACE gravimetry data for comparison of the total water storage amounts available in the whole basin. Results are obtained at daily time steps for the period 1994-2009 (16 years), and they can be used as a reference for any water resource development activities in the region. The overall water-budget analysis shows that precipitation of the basin is 1360 ± 230 mm per year. Evapotranspiration accounts for 56 % of the annual water budget, runoff is 33 %, storage varies from -10 to +17 % of the water budget.

  7. Competing Strategies to Balance the Budgets of Publicly Funded Higher Education Institutions

    Science.gov (United States)

    Askari, Mahmoud Yousef

    2017-01-01

    This paper compares and contrasts different strategies to balance academic institutions' operating budgets. Some strategies use economic theory to recommend a budgeting technique, others use management methods to cut cost, and some strategies use a management accounting approach to reach a balanced budget. Through the use of a simplified numerical…

  8. Women and Budget Deficits

    OpenAIRE

    Signe Krogstrup; Sébastien Wälti

    2007-01-01

    If women have different economic preferences than men, then female economic and political empowerment is likely to change policy and household decisions, and in turn macroeconomic outcomes. We test the hypothesis that female enfranchisement leads to lower government budget deficits due gender differences in preferences over fiscal outcomes. Estimating the impact of women's vote on budget deficits in a differences-in-differences regression for Swiss cantonal panel data, we find that including ...

  9. Budget Constraints Affect Male Rats’ Choices between Differently Priced Commodities

    Science.gov (United States)

    Kalenscher, Tobias

    2015-01-01

    Demand theory can be applied to analyse how a human or animal consumer changes her selection of commodities within a certain budget in response to changes in price of those commodities. This change in consumption assessed over a range of prices is defined as demand elasticity. Previously, income-compensated and income-uncompensated price changes have been investigated using human and animal consumers, as demand theory predicts different elasticities for both conditions. However, in these studies, demand elasticity was only evaluated over the entirety of choices made from a budget. As compensating budgets changes the number of attainable commodities relative to uncompensated conditions, and thus the number of choices, it remained unclear whether budget compensation has a trivial effect on demand elasticity by simply sampling from a different total number of choices or has a direct effect on consumers’ sequential choice structure. If the budget context independently changes choices between commodities over and above price effects, this should become apparent when demand elasticity is assessed over choice sets of any reasonable size that are matched in choice opportunities between budget conditions. To gain more detailed insight in the sequential choice dynamics underlying differences in demand elasticity between budget conditions, we trained N=8 rat consumers to spend a daily budget by making a number of nosepokes to obtain two liquid commodities under different price regimes, in sessions with and without budget compensation. We confirmed that demand elasticity for both commodities differed between compensated and uncompensated budget conditions, also when the number of choices considered was matched, and showed that these elasticity differences emerge early in the sessions. These differences in demand elasticity were driven by a higher choice rate and an increased reselection bias for the preferred commodity in compensated compared to uncompensated budget

  10. Budget Constraints Affect Male Rats' Choices between Differently Priced Commodities.

    Directory of Open Access Journals (Sweden)

    Marijn van Wingerden

    Full Text Available Demand theory can be applied to analyse how a human or animal consumer changes her selection of commodities within a certain budget in response to changes in price of those commodities. This change in consumption assessed over a range of prices is defined as demand elasticity. Previously, income-compensated and income-uncompensated price changes have been investigated using human and animal consumers, as demand theory predicts different elasticities for both conditions. However, in these studies, demand elasticity was only evaluated over the entirety of choices made from a budget. As compensating budgets changes the number of attainable commodities relative to uncompensated conditions, and thus the number of choices, it remained unclear whether budget compensation has a trivial effect on demand elasticity by simply sampling from a different total number of choices or has a direct effect on consumers' sequential choice structure. If the budget context independently changes choices between commodities over and above price effects, this should become apparent when demand elasticity is assessed over choice sets of any reasonable size that are matched in choice opportunities between budget conditions. To gain more detailed insight in the sequential choice dynamics underlying differences in demand elasticity between budget conditions, we trained N=8 rat consumers to spend a daily budget by making a number of nosepokes to obtain two liquid commodities under different price regimes, in sessions with and without budget compensation. We confirmed that demand elasticity for both commodities differed between compensated and uncompensated budget conditions, also when the number of choices considered was matched, and showed that these elasticity differences emerge early in the sessions. These differences in demand elasticity were driven by a higher choice rate and an increased reselection bias for the preferred commodity in compensated compared to

  11. Budget Constraints Affect Male Rats' Choices between Differently Priced Commodities.

    Science.gov (United States)

    van Wingerden, Marijn; Marx, Christine; Kalenscher, Tobias

    2015-01-01

    Demand theory can be applied to analyse how a human or animal consumer changes her selection of commodities within a certain budget in response to changes in price of those commodities. This change in consumption assessed over a range of prices is defined as demand elasticity. Previously, income-compensated and income-uncompensated price changes have been investigated using human and animal consumers, as demand theory predicts different elasticities for both conditions. However, in these studies, demand elasticity was only evaluated over the entirety of choices made from a budget. As compensating budgets changes the number of attainable commodities relative to uncompensated conditions, and thus the number of choices, it remained unclear whether budget compensation has a trivial effect on demand elasticity by simply sampling from a different total number of choices or has a direct effect on consumers' sequential choice structure. If the budget context independently changes choices between commodities over and above price effects, this should become apparent when demand elasticity is assessed over choice sets of any reasonable size that are matched in choice opportunities between budget conditions. To gain more detailed insight in the sequential choice dynamics underlying differences in demand elasticity between budget conditions, we trained N=8 rat consumers to spend a daily budget by making a number of nosepokes to obtain two liquid commodities under different price regimes, in sessions with and without budget compensation. We confirmed that demand elasticity for both commodities differed between compensated and uncompensated budget conditions, also when the number of choices considered was matched, and showed that these elasticity differences emerge early in the sessions. These differences in demand elasticity were driven by a higher choice rate and an increased reselection bias for the preferred commodity in compensated compared to uncompensated budget conditions

  12. Atmospheric boundary layers in storms: advanced theory and modelling applications

    Directory of Open Access Journals (Sweden)

    S. S. Zilitinkevich

    2005-01-01

    Full Text Available Turbulent planetary boundary layers (PBLs control the exchange processes between the atmosphere and the ocean/land. The key problems of PBL physics are to determine the PBL height, the momentum, energy and matter fluxes at the surface and the mean wind and scalar profiles throughout the layer in a range of regimes from stable and neutral to convective. Until present, the PBLs typical of stormy weather were always considered as neutrally stratified. Recent works have disclosed that such PBLs are in fact very strongly affected by the static stability of the free atmosphere and must be treated as factually stable (we call this type of the PBL "conventionally neutral" in contract to the "truly neutral" PBLs developed against the neutrally stratified free flow. It is common knowledge that basic features of PBLs exhibit a noticeable dependence on the free-flow static stability and baroclinicity. However, the concern of the traditional theory of neural and stable PBLs was almost without exception the barotropic nocturnal PBL, which develops at mid latitudes during a few hours in the night, on the background of a neutral or slightly stable residual layer. The latter separates this type of the PBL from the free atmosphere. It is not surprising that the nature of turbulence in such regimes is basically local and does not depend on the properties of the free atmosphere. Alternatively, long-lived neutral (in fact only conditionally neutral or stable PBLs, which have much more time to grow up, are placed immediately below the stably stratified free flow. Under these conditions, the turbulent transports of momentum and scalars even in the surface layer - far away from the PBL outer boundary - depend on the free-flow Brunt-Väisälä frequency, N. Furthermore, integral measures of the long-lived PBLs (their depths and the resistance law functions depend on N and also on the baroclinic shear, S. In the traditional PBL models both non-local parameters N and S

  13. Global Carbon Budget 2015

    Science.gov (United States)

    Le Quéré, C.; Moriarty, R.; Andrew, R. M.; Canadell, J. G.; Sitch, S.; Korsbakken, J. I.; Friedlingstein, P.; Peters, G. P.; Andres, R. J.; Boden, T. A.; Houghton, R. A.; House, J. I.; Keeling, R. F.; Tans, P.; Arneth, A.; Bakker, D. C. E.; Barbero, L.; Bopp, L.; Chang, J.; Chevallier, F.; Chini, L. P.; Ciais, P.; Fader, M.; Feely, R. A.; Gkritzalis, T.; Harris, I.; Hauck, J.; Ilyina, T.; Jain, A. K.; Kato, E.; Kitidis, V.; Klein Goldewijk, K.; Koven, C.; Landschützer, P.; Lauvset, S. K.; Lefèvre, N.; Lenton, A.; Lima, I. D.; Metzl, N.; Millero, F.; Munro, D. R.; Murata, A.; Nabel, J. E. M. S.; Nakaoka, S.; Nojiri, Y.; O'Brien, K.; Olsen, A.; Ono, T.; Pérez, F. F.; Pfeil, B.; Pierrot, D.; Poulter, B.; Rehder, G.; Rödenbeck, C.; Saito, S.; Schuster, U.; Schwinger, J.; Séférian, R.; Steinhoff, T.; Stocker, B. D.; Sutton, A. J.; Takahashi, T.; Tilbrook, B.; van der Laan-Luijkx, I. T.; van der Werf, G. R.; van Heuven, S.; Vandemark, D.; Viovy, N.; Wiltshire, A.; Zaehle, S.; Zeng, N.

    2015-12-01

    Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates as well as consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil fuels and industry (EFF) are based on energy statistics and cement production data, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover-change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models forced by observed climate, CO2, and land-cover change (some including nitrogen-carbon interactions). We compare the mean land and ocean fluxes and their variability to estimates from three atmospheric inverse methods for three broad latitude bands. All uncertainties are reported as ±1σ, reflecting the current capacity to characterise the annual estimates of each component of the global

  14. Confronting the WRF and RAMS mesoscale models with innovative observations in the Netherlands: Evaluating the boundary layer heat budget

    NARCIS (Netherlands)

    Steeneveld, G. J.; Tolk, L. F.; Moene, A. F.; Hartogensis, O. K.; Peters, W.; Holtslag, A. A. M.

    2011-01-01

    The Weather Research and Forecasting Model (WRF) and the Regional Atmospheric Mesoscale Model System (RAMS) are frequently used for (regional) weather, climate and air quality studies. This paper covers an evaluation of these models for a windy and calm episode against Cabauw tower observations

  15. BUDGET AND PUBLIC DEBT

    Directory of Open Access Journals (Sweden)

    Morar Ioan Dan

    2014-12-01

    Full Text Available The issue of public budgeting is an important issue for public policy of the state, for the simple reason that no money from the state budget can not promote public policy. Budgetary policy is official government Doctrine vision mirror and also represents a starting point for other public policies, which in turn are financed by the public budget. Fiscal policy instruments at its disposal handles the public sector in its structure, and the private sector. Tools such as grant, budgetary allocation, tax, welfare under various forms, direct investments and not least the state aid is used by the state through their budgetary policies to directly and indirectly infuence sector, and the private. Fiscal policies can be grouped according to the structure of the public sector in these components, namely fiscal policy, budgeting and resource allocation policies for financing the budget deficit. An important issue is the financing of the budget deficit budgetary policies. There are two funding possibilities, namely, the higher taxes or more axles site and enter the second call to public loans. Both options involve extra effort from taxpayers in the current fiscal year when they pay higher taxes or a future period when public loans will be repaid. We know that by virtue of "fiscal pact" structural deficits of the member countries of the EU are limited by the European Commission, according to the macro structural stability and budget of each Member State. This problem tempers to some extent the governments of the Member States budgetary appetite, but does not solve the problem of chronic budget deficits. Another issue addressed in this paper is related to the public debt, the absolute amount of its relative level of public datoriri, about the size of GDP, public debt financing and its repayment sources. Sources of public debt issuance and monetary impact on the budget and monetary stability are variables that must underpin the justification of budgetary

  16. The Dynamic Definition of the Advertising Campaign Budget

    Directory of Open Access Journals (Sweden)

    Ostrianyn Serhii O.

    2017-10-01

    Full Text Available The article is aimed at exploring means for optimizing the budgeting of advertising activity, the optimal distribution of the advertising budget among several products, advertised by a company, and several advertising channels used during the distribution of an advertisement. The current status of the optimizationized budgeting models in the world and in Ukraine was analyzed. The topicality of scientific developments in this direction has been substantiated. The model is based on the non-linear logistics function of the response value of sales for the costs of placement of an advertising message, including market saturation effects and the accumulation of promotional effect. A complex advertising budgeting model has been proposed, which includes the dynamic definition of a budget constraint based on the expected sales return from a byed advertising campaign whose budget is being optimized. The proposed model allows to schedule advertising activities in conditions of uncertainty and rapid change of environment.

  17. Supersymmetry and string theory beyond the standard model

    CERN Document Server

    Dine, Michael

    2015-01-01

    The past decade has witnessed dramatic developments in the fields of experimental and theoretical particle physics and cosmology. This fully updated second edition is a comprehensive introduction to these recent developments and brings this self-contained textbook right up to date. Brand new material for this edition includes the groundbreaking Higgs discovery, results of the WMAP and Planck experiments. Extensive discussion of theories of dynamical electroweak symmetry breaking and a new chapter on the landscape, as well as a completely rewritten coda on future directions gives readers a modern perspective on this developing field. A focus on three principle areas: supersymmetry, string theory, and astrophysics and cosmology provide the structure for this book which will be of great interest to graduates and researchers in the fields of particle theory, string theory, astrophysics and cosmology. The book contains several problems, and password-protected solutions will be available to lecturers at www.cambrid...

  18. Studying the impact of overshooting convection on the tropopause tropical layer (TTL) water vapor budget at the continental scale using a mesoscale model

    Science.gov (United States)

    Behera, Abhinna; Rivière, Emmanuel; Marécal, Virginie; Claud, Chantal; Rysman, Jean-François; Geneviève, Seze

    2016-04-01

    Water vapour budget is a key component in the earth climate system. In the tropical upper troposphere, lower stratosphere (UTLS), it plays a central role both on the radiative and the chemical budget. Its abundance is mostly driven by slow ascent above the net zero radiative heating level followed by ice crystals' formation and sedimentation, so called the cold trap. In contrast to this large scale temperature driven process, overshooting convection penetrating the stratosphere could be one piece of the puzzle. It has been proven to hydrate the lower stratosphere at the local scale. Satellite-borne H2O instruments can not measure with a fine enough resolution the water vapour enhancements caused by overshooting convection. The consequence is that it is difficult to estimate the role of overshooting deep convection at the global scale. Using a mesoscale model i.e., Brazilian Regional Atmospheric Modelling System (BRAMS), past atmospheric conditions have been simulated for the full wet season i.e., Nov 2012 to Mar 2013 having a single grid with horizontal resolution of 20 km × 20km over a large part of Brazil and South America. This resolution is too coarse to reproduce overshooting convection in the model, so that this simulation should be used as a reference (REF) simulation, without the impact of overshooting convection in the TTL water budget. For initialisation, as well as nudging the grid boundary in every 6 hours, European Centre for Medium-Range Weather Forecasts (ECMWF) analyses has been used. The size distribution of hydrometeors and number of cloud condensation nuclei (CCN) are fitted in order to best reproduce accumulated precipitations derived from Tropical Rainfall Measuring Mission (TRMM). Similarly, GOES and MSG IR mages have been thoroughly compared with model's outputs, using image correlation statistics for the position of the clouds. The model H2O variability during the wet season, is compared with the in situ balloon-borne measurements during

  19. Forecasting carbon budget under climate change and CO2 fertilization for subtropical region in China using integrated biosphere simulator (IBIS) model

    Science.gov (United States)

    Zhu, Q.; Jiang, H.; Liu, J.; Peng, C.; Fang, X.; Yu, S.; Zhou, G.; Wei, X.; Ju, W.

    2011-01-01

    The regional carbon budget of the climatic transition zone may be very sensitive to climate change and increasing atmospheric CO2 concentrations. This study simulated the carbon cycles under these changes using process-based ecosystem models. The Integrated Biosphere Simulator (IBIS), a Dynamic Global Vegetation Model (DGVM), was used to evaluate the impacts of climate change and CO2 fertilization on net primary production (NPP), net ecosystem production (NEP), and the vegetation structure of terrestrial ecosystems in Zhejiang province (area 101,800 km2, mainly covered by subtropical evergreen forest and warm-temperate evergreen broadleaf forest) which is located in the subtropical climate area of China. Two general circulation models (HADCM3 and CGCM3) representing four IPCC climate change scenarios (HC3AA, HC3GG, CGCM-sresa2, and CGCM-sresb1) were used as climate inputs for IBIS. Results show that simulated historical biomass and NPP are consistent with field and other modelled data, which makes the analysis of future carbon budget reliable. The results indicate that NPP over the entire Zhejiang province was about 55 Mt C yr-1 during the last half of the 21st century. An NPP increase of about 24 Mt C by the end of the 21st century was estimated with the combined effects of increasing CO2 and climate change. A slight NPP increase of about 5 Mt C was estimated under the climate change alone scenario. Forests in Zhejiang are currently acting as a carbon sink with an average NEP of about 2.5 Mt C yr-1. NEP will increase to about 5 Mt C yr-1 by the end of the 21st century with the increasing atmospheric CO2 concentration and climate change. However, climate change alone will reduce the forest carbon sequestration of Zhejiang's forests. Future climate warming will substantially change the vegetation cover types; warm-temperate evergreen broadleaf forest will be gradually substituted by subtropical evergreen forest. An increasing CO2 concentration will have little

  20. Maximum Likelihood Item Easiness Models for Test Theory without an Answer Key

    Science.gov (United States)

    France, Stephen L.; Batchelder, William H.

    2015-01-01

    Cultural consensus theory (CCT) is a data aggregation technique with many applications in the social and behavioral sciences. We describe the intuition and theory behind a set of CCT models for continuous type data using maximum likelihood inference methodology. We describe how bias parameters can be incorporated into these models. We introduce…