WorldWideScience

Sample records for budget theory model

  1. Sublethal toxicant effects with dynamic energy budget theory: model formulation.

    Science.gov (United States)

    Muller, Erik B; Nisbet, Roger M; Berkley, Heather A

    2010-01-01

    We develop and test a general modeling framework to describe the sublethal effects of pollutants by adding toxicity modules to an established dynamic energy budget (DEB) model. The DEB model describes the rates of energy acquisition and expenditure by individual organisms; the toxicity modules describe how toxicants affect these rates by changing the value of one or more DEB parameters, notably the parameters quantifying the rates of feeding and maintenance. We investigate four toxicity modules that assume: (1) effects on feeding only; (2) effects on maintenance only; (3) effects on feeding and maintenance with similar values for the toxicity parameters; and (4) effects on feeding and maintenance with different values for the toxicity parameters. We test the toxicity modules by fitting each to published data on feeding, respiration, growth and reproduction. Among the pollutants tested are metals (mercury and copper) and various organic compounds (chlorophenols, toluene, polycyclic aromatic hydrocarbons, tetradifon and pyridine); organisms include mussels, oysters, earthworms, water fleas and zebrafish. In most cases, the data sets could be adequately described with any of the toxicity modules, and no single module gave superior fits to all data sets. We therefore propose that for many applications, it is reasonable to use the most general and parameter sparse module, i.e. module 3 that assumes similar effects on feeding and maintenance, as a default. For one example (water fleas), we use parameter estimates to calculate the impact of food availability and toxicant levels on the long term population growth rate. PMID:19633955

  2. Linking individual-based models and dynamic energy budget theory : lessons for ecology and ecotoxicology

    OpenAIRE

    Martin, Benjamin

    2013-01-01

    In the context of ecological risk assessment of chemicals, individual-based population models hold great potential to increase the ecological realism of current regulatory risk assessment procedures. However, developing and parameterizing such models is time-consuming and often ad hoc. Using standardized, tested submodels of individual organisms would make individual-based modelling more efficient and coherent. In this thesis, I explored whether Dynamic Energy Budget (DEB) theory is suitable ...

  3. Model theory

    CERN Document Server

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  4. Model theory

    CERN Document Server

    Hodges, Wilfrid

    1993-01-01

    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  5. THE CONCEPTUAL CONTENT OF STATE BUDGET PROCESS IN ECONOMIC THEORY

    OpenAIRE

    Žubule, Ērika; Puzule, Anita

    2015-01-01

    Evaluating the role of the budget in economy we may declare that the budget process should favour the social economic development of the state. The aim of the research is to explore and evaluate theoretical aspects of the state budget process as a component of the state financial policy and to work out proposals for improvement of the state budget process, based on the theoretical and empirical findings. The main objectives of the research were to study the foreign economic scientific literat...

  6. Nambe Pueblo Water Budget and Forecasting model.

    Energy Technology Data Exchange (ETDEWEB)

    Brainard, James Robert

    2009-10-01

    This report documents The Nambe Pueblo Water Budget and Water Forecasting model. The model has been constructed using Powersim Studio (PS), a software package designed to investigate complex systems where flows and accumulations are central to the system. Here PS has been used as a platform for modeling various aspects of Nambe Pueblo's current and future water use. The model contains three major components, the Water Forecast Component, Irrigation Scheduling Component, and the Reservoir Model Component. In each of the components, the user can change variables to investigate the impacts of water management scenarios on future water use. The Water Forecast Component includes forecasting for industrial, commercial, and livestock use. Domestic demand is also forecasted based on user specified current population, population growth rates, and per capita water consumption. Irrigation efficiencies are quantified in the Irrigated Agriculture component using critical information concerning diversion rates, acreages, ditch dimensions and seepage rates. Results from this section are used in the Water Demand Forecast, Irrigation Scheduling, and the Reservoir Model components. The Reservoir Component contains two sections, (1) Storage and Inflow Accumulations by Categories and (2) Release, Diversion and Shortages. Results from both sections are derived from the calibrated Nambe Reservoir model where historic, pre-dam or above dam USGS stream flow data is fed into the model and releases are calculated.

  7. Stochastic Models for Budget Optimization in Search-Based Advertising

    OpenAIRE

    Muthukrishnan, S.; Pal, Martin; Svitkina, Zoya

    2006-01-01

    Internet search companies sell advertisement slots based on users' search queries via an auction. Advertisers have to determine how to place bids on the keywords of their interest in order to maximize their return for a given budget: this is the budget optimization problem. The solution depends on the distribution of future queries. In this paper, we formulate stochastic versions of the budget optimization problem based on natural probabilistic models of distribution over future queries, and ...

  8. Validation of a Dynamic Energy Budget (DEB) model for the blue mussel

    NARCIS (Netherlands)

    Saraiva, S.; van der Meer, J.; Kooijman, S.A.L.M.; Witbaard, R.; Philippart, C.J.M.; Hippler, D.; Parker, R.

    2012-01-01

    A model for bivalve growth was developed and the results were tested against field observations. The model is based on the Dynamic Energy Budget (DEB) theory and includes an extension of the standard DEB model to cope with changing food quantity and quality. At 4 different locations in the North Sea

  9. Model Theory and Applications

    CERN Document Server

    Mangani, P

    2011-01-01

    This title includes: Lectures - G.E. Sacks - Model theory and applications, and H.J. Keisler - Constructions in model theory; and, Seminars - M. Servi - SH formulas and generalized exponential, and J.A. Makowski - Topological model theory.

  10. A Theory of the Perturbed Consumer with General Budgets

    DEFF Research Database (Denmark)

    McFadden, Daniel L; Fosgerau, Mogens

    subgradients with respect to these perturbations are convex hulls of the utility-maximizing demands. We give necessary as well as sufficient conditions for DGF to be consistent with utility maximization, and establish under quite general conditions that utility-maximizing demands are almost everywhere single......We consider demand systems for utility-maximizing consumers facing general budget constraints whose utilities are perturbed by additive linear shifts in marginal utilities. Budgets are required to be compact but are not required to be convex. We define demand generating functions (DGF) whose...

  11. Modelling nematode life cycles using dynamic energy budgets

    NARCIS (Netherlands)

    Jager, T.; Alda Alvarez, O.; Kammenga, J.E.; Kooijman, S.A.L.M.

    2005-01-01

    1. To understand the life cycle of an organism, it is important to understand the underlying physiological mechanisms of their life histories. We here use the theory of dynamic energy budgets (DEB) to investigate the close relationships between growth, reproduction and respiration in nematodes. 2. U

  12. 3D modeling of satellite spectral images, radiation budget and energy budget of urban landscapes

    Science.gov (United States)

    Gastellu-Etchegorry, J. P.

    2008-12-01

    DART EB is a model that is being developed for simulating the 3D (3 dimensional) energy budget of urban and natural scenes, possibly with topography and atmosphere. It simulates all non radiative energy mechanisms (heat conduction, turbulent momentum and heat fluxes, water reservoir evolution, etc.). It uses DART model (Discrete Anisotropic Radiative Transfer) for simulating radiative mechanisms: 3D radiative budget of 3D scenes and their remote sensing images expressed in terms of reflectance or brightness temperature values, for any atmosphere, wavelength, sun/view direction, altitude and spatial resolution. It uses an innovative multispectral approach (ray tracing, exact kernel, discrete ordinate techniques) over the whole optical domain. This paper presents two major and recent improvements of DART for adapting it to urban canopies. (1) Simulation of the geometry and optical characteristics of urban elements (houses, etc.). (2) Modeling of thermal infrared emission by vegetation and urban elements. The new DART version was used in the context of the CAPITOUL project. For that, districts of the Toulouse urban data base (Autocad format) were translated into DART scenes. This allowed us to simulate visible, near infrared and thermal infrared satellite images of Toulouse districts. Moreover, the 3D radiation budget was used by DARTEB for simulating the time evolution of a number of geophysical quantities of various surface elements (roads, walls, roofs). Results were successfully compared with ground measurements of the CAPITOUL project.

  13. An information theory approach for evaluating earth radiation budget (ERB) measurements - Nonuniform sampling of reflected shortwave radiation

    Science.gov (United States)

    Barkstrom, Bruce R.; Direskeneli, Haldun; Halyo, Nesim

    1992-01-01

    An information theory approach to examine the temporal nonuniform sampling characteristics of shortwave (SW) flux for earth radiation budget (ERB) measurements is suggested. The information gain is computed by computing the information content before and after the measurements. A stochastic diurnal model for the SW flux is developed, and measurements for different orbital parameters are examined. The methodology is applied to specific NASA Polar platform and Tropical Rainfall Measuring Mission (TRMM) orbital parameters. The information theory approach, coupled with the developed SW diurnal model, is found to be promising for measurements involving nonuniform orbital sampling characteristics.

  14. Modelling the energy budget and prey choice of eider ducks

    NARCIS (Netherlands)

    Brinkman, A.G.; Ens, B.J.; Kats, R.K.H.

    2003-01-01

    We developed an energy and heat budget model for eider ducks. All relevant processes have been quantified. Food processing, diving costs, prey heating, the costs of crushing mussel shells, heat losses during diving as well as during resting, and heat production as a result of muscle activity are dis

  15. Model theory and modules

    CERN Document Server

    Prest, M

    1988-01-01

    In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module

  16. Electric solar wind sail mass budget model

    Directory of Open Access Journals (Sweden)

    P. Janhunen

    2013-02-01

    Full Text Available The electric solar wind sail (E-sail is a new type of propellantless propulsion system for Solar System transportation, which uses the natural solar wind to produce spacecraft propulsion. The E-sail consists of thin centrifugally stretched tethers that are kept charged by an onboard electron gun and, as such, experience Coulomb drag through the high-speed solar wind plasma stream. This paper discusses a mass breakdown and a performance model for an E-sail spacecraft that hosts a mission-specific payload of prescribed mass. In particular, the model is able to estimate the total spacecraft mass and its propulsive acceleration as a function of various design parameters such as the number of tethers and their length. A number of subsystem masses are calculated assuming existing or near-term E-sail technology. In light of the obtained performance estimates, an E-sail represents a promising propulsion system for a variety of transportation needs in the Solar System.

  17. Theory Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Shlachter, Jack [Los Alamos National Laboratory

    2012-08-23

    Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.

  18. ENSIS, Pollution inventory, pollution budget model, water quality model and scenario handling. Functional specification

    OpenAIRE

    Bakken, T.H.; Bjørkenes, A.; Dagestad, K

    2003-01-01

    This is the functional specification of a complete pollution budget model for water. A crucial improvement of this model is implementation of new pollution sources and modification of existing sources. The specification of a water quality model, based on the results from the pollution budget model is also included. The document is intended to give a cost and time estimate of the programming of the functionality it describes, and will be the guideline for implementation of the specified functi...

  19. Late Budgets

    DEFF Research Database (Denmark)

    Andersen, Asger Lau; Lassen, David Dreyer; Nielsen, Lasse Holbøll Westh

    are negative rather than positive; and when there is divided government. We test the hypotheses of the model using a unique data set of late budgets for US state governments, based on dates of budget approval collected from news reports and a survey of state budget o¢ cers for the period 1988......The budget forms the legal basis of government spending. If a budget is not in place at the beginning of the fiscal year, planning as well as current spending are jeopardized and government shutdown may result. This paper develops a continuous-time war-of-attrition model of budgeting...... in a presidential style-democracy to explain the duration of budget negotiations. We build our model around budget baselines as reference points for loss averse negotiators. We derive three testable hypotheses: there are more late budgets, and they are more late, when fiscal circumstances change; when such changes...

  20. How processing digital elevation models can affect simulated water budgets

    Science.gov (United States)

    Kuniansky, E.L.; Lowery, M.A.; Campbell, B.G.

    2009-01-01

    For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.

  1. Corruption, Public Procurement, and the Budget Composition : Theory and Evidence from OECD Countries

    OpenAIRE

    Zohal Hessami

    2013-01-01

    This paper examines the relationship between corruption and the composition of public expenditures. First, I derive a theoretical model that links the degree of corruption in a country - to be understood as the prevailing culture of corruption - to distortions in the budget composition. The transmission channel is a rent-seeking contest where firms from different sectors pay bribes to politicians and bureaucrats to influence public procurement decisions, which give rise to endogenous rents. I...

  2. Modelling mussel growth in ecosystems with low suspended matter loads using a Dynamic Energy Budget approach

    Science.gov (United States)

    Duarte, P.; Fernández-Reiriz, M. J.; Labarta, U.

    2012-01-01

    The environmental and the economic importance of shellfish stimulated a great deal of studies on their physiology over the last decades, with many attempts to model their growth. The first models developed to simulate bivalve growth were predominantly based on the Scope For Growth ( SFG) paradigm. In the last years there has been a shift towards the Dynamic Energy Budget ( DEB) paradigm. The general objective of this work is contributing to the evaluation of different approaches to simulate bivalve growth in low seston waters by: (i) implementing a model to simulate mussel growth in low suspended matter ecosystems based on the DEB theory (Kooijman, S.A.L.M., 2000. Dynamic and energy mass budgets in biological systems, Cambridge University Press); (ii) comparing and discussing different approaches to simulate feeding processes, in the light of recently published works both on experimental physiology and physiology modeling; (iii) comparing and discussing results obtained with a model based on EMMY ( Scholten and Smaal, 1998). The model implemented allowed to successfully simulate mussel feeding and shell length growth in two different Galician Rias. Obtained results together with literature data suggest that modeling of bivalve feeding should incorporate physiologic feed-backs related with food digestibility. In spite of considerable advances in bivalve modeling a number of issues is yet to be resolved, with emphasis on the way food sources are represented and feeding processes formulated.

  3. Exploring the Beta Model Using Proportional Budget Information in a Contingent Valuation Study

    OpenAIRE

    Hui Li; Robert P. Berrens; Bohara, Alok K.; Hank C. Jenkins-Smith; Silva, Carol L.; Weimer, David L

    2005-01-01

    Using a set of random telephone and Internet (web-based) survey samples for a national advisory referendum, we implement Beta models to handle proportional budget information, and allow for consistency in modeling assumptions and the calculation of estimated willingness to pay (WTP). Results indicate significant budget constraint effects and demonstrate the potential for Beta models in handling mental-accounting type information.

  4. Theory and modeling group

    Science.gov (United States)

    Holman, Gordon D.

    The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.

  5. Modelling the global tropospheric ozone budget: exploring the variability in current models

    Directory of Open Access Journals (Sweden)

    O. Wild

    2007-02-01

    Full Text Available What are the largest uncertainties in modelling ozone in the troposphere, and how do they affect the calculated ozone budget? Published chemistry-transport model studies of tropospheric ozone differ significantly in their conclusions regarding the importance of the key processes controlling the ozone budget: influx from the stratosphere, chemical processing and surface deposition. This study surveys ozone budgets from previous studies and demonstrates that about two thirds of the increase in ozone production seen between early assessments and more recent model intercomparisons can be accounted for by increased precursor emissions. Model studies using recent estimates of emissions compare better with ozonesonde measurements than studies using older data, and the tropospheric burden of ozone is closer to that derived here from measurement climatologies, 335±10 Tg. However, differences between individual model studies remain large and cannot be explained by surface precursor emissions alone; cross-tropopause transport, wet and dry deposition, humidity, and lightning make large contributions to the differences seen between models. The importance of these processes is examined here using a chemistry-transport model to investigate the sensitivity of the calculated ozone budget to different assumptions about emissions, physical processes, meteorology and model resolution. The budget is particularly sensitive to the magnitude and location of lightning NOx emissions, which remain poorly constrained; the 3–8 TgN/yr range in recent model studies may account for a 10% difference in tropospheric ozone burden and a 1.4 year difference in CH4 lifetime. Differences in humidity and dry deposition account for some of the variability in ozone abundance and loss seen in previous studies, with smaller contributions from wet deposition and stratospheric influx. At coarse model resolutions stratospheric influx is systematically overestimated

  6. Accounting changes and budgeting practices in the Tanzanian central government: a theory of struggling for conformance

    OpenAIRE

    Mkasiwa, Tausi

    2011-01-01

    This research investigates the phenomenon of budgeting practices in the Tanzanian Central Government. It seeks to understand how budgeting systems under the New Public Management (NPM), World Bank- and IMF-exhorted systems were adopted and implemented. There were several motives for this research: the significance of budgeting in financial management, the sparsity of empirical studies on NPM in developing countries, and a call for an understanding of the local contexts of the country and an e...

  7. Dependent-Chance Programming Models for Capital Budgeting in Fuzzy Environments

    Institute of Scientific and Technical Information of China (English)

    LIANG Rui; GAO Jinwu

    2008-01-01

    Capital budgeting is concerned with maximizing the total net profit subject to budget constraints by selecting an appropriate combination of projects. This paper presents chance maximizing models for capital budgeting with fuzzy input data and multiple conflicting objectives. When the decision maker sets a prospec-tive profit level and wants to maximize the chances of the total profit achieving the prospective profit level, a fuzzy dependent-chance programming model, a fuzzy multi-objective dependent-chance programming model, and a fuzzy goal dependent-chance programming model are used to formulate the fuzzy capital budgeting problem. A fuzzy simulation based genetic algorithm is used to solve these models. Numerical examples are provided to illustrate the effectiveness of the simulation-based genetic algorithm and the po-tential applications of these models.

  8. DART : a 3D model for remote sensing images and radiative budget of earth surfaces

    OpenAIRE

    Gastellu-Etchegorry, J.P.; Grau, E.; Lauret, N.

    2012-01-01

    Modeling the radiative behavior and the energy budget of land surfaces is relevant for many scientific domains such as the study of vegetation functioning with remotely acquired information. DART model (Discrete Anisotropic Radiative Transfer) is developed since 1992. It is one of the most complete 3D models in this domain. It simulates radiative transfer (R.T.) in the optical domain: 3D radiative budget and remote sensing images (i.e., radiance, reflectance, brightness temperature) of vegeta...

  9. Budget constraint and vaccine dosing: A mathematical modelling exercise

    NARCIS (Netherlands)

    Standaert, Baudouin A.; Curran, Desmond; Postma, Maarten J.

    2014-01-01

    Background: Increasing the number of vaccine doses may potentially improve overall efficacy. Decision-makers need information about choosing the most efficient dose schedule to maximise the total health gain of a population when operating under a constrained budget. The objective of this study is to

  10. Probability theory and its models

    OpenAIRE

    Humphreys, Paul

    2008-01-01

    This paper argues for the status of formal probability theory as a mathematical, rather than a scientific, theory. David Freedman and Philip Stark's concept of model based probabilities is examined and is used as a bridge between the formal theory and applications.

  11. Towards the determination of Mytilus edulis food preferences using the dynamic energy budget (DEB) theory.

    Science.gov (United States)

    Picoche, Coralie; Le Gendre, Romain; Flye-Sainte-Marie, Jonathan; Françoise, Sylvaine; Maheux, Frank; Simon, Benjamin; Gangnery, Aline

    2014-01-01

    The blue mussel, Mytilus edulis, is a commercially important species, with production based on both fisheries and aquaculture. Dynamic Energy Budget (DEB) models have been extensively applied to study its energetics but such applications require a deep understanding of its nutrition, from filtration to assimilation. Being filter feeders, mussels show multiple responses to temporal fluctuations in their food and environment, raising questions that can be investigated by modeling. To provide a better insight into mussel-environment interactions, an experiment was conducted in one of the main French growing zones (Utah Beach, Normandy). Mussel growth was monitored monthly for 18 months, with a large number of environmental descriptors measured in parallel. Food proxies such as chlorophyll a, particulate organic carbon and phytoplankton were also sampled, in addition to non-nutritious particles. High-frequency physical data recording (e.g., water temperature, immersion duration) completed the habitat description. Measures revealed an increase in dry flesh mass during the first year, followed by a high mass loss, which could not be completely explained by the DEB model using raw external signals. We propose two methods that reconstruct food from shell length and dry flesh mass variations. The former depends on the inversion of the growth equation while the latter is based on iterative simulations. Assemblages of food proxies are then related to reconstructed food input, with a special focus on plankton species. A characteristic contribution is attributed to these sources to estimate nutritional values for mussels. M. edulis shows no preference between most plankton life history traits. Selection is based on the size of the ingested particles, which is modified by the volume and social behavior of plankton species. This finding reveals the importance of diet diversity and both passive and active selections, and confirms the need to adjust DEB models to different

  12. Estimation of IT energy budget during the St. Patrick's Day storm 2015: observations, modeling and challenges.

    Science.gov (United States)

    Verkhoglyadova, O. P.; Meng, X.; Mannucci, A. J.; Mlynczak, M. G.; Hunt, L. A.; Tsurutani, B.

    2015-12-01

    We present estimates for the energy budget of the 2015 St. Patrick's Day storm. Empirical models and coupling functions are used as proxies for energy input due to solar wind-magnetosphere coupling. Fluxes of thermospheric nitric oxide and carbon dioxide cooling emissions are estimated in several latitude ranges. Solar wind data and the Weimer 2005 model for high-latitude electrodynamics are used to drive GITM modeling for the storm. Model estimations for energy partitioning, Joule heating, NO cooling are compared with observations and empirical proxies. We outline challenges in the estimation of the IT energy budget (Joule heating, Poynting flux, particle precipitation) during geomagnetic storms.

  13. Warped models in string theory

    International Nuclear Information System (INIS)

    Warped models, originating with the ideas of Randall and Sundrum, provide a fascinating extension of the standard model with interesting consequences for the LHC. We investigate in detail how string theory realises such models, with emphasis on fermion localisation and the computation of Yukawa couplings. We find, in contrast to the 5d models, that fermions can be localised anywhere in the extra dimension, and that there are new mechanisms to generate exponential hierarchies amongst the Yukawa couplings. We also suggest a way to distinguish these string theory models with data from the LHC. (author)

  14. Surface heat budget over the Weddell Sea: Buoy results and model comparisons

    Science.gov (United States)

    Vihma, Timo; Uotila, Juha; Cheng, Bin; Launiainen, Jouko

    2002-02-01

    The surface heat budget over the Weddell Sea ice cover in 1996 was studied on the basis of data from Argos buoys equipped with meteorological sensors. In addition, a thermodynamic sea ice model, satellite-based data on the sea ice concentration, sonar results on ice thickness distribution, and output from large-scale meteorological models were all utilized. Applying the buoy data, the sensible heat flux over sea ice was calculated by Monin-Obukhov theory using the gradient method, and the latent heat flux was obtained by the bulk method. A second estimate for the surface fluxes was obtained from the thermodynamic sea ice model, which was forced by the buoy observations. The results showed a reasonable agreement. The dominating component in the heat budget over ice was the net longwave radiation, which had a mean annual cooling effect of -28 W m-2. This was balanced by the net shortwave radiation (annual mean 13 W m-2), the sensible (13 W m-2) and latent (-3 W m-2) heat fluxes, and the conductive heat flux through the ice (5 W m-2). The regional surface fluxes over the fractured ice cover were estimated using the buoy data and Special Sensor Microwave Imager (SSMI)-derived ice concentrations. In winter the regional surface sensible heat flux was sensitive to the ice concentration and thickness distribution. The estimate for the area-averaged formation rate of new ice in leads in winter varies from 0.05 to 0.21 m per month depending on the SSMI processing algorithm applied. Countergradient fluxes occurred 8-10% of the time. The buoy observations were compared with the operational analyses of the European Centre for Medium-Range Weather Forecasts (ECMWF) and the reanalyses of the National Centers for Environmental Prediction (NCEP)/National Center for Atmospheric Research (NCAR). The 2 m air temperature and surface temperature were 3.5° and 4.4°C too high, respectively, in the ECMWF and 3.2° and 3.0°C too low in the NCEP/NCAR fields, but the models reproduced the

  15. Standard Model Theory

    OpenAIRE

    Hollik, W.

    2015-01-01

    In this conference report a summary is given on the theoretical work that has contributed to provide accurate theoretical predictions for testing the standard model in present and future experiments. Precision calculations for the vector boson masses, for the Z resonance, W pair production, and for the g-2 of the muon are reviewed and the theoretical situation for the Higgs sector is summarized. The status of the standard model is discussed in the light of the recent high and low energy data....

  16. Exploring the effects of temperature and resource limitation on mercury bioaccumulation in Fundulus heteroclitus using dynamic energy budget modeling

    Science.gov (United States)

    Dynamic energy budget (DEB) theory provides a generalizable and broadly applicable framework to connect sublethal toxic effects on individuals to changes in population survival and growth. To explore this approach, we conducted growth and bioaccumulation studies that contribute t...

  17. Budget calculations for ozone and its precursors: Seasonal and episodic features based on model simulations

    NARCIS (Netherlands)

    Memmesheimer, M.; Ebel, A.; Roemer, M.

    1997-01-01

    Results from two air quality models (LOTOS, EURAD) have been used to analyse the contribution of the different terms in the continuity equation to the budget of ozone, NO(x) and PAN. Both models cover large parts of Europe and describe the processes relevant for tropospheric chemistry and dynamics.

  18. A multi-layer land surface energy budget model for implicit coupling with global atmospheric simulations

    Directory of Open Access Journals (Sweden)

    J. Ryder

    2014-12-01

    Full Text Available In Earth system modelling, a description of the energy budget of the vegetated surface layer is fundamental as it determines the meteorological conditions in the planetary boundary layer and as such contributes to the atmospheric conditions and its circulation. The energy budget in most Earth system models has long been based on a "big-leaf approach", with averaging schemes that represent in-canopy processes. Such models have difficulties in reproducing consistently the energy balance in field observations. We here outline a newly developed numerical model for energy budget simulation, as a component of the land surface model ORCHIDEE-CAN (Organising Carbon and Hydrology In Dynamic Ecosystems – CANopy. This new model implements techniques from single-site canopy models in a practical way. It includes representation of in-canopy transport, a multilayer longwave radiation budget, height-specific calculation of aerodynamic and stomatal conductance, and interaction with the bare soil flux within the canopy space. Significantly, it avoids iterations over the height of tha canopy and so maintains implicit coupling to the atmospheric model LMDz. As a first test, the model is evaluated against data from both an intensive measurement campaign and longer term eddy covariance measurements for the intensively studied Eucalyptus stand at Tumbarumba, Australia. The model performs well in replicating both diurnal and annual cycles of fluxes, as well as the gradients of sensible heat fluxes. However, the model overestimates sensible heat flux against an underestimate of the radiation budget. Improved performance is expected through the implementation of a more detailed calculation of stand albedo and a more up-to-date stomatal conductance calculation.

  19. Effects of activity and energy budget balancing algorithm on laboratory performance of a fish bioenergetics model

    Science.gov (United States)

    Madenjian, Charles P.; David, Solomon R.; Pothoven, Steven A.

    2012-01-01

    We evaluated the performance of the Wisconsin bioenergetics model for lake trout Salvelinus namaycush that were fed ad libitum in laboratory tanks under regimes of low activity and high activity. In addition, we compared model performance under two different model algorithms: (1) balancing the lake trout energy budget on day t based on lake trout energy density on day t and (2) balancing the lake trout energy budget on day t based on lake trout energy density on day t + 1. Results indicated that the model significantly underestimated consumption for both inactive and active lake trout when algorithm 1 was used and that the degree of underestimation was similar for the two activity levels. In contrast, model performance substantially improved when using algorithm 2, as no detectable bias was found in model predictions of consumption for inactive fish and only a slight degree of overestimation was detected for active fish. The energy budget was accurately balanced by using algorithm 2 but not by using algorithm 1. Based on the results of this study, we recommend the use of algorithm 2 to estimate food consumption by fish in the field. Our study results highlight the importance of accurately accounting for changes in fish energy density when balancing the energy budget; furthermore, these results have implications for the science of evaluating fish bioenergetics model performance and for more accurate estimation of food consumption by fish in the field when fish energy density undergoes relatively rapid changes.

  20. Optimality test in fuzzy inventory model for restricted budget and space: Move forward to a non-linear programming approach

    Directory of Open Access Journals (Sweden)

    Pattnaik Monalisha

    2015-01-01

    Full Text Available In this paper, the concept of fuzzy Non-Linear Programming Technique is applied to solve an economic order quantity (EOQ model for restricted budget and space. Since various types of uncertainties and imprecision are inherent in real inventory problems, they are classically modeled using the approaches from the probability theory. However, there are uncertainties that cannot be appropriately treated by the usual probabilistic models. The questions are how to define inventory optimization tasks in such environment and how to interpret the optimal solutions. This paper allow the modification of the Single item EOQ model in presence of fuzzy decision making process where demand is related to the unit price, and the setup cost varies with the quantity produced/Purchased. The modification of objective function, budget, and storage area in the presence of imprecisely estimated parameters are considered. The model is developed by employing different approaches over an infinite planning horizon. It incorporates all the concepts of a fuzzy arithmetic approach and comparative analysis with other non linear models. Investigation of the properties of an optimal solution allows developing an algorithm whose validity is illustrated by an example problem, and two and three dimensional diagrams are represented to this application through MATL(R2009a software. Sensitivity analysis of the optimal solution is studied with respect to the changes of different parameter values for obtaining managerial insights of the decision problem.

  1. BEYOND BUDGETING

    Directory of Open Access Journals (Sweden)

    Edo Cvrkalj

    2015-12-01

    Full Text Available Traditional budgeting principles, with strictly defined business goals, have been, since 1998, slowly growing into more sophisticated and organization-adjusted alternative budgeting concepts. One of those alternative concepts is the “Beyond budgeting” model with an implemented performance effects measuring process. In order for the model to be practicable, budget planning and control has to be reoriented to the “bottom up” planning and control approach. In today’s modern business surroundings one has to take both present and future opportunities and threats into consideration, by valorizing them in a budget which would allow a company to realize a whole pallet of advantages over the traditional budgeting principles which are presented later in the article. It is essential to emphasize the importance of successfully implementing the new budgeting principles within an organization. If the implementation has been lacking and done without a higher goal in mind, it is easily possible that the process has been implemented without coordination, planning and control framework within the organization itself. Further in the article we present an overview of managerial techniques and instruments within the “Beyond budgeting” model such as balanced scorecard, rolling forecast, dashboard, KPI and other supporting instruments. Lastly we define seven steps for implementing the “Beyond budgeting” model and offer a comparison of “Beyond budgeting” model against traditional budgeting principles which lists twelve reasons why “Beyond budgeting” is better suited to modern and market-oriented organizations. Each company faces those challenges in their own characteristic way but implementing new dynamic planning models will soon become essential for surviving in the market.

  2. Budgeting Based on Results: A Return-on-Investment Model Contributes to More Effective Annual Spending Choices

    Science.gov (United States)

    Cooper, Kelt L.

    2011-01-01

    One major problem in developing school district budgets immune to cuts is the model administrators traditionally use--an expenditure model. The simplicity of this model is seductive: What were the revenues and expenditures last year? What are the expected revenues and expenditures this year? A few adjustments here and there and one has a budget.…

  3. Surface Water and Energy Budgets for Sub-Saharan Africa in GFDL Coupled Climate Model

    Science.gov (United States)

    Tian, D.; Wood, E. F.; Vecchi, G. A.; Jia, L.; Pan, M.

    2015-12-01

    This study compare surface water and energy budget variables from the Geophysical Fluid Dynamics Laboratory (GFDL) FLOR models with the National Centers for Environmental Prediction (NCEP) Climate Forecast System Reanalysis (CFSR), Princeton University Global Meteorological Forcing Dataset (PGF), and PGF-driven Variable Infiltration Capacity (VIC) model outputs, as well as available observations over the sub-Saharan Africa. The comparison was made for four configurations of the FLOR models that included FLOR phase 1 (FLOR-p1) and phase 2 (FLOR-p2) and two phases of flux adjusted versions (FLOR-FA-p1 and FLOR-FA-p2). Compared to p1, simulated atmospheric states in p2 were nudged to the Modern-Era Retrospective Analysis for Research and Applications (MERRA) reanalysis. The seasonal cycle and annual mean of major surface water (precipitation, evapotranspiration, runoff, and change of storage) and energy variables (sensible heat, ground heat, latent heat, net solar radiation, net longwave radiation, and skin temperature) over a 34-yr period during 1981-2014 were compared in different regions in sub-Saharan Africa (West Africa, East Africa, and Southern Africa). In addition to evaluating the means in three sub-regions, empirical orthogonal functions (EOFs) analyses were conducted to compare both spatial and temporal characteristics of water and energy budget variables from four versions of GFDL FLOR, NCEP CFSR, PGF, and VIC outputs. This presentation will show how well each coupled climate model represented land surface physics and reproduced spatiotemporal characteristics of surface water and energy budget variables. We discuss what caused differences in surface water and energy budgets in land surface components of coupled climate model, climate reanalysis, and reanalysis driven land surface model. The comparisons will reveal whether flux adjustment and nudging would improve depiction of the surface water and energy budgets in coupled climate models.

  4. Growth of cockles (Cerastoderma edule) in the Oosterschelde described by a Dynamic Energy Budget model

    NARCIS (Netherlands)

    Wijsman, J.W.M.; Smaal, A.C.

    2011-01-01

    A Dynamic Energy Budget (DEB) model for cockles is presented and calibrated using detailed data on cockle growth and water quality in the Oosterschelde. Cockles in the intertidal areas of the Oosterschelde have an important function as a food source for wading birds and as such for the natural value

  5. A Sediment Budget Case Study: Comparing Watershed Scale Erosion Estimates to Modeled and Empirical Sediment Loads

    Science.gov (United States)

    McDavitt, B.; O'Connor, M.

    2003-12-01

    The Pacific Lumber Company Habitat Conservation Plan requires watershed analyses to be conducted on their property. This paper summarizes a portion of that analysis focusing on erosion and sedimentation processes and rates coupled with downstream sediment routing in the Freshwater Creek watershed in northwest California. Watershed scale erosion sources from hillslopes, roads, and channel banks were quantified using field surveys, aerial photo interpretation, and empirical modeling approaches for different elements of the study. Sediment transport rates for bedload were modeled, and sediment transport rates for suspended sediment were estimated based on size distribution of sediment inputs in relation to sizes transported in suspension. Recent short-term, high-quality estimates of suspended sediment yield that a community watershed group collected with technical assistance from the US Forest Service were used to validate the resulting sediment budget. Bedload yield data from an adjacent watershed, Jacoby Creek, provided another check on the sediment budget. The sediment budget techniques and bedload routing models used for this study generated sediment yield estimates that are in good agreement with available data. These results suggest that sediment budget techniques that require moderate levels of fieldwork can be used to provide relatively accurate technical assessments. Ongoing monitoring of sediment sources coupled with sediment routing models and reach scale field data allows for predictions to be made regarding in-channel sediment storage.

  6. Towards a formalization of budgets

    NARCIS (Netherlands)

    J.A. Bergstra; S. Nolst Trenité; M.B. van der Zwaag

    2008-01-01

    We go into the need for, and the requirements on, a formal theory of budgets. We present a simple algebraic theory of rational budgets, i.e., budgets in which amounts of money are specified by functions on the rational numbers. This theory is based on the tuplix calculus. We go into the importance o

  7. Regional atmospheric budgets of reduced nitrogen over the British isles assessed using a multi-layer atmospheric transport model

    NARCIS (Netherlands)

    Fournier, N.; Tang, Y.S.; Dragosits, U.; Kluizenaar, Y.de; Sutton, M.A.

    2005-01-01

    Atmospheric budgets of reduced nitrogen for the major political regions of the British Isles are investigated with a multi-layer atmospheric transport model. The model is validated against measurements of NH3 concentration and is developed to provide atmospheric budgets for defined subdomains of the

  8. Using a unit cost model to predict the impact of budget cuts on logistics products and services

    OpenAIRE

    Van Haasteren, Cleve J.

    1992-01-01

    Approved for Public Release; Distribution is Unlimited The Director of the Trident Integrated Logistics Support Division at the Naval Sea Systems Command manages a complex and dynamic budget that supports the provision of logistics products and services to the Trident submarine fleet. This thesis focuses on analyzing the Logistics Division budget and developing a model where the impact of a budget cut can be predicted by employing marginal cost. The thesis also explores ...

  9. Stochastic Climate Theory and Modelling

    CERN Document Server

    Franzke, Christian L E; Berner, Judith; Williams, Paul D; Lucarini, Valerio

    2014-01-01

    Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations as well as for model error representation, uncertainty quantification, data assimilation and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochast...

  10. The "covariation method" for estimating the parameters of the standard Dynamic Energy Budget model I: Philosophy and approach

    NARCIS (Netherlands)

    Lika, K.; Kearney, M.R.; Freitas, V.; van der Veer, H.W.; van der Meer, J.; Wijsman, J.W.M.; Pecquerie, L.; Kooijman, S.A.L.M.

    2011-01-01

    The Dynamic Energy Budget (DEB) theory for metabolic organisation captures the processes of development, growth, maintenance, reproduction and ageing for any kind of organism throughout its life-cycle. However, the application of DEB theory is challenging because the state variables and parameters a

  11. Turbulence kinetic energy budget during the afternoon transition - Part 2: A simple TKE model

    Science.gov (United States)

    Nilsson, Erik; Lothon, Marie; Lohou, Fabienne; Pardyjak, Eric; Hartogensis, Oscar; Darbieu, Clara

    2016-07-01

    A simple model for turbulence kinetic energy (TKE) and the TKE budget is presented for sheared convective atmospheric conditions based on observations from the Boundary Layer Late Afternoon and Sunset Turbulence (BLLAST) field campaign. It is based on an idealized mixed-layer approximation and a simplified near-surface TKE budget. In this model, the TKE is dependent on four budget terms (turbulent dissipation rate, buoyancy production, shear production and vertical transport of TKE) and only requires measurements of three available inputs (near-surface buoyancy flux, boundary layer depth and wind speed at one height in the surface layer) to predict vertical profiles of TKE and TKE budget terms.This simple model is shown to reproduce some of the observed variations between the different studied days in terms of near-surface TKE and its decay during the afternoon transition reasonably well. It is subsequently used to systematically study the effects of buoyancy and shear on TKE evolution using idealized constant and time-varying winds during the afternoon transition. From this, we conclude that many different TKE decay rates are possible under time-varying winds and that generalizing the decay with simple scaling laws for near-surface TKE of the form tα may be questionable.The model's errors result from the exclusion of processes such as elevated shear production and horizontal advection. The model also produces an overly rapid decay of shear production with height. However, the most influential budget terms governing near-surface TKE in the observed sheared convective boundary layers are included, while only second-order factors are neglected. Comparison between modeled and averaged observed estimates of dissipation rate illustrates that the overall behavior of the model is often quite reasonable. Therefore, we use the model to discuss the low-turbulence conditions that form first in the upper parts of the boundary layer during the afternoon transition and are only

  12. Impact of land-surface elevation and riparian evapotranspiration seasonality on groundwater budget in MODFLOW models

    Science.gov (United States)

    Ajami, Hoori; Meixner, Thomas; Maddock, Thomas; Hogan, James F.; Guertin, D. Phillip

    2011-09-01

    Riparian groundwater evapotranspiration (ETg) constitutes a major component of the water balance especially in many arid and semi-arid environments. Although spatial and temporal variability of riparian ETg are controlled by climate, vegetation and subsurface characteristics, depth to water table (DTWT) is often considered the major controlling factor. Relationships between ETg rates and DTWT, referred to as ETg curves, are implemented in MODFLOW ETg packages (EVT, ETS1 and RIP-ET) with different functional forms. Here, the sensitivity of the groundwater budget in MODFLOW groundwater models to ETg parameters (including ETg curves, land-surface elevation and ETg seasonality) are investigated. A MODFLOW model of the hypothetical Dry Alkaline Valley in the Southwestern USA is used to show how spatial representation of riparian vegetation and digital elevation model (DEM) processing methods impact the water budget when RIPGIS-NET (a GIS-based ETg program) is used with MODFLOW's RIP-ET package, and results are compared with the EVT and ETS1 packages. Results show considerable impact on ETg and other groundwater budget components caused by spatial representation of riparian vegetation, vegetation type, fractional coverage areas and land-surface elevation. RIPGIS-NET enhances ETg estimation in MODFLOW by incorporating vegetation and land-surface parameters, providing a tool for ecohydrology studies, riparian ecosystem management and stream restoration.

  13. Logic in the 1930s: type theory and model theory

    OpenAIRE

    Schiemer, Georg; Reck, Erich H.

    2013-01-01

    In historical discussions of twentieth-century logic, it is typically assumed that model theory emerged within the tradition that adopted first-order logic as the standard framework. Work within the type-theoretic tradition, in the style of Principia Mathematica, tends to be downplayed or ignored in this connection. Indeed, the shift from type theory to first-order logic is sometimes seen as involving a radical break that first made possible the rise of modern model theory. ...

  14. Models in theory building: the case of early string theory

    International Nuclear Information System (INIS)

    The history of the origins and first steps of string theory, from Veneziano's formulation of his famous scattering amplitude in 1968 to the 'first string revolution' in 1984, provides rich material for discussing traditional issues in the philosophy of science. This paper focusses on the initial phase of this history, that is the making of early string theory out of the 'dual theory of strong interactions' motivated by the aim of finding a viable theory of hadrons in the framework of the so-called S-matrix theory of the Sixties: from the first two models proposed (the Dual Resonance Model and the Shapiro-Virasoro Model) to all the subsequent endeavours to extend and complete the theory, including its string interpretation. As is the aim of this paper to show, by representing an exemplary illustration of the building of a scientific theory out of tentative and partial models this is a particularly fruitful case study for the current philosophical discussion on how to characterize a scientific model, a scientific theory, and the relation between models and theories.

  15. Halo modelling in chameleon theories

    CERN Document Server

    Lombriser, Lucas; Li, Baojiu

    2013-01-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a LCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke p...

  16. Modelling the water budget and the riverflows of the Maritsa basin in Bulgaria

    Directory of Open Access Journals (Sweden)

    E. Artinyan

    2008-01-01

    Full Text Available A soil-vegetation-atmosphere transfer model coupled with a macroscale distributed hydrological model was used to simulate the water cycle for a large region in Bulgaria. To do so, an atmospheric forcing was built for two hydrological years (1 October 1995 to 30 September 1997, at an eight km resolution. The impact of the human activities on the rivers (especially hydropower or irrigation was taken into account. An improvement of the hydrometeorological model was made: for better simulation of summer riverflow, two additional reservoirs were added to simulate the slow component of the runoff. Those reservoirs were calibrated using the observed data of the 1st year, while the 2nd year was used for validation. 56 hydrologic stations and 12 dams were used for the model calibration while 41 river gauges were used for the validation of the model. The results compare well with the daily-observed discharges, with good results obtained over more than 25% of the river gauges. The simulated snow depth was compared to daily measurements at 174 stations and the evolution of the snow water equivalent was validated at 5 sites. The process of melting and refreezing of snow was found to be important in this region. The comparison of the normalized values of simulated versus measured soil moisture showed good correlation. The surface water budget shows large spatial variations due to the elevation influence on the precipitation, soil properties and vegetation variability. An inter-annual difference was observed in the water cycle as the first year was more influenced by Mediterranean climate, while the second year was characterised by continental influence. The energy budget shows a dominating sensible heat component in summer, due to the fact that the water stress limits the evaporation. This study is a first step for the implementation of an operational hydrometeorological model that could be used for real time monitoring and forecasting of water budget

  17. The Governance Slack Model: A cash flow approach for the budgeting and accountability of some corporate governance issues

    OpenAIRE

    Apreda, Rodolfo

    2002-01-01

    This paper introduces a cash flow model to budget and monitor distinctive matters usually arising in corporate governance. By enlarging the standard cash flow model widely used in Finance, and avoiding some of its downsides, it sets up a composite of cash flows called governance slack, which amounts to a comprehensive budget for the most usual governance issues. This slack has a dual structure whose dynamics keeps track of uses and sources of its components, preventing likely agency problems ...

  18. Turbulence Kinetic Energy budget during the afternoon transition – Part 2: A simple TKE model

    Directory of Open Access Journals (Sweden)

    E. Nilsson

    2015-11-01

    Full Text Available A simple model for turbulence kinetic energy (TKE and the TKE budget is presented for sheared convective atmospheric conditions based on observations from the Boundary Layer Late Afternoon and Sunset Turbulence (BLLAST field campaign. It is based on an idealized mixed-layer approximation and a simplified near-surface TKE budget. In this model, the TKE is dependent on four budget terms (turbulent dissipation rate, buoyancy production, shear production and vertical transport of TKE and only requires measurements of three input available (near-surface buoyancy flux, boundary layer depth and wind speed at one height in the surface layer. This simple model is shown to reproduce some of the observed variations between the different studied days in terms of near-surface TKE and its decay during the afternoon transition reasonably well. It is subsequently used to systematically study the effects of buoyancy and shear on TKE evolution using idealized constant and time-varying winds during the afternoon transition. From this, we conclude that many different TKE decay rates are possible under time-varying winds and that generalizing the decay with simple scaling laws for near-surface TKE of the form tα may be questionable. The model's errors result from the exclusion of processes such as elevated shear production and horizontal advection. The model also produces an overly rapid decay of shear production with height. However, the most influential budget terms governing near-surface TKE in the observed sheared convective boundary layers are included, while only second order factors are neglected. Comparison between modeled and averaged observed estimates of dissipation rate illustrate that the overall behavior of the model is often quite reasonable. Therefore, we use the model to discuss the low turbulence conditions that form first in the upper parts of the boundary layer during the afternoon transition and are only apparent later near the surface. This

  19. Field theory and the Standard Model

    CERN Document Server

    Dudas, E

    2014-01-01

    This brief introduction to Quantum Field Theory and the Standard Model con- tains the basic building blocks of perturbation theory in quantum field theory, an elementary introduction to gauge theories and the basic classical and quan- tum features of the electroweak sector of the Standard Model. Some details are given for the theoretical bias concerning the Higgs mass limits, as well as on obscure features of the Standard Model which motivate new physics con- structions

  20. Lattice models and conformal field theories

    International Nuclear Information System (INIS)

    Theoretical studies concerning the connection between critical physical systems and the conformal theories are reviewed. The conformal theory associated to a critical (integrable) lattice model is derived. The obtention of the central charge, critical exponents and torus partition function, using renormalization group arguments, is shown. The quantum group structure, in the integrable lattice models, and the theory of Visaro algebra representations are discussed. The relations between off-critical integrable models and conformal theories, in finite geometries, are studied

  1. Using "snapshot" measurements of CH4 fluxes from peatlands to estimate annual budgets: interpolation vs. modelling.

    Science.gov (United States)

    Green, Sophie M.; Baird, Andy J.

    2016-04-01

    There is growing interest in estimating annual budgets of peatland-atmosphere carbon dioxide (CO2) and methane (CH4) exchanges. Such budgeting is required for calculating peatland carbon balance and the radiative forcing impact of peatlands on climate. There have been multiple approaches used to estimate CO2 budgets; however, there is a limited literature regarding the modelling of annual CH4 budgets. Using data collected from flux chamber tests in an area of blanket peatland in North Wales, we compared annual estimates of peatland-atmosphere CH4 emissions using an interpolation approach and an additive and multiplicative modelling approach. Flux-chamber measurements represent a snapshot of the conditions on a particular site. In contrast to CO2, most studies that have estimated the time-integrated flux of CH4 have not used models. Typically, linear interpolation is used to estimate CH4 fluxes during the time periods between flux-chamber measurements. It is unclear how much error is involved with such a simple integration method. CH4 fluxes generally show a rise followed by a fall through the growing season that may be captured reasonably well by interpolation, provided there are sufficiently frequent measurements. However, day-to-day and week-to-week variability is also often evident in CH4 flux data, and will not necessarily be properly represented by interpolation. Our fits of the CH4 flux models yielded r2 > 0.5 in 38 of the 48 models constructed, with 55% of these having a weighted rw2 > 0.4. Comparison of annualised CH4 fluxes estimated by interpolation and modelling reveals no correlation between the two data sets; indeed, in some cases even the sign of the flux differs. The difference between the methods seems also to be related to the size of the flux - for modest annual fluxes there is a fairly even scatter of points around the 1:1 line, whereas when the modelled fluxes are high, the corresponding interpolated fluxes tend to be low. We consider the

  2. Halo modelling in chameleon theories

    Science.gov (United States)

    Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu

    2014-03-01

    We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.

  3. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  4. Modelling the carbon budget of intensive forest monitoring sites in Germany using the simulation model BIOME-BGC

    Directory of Open Access Journals (Sweden)

    Schulz C

    2009-01-01

    Full Text Available It is shown that by calibrating the simulation model BIOME-BGC with mandatory and optional Level II data, within the ICP Forest programme, a well-founded calculation of the carbon budget of forest stands is achievable and, based on succeeded calibration, the modified BIOME-BGC model is a useful tool to assess the effect of climate change on forest ecosystems.

  5. Reconciled climate response estimates from climate models and the energy budget of Earth

    Science.gov (United States)

    Richardson, Mark; Cowtan, Kevin; Hawkins, Ed; Stolpe, Martin B.

    2016-10-01

    Climate risks increase with mean global temperature, so knowledge about the amount of future global warming should better inform risk assessments for policymakers. Expected near-term warming is encapsulated by the transient climate response (TCR), formally defined as the warming following 70 years of 1% per year increases in atmospheric CO2 concentration, by which point atmospheric CO2 has doubled. Studies based on Earth's historical energy budget have typically estimated lower values of TCR than climate models, suggesting that some models could overestimate future warming. However, energy-budget estimates rely on historical temperature records that are geographically incomplete and blend air temperatures over land and sea ice with water temperatures over open oceans. We show that there is no evidence that climate models overestimate TCR when their output is processed in the same way as the HadCRUT4 observation-based temperature record. Models suggest that air-temperature warming is 24% greater than observed by HadCRUT4 over 1861-2009 because slower-warming regions are preferentially sampled and water warms less than air. Correcting for these biases and accounting for wider uncertainties in radiative forcing based on recent evidence, we infer an observation-based best estimate for TCR of 1.66 °C, with a 5-95% range of 1.0-3.3 °C, consistent with the climate models considered in the IPCC 5th Assessment Report.

  6. Impact of surface wind biases on the Antarctic sea ice concentration budget in climate models

    Science.gov (United States)

    Lecomte, O.; Goosse, H.; Fichefet, T.; Holland, P. R.; Uotila, P.; Zunz, V.; Kimura, N.

    2016-09-01

    We derive the terms in the Antarctic sea ice concentration budget from the output of three models, and compare them to observations of the same terms. Those models include two climate models from the 5th Coupled Model Intercomparison Project (CMIP5) and one ocean-sea ice coupled model with prescribed atmospheric forcing. Sea ice drift and wind fields from those models, in average over April-October 1992-2005, all exhibit large differences with the available observational or reanalysis datasets. However, the discrepancies between the two distinct ice drift products or the two wind reanalyses used here are sometimes even greater than those differences. Two major findings stand out from the analysis. Firstly, large biases in sea ice drift speed and direction in exterior sectors of the sea ice covered region tend to be systematic and consistent with those in winds. This suggests that sea ice errors in these areas are most likely wind-driven, so as errors in the simulated ice motion vectors. The systematic nature of these biases is less prominent in interior sectors, nearer the coast, where sea ice is mechanically constrained and its motion in response to the wind forcing more depending on the model rheology. Second, the intimate relationship between winds, sea ice drift and the sea ice concentration budget gives insight on ways to categorize models with regard to errors in their ice dynamics. In exterior regions, models with seemingly too weak winds and slow ice drift consistently yield a lack of ice velocity divergence and hence a wrong wintertime sea ice growth rate. In interior sectors, too slow ice drift, presumably originating from issues in the physical representation of sea ice dynamics as much as from errors in surface winds, leads to wrong timing of the late winter ice retreat. Those results illustrate that the applied methodology provides a valuable tool for prioritizing model improvements based on the ice concentration budget-ice drift biases-wind biases

  7. Theory and Experience in Deliberative Democracy and Budget Review%协商民主与预算审议的理论与经验

    Institute of Scientific and Technical Information of China (English)

    顾维萌

    2012-01-01

    The theory of deliberative democracy emphasizes that through the way of rational dialogue in freedom and equality, debate, citizens should participate in public affairs consultation, deliberation, giving legitimacy to legislation and decision, which will be of great significance to the budget review. The practices of participatory budgeting in Brazil Porto Alegre and China Wenling are typical cases of deliberative democracy. Referring to the theory of deliberative democracy and participatory budgeting practice, China's regime improvement about budget review should start from consummating legal system; Structuring procedure mechanism and promoting citizens' autonomous ability in participation.%协商民主理论强调公民通过自由、平等的理性对话、辩论、协商、审议等方式来参与公共事务,赋予立法和决策以合法性,对预算审议有着重要意义。巴西阿雷格里港市和中国温岭的参与式预算实践是协商民主的典型体现。要想通过借鉴协商民主理论和参与式预算实践完善我国预算审议制度,就必须完善法律体系,构建程序机制,提升民众参与自治的能力。

  8. A Budget Impact Model for Paclitaxel-eluting Stent in Femoropopliteal Disease in France

    International Nuclear Information System (INIS)

    The Zilver PTX drug-eluting stent (Cook Ireland Ltd., Limerick, Ireland) represents an advance in endovascular treatments for atherosclerotic superficial femoral artery (SFA) disease. Clinical data demonstrate improved clinical outcomes compared to bare-metal stents (BMS). This analysis assessed the likely impact on the French public health care budget of introducing reimbursement for the Zilver PTX stent. A model was developed in Microsoft Excel to estimate the impact of a progressive transition from BMS to Zilver PTX over a 5-year horizon. The number of patients undergoing SFA stenting was estimated on the basis of hospital episode data. The analysis from the payer perspective used French reimbursement tariffs. Target lesion revascularization (TLR) after primary stent placement was the primary outcome. TLR rates were based on 2-year data from the Zilver PTX single-arm study (6 and 9 %) and BMS rates reported in the literature (average 16 and 22 %) and extrapolated to 5 years. Net budget impact was expressed as the difference in total costs (primary stenting and reinterventions) for a scenario where BMS is progressively replaced by Zilver PTX compared to a scenario of BMS only. The model estimated a net cumulative 5-year budget reduction of €6,807,202 for a projected population of 82,316 patients (21,361 receiving Zilver PTX). Base case results were confirmed in sensitivity analyses. Adoption of Zilver PTX could lead to important savings for the French public health care payer. Despite higher initial reimbursement for the Zilver PTX stent, fewer expected SFA reinterventions after the primary stenting procedure result in net savings.

  9. A Budget Impact Model for Paclitaxel-eluting Stent in Femoropopliteal Disease in France

    Energy Technology Data Exchange (ETDEWEB)

    De Cock, Erwin, E-mail: erwin.decock@unitedbiosource.com [United BioSource Corporation, Peri- and Post-Approval Services (Spain); Sapoval, Marc, E-mail: Marc.sapoval2@egp.aphp.fr [Hopital Europeen Georges Pompidou, Universite Rene Descartes, Department of Cardiovascular and Interventional Radiology (France); Julia, Pierre, E-mail: pierre.julia@egp.aphp.fr [Hopital Europeen Georges Pompidou, Universite Rene Descartes, Cardiovascular Surgery Department (France); Lissovoy, Greg de, E-mail: gdelisso@jhsph.edu [Johns Hopkins Bloomberg School of Public Health, Department of Health Policy and Management (United States); Lopes, Sandra, E-mail: Sandra.Lopes@CookMedical.com [Cook Medical, Health Economics and Reimbursement (Denmark)

    2013-04-15

    The Zilver PTX drug-eluting stent (Cook Ireland Ltd., Limerick, Ireland) represents an advance in endovascular treatments for atherosclerotic superficial femoral artery (SFA) disease. Clinical data demonstrate improved clinical outcomes compared to bare-metal stents (BMS). This analysis assessed the likely impact on the French public health care budget of introducing reimbursement for the Zilver PTX stent. A model was developed in Microsoft Excel to estimate the impact of a progressive transition from BMS to Zilver PTX over a 5-year horizon. The number of patients undergoing SFA stenting was estimated on the basis of hospital episode data. The analysis from the payer perspective used French reimbursement tariffs. Target lesion revascularization (TLR) after primary stent placement was the primary outcome. TLR rates were based on 2-year data from the Zilver PTX single-arm study (6 and 9 %) and BMS rates reported in the literature (average 16 and 22 %) and extrapolated to 5 years. Net budget impact was expressed as the difference in total costs (primary stenting and reinterventions) for a scenario where BMS is progressively replaced by Zilver PTX compared to a scenario of BMS only. The model estimated a net cumulative 5-year budget reduction of Euro-Sign 6,807,202 for a projected population of 82,316 patients (21,361 receiving Zilver PTX). Base case results were confirmed in sensitivity analyses. Adoption of Zilver PTX could lead to important savings for the French public health care payer. Despite higher initial reimbursement for the Zilver PTX stent, fewer expected SFA reinterventions after the primary stenting procedure result in net savings.

  10. Evaluation of Mediterranean Sea water and heat budgets simulated by an ensemble of high resolution regional climate models

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Gomez, E. [CERFACS/CNRS, SUC URA1875, Toulouse Cedex (France); Somot, S.; Dubois, C.; Deque, M. [CNRM/GAME, Meteo-France/CNRS, Toulouse (France); Josey, S.A. [National Oceanography Centre, Southampton (United Kingdom); Elguindi, N. [LA, CNRS, Toulouse (France)

    2011-11-15

    Air-sea heat and freshwater water fluxes in the Mediterranean Sea play a crucial role in dense water formation. Here, we compare estimates of Mediterranean Sea heat and water budgets from a range of observational datasets and discuss the main differences between them. Taking into account the closure hypothesis at the Gibraltar Strait, we have built several observational estimates of water and heat budgets by combination of their different observational components. We provide then three estimates for water budget and one for heat budget that satisfy the closure hypothesis. We then use these observational estimates to assess the ability of an ensemble of ERA40-driven high resolution (25 km) Regional Climate Models (RCMs) from the FP6-EU ENSEMBLES database, to simulate the various components, and net values, of the water and heat budgets. Most of the RCM Mediterranean basin means are within the range spanned by the observational estimates of the different budget components, though in some cases the RCMs have a tendency to overestimate the latent heat flux (or evaporation) with respect to observations. The RCMs do not show significant improvements of the total water budget estimates comparing to ERA40. Moreover, given the large spread found in observational estimates of precipitation over the sea, it is difficult to draw conclusions on the performance of RCM for the freshwater budget and this underlines the need for better precipitation observations. The original ERA40 value for the basin mean net heat flux is -15 W/m{sup 2} which is 10 W/m{sup 2} less than the value of -5 W/m{sup 2} inferred from the transport measurements at Gibraltar Strait. The ensemble of heat budget values estimated from the models show that most of RCMs do not achieve heat budget closure. However, the ensemble mean value for the net heat flux is -7 {+-} 21 W/m{sup 2}, which is close to the Gibraltar value, although the spread between the RCMs is large. Since the RCMs are forced by the same

  11. On Dimer Models and Closed String Theories

    OpenAIRE

    Sarkar, Tapobrata

    2007-01-01

    We study some aspects of the recently discovered connection between dimer models and D-brane gauge theories. We argue that dimer models are also naturally related to closed string theories on non compact orbifolds of $\\BC^2$ and $\\BC^3$, via their twisted sector R charges, and show that perfect matchings in dimer models correspond to twisted sector states in the closed string theory. We also use this formalism to study the combinatorics of some unstable orbifolds of $\\BC^2$.

  12. The Friction Theory for Viscosity Modeling

    DEFF Research Database (Denmark)

    Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan

    2001-01-01

    such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet...... below the saturation pressure. In addition, a tuned f-theory general model delivers accurate modeling of different kinds of light and heavy oils. Thus, the simplicity and stability of the f-theory general models make them a powerful tool for applications such as reservoir simulations, between others. (C......In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...

  13. The measurement of the earth's radiation budget as a problem in information theory - A tool for the rational design of earth observing systems

    Science.gov (United States)

    Barkstrom, B. R.

    1983-01-01

    The measurement of the earth's radiation budget has been chosen to illustrate the technique of objective system design. The measurement process is an approximately linear transformation of the original field of radiant exitances, so that linear statistical techniques may be employed. The combination of variability, measurement strategy, and error propagation is presently made with the help of information theory, as suggested by Kondratyev et al. (1975) and Peckham (1974). Covariance matrices furnish the quantitative statement of field variability.

  14. Assessing the O2 budget under sea ice: An experimental and modelling approach

    Directory of Open Access Journals (Sweden)

    S. Moreau

    2015-12-01

    Full Text Available Abstract The objective of this study was to assess the O2 budget in the water under sea ice combining observations and modelling. Modelling was used to discriminate between physical processes, gas-specific transport (i.e., ice-atmosphere gas fluxes and gas bubble buoyancy and bacterial respiration (BR and to constrain bacterial growth efficiency (BGE. A module describing the changes of the under-ice water properties, due to brine rejection and temperature-dependent BR, was implemented in the one-dimensional halo-thermodynamic sea ice model LIM1D. Our results show that BR was the dominant biogeochemical driver of O2 concentration in the water under ice (in a system without primary producers, followed by gas specific transport. The model suggests that the actual contribution of BR and gas specific transport to the change in seawater O2 concentration was 37% during ice growth and 48% during melt. BGE in the water under sea ice, as retrieved from the simulated O2 budget, was found to be between 0.4 and 0.5, which is in line with published BGE values for cold marine waters. Given the importance of BR to seawater O2 in the present study, it can be assumed that bacteria contribute substantially to organic matter consumption and gas fluxes in ice-covered polar oceans. In addition, we propose a parameterization of polar marine bacterial respiration, based on the strong temperature dependence of bacterial respiration and the high growth efficiency observed here, for further biogeochemical ocean modelling applications, such as regional or large-scale Earth System models.

  15. Ozone Budgets from a Global Chemistry/ Transport Model and Comparison to Observations from POLARIS

    Science.gov (United States)

    Kawa, S. Randy

    1999-01-01

    The objective of the Photochemistry of Ozone Loss in the Arctic Region in Summer (POLARIS) field mission was to obtain data to better characterize the summertime seasonal decrease of ozone at mid to high latitudes. The decrease in ozone occurs mainly in the lower stratosphere and is expected to result from in situ chemical destruction. Instrumented balloons and aircraft were used in POLARIS, along with satellites, to measure ozone and chemical species which are involved with stratospheric ozone chemistry. In order to close the seasonal ozone budget, however, ozone transport must also be estimated. Comparison to a global chemistry and transport model (CTM) of the stratosphere indicates how well the summertime ozone loss processes are simulated and thus how well we can predict the ozone response to changing amounts of chemical source gases. Moreover, the model gives insight into the possible relative magnitude of transport contributions to the seasonal ozone decline. Initial comparison to the Goddard CTM, which uses transport winds and temperatures from meteorological data assimilation, shows a high ozone bias in the model and an attenuated summertime ozone loss cycle. Comparison of the model chemical partitioning, and ozone catalytic loss rates to those derived from measurements shows fairly close agreement both at ER-2 altitudes (20 km) and higher. This suggests that the model transport is too active in resupplying ozone to the high latitude region, although chemistry failings cannot be completely ruled out. Comparison of ozone and related species will be shown along with a full diagnosis of the model ozone budget and its possible sources of error.

  16. Applications of model theory to functional analysis

    CERN Document Server

    Iovino, Jose

    2014-01-01

    During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the

  17. Quantum field theory competitive models

    CERN Document Server

    Tolksdorf, Jürgen; Zeidler, Eberhard

    2009-01-01

    For more than 70 years, quantum field theory (QFT) can be seen as a driving force in the development of theoretical physics. Equally fascinating is the fruitful impact which QFT had in rather remote areas of mathematics. The present book features some of the different approaches, different physically viewpoints and techniques used to make the notion of quantum field theory more precise. For example, the present book contains a discussion including general considerations, stochastic methods, deformation theory and the holographic AdS/CFT correspondence. It also contains a discussion of more recent developments like the use of category theory and topos theoretic methods to describe QFT. The present volume emerged from the 3rd 'Blaubeuren Workshop: Recent Developments in Quantum Field Theory', held in July 2007 at the Max Planck Institute of Mathematics in the Sciences in Leipzig/Germany. All of the contributions are committed to the idea of this workshop series: 'To bring together outstanding experts working in...

  18. Domain Theory, Its Models and Concepts

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Howard, Thomas J.; Bruun, Hans Peter Lomholt

    2014-01-01

    and industrial applications especially for the DFX areas (not reported here) and for product modelling. The theory therefore contains a rich ontology of interrelated concepts. The Domain Theory is not aiming to create normative methods but the creation of a collection of concepts related to design phenomena......Domain Theory is a systems approach for the analysis and synthesis of products. Its basic idea is to view a product as systems of activities, organs and parts and to define structure, elements, behaviour and function in these domains. The theory is a basis for a long line of research contributions......, which can support design work and to form elements of designers’ mindsets and thereby their practice. The theory is a model-based theory, which means it is composed of concepts and models, which explains certain design phenomena. Many similar theories are described in the literature with differences...

  19. Motivation in Beyond Budgeting: A Motivational Paradox?

    DEFF Research Database (Denmark)

    Sandalgaard, Niels; Bukh, Per Nikolaj

    In this paper we discuss the role of motivation in relation to budgeting and we analyse how the Beyond Budgeting model functions compared with traditional budgeting. In the paper we focus on budget related motivation (and motivation in general) and conclude that the Beyond Budgeting model...... is a motivational paradox....

  20. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel

    1996-01-01

    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the w

  1. Development of a dynamic energy budget modeling approach to investigate the effects of temperature and resource limitation on mercury bioaccumulation in Fundulus heteroclitus.

    Science.gov (United States)

    Dynamic energy budget (DEB) theory provides a generalizable and broadly applicable framework to connect sublethal toxic effects on individuals to changes in population survival and growth. To explore this approach, we are developing growth and bioaccumulation studies that contrib...

  2. Development of a dynamic energy budget modeling approach to investigate the effects of temperature and resource limitation on mercury bioaccumulation in Fundulus heteroclitus-presentation

    Science.gov (United States)

    Dynamic energy budget (DEB) theory provides a generalizable and broadly applicable framework to connect sublethal toxic effects on individuals to changes in population survival and growth. To explore this approach, we are conducting growth and bioaccumulation studies that contrib...

  3. A photochemical modeling study of ozone and formaldehyde generation and budget in the Po basin

    Science.gov (United States)

    Liu, L.; Andreani-Aksoyoglu, S.; Keller, J.; OrdóñEz, C.; Junkermann, W.; Hak, C.; Braathen, G. O.; Reimann, S.; Astorga-Llorens, C.; Schultz, M.; PréVôT, A. S. H.; Isaksen, I. S. A.

    2007-11-01

    In this work, a photochemical dispersion model, CAMx (Comprehensive Air quality Model with eXtensions) was used to simulate a high ozone episode observed in the Po basin during the 2003 FORMAT (Formaldehyde as a Tracer of Oxidation in the Troposphere) campaign. The study focuses on formaldehyde and ozone, and a budget analysis was set up for interpreting the importance of different processes, namely emission, chemistry, transport and deposition, for three different areas (urban, downwind, suburban) around the Milan metropolitan region. In addition, a sensitivity study was carried out based on 11 different VOC emission scenarios. The results of the budget study show that the strongest O3 production rate (4 ppbv/hour) occurs in the downwind area of the city of Milan, and that accumulated O3 is transported back to Milan city during nighttime. More than 80% of the HCHO concentration over the Milan metropolitan region is secondary, i.e., photochemically produced from other VOCs. The sensitivity study shows that the emissions of isoprene are not, on average, a controlling factor for the peak concentrations of O3 and HCHO over the model domain because of very few oak trees in this region. Although the paraffinic (PAR) emissions are fairly large, a 20% reduction of PAR yields only 1.7% of HCHO reduction and 2.7% reduction of the O3 peak. The largest reduction of O3 levels can be obtained by reduced xylene (XYL) emissions. A 20% reduction of the total anthropogenic VOC emissions leads to 15.5% (20.3 ppbv) reduction of O3 peak levels over the Milan metropolitan region.

  4. Water budget comparison of global climate models and experimental data in Onça Creek basin, Brazil

    Science.gov (United States)

    Melo, D. C. D.; Marin, I. S. P.; Wendland, E.

    2014-09-01

    Groundwater is an important part of the hydrological cycle, accounting for more than 25% of human needs on the global scale. As a result of aquifer overexploitation associated with climate change, even in the most conservative future climate scenarios, mean water-table levels can experience drastic drops. Although there are efforts to include groundwater dynamics in global climate models (GCMs), its influence is still not taken into full account in GCM water budgets, although it is as important as the other water sources considered. To assess the role of percolation in the water balance, we compared the water budget from climate forcing scenarios using 10 GCMs with the water budget from experimental data of a basin in São Paulo state, Brazil. We used the delta factor approach to correct the bias of the model's temperature and precipitation for a control period from 1970 to 1999, and calculated evapotranspiration using the Thornthwaite method. Experimental data for runoff and interception were derived for the basin's representative crops (sugar cane and pasture) for both water budgets. As the GCMs ignore subsurface flow and the only input considered is precipitation and snow melt, the excess surface water is assumed to be redistributed among the other water budget components. The experimental data shows that there is enough available water for infiltration, indicating that recharge cannot be ignored in the water balance. This leads to the possibility of the models' overestimating the other components to compensate for the ignored recharge.

  5. A water-budget model and estimates of groundwater recharge for Guam

    Science.gov (United States)

    Johnson, Adam G.

    2012-01-01

    On Guam, demand for groundwater tripled from the early 1970s to 2010. The demand for groundwater is anticipated to further increase in the near future because of population growth and a proposed military relocation to Guam. Uncertainty regarding the availability of groundwater resources to support the increased demand has prompted an investigation of groundwater recharge on Guam using the most current data and accepted methods. For this investigation, a daily water-budget model was developed and used to estimate mean recharge for various land-cover and rainfall conditions. Recharge was also estimated for part of the island using the chloride mass-balance method. Using the daily water-budget model, estimated mean annual recharge on Guam is 394.1 million gallons per day, which is 39 percent of mean annual rainfall (999.0 million gallons per day). Although minor in comparison to rainfall on the island, water inflows from water-main leakage, septic-system leachate, and stormwater runoff may be several times greater than rainfall at areas that receive these inflows. Recharge is highest in areas that are underlain by limestone, where recharge is typically between 40 and 60 percent of total water inflow. Recharge is relatively high in areas that receive stormwater runoff from storm-drain systems, but is relatively low in urbanized areas where stormwater runoff is routed to the ocean or to other areas. In most of the volcanic uplands in southern Guam where runoff is substantial, recharge is less than 30 percent of total water inflow. The water-budget model in this study differs from all previous water-budget investigations on Guam by directly accounting for canopy evaporation in forested areas, quantifying the evapotranspiration rate of each land-cover type, and accounting for evaporation from impervious areas. For the northern groundwater subbasins defined in Camp, Dresser & McKee Inc. (1982), mean annual baseline recharge computed in this study is 159.1 million gallons

  6. Volcanic aquifers of Hawai‘i—Hydrogeology, water budgets, and conceptual models

    Science.gov (United States)

    Izuka, Scot K.; Engott, John A.; Bassiouni, Maoya; Johnson, Adam G.; Miller, Lisa D.; Rotzoll, Kolja; Mair, Alan

    2016-06-13

    Hawai‘i’s aquifers have limited capacity to store fresh groundwater because each island is small and surrounded by saltwater. Saltwater also underlies much of the fresh groundwater. Fresh groundwater resources are, therefore, particularly vulnerable to human activity, short-term climate cycles, and long-term climate change. Availability of fresh groundwater for human use is constrained by the degree to which the impacts of withdrawal—such as lowering of the water table, saltwater intrusion, and reduction in the natural discharge to springs, streams, wetlands, and submarine seeps—are deemed acceptable. This report describes the hydrogeologic framework, groundwater budgets (inflows and outflows), conceptual models of groundwater occurrence and movement, and the factors limiting groundwater availability for the largest and most populated of the Hawaiian Islands—Kaua‘i, O‘ahu, Maui, and Hawai‘i Island.

  7. An approach for modeling sediment budgets in supply-limited rivers

    Science.gov (United States)

    Wright, Scott A.; Topping, David J.; Rubin, David M.; Melis, Theodore S.

    2010-01-01

    Reliable predictions of sediment transport and river morphology in response to variations in natural and human-induced drivers are necessary for river engineering and management. Because engineering and management applications may span a wide range of space and time scales, a broad spectrum of modeling approaches has been developed, ranging from suspended-sediment "rating curves" to complex three-dimensional morphodynamic models. Suspended sediment rating curves are an attractive approach for evaluating changes in multi-year sediment budgets resulting from changes in flow regimes because they are simple to implement, computationally efficient, and the empirical parameters can be estimated from quantities that are commonly measured in the field (i.e., suspended sediment concentration and water discharge). However, the standard rating curve approach assumes a unique suspended sediment concentration for a given water discharge. This assumption is not valid in rivers where sediment supply varies enough to cause changes in particle size or changes in areal coverage of sediment on the bed; both of these changes cause variations in suspended sediment concentration for a given water discharge. More complex numerical models of hydraulics and morphodynamics have been developed to address such physical changes of the bed. This additional complexity comes at a cost in terms of computations as well as the type and amount of data required for model setup, calibration, and testing. Moreover, application of the resulting sediment-transport models may require observations of bed-sediment boundary conditions that require extensive (and expensive) observations or, alternatively, require the use of an additional model (subject to its own errors) merely to predict the bed-sediment boundary conditions for use by the transport model. In this paper we present a hybrid approach that combines aspects of the rating curve method and the more complex morphodynamic models. Our primary objective

  8. Autotrophic carbon budget in coral tissue: a new 13C-based model of photosynthate translocation.

    Science.gov (United States)

    Tremblay, Pascale; Grover, Renaud; Maguer, Jean François; Legendre, Louis; Ferrier-Pagès, Christine

    2012-04-15

    Corals live in symbiosis with dinoflagellates of the genus Symbiodinum. These dinoflagellates translocate a large part of the photosynthetically fixed carbon to the host, which in turn uses it for its own needs. Assessing the carbon budget in coral tissue is a central question in reef studies that still vexes ecophysiologists. The amount of carbon fixed by the symbiotic association can be determined by measuring the rate of photosynthesis, but the amount of carbon translocated by the symbionts to the host and the fate of this carbon are more difficult to assess. In the present study, we propose a novel approach to calculate the budget of autotrophic carbon in the tissue of scleractinian corals, based on a new model and measurements made with the stable isotope (13)C. Colonies of the scleractinian coral Stylophora pistillata were incubated in H(13)CO (-)(3)-enriched seawater, after which the fate of (13)C was followed in the symbionts, the coral tissue and the released particulate organic carbon (i.e. mucus). Results obtained showed that after 15 min, ca. 60% of the carbon fixed was already translocated to the host, and after 48 h, this value reached 78%. However, ca. 48% of the photosynthetically fixed carbon was respired by the symbiotic association, and 28% was released as dissolved organic carbon. This is different from other coral species, where coral tissue after 48 h. Results show that our (13)C-based model could successfully trace the carbon flow from the symbionts to the host, and the photosynthetically acquired carbon lost from the symbiotic association. PMID:22442377

  9. Growth of cockles ( Cerastoderma edule) in the Oosterschelde described by a Dynamic Energy Budget model

    Science.gov (United States)

    Wijsman, Johannes W. M.; Smaal, Aad C.

    2011-11-01

    A Dynamic Energy Budget (DEB) model for cockles is presented and calibrated using detailed data on cockle growth and water quality in the Oosterschelde. Cockles in the intertidal areas of the Oosterschelde have an important function as a food source for wading birds and as such for the natural values of the ecosystem. In the presented model, special attention is paid to the formulation and parameter estimation of the functional response. With this functional response, the food quantity and quality variables such as Chlorophyll- a, POM, POC and TPM are translated into food ingestion rate for the cockles. The calibration of the specific parameters included in this functional response is done using a detailed, long term dataset (1992-2007) of cockle growth in the Oosterschelde estuary. This dataset gives a good overview of the development of the cockle population in relation to the environmental conditions (food availability and ambient temperature). The DEB model was able to describe the spatial variation in cockle growth in the Oosterschelde as a function of environmental conditions and the parameters of the functional response. Both the data and the model show that growth performance of cockles is highest in the western and central part of the Oosterschelde due to the higher concentrations of Chlorophyll- a, which is an important food source for cockles. The model failed to describe the large variation in ash-free dry weight during the season. It is tested whether this is caused by aggregating the data by running the model for the full life cycle of year class 2001 at a specific location in the western part of the Oosterschelde. Finally, the model simulations have been compared to growth simulations obtained with an existing ecophysiological model for cockles in the Oosterschelde, the COCO model, with identical forcing. The COCO model showed higher growth in terms of shell length compared to the DEB model and the field observations.

  10. Advanced Modeling Techniques to Study Anthropogenic Influences on Atmospheric Chemical Budgets

    Science.gov (United States)

    Mathur, Rohit

    1997-01-01

    This research work is a collaborative effort between research groups at MCNC and the University of North Carolina at Chapel Hill. The overall objective of this research is to improve the level of understanding of the processes that determine the budgets of chemically and radiatively active compounds in the atmosphere through development and application of advanced methods for calculating the chemical change in atmospheric models. The research performed during the second year of this project focused on four major aspects: (1) The continued development and refinement of multiscale modeling techniques to address the issue of the disparate scales of the physico-chemical processes that govern the fate of atmospheric pollutants; (2) Development and application of analysis methods utilizing process and mass balance techniques to increase the interpretive powers of atmospheric models and to aid in complementary analysis of model predictions and observations; (3) Development of meteorological and emission inputs for initial application of the chemistry/transport model over the north Atlantic region; and, (4) The continued development and implementation of a totally new adaptive chemistry representation that changes the details of what is represented as the underlying conditions change.

  11. Particle hopping models and traffic flow theory

    OpenAIRE

    Nagel, Kai

    1995-01-01

    This paper shows how particle hopping models fit into the context of traffic flow theory. Connections between fluid-dynamical traffic flow models, which derive from the Navier-Stokes-equations, and particle hopping models are shown. In some cases, these connections are exact and have long been established, but have never been viewed in the context of traffic theory. In other cases, critical behavior of traffic jam clusters can be compared to instabilities in the partial differential equations...

  12. Retrofitting step by step. Model project with potentials for different budgets; Modernisieren Schritt fuer Schritt. Modellprojekt mit Potenzialen fuer unterschiedliche Budgets

    Energy Technology Data Exchange (ETDEWEB)

    Unger, Astrid [VELUX Deutschland GmbH (Germany)

    2011-07-01

    Within a European-wide experiment, Velux (Schwalmstadt, Federal Republic of Germany) establishes six trend-setting concept houses in Denmark, Austria, England, France and Germany between 2009 and 2011 using the Model Home 2020. The vision: Buildings with optimal energy design simultaneously offering the highest quality of life. In Germany, this company accepts the task of social relevance to retrofit a settler's house to a light-active-house. Nearly one-half of the 39 million residential units in Germany is between 30 and 60 years old and must be retrofitted energetically. Here enormous potentials exist: However, not everyone can afford a sophisticated premium modernization. therefore the expert team now calculated two additional modernization variants for small budgets n the case of the German concept house.

  13. Internal variability of Earth’s energy budget simulated by CMIP5 climate models

    International Nuclear Information System (INIS)

    We analyse a large number of multi-century pre-industrial control simulations from the fifth phase of the Coupled Model Intercomparison Project (CMIP5) to investigate relationships between: net top-of-atmosphere radiation (TOA), globally averaged surface temperature (GST), and globally integrated ocean heat content (OHC) on decadal timescales. Consistent with previous studies, we find that large trends (∼0.3 K dec−1) in GST can arise from internal climate variability and that these trends are generally an unreliable indicator of TOA over the same period. In contrast, trends in total OHC explain 95% or more of the variance in TOA for two-thirds of the models analysed; emphasizing the oceans’ role as Earth’s primary energy store. Correlation of trends in total system energy (TE ≡ time integrated TOA) against trends in OHC suggests that for most models the ocean becomes the dominant term in the planetary energy budget on a timescale of about 12 months. In the context of the recent pause in global surface temperature rise, we investigate the potential importance of internal climate variability in both TOA and ocean heat rearrangement. The model simulations suggest that both factors can account for O (0.1 W m−2) on decadal timescales and may play an important role in the recently observed trends in GST and 0–700 m (and 0–1800 m) ocean heat uptake. (paper)

  14. Modeling long-term, large-scale sediment storage using a simple sediment budget approach

    Science.gov (United States)

    Naipal, Victoria; Reick, Christian; Van Oost, Kristof; Hoffmann, Thomas; Pongratz, Julia

    2016-05-01

    Currently, the anthropogenic perturbation of the biogeochemical cycles remains unquantified due to the poor representation of lateral fluxes of carbon and nutrients in Earth system models (ESMs). This lateral transport of carbon and nutrients between terrestrial ecosystems is strongly affected by accelerated soil erosion rates. However, the quantification of global soil erosion by rainfall and runoff, and the resulting redistribution is missing. This study aims at developing new tools and methods to estimate global soil erosion and redistribution by presenting and evaluating a new large-scale coarse-resolution sediment budget model that is compatible with ESMs. This model can simulate spatial patterns and long-term trends of soil redistribution in floodplains and on hillslopes, resulting from external forces such as climate and land use change. We applied the model to the Rhine catchment using climate and land cover data from the Max Planck Institute Earth System Model (MPI-ESM) for the last millennium (here AD 850-2005). Validation is done using observed Holocene sediment storage data and observed scaling between sediment storage and catchment area. We find that the model reproduces the spatial distribution of floodplain sediment storage and the scaling behavior for floodplains and hillslopes as found in observations. After analyzing the dependence of the scaling behavior on the main parameters of the model, we argue that the scaling is an emergent feature of the model and mainly dependent on the underlying topography. Furthermore, we find that land use change is the main contributor to the change in sediment storage in the Rhine catchment during the last millennium. Land use change also explains most of the temporal variability in sediment storage in floodplains and on hillslopes.

  15. From Dreams to Dollars: Joining the Theory of Planning with the Practicality of Budget to Maximize Both

    Science.gov (United States)

    Dorsey, Myrtle E. B.

    2008-01-01

    The integrated online planning and budget development system at Baton Rouge Community College is an innovative approach to systematically link college strategic priorities and unit plan objectives with financial resources. Using two industry standards (Microsoft Access and Sungard Banner), a user-friendly program was developed that has facilitated…

  16. Budgeting tool for Restaurant X

    OpenAIRE

    Nguyen, Uyen

    2014-01-01

    In order to improve profitability and advance a company’s commitment to organ-ize growth, details plans which are called budgets are required. A budgeting tool is a beneficial asset for a company because it helps the budgeting preparation process become easier and faster. Thus, the aim of this thesis is to create a budgeting tool for Restaurant X. This thesis is product-orientated. There are three tasks conducted in this thesis. First one is to cover all relevant theories about a budget. T...

  17. A numerical modelling study on regional mercury budget for eastern North America

    Directory of Open Access Journals (Sweden)

    X. Lin

    2003-01-01

    Full Text Available In this study, we have integrated an up-to-date physio-chemical transformation mechanism of Hg into the framework of US EPA's CMAQ model system. In addition, the model adapted detailed calculations of the air-surface exchange for Hg to properly describe Hg re-emissions and dry deposition from and to natural surfaces. The mechanism covers Hg in three categories, elemental Hg (Hg0, reactive gaseous Hg (RGM and particulate Hg (HgP. With interfacing to MM5 (meteorology processor and SMOKE (emission processor, we applied the model to a 4-week period in June/July 1995 on a domain covering most of eastern North America. Results indicate that the model simulates reasonably well the levels of total gaseous Hg (TGM and the specific Hg wet deposition measurements made by the Hg deposition network (MDN. Moreover, results from various scenario runs reveal that the Hg system behaves in a closely linear way in terms of contributions from different source categories, i.e. anthropogenic emissions, natural re-emissions and background. Analyses of the scenario results suggest that 37% of anthropogenically emitted Hg was deposited back in the model domain with 5155 kg of anthropogenic Hg moving out of the domain during the simulation period. Overall, the domain served as a net source, which supplied ~a half ton of Hg to the global background pool over the period. Our model validation and a sensitivity test further rationalized the rate constant for gaseous oxidation of Hg0 by hydroxyl radical OH used in the global scale modelling study by Bergan and Rodhe (2001. A further laboratory determination of the reaction rate constant, including its temperature dependence, stands as one of the important issues critical to improving our knowledge on the budget and cycling of Hg.

  18. The AquaDEB project: Physiological flexibility of aquatic animals analysed with a generic dynamic energy budget model (phase II)

    Science.gov (United States)

    Alunno-Bruscia, Marianne; van der Veer, Henk W.; Kooijman, Sebastiaan A. L. M.

    2011-11-01

    This second special issue of the Journal of Sea Research on development and applications of Dynamic Energy Budget (DEB) theory concludes the European Research Project AquaDEB (2007-2011). In this introductory paper we summarise the progress made during the running time of this 5 years' project, present context for the papers in this volume and discuss future directions. The main scientific objectives in AquaDEB were (i) to study and compare the sensitivity of aquatic species (mainly molluscs and fish) to environmental variability within the context of DEB theory for metabolic organisation, and (ii) to evaluate the inter-relationships between different biological levels (individual, population, ecosystem) and temporal scales (life cycle, population dynamics, evolution). AquaDEB phase I focussed on quantifying bio-energetic processes of various aquatic species ( e.g. molluscs, fish, crustaceans, algae) and phase II on: (i) comparing of energetic and physiological strategies among species through the DEB parameter values and identifying the factors responsible for any differences in bioenergetics and physiology; (ii) considering different scenarios of environmental disruption (excess of nutrients, diffuse or massive pollution, exploitation by man, climate change) to forecast effects on growth, reproduction and survival of key species; (iii) scaling up the models for a few species from the individual level up to the level of evolutionary processes. Apart from the three special issues in the Journal of Sea Research — including the DEBIB collaboration (see vol. 65 issue 2), a theme issue on DEB theory appeared in the Philosophical Transactions of the Royal Society B (vol 365, 2010); a large number of publications were produced; the third edition of the DEB book appeared (2010); open-source software was substantially expanded (over 1000 functions); a large open-source systematic collection of ecophysiological data and DEB parameters has been set up; and a series of DEB

  19. The Nomad Model: Theory, Developments and Applications

    NARCIS (Netherlands)

    Campanella, M.; Hoogendoorn, S.P.; Daamen, W.

    2014-01-01

    This paper presents details of the developments of the Nomad model after being introduced more than 12 years ago. The model is derived from a normative theory of pedestrian behavior making it unique under microscopic models. Nomad has been successfully applied in several cases indicating that it ful

  20. Carbon budget estimation of a subarctic catchment using a dynamic ecosystem model at high spatial resolution

    Science.gov (United States)

    Tang, J.; Miller, P. A.; Persson, A.; Olefeldt, D.; Pilesjo, P.; Heliasz, M.; Jackowicz-Korczynski, M.; Yang, Z.; Smith, B.; Callaghan, T. V.; Christensen, T. R.

    2015-05-01

    A large amount of organic carbon is stored in high-latitude soils. A substantial proportion of this carbon stock is vulnerable and may decompose rapidly due to temperature increases that are already greater than the global average. It is therefore crucial to quantify and understand carbon exchange between the atmosphere and subarctic/arctic ecosystems. In this paper, we combine an Arctic-enabled version of the process-based dynamic ecosystem model, LPJ-GUESS (version LPJG-WHyMe-TFM) with comprehensive observations of terrestrial and aquatic carbon fluxes to simulate long-term carbon exchange in a subarctic catchment at 50 m resolution. Integrating the observed carbon fluxes from aquatic systems with the modeled terrestrial carbon fluxes across the whole catchment, we estimate that the area is a carbon sink at present and will become an even stronger carbon sink by 2080, which is mainly a result of a projected densification of birch forest and its encroachment into tundra heath. However, the magnitudes of the modeled sinks are very dependent on future atmospheric CO2 concentrations. Furthermore, comparisons of global warming potentials between two simulations with and without CO2 increase since 1960 reveal that the increased methane emission from the peatland could double the warming effects of the whole catchment by 2080 in the absence of CO2 fertilization of the vegetation. This is the first process-based model study of the temporal evolution of a catchment-level carbon budget at high spatial resolution, including both terrestrial and aquatic carbon. Though this study also highlights some limitations in modeling subarctic ecosystem responses to climate change, such as aquatic system flux dynamics, nutrient limitation, herbivory and other disturbances, and peatland expansion, our study provides one process-based approach to resolve the complexity of carbon cycling in subarctic ecosystems while simultaneously pointing out the key model developments for capturing

  1. Simulation of arctic surface radiation and energy budget during the summertime using the single-column model

    Institute of Scientific and Technical Information of China (English)

    LI Xiang; WANG Hui; ZHANG Zhanhai; WU Huiding

    2008-01-01

    The surface heat budget of the Arctic Ocean (SHEBA) project has shown that the study of the surface heat budget characteristics is crucial to understanding the interface process and environmental change in the polar region.An arctic single-column model (ARCSCM) of Colorado University is used to simulate the arctic surface radiation and energy budget during the summertime.The simulation results are analyzed and compared with the SHEBA measurements.Sensitivity analyses are performed to test microphys- ical and radiative parameterizations in this model.The results show that the ARCSCM model is able to simulate the surface radia- tion and energy budget in the arctic during the summertime,and the different parameterizations have a significant influence on the results.The combination of cloud microphysics and RRTM parameterizations can fairly derive the surface solar shortwave radiation and downwelling Iongwave radiation flux.But this cloud microphysics parameterization scheme deviates notably from the simula- tion of surface sensible and latent heat flux.Further improvement for the parameterization scheme applied to the Arctic Regions is necessary.

  2. Integrable Models, SUSY Gauge Theories, and String Theory

    CERN Document Server

    Nam, S

    1996-01-01

    We consider the close relation between duality in N=2 SUSY gauge theories and integrable models. Vario us integrable models ranging from Toda lattices, Calogero models, spinning tops, and spin chains are re lated to the quantum moduli space of vacua of N=2 SUSY gauge theories. In particular, SU(3) gauge t heories with two flavors of massless quarks in the fundamental representation can be related to the spec tral curve of the Goryachev-Chaplygin top, which is a Nahm's equation in disguise. This can be generaliz ed to the cases with massive quarks, and N_f = 0,1,2, where a system with seven dimensional phas e space has the relevant hyperelliptic curve appear in the Painlevé test. To understand the stringy o rigin of the integrability of these theories we obtain exact nonperturbative point particle limit of ty pe II string compactified on a Calabi-Yau manifold, which gives the hyperelliptic curve of SU(2) QCD w ith N_f =1 hypermultiplet.

  3. Evaluation of water and energy budgets in regional climate models applied over Europe

    Energy Technology Data Exchange (ETDEWEB)

    Hagemann, S.; Jacob, D. [Max Planck Institute for Meteorology, Hamburg (Germany); Machenhauer, B.; Christensen, O.B. [Danish Meteorological Institute, Climate Research Division, Copenhagen Oe (Denmark); Jones, R. [Meteorological Office Hadley Centre, Bracknell (United Kingdom); Deque, M. [Meteo-France CNRM/GMGEC/EAC, Toulouse Cedex 01 (France); Vidale, P.L. [Climate Research ETH, Zuerich (Switzerland)

    2004-10-01

    This study presents a model intercomparison of four regional climate models (RCMs) and one variable resolution atmospheric general circulation model (AGCM) applied over Europe with special focus on the hydrological cycle and the surface energy budget. The models simulated the 15 years from 1979 to 1993 by using quasi-observed boundary conditions derived from ECMWF re-analyses (ERA). The model intercomparison focuses on two large atchments representing two different climate conditions covering two areas of major research interest within Europe. The first is the Danube catchment which represents a continental climate dominated by advection from the surrounding land areas. It is used to analyse the common model error of a too dry and too warm simulation of the summertime climate of southeastern Europe. This summer warming and drying problem is seen in many RCMs, and to a less extent in GCMs. The second area is the Baltic Sea catchment which represents maritime climate dominated by advection from the ocean and from the Baltic Sea. This catchment is a research area of many studies within Europe and also covered by the BALTEX program. The observed data used are monthly mean surface air temperature, precipitation and river discharge. For all models, these are used to estimate mean monthly biases of all components of the hydrological cycle over land. In addition, the mean monthly deviations of the surface energy fluxes from ERA data are computed. Atmospheric moisture fluxes from ERA are compared with those of one model to provide an independent estimate of the convergence bias derived from the observed data. These help to add weight to some of the inferred estimates and explain some of the discrepancies between them. An evaluation of these biases and deviations suggests possible sources of error in each of the models. For the Danube catchment, systematic errors in the dynamics cause the prominent summer drying problem for three of the RCMs, while for the fourth RCM this is

  4. Forecasting Rainfall Induced Landslide using High Resolution DEM and Simple Water Budget Model

    Science.gov (United States)

    Luzon, P. K. D.; Lagmay, A. M. F. A.

    2014-12-01

    Philippines is hit by an average of 20 typhoons per year bringing large amount of rainfall. Monsoon carrying rain coming from the southwest of the country also contributes to the annual total rainfall that causes different hazards. Such is shallow landslide mainly triggered by high saturation of soil due to continuous downpour which could take up from hours to days. Recent event like this happened in Zambales province September of 2013 where torrential rain occurred for 24 hours amounting to half a month of rain. Rainfall intensity measured by the nearest weather station averaged to 21 mm/hr from 10 pm of 22 until 10 am the following day. The monsoon rains was intensified by the presence of Typhoon Usagi positioned north and heading northwest of the country. A number of landslides due to this happened in 3 different municipalities; Subic, San Marcelino and Castillejos. The disaster have taken 30 lives from the province. Monitoring these areas for the entire country is but a big challenge in all aspect of disaster preparedness and management. The approach of this paper is utilizing the available forecast of rainfall amount to monitor highly hazardous area during the rainy seasons and forecasting possible landslide that could happen. A simple water budget model following the equation Perct=Pt-R/Ot-∆STt-AETt (where as the terms are Percolation, Runoff, Change in Storage, and Actual Evapotraspiration) was implemented in quantifying all the water budget component. Computations are in Python scripted grid system utilizing the widely used GIS forms for easy transfer of data and faster calculation. Results of successive runs will let percolation and change in water storage as indicators of possible landslide.. This approach needs three primary sets of data; weather data, topographic data, and soil parameters. This research uses 5 m resolution DEM (IfSAR) to define the topography. Soil parameters are from fieldworks conducted. Weather data are from the Philippine

  5. An approach to improve precipitation estimation to model the water budget in Alpine catchments

    Science.gov (United States)

    Mair, E.; Bertoldi, G.; Della Chiesa, S.; Niedrist, G.; Egarter Vigl, L.; Tappeiner, U.

    2012-04-01

    Accurate quantification of precipitation is still one of the major sources of uncertainty in quantifying the water budget of Alpine catchments. In fact, besides increasing data availability, usually most of the stations are located in the bottom of the valleys, while, at high elevations, rain gauge accuracy is limited by snow and wind, with strong underestimation of the total precipitation. Similar problems exist for snow measurement devices. In this contribution we present a novel empirical approach to improve precipitation estimation using rain gauge data, snow height and standard meteorological observations, and we evaluate the improvements in estimating the water budget of the Mazia Valley (100 km2 - Central Alps - South Tyrol, Italy). In fact, due to the screening effect of the surrounding mountains (mostly glaciated, maximum elevation: 3750 m a.s.l.) this valley has a relatively dry cold continental climate with strong precipitation gradients. In the framework of the projects "Klimawandel" and "HydroAlp", 17 monitoring stations were installed to measure standard micrometeorological variables, vegetation properties and soil moisture. For a correct climate analysis, a distinction between snow and rainfall is necessary. Due to energy limitations in remote alpine areas no heated rain gauges were installed. However, four stations are equipped with snow height sensors, from which snow data can be retrieved. For other stations the calculation of the snow water equivalent was more complicated because of the lack of snow height sensors. In the empirical approach, for every registered precipitation data record snow height change was reviewed and compared to air temperature and relative humidity, as well as to the calculated wet bulb temperature, in order to distinguish between rainfall and snowfall events. Also the global solar radiation was controlled to identify melt water production coming from accumulated snow on the top of the unheated rain gauges. With a formula

  6. Measurements of hydroxyl and hydroperoxy radicals during CalNex-LA: Model comparisons and radical budgets

    Science.gov (United States)

    Griffith, S. M.; Hansen, R. F.; Dusanter, S.; Michoud, V.; Gilman, J. B.; Kuster, W. C.; Veres, P. R.; Graus, M.; Gouw, J. A.; Roberts, J.; Young, C.; Washenfelder, R.; Brown, S. S.; Thalman, R.; Waxman, E.; Volkamer, R.; Tsai, C.; Stutz, J.; Flynn, J. H.; Grossberg, N.; Lefer, B.; Alvarez, S. L.; Rappenglueck, B.; Mielke, L. H.; Osthoff, H. D.; Stevens, P. S.

    2016-04-01

    Measurements of hydroxyl (OH) and hydroperoxy (HO2*) radical concentrations were made at the Pasadena ground site during the CalNex-LA 2010 campaign using the laser-induced fluorescence-fluorescence assay by gas expansion technique. The measured concentrations of OH and HO2* exhibited a distinct weekend effect, with higher radical concentrations observed on the weekends corresponding to lower levels of nitrogen oxides (NOx). The radical measurements were compared to results from a zero-dimensional model using the Regional Atmospheric Chemical Mechanism-2 constrained by NOx and other measured trace gases. The chemical model overpredicted measured OH concentrations during the weekends by a factor of approximately 1.4 ± 0.3 (1σ), but the agreement was better during the weekdays (ratio of 1.0 ± 0.2). Model predicted HO2* concentrations underpredicted by a factor of 1.3 ± 0.2 on the weekends, while measured weekday concentrations were underpredicted by a factor of 3.0 ± 0.5. However, increasing the modeled OH reactivity to match the measured total OH reactivity improved the overall agreement for both OH and HO2* on all days. A radical budget analysis suggests that photolysis of carbonyls and formaldehyde together accounted for approximately 40% of radical initiation with photolysis of nitrous acid accounting for 30% at the measurement height and ozone photolysis contributing less than 20%. An analysis of the ozone production sensitivity reveals that during the week, ozone production was limited by volatile organic compounds throughout the day during the campaign but NOx limited during the afternoon on the weekends.

  7. Calibration, Sensor Model Improvements and Uncertainty Budget of the Airborne Imaging Spectrometer APEX

    Science.gov (United States)

    Hueni, A.

    2015-12-01

    ESA's Airborne Imaging Spectrometer APEX (Airborne Prism Experiment) was developed under the PRODEX (PROgramme de Développement d'EXpériences scientifiques) program by a Swiss-Belgian consortium and entered its operational phase at the end of 2010 (Schaepman et al., 2015). Work on the sensor model has been carried out extensively within the framework of European Metrology Research Program as part of the Metrology for Earth Observation and Climate (MetEOC and MetEOC2). The focus has been to improve laboratory calibration procedures in order to reduce uncertainties, to establish a laboratory uncertainty budget and to upgrade the sensor model to compensate for sensor specific biases. The updated sensor model relies largely on data collected during dedicated characterisation experiments in the APEX calibration home base but includes airborne data as well where the simulation of environmental conditions in the given laboratory setup was not feasible. The additions to the model deal with artefacts caused by environmental changes and electronic features, namely the impact of ambient air pressure changes on the radiometry in combination with dichroic coatings, influences of external air temperatures and consequently instrument baffle temperatures on the radiometry, and electronic anomalies causing radiometric errors in the four shortwave infrared detector readout blocks. Many of these resolved issues might be expected to be present in other imaging spectrometers to some degree or in some variation. Consequently, the work clearly shows the difficulties of extending a laboratory-based uncertainty to data collected under in-flight conditions. The results are hence not only of interest to the calibration scientist but also to the spectroscopy end user, in particular when commercial sensor systems are used for data collection and relevant sensor characteristic information tends to be sparse. Schaepman, et al, 2015. Advanced radiometry measurements and Earth science

  8. Grey-theory based intrusion detection model

    Institute of Scientific and Technical Information of China (English)

    Qin Boping; Zhou Xianwei; Yang Jun; Song Cunyi

    2006-01-01

    To solve the problem that current intrusion detection model needs large-scale data in formulating the model in real-time use, an intrusion detection system model based on grey theory (GTIDS) is presented. Grey theory has merits of fewer requirements on original data scale, less limitation of the distribution pattern and simpler algorithm in modeling.With these merits GTIDS constructs model according to partial time sequence for rapid detect on intrusive act in secure system. In this detection model rate of false drop and false retrieval are effectively reduced through twice modeling and repeated detect on target data. Furthermore, GTIDS framework and specific process of modeling algorithm are presented. The affectivity of GTIDS is proved through emulated experiments comparing snort and next-generation intrusion detection expert system (NIDES) in SRI international.

  9. An information theory approach for evaluating earth radiation budget (ERB) measurements - Nonuniform sampling of diurnal longwave flux variations

    Science.gov (United States)

    Halyo, Nesim; Direskeneli, Haldun; Barkstrom, Bruce R.

    1991-01-01

    Satellite measurements are subject to a wide range of uncertainties due to their temporal, spatial, and directional sampling characteristics. An information-theory approach is suggested to examine the nonuniform temporal sampling of ERB measurements. The information (i.e., its entropy or uncertainty) before and after the measurements is determined, and information gain (IG) is defined as a reduction in the uncertainties involved. A stochastic model for the diurnal outgoing flux variations that affect the ERB is developed. Using Gaussian distributions for the a priori and measured radiant exitance fields, the IG is obtained by computing the a posteriori covariance. The IG for the monthly outgoing flux measurements is examined for different orbital parameters and orbital tracks, using the Earth Observing System orbital parameters as specific examples. Variations in IG due to changes in the orbit's inclination angle and the initial ascending node local time are investigated.

  10. Metabolic programming of zebra fish Danio rerio uncovered. Physiological performance as explained by dynamic energy budget theory and life-cycle consequence of uranium induced perturbations

    International Nuclear Information System (INIS)

    The aim of this dissertation is to characterize the toxicity of uranium on the metabolism of zebra fish, nio rerio. The first three chapters of this manuscript are dedicated to characterizing the blank metabolism of zebra fish. I used the Dynamic Energy Budget (deb) theory for this characterisation; it is presently the only theory that covers the full life cycle of the organism and quantifies feeding, assimilation, growth, reproduction, maturation, maintenance and ageing. Any metabolic effect of uranium should appear as effects on one or more of these fundamental processes. Since the life span of zebra fish is some four and a half years, and larger individuals respond slower to chemical stress, the focus was on the early life stages. Considerable breakthroughs in the quantification of zebra fish development, growth and reproduction have been made. It turned out the zebra fish accelerates its metabolism after birth till metamorphosis, when acceleration ceases. This process is seen in some, but not all, species of fish. Another striking conclusion was that somatic maintenance was much higher than is typical for fish. We don't yet have an explanation for this funding. Further it turned out that the details of reproduction matter: allocation to reproduction (in adults) accumulates in a reproduction buffer and this buffer is used to prepare batches of eggs. We needed to detail this preparation process to understand how zebra fish can eliminate uranium via eggs. Deb theory specifies that a particular developmental stage (birth, metamorphosis, puberty) is reached at specified levels of maturity. For different temperatures and food levels, that can occur at different ages and body sizes. We extended this idea to include all the described morphologically defined developmental stages of the zebra fish in the literature; the observed variations in ages and body sizes can now be explained by deb theory. To test if deb theory can also explain perturbations of maturation, we

  11. Graphical Model Theory for Wireless Sensor Networks

    Energy Technology Data Exchange (ETDEWEB)

    Davis, William B.

    2002-12-08

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm.

  12. Graphical Model Theory for Wireless Sensor Networks

    International Nuclear Information System (INIS)

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm

  13. F-theory and linear sigma models

    CERN Document Server

    Bershadsky, M; Greene, Brian R; Johansen, A; Lazaroiu, C I

    1998-01-01

    We present an explicit method for translating between the linear sigma model and the spectral cover description of SU(r) stable bundles over an elliptically fibered Calabi-Yau manifold. We use this to investigate the 4-dimensional duality between (0,2) heterotic and F-theory compactifications. We indirectly find that much interesting heterotic information must be contained in the `spectral bundle' and in its dual description as a gauge theory on multiple F-theory 7-branes. A by-product of these efforts is a method for analyzing semistability and the splitting type of vector bundles over an elliptic curve given as the sheaf cohomology of a monad.

  14. Integrable Lattice Models From Gauge Theory

    CERN Document Server

    Witten, Edward

    2016-01-01

    These notes provide an introduction to recent work by Kevin Costello in which integrable lattice models of classical statistical mechanics in two dimensions are understood in terms of quantum gauge theory in four dimensions. This construction will be compared to the more familiar relationship between quantum knot invariants in three dimensions and Chern-Simons gauge theory. (Based on a Whittaker Colloquium at the University of Edinburgh and a lecture at Strings 2016 in Beijing.)

  15. Participatory Budgeting

    OpenAIRE

    Innovation for Development and South-South Cooperation, IDEASS

    2007-01-01

    This book provides an overview of the principles underlying participatory budgeting. It analyzes the merits and demerits of participatory budgeting practices around the world with a view to guiding policy makers and practitioners on improving such practices in the interest of inclusive governance. This publication includes five regional surveys, and seven country case studies can be found ...

  16. Budget timetable

    Science.gov (United States)

    This is a timetable for congressional action under the Balanced Budget and Emergency Deficit Control Act of 1985 (Gramm-Rudman-Hollings). These deadlines apply to fiscal years (FY) 1987-1991. The Congress missed a number of these deadlines last year. The deficit reduction measures in Gramm-Rudman-Hollings would lead to a balanced budget in 1991.

  17. System Budgets

    DEFF Research Database (Denmark)

    Jeppesen, Palle

    1996-01-01

    The lecture note is aimed at introducing system budgets for optical communication systems. It treats optical fiber communication systems (six generations), system design, bandwidth effects, other system impairments and optical amplifiers.......The lecture note is aimed at introducing system budgets for optical communication systems. It treats optical fiber communication systems (six generations), system design, bandwidth effects, other system impairments and optical amplifiers....

  18. Security Theorems via Model Theory

    Directory of Open Access Journals (Sweden)

    Joshua Guttman

    2009-11-01

    Full Text Available A model-theoretic approach can establish security theorems for cryptographic protocols. Formulas expressing authentication and non-disclosure properties of protocols have a special form. They are quantified implications for all xs . (phi implies for some ys . psi. Models (interpretations for these formulas are *skeletons*, partially ordered structures consisting of a number of local protocol behaviors. *Realized* skeletons contain enough local sessions to explain all the behavior, when combined with some possible adversary behaviors. We show two results. (1 If phi is the antecedent of a security goal, then there is a skeleton A_phi such that, for every skeleton B, phi is satisfied in B iff there is a homomorphism from A_phi to B. (2 A protocol enforces for all xs . (phi implies for some ys . psi iff every realized homomorphic image of A_phi satisfies psi. Hence, to verify a security goal, one can use the Cryptographic Protocol Shapes Analyzer CPSA (TACAS, 2007 to identify minimal realized skeletons, or "shapes," that are homomorphic images of A_phi. If psi holds in each of these shapes, then the goal holds.

  19. Regional differences in the surface energy budget over China: an evaluation of a selection of CMIP5 models

    Science.gov (United States)

    Zhou, Lian-Tong; Du, Zhencai

    2016-04-01

    The present study provides an evaluation of the regional differences over China in surface energy budget components as simulated by a selection of models from phase five of the Coupled Model Intercomparison Project (CMIP5), covering the period 1960-2005. Similarities and differences exist among the models in terms of both spatial and magnitude patterns. For climatology, the CMIP5 models show quite different spatial distributions of shortwave radiation and sensible heat flux. In terms of seasonal variation, the surface energy budgets are remarkably different between western and eastern China. The discrepancies in the seasonal variation of sensible heat flux are mainly attributable to temperature differences and wind speed, while those of shortwave radiation are caused by the seasonal variation in total cloud cover. Cloudiness is one of the most crucial parameters in estimating the surface energy budget. In addition, the study also reveals that the magnitudes of the various components show larger (more than two-fold) differences between western and eastern parts of China, especially in net longwave and upward shortwave radiation, as well as latent and sensible heat fluxes. The results for surface soil heat flux show that there is more incoming energy during spring and summer and more outgoing energy during fall and winter in both western and eastern China. Furthermore, compared to NCEP2 data, the ERA-40 reanalysis product produces results more similar to the multi-model ensemble mean for most components.

  20. Vacation queueing models theory and applications

    CERN Document Server

    Tian, Naishuo

    2006-01-01

    A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...

  1. Reconstructing bidimensional scalar field theory models

    Energy Technology Data Exchange (ETDEWEB)

    Flores, Gabriel H.; Svaiter, N.F. [Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, RJ (Brazil)]. E-mail: gflores@cbpf.br; nfuxsvai@cbpf.br

    2001-07-01

    In this paper we review how to reconstruct scalar field theories in two dimensional spacetime starting from solvable Scrodinger equations. Theree different Schrodinger potentials are analyzed. We obtained two new models starting from the Morse and Scarf II hyperbolic potencials, the U ({theta}) {theta}{sup 2} In{sup 2} ({theta}{sup 2}) model and U ({theta}) = {theta}{sup 2} cos{sup 2} (In({theta}{sup 2})) model respectively. (author)

  2. Algebraic model theory for languages without equality

    OpenAIRE

    Elgueta Montó, Raimon

    1994-01-01

    In our opinion, it is fair to distinguish two separate branches in the origins of model theory. The first one, the model theory of first-order logic, can be traced back to the pioneering work of L. Lowenheim, T. Skolem, K. Gödel, A. Tarski and A.I. MaI 'cev, published before the mid 30's. This branch was put forward during the 40s' and 50s’ by several authors, including A. Tarski, L. Henkin, A. Robinson, J. Los. Their contribution, however, was rather influenced by modern algebra, a disciplin...

  3. Quantum field theory and the standard model

    CERN Document Server

    Schwartz, Matthew D

    2014-01-01

    Providing a comprehensive introduction to quantum field theory, this textbook covers the development of particle physics from its foundations to the discovery of the Higgs boson. Its combination of clear physical explanations, with direct connections to experimental data, and mathematical rigor make the subject accessible to students with a wide variety of backgrounds and interests. Assuming only an undergraduate-level understanding of quantum mechanics, the book steadily develops the Standard Model and state-of-the-art calculation techniques. It includes multiple derivations of many important results, with modern methods such as effective field theory and the renormalization group playing a prominent role. Numerous worked examples and end-of-chapter problems enable students to reproduce classic results and to master quantum field theory as it is used today. Based on a course taught by the author over many years, this book is ideal for an introductory to advanced quantum field theory sequence or for independe...

  4. On the algebraic theory of kink sectors: Application to quantum field theory models and collision theory

    International Nuclear Information System (INIS)

    Several two dimensional quantum field theory models have more than one vacuum state. An investigation of super selection sectors in two dimensions from an axiomatic point of view suggests that there should be also states, called soliton or kink states, which interpolate different vacua. Familiar quantum field theory models, for which the existence of kink states have been proven, are the Sine-Gordon and the φ42-model. In order to establish the existence of kink states for a larger class of models, we investigate the following question: Which are sufficient conditions a pair of vacuum states has to fulfill, such that an interpolating kink state can be constructed? We discuss the problem in the framework of algebraic quantum field theory which includes, for example, the P(φ)2-models. We identify a large class of vacuum states, including the vacua of the P(φ)2-models, the Yukawa2-like models and special types of Wess-Zumino models, for which there is a natural way to construct an interpolating kink state. In two space-time dimensions, massive particle states are kink states. We apply the Haag-Ruelle collision theory to kink sectors in order to analyze the asymptotic scattering states. We show that for special configurations of n kinks the scattering states describe n freely moving non interacting particles. (orig.)

  5. Use of a process-based model for assessing the methane budgets of global terrestrial ecosystems and evaluation of uncertainty

    Directory of Open Access Journals (Sweden)

    A. Ito

    2012-02-01

    Full Text Available We assessed the global terrestrial budget of methane (CH4 by using a process-based biogeochemical model (VISIT and inventory data for components of the budget that were not included in the model. Emissions from wetlands, paddy fields, biomass burning, and plants, as well as oxidative consumption by upland soils, were simulated by the model. Emissions from ruminant livestock and termites were evaluated by using an inventory approach. These CH4 flows were estimated for each of the model's 0.5° × 0.5° grid cells from 1901 to 2009, while accounting for atmospheric composition, meteorological factors, and land-use changes. Estimation uncertainties were examined through ensemble simulations using different parameterization schemes and input data (e.g., different wetland maps and emission factors. From 1996 to 2005, the average global terrestrial CH4 budget was estimated on the basis of 1152 simulations, and terrestrial ecosystems were found to be a net source of 308.3 ± 20.7 Tg CH4 yr−1. Wetland and livestock ruminant emissions were the primary sources. The results of our simulations indicate that sources and sinks are distributed highly heterogeneously over the Earth's land surface. Seasonal and interannual variability in the terrestrial budget was also assessed. The trend of increasing net emission from terrestrial sources and its relationship with temperature variability imply that terrestrial CH4 feedbacks will play an increasingly important role as a result of future climatic change.

  6. Processo Orçamentário: uma aplicação da análise substantiva com utilização da grounded theory [Budgeting: substantive analysis using grounded theory

    Directory of Open Access Journals (Sweden)

    Tânia Regina Sordi Relvas

    2011-09-01

    Full Text Available Diante da constatação de que os estudos sobre o orçamento exploram o fenômeno de forma reducionista, este artigo tem por objetivo propor uma teoria substantiva abrangente e fundamentada em dados empíricos para a análise do orçamento. Essa abordagem considera seus elementos constituintes e suas interdependências. Isso foi feito por meio da aplicação da abordagem indutiva fundamentada nos dados empíricos (grounded theory, sob o paradigma qualitativo. O foco de análise foi uma instituição financeira de grande porte e o trabalho de campo foi desenvolvido ao longo de dois anos, envolvendo vários níveis gerenciais. A contribuição do trabalho advém da disponibilização de framework para o tratamento do tema em um contexto amplo, o que permitiu entender aspectos que deixariam de ser considerados com uma abordagem de análise mais restrita e menos abrangente. Como produto da teoria substantiva, cinco proposições foram desenvolvidas com a perspectiva de serem aplicadas nas organizações. --- Budgeting: substantive analysis using grounded theory --- Abstract --- Considering the fact that studies into budgeting basically use a reductionist approach, this paper proposes a comprehensive substantive theory based on empirical data to be used in budget analysis. This approach takes into consideration its elements and interdependence by applying the inductive approach based on empirical data (grounded theory on a qualitative paradigm. The focus was an in-depth two-year study of a large Brazilian financial institution involving several management levels. The main contribution of the study is as a framework that treats all elements of the budget process in a comprehensive and coherent fashion, otherwise impossible using a reductionist approach. As products of the substantive theory, five propositions were developed to be applied in organizations.

  7. Improved predictive ability of climate-human-behaviour interactions with modifications to the COMFA outdoor energy budget model

    Science.gov (United States)

    Vanos, J. K.; Warland, J. S.; Gillespie, T. J.; Kenny, N. A.

    2012-11-01

    The purpose of this paper is to implement current and novel research techniques in human energy budget estimations to give more accurate and efficient application of models by a variety of users. Using the COMFA model, the conditioning level of an individual is incorporated into overall energy budget predictions, giving more realistic estimations of the metabolism experienced at various fitness levels. Through the use of VO2 reserve estimates, errors are found when an elite athlete is modelled as an unconditioned or a conditioned individual, giving budgets underpredicted significantly by -173 and -123 W m-2, respectively. Such underprediction can result in critical errors regarding heat stress, particularly in highly motivated individuals; thus this revision is critical for athletic individuals. A further improvement in the COMFA model involves improved adaptation of clothing insulation ( I cl), as well clothing non-uniformity, with changing air temperature ( T a) and metabolic activity ( M act). Equivalent T a values (for I cl estimation) are calculated in order to lower the I cl value with increasing M act at equal T a. Furthermore, threshold T a values are calculated to predict the point at which an individual will change from a uniform I cl to a segmented I cl (full ensemble to shorts and a T-shirt). Lastly, improved relative velocity ( v r) estimates were found with a refined equation accounting for the degree angle of wind to body movement. Differences between the original and improved v r equations increased with higher wind and activity speeds, and as the wind to body angle moved away from 90°. Under moderate microclimate conditions, and wind from behind a person, the convective heat loss and skin temperature estimates were 47 W m-2 and 1.7°C higher when using the improved v r equation. These model revisions improve the applicability and usability of the COMFA energy budget model for subjects performing physical activity in outdoor environments

  8. Verification and calibration of Energy- and Flux-Budget (EFB) turbulence closure model through large eddy simulations and direct numerical simulations

    Science.gov (United States)

    Kadantsev, Evgeny; Fortelius, Carl; Druzhinin, Oleg; Mortikov, Evgeny; Glazunov, Andrey; Zilitinkevich, Sergej

    2016-04-01

    We examine and validate the EFB turbulence closure model (Zilitinkevich et al., 2013), which is based on the budget equations for basic second moments, namely, two energies: turbulent kinetic energy EK and turbulent potential energy EP, and vertical turbulent fluxes of momentum and potential temperature, τi (i = 1, 2) and Fz. Instead of traditional postulation of down-gradient turbulent transport, the EFB closure determines the eddy viscosity and eddy conductivity from the steady-state version of the budget equations for τi and Fz. Furthermore, the EFB closure involves new prognostic equation for turbulent dissipation time scale tT, and extends the theory to non-steady turbulence regimes accounting for non-gradient and non-local turbulent transports (when the traditional concepts of eddy viscosity and eddy conductivity become generally inconsistent). Our special interest is in asymptotic behavior of the EFB closure in strongly stable stratification. For this purpose, we consider plane Couette flow, namely, the flow between two infinite parallel plates, one of which is moving relative to another. We use a set of Direct Numerical Simulation (DNS) experiments at the highest possible Reynolds numbers for different bulk Richardson numbers (Druzhinin et al., 2015). To demonstrate potential improvements in Numerical Weather Prediction models, we test the new closure model in various idealized cases, varying stratification from the neutral and conventionally neutral to stable (GABLS1) running a test RANS model and HARMONIE/AROME model in single-column mode. Results are compared with DNS and LES (Large Eddy Simulation) runs and different numerical weather prediction models.

  9. Using chemical organization theory for model checking

    OpenAIRE

    Kaleta, Christoph; Richter, Stephan; Dittrich, Peter

    2009-01-01

    Motivation: The increasing number and complexity of biomodels makes automatic procedures for checking the models' properties and quality necessary. Approaches like elementary mode analysis, flux balance analysis, deficiency analysis and chemical organization theory (OT) require only the stoichiometric structure of the reaction network for derivation of valuable information. In formalisms like Systems Biology Markup Language (SBML), however, information about the stoichiometric coefficients re...

  10. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  11. Engaging Theories and Models to Inform Practice

    Science.gov (United States)

    Kraus, Amanda

    2012-01-01

    Helping students prepare for the complex transition to life after graduation is an important responsibility shared by those in student affairs and others in higher education. This chapter explores theories and models that can inform student affairs practitioners and faculty in preparing students for life after college. The focus is on roles,…

  12. Open Budget

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Government initiatives to publicize budgetary information allow for greater public supervision In an unprecedented move,four ministries under the Central Government recently posted their 2010 budgets on their official websites.

  13. Confronting the WRF and RAMS mesoscale models with innovative observations in the Netherlands: Evaluating the boundary layer heat budget

    Science.gov (United States)

    Steeneveld, G. J.; Tolk, L. F.; Moene, A. F.; Hartogensis, O. K.; Peters, W.; Holtslag, A. A. M.

    2011-12-01

    The Weather Research and Forecasting Model (WRF) and the Regional Atmospheric Mesoscale Model System (RAMS) are frequently used for (regional) weather, climate and air quality studies. This paper covers an evaluation of these models for a windy and calm episode against Cabauw tower observations (Netherlands), with a special focus on the representation of the physical processes in the atmospheric boundary layer (ABL). In addition, area averaged sensible heat flux observations by scintillometry are utilized which enables evaluation of grid scale model fluxes and flux observations at the same horizontal scale. Also, novel ABL height observations by ceilometry and of the near surface longwave radiation divergence are utilized. It appears that WRF in its basic set-up shows satisfactory model results for nearly all atmospheric near surface variables compared to field observations, while RAMS needed refining of its ABL scheme. An important inconsistency was found regarding the ABL daytime heat budget: Both model versions are only able to correctly forecast the ABL thermodynamic structure when the modeled surface sensible heat flux is much larger than both the eddy-covariance and scintillometer observations indicate. In order to clarify this discrepancy, model results for each term of the heat budget equation is evaluated against field observations. Sensitivity studies and evaluation of radiative tendencies and entrainment reveal that possible errors in these variables cannot explain the overestimation of the sensible heat flux within the current model infrastructure.

  14. A comprehensive theory-based transport model

    International Nuclear Information System (INIS)

    A new theory based transport model with comprehensive physics (trapping, general toroidal geometry, finite beta, collisions) has been developed. The core of the model is the new trapped-gyro-Landau-fluid (TGLF) equations which provide a fast and accurate approximation to the linear eigenmodes for gyrokinetic drift-wave instabilities (trapped ion and electron modes, ion and electron temperature gradient modes and kinetic ballooning modes). This new TGLF transport model removes the limitation of its predecessor GLF23 and is valid for the same conditions as the gyrokinetic equations. A theory-based philosophy is used in the model construction. The closure coefficients of the TGLF equations are fit to local kinetic theory to give accurate linear eigenmodes. The saturation model is fit to non-linear turbulence simulations. No fitting to experiment is done so applying the model to experiments is a true test of the theory it is approximating. The TGLF model unifies trapped and passing particles in a single set of gyrofluid equations. A model for the averaging of the Landau resonance by the trapped particles makes the equations work seamlessly over the whole drift-wave wavenumber range from trapped ion modes to electron temperature gradient modes. A fast eigenmode solution method enables unrestricted magnetic geometry. Electron-ion collisions and full electromagnetic fluctuations round out the physics. The linear eigenmodes have been benchmarked against comprehensive physics gyrokinetic calculations over a large range of plasma parameters. The deviation between the gyrokinetic and TGLF linear growth rates averages 11.4% in shifted circle geometry. The transport model uses the TGLF eigenmodes to compute quasilinear fluxes of energy and particles. A model for the saturated amplitude of the turbulence completes the calculation. The saturation model is constructed to fit a large set of nonlinear gyrokinetic turbulence simulations. The TGLF model is valid in new physical

  15. A comprehensive theory-based transport model

    International Nuclear Information System (INIS)

    Full text: A new theory based transport model with comprehensive physics (trapping, general toroidal geometry, finite beta, collisions) has been developed. The core of the model is the new trapped-gyro- Landau-fluid (TGLF) equations which provide a fast and accurate approximation to the linear eigenmodes for gyrokinetic drift-wave instabilities (trapped ion and electron modes, ion and electron temperature gradient modes and kinetic ballooning modes). This new TGLF transport model removes the limitation of its predecessor GLF23 and is valid for the same conditions as the gyrokinetic equations. A theory-based philosophy is used in the model construction. The closure coefficients of the TGLF equations are fit to local kinetic theory to give accurate linear eigenmodes. The saturation model is fit to non-linear turbulence simulations. No fitting to experiment is done so applying the model to experiments is a true test of the theory it is approximating. The TGLF model unifies trapped and passing particles in a single set of gyrofluid equations. A model for the averaging of the Landau resonance by the trapped particles makes the equations work seamlessly over the whole drift-wave wavenumber range from trapped ion modes to electron temperature gradient modes. A fast eigenmode solution method enables unrestricted magnetic geometry. Electron-ion collisions and full electromagnetic fluctuations round out the physics. The linear eigenmodes have been benchmarked against comprehensive physics gyrokinetic calculations over a large range of plasma parameters. The deviation between the gyrokinetic and TGLF linear growth rates averages 11.4% in shifted circle geometry. The transport model uses the TGLF eigenmodes to compute quasilinear fluxes of energy and particles. A model for the saturated amplitude of the turbulence completes the calculation. The saturation model is constructed to fit a large set of nonlinear gyrokinetic turbulence simulations. The TGLF model is valid in new

  16. Crack propagation modeling using Peridynamic theory

    Science.gov (United States)

    Hafezi, M. H.; Alebrahim, R.; Kundu, T.

    2016-04-01

    Crack propagation and branching are modeled using nonlocal peridynamic theory. One major advantage of this nonlocal theory based analysis tool is the unifying approach towards material behavior modeling - irrespective of whether the crack is formed in the material or not. No separate damage law is needed for crack initiation and propagation. This theory overcomes the weaknesses of existing continuum mechanics based numerical tools (e.g. FEM, XFEM etc.) for identifying fracture modes and does not require any simplifying assumptions. Cracks grow autonomously and not necessarily along a prescribed path. However, in some special situations such as in case of ductile fracture, the damage evolution and failure depend on parameters characterizing the local stress state instead of peridynamic damage modeling technique developed for brittle fracture. For brittle fracture modeling the bond is simply broken when the failure criterion is satisfied. This simulation helps us to design more reliable modeling tool for crack propagation and branching in both brittle and ductile materials. Peridynamic analysis has been found to be very demanding computationally, particularly for real-world structures (e.g. vehicles, aircrafts, etc.). It also requires a very expensive visualization process. The goal of this paper is to bring awareness to researchers the impact of this cutting-edge simulation tool for a better understanding of the cracked material response. A computer code has been developed to implement the peridynamic theory based modeling tool for two-dimensional analysis. A good agreement between our predictions and previously published results is observed. Some interesting new results that have not been reported earlier by others are also obtained and presented in this paper. The final objective of this investigation is to increase the mechanics knowledge of self-similar and self-affine cracks.

  17. Budget Management Model Based on Zero-based Budget%基于零基预算的预算管理模式研究

    Institute of Scientific and Technical Information of China (English)

    王海玲

    2013-01-01

      本文从我国国家预算管理体制大环境入手,从高职院校预算管理角度对目前我国高职院校预算管理中存在的问题进行研究和分析,并提出了适应新的公共财政预算制度的对策和建议。本文的研究对加强高职院校的预算管理,提高财务管理水平具有重要的现实意义。%Starting with our national budget management system, the paper studies and researches the problems in current budget management of national vocational colleges, and puts forward the strategies and suggestions on adapting the new public finance budget system. This study is significant to strengthen budgeting management in higher vocational colleges and improve financial management level.

  18. Stratospheric water vapour budget and convection overshooting the tropopause: modelling study from SCOUT-AMMA

    Directory of Open Access Journals (Sweden)

    X. M. Liu

    2010-09-01

    Full Text Available The aim of this paper is to study the impacts of overshooting convection at a local scale on the water distribution in the tropical UTLS. Overshooting convection is assumed to be one of the processes controlling the entry of water vapour mixing ratio in the stratosphere by injecting ice crystals above the tropopause which later sublimate and hydrate the lower stratosphere. For this purpose, we quantify the individual impact of two cases of overshooting convection in Africa observed during SCOUT-AMMA: the case of 4 August 2006 over Southern Chad which is likely to have influenced the water vapour measurements by micro-SDLA and FLASH-B from Niamey on 5 August, and the case of a mesoscale convective system over Aïr on 5 August 2006. We make use of high resolution (down to 1 km horizontally nested grid simulations with the three-dimensional regional atmospheric model BRAMS (Brazilian Regional Atmospheric Modelling System. In both cases, BRAMS succeeds in simulating the main features of the convective activity, as well as overshooting convection, though the exact position and time of the overshoots indicated by MSG brightness temperature difference is not fully reproduced (typically 1° displacement in latitude compared with the overshoots indicated by brightness temperature difference from satellite observations for both cases, and several hours shift for the Aïr case on 5 August 2006. Total water budgets associated with these two events show a significant injection of ice particles above the tropopause with maximum values of about 3.7 ton s−1 for the Chad case (4 August and 1.4 ton s−1 for the Aïr case (5 August, and a total upward cross tropopause transport of about 3300 ton h−1 for the Chad case and 2400 ton h−1 for the Aïr case in the third domain of simulation. The order of magnitude of these modelled fluxes is lower but comparable with similar studies in other tropical areas based on

  19. A tidal creek water budget: Estimation of groundwater discharge and overland flow using hydrologic modeling in the Southern Everglades

    Science.gov (United States)

    Michot, Béatrice; Meselhe, Ehab A.; Rivera-Monroy, Victor H.; Coronado-Molina, Carlos; Twilley, Robert R.

    2011-07-01

    Taylor Slough is one of the natural freshwater contributors to Florida Bay through a network of microtidal creeks crossing the Everglades Mangrove Ecotone Region (EMER). The EMER ecological function is critical since it mediates freshwater and nutrient inputs and controls the water quality in Eastern Florida Bay. Furthermore, this region is vulnerable to changing hydrodynamics and nutrient loadings as a result of upstream freshwater management practices proposed by the Comprehensive Everglades Restoration Program (CERP), currently the largest wetland restoration project in the USA. Despite the hydrological importance of Taylor Slough in the water budget of Florida Bay, there are no fine scale (˜1 km 2) hydrodynamic models of this system that can be utilized as a tool to evaluate potential changes in water flow, salinity, and water quality. Taylor River is one of the major creeks draining Taylor Slough freshwater into Florida Bay. We performed a water budget analysis for the Taylor River area, based on long-term hydrologic data (1999-2007) and supplemented by hydrodynamic modeling using a MIKE FLOOD (DHI, http://dhigroup.com/) model to evaluate groundwater and overland water discharges. The seasonal hydrologic characteristics are very distinctive (average Taylor River wet vs. dry season outflow was 6 to 1 during 1999-2006) with a pronounced interannual variability of flow. The water budget shows a net dominance of through flow in the tidal mixing zone, while local precipitation and evapotranspiration play only a secondary role, at least in the wet season. During the dry season, the tidal flood reaches the upstream boundary of the study area during approximately 80 days per year on average. The groundwater field measurements indicate a mostly upwards-oriented leakage, which possibly equals the evapotranspiration term. The model results suggest a high importance of groundwater contribution to the water salinity in the EMER. The model performance is satisfactory

  20. A Membrane Model from Implicit Elasticity Theory

    Science.gov (United States)

    Freed, A. D.; Liao, J.; Einstein, D. R.

    2014-01-01

    A Fungean solid is derived for membranous materials as a body defined by isotropic response functions whose mathematical structure is that of a Hookean solid where the elastic constants are replaced by functions of state derived from an implicit, thermodynamic, internal-energy function. The theory utilizes Biot’s (1939) definitions for stress and strain that, in 1-dimension, are the stress/strain measures adopted by Fung (1967) when he postulated what is now known as Fung’s law. Our Fungean membrane model is parameterized against a biaxial data set acquired from a porcine pleural membrane subjected to three, sequential, proportional, planar extensions. These data support an isotropic/deviatoric split in the stress and strain-rate hypothesized by our theory. These data also demonstrate that the material response is highly non-linear but, otherwise, mechanically isotropic. These data are described reasonably well by our otherwise simple, four-parameter, material model. PMID:24282079

  1. Quantum mechanical model in gravity theory

    Science.gov (United States)

    Losyakov, V. V.

    2016-05-01

    We consider a model of a real massive scalar field defined as homogeneous on a d-dimensional sphere such that the sphere radius, time scale, and scalar field are related by the equations of the general theory of relativity. We quantize this system with three degrees of freedom, define the observables, and find dynamical mean values of observables in the regime where the scalar field mass is much less than the Planck mass.

  2. MODELS AND THE DYNAMICS OF THEORIES

    Directory of Open Access Journals (Sweden)

    Paulo Abrantes

    2007-12-01

    Full Text Available Abstract: This paper gives a historical overview of the ways various trends in the philosophy of science dealt with models and their relationship with the topics of heuristics and theoretical dynamics. First of all, N. Campbell’s account of analogies as components of scientific theories is presented. Next, the notion of ‘model’ in the reconstruction of the structure of scientific theories proposed by logical empiricists is examined. This overview finishes with M. Hesse’s attempts to develop Campbell’s early ideas in terms of an analogical inference. The final part of the paper points to contemporary developments on these issues which adopt a cognitivist perspective. It is indicated how discussions in the cognitive sciences might help to flesh out some of the insights philosophers of science had concerning the role models and analogies play in actual scientific theorizing. Key words: models, analogical reasoning, metaphors in science, the structure of scientific theories, theoretical dynamics, heuristics, scientific discovery.

  3. Estimation of energy budget of ionosphere-thermosphere system during two CIR-HSS events: observations and modeling

    Science.gov (United States)

    Verkhoglyadova, Olga; Meng, Xing; Mannucci, Anthony J.; Tsurutani, Bruce T.; Hunt, Linda A.; Mlynczak, Martin G.; Hajra, Rajkumar; Emery, Barbara A.

    2016-04-01

    We analyze the energy budget of the ionosphere-thermosphere (IT) system during two High-Speed Streams (HSSs) on 22-31 January, 2007 (in the descending phase of solar cycle 23) and 25 April-2 May, 2011 (in the ascending phase of solar cycle 24) to understand typical features, similarities, and differences in magnetosphere-ionosphere-thermosphere (IT) coupling during HSS geomagnetic activity. We focus on the solar wind energy input into the magnetosphere (by using coupling functions) and energy partitioning within the IT system during these intervals. The Joule heating is estimated empirically. Hemispheric power is estimated based on satellite measurements. We utilize observations from TIMED/SABER (Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics/Sounding of the Atmosphere using Broadband Emission Radiometry) to estimate nitric oxide (NO) and carbon dioxide (CO2) cooling emission fluxes. We perform a detailed modeling study of these two similar HSS events with the Global Ionosphere-Thermosphere Model (GITM) and different external driving inputs to understand the IT response and to address how well the model reproduces the energy transport. GITM is run in a mode with forecastable inputs. It is shown that the model captures the main features of the energy coupling, but underestimates NO cooling and auroral heating in high latitudes. Lower thermospheric forcing at 100 km altitude is important for correct energy balance of the IT system. We discuss challenges for a physics-based general forecasting approach in modeling the energy budget of moderate IT storms caused by HSSs.

  4. TRADITIONAL BUDGETING VERSUS BEYOND BUDGETING: A LITERATURE REVIEW

    Directory of Open Access Journals (Sweden)

    CARDOS ILDIKO REKA

    2014-07-01

    Full Text Available Budgets are an important part of the business environment since 1920 and are considered to be the key drivers and evaluators of managerial performance; and the key elements for planning and control. Budgets are the most powerful tool for management control; they can play an essential role in the organization’s power politics because it can increase the power and authority of top management and limit the autonomy of lower-level managers. Besides its advantages traditional budgeting presents disadvantages also. In recent years criticism towards traditional budgeting has increased. The basis of this criticism is that traditional budgeting is a relic of the past; it prevents reactions to changes in the market, it cannot keep up with the changes and requirements of today’s business world and it isn’t useful for business management. In order to eliminate criticism researchers and practitioners have developed more systematic and alternative concepts of budgeting that suits better for the needs of the modern business environment. Beyond budgeting, better budgeting, rolling forecasts, activity-based budgeting are the main alternatives developed in the last years. From the mentioned alternatives this article examines only beyond budgeting. Our paper discusses how budgeting has evolved into its current state, before examining why this universal technique has come under such heavy criticism of late. The paper is a literature analysis, it contributes to the existing managerial accounting literature and it is structured as follows. In the first part the background and evolution of budgeting is presented, followed by the analysis of related theories in traditional budgeting, emphasizing both the advantages and disadvantages of traditional budgeting. The second part of the paper continues with the discussion about alternative budgeting methods highlighting pros and cons of alternative methods, especially beyond budgeting. In the third part conducted

  5. Sparse modeling theory, algorithms, and applications

    CERN Document Server

    Rish, Irina

    2014-01-01

    ""A comprehensive, clear, and well-articulated book on sparse modeling. This book will stand as a prime reference to the research community for many years to come.""-Ricardo Vilalta, Department of Computer Science, University of Houston""This book provides a modern introduction to sparse methods for machine learning and signal processing, with a comprehensive treatment of both theory and algorithms. Sparse Modeling is an ideal book for a first-year graduate course.""-Francis Bach, INRIA - École Normale Supřieure, Paris

  6. Modelling canopy radiation budget through multiple scattering approximation: a case study of coniferous forest in Mexico City Valley

    Science.gov (United States)

    Silván-Cárdenas, Jose L.; Corona-Romero, Nirani

    2015-10-01

    In this paper, we describe some results from a study on hyperspectral analysis of coniferous canopy scattering for the purpose of estimating forest biophysical and structural parameters. Georeferenced airborne hyperspectral measurements were taken from a flying helicopter over a coniferous forest dominated by Pinus hartweguii and Abies religiosa within the Federal District Conservation Land in Mexico City. Hyperspectral data was recorded in the optical range from 350 to 2500 nm at 1nm spectral resolution using the FieldSpec 4 (ASD Inc.). Spectral measurements were also carried out in the ground for vegetation and understory components, including leaf, bark, soil and grass. Measurements were then analyzed through a previously developed multiple scattering approximation (MSA) model, which represents above-canopy spectral reflectance through a non-linear combination of pure spectral components (endmembers), as well as through a set of photon recollision probabilities and interceptance fractions. In this paper we provide an expression for the canopy absorptance as the basis for estimating the components of canopy radiation budget using the MSA model. Furthermore, since MSA does not prescribe a priori the endmembers to incorporate in the model, a multiple endmember selection method (MESMSA) was developed and tested. Photon recollision probabilities and interceptance fractions were estimated by fitting the model to airborne spectral reflectance and selected endmembers where then used to estimate the canopy radiation budget at each measured location.

  7. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  8. Theory, modeling and simulation: Annual report 1993

    International Nuclear Information System (INIS)

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies

  9. Open Budget

    Institute of Scientific and Technical Information of China (English)

    LI LI

    2010-01-01

    @@ In an unprecedented move,four ministries under the Central Government recently posted their 2010 budgets on their official websites.This move has been greeted with mixed reactions,with some netizens complaining about a lack of details and explanations of different items.

  10. An Optimization Model Based on Game Theory

    Directory of Open Access Journals (Sweden)

    Yang Shi

    2014-04-01

    Full Text Available Game Theory has a wide range of applications in department of economics, but in the field of computer science, especially in the optimization algorithm is seldom used. In this paper, we integrate thinking of game theory into optimization algorithm, and then propose a new optimization model which can be widely used in optimization processing. This optimization model is divided into two types, which are called “the complete consistency” and “the partial consistency”. In these two types, the partial consistency is added disturbance strategy on the basis of the complete consistency. When model’s consistency is satisfied, the Nash equilibrium of the optimization model is global optimal and when the model’s consistency is not met, the presence of perturbation strategy can improve the application of the algorithm. The basic experiments suggest that this optimization model has broad applicability and better performance, and gives a new idea for some intractable problems in the field of artificial intelligence

  11. String field theory inspired phantom model

    International Nuclear Information System (INIS)

    An exact solution to the Friedmann equations with a stringy inspired phantom field is constructed. The Universe is considered as a slowly decaying D3-brane, which is described in the string field theory framework. The notable features of the concerned exactly solvable stringy dark energy (DE) model are a ghost sign of the kinetic term and a special polynomial form of the effective tachyon potential. Cosmological consequences of adding the cold dark matter (CDM) to this model are investigated as well. Solutions with large initial value of the CDM energy density attracted by the exact solution without the CDM are constructed numerically. In contrast to the ACDM model the Hubble parameter in our model is not a monotonic function of time. For specific initial data the DE state parameter UJDE is also not monotonic function of time. For these cases there are two separate domains of time where U'DE being less than - 1 is close to - 1. Stability conditions, under which the constructed solution is stable with respect to small fluctuations of the initial conditions, including the CDM energy density, are found. Keywords: string field theory, cosmology, tachyon, phantom, dark energy, cold dark matter, Big Rip (authors)

  12. A comprehensive study on performance-based budgeting model: A case study of Iran's policy making, implementing and monitoring

    Directory of Open Access Journals (Sweden)

    Ghodratollah Talebnia

    2012-10-01

    Full Text Available Performance-based budgeting (PBB is the latest attempt to use performance indicators in allocation of resources in public sector. PBB experts normally attempt to place emphasis on output and outcome instead of input. Iran has made efforts to establish the PBB system but so far this goal has not been realized. The methodology of the research is descriptive by the means of survey-analytical approach. In the research, the possibility of establishment of PBB in Iran is examined from three perspectives (Policymaking, Implementing, and Monitoring. The conceptual model of this research is formed with a comprehensive review of literature of PBB all over the world. At first, with an extensive review of literature in the countries who implemented PBB or trying to implement it, we identify all variables, which are necessary for a suitable Performance Budgeting model. Then the PBB experts test the necessity of these variables in Iran and finally the existence has been proved by the statistical methods with the Iranian model.

  13. A modelling study of the impact of cirrus clouds on the moisture budget of the upper troposphere

    Directory of Open Access Journals (Sweden)

    S. Fueglistaler

    2006-01-01

    Full Text Available We present a modelling study of the effect of cirrus clouds on the moisture budget of the layer wherein the cloud formed. Our framework simplifies many aspects of cloud microphysics and collapses the problem of sedimentation onto a 0-dimensional box model, but retains essential feedbacks between saturation mixing ratio, particle growth, and water removal through particle sedimentation. The water budget is described by two coupled first-order differential equations for dimensionless particle number density and saturation point temperature, where the parameters defining the system (layer depth, reference temperature, amplitude and time scale of temperature perturbation and inital particle number density, which may or may not be a function of reference temperature and cooling rate are encapsulated in a single coefficient. This allows us to scale the results to a broad range of atmospheric conditions, and to test sensitivities. Results of the moisture budget calculations are presented for a range of atmospheric conditions (T: 238–205 K; p: 325–180 hPa and a range of time scales τT of the temperature perturbation that induces the cloud formation. The cirrus clouds are found to efficiently remove water for τT longer than a few hours, with longer perturbations (τT≳10 h required at lower temperatures (T≲210 K. Conversely, we find that temperature perturbations of duration order 1 h and less (a typical timescale for e.g., gravity waves do not efficiently dehydrate over most of the upper troposphere. A consequence is that (for particle densities typical of current cirrus clouds the assumption of complete dehydration to the saturation mixing ratio may yield valid predictions for upper tropospheric moisture distributions if it is based on the large scale temperature field, but this assumption is not necessarily valid if it is based on smaller scale temperature fields.

  14. Parameterisation and validation of a resource budget model for masting using spatiotemporal flowering data of individual trees.

    Science.gov (United States)

    Abe, Tomoyuki; Tachiki, Yuuya; Kon, Hirokazu; Nagasaka, Akiko; Onodera, Kensuke; Minamino, Kazuhiro; Han, Qingmin; Satake, Akiko

    2016-09-01

    Synchronised and fluctuating reproduction by plant populations, called masting, is widespread in diverse taxonomic groups. Here, we propose a new method to explore the proximate mechanism of masting by combining spatiotemporal flowering data, biochemical analysis of resource allocation and mathematical modelling. Flowering data of 170 trees over 13 years showed the emergence of clustering with trees in a given cluster mutually synchronised in reproduction, which was successfully explained by resource budget models. Analysis of resources invested in the development of reproductive organs showed that parametric values used in the model are significantly different between nitrogen and carbon. Using a fully parameterised model, we showed that the observed flowering pattern is explained only when the interplay between nitrogen dynamics and climatic cues was considered. This result indicates that our approach successfully identified resource type-specific roles on masting and that the method is suitable for a wide range of plant species. PMID:27449602

  15. Queuing theory models for computer networks

    Science.gov (United States)

    Galant, David C.

    1989-01-01

    A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

  16. A lattice gauge theory model for graphene

    CERN Document Server

    Porta, Marcello

    2011-01-01

    In this Ph.D. thesis a model for graphene in presence of quantized electromagnetic interactions is introduced. The zero and low temperature properties of the model are studied using rigorous renormalization group methods and lattice Ward identities. In particular, it is shown that, at all orders in renormalized perturbation theory, the Schwinger functions and the response functions decay with interaction dependent anomalous exponents. Regarding the 2-point Schwinger function, the wave function renormalization diverges in the infrared limit, while the effective Fermi velocity flows to the speed of light. Concerning the response functions, those associated to a Kekul\\'e distortion of the honeycomb lattice and to a charge density wave instability are enhanced by the electromagnetic electron-electron interactions (their scaling in real space is depressed), while the lowest order correction to the scaling exponent of the density-density response function is vanishing. Then, the model in presence of a fixed Kekul\\'...

  17. Economic contract theory tests models of mutualism.

    Science.gov (United States)

    Weyl, E Glen; Frederickson, Megan E; Yu, Douglas W; Pierce, Naomi E

    2010-09-01

    Although mutualisms are common in all ecological communities and have played key roles in the diversification of life, our current understanding of the evolution of cooperation applies mostly to social behavior within a species. A central question is whether mutualisms persist because hosts have evolved costly punishment of cheaters. Here, we use the economic theory of employment contracts to formulate and distinguish between two mechanisms that have been proposed to prevent cheating in host-symbiont mutualisms, partner fidelity feedback (PFF) and host sanctions (HS). Under PFF, positive feedback between host fitness and symbiont fitness is sufficient to prevent cheating; in contrast, HS posits the necessity of costly punishment to maintain mutualism. A coevolutionary model of mutualism finds that HS are unlikely to evolve de novo, and published data on legume-rhizobia and yucca-moth mutualisms are consistent with PFF and not with HS. Thus, in systems considered to be textbook cases of HS, we find poor support for the theory that hosts have evolved to punish cheating symbionts; instead, we show that even horizontally transmitted mutualisms can be stabilized via PFF. PFF theory may place previously underappreciated constraints on the evolution of mutualism and explain why punishment is far from ubiquitous in nature.

  18. A matrix model from string field theory

    Science.gov (United States)

    Zeze, Syoji

    2016-09-01

    We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N) vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large N matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.

  19. Quantum Link Models: A Discrete Approach to Gauge Theories

    OpenAIRE

    Chandrasekharan, S; Wiese, U.-J.

    1996-01-01

    We construct lattice gauge theories in which the elements of the link matrices are represented by non-commuting operators acting in a Hilbert space. These quantum link models are related to ordinary lattice gauge theories in the same way as quantum spin models are related to ordinary classical spin systems. Here U(1) and SU(2) quantum link models are constructed explicitly. As Hamiltonian theories quantum link models are nonrelativistic gauge theories with potential applications in condensed ...

  20. Nature, theory and modelling of geophysical convective planetary boundary layers

    Science.gov (United States)

    Zilitinkevich, Sergej

    2015-04-01

    horizontal branches of organised structures. This mechanism (Zilitinkevich et al., 2006), was overlooked in conventional local theories, such as the Monin-Obukhov similarity theory, and convective heat/mass transfer law: Nu~Ra1/3, where Nu and Ra are the Nusselt number and Raleigh numbers. References Hellsten A., Zilitinkevich S., 2013: Role of convective structures and background turbulence in the dry convective boundary layer. Boundary-Layer Meteorol. 149, 323-353. Zilitinkevich, S.S., 1973: Shear convection. Boundary-Layer Meteorol. 3, 416-423. Zilitinkevich, S.S., 1991: Turbulent Penetrative Convection, Avebury Technical, Aldershot, 180 pp. Zilitinkevich S.S., 2012: The Height of the Atmospheric Planetary Boundary layer: State of the Art and New Development - Chapter 13 in 'National Security and Human Health Implications of Climate Change', edited by H.J.S. Fernando, Z. Klaić, J.L. McKulley, NATO Science for Peace and Security Series - C: Environmental Security (ISBN 978-94-007-2429-7), Springer, 147-161. Zilitinkevich S.S., 2013: Atmospheric Turbulence and Planetary Boundary Layers. Fizmatlit, Moscow, 248 pp. Zilitinkevich, S.S., Hunt, J.C.R., Grachev, A.A., Esau, I.N., Lalas, D.P., Akylas, E., Tombrou, M., Fairall, C.W., Fernando, H.J.S., Baklanov, and A., Joffre, S.M., 2006: The influence of large convective eddies on the surface layer turbulence. Quart. J. Roy. Met. Soc. 132, 1423-1456. Zilitinkevich S.S., Tyuryakov S.A., Troitskaya Yu. I., Mareev E., 2012: Theoretical models of the height of the atmospheric planetary boundary layer and turbulent entrainment at its upper boundary. Izvestija RAN, FAO, 48, No.1, 150-160 Zilitinkevich, S.S., Elperin, T., Kleeorin, N., Rogachevskii, I., Esau, I.N., 2013: A hierarchy of energy- and flux-budget (EFB) turbulence closure models for stably stratified geophysical flows. Boundary-Layer Meteorol. 146, 341-373.

  1. Theory and Modelling of Electrolytes and Chain Molecules

    OpenAIRE

    Li, Ming

    2011-01-01

    An aqueous solution of electrolytes can be modelled simplistically as charged hard spheresdispersed in a dielectric continuum. We review various classical theories for hard sphere systems including the Percus-Yevick theory, the mean spherical approximation, the Debye-Hückel theory and the hyper-netted chain theory, and we compare the predictions of the theories with simulation results. The statistical associating fluid theory (SAFT) has proved to be accurate for neutral polymers. It is mo...

  2. Mean-velocity profile of smooth channel flow explained by a cospectral budget model with wall-blockage

    Science.gov (United States)

    McColl, Kaighin A.; Katul, Gabriel G.; Gentine, Pierre; Entekhabi, Dara

    2016-03-01

    A series of recent studies has shown that a model of the turbulent vertical velocity variance spectrum (Fvv) combined with a simplified cospectral budget can reproduce many macroscopic flow properties of turbulent wall-bounded flows, including various features of the mean-velocity profile (MVP), i.e., the "law of the wall". While the approach reasonably models the MVP's logarithmic layer, the buffer layer displays insufficient curvature compared to measurements. The assumptions are re-examined here using a direct numerical simulation (DNS) dataset at moderate Reynolds number that includes all the requisite spectral and co-spectral information. Starting with several hypotheses for the cause of the "missing" curvature in the buffer layer, it is shown that the curvature deficit is mainly due to mismatches between (i) the modelled and DNS-observed pressure-strain terms in the cospectral budget and (ii) the DNS-observed Fvv and the idealized form used in previous models. By replacing the current parameterization for the pressure-strain term with an expansive version that directly accounts for wall-blocking effects, the modelled and DNS reported pressure-strain profiles match each other in the buffer and logarithmic layers. Forcing the new model with DNS-reported Fvv rather than the idealized form previously used reproduces the missing buffer layer curvature to high fidelity thereby confirming the "spectral link" between Fvv and the MVP across the full profile. A broad implication of this work is that much of the macroscopic properties of the flow (such as the MVP) may be derived from the energy distribution in turbulent eddies (i.e., Fvv) representing the microstate of the flow, provided the link between them accounts for wall-blocking.

  3. Application of Chaos Theory to Psychological Models

    Science.gov (United States)

    Blackerby, Rae Fortunato

    This dissertation shows that an alternative theoretical approach from physics--chaos theory--offers a viable basis for improved understanding of human beings and their behavior. Chaos theory provides achievable frameworks for potential identification, assessment, and adjustment of human behavior patterns. Most current psychological models fail to address the metaphysical conditions inherent in the human system, thus bringing deep errors to psychological practice and empirical research. Freudian, Jungian and behavioristic perspectives are inadequate psychological models because they assume, either implicitly or explicitly, that the human psychological system is a closed, linear system. On the other hand, Adlerian models that require open systems are likely to be empirically tenable. Logically, models will hold only if the model's assumptions hold. The innovative application of chaotic dynamics to psychological behavior is a promising theoretical development because the application asserts that human systems are open, nonlinear and self-organizing. Chaotic dynamics use nonlinear mathematical relationships among factors that influence human systems. This dissertation explores these mathematical relationships in the context of a sample model of moral behavior using simulated data. Mathematical equations with nonlinear feedback loops describe chaotic systems. Feedback loops govern the equations' value in subsequent calculation iterations. For example, changes in moral behavior are affected by an individual's own self-centeredness, family and community influences, and previous moral behavior choices that feed back to influence future choices. When applying these factors to the chaos equations, the model behaves like other chaotic systems. For example, changes in moral behavior fluctuate in regular patterns, as determined by the values of the individual, family and community factors. In some cases, these fluctuations converge to one value; in other cases, they diverge in

  4. An Inflationary Model in String Theory

    CERN Document Server

    Iizuka, N; Iizuka, Norihiro; Trivedi, Sandip P.

    2004-01-01

    We construct a model of inflation in string theory after carefully taking into account moduli stabilization. The setting is a warped compactification of Type IIB string theory in the presence of D3 and anti-D3-branes. The inflaton is the position of a D3-brane in the internal space. By suitably adjusting fluxes and the location of symmetrically placed anti-D3-branes, we show that at a point of enhanced symmetry, the inflaton potential V can have a broad maximum, satisfying the condition V''/V << 1 in Planck units. On starting close to the top of this potential the slow-roll conditions can be met. Observational constraints impose significant restrictions. As a first pass we show that these can be satisfied and determine the important scales in the compactification to within an order of magnitude. One robust feature is that the scale of inflation is low, H = O(10^{10}) GeV. Removing the observational constraints makes it much easier to construct a slow-roll inflationary model. Generalizations and conseque...

  5. Modified perturbation theory for the Yukawa model

    CERN Document Server

    Poluektov, Yu M

    2016-01-01

    A new formulation of perturbation theory for a description of the Dirac and scalar fields (the Yukawa model) is suggested. As the main approximation the self-consistent field model is chosen, which allows in a certain degree to account for the effects caused by the interaction of fields. Such choice of the main approximation leads to a normally ordered form of the interaction Hamiltonian. Generation of the fermion mass due to the interaction with exchange of the scalar boson is investigated. It is demonstrated that, for zero bare mass, the fermion can acquire mass only if the coupling constant exceeds the critical value determined by the boson mass. In this connection, the problem of the neutrino mass is discussed.

  6. Model theory and the Tannakian formalism

    CERN Document Server

    Kamensky, Moshe

    2009-01-01

    We draw the connection between the model theoretic notions of internality and the binding group on one hand, and the Tannakian formalism on the other. More precisely, we deduce the fundamental results of the Tannakian formalism by associating to a Tannakian category a first order theory, and applying the results on internality there. In the other direction, we formulate prove a general categorical statement that can be viewed as a ``non-linear'' version of the Tannakian formalism, and deduce the model theoretic result from it. For dessert, we formulate a version of the Tannakian formalism for differential linear groups, and show how the same techniques can be used to deduce the analogous results in that context.

  7. Drafting Multiannual Local Budgets by Economic-Mathematical Modelling of the Evolution of Revenues

    Directory of Open Access Journals (Sweden)

    Ioan Radu

    2009-01-01

    Full Text Available Although seen as a sector with a high degree of inertia and conservatism the public administration system determines the public institutions to record a set of influences both from the internal and external environment. The public administration system is influenced by the frequent legislative changes and recently by the requirements claimed by the European Union. Given the complexity and dynamics of the competitive environment the approach of strategic management tools at the level of public administration becomes more and more important and necessary. One of the main types of exercise of strategic management is represented by financial planning moulded into policies, strategies, plans and programmes whose generation is based on multiannual budgets.

  8. PARFUME Theory and Model basis Report

    Energy Technology Data Exchange (ETDEWEB)

    Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson

    2009-09-01

    The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.

  9. The annual ammonia budget of fertilised cut grassland – Part 2: Seasonal variations and compensation point modeling

    Directory of Open Access Journals (Sweden)

    C. R. Flechard

    2009-10-01

    Full Text Available The net annual NH3 exchange budget of a fertilised, cut grassland in Central Switzerland is presented. The observation-based budget was computed from semi-continuous micrometeorological fluxes over a time period of 16 months and using a process-based gap-filling procedure. The data for emission peak events following the application of cattle slurry and for background exchange were analysed separately to distinguish short-term perturbations from longer-term ecosystem functioning. A canopy compensation point model of background exchange is parameterised on the basis of measured data and applied for the purposes of gap-filling. The data show that, outside fertilisation events, grassland behaves as a net sink for atmospheric NH3 with an annual dry deposition flux of −3.0 kg N ha−1 yr−1, although small NH3 emissions by the canopy were measured in dry daytime conditions. The median Γs ratio in the apoplast (=[NH4+]/[H+] estimated from micrometeorological measurements was 620, equivalent to a stomatal compensation point of 1.3 μg NH3 m−3 at 15°C. Non-stomatal resistance to deposition Rw was shown to increase with temperature and decrease with surface relative humidity, and Rw values were among the highest published for European grasslands, consistent with a relatively high ratio of NH3 to acid gases in the boundary layer at this site. Since the gross annual NH3 emission by slurry spreading was of the order of +20 kg N ha−1 yr−1, the fertilised grassland was a net NH3 source of +17 kg N ha−1 yr−1. A comparison with the few other measurement-based budget values from the literature reveals considerable variability, demonstrating both the influence of soil, climate, management and grassland type on the NH

  10. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  11. Estimation of energy budget of ionosphere-thermosphere system during two CIR-HSS events: observations and modeling

    Directory of Open Access Journals (Sweden)

    Verkhoglyadova Olga

    2016-01-01

    Full Text Available We analyze the energy budget of the ionosphere-thermosphere (IT system during two High-Speed Streams (HSSs on 22–31 January, 2007 (in the descending phase of solar cycle 23 and 25 April–2 May, 2011 (in the ascending phase of solar cycle 24 to understand typical features, similarities, and differences in magnetosphere-ionosphere-thermosphere (IT coupling during HSS geomagnetic activity. We focus on the solar wind energy input into the magnetosphere (by using coupling functions and energy partitioning within the IT system during these intervals. The Joule heating is estimated empirically. Hemispheric power is estimated based on satellite measurements. We utilize observations from TIMED/SABER (Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics/Sounding of the Atmosphere using Broadband Emission Radiometry to estimate nitric oxide (NO and carbon dioxide (CO2 cooling emission fluxes. We perform a detailed modeling study of these two similar HSS events with the Global Ionosphere-Thermosphere Model (GITM and different external driving inputs to understand the IT response and to address how well the model reproduces the energy transport. GITM is run in a mode with forecastable inputs. It is shown that the model captures the main features of the energy coupling, but underestimates NO cooling and auroral heating in high latitudes. Lower thermospheric forcing at 100 km altitude is important for correct energy balance of the IT system. We discuss challenges for a physics-based general forecasting approach in modeling the energy budget of moderate IT storms caused by HSSs.

  12. 装备经费预算项目组合优化决策模型研究%Research on Equipment Outlay Budget Project Assembly Optimization Decision-making Model

    Institute of Scientific and Technical Information of China (English)

    胡玉清; 张帅; 苑明; 张永顺

    2015-01-01

    Applying the theory of Portfolio Management , in accordance with the actual need and available finance , the assembly optimization model for the equipment outlay budget project is designed , and for this function , the maximum expected revenue is the goal , and the resource restraint , the project relationship restraint and the project decompounded restraint are controlled by the equipment outlay budget .The model provides decision-making means for equipment outlay budget authorized and enhances them in a scientific , fair and transparant way .%应用项目组合管理理论,综合考虑装备建设的实际需要与经费供给的可能性,建立以预期军事效益最大为目标,以装备经费预算控制指标约束、项目关系约束和项目可分解约束的装备经费预算项目组合优化决策模型。所建立的模型为装备经费预算编制工作提供了可行的决策技术方法,有效增强了装备经费预算编制的科学性、公正性和透明度。

  13. 基于控制论的民办高校财务预算研究%Research on Financial Budget of Private Colleges Based on Control Theory

    Institute of Scientific and Technical Information of China (English)

    罗小兰

    2016-01-01

    The effectiveness of financial budget management of private colleges at least depending on two factors: First, the preparation of the financial budget should pay attention to the method, as far as possible to improve the accuracy of the budget; second, taking the strict control of the budget as the standard, focuses on the results of the implementation of the financial budget control. Budget execution control is the key to the success of budget management in private colleges, only attaches great importance to the financial budget implementation control link, can achieve the purpose of the financial budget management of private colleges. This paper mainly from the perspective of the financial budget implementation of private colleges, explores suitable financial budget management method for private colleges.%民办院校财务预算管理的成效至少取决于两个因素:一是财务预算编制要注意方法,尽量提高预算的准确性;二是以预算为标准进行严格控制,重点关注财务预算执行控制的结果。预算执行控制是民办高校预算管理成功的关键,只有高度重视财务预算执行控制环节,才能达到民办高校财务预算管理的目的。本文主要从民办高校财务预算执行角度出发,探求适合民办高校的财务预算管理方法。

  14. Educational Program Evaluation Model, From the Perspective of the New Theories

    Directory of Open Access Journals (Sweden)

    Soleiman Ahmady

    2014-05-01

    Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.

  15. Blue mussel (Mytilus edulis) growth at various salinity regime determined by a Dynamic Energy Budget model

    DEFF Research Database (Denmark)

    Saurel, Camille; Maar, Marie; Landes, Anja;

    Blue mussels (Mytilus edulis) is a key euryhaline species in coastal areas that has been used in eutrophied water in mitigation cultures to improve water clarity by filtering phytoplankton in excess due to nutrient enrichment. Mussel growth rates depend mainly on key environmental conditions...... such as food supply, temperature and salinity. In the Baltic Sea - highly disturbed eutrophied environment-mussel growth efficiency is limited due to the very low levels of salinity and in area where the salinity is below 8 psu, mussels appear on a dwarf form. The aim of the present study was to incorporate...... the effects of low salinity into an eco-physiological model of blue mussels and to identify areas suitable for cost-effective mussel production for mitigation culture. A standard Dynamic Energy Budget (DEB) model was modified with respect to i) the morphological parameters (DW/WW-ratio, shape factor), ii...

  16. Projected Impact of Climate Change on the Water and Salt Budgets of the Arctic Ocean by a Global Climate Model

    Science.gov (United States)

    Miller, James R.; Russell, Gary L.

    1996-01-01

    The annual flux of freshwater into the Arctic Ocean by the atmosphere and rivers is balanced by the export of sea ice and oceanic freshwater. Two 150-year simulations of a global climate model are used to examine how this balance might change if atmospheric greenhouse gases (GHGs) increase. Relative to the control, the last 50-year period of the GHG experiment indicates that the total inflow of water from the atmosphere and rivers increases by 10% primarily due to an increase in river discharge, the annual sea-ice export decreases by about half, the oceanic liquid water export increases, salinity decreases, sea-ice cover decreases, and the total mass and sea-surface height of the Arctic Ocean increase. The closed, compact, and multi-phased nature of the hydrologic cycle in the Arctic Ocean makes it an ideal test of water budgets that could be included in model intercomparisons.

  17. A Comparison of Explicit Algebraic Turbulence Models and the Energy-Flux Budget (EFB) Closure in Gabls

    Science.gov (United States)

    Lazeroms, W. M.; Bazile, E.; Brethouwer, G.; Wallin, S.; Johansson, A. V.; Svensson, G.

    2014-12-01

    Turbulent flows with buoyancy effects occur in many situations, both in industry and in the atmosphere. It is challenging to correctly model such flows, especially in the case of stably stratified turbulence, where vertical motions are damped by buoyancy forces. For this purpose, we have derived a so-called explicit algebraic model for the Reynolds stresses and turbulent heat flux that gives accurate predictions in flows with buoyancy effects. Although inspired by turbulence models from engineering, the main aim of our work is to improve the parametrization of turbulence in the atmospheric boundary layer (ABL). Explicit algebraic turbulence models are a class of parametrizations that, on the one hand, are more advanced than standard eddy-diffusivity relations. On the other hand, they are signficantly easier to handle numerically than models that require the solution of the full flux-budget equations. To derive the algebraic model, we apply the assumption that transport terms of dimensionless fluxes can be neglected. Careful considerations of the algebra lead to a consistent formulation of the Reynolds stresses and turbulent heat flux, which is more general and robust than previous models of a similar kind. The model is shown to give good results compared to direct numerical simulations of engineering test cases, such as turbulent channel flow. Recent work has been aimed at testing the model in an atmospheric context. The first of these tests makes use of the GABLS1 case, in which a stable atmospheric boundary layer develops through a constant surface cooling rate. The model is able to give good predictions of this case compared to LES (see attached figure). Interestingly, the results are very close to the outcome of the recently developed Energy-Flux-Budget (EFB) closure by Zilitinkevich et al. (2013). A detailed discussion of the similarities and differences between these models will be given, which can give insight in the more general gap between engineering and

  18. Nonsingular models of universes in teleparallel theories.

    Science.gov (United States)

    de Haro, Jaume; Amoros, Jaume

    2013-02-15

    Different models of universes are considered in the context of teleparallel theories. Assuming that the universe is filled by a fluid with an equation of state P=-ρ-f(ρ), for different teleparallel theories and different equation of state we study its dynamics. Two particular cases are studied in detail: in the first one we consider a function f with two zeros (two de Sitter solutions) that mimics a huge cosmological constant at early times and a pressureless fluid at late times; in the second one, in the context of loop quantum cosmology with a small cosmological constant, we consider a pressureless fluid (P=0⇔f(ρ)=-ρ) which means there are de Sitter and anti-de Sitter solutions. In both cases one obtains a nonsingular universe that at early times is in an inflationary phase; after leaving this phase, it passes trough a matter dominated phase and finally at late times it expands in an accelerated way. PMID:25166366

  19. California's Methane Budget derived from CalNex P-3 Aircraft Observations and the WRF-STILT Lagrangian Transport Model

    Science.gov (United States)

    Santoni, G. W.; Xiang, B.; Kort, E. A.; Daube, B.; Andrews, A. E.; Sweeney, C.; Wecht, K.; Peischl, J.; Ryerson, T. B.; Angevine, W. M.; Trainer, M.; Nehrkorn, T.; Eluszkiewicz, J.; Wofsy, S. C.

    2012-12-01

    We present constraints on California emission inventories of methane (CH4) using atmospheric observations from nine NOAA P-3 flights during the California Nexus (CalNex) campaign in May and June of 2010. Measurements were made using a quantum cascade laser spectrometer (QCLS) and a cavity ring-down spectrometer (CRDS) and calibrated to NOAA standards in-flight. Five flights sampled above the northern and southern central valley and an additional four flights probed the south coast air basin, quantifying emissions from the Los Angeles basin. The data show large (>100 ppb) CH4 enhancements associated with point and area sources such as cattle and manure management, landfills, wastewater treatment, gas production and distribution infrastructure, and rice agriculture. We compare aircraft observations to modeled CH4 distributions by accounting for a) transport using the Stochastic Time-Inverted Lagrangian Transport (STILT) model driven by Weather Research and Forecasting (WRF) meteorology, b) emissions from inventories such as EDGAR and ones constructed from California-specific state and county databases, each gridded to 0.1° x 0.1° resolution, and c) spatially and temporally evolving boundary conditions such as GEOS-Chem and a NOAA aircraft profile measurement derived curtain imposed at the edge of the WRF domain. After accounting for errors associated with transport, planetary boundary layer height, lateral boundary conditions, seasonality of emissions, and the spatial resolution of surface emission prior estimates, we find that the California Air Resources Board (CARB) CH4 budget is a factor of 1.64 too low. Using a Bayesian inversion to the flight data, we estimate California's CH4 budget to be 2.5 TgCH4/yr, with emissions from cattle and manure management, landfills, rice, and natural gas infrastructure, representing roughly 82%, 26%, 9% and 32% (sum = 149% with other sources accounting for the additional 15%) of the current CARB CH4 budget estimate of 1.52 TgCH4

  20. Galaxy alignments: Theory, modelling and simulations

    CERN Document Server

    Kiessling, Alina; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L; Rassat, Anais

    2015-01-01

    The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in large-scale structure tend to align the shapes and angular momenta of nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both $N$-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the ...

  1. Modeling and Optimization : Theory and Applications Conference

    CERN Document Server

    Terlaky, Tamás

    2015-01-01

    This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 13-15, 2014. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, healthcare, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.

  2. Modeling active memory: Experiment, theory and simulation

    Science.gov (United States)

    Amit, Daniel J.

    2001-06-01

    Neuro-physiological experiments on cognitively performing primates are described to argue that strong evidence exists for localized, non-ergodic (stimulus specific) attractor dynamics in the cortex. The specific phenomena are delay activity distributions-enhanced spike-rate distributions resulting from training, which we associate with working memory. The anatomy of the relevant cortex region and the physiological characteristics of the participating elements (neural cells) are reviewed to provide a substrate for modeling the observed phenomena. Modeling is based on the properties of the integrate-and-fire neural element in presence of an input current of Gaussian distribution. Theory of stochastic processes provides an expression for the spike emission rate as a function of the mean and the variance of the current distribution. Mean-field theory is then based on the assumption that spike emission processes in different neurons in the network are independent, and hence the input current to a neuron is Gaussian. Consequently, the dynamics of the interacting network is reduced to the computation of the mean and the variance of the current received by a cell of a given population in terms of the constitutive parameters of the network and the emission rates of the neurons in the different populations. Within this logic we analyze the stationary states of an unstructured network, corresponding to spontaneous activity, and show that it can be stable only if locally the net input current of a neuron is inhibitory. This is then tested against simulations and it is found that agreement is excellent down to great detail. A confirmation of the independence hypothesis. On top of stable spontaneous activity, keeping all parameters fixed, training is described by (Hebbian) modification of synapses between neurons responsive to a stimulus and other neurons in the module-synapses are potentiated between two excited neurons and depressed between an excited and a quiescent neuron

  3. Gravothermal Star Clusters - Theory and Computer Modelling

    Science.gov (United States)

    Spurzem, Rainer

    2010-11-01

    In the George Darwin lecture, delivered to the British Royal Astronomical Society in 1960 by Viktor A. Ambartsumian he wrote on the evolution of stellar systems that it can be described by the "dynamic evolution of a gravitating gas" complemented by "a statistical description of the changes in the physical states of stars". This talk will show how this physical concept has inspired theoretical modeling of star clusters in the following decades up to the present day. The application of principles of thermodynamics shows, as Ambartsumian argued in his 1960 lecture, that there is no stable state of equilibrium of a gravitating star cluster. The trend to local thermodynamic equilibrium is always disturbed by escaping stars (Ambartsumian), as well as by gravothermal and gravogyro instabilities, as it was detected later. Here the state-of-the-art of modeling the evolution of dense stellar systems based on principles of thermodynamics and statistical mechanics (Fokker-Planck approximation) will be reviewed. Recent progress including rotation and internal correlations (primordial binaries) is presented. The models have also very successfully been used to study dense star clusters around massive black holes in galactic nuclei and even (in a few cases) relativistic supermassive dense objects in centres of galaxies (here again briefly touching one of the many research fields of V.A. Ambartsumian). For the modern present time of high-speed supercomputing, where we are tackling direct N-body simulations of star clusters, we will show that such direct modeling supports and proves the concept of the statistical models based on the Fokker-Planck theory, and that both theoretical concepts and direct computer simulations are necessary to support each other and make scientific progress in the study of star cluster evolution.

  4. A Mathematical Theory of the Gauged Linear Sigma Model

    CERN Document Server

    Fan, Huijun; Ruan, Yongbin

    2015-01-01

    We construct a rigorous mathematical theory of Witten's Gauged Linear Sigma Model (GLSM). Our theory applies to a wide range of examples, including many cases with non-Abelian gauge group. Both the Gromov-Witten theory of a Calabi-Yau complete intersection X and the Landau-Ginzburg dual (FJRW-theory) of X can be expressed as gauged linear sigma models. Furthermore, the Landau-Ginzburg/Calabi-Yau correspondence can be interpreted as a variation of the moment map or a deformation of GIT in the GLSM. This paper focuses primarily on the algebraic theory, while a companion article will treat the analytic theory.

  5. Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    2008-01-01

    Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... in the economic model is related to econometric concepts of exogeneity. The economic equilibrium corresponds to the so-called long-run value (Johansen 2005), the long-run impact matrix, C; captures the comparative statics and the exogenous variables are the common trends. The adjustment parameters of the CVAR...

  6. Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    2008-01-01

    Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... in the economic model implies the econometric concept of strong exogeneity for ß. The economic equilibrium corresponds to the so-called long-run value (Johansen 2005), the comparative statics are captured by the long-run impact matrix, C; and the exogenous variables are the common trends. Also, the adjustment...

  7. High spatial resolution radiation budget for Europe: derived from satellite data, validation of a regional model; Raeumlich hochaufgeloeste Strahlungsbilanz ueber Europa: Ableitung aus Satellitendaten, Validation eines regionalen Modells

    Energy Technology Data Exchange (ETDEWEB)

    Hollmann, R. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Atmosphaerenphysik

    2000-07-01

    Since forty years instruments onboard satellites have been demonstrated their usefulness for many applications in the field of meteorology and oceanography. Several experiments, like ERBE, are dedicated to establish a climatology of the global Earth radiation budget at the top of the atmosphere. Now the focus has been changed to the regional scale, e.g. GEWEX with its regional sub-experiments like BALTEX. To obtain a regional radiation budget for Europe in the first part of the work the well calibrated measurements from ScaRaB (scanner for radiation budget) are used to derive a narrow-to-broadband conversion, which is applicable to the AVHRR (advanced very high resolution radiometer). It is shown, that the accuracy of the method is in the order of that from SCaRaB itself. In the second part of the work, results of REMO have been compared with measurements of ScaRaB and AVHRR for March 1994. The model reproduces the measurements overall well, but it is overestimating the cold areas and underestimating the warm areas in the longwave spectral domain. Similarly it is overestimating the dark areas and underestimating the bright areas in the solar spectral domain. (orig.)

  8. Density functional theory and multiscale materials modeling

    Indian Academy of Sciences (India)

    Swapan K Ghosh

    2003-01-01

    One of the vital ingredients in the theoretical tools useful in materials modeling at all the length scales of interest is the concept of density. In the microscopic length scale, it is the electron density that has played a major role in providing a deeper understanding of chemical binding in atoms, molecules and solids. In the intermediate mesoscopic length scale, an appropriate picture of the equilibrium and dynamical processes has been obtained through the single particle number density of the constituent atoms or molecules. A wide class of problems involving nanomaterials, interfacial science and soft condensed matter has been addressed using the density based theoretical formalism as well as atomistic simulation in this regime. In the macroscopic length scale, however, matter is usually treated as a continuous medium and a description using local mass density, energy density and other related density functions has been found to be quite appropriate. A unique single unified theoretical framework that emerges through the density concept at these diverse length scales and is applicable to both quantum and classical systems is the so called density functional theory (DFT) which essentially provides a vehicle to project the many-particle picture to a single particle one. Thus, the central equation for quantum DFT is a one-particle Schrödinger-like Kohn–Sham equation, while the same for classical DFT consists of Boltzmann type distributions, both corresponding to a system of noninteracting particles in the field of a density-dependent effective potential. Selected illustrative applications of quantum DFT to microscopic modeling of intermolecular interaction and that of classical DFT to a mesoscopic modeling of soft condensed matter systems are presented.

  9. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    OpenAIRE

    Chung-Hung Tsai

    2014-01-01

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The propos...

  10. A model of PCF in guarded type theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy...

  11. A Punctuated Equilibrium in French Budgeting Processes

    OpenAIRE

    B. Baumgartner, Frank; Foucault, Martial; François, Abel

    2006-01-01

    We use data on French budgeting to test models of friction, incrementalism and punctuated equilibrium. Data include the overall state budget since 1820; ministerial budgets for seven ministries since 1868; and a more complete ministerial series covering ten ministries since 1947. Our results in every case are remarkably similar to the highly leptokurtic distributions that Jones and Baumgartner (2005) demonstrated in US budgeting processes. This suggests that general characteristics of adminis...

  12. Models of Particle Physics from Type IIB String Theory and F-theory: A Review

    CERN Document Server

    Maharana, Anshuman

    2012-01-01

    We review particle physics model building in type IIB string theory and F-theory. This is a region in the landscape where in principle many of the key ingredients required for a realistic model of particle physics can be combined successfully. We begin by reviewing moduli stabilisation within this framework and its implications for supersymmetry breaking. We then review model building tools and developments in the weakly coupled type IIB limit, for both local D3-branes at singularities and global models of intersecting D7-branes. Much of recent model building work has been in the strongly coupled regime of F-theory due to the presence of exceptional symmetries which allow for the construction of phenomenologically appealing Grand Unified Theories. We review both local and global F-theory model building starting from the fundamental concepts and tools regarding how the gauge group, matter sector and operators arise, and ranging to detailed phenomenological properties explored in the literature.

  13. Big Bang Models in String Theory

    OpenAIRE

    Craps, Ben

    2006-01-01

    These proceedings are based on lectures delivered at the "RTN Winter School on Strings, Supergravity and Gauge Theories", CERN, January 16 - January 20, 2006. The school was mainly aimed at Ph.D. students and young postdocs. The lectures start with a brief introduction to spacetime singularities and the string theory resolution of certain static singularities. Then they discuss attempts to resolve cosmological singularities in string theory, mainly focusing on two specific examples: the Milne...

  14. The Flare Irradiance Spectral Model (FISM) and its Contributions to Space Weather Research, the Flare Energy Budget, and Instrument Design

    Science.gov (United States)

    Chamberlin, Phillip

    2008-01-01

    The Flare Irradiance Spectral Model (FISM) is an empirical model of the solar irradiance spectrum from 0.1 to 190 nm at 1 nm spectral resolution and on a 1-minute time cadence. The goal of FISM is to provide accurate solar spectral irradiances over the vacuum ultraviolet (VUV: 0-200 nm) range as input for ionospheric and thermospheric models. The seminar will begin with a brief overview of the FISM model, and also how the Solar Dynamics Observatory (SDO) EUV Variability Experiment (EVE) will contribute to improving FISM. Some current studies will then be presented that use FISM estimations of the solar VUV irradiance to quantify the contributions of the increased irradiance from flares to Earth's increased thermospheric and ionospheric densites. Initial results will also be presented from a study looking at the electron density increases in the Martian atmosphere during a solar flare. Results will also be shown quantifying the VUV contributions to the total flare energy budget for both the impulsive and gradual phases of solar flares. Lastly, an example of how FISM can be used to simplify the design of future solar VUV irradiance instruments will be discussed, using the future NOAA GOES-R Extreme Ultraviolet and X-Ray Sensors (EXIS) space weather instrument.

  15. String-Like Dual Models for Scalar Theories

    CERN Document Server

    Baadsgaard, Christian; Bourjaily, Jacob L; Damgaard, Poul H

    2016-01-01

    We show that all tree-level amplitudes in $\\varphi^p$ scalar field theory can be represented as the $\\alpha'\\to0$ limit of an $SL(2,R)$-invariant, string-theory-like dual model integral. These dual models are constructed according to constraints that admit families of solutions. We derive these dual models, and give closed formulae for all tree-level amplitudes of any $\\varphi^p$ scalar field theory.

  16. Local Models in F-Theory and M-Theory with Three Generations

    OpenAIRE

    Bourjaily, Jacob L.

    2009-01-01

    We describe a general framework that can be used to geometrically engineer local, phenomenological models in F-theory and M-theory based on ALE-fibrations, and we present several concrete examples of such models that feature three generations of matter with semi-realistic phenomenology. We show that the geometric structures required for generating interactions--triple-intersections of matter-curves in F-theory and supersymmetric three-cycles supporting multiple conical singularities in M-theo...

  17. Comparison of rainfall based SPI drought indices with SMDI and ETDI indices derived from a soil water budget model

    Science.gov (United States)

    Houcine, A.; Bargaoui, Z.

    2012-04-01

    Modelling soil water budget is a key issue for assessing drought awareness indices based on soil moisture estimation. The aim of the study is to compare drought indices based on rainfall time series to those based on soil water content time series and evapotranspiration time series. To this end, a vertically averaged water budget over the root zone is implemented to assist the estimation of evapotranspiration flux. A daily time step is adopted to run the water budget model for a lumped watershed of 250 km2 under arid climate where recorded meteorological and hydrological data are available for a ten year period. The water balance including 7 parameters is computed including evapotranspiration, runoff and leakage. Soil properties related parameters are derived according to pedo transfer functions while two remaining parameters are considered as data driven and are subject to calibration. The model is calibrated using daily hydro meteorological data (solar radiation, air temperature, air humidity, mean areal rainfall) as well as daily runoff records and also average annual (or regional) evapotranspiration. The latter is estimated using an empirical sub-model. A set of acceptable solutions is identified according to the values of the Nash coefficients for annual and decadal runoffs as well as the relative bias for average annual evapotranspiration. Using these acceptable solutions several drought indices are computed: SPI (standard precipitation index), SMDI (soil moisture deficit index) and ETDI (evapotranspiration deficit index). While SPI indicators are based only on monthly precipitation time series, SMDI are based on weekly mean soil water content as computed by the hydrological model. On the other hand ETDI indices are based on weekly mean potential and actual evapotranspirations as estimated by the meteorological and hydrological models. For SPI evaluation various time scales are considered from one to twelve months (SPI1, SPI3, SPI6, SPI9 and SPI12). For all

  18. The Theory of Finite Models without Equal Sign

    Institute of Scientific and Technical Information of China (English)

    Li Bo LUO

    2006-01-01

    In this paper, it is the first time ever to suggest that we study the model theory of all finite structures and to put the equal sign in the same situtation as the other relations. Using formulas of infinite lengths we obtain new theorems for the preservation of model extensions, submodels, model homomorphisms and inverse homomorphisms. These kinds of theorems were discussed in Chang and Keisler's Model Theory, systematically for general models, but Gurevich obtained some different theorems in this direction for finite models. In our paper the old theorems manage to survive in the finite model theory. There are some differences between into homomorphisms and onto homomorphisms in preservation theorems too. We also study reduced models and minimum models. The characterization sentence of a model is given, which derives a general result for any theory T to be equivalent to a set of existential-universal sentences. Some results about completeness and model completeness are also given.

  19. Theory and modeling of active brazing.

    Energy Technology Data Exchange (ETDEWEB)

    van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.

    2013-09-01

    Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.

  20. THE BIG BANG THEORY AND UNIVERSE MODELING. MISTAKES IN THE RELATIVITY THEORY

    OpenAIRE

    Javadov, Khaladdin; Javadli, Elmaddin

    2014-01-01

    This article is about Theory of Big Bang and it describes some details of Universe Modelling. It is Physical and Mathematical modeling of Universe formation. Application of mathematical and physical formulas for Universe Calculations.

  1. 生态预算:一种城市环境管理模型%Eco-budget - A Model of Urban Environmental Management

    Institute of Scientific and Technical Information of China (English)

    郝韦霞

    2005-01-01

    In this paper, a new model of urban environmental resource management is introduced. The article analyzes the gap between urban environmental management and the management of economy and human resources. The significance, the key points, the implementation procedures, and steps of eco-budget cycle are discussed.

  2. Spectral and scattering theory for translation invariant models in quantum field theory

    DEFF Research Database (Denmark)

    Rasmussen, Morten Grud

    This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...... spectrum is proven to hold globally and scattering theory of the model is studied using time-dependent methods, of which the main result is asymptotic completeness....

  3. QUANTUM THEORY FOR THE BINOMIAL MODEL IN FINANCE THEORY

    Institute of Scientific and Technical Information of China (English)

    CHEN Zeqian

    2004-01-01

    In this paper, a quantum model for the binomial market in finance is proposed. We show that its risk-neutral world exhibits an intriguing structure as a disk in the unit ball of R3, whose radius is a function of the risk-free interest rate with two thresholds which prevent arbitrage opportunities from this quantum market. Furthermore, from the quantum mechanical point of view we re-deduce the Cox-Ross-Rubinstein binomial option pricing formula by considering Maxwell-Boltzmann statistics of the system of N distinguishable particles.

  4. Carbon budget of tropical forests in Southeast Asia and the effects of deforestation: an approach using a process-based model and field measurements

    Directory of Open Access Journals (Sweden)

    M. Adachi

    2011-09-01

    Full Text Available More reliable estimates of the carbon (C stock within forest ecosystems and C emission induced by deforestation are urgently needed to mitigate the effects of emissions on climate change. A process-based terrestrial biogeochemical model (VISIT was applied to tropical primary forests of two types (a seasonal dry forest in Thailand and a rainforest in Malaysia and one agro-forest (an oil palm plantation in Malaysia to estimate the C budget of tropical ecosystems in Southeast Asia, including the impacts of land-use conversion. The observed aboveground biomass in the seasonal dry tropical forest in Thailand (226.3 t C ha−1 and the rainforest in Malaysia (201.5 t C ha−1 indicate that tropical forests of Southeast Asia are among the most C-abundant ecosystems in the world. The model simulation results in rainforests were consistent with field data, except for the NEP, however, the VISIT model tended to underestimate C budget and stock in the seasonal dry tropical forest. The gross primary production (GPP based on field observations ranged from 32.0 to 39.6 t C ha−1 yr−1 in the two primary forests, whereas the model slightly underestimated GPP (26.5–34.5 t C ha−1 yr−1. The VISIT model appropriately captured the impacts of disturbances such as deforestation and land-use conversions on the C budget. Results of sensitivity analysis showed that the proportion of remaining residual debris was a key parameter determining the soil C budget after the deforestation event. According to the model simulation, the total C stock (total biomass and soil C of the oil palm plantation was about 35% of the rainforest's C stock at 30 yr following initiation of the plantation. However, there were few field data of C budget and stock, especially in oil palm plantation. The C budget of each ecosystem must be evaluated over the long term using both the model simulations and observations to

  5. Carbon budget of tropical forests in Southeast Asia and the effects of deforestation: an approach using a process-based model and field measurements

    Science.gov (United States)

    Adachi, M.; Ito, A.; Ishida, A.; Kadir, W. R.; Ladpala, P.; Yamagata, Y.

    2011-09-01

    More reliable estimates of the carbon (C) stock within forest ecosystems and C emission induced by deforestation are urgently needed to mitigate the effects of emissions on climate change. A process-based terrestrial biogeochemical model (VISIT) was applied to tropical primary forests of two types (a seasonal dry forest in Thailand and a rainforest in Malaysia) and one agro-forest (an oil palm plantation in Malaysia) to estimate the C budget of tropical ecosystems in Southeast Asia, including the impacts of land-use conversion. The observed aboveground biomass in the seasonal dry tropical forest in Thailand (226.3 t C ha-1) and the rainforest in Malaysia (201.5 t C ha-1) indicate that tropical forests of Southeast Asia are among the most C-abundant ecosystems in the world. The model simulation results in rainforests were consistent with field data, except for the NEP, however, the VISIT model tended to underestimate C budget and stock in the seasonal dry tropical forest. The gross primary production (GPP) based on field observations ranged from 32.0 to 39.6 t C ha-1 yr-1 in the two primary forests, whereas the model slightly underestimated GPP (26.5-34.5 t C ha-1 yr-1). The VISIT model appropriately captured the impacts of disturbances such as deforestation and land-use conversions on the C budget. Results of sensitivity analysis showed that the proportion of remaining residual debris was a key parameter determining the soil C budget after the deforestation event. According to the model simulation, the total C stock (total biomass and soil C) of the oil palm plantation was about 35% of the rainforest's C stock at 30 yr following initiation of the plantation. However, there were few field data of C budget and stock, especially in oil palm plantation. The C budget of each ecosystem must be evaluated over the long term using both the model simulations and observations to understand the effects of climate and land-use conversion on C budgets in tropical forest

  6. Hypergame Theory: A Model for Conflict, Misperception, and Deception

    Directory of Open Access Journals (Sweden)

    Nicholas S. Kovach

    2015-01-01

    Full Text Available When dealing with conflicts, game theory and decision theory can be used to model the interactions of the decision-makers. To date, game theory and decision theory have received considerable modeling focus, while hypergame theory has not. A metagame, known as a hypergame, occurs when one player does not know or fully understand all the strategies of a game. Hypergame theory extends the advantages of game theory by allowing a player to outmaneuver an opponent and obtaining a more preferred outcome with a higher utility. The ability to outmaneuver an opponent occurs in the hypergame because the different views (perception or deception of opponents are captured in the model, through the incorporation of information unknown to other players (misperception or intentional deception. The hypergame model more accurately provides solutions for complex theoretic modeling of conflicts than those modeled by game theory and excels where perception or information differences exist between players. This paper explores the current research in hypergame theory and presents a broad overview of the historical literature on hypergame theory.

  7. MULTI-FLEXIBLE SYSTEM DYNAMIC MODELING THEORY AND APPLICATION

    Institute of Scientific and Technical Information of China (English)

    仲昕; 周兵; 杨汝清

    2001-01-01

    The flexible body modeling theory was demonstrated. An example of modeling a kind of automobile's front suspension as a multi-flexible system was shown. Finally, it shows that the simulation results of multi-flexible dynamic model more approach the road test data than those of multi-rigid dynamic model do. Thus, it is fully testified that using multi-flexible body theory to model is necessary and effective.

  8. Implementation ambiguity: The fifth element long lost in uncertainty budgets for land biogeochemical modeling

    Science.gov (United States)

    Tang, J.; Riley, W. J.

    2015-12-01

    Previous studies have identified four major sources of predictive uncertainty in modeling land biogeochemical (BGC) processes: (1) imperfect initial conditions (e.g., assumption of preindustrial equilibrium); (2) imperfect boundary conditions (e.g., climate forcing data); (3) parameterization (type I equifinality); and (4) model structure (type II equifinality). As if that were not enough to cause substantial sleep loss in modelers, we propose here a fifth element of uncertainty that results from implementation ambiguity that occurs when the model's mathematical description is translated into computational code. We demonstrate the implementation ambiguity using the example of nitrogen down regulation, a necessary process in modeling carbon-climate feedbacks. We show that, depending on common land BGC model interpretations of the governing equations for mineral nitrogen, there are three different implementations of nitrogen down regulation. We coded these three implementations in the ACME land model (ALM), and explored how they lead to different preindustrial and contemporary land biogeochemical states and fluxes. We also show how this implementation ambiguity can lead to different carbon-climate feedback estimates across the RCP scenarios. We conclude by suggesting how to avoid such implementation ambiguity in ESM BGC models.

  9. Mean Kinetic Energy Budget of Wakes Within Model Wind Farms: Comparison of an Array of Model Wind Turbines and Porous Discs

    Science.gov (United States)

    Camp, E.; Cal, R. B.

    2015-12-01

    To optimize the power production of large wind farms, it is important to understand the flow within the wind turbine array as well as its interaction with the surrounding atmosphere. Computational simulations are often employed to study both the velocity field within and immediately above wind farms. In many computational studies, wind turbines are modeled as stationary, porous actuator discs. A wind tunnel study is done in order to compare the wakes within an array of porous discs and an equivalent array of model wind turbines. To characterize the wakes within a 4×3 model wind farm, stereoscopic particle image velocimetry (SPIV) is employed. SPIV measurements focus on the region along the centerline of the array upstream and downstream of the center turbine in the fourth row. The computed mean flow fields and turbulent stresses provide a basis to compare the near and far wakes of the turbines with those of the porous discs. The detailed analysis of the wakes for each case focus on the mean kinetic energy budget within the wakes. Examining the mean kinetic energy budget is done via computing the mean kinetic energy, flux of kinetic energy, and production of turbulence which are analogous to a measure of extracted power.

  10. Automated Budget System

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  11. The Standard Model is Natural as Magnetic Gauge Theory

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2011-01-01

    matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...

  12. Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions

    OpenAIRE

    Camaren Peter; Mark Swilling

    2014-01-01

    In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability . We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going...

  13. Gaining insight in the interaction of zinc and population density with a combined dynamic energy budget and population model.

    Science.gov (United States)

    Klok, Chris

    2008-12-01

    Laboratory tests are typically conducted under optimal conditions testing the single effect of a toxicant In the field, due to suboptimal conditions, density dependence can both diminish and enhance effects of toxicants on populations. A review of the literature indicated that general insight on interaction of density and toxicants is lacking, and therefore no predictions on their combined action can be made. In this paper the influence of zinc was tested at different population densities on the demographic rates: growth, reproduction, and survival in the earthworm Lumbricus rubellus. Changes in these rates were extrapolated with a combined Dynamic energy budget (DEB) and a population model to assess consequences at the population level. Inference from the DEB model indicated that density decreased the assimilation of food whereas zinc increased the maintenance costs. The combined effects of density and zinc resulted in a decrease in the intrinsic rate of population increase which suddenly dropped to zero at combinations of zinc and density where development is so strongly retarded that individuals do not mature. This already happened at zinc levels where zinc induced mortality is low and therefore density enhances zinc effects and density dependent compensation is not expected. PMID:19192801

  14. Dust emission size distribution impact on aerosol budget and radiative forcing over the Mediterranean region: a regional climate model approach

    Directory of Open Access Journals (Sweden)

    P. Nabat

    2012-07-01

    Full Text Available The present study investigates the dust emission and load over the Mediterranean basin using the coupled-chemistry-aerosol regional climate model RegCM-4. The first step of this work focuses on dust particle emission size distribution modeling. We compare a parameterization in which the emission is based on the individual kinetic energy of the aggregates striking the surface to a recent parameterization based on an analogy with the fragmentation of brittle materials. The main difference between the two dust schemes concerns the mass proportion of fine aerosol which is reduced in the case of the new dust parameterization, with consequences for optical properties. At the episodic scale, comparisons between RegCM-4 simulations, satellite and ground-based data show a clear improvement using the new dust distribution in terms of Aerosol Optical Depth (AOD values and geographic gradients. These results are confirmed at the seasonal scale for the investigated year 2008. A multi-annual simulation is finally carried out using the new dust distribution over the period 2000–2009. This change of dust distribution has sensitive impacts on the simulated regional dust budget, notably dry dust deposition and the regional direct aerosol radiative forcing over the Mediterranean basin. This could clearly modify the possible effects of dust aerosols on the biogeochemical activity and climate of the Mediterranean basin. In particular, we find that the new size distribution produces a higher dust deposition flux, and smaller top of atmosphere (TOA dust radiative cooling.

  15. Evolutionary Economics, Endogenous Growth Models, and Resource-Advantage Theory

    OpenAIRE

    Shelby D. Hunt

    1997-01-01

    The gap between evolutionary and neoclassical economics remains large. This article proposed that the gap can be narrowed by evolutionary economies developing process theories that can provide evolutionary theoretical foundations for formal models in the neoclassical equilibrium tradition. This article argues that a process of competitions labeled "resource-advantage theory," can provide an evolutionary, theoretical foundation for formal models of endogenous economic growth.

  16. The Properties of Model Selection when Retaining Theory Variables

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren

    Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...

  17. The logical foundations of scientific theories languages, structures, and models

    CERN Document Server

    Krause, Decio

    2016-01-01

    This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...

  18. Solid modeling and applications rapid prototyping, CAD and CAE theory

    CERN Document Server

    Um, Dugan

    2016-01-01

    The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...

  19. Studies of the Earth Energy Budget and Water Cycle Using Satellite Observations and Model Analyses

    Science.gov (United States)

    Campbell, G. G.; VonderHarr, T. H.; Randel, D. L.; Kidder, S. Q.

    1997-01-01

    During this research period we have utilized the ERBE data set in comparisons to surface properties and water vapor observations in the atmosphere. A relationship between cloudiness and surface temperature anomalies was found. This same relationship was found in a general circulation model, verifying the model. The attempt to construct a homogeneous time series from Nimbus 6, Nimbus 7 and ERBE data is not complete because we are still waiting for the ERBE reanalysis to be completed. It will be difficult to merge the Nimbus 6 data in because its observations occurred when the average weather was different than the other periods, so regression adjustments are not effective.

  20. Large field inflation models from higher-dimensional gauge theories

    Energy Technology Data Exchange (ETDEWEB)

    Furuuchi, Kazuyuki [Manipal Centre for Natural Sciences, Manipal University, Manipal, Karnataka 576104 (India); Koyama, Yoji [Department of Physics, National Tsing-Hua University, Hsinchu 30013, Taiwan R.O.C. (China)

    2015-02-23

    Motivated by the recent detection of B-mode polarization of CMB by BICEP2 which is possibly of primordial origin, we study large field inflation models which can be obtained from higher-dimensional gauge theories. The constraints from CMB observations on the gauge theory parameters are given, and their naturalness are discussed. Among the models analyzed, Dante’s Inferno model turns out to be the most preferred model in this framework.

  1. Teaching Wound Care Management: A Model for the Budget Conscious Educator

    Science.gov (United States)

    Berry, David C.

    2012-01-01

    For the author, the concept of wound care has always been a challenging topic to demonstrate. How to teach the concept without having a student in need of wound care or without having to spend money to buy another simulation manikin/model? The author has recently created a simulation to demonstrate and practice the cleaning, closing, and dressing…

  2. Theory of stellar convection - II. First stellar models

    Science.gov (United States)

    Pasetto, S.; Chiosi, C.; Chiosi, E.; Cropper, M.; Weiss, A.

    2016-07-01

    We present here the first stellar models on the Hertzsprung-Russell diagram, in which convection is treated according to the new scale-free convection theory (SFC theory) by Pasetto et al. The aim is to compare the results of the new theory with those from the classical, calibrated mixing-length (ML) theory to examine differences and similarities. We integrate the equations describing the structure of the atmosphere from the stellar surface down to a few per cent of the stellar mass using both ML theory and SFC theory. The key temperature over pressure gradients, the energy fluxes, and the extension of the convective zones are compared in both theories. The analysis is first made for the Sun and then extended to other stars of different mass and evolutionary stage. The results are adequate: the SFC theory yields convective zones, temperature gradients ∇ and ∇e, and energy fluxes that are very similar to those derived from the `calibrated' MT theory for main-sequence stars. We conclude that the old scale dependent ML theory can now be replaced with a self-consistent scale-free theory able to predict correct results, as it is more physically grounded than the ML theory. Fundamentally, the SFC theory offers a deeper insight of the underlying physics than numerical simulations.

  3. An end-to-end model of the Earth Radiation Budget Experiment (ERBE) Earth-viewing nonscanning radiometric channels

    OpenAIRE

    Priestly, Kory James

    1993-01-01

    The Earth Radiation Budget Experiment (ERBE) active-cavity radiometers are used to measure the incoming solar, reflected solar, and emitted longwave radiation from the Earth and its atmosphere. The radiometers are carried by the National Aeronautics and Space Administration's Earth Radiation Budget Satellite (ERBS) and the National Oceanic and Atmospheric Administration's NOAA-9 and NOAA-10 spacecraft. Four Earth-viewing nonscanning active-cavity radiometers are carried by e...

  4. The Birth of Model Theory Lowenheim's Theorem in the Frame of the Theory of Relatives

    CERN Document Server

    Badesa, Calixto

    2008-01-01

    Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. In The Birth of Model Theory, Cali

  5. Atmospheric water budget over the western Himalayas in a regional climate model

    Indian Academy of Sciences (India)

    A P Dimri

    2012-08-01

    During winter months (December, January, February – DJF), the western Himalayas (WH) receive precipitation from eastward moving extratropical cyclones, called western disturbances (WDs) in Indian parlance. Winter precipitation–moisture convergence–evaporation (P–C–E) cycle is analyzed for a period of 22 years (1981–2002: 1980(D)–1981(J, F) to 2001(D)–2002(J, F)) with observed and modelled (RegCM3) climatological estimates over WH. Remarkable model skills have been observed in depicting the hydrological cycle over WH. Although precipitation biases exist, similar spatial precipitation with well marked two maxima is simulated by the model. As season advances, temporal distribution shows higher precipitation in simulation than the observed. However, P–C–E cycle shows similar peaks of moisture convergence and evaporation in daily climatologies though with varying maxima/minima. In the first half of winter, evaporation over WH is mainly driven by ground surface and 2 m air temperature. Lowest temperatures during mid-winter correspond to lowest evaporation to precipitation ratio as well.

  6. Dynamic Electrothermal Model of a Sputtered Thermopile Thermal Radiation Detector for Earth Radiation Budget Applications

    Science.gov (United States)

    Weckmann, Stephanie

    1997-01-01

    The Clouds and the Earth's Radiant Energy System (CERES) is a program sponsored by the National Aeronautics and Space Administration (NASA) aimed at evaluating the global energy balance. Current scanning radiometers used for CERES consist of thin-film thermistor bolometers viewing the Earth through a Cassegrain telescope. The Thermal Radiation Group, a laboratory in the Department of Mechanical Engineering at Virginia Polytechnic Institute and State University, is currently studying a new sensor concept to replace the current bolometer: a thermopile thermal radiation detector. This next-generation detector would consist of a thermal sensor array made of thermocouple junction pairs, or thermopiles. The objective of the current research is to perform a thermal analysis of the thermopile. Numerical thermal models are particularly suited to solve problems for which temperature is the dominant mechanism of the operation of the device (through the thermoelectric effect), as well as for complex geometries composed of numerous different materials. Feasibility and design specifications are studied by developing a dynamic electrothermal model of the thermopile using the finite element method. A commercial finite element-modeling package, ALGOR, is used.

  7. Measurement-derived heat-budget approaches for simulating coastal wetland temperature with a hydrodynamic model

    Science.gov (United States)

    Swain, Eric; Decker, Jeremy

    2010-01-01

    Numerical modeling is needed to predict environmental temperatures, which affect a number of biota in southern Florida, U.S.A., such as the West Indian manatee (Trichechus manatus), which uses thermal basins for refuge from lethal winter cold fronts. To numerically simulate heat-transport through a dynamic coastal wetland region, an algorithm was developed for the FTLOADDS coupled hydrodynamic surface-water/ground-water model that uses formulations and coefficients suited to the coastal wetland thermal environment. In this study, two field sites provided atmospheric data to develop coefficients for the heat flux terms representing this particular study area. Several methods were examined to represent the heat-flux components used to compute temperature. A Dalton equation was compared with a Penman formulation for latent heat computations, producing similar daily-average temperatures. Simulation of heat-transport in the southern Everglades indicates that the model represents the daily fluctuation in coastal temperatures better than at inland locations; possibly due to the lack of information on the spatial variations in heat-transport parameters such as soil heat capacity and surface albedo. These simulation results indicate that the new formulation is suitable for defining the existing thermohydrologic system and evaluating the ecological effect of proposed restoration efforts in the southern Everglades of Florida.

  8. Evaluating the performance of the land surface model ORCHIDEE-CAN on water and energy flux estimation with a single- and a multi- layer energy budget scheme

    OpenAIRE

    Chen, Yiying; Ryder, James; Bastrikov, Vladislav; McGrath, Matthew J.; NAUDTS Kim; Otto, Juliane; Ottlé, Catherine; Peylin, Philippe; Polcher, Jan; VALADE Aude; Black, Andrew; Elbers, Jan A.; Moors, Eddy; Foken, Thomas; Gorsel, Eva van

    2016-01-01

    Canopy structure is one of the most important vegetation characteristics for land-atmosphere interactions, as it determines the energy and scalar exchanges between the land surface and the overlying air mass. In this study we evaluated the performance of a newly developed multi-layer energy budget in the land surface model ORCHIDEE-CAN (Organising Carbon and Hydrology In Dynamic Ecosystems – CANopy), which simulates canopy structure and can be coupled to an atmospheric model using an im...

  9. Parameter Estimations of Dynamic Energy Budget (DEB Model over the Life History of a Key Antarctic Species: The Antarctic Sea Star Odontaster validus Koehler, 1906.

    Directory of Open Access Journals (Sweden)

    Antonio Agüera

    Full Text Available Marine organisms in Antarctica are adapted to an extreme ecosystem including extremely stable temperatures and strong seasonality due to changes in day length. It is now largely accepted that Southern Ocean organisms are particularly vulnerable to global warming with some regions already being challenged by a rapid increase of temperature. Climate change affects both the physical and biotic components of marine ecosystems and will have an impact on the distribution and population dynamics of Antarctic marine organisms. To predict and assess the effect of climate change on marine ecosystems a more comprehensive knowledge of the life history and physiology of key species is urgently needed. In this study we estimate the Dynamic Energy Budget (DEB model parameters for key benthic Antarctic species the sea star Odontaster validus using available information from literature and experiments. The DEB theory is unique in capturing the metabolic processes of an organism through its entire life cycle as a function of temperature and food availability. The DEB model allows for the inclusion of the different life history stages, and thus, becomes a tool that can be used to model lifetime feeding, growth, reproduction, and their responses to changes in biotic and abiotic conditions. The DEB model presented here includes the estimation of reproduction handling rules for the development of simultaneous oocyte cohorts within the gonad. Additionally it links the DEB model reserves to the pyloric caeca an organ whose function has long been ascribed to energy storage. Model parameters described a slowed down metabolism of long living animals that mature slowly. O. validus has a large reserve that-matching low maintenance costs- allow withstanding long periods of starvation. Gonad development is continuous and individual cohorts developed within the gonads grow in biomass following a power function of the age of the cohort. The DEB model developed here for O

  10. Estimation of Carbon Budgets for Croplands by Combining High Resolution Remote Sensing Data with a Crop Model and Validation Ground Data

    Science.gov (United States)

    Mangiarotti, S.; Veloso, A.; Ceschia, E.; Tallec, T.; Dejoux, J. F.

    2015-12-01

    Croplands occupy large areas of Earth's land surface playing a key role in the terrestrial carbon cycle. Hence, it is essential to quantify and analyze the carbon fluxes from those agro-ecosystems, since they contribute to climate change and are impacted by the environmental conditions. In this study we propose a regional modeling approach that combines high spatial and temporal resolutions (HSTR) optical remote sensing data with a crop model and a large set of in-situ measurements for model calibration and validation. The study area is located in southwest France and the model that we evaluate, called SAFY-CO2, is a semi-empirical one based on the Monteith's light-use efficiency theory and adapted for simulating the components of the net ecosystem CO2 fluxes (NEE) and of the annual net ecosystem carbon budgets (NECB) at a daily time step. The approach is based on the assimilation of satellite-derived green area index (GAI) maps for calibrating a number of the SAFY-CO2 parameters linked to crop phenology. HSTR data from the Formosat-2 and SPOT satellites were used to produce the GAI maps. The experimental data set includes eddy covariance measurements of net CO2 fluxes from two experimental sites and partitioned into gross primary production (GPP) and ecosystem respiration (Reco). It also includes measurements of GAI, biomass and yield between 2005 and 2011, focusing on the winter wheat crop. The results showed that the SAFY-CO2 model correctly reproduced the biomass production, its dynamic and the yield (relative errors about 24%) in contrasted climatic, environmental and management conditions. The net CO2 flux components estimated with the model were overall in agreement with the ground data, presenting good correlations (R² about 0.93 for GPP, 0.77 for Reco and 0.86 for NEE). The evaluation of the modelled NECB for the different site-years highlighted the importance of having accurate estimates of each component of the NECB. Future works aim at considering

  11. Budgeting based on need: a model to determine sub-national allocation of resources for health services in Indonesia

    Directory of Open Access Journals (Sweden)

    Ensor Tim

    2012-08-01

    Full Text Available Abstract Background Allocating national resources to regions based on need is a key policy issue in most health systems. Many systems utilise proxy measures of need as the basis for allocation formulae. Increasingly these are underpinned by complex statistical methods to separate need from supplier induced utilisation. Assessment of need is then used to allocate existing global budgets to geographic areas. Many low and middle income countries are beginning to use formula methods for funding however these attempts are often hampered by a lack of information on utilisation, relative needs and whether the budgets allocated bear any relationship to cost. An alternative is to develop bottom-up estimates of the cost of providing for local need. This method is viable where public funding is focused on a relatively small number of targeted services. We describe a bottom-up approach to developing a formula for the allocation of resources. The method is illustrated in the context of the state minimum service package mandated to be provided by the Indonesian public health system. Methods A standardised costing methodology was developed that is sensitive to the main expected drivers of local cost variation including demographic structure, epidemiology and location. Essential package costing is often undertaken at a country level. It is less usual to utilise the methods across different parts of a country in a way that takes account of variation in population needs and location. Costing was based on best clinical practice in Indonesia and province specific data on distribution and costs of facilities. The resulting model was used to estimate essential package costs in a representative district in each province of the country. Findings Substantial differences in the costs of providing basic services ranging from USD 15 in urban Yogyakarta to USD 48 in sparsely populated North Maluku. These costs are driven largely by the structure of the population

  12. Global model simulations of the impact of ocean-going ships on aerosols, clouds, and the radiation budget

    Science.gov (United States)

    Lauer, A.; Eyring, V.; Hendricks, J.; Jöckel, P.; Lohmann, U.

    2007-07-01

    International shipping contributes significantly to the fuel consumption of all transport related activities. Specific emissions of pollutants such as sulfur dioxide (SO2) per kg of fuel emitted are higher than for road transport or aviation. Besides gaseous pollutants, ships also emit various types of particulate matter. The aerosol impacts the Earth's radiation budget directly by scattering and absorbing incoming solar radiation and indirectly by changing cloud properties. Here we use ECHAM5/MESSy1-MADE, a global climate model with detailed aerosol and cloud microphysics, to show that emissions from ships significantly increase the cloud droplet number concentration of low maritime water clouds. Whereas the cloud liquid water content remains nearly unchanged in these simulations, effective radii of cloud droplets decrease, leading to cloud optical thickness increase up to 5-10%. The sensitivity of the results is estimated by using three different emission inventories for present day conditions. The sensitivity analysis reveals that shipping contributes with 2.3% to 3.6% to the total sulfate burden and 0.4% to 1.4% to the total black carbon burden in the year 2000. In addition to changes in aerosol chemical composition, shipping increases the aerosol number concentration, e.g. up to 25% in the size range of the accumulation mode (typically >0.1 μm) over the Atlantic. The total aerosol optical thickness over the Indian Ocean, the Gulf of Mexico and the Northeastern Pacific increases up to 8-10% depending on the emission inventory. Changes in aerosol optical thickness caused by the shipping induced modification of aerosol particle number concentration and chemical composition lead to a change of the net top of the atmosphere (ToA) clear sky radiation of about -0.013 W/m2 to -0.036 W/m2 on global annual average. The estimated all-sky direct aerosol effect calculated from these changes ranges between -0.009 W/m2 and -0.014 W/m2. The indirect aerosol effect of ships

  13. Global model simulations of the impact of ocean-going ships on aerosols, clouds, and the radiation budget

    Directory of Open Access Journals (Sweden)

    A. Lauer

    2007-10-01

    Full Text Available International shipping contributes significantly to the fuel consumption of all transport related activities. Specific emissions of pollutants such as sulfur dioxide (SO2 per kg of fuel emitted are higher than for road transport or aviation. Besides gaseous pollutants, ships also emit various types of particulate matter. The aerosol impacts the Earth's radiation budget directly by scattering and absorbing the solar and thermal radiation and indirectly by changing cloud properties. Here we use ECHAM5/MESSy1-MADE, a global climate model with detailed aerosol and cloud microphysics to study the climate impacts of international shipping. The simulations show that emissions from ships significantly increase the cloud droplet number concentration of low marine water clouds by up to 5% to 30% depending on the ship emission inventory and the geographic region. Whereas the cloud liquid water content remains nearly unchanged in these simulations, effective radii of cloud droplets decrease, leading to cloud optical thickness increase of up to 5–10%. The sensitivity of the results is estimated by using three different emission inventories for present-day conditions. The sensitivity analysis reveals that shipping contributes to 2.3% to 3.6% of the total sulfate burden and 0.4% to 1.4% to the total black carbon burden in the year 2000 on the global mean. In addition to changes in aerosol chemical composition, shipping increases the aerosol number concentration, e.g. up to 25% in the size range of the accumulation mode (typically >0.1 μm over the Atlantic. The total aerosol optical thickness over the Indian Ocean, the Gulf of Mexico and the Northeastern Pacific increases by up to 8–10% depending on the emission inventory. Changes in aerosol optical thickness caused by shipping induced modification of aerosol particle number concentration and chemical composition lead to a change in the shortwave radiation budget at the top of the

  14. Fractional Order Modelling Using State Space Theory

    Directory of Open Access Journals (Sweden)

    Pritesh Shah

    2013-06-01

    Full Text Available There are various fractional order systems existing. This paper deals with the modelling of fractional order systems using an old and unique model structure i.e. state space model. The fractional order process system can be mathematically modelled by state space model. Simulation results validated that the fractional order model using state space is better as compared to other models such as first order with delay.

  15. General autocatalytic theory and simple model of financial markets

    Science.gov (United States)

    Thuy Anh, Chu; Lan, Nguyen Tri; Viet, Nguyen Ai

    2015-06-01

    The concept of autocatalytic theory has become a powerful tool in understanding evolutionary processes in complex systems. A generalization of autocatalytic theory was assumed by considering that the initial element now is being some distribution instead of a constant value as in traditional theory. This initial condition leads to that the final element might have some distribution too. A simple physics model for financial markets is proposed, using this general autocatalytic theory. Some general behaviours of evolution process and risk moment of a financial market also are investigated in framework of this simple model.

  16. Quantum Link Models and Quantum Simulation of Gauge Theories

    International Nuclear Information System (INIS)

    This lecture is about Quantum Link Models and Quantum Simulation of Gauge Theories. The lecture consists out of 4 parts. The first part gives a brief history of Computing and Pioneers of Quantum Computing and Quantum Simulations of Quantum Spin Systems are introduced. The 2nd lecture is about High-Temperature Superconductors versus QCD, Wilson’s Lattice QCD and Abelian Quantum Link Models. The 3rd lecture deals with Quantum Simulators for Abelian Lattice Gauge Theories and Non-Abelian Quantum Link Models. The last part of the lecture discusses Quantum Simulators mimicking ‘Nuclear’ physics and the continuum limit of D-Theorie models. (nowak)

  17. Non-linear sigma-models and string theories

    Energy Technology Data Exchange (ETDEWEB)

    Sen, A.

    1986-10-01

    The connection between sigma-models and string theories is discussed, as well as how the sigma-models can be used as tools to prove various results in string theories. Closed bosonic string theory in the light cone gauge is very briefly introduced. Then, closed bosonic string theory in the presence of massless background fields is discussed. The light cone gauge is used, and it is shown that in order to obtain a Lorentz invariant theory, the string theory in the presence of background fields must be described by a two-dimensional conformally invariant theory. The resulting constraints on the background fields are found to be the equations of motion of the string theory. The analysis is extended to the case of the heterotic string theory and the superstring theory in the presence of the massless background fields. It is then shown how to use these results to obtain nontrivial solutions to the string field equations. Another application of these results is shown, namely to prove that the effective cosmological constant after compactification vanishes as a consequence of the classical equations of motion of the string theory. 34 refs. (LEW)

  18. Global atmospheric budget of acetaldehyde: 3-D model analysis and constraints from in-situ and satellite observations

    Directory of Open Access Journals (Sweden)

    D. B. Millet

    2010-04-01

    Full Text Available We construct a global atmospheric budget for acetaldehyde using a 3-D model of atmospheric chemistry (GEOS-Chem, and use an ensemble of observations to evaluate present understanding of its sources and sinks. Hydrocarbon oxidation provides the largest acetaldehyde source in the model (128 Tg a−1, a factor of 4 greater than the previous estimate, with alkanes, alkenes, and ethanol the main precursors. There is also a minor source from isoprene oxidation. We use an updated chemical mechanism for GEOS-Chem, and photochemical acetaldehyde yields are consistent with the Master Chemical Mechanism. We present a new approach to quantifying the acetaldehyde air-sea flux based on the global distribution of light absorption due to colored dissolved organic matter (CDOM derived from satellite ocean color observations. The resulting net ocean emission is 57 Tg a−1, the second largest global source of acetaldehyde. A key uncertainty is the acetaldehyde turnover time in the ocean mixed layer, with quantitative model evaluation over the ocean complicated by known measurement artifacts in clean air. Simulated concentrations in surface air over the ocean generally agree well with aircraft measurements, though the model tends to overestimate the vertical gradient. PAN:NOx ratios are well-simulated in the marine boundary layer, providing some support for the modeled ocean source. We introduce the Model of Emissions of Gases and Aerosols from Nature (MEGANv2.1 for acetaldehyde and ethanol and use it to quantify their net flux from living terrestrial plants. Including emissions from decaying plants the total direct acetaldehyde source from the land biosphere is 23 Tg a−1. Other terrestrial acetaldehyde sources include biomass burning (3 Tg a−1 and anthropogenic emissions (2 Tg a−1. Simulated concentrations in the continental boundary layer are generally unbiased and capture the spatial

  19. Error budget analysis of SCIAMACHY limb ozone profile retrievals using the SCIATRAN model

    Directory of Open Access Journals (Sweden)

    N. Rahpoe

    2013-10-01

    Full Text Available A comprehensive error characterization of SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric CHartographY limb ozone profiles has been established based upon SCIATRAN transfer model simulations. The study was carried out in order to evaluate the possible impact of parameter uncertainties, e.g. in albedo, stratospheric aerosol optical extinction, temperature, pressure, pointing, and ozone absorption cross section on the limb ozone retrieval. Together with the a posteriori covariance matrix available from the retrieval, total random and systematic errors are defined for SCIAMACHY ozone profiles. Main error sources are the pointing errors, errors in the knowledge of stratospheric aerosol parameters, and cloud interference. Systematic errors are of the order of 7%, while the random error amounts to 10–15% for most of the stratosphere. These numbers can be used for the interpretation of instrument intercomparison and validation of the SCIAMACHY V 2.5 limb ozone profiles in a rigorous manner.

  20. KAOS: A Kinetic Theory Tool for Modeling Complex Social Systems

    Directory of Open Access Journals (Sweden)

    Bruneo Dario

    2016-01-01

    Full Text Available The kinetic theory approach is successfully used to model complex phenomena related to social systems, allowing to predict the dynamics and emergent behavior of large populations of agents. In particular, kinetic theory for active particles (KTAP models are usually analyzed by numerically solving the underlying Boltzmann-type differential equations through ad-hoc implementations. In this paper, we present KAOS: a kinetic theory of active particles modeling and analysis software tool. To the best of our knowledge, KAOS represents the first attempt to design and implement a comprehensive tool that assists the user in all the steps of the modeling process in the framework of the kinetic theories, from the model definition to the representation of transient solutions. To show the KAOS features, we present a new model capturing the competition/cooperation dynamics of a socio-economic system with welfare dynamics, in different socio-political conditions

  1. An Application of Rough Set Theory to Modelling and Utilising Data Warehouses

    Institute of Scientific and Technical Information of China (English)

    DENG Ming-rong; YANG Jian-bo; PAN Yun-he

    2001-01-01

    A data warehouse often accommodates enormous summary information in various granularities and is mainly used to support on-line analytical processing. Ideally all detailed data should be accessible by residing in some legacy systems or on-line transaction processing systems. In many cases, however, data sources in computers are also kinds of summary data due to technological problems or budget limits and also because different aggregation hierarchies may need to be used among various transaction systems. In such circumstances, it is necessary to investigate how to design dimensions, which play a major role in dimensional model for a data warehouse, and how to estimate summary information, which is not stored in the data warehouse. In this paper, the rough set theory is applied to support the dimension design and information estimation.

  2. What Is Your Budget Saying about Your Library?

    Science.gov (United States)

    Jacobs, Leslie; Strouse, Roger

    2002-01-01

    Discusses budgeting for corporate libraries and how to keep budgets from getting cut. Topics include whether the budget is considered corporate overhead; recovering costs; models for content cost recovery; showing return on library investment; marketing library value to senior management; user needs and satisfaction; and comparing budgets to other…

  3. Experimental and modeling study of the impact of vertical transport processes from the boundary-layer on the variability and the budget of tropospheric ozone

    International Nuclear Information System (INIS)

    Closing the tropospheric ozone budget requires a better understanding of the role of transport processes from the major reservoirs: the planetary boundary layer and the stratosphere. Case studies lead to the identification of mechanisms involved as well as their efficiency. However, their global impact on the budget must be addressed on a climatological basis. This manuscript is thus divided in two parts. First, we present case studies based on ozone LIDAR measurements performed during the ESCOMPTE campaign. This work consists in a data analysis investigation by means of a hybrid - Lagrangian study involving: global meteorological analyses, Lagrangian particle dispersion computation, and mesoscale, chemistry - transport, and Lagrangian photochemistry modeling. Our aim is to document the amount of observed ozone variability related to transport processes and, when appropriate, to infer the role of tropospheric photochemical production. Second, we propose a climatological analysis of the respective impact of transport from the boundary-layer and from the tropopause region on the tropospheric ozone budget. A multivariate analysis is presented and compared to a trajectography approach. Once validated, this algorithm is applied to the whole database of ozone profiles collected above Europe during the past 30 years in order to discuss the seasonal, geographical and temporal variability of transport processes as well as their impact on the tropospheric ozone budget. The variability of turbulent mixing and its impact on the persistence of tropospheric layers will also be discussed. (author)

  4. Semiparametric theory based MIMO model and performance analysis

    Institute of Scientific and Technical Information of China (English)

    XU Fang-min; XU Xiao-dong; ZHANG Ping

    2007-01-01

    In this article, a new approach for modeling multi- input multi-output (MIMO) systems with unknown nonlinear interference is introduced. The semiparametric theory based MIMO model is established, and Kernel estimation is applied to combat the nonlinear interference. Furthermore, we derive MIMO capacity for these systems and explore the asymptotic properties of the new channel matrix via theoretical analysis. The simulation results show that the semiparametric theory based modeling and kernel estimation are valid to combat this kind of interference.

  5. Carbon budget of tropical forests in Southeast Asia and the effects of deforestation: approach from a process model and field measurements

    Science.gov (United States)

    Adachi, M.; Ito, A.; Ishida, A.; Kadir, W. R.; Ladpala, P.; Yamagata, Y.

    2010-12-01

    More reliable estimation of carbon (C) stock within forest ecosystems and C emission induced by deforestation is one of the most urgent tasks for model researchers. A process-based terrestrial biogeochemical model (VISIT) was applied to two types of tropical forests (a seasonal dry forest in Thailand and a moist forest in Malaysia) and one agro-forest (an oil palm plantation in Malaysia) to estimate C budget of tropical ecosystems including the impacts of land use conversion in Southeast Asia. Observation and VISIT model simulation indicated that the two types of tropical forest had a comparably high photosynthetic uptakes; gross primary productions were estimated as 32.2-37.6 t C ha-1 y-1. A rain forest had a higher total C stock (plant biomass and soil organic matter, 301.5 t C ha-1) than that in a seasonal dry forest (266.5 t C ha-1). The VISIT model appropriately captured the impacts of disturbance such as deforestation and land use conversions on C budget, and a sensitivity analysis implied that remaining ratios of abandoned debris would be a key parameter determining soil C budget after deforestation events. Although the model tended to overestimate oil palm biomass in comparison with field data, C stock of the oil palm plantation was about half of the forest C at 30 years following the oil palm plantation formation when the remaining rate of residual debris was about 34%. These results indicated that adequate forest management was important for reducing C emission from soil, and C budget at each ecosystem need evaluate over a long-term using model and observations.

  6. Modeling transonic aerodynamic response using nonlinear systems theory for use with modern control theory

    Science.gov (United States)

    Silva, Walter A.

    1993-01-01

    The presentation begins with a brief description of the motivation and approach that has been taken for this research. This will be followed by a description of the Volterra Theory of Nonlinear Systems and the CAP-TSD code which is an aeroelastic, transonic CFD (Computational Fluid Dynamics) code. The application of the Volterra theory to a CFD model and, more specifically, to a CAP-TSD model of a rectangular wing with a NACA 0012 airfoil section will be presented.

  7. Secondary flow structure in a model curved artery: 3D morphology and circulation budget analysis

    Science.gov (United States)

    Bulusu, Kartik V.; Plesniak, Michael W.

    2015-11-01

    In this study, we examined the rate of change of circulation within control regions encompassing the large-scale vortical structures associated with secondary flows, i.e. deformed Dean-, Lyne- and Wall-type (D-L-W) vortices at planar cross-sections in a 180° curved artery model (curvature ratio, 1/7). Magnetic resonance velocimetry (MRV) and particle image velocimetry (PIV) experiments were performed independently, under the same physiological inflow conditions (Womersley number, 4.2) and using Newtonian blood-analog fluids. The MRV-technique performed at Stanford University produced phase-averaged, three-dimensional velocity fields. Secondary flow field comparisons of MRV-data to PIV-data at various cross-sectional planes and inflow phases were made. A wavelet-decomposition-based approach was implemented to characterize various secondary flow morphologies. We hypothesize that the persistence and decay of arterial secondary flow vortices is intrinsically related to the influence of the out-of-plane flow, tilting, in-plane convection and diffusion-related factors within the control regions. Evaluation of these factors will elucidate secondary flow structures in arterial hemodynamics. Supported by the National Science Foundation under Grant Number CBET-0828903, and GW Center for Biomimetics and Bioinspired Engineering (COBRE). The MRV data were acquired at Stanford University in collaboration with Christopher Elkins and John Eaton.

  8. Modeling Carbon and Water Budgets in the Lushi Basin with Biome-BGC

    Institute of Scientific and Technical Information of China (English)

    Dong Wenjuan; Qi Ye; Li Huimin; Zhou Dajie; Shi Duanhua; Sun Liying

    2005-01-01

    In this article, annual evapotranspiration (ET) and net primary productivity (NPP) of four types of vegetation were estimated for the Lushi basin,a subbasin of the Yellow River in China. These four vegetation types include: deciduous broadleaf forest,evergreen needle leaf forest, dwarf shrub and grass.Biome-BGC--a biogeochemical process model was used to calculate annual ET and NPP for each vegetation type in the study area from 1954 to 2000.Daily microclimate data of 47 years monitored by Lushi meteorological station was extrapolated to cover the basin using MT-CLIM, a mountain microclimate simulator. The output files of MTCLIM were used to feed Biome-BGC. We used average ecophysiological values of each type of vegetation supplied by Numerical Terradynamic Simulation Group (NTSG) in the University of Montana as input ecophysiological constants file.The estimates of daily NPP in early July and annual ET on these four biome groups were compared respectively with field measurements and other studies.Daily gross primary production (GPP) of evergreen needle leaf forest measurements were very dose to the output of Biome-BGC, but measurements of broadleaf forest and dwarf shrub were much smaller than the simulation result. Simulated annual ET and NPP had a significant correlation with precipitation,indicating precipitation is the major environmental factor affecting ET and NPP in the study area.Precipitation also is the key climatic factor for the interannual ET and NPP variations.

  9. Conditional spatial policy dependence: theory and model specification

    OpenAIRE

    Neumayer, Eric; Plümper, Thomas

    2012-01-01

    The authors discuss how scholars can bring theories of spatial policy dependence and empirical model specifications closer in line so that the empirical analysis actually tests the theoretical predictions. Comprehensive theories of spatial policy dependence typically suggest that the jurisdictions receiving spatial stimuli systematically differ in their exposure to such signals as a function of the intensity of their interaction with other jurisdictions. Similarly, theories often predict that...

  10. Gromov-Witten theory, Hurwitz numbers, and Matrix models, I

    OpenAIRE

    Okounkov, Andrei; Pandharipande, Rahul

    2001-01-01

    The main goal of the paper is to present a new approach via Hurwitz numbers to Kontsevich's combinatorial/matrix model for the intersection theory of the moduli space of curves. A secondary goal is to present an exposition of the circle of ideas involved: Hurwitz numbers, Gromov-Witten theory of the projective line, matrix integrals, and the theory of random trees. Further topics will be treated in a sequel.

  11. Iterated perturbation theory for the attractive Holstein and Hubbard models

    OpenAIRE

    Freericks, J. K.; Jarrell, Mark (Eds. )

    1994-01-01

    A strictly truncated (weak-coupling) perturbation theory is applied to the attractive Holstein and Hubbard models in infinite dimensions. These results are qualified by comparison with essentially exact Monte Carlo results. The second order iterated perturbation theory is shown to be quite accurate in calculating transition temperatures for retarded interactions, but is not as accurate for the self energy or the irreducible vertex functions themselves. Iterated perturbation theory is carried ...

  12. Gutzwiller variational theory for the Hubbard model with attractive interaction.

    Science.gov (United States)

    Bünemann, Jörg; Gebhard, Florian; Radnóczi, Katalin; Fazekas, Patrik

    2005-06-29

    We investigate the electronic and superconducting properties of a negative-U Hubbard model. For this purpose we evaluate a recently introduced variational theory based on Gutzwiller-correlated BCS wavefunctions. We find significant differences between our approach and standard BCS theory, especially for the superconducting gap. For small values of |U|, we derive analytical expressions for the order parameter and the superconducting gap which we compare to exact results from perturbation theory.

  13. Geometry model construction in infrared image theory simulation of buildings

    Institute of Scientific and Technical Information of China (English)

    谢鸣; 李玉秀; 徐辉; 谈和平

    2004-01-01

    Geometric model construction is the basis of infrared image theory simulation. Taking the construction of the geometric model of one building in Harbin as an example, this paper analyzes the theoretical groundings of simplification and principles of geometric model construction of buildings. It then discusses some particular treatment methods in calculating the radiation transfer coefficient in geometric model construction using the Monte Carlo Method.

  14. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  15. The danger model: questioning an unconvincing theory.

    Science.gov (United States)

    Józefowski, Szczepan

    2016-02-01

    Janeway's pattern recognition theory holds that the immune system detects infection through a limited number of the so-called pattern recognition receptors (PRRs). These receptors bind specific chemical compounds expressed by entire groups of related pathogens, but not by host cells (pathogen-associated molecular patterns (PAMPs). In contrast, Matzinger's danger hypothesis postulates that products released from stressed or damaged cells have a more important role in the activation of immune system than the recognition of nonself. These products, named by analogy to PAMPs as danger-associated molecular patterns (DAMPs), are proposed to act through the same receptors (PRRs) as PAMPs and, consequently, to stimulate largely similar responses. Herein, I review direct and indirect evidence that contradict the widely accepted danger theory, and suggest that it may be false.

  16. Theories and models of globalization ethicizing

    Directory of Open Access Journals (Sweden)

    Dritan Abazović

    2016-05-01

    Full Text Available Globalization as a phenomenon is under the magnifying glass of many philosophical discussions and theoretical deliberations. While most theorists deal with issues that are predominantly of economic or political character, this article has a different logic. The article presents six theories which in their own way explain the need for movement by ethicizing globalization. Globalization is a process that affects all and as such it has become inevitable, but it is up the people to determine its course and make it either functional or uncontrolled. The survival and development of any society is measured primarily by the quality of its moral and ethical foundation. Therefore, it is clear that global society can survive and be functional only if it finds a minimum consensus on ethical norms or, as said in theory, if it establishes its ethical system based on which it would be built and developed.

  17. [Models of economic theory of population growth].

    Science.gov (United States)

    Von Zameck, W

    1987-01-01

    "The economic theory of population growth applies the opportunity cost approach to the fertility decision. Variations and differentials in fertility are caused by the available resources and relative prices or by the relative production costs of child services. Pure changes in real income raise the demand for children or the total amount spent on children. If relative prices or production costs and real income are affected together the effect on fertility requires separate consideration." (SUMMARY IN ENG)

  18. Measurement Models for Reasoned Action Theory

    OpenAIRE

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-01-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...

  19. The model equation of soliton theory

    OpenAIRE

    Adler, V. E.; Shabat, A. B.

    2007-01-01

    We consider an hierarchy of integrable 1+2-dimensional equations related to Lie algebra of the vector fields on the line. The solutions in quadratures are constructed depending on $n$ arbitrary functions of one argument. The most interesting result is the simple equation for the generating function of the hierarchy which defines the dynamics for the negative times and also has applications to the second order spectral problems. A rather general theory of integrable 1+1-dimensional equations c...

  20. Theory and model use in social marketing health interventions.

    Science.gov (United States)

    Luca, Nadina Raluca; Suggs, L Suzanne

    2013-01-01

    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions. PMID:22934539

  1. Theory and model use in social marketing health interventions.

    Science.gov (United States)

    Luca, Nadina Raluca; Suggs, L Suzanne

    2013-01-01

    The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.

  2. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael;

    2015-01-01

    Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...

  3. Carbon budget of tropical forests in Southeast Asia and the effects of deforestation: an approach using a process-based model and field measurements

    Directory of Open Access Journals (Sweden)

    M. Adachi

    2011-03-01

    Full Text Available More reliable estimates of carbon (C stock within forest ecosystems and C emission induced by deforestation are urgently needed to mitigate the effects of emissions on climate change. A process-based terrestrial biogeochemical model (VISIT was applied to tropical primary forests of two types (a seasonal dry forest in Thailand and a rainforest in Malaysia and one agro-forest (an oil palm plantation in Malaysia to estimate the C budget of tropical ecosystems, including the impacts of land-use conversion, in Southeast Asia. Observations and VISIT model simulations indicated that the primary forests had high photosynthetic uptake: gross primary production was estimated at 31.5–35.5 t C ha−1 yr−1. In the VISIT model simulation, the rainforest had a higher total C stock (plant biomass and soil organic matter, 301.5 t C ha−1 than that in the seasonal dry forest (266.5 t C ha−1 in 2008. The VISIT model appropriately captured the impacts of disturbances such as deforestation and land-use conversions on the C budget. Results of sensitivity analysis implied that the ratio of remaining residual debris was a key parameter determining the soil C budget after deforestation events. The C stock of the oil palm plantation was about 46% of the rainforest's C at 30 yr following initiation of the plantation, when the ratio of remaining residual debris was assumed to be about 33%. These results show that adequate forest management is important for reducing C emission from soil and C budget of each ecosystem must be evaluated over a long term using both the model simulations and observations.

  4. FY 1996 Congressional budget request: Budget highlights

    Energy Technology Data Exchange (ETDEWEB)

    1995-02-01

    The FY 1996 budget presentation is organized by the Department`s major business lines. An accompanying chart displays the request for new budget authority. The report compares the budget request for FY 1996 with the appropriated FY 1995 funding levels displayed on a comparable basis. The FY 1996 budget represents the first year of a five year plan in which the Department will reduce its spending by $15.8 billion in budget authority and by $14.1 billion in outlays. FY 1996 is a transition year as the Department embarks on its multiyear effort to do more with less. The Budget Highlights are presented by business line; however, the fifth business line, Economic Productivity, which is described in the Policy Overview section, cuts across multiple organizational missions, funding levels and activities and is therefore included in the discussion of the other four business lines.

  5. Alluvial and colluvial sediment storage in the Geul River catchment (The Netherlands) — Combining field and modelling data to construct a Late Holocene sediment budget

    Science.gov (United States)

    de Moor, J. J. W.; Verstraeten, G.

    2008-03-01

    We used a combined approach of a two-dimensional erosion and hillslope sediment delivery model (WATEM/SEDEM) and detailed geomorphological reconstructions to quantify the different components in a sediment budget for the Geul River catchment (southern Netherlands) since the High Middle Ages. Hillslope erosion and colluvium deposition were calculated using the model, while floodplain storage was estimated using field data. Our results show that more than 80% of the total sediment production in the catchment has been stored as colluvium (mostly generated by hillslope erosion), while almost 13% is stored in the floodplain since the High Middle Ages (this situation resembles a capacity-limited system). Model results for the period prior to the High Middle Ages (with a nearly completely forested catchment) show that far less sediment was generated and that most of the sediments were directly transported to the main river valleys or out of the catchment (a supply-limited system). Geomorphological analysis of a large alluvial fan shows the sensitivity of the study area to changes in the percentage of arable land. Our combined field data-modeling study presents an elegant method to calculate a catchment sediment budget for a longer period and is able to identify and quantify the most important sediment storage elements. Furthermore, it provides a valuable tool to calculate a sediment budget while only limited dated fluvial sediment sequences are available.

  6. Modeling Multivariate Volatility Processes: Theory and Evidence

    Directory of Open Access Journals (Sweden)

    Jelena Z. Minovic

    2009-05-01

    Full Text Available This article presents theoretical and empirical methodology for estimation and modeling of multivariate volatility processes. It surveys the model specifications and the estimation methods. Multivariate GARCH models covered are VEC (initially due to Bollerslev, Engle and Wooldridge, 1988, diagonal VEC (DVEC, BEKK (named after Baba, Engle, Kraft and Kroner, 1995, Constant Conditional Correlation Model (CCC, Bollerslev, 1990, Dynamic Conditional Correlation Model (DCC models of Tse and Tsui, 2002, and Engle, 2002. I illustrate approach by applying it to daily data from the Belgrade stock exchange, I examine two pairs of daily log returns for stocks and index, report the results obtained, and compare them with the restricted version of BEKK, DVEC and CCC representations. The methods for estimation parameters used are maximum log-likehood (in BEKK and DVEC models and twostep approach (in CCC model.

  7. Carbon budget of tropical forests in Southeast Asia and the effects of deforestation: an approach using a process-based model and field measurements

    OpenAIRE

    Adachi, M; Ito, A; A. Ishida; W. R. Kadir; P. Ladpala; Yamagata, Y

    2011-01-01

    More reliable estimates of carbon (C) stock within forest ecosystems and C emission induced by deforestation are urgently needed to mitigate the effects of emissions on climate change. A process-based terrestrial biogeochemical model (VISIT) was applied to tropical primary forests of two types (a seasonal dry forest in Thailand and a rainforest in Malaysia) and one agro-forest (an oil palm plantation in Malaysia) to estimate the C budget of tropical ecosystems, including the impacts of...

  8. Carbon budget of tropical forests in Southeast Asia and the effects of deforestation: an approach using a process-based model and field measurements

    OpenAIRE

    Adachi, M; Ito, A; A. Ishida; W. R. Kadir; P. Ladpala; Yamagata, Y

    2011-01-01

    More reliable estimates of the carbon (C) stock within forest ecosystems and C emission induced by deforestation are urgently needed to mitigate the effects of emissions on climate change. A process-based terrestrial biogeochemical model (VISIT) was applied to tropical primary forests of two types (a seasonal dry forest in Thailand and a rainforest in Malaysia) and one agro-forest (an oil palm plantation in Malaysia) to estimate the C budget of tropical ecosystems in Southeast Asia, including...

  9. Resource management model based on budget mechanism%基于预算的资源管理模型

    Institute of Scientific and Technical Information of China (English)

    罗红兵; 王伟; 张晓霞; 武林平

    2011-01-01

    There is a wide gap between the existing resource management techniques and actual demands, such as ensuring fairness and quality of service(QoS) metric. Based on principles of economics, a resource management model named BB-RAM was presented, which uses budget mechanism to implement macro-control on the computing resource allocation, and ultimately achieves optimal using of resources and ensuring QoS. The simulation showed that parallel job scheduling based on BB-RAM outperforms the traditional strategies on all metrics, for example, QoS, turnaround time, slowdown etc.%针对现有批作业系统中的资源管理方式在资源使用公平性和合理性、作业服务质量(QoS)与实际需求存在较大差距的问题,提出一种基于经济学原理的资源管理模型——BB-RAM模型.模型通过预算机制来实现对计算资源管理和使用的宏观控制,最终达到资源使用最优化和保证作业服务质量的目的.基于实际作业流的仿真结果表明该模型的作业调度的作业延误率、效益值等QoS指标,以及平均响应时间等传统评价指标都优于传统调度策略.

  10. Classical conformality in the Standard Model from Coleman's theory

    CERN Document Server

    Kawana, Kiyoharu

    2016-01-01

    The classical conformality is one of the possible candidates for explaining the gauge hierarchy of the Standard Model. We show that it is naturally obtained from the Coleman's theory on baby universe.

  11. A price response model developed from perceptual theories

    OpenAIRE

    Gurumurthy, K.; Little, John D. C.; University of Texas at Dallas. Marketing Center.

    2003-01-01

    "June, 1989." "Revised version of a paper originally presented at the Marketing Science in Dallas, March 1986, under the title 'A pricing model based on perception theories and its testing on scanner panel data'."

  12. Convergent perturbation theory for lattice models with fermions

    Science.gov (United States)

    Sazonov, V. K.

    2016-05-01

    The standard perturbation theory in QFT and lattice models leads to the asymptotic expansions. However, an appropriate regularization of the path or lattice integrals allows one to construct convergent series with an infinite radius of the convergence. In the earlier studies, this approach was applied to the purely bosonic systems. Here, using bosonization, we develop the convergent perturbation theory for a toy lattice model with interacting fermionic and bosonic fields.

  13. A QCD Model Using Generalized Yang-Mills Theory

    Institute of Scientific and Technical Information of China (English)

    WANG Dian-Fu; SONG He-Shan; KOU Li-Na

    2007-01-01

    Generalized Yang-Mills theory has a covariant derivative,which contains both vector and scalar gauge bosons.Based on this theory,we construct a strong interaction model by using the group U(4).By using this U(4)generalized Yang-Mills model,we also obtain a gauge potential solution,which can be used to explain the asymptotic behavior and color confinement.

  14. Consumer preference models: fuzzy theory approach

    Science.gov (United States)

    Turksen, I. B.; Wilson, I. A.

    1993-12-01

    Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).

  15. A Dynamic Systems Theory Model of Visual Perception Development

    Science.gov (United States)

    Coté, Carol A.

    2015-01-01

    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  16. Bianchi class A models in Sàez-Ballester's theory

    Science.gov (United States)

    Socorro, J.; Espinoza-García, Abraham

    2012-08-01

    We apply the Sàez-Ballester (SB) theory to Bianchi class A models, with a barotropic perfect fluid in a stiff matter epoch. We obtain exact classical solutions à la Hamilton for Bianchi type I, II and VIh=-1 models. We also find exact quantum solutions to all Bianchi Class A models employing a particular ansatz for the wave function of the universe.

  17. Measurement-based load modeling: Theory and application

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Load model is one of the most important elements in power system operation and control. However, owing to its complexity, load modeling is still an open and very difficult problem. Summarizing our work on measurement-based load modeling in China for more than twenty years, this paper systematically introduces the mathematical theory and applications regarding the load modeling. The flow chart and algorithms for measurement-based load modeling are presented. A composite load model structure with 13 parameters is also proposed. Analysis results based on the trajectory sensitivity theory indicate the importance of the load model parameters for the identification. Case studies show the accuracy of the presented measurement-based load model. The load model thus built has been validated by field measurements all over China. Future working directions on measurement- based load modeling are also discussed in the paper.

  18. Modeling in applied sciences a kinetic theory approach

    CERN Document Server

    Pulvirenti, Mario

    2000-01-01

    Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...

  19. Topological B-model and ${\\hat c}=1$ String Theory

    CERN Document Server

    Hyun, S; Park, J D; Yi, S H; Hyun, Seungjoon; Oh, Kyungho; Park, Jong-Dae; Yi, Sang-Heon

    2005-01-01

    We study the topological B-model on a deformed $\\Z_2$ orbifolded conifold by investigating variation of complex structures via quantum Kodaira-Spencer theories. The fermionic/brane formulation together with systematic utilization of symmetries of the geometry gives rise to a free fermion realization of the amplitudes. We derive Ward identities which solve the perturbed free energy exactly. We also obtain the corresponding Kontsevich-like matrix model. All these confirm the recent conjecture on the connection of the theory with ${\\hat c}=1$ type 0A string theory compactified at the radius $R=\\sqrt{\\alpha'/2}$.

  20. The monster sporadic group and a theory underlying superstring models

    International Nuclear Information System (INIS)

    The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs

  1. Vessel Route Choice Theory and Modeling

    NARCIS (Netherlands)

    Shu, Y.; Daamen, W.; Ligteringen, H.; Hoogendoorn, S.P.

    2015-01-01

    A new maritime traffic model describes vessel traffic in ports and inland waterways better. In this research, vessel behavior is categorized into a tactical level (route choice) and an operational level (dynamics of vessel behavior). This new maritime traffic model comprises two parts. The route cho

  2. Optimal transportation networks models and theory

    CERN Document Server

    Bernot, Marc; Morel, Jean-Michel

    2009-01-01

    The transportation problem can be formalized as the problem of finding the optimal way to transport a given measure into another with the same mass. In contrast to the Monge-Kantorovitch problem, recent approaches model the branched structure of such supply networks as minima of an energy functional whose essential feature is to favour wide roads. Such a branched structure is observable in ground transportation networks, in draining and irrigation systems, in electrical power supply systems and in natural counterparts such as blood vessels or the branches of trees. These lectures provide mathematical proof of several existence, structure and regularity properties empirically observed in transportation networks. The link with previous discrete physical models of irrigation and erosion models in geomorphology and with discrete telecommunication and transportation models is discussed. It will be mathematically proven that the majority fit in the simple model sketched in this volume.

  3. Global model simulations of the impact of ocean-going ships on aerosols, clouds, and the radiation budget

    Directory of Open Access Journals (Sweden)

    A. Lauer

    2007-07-01

    Full Text Available International shipping contributes significantly to the fuel consumption of all transport related activities. Specific emissions of pollutants such as sulfur dioxide (SO2 per kg of fuel emitted are higher than for road transport or aviation. Besides gaseous pollutants, ships also emit various types of particulate matter. The aerosol impacts the Earth's radiation budget directly by scattering and absorbing incoming solar radiation and indirectly by changing cloud properties. Here we use ECHAM5/MESSy1-MADE, a global climate model with detailed aerosol and cloud microphysics, to show that emissions from ships significantly increase the cloud droplet number concentration of low maritime water clouds. Whereas the cloud liquid water content remains nearly unchanged in these simulations, effective radii of cloud droplets decrease, leading to cloud optical thickness increase up to 5–10%. The sensitivity of the results is estimated by using three different emission inventories for present day conditions. The sensitivity analysis reveals that shipping contributes with 2.3% to 3.6% to the total sulfate burden and 0.4% to 1.4% to the total black carbon burden in the year 2000. In addition to changes in aerosol chemical composition, shipping increases the aerosol number concentration, e.g. up to 25% in the size range of the accumulation mode (typically >0.1 μm over the Atlantic. The total aerosol optical thickness over the Indian Ocean, the Gulf of Mexico and the Northeastern Pacific increases up to 8–10% depending on the emission inventory. Changes in aerosol optical thickness caused by the shipping induced modification of aerosol particle number concentration and chemical composition lead to a change of the net top of the atmosphere (ToA clear sky radiation of about −0.013 W/m2 to −0.036 W/m2 on global annual average. The estimated all-sky direct aerosol effect calculated from these changes ranges between −0

  4. Stratigraphy of two conjugate margins (Gulf of Lion and West Sardinia): modeling of vertical movements and sediment budgets

    Science.gov (United States)

    Leroux, Estelle; Gorini, Christian; Aslanian, Daniel; Rabineau, Marina; Blanpied, Christian; Rubino, Jean-Loup; Robin, Cécile; Granjeon, Didier; Taillepierre, Rachel

    2016-04-01

    The post-rift (~20-0 Ma) vertical movements of the Provence Basin (West Mediterranean) are quantified on its both conjugate (the Gulf of Lion and the West Sardinia) margins. This work is based on the stratigraphic study of sedimentary markers using a large 3D grid of seismic data, correlations with existing drillings and refraction data. The post-rift subsidence is measured by the direct use of sedimentary geometries analysed in 3D [Gorini et al., 2015; Rabineau et al., 2014] and validated by numerical stratigraphic modelling. Three domains were found: on the platform (1) and slope (2), the subsidence takes the form of a seaward tilting with different amplitudes, whereas the deep basin (3) subsides purely vertically [Leroux et al., 2015a]. These domains correspond to the deeper crustal domains respectively highlighted by wide angle seismic data. The continental crust (1) and the thinned continental crust (2) are tilted, whereas the intermediate crust, identified as lower continental exhumed crust [Moulin et al., 2015, Afhilado et al., 2015] (3) sagged. The post-break-up subsidence re-uses the initial hinge lines of the rifting phase. This striking correlation between surface geologic processes and deep earth dynamic processes emphasizes that the sedimentary record and sedimentary markers is a window into deep geodynamic processes and dynamic topography. Pliocene-Pleistocene seismic markers enabled high resolution quantification of sediment budgets over the past 6 Myr [Leroux et al., in press]. Sediment budget history is here completed on the Miocene interval. Thus, the controlling factors (climate, tectonics and eustasy) are discussed. Afilhado, A., Moulin, M., Aslanian, D., Schnürle, P., Klingelhoefer, F., Nouzé, H., Rabineau, M., Leroux, E. & Beslier, M.-O. (2015). Deep crustal structure across a young 1 passive margin from wide-angle and reflection seismic data (The SARDINIA Experiment) - II. Sardinia's margin. Bull. Soc. géol. France, 186, ILP Spec. issue, 4

  5. Measurement Models for Reasoned Action Theory.

    Science.gov (United States)

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-03-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.

  6. Understanding Rasch Measurement: The Rasch Model, Additive Conjoint Measurement, and New Models of Probabilistic Measurement Theory.

    Science.gov (United States)

    Karabatsos, George

    2001-01-01

    Describes similarities and differences between additive conjoint measurement and the Rasch model, and formalizes some new nonparametric item response models that are, in a sense, probabilistic measurement theory models. Applies these new models to published and simulated data. (SLD)

  7. The Family FIRO Model: The Integration of Group Theory and Family Theory.

    Science.gov (United States)

    Colangelo, Nicholas; Doherty, William J.

    1988-01-01

    Presents the Family Fundamental Interpersonal Relations Orientation (Family FIRO) Model, an integration of small-group theory and family therapy. The model is offered as a framework for organizing family issues. Discusses three fundamental issues of human relatedness and their applicability to group dynamics. (Author/NB)

  8. Mixed models theory and applications with R

    CERN Document Server

    Demidenko, Eugene

    2013-01-01

    Mixed modeling is one of the most promising and exciting areas of statistical analysis, enabling the analysis of nontraditional, clustered data that may come in the form of shapes or images. This book provides in-depth mathematical coverage of mixed models' statistical properties and numerical algorithms, as well as applications such as the analysis of tumor regrowth, shape, and image. The new edition includes significant updating, over 300 exercises, stimulating chapter projects and model simulations, inclusion of R subroutines, and a revised text format. The target audience continues to be g

  9. Twinlike models in scalar field theories

    International Nuclear Information System (INIS)

    This work deals with the presence of defect structures in models described by a real scalar field in a diversity of scenarios. The defect structures that we consider are static solutions of the equations of motion that depend on a single spatial dimension. We search for different models, which support the same defect solution, with the very same energy density. We work in flat spacetime, where we introduce and investigate a new class of models. We also work in curved spacetime, within the braneworld context, with a single extra dimension of infinite extent, and there we show how the brane is formed from the static field configuration.

  10. Sticker DNA computer model--Part Ⅰ:Theory

    Institute of Scientific and Technical Information of China (English)

    XU Jin; DONG Yafei; WEI Xiaopeng

    2004-01-01

    Sticker model is one of the basic models in the DNA computer models. This model is coded with single-double stranded DNA molecules. It has the following advantages that the operations require no strands extension and use no enzymes; What's more, the materials are reusable. Therefore it arouses attention and interest of scientists in many fields. In this paper, we will systematically analyze the theories and applications of the model, summarize other scientists' contributions in this field, and propose our research results. This paper is the theoretical portion of the sticker model on DNA computer, which includes the introduction of the basic model of sticker computing. Firstly, we systematically introduce the basic theories of classic models about sticker computing; Secondly, we discuss the sticker system which is an abstract computing model based on the sticker model and formal languages; Finally, extend and perfect the model, and present two types of models that are more extensive in the applications and more perfect in the theory than the past models: one is the so-called k-bit sticker model, the other is full-message sticker DNA computing model.

  11. Homogeneous cosmological models in Yang's gravitation theory

    Science.gov (United States)

    Fennelly, A. J.; Pavelle, R.

    1979-01-01

    We present a dynamic, spatially homogeneous solution of Yang's pure space gravitational field equations which is non-Einsteinian. The predictions of this cosmological model seem to be at variance with observations.

  12. Solid mechanics theory, modeling, and problems

    CERN Document Server

    Bertram, Albrecht

    2015-01-01

    This textbook offers an introduction to modeling the mechanical behavior of solids within continuum mechanics and thermodynamics. To illustrate the fundamental principles, the book starts with an overview of the most important models in one dimension. Tensor calculus, which is called for in three-dimensional modeling, is concisely presented in the second part of the book. Once the reader is equipped with these essential mathematical tools, the third part of the book develops the foundations of continuum mechanics right from the beginning. Lastly, the book’s fourth part focuses on modeling the mechanics of materials and in particular elasticity, viscoelasticity and plasticity. Intended as an introductory textbook for students and for professionals interested in self-study, it also features numerous worked-out examples to aid in understanding.

  13. Holographic Models for Theories with Hyperscaling Violation

    CERN Document Server

    Gath, Jakob; Monteiro, Ricardo; Obers, Niels A

    2013-01-01

    We study in detail a variety of gravitational toy models for hyperscaling-violating Lifshitz (hvLif) space-times. These space-times have been recently explored as holographic dual models for condensed matter systems. We start by considering a model of gravity coupled to a massive vector field and a dilaton with a potential. This model supports the full class of hvLif space-times and special attention is given to the particular values of the scaling exponents appearing in certain non-Fermi liquids. We study linearized perturbations in this model, and consider probe fields whose interactions mimic those of the perturbations. The resulting equations of motion for the probe fields are invariant under the Lifshitz scaling. We derive Breitenlohner-Freedman-type bounds for these new probe fields. For the cases of interest the hvLif space-times have curvature invariants that blow up in the UV. We study the problem of constructing models in which the hvLif space-time can have an AdS or Lifshitz UV completion. We also ...

  14. Theory of stellar convection II: first stellar models

    CERN Document Server

    Pasetto, S; Chiosi, E; Cropper, M; Weiss, A

    2015-01-01

    We present here the first stellar models on the Hertzsprung-Russell diagram (HRD), in which convection is treated according to the novel scale-free convection theory (SFC theory) by Pasetto et al. (2014). The aim is to compare the results of the new theory with those from the classical, calibrated mixing-length (ML) theory to examine differences and similarities. We integrate the equations describing the structure of the atmosphere from the stellar surface down to a few percent of the stellar mass using both ML theory and SFC theory. The key temperature over pressure gradients, the energy fluxes, and the extension of the convective zones are compared in both theories. The analysis is first made for the Sun and then extended to other stars of different mass and evolutionary stage. The results are adequate: the SFC theory yields convective zones, temperature gradients of the ambient and of the convective element, and energy fluxes that are very similar to those derived from the "calibrated" MT theory for main s...

  15. Assessing the carbon sink of afforestation with the Carbon Budget Model at the country level: an example for Italy

    Directory of Open Access Journals (Sweden)

    Pilli R

    2015-08-01

    Full Text Available In the context of the Kyoto Protocol, the mandatory accounting of Afforestation and Reforestation (AR activities requires estimating the forest carbon (C stock changes for any direct human-induced expansion of forest since 1990. We used the Carbon Budget Model (CBM to estimate C stock changes and emissions from fires on AR lands at country level. Italy was chosen because it has one of the highest annual rates of AR in Europe and the same model was recently applied to Italy’s forest management area. We considered the time period 1990-2020 with two case studies reflecting different average annual rates of AR: 78 kha yr-1, based on the 2013 Italian National Inventory Report (NIR, official estimates, and 28 kha yr-1, based on the Italian Land Use Inventory System (IUTI estimates. We compared these two different AR rates with eight regional forest inventories and three independent local studies. The average annual C stock change estimated by CBM, excluding harvest or natural disturbances, was equal to 1738 Gg C yr-1 (official estimates and 630 Gg C yr-1 (IUTI estimates. Results for the official estimates are consistent with the estimates reported by Italy to the KP for the period 2008-2010; for 2011 our estimates are about 20% higher than the country’s data, probably due to different assumptions on the fire disturbances, the AR rate and the dead wood and litter pools. Furthermore, our analysis suggests that: (i the impact on the AR sink of different assumptions of species composition is small; (ii the amount of harvest provided by AR has been negligible for the past (< 3% and is expected to be small in the near future (up to 8% in 2020; (iii forest fires up to 2011 had a small impact on the AR sink (on average, < 100 Gg C yr-1. Finally the comparison of the historical AR rates reported by NIR and IUTI with other independent sources gives mixed results: the regional inventories support the AR rates reported by the NIR, while some local studies

  16. 2017 Budget Outlays

    Data.gov (United States)

    Executive Office of the President — This dataset includes three data files that contain an extract of the Office of Management and Budget (OMB) budget database. These files can be used to reproduce...

  17. 2017 Budget Receipts

    Data.gov (United States)

    Executive Office of the President — This dataset includes three data files that contain an extract of the Office of Management and Budget (OMB) budget database. These files can be used to reproduce...

  18. Fiscal Year 2015 Budget

    Data.gov (United States)

    Montgomery County of Maryland — This dataset includes the Fiscal Year 2015 Council-approved operating budget for Montgomery County. The dataset does not include revenues and detailed agency budget...

  19. Electrorheological fluids modeling and mathematical theory

    CERN Document Server

    Růžička, Michael

    2000-01-01

    This is the first book to present a model, based on rational mechanics of electrorheological fluids, that takes into account the complex interactions between the electromagnetic fields and the moving liquid. Several constitutive relations for the Cauchy stress tensor are discussed. The main part of the book is devoted to a mathematical investigation of a model possessing shear-dependent viscosities, proving the existence and uniqueness of weak and strong solutions for the steady and the unsteady case. The PDS systems investigated possess so-called non-standard growth conditions. Existence results for elliptic systems with non-standard growth conditions and with a nontrivial nonlinear r.h.s. and the first ever results for parabolic systems with a non-standard growth conditions are given for the first time. Written for advanced graduate students, as well as for researchers in the field, the discussion of both the modeling and the mathematics is self-contained.

  20. Historical analysis and modeling of the forest carbon dynamics using the Carbon Budget Model: an example for the Trento Province (NE, Italy

    Directory of Open Access Journals (Sweden)

    Pilli R

    2014-02-01

    Full Text Available Historical analysis and modeling of the forest carbon dynamics using the Carbon Budget Model: an example for the Trento Province (NE, Italy. The Carbon Budget Model (CBM-CFS3 developed by the Canadian Forest Service was applied to data collected by the last Italian National Forest Inventory (INFC for the Trento Province (NE, Italy. CBM was modified and adapted to the different management types (i.e., even-aged high forests, uneven-aged high forests and coppices and silvicultural systems (including clear cuts, single tree selection systems and thinning applied in this province. The aim of this study was to provide an example of down-scaling of this model from a national to a regional scale, providing (i an historical analysis, from 1995 to 2011, and (ii a projection, from 2012 to 2020, of the forest biomass and the carbon stock evolution. The analysis was based on the harvest rate reported by the Italian National Institute of Statistics (from 1995 to 2011, corrected according to the last INFC data and distinguished between timber and fuel woods and between conifers and broadleaves. Since 2012, we applied a constant harvest rate, equal to about 1300 Mm3 yr-1, estimated from the average harvest rate for the period 2006-2011. Model results were consistent with similar data reported in the literature. The average biomass C stock was 90 Mg C ha-1 and the biomass C stock change was 0.97 Mg C ha-1 yr-1 and 0.87 Mg C ha-1 yr-1, for the period 1995 -2011 and 2012-2020, respectively. The C stock cumulated by the timber products since 1995 was 96 Gg C yr-1, i.e., about 28% of the average annual C stock change of the forests, equal to 345 Gg C yr-1. CBM also provided estimates on the evolution of the age class distribution of the even-aged forests and on the C stock of the DOM forest pools (litter, dead wood and soil. This study demonstrates the utility of CBM to provide estimates at a regional or local scale, using not only the data provided by the forest

  1. Microbial community modeling using reliability theory.

    Science.gov (United States)

    Zilles, Julie L; Rodríguez, Luis F; Bartolerio, Nicholas A; Kent, Angela D

    2016-08-01

    Linking microbial community composition with the corresponding ecosystem functions remains challenging. Because microbial communities can differ in their functional responses, this knowledge gap limits ecosystem assessment, design and management. To develop models that explicitly incorporate microbial populations and guide efforts to characterize their functional differences, we propose a novel approach derived from reliability engineering. This reliability modeling approach is illustrated here using a microbial ecology dataset from denitrifying bioreactors. Reliability modeling is well-suited for analyzing the stability of complex networks composed of many microbial populations. It could also be applied to evaluate the redundancy within a particular biochemical pathway in a microbial community. Reliability modeling allows characterization of the system's resilience and identification of failure-prone functional groups or biochemical steps, which can then be targeted for monitoring or enhancement. The reliability engineering approach provides a new perspective for unraveling the interactions between microbial community diversity, functional redundancy and ecosystem services, as well as practical tools for the design and management of engineered ecosystems. PMID:26882268

  2. Theory and Model for Martensitic Transformations

    DEFF Research Database (Denmark)

    Lindgård, Per-Anker; Mouritsen, Ole G.

    1986-01-01

    Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...

  3. Modeling developmental transitions in adaptive resonance theory

    NARCIS (Netherlands)

    M.E.J. Raijmakers; P.C.M. Molenaar

    2004-01-01

    Neural networks are applied to a theoretical subject in developmental psychology: modeling developmental transitions. Two issues that are involved will be discussed: discontinuities and acquiring qualitatively new knowledge. We will argue that by the appearance of a bifurcation, a neural network can

  4. Study on Strand Space Model Theory

    Institute of Scientific and Technical Information of China (English)

    JI QingGuang(季庆光); QING SiHan(卿斯汉); ZHOU YongBin(周永彬); FENG DengGuo(冯登国)

    2003-01-01

    The growing interest in the application of formal methods of cryptographic pro-tocol analysis has led to the development of a number of different ways for analyzing protocol. Inthis paper, it is strictly proved that if for any strand, there exists at least one bundle containingit, then an entity authentication protocol is secure in strand space model (SSM) with some smallextensions. Unfortunately, the results of attack scenario demonstrate that this protocol and the Yahalom protocol and its modification are de facto insecure. By analyzing the reasons of failure offormal inference in strand space model, some deficiencies in original SSM are pointed out. In orderto break through these limitations of analytic capability of SSM, the generalized strand space model(GSSM) induced by some protocol is proposed. In this model, some new classes of strands, oraclestrands, high order oracle strands etc., are developed, and some notions are formalized strictly in GSSM, such as protocol attacks, valid protocol run and successful protocol run. GSSM can thenbe used to further analyze the entity authentication protocol. This analysis sheds light on why thisprotocol would be vulnerable while it illustrates that GSSM not only can prove security protocolcorrect, but also can be efficiently used to construct protocol attacks. It is also pointed out thatusing other protocol to attack some given protocol is essentially the same as the case of using themost of protocol itself.

  5. A catastrophe theory model of the conflict helix, with tests.

    Science.gov (United States)

    Rummel, R J

    1987-10-01

    Macro social field theory has undergone extensive development and testing since the 1960s. One of these has been the articulation of an appropriate conceptual micro model--called the conflict helix--for understanding the process from conflict to cooperation and vice versa. Conflict and cooperation are viewed as distinct equilibria of forces in a social field; the movement between these equilibria is a jump, energized by a gap between social expectations and power, and triggered by some minor event. Quite independently, there also has been much recent application of catastrophe theory to social behavior, but usually without a clear substantive theory and lacking empirical testing. This paper uses catastrophe theory--namely, the butterfly model--mathematically to structure the conflict helix. The social field framework and helix provide the substantive interpretation for the catastrophe theory; and catastrophe theory provides a suitable mathematical model for the conflict helix. The model is tested on the annual conflict and cooperation between India and Pakistan, 1948 to 1973. The results are generally positive and encouraging.

  6. $f(T)$ theories from holographic dark energy models

    OpenAIRE

    Huang, Peng; Huang, Yong-Chang(Institute of Theoretical Physics, Beijing University of Technology, 100124, Beijing, China)

    2013-01-01

    We reconstruct $f(T)$ theories from three different holographic dark energy models in different time durations. For the HDE model, the dark energy dominated era with new setting up is chosen for reconstruction, and the radiation dominated era is chosen when the involved model changes into NADE. For the RDE model, radiation, matter and dark energy dominated time durations are all investigated. We also investigate the limitation which prevents an arbitrary choice of the time duration for recons...

  7. Federal budget timetable

    Science.gov (United States)

    This is the federal budget timetable under the Balanced Budget and Emergency Deficit Control Act of 1985 (Gramm-Rudman-Hollings). These deadlines apply to fiscal years (FY) 1987-1991. The deficit reduction measures in Gramm-Rudman-Hollings would lead to a balanced budget in 1991.

  8. Budgeting and Beyond

    DEFF Research Database (Denmark)

    Rohde, Carsten

    Budgets and budget control has been known since the early 19th century1. However the use of budget control was until the beginning of the 1920ies in US primarily related to governmental units and states and to a minor extent to business units in practice. At that time James McKinsey describes...

  9. Theory and Model of Agricultural Insurance Subsidy

    Institute of Scientific and Technical Information of China (English)

    Wan Kailiang; Long Wenjun

    2007-01-01

    The issue of agricultural insurance subsidy is discussed in this paper aiming to make it provided more rationally and scientifically.It is started with the connection between agricultural insurance and financial subsidy.It is really necessary and crucial to implement the financial insurance due to the bad operational performance,especially in the developing countries.But the subsidy should be provided more rationally because financial subsidy has lots of negative effects.A model in competitive insurance markets developed by Ahsan et al(1982)and a farmers'decision model arc developed to solve the optimal subsidized rate.Finally,the equation is got to calculate it.But a quantitative subsidized rate is not made here because the calculation should be under some restricted conditions,which are always absent in the developing countries.So the government should provide some subsidy for the ex ante research and preparation to get the scientific probability and premium rate.

  10. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  11. Fuzzy Stochastic Optimization Theory, Models and Applications

    CERN Document Server

    Wang, Shuming

    2012-01-01

    Covering in detail both theoretical and practical perspectives, this book is a self-contained and systematic depiction of current fuzzy stochastic optimization that deploys the fuzzy random variable as a core mathematical tool to model the integrated fuzzy random uncertainty. It proceeds in an orderly fashion from the requisite theoretical aspects of the fuzzy random variable to fuzzy stochastic optimization models and their real-life case studies.   The volume reflects the fact that randomness and fuzziness (or vagueness) are two major sources of uncertainty in the real world, with significant implications in a number of settings. In industrial engineering, management and economics, the chances are high that decision makers will be confronted with information that is simultaneously probabilistically uncertain and fuzzily imprecise, and optimization in the form of a decision must be made in an environment that is doubly uncertain, characterized by a co-occurrence of randomness and fuzziness. This book begins...

  12. From integrable models to gauge theories Festschrift Matinyan (Sergei G)

    CERN Document Server

    Gurzadyan, V G

    2002-01-01

    This collection of twenty articles in honor of the noted physicist and mentor Sergei Matinyan focuses on topics that are of fundamental importance to high-energy physics, field theory and cosmology. The topics range from integrable quantum field theories, three-dimensional Ising models, parton models and tests of the Standard Model, to black holes in loop quantum gravity, the cosmological constant and magnetic fields in cosmology. A pedagogical essay by Lev Okun concentrates on the problem of fundamental units. The articles have been written by well-known experts and are addressed to graduate

  13. L∞-algebra models and higher Chern-Simons theories

    Science.gov (United States)

    Ritter, Patricia; Sämann, Christian

    2016-10-01

    We continue our study of zero-dimensional field theories in which the fields take values in a strong homotopy Lie algebra. In the first part, we review in detail how higher Chern-Simons theories arise in the AKSZ-formalism. These theories form a universal starting point for the construction of L∞-algebra models. We then show how to describe superconformal field theories and how to perform dimensional reductions in this context. In the second part, we demonstrate that Nambu-Poisson and multisymplectic manifolds are closely related via their Heisenberg algebras. As a byproduct of our discussion, we find central Lie p-algebra extensions of 𝔰𝔬(p + 2). Finally, we study a number of L∞-algebra models which are physically interesting and which exhibit quantized multisymplectic manifolds as vacuum solutions.

  14. Rock mechanics modeling based on soft granulation theory

    CERN Document Server

    Owladeghaffari, H

    2008-01-01

    This paper describes application of information granulation theory, on the design of rock engineering flowcharts. Firstly, an overall flowchart, based on information granulation theory has been highlighted. Information granulation theory, in crisp (non-fuzzy) or fuzzy format, can take into account engineering experiences (especially in fuzzy shape-incomplete information or superfluous), or engineering judgments, in each step of designing procedure, while the suitable instruments modeling are employed. In this manner and to extension of soft modeling instruments, using three combinations of Self Organizing Map (SOM), Neuro-Fuzzy Inference System (NFIS), and Rough Set Theory (RST) crisp and fuzzy granules, from monitored data sets are obtained. The main underlined core of our algorithms are balancing of crisp(rough or non-fuzzy) granules and sub fuzzy granules, within non fuzzy information (initial granulation) upon the open-close iterations. Using different criteria on balancing best granules (information pock...

  15. Applying learning theories and instructional design models for effective instruction.

    Science.gov (United States)

    Khalil, Mohammed K; Elkhider, Ihsan A

    2016-06-01

    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory.

  16. New theories of root growth modelling

    Science.gov (United States)

    Landl, Magdalena; Schnepf, Andrea; Vanderborght, Jan; Huber, Katrin; Javaux, Mathieu; Bengough, A. Glyn; Vereecken, Harry

    2016-04-01

    In dynamic root architecture models, root growth is represented by moving root tips whose line trajectory results in the creation of new root segments. Typically, the direction of root growth is calculated as the vector sum of various direction-affecting components. However, in our simulations this did not reproduce experimental observations of root growth in structured soil. We therefore developed a new approach to predict the root growth direction. In this approach we distinguish between, firstly, driving forces for root growth, i.e. the force exerted by the root which points in the direction of the previous root segment and gravitropism, and, secondly, the soil mechanical resistance to root growth or penetration resistance. The latter can be anisotropic, i.e. depending on the direction of growth, which leads to a difference between the direction of the driving force and the direction of the root tip movement. Anisotropy of penetration resistance can be caused either by microscale differences in soil structure or by macroscale features, including macropores. Anisotropy at the microscale is neglected in our model. To allow for this, we include a normally distributed random deflection angle α to the force which points in the direction of the previous root segment with zero mean and a standard deviation σ. The standard deviation σ is scaled, so that the deflection from the original root tip location does not depend on the spatial resolution of the root system model. Similarly to the water flow equation, the direction of the root tip movement corresponds to the water flux vector while the driving forces are related to the water potential gradient. The analogue of the hydraulic conductivity tensor is the root penetrability tensor. It is determined by the inverse of soil penetration resistance and describes the ease with which a root can penetrate the soil. By adapting the three dimensional soil and root water uptake model R-SWMS (Javaux et al., 2008) in this way

  17. Thermodynamic Models from Fluctuation Solution Theory Analysis of Molecular Simulations

    DEFF Research Database (Denmark)

    Christensen, Steen; Peters, Günther H.j.; Hansen, Flemming Yssing;

    2007-01-01

    Fluctuation solution theory (FST) is employed to analyze results of molecular dynamics (MD) simulations of liquid mixtures. The objective is to generate parameters for macroscopic GE-models, here the modified Margules model. We present a strategy for choosing the number of parameters included...

  18. Pilot evaluation in TENCompetence: a theory-driven model1

    NARCIS (Netherlands)

    J. Schoonenboom; H. Sligte; A. Moghnieh; M. Specht; C. Glahn; K. Stefanov

    2008-01-01

    This paper describes a theory-driven evaluation model that is used in evaluating four pilots in which an infrastructure for lifelong competence development, which is currently being developed, is validated. The model makes visible the separate implementation steps that connect the envisaged infrastr

  19. Optimal velocity difference model for a car-following theory

    International Nuclear Information System (INIS)

    In this Letter, we present a new optimal velocity difference model for a car-following theory based on the full velocity difference model. The linear stability condition of the new model is obtained by using the linear stability theory. The unrealistically high deceleration does not appear in OVDM. Numerical simulation of traffic dynamics shows that the new model can avoid the disadvantage of negative velocity occurred at small sensitivity coefficient λ in full velocity difference model by adjusting the coefficient of the optimal velocity difference, which shows that collision can disappear in the improved model. -- Highlights: → A new optimal velocity difference car-following model is proposed. → The effects of the optimal velocity difference on the stability of traffic flow have been explored. → The starting and braking process were carried out through simulation. → The effects of the optimal velocity difference can avoid the disadvantage of negative velocity.

  20. Optimal velocity difference model for a car-following theory

    Energy Technology Data Exchange (ETDEWEB)

    Peng, G.H., E-mail: pengguanghan@yahoo.com.cn [College of Physics and Electronics, Hunan University of Arts and Science, Changde 415000 (China); Cai, X.H.; Liu, C.Q.; Cao, B.F.; Tuo, M.X. [College of Physics and Electronics, Hunan University of Arts and Science, Changde 415000 (China)

    2011-10-31

    In this Letter, we present a new optimal velocity difference model for a car-following theory based on the full velocity difference model. The linear stability condition of the new model is obtained by using the linear stability theory. The unrealistically high deceleration does not appear in OVDM. Numerical simulation of traffic dynamics shows that the new model can avoid the disadvantage of negative velocity occurred at small sensitivity coefficient λ in full velocity difference model by adjusting the coefficient of the optimal velocity difference, which shows that collision can disappear in the improved model. -- Highlights: → A new optimal velocity difference car-following model is proposed. → The effects of the optimal velocity difference on the stability of traffic flow have been explored. → The starting and braking process were carried out through simulation. → The effects of the optimal velocity difference can avoid the disadvantage of negative velocity.

  1. Kinetic theories for spin models for cooperative relaxation dynamics

    Science.gov (United States)

    Pitts, Steven Jerome

    The facilitated kinetic Ising models with asymmetric spin flip constraints introduced by Jackle and co-workers [J. Jackle, S. Eisinger, Z. Phys. B 84, 115 (1991); J. Reiter, F. Mauch, J. Jackle, Physica A 184, 458 (1992)] exhibit complex relaxation behavior in their associated spin density time correlation functions. This includes the growth of relaxation times over many orders of magnitude when the thermodynamic control parameter is varied, and, in some cases, ergodic-nonergodic transitions. Relaxation equations for the time dependence of the spin density autocorrelation function for a set of these models are developed that relate this autocorrelation function to the irreducible memory function of Kawasaki [K. Kawasaki, Physica A 215, 61 (1995)] using a novel diagrammatic series approach. It is shown that the irreducible memory function in a theory of the relaxation of an autocorrelation function in a Markov model with detailed balance plays the same role as the part of the memory function approximated by a polynomial function of the autocorrelation function with positive coefficients in schematic simple mode coupling theories for supercooled liquids [W. Gotze, in Liquids, Freezing and the Glass Transition, D. Levesque, J. P. Hansen, J. Zinn-Justin eds., 287 (North Holland, New York, 1991)]. Sets of diagrams in the series for the irreducible memory function are summed which lead to approximations of this type. The behavior of these approximations is compared with known results from previous analytical calculations and from numerical simulations. For the simplest one dimensional model, relaxation equations that are closely related to schematic extended mode coupling theories [W. Gotze, ibid] are also derived using the diagrammatic series. Comparison of the results of these approximate theories with simulation data shows that these theories improve significantly on the results of the theories of the simple schematic mode coupling theory type. The potential

  2. Network Proactive Defense Model Based on Immune Danger Theory

    OpenAIRE

    Zhenxing Wang; Liancheng Zhang; Yazhou Kong; Yu Wang

    2016-01-01

    Recent investigations into proactive network defense have not produced a systematic methodology and structure; in addition, issues including multi-source information fusion and attacking behavior analysis have not been resolved. Borrowing ideas of danger sensing and immune response from danger theory, a proactive network defense model based on danger theory is proposed. This paper defines the signals and antigens in the network environment as well as attacking behavior analysis algorithm, pro...

  3. Lorentz breaking effective field theory models for matter and gravity: theory and observational constraints

    CERN Document Server

    Liberati, Stefano

    2012-01-01

    A number of different approaches to quantum gravity are at least partly phenomenologically characterized by their treatment of Lorentz symmetry, in particular whether the symmetry is exact or modified/broken at the smallest scales. For example, string theory generally preserves Lorentz symmetry while analog gravity and Lifshitz models break it at microscopic scales. In models with broken Lorentz symmetry there are a vast number of constraints on departures from Lorentz invariance that can be established with low energy experiments by employing the techniques of effective field theory in both the matter and gravitational sectors. We shall review here the low energy effective field theory approach to Lorentz breaking in these sectors and present various constraints provided by available observations.

  4. Bouncing Model in Brane World Theory

    CERN Document Server

    Maier, Rodrigo; Soares, Ivano Damião

    2013-01-01

    We examine the nonlinear dynamics of a closed Friedmann-Robertson-Walker universe in the framework of Brane World formalism with a timelike extra dimension. In this scenario, the Friedmann equations contain additional terms arising from the bulk-brane interaction which provide a concrete model for nonsingular bounces in the early phase of the Universe. We construct a nonsingular cosmological scenario sourced with dust, radiation and a cosmological constant. The structure of the phase space shows a nonsingular orbit with two accelerated phases, separated by a smooth transition corresponding to a decelerated expansion. Given observational parameters we connect such phases to a primordial accelerated phase, a soft transition to Friedmann (where the classical regime is valid), and a graceful exit to a de Sitter accelerated phase.

  5. Improvement of the competitiveness of the budget hotels in China based on customer value innovation theory%基于顾客价值创新的本土经济型酒店竞争力的提升

    Institute of Scientific and Technical Information of China (English)

    裴沛

    2015-01-01

    近几年,许多本土经济型酒店收入增速放缓,甚至出现亏损现象. 想要扭转这种局面,吸引客源提高竞争力,必须改变传统思维方式,为顾客创造酒店新价值. 本文结合顾客价值创新理论,对本土经济型酒店的顾客需求进行深入分析,针对十个维度绘出一条新的经济型酒店顾客价值曲线,提出了若干提升本土经济型酒店竞争力的对策,以客房为中心保证产品质量,注重服务细节提升服务质量,重视酒店网络点评等.%In recent years,many of the budget hotels in China appeared the phenomenon that revenue growth slowed down and even lost.To improve this condition,the budget hotels should change the traditional way of thinking and create a new hotel value for customers in order to attract customers and improve competitiveness. Based on customer value innovation theory,this paper analyzes the value of customers of the budget hotels in China,draws a new curve of customer value,and puts forward some countermeasures to improve the competi-tiveness,such as the room as the center to ensure product quality,pay attention to detail and the hotel network reviews.

  6. Scaling Theory and Modeling of DNA Evolution

    Science.gov (United States)

    Buldyrev, Sergey V.

    1998-03-01

    We present evidence supporting the possibility that the nucleotide sequence in noncoding DNA is power-law correlated. We do not find such long-range correlation in the coding regions of the gene, so we build a ``coding sequence finder'' to locate the coding regions of an unknown DNA sequence. We also propose a different coding sequence finding algorithm, based on the concept of mutual information(I. Große, S. V. Buldyrev, H. Herzel, H. E. Stanley, (preprint).). We describe our recent work on quantification of DNA patchiness, using long-range correlation measures (G. M. Viswanathan, S. V. Buldyrev, S. Havlin, and H. E. Stanley, Biophysical Journal 72), 866-875 (1997).. We also present our recent study of the simple repeat length distributions. We find that the distributions of some simple repeats in noncoding DNA have long power-law tails, while in coding DNA all simple repeat distributions decay exponentially. (N. V. Dokholyan, S. V. Buldyrev, S. Havlin, and H. E. Stanley, Phys. Rev. Lett (in press).) We discuss several models based on insertion-deletion and mutation-duplication mechanisms that relate long-range correlations in non-coding DNA to DNA evolution. Specifically, we relate long-range correlations in non-coding DNA to simple repeat expansion, and propose an evolutionary model that reproduces the power law distribution of simple repeat lengths. We argue that the absence of long-range correlations in protein coding sequences is related to their highly conserved primary structure which is necessary to insure protein folding.

  7. An Abstraction Theory for Qualitative Models of Biological Systems

    CERN Document Server

    Banks, Richard; 10.4204/EPTCS.40.3

    2010-01-01

    Multi-valued network models are an important qualitative modelling approach used widely by the biological community. In this paper we consider developing an abstraction theory for multi-valued network models that allows the state space of a model to be reduced while preserving key properties of the model. This is important as it aids the analysis and comparison of multi-valued networks and in particular, helps address the well-known problem of state space explosion associated with such analysis. We also consider developing techniques for efficiently identifying abstractions and so provide a basis for the automation of this task. We illustrate the theory and techniques developed by investigating the identification of abstractions for two published MVN models of the lysis-lysogeny switch in the bacteriophage lambda.

  8. Atmospheric deposition impacts on nutrients and biological budgets of the Mediterranean Sea, results from the high resolution coupled model NEMOMED12/PISCES

    Science.gov (United States)

    Richon, Camille; Dutay, Jean-Claude; Dulac, François; Desboeufs, Karine; Nabat, Pierre; Guieu, Cécile; Aumont, Olivier; Palmieri, Julien

    2016-04-01

    Atmospheric deposition is at present not included in regional oceanic biogeochemical models of the Mediterranean Sea, whereas, along with river inputs, it represents a significant source of nutrients at the basin scale, especially through intense desert dust events. Moreover, observations (e.g. DUNE campaign, Guieu et al. 2010) show that these events significantly modify the biogeochemistry of the oligotrophic Mediterranean Sea. We use a high resolution (1/12°) version of the 3D coupled model NEMOMED12/PISCES to investigate the effects of high resolution atmospheric dust deposition forcings on the biogeochemistry of the Mediterranean basin. The biogeochemical model PISCES represents the evolution of 24 prognostic tracers including five nutrients (nitrate, ammonium, phosphate, silicate and iron) and two phytoplankton and zooplanktons groups (Palmiéri, 2014). From decadal simulations (1982-2012) we evaluate the influence of natural dust and anthropogenic nitrogen deposition on the budget of nutrients in the basin and its impact on the biogeochemistry (primary production, plankton distributions...). Our results show that natural dust deposition accounts for 15% of global PO4 budget and that it influences primarily the southern part of the basin. Anthropogenic nitrogen accounts for 50% of bioavailable N supply for the northern part. Deposition events significantly affect biological production; primary productivity enhancement can be as high as 30% in the areas of high deposition, especially during the stratified period. Further developments of the model will include 0D and 1D modeling of bacteria in the frame of the PEACETIME project.

  9. Putting "Organizations" into an Organization Theory Course: A Hybrid CAO Model for Teaching Organization Theory

    Science.gov (United States)

    Hannah, David R.; Venkatachary, Ranga

    2010-01-01

    In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…

  10. Summary of papers presented in the Theory and Modelling session

    Directory of Open Access Journals (Sweden)

    Lin-Liu Y.R.

    2012-09-01

    Full Text Available A total of 14 contributions were presented in the Theory and Modelling sessions at EC-17. One Theory and Modelling paper was included in the ITER ECRH and ECE sessions each. Three papers were in the area of nonlinear physics discussing parametric processes accompanying ECRH. Eight papers were based on the quasi-linear theory of wave heating and current drive. Three of these addressed the application of ECCD for NTM stabilization. Two papers considered scattering of EC waves by edge density fluctuations and related phenomena. In this summary, we briefly describe the highlights of these contributions. Finally, the three papers concerning modelling of various aspects of ECE are reported in the ECE session.

  11. Artificial Immune Danger Theory Based Model for Network Security Evaluation

    Directory of Open Access Journals (Sweden)

    Feixian Sun

    2011-02-01

    Full Text Available Inspired by the principles of immune danger theory, a danger theory based model for network security risk assessment is presented in this paper. Firstly, the principle of the danger theory is introduced. And then, with the improved concepts and formal definitions of antigen, antibody, danger signal, and detection lymphocyte for network security risk assessment presented, the distributed architecture of the proposed model is described. Following that, the principle of network intrusion detection is expounded. Finally, the method of network security risk assessment is given. Theoretical analysis and simulation results show that the proposed model can evaluate the network attack threats in real time. Thus, it provides an effective risk evaluation solution to network security.

  12. Perturbation theory for string sigma models

    CERN Document Server

    Bianchi, Lorenzo

    2016-01-01

    In this thesis we investigate quantum aspects of the Green-Schwarz superstring in various AdS backgrounds relevant for the AdS/CFT correspondence, providing several examples of perturbative computations in the corresponding integrable sigma-models. We start by reviewing in details the supercoset construction of the superstring action in $AdS_5 \\times S^5$, pointing out the limits of this procedure for $AdS_4$ and $AdS_3$ backgrounds. For the $AdS_4 \\times CP^3$ case we give a thorough derivation of an alternative action, based on the double-dimensional reduction of eleven-dimensional super-membranes. We then consider the expansion about the BMN vacuum and the S-matrix for the scattering of worldsheet excitations in the decompactification limit. To evaluate its elements efficiently we describe a unitarity-based method resulting in a very compact formula yielding the cut-constructible part of any one-loop two-dimensional S-matrix. In the second part of this review we analyze the superstring action on $AdS_4 \\ti...

  13. Magnetized cosmological models in bimetric theory of gravitation

    Indian Academy of Sciences (India)

    S D Katore; R S Rane

    2006-08-01

    Bianchi type-III magnetized cosmological model when the field of gravitation is governed by either a perfect fluid or cosmic string is investigated in Rosen's [1] bimetric theory of gravitation. To complete determinate solution, the condition, viz., = (), where is a constant, between the metric potentials is used. We have assumed different equations of state for cosmic string [2] for the complete solution of the model. Some physical and geometrical properties of the exhibited model are discussed and studied.

  14. A Model of Resurgence Based on Behavioral Momentum Theory

    OpenAIRE

    Shahan, Timothy A; Sweeney, Mary M.

    2011-01-01

    Resurgence is the reappearance of an extinguished behavior when an alternative behavior reinforced during extinction is subsequently placed on extinction. Resurgence is of particular interest because it may be a source of relapse to problem behavior following treatments involving alternative reinforcement. In this article we develop a quantitative model of resurgence based on the augmented model of extinction provided by behavioral momentum theory. The model suggests that alternative reinforc...

  15. N = 2 Supersymmetric Quantum Mechanical Models and Hodge Theory

    CERN Document Server

    Malik, R P

    2012-01-01

    We demonstrate the existence of a novel set of discrete symmetries in the context of N = 2 supersymmetric quantum mechanical (SQM) models in one and two dimensions. We derive the underlying algebra of the continuous symmetry transformations and establish its relevance to the algebraic structures of the de Rham cohomological operators of differential geometry. We further show that the discrete symmetry transformations of our models correspond to the Hodge duality operation so that these models provide interesting examples of Hodge theory.

  16. Hydrodynamics Research on Amphibious Vehicle Systems:Modeling Theory

    Institute of Scientific and Technical Information of China (English)

    JU Nai-jun

    2006-01-01

    For completing the hydrodynamics software development and the engineering application research on the amphibious vehicle systems, hydrodynamic modeling theory of the amphibious vehicle systems is elaborated, which includes to build up the dynamic system model of amphibious vehicle motion on water, gun tracking-aiming-firing, bullet hit and armored check-target, gunner operating control, and the simulation computed model of time domain for random sea wave.

  17. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  18. Theory of compressive modeling and simulation

    Science.gov (United States)

    Szu, Harold; Cha, Jae; Espinola, Richard L.; Krapels, Keith

    2013-05-01

    Modeling and Simulation (M&S) has been evolving along two general directions: (i) data-rich approach suffering the curse of dimensionality and (ii) equation-rich approach suffering computing power and turnaround time. We suggest a third approach. We call it (iii) compressive M&S (CM&S); because the basic Minimum Free-Helmholtz Energy (MFE) facilitating CM&S can reproduce and generalize Candes, Romberg, Tao & Donoho (CRT&D) Compressive Sensing (CS) paradigm as a linear Lagrange Constraint Neural network (LCNN) algorithm. CM&S based MFE can generalize LCNN to 2nd order as Nonlinear augmented LCNN. For example, during the sunset, we can avoid a reddish bias of sunlight illumination due to a long-range Rayleigh scattering over the horizon. With CM&S we can take instead of day camera, a night vision camera. We decomposed long wave infrared (LWIR) band with filter into 2 vector components (8~10μm and 10~12μm) and used LCNN to find pixel by pixel the map of Emissive-Equivalent Planck Radiation Sources (EPRS). Then, we up-shifted consistently, according to de-mixed sources map, to the sub-micron RGB color image. Moreover, the night vision imaging can also be down-shifted at Passive Millimeter Wave (PMMW) imaging, suffering less blur owing to dusty smokes scattering and enjoying apparent smoothness of surface reflectivity of man-made objects under the Rayleigh resolution. One loses three orders of magnitudes in the spatial Rayleigh resolution; but gains two orders of magnitude in the reflectivity, and gains another two orders in the propagation without obscuring smog . Since CM&S can generate missing data and hard to get dynamic transients, CM&S can reduce unnecessary measurements and their associated cost and computing in the sense of super-saving CS: measuring one & getting one's neighborhood free .

  19. Projected Impact of Climate Change on the Energy Budget of the Arctic Ocean by a Global Climate Model

    Science.gov (United States)

    Miller, James R.; Russell, Gary L.; Hansen, James E. (Technical Monitor)

    2001-01-01

    The annual energy budget of the Arctic Ocean is characterized by a net heat loss at the air-sea interface that is balanced by oceanic heat transport into the Arctic. The energy loss at the air-sea interface is due to the combined effects of radiative, sensible, and latent heat fluxes. The inflow of heat by the ocean can be divided into two components: the transport of water masses of different temperatures between the Arctic and the Atlantic and Pacific Oceans and the export of sea ice, primarily through Fram Strait. Two 150-year simulations (1950-2099) of a global climate model are used to examine how this balance might change if atmospheric greenhouse gases (GHGs) increase. One is a control simulation for the present climate with constant 1950 atmospheric composition, and the other is a transient experiment with observed GHGs from 1950 to 1990 and 0.5% annual compounded increases of CO2 after 1990. For the present climate the model agrees well with observations of radiative fluxes at the top of the atmosphere, atmospheric advective energy transport into the Arctic, and surface air temperature. It also simulates the seasonal cycle and summer increase of cloud cover and the seasonal cycle of sea-ice cover. In addition, the changes in high-latitude surface air temperature and sea-ice cover in the GHG experiment are consistent with observed changes during the last 40 and 20 years, respectively. Relative to the control, the last 50-year period of the GHG experiment indicates that even though the net annual incident solar radiation at the surface decreases by 4.6 W(per square meters) (because of greater cloud cover and increased cloud optical depth), the absorbed solar radiation increases by 2.8 W(per square meters) (because of less sea ice). Increased cloud cover and warmer air also cause increased downward thermal radiation at the surface so that the net radiation into the ocean increases by 5.0 Wm-2. The annual increase in radiation into the ocean, however, is

  20. Budget support, conditionality and poverty.

    OpenAIRE

    Mosley, P.; Suleiman, A.

    2005-01-01

    This paper examines the effectiveness of budget support aid as an anti-poverty instrument. We argue that a major determinant of this effectiveness is the element of trust – or `social capital´, as it may be seen – which builds up between representatives of the donor and recipient. Thus we model the conditionality processes attending budget support aid, not purely in the conventional way as a non-cooperative two-person game, but rather as a non-cooperative game which may mutate into a collabor...

  1. Future projections of the surface heat and water budgets of the Mediterranean Sea in an ensemble of coupled atmosphere-ocean regional climate models

    Energy Technology Data Exchange (ETDEWEB)

    Dubois, C.; Somot, S.; Deque, M.; Sevault, F. [CNRM-GAME, Meteo-France, CNRS, Toulouse (France); Calmanti, S.; Carillo, A.; Dell' Aquilla, A.; Sannino, G. [ENEA, Rome (Italy); Elizalde, A.; Jacob, D. [Max Planck Institute for Meteorology, Hamburg (Germany); Gualdi, S.; Oddo, P.; Scoccimarro, E. [INGV, Bologna (Italy); L' Heveder, B.; Li, L. [Laboratoire de Meteorologie Dynamique, Paris (France)

    2012-10-15

    Within the CIRCE project ''Climate change and Impact Research: the Mediterranean Environment'', an ensemble of high resolution coupled atmosphere-ocean regional climate models (AORCMs) are used to simulate the Mediterranean climate for the period 1950-2050. For the first time, realistic net surface air-sea fluxes are obtained. The sea surface temperature (SST) variability is consistent with the atmospheric forcing above it and oceanic constraints. The surface fluxes respond to external forcing under a warming climate and show an equivalent trend in all models. This study focuses on the present day and on the evolution of the heat and water budget over the Mediterranean Sea under the SRES-A1B scenario. On the contrary to previous studies, the net total heat budget is negative over the present period in all AORCMs and satisfies the heat closure budget controlled by a net positive heat gain at the strait of Gibraltar in the present climate. Under climate change scenario, some models predict a warming of the Mediterranean Sea from the ocean surface (positive net heat flux) in addition to the positive flux at the strait of Gibraltar for the 2021-2050 period. The shortwave and latent flux are increasing and the longwave and sensible fluxes are decreasing compared to the 1961-1990 period due to a reduction of the cloud cover and an increase in greenhouse gases (GHGs) and SSTs over the 2021-2050 period. The AORCMs provide a good estimates of the water budget with a drying of the region during the twenty-first century. For the ensemble mean, he decrease in precipitation and runoff is about 10 and 15% respectively and the increase in evaporation is much weaker, about 2% compared to the 1961-1990 period which confirm results obtained in recent studies. Despite a clear consistency in the trends and results between the models, this study also underlines important differences in the model set-ups, methodology and choices of some physical parameters inducing

  2. Lenses on Reading An Introduction to Theories and Models

    CERN Document Server

    Tracey, Diane H

    2012-01-01

    This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition

  3. Cosmological models in Weyl geometrical scalar-tensor theory

    Science.gov (United States)

    Pucheu, M. L.; Alves Junior, F. A. P.; Barreto, A. B.; Romero, C.

    2016-09-01

    We investigate cosmological models in a recently proposed geometrical theory of gravity, in which the scalar field appears as part of the spacetime geometry. We extend the previous theory to include a scalar potential in the action. We solve the vacuum field equations for different choices of the scalar potential and give a detailed analysis of the solutions. We show that, in some cases, a cosmological scenario is found that seems to suggest the appearance of a geometric phase transition. We build a toy model, in which the accelerated expansion of the early Universe is driven by pure geometry.

  4. Effective Field Theory and the No-Core Shell Model

    Directory of Open Access Journals (Sweden)

    Stetcua I.

    2010-04-01

    Full Text Available In finite model space suitable for many-body calculations via the no-core shell model (NCSM, I illustrate the direct application of the effective field theory (EFT principles to solving the many-body Schrödinger equation. Two different avenues for fixing the low-energy constants naturally arising in an EFT approach are discussed. I review results for both nuclear and trapped atomic systems, using effective theories formally similar, albeit describing different underlying physics.

  5. A Study of the Stable Boundary Layer Based on a Single-Column K-Theory Model

    Science.gov (United States)

    Sorbjan, Zbigniew

    2012-01-01

    We document numerical experiments with a single-column, high-resolution model of the stable boundary layer. The model resolves the logarithmic layer, and does not require inverting the Monin-Obukhov similarity functions in order to calculate the surface fluxes. The turbulence closure is based on the K-theory approach, with a new form of stability functions of the Richardson number, evaluated by using the Surface Heat Budget of the Arctic Ocean (SHEBA) and the Cooperative Atmosphere-Surface Exchange Study (CASES-99) data. A comparison with two, high-resolution large-eddy simulation models shows very good agreement. The reported numerical experiments test the effects of shear, surface cooling, the Coriolis parameter, subsidence, and baroclinicity. The time evolution of the drag coefficient, the heat-transfer coefficient, and the cross-isobar angle is also evaluated.

  6. Theory, modeling and simulation of superconducting qubits

    Energy Technology Data Exchange (ETDEWEB)

    Berman, Gennady P [Los Alamos National Laboratory; Kamenev, Dmitry I [Los Alamos National Laboratory; Chumak, Alexander [INSTIT OF PHYSICS, KIEV; Kinion, Carin [LLNL; Tsifrinovich, Vladimir [POLYTECHNIC INSTIT OF NYU

    2011-01-13

    We analyze the dynamics of a qubit-resonator system coupled with a thermal bath and external electromagnetic fields. Using the evolution equations for the set of Heisenberg operators that describe the whole system, we derive an expression for the resonator field, that includes the resonator-drive, the resonator-bath, and resonator-qubit interactions. The renormalization of the resonator frequency, caused by the qubit-resonator interaction, is accounted for. Using the solutions for the resonator field, we derive the equation that describes the qubit dynamics. The dependence of the qubit evolution during the measurement time on the fidelity of a single-shot measurement is studied. The relation between the fidelity and measurement time is shown explicitly. We proposed a novel adiabatic method for the phase qubit measurement. The method utilizes a low-frequency, quasi-classical resonator inductively coupled to the qubit. The resonator modulates the qubit energy, and the back reaction of the qubit causes a shift in the phase of the resonator. The resonator phase shift can be used to determine the qubit state. We have simulated this measurement taking into the account the energy levels outside the phase qubit manifold. We have shown that, for qubit frequencies in the range of 8-12GHZ, a resonator frequency of 500 MHz and a measurement time of 100 ns, the phase difference between the two qubit states is greater than 0.2 rad. This phase difference exceeds the measurement uncertainty, and can be detected using a classical phase-meter. A fidelity of 0.9999 can be achieved for a relaxation time of 0.5 ms. We also model and simulate a microstrip-SQUID amplifier of frequency about 500 MHz, which could be used to amplify the resonator oscillations in the phase qubit adiabatic measurement. The voltage gain and the amplifier noise temperature are calculated. We simulate the preparation of a generalized Bell state and compute the relaxation times required for achieving high

  7. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  8. Cosmological Model Based on Gauge Theory of Gravity

    Institute of Scientific and Technical Information of China (English)

    WU Ning

    2005-01-01

    A cosmological model based on gauge theory of gravity is proposed in this paper. Combining cosmological principle and field equation of gravitational gauge field, dynamical equations of the scale factor R(t) of our universe can be obtained. This set of equations has three different solutions. A prediction of the present model is that, if the energy density of the universe is not zero and the universe is expanding, the universe must be space-flat, the total energy density must be the critical density ρc of the universe. For space-flat case, this model gives the same solution as that of the Friedmann model. In other words, though they have different dynamics of gravitational interactions, general relativity and gauge theory of gravity give the same cosmological model.

  9. Bridging emotion theory and neurobiology through dynamic systems modeling.

    Science.gov (United States)

    Lewis, Marc D

    2005-04-01

    Efforts to bridge emotion theory with neurobiology can be facilitated by dynamic systems (DS) modeling. DS principles stipulate higher-order wholes emerging from lower-order constituents through bidirectional causal processes--offering a common language for psychological and neurobiological models. After identifying some limitations of mainstream emotion theory, I apply DS principles to emotion-cognition relations. I then present a psychological model based on this reconceptualization, identifying trigger, self-amplification, and self-stabilization phases of emotion-appraisal states, leading to consolidating traits. The article goes on to describe neural structures and functions involved in appraisal and emotion, as well as DS mechanisms of integration by which they interact. These mechanisms include nested feedback interactions, global effects of neuromodulation, vertical integration, action-monitoring, and synaptic plasticity, and they are modeled in terms of both functional integration and temporal synchronization. I end by elaborating the psychological model of emotion-appraisal states with reference to neural processes.

  10. A Model of PCF in Guarded Type Theory

    DEFF Research Database (Denmark)

    Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars

    2015-01-01

    Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about elements...... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it...... computationally adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....

  11. Integrating Social Capital Theory, Social Cognitive Theory, and the Technology Acceptance Model to Explore a Behavioral Model of Telehealth Systems

    Directory of Open Access Journals (Sweden)

    Chung-Hung Tsai

    2014-05-01

    Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  12. Integrating social capital theory, social cognitive theory, and the technology acceptance model to explore a behavioral model of telehealth systems.

    Science.gov (United States)

    Tsai, Chung-Hung

    2014-05-07

    Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.

  13. MODEL OF DISTRIBUTION OF THE BUDGET OF THE PORTFOLIO OF IT PROJECTS TAKING IN-TO ACCOUNT THEIR PRIORITY

    OpenAIRE

    Sotnikova, A

    2015-01-01

    Article is devoted to a problem of effective distribution of the general budget of a portfolio between the IT projects which are its part taking into ac-count their priority. The designated problem is actual in view of low results of activity of the consulting companies in the sphere of information technologies. For determination of priority of IT projects the method of analytical networks developed by T. Saati is used. For the purpose of application of this method the system of criteria (ind...

  14. Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions

    Directory of Open Access Journals (Sweden)

    Camaren Peter

    2014-03-01

    Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.

  15. The early years of string theory: The dual resonance model

    International Nuclear Information System (INIS)

    This paper reviews the past quantum mechanical history of the dual resonance model which is an early string theory. The content of this paper is listed as follows: historical review, the Veneziano amplitude, the operator formalism, the ghost story, and the string story

  16. A Proposed Model of Jazz Theory Knowledge Acquisition

    Science.gov (United States)

    Ciorba, Charles R.; Russell, Brian E.

    2014-01-01

    The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…

  17. Evaluating hydrological model performance using information theory-based metrics

    Science.gov (United States)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  18. The Five-Factor Model and Self-Determination Theory

    DEFF Research Database (Denmark)

    Olesen, Martin Hammershøj; Thomsen, Dorthe Kirkegaard; Schnieber, Anette;

    This study investigates conceptual overlap vs. distinction between individual differences in personality traits, i.e. the Five-Factor Model; and Self-determination Theory, i.e. general causality orientations. Twelve-hundred-and-eighty-seven freshmen (mean age 21.71; 64% women) completed electronic...

  19. Item Response Theory Modeling of the Philadelphia Naming Test

    Science.gov (United States)

    Fergadiotis, Gerasimos; Kellough, Stacey; Hula, William D.

    2015-01-01

    Purpose: In this study, we investigated the fit of the Philadelphia Naming Test (PNT; Roach, Schwartz, Martin, Grewal, & Brecher, 1996) to an item-response-theory measurement model, estimated the precision of the resulting scores and item parameters, and provided a theoretical rationale for the interpretation of PNT overall scores by relating…

  20. Conceptualizations of Creativity: Comparing Theories and Models of Giftedness

    Science.gov (United States)

    Miller, Angie L.

    2012-01-01

    This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…

  1. A Model to Demonstrate the Place Theory of Hearing

    Science.gov (United States)

    Ganesh, Gnanasenthil; Srinivasan, Venkata Subramanian; Krishnamurthi, Sarayu

    2016-01-01

    In this brief article, the authors discuss Georg von Békésy's experiments showing the existence of traveling waves in the basilar membrane and that maximal displacement of the traveling wave was determined by the frequency of the sound. The place theory of hearing equates the basilar membrane to a frequency analyzer. The model described in this…

  2. Anisotropic cosmological models and generalized scalar tensor theory

    Indian Academy of Sciences (India)

    Subenoy Chakraborty; Batul Chandra Santra; Nabajit Chakravarty

    2003-10-01

    In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been studied and some assumptions among the physical parameters and solutions have been discussed.

  3. Unified theory of particle decay modes in electonic model

    Energy Technology Data Exchange (ETDEWEB)

    Jian, C.X.

    1982-06-01

    In a previous paper we have given a reasonable description of the total number of constituents (eletons and antieletons) in Santilli's structure model of hadrons. In this paper we shall extend these results to inlude decays of unstable hadrons. We shall continue to use the theory of stable and unstable groups with particular reference to Euler's function.

  4. Factorization in Dual Models and Functional Integration in String Theory

    CERN Document Server

    Mandelstam, Stanley

    2008-01-01

    This article contains a summary of the author's contributions, one in collaboration with K. Bardakci, to dual models and string theory prior to the mid-seventies. Other workers' contributions, during and subsequent to this period, are mentioned in order to relate our work to the general development of the subject

  5. Accounting for Errors in Model Analysis Theory: A Numerical Approach

    Science.gov (United States)

    Sommer, Steven R.; Lindell, Rebecca S.

    2004-09-01

    By studying the patterns of a group of individuals' responses to a series of multiple-choice questions, researchers can utilize Model Analysis Theory to create a probability distribution of mental models for a student population. The eigenanalysis of this distribution yields information about what mental models the students possess, as well as how consistently they utilize said mental models. Although the theory considers the probabilistic distribution to be fundamental, there exists opportunities for random errors to occur. In this paper we will discuss a numerical approach for mathematically accounting for these random errors. As an example of this methodology, analysis of data obtained from the Lunar Phases Concept Inventory will be presented. Limitations and applicability of this numerical approach will be discussed.

  6. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing......The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  7. THE REAL OPTIONS OF CAPITAL BUDGET

    Directory of Open Access Journals (Sweden)

    Antonio Lopo Martins

    2008-07-01

    Full Text Available The traditional techniques of capital budget, as the deducted cash flow and the net value present, do not incorporate existing flexibilities in an investment project, they tend to distort the value of certain investments, mainly those that are considered in scenes of uncertainty and risk. Therefore, this study intends to demonstrate that the Real Options Theory (TOR is a useful methodology to evaluate and to indicate the best option for project of expansion investment. To reach the considered objective the procedure method was used a case study, having as unit of case the Resort Praia Hotel do Litoral Norte of Salvador. This study was developed of the following form: first it identified the traditional net value present and later it was incorporated the volatileness of each analyzed uncertainty. Second, as the real options are analogous to the financial options, it was necessary to identify elements that composed the terminologies of the financial options with intention to get the value of the real option. For this model of options pricing of Black & Scholes jointly with a computational simulator was used (SLS to get the expanded net value present. As a result of this study it was possible to evidence that using the traditional tool of capital budget Net Value Present (VPL is negative, therefore the project of expansion of the Hotel would be rejected. While for the application of methodology TOR the project presents positive Expanded Present Value which would represent an excellent chance of investment. Key-word: Capital budget, Real options, Analysis of investments.

  8. Alliance: A common factor of psychotherapy modeled by structural theory

    Directory of Open Access Journals (Sweden)

    Wolfgang eTschacher

    2015-04-01

    Full Text Available There is broad consensus that the therapeutic alliance constitutes a core common factor for all modalities of psychotherapy. Meta-analyses corroborated that alliance, as it emerges from therapeutic process, is a significant predictor of therapy outcome. Psychotherapy process is traditionally described and explored using two categorially different approaches, the experiential (first-person perspective and the behavioral (third-person perspective. We propose to add to this duality a third, structural approach. Dynamical systems theory and synergetics on the one hand and enactivist theory on the other together can provide this structural approach, which contributes in specific ways to a clarification of the alliance factor. Systems theory offers concepts and tools for the modeling of the individual self and, building on this, of alliance processes. In the enactive perspective, the self is conceived as a socially enacted autonomous system that strives to maintain identity by observing a two-fold goal: to exist as an individual self in its own right (distinction while also being open to others (participation. Using this conceptualization, we formalized the therapeutic alliance as a phase space whose potential minima (attractors can be shifted by the therapist to approximate therapy goals. This mathematical formalization is derived from probability theory and synergetics. Our conclusions say that structural theory provides powerful tools for the modeling of how therapeutic change is staged by the formation, utilization, and dissolution of the therapeutic alliance. In addition, we point out novel testable hypotheses and future applications.

  9. THE NEW CLASSICAL THEORY AND THE REAL BUSINESS CYCLE MODEL

    OpenAIRE

    Oana Simona HUDEA (CARAMAN); Sorin George TOMA; Marin BURCEA

    2014-01-01

    The present paper aims at describing some key elements of the new classical theory-related model, namely the Real Business Cycle, mainly describing the economy from the perspective of a perfectly competitive market, characterised by price, wage and interest rate flexibility. The rendered impulse-response functions, that help us in revealing the capacity of the model variables to return to their steady state under the impact of a structural shock, be it technology or monetary policy oriented, ...

  10. Cohomological gauge theory, quiver matrix models and Donaldson-Thomas theoryCohomological gauge theory, quiver matrix models and Donaldson-Thomas theory

    NARCIS (Netherlands)

    Cirafici, M.; Sinkovics, A.; Szabo, R.J.

    2009-01-01

    We study the relation between Donaldson–Thomas theory of Calabi–Yau threefolds and a six-dimensional topological Yang–Mills theory. Our main example is the topological U(N) gauge theory on flat space in its Coulomb branch. To evaluate its partition function we use equivariant localization techniques

  11. Cluster variational theory of spin ((3)/(2)) Ising models

    CERN Document Server

    Tucker, J W

    2000-01-01

    A cluster variational method for spin ((3)/(2)) Ising models on regular lattices is presented that leads to results that are exact for Bethe lattices of the same coordination number. The method is applied to both the Blume-Capel (BC) and the isotropic Blume-Emery-Griffiths model (BEG). In particular, the first-order phase line separating the two low-temperature ferromagnetic phases in the BC model, and the ferrimagnetic phase boundary in the BEG model are studied. Results are compared with those of other theories whose qualitative predictions have been in conflict.

  12. Cluster variational theory of spin ((3)/(2)) Ising models

    International Nuclear Information System (INIS)

    A cluster variational method for spin ((3)/(2)) Ising models on regular lattices is presented that leads to results that are exact for Bethe lattices of the same coordination number. The method is applied to both the Blume-Capel (BC) and the isotropic Blume-Emery-Griffiths model (BEG). In particular, the first-order phase line separating the two low-temperature ferromagnetic phases in the BC model, and the ferrimagnetic phase boundary in the BEG model are studied. Results are compared with those of other theories whose qualitative predictions have been in conflict

  13. Modeling DNA loops using the theory of elasticity

    CERN Document Server

    Balaeff, A; Schulten, K; Balaeff, Alexander; Schulten, Klaus

    2003-01-01

    A versatile approach to modeling the conformations and energetics of DNA loops is presented. The model is based on the classical theory of elasticity, modified to describe the intrinsic twist and curvature of DNA, the DNA bending anisotropy, and electrostatic properties. All the model parameters are considered to be functions of the loop arclength, so that the DNA sequence-specific properties can be modeled. The model is applied to the test case study of a DNA loop clamped by the lac repressor protein. Several topologically different conformations are predicted for various lengths of the loop. The dependence of the predicted conformations on the parameters of the problem is systematically investigated. Extensions of the presented model and the scope of the model's applicability, including multi-scale simulations of protein-DNA complexes and building all-atom structures on the basis of the model, are discussed.

  14. Theory and application of experimental model analysis in earthquake engineering

    Science.gov (United States)

    Moncarz, P. D.

    The feasibility and limitations of small-scale model studies in earthquake engineering research and practice is considered with emphasis on dynamic modeling theory, a study of the mechanical properties of model materials, the development of suitable model construction techniques and an evaluation of the accuracy of prototype response prediction through model case studies on components and simple steel and reinforced concrete structures. It is demonstrated that model analysis can be used in many cases to obtain quantitative information on the seismic behavior of complex structures which cannot be analyzed confidently by conventional techniques. Methodologies for model testing and response evaluation are developed in the project and applications of model analysis in seismic response studies on various types of civil engineering structures (buildings, bridges, dams, etc.) are evaluated.

  15. Theory-based Bayesian models of inductive learning and reasoning.

    Science.gov (United States)

    Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles

    2006-07-01

    Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.

  16. Changes in water budgets and sediment yields from a hypothetical agricultural field as a function of landscape and management characteristics--A unit field modeling approach

    Science.gov (United States)

    Roth, Jason L.; Capel, Paul D.

    2012-01-01

    Crop agriculture occupies 13 percent of the conterminous United States. Agricultural management practices, such as crop and tillage types, affect the hydrologic flow paths through the landscape. Some agricultural practices, such as drainage and irrigation, create entirely new hydrologic flow paths upon the landscapes where they are implemented. These hydrologic changes can affect the magnitude and partitioning of water budgets and sediment erosion. Given the wide degree of variability amongst agricultural settings, changes in the magnitudes of hydrologic flow paths and sediment erosion induced by agricultural management practices commonly are difficult to characterize, quantify, and compare using only field observations. The Water Erosion Prediction Project (WEPP) model was used to simulate two landscape characteristics (slope and soil texture) and three agricultural management practices (land cover/crop type, tillage type, and selected agricultural land management practices) to evaluate their effects on the water budgets of and sediment yield from agricultural lands. An array of sixty-eight 60-year simulations were run, each representing a distinct natural or agricultural scenario with various slopes, soil textures, crop or land cover types, tillage types, and select agricultural management practices on an isolated 16.2-hectare field. Simulations were made to represent two common agricultural climate regimes: arid with sprinkler irrigation and humid. These climate regimes were constructed with actual climate and irrigation data. The results of these simulations demonstrate the magnitudes of potential changes in water budgets and sediment yields from lands as a result of landscape characteristics and agricultural practices adopted on them. These simulations showed that variations in landscape characteristics, such as slope and soil type, had appreciable effects on water budgets and sediment yields. As slopes increased, sediment yields increased in both the arid and

  17. 7 CFR 3402.14 - Budget and budget narrative.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Budget and budget narrative. 3402.14 Section 3402.14 Agriculture Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION... budget narrative. Applicants must prepare the Budget, Form CSREES-2004, and a budget...

  18. Logic without borders essays on set theory, model theory, philosophical logic and philosophy of mathematics

    CERN Document Server

    Hirvonen, Åsa; Kossak, Roman; Villaveces, Andrés

    2015-01-01

    In recent years, mathematical logic has developed in many directions, the initial unity of its subject matter giving way to a myriad of seemingly unrelated areas. The articles collected here, which range from historical scholarship to recent research in geometric model theory, squarely address this development. These articles also connect to the diverse work of Väänänen, whose ecumenical approach to logic reflects the unity of the discipline.

  19. POLITICAL BUDGET CYCLES: EVIDENCE FROM TURKEY

    Directory of Open Access Journals (Sweden)

    FİLİZ ERYILMAZ

    2015-04-01

    Full Text Available The theorical literature on “Political Business Cycles” presents important insights on the extent to which politicians attempt to manipulate government monetary and fiscal policies to influence electoral outcomes, in particular, with the aim of re-election. In recent years “Political Budget Cycles” is the one of the most important topics in Political Business Cycles literature. According to Political Budget Cycles Theory, some components of the government budget are influenced by the electoral cycle and consequently an increase in government spending or decrease in taxes in an election year, leading to larger fiscal deficit. This incumbent’s fiscal manipulation is a tool that governments possess to increase their changes for re-election. In this paper we investigate the presence of Political Budget Cycles using a data set of budget balance, total expenditure and total revenue over the period 1994–2012. Our findings suggest that incumbents in Turkey use fiscal policy to increase their popularity and win elections, therefore fiscal manipulation was rewarded rather than punished by Turkish voters. The meaning of this result is that Political Budget Cycles Theory is valid for Turkey between 1994 and 2012.

  20. Dynamical influence of gravity waves generated by the Vestfjella Mountains in Antarctica: radar observations, fine-scale modelling and kinetic energy budget analysis

    Directory of Open Access Journals (Sweden)

    Joel Arnault

    2012-02-01

    Full Text Available Gravity waves generated by the Vestfjella Mountains (in western Droning Maud Land, Antarctica, southwest of the Finnish/Swedish Aboa/Wasa station have been observed with the Moveable atmospheric radar for Antarctica (MARA during the SWEDish Antarctic Research Programme (SWEDARP in December 2007/January 2008. These radar observations are compared with a 2-month Weather Research Forecast (WRF model experiment operated at 2 km horizontal resolution. A control simulation without orography is also operated in order to separate unambiguously the contribution of the mountain waves on the simulated atmospheric flow. This contribution is then quantified with a kinetic energy budget analysis computed in the two simulations. The results of this study confirm that mountain waves reaching lower-stratospheric heights break through convective overturning and generate inertia gravity waves with a smaller vertical wavelength, in association with a brief depletion of kinetic energy through frictional dissipation and negative vertical advection. The kinetic energy budget also shows that gravity waves have a strong influence on the other terms of the budget, i.e. horizontal advection and horizontal work of pressure forces, so evaluating the influence of gravity waves on the mean-flow with the vertical advection term alone is not sufficient, at least in this case. We finally obtain that gravity waves generated by the Vestfjella Mountains reaching lower stratospheric heights generally deplete (create kinetic energy in the lower troposphere (upper troposphere–lower stratosphere, in contradiction with the usual decelerating effect attributed to gravity waves on the zonal circulation in the upper troposphere–lower stratosphere.

  1. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  2. Matrix models and stochastic growth in Donaldson-Thomas theory

    Energy Technology Data Exchange (ETDEWEB)

    Szabo, Richard J. [Department of Mathematics, Heriot-Watt University, Colin Maclaurin Building, Riccarton, Edinburgh EH14 4AS, United Kingdom and Maxwell Institute for Mathematical Sciences, Edinburgh (United Kingdom); Tierz, Miguel [Grupo de Fisica Matematica, Complexo Interdisciplinar da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, PT-1649-003 Lisboa (Portugal); Departamento de Analisis Matematico, Facultad de Ciencias Matematicas, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid (Spain)

    2012-10-15

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.

  3. Forewarning model for water pollution risk based on Bayes theory.

    Science.gov (United States)

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk. PMID:24194413

  4. Matrix models and stochastic growth in Donaldson-Thomas theory

    Science.gov (United States)

    Szabo, Richard J.; Tierz, Miguel

    2012-10-01

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kähler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.

  5. Advertising Budget Allocation under Uncertainty

    OpenAIRE

    Duncan M. Holthausen, Jr.; Gert Assmus

    1982-01-01

    This article presents a model for the allocation of an advertising budget to geographic market segments, or territories, when the sales response to advertising in each segment is characterized by a probability distribution. It is shown that allocation decisions that are based on the expected sales response may be associated with a relatively large degree of risk and, therefore, non-optimal to a risk-averse manager. The model derives an "efficient frontier" in terms of the expected profit and ...

  6. Nonequilibrium Dynamical Mean-Field Theory for Bosonic Lattice Models

    Science.gov (United States)

    Strand, Hugo U. R.; Eckstein, Martin; Werner, Philipp

    2015-01-01

    We develop the nonequilibrium extension of bosonic dynamical mean-field theory and a Nambu real-time strong-coupling perturbative impurity solver. In contrast to Gutzwiller mean-field theory and strong-coupling perturbative approaches, nonequilibrium bosonic dynamical mean-field theory captures not only dynamical transitions but also damping and thermalization effects at finite temperature. We apply the formalism to quenches in the Bose-Hubbard model, starting from both the normal and the Bose-condensed phases. Depending on the parameter regime, one observes qualitatively different dynamical properties, such as rapid thermalization, trapping in metastable superfluid or normal states, as well as long-lived or strongly damped amplitude oscillations. We summarize our results in nonequilibrium "phase diagrams" that map out the different dynamical regimes.

  7. A Model-Theoretic Framework for Theories of Syntax

    CERN Document Server

    Rogers, J

    1996-01-01

    A natural next step in the evolution of constraint-based grammar formalisms from rewriting formalisms is to abstract fully away from the details of the grammar mechanism---to express syntactic theories purely in terms of the properties of the class of structures they license. By focusing on the structural properties of languages rather than on mechanisms for generating or checking structures that exhibit those properties, this model-theoretic approach can offer simpler and significantly clearer expression of theories and can potentially provide a uniform formalization, allowing disparate theories to be compared on the basis of those properties. We discuss $\\LKP$, a monadic second-order logical framework for such an approach to syntax that has the distinctive virtue of being superficially expressive---supporting direct statement of most linguistically significant syntactic properties---but having well-defined strong generative capacity---languages are definable in $\\LKP$ iff they are strongly context-free. We ...

  8. Should the model for risk-informed regulation be game theory rather than decision theory?

    Science.gov (United States)

    Bier, Vicki M; Lin, Shi-Woei

    2013-02-01

    deception), to identify optimal regulatory strategies. Therefore, we believe that the types of regulatory interactions analyzed in this article are better modeled using game theory rather than decision theory. In particular, the goals of this article are to review the relevant literature in game theory and regulatory economics (to stimulate interest in this area among risk analysts), and to present illustrative results showing how the application of game theory can provide useful insights into the theory and practice of risk-informed regulation.

  9. A new formulation of the atmospheric spectral energy budget, with application to two high-resolution general circulation models

    CERN Document Server

    Augier, Pierre

    2012-01-01

    A new formulation of the spectral energy budget of kinetic and available potential energies of the atmosphere is derived, with spherical harmonics as base functions. Compared to previous formulations, there are three main improvements: (i) the topography is taken into account, (ii) the exact three-dimensional advection terms are considered and (iii) the vertical flux is separated from the energy transfer between different spherical harmonics. Using this formulation, results from two different high resolution GCMs are analyzed: the AFES T639L24 and the ECMWF IFS T1279L91. The spectral fluxes show that the AFES, which reproduces realistic horizontal spectra with a $k^{-5/3}$ inertial range at the mesoscales, simulates a strong downscale energy cascade. % In contrast, neither the $k^{-5/3}$ vertically integrated spectra nor the downscale energy cascade are produced by the ECMWF IFS.

  10. A hierarchy of energy- and flux-budget (EFB) turbulence closure models for stably stratified geophysical flows

    CERN Document Server

    Zilitinkevich, S S; Kleeorin, N; Rogachevskii, I; Esau, I

    2011-01-01

    In this paper we advance physical background of the EFB turbulence closure and present its comprehensive description. It is based on four budget equations for the second moments: turbulent kinetic and potential energies (TKE and TPE) and vertical turbulent fluxes of momentum and buoyancy; a new relaxation equation for the turbulent dissipation time-scale; and advanced concept of the inter-component exchange of TKE. The EFB closure is designed for stratified, rotating geophysical flows from neutral to very stable. In accordance to modern experimental evidence, it grants maintaining turbulence by the velocity shear at any gradient Richardson number Ri, and distinguishes between the two principally different regimes: "strong turbulence" at Ri 1 typical of the free atmosphere or deep ocean, where Pr_T asymptotically linearly increases with increasing Ri that implies strong suppressing of the heat transfer compared to momentum transfer. For use in different applications, the EFB turbulence closure is formulated a...

  11. Nonrelativistic factorizable scattering theory of multicomponent Calogero-Sutherland model

    CERN Document Server

    Ahn, C; Nam, S; Ahn, Changrim; Lee, Kong Ju Bock; Nam, Soonkeon

    1995-01-01

    We relate two integrable models in (1+1) dimensions, namely, multicomponent Calogero-Sutherland model with particles and antiparticles interacting via the hyperbolic potential and the nonrelativistic factorizable S-matrix theory with SU(N)-invariance. We find complete solutions of the Yang-Baxter equations without implementing the crossing symmetry, and one of them is identified with the scattering amplitudes derived from the Schr\\"{o}dinger equation of the Calogero-Sutherland model. This particular solution is of interest in that it cannot be obtained as a nonrelativistic limit of any known relativistic solutions of the SU(N)-invariant Yang-Baxter equations.

  12. The interconnection of wet and dry deposition and the alteration of deposition budgets due to incorporation of new process understanding in regional models

    Science.gov (United States)

    Dennis, R. L.; Bash, J. O.; Foley, K. M.; Gilliam, R.; Pinder, R. W.

    2013-12-01

    Deposition is affected by the chemical and physical processes represented in the regional models as well as source strength. The overall production and loss budget (wet and dry deposition) is dynamically connected and adjusts internally to changes in process representation. In addition, the scrubbing of pollutants from the atmosphere by precipitation is one of several processes that remove pollutants, creating a coupling with the atmospheric aqueous and gas phase chemistry that can influence wet deposition rates in a nonlinear manner. We explore through model sensitivities with the regional Community Multiscale Air Quality (CMAQ) model the influence on wet and dry deposition, and the overall continental nitrogen budget, of changes in three process representations in the model: (1) incorporation of lightning generated NO, (2) improved representation of convective precipitation, and (3) replacement of the typical unidirectional dry deposition of NH3 with a state of the science representation of NH3 bi-directional air-surface exchange. Results of the sensitivity studies will be presented. (1) Incorporation of lightning generated NO significantly reduces a negative bias in summer wet nitrate deposition, but is sensitive to the choice of convective parameterization. (2) Use of a less active trigger of convective precipitation in the WRF meteorological model to reduce summertime precipitation over prediction bias reduces the generation of NO from lightning. It also reduces the wet deposition of nitrate and increases the dry deposition of oxidized nitrogen, as well as changing (reducing) the surface level exposure to ozone. Improvements in the convective precipitation processes also result in more non-precipitating clouds leading to an increase in SO4 production through the aqueous pathway resulting in improvements in summertime SO4 ambient aerosol estimates.(3) Incorporation of state of the science ammonia bi-directional air surface exchange affects both the dry

  13. The iron budget in ocean surface waters in the 20th and 21st centuries: projections by the Community Earth System Model version 1

    Directory of Open Access Journals (Sweden)

    K. Misumi

    2013-05-01

    Full Text Available We investigated the simulated iron budget in ocean surface waters in the 1990s and 2090s using the Community Earth System Model version 1 and the Representative Concentration Pathway 8.5 future CO2 emission scenario. We assumed that exogenous iron inputs did not change during the whole simulation period; thus, iron budget changes were attributed solely to changes in ocean circulation and mixing in response to projected global warming. The model simulated the major features of ocean circulation and dissolved iron distribution for the present climate reasonably well. Detailed iron budget analysis revealed that roughly 70% of the iron supplied to surface waters in high-nutrient, low-chlorophyll (HNLC regions is contributed by ocean circulation and mixing processes, but the dominant supply mechanism differed in each HNLC region: vertical mixing in the Southern Ocean, upwelling in the eastern equatorial Pacific, and deposition of iron-bearing dust in the subarctic North Pacific. In the 2090s, our model projected an increased iron supply to HNLC surface waters, even though enhanced stratification was predicted to reduce iron entrainment from deeper waters. This unexpected result could be attributed largely to changes in the meridional overturning and gyre-scale circulations that intensified the advective supply of iron to surface waters, especially in the eastern equatorial Pacific. The simulated primary and export productions in the 2090s decreased globally by 6% and 13%, respectively, whereas in the HNLC regions, they increased by 11% and 6%, respectively. Roughly half of the elevated production could be attributed to the intensified iron supply. The projected ocean circulation and mixing changes are consistent with recent observations of responses to the warming climate and with other Coupled Model Intercomparison Project model projections. We conclude that future ocean circulation and mixing changes will likely elevate the iron supply to HNLC

  14. Integrating field measurements, a geomorphological map and stochastic modelling to estimate the spatially distributed rockfall sediment budget of the Upper Kaunertal, Austrian Central Alps

    Science.gov (United States)

    Heckmann, Tobias; Hilger, Ludwig; Vehling, Lucas; Becht, Michael

    2016-05-01

    The estimation of catchment-scale rockfall rates relies on the regionalisation of local measurements. Here, we propose a new framework for such a regionalisation by the example of a case study in the Upper Kaunertal, Austrian Central Alps (62.5 km2). Measurements of rockfall deposition during 12 months onto six collector nets within the study area were combined with published mean annual rates from the literature, and a probability density function was fitted to these data. A numerical model involving a random walk routing scheme and a one-parameter friction model was used to simulate rockfall trajectories, starting from potential rockfall source areas that were delineated from a digital elevation model. Rockfall rates sampled from the fitted probability density function were assigned to these trajectories in order to model the spatial distribution and to estimate the amount of rockfall deposition. By recording all trajectories as edges of a network of raster cells, and by aggregating the latter to landforms (or landform types) as delineated in a geomorphological map of the study area, rockfall sediment flux from sources to different landforms could be quantified. Specifically, the geomorphic coupling of rockfall sources to storage landforms and the glacial and fluvial sediment cascade was investigated using this network model. The total rockfall contribution to the sediment budget of the Upper Kaunertal is estimated at c. 8000 Mg yr- 1, 16.5% of which is delivered to the glaciers, and hence to the proglacial zone. The network approach is favourable, for example because multiple scenarios (involving different probability density functions) can be calculated on the basis of the same set of trajectories, and because deposits can be back-linked to their respective sources. While the methodological framework constitutes the main aim of our paper, we also discuss how the estimation of the budget can be improved on the basis of spatially distributed production rates.

  15. Budgeting Based on Results

    Science.gov (United States)

    Cooper, Kelt L.

    2011-01-01

    Every program in a school or school district has, or once had, a purpose. The purpose was most likely promoted, argued and debated among school constituencies--parents, teachers, administrators and school board members--before it was eventually approved. This process occurs year after year, budget after budget. In itself, this is not necessarily a…

  16. Learning From Low Budgets

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Chinese filmmakers turn small-budget productions into box-office successes Organizers of China’s upcoming film festivals are finally giving recognition to the little guys—low budget films—to encourage a generation of young,talented directors.

  17. Reading Institutional Budgets.

    Science.gov (United States)

    Chabot, Barry

    2001-01-01

    Prepares two tables to illustrate how it is helpful to place worries about much smaller sums in the context of Miami University's overall academic budget; One table summarizes the academic budgets for every department during the 1997-98 academic year and a second contains the income-expense ratios for all Oxford departments over a five-year…

  18. Little string theory from a double-scaled matrix model

    International Nuclear Information System (INIS)

    Following Lin and Maldacena, we find exact supergravity solutions dual to a class of vacua of the plane wave matrix model by solving an electrostatics problem. These are asymptotically near-horizon D0-brane solutions with a throat associated with NS5-brane degrees of freedom. We determine the precise limit required to decouple the asymptotic geometry and leave an infinite throat solution found earlier by Lin and Maldacena, dual to Little String Theory on S5. By matching parameters with the gauge theory, we find that this corresponds to a double scaling limit of the plane wave matrix model in which N→∞ and the 't Hooft coupling λ scales as ln4(N), which we speculate allows all terms in the genus expansion to contribute even at infinite N. Thus, the double-scaled matrix quantum mechanics gives a Lagrangian description of Little String Theory on S5, or equivalently a ten-dimensional string theory with linear dilaton background

  19. Little String Theory from a Double-Scaled Matrix Model

    CERN Document Server

    Ling, H; Shieh, H H; Van Anders, G; Van Raamsdonk, M; Ling, Henry; Mohazab, Ali Reza; Shieh, Hsien-Hang; Anders, Greg van; Raamsdonk, Mark Van

    2006-01-01

    Following Lin and Maldacena, we find exact supergravity solutions dual to a class of vacua of the plane wave matrix model by solving an electrostatics problem. These are asymptotically near-horizon D0-brane solutions with a throat associated with NS5-brane degrees of freedom. We determine the precise limit required to decouple the asymptotic geometry and leave an infinite throat solution found earlier by Lin and Maldacena, dual to Little String Theory on S^5. By matching parameters with the gauge theory, we find that this corresponds to a double scaling limit of the plane wave matrix model in which N \\to \\infty and the 't Hooft coupling \\lambda scales as \\ln^4(N), which we speculate allows all terms in the genus expansion to contribute even at infinite N. Thus, the double-scaled matrix quantum mechanics gives a Lagrangian description of Little String Theory on S^5, or equivalently a ten-dimensional string theory with linear dilaton background.

  20. Circuit theory and model-based inference for landscape connectivity

    Science.gov (United States)

    Hanks, Ephraim M.; Hooten, Mevin B.

    2013-01-01

    Circuit theory has seen extensive recent use in the field of ecology, where it is often applied to study functional connectivity. The landscape is typically represented by a network of nodes and resistors, with the resistance between nodes a function of landscape characteristics. The effective distance between two locations on a landscape is represented by the resistance distance between the nodes in the network. Circuit theory has been applied to many other scientific fields for exploratory analyses, but parametric models for circuits are not common in the scientific literature. To model circuits explicitly, we demonstrate a link between Gaussian Markov random fields and contemporary circuit theory using a covariance structure that induces the necessary resistance distance. This provides a parametric model for second-order observations from such a system. In the landscape ecology setting, the proposed model provides a simple framework where inference can be obtained for effects that landscape features have on functional connectivity. We illustrate the approach through a landscape genetics study linking gene flow in alpine chamois (Rupicapra rupicapra) to the underlying landscape.

  1. Matrix models and stochastic growth in Donaldson-Thomas theory

    CERN Document Server

    Szabo, Richard J

    2010-01-01

    We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating ...

  2. MJO theory in relation to comprehensive models and observations

    Science.gov (United States)

    Sobel, A. H.

    2015-12-01

    I will describe some recent thinking on the fundamental physics of the MJO. Much recent work portrays the MJO as some form of "moisture mode" - meaning that the prognostic variable of greatest importance is moisture or a conserved variable closely related to it, such as moist static energy or moist entropy - and I will argue for this line of thinking, presenting evidence in favor of it, as well as unresolved problems. I will focus on the relationship between theory, numerical modeling at a range of scales, and observations. We have reached a new phase in the study of the MJO: the phenomenon has been described in great detail from an observational point of view, and perhaps even more importantly, the best numerical models can now simulate it well. I will argue that in this situation, theory is most compelling when its core assumptions can be specifically defended using both observations and comprehensive numerical models. I will make connections between the moisture mode view and results from several different types of numerical models: numerical weather prediction models, global climate models, and small-domain high-resolution cloud resolving models.

  3. Budget brief, 1981

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    The FY DOE budge totals $12.6 billion in budget authority and $11.1 billion in budget outlays. The budget authority being requested consists of $10.3 billion in new authority and a $2.3 billion reappropriation of expiring funds for the Strategic Petroleum Reserve. Areas covered in the Energy budget are: energy conservation; research, development, and applications; regulation and information; direct energy production; strategic energy production; and energy security reserve. Other areas include: general science, defense activities; departmental administration; and legislative proposal - spent fuel. Budget totals are compared for 1980 and 1981. A detailed discussion of the FY 1981 activities to be undertaken to carry out these activities is provided. (MCW)

  4. Quantifying the impacts of land surface schemes and dynamic vegetation on the model dependency of projected changes in surface energy and water budgets

    Science.gov (United States)

    Yu, Miao; Wang, Guiling; Chen, Haishan

    2016-03-01

    Assessing and quantifying the uncertainties in projected future changes of energy and water budgets over land surface are important steps toward improving our confidence in climate change projections. In this study, the contribution of land surface models to the inter-GCM variation of projected future changes in land surface energy and water fluxes are assessed based on output from 19 global climate models (GCMs) and offline Community Land Model version 4 (CLM4) simulations driven by meteorological forcing from the 19 GCMs. Similar offline simulations using CLM4 with its dynamic vegetation submodel are also conducted to investigate how dynamic vegetation feedback, a process that is being added to more earth system models, may amplify or moderate the intermodel variations of projected future changes. Projected changes are quantified as the difference between the 2081-2100 period from the Representative Concentration Pathway 8.5 (RCP8.5) future experiment and the 1981-2000 period from the historical simulation. Under RCP8.5, projected changes in surface water and heat fluxes show a high degree of model dependency across the globe. Although precipitation is very likely to increase in the high latitudes of the Northern Hemisphere, a high degree of model-related uncertainty exists for evapotranspiration, soil water content, and surface runoff, suggesting discrepancy among land surface models (LSMs) in simulating the surface hydrological processes and snow-related processes. Large model-related uncertainties for the surface water budget also exist in the Tropics including southeastern South America and Central Africa. These uncertainties would be reduced in the hypothetical scenario of a single near-perfect land surface model being used across all GCMs, suggesting the potential to reduce uncertainties through the use of more consistent approaches toward land surface model development. Under such a scenario, the most significant reduction is likely to be seen in the

  5. Budget 2011: A budget lacking in ambition

    OpenAIRE

    Dolphin, Tony

    2011-01-01

    Growth is key to the government’s plans for the recovery. Tony Dolphin, Senior Economist at the Institute for Public Policy Research looks at this year’s budget and finds that while it may promote growth now, a broader strategy may be needed in the long term.

  6. Decision-Making Theories and Models: A Discussion of Rational and Psychological Decision-Making Theories and Models: The Search for a Cultural-Ethical Decision-Making Model

    OpenAIRE

    Oliveira, Arnaldo

    2007-01-01

    This paper examines rational and psychological decision-making models. Descriptive and normative methodologies such as attribution theory, schema theory, prospect theory, ambiguity model, game theory, and expected utility theory are discussed. The definition of culture is reviewed, and the relationship between culture and decision making is also highlighted as many organizations use a cultural-ethical decision-making model.

  7. Linking Simple Economic Theory Models and the Cointegrated Vector AutoRegressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its...

  8. Quantile hydrologic model selection and model structure deficiency assessment: 1. Theory

    NARCIS (Netherlands)

    Pande, S.

    2013-01-01

    A theory for quantile based hydrologic model selection and model structure deficiency assessment is presented. The paper demonstrates that the degree to which a model selection problem is constrained by the model structure (measured by the Lagrange multipliers of the constraints) quantifies structur

  9. Supersymmetric Theory of Stochastic ABC Model: A Numerical Study

    CERN Document Server

    Ovchinnikov, Igor V; Ensslin, Torsten A; Wang, Kang L

    2016-01-01

    In this paper, we investigate numerically the stochastic ABC model, a toy model in the theory of astrophysical kinematic dynamos, within the recently proposed supersymmetric theory of stochastics (STS). STS characterises stochastic differential equations (SDEs) by the spectrum of the stochastic evolution operator (SEO) on elements of the exterior algebra or differentials forms over the system's phase space, X. STS can thereby classify SDEs as chaotic or non-chaotic by identifying the phenomenon of stochastic chaos with the spontaneously broken topological supersymmetry that all SDEs possess. We demonstrate the following three properties of the SEO, deduced previously analytically and from physical arguments: the SEO spectra for zeroth and top degree forms never break topological supersymmetry, all SDEs possesses pseudo-time-reversal symmetry, and each de Rahm cohomology class provides one supersymmetric eigenstate. Our results also suggests that the SEO spectra for forms of complementary degrees, i.e., k and ...

  10. Ranking streamflow model performance based on Information theory metrics

    Science.gov (United States)

    Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas

    2016-04-01

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.

  11. Collective String Field Theory of Matrix Models in the BMN Limit

    OpenAIRE

    Koch, Robert de Mello; Jevicki, Antal; Rodrigues, Joao P.

    2002-01-01

    We develop a systematic procedure for deriving canonical string field theory from large N matrix models in the Berenstein-Maldacena-Nastase limit. The approach, based on collective field theory, provides a generalization of standard string field theory.

  12. Holographic Dark Energy Model and Scalar-Tensor Theories

    OpenAIRE

    Bisabr, Yousef

    2008-01-01

    We study the holographic dark energy model in a generalized scalar tensor theory. In a universe filled with cold dark matter and dark energy, the effect of potential of the scalar field is investigated in the equation of state parameter. We show that for a various types of potentials, the equation of state parameter is negative and transition from deceleration to acceleration expansion of the universe is possible.

  13. Properties of lattice gauge theory models at low temperatures

    International Nuclear Information System (INIS)

    The Z(N) theory of quark confinement is discussed and how fluctuations of Z(N) gauge fields may continue to be important in the continuum limit. Existence of a model in four dimensions is pointed out in which confinement of (scalar) quarks can be shown to persist in the continuum limit. This article is based on the author's Cargese lectures 1979. Some of its results are published here for the first time. (orig.) 891 HSI/orig. 892 MKO

  14. The dorsal stream in speech processing: Model and theory

    OpenAIRE

    Keidel JJ, Welbourne SR, Lambon Ralph MA.

    2009-01-01

    The ability to produce and comprehend spoken language requires an internal understanding of the complex relations between articulatory gestures and their acoustic consequences. Recent theories of speech processing propose a division between the ventral stream, which involves the mapping of acoustic signals to lexical/semantic representations, and the dorsal stream, which mediates the mapping between incoming auditory signals and articulatory output. We present a connectionist model of the dor...

  15. Analytical theory of Doppler reflectometry in slab plasma model

    CERN Document Server

    Gusakov, E; Gusakov, Evgeniy; Surkov, Alexander

    2004-01-01

    Doppler reflectometry is considered in slab plasma model in the frameworks of analytical theory. The diagnostics locality is analyzed for both regimes: linear and nonlinear in turbulence amplitude. The toroidal antenna focusing of probing beam to the cut-off is proposed and discussed as a method to increase diagnostics spatial resolution. It is shown that even in the case of nonlinear regime of multiple scattering, the diagnostics can be used for an estimation (with certain accuracy) of plasma poloidal rotation profile.

  16. A model theory for tachyons in two dimensions

    International Nuclear Information System (INIS)

    The paper is divided in two parts, the first one having nothing to do with tachyons. In fact, to prepare the ground, in part one (sect. 2) it is shown that special relativity, even without tachyons, can be given a form such to describe both particles and antiparticles. The plan of part two is confined only to a model theory in two dimensions, for the reasons stated in sect. 3

  17. Stochastic models in risk theory and management accounting

    OpenAIRE

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest force. The other extension deals with the insurer's strategy with respect to maintaining the current premium system when the insurer does not have complete knowledge about the distribution of the risk...

  18. Dynamic density functional theory of solid tumor growth: Preliminary models

    OpenAIRE

    Arnaud Chauviere; Haralambos Hatzikirou; Kevrekidis, Ioannis G.; Lowengrub, John S.; Vittorio Cristini

    2012-01-01

    Cancer is a disease that can be seen as a complex system whose dynamics and growth result from nonlinear processes coupled across wide ranges of spatio-temporal scales. The current mathematical modeling literature addresses issues at various scales but the development of theoretical methodologies capable of bridging gaps across scales needs further study. We present a new theoretical framework based on Dynamic Density Functional Theory (DDFT) extended, for the first time, to the dynamics of l...

  19. Regression modeling methods, theory, and computation with SAS

    CERN Document Server

    Panik, Michael

    2009-01-01

    Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,

  20. Embankment deformation analyzed by elastoplastic damage model coupling consolidation theory

    Institute of Scientific and Technical Information of China (English)

    Hong SUN; Xihong ZHAO

    2006-01-01

    The deformation of embankment has serious influences on neighboring structure and infrastructure. A trial embankment is reanalyzed by elastoplastic damage model coupling Biot's consolidation theory. With the increase in time of loading, the damage accumulation becomes larger. Under the centre and toe of embankment, damage becomes serious. Under the centre of embankment, vertical damage values are bigger than horizontal ones. Under the toe of embankment, horizontal damage values are bigger than vertical ones.

  1. Molecular Thermodynamic Modeling of Fluctuation Solution Theory Properties

    DEFF Research Database (Denmark)

    O’Connell, John P.; Abildskov, Jens

    2013-01-01

    Fluctuation Solution Theory provides relationships between integrals of the molecular pair total and direct correlation functions and the pressure derivative of solution density, partial molar volumes, and composition derivatives of activity coefficients. For dense fluids, the integrals follow...... a relatively simple corresponding-states behavior even for complex systems, show welldefined relationships for infinite dilution properties in complex and near-critical systems, allow estimation of mixed-solvent solubilities of gases and pharmaceuticals, and can be expressed by simple perturbation models...

  2. Game Theory Models for Multi-Robot Patrolling of Infrastructures

    Directory of Open Access Journals (Sweden)

    Erik Hernández

    2013-03-01

    Full Text Available This work is focused on the problem of performing multi‐robot patrolling for infrastructure security applications in order to protect a known environment at critical facilities. Thus, given a set of robots and a set of points of interest, the patrolling task consists of constantly visiting these points at irregular time intervals for security purposes. Current existing solutions for these types of applications are predictable and inflexible. Moreover, most of the previous work has tackled the patrolling problem with centralized and deterministic solutions and only few efforts have been made to integrate dynamic methods. Therefore, one of the main contributions of this work is the development of new dynamic and decentralized collaborative approaches in order to solve the aforementioned problem by implementing learning models from Game Theory. The model selected in this work that includes belief‐based and reinforcement models as special cases is called Experience‐Weighted Attraction. The problem has been defined using concepts of Graph Theory to represent the environment in order to work with such Game Theory techniques. Finally, the proposed methods have been evaluated experimentally by using a patrolling simulator. The results obtained have been compared with previous available approaches.

  3. Shell-model Hamiltonians from Density Functional Theory

    CERN Document Server

    Alhassid, Y; Fang, L; Sabbey, B

    2005-01-01

    The density functional theory of nuclear structure provides a many-particle wave function that is useful for static properties, but an extension of the theory is necessary to describe correlation effects or other dynamic properties. Here we propose a procedure to extend the theory by mapping the properties of the self-consistent mean-field Hamiltonian onto an effective shell-model Hamiltonian with two-body interactions. In this initial study, we consider the sd-shell nuclei Ne-20, Mg-24, Si-28, and Ar-36. Our first application is in the framework of the USD shell-model Hamiltonian, using its mean-field approximation to construct an effective Hamiltonian and partially recover correlation effects. We find that more than half of the correlation energy is due to the quadrupole interaction. We then follow a similar procedure but using the SLy4 Skyrme energy functional as our starting point and truncating the space to the spherical $sd$ shell. The constructed shell-model Hamiltonian is found to satisfy minimal cons...

  4. Theories linguistiques, modeles informatiques, experimentation psycholinguistique (Linguistic Theories, Information-Processing Models, Psycholinguistic Experimentation)

    Science.gov (United States)

    Dubois, Daniele

    1975-01-01

    Delineates and elaborates upon the underlying psychological postulates in linguistic and information-processing models, and shows the interdependence of psycholinguistics and linguistic analysis. (Text is in French.) (DB)

  5. Models and applications of chaos theory in modern sciences

    CERN Document Server

    Zeraoulia, Elhadj

    2011-01-01

    This book presents a select group of papers that provide a comprehensive view of the models and applications of chaos theory in medicine, biology, ecology, economy, electronics, mechanical, and the human sciences. Covering both the experimental and theoretical aspects of the subject, it examines a range of current topics of interest. It considers the problems arising in the study of discrete and continuous time chaotic dynamical systems modeling the several phenomena in nature and society-highlighting powerful techniques being developed to meet these challenges that stem from the area of nonli

  6. Time-dependent Gutzwiller theory for multiband Hubbard models.

    Science.gov (United States)

    Oelsen, E v; Seibold, G; Bünemann, J

    2011-08-12

    Based on the variational Gutzwiller theory, we present a method for the computation of response functions for multiband Hubbard models with general local Coulomb interactions. The improvement over the conventional random-phase approximation is exemplified for an infinite-dimensional two-band Hubbard model where the incorporation of the local multiplet structure leads to a much larger sensitivity of ferromagnetism on the Hund coupling. Our method can be implemented into local-density approximation and Gutzwiller schemes and will therefore be an important tool for the computation of response functions for strongly correlated materials.

  7. THE NEW CLASSICAL THEORY AND THE REAL BUSINESS CYCLE MODEL

    Directory of Open Access Journals (Sweden)

    Oana Simona HUDEA (CARAMAN

    2014-11-01

    Full Text Available The present paper aims at describing some key elements of the new classical theory-related model, namely the Real Business Cycle, mainly describing the economy from the perspective of a perfectly competitive market, characterised by price, wage and interest rate flexibility. The rendered impulse-response functions, that help us in revealing the capacity of the model variables to return to their steady state under the impact of a structural shock, be it technology or monetary policy oriented, give points to the neutrality of the monetary entity decisions, therefore confirming the well-known classical dichotomy existing between the nominal and the real factors of the economy.

  8. The Standard Model as a 2T-physics Theory

    CERN Document Server

    Bars, Itzhak

    2007-01-01

    New developments in 2T-physics, that connect 2T-physics field theory directly to the real world, are reported in this talk. An action is proposed in field theory in 4+2 dimensions which correctly reproduces the Standard Model (SM) in 3+1 dimensions (and no junk). Everything that is known to work in the SM still works in the emergent 3+1 theory, but some of the problems of the SM get resolved. The resolution is due to new restrictions on interactions inherited from 4+2 dimensions that lead to some interesting physics and new points of view not discussed before in 3+1 dimensions. In particular the strong CP violation problem is resolved without an axion, and the electro-weak symmetry breakdown that generates masses requires the participation of the dilaton, thus relating the electro-weak phase transition to other phase transitions (such as evolution of the universe, vacuum selection in string theory, etc.) that also require the participation of the dilaton. The underlying principle of 2T-physics is the local sy...

  9. Who needs budgets?

    Science.gov (United States)

    Hope, Jeremy; Fraser, Robin

    2003-02-01

    Budgeting, as most corporations practice it, should be abolished. That may sound radical, but doing so would further companies' long-running efforts to transform themselves into developed networks that can nimbly adjust to market conditions. Most other building blocks are in place, but companies continue to restrict themselves by relying on inflexible budget processes and the command-and-control culture that budgeting entails. A number of companies have rejected the foregone conclusions embedded in budgets, and they've given up the self-interested wrangling over what the data indicate. In the absence of budgets, alternative goals and measures--some financial, such as cost-to-income ratios, and some nonfinancial, such as time to market-move to the foreground. Companies that have rejected budgets require employees to measure themselves against the performance of competitors and against internal peer groups. Because employees don't know whether they've succeeded until they can look back on the results of a given period, they must use every ounce of energy to ensure that they beat the competition. A key feature of many companies that have rejected budgets is the use of rolling forecasts, which are created every few months and typically cover five to eight quarters. Because the forecasts are regularly revised, they allow companies to continuously adapt to market conditions. The forecasting practices of two such companies, both based in Sweden, are examined in detail: the bank Svenska Handelsbanken and the wholesaler Ahlsell. Though the first companies to reject budgets were located in Northern Europe, organizations that have gone beyond budgeting can be found in a range of countries and industries. Their practices allow them to unleash the power of today's management tools and realize the potential of a fully decentralized organization.

  10. Bayesian Decision Theory Guiding Educational Decision-Making: Theories, Models and Application

    Science.gov (United States)

    Pan, Yilin

    2016-01-01

    Given the importance of education and the growing public demand for improving education quality under tight budget constraints, there has been an emerging movement to call for research-informed decisions in educational resource allocation. Despite the abundance of rigorous studies on the effectiveness, cost, and implementation of educational…

  11. The linear model and hypothesis a general unifying theory

    CERN Document Server

    Seber, George

    2015-01-01

    This book provides a concise and integrated overview of hypothesis testing in four important subject areas, namely linear and nonlinear models, multivariate analysis, and large sample theory. The approach used is a geometrical one based on the concept of projections and their associated idempotent matrices, thus largely avoiding the need to involve matrix ranks. It is shown that all the hypotheses encountered are either linear or asymptotically linear, and that all the underlying models used are either exactly or asymptotically linear normal models. This equivalence can be used, for example, to extend the concept of orthogonality in the analysis of variance to other models, and to show that the asymptotic equivalence of the likelihood ratio, Wald, and Score (Lagrange Multiplier) hypothesis tests generally applies.

  12. Symmetry-guided large-scale shell-model theory

    Science.gov (United States)

    Launey, Kristina D.; Dytrych, Tomas; Draayer, Jerry P.

    2016-07-01

    In this review, we present a symmetry-guided strategy that utilizes exact as well as partial symmetries for enabling a deeper understanding of and advancing ab initio studies for determining the microscopic structure of atomic nuclei. These symmetries expose physically relevant degrees of freedom that, for large-scale calculations with QCD-inspired interactions, allow the model space size to be reduced through a very structured selection of the basis states to physically relevant subspaces. This can guide explorations of simple patterns in nuclei and how they emerge from first principles, as well as extensions of the theory beyond current limitations toward heavier nuclei and larger model spaces. This is illustrated for the ab initio symmetry-adapted no-core shell model (SA-NCSM) and two significant underlying symmetries, the symplectic Sp(3 , R) group and its deformation-related SU(3) subgroup. We review the broad scope of nuclei, where these symmetries have been found to play a key role-from the light p-shell systems, such as 6Li, 8B, 8Be, 12C, and 16O, and sd-shell nuclei exemplified by 20Ne, based on first-principle explorations; through the Hoyle state in 12C and enhanced collectivity in intermediate-mass nuclei, within a no-core shell-model perspective; up to strongly deformed species of the rare-earth and actinide regions, as investigated in earlier studies. A complementary picture, driven by symmetries dual to Sp(3 , R) , is also discussed. We briefly review symmetry-guided techniques that prove useful in various nuclear-theory models, such as Elliott model, ab initio SA-NCSM, symplectic model, pseudo- SU(3) and pseudo-symplectic models, ab initio hyperspherical harmonics method, ab initio lattice effective field theory, exact pairing-plus-shell model approaches, and cluster models, including the resonating-group method. Important implications of these approaches that have deepened our understanding of emergent phenomena in nuclei, such as enhanced

  13. Toward a General Research Process for Using Dubin's Theory Building Model

    Science.gov (United States)

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  14. FY 1997 congressional budget request: Budget highlights

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    This is an overview of the 1997 budget request for the US DOE. The topics of the overview include a policy overview, the budget by business line, business lines by organization, crosswalk from business line to appropriation, summary by appropriation, energy supply research and development, uranium supply and enrichment activities, uranium enrichment decontamination and decommissioning fund, general science and research, weapons activities, defense environmental restoration and waste management, defense nuclear waste disposal, departmental administration, Office of the Inspector General, power marketing administrations, Federal Energy Regulatory commission, nuclear waste disposal fund, fossil energy research and development, naval petroleum and oil shale reserves, energy conservation, economic regulation, strategic petroleum reserve, energy information administration, clean coal technology and a Department of Energy Field Facilities map.

  15. Visceral obesity and psychosocial stress: a generalised control theory model

    Science.gov (United States)

    Wallace, Rodrick

    2016-07-01

    The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.

  16. A model theoretic Baire category theorem for simple theories

    CERN Document Server

    Shami, Ziv

    2009-01-01

    We define the class of $\\tilde\\tau_{low}^f$-sets. This is a class of type-definable sets defined in terms of forking by low formulas. We prove a model theoretic Baire category theorem for $\\tilde\\tau_{low}^f$-sets in a countable simple theory in which the extension property is first-order and show some of its applications. A typical application is the following. Let $T$ be a countable theory with the wnfcp (weak nonfinite cover property) and assume for every non-algebraic $a$ there exists a non-algebraic $a'\\acl(a)$ such that $SU(a')<\\omega$. Then there exists a weakly-minimal formula with parameters.

  17. Applications of the Likelihood Theory in Finance: Modelling and Pricing

    CERN Document Server

    Janssen, Arnold

    2012-01-01

    This paper discusses the connection between mathematical finance and statistical modelling which turns out to be more than a formal mathematical correspondence. We like to figure out how common results and notions in statistics and their meaning can be translated to the world of mathematical finance and vice versa. A lot of similarities can be expressed in terms of LeCam's theory for statistical experiments which is the theory of the behaviour of likelihood processes. For positive prices the arbitrage free financial assets fit into filtered experiments. It is shown that they are given by filtered likelihood ratio processes. From the statistical point of view, martingale measures, completeness and pricing formulas are revisited. The pricing formulas for various options are connected with the power functions of tests. For instance the Black-Scholes price of a European option has an interpretation as Bayes risk of a Neyman Pearson test. Under contiguity the convergence of financial experiments and option prices ...

  18. Modelling robotic palletising process with two robots using queuing theory

    Directory of Open Access Journals (Sweden)

    J. Li

    2008-12-01

    Full Text Available Purpose: This paper presents the modeling of a typical high-speed robotic palletizing process involving two robotsusing queuing theory with the aim of providing design guidelines of such a system for dynamic material flows.Design/methodology/approach: In this study, our calculation is carried out using the queue models and theproduction parameters as given in Section 2. We select three types of performance indexes, i.e. average lengthof queue (Lq, utilization of server (US and mean waiting time (MWT in the queue to interpret the predictionof performance of the high-speed palletizing systems to handle the prescribed material flow.Findings: The time average properties of the system, such as the average number of cartons in the system, meanwaiting time, the congestion probability, utilization of machine have been calculated. The calculation resultshave shown that the work performance of the system is related to the characteristics of the material flow, and theproduct design should consider both the time-average parameters and the dynamic features of the material flow.Practical implications: The paper has described a methodology of modeling a high-speed palletizing processwith two robotic servers to handle high-speed dynamic materials flow using the queuing theory technique.Originality/value: Based on the established model, a general design scheme can be derived to show howto consider the dynamic properties through the system design while meeting the prescribed utilization andcongestion requirements of such a process.

  19. Applications of queueing theory to stochastic models of gene expression

    Science.gov (United States)

    Kulkarni, Rahul

    2012-02-01

    The intrinsic stochasticity of cellular processes implies that analysis of fluctuations (`noise') is often essential for quantitative modeling of gene expression. Recent single-cell experiments have carried out such analysis to characterize moments and entire probability distributions for quantities of interest, e.g. mRNA and protein levels across a population of cells. Correspondingly, there is a need to develop general analytical tools for modeling and interpretation of data obtained from such single-cell experiments. One such approach involves the mapping between models of stochastic gene expression and systems analyzed in queueing theory. The talk will provide an overview of this approach and discuss how theorems from queueing theory (e.g. Little's Law) can be used to derive exact results for general stochastic models of gene expression. In the limit that gene expression occurs in bursts, analytical results can be obtained which provide insight into the effects of different regulatory mechanisms on the noise in protein steady-state distributions. In particular, the approach can be used to analyze the effect of post-transcriptional regulation by non-coding RNAs leading to new insights and experimentally testable predictions.

  20. Plane answers to complex questions the theory of linear models

    CERN Document Server

    Christensen, Ronald

    1987-01-01

    This book was written to rigorously illustrate the practical application of the projective approach to linear models. To some, this may seem contradictory. I contend that it is possible to be both rigorous and illustrative and that it is possible to use the projective approach in practical applications. Therefore, unlike many other books on linear models, the use of projections and sub­ spaces does not stop after the general theory. They are used wherever I could figure out how to do it. Solving normal equations and using calculus (outside of maximum likelihood theory) are anathema to me. This is because I do not believe that they contribute to the understanding of linear models. I have similar feelings about the use of side conditions. Such topics are mentioned when appropriate and thenceforward avoided like the plague. On the other side of the coin, I just as strenuously reject teaching linear models with a coordinate free approach. Although Joe Eaton assures me that the issues in complicated problems freq...

  1. Continued development of modeling tools and theory for RF heating

    International Nuclear Information System (INIS)

    Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project's renewal than to the initiation of a new project

  2. Budget Automation System

    Data.gov (United States)

    U.S. Environmental Protection Agency — BAS is the central Agency system used to integrate strategic planning, annual planning, budgeting and financial management. BAS contains resource (dollars and FTE),...

  3. A study of the logical model of capital market complexity theories

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Analyzes the shortcomings of the classic capital market theories based on EMH and discloses the complexity essence of the capital market. Considering the capital market a complicated, interactive and adaptable dynamic system, with complexity science as the method for researching the operation law of the capital market, this paper constructs a nonlinear logical model to analyze the applied realm, focal point and interrelationship of such theories as dissipative structure theory, chaos theory, fractal theory, synergetics theory, catastrophe theory and scale theory, and summarizes and discusses the achievements and problems of each theory.Based on the research, the paper foretells the developing direction of complexity science in a capital market.

  4. Cyclical budget balance measurement

    OpenAIRE

    C. AUDENIS; C. PROST

    2000-01-01

    Government balances are often adjusted for changes in economic activity in order to draw a clearer picture of the underlying fiscal situation and to use this as a guide to fiscal policy analysis. International organisations estimate the cyclical component of economic activity by the current level of the output gap. Using elasticities of tax and public expenditures to GDP, they compute the cyclical part of budget balance. The structural budget balance is defined as the remainder. Our approach ...

  5. Learning From Low Budgets

    Institute of Scientific and Technical Information of China (English)

    TANG YUANKAI

    2011-01-01

    Organizers of China's upcoming film festivals are finally giving recognition to the little guys-low budget films-to encourage a generation of young,talented directors.Several nominees were announced on September 10 to compete for the Small-and Medium-Budget Film Prize of the annual Golden Rooster and Hundred Flowers Film Festival,which will kick off on October 19.

  6. Gender budget pilot project

    OpenAIRE

    Barry, Ursula; Pillinger, Jane; Quinn, Sheila; Cashman, Aileen

    2004-01-01

    This Report presents the findings of the first Irish research project on gender budgeting. It explores recent international and Irish experiences of strategies towards greater gender equality and develops a template for applying a gender budget approach in selected local development organisations. The research was funded by the Gender Equality Unit of the Department of Justice, Equality and Law Reform who have responsibility for promoting and monitoring gender mainstreaming in the Irish Natio...

  7. Adapting Structuration Theory as a Comprehensive Theory for Distance Education: The ASTIDE Model

    Science.gov (United States)

    Aktaruzzaman, Md; Plunkett, Margaret

    2016-01-01

    Distance Education (DE) theorists have argued about the requirement for a theory to be comprehensive in a way that can explicate many of the activities associated with DE. Currently, Transactional Distance Theory (TDT) (Moore, 1993) and the Theory of Instructional Dialogue (IDT) (Caspi & Gorsky, 2006) are the most prominent theories, yet they…

  8. BASIC THEORY AND MATHEMATICAL MODELING OF URBAN RAINSTORM WATER LOGGING

    Institute of Scientific and Technical Information of China (English)

    LI Da-ming; ZHANG Hong-ping; LI Bing-fei; XIE Yi-yang; LI Pei-yan; HAN Su-qin

    2004-01-01

    In this paper, a mathematical model for the urban rainstorm water logging was established on the basis of one- and two-dimensional unsteady flow theory and the technique of non-structural irregular grid division. The continuity equation was discretized with the finite volume method. And the momentum equations were differently simplified and discretized for different cases. A method of "special passage" was proposed to deal with small-scale rivers and open channels. The urban drainage system was simplified and simulated in the model. The method of "open slot" was applied to coordinate the alternate calculation of open channel flow and pressure flow in drainage pipes. The model has been applied in Tianjin City and the verification is quite satisfactory.

  9. Study of a model equation in detonation theory: multidimensional effects

    CERN Document Server

    Faria, Luiz M; Rosales, Rodolfo R

    2015-01-01

    We extend the reactive Burgers equation presented in Kasimov et al. Phys. Rev. Lett., 110 (2013) and Faria et al. SIAM J. Appl. Maths, 74 (2014), to include multidimensional effects. Furthermore, we explain how the model can be rationally justified following the ideas of the asymptotic theory developed in Faria et al. JFM (2015). The proposed model is a forced version of the unsteady small disturbance transonic flow equations. We show that for physically reasonable choices of forcing functions, traveling wave solutions akin to detonation waves exist. It is demonstrated that multidimensional effects play an important role in the stability and dynamics of the traveling waves. Numerical simulations indicate that solutions of the model tend to form multi-dimensional patterns analogous to cells in gaseous detonations.

  10. Modelling apical constriction in epithelia using elastic shell theory.

    Science.gov (United States)

    Jones, Gareth Wyn; Chapman, S Jonathan

    2010-06-01

    Apical constriction is one of the fundamental mechanisms by which embryonic tissue is deformed, giving rise to the shape and form of the fully-developed organism. The mechanism involves a contraction of fibres embedded in the apical side of epithelial tissues, leading to an invagination or folding of the cell sheet. In this article the phenomenon is modelled mechanically by describing the epithelial sheet as an elastic shell, which contains a surface representing the continuous mesh formed from the embedded fibres. Allowing this mesh to contract, an enhanced shell theory is developed in which the stiffness and bending tensors of the shell are modified to include the fibres' stiffness, and in which the active effects of the contraction appear as body forces in the shell equilibrium equations. Numerical examples are presented at the end, including the bending of a plate and a cylindrical shell (modelling neurulation) and the invagination of a spherical shell (modelling simple gastrulation). PMID:19859751

  11. Dynamic statistical models of biological cognition: insights from communications theory

    Science.gov (United States)

    Wallace, Rodrick

    2014-10-01

    Maturana's cognitive perspective on the living state, Dretske's insight on how information theory constrains cognition, the Atlan/Cohen cognitive paradigm, and models of intelligence without representation, permit construction of a spectrum of dynamic necessary conditions statistical models of signal transduction, regulation, and metabolism at and across the many scales and levels of organisation of an organism and its context. Nonequilibrium critical phenomena analogous to physical phase transitions, driven by crosstalk, will be ubiquitous, representing not only signal switching, but the recruitment of underlying cognitive modules into tunable dynamic coalitions that address changing patterns of need and opportunity at all scales and levels of organisation. The models proposed here, while certainly providing much conceptual insight, should be most useful in the analysis of empirical data, much as are fitted regression equations.

  12. Density Functional Theory and Materials Modeling at Atomistic Length Scales

    Directory of Open Access Journals (Sweden)

    Swapan K. Ghosh

    2002-04-01

    Full Text Available Abstract: We discuss the basic concepts of density functional theory (DFT as applied to materials modeling in the microscopic, mesoscopic and macroscopic length scales. The picture that emerges is that of a single unified framework for the study of both quantum and classical systems. While for quantum DFT, the central equation is a one-particle Schrodinger-like Kohn-Sham equation, the classical DFT consists of Boltzmann type distributions, both corresponding to a system of noninteracting particles in the field of a density-dependent effective potential, the exact functional form of which is unknown. One therefore approximates the exchange-correlation potential for quantum systems and the excess free energy density functional or the direct correlation functions for classical systems. Illustrative applications of quantum DFT to microscopic modeling of molecular interaction and that of classical DFT to a mesoscopic modeling of soft condensed matter systems are highlighted.

  13. Rigid Rotor as a Toy Model for Hodge Theory

    CERN Document Server

    Gupta, Saurabh

    2009-01-01

    We apply the superfield approach to the toy model of a rigid rotor and show the existence of the nilpotent and absolutely anticommuting Becchi-Rouet-Stora-Tyutin (BRST) and anti-BRST symmetry transformations, under which, the kinetic term and Lagrangian remain invariant. Furthermore, we also derive the off-shell nilpotent and absolutely anticommuting (anti-) co-BRST symmetry transformations, under which, the gauge-fixing term and Lagrangian remain invariant. The anticommutator of the above nilpotent symmetry transformations leads to the derivation of a bosonic symmetry transformation, under which, the ghost terms and Lagrangian remain invariant. Together, the above transformations (and their corresponding generators) respect an algebra that turns out to be the realization of the algebra obeyed by the de Rham cohomological operators of differential geometry. Thus, our present model is a toy model for the Hodge theory.

  14. Structure and asymptotic theory for nonlinear models with GARCH errors

    Directory of Open Access Journals (Sweden)

    Felix Chan

    2015-01-01

    Full Text Available Nonlinear time series models, especially those with regime-switching and/or conditionally heteroskedastic errors, have become increasingly popular in the economics and finance literature. However, much of the research has concentrated on the empirical applications of various models, with little theoretical or statistical analysis associated with the structure of the processes or the associated asymptotic theory. In this paper, we derive sufficient conditions for strict stationarity and ergodicity of three different specifications of the first-order smooth transition autoregressions with heteroskedastic errors. This is essential, among other reasons, to establish the conditions under which the traditional LM linearity tests based on Taylor expansions are valid. We also provide sufficient conditions for consistency and asymptotic normality of the Quasi-Maximum Likelihood Estimator for a general nonlinear conditional mean model with first-order GARCH errors.

  15. A queueing theory based model for business continuity in hospitals.

    Science.gov (United States)

    Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R

    2013-01-01

    Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals. PMID:24109839

  16. Threshold effects in the US budget deficit

    OpenAIRE

    Arestis, Philip; Cipollini, Andrea; Fattouh, Bassam

    2002-01-01

    We contribute to the debate on whether the large U.S. federal budget deficits are sustainable in the long run. We model the U.S. government deficit per capita as a threshold autoregressive process. We find evidence that the U.S. budget deficit is sustainable in the long run and that economic policymakers will intervene to reduce per capita deficit only when it reaches a certain threshold.

  17. Budget Setting Strategies for the Company's Divisions

    OpenAIRE

    Berg, M; Brekelmans, R.C.M.; De Waegenaere, A.M.B.

    1997-01-01

    The paper deals with the issue of budget setting to the divisions of a company. The approach is quantitative in nature both in the formulation of the requirements for the set-budgets, as related to different general managerial objectives of interest, and in the modelling of the inherent uncertainties in the divisions' revenues. Solutions are provided for specific cases and conclusions are drawn on different aspects of this issue based on analytical and numerical analysis of the results. From ...

  18. Rigorously testing multialternative decision field theory against random utility models.

    Science.gov (United States)

    Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg

    2014-06-01

    Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions.

  19. New Trends in Model Coupling Theory, Numerics and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Coquel, F. [CMAP Ecole Polytech, CNRS, UMR 7641, F-91128 Palaiseau (France); Godlewski, E. [UPMC Univ Paris 6, UMR 7598, Lab Jacques Louis Lions, F-75005 Paris (France); Herard, J. M. [EDF RD, F-78400 Chatou (France); Segre, J. [CEA Saclay, DEN, DM2S, F-91191 Gif Sur Yvette (France)

    2010-07-01

    This special issue comprises selected papers from the workshop New Trends in Model Coupling, Theory, Numerics and Applications (NTMC'09) which took place in Paris, September 2 - 4, 2009. The research of optimal technological solutions in a large amount of industrial systems requires to perform numerical simulations of complex phenomena which are often characterized by the coupling of models related to various space and/or time scales. Thus, the so-called multi-scale modelling has been a thriving scientific activity which connects applied mathematics and other disciplines such as physics, chemistry, biology or even social sciences. To illustrate the variety of fields concerned by the natural occurrence of model coupling we may quote: meteorology where it is required to take into account several turbulence scales or the interaction between oceans and atmosphere, but also regional models in a global description, solid mechanics where a thorough understanding of complex phenomena such as propagation of cracks needs to couple various models from the atomistic level to the macroscopic level; plasma physics for fusion energy for instance where dense plasmas and collisionless plasma coexist; multiphase fluid dynamics when several types of flow corresponding to several types of models are present simultaneously in complex circuits; social behaviour analysis with interaction between individual actions and collective behaviour. (authors)

  20. Assembly models for Papovaviridae based on tiling theory

    Science.gov (United States)

    Keef, T.; Taormina, A.; Twarock, R.

    2005-09-01

    A vital constituent of a virus is its protein shell, called the viral capsid, that encapsulates and hence provides protection for the viral genome. Assembly models are developed for viral capsids built from protein building blocks that can assume different local bonding structures in the capsid. This situation occurs, for example, for viruses in the family of Papovaviridae, which are linked to cancer and are hence of particular interest for the health sector. More specifically, the viral capsids of the (pseudo-) T = 7 particles in this family consist of pentamers that exhibit two different types of bonding structures. While this scenario cannot be described mathematically in terms of Caspar-Klug theory (Caspar D L D and Klug A 1962 Cold Spring Harbor Symp. Quant. Biol. 27 1), it can be modelled via tiling theory (Twarock R 2004 J. Theor. Biol. 226 477). The latter is used to encode the local bonding environment of the building blocks in a combinatorial structure, called the assembly tree, which is a basic ingredient in the derivation of assembly models for Papovaviridae along the lines of the equilibrium approach of Zlotnick (Zlotnick A 1994 J. Mol. Biol. 241 59). A phase space formalism is introduced to characterize the changes in the assembly pathways and intermediates triggered by the variations in the association energies characterizing the bonds between the building blocks in the capsid. Furthermore, the assembly pathways and concentrations of the statistically dominant assembly intermediates are determined. The example of Simian virus 40 is discussed in detail.

  1. Global Carbon Budget 2015

    Science.gov (United States)

    Le Quéré, C.; Moriarty, R.; Andrew, R. M.; Canadell, J. G.; Sitch, S.; Korsbakken, J. I.; Friedlingstein, P.; Peters, G. P.; Andres, R. J.; Boden, T. A.; Houghton, R. A.; House, J. I.; Keeling, R. F.; Tans, P.; Arneth, A.; Bakker, D. C. E.; Barbero, L.; Bopp, L.; Chang, J.; Chevallier, F.; Chini, L. P.; Ciais, P.; Fader, M.; Feely, R. A.; Gkritzalis, T.; Harris, I.; Hauck, J.; Ilyina, T.; Jain, A. K.; Kato, E.; Kitidis, V.; Klein Goldewijk, K.; Koven, C.; Landschützer, P.; Lauvset, S. K.; Lefèvre, N.; Lenton, A.; Lima, I. D.; Metzl, N.; Millero, F.; Munro, D. R.; Murata, A.; Nabel, J. E. M. S.; Nakaoka, S.; Nojiri, Y.; O'Brien, K.; Olsen, A.; Ono, T.; Pérez, F. F.; Pfeil, B.; Pierrot, D.; Poulter, B.; Rehder, G.; Rödenbeck, C.; Saito, S.; Schuster, U.; Schwinger, J.; Séférian, R.; Steinhoff, T.; Stocker, B. D.; Sutton, A. J.; Takahashi, T.; Tilbrook, B.; van der Laan-Luijkx, I. T.; van der Werf, G. R.; van Heuven, S.; Vandemark, D.; Viovy, N.; Wiltshire, A.; Zaehle, S.; Zeng, N.

    2015-12-01

    Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates as well as consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil fuels and industry (EFF) are based on energy statistics and cement production data, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover-change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models forced by observed climate, CO2, and land-cover change (some including nitrogen-carbon interactions). We compare the mean land and ocean fluxes and their variability to estimates from three atmospheric inverse methods for three broad latitude bands. All uncertainties are reported as ±1σ, reflecting the current capacity to characterise the annual estimates of each component of the global

  2. Global carbon budget 2014

    Science.gov (United States)

    Le Quéré, C.; Moriarty, R.; Andrew, R. M.; Peters, G. P.; Ciais, P.; Friedlingstein, P.; Jones, S. D.; Sitch, S.; Tans, P.; Arneth, A.; Boden, T. A.; Bopp, L.; Bozec, Y.; Canadell, J. G.; Chini, L. P.; Chevallier, F.; Cosca, C. E.; Harris, I.; Hoppema, M.; Houghton, R. A.; House, J. I.; Jain, A. K.; Johannessen, T.; Kato, E.; Keeling, R. F.; Kitidis, V.; Klein Goldewijk, K.; Koven, C.; Landa, C. S.; Landschützer, P.; Lenton, A.; Lima, I. D.; Marland, G.; Mathis, J. T.; Metzl, N.; Nojiri, Y.; Olsen, A.; Ono, T.; Peng, S.; Peters, W.; Pfeil, B.; Poulter, B.; Raupach, M. R.; Regnier, P.; Rödenbeck, C.; Saito, S.; Salisbury, J. E.; Schuster, U.; Schwinger, J.; Séférian, R.; Segschneider, J.; Steinhoff, T.; Stocker, B. D.; Sutton, A. J.; Takahashi, T.; Tilbrook, B.; van der Werf, G. R.; Viovy, N.; Wang, Y.-P.; Wanninkhof, R.; Wiltshire, A.; Zeng, N.

    2015-05-01

    Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates, consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil fuel combustion and cement production (EFF) are based on energy statistics and cement production data, respectively, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover-change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models forced by observed climate, CO2, and land-cover-change (some including nitrogen-carbon interactions). We compare the mean land and ocean fluxes and their variability to estimates from three atmospheric inverse methods for three broad latitude bands. All uncertainties are reported as ±1σ, reflecting the current capacity to characterise the annual estimates of each

  3. DEBtox theory and matrix population models as helpful tools in understanding the interaction between toxic cyanobacteria and zooplankton.

    Science.gov (United States)

    Billoir, Elise; da Silva Ferrão-Filho, Aloysio; Laure Delignette-Muller, Marie; Charles, Sandrine

    2009-06-01

    Bioassays were performed to find out how field samples of the toxic cyanobacteria Microcystis aeruginosa affect Moina micrura, a cladoceran found in the tropical Jacarepagua Lagoon (Rio de Janeiro, Brazil). The DEBtox (Dynamic Energy Budget theory applied to toxicity data) approach has been proposed for use in analysing chronic toxicity tests as an alternative to calculating the usual safety parameters (NOEC, ECx). DEBtox theory deals with the energy balance between physiological processes (assimilation, maintenance, growth and reproduction), and it can be used to investigate and compare various hypotheses concerning the mechanism of action of a toxicant. Even though the DEBtox framework was designed for standard toxicity bioassays carried out with standard species (fish, daphnids), we applied the growth and reproduction models to M. micrura, by adapting the data available using a weight-length allometric relationship. Our modelling approach appeared to be very relevant at the individual level, and confirmed previous conclusions about the toxic mechanism. In our study we also wanted to assess the toxic effects at the population level, which is a more relevant endpoint in risk assessment. We therefore incorporated both lethal and sublethal toxic effects in a matrix population model used to calculate the finite rate of population change as a continuous function of the exposure concentration. Alongside this calculation, we constructed a confidence band to predict the critical exposure concentration for population health. Finally, we discuss our findings with regard to the prospects for further refining the analysis of ecotoxicological data. PMID:18706427

  4. The Gaussian streaming model and Lagrangian effective field theory

    CERN Document Server

    Vlah, Zvonimir; White, Martin

    2016-01-01

    We update the ingredients of the Gaussian streaming model (GSM) for the redshift-space clustering of biased tracers using the techniques of Lagrangian perturbation theory, effective field theory (EFT) and a generalized Lagrangian bias expansion. After relating the GSM to the cumulant expansion, we present new results for the real-space correlation function, mean pairwise velocity and pairwise velocity dispersion including counter terms from EFT and bias terms through third order in the linear density, its leading derivatives and its shear up to second order. We discuss the connection to the Gaussian peaks formalism. We compare the ingredients of the GSM to a suite of large N-body simulations, and show the performance of the theory on the low order multipoles of the redshift-space correlation function and power spectrum. We highlight the importance of a general biasing scheme, which we find to be as important as higher-order corrections due to non-linear evolution for the halos we consider on the scales of int...

  5. Theory and Modeling of High-Power Gyrotrons

    Energy Technology Data Exchange (ETDEWEB)

    Nusinovich, Gregory Semeon [Univ. of Maryland, College Park, MD (United States)

    2016-04-29

    This report summarized results of the work performed at the Institute for Research in Electronics and Applied Physics of the University of Maryland (College Park, MD) in the framework of the DOE Grant “Theory and Modeling of High-Power Gyrotrons”. The report covers the work performed in 2011-2014. The research work was performed in three directions: - possibilities of stable gyrotron operation in very high-order modes offering the output power exceeding 1 MW level in long-pulse/continuous-wave regimes, - effect of small imperfections in gyrotron fabrication and alignment on the gyrotron efficiency and operation, - some issues in physics of beam-wave interaction in gyrotrons.

  6. Mean-field theory and self-consistent dynamo modeling

    Energy Technology Data Exchange (ETDEWEB)

    Yoshizawa, Akira; Yokoi, Nobumitsu [Tokyo Univ. (Japan). Inst. of Industrial Science; Itoh, Sanae-I [Kyushu Univ., Fukuoka (Japan). Research Inst. for Applied Mechanics; Itoh, Kimitaka [National Inst. for Fusion Science, Toki, Gifu (Japan)

    2001-12-01

    Mean-field theory of dynamo is discussed with emphasis on the statistical formulation of turbulence effects on the magnetohydrodynamic equations and the construction of a self-consistent dynamo model. The dynamo mechanism is sought in the combination of the turbulent residual-helicity and cross-helicity effects. On the basis of this mechanism, discussions are made on the generation of planetary magnetic fields such as geomagnetic field and sunspots and on the occurrence of flow by magnetic fields in planetary and fusion phenomena. (author)

  7. Wireless network traffic modeling based on extreme value theory

    Science.gov (United States)

    Liu, Chunfeng; Shu, Yantai; Yang, Oliver W. W.; Liu, Jiakun; Dong, Linfang

    2006-10-01

    In this paper, Extreme Value Theory (EVT) is presented to analyze wireless network traffic. The role of EVT is to allow the development of procedures that are scientifically and statistically rational to estimate the extreme behavior of random processes. There are two primary methods for studying extremes: the Block Maximum (BM) method and the Points Over Threshold (POT) method. By taking limited traffic data that is greater than the threshold value, our experiment and analysis show the wireless network traffic model obtained with the EVT fits well with that of empirical distribution of traffic, thus illustrating that EVT has a good application foreground in the analysis of wireless network traffic.

  8. BUDGET AND PUBLIC DEBT

    Directory of Open Access Journals (Sweden)

    Morar Ioan Dan

    2014-12-01

    Full Text Available The issue of public budgeting is an important issue for public policy of the state, for the simple reason that no money from the state budget can not promote public policy. Budgetary policy is official government Doctrine vision mirror and also represents a starting point for other public policies, which in turn are financed by the public budget. Fiscal policy instruments at its disposal handles the public sector in its structure, and the private sector. Tools such as grant, budgetary allocation, tax, welfare under various forms, direct investments and not least the state aid is used by the state through their budgetary policies to directly and indirectly infuence sector, and the private. Fiscal policies can be grouped according to the structure of the public sector in these components, namely fiscal policy, budgeting and resource allocation policies for financing the budget deficit. An important issue is the financing of the budget deficit budgetary policies. There are two funding possibilities, namely, the higher taxes or more axles site and enter the second call to public loans. Both options involve extra effort from taxpayers in the current fiscal year when they pay higher taxes or a future period when public loans will be repaid. We know that by virtue of "fiscal pact" structural deficits of the member countries of the EU are limited by the European Commission, according to the macro structural stability and budget of each Member State. This problem tempers to some extent the governments of the Member States budgetary appetite, but does not solve the problem of chronic budget deficits. Another issue addressed in this paper is related to the public debt, the absolute amount of its relative level of public datoriri, about the size of GDP, public debt financing and its repayment sources. Sources of public debt issuance and monetary impact on the budget and monetary stability are variables that must underpin the justification of budgetary

  9. Corvid re-caching without 'theory of mind': a model.

    Directory of Open Access Journals (Sweden)

    Elske van der Vaart

    Full Text Available Scrub jays are thought to use many tactics to protect their caches. For instance, they predominantly bury food far away from conspecifics, and if they must cache while being watched, they often re-cache their worms later, once they are in private. Two explanations have been offered for such observations, and they are intensely debated. First, the birds may reason about their competitors' mental states, with a 'theory of mind'; alternatively, they may apply behavioral rules learned in daily life. Although this second hypothesis is cognitively simpler, it does seem to require a different, ad-hoc behavioral rule for every caching and re-caching pattern exhibited by the birds. Our new theory avoids this drawback by explaining a large variety of patterns as side-effects of stress and the resulting memory errors. Inspired by experimental data, we assume that re-caching is not motivated by a deliberate effort to safeguard specific caches from theft, but by a general desire to cache more. This desire is brought on by stress, which is determined by the presence and dominance of onlookers, and by unsuccessful recovery attempts. We study this theory in two experiments similar to those done with real birds with a kind of 'virtual bird', whose behavior depends on a set of basic assumptions about corvid cognition, and a well-established model of human memory. Our results show that the 'virtual bird' acts as the real birds did; its re-caching reflects whether it has been watched, how dominant its onlooker was, and how close to that onlooker it has cached. This happens even though it cannot attribute mental states, and it has only a single behavioral rule assumed to be previously learned. Thus, our simulations indicate that corvid re-caching can be explained without sophisticated social cognition. Given our specific predictions, our theory can easily be tested empirically.

  10. Adjusting Economic of the Romania’s GDP Using Econometric Model of the System: Budget Expenditure - GDP

    OpenAIRE

    Nadia Stoicuţa; Ana Maria Giurgiulescu; Olimpiu Stoicuţa

    2009-01-01

    The paper presents a model of economic adjustment Romania’s GDP using econometric model which has the budgetary input and output as Romania's GDP. Adjustment shall be based on a square linear regulator by type discrete.

  11. A mercury transport and fate model (LM2-mercury) for mass budget assessment of mercury cycling in Lake Michigan

    Science.gov (United States)

    LM2-Mercury, a mercury mass balance model, was developed to simulate and evaluate the transport, fate, and biogeochemical transformations of mercury in Lake Michigan. The model simulates total suspended solids (TSS), disolved organic carbon (DOC), and total, elemental, divalent, ...

  12. Modeling the impact of tropical mesoscale convective systems on Sahelian mineral dust budget: a case study during AMMA SOPs 1-2

    Science.gov (United States)

    Bouet, C.; Cautenet, G.; Marticorena, B.; Bergametti, G.; Chatenet, B.; Rajot, J.-L.; Descroix, L.

    2009-04-01

    Tropical mesoscale convective systems (MCSs) are a prominent feature of the African meteorology. A continuous monitoring of the aeolian activity in an experimental site located in Niger showed that such events are responsible for the major part of the annual local wind erosion, i.e. for most of the Sahelian dust emission [Rajot, 2001]. However, the net effect of these MCSs on mineral dust budget has to be estimated: on the one hand, these systems produce extremely high surface wind velocities leading to intense dust uptake, but on the other hand, rainfalls associated with these systems can efficiently remove the emitted dust from the atmosphere. High resolution modeling of MCSs appears as the most relevant approach to assess the budget between dust emission and deposition in such local meteorological systems. As a first step, in order to properly estimate dust emissions, it is necessary to accurately describe the surface wind fields at the local scale. Indeed, dust emission is a threshold phenomenon that depends on the third power of surface wind velocity. This study focuses on a case study of dust emission associated with the passage of a MCS observed during one of the intensive observation period of the international African Monsoon Multidisciplinary Analysis (AMMA - SOPs 1-2) program. The simulations were made using the Regional Atmospheric Modeling System (RAMS) coupled online with the dust production model (DPM) developed by Marticorena and Bergametti [1995] and recently improved by Laurent et al. [2008] for Africa. Two horizontal resolutions were tested (5 km and 2.5 km) as well as two microphysical schemes (a 1-moment scheme [Walko et al., 1995] and a 2-moment scheme [Meyers et al., 1997]). The use of the two convective parameterizations now available in the version 6 of RAMS (Kuo [1995] modified by Molinari [1985] and Molinari and Corsetti [1985], and Kain and Fritsch [1992; 1993]) to simulate cloud convection was also tested. Sensitivity tests have been

  13. Confronting the WRF and RAMS mesoscale models with innovative observations in the Netherlands: Evaluating the boundary layer heat budget

    NARCIS (Netherlands)

    Steeneveld, G. J.; Tolk, L. F.; Moene, A. F.; Hartogensis, O. K.; Peters, W.; Holtslag, A. A. M.

    2011-01-01

    The Weather Research and Forecasting Model (WRF) and the Regional Atmospheric Mesoscale Model System (RAMS) are frequently used for (regional) weather, climate and air quality studies. This paper covers an evaluation of these models for a windy and calm episode against Cabauw tower observations (Net

  14. Reconsideration of r/K Selection Theory Using Stochastic Control Theory and Nonlinear Structured Population Models.

    Directory of Open Access Journals (Sweden)

    Ryo Oizumi

    Full Text Available Despite the fact that density effects and individual differences in life history are considered to be important for evolution, these factors lead to several difficulties in understanding the evolution of life history, especially when population sizes reach the carrying capacity. r/K selection theory explains what types of life strategies evolve in the presence of density effects and individual differences. However, the relationship between the life schedules of individuals and population size is still unclear, even if the theory can classify life strategies appropriately. To address this issue, we propose a few equations on adaptive life strategies in r/K selection where density effects are absent or present. The equations detail not only the adaptive life history but also the population dynamics. Furthermore, the equations can incorporate temporal individual differences, which are referred to as internal stochasticity. Our framework reveals that maximizing density effects is an evolutionarily stable strategy related to the carrying capacity. A significant consequence of our analysis is that adaptive strategies in both selections maximize an identical function, providing both population growth rate and carrying capacity. We apply our method to an optimal foraging problem in a semelparous species model and demonstrate that the adaptive strategy yields a lower intrinsic growth rate as well as a lower basic reproductive number than those obtained with other strategies. This study proposes that the diversity of life strategies arises due to the effects of density and internal stochasticity.

  15. A theoretical model for Reynolds-stress and dissipation-rate budgets in near-wall region

    Institute of Scientific and Technical Information of China (English)

    陆利蓬; 陈矛章

    2000-01-01

    A 3-D wave model for the turbulent coherent structures in near-wall region is proposed. The transport nature of the Reynolds stresses and dissipation rate of the turbulence kinetic energy are shown via computation based on the theoretical model. The mean velocity profile is also computed by using the same theoretical model. The theoretical results are in good agreement with those found from DNS, indicating that the theoretical model proposed can correctly describe the physical mechanism of turbulence in near wail region and it thus possibly opens a new way for turbulence modeling in this region.

  16. A theoretical model for Reynolds-stress and dissipation-rate budgets in near-wall region

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    A 3-D wave model for the turbulent coherent structures in near-wall region is proposed. The transport nature of the Reynolds stresses and dissipation rate of the turbulence kinetic energy are shown via computation based on the theoretical model. The mean velocity profile is also computed by using the same theoretical model. The theoretical results are in good agreement with those found from DNS, indicating that the theoretical model proposed can correctly describe the physical mechanism of turbulence in near wall region and it thus possibly opens a new way for turbulence modeling in this region.

  17. Modelling the growth of white seabream (Diplodus sargus) and gilthead seabream (Sparus aurata) in semi-intensive earth production ponds using the Dynamic Energy Budget approach

    Science.gov (United States)

    Serpa, Dalila; Ferreira, Pedro Pousão; Ferreira, Hugo; da Fonseca, Luís Cancela; Dinis, Maria Teresa; Duarte, Pedro

    2013-02-01

    Fish growth models may help understanding the influence of environmental, physiological and husbandry factors on fish production, providing crucial information to maximize the growth rates of cultivated species. The main objectives of this work were to: i) develop and implement an Individual Based Model using a Dynamic Energy Budget (IBM-DEB) approach to simulate the growth of two commercially important Sparidae species in semi-intensive earth ponds, the white seabream which is considered as a potential candidate for Mediterranean aquaculture and the gilthead seabream that has been cultivated since the early 80s; ii) evaluate which model parameters are more likely to affect fish performance, and iii) investigate which parameters might account for growth differences between the cultivated species. The model may be run in two modes: the "state variable" mode, in which an average fish is simulated with a particular parameter set and the "Individual Based Model" (IBM) mode that simulates a population of n fishes, each with its specific parameter set assigned randomly. The IBM mode has the advantage of allowing a quick model calibration and an evaluation of the parameter sets that produce the best fit between predicted and observed fish growth. Results revealed that the model reproduces reasonably well the growth of the two seabreams. Fish performance was mainly affected by parameters related to feed ingestion/assimilation and reserves utilization, suggesting that special attention should be taken in the estimation of these parameters when applying the model to other species. Comparing the DEB parameters set of the two sparids it seems that the white seabream's low growth rates are a result of higher maintenance costs and a lower feed assimilation efficiency. Hence, the development of new feed formulations may be crucial for the success of white seabream production in semi-intensive earth ponds.

  18. The JGrass-NewAge system for forecasting and managing the hydrological budgets at the basin scale: the models of flow generation, propagation, and aggregation

    Directory of Open Access Journals (Sweden)

    G. Formetta

    2011-04-01

    Full Text Available This paper presents a discussion of the predictive capacity of the first implementation of the semi-distributed hydrological modeling system JGrass-NewAge. This model focuses on the hydrological balance of medium scale to large scale basins, and considers statistics of the processes at the hillslope scale. The whole modeling system consists of six main parts: (i estimation of energy balance; (ii estimation of evapotranspiration; (iii snow modelling; (iv estimation of runoff production; (v aggregation and propagation of flows in channel, and (vi description of intakes, out-takes, and reservoirs. This paper details the processes, of runoff production, and aggregation/propagation of flows on a river network. The system is based on a hillslope-link geometrical partition of the landscape, so the basic unit, where the budget is evaluated, consists of hillslopes that drain into a single associated link rather than cells or pixels. To this conceptual partition corresponds an implementation of informatics that uses vectorial features for channels, and raster data for hillslopes. Runoff production at each channel link is estimated through a combination of the Duffy (1996 model and a GIUH model for estimating residence times in hillslope. Routing in channels uses equations integrated for any channels' link, and produces discharges at any link end, for any link in the river network. The model has been tested against measured discharges according to some indexes of goodness of fit such as RMSE and Nash Sutcliffe. The characteristic ability to reproduce discharge in any point of the river network is used to infer some statistics, and notably, the scaling properties of the modeled discharge.

  19. The JGrass-NewAge system for forecasting and managing the hydrological budgets at the basin scale: the models of flow generation, propagation, and aggregation

    Science.gov (United States)

    Formetta, G.; Mantilla, R.; Franceschi, S.; Antonello, A.; Rigon, R.

    2011-04-01

    This paper presents a discussion of the predictive capacity of the first implementation of the semi-distributed hydrological modeling system JGrass-NewAge. This model focuses on the hydrological balance of medium scale to large scale basins, and considers statistics of the processes at the hillslope scale. The whole modeling system consists of six main parts: (i) estimation of energy balance; (ii) estimation of evapotranspiration; (iii) snow modelling; (iv) estimation of runoff production; (v) aggregation and propagation of flows in channel, and (vi) description of intakes, out-takes, and reservoirs. This paper details the processes, of runoff production, and aggregation/propagation of flows on a river network. The system is based on a hillslope-link geometrical partition of the landscape, so the basic unit, where the budget is evaluated, consists of hillslopes that drain into a single associated link rather than cells or pixels. To this conceptual partition corresponds an implementation of informatics that uses vectorial features for channels, and raster data for hillslopes. Runoff production at each channel link is estimated through a combination of the Duffy (1996) model and a GIUH model for estimating residence times in hillslope. Routing in channels uses equations integrated for any channels' link, and produces discharges at any link end, for any link in the river network. The model has been tested against measured discharges according to some indexes of goodness of fit such as RMSE and Nash Sutcliffe. The characteristic ability to reproduce discharge in any point of the river network is used to infer some statistics, and notably, the scaling properties of the modeled discharge.

  20. Budget Constraints Affect Male Rats' Choices between Differently Priced Commodities.

    Science.gov (United States)

    van Wingerden, Marijn; Marx, Christine; Kalenscher, Tobias

    2015-01-01

    Demand theory can be applied to analyse how a human or animal consumer changes her selection of commodities within a certain budget in response to changes in price of those commodities. This change in consumption assessed over a range of prices is defined as demand elasticity. Previously, income-compensated and income-uncompensated price changes have been investigated using human and animal consumers, as demand theory predicts different elasticities for both conditions. However, in these studies, demand elasticity was only evaluated over the entirety of choices made from a budget. As compensating budgets changes the number of attainable commodities relative to uncompensated conditions, and thus the number of choices, it remained unclear whether budget compensation has a trivial effect on demand elasticity by simply sampling from a different total number of choices or has a direct effect on consumers' sequential choice structure. If the budget context independently changes choices between commodities over and above price effects, this should become apparent when demand elasticity is assessed over choice sets of any reasonable size that are matched in choice opportunities between budget conditions. To gain more detailed insight in the sequential choice dynamics underlying differences in demand elasticity between budget conditions, we trained N=8 rat consumers to spend a daily budget by making a number of nosepokes to obtain two liquid commodities under different price regimes, in sessions with and without budget compensation. We confirmed that demand elasticity for both commodities differed between compensated and uncompensated budget conditions, also when the number of choices considered was matched, and showed that these elasticity differences emerge early in the sessions. These differences in demand elasticity were driven by a higher choice rate and an increased reselection bias for the preferred commodity in compensated compared to uncompensated budget conditions

  1. Budget Constraints Affect Male Rats' Choices between Differently Priced Commodities.

    Directory of Open Access Journals (Sweden)

    Marijn van Wingerden

    Full Text Available Demand theory can be applied to analyse how a human or animal consumer changes her selection of commodities within a certain budget in response to changes in price of those commodities. This change in consumption assessed over a range of prices is defined as demand elasticity. Previously, income-compensated and income-uncompensated price changes have been investigated using human and animal consumers, as demand theory predicts different elasticities for both conditions. However, in these studies, demand elasticity was only evaluated over the entirety of choices made from a budget. As compensating budgets changes the number of attainable commodities relative to uncompensated conditions, and thus the number of choices, it remained unclear whether budget compensation has a trivial effect on demand elasticity by simply sampling from a different total number of choices or has a direct effect on consumers' sequential choice structure. If the budget context independently changes choices between commodities over and above price effects, this should become apparent when demand elasticity is assessed over choice sets of any reasonable size that are matched in choice opportunities between budget conditions. To gain more detailed insight in the sequential choice dynamics underlying differences in demand elasticity between budget conditions, we trained N=8 rat consumers to spend a daily budget by making a number of nosepokes to obtain two liquid commodities under different price regimes, in sessions with and without budget compensation. We confirmed that demand elasticity for both commodities differed between compensated and uncompensated budget conditions, also when the number of choices considered was matched, and showed that these elasticity differences emerge early in the sessions. These differences in demand elasticity were driven by a higher choice rate and an increased reselection bias for the preferred commodity in compensated compared to

  2. Using energetic budgets to assess the effects of environmental stress on corals: are we measuring the right things?

    Science.gov (United States)

    Lesser, M. P.

    2013-03-01

    Historically, the response of marine invertebrates to their environment, and environmentally induced stress, has included some measurement of their physiology or metabolism. Eventually, this approach developed into comparative energetics and the construction of energetic budgets. More recently, coral reefs, and scleractinian corals in particular, have suffered significant declines due to climate change-related environmental stress. In addition to a number of physiological, biophysical and molecular measurements to assess "coral health," there has been increased use of energetic approaches that have included the measurement of specific biochemical constituents (i.e., lipid concentrations) as a proxy for energy available to assess the potential outcomes of environmental stress on corals. In reading these studies, there appears to be some confusion between energy budgets and carbon budgets. Additionally, many assumptions regarding proximate biochemical composition, metabolic fuel preferences and metabolic quotients have been made, all of which are essential to construct accurate energy budgets and to convert elemental composition (i.e., carbon) to energy equivalents. Additionally, models of energetics such as the metabolic theory of ecology or dynamic energy budgets are being applied to coral physiology and include several assumptions that are not appropriate for scleractinian corals. As we assess the independent and interactive effects of multiple stressors on corals, efforts to construct quantitative energetic budgets should be a priority component of realistic multifactor experiments that would then improve the use of models as predictors of outcomes related to the effects of environmental change on corals.

  3. Modeling Adversaries in Counterterrorism Decisions Using Prospect Theory.

    Science.gov (United States)

    Merrick, Jason R W; Leclerc, Philip

    2016-04-01

    Counterterrorism decisions have been an intense area of research in recent years. Both decision analysis and game theory have been used to model such decisions, and more recently approaches have been developed that combine the techniques of the two disciplines. However, each of these approaches assumes that the attacker is maximizing its utility. Experimental research shows that human beings do not make decisions by maximizing expected utility without aid, but instead deviate in specific ways such as loss aversion or likelihood insensitivity. In this article, we modify existing methods for counterterrorism decisions. We keep expected utility as the defender's paradigm to seek for the rational decision, but we use prospect theory to solve for the attacker's decision to descriptively model the attacker's loss aversion and likelihood insensitivity. We study the effects of this approach in a critical decision, whether to screen containers entering the United States for radioactive materials. We find that the defender's optimal decision is sensitive to the attacker's levels of loss aversion and likelihood insensitivity, meaning that understanding such descriptive decision effects is important in making such decisions. PMID:25039254

  4. Theory and Modeling for the Magnetospheric Multiscale Mission

    Science.gov (United States)

    Hesse, M.; Aunai, N.; Birn, J.; Cassak, P.; Denton, R. E.; Drake, J. F.; Gombosi, T.; Hoshino, M.; Matthaeus, W.; Sibeck, D.; Zenitani, S.

    2016-03-01

    The Magnetospheric Multiscale (MMS) mission will provide measurement capabilities, which will exceed those of earlier and even contemporary missions by orders of magnitude. MMS will, for the first time, be able to measure directly and with sufficient resolution key features of the magnetic reconnection process, down to the critical electron scales, which need to be resolved to understand how reconnection works. Owing to the complexity and extremely high spatial resolution required, no prior measurements exist, which could be employed to guide the definition of measurement requirements, and consequently set essential parameters for mission planning and execution. Insight into expected details of the reconnection process could hence only been obtained from theory and modern kinetic modeling. This situation was recognized early on by MMS leadership, which supported the formation of a fully integrated Theory and Modeling Team (TMT). The TMT participated in all aspects of mission planning, from the proposal stage to individual aspects of instrument performance characteristics. It provided and continues to provide to the mission the latest insights regarding the kinetic physics of magnetic reconnection, as well as associated particle acceleration and turbulence, assuring that, to the best of modern knowledge, the mission is prepared to resolve the inner workings of the magnetic reconnection process. The present paper provides a summary of key recent results or reconnection research by TMT members.

  5. Pluralistic and stochastic gene regulation: examples, models and consistent theory.

    Science.gov (United States)

    Salas, Elisa N; Shu, Jiang; Cserhati, Matyas F; Weeks, Donald P; Ladunga, Istvan

    2016-06-01

    We present a theory of pluralistic and stochastic gene regulation. To bridge the gap between empirical studies and mathematical models, we integrate pre-existing observations with our meta-analyses of the ENCODE ChIP-Seq experiments. Earlier evidence includes fluctuations in levels, location, activity, and binding of transcription factors, variable DNA motifs, and bursts in gene expression. Stochastic regulation is also indicated by frequently subdued effects of knockout mutants of regulators, their evolutionary losses/gains and massive rewiring of regulatory sites. We report wide-spread pluralistic regulation in ≈800 000 tightly co-expressed pairs of diverse human genes. Typically, half of ≈50 observed regulators bind to both genes reproducibly, twice more than in independently expressed gene pairs. We also examine the largest set of co-expressed genes, which code for cytoplasmic ribosomal proteins. Numerous regulatory complexes are highly significant enriched in ribosomal genes compared to highly expressed non-ribosomal genes. We could not find any DNA-associated, strict sense master regulator. Despite major fluctuations in transcription factor binding, our machine learning model accurately predicted transcript levels using binding sites of 20+ regulators. Our pluralistic and stochastic theory is consistent with partially random binding patterns, redundancy, stochastic regulator binding, burst-like expression, degeneracy of binding motifs and massive regulatory rewiring during evolution.

  6. Modeling Adversaries in Counterterrorism Decisions Using Prospect Theory.

    Science.gov (United States)

    Merrick, Jason R W; Leclerc, Philip

    2016-04-01

    Counterterrorism decisions have been an intense area of research in recent years. Both decision analysis and game theory have been used to model such decisions, and more recently approaches have been developed that combine the techniques of the two disciplines. However, each of these approaches assumes that the attacker is maximizing its utility. Experimental research shows that human beings do not make decisions by maximizing expected utility without aid, but instead deviate in specific ways such as loss aversion or likelihood insensitivity. In this article, we modify existing methods for counterterrorism decisions. We keep expected utility as the defender's paradigm to seek for the rational decision, but we use prospect theory to solve for the attacker's decision to descriptively model the attacker's loss aversion and likelihood insensitivity. We study the effects of this approach in a critical decision, whether to screen containers entering the United States for radioactive materials. We find that the defender's optimal decision is sensitive to the attacker's levels of loss aversion and likelihood insensitivity, meaning that understanding such descriptive decision effects is important in making such decisions.

  7. Studying the impact of overshooting convection on the tropopause tropical layer (TTL) water vapor budget at the continental scale using a mesoscale model

    Science.gov (United States)

    Behera, Abhinna; Rivière, Emmanuel; Marécal, Virginie; Claud, Chantal; Rysman, Jean-François; Geneviève, Seze

    2016-04-01

    Water vapour budget is a key component in the earth climate system. In the tropical upper troposphere, lower stratosphere (UTLS), it plays a central role both on the radiative and the chemical budget. Its abundance is mostly driven by slow ascent above the net zero radiative heating level followed by ice crystals' formation and sedimentation, so called the cold trap. In contrast to this large scale temperature driven process, overshooting convection penetrating the stratosphere could be one piece of the puzzle. It has been proven to hydrate the lower stratosphere at the local scale. Satellite-borne H2O instruments can not measure with a fine enough resolution the water vapour enhancements caused by overshooting convection. The consequence is that it is difficult to estimate the role of overshooting deep convection at the global scale. Using a mesoscale model i.e., Brazilian Regional Atmospheric Modelling System (BRAMS), past atmospheric conditions have been simulated for the full wet season i.e., Nov 2012 to Mar 2013 having a single grid with horizontal resolution of 20 km × 20km over a large part of Brazil and South America. This resolution is too coarse to reproduce overshooting convection in the model, so that this simulation should be used as a reference (REF) simulation, without the impact of overshooting convection in the TTL water budget. For initialisation, as well as nudging the grid boundary in every 6 hours, European Centre for Medium-Range Weather Forecasts (ECMWF) analyses has been used. The size distribution of hydrometeors and number of cloud condensation nuclei (CCN) are fitted in order to best reproduce accumulated precipitations derived from Tropical Rainfall Measuring Mission (TRMM). Similarly, GOES and MSG IR mages have been thoroughly compared with model's outputs, using image correlation statistics for the position of the clouds. The model H2O variability during the wet season, is compared with the in situ balloon-borne measurements during

  8. Forecasting carbon budget under climate change and CO2 fertilization for subtropical region in China using integrated biosphere simulator (IBIS) model

    Science.gov (United States)

    Zhu, Q.; Jiang, H.; Liu, J.; Peng, C.; Fang, X.; Yu, S.; Zhou, G.; Wei, X.; Ju, W.

    2011-01-01

    The regional carbon budget of the climatic transition zone may be very sensitive to climate change and increasing atmospheric CO2 concentrations. This study simulated the carbon cycles under these changes using process-based ecosystem models. The Integrated Biosphere Simulator (IBIS), a Dynamic Global Vegetation Model (DGVM), was used to evaluate the impacts of climate change and CO2 fertilization on net primary production (NPP), net ecosystem production (NEP), and the vegetation structure of terrestrial ecosystems in Zhejiang province (area 101,800 km2, mainly covered by subtropical evergreen forest and warm-temperate evergreen broadleaf forest) which is located in the subtropical climate area of China. Two general circulation models (HADCM3 and CGCM3) representing four IPCC climate change scenarios (HC3AA, HC3GG, CGCM-sresa2, and CGCM-sresb1) were used as climate inputs for IBIS. Results show that simulated historical biomass and NPP are consistent with field and other modelled data, which makes the analysis of future carbon budget reliable. The results indicate that NPP over the entire Zhejiang province was about 55 Mt C yr-1 during the last half of the 21st century. An NPP increase of about 24 Mt C by the end of the 21st century was estimated with the combined effects of increasing CO2 and climate change. A slight NPP increase of about 5 Mt C was estimated under the climate change alone scenario. Forests in Zhejiang are currently acting as a carbon sink with an average NEP of about 2.5 Mt C yr-1. NEP will increase to about 5 Mt C yr-1 by the end of the 21st century with the increasing atmospheric CO2 concentration and climate change. However, climate change alone will reduce the forest carbon sequestration of Zhejiang's forests. Future climate warming will substantially change the vegetation cover types; warm-temperate evergreen broadleaf forest will be gradually substituted by subtropical evergreen forest. An increasing CO2 concentration will have little

  9. A spatial implementation of the BIOME-BGC to model grassland GPP production and water budgets in the Ecuadorian Andean Region

    Science.gov (United States)

    Minaya, Veronica; Corzo, Gerald; van der Kwast, Johannes; Mynett, Arthur

    2016-04-01

    Many terrestrial biogeochemistry process models have been applied around the world at different scales and for a large range of ecosystems. Grasslands, and in particular the ones located in the Andean Region are essential ecosystems that sustain important ecological processes; however, just a few efforts have been made to estimate the gross primary production (GPP) and the hydrological budgets for this specific ecosystem along an altitudinal gradient. A previous study, which is one of the few available in the region, considered the heterogeneity of the main properties of the páramo vegetation and showed significant differences in plant functional types, site/soil parameters and daily meteorology. This study extends the work above mentioned and uses spatio-temporal analysis of the BIOME-BGC model results. This was done to simulate the GPP and the water fluxes in space and time, by applying altitudinal analysis. The catchment located at the southwestern slope of the Antisana volcano in Ecuador was selected as a representative area of the Andean páramos and its hydrological importance as one of the main sources of a water supply reservoir in the region. An accurate estimation of temporal changes in GPP in the region is important for carbon budget assessments, evaluation of the impact of climate change and biomass productivity. This complex and yet interesting problem was integrated by the ecosystem process model BIOME-BGC, the results were evaluated and associated to the land cover map where the growth forms of vegetation were identified. The responses of GPP and the water fluxes were not only dependent on the environmental drivers but also on the ecophysiology and the site specific parameters. The model estimated that the GPP at lower elevations doubles the amount estimated at higher elevations, which might have a large implication during extrapolations at larger spatio-temporal scales. The outcomes of the stand hydrological processes demonstrated a wrong

  10. Forecasting carbon budget under climate change and CO 2 fertilization for subtropical region in China using integrated biosphere simulator (IBIS) model

    Science.gov (United States)

    Zhu, Q.; Jiang, H.; Liu, J.; Peng, C.; Fang, X.; Yu, S.; Zhou, G.; Wei, X.; Ju, W.

    2011-01-01

    The regional carbon budget of the climatic transition zone may be very sensitive to climate change and increasing atmospheric CO 2 concentrations. This study simulated the carbon cycles under these changes using process-based ecosystem models. The Integrated Biosphere Simulator (IBIS), a Dynamic Global Vegetation Model (DGVM), was used to evaluate the impacts of climate change and CO 2 fertilization on net primary production (NPP), net ecosystem production (NEP), and the vegetation structure of terrestrial ecosystems in Zhejiang province (area 101,800 km 2, mainly covered by subtropical evergreen forest and warm-temperate evergreen broadleaf forest) which is located in the subtropical climate area of China. Two general circulation models (HADCM3 and CGCM3) representing four IPCC climate change scenarios (HC3AA, HC3GG, CGCM-sresa2, and CGCM-sresb1) were used as climate inputs for IBIS. Results show that simulated historical biomass and NPP are consistent with field and other modelled data, which makes the analysis of future carbon budget reliable. The results indicate that NPP over the entire Zhejiang province was about 55 Mt C yr -1 during the last half of the 21 st century. An NPP increase of about 24 Mt C by the end of the 21 st century was estimated with the combined effects of increasing CO 2 and climate change. A slight NPP increase of about 5 Mt C was estimated under the climate change alone scenario. Forests in Zhejiang are currently acting as a carbon sink with an average NEP of about 2.5 Mt C yr -1. NEP will increase to about 5 Mt C yr -1 by the end of the 21 st century with the increasing atmospheric CO 2 concentration and climate change. However, climate change alone will reduce the forest carbon sequestration of Zhejiang's forests. Future climate warming will substantially change the vegetation cover types; warm-temperate evergreen broadleaf forest will be gradually substituted by subtropical evergreen forest. An increasing CO 2 concentration will have

  11. SIMP model at NNLO in chiral perturbation theory

    DEFF Research Database (Denmark)

    Hansen, Martin Rasmus Lundquist; Langaeble, K.; Sannino, F.

    2015-01-01

    We investigate the phenomenological viability of a recently proposed class of composite dark matter models where the relic density is determined by 3 to 2 number-changing processes in the dark sector. Here the pions of the strongly interacting field theory constitute the dark matter particles....... By performing a consistent next-to-leading and next-to-next-to-leading order chiral perturbative investigation we demonstrate that the leading order analysis cannot be used to draw conclusions about the viability of the model. We further show that higher order corrections substantially increase the tension...... with phenomenological constraints challenging the viability of the simplest realisation of the strongly interacting massive particle (SIMP) paradigm....

  12. A Quantitative Theory Model of a Photobleaching Mechanism

    Institute of Scientific and Technical Information of China (English)

    陈同生; 曾绍群; 周炜; 骆清铭

    2003-01-01

    A photobleaching model:D-P(dye-photon interaction)and D-O(Dye-oxygen oxidative reaction)photobleaching theory,is proposed.The quantitative power dependences of photobleaching rates with both one-and two-photon excitations(1 PE and TPE)are obtained.This photobleaching model can be used to elucidate our and other experimental results commendably.Experimental studies of the photobleaching rates for rhodamine B with TPE under unsaturation conditions reveals that the power dependences of photobleaching rates increase with the increasing dye concentration,and that the photobleaching rate of a single molecule increases in the second power of the excitation intensity,which is different from the high-order(> 3)nonlinear dependence of ensemble molecules.

  13. Dynamical 3-Space Gravity Theory: Effects on Polytropic Solar Models

    Directory of Open Access Journals (Sweden)

    May R. D.

    2011-01-01

    Full Text Available Numerous experiments and observations have confirmed the existence of a dynamical 3-space, detectable directly by light-speed anisotropy experiments, and indirectly by means of novel gravitational effects, such as bore hole g anomalies, predictable black hole masses, flat spiral-galaxy rotation curves, and the expansion of the universe, all without dark matter and dark energy. The dynamics for this 3-space follows from a unique generalisation of Newtonian gravity, once that is cast into a velocity formalism. This new theory of gravity is applied to the solar model of the sun to compute new density, pressure and temperature profiles, using polytrope modelling of the equation of state for the matter. These results should be applied to a re-analysis of solar neutrino production, and to stellar evolution in general.

  14. HRM Model in Tourism, Based on Dialectical Systems Theory

    Directory of Open Access Journals (Sweden)

    Simona Šarotar Žižek

    2015-12-01

    Full Text Available A human resources management (HRM model integrating trends in HRM with trends in tourism into a dialectical system by the Dialectical Systems Theory (DST. HRM strategy, integrated within the tourism organization’s (to’s strategy is implemented through functional strategies helping their users to achieve a requisitely holistic (rh HRM strategy replacing the prevailing one-sided ones. to’s strategy covers: employees (1 planning, (2 acquisition and selection, (3 development and training, (4 diversity management, (5 teamwork and creativity, (6 motivation and rewarding, (7 stress reduction and health, (8 relationships, (9 personal holism, (10 well-being, (11 work and results assessment; etc. Everyone matters; their synergy is crucial. An innovated HRM model for TOS, which applies employees’, organizations’ rh and integrates new knowledge about HRM. HRM belongs to central managers’ tools. Their HRM must be adapted for TOS, where employees are crucial.

  15. Dynamical 3-Space Gravity Theory: Effects on Polytropic Solar Models

    Directory of Open Access Journals (Sweden)

    Cahill R. T.

    2011-01-01

    Full Text Available Numerous experiments and observations have confirmed the existence of a dynamical 3-space, detectable directly by light-speed anisotropy experiments, and indirectly by means of novel gravitational effects, such as bore hole g-anomalies, predictable black hole masses, flat spiral-galaxy rotation curves, and the expansion of the universe, all without dark matter and dark energy. The dynamics for this 3-space follows from a unique generalisation of Newtonian gravity, once that is cast into a velocity formalism. This new theory of gravity is applied to the solar model of the sun to compute new density, pressure and temperature profiles, using polytrope modelling of the equation of state for the matter. These results should be applied to a re-analysis of solar neutrino production, and to stellar evolution in general.

  16. Eye growth and myopia development: Unifying theory and Matlab model.

    Science.gov (United States)

    Hung, George K; Mahadas, Kausalendra; Mohammad, Faisal

    2016-03-01

    The aim of this article is to present an updated unifying theory of the mechanisms underlying eye growth and myopia development. A series of model simulation programs were developed to illustrate the mechanism of eye growth regulation and myopia development. Two fundamental processes are presumed to govern the relationship between physiological optics and eye growth: genetically pre-programmed signaling and blur feedback. Cornea/lens is considered to have only a genetically pre-programmed component, whereas eye growth is considered to have both a genetically pre-programmed and a blur feedback component. Moreover, based on the Incremental Retinal-Defocus Theory (IRDT), the rate of change of blur size provides the direction for blur-driven regulation. The various factors affecting eye growth are shown in 5 simulations: (1 - unregulated eye growth): blur feedback is rendered ineffective, as in the case of form deprivation, so there is only genetically pre-programmed eye growth, generally resulting in myopia; (2 - regulated eye growth): blur feedback regulation demonstrates the emmetropization process, with abnormally excessive or reduced eye growth leading to myopia and hyperopia, respectively; (3 - repeated near-far viewing): simulation of large-to-small change in blur size as seen in the accommodative stimulus/response function, and via IRDT as well as nearwork-induced transient myopia (NITM), leading to the development of myopia; (4 - neurochemical bulk flow and diffusion): release of dopamine from the inner plexiform layer of the retina, and the subsequent diffusion and relay of neurochemical cascade show that a decrease in dopamine results in a reduction of proteoglycan synthesis rate, which leads to myopia; (5 - Simulink model): model of genetically pre-programmed signaling and blur feedback components that allows for different input functions to simulate experimental manipulations that result in hyperopia, emmetropia, and myopia. These model simulation programs

  17. The Budget-Related Antecedents of Job Performance

    Directory of Open Access Journals (Sweden)

    Emine Yilmaz Karakoc

    2016-04-01

    Full Text Available This study aims to investigate budget related antecedents of job performance of managers. For this purpose, relationships among budgetary participation, budget goal commitment, information sharing, and job performance of managers were examined. The sample consists of managers who are responsible from the budgets of their units in different private enterprises located in Turkey. Survey data was analyzed with confirmatory factor analyses and Structural Equation Modeling. Results indicate that budgetary participation has statistically significant and positive impact on job performance. It also positively affects budget goal commitment and information sharing. Budget goal commitment and information sharing have significant and positive impact on job performance. In addition, budget goal commitment positively affects information sharing of managers. Analyses also revealed that budget goal commitment and information sharing have partial mediation effect on the relationship between budgetary participation and job performance.

  18. How to use the Standard Model effective field theory

    Science.gov (United States)

    Henning, Brian; Lu, Xiaochuan; Murayama, Hitoshi

    2016-01-01

    We present a practical three-step procedure of using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on a given UV model. We give a detailed explanation for calculating the effective action up to one-loop order in a manifestly gauge covariant fashion. This covariant derivative expansion method dramatically simplifies the process of matching a UV model with the SM EFT, and also makes available a universal formalism that is easy to use for a variety of UV models. A few general aspects of RG running effects and choosing operator bases are discussed. Finally, we provide mapping results between the bosonic sector of the SM EFT and a complete set of precision electroweak and Higgs observables to which present and near future experiments are sensitive. Many results and tools which should prove useful to those wishing to use the SM EFT are detailed in several appendices.

  19. The Attributive Theory of Quality: A Model for Quality Measurement in Higher Education.

    Science.gov (United States)

    Afshar, Arash

    A theoretical basis for defining and measuring the quality of institutions of higher education, namely for accreditation purposes, is developed. The theory, the Attributive Theory of Quality, is illustrated using a calculation model that is based on general systems theory. The theory postulates that quality only exists in relation to the…

  20. Multiscale Multiphysics and Multidomain Models I: Basic Theory.

    Science.gov (United States)

    Wei, Guo-Wei

    2013-12-01

    This work extends our earlier two-domain formulation of a differential geometry based multiscale paradigm into a multidomain theory, which endows us the ability to simultaneously accommodate multiphysical descriptions of aqueous chemical, physical and biological systems, such as fuel cells, solar cells, nanofluidics, ion channels, viruses, RNA polymerases, molecular motors and large macromolecular complexes. The essential idea is to make use of the differential geometry theory of surfaces as a natural means to geometrically separate the macroscopic domain of solvent from the microscopic domain of solute, and dynamically couple continuum and discrete descriptions. Our main strategy is to construct energy functionals to put on an equal footing of multiphysics, including polar (i.e., electrostatic) solvation, nonpolar solvation, chemical potential, quantum mechanics, fluid mechanics, molecular mechanics, coarse grained dynamics and elastic dynamics. The variational principle is applied to the energy functionals to derive desirable governing equations, such as multidomain Laplace-Beltrami (LB) equations for macromolecular morphologies, multidomain Poisson-Boltzmann (PB) equation or Poisson equation for electrostatic potential, generalized Nernst-Planck (NP) equations for the dynamics of charged solvent species, generalized Navier-Stokes (NS) equation for fluid dynamics, generalized Newton's equations for molecular dynamics (MD) or coarse-grained dynamics and equation of motion for elastic dynamics. Unlike the classical PB equation, our PB equation is an integral-differential equation due to solvent-solute interactions. To illustrate the proposed formalism, we have explicitly constructed three models, a multidomain solvation model, a multidomain charge transport model and a multidomain chemo-electro-fluid-MD-elastic model. Each solute domain is equipped with distinct surface tension, pressure, dielectric function, and charge density distribution. In addition to long