Integrating dynamic energy budget (DEB) theory with traditional bioenergetic models.
Nisbet, Roger M; Jusup, Marko; Klanjscek, Tin; Pecquerie, Laure
2012-03-15
Dynamic energy budget (DEB) theory offers a systematic, though abstract, way to describe how an organism acquires and uses energy and essential elements for physiological processes, in addition to how physiological performance is influenced by environmental variables such as food density and temperature. A 'standard' DEB model describes the performance (growth, development, reproduction, respiration, etc.) of all life stages of an animal (embryo to adult), and predicts both intraspecific and interspecific variation in physiological rates. This approach contrasts with a long tradition of more phenomenological and parameter-rich bioenergetic models that are used to make predictions from species-specific rate measurements. These less abstract models are widely used in fisheries studies; they are more readily interpretable than DEB models, but lack the generality of DEB models. We review the interconnections between the two approaches and present formulae relating the state variables and fluxes in the standard DEB model to measured bioenergetic rate processes. We illustrate this synthesis for two large fishes: Pacific bluefin tuna (Thunnus orientalis) and Pacific salmon (Oncorhynchus spp.). For each, we have a parameter-sparse, full-life-cycle DEB model that requires adding only a few species-specific features to the standard model. Both models allow powerful integration of knowledge derived from data restricted to certain life stages, processes and environments.
A Theory of the Perturbed Consumer with General Budgets
DEFF Research Database (Denmark)
McFadden, Daniel L; Fosgerau, Mogens
We consider demand systems for utility-maximizing consumers facing general budget constraints whose utilities are perturbed by additive linear shifts in marginal utilities. Budgets are required to be compact but are not required to be convex. We define demand generating functions (DGF) whose......-valued and smooth in their arguments. We also give sufficient conditions for integrability of perturbed demand. Our analysis provides a foundation for applications of consumer theory to problems with nonlinear budget constraints....
Chang, CC
2012-01-01
Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko
A Public Choice Theory of Budgets: Implications for Education in Less Developed Countries.
Gallagher, Mark
1993-01-01
The rate of growth in government spending (particularly, slow growth or decline) has an important impact on the effectiveness of resource allocation. Data from 47 developing nations was used to test a model, based on public choice theory, of interest-group behavior and educational budget growth/decline. Government spending trends were related to…
Hodges, Wilfrid
1993-01-01
An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.
THE CONCEPTUAL CONTENT OF STATE BUDGET PROCESS IN ECONOMIC THEORY
Žubule, Ērika; Puzule, Anita
2015-01-01
Evaluating the role of the budget in economy we may declare that the budget process should favour the social economic development of the state. The aim of the research is to explore and evaluate theoretical aspects of the state budget process as a component of the state financial policy and to work out proposals for improvement of the state budget process, based on the theoretical and empirical findings. The main objectives of the research were to study the foreign economic scientific literat...
Nambe Pueblo Water Budget and Forecasting model.
Energy Technology Data Exchange (ETDEWEB)
Brainard, James Robert
2009-10-01
This report documents The Nambe Pueblo Water Budget and Water Forecasting model. The model has been constructed using Powersim Studio (PS), a software package designed to investigate complex systems where flows and accumulations are central to the system. Here PS has been used as a platform for modeling various aspects of Nambe Pueblo's current and future water use. The model contains three major components, the Water Forecast Component, Irrigation Scheduling Component, and the Reservoir Model Component. In each of the components, the user can change variables to investigate the impacts of water management scenarios on future water use. The Water Forecast Component includes forecasting for industrial, commercial, and livestock use. Domestic demand is also forecasted based on user specified current population, population growth rates, and per capita water consumption. Irrigation efficiencies are quantified in the Irrigated Agriculture component using critical information concerning diversion rates, acreages, ditch dimensions and seepage rates. Results from this section are used in the Water Demand Forecast, Irrigation Scheduling, and the Reservoir Model components. The Reservoir Component contains two sections, (1) Storage and Inflow Accumulations by Categories and (2) Release, Diversion and Shortages. Results from both sections are derived from the calibrated Nambe Reservoir model where historic, pre-dam or above dam USGS stream flow data is fed into the model and releases are calculated.
Harrison, D. E.; Holland, W. R.
1981-01-01
A mean vorticity budget analysis is presented of Holland's (1978) numerical ocean general circulation experiment. The stable budgets are compared with classical circulation theory to emphasize the ways in which the mesoscale motions of the model alter (or leave unaltered) classical vorticity balances. The basinwide meridional transports of vorticity by the mean flow and by the mesoscale flow in the mean are evaluated to establish the role(s) of the mesoscale in the larger scale equilibrium vorticity transports. The vorticity equation for this model fluid system is presented and the budget analysis method is described. Vorticity budgets over the selected regions and on a larger scale are given, and a summary of budget results is provided along with remarks about the utility of this type of analysis.
Validation of a Dynamic Energy Budget (DEB) model for the blue mussel
Saraiva, S.; van der Meer, J.; Kooijman, S.A.L.M.; Witbaard, R.; Philippart, C.J.M.; Hippler, D.; Parker, R.
2012-01-01
A model for bivalve growth was developed and the results were tested against field observations. The model is based on the Dynamic Energy Budget (DEB) theory and includes an extension of the standard DEB model to cope with changing food quantity and quality. At 4 different locations in the North Sea
Mangani, P
2011-01-01
This title includes: Lectures - G.E. Sacks - Model theory and applications, and H.J. Keisler - Constructions in model theory; and, Seminars - M. Servi - SH formulas and generalized exponential, and J.A. Makowski - Topological model theory.
Galic, Nika; Forbes, Valery E.
2017-03-01
Human activities have been modifying ecosystems for centuries, from pressures on wild populations we harvest to modifying habitats through urbanization and agricultural activities. Changes in global climate patterns are adding another layer of, often unpredictable, perturbations to ecosystems on which we rely for life support [1,2]. To ensure the sustainability of ecosystem services, especially at this point in time when the human population is estimated to grow by another 2 billion by 2050 [3], we need to predict possible consequences of our actions and suggest relevant solutions [4,5]. We face several challenges when estimating adverse impacts of our actions on ecosystems. We describe these in the context of ecological risk assessment of chemicals. Firstly, when attempting to assess risk from exposure to chemicals, we base our decisions on a very limited number of species that are easily cultured and kept in the lab. We assume that preventing risk to these species will also protect all of the untested species present in natural ecosystems [6]. Secondly, although we know that chemicals interact with other stressors in the field, the number of stressors that we can test is limited due to logistical and ethical reasons. Similarly, empirical approaches are limited in both spatial and temporal scale due to logistical, financial and ethical reasons [7,8]. To bypass these challenges, we can develop ecological models that integrate relevant life history and other information and make testable predictions across relevant spatial and temporal scales [8-10].
Geček, Sunčana
2017-03-01
Jusup and colleagues in the recent review on physics of metabolic organization [1] discuss in detail motivational considerations and common assumptions of Dynamic Energy Budget (DEB) theory, supply readers with a practical guide to DEB-based modeling, demonstrate the construction and dynamics of the standard DEB model, and illustrate several applications. The authors make a step forward from the existing literature by seamlessly bridging over the dichotomy between (i) thermodynamic foundations of the theory (which are often more accessible and understandable to physicists and mathematicians), and (ii) the resulting bioenergetic models (mostly used by biologists in real-world applications).
Barkstrom, Bruce R.; Direskeneli, Haldun; Halyo, Nesim
1992-01-01
An information theory approach to examine the temporal nonuniform sampling characteristics of shortwave (SW) flux for earth radiation budget (ERB) measurements is suggested. The information gain is computed by computing the information content before and after the measurements. A stochastic diurnal model for the SW flux is developed, and measurements for different orbital parameters are examined. The methodology is applied to specific NASA Polar platform and Tropical Rainfall Measuring Mission (TRMM) orbital parameters. The information theory approach, coupled with the developed SW diurnal model, is found to be promising for measurements involving nonuniform orbital sampling characteristics.
Modeling Budget Optimum Allocation of Khorasan Razavi Province Agriculture Sector
Directory of Open Access Journals (Sweden)
Seyed Mohammad Fahimifard
2016-09-01
Full Text Available Introduction: Stock shortage is one of the development impasses in developing countries and trough it the agriculture sector has faced with the most limitation. The share of Iran’s agricultural sector from total investments after the Islamic revolution (1979 has been just 5.5 percent. This fact causes low efficiency in Iran’s agriculture sector. For instance per each 1 cubic meter of water in Iran’s agriculture sector, less that 1 kilogram dry food produced and each Iranian farmer achieves less annual income and has less mechanization in comparison with similar countries in Iran’s 1404 perspective document. Therefore, it is clear that increasing investment in agriculture sector, optimize the budget allocation for this sector is mandatory however has not been adequately and scientifically revised until now. Thus, in this research optimum budget allocation of Iran- Khorasan Razavi province agriculture sector was modeled. Materials and Methods: In order to model the optimum budget allocation of Khorasan Razavi province’s agriculture sector at first optimum budget allocation between agriculture programs was modeled with compounding three indexes: 1. Analyzing the priorities of Khorasan Razavi province’s agriculture sector experts with the application of Analytical Hierarchy Process (AHP, 2. The average share of agriculture sector programs from 4th country’s development program for Khorasan Razavi province’s agriculture sector, and 3.The average share of agriculture sector programs from 5th country’s development program for Khorasan Razavi province’s agriculture sector. Then, using Delphi technique potential indexes of each program was determined. After that, determined potential indexes were weighted using Analytical Hierarchy Process (AHP and finally, using numerical taxonomy model to optimize allocation of the program’s budget between cities based on two scenarios. Required data, also was gathered from the budget and planning
Prest, M
1988-01-01
In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module
Electric solar wind sail mass budget model
Directory of Open Access Journals (Sweden)
P. Janhunen
2013-02-01
Full Text Available The electric solar wind sail (E-sail is a new type of propellantless propulsion system for Solar System transportation, which uses the natural solar wind to produce spacecraft propulsion. The E-sail consists of thin centrifugally stretched tethers that are kept charged by an onboard electron gun and, as such, experience Coulomb drag through the high-speed solar wind plasma stream. This paper discusses a mass breakdown and a performance model for an E-sail spacecraft that hosts a mission-specific payload of prescribed mass. In particular, the model is able to estimate the total spacecraft mass and its propulsive acceleration as a function of various design parameters such as the number of tethers and their length. A number of subsystem masses are calculated assuming existing or near-term E-sail technology. In light of the obtained performance estimates, an E-sail represents a promising propulsion system for a variety of transportation needs in the Solar System.
DEFF Research Database (Denmark)
Andersen, Asger Lau; Lassen, David Dreyer; Nielsen, Lasse Holbøll Westh
are negative rather than positive; and when there is divided government. We test the hypotheses of the model using a unique data set of late budgets for US state governments, based on dates of budget approval collected from news reports and a survey of state budget o¢ cers for the period 1988......The budget forms the legal basis of government spending. If a budget is not in place at the beginning of the fiscal year, planning as well as current spending are jeopardized and government shutdown may result. This paper develops a continuous-time war-of-attrition model of budgeting...... in a presidential style-democracy to explain the duration of budget negotiations. We build our model around budget baselines as reference points for loss averse negotiators. We derive three testable hypotheses: there are more late budgets, and they are more late, when fiscal circumstances change; when such changes...
DEFF Research Database (Denmark)
Andersen, Asger Lau; Lassen, David Dreyer; Nielsen, Lasse Holbøll Westh
The budget forms the legal basis of government spending. If a budget is not in place at the beginning of the fiscal year, planning as well as current spending are jeopardized and government shutdown may result. This paper develops a continuous-time war-of-attrition model of budgeting...... in a presidential style-democracy to explain the duration of budget negotiations. We build our model around budget baselines as reference points for loss averse negotiators. We derive three testable hypotheses: there are more late budgets, and they are more late, when fiscal circumstances change; when such changes...... are negative rather than positive; and when there is divided government. We test the hypotheses of the model using a unique data set of late budgets for US state governments, based on dates of budget approval collected from news reports and a survey of state budget o¢ cers for the period 1988...
Nisbet, Roger M.
2017-03-01
Jusup et al. [1] provide a comprehensive review of Dynamic Energy Budget (DEB) theory - a theory of metabolic organization that has its roots in a model by S.A.L.M Kooijman [2] and has evolved over three decades into a remarkable general theory whose use appears to be growing exponentially. The definitive text on DEB theory [3] is a challenging (though exceptionally rewarding) read, and previous reviews (e.g. [4,5]) have provided focused summaries of some of its main themes, targeted at specific groups of readers. The strong case for a further review is well captured in the abstract: ;Hitherto, the foundations were more accessible to physicists or mathematicians, and the applications to biologists, causing a dichotomy in what always should have been a single body of work.; In response to this need, Jusup et al. provide a review that combines a lucid, rigorous exposition of the core components of DEB theory with a diverse collection of DEB applications. They also highlight some recent advances, notably the rapidly growing on-line database of DEB model parameters (451 species on 15 August 2016 according to [1], now, just a few months later, over 500 species).
How processing digital elevation models can affect simulated water budgets.
Kuniansky, Eve L; Lowery, Mark A; Campbell, Bruce G
2009-01-01
For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.
2012-09-01
Depression, it was generally accepted that the budget should be balanced, Keynes proposed that budget deficits may be desirable in periods of recession to...debt matter? As aforementioned, balanced budgets was the prevailing strategy, until Keynes developed his theory according to which deficits may be...schools of economic thought seems to have started on October 1932. In particular, Keynes and five more economists (MacGregor, Pigou, Layton, Salter and
Duarte, P.; Fernández-Reiriz, M. J.; Labarta, U.
2012-01-01
The environmental and the economic importance of shellfish stimulated a great deal of studies on their physiology over the last decades, with many attempts to model their growth. The first models developed to simulate bivalve growth were predominantly based on the Scope For Growth ( SFG) paradigm. In the last years there has been a shift towards the Dynamic Energy Budget ( DEB) paradigm. The general objective of this work is contributing to the evaluation of different approaches to simulate bivalve growth in low seston waters by: (i) implementing a model to simulate mussel growth in low suspended matter ecosystems based on the DEB theory (Kooijman, S.A.L.M., 2000. Dynamic and energy mass budgets in biological systems, Cambridge University Press); (ii) comparing and discussing different approaches to simulate feeding processes, in the light of recently published works both on experimental physiology and physiology modeling; (iii) comparing and discussing results obtained with a model based on EMMY ( Scholten and Smaal, 1998). The model implemented allowed to successfully simulate mussel feeding and shell length growth in two different Galician Rias. Obtained results together with literature data suggest that modeling of bivalve feeding should incorporate physiologic feed-backs related with food digestibility. In spite of considerable advances in bivalve modeling a number of issues is yet to be resolved, with emphasis on the way food sources are represented and feeding processes formulated.
Pecquerie, Laure; Johnson, Leah R.; Kooijman, Sebastiaan A. L. M.; Nisbet, Roger M.
2011-11-01
To determine the response of Pacific salmon ( Oncorhynchus spp.) populations to environmental change, we need to understand impacts on all life stages. However, an integrative and mechanistic approach is particularly challenging for Pacific salmon as they use multiple habitats (river, estuarine and marine) during their life cycle. Here we develop a bioenergetic model that predicts development, growth and reproduction of a Pacific salmon in a dynamic environment, from an egg to a reproducing female, and that links female state to egg traits. This model uses Dynamic Energy Budget (DEB) theory to predict how life history traits vary among five species of Pacific salmon: Pink, Sockeye, Coho, Chum and Chinook. Supplemented with a limited number of assumptions on anadromy and semelparity and external signals for migrations, the model reproduces the qualitative patterns in egg size, fry size and fecundity both at the inter- and intra-species levels. Our results highlight how modeling all life stages within a single framework enables us to better understand complex life-history patterns. Additionally we show that body size scaling relationships implied by DEB theory provide a simple way to transfer model parameters among Pacific salmon species, thus providing a generic approach to study the impact of environmental conditions on the life cycle of Pacific salmon.
Modelling the global tropospheric ozone budget: exploring the variability in current models
Directory of Open Access Journals (Sweden)
O. Wild
2007-02-01
Full Text Available What are the largest uncertainties in modelling ozone in the troposphere, and how do they affect the calculated ozone budget? Published chemistry-transport model studies of tropospheric ozone differ significantly in their conclusions regarding the importance of the key processes controlling the ozone budget: influx from the stratosphere, chemical processing and surface deposition. This study surveys ozone budgets from previous studies and demonstrates that about two thirds of the increase in ozone production seen between early assessments and more recent model intercomparisons can be accounted for by increased precursor emissions. Model studies using recent estimates of emissions compare better with ozonesonde measurements than studies using older data, and the tropospheric burden of ozone is closer to that derived here from measurement climatologies, 335±10 Tg. However, differences between individual model studies remain large and cannot be explained by surface precursor emissions alone; cross-tropopause transport, wet and dry deposition, humidity, and lightning make large contributions to the differences seen between models. The importance of these processes is examined here using a chemistry-transport model to investigate the sensitivity of the calculated ozone budget to different assumptions about emissions, physical processes, meteorology and model resolution. The budget is particularly sensitive to the magnitude and location of lightning NO_{x} emissions, which remain poorly constrained; the 3–8 TgN/yr range in recent model studies may account for a 10% difference in tropospheric ozone burden and a 1.4 year difference in CH_{4} lifetime. Differences in humidity and dry deposition account for some of the variability in ozone abundance and loss seen in previous studies, with smaller contributions from wet deposition and stratospheric influx. At coarse model resolutions stratospheric influx is systematically overestimated
Dependent-Chance Programming Models for Capital Budgeting in Fuzzy Environments
Institute of Scientific and Technical Information of China (English)
LIANG Rui; GAO Jinwu
2008-01-01
Capital budgeting is concerned with maximizing the total net profit subject to budget constraints by selecting an appropriate combination of projects. This paper presents chance maximizing models for capital budgeting with fuzzy input data and multiple conflicting objectives. When the decision maker sets a prospec-tive profit level and wants to maximize the chances of the total profit achieving the prospective profit level, a fuzzy dependent-chance programming model, a fuzzy multi-objective dependent-chance programming model, and a fuzzy goal dependent-chance programming model are used to formulate the fuzzy capital budgeting problem. A fuzzy simulation based genetic algorithm is used to solve these models. Numerical examples are provided to illustrate the effectiveness of the simulation-based genetic algorithm and the po-tential applications of these models.
Huang, Shih-Yu; Deng, Yi; Wang, Jingfeng
2016-10-01
The maximum-entropy-production (MEP) model of surface heat fluxes, based on contemporary non-equilibrium thermodynamics, information theory, and atmospheric turbulence theory, is used to re-estimate the global surface heat fluxes. The MEP model predicted surface fluxes automatically balance the surface energy budgets at all time and space scales without the explicit use of near-surface temperature and moisture gradient, wind speed and surface roughness data. The new MEP-based global annual mean fluxes over the land surface, using input data of surface radiation, temperature data from National Aeronautics and Space Administration-Clouds and the Earth's Radiant Energy System (NASA CERES) supplemented by surface specific humidity data from the Modern-Era Retrospective Analysis for Research and Applications (MERRA), agree closely with previous estimates. The new estimate of ocean evaporation, not using the MERRA reanalysis data as model inputs, is lower than previous estimates, while the new estimate of ocean sensible heat flux is higher than previously reported. The MEP model also produces the first global map of ocean surface heat flux that is not available from existing global reanalysis products.
Probability state modeling theory.
Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I
2015-07-01
As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.
Directory of Open Access Journals (Sweden)
Hlіnska Olha M.
2014-01-01
Full Text Available The article considers issues of efficient budgeting and use of labour resources at coal mining enterprises. It proves expediency of use of modern neuronet, namely, multilayer perceptron, for solution of tasks of modelling the process of budgeting and use of labour resources at coal mining enterprises. It shows that Statistika is the best software package for creation of neuronets of the multilayer perceptron architecture. On the basis of analysis and comparative characteristic the article selects the topology and builds a neuronet model of budgeting and use of labour resources at coal mining enterprises.
Budget constraint and vaccine dosing: A mathematical modelling exercise
Standaert, Baudouin A.; Curran, Desmond; Postma, Maarten J.
2014-01-01
Background: Increasing the number of vaccine doses may potentially improve overall efficacy. Decision-makers need information about choosing the most efficient dose schedule to maximise the total health gain of a population when operating under a constrained budget. The objective of this study is to
Participative Budgeting as a Communication Process: A Model and Experiment.
1978-01-01
Partici- pative Budgeting on Managerial Behavior. New York: National Association of Accountants , 1975. Vroom , Victor H . Some Personality...Determinants of the Effects of Participation. Englewood Cliffs , N.J.: Prentice—Hall , Inc., 1960. Vroom , Victor H . and Yetton , Philli p W. Leadership and...about which you are concerned 3. Does your immediate superior ask your opi nion when a p rcb lem comes up that involves your work 16Victor H . Vroom
Picoche, Coralie; Le Gendre, Romain; Flye-Sainte-Marie, Jonathan; Françoise, Sylvaine; Maheux, Frank; Simon, Benjamin; Gangnery, Aline
2014-01-01
The blue mussel, Mytilus edulis, is a commercially important species, with production based on both fisheries and aquaculture. Dynamic Energy Budget (DEB) models have been extensively applied to study its energetics but such applications require a deep understanding of its nutrition, from filtration to assimilation. Being filter feeders, mussels show multiple responses to temporal fluctuations in their food and environment, raising questions that can be investigated by modeling. To provide a better insight into mussel-environment interactions, an experiment was conducted in one of the main French growing zones (Utah Beach, Normandy). Mussel growth was monitored monthly for 18 months, with a large number of environmental descriptors measured in parallel. Food proxies such as chlorophyll a, particulate organic carbon and phytoplankton were also sampled, in addition to non-nutritious particles. High-frequency physical data recording (e.g., water temperature, immersion duration) completed the habitat description. Measures revealed an increase in dry flesh mass during the first year, followed by a high mass loss, which could not be completely explained by the DEB model using raw external signals. We propose two methods that reconstruct food from shell length and dry flesh mass variations. The former depends on the inversion of the growth equation while the latter is based on iterative simulations. Assemblages of food proxies are then related to reconstructed food input, with a special focus on plankton species. A characteristic contribution is attributed to these sources to estimate nutritional values for mussels. M. edulis shows no preference between most plankton life history traits. Selection is based on the size of the ingested particles, which is modified by the volume and social behavior of plankton species. This finding reveals the importance of diet diversity and both passive and active selections, and confirms the need to adjust DEB models to different
Evaluation Theory, Models, and Applications
Stufflebeam, Daniel L.; Shinkfield, Anthony J.
2007-01-01
"Evaluation Theory, Models, and Applications" is designed for evaluators and students who need to develop a commanding knowledge of the evaluation field: its history, theory and standards, models and approaches, procedures, and inclusion of personnel as well as program evaluation. This important book shows how to choose from a growing…
Outcomes analysis of hospital management model in restricted budget conditions
Directory of Open Access Journals (Sweden)
Virsavia Vaseva
2016-03-01
Full Text Available Facing conditions of market economy and financial crisis, the head of any healthcare facility has to take adequate decisions about the cost-effective functioning of the hospital. Along with cost reduction, the main problem is how to maintain a high level of health services. The aim of our study was to analyse the quality of healthcare services after the implementation of control over expenses due to a reduction in the budgetary resources in Military Medical Academy (MMA, Sofia, Bulgaria. Data from the hospital information system and the Financial Department about the incomes and expenditures for patient treatment were used. We conducted a retrospective study on the main components of clinical indicators in 2013 to reveal the main problems in the hospital management. In 2014, control was imposed on the use of the most expensive medicines and consumables. Comparative analysis was made of the results of the medical services in MMA for 2013 and 2014. Our results showed that despite the limited budget in MMA over the last year, the policy of control over operational costs succeeded in maintaining the quality of healthcare services. While reducing the expenses for medicines, consumables and laboratory investigations by ∼26%, some quality criteria for healthcare services were observed to be improved by ∼9%. Financial crisis and budget reduction urge healthcare economists to create adequate economical instruments to assist the normal functioning of hospital facilities. Our analysis showed that when a right policy is chosen, better results may be achieved with fewer resources.
Verkhoglyadova, O. P.; Meng, X.; Mannucci, A. J.; Mlynczak, M. G.; Hunt, L. A.; Tsurutani, B.
2015-12-01
We present estimates for the energy budget of the 2015 St. Patrick's Day storm. Empirical models and coupling functions are used as proxies for energy input due to solar wind-magnetosphere coupling. Fluxes of thermospheric nitric oxide and carbon dioxide cooling emissions are estimated in several latitude ranges. Solar wind data and the Weimer 2005 model for high-latitude electrodynamics are used to drive GITM modeling for the storm. Model estimations for energy partitioning, Joule heating, NO cooling are compared with observations and empirical proxies. We outline challenges in the estimation of the IT energy budget (Joule heating, Poynting flux, particle precipitation) during geomagnetic storms.
A Methodological Review of US Budget-Impact Models for New Drugs.
Mauskopf, Josephine; Earnshaw, Stephanie
2016-11-01
A budget-impact analysis is required by many jurisdictions when adding a new drug to the formulary. However, previous reviews have indicated that adherence to methodological guidelines is variable. In this methodological review, we assess the extent to which US budget-impact analyses for new drugs use recommended practices. We describe recommended practice for seven key elements in the design of a budget-impact analysis. Targeted literature searches for US studies reporting estimates of the budget impact of a new drug were performed and we prepared a summary of how each study addressed the seven key elements. The primary finding from this review is that recommended practice is not followed in many budget-impact analyses. For example, we found that growth in the treated population size and/or changes in disease-related costs expected during the model time horizon for more effective treatments was not included in several analyses for chronic conditions. In addition, all drug-related costs were not captured in the majority of the models. Finally, for most studies, one-way sensitivity and scenario analyses were very limited, and the ranges used in one-way sensitivity analyses were frequently arbitrary percentages rather than being data driven. The conclusions from our review are that changes in population size, disease severity mix, and/or disease-related costs should be properly accounted for to avoid over- or underestimating the budget impact. Since each budget holder might have different perspectives and different values for many of the input parameters, it is also critical for published budget-impact analyses to include extensive sensitivity and scenario analyses based on realistic input values.
Ecosystem Modeling of Biological Processes to Global Budgets
Christopher, Potter S.; Condon, Estelle (Technical Monitor)
2000-01-01
biosphere effects on atmospheric composition is the ecosystem level. These assumptions are the foundation for developing modern emission budgets for biogenic gases such as carbon dioxide, methane, carbon monoxide, isoprene, nitrous and nitric oxide, and ammonia. Such emission budgets commonly include information on seasonal flux patterns, typical diurnal profiles, and spatial resolution of at least one degree latitude/longitude for the globe. On the basis of these budgets, it is possible to compute 'base emission rates' for the major biogenic trace gases from both terrestrial and ocean sources, which may be useful benchmarks for defining the gas production rates of organisms, especially those from early Earth history, which are required to generate a detectable signal on a global atmosphere. This type of analysis is also the starting point for evaluation of the 'biological processes to global gas budget' extrapolation procedure described above for early Earth ecosystems.
Klok, Chris; Hjorth, Morten; Dahllöf, Ingela
2012-10-01
The Dynamic Energy Budget (DEB) theory provides a logic and consistent framework to evaluate ecotoxicological test results. Currently this framework is not regularly applied in ecotoxicology given perceived complexity and data needs. However, even in the case of low data availability the DEB theory is already useful. In this paper we apply the DEB theory to evaluate the results in three previously published papers on the effects of PAHs on Arctic copepods. Since these results do not allow for a quantitative application we used DEB qualitatively. The ecotoxicological results were thereby set in a wider ecological context and we found a logical explanation for an unexpected decline in hatching success described in one of these papers. Moreover, the DEB evaluation helped to derive relevant ecological questions that can guide future experimental work on this subject.
Dynamic energy budget (DEB) theory provides a generalizable and broadly applicable framework to connect sublethal toxic effects on individuals to changes in population survival and growth. To explore this approach, we conducted growth and bioaccumulation studies that contribute t...
Matzelle, A.; Montalto, V.; Sarà, G.; Zippay, M.; Helmuth, B.
2014-11-01
Dynamic Energy Budget (DEB) models serve as a powerful tool for describing the flow of energy through organisms from assimilation of food to utilization for maintenance, growth and reproduction. The DEB theory has been successfully applied to several bivalve species to compare bioenergetic and physiological strategies for the utilization of energy. In particular, mussels within the Mytilus edulis complex (M. edulis, M. galloprovincialis, and M. trossulus) have been the focus of many studies due to their economic and ecological importance, and their worldwide distribution. However, DEB parameter values have never been estimated for Mytilus californianus, a species that is an ecological dominant on rocky intertidal shores on the west coast of North America and which likely varies considerably from mussels in the M. edulis complex in its physiology. We estimated a set of DEB parameters for M. californianus using the covariation method estimation procedure and compared these to parameter values from other bivalve species. Model parameters were used to compare sensitivity to environmental variability among species, as a first examination of how strategies for physiologically contending with environmental change by M. californianus may differ from those of other bivalves. Results suggest that based on the parameter set obtained, M. californianus has favorable energetic strategies enabling it to contend with a range of environmental conditions. For instance, the allocation fraction of reserve to soma (κ) is among the highest of any bivalves, which is consistent with the observation that this species can survive over a wide range of environmental conditions, including prolonged periods of starvation.
Tie, Xuexi; Emmons, Louisa; Horowitz, Larry; Brasseur, Guy; Ridley, Brian; Atlas, Elliot; Stround, Craig; Hess, Peter; Klonecki, Andrzej; Madronich, Sasha; Talbot, Robert; Dibb, Jack
2003-02-01
The distributions of NOx and O3 are analyzed during TOPSE (Tropospheric Ozone Production about the Spring Equinox). In this study these data are compared with the calculations of a global chemical/transport model (Model for OZone And Related chemical Tracers (MOZART)). Specifically, the effect that hydrolysis of N2O5 on sulfate aerosols has on tropospheric NOx and O3 budgets is studied. The results show that without this heterogeneous reaction, the model significantly overestimates NOx concentrations at high latitudes of the Northern Hemisphere (NH) in winter and spring in comparison to the observations during TOPSE; with this reaction, modeled NOx concentrations are close to the measured values. This comparison provides evidence that the hydrolysis of N2O5 on sulfate aerosol plays an important role in controlling the tropospheric NOx and O3 budgets. The calculated reduction of NOx attributed to this reaction is 80 to 90% in winter at high latitudes over North America. Because of the reduction of NOx, O3 concentrations are also decreased. The maximum O3 reduction occurs in spring although the maximum NOx reduction occurs in winter when photochemical O3 production is relatively low. The uncertainties related to uptake coefficient and aerosol loading in the model is analyzed. The analysis indicates that the changes in NOx due to these uncertainties are much smaller than the impact of hydrolysis of N2O5 on sulfate aerosol. The effect that hydrolysis of N2O5 on global NOx and O3 budgets are also assessed by the model. The results suggest that in the Northern Hemisphere, the average NOx budget decreases 50% due to this reaction in winter and 5% in summer. The average O3 budget is reduced by 8% in winter and 6% in summer. In the Southern Hemisphere (SH), the sulfate aerosol loading is significantly smaller than in the Northern Hemisphere. As a result, sulfate aerosol has little impact on NOx and O3 budgets of the Southern Hemisphere.
Memmesheimer, M.; Ebel, A.; Roemer, M.
1997-01-01
Results from two air quality models (LOTOS, EURAD) have been used to analyse the contribution of the different terms in the continuity equation to the budget of ozone, NO(x) and PAN. Both models cover large parts of Europe and describe the processes relevant for tropospheric chemistry and dynamics.
Directory of Open Access Journals (Sweden)
J. Ryder
2014-12-01
Full Text Available In Earth system modelling, a description of the energy budget of the vegetated surface layer is fundamental as it determines the meteorological conditions in the planetary boundary layer and as such contributes to the atmospheric conditions and its circulation. The energy budget in most Earth system models has long been based on a "big-leaf approach", with averaging schemes that represent in-canopy processes. Such models have difficulties in reproducing consistently the energy balance in field observations. We here outline a newly developed numerical model for energy budget simulation, as a component of the land surface model ORCHIDEE-CAN (Organising Carbon and Hydrology In Dynamic Ecosystems – CANopy. This new model implements techniques from single-site canopy models in a practical way. It includes representation of in-canopy transport, a multilayer longwave radiation budget, height-specific calculation of aerodynamic and stomatal conductance, and interaction with the bare soil flux within the canopy space. Significantly, it avoids iterations over the height of tha canopy and so maintains implicit coupling to the atmospheric model LMDz. As a first test, the model is evaluated against data from both an intensive measurement campaign and longer term eddy covariance measurements for the intensively studied Eucalyptus stand at Tumbarumba, Australia. The model performs well in replicating both diurnal and annual cycles of fluxes, as well as the gradients of sensible heat fluxes. However, the model overestimates sensible heat flux against an underestimate of the radiation budget. Improved performance is expected through the implementation of a more detailed calculation of stand albedo and a more up-to-date stomatal conductance calculation.
Directory of Open Access Journals (Sweden)
Edo Cvrkalj
2015-12-01
Full Text Available Traditional budgeting principles, with strictly defined business goals, have been, since 1998, slowly growing into more sophisticated and organization-adjusted alternative budgeting concepts. One of those alternative concepts is the “Beyond budgeting” model with an implemented performance effects measuring process. In order for the model to be practicable, budget planning and control has to be reoriented to the “bottom up” planning and control approach. In today’s modern business surroundings one has to take both present and future opportunities and threats into consideration, by valorizing them in a budget which would allow a company to realize a whole pallet of advantages over the traditional budgeting principles which are presented later in the article. It is essential to emphasize the importance of successfully implementing the new budgeting principles within an organization. If the implementation has been lacking and done without a higher goal in mind, it is easily possible that the process has been implemented without coordination, planning and control framework within the organization itself. Further in the article we present an overview of managerial techniques and instruments within the “Beyond budgeting” model such as balanced scorecard, rolling forecast, dashboard, KPI and other supporting instruments. Lastly we define seven steps for implementing the “Beyond budgeting” model and offer a comparison of “Beyond budgeting” model against traditional budgeting principles which lists twelve reasons why “Beyond budgeting” is better suited to modern and market-oriented organizations. Each company faces those challenges in their own characteristic way but implementing new dynamic planning models will soon become essential for surviving in the market.
Madenjian, Charles P.; David, Solomon R.; Pothoven, Steven A.
2012-01-01
We evaluated the performance of the Wisconsin bioenergetics model for lake trout Salvelinus namaycush that were fed ad libitum in laboratory tanks under regimes of low activity and high activity. In addition, we compared model performance under two different model algorithms: (1) balancing the lake trout energy budget on day t based on lake trout energy density on day t and (2) balancing the lake trout energy budget on day t based on lake trout energy density on day t + 1. Results indicated that the model significantly underestimated consumption for both inactive and active lake trout when algorithm 1 was used and that the degree of underestimation was similar for the two activity levels. In contrast, model performance substantially improved when using algorithm 2, as no detectable bias was found in model predictions of consumption for inactive fish and only a slight degree of overestimation was detected for active fish. The energy budget was accurately balanced by using algorithm 2 but not by using algorithm 1. Based on the results of this study, we recommend the use of algorithm 2 to estimate food consumption by fish in the field. Our study results highlight the importance of accurately accounting for changes in fish energy density when balancing the energy budget; furthermore, these results have implications for the science of evaluating fish bioenergetics model performance and for more accurate estimation of food consumption by fish in the field when fish energy density undergoes relatively rapid changes.
Directory of Open Access Journals (Sweden)
Pattnaik Monalisha
2015-01-01
Full Text Available In this paper, the concept of fuzzy Non-Linear Programming Technique is applied to solve an economic order quantity (EOQ model for restricted budget and space. Since various types of uncertainties and imprecision are inherent in real inventory problems, they are classically modeled using the approaches from the probability theory. However, there are uncertainties that cannot be appropriately treated by the usual probabilistic models. The questions are how to define inventory optimization tasks in such environment and how to interpret the optimal solutions. This paper allow the modification of the Single item EOQ model in presence of fuzzy decision making process where demand is related to the unit price, and the setup cost varies with the quantity produced/Purchased. The modification of objective function, budget, and storage area in the presence of imprecisely estimated parameters are considered. The model is developed by employing different approaches over an infinite planning horizon. It incorporates all the concepts of a fuzzy arithmetic approach and comparative analysis with other non linear models. Investigation of the properties of an optimal solution allows developing an algorithm whose validity is illustrated by an example problem, and two and three dimensional diagrams are represented to this application through MATL(R2009a software. Sensitivity analysis of the optimal solution is studied with respect to the changes of different parameter values for obtaining managerial insights of the decision problem.
Model Theory in Algebra, Analysis and Arithmetic
Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J
2014-01-01
Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.
Wijsman, J.W.M.; Smaal, A.C.
2011-01-01
A Dynamic Energy Budget (DEB) model for cockles is presented and calibrated using detailed data on cockle growth and water quality in the Oosterschelde. Cockles in the intertidal areas of the Oosterschelde have an important function as a food source for wading birds and as such for the natural value
McDavitt, B.; O'Connor, M.
2003-12-01
The Pacific Lumber Company Habitat Conservation Plan requires watershed analyses to be conducted on their property. This paper summarizes a portion of that analysis focusing on erosion and sedimentation processes and rates coupled with downstream sediment routing in the Freshwater Creek watershed in northwest California. Watershed scale erosion sources from hillslopes, roads, and channel banks were quantified using field surveys, aerial photo interpretation, and empirical modeling approaches for different elements of the study. Sediment transport rates for bedload were modeled, and sediment transport rates for suspended sediment were estimated based on size distribution of sediment inputs in relation to sizes transported in suspension. Recent short-term, high-quality estimates of suspended sediment yield that a community watershed group collected with technical assistance from the US Forest Service were used to validate the resulting sediment budget. Bedload yield data from an adjacent watershed, Jacoby Creek, provided another check on the sediment budget. The sediment budget techniques and bedload routing models used for this study generated sediment yield estimates that are in good agreement with available data. These results suggest that sediment budget techniques that require moderate levels of fieldwork can be used to provide relatively accurate technical assessments. Ongoing monitoring of sediment sources coupled with sediment routing models and reach scale field data allows for predictions to be made regarding in-channel sediment storage.
Swinkels, J.M.; Hogeveen, H.; Zadoks, R.N.
2005-01-01
linical Staphylococcus aureus mastitis is rarely treated during lactation because it is widely believed to be uneconomical, although there are no economic studies that support this view. Partial budgeting was used to develop a deterministic simulation model to estimate the net cost or benefit of ant
Fournier, N.; Tang, Y.S.; Dragosits, U.; Kluizenaar, Y.de; Sutton, M.A.
2005-01-01
Atmospheric budgets of reduced nitrogen for the major political regions of the British Isles are investigated with a multi-layer atmospheric transport model. The model is validated against measurements of NH3 concentration and is developed to provide atmospheric budgets for defined subdomains of the
Multi-Sensor Model-Data Assimilation for Improved Modeling of Savanna Carbon and Water Budgets
Barrett, D. J.; Renzullo, L. J.; Guerschman, J.; Hill, M. J.
2007-12-01
Model-data assimilation methods are increasingly being used to improve model predictions of carbon pools and fluxes, soil profile moisture contents, and evapotranspiration at catchment to regional scales. In this talk, I will discuss the development of model-data assimilation methods for application to parameter and state estimation problems in the context of savanna carbon and water cycles. A particular focus of this talk will be on the integration of in situ datasets and multiple types of satellite observations with radiative transfer, surface energy balance, and carbon budget models. An example will be drawn from existing work demonstrating regional estimation of soil profile moisture content based on multiple satellite sensors. The data assimilation scheme comprised a forward model, observation operators, multiple observation datasets and an optimization scheme. The forward model propagates model state variables in time based on climate forcing, initial conditions and model parameters and includes processes governing evapotranspiration, water budget and carbon cycle processes. The observation operators calculate modeled land surface temperature and microwave brightness temperatures based on the state variables of profile soil moisture and soil surface layer soil moisture at less than 2.5 cm depth. Satellite observations used in the assimilation scheme are surface brightness temperatures from AMSR-E (passive microwave at 6.9GHz at horizontal polarization) and from AVHRR (thermal channels 4 & 5 from NOAA-18), and land surface reflectances from MODIS Terra (channels 1 and 2 at 250m resolution). These three satellite sensors overpass at approximately the same time of day and provide independent observations of the land surface at different wavelengths. The observed brightness temperatures are used as constraints on the coupled energy balance/microwave radiative transfer model, and a canopy optical model was inverted to retrieve leaf area indices from observed
Using a unit cost model to predict the impact of budget cuts on logistics products and services
Van Haasteren, Cleve J.
1992-01-01
Approved for Public Release; Distribution is Unlimited The Director of the Trident Integrated Logistics Support Division at the Naval Sea Systems Command manages a complex and dynamic budget that supports the provision of logistics products and services to the Trident submarine fleet. This thesis focuses on analyzing the Logistics Division budget and developing a model where the impact of a budget cut can be predicted by employing marginal cost. The thesis also explores ...
The effects of atmospheric chemistry on radiation budget in the Community Earth Systems Model
Choi, Y.; Czader, B.; Diao, L.; Rodriguez, J.; Jeong, G.
2013-12-01
The Community Earth Systems Model (CESM)-Whole Atmosphere Community Climate Model (WACCM) simulations were performed to study the impact of atmospheric chemistry on the radiation budget over the surface within a weather prediction time scale. The secondary goal is to get a simplified and optimized chemistry module for the short time period. Three different chemistry modules were utilized to represent tropospheric and stratospheric chemistry, which differ in how their reactions and species are represented: (1) simplified tropospheric and stratospheric chemistry (approximately 30 species), (2) simplified tropospheric chemistry and comprehensive stratospheric chemistry from the Model of Ozone and Related Chemical Tracers, version 3 (MOZART-3, approximately 60 species), and (3) comprehensive tropospheric and stratospheric chemistry (MOZART-4, approximately 120 species). Our results indicate the different details in chemistry treatment from these model components affect the surface temperature and impact the radiation budget.
Lika, K.; Kearney, M.R.; Freitas, V.; Veer, van der H.W.; Meer, van der J.; Wijsman, J.W.M.; Pecquerie, L.; Kooijman, S.A.L.M.
2011-01-01
The Dynamic Energy Budget (DEB) theory for metabolic organisation captures the processes of development, growth, maintenance, reproduction and ageing for any kind of organism throughout its life-cycle. However, the application of DEB theory is challenging because the state variables and parameters a
Lika, K.; Kearney, M.R.; Freitas, V.; van der Veer, H.W.; van der Meer, J.; Wijsman, J.W.M.; Pecquerie, L.; Kooijman, S.A.L.M.
2011-01-01
The Dynamic Energy Budget (DEB) theory for metabolic organisation captures the processes of development, growth, maintenance, reproduction and ageing for any kind of organism throughout its life-cycle. However, the application of DEB theory is challenging because the state variables and parameters a
The paper applies the mathematical control theory to the accounting network flows, where the flow rates are constrained by linear inequalities. The...cross section phase of the problem, which is characterized by linear programming, and the dynamic phase of the problem, which is characterized by control theory . (Author)
Stochastic Climate Theory and Modelling
Franzke, Christian L E; Berner, Judith; Williams, Paul D; Lucarini, Valerio
2014-01-01
Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations as well as for model error representation, uncertainty quantification, data assimilation and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochast...
Model companions of theories with an automorphism
Kikyo, Hirotaka
1998-01-01
For a theory $T$ in $L, T_\\sigma$ is the theory of the models of $T$ with an automorphism $\\sigma$. If $T$ is an unstable model complete theory without the independence property, then $T_\\sigma$ has no model companion. If $T$ is an unstable model complete theory and $T_\\sigma$ has the amalgamation property, then $T_\\sigma$ has no model companion. If $T$ is model complete and has the fcp, then $T_\\sigma$ has no model completion.
Fos, Peter J; Miller, Danny L; Amy, Brian W; Zuniga, Miguel A
2004-01-01
State public health agencies are charged with providing and overseeing the management of basic public health services on a population-wide basis. These activities have a re-emphasized focus as a result of the events of September 11, 2001, the subsequent anthrax events, and the continuing importance placed on bioterrorism preparedness, West Nile virus, and emerging infectious diseases (eg, monkeypox, SARS). This has added to the tension that exists in budgeting and planning, given the diverse constituencies that are served in each state. State health agencies must be prepared to allocate finite resources in a more formal manner to be able to provide basic public health services on a routine basis, as well as during outbreaks. This article describes the use of an analytical approach to assist financial analysis that is used for budgeting and planning in a state health agency. The combined benefits of decision science and financial analysis are needed to adequately and appropriately plan and budget to meet the diverse needs of the populations within a state. Health and financial indicators are incorporated into a decision model, based on multicriteria decision theory, that has been employed to acquire information about counties and public health programs areas within a county, that reflect the impact of planning and budgeting efforts. This information can be used to allocate resources, to distribute funds for health care services, and to guide public health finance policy formulation and implementation.
Directory of Open Access Journals (Sweden)
I Putu Yoga Bumi Pradana
2015-02-01
Full Text Available This study aims to present a reconciliation model of bureaucratic principles (Secretion and democracy (Transparency through the mapping of public information about managing a local government budget which is accessible to the public and which ones are excluded (secret based on bureaucracy and public perceptions. This study uses a mixed method with sequential exploratory design and data collection research procedures using surveys, depth interviews, and documents. The validation data use source of triangulation techniques. The subjects of this study was divided into 2 (two information assembling that is government bureaucracy and public Kupang determined by purposive. The results of this research showed that Kupang Goverment bureaucracy has 22 types of information perception (33,85% in category information which is open and 42 types of information (64,62% in category information that are closed while the public perceives 29 types of information (44,62% in category information which is open and 26 types of information (40% in the category of information that are closed. Therefore, to achieve the main of reconciliation to end of conflict between bureaucracy and public, later on the amount of information is open budget of management that are 32 types of information (49,2% and the amount of information that is enclosed which includes 33 types of information (50,8 % of the 65 types of management budget information by egulation No. 13 of 2006 on local Financial Management.
Turbulence Kinetic Energy budget during the afternoon transition – Part 2: A simple TKE model
Directory of Open Access Journals (Sweden)
E. Nilsson
2015-11-01
Full Text Available A simple model for turbulence kinetic energy (TKE and the TKE budget is presented for sheared convective atmospheric conditions based on observations from the Boundary Layer Late Afternoon and Sunset Turbulence (BLLAST field campaign. It is based on an idealized mixed-layer approximation and a simplified near-surface TKE budget. In this model, the TKE is dependent on four budget terms (turbulent dissipation rate, buoyancy production, shear production and vertical transport of TKE and only requires measurements of three input available (near-surface buoyancy flux, boundary layer depth and wind speed at one height in the surface layer. This simple model is shown to reproduce some of the observed variations between the different studied days in terms of near-surface TKE and its decay during the afternoon transition reasonably well. It is subsequently used to systematically study the effects of buoyancy and shear on TKE evolution using idealized constant and time-varying winds during the afternoon transition. From this, we conclude that many different TKE decay rates are possible under time-varying winds and that generalizing the decay with simple scaling laws for near-surface TKE of the form tα may be questionable. The model's errors result from the exclusion of processes such as elevated shear production and horizontal advection. The model also produces an overly rapid decay of shear production with height. However, the most influential budget terms governing near-surface TKE in the observed sheared convective boundary layers are included, while only second order factors are neglected. Comparison between modeled and averaged observed estimates of dissipation rate illustrate that the overall behavior of the model is often quite reasonable. Therefore, we use the model to discuss the low turbulence conditions that form first in the upper parts of the boundary layer during the afternoon transition and are only apparent later near the surface. This
Reconciled climate response estimates from climate models and the energy budget of Earth
Richardson, Mark; Cowtan, Kevin; Hawkins, Ed; Stolpe, Martin B.
2016-10-01
Climate risks increase with mean global temperature, so knowledge about the amount of future global warming should better inform risk assessments for policymakers. Expected near-term warming is encapsulated by the transient climate response (TCR), formally defined as the warming following 70 years of 1% per year increases in atmospheric CO2 concentration, by which point atmospheric CO2 has doubled. Studies based on Earth's historical energy budget have typically estimated lower values of TCR than climate models, suggesting that some models could overestimate future warming. However, energy-budget estimates rely on historical temperature records that are geographically incomplete and blend air temperatures over land and sea ice with water temperatures over open oceans. We show that there is no evidence that climate models overestimate TCR when their output is processed in the same way as the HadCRUT4 observation-based temperature record. Models suggest that air-temperature warming is 24% greater than observed by HadCRUT4 over 1861-2009 because slower-warming regions are preferentially sampled and water warms less than air. Correcting for these biases and accounting for wider uncertainties in radiative forcing based on recent evidence, we infer an observation-based best estimate for TCR of 1.66 °C, with a 5-95% range of 1.0-3.3 °C, consistent with the climate models considered in the IPCC 5th Assessment Report.
Stochastic models: theory and simulation.
Energy Technology Data Exchange (ETDEWEB)
Field, Richard V., Jr.
2008-03-01
Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.
Theory and Experience in Deliberative Democracy and Budget Review%协商民主与预算审议的理论与经验
Institute of Scientific and Technical Information of China (English)
顾维萌
2012-01-01
The theory of deliberative democracy emphasizes that through the way of rational dialogue in freedom and equality, debate, citizens should participate in public affairs consultation, deliberation, giving legitimacy to legislation and decision, which will be of great significance to the budget review. The practices of participatory budgeting in Brazil Porto Alegre and China Wenling are typical cases of deliberative democracy. Referring to the theory of deliberative democracy and participatory budgeting practice, China＇s regime improvement about budget review should start from consummating legal system; Structuring procedure mechanism and promoting citizens＇ autonomous ability in participation.%协商民主理论强调公民通过自由、平等的理性对话、辩论、协商、审议等方式来参与公共事务，赋予立法和决策以合法性，对预算审议有着重要意义。巴西阿雷格里港市和中国温岭的参与式预算实践是协商民主的典型体现。要想通过借鉴协商民主理论和参与式预算实践完善我国预算审议制度，就必须完善法律体系，构建程序机制，提升民众参与自治的能力。
Impact of surface wind biases on the Antarctic sea ice concentration budget in climate models
Lecomte, O.; Goosse, H.; Fichefet, T.; Holland, P. R.; Uotila, P.; Zunz, V.; Kimura, N.
2016-09-01
We derive the terms in the Antarctic sea ice concentration budget from the output of three models, and compare them to observations of the same terms. Those models include two climate models from the 5th Coupled Model Intercomparison Project (CMIP5) and one ocean-sea ice coupled model with prescribed atmospheric forcing. Sea ice drift and wind fields from those models, in average over April-October 1992-2005, all exhibit large differences with the available observational or reanalysis datasets. However, the discrepancies between the two distinct ice drift products or the two wind reanalyses used here are sometimes even greater than those differences. Two major findings stand out from the analysis. Firstly, large biases in sea ice drift speed and direction in exterior sectors of the sea ice covered region tend to be systematic and consistent with those in winds. This suggests that sea ice errors in these areas are most likely wind-driven, so as errors in the simulated ice motion vectors. The systematic nature of these biases is less prominent in interior sectors, nearer the coast, where sea ice is mechanically constrained and its motion in response to the wind forcing more depending on the model rheology. Second, the intimate relationship between winds, sea ice drift and the sea ice concentration budget gives insight on ways to categorize models with regard to errors in their ice dynamics. In exterior regions, models with seemingly too weak winds and slow ice drift consistently yield a lack of ice velocity divergence and hence a wrong wintertime sea ice growth rate. In interior sectors, too slow ice drift, presumably originating from issues in the physical representation of sea ice dynamics as much as from errors in surface winds, leads to wrong timing of the late winter ice retreat. Those results illustrate that the applied methodology provides a valuable tool for prioritizing model improvements based on the ice concentration budget-ice drift biases-wind biases
Rosland, R.; Strand, Ø.; Alunno-Bruscia, M.; Bacher, C.; Strohmeier, T.
2009-08-01
A Dynamic Energy Budget (DEB) model for simulation of growth and bioenergetics of blue mussels ( Mytilus edulis) has been tested in three low seston sites in southern Norway. The observations comprise four datasets from laboratory experiments (physiological and biometrical mussel data) and three datasets from in situ growth experiments (biometrical mussel data). Additional in situ data from commercial farms in southern Norway were used for estimation of biometrical relationships in the mussels. Three DEB parameters (shape coefficient, half saturation coefficient, and somatic maintenance rate coefficient) were estimated from experimental data, and the estimated parameters were complemented with parameter values from literature to establish a basic parameter set. Model simulations based on the basic parameter set and site specific environmental forcing matched fairly well with observations, but the model was not successful in simulating growth at the extreme low seston regimes in the laboratory experiments in which the long period of negative growth caused negative reproductive mass. Sensitivity analysis indicated that the model was moderately sensitive to changes in the parameter and initial conditions. The results show the robust properties of the DEB model as it manages to simulate mussel growth in several independent datasets from a common basic parameter set. However, the results also demonstrate limitations of Chl a as a food proxy for blue mussels and limitations of the DEB model to simulate long term starvation. Future work should aim at establishing better food proxies and improving the model formulations of the processes involved in food ingestion and assimilation. The current DEB model should also be elaborated to allow shrinking in the structural tissue in order to produce more realistic growth simulations during long periods of starvation.
A Budget Impact Model for Paclitaxel-eluting Stent in Femoropopliteal Disease in France
Energy Technology Data Exchange (ETDEWEB)
De Cock, Erwin, E-mail: erwin.decock@unitedbiosource.com [United BioSource Corporation, Peri- and Post-Approval Services (Spain); Sapoval, Marc, E-mail: Marc.sapoval2@egp.aphp.fr [Hopital Europeen Georges Pompidou, Universite Rene Descartes, Department of Cardiovascular and Interventional Radiology (France); Julia, Pierre, E-mail: pierre.julia@egp.aphp.fr [Hopital Europeen Georges Pompidou, Universite Rene Descartes, Cardiovascular Surgery Department (France); Lissovoy, Greg de, E-mail: gdelisso@jhsph.edu [Johns Hopkins Bloomberg School of Public Health, Department of Health Policy and Management (United States); Lopes, Sandra, E-mail: Sandra.Lopes@CookMedical.com [Cook Medical, Health Economics and Reimbursement (Denmark)
2013-04-15
The Zilver PTX drug-eluting stent (Cook Ireland Ltd., Limerick, Ireland) represents an advance in endovascular treatments for atherosclerotic superficial femoral artery (SFA) disease. Clinical data demonstrate improved clinical outcomes compared to bare-metal stents (BMS). This analysis assessed the likely impact on the French public health care budget of introducing reimbursement for the Zilver PTX stent. A model was developed in Microsoft Excel to estimate the impact of a progressive transition from BMS to Zilver PTX over a 5-year horizon. The number of patients undergoing SFA stenting was estimated on the basis of hospital episode data. The analysis from the payer perspective used French reimbursement tariffs. Target lesion revascularization (TLR) after primary stent placement was the primary outcome. TLR rates were based on 2-year data from the Zilver PTX single-arm study (6 and 9 %) and BMS rates reported in the literature (average 16 and 22 %) and extrapolated to 5 years. Net budget impact was expressed as the difference in total costs (primary stenting and reinterventions) for a scenario where BMS is progressively replaced by Zilver PTX compared to a scenario of BMS only. The model estimated a net cumulative 5-year budget reduction of Euro-Sign 6,807,202 for a projected population of 82,316 patients (21,361 receiving Zilver PTX). Base case results were confirmed in sensitivity analyses. Adoption of Zilver PTX could lead to important savings for the French public health care payer. Despite higher initial reimbursement for the Zilver PTX stent, fewer expected SFA reinterventions after the primary stenting procedure result in net savings.
Energy Technology Data Exchange (ETDEWEB)
Sanchez-Gomez, E. [CERFACS/CNRS, SUC URA1875, Toulouse Cedex (France); Somot, S.; Dubois, C.; Deque, M. [CNRM/GAME, Meteo-France/CNRS, Toulouse (France); Josey, S.A. [National Oceanography Centre, Southampton (United Kingdom); Elguindi, N. [LA, CNRS, Toulouse (France)
2011-11-15
Air-sea heat and freshwater water fluxes in the Mediterranean Sea play a crucial role in dense water formation. Here, we compare estimates of Mediterranean Sea heat and water budgets from a range of observational datasets and discuss the main differences between them. Taking into account the closure hypothesis at the Gibraltar Strait, we have built several observational estimates of water and heat budgets by combination of their different observational components. We provide then three estimates for water budget and one for heat budget that satisfy the closure hypothesis. We then use these observational estimates to assess the ability of an ensemble of ERA40-driven high resolution (25 km) Regional Climate Models (RCMs) from the FP6-EU ENSEMBLES database, to simulate the various components, and net values, of the water and heat budgets. Most of the RCM Mediterranean basin means are within the range spanned by the observational estimates of the different budget components, though in some cases the RCMs have a tendency to overestimate the latent heat flux (or evaporation) with respect to observations. The RCMs do not show significant improvements of the total water budget estimates comparing to ERA40. Moreover, given the large spread found in observational estimates of precipitation over the sea, it is difficult to draw conclusions on the performance of RCM for the freshwater budget and this underlines the need for better precipitation observations. The original ERA40 value for the basin mean net heat flux is -15 W/m{sup 2} which is 10 W/m{sup 2} less than the value of -5 W/m{sup 2} inferred from the transport measurements at Gibraltar Strait. The ensemble of heat budget values estimated from the models show that most of RCMs do not achieve heat budget closure. However, the ensemble mean value for the net heat flux is -7 {+-} 21 W/m{sup 2}, which is close to the Gibraltar value, although the spread between the RCMs is large. Since the RCMs are forced by the same
Barkstrom, B. R.
1983-01-01
The measurement of the earth's radiation budget has been chosen to illustrate the technique of objective system design. The measurement process is an approximately linear transformation of the original field of radiant exitances, so that linear statistical techniques may be employed. The combination of variability, measurement strategy, and error propagation is presently made with the help of information theory, as suggested by Kondratyev et al. (1975) and Peckham (1974). Covariance matrices furnish the quantitative statement of field variability.
An energy-budget-based glacier melting model for the Tibetan Plateau
Ding, Baohong; Yang, Kun; Chen, Yingying
2013-04-01
There have been rapid glacier retreats during the past few decades on the Tibetan Plateau, which not only have far-reaching impacts on the water resources in this region, but also potentially threat the downstream by glacial lake outburst floods. It is therefore important to model the physical link between glacier melting and climate changes and its implication in water resources. There have been a few studies on glacier melting models, of which the applicability is limited to some areas and the simulation capability also needs to be improved. This paper presents a new energy-budget-based model for the melting of the mountainous glaciers. Enthalpy, rather than temperature, is used in the energy balance equations to simplify the computation for the energy transfer through water phase transition and within-snow liquid water movement. Heat transfer is computed in both snow and ice layers, and the inhomogeneous layering method is employed to describe the temperature profiles better, especially at the interface between snow and atmosphere as well as that between snow and ice. A new parameterization scheme is introduced into the model to calculate turbulent heat transfer over glacier surfaces. This model was validated based on the data collected from a field experiment which was implemented in the melting zone of the Parlung No. 4 Glacier in the southeastern TP from May to August in 2009. The result shows that the RMSE of the simulated hourly surface temperature is about 0.97 degree centigrade and the R2 is 0.81. The RMSE of the simulated hourly latent heat flux and hourly sensible heat flux are 14.5W m^-2 and 23.5W m^-2 respectively, and R2 are 0.92 and 0.93. In general, this energy-budget-based model could reasonably simulate the glacier melting process. The model is still under development for a better simulation of the glacier melting and its contribution to the water resources.
Simulated effects of nitrogen saturation on the global carbon budget using the IBIS model
Lu, Xuehe; Jiang, Hong; Liu, Jinxun; Zhang, Xiuying; Jin, Jiaxin; Zhu, Qiuan; Zhang, Zhen; Peng, Changhui
2016-12-01
Over the past 100 years, human activity has greatly changed the rate of atmospheric N (nitrogen) deposition in terrestrial ecosystems, resulting in N saturation in some regions of the world. The contribution of N saturation to the global carbon budget remains uncertain due to the complicated nature of C-N (carbon-nitrogen) interactions and diverse geography. Although N deposition is included in most terrestrial ecosystem models, the effect of N saturation is frequently overlooked. In this study, the IBIS (Integrated BIosphere Simulator) was used to simulate the global-scale effects of N saturation during the period 1961–2009. The results of this model indicate that N saturation reduced global NPP (Net Primary Productivity) and NEP (Net Ecosystem Productivity) by 0.26 and 0.03 Pg C yr‑1, respectively. The negative effects of N saturation on carbon sequestration occurred primarily in temperate forests and grasslands. In response to elevated CO2 levels, global N turnover slowed due to increased biomass growth, resulting in a decline in soil mineral N. These changes in N cycling reduced the impact of N saturation on the global carbon budget. However, elevated N deposition in certain regions may further alter N saturation and C-N coupling.
Simulated effects of nitrogen saturation the global carbon budget using the IBIS model
Lu, Xuehe; Jiang, Hong; Liu, Jinxun; Zhang, Xiuying; Jin, Jiaxin; Zhu, Qiuan; Zhang, Zhen; Peng, Changhui
2016-01-01
Over the past 100 years, human activity has greatly changed the rate of atmospheric N (nitrogen) deposition in terrestrial ecosystems, resulting in N saturation in some regions of the world. The contribution of N saturation to the global carbon budget remains uncertain due to the complicated nature of C-N (carbon-nitrogen) interactions and diverse geography. Although N deposition is included in most terrestrial ecosystem models, the effect of N saturation is frequently overlooked. In this study, the IBIS (Integrated BIosphere Simulator) was used to simulate the global-scale effects of N saturation during the period 1961–2009. The results of this model indicate that N saturation reduced global NPP (Net Primary Productivity) and NEP (Net Ecosystem Productivity) by 0.26 and 0.03 Pg C yr−1, respectively. The negative effects of N saturation on carbon sequestration occurred primarily in temperate forests and grasslands. In response to elevated CO2 levels, global N turnover slowed due to increased biomass growth, resulting in a decline in soil mineral N. These changes in N cycling reduced the impact of N saturation on the global carbon budget. However, elevated N deposition in certain regions may further alter N saturation and C-N coupling.
DEFF Research Database (Denmark)
Saurel, Camille; Maar, Marie; Landes, Anja
such as food supply, temperature and salinity. In the Baltic Sea - highly disturbed eutrophied environment-mussel growth efficiency is limited due to the very low levels of salinity and in area where the salinity is below 8 psu, mussels appear on a dwarf form. The aim of the present study was to incorporate...... the effects of low salinity into an eco-physiological model of blue mussels and to identify areas suitable for cost-effective mussel production for mitigation culture. A standard Dynamic Energy Budget (DEB) model was modified with respect to i) the morphological parameters (DW/WW-ratio, shape factor), ii......) change in ingestion rate and iii) metabolic costs due to osmotic regulatory mechanisms to adapt in different salinity environments. The modified DEB model was validated with experimental data from different locations in the Western Baltic Sea including the Limfjorden, with salinities ranging from 8...
Motivation in Beyond Budgeting: A Motivational Paradox?
DEFF Research Database (Denmark)
Sandalgaard, Niels; Bukh, Per Nikolaj
In this paper we discuss the role of motivation in relation to budgeting and we analyse how the Beyond Budgeting model functions compared with traditional budgeting. In the paper we focus on budget related motivation (and motivation in general) and conclude that the Beyond Budgeting model...... is a motivational paradox....
Healy, Richard W.; Scanlon, Bridget R.
2010-01-01
A water budget is an accounting of water movement into and out of, and storage change within, some control volume. Universal and adaptable are adjectives that reflect key features of water-budget methods for estimating recharge. The universal concept of mass conservation of water implies that water-budget methods are applicable over any space and time scales (Healy et al., 2007). The water budget of a soil column in a laboratory can be studied at scales of millimeters and seconds. A water-budget equation is also an integral component of atmospheric general circulation models used to predict global climates over periods of decades or more. Water-budget equations can be easily customized by adding or removing terms to accurately portray the peculiarities of any hydrologic system. The equations are generally not bound by assumptions on mechanisms by which water moves into, through, and out of the control volume of interest. So water-budget methods can be used to estimate both diffuse and focused recharge, and recharge estimates are unaffected by phenomena such as preferential flow paths within the unsaturated zone. Water-budget methods represent the largest class of techniques for estimating recharge. Most hydrologic models are derived from a water-budget equation and can therefore be classified as water-budget models. It is not feasible to address all water-budget methods in a single chapter. This chapter is limited to discussion of the “residual” water-budget approach, whereby all variables in a water-budget equation, except for recharge, are independently measured or estimated and recharge is set equal to the residual. This chapter is closely linked with Chapter 3, on modeling methods, because the equations presented here form the basis of many models and because models are often used to estimate individual components in water-budget studies. Water budgets for streams and other surface-water bodies are addressed in Chapter 4. The use of soil-water budgets and
Assessing the O2 budget under sea ice: An experimental and modelling approach
Directory of Open Access Journals (Sweden)
S. Moreau
2015-12-01
Full Text Available Abstract The objective of this study was to assess the O2 budget in the water under sea ice combining observations and modelling. Modelling was used to discriminate between physical processes, gas-specific transport (i.e., ice-atmosphere gas fluxes and gas bubble buoyancy and bacterial respiration (BR and to constrain bacterial growth efficiency (BGE. A module describing the changes of the under-ice water properties, due to brine rejection and temperature-dependent BR, was implemented in the one-dimensional halo-thermodynamic sea ice model LIM1D. Our results show that BR was the dominant biogeochemical driver of O2 concentration in the water under ice (in a system without primary producers, followed by gas specific transport. The model suggests that the actual contribution of BR and gas specific transport to the change in seawater O2 concentration was 37% during ice growth and 48% during melt. BGE in the water under sea ice, as retrieved from the simulated O2 budget, was found to be between 0.4 and 0.5, which is in line with published BGE values for cold marine waters. Given the importance of BR to seawater O2 in the present study, it can be assumed that bacteria contribute substantially to organic matter consumption and gas fluxes in ice-covered polar oceans. In addition, we propose a parameterization of polar marine bacterial respiration, based on the strong temperature dependence of bacterial respiration and the high growth efficiency observed here, for further biogeochemical ocean modelling applications, such as regional or large-scale Earth System models.
Ozone Budgets from a Global Chemistry/ Transport Model and Comparison to Observations from POLARIS
Kawa, S. Randy
1999-01-01
The objective of the Photochemistry of Ozone Loss in the Arctic Region in Summer (POLARIS) field mission was to obtain data to better characterize the summertime seasonal decrease of ozone at mid to high latitudes. The decrease in ozone occurs mainly in the lower stratosphere and is expected to result from in situ chemical destruction. Instrumented balloons and aircraft were used in POLARIS, along with satellites, to measure ozone and chemical species which are involved with stratospheric ozone chemistry. In order to close the seasonal ozone budget, however, ozone transport must also be estimated. Comparison to a global chemistry and transport model (CTM) of the stratosphere indicates how well the summertime ozone loss processes are simulated and thus how well we can predict the ozone response to changing amounts of chemical source gases. Moreover, the model gives insight into the possible relative magnitude of transport contributions to the seasonal ozone decline. Initial comparison to the Goddard CTM, which uses transport winds and temperatures from meteorological data assimilation, shows a high ozone bias in the model and an attenuated summertime ozone loss cycle. Comparison of the model chemical partitioning, and ozone catalytic loss rates to those derived from measurements shows fairly close agreement both at ER-2 altitudes (20 km) and higher. This suggests that the model transport is too active in resupplying ozone to the high latitude region, although chemistry failings cannot be completely ruled out. Comparison of ozone and related species will be shown along with a full diagnosis of the model ozone budget and its possible sources of error.
On Dimer Models and Closed String Theories
Sarkar, Tapobrata
2007-01-01
We study some aspects of the recently discovered connection between dimer models and D-brane gauge theories. We argue that dimer models are also naturally related to closed string theories on non compact orbifolds of $\\BC^2$ and $\\BC^3$, via their twisted sector R charges, and show that perfect matchings in dimer models correspond to twisted sector states in the closed string theory. We also use this formalism to study the combinatorics of some unstable orbifolds of $\\BC^2$.
A water-budget model and estimates of groundwater recharge for Guam
Johnson, Adam G.
2012-01-01
On Guam, demand for groundwater tripled from the early 1970s to 2010. The demand for groundwater is anticipated to further increase in the near future because of population growth and a proposed military relocation to Guam. Uncertainty regarding the availability of groundwater resources to support the increased demand has prompted an investigation of groundwater recharge on Guam using the most current data and accepted methods. For this investigation, a daily water-budget model was developed and used to estimate mean recharge for various land-cover and rainfall conditions. Recharge was also estimated for part of the island using the chloride mass-balance method. Using the daily water-budget model, estimated mean annual recharge on Guam is 394.1 million gallons per day, which is 39 percent of mean annual rainfall (999.0 million gallons per day). Although minor in comparison to rainfall on the island, water inflows from water-main leakage, septic-system leachate, and stormwater runoff may be several times greater than rainfall at areas that receive these inflows. Recharge is highest in areas that are underlain by limestone, where recharge is typically between 40 and 60 percent of total water inflow. Recharge is relatively high in areas that receive stormwater runoff from storm-drain systems, but is relatively low in urbanized areas where stormwater runoff is routed to the ocean or to other areas. In most of the volcanic uplands in southern Guam where runoff is substantial, recharge is less than 30 percent of total water inflow. The water-budget model in this study differs from all previous water-budget investigations on Guam by directly accounting for canopy evaporation in forested areas, quantifying the evapotranspiration rate of each land-cover type, and accounting for evaporation from impervious areas. For the northern groundwater subbasins defined in Camp, Dresser & McKee Inc. (1982), mean annual baseline recharge computed in this study is 159.1 million gallons
Applications of model theory to functional analysis
Iovino, Jose
2014-01-01
During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the
Dynamic energy budget (DEB) theory provides a generalizable and broadly applicable framework to connect sublethal toxic effects on individuals to changes in population survival and growth. To explore this approach, we are developing growth and bioaccumulation studies that contrib...
Dynamic energy budget (DEB) theory provides a generalizable and broadly applicable framework to connect sublethal toxic effects on individuals to changes in population survival and growth. To explore this approach, we are conducting growth and bioaccumulation studies that contrib...
Domain Theory, Its Models and Concepts
DEFF Research Database (Denmark)
Andreasen, Mogens Myrup; Howard, Thomas J.; Bruun, Hans Peter Lomholt
2014-01-01
Domain Theory is a systems approach for the analysis and synthesis of products. Its basic idea is to view a product as systems of activities, organs and parts and to define structure, elements, behaviour and function in these domains. The theory is a basis for a long line of research contributions...... and industrial applications especially for the DFX areas (not reported here) and for product modelling. The theory therefore contains a rich ontology of interrelated concepts. The Domain Theory is not aiming to create normative methods but the creation of a collection of concepts related to design phenomena......, which can support design work and to form elements of designers’ mindsets and thereby their practice. The theory is a model-based theory, which means it is composed of concepts and models, which explains certain design phenomena. Many similar theories are described in the literature with differences...
Quantum field theory competitive models
Tolksdorf, Jürgen; Zeidler, Eberhard
2009-01-01
For more than 70 years, quantum field theory (QFT) can be seen as a driving force in the development of theoretical physics. Equally fascinating is the fruitful impact which QFT had in rather remote areas of mathematics. The present book features some of the different approaches, different physically viewpoints and techniques used to make the notion of quantum field theory more precise. For example, the present book contains a discussion including general considerations, stochastic methods, deformation theory and the holographic AdS/CFT correspondence. It also contains a discussion of more recent developments like the use of category theory and topos theoretic methods to describe QFT. The present volume emerged from the 3rd 'Blaubeuren Workshop: Recent Developments in Quantum Field Theory', held in July 2007 at the Max Planck Institute of Mathematics in the Sciences in Leipzig/Germany. All of the contributions are committed to the idea of this workshop series: 'To bring together outstanding experts working in...
A coupled biogeochemical-Dynamic Energy Budget model as a tool for managing fish production ponds.
Serpa, Dalila; Pousão-Ferreira, Pedro; Caetano, Miguel; Cancela da Fonseca, Luís; Dinis, Maria Teresa; Duarte, Pedro
2013-10-01
The sustainability of semi-intensive aquaculture relies on management practices that simultaneously improve production efficiency and minimize the environmental impacts of this activity. The purpose of the present work was to develop a mathematical model that reproduced the dynamics of a semi-intensive fish earth pond, to simulate different management scenarios for optimizing fish production. The modeling approach consisted of coupling a biogeochemical model that simulated the dynamics of the elements that are more likely to affect fish production and cause undesirable environmental impacts (nitrogen, phosphorus and oxygen) to a fish growth model based on the Dynamic Energy Budget approach. The biogeochemical sub-model successfully simulated most water column and sediment variables. A good model fit was also found between predicted and observed white seabream (Diplodus sargus) growth data over a production cycle. In order to optimize fish production, different management scenarios were analysed with the model (e.g. increase stocking densities, decrease/increase water exchange rates, decrease/increase feeding rates, decrease phosphorus content in fish feeds, increase food assimilation efficiency and decrease pellets sinking velocity) to test their effects on the pond environment as well as on fish yields and effluent nutrient discharges. Scenarios were quantitatively evaluated and compared using the Analytical Hierarchical Process (AHP) methodology. The best management options that allow the maximization of fish production while maintaining a good pond environment and minimum impacts on the adjacent coastal system were to double standard stocking densities and to improve food assimilation efficiency.
A water-budget model and estimates of groundwater recharge for Guam
Johnson, Adam G.
2012-01-01
On Guam, demand for groundwater tripled from the early 1970s to 2010. The demand for groundwater is anticipated to further increase in the near future because of population growth and a proposed military relocation to Guam. Uncertainty regarding the availability of groundwater resources to support the increased demand has prompted an investigation of groundwater recharge on Guam using the most current data and accepted methods. For this investigation, a daily water-budget model was developed and used to estimate mean recharge for various land-cover and rainfall conditions. Recharge was also estimated for part of the island using the chloride mass-balance method. Using the daily water-budget model, estimated mean annual recharge on Guam is 394.1 million gallons per day, which is 39 percent of mean annual rainfall (999.0 million gallons per day). Although minor in comparison to rainfall on the island, water inflows from water-main leakage, septic-system leachate, and stormwater runoff may be several times greater than rainfall at areas that receive these inflows. Recharge is highest in areas that are underlain by limestone, where recharge is typically between 40 and 60 percent of total water inflow. Recharge is relatively high in areas that receive stormwater runoff from storm-drain systems, but is relatively low in urbanized areas where stormwater runoff is routed to the ocean or to other areas. In most of the volcanic uplands in southern Guam where runoff is substantial, recharge is less than 30 percent of total water inflow. The water-budget model in this study differs from all previous water-budget investigations on Guam by directly accounting for canopy evaporation in forested areas, quantifying the evapotranspiration rate of each land-cover type, and accounting for evaporation from impervious areas. For the northern groundwater subbasins defined in Camp, Dresser & McKee Inc. (1982), mean annual baseline recharge computed in this study is 159.1 million gallons
Alunno-Bruscia, Marianne; Veer, Henk van der; Kooijman, S.A.L.M.
2011-01-01
This second special issue of the Journal of Sea Research on development and applications of Dynamic Energy Budget (DEB) theory concludes the European Research Project AquaDEB (2007–2011). In this introductory paper we summarise the progress made during the running time of this 5 years’ project, present context for the papers in this volume and discuss future directions. The main scientific objectives in AquaDEB were (i) to study and compare the sensitivity of aquatic species (mainly molluscs ...
Modelling developmental changes in the carbon and nitrogen budgets of larval brachyuran crabs
Anger, K.
1990-03-01
The uptake and partitioning of nutritional carbon (C) and nitrogen (N) were studied during the complete larval development of a brachyuran crab, Hyas araneus, reared under constant conditions in the laboratory. Biochemical and physiological data were published in a foregoing paper, and complete budgets of C and N were now constructed from these data. Regression equations describing rates of feeding ( F), growth ( G), respiration ( R), and ammonia excretion ( U) as functions of time during individual larval moult cycles were inserted in a simulation model, in order to analyse time-dependent (i.e. developmental) patterns of variation in these parameters as well as in bioenergetic efficiencies. Absolute daily feeding rates ( F; per individual) as well as carbon and nitrogen-specific rates ( F/C, F/N) are in general maximum in early, and minimum in late stages of individual larval moult cycles (postmoult and premoult, respectively). Early crab zoeae may ingest equivalents of up to ca 40% body C and 30% body N per day, respectively, whereas megalopa larvae usually eat less than 10%. Also growth rates ( G; G/C, G/N) reveal decreasing tendencies both during individual moult cycles and, on the average, in subsequent instars. Conversion of C and N data to lipid and protein, respectively, suggests that in all larval instars there is initially an increase in the lipid: protein ratio. Protein, however, remains clearly the predominant biochemical constituent in larval biomass. The absolute and specific values of respiration ( R; R/C) and excretion ( U; U/N) vary only little during the course of individual moult cycles. Thus, their significance in relation to G increases within the C and N budgets, and net growth efficiency ( K 2) decreases concurrently. Also gross growth and assimilation efficiency ( K 2; A/F) are, in general, maximum in early stages of the moult cycle (postmoult). Biochemical data suggest that lipid utilization efficiency is particularly high in early moult
Theories, Models and Methodology in Writing Research
Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel
1996-01-01
Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the w
The Friction Theory for Viscosity Modeling
DEFF Research Database (Denmark)
Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan
2001-01-01
In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet......, in the case when experimental information is available a more accurate modeling can be obtained by means of a simple tuning procedure. A tuned f-theory general model can deliver highly accurate viscosity modeling above the saturation pressure and good prediction of the liquid-phase viscosity at pressures...
Volcanic aquifers of Hawai‘i—Hydrogeology, water budgets, and conceptual models
Izuka, Scot K.; Engott, John A.; Bassiouni, Maoya; Johnson, Adam G.; Miller, Lisa D.; Rotzoll, Kolja; Mair, Alan
2016-06-13
Hawai‘i’s aquifers have limited capacity to store fresh groundwater because each island is small and surrounded by saltwater. Saltwater also underlies much of the fresh groundwater. Fresh groundwater resources are, therefore, particularly vulnerable to human activity, short-term climate cycles, and long-term climate change. Availability of fresh groundwater for human use is constrained by the degree to which the impacts of withdrawal—such as lowering of the water table, saltwater intrusion, and reduction in the natural discharge to springs, streams, wetlands, and submarine seeps—are deemed acceptable. This report describes the hydrogeologic framework, groundwater budgets (inflows and outflows), conceptual models of groundwater occurrence and movement, and the factors limiting groundwater availability for the largest and most populated of the Hawaiian Islands—Kaua‘i, O‘ahu, Maui, and Hawai‘i Island.
Spatiotemporal Variability of the Urban Water Budget and Implications for Distributed Modeling
Bhaskar, A. S.; Welty, C.; Maxwell, R. M.
2011-12-01
In seeking to understand the feedbacks between urban development and water availability, we are in the process of coupling an integrated hydrologic model with an urban growth model, both of the Baltimore, Maryland, USA region. We are implementing ParFlow.CLM as the integrated hydrologic model (a subsurface-surface flow/land surface processes model) for the 13,000 sq km Baltimore metropolitan area. This work requires an understanding of the distribution of flows and making decisions on how to best model the short-circuiting of water and other phenomena unique to urban systems. In order to assess the attributes of available data, we conducted a study of the urban water budget from 2000 to 2009 and across an urban to rural gradient of development. For 65 watersheds in the Baltimore metropolitan area we quantified both natural (precipitation, evapotranspiration and streamflow) and engineered or piped (wastewater infiltration and inflow, lawn irrigation, water supply pipe leakage and reservoir withdrawals) water budget components on a monthly basis. We used monthly PRISM grids for precipitation, the land surface model GLDAS- Noah for gridded evapotranspiration estimates and streamflow from USGS gage records. For piped components, we used Baltimore City's comprehensive wastewater monitoring program data, which has infiltration and inflow estimates for most of the city's sewer basins, as well as estimates of lawn irrigation from fine-scale land cover data and lawn watering estimates, and water supply pipe leakage based on system wide values and the distribution of water supply pipes. We found that when solely considering natural components, urban watersheds generally appeared to have excess water, although the spatial variability was much higher for urban watersheds as compared to rural ones. This apparent excess water was more than accounted for by the most significant piped component, the export of groundwater and rainwater by cracks and improper connections to the
An approach for modeling sediment budgets in supply-limited rivers
Wright, Scott A.; Topping, David J.; Rubin, David M.; Melis, Theodore S.
2010-10-01
Reliable predictions of sediment transport and river morphology in response to variations in natural and human-induced drivers are necessary for river engineering and management. Because engineering and management applications may span a wide range of space and time scales, a broad spectrum of modeling approaches has been developed, ranging from suspended-sediment "rating curves" to complex three-dimensional morphodynamic models. Suspended sediment rating curves are an attractive approach for evaluating changes in multi-year sediment budgets resulting from changes in flow regimes because they are simple to implement, computationally efficient, and the empirical parameters can be estimated from quantities that are commonly measured in the field (i.e., suspended sediment concentration and water discharge). However, the standard rating curve approach assumes a unique suspended sediment concentration for a given water discharge. This assumption is not valid in rivers where sediment supply varies enough to cause changes in particle size or changes in areal coverage of sediment on the bed; both of these changes cause variations in suspended sediment concentration for a given water discharge. More complex numerical models of hydraulics and morphodynamics have been developed to address such physical changes of the bed. This additional complexity comes at a cost in terms of computations as well as the type and amount of data required for model setup, calibration, and testing. Moreover, application of the resulting sediment-transport models may require observations of bed-sediment boundary conditions that require extensive (and expensive) observations or, alternatively, require the use of an additional model (subject to its own errors) merely to predict the bed-sediment boundary conditions for use by the transport model. In this paper we present a hybrid approach that combines aspects of the rating curve method and the more complex morphodynamic models. Our primary objective
An approach for modeling sediment budgets in supply-limited rivers
Wright, Scott A.; Topping, David J.; Rubin, David M.; Melis, Theodore S.
2010-01-01
Reliable predictions of sediment transport and river morphology in response to variations in natural and human-induced drivers are necessary for river engineering and management. Because engineering and management applications may span a wide range of space and time scales, a broad spectrum of modeling approaches has been developed, ranging from suspended-sediment "rating curves" to complex three-dimensional morphodynamic models. Suspended sediment rating curves are an attractive approach for evaluating changes in multi-year sediment budgets resulting from changes in flow regimes because they are simple to implement, computationally efficient, and the empirical parameters can be estimated from quantities that are commonly measured in the field (i.e., suspended sediment concentration and water discharge). However, the standard rating curve approach assumes a unique suspended sediment concentration for a given water discharge. This assumption is not valid in rivers where sediment supply varies enough to cause changes in particle size or changes in areal coverage of sediment on the bed; both of these changes cause variations in suspended sediment concentration for a given water discharge. More complex numerical models of hydraulics and morphodynamics have been developed to address such physical changes of the bed. This additional complexity comes at a cost in terms of computations as well as the type and amount of data required for model setup, calibration, and testing. Moreover, application of the resulting sediment-transport models may require observations of bed-sediment boundary conditions that require extensive (and expensive) observations or, alternatively, require the use of an additional model (subject to its own errors) merely to predict the bed-sediment boundary conditions for use by the transport model. In this paper we present a hybrid approach that combines aspects of the rating curve method and the more complex morphodynamic models. Our primary objective
Advanced Modeling Techniques to Study Anthropogenic Influences on Atmospheric Chemical Budgets
Mathur, Rohit
1997-01-01
This research work is a collaborative effort between research groups at MCNC and the University of North Carolina at Chapel Hill. The overall objective of this research is to improve the level of understanding of the processes that determine the budgets of chemically and radiatively active compounds in the atmosphere through development and application of advanced methods for calculating the chemical change in atmospheric models. The research performed during the second year of this project focused on four major aspects: (1) The continued development and refinement of multiscale modeling techniques to address the issue of the disparate scales of the physico-chemical processes that govern the fate of atmospheric pollutants; (2) Development and application of analysis methods utilizing process and mass balance techniques to increase the interpretive powers of atmospheric models and to aid in complementary analysis of model predictions and observations; (3) Development of meteorological and emission inputs for initial application of the chemistry/transport model over the north Atlantic region; and, (4) The continued development and implementation of a totally new adaptive chemistry representation that changes the details of what is represented as the underlying conditions change.
Energy Technology Data Exchange (ETDEWEB)
Unger, Astrid [VELUX Deutschland GmbH (Germany)
2011-07-01
Within a European-wide experiment, Velux (Schwalmstadt, Federal Republic of Germany) establishes six trend-setting concept houses in Denmark, Austria, England, France and Germany between 2009 and 2011 using the Model Home 2020. The vision: Buildings with optimal energy design simultaneously offering the highest quality of life. In Germany, this company accepts the task of social relevance to retrofit a settler's house to a light-active-house. Nearly one-half of the 39 million residential units in Germany is between 30 and 60 years old and must be retrofitted energetically. Here enormous potentials exist: However, not everyone can afford a sophisticated premium modernization. therefore the expert team now calculated two additional modernization variants for small budgets n the case of the German concept house.
Operational budgeting using fuzzy goal programming
Directory of Open Access Journals (Sweden)
Saeed Mohammadi
2013-10-01
Full Text Available Having an efficient budget normally has different advantages such as measuring the performance of various organizations, setting appropriate targets and promoting managers based on their achievements. However, any budgeting planning requires prediction of different cost components. There are various methods for budgeting planning such as incremental budgeting, program budgeting, zero based budgeting and performance budgeting. In this paper, we present a fuzzy goal programming to estimate operational budget. The proposed model uses fuzzy triangular as well as interval number to estimate budgeting expenses. The proposed study of this paper is implemented for a real-world case study in province of Qom, Iran and the results are analyzed.
Budget of tropospheric ozone during TOPSE from two chemical transport models
Emmons, L. K.; Hess, P.; Klonecki, A.; Tie, X.; Horowitz, L.; Lamarque, J.-F.; Kinnison, D.; Brasseur, G.; Atlas, E.; Browell, E.; Cantrell, C.; Eisele, F.; Mauldin, R. L.; Merrill, J.; Ridley, B.; Shetter, R.
2003-04-01
The tropospheric ozone budget during the Tropospheric Ozone Production about the Spring Equinox (TOPSE) campaign has been studied using two chemical transport models (CTMs): HANK and the Model of Ozone and Related chemical Tracers, version 2 (MOZART-2). The two models have similar chemical schemes but use different meteorological fields, with HANK using MM5 (Pennsylvania State University, National Center for Atmospheric Research Mesoscale Modeling System) and MOZART-2 driven by European Centre for Medium-Range Weather Forecasts (ECMWF) fields. Both models simulate ozone in good agreement with the observations but underestimate NOx. The models indicate that in the troposphere, averaged over the northern middle and high latitudes, chemical production of ozone drives the increase of ozone seen in the spring. Both ozone gross chemical production and loss increase greatly over the spring months. The in situ production is much larger than the net stratospheric input, and the deposition and horizontal fluxes are relatively small in comparison to chemical destruction. The net production depends sensitively on the concentrations of H2O, HO2 and NO, which differ slightly in the two models. Both models underestimate the chemical production calculated in a steady state model using TOPSE measurements, but the chemical loss rates agree well. Measures of the stratospheric influence on tropospheric ozone in relation to in situ ozone production are discussed. Two different estimates of the stratospheric fraction of O3 in the Northern Hemisphere troposphere indicate it decreases from 30-50% in February to 15-30% in June. A sensitivity study of the effect of a perturbation in the vertical flux on tropospheric ozone indicates the contribution from the stratosphere is approximately 15%.
Directory of Open Access Journals (Sweden)
J. Tang
2015-01-01
Eriophorum, Sphagnum and then tundra heath during the observation periods. The catchment-level carbon fluxes from aquatic systems are dominated by CO2 emissions from streams. Integrated across the whole catchment, we estimate that the area is a carbon sink at present, and will become an even stronger carbon sink by 2080, which is mainly a result of a projected densification of birch forest and its encroachment into tundra heath. However, the magnitudes of the modelled sinks are very dependent on future atmospheric CO2 concentrations. Furthermore, comparisons of global warming potentials between two simulations with and without CO2 increase since 1960 reveal that the increased methane emission from the peatland could double the warming effects of the whole catchment by 2080 in the absence of CO2 fertilization of the vegetation. This is the first process-based model study of the temporal evolution of a catchment-level carbon budget at high spatial resolution, integrating comprehensive and diverse fluxes including both terrestrial and aquatic carbon. Though this study also highlights some limitations in modelling subarctic ecosystem responses to climate change including aquatic system flux dynamics, nutrient limitation, herbivory and other disturbances and peatland expansion, our application provides a mechanism to resolve the complexity of carbon cycling in subarctic ecosystems while simultaneously pointing out the key model developments for capturing complex subarctic processes.
Application of arrangement theory to unfolding models
Kamiya, Hidehiko; Tokushige, Norihide
2010-01-01
Arrangement theory plays an essential role in the study of the unfolding model used in many fields. This paper describes how arrangement theory can be usefully employed in solving the problems of counting (i) the number of admissible rankings in an unfolding model and (ii) the number of ranking patterns generated by unfolding models. The paper is mostly expository but also contains some new results such as simple upper and lower bounds for the number of ranking patterns in the unidimensional case.
Scientific Theories, Models and the Semantic Approach
Directory of Open Access Journals (Sweden)
Décio Krause
2007-12-01
Full Text Available According to the semantic view, a theory is characterized by a class of models. In this paper, we examine critically some of the assumptions that underlie this approach. First, we recall that models are models of something. Thus we cannot leave completely aside the axiomatization of the theories under consideration, nor can we ignore the metamathematics used to elaborate these models, for changes in the metamathematics often impose restrictions on the resulting models. Second, based on a parallel between van Fraassen’s modal interpretation of quantum mechanics and Skolem’s relativism regarding set-theoretic concepts, we introduce a distinction between relative and absolute concepts in the context of the models of a scientific theory. And we discuss the significance of that distinction. Finally, by focusing on contemporary particle physics, we raise the question: since there is no general accepted unification of the parts of the standard model (namely, QED and QCD, we have no theory, in the usual sense of the term. This poses a difficulty: if there is no theory, how can we speak of its models? What are the latter models of? We conclude by noting that it is unclear that the semantic view can be applied to contemporary physical theories.
Modeling long-term, large-scale sediment storage using a simple sediment budget approach
Naipal, Victoria; Reick, Christian; Van Oost, Kristof; Hoffmann, Thomas; Pongratz, Julia
2016-05-01
Currently, the anthropogenic perturbation of the biogeochemical cycles remains unquantified due to the poor representation of lateral fluxes of carbon and nutrients in Earth system models (ESMs). This lateral transport of carbon and nutrients between terrestrial ecosystems is strongly affected by accelerated soil erosion rates. However, the quantification of global soil erosion by rainfall and runoff, and the resulting redistribution is missing. This study aims at developing new tools and methods to estimate global soil erosion and redistribution by presenting and evaluating a new large-scale coarse-resolution sediment budget model that is compatible with ESMs. This model can simulate spatial patterns and long-term trends of soil redistribution in floodplains and on hillslopes, resulting from external forces such as climate and land use change. We applied the model to the Rhine catchment using climate and land cover data from the Max Planck Institute Earth System Model (MPI-ESM) for the last millennium (here AD 850-2005). Validation is done using observed Holocene sediment storage data and observed scaling between sediment storage and catchment area. We find that the model reproduces the spatial distribution of floodplain sediment storage and the scaling behavior for floodplains and hillslopes as found in observations. After analyzing the dependence of the scaling behavior on the main parameters of the model, we argue that the scaling is an emergent feature of the model and mainly dependent on the underlying topography. Furthermore, we find that land use change is the main contributor to the change in sediment storage in the Rhine catchment during the last millennium. Land use change also explains most of the temporal variability in sediment storage in floodplains and on hillslopes.
Alunno-Bruscia, Marianne; van der Veer, Henk W.; Kooijman, Sebastiaan A. L. M.
2011-11-01
This second special issue of the Journal of Sea Research on development and applications of Dynamic Energy Budget (DEB) theory concludes the European Research Project AquaDEB (2007-2011). In this introductory paper we summarise the progress made during the running time of this 5 years' project, present context for the papers in this volume and discuss future directions. The main scientific objectives in AquaDEB were (i) to study and compare the sensitivity of aquatic species (mainly molluscs and fish) to environmental variability within the context of DEB theory for metabolic organisation, and (ii) to evaluate the inter-relationships between different biological levels (individual, population, ecosystem) and temporal scales (life cycle, population dynamics, evolution). AquaDEB phase I focussed on quantifying bio-energetic processes of various aquatic species ( e.g. molluscs, fish, crustaceans, algae) and phase II on: (i) comparing of energetic and physiological strategies among species through the DEB parameter values and identifying the factors responsible for any differences in bioenergetics and physiology; (ii) considering different scenarios of environmental disruption (excess of nutrients, diffuse or massive pollution, exploitation by man, climate change) to forecast effects on growth, reproduction and survival of key species; (iii) scaling up the models for a few species from the individual level up to the level of evolutionary processes. Apart from the three special issues in the Journal of Sea Research — including the DEBIB collaboration (see vol. 65 issue 2), a theme issue on DEB theory appeared in the Philosophical Transactions of the Royal Society B (vol 365, 2010); a large number of publications were produced; the third edition of the DEB book appeared (2010); open-source software was substantially expanded (over 1000 functions); a large open-source systematic collection of ecophysiological data and DEB parameters has been set up; and a series of DEB
Institute of Scientific and Technical Information of China (English)
LI Xiang; WANG Hui; ZHANG Zhanhai; WU Huiding
2008-01-01
The surface heat budget of the Arctic Ocean (SHEBA) project has shown that the study of the surface heat budget characteristics is crucial to understanding the interface process and environmental change in the polar region.An arctic single-column model (ARCSCM) of Colorado University is used to simulate the arctic surface radiation and energy budget during the summertime.The simulation results are analyzed and compared with the SHEBA measurements.Sensitivity analyses are performed to test microphys- ical and radiative parameterizations in this model.The results show that the ARCSCM model is able to simulate the surface radia- tion and energy budget in the arctic during the summertime,and the different parameterizations have a significant influence on the results.The combination of cloud microphysics and RRTM parameterizations can fairly derive the surface solar shortwave radiation and downwelling Iongwave radiation flux.But this cloud microphysics parameterization scheme deviates notably from the simula- tion of surface sensible and latent heat flux.Further improvement for the parameterization scheme applied to the Arctic Regions is necessary.
Tang, J.; Miller, P. A.; Persson, A.; Olefeldt, D.; Pilesjo, P.; Heliasz, M.; Jackowicz-Korczynski, M.; Yang, Z.; Smith, B.; Callaghan, T. V.; Christensen, T. R.
2015-05-01
A large amount of organic carbon is stored in high-latitude soils. A substantial proportion of this carbon stock is vulnerable and may decompose rapidly due to temperature increases that are already greater than the global average. It is therefore crucial to quantify and understand carbon exchange between the atmosphere and subarctic/arctic ecosystems. In this paper, we combine an Arctic-enabled version of the process-based dynamic ecosystem model, LPJ-GUESS (version LPJG-WHyMe-TFM) with comprehensive observations of terrestrial and aquatic carbon fluxes to simulate long-term carbon exchange in a subarctic catchment at 50 m resolution. Integrating the observed carbon fluxes from aquatic systems with the modeled terrestrial carbon fluxes across the whole catchment, we estimate that the area is a carbon sink at present and will become an even stronger carbon sink by 2080, which is mainly a result of a projected densification of birch forest and its encroachment into tundra heath. However, the magnitudes of the modeled sinks are very dependent on future atmospheric CO2 concentrations. Furthermore, comparisons of global warming potentials between two simulations with and without CO2 increase since 1960 reveal that the increased methane emission from the peatland could double the warming effects of the whole catchment by 2080 in the absence of CO2 fertilization of the vegetation. This is the first process-based model study of the temporal evolution of a catchment-level carbon budget at high spatial resolution, including both terrestrial and aquatic carbon. Though this study also highlights some limitations in modeling subarctic ecosystem responses to climate change, such as aquatic system flux dynamics, nutrient limitation, herbivory and other disturbances, and peatland expansion, our study provides one process-based approach to resolve the complexity of carbon cycling in subarctic ecosystems while simultaneously pointing out the key model developments for capturing
Evaluation of water and energy budgets in regional climate models applied over Europe
Energy Technology Data Exchange (ETDEWEB)
Hagemann, S.; Jacob, D. [Max Planck Institute for Meteorology, Hamburg (Germany); Machenhauer, B.; Christensen, O.B. [Danish Meteorological Institute, Climate Research Division, Copenhagen Oe (Denmark); Jones, R. [Meteorological Office Hadley Centre, Bracknell (United Kingdom); Deque, M. [Meteo-France CNRM/GMGEC/EAC, Toulouse Cedex 01 (France); Vidale, P.L. [Climate Research ETH, Zuerich (Switzerland)
2004-10-01
This study presents a model intercomparison of four regional climate models (RCMs) and one variable resolution atmospheric general circulation model (AGCM) applied over Europe with special focus on the hydrological cycle and the surface energy budget. The models simulated the 15 years from 1979 to 1993 by using quasi-observed boundary conditions derived from ECMWF re-analyses (ERA). The model intercomparison focuses on two large atchments representing two different climate conditions covering two areas of major research interest within Europe. The first is the Danube catchment which represents a continental climate dominated by advection from the surrounding land areas. It is used to analyse the common model error of a too dry and too warm simulation of the summertime climate of southeastern Europe. This summer warming and drying problem is seen in many RCMs, and to a less extent in GCMs. The second area is the Baltic Sea catchment which represents maritime climate dominated by advection from the ocean and from the Baltic Sea. This catchment is a research area of many studies within Europe and also covered by the BALTEX program. The observed data used are monthly mean surface air temperature, precipitation and river discharge. For all models, these are used to estimate mean monthly biases of all components of the hydrological cycle over land. In addition, the mean monthly deviations of the surface energy fluxes from ERA data are computed. Atmospheric moisture fluxes from ERA are compared with those of one model to provide an independent estimate of the convergence bias derived from the observed data. These help to add weight to some of the inferred estimates and explain some of the discrepancies between them. An evaluation of these biases and deviations suggests possible sources of error in each of the models. For the Danube catchment, systematic errors in the dynamics cause the prominent summer drying problem for three of the RCMs, while for the fourth RCM this is
Directory of Open Access Journals (Sweden)
Irene Lenoir-Wijnkoop
Full Text Available Two recent meta-analyses by the York Health Economics Consortium (YHEC and Cochrane demonstrated probiotic efficacy in reducing the duration and number of common respiratory tract infections (CRTI and associated antibiotic prescriptions. A health-economic analysis was undertaken to estimate the public health and budget consequences of a generalized probiotic consumption in France.A virtual age- and gender-standardized population was generated using a Markov microsimulation model. CRTI risk factors incorporated into this model were age, active/passive smoking and living in a community setting. Incidence rates and resource utilization were based on the 2011-2012 flu season and retrieved from the French GPs Sentinelles network. Results of both meta-analyses were independently applied to the French population to estimate CRTI events, assuming a generalized probiotic use compared to no probiotics during winter months: -0.77 days/CRTI episode (YHEC scenario or odds-ratio 0.58 for ≥1 CRTI episode (Cochrane scenario with vs. without probiotics. Economic perspectives were National Health System (NHS, society, family. Outcomes included cost savings related to the reduced numbers of CRTI episodes, days of illness, number of antibiotic courses, sick leave days, medical and indirect costs.For France, generalized probiotic use would save 2.4 million CRTI-days, 291,000 antibiotic courses and 581,000 sick leave days, based on YHEC data. Applying the Cochrane data, reductions were 6.6 million CRTI days, 473,000 antibiotic courses and 1.5 million sick days. From the NHS perspective, probiotics' economic impact was about €14.6 million saved according to YHEC and €37.7 million according to Cochrane. Higher savings were observed in children, active smokers and people with more frequent human contacts.Public health and budget impact of probiotics are substantial, whether they reduce CRTI episodes frequency or duration. Noteworthy, the 2011-12 winter CRTI
The Nomad Model: Theory, Developments and Applications
Campanella, M.; Hoogendoorn, S.P.; Daamen, W.
2014-01-01
This paper presents details of the developments of the Nomad model after being introduced more than 12 years ago. The model is derived from a normative theory of pedestrian behavior making it unique under microscopic models. Nomad has been successfully applied in several cases indicating that it ful
Integrable Models, SUSY Gauge Theories, and String Theory
Nam, S
1996-01-01
We consider the close relation between duality in N=2 SUSY gauge theories and integrable models. Vario us integrable models ranging from Toda lattices, Calogero models, spinning tops, and spin chains are re lated to the quantum moduli space of vacua of N=2 SUSY gauge theories. In particular, SU(3) gauge t heories with two flavors of massless quarks in the fundamental representation can be related to the spec tral curve of the Goryachev-Chaplygin top, which is a Nahm's equation in disguise. This can be generaliz ed to the cases with massive quarks, and N_f = 0,1,2, where a system with seven dimensional phas e space has the relevant hyperelliptic curve appear in the Painlevé test. To understand the stringy o rigin of the integrability of these theories we obtain exact nonperturbative point particle limit of ty pe II string compactified on a Calabi-Yau manifold, which gives the hyperelliptic curve of SU(2) QCD w ith N_f =1 hypermultiplet.
Lattice Gauge Theories and Spin Models
Mathur, Manu
2016-01-01
The Wegner $Z_2$ gauge theory-$Z_2$ Ising spin model duality in $(2+1)$ dimensions is revisited and derived through a series of canonical transformations. These $Z_2$ results are directly generalized to SU(N) lattice gauge theory in $(2+1)$ dimensions to obtain a dual SU(N) spin model in terms of the SU(N) magnetic fields and electric scalar potentials. The gauge-spin duality naturally leads to a new gauge invariant disorder operator for SU(N) lattice gauge theory. A variational ground state of the dual SU(2) spin model with only nearest neighbour interactions is constructed to analyze SU(2) lattice gauge theory.
Hueni, A.
2015-12-01
ESA's Airborne Imaging Spectrometer APEX (Airborne Prism Experiment) was developed under the PRODEX (PROgramme de Développement d'EXpériences scientifiques) program by a Swiss-Belgian consortium and entered its operational phase at the end of 2010 (Schaepman et al., 2015). Work on the sensor model has been carried out extensively within the framework of European Metrology Research Program as part of the Metrology for Earth Observation and Climate (MetEOC and MetEOC2). The focus has been to improve laboratory calibration procedures in order to reduce uncertainties, to establish a laboratory uncertainty budget and to upgrade the sensor model to compensate for sensor specific biases. The updated sensor model relies largely on data collected during dedicated characterisation experiments in the APEX calibration home base but includes airborne data as well where the simulation of environmental conditions in the given laboratory setup was not feasible. The additions to the model deal with artefacts caused by environmental changes and electronic features, namely the impact of ambient air pressure changes on the radiometry in combination with dichroic coatings, influences of external air temperatures and consequently instrument baffle temperatures on the radiometry, and electronic anomalies causing radiometric errors in the four shortwave infrared detector readout blocks. Many of these resolved issues might be expected to be present in other imaging spectrometers to some degree or in some variation. Consequently, the work clearly shows the difficulties of extending a laboratory-based uncertainty to data collected under in-flight conditions. The results are hence not only of interest to the calibration scientist but also to the spectroscopy end user, in particular when commercial sensor systems are used for data collection and relevant sensor characteristic information tends to be sparse. Schaepman, et al, 2015. Advanced radiometry measurements and Earth science
Griffith, S. M.; Hansen, R. F.; Dusanter, S.; Michoud, V.; Gilman, J. B.; Kuster, W. C.; Veres, P. R.; Graus, M.; Gouw, J. A.; Roberts, J.; Young, C.; Washenfelder, R.; Brown, S. S.; Thalman, R.; Waxman, E.; Volkamer, R.; Tsai, C.; Stutz, J.; Flynn, J. H.; Grossberg, N.; Lefer, B.; Alvarez, S. L.; Rappenglueck, B.; Mielke, L. H.; Osthoff, H. D.; Stevens, P. S.
2016-04-01
Measurements of hydroxyl (OH) and hydroperoxy (HO2*) radical concentrations were made at the Pasadena ground site during the CalNex-LA 2010 campaign using the laser-induced fluorescence-fluorescence assay by gas expansion technique. The measured concentrations of OH and HO2* exhibited a distinct weekend effect, with higher radical concentrations observed on the weekends corresponding to lower levels of nitrogen oxides (NOx). The radical measurements were compared to results from a zero-dimensional model using the Regional Atmospheric Chemical Mechanism-2 constrained by NOx and other measured trace gases. The chemical model overpredicted measured OH concentrations during the weekends by a factor of approximately 1.4 ± 0.3 (1σ), but the agreement was better during the weekdays (ratio of 1.0 ± 0.2). Model predicted HO2* concentrations underpredicted by a factor of 1.3 ± 0.2 on the weekends, while measured weekday concentrations were underpredicted by a factor of 3.0 ± 0.5. However, increasing the modeled OH reactivity to match the measured total OH reactivity improved the overall agreement for both OH and HO2* on all days. A radical budget analysis suggests that photolysis of carbonyls and formaldehyde together accounted for approximately 40% of radical initiation with photolysis of nitrous acid accounting for 30% at the measurement height and ozone photolysis contributing less than 20%. An analysis of the ozone production sensitivity reveals that during the week, ozone production was limited by volatile organic compounds throughout the day during the campaign but NOx limited during the afternoon on the weekends.
Gauge theories and integrable lattice models
Witten, Edward
1989-08-01
Investigations of new knot polynomials discovered in the last few years have shown them to be intimately connected with soluble models of two dimensional lattice statistical mechanics. In this paper, these results, which in time may illuminate the whole question of why integrable lattice models exist, are reconsidered from the point of view of three dimensional gauge theory. Expectation values of Wilson lines in three dimensional Chern-Simons gauge theories can be computed by evaluating the partition functions of certain lattice models on finite graphs obtained by projecting the Wilson lines to the plane. The models in question — previously considered in both the knot theory and statistical mechanics — are IRF models in which the local Boltzmann weights are the matrix elements of braiding matrices in rational conformal field theories. These matrix elements, in turn, can be presented in three dimensional gauge theory in terms of the expectation value of a certain tetrahedral configuration of Wilson lines. This representation makes manifest a surprising symmetry of the braiding matrix elements in conformal field theory.
Modeling Techniques: Theory and Practice
Odd A. Asbjørnsen
1985-01-01
A survey is given of some crucial concepts in chemical process modeling. Those are the concepts of physical unit invariance, of reaction invariance and stoichiometry, the chromatographic effect in heterogeneous systems, the conservation and balance principles and the fundamental structures of cause and effect relationships. As an example, it is shown how the concept of reaction invariance may simplify the homogeneous reactor modeling to a large extent by an orthogonal decomposition of the pro...
Social Security Administration — DCS Budget Tracking System database contains budget information for the Information Technology budget and the 'Other Objects' budget. This data allows for monitoring...
Halyo, Nesim; Direskeneli, Haldun; Barkstrom, Bruce R.
1991-01-01
Satellite measurements are subject to a wide range of uncertainties due to their temporal, spatial, and directional sampling characteristics. An information-theory approach is suggested to examine the nonuniform temporal sampling of ERB measurements. The information (i.e., its entropy or uncertainty) before and after the measurements is determined, and information gain (IG) is defined as a reduction in the uncertainties involved. A stochastic model for the diurnal outgoing flux variations that affect the ERB is developed. Using Gaussian distributions for the a priori and measured radiant exitance fields, the IG is obtained by computing the a posteriori covariance. The IG for the monthly outgoing flux measurements is examined for different orbital parameters and orbital tracks, using the Earth Observing System orbital parameters as specific examples. Variations in IG due to changes in the orbit's inclination angle and the initial ascending node local time are investigated.
This is a timetable for congressional action under the Balanced Budget and Emergency Deficit Control Act of 1985 (Gramm-Rudman-Hollings). These deadlines apply to fiscal years (FY) 1987-1991. The Congress missed a number of these deadlines last year. The deficit reduction measures in Gramm-Rudman-Hollings would lead to a balanced budget in 1991.
DEFF Research Database (Denmark)
Jeppesen, Palle
1996-01-01
The lecture note is aimed at introducing system budgets for optical communication systems. It treats optical fiber communication systems (six generations), system design, bandwidth effects, other system impairments and optical amplifiers.......The lecture note is aimed at introducing system budgets for optical communication systems. It treats optical fiber communication systems (six generations), system design, bandwidth effects, other system impairments and optical amplifiers....
Grey-theory based intrusion detection model
Institute of Scientific and Technical Information of China (English)
Qin Boping; Zhou Xianwei; Yang Jun; Song Cunyi
2006-01-01
To solve the problem that current intrusion detection model needs large-scale data in formulating the model in real-time use, an intrusion detection system model based on grey theory (GTIDS) is presented. Grey theory has merits of fewer requirements on original data scale, less limitation of the distribution pattern and simpler algorithm in modeling.With these merits GTIDS constructs model according to partial time sequence for rapid detect on intrusive act in secure system. In this detection model rate of false drop and false retrieval are effectively reduced through twice modeling and repeated detect on target data. Furthermore, GTIDS framework and specific process of modeling algorithm are presented. The affectivity of GTIDS is proved through emulated experiments comparing snort and next-generation intrusion detection expert system (NIDES) in SRI international.
Modeling Techniques: Theory and Practice
Directory of Open Access Journals (Sweden)
Odd A. Asbjørnsen
1985-07-01
Full Text Available A survey is given of some crucial concepts in chemical process modeling. Those are the concepts of physical unit invariance, of reaction invariance and stoichiometry, the chromatographic effect in heterogeneous systems, the conservation and balance principles and the fundamental structures of cause and effect relationships. As an example, it is shown how the concept of reaction invariance may simplify the homogeneous reactor modeling to a large extent by an orthogonal decomposition of the process variables. This allows residence time distribution function parameters to be estimated with the reaction in situ, but without any correlation between the estimated residence time distribution parameters and the estimated reaction kinetic parameters. A general word of warning is given to the choice of wrong mathematical structure of models.
Graphical Model Theory for Wireless Sensor Networks
Energy Technology Data Exchange (ETDEWEB)
Davis, William B.
2002-12-08
Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm.
企业全面预算管理成熟度模型构建研究%Research on the Construction of the Enterprise Comprehensive Budget Maturity Model
Institute of Scientific and Technical Information of China (English)
刘凌冰; 韩向东
2015-01-01
This paper systematically analyzes the elements and the factors of the budget management system, taking internal control theory as theoretical foundation. The idea of the constructing and grading method refer to Capability Maturity Model Integration (CMMI). Meanwhile, Comprehensive Budget Maturity Model (CBMM) is combined with the practice of budget management in Chinese Enterprises. CBMM achieves quantitative measurement for the level of enterprise budget management and can be a benchmark of budget management for Chinese enterprises, which can help enterprises improve the effectiveness, efficiency and process of budget management.%本文以内部控制五要素理论为基础，系统地分析了企业全面预算管理系统的构成要素和影响因素，借鉴了软件能力成熟度集成模型的构造方法和分级评价思想，结合中国企业预算管理实践开发了企业全面预算管理成熟度模型（CBMM）。CBMM模型以中外企业预算管理的经验和全面预算的概念为基础，用以指导企业如何运用预算技术方法、实施过程和如何制定与之相适应的管理体系。CBMM模型实现了对企业预算管理水平的定量测量，也是一套保障预算管理实施效果、提高预算工作效率、改进预算流程的全面预算管理模式与标准规范。
F-theory and linear sigma models
Bershadsky, M; Greene, Brian R; Johansen, A; Lazaroiu, C I
1998-01-01
We present an explicit method for translating between the linear sigma model and the spectral cover description of SU(r) stable bundles over an elliptically fibered Calabi-Yau manifold. We use this to investigate the 4-dimensional duality between (0,2) heterotic and F-theory compactifications. We indirectly find that much interesting heterotic information must be contained in the `spectral bundle' and in its dual description as a gauge theory on multiple F-theory 7-branes. A by-product of these efforts is a method for analyzing semistability and the splitting type of vector bundles over an elliptic curve given as the sheaf cohomology of a monad.
Integrable Lattice Models From Gauge Theory
Witten, Edward
2016-01-01
These notes provide an introduction to recent work by Kevin Costello in which integrable lattice models of classical statistical mechanics in two dimensions are understood in terms of quantum gauge theory in four dimensions. This construction will be compared to the more familiar relationship between quantum knot invariants in three dimensions and Chern-Simons gauge theory. (Based on a Whittaker Colloquium at the University of Edinburgh and a lecture at Strings 2016 in Beijing.)
Spreading Models in Banach Space Theory
Argyros, S A; Tyros, K
2010-01-01
We extend the classical Brunel-Sucheston definition of the spreading model by introducing the $\\mathcal{F}$-sequences $(x_s)_{s\\in\\mathcal{F}}$ in a Banach space and the plegma families in $\\mathcal{F}$ where $\\mathcal{F}$ is a regular thin family. The new concept yields a transfinite increasing hierarchy of classes of 1-subsymmetric sequences. We explore the corresponding theory and we present examples establishing this hierarchy and illustrating the limitation of the theory.
Lu, Fei; Wang, Xiao-Ke; Han, Bing; Ouyang, Zhi-Yun; Zheng, Hua
2010-05-01
Straw returning is considered to be one of the most promising carbon sequestration measures in China's cropland. A compound model, namely "Straw Returning and Burning Model-Expansion" (SRBME), was built to estimate the net mitigation potential, economic benefits, and air pollutant reduction of straw returning. Three scenarios, that is, baseline, "full popularization of straw returning (FP)," and "full popularization of straw returning and precision fertilization (FP + P)," were set to reflect popularization of straw returning. The results of the SRBME indicated that (1) compared with the soil carbon sequestration of 13.37 Tg/yr, the net mitigation potentials, which were 6.328 Tg/yr for the FP scenario and 9.179 Tg/yr for the FP + P scenario, had different trends when the full budget of the greenhouse gases was considered; (2) when the feasibility in connection with greenhouse gas (GHG) mitigation, economic benefits, and environmental benefits was taken into consideration, straw returning was feasible in 15 provinces in the FP scenario, with a total net mitigation potential of 7.192 TgCe/yr and the total benefits of CNY 1.473 billion (USD 216.6 million); (3) in the FP + P scenario, with the implementation of precision fertilization, straw returning was feasible in 26 provinces with a total net mitigation potential of 10.39 TgCe/yr and the total benefits of CNY 5.466 billion (USD 803.8 million); (4) any extent of change in the treatment of straw from being burnt to being returned would contribute to air pollution reduction; (5) some countermeasures, such as CH(4) reduction in rice paddies, precision fertilization, financial support, education and propaganda, would promote the feasibility of straw returning as a mitigation measure.
建筑工程造价预算控制理论与方法研究%Research on the Theory and Method of Construction Cost Budget Control
Institute of Scientific and Technical Information of China (English)
黄凯
2016-01-01
The project cost budget control plays a very important role in improving the project economic efficiency and the working efficiency of the budget staff. This paper comprehensively studies how to achieve the reasonable and standard project budget control. The current budget control and common problems of construction project cost are put forward from four points. The measures to strengthen the construction cost budget control are put forward from method, personnel, system and other aspects. In the end, this paper studies the theory and method of construction cost budget control, expounds the engineering cost budget control based on BIM, the project budget control under the whole life cycle cost and the project budget of the subjective probability method to provide some guiding significance for the project cost budget control work.%工程造价预算控制对提高项目经济效益,提高预算业务人员的工作效率起到非常重要作用.如何做到工程项目预算控制合理和规范,本文进行了较为全面的研究.从四个角度提出了建筑工程造价预算控制现状及常见问题,并在方法,人员,制度等方面给出了加强工程造价预算控制的对策与措施;最后研究了建筑工程造价预算控制理论与方法,阐述了基于BIM的工程造价预算控制,全寿命周期费用下的工程预算控制,主观概率法的工程预算,旨在为工程造价预算控制工作提供一定的指导意义.
Security Theorems via Model Theory
Directory of Open Access Journals (Sweden)
Joshua Guttman
2009-11-01
Full Text Available A model-theoretic approach can establish security theorems for cryptographic protocols. Formulas expressing authentication and non-disclosure properties of protocols have a special form. They are quantified implications for all xs . (phi implies for some ys . psi. Models (interpretations for these formulas are *skeletons*, partially ordered structures consisting of a number of local protocol behaviors. *Realized* skeletons contain enough local sessions to explain all the behavior, when combined with some possible adversary behaviors. We show two results. (1 If phi is the antecedent of a security goal, then there is a skeleton A_phi such that, for every skeleton B, phi is satisfied in B iff there is a homomorphism from A_phi to B. (2 A protocol enforces for all xs . (phi implies for some ys . psi iff every realized homomorphic image of A_phi satisfies psi. Hence, to verify a security goal, one can use the Cryptographic Protocol Shapes Analyzer CPSA (TACAS, 2007 to identify minimal realized skeletons, or "shapes," that are homomorphic images of A_phi. If psi holds in each of these shapes, then the goal holds.
Directory of Open Access Journals (Sweden)
Tânia Regina Sordi Relvas
2011-09-01
Full Text Available Diante da constatação de que os estudos sobre o orçamento exploram o fenômeno de forma reducionista, este artigo tem por objetivo propor uma teoria substantiva abrangente e fundamentada em dados empíricos para a análise do orçamento. Essa abordagem considera seus elementos constituintes e suas interdependências. Isso foi feito por meio da aplicação da abordagem indutiva fundamentada nos dados empíricos (grounded theory, sob o paradigma qualitativo. O foco de análise foi uma instituição financeira de grande porte e o trabalho de campo foi desenvolvido ao longo de dois anos, envolvendo vários níveis gerenciais. A contribuição do trabalho advém da disponibilização de framework para o tratamento do tema em um contexto amplo, o que permitiu entender aspectos que deixariam de ser considerados com uma abordagem de análise mais restrita e menos abrangente. Como produto da teoria substantiva, cinco proposições foram desenvolvidas com a perspectiva de serem aplicadas nas organizações. --- Budgeting: substantive analysis using grounded theory --- Abstract --- Considering the fact that studies into budgeting basically use a reductionist approach, this paper proposes a comprehensive substantive theory based on empirical data to be used in budget analysis. This approach takes into consideration its elements and interdependence by applying the inductive approach based on empirical data (grounded theory on a qualitative paradigm. The focus was an in-depth two-year study of a large Brazilian financial institution involving several management levels. The main contribution of the study is as a framework that treats all elements of the budget process in a comprehensive and coherent fashion, otherwise impossible using a reductionist approach. As products of the substantive theory, five propositions were developed to be applied in organizations.
Directory of Open Access Journals (Sweden)
A. Ito
2012-02-01
Full Text Available We assessed the global terrestrial budget of methane (CH_{4} by using a process-based biogeochemical model (VISIT and inventory data for components of the budget that were not included in the model. Emissions from wetlands, paddy fields, biomass burning, and plants, as well as oxidative consumption by upland soils, were simulated by the model. Emissions from ruminant livestock and termites were evaluated by using an inventory approach. These CH_{4} flows were estimated for each of the model's 0.5° × 0.5° grid cells from 1901 to 2009, while accounting for atmospheric composition, meteorological factors, and land-use changes. Estimation uncertainties were examined through ensemble simulations using different parameterization schemes and input data (e.g., different wetland maps and emission factors. From 1996 to 2005, the average global terrestrial CH_{4} budget was estimated on the basis of 1152 simulations, and terrestrial ecosystems were found to be a net source of 308.3 ± 20.7 Tg CH_{4} yr^{−1}. Wetland and livestock ruminant emissions were the primary sources. The results of our simulations indicate that sources and sinks are distributed highly heterogeneously over the Earth's land surface. Seasonal and interannual variability in the terrestrial budget was also assessed. The trend of increasing net emission from terrestrial sources and its relationship with temperature variability imply that terrestrial CH_{4} feedbacks will play an increasingly important role as a result of future climatic change.
Vacation queueing models theory and applications
Tian, Naishuo
2006-01-01
A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...
Some Remarks on the Model Theory of Epistemic Plausibility Models
Demey, Lorenz
2010-01-01
Classical logics of knowledge and belief are usually interpreted on Kripke models, for which a mathematically well-developed model theory is available. However, such models are inadequate to capture dynamic phenomena. Therefore, epistemic plausibility models have been introduced. Because these are much richer structures than Kripke models, they do not straightforwardly inherit the model-theoretical results of modal logic. Therefore, while epistemic plausibility structures are well-suited for modeling purposes, an extensive investigation of their model theory has been lacking so far. The aim of the present paper is to fill exactly this gap, by initiating a systematic exploration of the model theory of epistemic plausibility models. Like in 'ordinary' modal logic, the focus will be on the notion of bisimulation. We define various notions of bisimulations (parametrized by a language L) and show that L-bisimilarity implies L-equivalence. We prove a Hennesy-Milner type result, and also two undefinability results. ...
Energy Technology Data Exchange (ETDEWEB)
Kheshgi, Haroon S. [Corporate Research Laboratories, Exxon Research and Engineering Company, Annandale, New Jersey (United States); Jain, Atul K. [Department of Atmospheric Sciences, University of Illinois, Urbana (United States); Wuebbles, Donald J. [Department of Atmospheric Sciences, University of Illinois, Urbana (United States)
1999-12-27
A global carbon cycle model is used to reconstruct the carbon budget, balancing emissions from fossil fuel and land use with carbon uptake by the oceans, and the terrestrial biosphere. We apply Bayesian statistics to estimate uncertainty of carbon uptake by the oceans and the terrestrial biosphere based on carbon dioxide and carbon isotope records, and prior information on model parameter probability distributions. This results in a quantitative reconstruction of past carbon budget and its uncertainty derived from an explicit choice of model, data-based constraints, and prior distribution of parameters. Our estimated ocean sink for the 1980s is 17{+-}7 Gt C (90% confidence interval) and is comparable to the estimate of 20{+-}8 Gt C given in the recent Intergovernmental Panel on Climate Change assessment [Schimel et al., 1996]. Constraint choice is tested to determine which records have the most influence over estimates of the past carbon budget; records individually (e.g., bomb-radiocarbon inventory) have little effect since there are other records which form similar constraints. (c) 1999 American Geophysical Union.
Vanos, J. K.; Warland, J. S.; Gillespie, T. J.; Kenny, N. A.
2012-11-01
The purpose of this paper is to implement current and novel research techniques in human energy budget estimations to give more accurate and efficient application of models by a variety of users. Using the COMFA model, the conditioning level of an individual is incorporated into overall energy budget predictions, giving more realistic estimations of the metabolism experienced at various fitness levels. Through the use of VO2 reserve estimates, errors are found when an elite athlete is modelled as an unconditioned or a conditioned individual, giving budgets underpredicted significantly by -173 and -123 W m-2, respectively. Such underprediction can result in critical errors regarding heat stress, particularly in highly motivated individuals; thus this revision is critical for athletic individuals. A further improvement in the COMFA model involves improved adaptation of clothing insulation ( I cl), as well clothing non-uniformity, with changing air temperature ( T a) and metabolic activity ( M act). Equivalent T a values (for I cl estimation) are calculated in order to lower the I cl value with increasing M act at equal T a. Furthermore, threshold T a values are calculated to predict the point at which an individual will change from a uniform I cl to a segmented I cl (full ensemble to shorts and a T-shirt). Lastly, improved relative velocity ( v r) estimates were found with a refined equation accounting for the degree angle of wind to body movement. Differences between the original and improved v r equations increased with higher wind and activity speeds, and as the wind to body angle moved away from 90°. Under moderate microclimate conditions, and wind from behind a person, the convective heat loss and skin temperature estimates were 47 W m-2 and 1.7°C higher when using the improved v r equation. These model revisions improve the applicability and usability of the COMFA energy budget model for subjects performing physical activity in outdoor environments
Supersymmetric Microscopic Theory of the Standard Model
Ter-Kazarian, G T
2000-01-01
We promote the microscopic theory of standard model (MSM, hep-ph/0007077) into supersymmetric framework in order to solve its technical aspects of vacuum zero point energy and hierarchy problems, and attempt, further, to develop its realistic viable minimal SUSY extension. Among other things that - the MSM provides a natural unification of geometry and the field theory, has clarified the physical conditions in which the geometry and particles come into being, in microscopic sense enables an insight to key problems of particle phenomenology and answers to some of its nagging questions - a present approach also leads to quite a new realization of the SUSY yielding a physically realistic particle spectrum. It stems from the special subquark algebra, from which the nilpotent supercharge operators are derived. The resulting theory makes plausible following testable implications for the current experiments at LEP2, at the Tevatron and at LHC drastically different from those of the conventional MSSM models: 1. All t...
PRINCIPLES OF FORMATION OF INNOVATIVE MODEL OF EFFICIENCY OF BUDGET FUNDS USE
Directory of Open Access Journals (Sweden)
Elena I. Chibisova
2013-01-01
Full Text Available The article describes the innovative approach to use of performance indicators to improve the quality of efficiency control of budget funds use, it is offered to create the inner budgetary administrative system of control on the effectiveness and appropriateness of the budgetary funds use.
Aligning Grammatical Theories and Language Processing Models
Lewis, Shevaun; Phillips, Colin
2015-01-01
We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…
Engaging Theories and Models to Inform Practice
Kraus, Amanda
2012-01-01
Helping students prepare for the complex transition to life after graduation is an important responsibility shared by those in student affairs and others in higher education. This chapter explores theories and models that can inform student affairs practitioners and faculty in preparing students for life after college. The focus is on roles,…
Budget Management Model Based on Zero-based Budget%基于零基预算的预算管理模式研究
Institute of Scientific and Technical Information of China (English)
王海玲
2013-01-01
本文从我国国家预算管理体制大环境入手，从高职院校预算管理角度对目前我国高职院校预算管理中存在的问题进行研究和分析，并提出了适应新的公共财政预算制度的对策和建议。本文的研究对加强高职院校的预算管理，提高财务管理水平具有重要的现实意义。%Starting with our national budget management system, the paper studies and researches the problems in current budget management of national vocational colleges, and puts forward the strategies and suggestions on adapting the new public finance budget system. This study is significant to strengthen budgeting management in higher vocational colleges and improve financial management level.
Lattice gauge theories and spin models
Mathur, Manu; Sreeraj, T. P.
2016-10-01
The Wegner Z2 gauge theory-Z2 Ising spin model duality in (2 +1 ) dimensions is revisited and derived through a series of canonical transformations. The Kramers-Wannier duality is similarly obtained. The Wegner Z2 gauge-spin duality is directly generalized to SU(N) lattice gauge theory in (2 +1 ) dimensions to obtain the SU(N) spin model in terms of the SU(N) magnetic fields and their conjugate SU(N) electric scalar potentials. The exact and complete solutions of the Z2, U(1), SU(N) Gauss law constraints in terms of the corresponding spin or dual potential operators are given. The gauge-spin duality naturally leads to a new gauge invariant magnetic disorder operator for SU(N) lattice gauge theory which produces a magnetic vortex on the plaquette. A variational ground state of the SU(2) spin model with nearest neighbor interactions is constructed to analyze SU(2) gauge theory.
Michot, Béatrice; Meselhe, Ehab A.; Rivera-Monroy, Victor H.; Coronado-Molina, Carlos; Twilley, Robert R.
2011-07-01
Taylor Slough is one of the natural freshwater contributors to Florida Bay through a network of microtidal creeks crossing the Everglades Mangrove Ecotone Region (EMER). The EMER ecological function is critical since it mediates freshwater and nutrient inputs and controls the water quality in Eastern Florida Bay. Furthermore, this region is vulnerable to changing hydrodynamics and nutrient loadings as a result of upstream freshwater management practices proposed by the Comprehensive Everglades Restoration Program (CERP), currently the largest wetland restoration project in the USA. Despite the hydrological importance of Taylor Slough in the water budget of Florida Bay, there are no fine scale (˜1 km 2) hydrodynamic models of this system that can be utilized as a tool to evaluate potential changes in water flow, salinity, and water quality. Taylor River is one of the major creeks draining Taylor Slough freshwater into Florida Bay. We performed a water budget analysis for the Taylor River area, based on long-term hydrologic data (1999-2007) and supplemented by hydrodynamic modeling using a MIKE FLOOD (DHI, http://dhigroup.com/) model to evaluate groundwater and overland water discharges. The seasonal hydrologic characteristics are very distinctive (average Taylor River wet vs. dry season outflow was 6 to 1 during 1999-2006) with a pronounced interannual variability of flow. The water budget shows a net dominance of through flow in the tidal mixing zone, while local precipitation and evapotranspiration play only a secondary role, at least in the wet season. During the dry season, the tidal flood reaches the upstream boundary of the study area during approximately 80 days per year on average. The groundwater field measurements indicate a mostly upwards-oriented leakage, which possibly equals the evapotranspiration term. The model results suggest a high importance of groundwater contribution to the water salinity in the EMER. The model performance is satisfactory
Directory of Open Access Journals (Sweden)
X. M. Liu
2010-09-01
Full Text Available The aim of this paper is to study the impacts of overshooting convection at a local scale on the water distribution in the tropical UTLS. Overshooting convection is assumed to be one of the processes controlling the entry of water vapour mixing ratio in the stratosphere by injecting ice crystals above the tropopause which later sublimate and hydrate the lower stratosphere. For this purpose, we quantify the individual impact of two cases of overshooting convection in Africa observed during SCOUT-AMMA: the case of 4 August 2006 over Southern Chad which is likely to have influenced the water vapour measurements by micro-SDLA and FLASH-B from Niamey on 5 August, and the case of a mesoscale convective system over Aïr on 5 August 2006. We make use of high resolution (down to 1 km horizontally nested grid simulations with the three-dimensional regional atmospheric model BRAMS (Brazilian Regional Atmospheric Modelling System. In both cases, BRAMS succeeds in simulating the main features of the convective activity, as well as overshooting convection, though the exact position and time of the overshoots indicated by MSG brightness temperature difference is not fully reproduced (typically 1° displacement in latitude compared with the overshoots indicated by brightness temperature difference from satellite observations for both cases, and several hours shift for the Aïr case on 5 August 2006. Total water budgets associated with these two events show a significant injection of ice particles above the tropopause with maximum values of about 3.7 ton s^{−1} for the Chad case (4 August and 1.4 ton s^{−1} for the Aïr case (5 August, and a total upward cross tropopause transport of about 3300 ton h^{−1} for the Chad case and 2400 ton h^{−1} for the Aïr case in the third domain of simulation. The order of magnitude of these modelled fluxes is lower but comparable with similar studies in other tropical areas based on
TRADITIONAL BUDGETING VERSUS BEYOND BUDGETING: A LITERATURE REVIEW
Directory of Open Access Journals (Sweden)
CARDOS ILDIKO REKA
2014-07-01
Full Text Available Budgets are an important part of the business environment since 1920 and are considered to be the key drivers and evaluators of managerial performance; and the key elements for planning and control. Budgets are the most powerful tool for management control; they can play an essential role in the organization’s power politics because it can increase the power and authority of top management and limit the autonomy of lower-level managers. Besides its advantages traditional budgeting presents disadvantages also. In recent years criticism towards traditional budgeting has increased. The basis of this criticism is that traditional budgeting is a relic of the past; it prevents reactions to changes in the market, it cannot keep up with the changes and requirements of today’s business world and it isn’t useful for business management. In order to eliminate criticism researchers and practitioners have developed more systematic and alternative concepts of budgeting that suits better for the needs of the modern business environment. Beyond budgeting, better budgeting, rolling forecasts, activity-based budgeting are the main alternatives developed in the last years. From the mentioned alternatives this article examines only beyond budgeting. Our paper discusses how budgeting has evolved into its current state, before examining why this universal technique has come under such heavy criticism of late. The paper is a literature analysis, it contributes to the existing managerial accounting literature and it is structured as follows. In the first part the background and evolution of budgeting is presented, followed by the analysis of related theories in traditional budgeting, emphasizing both the advantages and disadvantages of traditional budgeting. The second part of the paper continues with the discussion about alternative budgeting methods highlighting pros and cons of alternative methods, especially beyond budgeting. In the third part conducted
Microscopic Theory of the Standard Model
Ter-Kazarian, G T
2000-01-01
The operator manifold formalism (part I) enables the unification of the geometry and the field theory, and yields the quantization of geometry. This is the mathematical framework for our physical outlook that the geometry and fields, with the internal symmetries and all interactions, as well the four major principles of relativity (special and general), quantum, gauge and colour confinement, are derivative, and come into being simultaneously in the stable system of the underlying ``primordial structures''. In part II we attempt to develop, further, the microscopic approach to the Standard Model of particle physics, which enables an insight to the key problems of particle phenomenology. We suggest the microscopic theory of the unified electroweak interactions. The Higgs bosons have arisen on an analogy of the Cooper pairs in superconductivity. Besides of microscopic interpretation of all physical parameters the resulting theory also makes plausible following testable implications for the current experiments: 1...
F-theory and linear sigma models
Energy Technology Data Exchange (ETDEWEB)
Bershadsky, M.; Johansen, A. [Harvard Univ., Cambridge, MA (United States). Lyman Lab. of Physics; Chiang, T.M. [Newman Laboratory of Nuclear Studies, Cornell University, Ithaca, NY 14850 (United States); Greene, B.R.; Lazaroiu, C.I. [Departments of Physics and Mathematics, Columbia University, New York, NY 10027 (United States)
1998-09-07
We present an explicit method for translating between the linear sigma model and the spectral cover description of SU(r) stable bundles over an elliptically fibered Calabi-Yau manifold. We use this to investigate the four-dimensional duality between (0,2) heterotic and F-theory compactifications. We indirectly find that much interesting heterotic information must be contained in the `spectral bundle` and in its dual description as a gauge theory on multiple F-theory 7-branes. A by-product of these efforts is a method for analyzing semistability and the splitting type of vector bundles over an elliptic curve given as the sheaf cohomology of a monad. (orig.) 24 refs.
Crack propagation modeling using Peridynamic theory
Hafezi, M. H.; Alebrahim, R.; Kundu, T.
2016-04-01
Crack propagation and branching are modeled using nonlocal peridynamic theory. One major advantage of this nonlocal theory based analysis tool is the unifying approach towards material behavior modeling - irrespective of whether the crack is formed in the material or not. No separate damage law is needed for crack initiation and propagation. This theory overcomes the weaknesses of existing continuum mechanics based numerical tools (e.g. FEM, XFEM etc.) for identifying fracture modes and does not require any simplifying assumptions. Cracks grow autonomously and not necessarily along a prescribed path. However, in some special situations such as in case of ductile fracture, the damage evolution and failure depend on parameters characterizing the local stress state instead of peridynamic damage modeling technique developed for brittle fracture. For brittle fracture modeling the bond is simply broken when the failure criterion is satisfied. This simulation helps us to design more reliable modeling tool for crack propagation and branching in both brittle and ductile materials. Peridynamic analysis has been found to be very demanding computationally, particularly for real-world structures (e.g. vehicles, aircrafts, etc.). It also requires a very expensive visualization process. The goal of this paper is to bring awareness to researchers the impact of this cutting-edge simulation tool for a better understanding of the cracked material response. A computer code has been developed to implement the peridynamic theory based modeling tool for two-dimensional analysis. A good agreement between our predictions and previously published results is observed. Some interesting new results that have not been reported earlier by others are also obtained and presented in this paper. The final objective of this investigation is to increase the mechanics knowledge of self-similar and self-affine cracks.
Verkhoglyadova, Olga; Meng, Xing; Mannucci, Anthony J.; Tsurutani, Bruce T.; Hunt, Linda A.; Mlynczak, Martin G.; Hajra, Rajkumar; Emery, Barbara A.
2016-04-01
We analyze the energy budget of the ionosphere-thermosphere (IT) system during two High-Speed Streams (HSSs) on 22-31 January, 2007 (in the descending phase of solar cycle 23) and 25 April-2 May, 2011 (in the ascending phase of solar cycle 24) to understand typical features, similarities, and differences in magnetosphere-ionosphere-thermosphere (IT) coupling during HSS geomagnetic activity. We focus on the solar wind energy input into the magnetosphere (by using coupling functions) and energy partitioning within the IT system during these intervals. The Joule heating is estimated empirically. Hemispheric power is estimated based on satellite measurements. We utilize observations from TIMED/SABER (Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics/Sounding of the Atmosphere using Broadband Emission Radiometry) to estimate nitric oxide (NO) and carbon dioxide (CO2) cooling emission fluxes. We perform a detailed modeling study of these two similar HSS events with the Global Ionosphere-Thermosphere Model (GITM) and different external driving inputs to understand the IT response and to address how well the model reproduces the energy transport. GITM is run in a mode with forecastable inputs. It is shown that the model captures the main features of the energy coupling, but underestimates NO cooling and auroral heating in high latitudes. Lower thermospheric forcing at 100 km altitude is important for correct energy balance of the IT system. We discuss challenges for a physics-based general forecasting approach in modeling the energy budget of moderate IT storms caused by HSSs.
Topos models for physics and topos theory
Wolters, Sander
2014-08-01
What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a "quantum logic" in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos.
A Membrane Model from Implicit Elasticity Theory
Freed, A. D.; Liao, J.; Einstein, D. R.
2014-01-01
A Fungean solid is derived for membranous materials as a body defined by isotropic response functions whose mathematical structure is that of a Hookean solid where the elastic constants are replaced by functions of state derived from an implicit, thermodynamic, internal-energy function. The theory utilizes Biot’s (1939) definitions for stress and strain that, in 1-dimension, are the stress/strain measures adopted by Fung (1967) when he postulated what is now known as Fung’s law. Our Fungean membrane model is parameterized against a biaxial data set acquired from a porcine pleural membrane subjected to three, sequential, proportional, planar extensions. These data support an isotropic/deviatoric split in the stress and strain-rate hypothesized by our theory. These data also demonstrate that the material response is highly non-linear but, otherwise, mechanically isotropic. These data are described reasonably well by our otherwise simple, four-parameter, material model. PMID:24282079
Driscoll, Daniel G.; Norton, Parker A.
2009-01-01
The U.S. Geological Survey cooperated with South Dakota Game, Fish and Parks to characterize hydrologic information relevant to management of water resources associated with Sheridan Lake, which is formed by a dam on Spring Creek. This effort consisted primarily of characterization of hydrologic data for a base period of 1962 through 2006, development of a hydrologic budget for Sheridan Lake for this timeframe, and development of an associated model for simulation of storage deficits and drawdown in Sheridan Lake for hypothetical release scenarios from the lake. Historically, the dam has been operated primarily as a 'pass-through' system, in which unregulated outflows pass over the spillway; however, the dam recently was retrofitted with an improved control valve system that would allow controlled releases of about 7 cubic feet per second (ft3/s) or less from a fixed depth of about 60 feet (ft). Development of a hydrologic budget for Sheridan Lake involved compilation, estimation, and characterization of data sets for streamflow, precipitation, and evaporation. The most critical data need was for extrapolation of available short-term streamflow records for Spring Creek to be used as the long-term inflow to Sheridan Lake. Available short-term records for water years (WY) 1991-2004 for a gaging station upstream from Sheridan Lake were extrapolated to WY 1962-2006 on the basis of correlations with streamflow records for a downstream station and for stations located along two adjacent streams. Comparisons of data for the two streamflow-gaging stations along Spring Creek indicated that tributary inflow is approximately proportional to the intervening drainage area, which was used as a means of estimating tributary inflow for the hydrologic budget. Analysis of evaporation data shows that sustained daily rates may exceed maximum monthly rates by a factor of about two. A long-term (1962-2006) hydrologic budget was developed for computation of reservoir outflow from
Network Data: Statistical Theory and New Models
2016-02-17
max scale): Number of graduating undergraduates funded by a DoD funded Center of Excellence grant for Education , Research and Engineering: The number...structured networks The major goals of this project are to develop and implement algorithms based on high dimensional statis- tics theory, especially...invariance to local deformation. These techniques have been adapted to modeling higher order visual areas such as area MT on two experimental datasets provided
Theory, Modeling and Simulation Annual Report 2000
Energy Technology Data Exchange (ETDEWEB)
Dixon, David A.; Garrett, Bruce C.; Straatsma, Tp; Jones, Donald R.; Studham, Ronald S.; Harrison, Robert J.; Nichols, Jeffrey A.
2001-11-01
This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM&S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.
Theory, Modeling and Simulation Annual Report 2000
Energy Technology Data Exchange (ETDEWEB)
Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A
2001-11-01
This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.
Patel, Nitin R; Ankolekar, Suresh; Antonijevic, Zoran; Rajicic, Natasa
2013-05-10
We describe a value-driven approach to optimizing pharmaceutical portfolios. Our approach incorporates inputs from research and development and commercial functions by simultaneously addressing internal and external factors. This approach differentiates itself from current practices in that it recognizes the impact of study design parameters, sample size in particular, on the portfolio value. We develop an integer programming (IP) model as the basis for Bayesian decision analysis to optimize phase 3 development portfolios using expected net present value as the criterion. We show how this framework can be used to determine optimal sample sizes and trial schedules to maximize the value of a portfolio under budget constraints. We then illustrate the remarkable flexibility of the IP model to answer a variety of 'what-if' questions that reflect situations that arise in practice. We extend the IP model to a stochastic IP model to incorporate uncertainty in the availability of drugs from earlier development phases for phase 3 development in the future. We show how to use stochastic IP to re-optimize the portfolio development strategy over time as new information accumulates and budget changes occur.
Kuribayashi, Masatoshi; Noh, Nam-Jin; Saitoh, Taku M.; Ito, Akihiko; Wakazuki, Yasutaka; Muraoka, Hiroyuki
2016-12-01
Accurate projection of carbon budget in forest ecosystems under future climate and atmospheric carbon dioxide (CO2) concentration is important to evaluate the function of terrestrial ecosystems, which serve as a major sink of atmospheric CO2. In this study, we examined the effects of spatial resolution of meteorological data on the accuracies of ecosystem model simulation for canopy phenology and carbon budget such as gross primary production (GPP), ecosystem respiration (ER), and net ecosystem production (NEP) of a deciduous forest in Japan. Then, we simulated the future (around 2085) changes in canopy phenology and carbon budget of the forest by incorporating high-resolution meteorological data downscaled by a regional climate model. The ecosystem model overestimated GPP and ER when we inputted low-resolution data, which have warming biases over mountainous landscape. But, it reproduced canopy phenology and carbon budget well, when we inputted high-resolution data. Under the future climate, earlier leaf expansion and delayed leaf fall by about 10 days compared with the present state was simulated, and also, GPP, ER and NEP were estimated to increase by 25.2%, 23.7% and 35.4%, respectively. Sensitivity analysis showed that the increase of NEP in June and October would be mainly caused by rising temperature, whereas that in July and August would be largely attributable to CO2 fertilization. This study suggests that the downscaling of future climate data enable us to project more reliable carbon budget of forest ecosystem in mountainous landscape than the low-resolution simulation due to the better predictions of leaf expansion and shedding.
Sparse modeling theory, algorithms, and applications
Rish, Irina
2014-01-01
""A comprehensive, clear, and well-articulated book on sparse modeling. This book will stand as a prime reference to the research community for many years to come.""-Ricardo Vilalta, Department of Computer Science, University of Houston""This book provides a modern introduction to sparse methods for machine learning and signal processing, with a comprehensive treatment of both theory and algorithms. Sparse Modeling is an ideal book for a first-year graduate course.""-Francis Bach, INRIA - École Normale Supřieure, Paris
Directory of Open Access Journals (Sweden)
Ghodratollah Talebnia
2012-10-01
Full Text Available Performance-based budgeting (PBB is the latest attempt to use performance indicators in allocation of resources in public sector. PBB experts normally attempt to place emphasis on output and outcome instead of input. Iran has made efforts to establish the PBB system but so far this goal has not been realized. The methodology of the research is descriptive by the means of survey-analytical approach. In the research, the possibility of establishment of PBB in Iran is examined from three perspectives (Policymaking, Implementing, and Monitoring. The conceptual model of this research is formed with a comprehensive review of literature of PBB all over the world. At first, with an extensive review of literature in the countries who implemented PBB or trying to implement it, we identify all variables, which are necessary for a suitable Performance Budgeting model. Then the PBB experts test the necessity of these variables in Iran and finally the existence has been proved by the statistical methods with the Iranian model.
Zilitinkevich, S. S.; Elperin, T.; Kleeorin, N.; Rogachevskii, I.; Esau, I.
2013-03-01
Here we advance the physical background of the energy- and flux-budget turbulence closures based on the budget equations for the turbulent kinetic and potential energies and turbulent fluxes of momentum and buoyancy, and a new relaxation equation for the turbulent dissipation time scale. The closure is designed for stratified geophysical flows from neutral to very stable and accounts for the Earth's rotation. In accordance with modern experimental evidence, the closure implies the maintaining of turbulence by the velocity shear at any gradient Richardson number Ri, and distinguishes between the two principally different regimes: "strong turbulence" at {Ri ≪ 1} typical of boundary-layer flows and characterized by the practically constant turbulent Prandtl number Pr T; and "weak turbulence" at Ri > 1 typical of the free atmosphere or deep ocean, where Pr T asymptotically linearly increases with increasing Ri (which implies very strong suppression of the heat transfer compared to the momentum transfer). For use in different applications, the closure is formulated at different levels of complexity, from the local algebraic model relevant to the steady-state regime of turbulence to a hierarchy of non-local closures including simpler down-gradient models, presented in terms of the eddy viscosity and eddy conductivity, and a general non-gradient model based on prognostic equations for all the basic parameters of turbulence including turbulent fluxes.
Directory of Open Access Journals (Sweden)
H. Chakroun
2012-05-01
Full Text Available The use of remote sensing at different spatio-temporal resolutions is being common during the last decades since sensors offer many inputs to water budget estimation. Various water balance models use the LAI as a parameter for accounting water interception, evapotranspiration, runoff and available ground water. The objective of the present work is to improve vegetation stress monitoring at regional scale for a natural forested ecosystem. LAI-MODIS and spatialized vegetation, soil and climatic data have been integrated in a water budget model that simulates evapotranspiration and soil water content at daily step. We first explore LAI-MODIS in the specific context of Mediterranean natural ecosystem. Results showed that despite coarse resolution of LAI-MODIS product (1 km, it was possible to discriminate evergreen and coniferous vegetation and that LAI values are influenced by underlying soil capacity of water holding. The dynamic of vegetation has been integrated into the water budget model by weekly varying LAI-MODIS. Results of simulations were analysed in terms of actual evapotranspoiration, deficit of soil water to field capacity and vegetation stress index based on actual and potential evapotranspiration. Comparing dynamic LAI variation, afforded by MODIS, to a hypothetic constant LAI all over the year correspond to 30% of fAPAR increase. A sensitivity analysis of simulation outputs to this fAPAR variation reveals that increase of both deficit of soil water to field capacity and stress index are respectively 18% and 27%, (in terms of RMSE, these variations are respectively 1258 mm yr^{−1} and 11 days yr^{−1}. These results are consistent with previous studies led at local scale showing that LAI increase is accompanied by stress conditions increase in Mediterranean natural ecosystems. In this study, we also showed that spatial modelisation of drought conditions based on water budget simulations is an adequate tool for
Theory, modeling and simulation: Annual report 1993
Energy Technology Data Exchange (ETDEWEB)
Dunning, T.H. Jr.; Garrett, B.C.
1994-07-01
Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.
An Optimization Model Based on Game Theory
Directory of Open Access Journals (Sweden)
Yang Shi
2014-04-01
Full Text Available Game Theory has a wide range of applications in department of economics, but in the field of computer science, especially in the optimization algorithm is seldom used. In this paper, we integrate thinking of game theory into optimization algorithm, and then propose a new optimization model which can be widely used in optimization processing. This optimization model is divided into two types, which are called “the complete consistency” and “the partial consistency”. In these two types, the partial consistency is added disturbance strategy on the basis of the complete consistency. When model’s consistency is satisfied, the Nash equilibrium of the optimization model is global optimal and when the model’s consistency is not met, the presence of perturbation strategy can improve the application of the algorithm. The basic experiments suggest that this optimization model has broad applicability and better performance, and gives a new idea for some intractable problems in the field of artificial intelligence
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... are related to expectations formation, market clearing, nominal rigidities, etc. Finally, the general-partial equilibrium distinction is analyzed....
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... parameters of the CVAR are shown to be interpretable in terms of expectations formation, market clearing, nominal rigidities, etc. The general-partial equilibrium distinction is also discussed....
Institute of Scientific and Technical Information of China (English)
胡玉清; 张帅; 苑明; 张永顺
2015-01-01
Applying the theory of Portfolio Management , in accordance with the actual need and available finance , the assembly optimization model for the equipment outlay budget project is designed , and for this function , the maximum expected revenue is the goal , and the resource restraint , the project relationship restraint and the project decompounded restraint are controlled by the equipment outlay budget .The model provides decision-making means for equipment outlay budget authorized and enhances them in a scientific , fair and transparant way .%应用项目组合管理理论，综合考虑装备建设的实际需要与经费供给的可能性，建立以预期军事效益最大为目标，以装备经费预算控制指标约束、项目关系约束和项目可分解约束的装备经费预算项目组合优化决策模型。所建立的模型为装备经费预算编制工作提供了可行的决策技术方法，有效增强了装备经费预算编制的科学性、公正性和透明度。
Economic contract theory tests models of mutualism.
Weyl, E Glen; Frederickson, Megan E; Yu, Douglas W; Pierce, Naomi E
2010-09-01
Although mutualisms are common in all ecological communities and have played key roles in the diversification of life, our current understanding of the evolution of cooperation applies mostly to social behavior within a species. A central question is whether mutualisms persist because hosts have evolved costly punishment of cheaters. Here, we use the economic theory of employment contracts to formulate and distinguish between two mechanisms that have been proposed to prevent cheating in host-symbiont mutualisms, partner fidelity feedback (PFF) and host sanctions (HS). Under PFF, positive feedback between host fitness and symbiont fitness is sufficient to prevent cheating; in contrast, HS posits the necessity of costly punishment to maintain mutualism. A coevolutionary model of mutualism finds that HS are unlikely to evolve de novo, and published data on legume-rhizobia and yucca-moth mutualisms are consistent with PFF and not with HS. Thus, in systems considered to be textbook cases of HS, we find poor support for the theory that hosts have evolved to punish cheating symbionts; instead, we show that even horizontally transmitted mutualisms can be stabilized via PFF. PFF theory may place previously underappreciated constraints on the evolution of mutualism and explain why punishment is far from ubiquitous in nature.
A matrix model from string field theory
Directory of Open Access Journals (Sweden)
Syoji Zeze
2016-09-01
Full Text Available We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large $N$ matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.
A matrix model from string field theory
Zeze, Syoji
2016-09-01
We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N) vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large N matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.
Polarimetric clutter modeling: Theory and application
Kong, J. A.; Lin, F. C.; Borgeaud, M.; Yueh, H. A.; Swartz, A. A.; Lim, H. H.; Shim, R. T.; Novak, L. M.
1988-01-01
The two-layer anisotropic random medium model is used to investigate fully polarimetric scattering properties of earth terrain media. The polarization covariance matrices for the untilted and tilted uniaxial random medium are evaluated using the strong fluctuation theory and distorted Born approximation. In order to account for the azimuthal randomness in the growth direction of leaves in tree and grass fields, an averaging scheme over the azimuthal direction is also applied. It is found that characteristics of terrain clutter can be identified through the analysis of each element of the covariance matrix. Theoretical results are illustrated by the comparison with experimental data provided by MIT Lincoln Laboratory for tree and grass fields.
Directory of Open Access Journals (Sweden)
C. R. Flechard
2009-10-01
Full Text Available The net annual NH_{3} exchange budget of a fertilised, cut grassland in Central Switzerland is presented. The observation-based budget was computed from semi-continuous micrometeorological fluxes over a time period of 16 months and using a process-based gap-filling procedure. The data for emission peak events following the application of cattle slurry and for background exchange were analysed separately to distinguish short-term perturbations from longer-term ecosystem functioning. A canopy compensation point model of background exchange is parameterised on the basis of measured data and applied for the purposes of gap-filling. The data show that, outside fertilisation events, grassland behaves as a net sink for atmospheric NH_{3} with an annual dry deposition flux of −3.0 kg N ha^{−1} yr^{−1}, although small NH_{3} emissions by the canopy were measured in dry daytime conditions. The median Γ_{s} ratio in the apoplast (=[NH_{4}^{+}]/[H^{+}] estimated from micrometeorological measurements was 620, equivalent to a stomatal compensation point of 1.3 μg NH_{3} m^{−3} at 15°C. Non-stomatal resistance to deposition R_{w} was shown to increase with temperature and decrease with surface relative humidity, and R_{w} values were among the highest published for European grasslands, consistent with a relatively high ratio of NH_{3} to acid gases in the boundary layer at this site. Since the gross annual NH_{3} emission by slurry spreading was of the order of +20 kg N ha^{−1} yr^{−1}, the fertilised grassland was a net NH_{3} source of +17 kg N ha^{−1} yr^{−1}. A comparison with the few other measurement-based budget values from the literature reveals considerable variability, demonstrating both the influence of soil, climate, management and grassland type on the NH
Quantum Model Theory (QMod): Modeling Contextual Emergent Entangled Interfering Entities
Aerts, Diederik
2012-01-01
In this paper we present 'Quantum Model Theory' (QMod), a theory we developed to model entities that entail the typical quantum effects of 'contextuality, 'superposition', 'interference', 'entanglement' and 'emergence'. This aim of QMod is to put forward a theoretical framework that has the technical power of standard quantum mechanics, namely it makes explicitly use of the standard complex Hilbert space and its quantum mechanical calculus, but is also more general than standard quantum mechanics, in the sense that it only uses this quantum calculus locally, i.e. for each context corresponding to a measurement. In this sense, QMod is a generalization of quantum mechanics, similar to how the general relativity manifold mathematical formalism is a generalization of special relativity and classical physics. We prove by means of a representation theorem that QMod can be used for any entity entailing the typical quantum effects mentioned above. Some examples of application of QMod in concept theory and macroscopic...
基于控制论的民办高校财务预算研究%Research on Financial Budget of Private Colleges Based on Control Theory
Institute of Scientific and Technical Information of China (English)
罗小兰
2016-01-01
The effectiveness of financial budget management of private colleges at least depending on two factors: First, the preparation of the financial budget should pay attention to the method, as far as possible to improve the accuracy of the budget; second, taking the strict control of the budget as the standard, focuses on the results of the implementation of the financial budget control. Budget execution control is the key to the success of budget management in private colleges, only attaches great importance to the financial budget implementation control link, can achieve the purpose of the financial budget management of private colleges. This paper mainly from the perspective of the financial budget implementation of private colleges, explores suitable financial budget management method for private colleges.%民办院校财务预算管理的成效至少取决于两个因素：一是财务预算编制要注意方法，尽量提高预算的准确性；二是以预算为标准进行严格控制，重点关注财务预算执行控制的结果。预算执行控制是民办高校预算管理成功的关键，只有高度重视财务预算执行控制环节，才能达到民办高校财务预算管理的目的。本文主要从民办高校财务预算执行角度出发，探求适合民办高校的财务预算管理方法。
Drafting Multiannual Local Budgets by Economic-Mathematical Modelling of the Evolution of Revenues
Directory of Open Access Journals (Sweden)
Ioan Radu
2009-01-01
Full Text Available Although seen as a sector with a high degree of inertia and conservatism the public administration system determines the public institutions to record a set of influences both from the internal and external environment. The public administration system is influenced by the frequent legislative changes and recently by the requirements claimed by the European Union. Given the complexity and dynamics of the competitive environment the approach of strategic management tools at the level of public administration becomes more and more important and necessary. One of the main types of exercise of strategic management is represented by financial planning moulded into policies, strategies, plans and programmes whose generation is based on multiannual budgets.
Directory of Open Access Journals (Sweden)
Verkhoglyadova Olga
2016-01-01
Full Text Available We analyze the energy budget of the ionosphere-thermosphere (IT system during two High-Speed Streams (HSSs on 22–31 January, 2007 (in the descending phase of solar cycle 23 and 25 April–2 May, 2011 (in the ascending phase of solar cycle 24 to understand typical features, similarities, and differences in magnetosphere-ionosphere-thermosphere (IT coupling during HSS geomagnetic activity. We focus on the solar wind energy input into the magnetosphere (by using coupling functions and energy partitioning within the IT system during these intervals. The Joule heating is estimated empirically. Hemispheric power is estimated based on satellite measurements. We utilize observations from TIMED/SABER (Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics/Sounding of the Atmosphere using Broadband Emission Radiometry to estimate nitric oxide (NO and carbon dioxide (CO2 cooling emission fluxes. We perform a detailed modeling study of these two similar HSS events with the Global Ionosphere-Thermosphere Model (GITM and different external driving inputs to understand the IT response and to address how well the model reproduces the energy transport. GITM is run in a mode with forecastable inputs. It is shown that the model captures the main features of the energy coupling, but underestimates NO cooling and auroral heating in high latitudes. Lower thermospheric forcing at 100 km altitude is important for correct energy balance of the IT system. We discuss challenges for a physics-based general forecasting approach in modeling the energy budget of moderate IT storms caused by HSSs.
Educational Program Evaluation Model, From the Perspective of the New Theories
Directory of Open Access Journals (Sweden)
Soleiman Ahmady
2014-05-01
Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.
Miller, James R.; Russell, Gary L.
1996-01-01
The annual flux of freshwater into the Arctic Ocean by the atmosphere and rivers is balanced by the export of sea ice and oceanic freshwater. Two 150-year simulations of a global climate model are used to examine how this balance might change if atmospheric greenhouse gases (GHGs) increase. Relative to the control, the last 50-year period of the GHG experiment indicates that the total inflow of water from the atmosphere and rivers increases by 10% primarily due to an increase in river discharge, the annual sea-ice export decreases by about half, the oceanic liquid water export increases, salinity decreases, sea-ice cover decreases, and the total mass and sea-surface height of the Arctic Ocean increase. The closed, compact, and multi-phased nature of the hydrologic cycle in the Arctic Ocean makes it an ideal test of water budgets that could be included in model intercomparisons.
An Inflationary Model in String Theory
Iizuka, N; Iizuka, Norihiro; Trivedi, Sandip P.
2004-01-01
We construct a model of inflation in string theory after carefully taking into account moduli stabilization. The setting is a warped compactification of Type IIB string theory in the presence of D3 and anti-D3-branes. The inflaton is the position of a D3-brane in the internal space. By suitably adjusting fluxes and the location of symmetrically placed anti-D3-branes, we show that at a point of enhanced symmetry, the inflaton potential V can have a broad maximum, satisfying the condition V''/V << 1 in Planck units. On starting close to the top of this potential the slow-roll conditions can be met. Observational constraints impose significant restrictions. As a first pass we show that these can be satisfied and determine the important scales in the compactification to within an order of magnitude. One robust feature is that the scale of inflation is low, H = O(10^{10}) GeV. Removing the observational constraints makes it much easier to construct a slow-roll inflationary model. Generalizations and conseque...
Modified perturbation theory for the Yukawa model
Poluektov, Yu M
2016-01-01
A new formulation of perturbation theory for a description of the Dirac and scalar fields (the Yukawa model) is suggested. As the main approximation the self-consistent field model is chosen, which allows in a certain degree to account for the effects caused by the interaction of fields. Such choice of the main approximation leads to a normally ordered form of the interaction Hamiltonian. Generation of the fermion mass due to the interaction with exchange of the scalar boson is investigated. It is demonstrated that, for zero bare mass, the fermion can acquire mass only if the coupling constant exceeds the critical value determined by the boson mass. In this connection, the problem of the neutrino mass is discussed.
PARFUME Theory and Model basis Report
Energy Technology Data Exchange (ETDEWEB)
Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson
2009-09-01
The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.
Stochastic linear programming models, theory, and computation
Kall, Peter
2011-01-01
This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...
Santoni, G. W.; Xiang, B.; Kort, E. A.; Daube, B.; Andrews, A. E.; Sweeney, C.; Wecht, K.; Peischl, J.; Ryerson, T. B.; Angevine, W. M.; Trainer, M.; Nehrkorn, T.; Eluszkiewicz, J.; Wofsy, S. C.
2012-12-01
We present constraints on California emission inventories of methane (CH4) using atmospheric observations from nine NOAA P-3 flights during the California Nexus (CalNex) campaign in May and June of 2010. Measurements were made using a quantum cascade laser spectrometer (QCLS) and a cavity ring-down spectrometer (CRDS) and calibrated to NOAA standards in-flight. Five flights sampled above the northern and southern central valley and an additional four flights probed the south coast air basin, quantifying emissions from the Los Angeles basin. The data show large (>100 ppb) CH4 enhancements associated with point and area sources such as cattle and manure management, landfills, wastewater treatment, gas production and distribution infrastructure, and rice agriculture. We compare aircraft observations to modeled CH4 distributions by accounting for a) transport using the Stochastic Time-Inverted Lagrangian Transport (STILT) model driven by Weather Research and Forecasting (WRF) meteorology, b) emissions from inventories such as EDGAR and ones constructed from California-specific state and county databases, each gridded to 0.1° x 0.1° resolution, and c) spatially and temporally evolving boundary conditions such as GEOS-Chem and a NOAA aircraft profile measurement derived curtain imposed at the edge of the WRF domain. After accounting for errors associated with transport, planetary boundary layer height, lateral boundary conditions, seasonality of emissions, and the spatial resolution of surface emission prior estimates, we find that the California Air Resources Board (CARB) CH4 budget is a factor of 1.64 too low. Using a Bayesian inversion to the flight data, we estimate California's CH4 budget to be 2.5 TgCH4/yr, with emissions from cattle and manure management, landfills, rice, and natural gas infrastructure, representing roughly 82%, 26%, 9% and 32% (sum = 149% with other sources accounting for the additional 15%) of the current CARB CH4 budget estimate of 1.52 TgCH4
Grégoire, M.; Soetaert, K.E.R.
2010-01-01
Carbon, nitrogen, oxygen and sulfide budgets are derived for the Black Sea water column from a coupled physical–biogeochemical model. The model is applied in the deep part of the sea and simulates processes over the whole water column including the anoxic layer that extends from similar, equals115 m
A Content Analysis of Defense Budget Rhetoric
2011-06-01
President’s budget. 14. SUBJECT TERMS Defense Budget, Content Analysis, Political Discourse, Budget Rhetoric, Political Communication , Senate Armed...represent the most recent paradigm shift in political communication research (Scheufele & Tewksbury, 2007, p. 10). These three models combine to construct...this study was to fill the gap on political communication by examining whether Congress was responsive to framing by the President’s budget. To
Galaxy alignments: Theory, modelling and simulations
Kiessling, Alina; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L; Rassat, Anais
2015-01-01
The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in large-scale structure tend to align the shapes and angular momenta of nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both $N$-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the ...
Theory and modelling of nanocarbon phase stability.
Energy Technology Data Exchange (ETDEWEB)
Barnard, A. S.
2006-01-01
The transformation of nanodiamonds into carbon-onions (and vice versa) has been observed experimentally and has been modeled computationally at various levels of sophistication. Also, several analytical theories have been derived to describe the size, temperature and pressure dependence of this phase transition. However, in most cases a pure carbon-onion or nanodiamond is not the final product. More often than not an intermediary is formed, known as a bucky-diamond, with a diamond-like core encased in an onion-like shell. This has prompted a number of studies investigating the relative stability of nanodiamonds, bucky-diamonds, carbon-onions and fullerenes, in various size regimes. Presented here is a review outlining results of numerous theoretical studies examining the phase diagrams and phase stability of carbon nanoparticles, to clarify the complicated relationship between fullerenic and diamond structures at the nanoscale.
Modeling and Optimization : Theory and Applications Conference
Terlaky, Tamás
2015-01-01
This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 13-15, 2014. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, healthcare, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.
Modeling missing data in knowledge space theory.
de Chiusole, Debora; Stefanutti, Luca; Anselmi, Pasquale; Robusto, Egidio
2015-12-01
Missing data are a well known issue in statistical inference, because some responses may be missing, even when data are collected carefully. The problem that arises in these cases is how to deal with missing data. In this article, the missingness is analyzed in knowledge space theory, and in particular when the basic local independence model (BLIM) is applied to the data. Two extensions of the BLIM to missing data are proposed: The former, called ignorable missing BLIM (IMBLIM), assumes that missing data are missing completely at random; the latter, called missing BLIM (MissBLIM), introduces specific dependencies of the missing data on the knowledge states, thus assuming that the missing data are missing not at random. The IMBLIM and the MissBLIM modeled the missingness in a satisfactory way, in both a simulation study and an empirical application, depending on the process that generates the missingness: If the missing data-generating process is of type missing completely at random, then either IMBLIM or MissBLIM provide adequate fit to the data. However, if the pattern of missingness is functionally dependent upon unobservable features of the data (e.g., missing answers are more likely to be wrong), then only a correctly specified model of the missingness distribution provides an adequate fit to the data.
Energy Technology Data Exchange (ETDEWEB)
Hollmann, R. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Atmosphaerenphysik
2000-07-01
Since forty years instruments onboard satellites have been demonstrated their usefulness for many applications in the field of meteorology and oceanography. Several experiments, like ERBE, are dedicated to establish a climatology of the global Earth radiation budget at the top of the atmosphere. Now the focus has been changed to the regional scale, e.g. GEWEX with its regional sub-experiments like BALTEX. To obtain a regional radiation budget for Europe in the first part of the work the well calibrated measurements from ScaRaB (scanner for radiation budget) are used to derive a narrow-to-broadband conversion, which is applicable to the AVHRR (advanced very high resolution radiometer). It is shown, that the accuracy of the method is in the order of that from SCaRaB itself. In the second part of the work, results of REMO have been compared with measurements of ScaRaB and AVHRR for March 1994. The model reproduces the measurements overall well, but it is overestimating the cold areas and underestimating the warm areas in the longwave spectral domain. Similarly it is overestimating the dark areas and underestimating the bright areas in the solar spectral domain. (orig.)
A Mathematical Theory of the Gauged Linear Sigma Model
Fan, Huijun; Ruan, Yongbin
2015-01-01
We construct a rigorous mathematical theory of Witten's Gauged Linear Sigma Model (GLSM). Our theory applies to a wide range of examples, including many cases with non-Abelian gauge group. Both the Gromov-Witten theory of a Calabi-Yau complete intersection X and the Landau-Ginzburg dual (FJRW-theory) of X can be expressed as gauged linear sigma models. Furthermore, the Landau-Ginzburg/Calabi-Yau correspondence can be interpreted as a variation of the moment map or a deformation of GIT in the GLSM. This paper focuses primarily on the algebraic theory, while a companion article will treat the analytic theory.
The Properties of Model Selection when Retaining Theory Variables
DEFF Research Database (Denmark)
Hendry, David F.; Johansen, Søren
Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...... set by their statistical significance can be undertaken without affecting the estimator distribution of the theory parameters. This strategy returns the theory-parameter estimates when the theory is correct, yet protects against the theory being under-specified because some w{t} are relevant....
Gravothermal Star Clusters - Theory and Computer Modelling
Spurzem, Rainer
2010-11-01
In the George Darwin lecture, delivered to the British Royal Astronomical Society in 1960 by Viktor A. Ambartsumian he wrote on the evolution of stellar systems that it can be described by the "dynamic evolution of a gravitating gas" complemented by "a statistical description of the changes in the physical states of stars". This talk will show how this physical concept has inspired theoretical modeling of star clusters in the following decades up to the present day. The application of principles of thermodynamics shows, as Ambartsumian argued in his 1960 lecture, that there is no stable state of equilibrium of a gravitating star cluster. The trend to local thermodynamic equilibrium is always disturbed by escaping stars (Ambartsumian), as well as by gravothermal and gravogyro instabilities, as it was detected later. Here the state-of-the-art of modeling the evolution of dense stellar systems based on principles of thermodynamics and statistical mechanics (Fokker-Planck approximation) will be reviewed. Recent progress including rotation and internal correlations (primordial binaries) is presented. The models have also very successfully been used to study dense star clusters around massive black holes in galactic nuclei and even (in a few cases) relativistic supermassive dense objects in centres of galaxies (here again briefly touching one of the many research fields of V.A. Ambartsumian). For the modern present time of high-speed supercomputing, where we are tackling direct N-body simulations of star clusters, we will show that such direct modeling supports and proves the concept of the statistical models based on the Fokker-Planck theory, and that both theoretical concepts and direct computer simulations are necessary to support each other and make scientific progress in the study of star cluster evolution.
A Realizability Model for Impredicative Hoare Type Theory
DEFF Research Database (Denmark)
Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar
2008-01-01
We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....
Model of Polyakov duality: String field theory Hamiltonians from Yang-Mills theories
Periwal, Vipul
2000-08-01
Polyakov has conjectured that Yang-Mills theory should be equivalent to a noncritical string theory. I point out, based on the work of Marchesini, Ishibashi, Kawai and collaborators, and Jevicki and Rodrigues, that the loop operator of the Yang-Mills theory is the temporal gauge string field theory Hamiltonian of a noncritical string theory. The consistency condition of the string interpretation is the zig-zag symmetry emphasized by Polyakov. I explicitly show how this works for the one-plaquette model, providing a consistent direct string interpretation of the unitary matrix model for the first time.
Chamberlin, Phillip
2008-01-01
The Flare Irradiance Spectral Model (FISM) is an empirical model of the solar irradiance spectrum from 0.1 to 190 nm at 1 nm spectral resolution and on a 1-minute time cadence. The goal of FISM is to provide accurate solar spectral irradiances over the vacuum ultraviolet (VUV: 0-200 nm) range as input for ionospheric and thermospheric models. The seminar will begin with a brief overview of the FISM model, and also how the Solar Dynamics Observatory (SDO) EUV Variability Experiment (EVE) will contribute to improving FISM. Some current studies will then be presented that use FISM estimations of the solar VUV irradiance to quantify the contributions of the increased irradiance from flares to Earth's increased thermospheric and ionospheric densites. Initial results will also be presented from a study looking at the electron density increases in the Martian atmosphere during a solar flare. Results will also be shown quantifying the VUV contributions to the total flare energy budget for both the impulsive and gradual phases of solar flares. Lastly, an example of how FISM can be used to simplify the design of future solar VUV irradiance instruments will be discussed, using the future NOAA GOES-R Extreme Ultraviolet and X-Ray Sensors (EXIS) space weather instrument.
Smirnova, Daria
2017-01-01
The purpose of this research-based thesis was to get an idea how managers of two small resembling hotels of a specific region deal with marketing process with a limited budget. In addition, the aim of the thesis was to examine if hotel managers who were interviewed perceive marketing only in the way of ‘promotion’ rather than marketing research, marketing mix and marketing environment theories. It was also found out if hotel managers of those hotels consider marketing as a key to successful h...
Houcine, A.; Bargaoui, Z.
2012-04-01
Modelling soil water budget is a key issue for assessing drought awareness indices based on soil moisture estimation. The aim of the study is to compare drought indices based on rainfall time series to those based on soil water content time series and evapotranspiration time series. To this end, a vertically averaged water budget over the root zone is implemented to assist the estimation of evapotranspiration flux. A daily time step is adopted to run the water budget model for a lumped watershed of 250 km2 under arid climate where recorded meteorological and hydrological data are available for a ten year period. The water balance including 7 parameters is computed including evapotranspiration, runoff and leakage. Soil properties related parameters are derived according to pedo transfer functions while two remaining parameters are considered as data driven and are subject to calibration. The model is calibrated using daily hydro meteorological data (solar radiation, air temperature, air humidity, mean areal rainfall) as well as daily runoff records and also average annual (or regional) evapotranspiration. The latter is estimated using an empirical sub-model. A set of acceptable solutions is identified according to the values of the Nash coefficients for annual and decadal runoffs as well as the relative bias for average annual evapotranspiration. Using these acceptable solutions several drought indices are computed: SPI (standard precipitation index), SMDI (soil moisture deficit index) and ETDI (evapotranspiration deficit index). While SPI indicators are based only on monthly precipitation time series, SMDI are based on weekly mean soil water content as computed by the hydrological model. On the other hand ETDI indices are based on weekly mean potential and actual evapotranspirations as estimated by the meteorological and hydrological models. For SPI evaluation various time scales are considered from one to twelve months (SPI1, SPI3, SPI6, SPI9 and SPI12). For all
Density functional theory and multiscale materials modeling
Indian Academy of Sciences (India)
Swapan K Ghosh
2003-01-01
One of the vital ingredients in the theoretical tools useful in materials modeling at all the length scales of interest is the concept of density. In the microscopic length scale, it is the electron density that has played a major role in providing a deeper understanding of chemical binding in atoms, molecules and solids. In the intermediate mesoscopic length scale, an appropriate picture of the equilibrium and dynamical processes has been obtained through the single particle number density of the constituent atoms or molecules. A wide class of problems involving nanomaterials, interfacial science and soft condensed matter has been addressed using the density based theoretical formalism as well as atomistic simulation in this regime. In the macroscopic length scale, however, matter is usually treated as a continuous medium and a description using local mass density, energy density and other related density functions has been found to be quite appropriate. A unique single unified theoretical framework that emerges through the density concept at these diverse length scales and is applicable to both quantum and classical systems is the so called density functional theory (DFT) which essentially provides a vehicle to project the many-particle picture to a single particle one. Thus, the central equation for quantum DFT is a one-particle Schrödinger-like Kohn–Sham equation, while the same for classical DFT consists of Boltzmann type distributions, both corresponding to a system of noninteracting particles in the field of a density-dependent effective potential. Selected illustrative applications of quantum DFT to microscopic modeling of intermolecular interaction and that of classical DFT to a mesoscopic modeling of soft condensed matter systems are presented.
Models of Particle Physics from Type IIB String Theory and F-theory: A Review
Maharana, Anshuman
2012-01-01
We review particle physics model building in type IIB string theory and F-theory. This is a region in the landscape where in principle many of the key ingredients required for a realistic model of particle physics can be combined successfully. We begin by reviewing moduli stabilisation within this framework and its implications for supersymmetry breaking. We then review model building tools and developments in the weakly coupled type IIB limit, for both local D3-branes at singularities and global models of intersecting D7-branes. Much of recent model building work has been in the strongly coupled regime of F-theory due to the presence of exceptional symmetries which allow for the construction of phenomenologically appealing Grand Unified Theories. We review both local and global F-theory model building starting from the fundamental concepts and tools regarding how the gauge group, matter sector and operators arise, and ranging to detailed phenomenological properties explored in the literature.
Directory of Open Access Journals (Sweden)
M. Adachi
2011-09-01
Full Text Available More reliable estimates of the carbon (C stock within forest ecosystems and C emission induced by deforestation are urgently needed to mitigate the effects of emissions on climate change. A process-based terrestrial biogeochemical model (VISIT was applied to tropical primary forests of two types (a seasonal dry forest in Thailand and a rainforest in Malaysia and one agro-forest (an oil palm plantation in Malaysia to estimate the C budget of tropical ecosystems in Southeast Asia, including the impacts of land-use conversion. The observed aboveground biomass in the seasonal dry tropical forest in Thailand (226.3 t C ha^{−1} and the rainforest in Malaysia (201.5 t C ha^{−1} indicate that tropical forests of Southeast Asia are among the most C-abundant ecosystems in the world. The model simulation results in rainforests were consistent with field data, except for the NEP, however, the VISIT model tended to underestimate C budget and stock in the seasonal dry tropical forest. The gross primary production (GPP based on field observations ranged from 32.0 to 39.6 t C ha^{−1} yr^{−1} in the two primary forests, whereas the model slightly underestimated GPP (26.5–34.5 t C ha^{−1} yr^{−1}. The VISIT model appropriately captured the impacts of disturbances such as deforestation and land-use conversions on the C budget. Results of sensitivity analysis showed that the proportion of remaining residual debris was a key parameter determining the soil C budget after the deforestation event. According to the model simulation, the total C stock (total biomass and soil C of the oil palm plantation was about 35% of the rainforest's C stock at 30 yr following initiation of the plantation. However, there were few field data of C budget and stock, especially in oil palm plantation. The C budget of each ecosystem must be evaluated over the long term using both the model simulations and observations to
String-like dual models for scalar theories
Baadsgaard, Christian; Bjerrum-Bohr, N. E. J.; Bourjaily, Jacob; Damgaard, Poul H.
2016-12-01
We show that all tree-level amplitudes in φ p scalar field theory can be represented as the α ' → 0 limit of an SL(2, ℝ)-invariant, string-theory-like dual model integral. These dual models are constructed according to constraints that admit families of solutions. We derive these dual models, and give closed formulae for all tree-level amplitudes of any φ p scalar field theory.
String-Like Dual Models for Scalar Theories
Baadsgaard, Christian; Bourjaily, Jacob L; Damgaard, Poul H
2016-01-01
We show that all tree-level amplitudes in $\\varphi^p$ scalar field theory can be represented as the $\\alpha'\\to0$ limit of an $SL(2,R)$-invariant, string-theory-like dual model integral. These dual models are constructed according to constraints that admit families of solutions. We derive these dual models, and give closed formulae for all tree-level amplitudes of any $\\varphi^p$ scalar field theory.
Catastrophe Theory: A Unified Model for Educational Change.
Cryer, Patricia; Elton, Lewis
1990-01-01
Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)
The Theory of Finite Models without Equal Sign
Institute of Scientific and Technical Information of China (English)
Li Bo LUO
2006-01-01
In this paper, it is the first time ever to suggest that we study the model theory of all finite structures and to put the equal sign in the same situtation as the other relations. Using formulas of infinite lengths we obtain new theorems for the preservation of model extensions, submodels, model homomorphisms and inverse homomorphisms. These kinds of theorems were discussed in Chang and Keisler's Model Theory, systematically for general models, but Gurevich obtained some different theorems in this direction for finite models. In our paper the old theorems manage to survive in the finite model theory. There are some differences between into homomorphisms and onto homomorphisms in preservation theorems too. We also study reduced models and minimum models. The characterization sentence of a model is given, which derives a general result for any theory T to be equivalent to a set of existential-universal sentences. Some results about completeness and model completeness are also given.
Chaos Theory as a Model for Managing Issues and Crises.
Murphy, Priscilla
1996-01-01
Uses chaos theory to model public relations situations in which the salient feature is volatility of public perceptions. Discusses the premises of chaos theory and applies them to issues management, the evolution of interest groups, crises, and rumors. Concludes that chaos theory is useful as an analogy to structure image problems and to raise…
Adachi, M.; Ito, A.; Takeuchi, W.; Yamagata, Y.
2011-12-01
Reducing emissions from deforestation and forest degradation in developing countries (REDD) is one of the most important carbon emission reduction efforts in the tropical region. Deforestation and land use changes are human activities with major impact on the regional carbon budged and the other greenhouse gases (CH4 and N2O) emissions. Forest carbon biomass in Southeast Asia is largest in Asia region; however, the area of primary forest had continuously decreased due to land-use conversion. The objective of the present study was to evaluate carbon budged and greenhouse gases induced by deforestation from Borneo Island. We used time-series satellite remote-sensing data to track deforestation history in Borneo Island, Southeast Asia, and estimated the resulting forest carbon budget using a process-based model (VISIT: Vegetation Integrative SImulator for Trace gases). The forest/non-forest area was mapped by applying the ALOS/PALSAR-calibrated threshold value to MODIS, SPOT-VEGETATION, and NOAA-AVHRR images. The model allowed us to estimate changes in carbon budged and greenhouse gases by human disturbances, including land-use conversion from primary forest to cropland (e.g., oil-palm plantation). The estimated carbon stocks, budged, and greenhouse gases were verified using field observation of previous studies at some point of Borneo Island. Our results suggested that the southern part of Borneo Island was a large carbon source due to deforestation, although the VISIT model need be revised to account for tropical peatland.
Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...
Theory and modeling of active brazing.
Energy Technology Data Exchange (ETDEWEB)
van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.
2013-09-01
Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.
Directory of Open Access Journals (Sweden)
P. Martinerie
2009-01-01
Full Text Available The budgets of seven halogenated gases (CFC-11, CFC-12, CFC-113, CFC-114, CFC-115, CCl_{4} and SF_{6} are studied by comparing measurements in polar firn air from two Arctic and three Antarctic sites, and simulation results of two numerical models: a 2-D atmospheric chemistry model and a 1-D firn diffusion model. The first one is used to calculate atmospheric concentrations from emission trends based on industrial inventories; the calculated concentration trends are used by the second one to produce depth concentration profiles in the firn. The 2-D atmospheric model is validated in the boundary layer by comparison with atmospheric station measurements, and vertically for CFC-12 by comparison with balloon and FTIR measurements. Firn air measurements provide constraints on historical atmospheric concentrations over the last century. Age distributions in the firn are discussed using a Green function approach. Finally, our results are used as input to a radiative model in order to evaluate the radiative forcing of our target gases. Multi-species and multi-site firn air studies allow to better constrain atmospheric trends. The low concentrations of all studied gases at the bottom of the firn, and their consistency with our model results confirm that their natural sources are insignificant. Our results indicate that the emissions, sinks and trends of CFC-11, CFC-12, CFC-113, CFC-115 and SF_{6} are well constrained, whereas it is not the case for CFC-114 and CCl_{4}. Significant emission-dependent changes in the lifetimes of halocarbons destroyed in the stratosphere were obtained. Those result from the time needed for their transport from the surface where they are emitted to the stratosphere where they are destroyed. Efforts should be made to update and reduce the large uncertainties on CFC lifetimes.
Directory of Open Access Journals (Sweden)
P. Martinerie
2009-06-01
Full Text Available The budgets of seven halogenated gases (CFC-11, CFC-12, CFC-113, CFC-114, CFC-115, CCl_{4} and SF_{6} are studied by comparing measurements in polar firn air from two Arctic and three Antarctic sites, and simulation results of two numerical models: a 2-D atmospheric chemistry model and a 1-D firn diffusion model. The first one is used to calculate atmospheric concentrations from emission trends based on industrial inventories; the calculated concentration trends are used by the second one to produce depth concentration profiles in the firn. The 2-D atmospheric model is validated in the boundary layer by comparison with atmospheric station measurements, and vertically for CFC-12 by comparison with balloon and FTIR measurements. Firn air measurements provide constraints on historical atmospheric concentrations over the last century. Age distributions in the firn are discussed using a Green function approach. Finally, our results are used as input to a radiative model in order to evaluate the radiative forcing of our target gases. Multi-species and multi-site firn air studies allow to better constrain atmospheric trends. The low concentrations of all studied gases at the bottom of the firn, and their consistency with our model results confirm that their natural sources are small. Our results indicate that the emissions, sinks and trends of CFC-11, CFC-12, CFC-113, CFC-115 and SF_{6} are well constrained, whereas it is not the case for CFC-114 and CCl_{4}. Significant emission-dependent changes in the lifetimes of halocarbons destroyed in the stratosphere were obtained. Those result from the time needed for their transport from the surface where they are emitted to the stratosphere where they are destroyed. Efforts should be made to update and reduce the large uncertainties on CFC lifetimes.
A Multi-Model Comparison of Black Carbon Budgets in the Arctic Region.
Mahmood, R.; von Salzen, K.; Flanner, M.; Sand, M.; Langner, J.; Wang, H.; Huang, L.
2015-12-01
In this study we quantify modeled aerosol processes related to black carbon (BC) concentrations in the Arctic region in several general circulation models used by the Expert Group on Arctic Monitoring and Assessment Program (AMAP). All models simulated well the observed seasonal cycle of BC concentrations in the high Canadian Arctic region, however, most models (except CanAM) underestimate the total concentrations. Transport of BC from lower latitudes is the major source for the Arctic region where emissions are small. The models produce similar seasonal cycle of BC transport towards the Arctic with maximum transport in July. However, substantial differences were found among the models in simulating BC burdens and vertical distributions with some models producing very week seasonal cycle while others producing stronger seasonality. The annual mean BC residence times in models also differs markedly with CanAM having the shortest residence times followed by SMHI-MATCH, CESM and NorESM. There are substantial differences among the models in simulating the relative role of wet and dry deposition rates which is one of the major factors causing variations in the seasonality of BC burdens in the models. Similarly, significant differences in wet deposition efficiencies among the models exist and are the leading cause of differences in simulated BC burdens. To further explore these processes, we performed several sensitivity tests in CanAM and CESM. Overall, the results indicate that scavenging of BC in convective clouds as compared to simulations without convective BC scavenging substantially increases the overall efficiency of BC wet deposition which leads to low BC burdens and a more pronounced seasonal cycle.
Hypergame Theory: A Model for Conflict, Misperception, and Deception
Directory of Open Access Journals (Sweden)
Nicholas S. Kovach
2015-01-01
Full Text Available When dealing with conflicts, game theory and decision theory can be used to model the interactions of the decision-makers. To date, game theory and decision theory have received considerable modeling focus, while hypergame theory has not. A metagame, known as a hypergame, occurs when one player does not know or fully understand all the strategies of a game. Hypergame theory extends the advantages of game theory by allowing a player to outmaneuver an opponent and obtaining a more preferred outcome with a higher utility. The ability to outmaneuver an opponent occurs in the hypergame because the different views (perception or deception of opponents are captured in the model, through the incorporation of information unknown to other players (misperception or intentional deception. The hypergame model more accurately provides solutions for complex theoretic modeling of conflicts than those modeled by game theory and excels where perception or information differences exist between players. This paper explores the current research in hypergame theory and presents a broad overview of the historical literature on hypergame theory.
Information Theory: a Multifaceted Model of Information
Directory of Open Access Journals (Sweden)
Mark Burgin
2003-06-01
Full Text Available A contradictory and paradoxical situation that currently exists in information studies can be improved by the introduction of a new information approach, which is called the general theory of information. The main achievement of the general theory of information is explication of a relevant and adequate definition of information. This theory is built as a system of two classes of principles (ontological and sociological and their consequences. Axiological principles, which explain how to measure and evaluate information and information processes, are presented in the second section of this paper. These principles systematize and unify different approaches, existing as well as possible, to construction and utilization of information measures. Examples of such measures are given by ShannonÃ¢Â€Â™s quantity of information, algorithmic quantity of information or volume of information. It is demonstrated that all other known directions of information theory may be treated inside general theory of information as its particular cases.
MULTI-FLEXIBLE SYSTEM DYNAMIC MODELING THEORY AND APPLICATION
Institute of Scientific and Technical Information of China (English)
仲昕; 周兵; 杨汝清
2001-01-01
The flexible body modeling theory was demonstrated. An example of modeling a kind of automobile's front suspension as a multi-flexible system was shown. Finally, it shows that the simulation results of multi-flexible dynamic model more approach the road test data than those of multi-rigid dynamic model do. Thus, it is fully testified that using multi-flexible body theory to model is necessary and effective.
QUANTUM THEORY FOR THE BINOMIAL MODEL IN FINANCE THEORY
Institute of Scientific and Technical Information of China (English)
CHEN Zeqian
2004-01-01
In this paper, a quantum model for the binomial market in finance is proposed. We show that its risk-neutral world exhibits an intriguing structure as a disk in the unit ball of R3, whose radius is a function of the risk-free interest rate with two thresholds which prevent arbitrage opportunities from this quantum market. Furthermore, from the quantum mechanical point of view we re-deduce the Cox-Ross-Rubinstein binomial option pricing formula by considering Maxwell-Boltzmann statistics of the system of N distinguishable particles.
The Standard Model is Natural as Magnetic Gauge Theory
DEFF Research Database (Denmark)
Sannino, Francesco
2011-01-01
matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...
Dynamic Energy Budgets and Bioaccumulation: A Model for Marine Mammals and Marine Mammal Populations
2006-06-01
Birnbaum, and M. DeVito. 2005. Comparison of the use of a physiologically based pharmacokinetic model and a classical pharmacokinetic model for dioxin ...status on biomarker responses to PCB in the arctic charr (Salvelinus alpinus). Aquatic Toxicology 44:233-244, Kann, L. M., and K. Wishner. 1995...description of the tissue distribution and enzyme-inducing properties of 2,3,7,8-tetrachlorodibenzo-p- dioxin in the rat. Toxicology and Applied
Summary of papers presented in the Theory and Modelling session
Lin-Liu Y.R.; Westerhof E.
2012-01-01
A total of 14 contributions were presented in the Theory and Modelling sessions at EC-17. One Theory and Modelling paper was included in the ITER ECRH and ECE sessions each. Three papers were in the area of nonlinear physics discussing parametric processes accompanying ECRH. Eight papers were based on the quasi-linear theory of wave heating and current drive. Three of these addressed the application of ECCD for NTM stabilization. Two papers considered scattering of EC waves by edge density fl...
Matrix models, topological strings, and supersymmetric gauge theories
Dijkgraaf, Robbert; Vafa, Cumrun
2002-11-01
We show that B-model topological strings on local Calabi-Yau threefolds are large- N duals of matrix models, which in the planar limit naturally give rise to special geometry. These matrix models directly compute F-terms in an associated N=1 supersymmetric gauge theory, obtained by deforming N=2 theories by a superpotential term that can be directly identified with the potential of the matrix model. Moreover by tuning some of the parameters of the geometry in a double scaling limit we recover ( p, q) conformal minimal models coupled to 2d gravity, thereby relating non-critical string theories to type II superstrings on Calabi-Yau backgrounds.
A Quantitative Causal Model Theory of Conditional Reasoning
Fernbach, Philip M.; Erb, Christopher D.
2013-01-01
The authors propose and test a causal model theory of reasoning about conditional arguments with causal content. According to the theory, the acceptability of modus ponens (MP) and affirming the consequent (AC) reflect the conditional likelihood of causes and effects based on a probabilistic causal model of the scenario being judged. Acceptability…
Introduction to gauge theories and the Standard Model
de Wit, Bernard
1995-01-01
The conceptual basis of gauge theories is introduced to enable the construction of generic models.Spontaneous symmetry breaking is dicussed and its relevance for the renormalization of theories with massive vector field is explained. Subsequently a d standard model. When time permits we will address more practical questions that arise in the evaluation of quantum corrections.
Solid modeling and applications rapid prototyping, CAD and CAE theory
Um, Dugan
2016-01-01
The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...
The logical foundations of scientific theories languages, structures, and models
Krause, Decio
2016-01-01
This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...
Item response theory modeling with nonignorable missing data
Pimentel, Jonald L.
2005-01-01
This thesis discusses methods to detect nonignorable missing data and methods to adjust for the bias caused by nonignorable missing data, both by introducing a model for the missing data indicator using item response theory (IRT) models.
Application of multidimensional item response theory models to longitudinal data
Marvelde, te Janneke M.; Glas, Cees A.W.; Van Landeghem, Georges; Van Damme, Jan
2006-01-01
The application of multidimensional item response theory (IRT) models to longitudinal educational surveys where students are repeatedly measured is discussed and exemplified. A marginal maximum likelihood (MML) method to estimate the parameters of a multidimensional generalized partial credit model
Bhatti, Asif M.; Koike, Toshio; Shrestha, Maheswor
2016-12-01
A water and energy budget-based distributed hydrological model with improved snow physics (WEB-DHM-S) was applied to elucidate the impact of climate change on mountain snow hydrology in the Shubuto River basin, Hokkaido, Japan. The simulated spatial distribution of snow cover was evaluated using the Moderate Resolution Imaging Spectroradiometer (MODIS) 8-day maximum snow-cover extent (MOD10A2) product, which revealed the model's capability for capturing the spatiotemporal variations in snow cover within the study area. Four Atmosphere Ocean General Circulation Models (AOGCMs) were selected and the SRESA1B emission scenario of the Intergovernmental Panel on Climate Change was used to describe climate predictions in the basin. All AOGCMs predict a future decrease in snowmelt contribution to total discharge 11-22% and an average decrease in SWE of 36%, with a shift in peak SWE by 4-14 days. The shift in runoff regime is broadly consistent between the AOGCMs with snowmelt-induced peak discharge expected to occur on average about two weeks earlier in the future hydrological year. The warming climate will drive a shift in runoff regime from a combined rainfall- and snowmelt-driven regime to one with a reduced contribution from snowmelt. The results of the study revealed that the model could be successfully applicable on the basin scale to simulate river discharge and snow processes and to investigate the effect of climate change on hydrological processes. This research contributes to improve the understanding of basin hydrological responses and the pace of change associated with climate variability.
Agüera, Antonio; Collard, Marie; Jossart, Quentin; Moreau, Camille; Danis, Bruno
2015-01-01
Marine organisms in Antarctica are adapted to an extreme ecosystem including extremely stable temperatures and strong seasonality due to changes in day length. It is now largely accepted that Southern Ocean organisms are particularly vulnerable to global warming with some regions already being challenged by a rapid increase of temperature. Climate change affects both the physical and biotic components of marine ecosystems and will have an impact on the distribution and population dynamics of Antarctic marine organisms. To predict and assess the effect of climate change on marine ecosystems a more comprehensive knowledge of the life history and physiology of key species is urgently needed. In this study we estimate the Dynamic Energy Budget (DEB) model parameters for key benthic Antarctic species the sea star Odontaster validus using available information from literature and experiments. The DEB theory is unique in capturing the metabolic processes of an organism through its entire life cycle as a function of temperature and food availability. The DEB model allows for the inclusion of the different life history stages, and thus, becomes a tool that can be used to model lifetime feeding, growth, reproduction, and their responses to changes in biotic and abiotic conditions. The DEB model presented here includes the estimation of reproduction handling rules for the development of simultaneous oocyte cohorts within the gonad. Additionally it links the DEB model reserves to the pyloric caeca an organ whose function has long been ascribed to energy storage. Model parameters described a slowed down metabolism of long living animals that mature slowly. O. validus has a large reserve that-matching low maintenance costs- allow withstanding long periods of starvation. Gonad development is continuous and individual cohorts developed within the gonads grow in biomass following a power function of the age of the cohort. The DEB model developed here for O. validus allowed us to
Mangiarotti, S.; Veloso, A.; Ceschia, E.; Tallec, T.; Dejoux, J. F.
2015-12-01
Croplands occupy large areas of Earth's land surface playing a key role in the terrestrial carbon cycle. Hence, it is essential to quantify and analyze the carbon fluxes from those agro-ecosystems, since they contribute to climate change and are impacted by the environmental conditions. In this study we propose a regional modeling approach that combines high spatial and temporal resolutions (HSTR) optical remote sensing data with a crop model and a large set of in-situ measurements for model calibration and validation. The study area is located in southwest France and the model that we evaluate, called SAFY-CO2, is a semi-empirical one based on the Monteith's light-use efficiency theory and adapted for simulating the components of the net ecosystem CO2 fluxes (NEE) and of the annual net ecosystem carbon budgets (NECB) at a daily time step. The approach is based on the assimilation of satellite-derived green area index (GAI) maps for calibrating a number of the SAFY-CO2 parameters linked to crop phenology. HSTR data from the Formosat-2 and SPOT satellites were used to produce the GAI maps. The experimental data set includes eddy covariance measurements of net CO2 fluxes from two experimental sites and partitioned into gross primary production (GPP) and ecosystem respiration (Reco). It also includes measurements of GAI, biomass and yield between 2005 and 2011, focusing on the winter wheat crop. The results showed that the SAFY-CO2 model correctly reproduced the biomass production, its dynamic and the yield (relative errors about 24%) in contrasted climatic, environmental and management conditions. The net CO2 flux components estimated with the model were overall in agreement with the ground data, presenting good correlations (R² about 0.93 for GPP, 0.77 for Reco and 0.86 for NEE). The evaluation of the modelled NECB for the different site-years highlighted the importance of having accurate estimates of each component of the NECB. Future works aim at considering
Directory of Open Access Journals (Sweden)
Antonio Agüera
Full Text Available Marine organisms in Antarctica are adapted to an extreme ecosystem including extremely stable temperatures and strong seasonality due to changes in day length. It is now largely accepted that Southern Ocean organisms are particularly vulnerable to global warming with some regions already being challenged by a rapid increase of temperature. Climate change affects both the physical and biotic components of marine ecosystems and will have an impact on the distribution and population dynamics of Antarctic marine organisms. To predict and assess the effect of climate change on marine ecosystems a more comprehensive knowledge of the life history and physiology of key species is urgently needed. In this study we estimate the Dynamic Energy Budget (DEB model parameters for key benthic Antarctic species the sea star Odontaster validus using available information from literature and experiments. The DEB theory is unique in capturing the metabolic processes of an organism through its entire life cycle as a function of temperature and food availability. The DEB model allows for the inclusion of the different life history stages, and thus, becomes a tool that can be used to model lifetime feeding, growth, reproduction, and their responses to changes in biotic and abiotic conditions. The DEB model presented here includes the estimation of reproduction handling rules for the development of simultaneous oocyte cohorts within the gonad. Additionally it links the DEB model reserves to the pyloric caeca an organ whose function has long been ascribed to energy storage. Model parameters described a slowed down metabolism of long living animals that mature slowly. O. validus has a large reserve that-matching low maintenance costs- allow withstanding long periods of starvation. Gonad development is continuous and individual cohorts developed within the gonads grow in biomass following a power function of the age of the cohort. The DEB model developed here for O
Eid, E.M.; Shaltout, K.H.; Al-Sodany, Y.M.; Soetaert, K.E.R.; Jensen, K.
2010-01-01
Phragmites australis is the major component of reed stands covering some 8200 ha along the shores of Lake Burullus (Egypt). We applied a published temperate zone reed model to assess growth and cycling of C and nutrients among the various organs of P. australis in this sub-tropical lake. We aim to q
Teaching Wound Care Management: A Model for the Budget Conscious Educator
Berry, David C.
2012-01-01
For the author, the concept of wound care has always been a challenging topic to demonstrate. How to teach the concept without having a student in need of wound care or without having to spend money to buy another simulation manikin/model? The author has recently created a simulation to demonstrate and practice the cleaning, closing, and dressing…
Theory of stellar convection - II. First stellar models
Pasetto, S.; Chiosi, C.; Chiosi, E.; Cropper, M.; Weiss, A.
2016-07-01
We present here the first stellar models on the Hertzsprung-Russell diagram, in which convection is treated according to the new scale-free convection theory (SFC theory) by Pasetto et al. The aim is to compare the results of the new theory with those from the classical, calibrated mixing-length (ML) theory to examine differences and similarities. We integrate the equations describing the structure of the atmosphere from the stellar surface down to a few per cent of the stellar mass using both ML theory and SFC theory. The key temperature over pressure gradients, the energy fluxes, and the extension of the convective zones are compared in both theories. The analysis is first made for the Sun and then extended to other stars of different mass and evolutionary stage. The results are adequate: the SFC theory yields convective zones, temperature gradients ∇ and ∇e, and energy fluxes that are very similar to those derived from the `calibrated' MT theory for main-sequence stars. We conclude that the old scale dependent ML theory can now be replaced with a self-consistent scale-free theory able to predict correct results, as it is more physically grounded than the ML theory. Fundamentally, the SFC theory offers a deeper insight of the underlying physics than numerical simulations.
Large field inflation models from higher-dimensional gauge theories
Energy Technology Data Exchange (ETDEWEB)
Furuuchi, Kazuyuki [Manipal Centre for Natural Sciences, Manipal University, Manipal, Karnataka 576104 (India); Koyama, Yoji [Department of Physics, National Tsing-Hua University, Hsinchu 30013, Taiwan R.O.C. (China)
2015-02-23
Motivated by the recent detection of B-mode polarization of CMB by BICEP2 which is possibly of primordial origin, we study large field inflation models which can be obtained from higher-dimensional gauge theories. The constraints from CMB observations on the gauge theory parameters are given, and their naturalness are discussed. Among the models analyzed, Dante’s Inferno model turns out to be the most preferred model in this framework.
Directory of Open Access Journals (Sweden)
Ensor Tim
2012-08-01
Full Text Available Abstract Background Allocating national resources to regions based on need is a key policy issue in most health systems. Many systems utilise proxy measures of need as the basis for allocation formulae. Increasingly these are underpinned by complex statistical methods to separate need from supplier induced utilisation. Assessment of need is then used to allocate existing global budgets to geographic areas. Many low and middle income countries are beginning to use formula methods for funding however these attempts are often hampered by a lack of information on utilisation, relative needs and whether the budgets allocated bear any relationship to cost. An alternative is to develop bottom-up estimates of the cost of providing for local need. This method is viable where public funding is focused on a relatively small number of targeted services. We describe a bottom-up approach to developing a formula for the allocation of resources. The method is illustrated in the context of the state minimum service package mandated to be provided by the Indonesian public health system. Methods A standardised costing methodology was developed that is sensitive to the main expected drivers of local cost variation including demographic structure, epidemiology and location. Essential package costing is often undertaken at a country level. It is less usual to utilise the methods across different parts of a country in a way that takes account of variation in population needs and location. Costing was based on best clinical practice in Indonesia and province specific data on distribution and costs of facilities. The resulting model was used to estimate essential package costs in a representative district in each province of the country. Findings Substantial differences in the costs of providing basic services ranging from USD 15 in urban Yogyakarta to USD 48 in sparsely populated North Maluku. These costs are driven largely by the structure of the population
Directory of Open Access Journals (Sweden)
A. Lauer
2007-10-01
Full Text Available International shipping contributes significantly to the fuel consumption of all transport related activities. Specific emissions of pollutants such as sulfur dioxide (SO_{2} per kg of fuel emitted are higher than for road transport or aviation. Besides gaseous pollutants, ships also emit various types of particulate matter. The aerosol impacts the Earth's radiation budget directly by scattering and absorbing the solar and thermal radiation and indirectly by changing cloud properties. Here we use ECHAM5/MESSy1-MADE, a global climate model with detailed aerosol and cloud microphysics to study the climate impacts of international shipping. The simulations show that emissions from ships significantly increase the cloud droplet number concentration of low marine water clouds by up to 5% to 30% depending on the ship emission inventory and the geographic region. Whereas the cloud liquid water content remains nearly unchanged in these simulations, effective radii of cloud droplets decrease, leading to cloud optical thickness increase of up to 5–10%. The sensitivity of the results is estimated by using three different emission inventories for present-day conditions. The sensitivity analysis reveals that shipping contributes to 2.3% to 3.6% of the total sulfate burden and 0.4% to 1.4% to the total black carbon burden in the year 2000 on the global mean. In addition to changes in aerosol chemical composition, shipping increases the aerosol number concentration, e.g. up to 25% in the size range of the accumulation mode (typically >0.1 μm over the Atlantic. The total aerosol optical thickness over the Indian Ocean, the Gulf of Mexico and the Northeastern Pacific increases by up to 8–10% depending on the emission inventory. Changes in aerosol optical thickness caused by shipping induced modification of aerosol particle number concentration and chemical composition lead to a change in the shortwave radiation budget at the top of the
Atmospheric water budget over the western Himalayas in a regional climate model
Indian Academy of Sciences (India)
A P Dimri
2012-08-01
During winter months (December, January, February – DJF), the western Himalayas (WH) receive precipitation from eastward moving extratropical cyclones, called western disturbances (WDs) in Indian parlance. Winter precipitation–moisture convergence–evaporation (P–C–E) cycle is analyzed for a period of 22 years (1981–2002: 1980(D)–1981(J, F) to 2001(D)–2002(J, F)) with observed and modelled (RegCM3) climatological estimates over WH. Remarkable model skills have been observed in depicting the hydrological cycle over WH. Although precipitation biases exist, similar spatial precipitation with well marked two maxima is simulated by the model. As season advances, temporal distribution shows higher precipitation in simulation than the observed. However, P–C–E cycle shows similar peaks of moisture convergence and evaporation in daily climatologies though with varying maxima/minima. In the first half of winter, evaporation over WH is mainly driven by ground surface and 2 m air temperature. Lowest temperatures during mid-winter correspond to lowest evaporation to precipitation ratio as well.
An Application of Rough Set Theory to Modelling and Utilising Data Warehouses
Institute of Scientific and Technical Information of China (English)
DENG Ming-rong; YANG Jian-bo; PAN Yun-he
2001-01-01
A data warehouse often accommodates enormous summary information in various granularities and is mainly used to support on-line analytical processing. Ideally all detailed data should be accessible by residing in some legacy systems or on-line transaction processing systems. In many cases, however, data sources in computers are also kinds of summary data due to technological problems or budget limits and also because different aggregation hierarchies may need to be used among various transaction systems. In such circumstances, it is necessary to investigate how to design dimensions, which play a major role in dimensional model for a data warehouse, and how to estimate summary information, which is not stored in the data warehouse. In this paper, the rough set theory is applied to support the dimension design and information estimation.
Directory of Open Access Journals (Sweden)
D. B. Millet
2010-04-01
Full Text Available We construct a global atmospheric budget for acetaldehyde using a 3-D model of atmospheric chemistry (GEOS-Chem, and use an ensemble of observations to evaluate present understanding of its sources and sinks. Hydrocarbon oxidation provides the largest acetaldehyde source in the model (128 Tg a^{−1}, a factor of 4 greater than the previous estimate, with alkanes, alkenes, and ethanol the main precursors. There is also a minor source from isoprene oxidation. We use an updated chemical mechanism for GEOS-Chem, and photochemical acetaldehyde yields are consistent with the Master Chemical Mechanism. We present a new approach to quantifying the acetaldehyde air-sea flux based on the global distribution of light absorption due to colored dissolved organic matter (CDOM derived from satellite ocean color observations. The resulting net ocean emission is 57 Tg a^{−1}, the second largest global source of acetaldehyde. A key uncertainty is the acetaldehyde turnover time in the ocean mixed layer, with quantitative model evaluation over the ocean complicated by known measurement artifacts in clean air. Simulated concentrations in surface air over the ocean generally agree well with aircraft measurements, though the model tends to overestimate the vertical gradient. PAN:NO_{x} ratios are well-simulated in the marine boundary layer, providing some support for the modeled ocean source. We introduce the Model of Emissions of Gases and Aerosols from Nature (MEGANv2.1 for acetaldehyde and ethanol and use it to quantify their net flux from living terrestrial plants. Including emissions from decaying plants the total direct acetaldehyde source from the land biosphere is 23 Tg a^{−1}. Other terrestrial acetaldehyde sources include biomass burning (3 Tg a^{−1} and anthropogenic emissions (2 Tg a^{−1}. Simulated concentrations in the continental boundary layer are generally unbiased and capture the spatial
Goeckede, M.; Michalak, A. M.; Vickers, D.; Turner, D.; Law, B.
2009-04-01
The study presented is embedded within the NACP (North American Carbon Program) West Coast project ORCA2, which aims at determining the regional carbon balance of the US states Oregon, California and Washington. Our work specifically focuses on the effect of disturbance history and climate variability, aiming at improving our understanding of e.g. drought stress and stand age on carbon sources and sinks in complex terrain with fine-scale variability in land cover types. The ORCA2 atmospheric inverse modeling approach has been set up to capture flux variability on the regional scale at high temporal and spatial resolution. Atmospheric transport is simulated coupling the mesoscale model WRF (Weather Research and Forecast) with the STILT (Stochastic Time Inverted Lagrangian Transport) footprint model. This setup allows identifying sources and sinks that influence atmospheric observations with highly resolved mass transport fields and realistic turbulent mixing. Terrestrial biosphere carbon fluxes are simulated at spatial resolutions of up to 1km and subdaily timesteps, considering effects of ecoregion, land cover type and disturbance regime on the carbon budgets. Our approach assimilates high-precision atmospheric CO2 concentration measurements and eddy-covariance data from several sites throughout the model domain, as well as high-resolution remote sensing products (e.g. LandSat, MODIS) and interpolated surface meteorology (DayMet, SOGS, PRISM). We present top-down modeling results that have been optimized using Bayesian inversion, reflecting the information on regional scale carbon processes provided by the network of high-precision CO2 observations. We address the level of detail (e.g. spatial and temporal resolution) that can be resolved by top-down modeling on the regional scale, given the uncertainties introduced by various sources for model-data mismatch. Our results demonstrate the importance of accurate modeling of carbon-water coupling, with the
Toric Methods in F-Theory Model Building
Directory of Open Access Journals (Sweden)
Johanna Knapp
2011-01-01
Full Text Available We discuss recent constructions of global F-theory GUT models and explain how to make use of toric geometry to do calculations within this framework. After introducing the basic properties of global F-theory GUTs, we give a self-contained review of toric geometry and introduce all the tools that are necessary to construct and analyze global F-theory models. We will explain how to systematically obtain a large class of compact Calabi-Yau fourfolds which can support F-theory GUTs by using the software package PALP.
General autocatalytic theory and simple model of financial markets
Thuy Anh, Chu; Lan, Nguyen Tri; Viet, Nguyen Ai
2015-06-01
The concept of autocatalytic theory has become a powerful tool in understanding evolutionary processes in complex systems. A generalization of autocatalytic theory was assumed by considering that the initial element now is being some distribution instead of a constant value as in traditional theory. This initial condition leads to that the final element might have some distribution too. A simple physics model for financial markets is proposed, using this general autocatalytic theory. Some general behaviours of evolution process and risk moment of a financial market also are investigated in framework of this simple model.
Theories, models and urban realities. From New York to Kathmandu
Directory of Open Access Journals (Sweden)
Román Rodríguez González
2004-12-01
Full Text Available At the beginning of the 21st century, there are various social theories that speak of global changes in the history of human civilization. Urban models have been through obvious changes throughout the last century according to the important transformation that are pro-posed by previous general theories. Nevertheless global diversity contradicts the generaliza-tion of these theories and models. From our own simple observations and reflections we arrive at conclusions that distance themselves from the prevailing theory of our civilized world. New York, Delhi, Salvador de Bahia, Bruges, Paris, Cartagena de Indias or Kath-mandu still have more internal differences than similarities.
Theories, models and urban realities. From New York to Kathmandu
Directory of Open Access Journals (Sweden)
José Somoza Medina
2004-01-01
Full Text Available At the beginning of the 21st century, there are various social theories that speak of globalchanges in the history of human civilization. Urban models have been through obviouschanges throughout the last century according to the important transformation that are proposedby previous general theories. Nevertheless global diversity contradicts the generalizationof these theories and models. From our own simple observations and reflections wearrive at conclusions that distance themselves from the prevailing theory of our civilizedworld. New York, Delhi, Salvador de Bahia, Bruges, Paris, Cartagena de Indias or Kathmandustill have more internal differences than similarities.
Institute of Scientific and Technical Information of China (English)
李秀英
2014-01-01
全面预算是企业对一定期间经营活动、投资活动、财务活动等作出的预算安排。全面预算作为一种全方位、全过程、全员参与编制与实施的预算管理模式，凭借其计划、协调、控制、激励、评价等综合管理功能，整合和优化配置企业资源，提升企业运行效率，成为促进实现企业发展战略的重要抓手。企业全面预算是以企业发展战略为基础进行编制的，全面预算将企业发展战略进行具体量化，分级编制，层层落实，并将指标完成情况作为绩效考核的依据。%Comprehensive budget is the budget layout that enterprise makes for operating activities, investing activities and financial activities in a certain period. As a comprehensive, whole process and all staff partici⁃pation compilation and implementation budget management mode, Comprehensive budget rely on its compre⁃hensive management functions like plan, coordinate, control, motivation, evaluation to integrating and optimiz⁃ing the allocation of enterprise resources, improve enterprise efficiency and become an important factor to pro⁃mote realizing enterprise development strategy. Enterprise comprehensive budget is compiled based on enter⁃prise development strategy and also specific quantitative, grading, layers of implement it, and take the indica⁃tors complete status as the basis of performance appraisal.
Modeling Water and Carbon Budgets in Current and Future Agricultural Land Use
Drewniak, B.; Song, J.; Prell, J.; Kotamarthi, R.; Jacob, R.
2008-12-01
Biofuels are a key component of renewable energy mix proposed as a substitute to fossil fuels. Biofuels are suggested as both economical and having potential for reducing atmospheric emissions of carbon from the transportation sector, by building up soil carbon levels when planted on lands where these levels have been reduced by intensive tillage. The purpose of this research is to develop a carbon-nitrogen based crop module (CNC) for the community land model (CLM) and to improve the characterization of the below and above ground carbon sequestration for bioenergy crops. The CNC simulates planting, growing, maturing and harvesting stages for three major crops: maize, soybean and wheat. In addition, dynamic root module is implemented to simulate fine root distribution and development based on relative availability of soil water and nitrogen in the root zone. Coupled CLM-CNC models is used to study crop yields, geographic locations for bioenergy crop production and soil carbon changes. Bioenergy crop cultivation is based on current crop cultivation and future land use change dataset. Soil carbon change has been simulated based on carbon input to the soil from the leaf, stem and root, and carbon emission from soil carbon decomposition. Simulated water and carbon fluxes have been compared with field observations and soil carbon content has been examined under different harvest practices.
Error budget analysis of SCIAMACHY limb ozone profile retrievals using the SCIATRAN model
Directory of Open Access Journals (Sweden)
N. Rahpoe
2013-10-01
Full Text Available A comprehensive error characterization of SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric CHartographY limb ozone profiles has been established based upon SCIATRAN transfer model simulations. The study was carried out in order to evaluate the possible impact of parameter uncertainties, e.g. in albedo, stratospheric aerosol optical extinction, temperature, pressure, pointing, and ozone absorption cross section on the limb ozone retrieval. Together with the a posteriori covariance matrix available from the retrieval, total random and systematic errors are defined for SCIAMACHY ozone profiles. Main error sources are the pointing errors, errors in the knowledge of stratospheric aerosol parameters, and cloud interference. Systematic errors are of the order of 7%, while the random error amounts to 10–15% for most of the stratosphere. These numbers can be used for the interpretation of instrument intercomparison and validation of the SCIAMACHY V 2.5 limb ozone profiles in a rigorous manner.
Suppressing breakers with polar oil films: Using an epic sea rescue to model wave energy budgets
Cox, Charles S.; Zhang, Xin; Duda, Timothy F.
2017-02-01
Oil has been used to still stormy seas for centuries, but the mechanisms are poorly understood. Here we examine the processes by using quantitative information from a remarkable 1883 sea rescue where oil was used to reduce large breakers during a storm. Modeling of the oil film's extent and waves under the film suggests that large breakers were suppressed by a reduction of wind energy input. Modification of surface roughness by the film is hypothesized to alter the wind profile above the sea and the energy flow. The results are central to understanding air-sea momentum exchange, including its role in such processes as cyclone growth and storm surge, although they address only one aspect of the complex problem of wind interaction with the ocean surface.
FY 1996 Congressional budget request: Budget highlights
Energy Technology Data Exchange (ETDEWEB)
1995-02-01
The FY 1996 budget presentation is organized by the Department`s major business lines. An accompanying chart displays the request for new budget authority. The report compares the budget request for FY 1996 with the appropriated FY 1995 funding levels displayed on a comparable basis. The FY 1996 budget represents the first year of a five year plan in which the Department will reduce its spending by $15.8 billion in budget authority and by $14.1 billion in outlays. FY 1996 is a transition year as the Department embarks on its multiyear effort to do more with less. The Budget Highlights are presented by business line; however, the fifth business line, Economic Productivity, which is described in the Policy Overview section, cuts across multiple organizational missions, funding levels and activities and is therefore included in the discussion of the other four business lines.
Nieradzik, L. P.; Haverd, V. E.; Briggs, P.; Meyer, C. P.; Canadell, J.
2015-12-01
Fires play a major role in the carbon-cycle and the development of global vegetation, especially on the continent of Australia, where vegetation is prone to frequent fire occurences and where regional composition and stand-age distribution is regulated by fire. Furthermore, the probable changes of fire behaviour under a changing climate are still poorly understood and require further investigation.In this presentation we introduce the fire-model BLAZE (BLAZe induced land-atmosphere flux Estimator), designed for a novel approach to simulate fire-frequencies, fire-intensities, fire related fluxes and the responses in vegetation. Fire frequencies are prescribed using SIMFIRE (Knorr et al., 2014) or GFED3 (e.g. Giglio et al., 2013). Fire-Line-Intensity (FLI) is computed from meteorological information and fuel loads which are state variables within the C-cycle component of CABLE (Community Atmosphere-Biosphere-Land Exchange model). This FLI is used as an input to the tree-demography model POP(Population-Order-Physiology; Haverd et al., 2014). Within POP the fire-mortality depends on FLI and tree height distribution. Intensity-dependent combustion factors (CF) are then generated for and applied to live and litter carbon pools as well as the transfers from live pools to litter caused by fire. Thus, both fire and stand characteristics are taken into account which has a legacy effect on future events. Gross C-CO2 emissions from Australian wild fires are larger than Australian territorial fossil fuel emissions. However, the net effect of fire on the Australian terrestrial carbon budget is unknown. We address this by applying the newly-developed fire module, integrated within the CABLE land surface model, and optimised for the Australian region, to a reassessment of the Australian Terrestrial Carbon Budget.
Directory of Open Access Journals (Sweden)
M. Adachi
2011-03-01
Full Text Available More reliable estimates of carbon (C stock within forest ecosystems and C emission induced by deforestation are urgently needed to mitigate the effects of emissions on climate change. A process-based terrestrial biogeochemical model (VISIT was applied to tropical primary forests of two types (a seasonal dry forest in Thailand and a rainforest in Malaysia and one agro-forest (an oil palm plantation in Malaysia to estimate the C budget of tropical ecosystems, including the impacts of land-use conversion, in Southeast Asia. Observations and VISIT model simulations indicated that the primary forests had high photosynthetic uptake: gross primary production was estimated at 31.5–35.5 t C ha^{−1} yr^{−1}. In the VISIT model simulation, the rainforest had a higher total C stock (plant biomass and soil organic matter, 301.5 t C ha^{−1} than that in the seasonal dry forest (266.5 t C ha^{−1} in 2008. The VISIT model appropriately captured the impacts of disturbances such as deforestation and land-use conversions on the C budget. Results of sensitivity analysis implied that the ratio of remaining residual debris was a key parameter determining the soil C budget after deforestation events. The C stock of the oil palm plantation was about 46% of the rainforest's C at 30 yr following initiation of the plantation, when the ratio of remaining residual debris was assumed to be about 33%. These results show that adequate forest management is important for reducing C emission from soil and C budget of each ecosystem must be evaluated over a long term using both the model simulations and observations.
Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat
2017-01-23
Although widely recognized as a comprehensive framework for representing score reliability, generalizability theory (G-theory), despite its potential benefits, has been used sparingly in reporting of results for measures of individual differences. In this article, we highlight many valuable ways that G-theory can be used to quantify, evaluate, and improve psychometric properties of scores. Our illustrations encompass assessment of overall reliability, percentages of score variation accounted for by individual sources of measurement error, dependability of cut-scores for decision making, estimation of reliability and dependability for changes made to measurement procedures, disattenuation of validity coefficients for measurement error, and linkages of G-theory with classical test theory and structural equation modeling. We also identify computer packages for performing G-theory analyses, most of which can be obtained free of charge, and describe how they compare with regard to data input requirements, ease of use, complexity of designs supported, and output produced. (PsycINFO Database Record
The mathematical theory of reduced MHD models for fusion plasmas
Guillard, Hervé
2015-01-01
The derivation of reduced MHD models for fusion plasma is here formulated as a special instance of the general theory of singular limit of hyperbolic system of PDEs with large operator. This formulation allows to use the general results of this theory and to prove rigorously that reduced MHD models are valid approximations of the full MHD equations. In particular, it is proven that the solutions of the full MHD system converge to the solutions of an appropriate reduced model.
Modeling Carbon and Water Budgets in the Lushi Basin with Biome-BGC
Institute of Scientific and Technical Information of China (English)
Dong Wenjuan; Qi Ye; Li Huimin; Zhou Dajie; Shi Duanhua; Sun Liying
2005-01-01
In this article, annual evapotranspiration (ET) and net primary productivity (NPP) of four types of vegetation were estimated for the Lushi basin,a subbasin of the Yellow River in China. These four vegetation types include: deciduous broadleaf forest,evergreen needle leaf forest, dwarf shrub and grass.Biome-BGC--a biogeochemical process model was used to calculate annual ET and NPP for each vegetation type in the study area from 1954 to 2000.Daily microclimate data of 47 years monitored by Lushi meteorological station was extrapolated to cover the basin using MT-CLIM, a mountain microclimate simulator. The output files of MTCLIM were used to feed Biome-BGC. We used average ecophysiological values of each type of vegetation supplied by Numerical Terradynamic Simulation Group (NTSG) in the University of Montana as input ecophysiological constants file.The estimates of daily NPP in early July and annual ET on these four biome groups were compared respectively with field measurements and other studies.Daily gross primary production (GPP) of evergreen needle leaf forest measurements were very dose to the output of Biome-BGC, but measurements of broadleaf forest and dwarf shrub were much smaller than the simulation result. Simulated annual ET and NPP had a significant correlation with precipitation,indicating precipitation is the major environmental factor affecting ET and NPP in the study area.Precipitation also is the key climatic factor for the interannual ET and NPP variations.
Secondary flow structure in a model curved artery: 3D morphology and circulation budget analysis
Bulusu, Kartik V.; Plesniak, Michael W.
2015-11-01
In this study, we examined the rate of change of circulation within control regions encompassing the large-scale vortical structures associated with secondary flows, i.e. deformed Dean-, Lyne- and Wall-type (D-L-W) vortices at planar cross-sections in a 180° curved artery model (curvature ratio, 1/7). Magnetic resonance velocimetry (MRV) and particle image velocimetry (PIV) experiments were performed independently, under the same physiological inflow conditions (Womersley number, 4.2) and using Newtonian blood-analog fluids. The MRV-technique performed at Stanford University produced phase-averaged, three-dimensional velocity fields. Secondary flow field comparisons of MRV-data to PIV-data at various cross-sectional planes and inflow phases were made. A wavelet-decomposition-based approach was implemented to characterize various secondary flow morphologies. We hypothesize that the persistence and decay of arterial secondary flow vortices is intrinsically related to the influence of the out-of-plane flow, tilting, in-plane convection and diffusion-related factors within the control regions. Evaluation of these factors will elucidate secondary flow structures in arterial hemodynamics. Supported by the National Science Foundation under Grant Number CBET-0828903, and GW Center for Biomimetics and Bioinspired Engineering (COBRE). The MRV data were acquired at Stanford University in collaboration with Christopher Elkins and John Eaton.
Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...
Gutzwiller variational theory for the Hubbard model with attractive interaction.
Bünemann, Jörg; Gebhard, Florian; Radnóczi, Katalin; Fazekas, Patrik
2005-06-29
We investigate the electronic and superconducting properties of a negative-U Hubbard model. For this purpose we evaluate a recently introduced variational theory based on Gutzwiller-correlated BCS wavefunctions. We find significant differences between our approach and standard BCS theory, especially for the superconducting gap. For small values of |U|, we derive analytical expressions for the order parameter and the superconducting gap which we compare to exact results from perturbation theory.
Nonequilibrium Dynamical Mean-Field Theory for Bosonic Lattice Models
2015-01-01
We develop the nonequilibrium extension of bosonic dynamical mean-field theory and a Nambu real-time strong-coupling perturbative impurity solver. In contrast to Gutzwiller mean-field theory and strong-coupling perturbative approaches, nonequilibrium bosonic dynamical mean-field theory captures not only dynamical transitions but also damping and thermalization effects at finite temperature. We apply the formalism to quenches in the Bose-Hubbard model, starting from both the normal and the Bos...
Geometry model construction in infrared image theory simulation of buildings
Institute of Scientific and Technical Information of China (English)
谢鸣; 李玉秀; 徐辉; 谈和平
2004-01-01
Geometric model construction is the basis of infrared image theory simulation. Taking the construction of the geometric model of one building in Harbin as an example, this paper analyzes the theoretical groundings of simplification and principles of geometric model construction of buildings. It then discusses some particular treatment methods in calculating the radiation transfer coefficient in geometric model construction using the Monte Carlo Method.
Theory and model use in social marketing health interventions.
Luca, Nadina Raluca; Suggs, L Suzanne
2013-01-01
The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.
Modeling Routinization in Games: An Information Theory Approach
DEFF Research Database (Denmark)
Wallner, Simon; Pichlmair, Martin; Hecher, Michael
2015-01-01
Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...
Theory, modeling, and simulation annual report, 1992
Energy Technology Data Exchange (ETDEWEB)
1993-05-01
This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.
The danger model: questioning an unconvincing theory.
Józefowski, Szczepan
2016-02-01
Janeway's pattern recognition theory holds that the immune system detects infection through a limited number of the so-called pattern recognition receptors (PRRs). These receptors bind specific chemical compounds expressed by entire groups of related pathogens, but not by host cells (pathogen-associated molecular patterns (PAMPs). In contrast, Matzinger's danger hypothesis postulates that products released from stressed or damaged cells have a more important role in the activation of immune system than the recognition of nonself. These products, named by analogy to PAMPs as danger-associated molecular patterns (DAMPs), are proposed to act through the same receptors (PRRs) as PAMPs and, consequently, to stimulate largely similar responses. Herein, I review direct and indirect evidence that contradict the widely accepted danger theory, and suggest that it may be false.
Theories and models of globalization ethicizing
Directory of Open Access Journals (Sweden)
Dritan Abazović
2016-05-01
Full Text Available Globalization as a phenomenon is under the magnifying glass of many philosophical discussions and theoretical deliberations. While most theorists deal with issues that are predominantly of economic or political character, this article has a different logic. The article presents six theories which in their own way explain the need for movement by ethicizing globalization. Globalization is a process that affects all and as such it has become inevitable, but it is up the people to determine its course and make it either functional or uncontrolled. The survival and development of any society is measured primarily by the quality of its moral and ethical foundation. Therefore, it is clear that global society can survive and be functional only if it finds a minimum consensus on ethical norms or, as said in theory, if it establishes its ethical system based on which it would be built and developed.
Measurement Models for Reasoned Action Theory
Hennessy, Michael; Bleakley, Amy; FISHBEIN, MARTIN
2012-01-01
Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...
[Models of economic theory of population growth].
Von Zameck, W
1987-01-01
"The economic theory of population growth applies the opportunity cost approach to the fertility decision. Variations and differentials in fertility are caused by the available resources and relative prices or by the relative production costs of child services. Pure changes in real income raise the demand for children or the total amount spent on children. If relative prices or production costs and real income are affected together the effect on fertility requires separate consideration." (SUMMARY IN ENG)
A Model of the Economic Theory of Regulation for Undergraduates.
Wilson, Brooks
1995-01-01
Presents a model of the economic theory of regulation and recommends its use in undergraduate economics classes. Describes the use of computer-assisted instruction to teach the theory. Maintains that the approach enables students to gain access to graphs and tables that they produce themselves. (CFR)
A continuum theory for modeling the dynamics of crystalline materials.
Xiong, Liming; Chen, Youping; Lee, James D
2009-02-01
This paper introduces a multiscale field theory for modeling and simulation of the dynamics of crystalline materials. The atomistic formulation of a multiscale field theory is briefly introduced. Its applicability is discussed. A few application examples, including phonon dispersion relations of ferroelectric materials BiScO3 and MgO nano dot under compression are presented.
Resource management model based on budget mechanism%基于预算的资源管理模型
Institute of Scientific and Technical Information of China (English)
罗红兵; 王伟; 张晓霞; 武林平
2011-01-01
There is a wide gap between the existing resource management techniques and actual demands, such as ensuring fairness and quality of service(QoS) metric. Based on principles of economics, a resource management model named BB-RAM was presented, which uses budget mechanism to implement macro-control on the computing resource allocation, and ultimately achieves optimal using of resources and ensuring QoS. The simulation showed that parallel job scheduling based on BB-RAM outperforms the traditional strategies on all metrics, for example, QoS, turnaround time, slowdown etc.%针对现有批作业系统中的资源管理方式在资源使用公平性和合理性、作业服务质量(QoS)与实际需求存在较大差距的问题,提出一种基于经济学原理的资源管理模型——BB-RAM模型.模型通过预算机制来实现对计算资源管理和使用的宏观控制,最终达到资源使用最优化和保证作业服务质量的目的.基于实际作业流的仿真结果表明该模型的作业调度的作业延误率、效益值等QoS指标,以及平均响应时间等传统评价指标都优于传统调度策略.
Halyo, Nesim; Choi, Sang H.; Chrisman, Dan A., Jr.; Samms, Richard W.
1987-01-01
Dynamic models and computer simulations were developed for the radiometric sensors utilized in the Earth Radiation Budget Experiment (ERBE). The models were developed to understand performance, improve measurement accuracy by updating model parameters and provide the constants needed for the count conversion algorithms. Model simulations were compared with the sensor's actual responses demonstrated in the ground and inflight calibrations. The models consider thermal and radiative exchange effects, surface specularity, spectral dependence of a filter, radiative interactions among an enclosure's nodes, partial specular and diffuse enclosure surface characteristics and steady-state and transient sensor responses. Relatively few sensor nodes were chosen for the models since there is an accuracy tradeoff between increasing the number of nodes and approximating parameters such as the sensor's size, material properties, geometry, and enclosure surface characteristics. Given that the temperature gradients within a node and between nodes are small enough, approximating with only a few nodes does not jeopardize the accuracy required to perform the parameter estimates and error analyses.
Modeling Multivariate Volatility Processes: Theory and Evidence
Directory of Open Access Journals (Sweden)
Jelena Z. Minovic
2009-05-01
Full Text Available This article presents theoretical and empirical methodology for estimation and modeling of multivariate volatility processes. It surveys the model specifications and the estimation methods. Multivariate GARCH models covered are VEC (initially due to Bollerslev, Engle and Wooldridge, 1988, diagonal VEC (DVEC, BEKK (named after Baba, Engle, Kraft and Kroner, 1995, Constant Conditional Correlation Model (CCC, Bollerslev, 1990, Dynamic Conditional Correlation Model (DCC models of Tse and Tsui, 2002, and Engle, 2002. I illustrate approach by applying it to daily data from the Belgrade stock exchange, I examine two pairs of daily log returns for stocks and index, report the results obtained, and compare them with the restricted version of BEKK, DVEC and CCC representations. The methods for estimation parameters used are maximum log-likehood (in BEKK and DVEC models and twostep approach (in CCC model.
Nieradzik, L. P.; Haverd, V. E.; Briggs, P.; Meyer, C. P.; Canadell, J.
2014-12-01
Fires play a major role in the carbon-cycle and the development of global vegetation, especially on the continent of Australia, where vegetation is prone to frequent fire occurences and where regional composition and stand-age distribution is regulated by fire. Furthermore, the probable changes of fire behaviour under a changing climate are still poorly understood and require further investigation.In this presentation we introduce a novel approach to simulate fire-frequencies, fire-intensities and the responses in vegetation. Fire frequencies are prescribed using SIMFIRE (Knorr et al., 2014) or GFED3 (e.g. Giglio et al., 2013). Fire-Line-Intensity (FLI) is computed from meteorological information and fuel loads which are state variables within the C-cycle component of CABLE. This FLI is used as an input to the tree-demography model POP (Population-Order-Physiology; Haverd et al., 2014). Within POP the fire-mortality depends on FLI and tree height distribution.Intensity-dependent combustion factors (CF) are then generated for and applied to live and litter carbon pools as well as the transfers from live pools to litter caused by fire. Thus, both fire and stand characteristics are taken into account which has a legacy effect on future events. Gross C-CO2 emissions from Australian wild fires are larger than Australian territorial fossil fuel emissions. However, the net effect of fire on the Australian terrestrial carbon budget is unknown. We address this by applying the newly-developed fire module, integrated within the CABLE land surface model, and optimised for the Australian region, to a reassessment of the Australian Terrestrial Carbon Budget.
A Model of PCF in Guarded Type Theory
DEFF Research Database (Denmark)
Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars
2015-01-01
of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....
Classical conformality in the Standard Model from Coleman's theory
Kawana, Kiyoharu
2016-01-01
The classical conformality is one of the possible candidates for explaining the gauge hierarchy of the Standard Model. We show that it is naturally obtained from the Coleman's theory on baby universe.
Linear control theory for gene network modeling.
Shin, Yong-Jun; Bleris, Leonidas
2010-09-16
Systems biology is an interdisciplinary field that aims at understanding complex interactions in cells. Here we demonstrate that linear control theory can provide valuable insight and practical tools for the characterization of complex biological networks. We provide the foundation for such analyses through the study of several case studies including cascade and parallel forms, feedback and feedforward loops. We reproduce experimental results and provide rational analysis of the observed behavior. We demonstrate that methods such as the transfer function (frequency domain) and linear state-space (time domain) can be used to predict reliably the properties and transient behavior of complex network topologies and point to specific design strategies for synthetic networks.
Song, C. H.; Kim, H. S.; von Glasow, R.; Brimblecombe, P.; Kim, J.; Park, R. J.; Woo, J. H.
2010-06-01
Elevated levels of formaldehyde (HCHO) along the ship corridors have been observed by satellite sensors, such as ESA/ERS-2 GOME (Global Ozone Monitoring Experiment), and were also predicted by global 3-D chemistry-transport models. In this study, three likely sources of the elevated HCHO levels were investigated to identify the detailed sources and examine the contributions of the sources (budget) of the elevated levels of HCHO in the ship corridors using a newly-developed ship-plume photochemical/dynamic model: (1) primary HCHO emission from ships; (2) secondary HCHO production via the atmospheric oxidation of Non-methane volatile organic compounds (NMVOCs) emitted from ships; and (3) atmospheric oxidation of CH4 within the ship plumes. From multiple ship-plume model simulations, CH4 oxidation by elevated levels of in-plume OH radicals was found to be the main factor responsible for the elevated levels of HCHO in the ship corridors. More than ~91% of the HCHO for the base ship plume case (ITCT 2K2 ship-plume case) is produced by this atmospheric chemical process, except in the areas close to the ship stacks where the main source of the elevated HCHO levels would be primary HCHO from the ships (due to the deactivation of CH4 oxidation from the depletion of in-plume OH radicals). Because of active CH4 oxidation (chemical destruction of CH4) by OH radicals, the instantaneous chemical lifetime of CH4 (τ CH4) decreased to ~0.45 yr inside the ship plume, which is in contrast to τ CH4 of ~1.1 yr in the background (up to ~41% decrease). A variety of likely ship-plume situations at three locations at different latitudes within the global ship corridors was also studied to determine the extent of the enhancements in the HCHOlevels in the marine boundary layer (MBL) influenced by ship emissions. It was found that the ship-plume HCHO levels could be 20.5-434.9 pptv higher than the background HCHO levels depending on the latitudinal locations of the ship plumes (i
Directory of Open Access Journals (Sweden)
A. Lauer
2007-07-01
Full Text Available International shipping contributes significantly to the fuel consumption of all transport related activities. Specific emissions of pollutants such as sulfur dioxide (SO_{2} per kg of fuel emitted are higher than for road transport or aviation. Besides gaseous pollutants, ships also emit various types of particulate matter. The aerosol impacts the Earth's radiation budget directly by scattering and absorbing incoming solar radiation and indirectly by changing cloud properties. Here we use ECHAM5/MESSy1-MADE, a global climate model with detailed aerosol and cloud microphysics, to show that emissions from ships significantly increase the cloud droplet number concentration of low maritime water clouds. Whereas the cloud liquid water content remains nearly unchanged in these simulations, effective radii of cloud droplets decrease, leading to cloud optical thickness increase up to 5–10%. The sensitivity of the results is estimated by using three different emission inventories for present day conditions. The sensitivity analysis reveals that shipping contributes with 2.3% to 3.6% to the total sulfate burden and 0.4% to 1.4% to the total black carbon burden in the year 2000. In addition to changes in aerosol chemical composition, shipping increases the aerosol number concentration, e.g. up to 25% in the size range of the accumulation mode (typically >0.1 μm over the Atlantic. The total aerosol optical thickness over the Indian Ocean, the Gulf of Mexico and the Northeastern Pacific increases up to 8–10% depending on the emission inventory. Changes in aerosol optical thickness caused by the shipping induced modification of aerosol particle number concentration and chemical composition lead to a change of the net top of the atmosphere (ToA clear sky radiation of about −0.013 W/m^{2} to −0.036 W/m^{2} on global annual average. The estimated all-sky direct aerosol effect calculated from these changes ranges between −0
Leroux, Estelle; Gorini, Christian; Aslanian, Daniel; Rabineau, Marina; Blanpied, Christian; Rubino, Jean-Loup; Robin, Cécile; Granjeon, Didier; Taillepierre, Rachel
2016-04-01
The post-rift (~20-0 Ma) vertical movements of the Provence Basin (West Mediterranean) are quantified on its both conjugate (the Gulf of Lion and the West Sardinia) margins. This work is based on the stratigraphic study of sedimentary markers using a large 3D grid of seismic data, correlations with existing drillings and refraction data. The post-rift subsidence is measured by the direct use of sedimentary geometries analysed in 3D [Gorini et al., 2015; Rabineau et al., 2014] and validated by numerical stratigraphic modelling. Three domains were found: on the platform (1) and slope (2), the subsidence takes the form of a seaward tilting with different amplitudes, whereas the deep basin (3) subsides purely vertically [Leroux et al., 2015a]. These domains correspond to the deeper crustal domains respectively highlighted by wide angle seismic data. The continental crust (1) and the thinned continental crust (2) are tilted, whereas the intermediate crust, identified as lower continental exhumed crust [Moulin et al., 2015, Afhilado et al., 2015] (3) sagged. The post-break-up subsidence re-uses the initial hinge lines of the rifting phase. This striking correlation between surface geologic processes and deep earth dynamic processes emphasizes that the sedimentary record and sedimentary markers is a window into deep geodynamic processes and dynamic topography. Pliocene-Pleistocene seismic markers enabled high resolution quantification of sediment budgets over the past 6 Myr [Leroux et al., in press]. Sediment budget history is here completed on the Miocene interval. Thus, the controlling factors (climate, tectonics and eustasy) are discussed. Afilhado, A., Moulin, M., Aslanian, D., Schnürle, P., Klingelhoefer, F., Nouzé, H., Rabineau, M., Leroux, E. & Beslier, M.-O. (2015). Deep crustal structure across a young 1 passive margin from wide-angle and reflection seismic data (The SARDINIA Experiment) - II. Sardinia's margin. Bull. Soc. géol. France, 186, ILP Spec. issue, 4
A QCD Model Using Generalized Yang-Mills Theory
Institute of Scientific and Technical Information of China (English)
WANG Dian-Fu; SONG He-Shan; KOU Li-Na
2007-01-01
Generalized Yang-Mills theory has a covariant derivative,which contains both vector and scalar gauge bosons.Based on this theory,we construct a strong interaction model by using the group U(4).By using this U(4)generalized Yang-Mills model,we also obtain a gauge potential solution,which can be used to explain the asymptotic behavior and color confinement.
Bianchi class A models in Sàez-Ballester's theory
Socorro, J.; Espinoza-García, Abraham
2012-08-01
We apply the Sàez-Ballester (SB) theory to Bianchi class A models, with a barotropic perfect fluid in a stiff matter epoch. We obtain exact classical solutions à la Hamilton for Bianchi type I, II and VIh=-1 models. We also find exact quantum solutions to all Bianchi Class A models employing a particular ansatz for the wave function of the universe.
A Dynamic Systems Theory Model of Visual Perception Development
Coté, Carol A.
2015-01-01
This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…
Mathematical System Theory and System Modeling
1980-01-01
Choosing models related effectively to the questions to be addressed is a central issue in the craft of systems analysis. Since the mathematical description the analyst chooses constrains the types of issues he candeal with, it is important for these models to be selected so as to yield limitations that are acceptable in view of the questions the systems analysis seeks to answer. In this paper, the author gives an overview of the central issues affecting the question of model choice. To ...
Modeling in applied sciences a kinetic theory approach
Pulvirenti, Mario
2000-01-01
Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...
Measurement-based load modeling: Theory and application
Institute of Scientific and Technical Information of China (English)
MA; Jin; HAN; Dong; HE; RenMu
2007-01-01
Load model is one of the most important elements in power system operation and control. However, owing to its complexity, load modeling is still an open and very difficult problem. Summarizing our work on measurement-based load modeling in China for more than twenty years, this paper systematically introduces the mathematical theory and applications regarding the load modeling. The flow chart and algorithms for measurement-based load modeling are presented. A composite load model structure with 13 parameters is also proposed. Analysis results based on the trajectory sensitivity theory indicate the importance of the load model parameters for the identification. Case studies show the accuracy of the presented measurement-based load model. The load model thus built has been validated by field measurements all over China. Future working directions on measurement- based load modeling are also discussed in the paper.
The Neuman Systems Model Institute: testing middle-range theories.
Gigliotti, Eileen
2003-07-01
The credibility of the Neuman systems model can only be established through the generation and testing of Neuman systems model-derived middle-range theories. However, due to the number and complexity of Neuman systems model concepts/concept interrelations and the diversity of middle-range theory concepts linked to these Neuman systems model concepts by researchers, no explicit middle-range theories have yet been derived from the Neuman systems model. This article describes the development of an organized program for the systematic study of the Neuman systems model. Preliminary work, already accomplished, is detailed, and a tentative plan for the completion of further preliminary work as well as beginning the actual research conduction phase is proposed.
Consumer preference models: fuzzy theory approach
Turksen, I. B.; Wilson, I. A.
1993-12-01
Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).
Modeling acquaintance networks based on balance theory
Directory of Open Access Journals (Sweden)
Vukašinović Vida
2014-09-01
Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models
The Family FIRO Model: The Integration of Group Theory and Family Theory.
Colangelo, Nicholas; Doherty, William J.
1988-01-01
Presents the Family Fundamental Interpersonal Relations Orientation (Family FIRO) Model, an integration of small-group theory and family therapy. The model is offered as a framework for organizing family issues. Discusses three fundamental issues of human relatedness and their applicability to group dynamics. (Author/NB)
Directory of Open Access Journals (Sweden)
Pilli R
2015-08-01
Full Text Available In the context of the Kyoto Protocol, the mandatory accounting of Afforestation and Reforestation (AR activities requires estimating the forest carbon (C stock changes for any direct human-induced expansion of forest since 1990. We used the Carbon Budget Model (CBM to estimate C stock changes and emissions from fires on AR lands at country level. Italy was chosen because it has one of the highest annual rates of AR in Europe and the same model was recently applied to Italy’s forest management area. We considered the time period 1990-2020 with two case studies reflecting different average annual rates of AR: 78 kha yr-1, based on the 2013 Italian National Inventory Report (NIR, official estimates, and 28 kha yr-1, based on the Italian Land Use Inventory System (IUTI estimates. We compared these two different AR rates with eight regional forest inventories and three independent local studies. The average annual C stock change estimated by CBM, excluding harvest or natural disturbances, was equal to 1738 Gg C yr-1 (official estimates and 630 Gg C yr-1 (IUTI estimates. Results for the official estimates are consistent with the estimates reported by Italy to the KP for the period 2008-2010; for 2011 our estimates are about 20% higher than the country’s data, probably due to different assumptions on the fire disturbances, the AR rate and the dead wood and litter pools. Furthermore, our analysis suggests that: (i the impact on the AR sink of different assumptions of species composition is small; (ii the amount of harvest provided by AR has been negligible for the past (< 3% and is expected to be small in the near future (up to 8% in 2020; (iii forest fires up to 2011 had a small impact on the AR sink (on average, < 100 Gg C yr-1. Finally the comparison of the historical AR rates reported by NIR and IUTI with other independent sources gives mixed results: the regional inventories support the AR rates reported by the NIR, while some local studies
Training evaluation models: Theory and applications
Carbone, V.; MORVILLO, A
2002-01-01
This chapter has the following aims: 1. Compare the various conceptual models for evaluation, identifying their strengths and weaknesses; 2. Define an evaluation model consistent with the aims and constraints of the fit project; 3. Describe, in critical fashion, operative tools for evaluating training which are reliable, flexible and analytical.
Optimal transportation networks models and theory
Bernot, Marc; Morel, Jean-Michel
2009-01-01
The transportation problem can be formalized as the problem of finding the optimal way to transport a given measure into another with the same mass. In contrast to the Monge-Kantorovitch problem, recent approaches model the branched structure of such supply networks as minima of an energy functional whose essential feature is to favour wide roads. Such a branched structure is observable in ground transportation networks, in draining and irrigation systems, in electrical power supply systems and in natural counterparts such as blood vessels or the branches of trees. These lectures provide mathematical proof of several existence, structure and regularity properties empirically observed in transportation networks. The link with previous discrete physical models of irrigation and erosion models in geomorphology and with discrete telecommunication and transportation models is discussed. It will be mathematically proven that the majority fit in the simple model sketched in this volume.
Measurement Models for Reasoned Action Theory.
Hennessy, Michael; Bleakley, Amy; Fishbein, Martin
2012-03-01
Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.
Verification of uncertainty budgets
DEFF Research Database (Denmark)
Heydorn, Kaj; Madsen, B.S.
2005-01-01
The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data, and th...
This is the federal budget timetable under the Balanced Budget and Emergency Deficit Control Act of 1985 (Gramm-Rudman-Hollings). These deadlines apply to fiscal years (FY) 1987-1991. The deficit reduction measures in Gramm-Rudman-Hollings would lead to a balanced budget in 1991.
Sticker DNA computer model--Part Ⅰ:Theory
Institute of Scientific and Technical Information of China (English)
XU Jin; DONG Yafei; WEI Xiaopeng
2004-01-01
Sticker model is one of the basic models in the DNA computer models. This model is coded with single-double stranded DNA molecules. It has the following advantages that the operations require no strands extension and use no enzymes; What's more, the materials are reusable. Therefore it arouses attention and interest of scientists in many fields. In this paper, we will systematically analyze the theories and applications of the model, summarize other scientists' contributions in this field, and propose our research results. This paper is the theoretical portion of the sticker model on DNA computer, which includes the introduction of the basic model of sticker computing. Firstly, we systematically introduce the basic theories of classic models about sticker computing; Secondly, we discuss the sticker system which is an abstract computing model based on the sticker model and formal languages; Finally, extend and perfect the model, and present two types of models that are more extensive in the applications and more perfect in the theory than the past models: one is the so-called k-bit sticker model, the other is full-message sticker DNA computing model.
Quantum Field Theory and the Electroweak Standard Model
Boos, E
2015-01-01
The Standard Model is one of the main intellectual achievements for about the last 50 years, a result of many theoretical and experimental studies. In this lecture a brief introduction to the electroweak part of the Standard Model is given. Since the Standard Model is a quantum field theory, some aspects for understanding of quantization of abelian and non-abelian gauge theories are also briefly discussed. It is demonstrated how well the electroweak Standard Model works in describing a large variety of precise experimental measure- ments at lepton and hadron collider.
Directory of Open Access Journals (Sweden)
Pilli R
2014-02-01
Full Text Available Historical analysis and modeling of the forest carbon dynamics using the Carbon Budget Model: an example for the Trento Province (NE, Italy. The Carbon Budget Model (CBM-CFS3 developed by the Canadian Forest Service was applied to data collected by the last Italian National Forest Inventory (INFC for the Trento Province (NE, Italy. CBM was modified and adapted to the different management types (i.e., even-aged high forests, uneven-aged high forests and coppices and silvicultural systems (including clear cuts, single tree selection systems and thinning applied in this province. The aim of this study was to provide an example of down-scaling of this model from a national to a regional scale, providing (i an historical analysis, from 1995 to 2011, and (ii a projection, from 2012 to 2020, of the forest biomass and the carbon stock evolution. The analysis was based on the harvest rate reported by the Italian National Institute of Statistics (from 1995 to 2011, corrected according to the last INFC data and distinguished between timber and fuel woods and between conifers and broadleaves. Since 2012, we applied a constant harvest rate, equal to about 1300 Mm3 yr-1, estimated from the average harvest rate for the period 2006-2011. Model results were consistent with similar data reported in the literature. The average biomass C stock was 90 Mg C ha-1 and the biomass C stock change was 0.97 Mg C ha-1 yr-1 and 0.87 Mg C ha-1 yr-1, for the period 1995 -2011 and 2012-2020, respectively. The C stock cumulated by the timber products since 1995 was 96 Gg C yr-1, i.e., about 28% of the average annual C stock change of the forests, equal to 345 Gg C yr-1. CBM also provided estimates on the evolution of the age class distribution of the even-aged forests and on the C stock of the DOM forest pools (litter, dead wood and soil. This study demonstrates the utility of CBM to provide estimates at a regional or local scale, using not only the data provided by the forest
Mixed models theory and applications with R
Demidenko, Eugene
2013-01-01
Mixed modeling is one of the most promising and exciting areas of statistical analysis, enabling the analysis of nontraditional, clustered data that may come in the form of shapes or images. This book provides in-depth mathematical coverage of mixed models' statistical properties and numerical algorithms, as well as applications such as the analysis of tumor regrowth, shape, and image. The new edition includes significant updating, over 300 exercises, stimulating chapter projects and model simulations, inclusion of R subroutines, and a revised text format. The target audience continues to be g
Theory of stellar convection II: first stellar models
Pasetto, S; Chiosi, E; Cropper, M; Weiss, A
2015-01-01
We present here the first stellar models on the Hertzsprung-Russell diagram (HRD), in which convection is treated according to the novel scale-free convection theory (SFC theory) by Pasetto et al. (2014). The aim is to compare the results of the new theory with those from the classical, calibrated mixing-length (ML) theory to examine differences and similarities. We integrate the equations describing the structure of the atmosphere from the stellar surface down to a few percent of the stellar mass using both ML theory and SFC theory. The key temperature over pressure gradients, the energy fluxes, and the extension of the convective zones are compared in both theories. The analysis is first made for the Sun and then extended to other stars of different mass and evolutionary stage. The results are adequate: the SFC theory yields convective zones, temperature gradients of the ambient and of the convective element, and energy fluxes that are very similar to those derived from the "calibrated" MT theory for main s...
Solid mechanics theory, modeling, and problems
Bertram, Albrecht
2015-01-01
This textbook offers an introduction to modeling the mechanical behavior of solids within continuum mechanics and thermodynamics. To illustrate the fundamental principles, the book starts with an overview of the most important models in one dimension. Tensor calculus, which is called for in three-dimensional modeling, is concisely presented in the second part of the book. Once the reader is equipped with these essential mathematical tools, the third part of the book develops the foundations of continuum mechanics right from the beginning. Lastly, the book’s fourth part focuses on modeling the mechanics of materials and in particular elasticity, viscoelasticity and plasticity. Intended as an introductory textbook for students and for professionals interested in self-study, it also features numerous worked-out examples to aid in understanding.
Matrix Models, Topological Strings, and Supersymmetric Gauge Theories
Dijkgraaf, R; Dijkgraaf, Robbert; Vafa, Cumrun
2002-01-01
We show that B-model topological strings on local Calabi-Yau threefolds are large N duals of matrix models, which in the planar limit naturally give rise to special geometry. These matrix models directly compute F-terms in an associated N=1 supersymmetric gauge theory, obtained by deforming N=2 theories by a superpotential term that can be directly identified with the potential of the matrix model. Moreover by tuning some of the parameters of the geometry in a double scaling limit we recover (p,q) conformal minimal models coupled to 2d gravity, thereby relating non-critical string theories to type II superstrings on Calabi-Yau backgrounds.
Matrix models, topological strings, and supersymmetric gauge theories
Energy Technology Data Exchange (ETDEWEB)
Dijkgraaf, Robbert E-mail: rhd@science.uva.nl; Vafa, Cumrun
2002-11-11
We show that B-model topological strings on local Calabi-Yau threefolds are large-N duals of matrix models, which in the planar limit naturally give rise to special geometry. These matrix models directly compute F-terms in an associated N=1 supersymmetric gauge theory, obtained by deforming N=2 theories by a superpotential term that can be directly identified with the potential of the matrix model. Moreover by tuning some of the parameters of the geometry in a double scaling limit we recover (p,q) conformal minimal models coupled to 2d gravity, thereby relating non-critical string theories to type II superstrings on Calabi-Yau backgrounds.
Theory and modeling of electron fishbones
Vlad, G.; Fusco, V.; Briguglio, S.; Fogaccia, G.; Zonca, F.; Wang, X.
2016-10-01
Internal kink instabilities exhibiting fishbone like behavior have been observed in a variety of experiments where a high energy electron population, generated by strong auxiliary heating and/or current drive systems, was present. After briefly reviewing the experimental evidences of energetic electrons driven fishbones, and the main results of linear and nonlinear theory of electron fishbones, the results of global, self-consistent, nonlinear hybrid MHD-Gyrokinetic simulations will be presented. To this purpose, the extended/hybrid MHD-Gyrokinetic code XHMGC will be used. Linear dynamics analysis will enlighten the effect of considering kinetic thermal ion compressibility and diamagnetic response, and kinetic thermal electrons compressibility, in addition to the energetic electron contribution. Nonlinear saturation and energetic electron transport will also be addressed, making extensive use of Hamiltonian mapping techniques, discussing both centrally peaked and off-axis peaked energetic electron profiles. It will be shown that centrally peaked energetic electron profiles are characterized by resonant excitation and nonlinear response of deeply trapped energetic electrons. On the other side, off-axis peaked energetic electron profiles are characterized by resonant excitation and nonlinear response of barely circulating energetic electrons which experience toroidal precession reversal of their motion.
Dritselis, Chris D.
2017-04-01
In the first part of this study (Dritselis 2016 Fluid Dyn. Res. 48 015507), the Reynolds stress budgets were evaluated through point-particle direct numerical simulations (pp-DNSs) for the particle-laden turbulent flow in a vertical channel with two- and four-way coupling effects. Here several turbulence models are assessed by direct comparison of the particle contribution terms to the budgets, the dissipation rate, the pressure-strain rate, and the transport rate with the model expressions using the pp-DNS data. It is found that the models of the particle sources to the equations of fluid turbulent kinetic energy and dissipation rate cannot represent correctly the physics of the complex interaction between turbulence and particles. A relatively poor performance of the pressure-strain term models is revealed in the particulate flows, while the algebraic models for the dissipation rate of the fluid turbulence kinetic energy and the transport rate terms can adequately reproduce the main trends due to the presence of particles. Further work is generally needed to improve the models in order to account properly for the momentum exchange between the two phases and the effects of particle inertia, gravity and inter-particle collisions.
Electrorheological fluids modeling and mathematical theory
Růžička, Michael
2000-01-01
This is the first book to present a model, based on rational mechanics of electrorheological fluids, that takes into account the complex interactions between the electromagnetic fields and the moving liquid. Several constitutive relations for the Cauchy stress tensor are discussed. The main part of the book is devoted to a mathematical investigation of a model possessing shear-dependent viscosities, proving the existence and uniqueness of weak and strong solutions for the steady and the unsteady case. The PDS systems investigated possess so-called non-standard growth conditions. Existence results for elliptic systems with non-standard growth conditions and with a nontrivial nonlinear r.h.s. and the first ever results for parabolic systems with a non-standard growth conditions are given for the first time. Written for advanced graduate students, as well as for researchers in the field, the discussion of both the modeling and the mathematics is self-contained.
A catastrophe theory model of the conflict helix, with tests.
Rummel, R J
1987-10-01
Macro social field theory has undergone extensive development and testing since the 1960s. One of these has been the articulation of an appropriate conceptual micro model--called the conflict helix--for understanding the process from conflict to cooperation and vice versa. Conflict and cooperation are viewed as distinct equilibria of forces in a social field; the movement between these equilibria is a jump, energized by a gap between social expectations and power, and triggered by some minor event. Quite independently, there also has been much recent application of catastrophe theory to social behavior, but usually without a clear substantive theory and lacking empirical testing. This paper uses catastrophe theory--namely, the butterfly model--mathematically to structure the conflict helix. The social field framework and helix provide the substantive interpretation for the catastrophe theory; and catastrophe theory provides a suitable mathematical model for the conflict helix. The model is tested on the annual conflict and cooperation between India and Pakistan, 1948 to 1973. The results are generally positive and encouraging.
Institute of Scientific and Technical Information of China (English)
裴沛
2015-01-01
近几年,许多本土经济型酒店收入增速放缓,甚至出现亏损现象. 想要扭转这种局面,吸引客源提高竞争力,必须改变传统思维方式,为顾客创造酒店新价值. 本文结合顾客价值创新理论,对本土经济型酒店的顾客需求进行深入分析,针对十个维度绘出一条新的经济型酒店顾客价值曲线,提出了若干提升本土经济型酒店竞争力的对策,以客房为中心保证产品质量,注重服务细节提升服务质量,重视酒店网络点评等.%In recent years,many of the budget hotels in China appeared the phenomenon that revenue growth slowed down and even lost.To improve this condition,the budget hotels should change the traditional way of thinking and create a new hotel value for customers in order to attract customers and improve competitiveness. Based on customer value innovation theory,this paper analyzes the value of customers of the budget hotels in China,draws a new curve of customer value,and puts forward some countermeasures to improve the competi-tiveness,such as the room as the center to ensure product quality,pay attention to detail and the hotel network reviews.
Mean field theory, topological field theory, and multi-matrix models
Energy Technology Data Exchange (ETDEWEB)
Dijkgraaf, R. (Princeton Univ., NJ (USA). Joseph Henry Labs.); Witten, E. (Institute for Advanced Study, Princeton, NJ (USA). School of Natural Sciences)
1990-10-08
We show that the genus zero correlation functions of an arbitrary topological field theory coupled to two-dimensional topological gravity are determined by an appropriate Landau-Ginzburg potential. We determine the potentials that arise for topological sigma models with CP{sup 1} or a Calabi-Yau manifold for target space. We present substantial evidence that the multi-matrix models that have been studied recently are equivalent to certain topological field theories coupled to topological gravity. We also describe a topological version of the general 'string equation'. (orig.).
Mean field theory, topological field theory, and multi-matrix models
Dijkgraaf, Robbert; Witten, Edward
1990-10-01
We show that the genus zero correlation functions of an arbitrary topological field theory coupled to two-dimensional topological gravity are determined by an appropriate Landau-Ginzburg potential. We determine the potentials that arise for topological sigma models with CP 1 or a Calabi-Yau manifold for target space. We present substantial evidence that the multi-matrix models that have been studied recently are equivalent to certain topological field theories coupled to topological gravity. We also describe a topological version of the general "string equation".
Theory and Model for Martensitic Transformations
DEFF Research Database (Denmark)
Lindgård, Per-Anker; Mouritsen, Ole G.
1986-01-01
Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...
Study on Strand Space Model Theory
Institute of Scientific and Technical Information of China (English)
JI QingGuang(季庆光); QING SiHan(卿斯汉); ZHOU YongBin(周永彬); FENG DengGuo(冯登国)
2003-01-01
The growing interest in the application of formal methods of cryptographic pro-tocol analysis has led to the development of a number of different ways for analyzing protocol. Inthis paper, it is strictly proved that if for any strand, there exists at least one bundle containingit, then an entity authentication protocol is secure in strand space model (SSM) with some smallextensions. Unfortunately, the results of attack scenario demonstrate that this protocol and the Yahalom protocol and its modification are de facto insecure. By analyzing the reasons of failure offormal inference in strand space model, some deficiencies in original SSM are pointed out. In orderto break through these limitations of analytic capability of SSM, the generalized strand space model(GSSM) induced by some protocol is proposed. In this model, some new classes of strands, oraclestrands, high order oracle strands etc., are developed, and some notions are formalized strictly in GSSM, such as protocol attacks, valid protocol run and successful protocol run. GSSM can thenbe used to further analyze the entity authentication protocol. This analysis sheds light on why thisprotocol would be vulnerable while it illustrates that GSSM not only can prove security protocolcorrect, but also can be efficiently used to construct protocol attacks. It is also pointed out thatusing other protocol to attack some given protocol is essentially the same as the case of using themost of protocol itself.
Modeling Environmental Concern: Theory and Application.
Hackett, Paul M. W.
1993-01-01
Human concern for the quality and protection of the natural environment forms the basis of successful environmental conservation activities. Considers environmental concern research and proposes a model that incorporates the multiple dimensions of research through which environmental concern may be evaluated. (MDH)
Budget Setting Strategies for the Company's Divisions
Berg, M.; Brekelmans, R.C.M.; De Waegenaere, A.M.B.
1997-01-01
The paper deals with the issue of budget setting to the divisions of a company. The approach is quantitative in nature both in the formulation of the requirements for the set-budgets, as related to different general managerial objectives of interest, and in the modelling of the inherent uncertaintie
Applying learning theories and instructional design models for effective instruction.
Khalil, Mohammed K; Elkhider, Ihsan A
2016-06-01
Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory.
Rock mechanics modeling based on soft granulation theory
Owladeghaffari, H
2008-01-01
This paper describes application of information granulation theory, on the design of rock engineering flowcharts. Firstly, an overall flowchart, based on information granulation theory has been highlighted. Information granulation theory, in crisp (non-fuzzy) or fuzzy format, can take into account engineering experiences (especially in fuzzy shape-incomplete information or superfluous), or engineering judgments, in each step of designing procedure, while the suitable instruments modeling are employed. In this manner and to extension of soft modeling instruments, using three combinations of Self Organizing Map (SOM), Neuro-Fuzzy Inference System (NFIS), and Rough Set Theory (RST) crisp and fuzzy granules, from monitored data sets are obtained. The main underlined core of our algorithms are balancing of crisp(rough or non-fuzzy) granules and sub fuzzy granules, within non fuzzy information (initial granulation) upon the open-close iterations. Using different criteria on balancing best granules (information pock...
L∞-algebra models and higher Chern-Simons theories
Ritter, Patricia; Sämann, Christian
2016-10-01
We continue our study of zero-dimensional field theories in which the fields take values in a strong homotopy Lie algebra. In the first part, we review in detail how higher Chern-Simons theories arise in the AKSZ-formalism. These theories form a universal starting point for the construction of L∞-algebra models. We then show how to describe superconformal field theories and how to perform dimensional reductions in this context. In the second part, we demonstrate that Nambu-Poisson and multisymplectic manifolds are closely related via their Heisenberg algebras. As a byproduct of our discussion, we find central Lie p-algebra extensions of 𝔰𝔬(p + 2). Finally, we study a number of L∞-algebra models which are physically interesting and which exhibit quantized multisymplectic manifolds as vacuum solutions.
From integrable models to gauge theories Festschrift Matinyan (Sergei G)
Gurzadyan, V G
2002-01-01
This collection of twenty articles in honor of the noted physicist and mentor Sergei Matinyan focuses on topics that are of fundamental importance to high-energy physics, field theory and cosmology. The topics range from integrable quantum field theories, three-dimensional Ising models, parton models and tests of the Standard Model, to black holes in loop quantum gravity, the cosmological constant and magnetic fields in cosmology. A pedagogical essay by Lev Okun concentrates on the problem of fundamental units. The articles have been written by well-known experts and are addressed to graduate
Automated Physico-Chemical Cell Model Development through Information Theory
Energy Technology Data Exchange (ETDEWEB)
Peter J. Ortoleva
2005-11-29
The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.
Pilot evaluation in TENCompetence: a theory-driven model1
J. Schoonenboom; H. Sligte; A. Moghnieh; M. Specht; C. Glahn; K. Stefanov
2008-01-01
This paper describes a theory-driven evaluation model that is used in evaluating four pilots in which an infrastructure for lifelong competence development, which is currently being developed, is validated. The model makes visible the separate implementation steps that connect the envisaged infrastr
Reciprocal Ontological Models Show Indeterminism Comparable to Quantum Theory
Bandyopadhyay, Somshubhro; Banik, Manik; Bhattacharya, Some Sankar; Ghosh, Sibasish; Kar, Guruprasad; Mukherjee, Amit; Roy, Arup
2016-12-01
We show that within the class of ontological models due to Harrigan and Spekkens, those satisfying preparation-measurement reciprocity must allow indeterminism comparable to that in quantum theory. Our result implies that one can design quantum random number generator, for which it is impossible, even in principle, to construct a reciprocal deterministic model.
Chiral field theories as models for hadron substructure
Energy Technology Data Exchange (ETDEWEB)
Kahana, S.H.
1987-03-01
A model for the nucleon as soliton of quarks interacting with classical meson fields is described. The theory, based on the linear sigma model, is renormalizable and capable of including sea quarks straightforwardly. Application to nuclear matter is made in a Wigner-Seitz approximation.
Reciprocal Ontological Models Show Indeterminism Comparable to Quantum Theory
Bandyopadhyay, Somshubhro; Banik, Manik; Bhattacharya, Some Sankar; Ghosh, Sibasish; Kar, Guruprasad; Mukherjee, Amit; Roy, Arup
2017-02-01
We show that within the class of ontological models due to Harrigan and Spekkens, those satisfying preparation-measurement reciprocity must allow indeterminism comparable to that in quantum theory. Our result implies that one can design quantum random number generator, for which it is impossible, even in principle, to construct a reciprocal deterministic model.
Spectral and scattering theory for translation invariant models in quantum field theory
DEFF Research Database (Denmark)
Rasmussen, Morten Grud
This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...... of the essential energy-momentum spectrum and either the two-body threshold, if there are no exited isolated mass shells, or the one-body threshold pertaining to the first exited isolated mass shell, if it exists. For the model restricted to the vacuum and one-particle sectors, the absence of singular continuous...... spectrum is proven to hold globally and scattering theory of the model is studied using time-dependent methods, of which the main result is asymptotic completeness....
Rogers, B. M.; Randerson, J. T.; Bonan, G. B.
2011-12-01
Vegetation compositions of boreal forests are determined largely by recovery patterns after large-scale disturbances, the most notable of which is wildfire. Forest compositions exert large controls on regional energy and greenhouse gas budgets by affecting surface albedo, net radiation, turbulent energy fluxes, and carbon stocks. Impacts of boreal forest fires on climate are therefore products of direct fire effects, including charred surfaces and emitted aerosols and greenhouse gasses, and post-fire vegetation succession, which affects carbon and energy exchange for many decades after the initial disturbance. Climate changes are expected to be greatest at high latitudes, leading many to project increases in boreal forest fires. While numerous studies have documented the effects of post-fire landscape on energy and gas budgets in boreal forests, to date no continental analysis using a coupled model has been performed. In this study we quantified the effects of boreal forest fires and post-fire succession on regional and global climate using model experiments in the Community Earth System Model. We used 20th century climate data and MODIS vegetation continuous fields and land cover classes to identify boreal forests across North America and Eurasia. Historical fire return intervals were derived from a regression approach utilizing the Canadian and Alaskan Large Fire Databases, the Global Fire Emissions Database v3, and land cover and climate data. Succession trajectories were derived from the literature and MODIS land cover over known fire scars. Major improvements in model-data comparisons of long-term energy budgets were observed by prescribing post-fire vegetation succession. Global simulations using historical and future burn area scenarios highlight the potential impacts on climate from changing fire regimes and provide motivation for including vegetation succession in coupled simulations.
Nonlinear model predictive control theory and algorithms
Grüne, Lars
2017-01-01
This book offers readers a thorough and rigorous introduction to nonlinear model predictive control (NMPC) for discrete-time and sampled-data systems. NMPC schemes with and without stabilizing terminal constraints are detailed, and intuitive examples illustrate the performance of different NMPC variants. NMPC is interpreted as an approximation of infinite-horizon optimal control so that important properties like closed-loop stability, inverse optimality and suboptimality can be derived in a uniform manner. These results are complemented by discussions of feasibility and robustness. An introduction to nonlinear optimal control algorithms yields essential insights into how the nonlinear optimization routine—the core of any nonlinear model predictive controller—works. Accompanying software in MATLAB® and C++ (downloadable from extras.springer.com/), together with an explanatory appendix in the book itself, enables readers to perform computer experiments exploring the possibilities and limitations of NMPC. T...
Computational hemodynamics theory, modelling and applications
Tu, Jiyuan; Wong, Kelvin Kian Loong
2015-01-01
This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system. Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...
Fuzzy Stochastic Optimization Theory, Models and Applications
Wang, Shuming
2012-01-01
Covering in detail both theoretical and practical perspectives, this book is a self-contained and systematic depiction of current fuzzy stochastic optimization that deploys the fuzzy random variable as a core mathematical tool to model the integrated fuzzy random uncertainty. It proceeds in an orderly fashion from the requisite theoretical aspects of the fuzzy random variable to fuzzy stochastic optimization models and their real-life case studies. The volume reflects the fact that randomness and fuzziness (or vagueness) are two major sources of uncertainty in the real world, with significant implications in a number of settings. In industrial engineering, management and economics, the chances are high that decision makers will be confronted with information that is simultaneously probabilistically uncertain and fuzzily imprecise, and optimization in the form of a decision must be made in an environment that is doubly uncertain, characterized by a co-occurrence of randomness and fuzziness. This book begins...
Theory and Model of Agricultural Insurance Subsidy
Institute of Scientific and Technical Information of China (English)
Wan Kailiang; Long Wenjun
2007-01-01
The issue of agricultural insurance subsidy is discussed in this paper aiming to make it provided more rationally and scientifically.It is started with the connection between agricultural insurance and financial subsidy.It is really necessary and crucial to implement the financial insurance due to the bad operational performance,especially in the developing countries.But the subsidy should be provided more rationally because financial subsidy has lots of negative effects.A model in competitive insurance markets developed by Ahsan et al(1982)and a farmers'decision model arc developed to solve the optimal subsidized rate.Finally,the equation is got to calculate it.But a quantitative subsidized rate is not made here because the calculation should be under some restricted conditions,which are always absent in the developing countries.So the government should provide some subsidy for the ex ante research and preparation to get the scientific probability and premium rate.
The origin of discrete symmetries in F-theory models
2015-01-01
While non-abelian groups are undoubtedly the cornerstone of Grand Unified Theories (GUTs), phenomenology shows that the role of abelian and discrete symmetries is equally important in model building. The latter are the appropriate tool to suppress undesired proton decay operators and various flavour violating interactions, to generate a hierarchical fermion mass spectrum, etc. In F-theory, GUT symmetries are linked to the singularities of the elliptically fibred K3 manifolds; they are of ADE ...
Kinetic theories for spin models for cooperative relaxation dynamics
Pitts, Steven Jerome
The facilitated kinetic Ising models with asymmetric spin flip constraints introduced by Jackle and co-workers [J. Jackle, S. Eisinger, Z. Phys. B 84, 115 (1991); J. Reiter, F. Mauch, J. Jackle, Physica A 184, 458 (1992)] exhibit complex relaxation behavior in their associated spin density time correlation functions. This includes the growth of relaxation times over many orders of magnitude when the thermodynamic control parameter is varied, and, in some cases, ergodic-nonergodic transitions. Relaxation equations for the time dependence of the spin density autocorrelation function for a set of these models are developed that relate this autocorrelation function to the irreducible memory function of Kawasaki [K. Kawasaki, Physica A 215, 61 (1995)] using a novel diagrammatic series approach. It is shown that the irreducible memory function in a theory of the relaxation of an autocorrelation function in a Markov model with detailed balance plays the same role as the part of the memory function approximated by a polynomial function of the autocorrelation function with positive coefficients in schematic simple mode coupling theories for supercooled liquids [W. Gotze, in Liquids, Freezing and the Glass Transition, D. Levesque, J. P. Hansen, J. Zinn-Justin eds., 287 (North Holland, New York, 1991)]. Sets of diagrams in the series for the irreducible memory function are summed which lead to approximations of this type. The behavior of these approximations is compared with known results from previous analytical calculations and from numerical simulations. For the simplest one dimensional model, relaxation equations that are closely related to schematic extended mode coupling theories [W. Gotze, ibid] are also derived using the diagrammatic series. Comparison of the results of these approximate theories with simulation data shows that these theories improve significantly on the results of the theories of the simple schematic mode coupling theory type. The potential
Richon, Camille; Dutay, Jean-Claude; Dulac, François; Desboeufs, Karine; Nabat, Pierre; Guieu, Cécile; Aumont, Olivier; Palmieri, Julien
2016-04-01
Atmospheric deposition is at present not included in regional oceanic biogeochemical models of the Mediterranean Sea, whereas, along with river inputs, it represents a significant source of nutrients at the basin scale, especially through intense desert dust events. Moreover, observations (e.g. DUNE campaign, Guieu et al. 2010) show that these events significantly modify the biogeochemistry of the oligotrophic Mediterranean Sea. We use a high resolution (1/12°) version of the 3D coupled model NEMOMED12/PISCES to investigate the effects of high resolution atmospheric dust deposition forcings on the biogeochemistry of the Mediterranean basin. The biogeochemical model PISCES represents the evolution of 24 prognostic tracers including five nutrients (nitrate, ammonium, phosphate, silicate and iron) and two phytoplankton and zooplanktons groups (Palmiéri, 2014). From decadal simulations (1982-2012) we evaluate the influence of natural dust and anthropogenic nitrogen deposition on the budget of nutrients in the basin and its impact on the biogeochemistry (primary production, plankton distributions...). Our results show that natural dust deposition accounts for 15% of global PO4 budget and that it influences primarily the southern part of the basin. Anthropogenic nitrogen accounts for 50% of bioavailable N supply for the northern part. Deposition events significantly affect biological production; primary productivity enhancement can be as high as 30% in the areas of high deposition, especially during the stratified period. Further developments of the model will include 0D and 1D modeling of bacteria in the frame of the PEACETIME project.
Nanofluid Drop Evaporation: Experiment, Theory, and Modeling
Gerken, William James
Nanofluids, stable colloidal suspensions of nanoparticles in a base fluid, have potential applications in the heat transfer, combustion and propulsion, manufacturing, and medical fields. Experiments were conducted to determine the evaporation rate of room temperature, millimeter-sized pendant drops of ethanol laden with varying amounts (0-3% by weight) of 40-60 nm aluminum nanoparticles (nAl). Time-resolved high-resolution drop images were collected for the determination of early-time evaporation rate (D2/D 02 > 0.75), shown to exhibit D-square law behavior, and surface tension. Results show an asymptotic decrease in pendant drop evaporation rate with increasing nAl loading. The evaporation rate decreases by approximately 15% at around 1% to 3% nAl loading relative to the evaporation rate of pure ethanol. Surface tension was observed to be unaffected by nAl loading up to 3% by weight. A model was developed to describe the evaporation of the nanofluid pendant drops based on D-square law analysis for the gas domain and a description of the reduction in liquid fraction available for evaporation due to nanoparticle agglomerate packing near the evaporating drop surface. Model predictions are in relatively good agreement with experiment, within a few percent of measured nanofluid pendant drop evaporation rate. The evaporation of pinned nanofluid sessile drops was also considered via modeling. It was found that the same mechanism for nanofluid evaporation rate reduction used to explain pendant drops could be used for sessile drops. That mechanism is a reduction in evaporation rate due to a reduction in available ethanol for evaporation at the drop surface caused by the packing of nanoparticle agglomerates near the drop surface. Comparisons of the present modeling predictions with sessile drop evaporation rate measurements reported for nAl/ethanol nanofluids by Sefiane and Bennacer [11] are in fairly good agreement. Portions of this abstract previously appeared as: W. J
Energy Technology Data Exchange (ETDEWEB)
Dubois, C.; Somot, S.; Deque, M.; Sevault, F. [CNRM-GAME, Meteo-France, CNRS, Toulouse (France); Calmanti, S.; Carillo, A.; Dell' Aquilla, A.; Sannino, G. [ENEA, Rome (Italy); Elizalde, A.; Jacob, D. [Max Planck Institute for Meteorology, Hamburg (Germany); Gualdi, S.; Oddo, P.; Scoccimarro, E. [INGV, Bologna (Italy); L' Heveder, B.; Li, L. [Laboratoire de Meteorologie Dynamique, Paris (France)
2012-10-15
Within the CIRCE project ''Climate change and Impact Research: the Mediterranean Environment'', an ensemble of high resolution coupled atmosphere-ocean regional climate models (AORCMs) are used to simulate the Mediterranean climate for the period 1950-2050. For the first time, realistic net surface air-sea fluxes are obtained. The sea surface temperature (SST) variability is consistent with the atmospheric forcing above it and oceanic constraints. The surface fluxes respond to external forcing under a warming climate and show an equivalent trend in all models. This study focuses on the present day and on the evolution of the heat and water budget over the Mediterranean Sea under the SRES-A1B scenario. On the contrary to previous studies, the net total heat budget is negative over the present period in all AORCMs and satisfies the heat closure budget controlled by a net positive heat gain at the strait of Gibraltar in the present climate. Under climate change scenario, some models predict a warming of the Mediterranean Sea from the ocean surface (positive net heat flux) in addition to the positive flux at the strait of Gibraltar for the 2021-2050 period. The shortwave and latent flux are increasing and the longwave and sensible fluxes are decreasing compared to the 1961-1990 period due to a reduction of the cloud cover and an increase in greenhouse gases (GHGs) and SSTs over the 2021-2050 period. The AORCMs provide a good estimates of the water budget with a drying of the region during the twenty-first century. For the ensemble mean, he decrease in precipitation and runoff is about 10 and 15% respectively and the increase in evaporation is much weaker, about 2% compared to the 1961-1990 period which confirm results obtained in recent studies. Despite a clear consistency in the trends and results between the models, this study also underlines important differences in the model set-ups, methodology and choices of some physical parameters inducing
Miller, James R.; Russell, Gary L.; Hansen, James E. (Technical Monitor)
2001-01-01
The annual energy budget of the Arctic Ocean is characterized by a net heat loss at the air-sea interface that is balanced by oceanic heat transport into the Arctic. The energy loss at the air-sea interface is due to the combined effects of radiative, sensible, and latent heat fluxes. The inflow of heat by the ocean can be divided into two components: the transport of water masses of different temperatures between the Arctic and the Atlantic and Pacific Oceans and the export of sea ice, primarily through Fram Strait. Two 150-year simulations (1950-2099) of a global climate model are used to examine how this balance might change if atmospheric greenhouse gases (GHGs) increase. One is a control simulation for the present climate with constant 1950 atmospheric composition, and the other is a transient experiment with observed GHGs from 1950 to 1990 and 0.5% annual compounded increases of CO2 after 1990. For the present climate the model agrees well with observations of radiative fluxes at the top of the atmosphere, atmospheric advective energy transport into the Arctic, and surface air temperature. It also simulates the seasonal cycle and summer increase of cloud cover and the seasonal cycle of sea-ice cover. In addition, the changes in high-latitude surface air temperature and sea-ice cover in the GHG experiment are consistent with observed changes during the last 40 and 20 years, respectively. Relative to the control, the last 50-year period of the GHG experiment indicates that even though the net annual incident solar radiation at the surface decreases by 4.6 W(per square meters) (because of greater cloud cover and increased cloud optical depth), the absorbed solar radiation increases by 2.8 W(per square meters) (because of less sea ice). Increased cloud cover and warmer air also cause increased downward thermal radiation at the surface so that the net radiation into the ocean increases by 5.0 Wm-2. The annual increase in radiation into the ocean, however, is
Bouncing Model in Brane World Theory
Maier, Rodrigo; Soares, Ivano Damião
2013-01-01
We examine the nonlinear dynamics of a closed Friedmann-Robertson-Walker universe in the framework of Brane World formalism with a timelike extra dimension. In this scenario, the Friedmann equations contain additional terms arising from the bulk-brane interaction which provide a concrete model for nonsingular bounces in the early phase of the Universe. We construct a nonsingular cosmological scenario sourced with dust, radiation and a cosmological constant. The structure of the phase space shows a nonsingular orbit with two accelerated phases, separated by a smooth transition corresponding to a decelerated expansion. Given observational parameters we connect such phases to a primordial accelerated phase, a soft transition to Friedmann (where the classical regime is valid), and a graceful exit to a de Sitter accelerated phase.
An Abstraction Theory for Qualitative Models of Biological Systems
Banks, Richard; 10.4204/EPTCS.40.3
2010-01-01
Multi-valued network models are an important qualitative modelling approach used widely by the biological community. In this paper we consider developing an abstraction theory for multi-valued network models that allows the state space of a model to be reduced while preserving key properties of the model. This is important as it aids the analysis and comparison of multi-valued networks and in particular, helps address the well-known problem of state space explosion associated with such analysis. We also consider developing techniques for efficiently identifying abstractions and so provide a basis for the automation of this task. We illustrate the theory and techniques developed by investigating the identification of abstractions for two published MVN models of the lysis-lysogeny switch in the bacteriophage lambda.
Scaling Theory and Modeling of DNA Evolution
Buldyrev, Sergey V.
1998-03-01
We present evidence supporting the possibility that the nucleotide sequence in noncoding DNA is power-law correlated. We do not find such long-range correlation in the coding regions of the gene, so we build a ``coding sequence finder'' to locate the coding regions of an unknown DNA sequence. We also propose a different coding sequence finding algorithm, based on the concept of mutual information(I. Große, S. V. Buldyrev, H. Herzel, H. E. Stanley, (preprint).). We describe our recent work on quantification of DNA patchiness, using long-range correlation measures (G. M. Viswanathan, S. V. Buldyrev, S. Havlin, and H. E. Stanley, Biophysical Journal 72), 866-875 (1997).. We also present our recent study of the simple repeat length distributions. We find that the distributions of some simple repeats in noncoding DNA have long power-law tails, while in coding DNA all simple repeat distributions decay exponentially. (N. V. Dokholyan, S. V. Buldyrev, S. Havlin, and H. E. Stanley, Phys. Rev. Lett (in press).) We discuss several models based on insertion-deletion and mutation-duplication mechanisms that relate long-range correlations in non-coding DNA to DNA evolution. Specifically, we relate long-range correlations in non-coding DNA to simple repeat expansion, and propose an evolutionary model that reproduces the power law distribution of simple repeat lengths. We argue that the absence of long-range correlations in protein coding sequences is related to their highly conserved primary structure which is necessary to insure protein folding.
Hannah, David R.; Venkatachary, Ranga
2010-01-01
In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…
Summary of papers presented in the Theory and Modelling session
Directory of Open Access Journals (Sweden)
Lin-Liu Y.R.
2012-09-01
Full Text Available A total of 14 contributions were presented in the Theory and Modelling sessions at EC-17. One Theory and Modelling paper was included in the ITER ECRH and ECE sessions each. Three papers were in the area of nonlinear physics discussing parametric processes accompanying ECRH. Eight papers were based on the quasi-linear theory of wave heating and current drive. Three of these addressed the application of ECCD for NTM stabilization. Two papers considered scattering of EC waves by edge density fluctuations and related phenomena. In this summary, we briefly describe the highlights of these contributions. Finally, the three papers concerning modelling of various aspects of ECE are reported in the ECE session.
Design of formative assessment model for professional behavior using stages of change theory.
Hashemi, Akram; Mirzazadeh, Azim; Shirazi, Mandana; Asghari, Fariba
2016-01-01
Background: Professionalism is a core competency of physicians. This study was conducted to design a model for formative assessment of professional commitment in medical students according to stages of change theory. Methods: In this qualitative study, data were collected through literature review & focus group interviews in the Tehran University of Medical Sciences in 2013 and analyzed using content analysis approach. Results: Review of the literature and results of focus group interviews led to design a formative assessment model of professional commitment in three phases, including pre-contemplation, contemplation, and readiness for behavior change that each one has interventional and assessment components. In the second phase of the study, experts' opinion collected in two main categories: the educational environment (factors related to students, students' assessment and educational program); and administrative problems (factors related to subcultures, policymakers or managers and budget). Moreover, there was a section of recommendations for each category related to curriculum, professors, students, assessments, making culture, the staff and reinforcing administrative factors. Conclusion: This type of framework analysis made it possible to develop a conceptual model that could be effective on forming the professional commitment and behavioral change in medical students.
A Model of Resurgence Based on Behavioral Momentum Theory
Shahan, Timothy A; Sweeney, Mary M.
2011-01-01
Resurgence is the reappearance of an extinguished behavior when an alternative behavior reinforced during extinction is subsequently placed on extinction. Resurgence is of particular interest because it may be a source of relapse to problem behavior following treatments involving alternative reinforcement. In this article we develop a quantitative model of resurgence based on the augmented model of extinction provided by behavioral momentum theory. The model suggests that alternative reinforc...
Magnetized cosmological models in bimetric theory of gravitation
Indian Academy of Sciences (India)
S D Katore; R S Rane
2006-08-01
Bianchi type-III magnetized cosmological model when the field of gravitation is governed by either a perfect fluid or cosmic string is investigated in Rosen's [1] bimetric theory of gravitation. To complete determinate solution, the condition, viz., = (), where is a constant, between the metric potentials is used. We have assumed different equations of state for cosmic string [2] for the complete solution of the model. Some physical and geometrical properties of the exhibited model are discussed and studied.
Dynamics in Nonlocal Cosmological Models Derived from String Field Theory
Joukovskaya, Liudmila
2007-01-01
A general class of nonlocal cosmological models is considered. A new method for solving nonlocal Friedmann equations is proposed, and solutions of the Friedmann equations with nonlocal operator are presented. The cosmological properties of these solutions are discussed. Especially indicated is $p$-adic cosmological model in which we have obtained nonsingular bouncing solution and string field theory tachyon model in which we have obtained full solution of nonlocal Friedmann equations with $w=...
Hydrodynamics Research on Amphibious Vehicle Systems:Modeling Theory
Institute of Scientific and Technical Information of China (English)
JU Nai-jun
2006-01-01
For completing the hydrodynamics software development and the engineering application research on the amphibious vehicle systems, hydrodynamic modeling theory of the amphibious vehicle systems is elaborated, which includes to build up the dynamic system model of amphibious vehicle motion on water, gun tracking-aiming-firing, bullet hit and armored check-target, gunner operating control, and the simulation computed model of time domain for random sea wave.
Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.
Gopnik, Alison; Wellman, Henry M
2012-11-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.
Population changes: contemporary models and theories.
Sauvy, A
1981-01-01
In many developing countries rapid population growth has promoted a renewed interest in the study of the effect of population growth on economic development. This research takes either the macroeconomic viewpoint, where the nation is the framework, or the microeconomic perspective, where the family is the framework. For expository purposes, the macroeconomic viewpoint is assumed, and an example of such an investment is presented. Attention is directed to the following: a simplified model--housing; the lessons learned from experience (primitive populations, Spain in the 17th and 18th centuries, comparing development in Spain and Italy, 19th century Western Europe, and underdeveloped countries); the positive factors of population growth; and the concept of the optimal rate of growth. Housing is the typical investment that an individual makes. Hence, the housing per person (roughly 1/3 of the necessary amount of housing per family) is taken as a unit, and the calculations are made using averages. The conclusion is that growth is expensive. A population decrease might be advantageous, for this decrease would enable the entire population to benefit from past capital accumulation. It is also believed, "a priori," that population growth is more expensive for a developed than for a developing country. This belief may be attributable to the fact that the capital per person tends to be high in the developed countries. Any further increase in the population requires additional capital investments, driving this ratio even higher. Yet, investment is not the only factor inhibiting economic development. The literature describes factors regarding population growth, yet this writer prefers to emphasize 2 other factors that have been the subject of less study: a growing population's ease of adaptation and the human factor--behavior. A growing population adapts better to new conditions than does a stationary or declining population, and contrary to "a priori" belief, a growing
Perturbation theory for string sigma models
Bianchi, Lorenzo
2016-01-01
In this thesis we investigate quantum aspects of the Green-Schwarz superstring in various AdS backgrounds relevant for the AdS/CFT correspondence, providing several examples of perturbative computations in the corresponding integrable sigma-models. We start by reviewing in details the supercoset construction of the superstring action in $AdS_5 \\times S^5$, pointing out the limits of this procedure for $AdS_4$ and $AdS_3$ backgrounds. For the $AdS_4 \\times CP^3$ case we give a thorough derivation of an alternative action, based on the double-dimensional reduction of eleven-dimensional super-membranes. We then consider the expansion about the BMN vacuum and the S-matrix for the scattering of worldsheet excitations in the decompactification limit. To evaluate its elements efficiently we describe a unitarity-based method resulting in a very compact formula yielding the cut-constructible part of any one-loop two-dimensional S-matrix. In the second part of this review we analyze the superstring action on $AdS_4 \\ti...
THE REAL OPTIONS OF CAPITAL BUDGET
Directory of Open Access Journals (Sweden)
Antonio Lopo Martins
2008-07-01
Full Text Available The traditional techniques of capital budget, as the deducted cash flow and the net value present, do not incorporate existing flexibilities in an investment project, they tend to distort the value of certain investments, mainly those that are considered in scenes of uncertainty and risk. Therefore, this study intends to demonstrate that the Real Options Theory (TOR is a useful methodology to evaluate and to indicate the best option for project of expansion investment. To reach the considered objective the procedure method was used a case study, having as unit of case the Resort Praia Hotel do Litoral Norte of Salvador. This study was developed of the following form: first it identified the traditional net value present and later it was incorporated the volatileness of each analyzed uncertainty. Second, as the real options are analogous to the financial options, it was necessary to identify elements that composed the terminologies of the financial options with intention to get the value of the real option. For this model of options pricing of Black & Scholes jointly with a computational simulator was used (SLS to get the expanded net value present. As a result of this study it was possible to evidence that using the traditional tool of capital budget Net Value Present (VPL is negative, therefore the project of expansion of the Hotel would be rejected. While for the application of methodology TOR the project presents positive Expanded Present Value which would represent an excellent chance of investment. Key-word: Capital budget, Real options, Analysis of investments.
Lenses on Reading An Introduction to Theories and Models
Tracey, Diane H
2012-01-01
This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition
Theory of compressive modeling and simulation
Szu, Harold; Cha, Jae; Espinola, Richard L.; Krapels, Keith
2013-05-01
Modeling and Simulation (M&S) has been evolving along two general directions: (i) data-rich approach suffering the curse of dimensionality and (ii) equation-rich approach suffering computing power and turnaround time. We suggest a third approach. We call it (iii) compressive M&S (CM&S); because the basic Minimum Free-Helmholtz Energy (MFE) facilitating CM&S can reproduce and generalize Candes, Romberg, Tao & Donoho (CRT&D) Compressive Sensing (CS) paradigm as a linear Lagrange Constraint Neural network (LCNN) algorithm. CM&S based MFE can generalize LCNN to 2nd order as Nonlinear augmented LCNN. For example, during the sunset, we can avoid a reddish bias of sunlight illumination due to a long-range Rayleigh scattering over the horizon. With CM&S we can take instead of day camera, a night vision camera. We decomposed long wave infrared (LWIR) band with filter into 2 vector components (8~10μm and 10~12μm) and used LCNN to find pixel by pixel the map of Emissive-Equivalent Planck Radiation Sources (EPRS). Then, we up-shifted consistently, according to de-mixed sources map, to the sub-micron RGB color image. Moreover, the night vision imaging can also be down-shifted at Passive Millimeter Wave (PMMW) imaging, suffering less blur owing to dusty smokes scattering and enjoying apparent smoothness of surface reflectivity of man-made objects under the Rayleigh resolution. One loses three orders of magnitudes in the spatial Rayleigh resolution; but gains two orders of magnitude in the reflectivity, and gains another two orders in the propagation without obscuring smog . Since CM&S can generate missing data and hard to get dynamic transients, CM&S can reduce unnecessary measurements and their associated cost and computing in the sense of super-saving CS: measuring one & getting one's neighborhood free .
Consistent constraints on the Standard Model Effective Field Theory
Berthier, Laure
2015-01-01
We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, $\\Lambda \\gtrsim \\, 3 \\, {\\rm TeV}$. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an $\\rm S,T$ analysis is modified by the theory errors we include as an illustrative example.
Ulses, C.; Auger, P.-A.; Soetaert, K.; Marsaleix, P.; Diaz, F.; Coppola, L.; Herrmann, M. J.; Kessouri, F.; Estournel, C.
2016-09-01
A 3-D hydrodynamic-biogeochemical coupled model has been used to estimate a budget of organic carbon and its interannual variability over the 5 year period 2004-2008 in the North-Western Mediterranean Open Sea (NWMOS). The comparison of its results with in situ and satellite observations reveals that the timing and the magnitude of the convection and bloom processes during the study period, marked by contrasted atmospheric conditions, are reasonably well reproduced by the model. Model outputs show that the amount of nutrients annually injected into the surface layer is clearly linked to the intensity of the events of winter convection. During cold winters, primary production is reduced by intense mixing events but then spectacularly increases when the water column restratifies. In contrast, during mild winters, the primary production progressively and continuously increases, sustained by moderate new production followed by regenerated production. Overall, interannual variability in the annual primary production is low. The export in subsurface and at middepth is however affected by the intensity of the convection process, with annual values twice as high during cold winters than during mild winters. Finally, the estimation of a global budget of organic carbon reveals that the NWMOS acts as a sink for the shallower areas and as a source for the Algerian and Balearic subbasins.
Theories beyond the standard model, one year before the LHC
Dimopoulos, Savas
2006-04-01
Next year the Large Hadron Collider at CERN will begin what may well be a new golden era of particle physics. I will discuss three theories that will be tested at the LHC. I will begin with the supersymmetric standard model, proposed with Howard Georgi in 1981. This theory made a precise quantitative prediction, the unification of couplings, that has been experimentally confirmed in 1991 by experiments at CERN and SLAC. This established it as the leading theory for physics beyond the standard model. Its main prediction, the existence of supersymmetric particles, will be tested at the large hadron collider. I will next overview theories with large new dimensions, proposed with Nima Arkani-Hamed and Gia Dvali in 1998. This links the weakness of gravity to the presence of sub-millimeter size dimensions, that are presently searched for in experiments looking for deviations from Newton's law at short distances. In this framework quantum gravity, string theory, and black holes may be experimentally investigated at the large hadron collider. I will end with the recent proposal of split supersymmetry with Nima Arkani-Hamed. This theory is motivated by the possible existence of an enormous number of ground states in the fundamental theory, as suggested by the cosmological constant problem and recent developments in string theory and cosmology. It can be tested at the large hadron collider and, if confirmed, it will lend support to the idea that our universe and its laws are not unique and that there is an enormous variety of universes each with its own distinct physical laws.
Cosmological Model Based on Gauge Theory of Gravity
Institute of Scientific and Technical Information of China (English)
WU Ning
2005-01-01
A cosmological model based on gauge theory of gravity is proposed in this paper. Combining cosmological principle and field equation of gravitational gauge field, dynamical equations of the scale factor R(t) of our universe can be obtained. This set of equations has three different solutions. A prediction of the present model is that, if the energy density of the universe is not zero and the universe is expanding, the universe must be space-flat, the total energy density must be the critical density ρc of the universe. For space-flat case, this model gives the same solution as that of the Friedmann model. In other words, though they have different dynamics of gravitational interactions, general relativity and gauge theory of gravity give the same cosmological model.
A model of PCF in guarded type theory
DEFF Research Database (Denmark)
Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars
2015-01-01
Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about elements...... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy...
Higher-Rank Supersymmetric Models and Topological Field Theory
Kawai, T; Yang, S K; Kawai, Toshiya; Uchino, Taku; Yang, Sung-Kil
1993-01-01
In the first part of this paper we investigate the operator aspect of higher-rank supersymmetric model which is introduced as a Lie theoretic extension of the $N=2$ minimal model with the simplest case $su(2)$ corresponding to the $N=2$ minimal model. In particular we identify the analogs of chirality conditions and chiral ring. In the second part we construct a class of topological conformal field theories starting with this higher-rank supersymmetric model. We show the BRST-exactness of the twisted stress-energy tensor, find out physical observables and discuss how to make their correlation functions. It is emphasized that in the case of $su(2)$ the topological field theory constructed in this paper is distinct from the one obtained by twisting the $N=2$ minimal model through the usual procedure.
An introduction to queueing theory modeling and analysis in applications
Bhat, U Narayan
2015-01-01
This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...
Bridging emotion theory and neurobiology through dynamic systems modeling.
Lewis, Marc D
2005-04-01
Efforts to bridge emotion theory with neurobiology can be facilitated by dynamic systems (DS) modeling. DS principles stipulate higher-order wholes emerging from lower-order constituents through bidirectional causal processes--offering a common language for psychological and neurobiological models. After identifying some limitations of mainstream emotion theory, I apply DS principles to emotion-cognition relations. I then present a psychological model based on this reconceptualization, identifying trigger, self-amplification, and self-stabilization phases of emotion-appraisal states, leading to consolidating traits. The article goes on to describe neural structures and functions involved in appraisal and emotion, as well as DS mechanisms of integration by which they interact. These mechanisms include nested feedback interactions, global effects of neuromodulation, vertical integration, action-monitoring, and synaptic plasticity, and they are modeled in terms of both functional integration and temporal synchronization. I end by elaborating the psychological model of emotion-appraisal states with reference to neural processes.
Twisted gauge theories in 3D Walker-Wang models
Wang, Zitao
2016-01-01
Three dimensional gauge theories with a discrete gauge group can emerge from spin models as a gapped topological phase with fractional point excitations (gauge charge) and loop excitations (gauge flux). It is known that 3D gauge theories can be "twisted", in the sense that the gauge flux loops can have nontrivial braiding statistics among themselves and such twisted gauge theories are realized in models discovered by Dijkgraaf and Witten. A different framework to systematically construct three dimensional topological phases was proposed by Walker and Wang and a series of examples have been studied. Can the Walker Wang construction be used to realize the topological order in twisted gauge theories? This is not immediately clear because the Walker-Wang construction is based on a loop condensation picture while the Dijkgraaf-Witten theory is based on a membrane condensation picture. In this paper, we show that the answer to this question is Yes, by presenting an explicit construction of the Walker Wang models wh...
Structural properties of effective potential model by liquid state theories
Institute of Scientific and Technical Information of China (English)
Xiang Yuan-Tao; Andrej Jamnik; Yang Kai-Wei
2010-01-01
This paper investigates the structural properties of a model fluid dictated by an effective inter-particle oscillatory potential by grand canonical ensemble Monte Carlo (GCEMC) simulation and classical liquid state theories. The chosen oscillatory potential incorporates basic interaction terms used in modeling of various complex fluids which is composed of mesoscopic particles dispersed in a solvent bath, the studied structural properties include radial distribution function in bulk and inhomogeneous density distribution profile due to influence of several external fields. The GCEMC results are employed to test the validity of two recently proposed theoretical approaches in the field of atomic fluids. One is an Ornstein-Zernike integral equation theory approach; the other is a third order + second order perturbation density functional theory. Satisfactory agreement between the GCEMC simulation and the pure theories fully indicates the ready adaptability of the atomic fluid theories to effective model potentials in complex fluids, and classifies the proposed theoretical approaches as convenient tools for the investigation of complex fluids under the single component macro-fluid approximation.
Traffic Games: Modeling Freeway Traffic with Game Theory.
Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R
2016-01-01
We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.
Traffic Games: Modeling Freeway Traffic with Game Theory
Cortés-Berrueco, Luis E.; Gershenson, Carlos; Stephens, Christopher R.
2016-01-01
We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers’ interactions. PMID:27855176
Theory, modeling and simulation of superconducting qubits
Energy Technology Data Exchange (ETDEWEB)
Berman, Gennady P [Los Alamos National Laboratory; Kamenev, Dmitry I [Los Alamos National Laboratory; Chumak, Alexander [INSTIT OF PHYSICS, KIEV; Kinion, Carin [LLNL; Tsifrinovich, Vladimir [POLYTECHNIC INSTIT OF NYU
2011-01-13
We analyze the dynamics of a qubit-resonator system coupled with a thermal bath and external electromagnetic fields. Using the evolution equations for the set of Heisenberg operators that describe the whole system, we derive an expression for the resonator field, that includes the resonator-drive, the resonator-bath, and resonator-qubit interactions. The renormalization of the resonator frequency, caused by the qubit-resonator interaction, is accounted for. Using the solutions for the resonator field, we derive the equation that describes the qubit dynamics. The dependence of the qubit evolution during the measurement time on the fidelity of a single-shot measurement is studied. The relation between the fidelity and measurement time is shown explicitly. We proposed a novel adiabatic method for the phase qubit measurement. The method utilizes a low-frequency, quasi-classical resonator inductively coupled to the qubit. The resonator modulates the qubit energy, and the back reaction of the qubit causes a shift in the phase of the resonator. The resonator phase shift can be used to determine the qubit state. We have simulated this measurement taking into the account the energy levels outside the phase qubit manifold. We have shown that, for qubit frequencies in the range of 8-12GHZ, a resonator frequency of 500 MHz and a measurement time of 100 ns, the phase difference between the two qubit states is greater than 0.2 rad. This phase difference exceeds the measurement uncertainty, and can be detected using a classical phase-meter. A fidelity of 0.9999 can be achieved for a relaxation time of 0.5 ms. We also model and simulate a microstrip-SQUID amplifier of frequency about 500 MHz, which could be used to amplify the resonator oscillations in the phase qubit adiabatic measurement. The voltage gain and the amplifier noise temperature are calculated. We simulate the preparation of a generalized Bell state and compute the relaxation times required for achieving high
POLITICAL BUDGET CYCLES: EVIDENCE FROM TURKEY
Directory of Open Access Journals (Sweden)
FİLİZ ERYILMAZ
2015-04-01
Full Text Available The theorical literature on “Political Business Cycles” presents important insights on the extent to which politicians attempt to manipulate government monetary and fiscal policies to influence electoral outcomes, in particular, with the aim of re-election. In recent years “Political Budget Cycles” is the one of the most important topics in Political Business Cycles literature. According to Political Budget Cycles Theory, some components of the government budget are influenced by the electoral cycle and consequently an increase in government spending or decrease in taxes in an election year, leading to larger fiscal deficit. This incumbent’s fiscal manipulation is a tool that governments possess to increase their changes for re-election. In this paper we investigate the presence of Political Budget Cycles using a data set of budget balance, total expenditure and total revenue over the period 1994–2012. Our findings suggest that incumbents in Turkey use fiscal policy to increase their popularity and win elections, therefore fiscal manipulation was rewarded rather than punished by Turkish voters. The meaning of this result is that Political Budget Cycles Theory is valid for Turkey between 1994 and 2012.
Thimble regularization at work besides toy models: from Random Matrix Theory to Gauge Theories
Eruzzi, G
2015-01-01
Thimble regularization as a solution to the sign problem has been successfully put at work for a few toy models. Given the non trivial nature of the method (also from the algorithmic point of view) it is compelling to provide evidence that it works for realistic models. A Chiral Random Matrix theory has been studied in detail. The known analytical solution shows that the model is non-trivial as for the sign problem (in particular, phase quenched results can be very far away from the exact solution). This study gave us the chance to address a couple of key issues: how many thimbles contribute to the solution of a realistic problem? Can one devise algorithms which are robust as for staying on the correct manifold? The obvious step forward consists of applications to gauge theories.
7 CFR 3402.14 - Budget and budget narrative.
2010-01-01
... 7 Agriculture 15 2010-01-01 2010-01-01 false Budget and budget narrative. 3402.14 Section 3402.14 Agriculture Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION... budget narrative. Applicants must prepare the Budget, Form CSREES-2004, and a budget...
Pension Benefit Guaranty Corporation — The Summary of Changes dataset extracted from PBGC's congressional budget justification. It contains all administrative and program increases and decreases including...
Matrix Factorizations for Local F-Theory Models
Omer, Harun
2016-01-01
I use matrix factorizations to describe branes at simple singularities as they appear in elliptic fibrations of local F-theory models. Each node of the corresponding Dynkin diagrams of the ADE-type singularities is associated with one indecomposable matrix factorization which can be deformed into one or more factorizations of lower rank. Branes with internal fluxes arise naturally as bound states of the indecomposable factorizations. Describing branes in such a way avoids the need to resolve singularities and encodes information which is neglected in conventional F-theory treatments. This paper aims to show how branes arising in local F-theory models around simple singularities can be described in this framework.
Tsai, Chung-Hung
2014-05-07
Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.
Directory of Open Access Journals (Sweden)
Chung-Hung Tsai
2014-05-01
Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.
Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions
Directory of Open Access Journals (Sweden)
Camaren Peter
2014-03-01
Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.
Roth, Jason L.; Capel, Paul D.
2012-01-01
Crop agriculture occupies 13 percent of the conterminous United States. Agricultural management practices, such as crop and tillage types, affect the hydrologic flow paths through the landscape. Some agricultural practices, such as drainage and irrigation, create entirely new hydrologic flow paths upon the landscapes where they are implemented. These hydrologic changes can affect the magnitude and partitioning of water budgets and sediment erosion. Given the wide degree of variability amongst agricultural settings, changes in the magnitudes of hydrologic flow paths and sediment erosion induced by agricultural management practices commonly are difficult to characterize, quantify, and compare using only field observations. The Water Erosion Prediction Project (WEPP) model was used to simulate two landscape characteristics (slope and soil texture) and three agricultural management practices (land cover/crop type, tillage type, and selected agricultural land management practices) to evaluate their effects on the water budgets of and sediment yield from agricultural lands. An array of sixty-eight 60-year simulations were run, each representing a distinct natural or agricultural scenario with various slopes, soil textures, crop or land cover types, tillage types, and select agricultural management practices on an isolated 16.2-hectare field. Simulations were made to represent two common agricultural climate regimes: arid with sprinkler irrigation and humid. These climate regimes were constructed with actual climate and irrigation data. The results of these simulations demonstrate the magnitudes of potential changes in water budgets and sediment yields from lands as a result of landscape characteristics and agricultural practices adopted on them. These simulations showed that variations in landscape characteristics, such as slope and soil type, had appreciable effects on water budgets and sediment yields. As slopes increased, sediment yields increased in both the arid and
Conceptualizations of Creativity: Comparing Theories and Models of Giftedness
Miller, Angie L.
2012-01-01
This article reviews seven different theories of giftedness that include creativity as a component, comparing and contrasting how each one conceptualizes creativity as a part of giftedness. The functions of creativity vary across the models, suggesting that while the field of gifted education often cites the importance of creativity, the…
Multilevel Higher-Order Item Response Theory Models
Huang, Hung-Yu; Wang, Wen-Chung
2014-01-01
In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…
[General systems theory, analog models and essential arterial hypertension].
Indovina, I; Bonelli, M
1991-02-15
The application of the General System Theory to the fields of biology and particularly of medicine is fraught with many difficulties deriving from the mathematical complexities of application. The authors suggest that these difficulties can be overcome by applying analogical models, thus opening new prospects for the resolution of the manifold problems involved in connection with the study of arterial hypertension.
Application of Health Promotion Theories and Models for Environmental Health
Parker, Edith A.; Baldwin, Grant T.; Israel, Barbara; Salinas, Maria A.
2004-01-01
The field of environmental health promotion gained new prominence in recent years as awareness of physical environmental stressors and exposures increased in communities across the country and the world. Although many theories and conceptual models are used routinely to guide health promotion and health education interventions, they are rarely…
Classical and Quantum Theory of Perturbations in Inflationary Universe Models
Brandenberger, R H; Mukhanov, V
1993-01-01
A brief introduction to the gauge invariant classical and quantum theory of cosmological perturbations is given. The formalism is applied to inflationary Universe models and yields a consistent and unified description of the generation and evolution of fluctuations. A general formula for the amplitude of cosmological perturbations in inflationary cosmology is derived.
Pilot evaluation in TENCompetence: a theory-driven model
Schoonenboom, Judith; Sligte, Henk; Moghnieh, Ayman; Specht, Marcus; Glahn, Christian; Stefanov, Krassen
2007-01-01
Schoonenboom, J., Sligte, H., Moghnieh, A., Specht, M., Glahn, C., & Stefanov, K. (2007). Pilot evaluation in TENCompetence: a theory-driven model. In T. Navarette, J. Blat & R. Koper (Eds.). Proceedings of the 3rd TENCompetence Open Workshop 'Current Research on IMS Learning Design and Lifelong Com
A Proposed Model of Jazz Theory Knowledge Acquisition
Ciorba, Charles R.; Russell, Brian E.
2014-01-01
The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…
Evaluating hydrological model performance using information theory-based metrics
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...
Stochastic models in risk theory and management accounting
Brekelmans, R.C.M.
2000-01-01
This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest for
Anisotropic cosmological models and generalized scalar tensor theory
Indian Academy of Sciences (India)
Subenoy Chakraborty; Batul Chandra Santra; Nabajit Chakravarty
2003-10-01
In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–Sachs space-time. For bulk viscous ﬂuid, both exponential and power-law solutions have been studied and some assumptions among the physical parameters and solutions have been discussed.
Using SAS PROC MCMC for Item Response Theory Models
Ames, Allison J.; Samonte, Kelli
2015-01-01
Interest in using Bayesian methods for estimating item response theory models has grown at a remarkable rate in recent years. This attentiveness to Bayesian estimation has also inspired a growth in available software such as WinBUGS, R packages, BMIRT, MPLUS, and SAS PROC MCMC. This article intends to provide an accessible overview of Bayesian…
Taniguchi, Kristine; Gudiño, Napoleon; Biggs, Trent; Castillo, Carlos; Langendoen, Eddy; Bingner, Ron; Taguas, Encarnación; Liden, Douglas; Yuan, Yongping
2015-04-01
Several watersheds cross the US-Mexico boundary, resulting in trans-boundary environmental problems. Erosion in Tijuana, Mexico, increases the rate of sediment deposition in the Tijuana Estuary in the United States, altering the structure and function of the ecosystem. The well-being of residents in Tijuana is compromised by damage to infrastructure and homes built adjacent to stream channels, gully formation in dirt roads, and deposition of trash. We aim to understand the dominant source of sediment contributing to the sediment budget of the watershed (channel, gully, or rill erosion), where the hotspots of erosion are located, and what the impact of future planned and unplanned land use changes and Best Management Practices (BMPs) will be on sediment and storm flow. We will be using a mix of field methods, including 3D photo-reconstruction of stream channels, with two models, CONCEPTS and AnnAGNPS to constrain estimates of the sediment budget and impacts of land use change. Our research provides an example of how 3D photo-reconstruction and Structure from Motion (SfM) can be used to model channel evolution.
Directory of Open Access Journals (Sweden)
Joel Arnault
2012-02-01
Full Text Available Gravity waves generated by the Vestfjella Mountains (in western Droning Maud Land, Antarctica, southwest of the Finnish/Swedish Aboa/Wasa station have been observed with the Moveable atmospheric radar for Antarctica (MARA during the SWEDish Antarctic Research Programme (SWEDARP in December 2007/January 2008. These radar observations are compared with a 2-month Weather Research Forecast (WRF model experiment operated at 2 km horizontal resolution. A control simulation without orography is also operated in order to separate unambiguously the contribution of the mountain waves on the simulated atmospheric flow. This contribution is then quantified with a kinetic energy budget analysis computed in the two simulations. The results of this study confirm that mountain waves reaching lower-stratospheric heights break through convective overturning and generate inertia gravity waves with a smaller vertical wavelength, in association with a brief depletion of kinetic energy through frictional dissipation and negative vertical advection. The kinetic energy budget also shows that gravity waves have a strong influence on the other terms of the budget, i.e. horizontal advection and horizontal work of pressure forces, so evaluating the influence of gravity waves on the mean-flow with the vertical advection term alone is not sufficient, at least in this case. We finally obtain that gravity waves generated by the Vestfjella Mountains reaching lower stratospheric heights generally deplete (create kinetic energy in the lower troposphere (upper troposphere–lower stratosphere, in contradiction with the usual decelerating effect attributed to gravity waves on the zonal circulation in the upper troposphere–lower stratosphere.
Theory and modelling of diamond fracture from an atomic perspective.
Brenner, Donald W; Shenderova, Olga A
2015-03-28
Discussed in this paper are several theoretical and computational approaches that have been used to better understand the fracture of both single-crystal and polycrystalline diamond at the atomic level. The studies, which include first principles calculations, analytic models and molecular simulations, have been chosen to illustrate the different ways in which this problem has been approached, the conclusions and their reliability that have been reached by these methods, and how these theory and modelling methods can be effectively used together.
Cirafici, M.; Sinkovics, A.; Szabo, R.J.
2009-01-01
We study the relation between Donaldson–Thomas theory of Calabi–Yau threefolds and a six-dimensional topological Yang–Mills theory. Our main example is the topological U(N) gauge theory on flat space in its Coulomb branch. To evaluate its partition function we use equivariant localization techniques
Theory-based Bayesian models of inductive learning and reasoning.
Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles
2006-07-01
Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.
On ADE Quiver Models and F-Theory Compactification
Belhaj, A; Sebbar, A; Sedra, M B
2006-01-01
Based on mirror symmetry, we discuss geometric engineering of N=1 ADE quiver models from F-theory compactifications on elliptic K3 surfaces fibered over certain four-dimensional base spaces. The latter are constructed as intersecting 4-cycles according to $ADE$ Dynkin diagrams, thereby mimicking the construction of Calabi-Yau threefolds used in geometric engineering in type II superstring theory. Matter is incorporated by considering D7-branes wrapping these 4-cycles. Using a geometric procedure referred to as folding, we discuss how the corresponding physics can be converted into a scenario with D5-branes wrapping 2-cycles of ALE spaces.
On ADE quiver models and F-theory compactification
Energy Technology Data Exchange (ETDEWEB)
Belhaj, A [Department of Mathematics and Statistics, University of Ottawa, 585 King Edward Ave., Ottawa, ON, K1N 6N5 (Canada); Rasmussen, J [Department of Mathematics and Statistics, University of Melbourne, Parkville, Victoria 3010 (Australia); Sebbar, A [Department of Mathematics and Statistics, University of Ottawa, 585 King Edward Ave., Ottawa, ON, K1N 6N5 (Canada); Sedra, M B [Laboratoire de Physique de la Matiere et Rayonnement (LPMR), Morocco Faculte des Sciences, Universite Ibn Tofail, Kenitra, Morocco (Morocco)
2006-07-21
Based on mirror symmetry, we discuss geometric engineering of N = 1 ADE quiver models from F-theory compactifications on elliptic K3 surfaces fibred over certain four-dimensional base spaces. The latter are constructed as intersecting 4-cycles according to ADE Dynkin diagrams, thereby mimicking the construction of Calabi-Yau threefolds used in geometric engineering in type II superstring theory. Matter is incorporated by considering D7-branes wrapping these 4-cycles. Using a geometric procedure referred to as folding, we discuss how the corresponding physics can be converted into a scenario with D5-branes wrapping 2-cycles of ALE spaces.
Sample McMeeking, Laura B; Basile, Carole; Brian Cobb, R
2012-11-01
Theory-based evaluation (TBE) is an evaluation method that shows how a program will work under certain conditions and has been supported as a viable, evidence-based option in cases where randomized trials or high-quality quasi-experiments are not feasible. Despite the model's widely accepted theoretical appeal there are few examples of its well-implemented use, probably due to time and money limitations necessary for planning and a confusion over the definitions between research and evaluation functions and roles. In this paper, we describe the development of a theory-based evaluation design in a Math and Science Partnership (MSP) research project funded by the U.S. National Science Foundation (NSF). Through this work we developed an organizational model distinguishing between and integrating evaluation and research functions, explicating personnel roles and responsibilities, and highlighting connections between research and evaluation work. Although the research and evaluation components operated on independent budgeting, staffing, and implementation activities, we were able to combine datasets across activities to allow us to assess the integrity of the program theory, not just the hypothesized connections within it. This model has since been used for proposal development and has been invaluable as it creates a research and evaluation plan that is seamless from the beginning.
Hirvonen, Åsa; Kossak, Roman; Villaveces, Andrés
2015-01-01
In recent years, mathematical logic has developed in many directions, the initial unity of its subject matter giving way to a myriad of seemingly unrelated areas. The articles collected here, which range from historical scholarship to recent research in geometric model theory, squarely address this development. These articles also connect to the diverse work of Väänänen, whose ecumenical approach to logic reflects the unity of the discipline.
Faller, Martha Lewkus
1984-01-01
Looks at the factors complicating the management of student worker budgets in libraries (e.g., the number of separate but interrelated budgets involved). Proposes a budgetary system incorporating double-entry bookkeeping, continuous proving, and combination receipts and disbursements. Considers the advantages of the system and details procedures.…
Institute of Scientific and Technical Information of China (English)
无
2011-01-01
Chinese filmmakers turn small-budget productions into box-office successes Organizers of China’s upcoming film festivals are finally giving recognition to the little guys—low budget films—to encourage a generation of young,talented directors.
How Budget Deficit Impairs Long-Term Growth and Welfare under Perfect Capital Mobility
2014-01-01
This paper investigates the implications of the size of budget deficit in the open economy under perfect mobility of capital. For that purpose we construct a general equilibrium model with consumers maximizing the discounted utility of consumption, and firms maximizing profits. Government sets the size of the deficit relative to GDP and controls the structure of public debt. Using standard methods of optimal control theory we solve the model, i.e. we find explicit formulas for all trajectorie...
Directory of Open Access Journals (Sweden)
K. Misumi
2013-05-01
Full Text Available We investigated the simulated iron budget in ocean surface waters in the 1990s and 2090s using the Community Earth System Model version 1 and the Representative Concentration Pathway 8.5 future CO2 emission scenario. We assumed that exogenous iron inputs did not change during the whole simulation period; thus, iron budget changes were attributed solely to changes in ocean circulation and mixing in response to projected global warming. The model simulated the major features of ocean circulation and dissolved iron distribution for the present climate reasonably well. Detailed iron budget analysis revealed that roughly 70% of the iron supplied to surface waters in high-nutrient, low-chlorophyll (HNLC regions is contributed by ocean circulation and mixing processes, but the dominant supply mechanism differed in each HNLC region: vertical mixing in the Southern Ocean, upwelling in the eastern equatorial Pacific, and deposition of iron-bearing dust in the subarctic North Pacific. In the 2090s, our model projected an increased iron supply to HNLC surface waters, even though enhanced stratification was predicted to reduce iron entrainment from deeper waters. This unexpected result could be attributed largely to changes in the meridional overturning and gyre-scale circulations that intensified the advective supply of iron to surface waters, especially in the eastern equatorial Pacific. The simulated primary and export productions in the 2090s decreased globally by 6% and 13%, respectively, whereas in the HNLC regions, they increased by 11% and 6%, respectively. Roughly half of the elevated production could be attributed to the intensified iron supply. The projected ocean circulation and mixing changes are consistent with recent observations of responses to the warming climate and with other Coupled Model Intercomparison Project model projections. We conclude that future ocean circulation and mixing changes will likely elevate the iron supply to HNLC
Dennis, R. L.; Bash, J. O.; Foley, K. M.; Gilliam, R.; Pinder, R. W.
2013-12-01
Deposition is affected by the chemical and physical processes represented in the regional models as well as source strength. The overall production and loss budget (wet and dry deposition) is dynamically connected and adjusts internally to changes in process representation. In addition, the scrubbing of pollutants from the atmosphere by precipitation is one of several processes that remove pollutants, creating a coupling with the atmospheric aqueous and gas phase chemistry that can influence wet deposition rates in a nonlinear manner. We explore through model sensitivities with the regional Community Multiscale Air Quality (CMAQ) model the influence on wet and dry deposition, and the overall continental nitrogen budget, of changes in three process representations in the model: (1) incorporation of lightning generated NO, (2) improved representation of convective precipitation, and (3) replacement of the typical unidirectional dry deposition of NH3 with a state of the science representation of NH3 bi-directional air-surface exchange. Results of the sensitivity studies will be presented. (1) Incorporation of lightning generated NO significantly reduces a negative bias in summer wet nitrate deposition, but is sensitive to the choice of convective parameterization. (2) Use of a less active trigger of convective precipitation in the WRF meteorological model to reduce summertime precipitation over prediction bias reduces the generation of NO from lightning. It also reduces the wet deposition of nitrate and increases the dry deposition of oxidized nitrogen, as well as changing (reducing) the surface level exposure to ozone. Improvements in the convective precipitation processes also result in more non-precipitating clouds leading to an increase in SO4 production through the aqueous pathway resulting in improvements in summertime SO4 ambient aerosol estimates.(3) Incorporation of state of the science ammonia bi-directional air surface exchange affects both the dry
Cluster density functional theory for lattice models based on the theory of Möbius functions
Lafuente, Luis; Cuesta, José A.
2005-08-01
Rosenfeld's fundamental-measure theory for lattice models is given a rigorous formulation in terms of the theory of Möbius functions of partially ordered sets. The free-energy density functional is expressed as an expansion in a finite set of lattice clusters. This set is endowed with a partial order, so that the coefficients of the cluster expansion are connected to its Möbius function. Because of this, it is rigorously proven that a unique such expansion exists for any lattice model. The low-density analysis of the free-energy functional motivates a redefinition of the basic clusters (zero-dimensional cavities) which guarantees a correct zero-density limit of the pair and triplet direct correlation functions. This new definition extends Rosenfeld's theory to lattice models with any kind of short-range interaction (repulsive or attractive, hard or soft, one or multicomponent ...). Finally, a proof is given that these functionals have a consistent dimensional reduction, i.e. the functional for dimension d' can be obtained from that for dimension d (d' < d) if the latter is evaluated at a density profile confined to a d'-dimensional subset.
Cluster density functional theory for lattice models based on the theory of Moebius functions
Energy Technology Data Exchange (ETDEWEB)
Lafuente, Luis; Cuesta, Jose A [Grupo Interdisciplinar de Sistemas Complejos (GISC), Departamento de Matematicas, Universidad Carlos III de Madrid, 28911 Leganes, Madrid (Spain)
2005-08-26
Rosenfeld's fundamental-measure theory for lattice models is given a rigorous formulation in terms of the theory of Moebius functions of partially ordered sets. The free-energy density functional is expressed as an expansion in a finite set of lattice clusters. This set is endowed with a partial order, so that the coefficients of the cluster expansion are connected to its Moebius function. Because of this, it is rigorously proven that a unique such expansion exists for any lattice model. The low-density analysis of the free-energy functional motivates a redefinition of the basic clusters (zero-dimensional cavities) which guarantees a correct zero-density limit of the pair and triplet direct correlation functions. This new definition extends Rosenfeld's theory to lattice models with any kind of short-range interaction (repulsive or attractive, hard or soft, one or multicomponent ...). Finally, a proof is given that these functionals have a consistent dimensional reduction, i.e. the functional for dimension d' can be obtained from that for dimension d (d' < d) if the latter is evaluated at a density profile confined to a d'-dimensional subset.
Zilitinkevich, S S; Kleeorin, N; Rogachevskii, I; Esau, I
2011-01-01
In this paper we advance physical background of the EFB turbulence closure and present its comprehensive description. It is based on four budget equations for the second moments: turbulent kinetic and potential energies (TKE and TPE) and vertical turbulent fluxes of momentum and buoyancy; a new relaxation equation for the turbulent dissipation time-scale; and advanced concept of the inter-component exchange of TKE. The EFB closure is designed for stratified, rotating geophysical flows from neutral to very stable. In accordance to modern experimental evidence, it grants maintaining turbulence by the velocity shear at any gradient Richardson number Ri, and distinguishes between the two principally different regimes: "strong turbulence" at Ri 1 typical of the free atmosphere or deep ocean, where Pr_T asymptotically linearly increases with increasing Ri that implies strong suppressing of the heat transfer compared to momentum transfer. For use in different applications, the EFB turbulence closure is formulated a...
Heckmann, Tobias; Hilger, Ludwig; Vehling, Lucas; Becht, Michael
2016-05-01
The estimation of catchment-scale rockfall rates relies on the regionalisation of local measurements. Here, we propose a new framework for such a regionalisation by the example of a case study in the Upper Kaunertal, Austrian Central Alps (62.5 km2). Measurements of rockfall deposition during 12 months onto six collector nets within the study area were combined with published mean annual rates from the literature, and a probability density function was fitted to these data. A numerical model involving a random walk routing scheme and a one-parameter friction model was used to simulate rockfall trajectories, starting from potential rockfall source areas that were delineated from a digital elevation model. Rockfall rates sampled from the fitted probability density function were assigned to these trajectories in order to model the spatial distribution and to estimate the amount of rockfall deposition. By recording all trajectories as edges of a network of raster cells, and by aggregating the latter to landforms (or landform types) as delineated in a geomorphological map of the study area, rockfall sediment flux from sources to different landforms could be quantified. Specifically, the geomorphic coupling of rockfall sources to storage landforms and the glacial and fluvial sediment cascade was investigated using this network model. The total rockfall contribution to the sediment budget of the Upper Kaunertal is estimated at c. 8000 Mg yr- 1, 16.5% of which is delivered to the glaciers, and hence to the proglacial zone. The network approach is favourable, for example because multiple scenarios (involving different probability density functions) can be calculated on the basis of the same set of trajectories, and because deposits can be back-linked to their respective sources. While the methodological framework constitutes the main aim of our paper, we also discuss how the estimation of the budget can be improved on the basis of spatially distributed production rates.
Models for probability and statistical inference theory and applications
Stapleton, James H
2007-01-01
This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...
Modeling size effect in the SMA response: a gradient theory
Tabesh, Majid; Boyd, James G.; Lagoudas, Dimitris C.
2014-03-01
Shape memory alloys (SMAs) show size effect in their response. The critical stresses, for instance, for the start of martensite and austenite transformations are reported to increase in some SMA wires for diameters below 100 μm. Simulation of such a behavior cannot be achieved using conventional theories that lack an intrinsic length scale in their constitutive modeling. To enable the size effect, a thermodynamically consistent constitutive model is developed, that in addition to conventional internal variables of martensitic volume fraction and transformation strain, contains the spatial gradient of martensitic volume fraction as an internal variable. The developed theory is simplified for 1D cases and analytical solutions for pure bending of SMA beams are presented. The gradient model captures the size effect in the response of the studied SMA structures.
Social learning theory and the Health Belief Model.
Rosenstock, I M; Strecher, V J; Becker, M H
1988-01-01
The Health Belief Model, social learning theory (recently relabelled social cognitive theory), self-efficacy, and locus of control have all been applied with varying success to problems of explaining, predicting, and influencing behavior. Yet, there is conceptual confusion among researchers and practitioners about the interrelationships of these theories and variables. This article attempts to show how these explanatory factors may be related, and in so doing, posits a revised explanatory model which incorporates self-efficacy into the Health Belief Model. Specifically, self-efficacy is proposed as a separate independent variable along with the traditional health belief variables of perceived susceptibility, severity, benefits, and barriers. Incentive to behave (health motivation) is also a component of the model. Locus of control is not included explicitly because it is believed to be incorporated within other elements of the model. It is predicted that the new formulation will more fully account for health-related behavior than did earlier formulations, and will suggest more effective behavioral interventions than have hitherto been available to health educators.
Fracture and ductile vs. brittle behavior -- Theory, modeling and experiment
Energy Technology Data Exchange (ETDEWEB)
Beltz, G.E. [ed.] [Univ. of California, Santa Barbara, CA (United States); Selinger, R.L.B. [ed.] [Catholic Univ., Washington, DC (United States); Kim, K.S. [ed.] [Brown Univ., Providence, RI (United States); Marder, M.P. [ed.] [Univ. of Texas, Austin, TX (United States)
1999-08-01
The symposium brought together the many communities that investigate the fundamentals of fracture, with special emphasis on the ductile/brittle transition across a broad spectrum of material classes, fracture at interfaces, and modelling fracture over various length scales. Theoretical techniques discussed ranged from first-principles electronic structure theory to atomistic simulation to mesoscale and continuum theories, along with studies of fractals and scaling in fracture. Experimental and theoretical talks were interspersed throughout all sessions, rather than being segregated. The contributions to this volume generally follow the topical outline upon which the symposium was organized. The first part, dealing with ductile vs. brittle behavior in metals, concerns itself with investigations of high-strength steel, magnesium alloys, ordered intermetallics, and Fe-Cr-Al alloys. The development of analytical methods based on micromechanical models, such as dislocation mechanics and cohesive/contact zone models, are covered in a follow-up section. Nonmetals, including silicon, are considered in Parts 3 and 4. Fractals, chaos, and scaling theories are taken up in Part 5, with a special emphasis on fracture in heterogeneous solids. Modelling based on large populations of dislocations has substantially progressed during the past three years; hence, a section devoted to crystal plasticity and mesoscale dislocation modelling appears next. Finally, the technologically significant area of interfacial fracture, with applications to composites and intergranular fracture, is taken up in Part 7. Separate abstracts were prepared for most of the papers in this book.
Matrix models and stochastic growth in Donaldson-Thomas theory
Energy Technology Data Exchange (ETDEWEB)
Szabo, Richard J. [Department of Mathematics, Heriot-Watt University, Colin Maclaurin Building, Riccarton, Edinburgh EH14 4AS, United Kingdom and Maxwell Institute for Mathematical Sciences, Edinburgh (United Kingdom); Tierz, Miguel [Grupo de Fisica Matematica, Complexo Interdisciplinar da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, PT-1649-003 Lisboa (Portugal); Departamento de Analisis Matematico, Facultad de Ciencias Matematicas, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid (Spain)
2012-10-15
We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.
Matrix models and stochastic growth in Donaldson-Thomas theory
Szabo, Richard J.; Tierz, Miguel
2012-10-01
We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kähler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.
Nonequilibrium Dynamical Mean-Field Theory for Bosonic Lattice Models
Strand, Hugo U. R.; Eckstein, Martin; Werner, Philipp
2015-01-01
We develop the nonequilibrium extension of bosonic dynamical mean-field theory and a Nambu real-time strong-coupling perturbative impurity solver. In contrast to Gutzwiller mean-field theory and strong-coupling perturbative approaches, nonequilibrium bosonic dynamical mean-field theory captures not only dynamical transitions but also damping and thermalization effects at finite temperature. We apply the formalism to quenches in the Bose-Hubbard model, starting from both the normal and the Bose-condensed phases. Depending on the parameter regime, one observes qualitatively different dynamical properties, such as rapid thermalization, trapping in metastable superfluid or normal states, as well as long-lived or strongly damped amplitude oscillations. We summarize our results in nonequilibrium "phase diagrams" that map out the different dynamical regimes.
Should the model for risk-informed regulation be game theory rather than decision theory?
Bier, Vicki M; Lin, Shi-Woei
2013-02-01
deception), to identify optimal regulatory strategies. Therefore, we believe that the types of regulatory interactions analyzed in this article are better modeled using game theory rather than decision theory. In particular, the goals of this article are to review the relevant literature in game theory and regulatory economics (to stimulate interest in this area among risk analysts), and to present illustrative results showing how the application of game theory can provide useful insights into the theory and practice of risk-informed regulation.
Yu, Miao; Wang, Guiling; Chen, Haishan
2016-03-01
Assessing and quantifying the uncertainties in projected future changes of energy and water budgets over land surface are important steps toward improving our confidence in climate change projections. In this study, the contribution of land surface models to the inter-GCM variation of projected future changes in land surface energy and water fluxes are assessed based on output from 19 global climate models (GCMs) and offline Community Land Model version 4 (CLM4) simulations driven by meteorological forcing from the 19 GCMs. Similar offline simulations using CLM4 with its dynamic vegetation submodel are also conducted to investigate how dynamic vegetation feedback, a process that is being added to more earth system models, may amplify or moderate the intermodel variations of projected future changes. Projected changes are quantified as the difference between the 2081-2100 period from the Representative Concentration Pathway 8.5 (RCP8.5) future experiment and the 1981-2000 period from the historical simulation. Under RCP8.5, projected changes in surface water and heat fluxes show a high degree of model dependency across the globe. Although precipitation is very likely to increase in the high latitudes of the Northern Hemisphere, a high degree of model-related uncertainty exists for evapotranspiration, soil water content, and surface runoff, suggesting discrepancy among land surface models (LSMs) in simulating the surface hydrological processes and snow-related processes. Large model-related uncertainties for the surface water budget also exist in the Tropics including southeastern South America and Central Africa. These uncertainties would be reduced in the hypothetical scenario of a single near-perfect land surface model being used across all GCMs, suggesting the potential to reduce uncertainties through the use of more consistent approaches toward land surface model development. Under such a scenario, the most significant reduction is likely to be seen in the
Nonrelativistic factorizable scattering theory of multicomponent Calogero-Sutherland model
Ahn, C; Nam, S; Ahn, Changrim; Lee, Kong Ju Bock; Nam, Soonkeon
1995-01-01
We relate two integrable models in (1+1) dimensions, namely, multicomponent Calogero-Sutherland model with particles and antiparticles interacting via the hyperbolic potential and the nonrelativistic factorizable S-matrix theory with SU(N)-invariance. We find complete solutions of the Yang-Baxter equations without implementing the crossing symmetry, and one of them is identified with the scattering amplitudes derived from the Schr\\"{o}dinger equation of the Calogero-Sutherland model. This particular solution is of interest in that it cannot be obtained as a nonrelativistic limit of any known relativistic solutions of the SU(N)-invariant Yang-Baxter equations.
Massive mu pair production in a vector field theory model
Halliday, I G
1976-01-01
Massive electrodynamics is treated as a model for the production of massive mu pairs in high-energy hadronic collisions. The dominant diagrams in perturbation theory are identified and analyzed. These graphs have an eikonal structure which leads to enormous cancellations in the two-particle inclusive cross section but not in the n-particle production cross sections. Under the assumption that these cancellations are complete, a Drell-Yan structure appears in the inclusive cross section but the particles accompanying the mu pairs have a very different structure compared to the parton model. The pionization region is no longer empty of particles as in single parton models. (10 refs).
Entanglement of Conceptual Entities in Quantum Model Theory (QMod)
Aerts, Diederik
2012-01-01
We have recently elaborated 'Quantum Model Theory' (QMod) to model situations where the quantum effects of contextuality, interference, superposition, entanglement and emergence, appear without the entities giving rise to these situations having necessarily to be of microscopic nature. We have shown that QMod models without introducing linearity for the set of the states. In this paper we prove that QMod, although not using linearity for the state space, provides a method of identification for entangled states and an intuitive explanation for their occurrence. We illustrate this method for entanglement identification with concrete examples.
Tao, W.-K.; Johnson, D.; Shie, C.-L.; Simpson, J.
2004-10-01
A two-dimensional version of the Goddard Cumulus Ensemble (GCE) model is used to simulate convective systems that developed in various geographic locations (east Atlantic, west Pacific, South China Sea, and Great Plains in the United States). Observed large-scale advective tendencies for potential temperature, water vapor mixing ratio, and horizontal momentum derived from field campaigns are used as the main forcing. The atmospheric temperature and water vapor budgets from the model results show that the two largest terms are net condensation (heating/drying) and imposed large-scale forcing (cooling/moistening) for tropical oceanic cases though not for midlatitude continental cases. These two terms are opposite in sign, however, and are not the dominant terms in the moist static energy budget.The balance between net radiation, surface latent heat flux, and net condensational heating vary in these tropical cases, however. For cloud systems that developed over the South China Sea and eastern Atlantic, net radiation (cooling) is not negligible in the temperature budget; it is as large as 20% of the net condensation. However, shortwave heating and longwave cooling are in balance with each other for cloud systems over the west Pacific region such that the net radiation is very small. This is due to the thick anvil clouds simulated in the cloud systems over the Pacific region. The large-scale advection of moist static energy is negative, as a result of a larger absolute value of large-scale advection of sensible heat (cooling) compared to large-scale latent heat (moistening) advection in the Pacific and Atlantic cases. For three cloud systems that developed over a midlatitude continent, the net radiation and sensible and latent heat fluxes play a much more important role. This means that the accurate measurement of surface fluxes and radiation is crucial for simulating these midlatitude cases.The results showed that large-scale mean (multiday) precipitation efficiency
Matrix models and stochastic growth in Donaldson-Thomas theory
Szabo, Richard J
2010-01-01
We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating ...
Two velocity difference model for a car following theory
Ge, H. X.; Cheng, R. J.; Li, Z. P.
2008-09-01
In the light of the optimal velocity model, a two velocity difference model for a car-following theory is put forward considering navigation in modern traffic. To our knowledge, the model is an improvement over the previous ones theoretically, because it considers more aspects in the car-following process than others. Then we investigate the property of the model using linear and nonlinear analyses. The Korteweg-de Vries equation (for short, the KdV equation) near the neutral stability line and the modified Korteweg-de Vries equation (for short, the mKdV equation) around the critical point are derived by applying the reductive perturbation method. The traffic jam could be thus described by the KdV soliton and the kink-anti-kink soliton for the KdV equation and mKdV equation, respectively. Numerical simulations are made to verify the model, and good results are obtained with the new model.
Spatially random models, estimation theory, and robot arm dynamics
Rodriguez, G.
1987-01-01
Spatially random models provide an alternative to the more traditional deterministic models used to describe robot arm dynamics. These alternative models can be used to establish a relationship between the methodologies of estimation theory and robot dynamics. A new class of algorithms for many of the fundamental robotics problems of inverse and forward dynamics, inverse kinematics, etc. can be developed that use computations typical in estimation theory. The algorithms make extensive use of the difference equations of Kalman filtering and Bryson-Frazier smoothing to conduct spatial recursions. The spatially random models are very easy to describe and are based on the assumption that all of the inertial (D'Alembert) forces in the system are represented by a spatially distributed white-noise model. The models can also be used to generate numerically the composite multibody system inertia matrix. This is done without resorting to the more common methods of deterministic modeling involving Lagrangian dynamics, Newton-Euler equations, etc. These methods make substantial use of human knowledge in derivation and minipulation of equations of motion for complex mechanical systems.
Symmetry Breaking, Unification, and Theories Beyond the Standard Model
Energy Technology Data Exchange (ETDEWEB)
Nomura, Yasunori
2009-07-31
A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressing the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.
Quantile hydrologic model selection and model structure deficiency assessment: 1. Theory
Pande, S.
2013-01-01
A theory for quantile based hydrologic model selection and model structure deficiency assessment is presented. The paper demonstrates that the degree to which a model selection problem is constrained by the model structure (measured by the Lagrange multipliers of the constraints) quantifies structur
Linking Simple Economic Theory Models and the Cointegrated Vector AutoRegressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its...
Le Quéré, Corinne; Andrew, Robbie M.; Canadell, Josep G.; Sitch, Stephen; Korsbakken, Jan Ivar; Peters, Glen P.; Manning, Andrew C.; Boden, Thomas A.; Tans, Pieter P.; Houghton, Richard A.; Keeling, Ralph F.; Alin, Simone; Andrews, Oliver D.; Anthoni, Peter; Barbero, Leticia; Bopp, Laurent; Chevallier, Frédéric; Chini, Louise P.; Ciais, Philippe; Currie, Kim; Delire, Christine; Doney, Scott C.; Friedlingstein, Pierre; Gkritzalis, Thanos; Harris, Ian; Hauck, Judith; Haverd, Vanessa; Hoppema, Mario; Klein Goldewijk, Kees; Jain, Atul K.; Kato, Etsushi; Körtzinger, Arne; Landschützer, Peter; Lefèvre, Nathalie; Lenton, Andrew; Lienert, Sebastian; Lombardozzi, Danica; Melton, Joe R.; Metzl, Nicolas; Millero, Frank; Monteiro, Pedro M. S.; Munro, David R.; Nabel, Julia E. M. S.; Nakaoka, Shin-ichiro; O'Brien, Kevin; Olsen, Are; Omar, Abdirahman M.; Ono, Tsuneo; Pierrot, Denis; Poulter, Benjamin; Rödenbeck, Christian; Salisbury, Joe; Schuster, Ute; Schwinger, Jörg; Séférian, Roland; Skjelvan, Ingunn; Stocker, Benjamin D.; Sutton, Adrienne J.; Takahashi, Taro; Tian, Hanqin; Tilbrook, Bronte; van der Laan-Luijkx, Ingrid T.; van der Werf, Guido R.; Viovy, Nicolas; Walker, Anthony P.; Wiltshire, Andrew J.; Zaehle, Sönke
2016-11-01
Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere - the "global carbon budget" - is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates and consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil fuels and industry (EFF) are based on energy statistics and cement production data, respectively, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models. We compare the mean land and ocean fluxes and their variability to estimates from three atmospheric inverse methods for three broad latitude bands. All uncertainties are reported as ±1σ, reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. For the last decade available (2006-2015), EFF was 9
Standard Model in multi-scale theories and observational constraints
Calcagni, Gianluca; Rodríguez-Fernández, David
2015-01-01
We construct and analyze the Standard Model of electroweak and strong interactions in multi-scale spacetimes with (i) weighted derivatives and (ii) $q$-derivatives. Both theories can be formulated in two different frames, called fractional and integer picture. By definition, the fractional picture is where physical predictions should be made. (i) In the theory with weighted derivatives, it is shown that gauge invariance and the requirement of having constant masses in all reference frames make the Standard Model in the integer picture indistinguishable from the ordinary one. Experiments involving only weak and strong forces are insensitive to a change of spacetime dimensionality also in the fractional picture, and only the electromagnetic and gravitational sectors can break the degeneracy. For the simplest multi-scale measures with only one characteristic time, length and energy scale $t_*$, $\\ell_*$ and $E_*$, we compute the Lamb shift in the hydrogen atom and constrain the multi-scale correction to the ordi...
Synthetic Domain Theory and Models of Linear Abadi & Plotkin Logic
DEFF Research Database (Denmark)
Møgelberg, Rasmus Ejlers; Birkedal, Lars; Rosolini, Guiseppe
2008-01-01
Plotkin suggested using a polymorphic dual intuitionistic/linear type theory (PILLY) as a metalanguage for parametric polymorphism and recursion. In recent work the first two authors and R.L. Petersen have defined a notion of parametric LAPL-structure, which are models of PILLY, in which one can...... reason using parametricity and, for example, solve a large class of domain equations, as suggested by Plotkin.In this paper, we show how an interpretation of a strict version of Bierman, Pitts and Russo's language Lily into synthetic domain theory presented by Simpson and Rosolini gives rise...... to a parametric LAPL-structure. This adds to the evidence that the notion of LAPL-structure is a general notion, suitable for treating many different parametric models, and it provides formal proofs of consequences of parametricity expected to hold for the interpretation. Finally, we show how these results...
Supersymmetric Theory of Stochastic ABC Model: A Numerical Study
Ovchinnikov, Igor V; Ensslin, Torsten A; Wang, Kang L
2016-01-01
In this paper, we investigate numerically the stochastic ABC model, a toy model in the theory of astrophysical kinematic dynamos, within the recently proposed supersymmetric theory of stochastics (STS). STS characterises stochastic differential equations (SDEs) by the spectrum of the stochastic evolution operator (SEO) on elements of the exterior algebra or differentials forms over the system's phase space, X. STS can thereby classify SDEs as chaotic or non-chaotic by identifying the phenomenon of stochastic chaos with the spontaneously broken topological supersymmetry that all SDEs possess. We demonstrate the following three properties of the SEO, deduced previously analytically and from physical arguments: the SEO spectra for zeroth and top degree forms never break topological supersymmetry, all SDEs possesses pseudo-time-reversal symmetry, and each de Rahm cohomology class provides one supersymmetric eigenstate. Our results also suggests that the SEO spectra for forms of complementary degrees, i.e., k and ...
Building Better Ecological Machines: Complexity Theory and Alternative Economic Models
Directory of Open Access Journals (Sweden)
Jess Bier
2016-12-01
Full Text Available Computer models of the economy are regularly used to predict economic phenomena and set financial policy. However, the conventional macroeconomic models are currently being reimagined after they failed to foresee the current economic crisis, the outlines of which began to be understood only in 2007-2008. In this article we analyze the most prominent of this reimagining: Agent-Based models (ABMs. ABMs are an influential alternative to standard economic models, and they are one focus of complexity theory, a discipline that is a more open successor to the conventional chaos and fractal modeling of the 1990s. The modelers who create ABMs claim that their models depict markets as ecologies, and that they are more responsive than conventional models that depict markets as machines. We challenge this presentation, arguing instead that recent modeling efforts amount to the creation of models as ecological machines. Our paper aims to contribute to an understanding of the organizing metaphors of macroeconomic models, which we argue is relevant conceptually and politically, e.g., when models are used for regulatory purposes.
Ranking streamflow model performance based on Information theory metrics
Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas
2016-04-01
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.
Hope, Jeremy; Fraser, Robin
2003-02-01
Budgeting, as most corporations practice it, should be abolished. That may sound radical, but doing so would further companies' long-running efforts to transform themselves into developed networks that can nimbly adjust to market conditions. Most other building blocks are in place, but companies continue to restrict themselves by relying on inflexible budget processes and the command-and-control culture that budgeting entails. A number of companies have rejected the foregone conclusions embedded in budgets, and they've given up the self-interested wrangling over what the data indicate. In the absence of budgets, alternative goals and measures--some financial, such as cost-to-income ratios, and some nonfinancial, such as time to market-move to the foreground. Companies that have rejected budgets require employees to measure themselves against the performance of competitors and against internal peer groups. Because employees don't know whether they've succeeded until they can look back on the results of a given period, they must use every ounce of energy to ensure that they beat the competition. A key feature of many companies that have rejected budgets is the use of rolling forecasts, which are created every few months and typically cover five to eight quarters. Because the forecasts are regularly revised, they allow companies to continuously adapt to market conditions. The forecasting practices of two such companies, both based in Sweden, are examined in detail: the bank Svenska Handelsbanken and the wholesaler Ahlsell. Though the first companies to reject budgets were located in Northern Europe, organizations that have gone beyond budgeting can be found in a range of countries and industries. Their practices allow them to unleash the power of today's management tools and realize the potential of a fully decentralized organization.
Goeckede, M.; Michalak, A. M.; Vickers, D.; Turner, D.; Law, B.
2008-12-01
The ORCA project aims at determining the regional carbon balance of Oregon, California and Washington, with a special focus on the effect of disturbance history and climate variability on carbon sources and sinks. ORCA provides a regional test of the overall NACP strategy by demonstrating bottom-up and top-down modeling approaches to derive carbon balances at subregional to regional scales. The ORCA top-down modeling component has been set up to capture flux variability on the regional scale at high temporal and spatial resolution. Atmospheric transport is simulated coupling the mesoscale model WRF (Weather Research and Forecast) with the STILT (Stochastic Time Inverted Lagrangian Transport) footprint model. This setup allows identifying sources and sinks that influence atmospheric observations with highly resolved mass transport fields and realistic turbulent mixing. High-precision atmospheric CO2 concentrations are monitored as continuous time series in hourly timesteps at 5 locations within the model domain, west to east from the Pacific Coast to the Great Basin, and include two flux sites for evaluation of computed fluxes. Terrestrial biosphere carbon fluxes are simulated at an effective spatial resolution of smaller than 1km and subdaily timesteps, considering effects of ecoregion, land cover type and disturbance regime on the carbon budgets. Flux computation assimilates high-resolution remote sensing products (e.g. LandSat, MODIS) and interpolated surface meteorology (DayMet, SOGS, PRISM). We present results on regional carbon budgets for the ORCA modeling domain that have been optimized using Bayesian inversion and the information provided by the network of high-precision CO2 observations. We address the influence of spatial and temporal resolution in the general modeling setup on the findings, and test the level of detail that can be resolved by top-down modeling on the regional scale, given the uncertainties introduced by various sources for model
Embankment deformation analyzed by elastoplastic damage model coupling consolidation theory
Institute of Scientific and Technical Information of China (English)
Hong SUN; Xihong ZHAO
2006-01-01
The deformation of embankment has serious influences on neighboring structure and infrastructure. A trial embankment is reanalyzed by elastoplastic damage model coupling Biot's consolidation theory. With the increase in time of loading, the damage accumulation becomes larger. Under the centre and toe of embankment, damage becomes serious. Under the centre of embankment, vertical damage values are bigger than horizontal ones. Under the toe of embankment, horizontal damage values are bigger than vertical ones.
Regression modeling methods, theory, and computation with SAS
Panik, Michael
2009-01-01
Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,
Pizzuto, J. E.; Ackerman, T. R.
2012-12-01
Mercury (Hg) was released into the South River, VA, from an industrial source from 1929-1950. Because of mercury's affinity for fine grained particles, a budget for fine sediment can be used to model the trajectories of Hg through the alluvial valley. We adopt Malmon's (2002) model, which requires each storage compartment to be "well-mixed". Our sediment budget quantifies residence times, exchange rates, and sediment storage volumes in the floodplain (FP), hyporheic zone, and in fine-grained channel margin (FGCM) deposits that form in the lee of obstructions (chiefly downed trees) along the sides of the wetted perimeter of the channel. This simple model with only 3 storage compartments fails to fit Hg concentration histories in the FGCM and under predicts contemporary mercury loading to the channel from bank erosion. We speculate that the FP and FGCM deposits are not well-mixed. Mercury is preferentially stored and remobilized from frequently-inundated, low elevation floodplain areas near the stream channel. Radiometric dates from FGCM deposits suggest that most sediments are reworked within a few years, but a small fraction of the deposits remains in storage for decades. We therefore partition the FP and FGCM deposits into multiple reservoirs, each with a different residence time. We divide the FGCM deposits into two sub-reservoirs with characteristic exchange rates and masses that represent the observed age distribution. Sediment accumulation rates on the FP follow an exponential distribution of FP relief, and we divide the floodplain into 5 reservoirs with inundation frequencies of 0.3, 2, 5, 62, and 100 years. Since erosion is assumed to be evenly distributed across each reservoir, FP area as a function of age decreases exponentially. With time, the elevation of floodplains increases through sedimentation, so a portion of each reservoir evolves into a less frequently inundated category every year, creating a unidirectional mass flux from each FP reservoir into
Game Theory Models for Multi-Robot Patrolling of Infrastructures
Directory of Open Access Journals (Sweden)
Erik Hernández
2013-03-01
Full Text Available This work is focused on the problem of performing multi‐robot patrolling for infrastructure security applications in order to protect a known environment at critical facilities. Thus, given a set of robots and a set of points of interest, the patrolling task consists of constantly visiting these points at irregular time intervals for security purposes. Current existing solutions for these types of applications are predictable and inflexible. Moreover, most of the previous work has tackled the patrolling problem with centralized and deterministic solutions and only few efforts have been made to integrate dynamic methods. Therefore, one of the main contributions of this work is the development of new dynamic and decentralized collaborative approaches in order to solve the aforementioned problem by implementing learning models from Game Theory. The model selected in this work that includes belief‐based and reinforcement models as special cases is called Experience‐Weighted Attraction. The problem has been defined using concepts of Graph Theory to represent the environment in order to work with such Game Theory techniques. Finally, the proposed methods have been evaluated experimentally by using a patrolling simulator. The results obtained have been compared with previous available approaches.
Directory of Open Access Journals (Sweden)
J.-L. Drouet
2012-05-01
Full Text Available Spatial interactions within a landscape may lead to large inputs of reactive nitrogen (N_{r} transferred from cultivated areas and farms to oligotrophic ecosystems and induce environmental threats such as acidification, nitric pollution or eutrophication of protected areas. The paper presents a new methodology to estimate N_{r} fluxes at the landscape scale by taking into account spatial interactions between landscape elements. This methodology includes estimates of indirect N_{r} emissions due to short-range atmospheric and hydrological transfers. We used the NitroScape model which integrates processes of N_{r} transformation and short-range transfer in a dynamic and spatially distributed way to simulate N_{r} fluxes and budgets at the landscape scale. Four configurations of NitroScape were implemented by taking into account or not the atmospheric, hydrological or both pathways of N_{r} transfer. We simulated N_{r} fluxes, especially direct and indirect N_{r} emissions, within a test landscape including pig farms, croplands and unmanaged ecosystems. Simulation results showed the ability of NitroScape to simulate patterns of N_{r} emissions and recapture for each landscape element and the whole landscape. NitroScape made it possible to quantify the contribution of both atmospheric and hydrological transfers to N_{r} fluxes, budgets and indirect N_{r} emissions. For instance, indirect N_{2}O emissions were estimated at around 21% of the total N_{2}O emissions. They varied within the landscape according to land use, meteorological and soil conditions as well as topography. This first attempt proved that the NitroScape model is a useful tool to estimate the effect of spatial interactions on N_{r} fluxes and budgets as well as indirect N_{r} emissions within landscapes. Our approach needs to be further tested by applying Nitro
Models and applications of chaos theory in modern sciences
Zeraoulia, Elhadj
2011-01-01
This book presents a select group of papers that provide a comprehensive view of the models and applications of chaos theory in medicine, biology, ecology, economy, electronics, mechanical, and the human sciences. Covering both the experimental and theoretical aspects of the subject, it examines a range of current topics of interest. It considers the problems arising in the study of discrete and continuous time chaotic dynamical systems modeling the several phenomena in nature and society-highlighting powerful techniques being developed to meet these challenges that stem from the area of nonli
Time-dependent Gutzwiller theory for multiband Hubbard models.
Oelsen, E v; Seibold, G; Bünemann, J
2011-08-12
Based on the variational Gutzwiller theory, we present a method for the computation of response functions for multiband Hubbard models with general local Coulomb interactions. The improvement over the conventional random-phase approximation is exemplified for an infinite-dimensional two-band Hubbard model where the incorporation of the local multiplet structure leads to a much larger sensitivity of ferromagnetism on the Hund coupling. Our method can be implemented into local-density approximation and Gutzwiller schemes and will therefore be an important tool for the computation of response functions for strongly correlated materials.
THE NEW CLASSICAL THEORY AND THE REAL BUSINESS CYCLE MODEL
Directory of Open Access Journals (Sweden)
Oana Simona HUDEA (CARAMAN
2014-11-01
Full Text Available The present paper aims at describing some key elements of the new classical theory-related model, namely the Real Business Cycle, mainly describing the economy from the perspective of a perfectly competitive market, characterised by price, wage and interest rate flexibility. The rendered impulse-response functions, that help us in revealing the capacity of the model variables to return to their steady state under the impact of a structural shock, be it technology or monetary policy oriented, give points to the neutrality of the monetary entity decisions, therefore confirming the well-known classical dichotomy existing between the nominal and the real factors of the economy.
Thermodynamic Models from Fluctuation Solution Theory Analysis of Molecular Simulations
DEFF Research Database (Denmark)
Christensen, Steen; Peters, Günther H.j.; Hansen, Flemming Yssing
2007-01-01
Fluctuation solution theory (FST) is employed to analyze results of molecular dynamics (MD) simulations of liquid mixtures. The objective is to generate parameters for macroscopic GE-models, here the modified Margules model. We present a strategy for choosing the number of parameters included......, Mol. Simul. 33 (4–5) (2007) 449–457.]. The new one has advantages for systems with data points at dilute conditions. Prediction of bubble point pressures using parameters from the two objective functions are compared with experimental data for the binary mixtures methyl acetate–n-pentane and methyl...
Bayesian Decision Theory Guiding Educational Decision-Making: Theories, Models and Application
Pan, Yilin
2016-01-01
Given the importance of education and the growing public demand for improving education quality under tight budget constraints, there has been an emerging movement to call for research-informed decisions in educational resource allocation. Despite the abundance of rigorous studies on the effectiveness, cost, and implementation of educational…
FY 1997 congressional budget request: Budget highlights
Energy Technology Data Exchange (ETDEWEB)
NONE
1996-03-01
This is an overview of the 1997 budget request for the US DOE. The topics of the overview include a policy overview, the budget by business line, business lines by organization, crosswalk from business line to appropriation, summary by appropriation, energy supply research and development, uranium supply and enrichment activities, uranium enrichment decontamination and decommissioning fund, general science and research, weapons activities, defense environmental restoration and waste management, defense nuclear waste disposal, departmental administration, Office of the Inspector General, power marketing administrations, Federal Energy Regulatory commission, nuclear waste disposal fund, fossil energy research and development, naval petroleum and oil shale reserves, energy conservation, economic regulation, strategic petroleum reserve, energy information administration, clean coal technology and a Department of Energy Field Facilities map.
Transfer effects between moral dilemmas: a causal model theory.
Wiegmann, Alex; Waldmann, Michael R
2014-04-01
Evaluations of analogous situations are an important source for our moral intuitions. A puzzling recent set of findings in experiments exploring transfer effects between intuitions about moral dilemmas has demonstrated a striking asymmetry. Transfer often occurred with a specific ordering of moral dilemmas, but not when the sequence was reversed. In this article we present a new theory of transfer between moral intuitions that focuses on two components of moral dilemmas, namely their causal structure and their default evaluations. According to this theory, transfer effects are expected when the causal models underlying the considered dilemmas allow for a mapping of the highlighted aspect of the first scenario onto the causal structure of the second dilemma, and when the default evaluations of the two dilemmas substantially differ. The theory's key predictions for the occurrence and the direction of transfer effects between two moral dilemmas are tested in five experiments with various variants of moral dilemmas from different domains. A sixth experiment tests the predictions of the theory for how the target action in the moral dilemmas is represented.
Metacommunity speciation models and their implications for diversification theory.
Hubert, Nicolas; Calcagno, Vincent; Etienne, Rampal S; Mouquet, Nicolas
2015-08-01
The emergence of new frameworks combining evolutionary and ecological dynamics in communities opens new perspectives on the study of speciation. By acknowledging the relative contribution of local and regional dynamics in shaping the complexity of ecological communities, metacommunity theory sheds a new light on the mechanisms underlying the emergence of species. Three integrative frameworks have been proposed, involving neutral dynamics, niche theory, and life history trade-offs respectively. Here, we review these frameworks of metacommunity theory to emphasise that: (1) studies on speciation and community ecology have converged towards similar general principles by acknowledging the central role of dispersal in metacommunities dynamics, (2) considering the conditions of emergence and maintenance of new species in communities has given rise to new models of speciation embedded in the metacommunity theory, (3) studies of diversification have shifted from relating phylogenetic patterns to landscapes spatial and ecological characteristics towards integrative approaches that explicitly consider speciation in a mechanistic ecological framework. We highlight several challenges, in particular the need for a better integration of the eco-evolutionary consequences of dispersal and the need to increase our understanding on the relative rates of evolutionary and ecological changes in communities.
Spinors, strings, integrable models, and decomposed Yang-Mills theory
Ioannidou, Theodora; Jiang, Ying; Niemi, Antti J.
2014-07-01
This paper deals with various interrelations between strings and surfaces in three-dimensional ambient space, two-dimensional integrable models, and two-dimensional and four-dimensional decomposed SU(2) Yang-Mills theories. Initially, a spinor version of the Frenet equation is introduced in order to describe the differential geometry of static three-dimensional stringlike structures. Then its relation to the structure of the su_(2) Lie algebra valued Maurer-Cartan one-form is presented, while by introducing time evolution of the string a Lax pair is obtained, as an integrability condition. In addition, it is shown how the Lax pair of the integrable nonlinear Schrödinger equation becomes embedded into the Lax pair of the time extended spinor Frenet equation, and it is described how a spinor-based projection operator formalism can be used to construct the conserved quantities, in the case of the nonlinear Schrödinger equation. Then the Lax pair structure of the time extended spinor Frenet equation is related to properties of flat connections in a two-dimensional decomposed SU(2) Yang-Mills theory. In addition, the connection between the decomposed Yang-Mills and the Gauß-Codazzi equation that describes surfaces in three-dimensional ambient space is presented. In that context the relation between isothermic surfaces and integrable models is discussed. Finally, the utility of the Cartan approach to differential geometry is considered. In particular, the similarities between the Cartan formalism and the structure of both two-dimensional and four-dimensional decomposed SU(2) Yang-Mills theories are discussed, while the description of two-dimensional integrable models as embedded structures in the four-dimensional decomposed SU(2) Yang-Mills theory are presented.
Renormalized parameters and perturbation theory in dynamical mean-field theory for the Hubbard model
Hewson, A. C.
2016-11-01
We calculate the renormalized parameters for the quasiparticles and their interactions for the Hubbard model in the paramagnetic phase as deduced from the low-energy Fermi-liquid fixed point using the results of a numerical renormalization-group calculation (NRG) and dynamical mean-field theory (DMFT). Even in the low-density limit there is significant renormalization of the local quasiparticle interaction U ˜, in agreement with estimates based on the two-particle scattering theory of J. Kanamori [Prog. Theor. Phys. 30, 275 (1963), 10.1143/PTP.30.275]. On the approach to the Mott transition we find a finite ratio for U ˜/D ˜ , where 2 D ˜ is the renormalized bandwidth, which is independent of whether the transition is approached by increasing the on-site interaction U or on increasing the density to half filling. The leading ω2 term in the self-energy and the local dynamical spin and charge susceptibilities are calculated within the renormalized perturbation theory (RPT) and compared with the results calculated directly from the NRG-DMFT. We also suggest, more generally from the DMFT, how an approximate expression for the q ,ω spin susceptibility χ (q ,ω ) can be derived from repeated quasiparticle scattering with a local renormalized scattering vertex.
U.S. Environmental Protection Agency — BAS is the central Agency system used to integrate strategic planning, annual planning, budgeting and financial management. BAS contains resource (dollars and FTE),...
Reasoning with Conditionals: A Test of Formal Models of Four Theories
Oberauer, Klaus
2006-01-01
The four dominant theories of reasoning from conditionals are translated into formal models: The theory of mental models (Johnson-Laird, P. N., & Byrne, R. M. J. (2002). Conditionals: a theory of meaning, pragmatics, and inference. "Psychological Review," 109, 646-678), the suppositional theory (Evans, J. S. B. T., & Over, D. E. (2004). "If."…
Toward a General Research Process for Using Dubin's Theory Building Model
Holton, Elwood F.; Lowe, Janis S.
2007-01-01
Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…
The linear model and hypothesis a general unifying theory
Seber, George
2015-01-01
This book provides a concise and integrated overview of hypothesis testing in four important subject areas, namely linear and nonlinear models, multivariate analysis, and large sample theory. The approach used is a geometrical one based on the concept of projections and their associated idempotent matrices, thus largely avoiding the need to involve matrix ranks. It is shown that all the hypotheses encountered are either linear or asymptotically linear, and that all the underlying models used are either exactly or asymptotically linear normal models. This equivalence can be used, for example, to extend the concept of orthogonality in the analysis of variance to other models, and to show that the asymptotic equivalence of the likelihood ratio, Wald, and Score (Lagrange Multiplier) hypothesis tests generally applies.
Lesser, M. P.
2013-03-01
Historically, the response of marine invertebrates to their environment, and environmentally induced stress, has included some measurement of their physiology or metabolism. Eventually, this approach developed into comparative energetics and the construction of energetic budgets. More recently, coral reefs, and scleractinian corals in particular, have suffered significant declines due to climate change-related environmental stress. In addition to a number of physiological, biophysical and molecular measurements to assess "coral health," there has been increased use of energetic approaches that have included the measurement of specific biochemical constituents (i.e., lipid concentrations) as a proxy for energy available to assess the potential outcomes of environmental stress on corals. In reading these studies, there appears to be some confusion between energy budgets and carbon budgets. Additionally, many assumptions regarding proximate biochemical composition, metabolic fuel preferences and metabolic quotients have been made, all of which are essential to construct accurate energy budgets and to convert elemental composition (i.e., carbon) to energy equivalents. Additionally, models of energetics such as the metabolic theory of ecology or dynamic energy budgets are being applied to coral physiology and include several assumptions that are not appropriate for scleractinian corals. As we assess the independent and interactive effects of multiple stressors on corals, efforts to construct quantitative energetic budgets should be a priority component of realistic multifactor experiments that would then improve the use of models as predictors of outcomes related to the effects of environmental change on corals.
Standard Model in multiscale theories and observational constraints
Calcagni, Gianluca; Nardelli, Giuseppe; Rodríguez-Fernández, David
2016-08-01
We construct and analyze the Standard Model of electroweak and strong interactions in multiscale spacetimes with (i) weighted derivatives and (ii) q -derivatives. Both theories can be formulated in two different frames, called fractional and integer picture. By definition, the fractional picture is where physical predictions should be made. (i) In the theory with weighted derivatives, it is shown that gauge invariance and the requirement of having constant masses in all reference frames make the Standard Model in the integer picture indistinguishable from the ordinary one. Experiments involving only weak and strong forces are insensitive to a change of spacetime dimensionality also in the fractional picture, and only the electromagnetic and gravitational sectors can break the degeneracy. For the simplest multiscale measures with only one characteristic time, length and energy scale t*, ℓ* and E*, we compute the Lamb shift in the hydrogen atom and constrain the multiscale correction to the ordinary result, getting the absolute upper bound t*28 TeV . Stronger bounds are obtained from the measurement of the fine-structure constant. (ii) In the theory with q -derivatives, considering the muon decay rate and the Lamb shift in light atoms, we obtain the independent absolute upper bounds t*35 MeV . For α0=1 /2 , the Lamb shift alone yields t*450 GeV .
Budget Constraints Affect Male Rats' Choices between Differently Priced Commodities.
Directory of Open Access Journals (Sweden)
Marijn van Wingerden
Full Text Available Demand theory can be applied to analyse how a human or animal consumer changes her selection of commodities within a certain budget in response to changes in price of those commodities. This change in consumption assessed over a range of prices is defined as demand elasticity. Previously, income-compensated and income-uncompensated price changes have been investigated using human and animal consumers, as demand theory predicts different elasticities for both conditions. However, in these studies, demand elasticity was only evaluated over the entirety of choices made from a budget. As compensating budgets changes the number of attainable commodities relative to uncompensated conditions, and thus the number of choices, it remained unclear whether budget compensation has a trivial effect on demand elasticity by simply sampling from a different total number of choices or has a direct effect on consumers' sequential choice structure. If the budget context independently changes choices between commodities over and above price effects, this should become apparent when demand elasticity is assessed over choice sets of any reasonable size that are matched in choice opportunities between budget conditions. To gain more detailed insight in the sequential choice dynamics underlying differences in demand elasticity between budget conditions, we trained N=8 rat consumers to spend a daily budget by making a number of nosepokes to obtain two liquid commodities under different price regimes, in sessions with and without budget compensation. We confirmed that demand elasticity for both commodities differed between compensated and uncompensated budget conditions, also when the number of choices considered was matched, and showed that these elasticity differences emerge early in the sessions. These differences in demand elasticity were driven by a higher choice rate and an increased reselection bias for the preferred commodity in compensated compared to
Modelling non-ignorable missing-data mechanisms with item response theory models
Holman, Rebecca; Glas, Cees A.W.
2005-01-01
A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled us
Energy Technology Data Exchange (ETDEWEB)
Cooper, F.
1996-12-31
We review the assumptions and domain of applicability of Landau`s Hydrodynamical Model. By considering two models of particle production, pair production from strong electric fields and particle production in the linear {sigma} model, we demonstrate that many of Landau`s ideas are verified in explicit field theory calculations.
A model theoretic Baire category theorem for simple theories
Shami, Ziv
2009-01-01
We define the class of $\\tilde\\tau_{low}^f$-sets. This is a class of type-definable sets defined in terms of forking by low formulas. We prove a model theoretic Baire category theorem for $\\tilde\\tau_{low}^f$-sets in a countable simple theory in which the extension property is first-order and show some of its applications. A typical application is the following. Let $T$ be a countable theory with the wnfcp (weak nonfinite cover property) and assume for every non-algebraic $a$ there exists a non-algebraic $a'\\acl(a)$ such that $SU(a')<\\omega$. Then there exists a weakly-minimal formula with parameters.
Visceral obesity and psychosocial stress: a generalised control theory model
Wallace, Rodrick
2016-07-01
The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.
D=0 Matrix Model as Conjugate Field Theory
Ben-Menahem, S
1993-01-01
The D=0 matrix model is reformulated as a 2d nonlocal quantum field theory. The interactions occur on the one-dimensional line of hermitian matrix eigenvalues. The field is conjugate to the density of matrix eigenvalues which appears in the Jevicki-Sakita collective field theory. The classical solution of the field equation is either unique or labeled by a discrete index. Such a solution corresponds to the Dyson sea modified by an entropy term. The modification smoothes the sea edges, and interpolates between different eigenvalue bands for multiple-well potentials. Our classical eigenvalue density contains nonplanar effects, and satisfies a local nonlinear Schr\\"odinger equation with similarities to the Marinari-Parisi $D=1$ reformulation. The quantum fluctuations about a classical solution are computable, and the IR and UV divergences are manifestly removed to all orders. The quantum corrections greatly simplify in the double scaling limit, and include both string-perturbative and nonperturbative effects.
A cellular automaton model for evacuation flow using game theory
Guan, Junbiao; Wang, Kaihua; Chen, Fangyue
2016-11-01
Game theory serves as a good tool to explore crowd dynamic conflicts during evacuation processes. The purpose of this study is to simulate the complicated interaction behavior among the conflicting pedestrians in an evacuation flow. Two types of pedestrians, namely, defectors and cooperators, are considered, and two important factors including fear index and cost coefficient are taken into account. By combining the snowdrift game theory with a cellular automaton (CA) model, it is shown that the increase of fear index and cost coefficient will lengthen the evacuation time, which is more apparent for large values of cost coefficient. Meanwhile, it is found that the defectors to cooperators ratio could always tend to consistent states despite different values of parameters, largely owing to self-organization effects.
Applications of the Likelihood Theory in Finance: Modelling and Pricing
Janssen, Arnold
2012-01-01
This paper discusses the connection between mathematical finance and statistical modelling which turns out to be more than a formal mathematical correspondence. We like to figure out how common results and notions in statistics and their meaning can be translated to the world of mathematical finance and vice versa. A lot of similarities can be expressed in terms of LeCam's theory for statistical experiments which is the theory of the behaviour of likelihood processes. For positive prices the arbitrage free financial assets fit into filtered experiments. It is shown that they are given by filtered likelihood ratio processes. From the statistical point of view, martingale measures, completeness and pricing formulas are revisited. The pricing formulas for various options are connected with the power functions of tests. For instance the Black-Scholes price of a European option has an interpretation as Bayes risk of a Neyman Pearson test. Under contiguity the convergence of financial experiments and option prices ...
Plane answers to complex questions the theory of linear models
Christensen, Ronald
1987-01-01
This book was written to rigorously illustrate the practical application of the projective approach to linear models. To some, this may seem contradictory. I contend that it is possible to be both rigorous and illustrative and that it is possible to use the projective approach in practical applications. Therefore, unlike many other books on linear models, the use of projections and sub spaces does not stop after the general theory. They are used wherever I could figure out how to do it. Solving normal equations and using calculus (outside of maximum likelihood theory) are anathema to me. This is because I do not believe that they contribute to the understanding of linear models. I have similar feelings about the use of side conditions. Such topics are mentioned when appropriate and thenceforward avoided like the plague. On the other side of the coin, I just as strenuously reject teaching linear models with a coordinate free approach. Although Joe Eaton assures me that the issues in complicated problems freq...
The adhesion model as a field theory for cosmological clustering
Energy Technology Data Exchange (ETDEWEB)
Rigopoulos, Gerasimos, E-mail: rigopoulos@thphys.uni-heidelberg.de [Institut für Theoretische Physik, Universität Heidelberg, Philosophenweg 12, Heidelberg, 69120 Germany (Germany)
2015-01-01
The adhesion model has been proposed in the past as an improvement of the Zel'dovich approximation, providing a good description of the formation of the cosmic web. We recast the model as a field theory for cosmological large scale structure, adding a stochastic force to account for power generated from very short, highly non-linear scales that is uncorrelated with the initial power spectrum. The dynamics of this Stochastic Adhesion Model (SAM) is reminiscent of the well known Kardar-Parisi-Zhang equation with the difference that the viscosity and the noise spectrum are time dependent. Choosing the viscosity proportional to the growth factor D restricts the form of noise spectrum through a 1-loop renormalization argument. For this choice, the SAM field theory is renormalizable to one loop. We comment on the suitability of this model for describing the non-linear regime of the CDM power spectrum and its utility as a relatively simple approach to cosmological clustering.
Le Quéré, C.; Moriarty, R.; Andrew, R. M.; Canadell, J. G.; Sitch, S.; Korsbakken, J. I.; Friedlingstein, P.; Peters, G. P.; Andres, R. J.; Boden, T. A.; Houghton, R. A.; House, J. I.; Keeling, R. F.; Tans, P.; Arneth, A.; Bakker, D. C. E.; Barbero, L.; Bopp, L.; Chang, J.; Chevallier, F.; Chini, L. P.; Ciais, P.; Fader, M.; Feely, R. A.; Gkritzalis, T.; Harris, I.; Hauck, J.; Ilyina, T.; Jain, A. K.; Kato, E.; Kitidis, V.; Klein Goldewijk, K.; Koven, C.; Landschützer, P.; Lauvset, S. K.; Lefèvre, N.; Lenton, A.; Lima, I. D.; Metzl, N.; Millero, F.; Munro, D. R.; Murata, A.; Nabel, J. E. M. S.; Nakaoka, S.; Nojiri, Y.; O'Brien, K.; Olsen, A.; Ono, T.; Pérez, F. F.; Pfeil, B.; Pierrot, D.; Poulter, B.; Rehder, G.; Rödenbeck, C.; Saito, S.; Schuster, U.; Schwinger, J.; Séférian, R.; Steinhoff, T.; Stocker, B. D.; Sutton, A. J.; Takahashi, T.; Tilbrook, B.; van der Laan-Luijkx, I. T.; van der Werf, G. R.; van Heuven, S.; Vandemark, D.; Viovy, N.; Wiltshire, A.; Zaehle, S.; Zeng, N.
2015-12-01
Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates as well as consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil fuels and industry (EFF) are based on energy statistics and cement production data, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover-change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models forced by observed climate, CO2, and land-cover change (some including nitrogen-carbon interactions). We compare the mean land and ocean fluxes and their variability to estimates from three atmospheric inverse methods for three broad latitude bands. All uncertainties are reported as ±1σ, reflecting the current capacity to characterise the annual estimates of each component of the global
A study of the logical model of capital market complexity theories
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Analyzes the shortcomings of the classic capital market theories based on EMH and discloses the complexity essence of the capital market. Considering the capital market a complicated, interactive and adaptable dynamic system, with complexity science as the method for researching the operation law of the capital market, this paper constructs a nonlinear logical model to analyze the applied realm, focal point and interrelationship of such theories as dissipative structure theory, chaos theory, fractal theory, synergetics theory, catastrophe theory and scale theory, and summarizes and discusses the achievements and problems of each theory.Based on the research, the paper foretells the developing direction of complexity science in a capital market.
Interactions between causal models, theories, and social cognitive development.
Sobel, David M; Buchanan, David W; Butterfield, Jesse; Jenkins, Odest Chadwicke
2010-01-01
We propose a model of social cognitive development based not on a single modeling framework or the hypothesis that a single model accounts for children's developing social cognition. Rather, we advocate a Causal Model approach (cf. Waldmann, 1996), in which models of social cognitive development take the same position as theories of social cognitive development, in that they generate novel empirical hypotheses. We describe this approach and present three examples across various aspects of social cognitive development. Our first example focuses on children's understanding of pretense and involves only considering assumptions made by a computational framework. The second example focuses on children's learning from "testimony". It uses a modeling framework based on Markov random fields as a computational description of a set of empirical phenomena, and then tests a prediction of that description. The third example considers infants' generalization of action learned from imitation. Here, we use a modified version of the Rational Model of Categorization to explain children's inferences. Taken together, these examples suggest that research in social cognitive development can be assisted by considering how computational modeling can lead researchers towards testing novel hypotheses.
Directory of Open Access Journals (Sweden)
Morar Ioan Dan
2014-12-01
Full Text Available The issue of public budgeting is an important issue for public policy of the state, for the simple reason that no money from the state budget can not promote public policy. Budgetary policy is official government Doctrine vision mirror and also represents a starting point for other public policies, which in turn are financed by the public budget. Fiscal policy instruments at its disposal handles the public sector in its structure, and the private sector. Tools such as grant, budgetary allocation, tax, welfare under various forms, direct investments and not least the state aid is used by the state through their budgetary policies to directly and indirectly infuence sector, and the private. Fiscal policies can be grouped according to the structure of the public sector in these components, namely fiscal policy, budgeting and resource allocation policies for financing the budget deficit. An important issue is the financing of the budget deficit budgetary policies. There are two funding possibilities, namely, the higher taxes or more axles site and enter the second call to public loans. Both options involve extra effort from taxpayers in the current fiscal year when they pay higher taxes or a future period when public loans will be repaid. We know that by virtue of "fiscal pact" structural deficits of the member countries of the EU are limited by the European Commission, according to the macro structural stability and budget of each Member State. This problem tempers to some extent the governments of the Member States budgetary appetite, but does not solve the problem of chronic budget deficits. Another issue addressed in this paper is related to the public debt, the absolute amount of its relative level of public datoriri, about the size of GDP, public debt financing and its repayment sources. Sources of public debt issuance and monetary impact on the budget and monetary stability are variables that must underpin the justification of budgetary
Adapting Structuration Theory as a Comprehensive Theory for Distance Education: The ASTIDE Model
Aktaruzzaman, Md; Plunkett, Margaret
2016-01-01
Distance Education (DE) theorists have argued about the requirement for a theory to be comprehensive in a way that can explicate many of the activities associated with DE. Currently, Transactional Distance Theory (TDT) (Moore, 1993) and the Theory of Instructional Dialogue (IDT) (Caspi & Gorsky, 2006) are the most prominent theories, yet they…
Ducrot, Virginie; Péry, Alexandre R R; Mons, Raphaël; Quéau, Hervé; Charles, Sandrine; Garric, Jeanne
2007-08-01
This paper presents original toxicity test designs and mathematical models that may be used to assess the deleterious effects of toxicants on Valvata piscinalis (Mollusca, Gastropoda). Results obtained for zinc, used as a reference toxicant, are presented. The feeding behavior, juvenile survival, growth, age at puberty, onset of reproduction, number of breedings during the life cycle, and fecundity were significantly altered when the snails were exposed to zinc-spiked sediments. Dynamic energy budget models (DEBtox) adequately predicted the effects of zinc on the V. piscinalis life cycle. They also provided estimates for lifecycle parameters that were used to parameterize a demographic model, based on a Z-transformed life-cycle graph. The effect threshold for the population growth rate (lambda) was estimated at 259 mg/kg dry sediment of zinc, showing that significant changes in abundance may occur at environmental concentrations. Significant effects occurring just above this threshold value were mainly caused by the severe impairment of reproductive endpoints. Sensitivity analysis showed that the value of lambda depended mainly on the juvenile survival rate. The impairment of this latter parameter may result in extinction of V. piscinalis. Finally, the present study highlights advantages of the proposed modeling approach in V. piscinalis and possible transfer to other test species and contaminants.
Theory, modelling and simulation in origins of life studies.
Coveney, Peter V; Swadling, Jacob B; Wattis, Jonathan A D; Greenwell, H Christopher
2012-08-21
Origins of life studies represent an exciting and highly multidisciplinary research field. In this review we focus on the contributions made by theory, modelling and simulation to addressing fundamental issues in the domain and the advances these approaches have helped to make in the field. Theoretical approaches will continue to make a major impact at the "systems chemistry" level based on the analysis of the remarkable properties of nonlinear catalytic chemical reaction networks, which arise due to the auto-catalytic and cross-catalytic nature of so many of the putative processes associated with self-replication and self-reproduction. In this way, we describe inter alia nonlinear kinetic models of RNA replication within a primordial Darwinian soup, the origins of homochirality and homochiral polymerization. We then discuss state-of-the-art computationally-based molecular modelling techniques that are currently being deployed to investigate various scenarios relevant to the origins of life.
Rigid Rotor as a Toy Model for Hodge Theory
Gupta, Saurabh
2009-01-01
We apply the superfield approach to the toy model of a rigid rotor and show the existence of the nilpotent and absolutely anticommuting Becchi-Rouet-Stora-Tyutin (BRST) and anti-BRST symmetry transformations, under which, the kinetic term and Lagrangian remain invariant. Furthermore, we also derive the off-shell nilpotent and absolutely anticommuting (anti-) co-BRST symmetry transformations, under which, the gauge-fixing term and Lagrangian remain invariant. The anticommutator of the above nilpotent symmetry transformations leads to the derivation of a bosonic symmetry transformation, under which, the ghost terms and Lagrangian remain invariant. Together, the above transformations (and their corresponding generators) respect an algebra that turns out to be the realization of the algebra obeyed by the de Rham cohomological operators of differential geometry. Thus, our present model is a toy model for the Hodge theory.
Rigid rotor as a toy model for Hodge theory
Gupta, Saurabh; Malik, R. P.
2010-07-01
We apply the superfield approach to the toy model of a rigid rotor and show the existence of the nilpotent and absolutely anticommuting Becchi-Rouet-Stora-Tyutin (BRST) and anti-BRST symmetry transformations, under which, the kinetic term and the action remain invariant. Furthermore, we also derive the off-shell nilpotent and absolutely anticommuting (anti-) co-BRST symmetry transformations, under which, the gauge-fixing term and the Lagrangian remain invariant. The anticommutator of the above nilpotent symmetry transformations leads to the derivation of a bosonic symmetry transformation, under which, the ghost terms and the action remain invariant. Together, the above transformations (and their corresponding generators) respect an algebra that turns out to be a physical realization of the algebra obeyed by the de Rham cohomological operators of differential geometry. Thus, our present model is a toy model for the Hodge theory.
A queueing theory based model for business continuity in hospitals.
Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R
2013-01-01
Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals.
BASIC THEORY AND MATHEMATICAL MODELING OF URBAN RAINSTORM WATER LOGGING
Institute of Scientific and Technical Information of China (English)
LI Da-ming; ZHANG Hong-ping; LI Bing-fei; XIE Yi-yang; LI Pei-yan; HAN Su-qin
2004-01-01
In this paper, a mathematical model for the urban rainstorm water logging was established on the basis of one- and two-dimensional unsteady flow theory and the technique of non-structural irregular grid division. The continuity equation was discretized with the finite volume method. And the momentum equations were differently simplified and discretized for different cases. A method of "special passage" was proposed to deal with small-scale rivers and open channels. The urban drainage system was simplified and simulated in the model. The method of "open slot" was applied to coordinate the alternate calculation of open channel flow and pressure flow in drainage pipes. The model has been applied in Tianjin City and the verification is quite satisfactory.
Study of a model equation in detonation theory: multidimensional effects
Faria, Luiz M; Rosales, Rodolfo R
2015-01-01
We extend the reactive Burgers equation presented in Kasimov et al. Phys. Rev. Lett., 110 (2013) and Faria et al. SIAM J. Appl. Maths, 74 (2014), to include multidimensional effects. Furthermore, we explain how the model can be rationally justified following the ideas of the asymptotic theory developed in Faria et al. JFM (2015). The proposed model is a forced version of the unsteady small disturbance transonic flow equations. We show that for physically reasonable choices of forcing functions, traveling wave solutions akin to detonation waves exist. It is demonstrated that multidimensional effects play an important role in the stability and dynamics of the traveling waves. Numerical simulations indicate that solutions of the model tend to form multi-dimensional patterns analogous to cells in gaseous detonations.
Dynamic statistical models of biological cognition: insights from communications theory
Wallace, Rodrick
2014-10-01
Maturana's cognitive perspective on the living state, Dretske's insight on how information theory constrains cognition, the Atlan/Cohen cognitive paradigm, and models of intelligence without representation, permit construction of a spectrum of dynamic necessary conditions statistical models of signal transduction, regulation, and metabolism at and across the many scales and levels of organisation of an organism and its context. Nonequilibrium critical phenomena analogous to physical phase transitions, driven by crosstalk, will be ubiquitous, representing not only signal switching, but the recruitment of underlying cognitive modules into tunable dynamic coalitions that address changing patterns of need and opportunity at all scales and levels of organisation. The models proposed here, while certainly providing much conceptual insight, should be most useful in the analysis of empirical data, much as are fitted regression equations.
Density Functional Theory and Materials Modeling at Atomistic Length Scales
Directory of Open Access Journals (Sweden)
Swapan K. Ghosh
2002-04-01
Full Text Available Abstract: We discuss the basic concepts of density functional theory (DFT as applied to materials modeling in the microscopic, mesoscopic and macroscopic length scales. The picture that emerges is that of a single unified framework for the study of both quantum and classical systems. While for quantum DFT, the central equation is a one-particle Schrodinger-like Kohn-Sham equation, the classical DFT consists of Boltzmann type distributions, both corresponding to a system of noninteracting particles in the field of a density-dependent effective potential, the exact functional form of which is unknown. One therefore approximates the exchange-correlation potential for quantum systems and the excess free energy density functional or the direct correlation functions for classical systems. Illustrative applications of quantum DFT to microscopic modeling of molecular interaction and that of classical DFT to a mesoscopic modeling of soft condensed matter systems are highlighted.
Structure and asymptotic theory for nonlinear models with GARCH errors
Directory of Open Access Journals (Sweden)
Felix Chan
2015-01-01
Full Text Available Nonlinear time series models, especially those with regime-switching and/or conditionally heteroskedastic errors, have become increasingly popular in the economics and finance literature. However, much of the research has concentrated on the empirical applications of various models, with little theoretical or statistical analysis associated with the structure of the processes or the associated asymptotic theory. In this paper, we derive sufficient conditions for strict stationarity and ergodicity of three different specifications of the first-order smooth transition autoregressions with heteroskedastic errors. This is essential, among other reasons, to establish the conditions under which the traditional LM linearity tests based on Taylor expansions are valid. We also provide sufficient conditions for consistency and asymptotic normality of the Quasi-Maximum Likelihood Estimator for a general nonlinear conditional mean model with first-order GARCH errors.
Twinlike models for self-dual Maxwell-Higgs theories
Energy Technology Data Exchange (ETDEWEB)
Bazeia, D.; Hora, E. da [Universidade Federal da Paraiba (UFPB), Joao Pessoa, PB (Brazil). Dept. de Fisica; Menezes, R. [Universidade Federal da Paraiba (UFPB), Rio Tinto, PB (Brazil). Dept. de Ciencias Exatas
2012-07-01
Full text: We present the development of a coherent first-order theoretical framework that allows for the existence of twinlike models in the context of a generalized self-dual Maxwell-Higgs theories. Here, the generalized model is controlled by two dimensionless functions, h (|{phi}|) and w (|{phi}|) , which are functions of the Higgs field only. Furthermore, these functions are assumed to be positive, in order to avoid problems with the energy of the overall model. In this context, for a given constraint between h (|{phi}|), w (|{phi}|) and the symmetry breaking Higgs potential V (|{phi}|), we found BPS equations by minimizing the generalized total energy. Here, even in the presence of non-trivial choices to h (|{phi}|), w (|{phi}|) and V (|{phi}|), the resulting equations mimic the standard first-order ones. Therefore, given the usual finite-energy boundary conditions, the generalized model engenders the very same topologically non-trivial configurations as the standard Maxwell-Higgs theory. The energy of such configurations is bounded from below, this bound being equal to the magnetic flux, which is quantized according the winding number, as expected. Even in this case, it is important to note that the generalized model is not a parametrization of the usual Maxwell-Higgs one. We study the resulting BPS configurations via the canonical radially symmetric Ansatz, and we found electrically non-charged time-independent vortex solutions which engender the very same magnetic field. In this sense, since this field is related to the topological stability of the usual vortex configurations, we believe the stability of the generalized model can be achieve in the same way. We use a combination of theoretical and numerical techniques to show that different choices to h (|{phi}|) and w (|{phi}|) engender different energy densities, even in the presence of the very same field solutions. In this case, we show that all these densities give the very same total energy. (author)
Rigorously testing multialternative decision field theory against random utility models.
Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg
2014-06-01
Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions.
The Gaussian streaming model and Lagrangian effective field theory
Vlah, Zvonimir; White, Martin
2016-01-01
We update the ingredients of the Gaussian streaming model (GSM) for the redshift-space clustering of biased tracers using the techniques of Lagrangian perturbation theory, effective field theory (EFT) and a generalized Lagrangian bias expansion. After relating the GSM to the cumulant expansion, we present new results for the real-space correlation function, mean pairwise velocity and pairwise velocity dispersion including counter terms from EFT and bias terms through third order in the linear density, its leading derivatives and its shear up to second order. We discuss the connection to the Gaussian peaks formalism. We compare the ingredients of the GSM to a suite of large N-body simulations, and show the performance of the theory on the low order multipoles of the redshift-space correlation function and power spectrum. We highlight the importance of a general biasing scheme, which we find to be as important as higher-order corrections due to non-linear evolution for the halos we consider on the scales of int...
The Gaussian streaming model and convolution Lagrangian effective field theory
Vlah, Zvonimir; Castorina, Emanuele; White, Martin
2016-12-01
We update the ingredients of the Gaussian streaming model (GSM) for the redshift-space clustering of biased tracers using the techniques of Lagrangian perturbation theory, effective field theory (EFT) and a generalized Lagrangian bias expansion. After relating the GSM to the cumulant expansion, we present new results for the real-space correlation function, mean pairwise velocity and pairwise velocity dispersion including counter terms from EFT and bias terms through third order in the linear density, its leading derivatives and its shear up to second order. We discuss the connection to the Gaussian peaks formalism. We compare the ingredients of the GSM to a suite of large N-body simulations, and show the performance of the theory on the low order multipoles of the redshift-space correlation function and power spectrum. We highlight the importance of a general biasing scheme, which we find to be as important as higher-order corrections due to non-linear evolution for the halos we consider on the scales of interest to us.
Serpa, Dalila; Ferreira, Pedro Pousão; Ferreira, Hugo; da Fonseca, Luís Cancela; Dinis, Maria Teresa; Duarte, Pedro
2013-02-01
Fish growth models may help understanding the influence of environmental, physiological and husbandry factors on fish production, providing crucial information to maximize the growth rates of cultivated species. The main objectives of this work were to: i) develop and implement an Individual Based Model using a Dynamic Energy Budget (IBM-DEB) approach to simulate the growth of two commercially important Sparidae species in semi-intensive earth ponds, the white seabream which is considered as a potential candidate for Mediterranean aquaculture and the gilthead seabream that has been cultivated since the early 80s; ii) evaluate which model parameters are more likely to affect fish performance, and iii) investigate which parameters might account for growth differences between the cultivated species. The model may be run in two modes: the "state variable" mode, in which an average fish is simulated with a particular parameter set and the "Individual Based Model" (IBM) mode that simulates a population of n fishes, each with its specific parameter set assigned randomly. The IBM mode has the advantage of allowing a quick model calibration and an evaluation of the parameter sets that produce the best fit between predicted and observed fish growth. Results revealed that the model reproduces reasonably well the growth of the two seabreams. Fish performance was mainly affected by parameters related to feed ingestion/assimilation and reserves utilization, suggesting that special attention should be taken in the estimation of these parameters when applying the model to other species. Comparing the DEB parameters set of the two sparids it seems that the white seabream's low growth rates are a result of higher maintenance costs and a lower feed assimilation efficiency. Hence, the development of new feed formulations may be crucial for the success of white seabream production in semi-intensive earth ponds.
Directory of Open Access Journals (Sweden)
G. Formetta
2011-04-01
Full Text Available This paper presents a discussion of the predictive capacity of the first implementation of the semi-distributed hydrological modeling system JGrass-NewAge. This model focuses on the hydrological balance of medium scale to large scale basins, and considers statistics of the processes at the hillslope scale. The whole modeling system consists of six main parts: (i estimation of energy balance; (ii estimation of evapotranspiration; (iii snow modelling; (iv estimation of runoff production; (v aggregation and propagation of flows in channel, and (vi description of intakes, out-takes, and reservoirs. This paper details the processes, of runoff production, and aggregation/propagation of flows on a river network. The system is based on a hillslope-link geometrical partition of the landscape, so the basic unit, where the budget is evaluated, consists of hillslopes that drain into a single associated link rather than cells or pixels. To this conceptual partition corresponds an implementation of informatics that uses vectorial features for channels, and raster data for hillslopes. Runoff production at each channel link is estimated through a combination of the Duffy (1996 model and a GIUH model for estimating residence times in hillslope. Routing in channels uses equations integrated for any channels' link, and produces discharges at any link end, for any link in the river network. The model has been tested against measured discharges according to some indexes of goodness of fit such as RMSE and Nash Sutcliffe. The characteristic ability to reproduce discharge in any point of the river network is used to infer some statistics, and notably, the scaling properties of the modeled discharge.
Theory and Modeling of High-Power Gyrotrons
Energy Technology Data Exchange (ETDEWEB)
Nusinovich, Gregory Semeon [Univ. of Maryland, College Park, MD (United States)
2016-04-29
This report summarized results of the work performed at the Institute for Research in Electronics and Applied Physics of the University of Maryland (College Park, MD) in the framework of the DOE Grant “Theory and Modeling of High-Power Gyrotrons”. The report covers the work performed in 2011-2014. The research work was performed in three directions: - possibilities of stable gyrotron operation in very high-order modes offering the output power exceeding 1 MW level in long-pulse/continuous-wave regimes, - effect of small imperfections in gyrotron fabrication and alignment on the gyrotron efficiency and operation, - some issues in physics of beam-wave interaction in gyrotrons.
Mean-field theory and self-consistent dynamo modeling
Energy Technology Data Exchange (ETDEWEB)
Yoshizawa, Akira; Yokoi, Nobumitsu [Tokyo Univ. (Japan). Inst. of Industrial Science; Itoh, Sanae-I [Kyushu Univ., Fukuoka (Japan). Research Inst. for Applied Mechanics; Itoh, Kimitaka [National Inst. for Fusion Science, Toki, Gifu (Japan)
2001-12-01
Mean-field theory of dynamo is discussed with emphasis on the statistical formulation of turbulence effects on the magnetohydrodynamic equations and the construction of a self-consistent dynamo model. The dynamo mechanism is sought in the combination of the turbulent residual-helicity and cross-helicity effects. On the basis of this mechanism, discussions are made on the generation of planetary magnetic fields such as geomagnetic field and sunspots and on the occurrence of flow by magnetic fields in planetary and fusion phenomena. (author)
Psychodynamic theory and pastoral theology: an integrated model.
Garanzini, M J
The article proposes that a pastoral concern for gay and lesbian individuals must be sensitive to the psychological and social dynamics involved in their attachments, separations, and losses. Drawing on object relations theory and insights from self-psychology, a model is proposed whereby counselor and counselee can examine the cycle of attachment, separation, loss and reattachment that characterizes all important relationships. The suggestion is made that this cycle is applicable to the development and reformulation of life-giving myths and ways of being in the world. Finally, an analysis of the role of early narcissistic wounds and the healing process in therapy is presented.
Quantum of field theory for the electroweak Standard Model
Kleiss, R
2008-01-01
In these notes I present the content of relativistic quantum eld theory, and the way it purports to describe the electroweak Standard Model of particle physics, in the way it most appeals to me. I can claim neither exhaustiveness nor absolute mathematical rigour: after all, the subject is physics, not mathematics. The emphasis will be on physicality and applicability, and therefore I concentrate more on Feynman rules and Feynman diagrams than on hypothesized Lagrangians. The drawback of this is, unavoidably, that symmetry considerations retreat somewhat into the background leaving the limelight to diagrammatic results. This is all right: for I do not at present believe that symmetry rules the world.
Modeling of tethered satellite formations using graph theory
DEFF Research Database (Denmark)
Larsen, Martin Birkelund; Smith, Roy S; Blanke, Mogens
2011-01-01
satellite formation and proposes a method to deduce the equations of motion for the attitude dynamics of the formation in a compact form. The use of graph theory and Lagrange mechanics together allows a broad class of formations to be described using the same framework. A method is stated for finding...... could form stable formations in space are cumbersome when done at a case to case basis, and a common framework providing a basic model of the dynamics of tethered satellite formations can therefore be advantageous. This paper suggests the use of graph theoretical quantities to describe a tethered...
Stochastical modeling for Viral Disease: Statistical Mechanics and Network Theory
Zhou, Hao; Deem, Michael
2007-04-01
Theoretical methods of statistical mechanics are developed and applied to study the immunological response against viral disease, such as dengue. We use this theory to show how the immune response to four different dengue serotypes may be sculpted. It is the ability of avian influenza, to change and to mix, that has given rise to the fear of a new human flu pandemic. Here we propose to utilize a scale free network based stochastic model to investigate the mitigation strategies and analyze the risk.
Molecular Thermodynamic Modeling of Fluctuation Solution Theory Properties
DEFF Research Database (Denmark)
O’Connell, John P.; Abildskov, Jens
2013-01-01
Fluctuation Solution Theory provides relationships between integrals of the molecular pair total and direct correlation functions and the pressure derivative of solution density, partial molar volumes, and composition derivatives of activity coefficients. For dense fluids, the integrals follow...... for densities and gas solubilities, including ionic liquids and complex mixtures such as coal liquids. The approach is especially useful in systems with strong nonidealities. This chapter describes successful application of such modeling to a wide variety of systems treated over several decades and suggests how...
LM2-Mercury, a mercury mass balance model, was developed to simulate and evaluate the transport, fate, and biogeochemical transformations of mercury in Lake Michigan. The model simulates total suspended solids (TSS), disolved organic carbon (DOC), and total, elemental, divalent, ...
Behera, Abhinna; Rivière, Emmanuel; Marécal, Virginie; Claud, Chantal; Rysman, Jean-François; Geneviève, Seze
2016-04-01
Water vapour budget is a key component in the earth climate system. In the tropical upper troposphere, lower stratosphere (UTLS), it plays a central role both on the radiative and the chemical budget. Its abundance is mostly driven by slow ascent above the net zero radiative heating level followed by ice crystals' formation and sedimentation, so called the cold trap. In contrast to this large scale temperature driven process, overshooting convection penetrating the stratosphere could be one piece of the puzzle. It has been proven to hydrate the lower stratosphere at the local scale. Satellite-borne H2O instruments can not measure with a fine enough resolution the water vapour enhancements caused by overshooting convection. The consequence is that it is difficult to estimate the role of overshooting deep convection at the global scale. Using a mesoscale model i.e., Brazilian Regional Atmospheric Modelling System (BRAMS), past atmospheric conditions have been simulated for the full wet season i.e., Nov 2012 to Mar 2013 having a single grid with horizontal resolution of 20 km × 20km over a large part of Brazil and South America. This resolution is too coarse to reproduce overshooting convection in the model, so that this simulation should be used as a reference (REF) simulation, without the impact of overshooting convection in the TTL water budget. For initialisation, as well as nudging the grid boundary in every 6 hours, European Centre for Medium-Range Weather Forecasts (ECMWF) analyses has been used. The size distribution of hydrometeors and number of cloud condensation nuclei (CCN) are fitted in order to best reproduce accumulated precipitations derived from Tropical Rainfall Measuring Mission (TRMM). Similarly, GOES and MSG IR mages have been thoroughly compared with model's outputs, using image correlation statistics for the position of the clouds. The model H2O variability during the wet season, is compared with the in situ balloon-borne measurements during
Steeneveld, G. J.; Tolk, L. F.; Moene, A. F.; Hartogensis, O. K.; Peters, W.; Holtslag, A. A. M.
2011-01-01
The Weather Research and Forecasting Model (WRF) and the Regional Atmospheric Mesoscale Model System (RAMS) are frequently used for (regional) weather, climate and air quality studies. This paper covers an evaluation of these models for a windy and calm episode against Cabauw tower observations (Net
Minaya, Veronica; Corzo, Gerald; van der Kwast, Johannes; Mynett, Arthur
2016-04-01
Many terrestrial biogeochemistry process models have been applied around the world at different scales and for a large range of ecosystems. Grasslands, and in particular the ones located in the Andean Region are essential ecosystems that sustain important ecological processes; however, just a few efforts have been made to estimate the gross primary production (GPP) and the hydrological budgets for this specific ecosystem along an altitudinal gradient. A previous study, which is one of the few available in the region, considered the heterogeneity of the main properties of the páramo vegetation and showed significant differences in plant functional types, site/soil parameters and daily meteorology. This study extends the work above mentioned and uses spatio-temporal analysis of the BIOME-BGC model results. This was done to simulate the GPP and the water fluxes in space and time, by applying altitudinal analysis. The catchment located at the southwestern slope of the Antisana volcano in Ecuador was selected as a representative area of the Andean páramos and its hydrological importance as one of the main sources of a water supply reservoir in the region. An accurate estimation of temporal changes in GPP in the region is important for carbon budget assessments, evaluation of the impact of climate change and biomass productivity. This complex and yet interesting problem was integrated by the ecosystem process model BIOME-BGC, the results were evaluated and associated to the land cover map where the growth forms of vegetation were identified. The responses of GPP and the water fluxes were not only dependent on the environmental drivers but also on the ecophysiology and the site specific parameters. The model estimated that the GPP at lower elevations doubles the amount estimated at higher elevations, which might have a large implication during extrapolations at larger spatio-temporal scales. The outcomes of the stand hydrological processes demonstrated a wrong
Zhu, Q.; Jiang, H.; Liu, J.; Peng, C.; Fang, X.; Yu, S.; Zhou, G.; Wei, X.; Ju, W.
2011-01-01
The regional carbon budget of the climatic transition zone may be very sensitive to climate change and increasing atmospheric CO2 concentrations. This study simulated the carbon cycles under these changes using process-based ecosystem models. The Integrated Biosphere Simulator (IBIS), a Dynamic Global Vegetation Model (DGVM), was used to evaluate the impacts of climate change and CO2 fertilization on net primary production (NPP), net ecosystem production (NEP), and the vegetation structure of terrestrial ecosystems in Zhejiang province (area 101,800 km2, mainly covered by subtropical evergreen forest and warm-temperate evergreen broadleaf forest) which is located in the subtropical climate area of China. Two general circulation models (HADCM3 and CGCM3) representing four IPCC climate change scenarios (HC3AA, HC3GG, CGCM-sresa2, and CGCM-sresb1) were used as climate inputs for IBIS. Results show that simulated historical biomass and NPP are consistent with field and other modelled data, which makes the analysis of future carbon budget reliable. The results indicate that NPP over the entire Zhejiang province was about 55 Mt C yr-1 during the last half of the 21st century. An NPP increase of about 24 Mt C by the end of the 21st century was estimated with the combined effects of increasing CO2 and climate change. A slight NPP increase of about 5 Mt C was estimated under the climate change alone scenario. Forests in Zhejiang are currently acting as a carbon sink with an average NEP of about 2.5 Mt C yr-1. NEP will increase to about 5 Mt C yr-1 by the end of the 21st century with the increasing atmospheric CO2 concentration and climate change. However, climate change alone will reduce the forest carbon sequestration of Zhejiang's forests. Future climate warming will substantially change the vegetation cover types; warm-temperate evergreen broadleaf forest will be gradually substituted by subtropical evergreen forest. An increasing CO2 concentration will have little
Corvid re-caching without 'theory of mind': a model.
Directory of Open Access Journals (Sweden)
Elske van der Vaart
Full Text Available Scrub jays are thought to use many tactics to protect their caches. For instance, they predominantly bury food far away from conspecifics, and if they must cache while being watched, they often re-cache their worms later, once they are in private. Two explanations have been offered for such observations, and they are intensely debated. First, the birds may reason about their competitors' mental states, with a 'theory of mind'; alternatively, they may apply behavioral rules learned in daily life. Although this second hypothesis is cognitively simpler, it does seem to require a different, ad-hoc behavioral rule for every caching and re-caching pattern exhibited by the birds. Our new theory avoids this drawback by explaining a large variety of patterns as side-effects of stress and the resulting memory errors. Inspired by experimental data, we assume that re-caching is not motivated by a deliberate effort to safeguard specific caches from theft, but by a general desire to cache more. This desire is brought on by stress, which is determined by the presence and dominance of onlookers, and by unsuccessful recovery attempts. We study this theory in two experiments similar to those done with real birds with a kind of 'virtual bird', whose behavior depends on a set of basic assumptions about corvid cognition, and a well-established model of human memory. Our results show that the 'virtual bird' acts as the real birds did; its re-caching reflects whether it has been watched, how dominant its onlooker was, and how close to that onlooker it has cached. This happens even though it cannot attribute mental states, and it has only a single behavioral rule assumed to be previously learned. Thus, our simulations indicate that corvid re-caching can be explained without sophisticated social cognition. Given our specific predictions, our theory can easily be tested empirically.
A theoretical model for Reynolds-stress and dissipation-rate budgets in near-wall region
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
A 3-D wave model for the turbulent coherent structures in near-wall region is proposed. The transport nature of the Reynolds stresses and dissipation rate of the turbulence kinetic energy are shown via computation based on the theoretical model. The mean velocity profile is also computed by using the same theoretical model. The theoretical results are in good agreement with those found from DNS, indicating that the theoretical model proposed can correctly describe the physical mechanism of turbulence in near wall region and it thus possibly opens a new way for turbulence modeling in this region.
A theoretical model for Reynolds-stress and dissipation-rate budgets in near-wall region
Institute of Scientific and Technical Information of China (English)
陆利蓬; 陈矛章
2000-01-01
A 3-D wave model for the turbulent coherent structures in near-wall region is proposed. The transport nature of the Reynolds stresses and dissipation rate of the turbulence kinetic energy are shown via computation based on the theoretical model. The mean velocity profile is also computed by using the same theoretical model. The theoretical results are in good agreement with those found from DNS, indicating that the theoretical model proposed can correctly describe the physical mechanism of turbulence in near wail region and it thus possibly opens a new way for turbulence modeling in this region.
Directory of Open Access Journals (Sweden)
Ryo Oizumi
Full Text Available Despite the fact that density effects and individual differences in life history are considered to be important for evolution, these factors lead to several difficulties in understanding the evolution of life history, especially when population sizes reach the carrying capacity. r/K selection theory explains what types of life strategies evolve in the presence of density effects and individual differences. However, the relationship between the life schedules of individuals and population size is still unclear, even if the theory can classify life strategies appropriately. To address this issue, we propose a few equations on adaptive life strategies in r/K selection where density effects are absent or present. The equations detail not only the adaptive life history but also the population dynamics. Furthermore, the equations can incorporate temporal individual differences, which are referred to as internal stochasticity. Our framework reveals that maximizing density effects is an evolutionarily stable strategy related to the carrying capacity. A significant consequence of our analysis is that adaptive strategies in both selections maximize an identical function, providing both population growth rate and carrying capacity. We apply our method to an optimal foraging problem in a semelparous species model and demonstrate that the adaptive strategy yields a lower intrinsic growth rate as well as a lower basic reproductive number than those obtained with other strategies. This study proposes that the diversity of life strategies arises due to the effects of density and internal stochasticity.
The Interface Between Theory and Data in Structural Equation Models
Grace, James B.; Bollen, Kenneth A.
2006-01-01
Structural equation modeling (SEM) holds the promise of providing natural scientists the capacity to evaluate complex multivariate hypotheses about ecological systems. Building on its predecessors, path analysis and factor analysis, SEM allows for the incorporation of both observed and unobserved (latent) variables into theoretically based probabilistic models. In this paper we discuss the interface between theory and data in SEM and the use of an additional variable type, the composite, for representing general concepts. In simple terms, composite variables specify the influences of collections of other variables and can be helpful in modeling general relationships of the sort commonly of interest to ecologists. While long recognized as a potentially important element of SEM, composite variables have received very limited use, in part because of a lack of theoretical consideration, but also because of difficulties that arise in parameter estimation when using conventional solution procedures. In this paper we present a framework for discussing composites and demonstrate how the use of partially reduced form models can help to overcome some of the parameter estimation and evaluation problems associated with models containing composites. Diagnostic procedures for evaluating the most appropriate and effective use of composites are illustrated with an example from the ecological literature. It is argued that an ability to incorporate composite variables into structural equation models may be particularly valuable in the study of natural systems, where concepts are frequently multifaceted and the influences of suites of variables are often of interest.
Ju, W.; Chen, J.; Liu, R.; Liu, Y.
2013-12-01
The process-based Boreal Ecosystem Productivity Simulator (BEPS) model was employed in conjunction with spatially distributed leaf area index (LAI), land cover, soil, and climate data to simulate the carbon budget of global terrestrial ecosystems during the period from 1981 to 2008. The BEPS model was first calibrated and validated using gross primary productivity (GPP), net primary productivity (NPP), and net ecosystem productivity (NEP) measured in different ecosystems across the word. Then, four global simulations were conducted at daily time steps and a spatial resolution of 8 km to quantify the global terrestrial carbon budget and to identify the relative contributions of changes in climate, atmospheric CO2 concentration, and LAI to the global terrestrial carbon sink. The long term LAI data used to drive the model was generated through fusing Moderate Resolution Imaging Spectroradiometer (MODIS) and historical Advanced Very High Resolution Radiometer (AVHRR) data pixel by pixel. The meteorological fields were interpolated from the 0.5° global daily meteorological dataset produced by the land surface hydrological research group at Princeton University. The results show that the BEPS model was able to simulate carbon fluxes in different ecosystems. Simulated GPP, NPP, and NEP values and their temporal trends exhibited distinguishable spatial patterns. During the period from 1981 to 2008, global terrestrial ecosystems acted as a carbon sink. The averaged global totals of GPP NPP, and NEP were 122.70 Pg C yr-1, 56.89 Pg C yr-1, and 2.76 Pg C yr-1, respectively. The global totals of GPP and NPP increased greatly, at rates of 0.43 Pg C yr-2 (R2=0.728) and 0.26 Pg C yr-2 (R2=0.709), respectively. Global total NEP did not show an apparent increasing trend (R2= 0.036), averaged 2.26 Pg C yr-1, 3.21 Pg C yr-1, and 2.72 Pg C yr-1 for the periods from 1981 to 1989, from 1990 to 1999, and from 2000 to 2008, respectively. The magnitude and temporal trend of global
Fernandez, Charles; Soize, Christian; Gagliardini, Laurent
2009-01-01
The fuzzy structure theory was introduced 20 years ago in order to model the effects of complex subsystems imprecisely known on a master structure. This theory was only aimed at structural dynamics. In this paper, an extension of that theory is proposed in developing an elastoacoustic element useful to model sound-insulation layers for computational vibroacoustics of complex systems. The simplified model constructed enhances computation time and memory allocation because the number of physical and generalized degrees of freedom in the computational vibroacoustic model is not increased. However, these simplifications introduce model uncertainties. In order to take into account these uncertainties, the nonparametric probabilistic approach recently introduced is used. A robust simplified model for sound-insulation layers is then obtained. This model is controlled by a small number of physical and dispersion parameters. First, the extension of the fuzzy structure theory to elastoacoustic element is presented. Second, the computational vibroacoustic model including such an elastoacoustic element to model sound-insulation layer is given. Then, a design methodology to identify the model parameters with experiments is proposed and is experimentally validated. Finally, the theory is applied to an uncertain vibroacoustic system.
A theory and a computational model of spatial reasoning with preferred mental models.
Ragni, Marco; Knauff, Markus
2013-07-01
Inferences about spatial arrangements and relations like "The Porsche is parked to the left of the Dodge and the Ferrari is parked to the right of the Dodge, thus, the Porsche is parked to the left of the Ferrari," are ubiquitous. However, spatial descriptions are often interpretable in many different ways and compatible with several alternative mental models. This article suggests that individuals tackle such indeterminate multiple-model problems by constructing a single, simple, and typical mental model but neglect other possible models. The model that first comes to reasoners' minds is the preferred mental model. It helps save cognitive resources but also leads to reasoning errors and illusory inferences. The article presents a preferred model theory and an instantiation of this theory in the form of a computational model, preferred inferences in reasoning with spatial mental models (PRISM). PRISM can be used to simulate and explain how preferred models are constructed, inspected, and varied in a spatial array that functions as if it were a spatial working memory. A spatial focus inserts tokens into the array, inspects the array to find new spatial relations, and relocates tokens in the array to generate alternative models of the problem description, if necessary. The article also introduces a general measure of difficulty based on the number of necessary focus operations (rather than the number of models). A comparison with results from psychological experiments shows that the theory can explain preferences, errors, and the difficulty of spatial reasoning problems.
Pluralistic and stochastic gene regulation: examples, models and consistent theory.
Salas, Elisa N; Shu, Jiang; Cserhati, Matyas F; Weeks, Donald P; Ladunga, Istvan
2016-06-01
We present a theory of pluralistic and stochastic gene regulation. To bridge the gap between empirical studies and mathematical models, we integrate pre-existing observations with our meta-analyses of the ENCODE ChIP-Seq experiments. Earlier evidence includes fluctuations in levels, location, activity, and binding of transcription factors, variable DNA motifs, and bursts in gene expression. Stochastic regulation is also indicated by frequently subdued effects of knockout mutants of regulators, their evolutionary losses/gains and massive rewiring of regulatory sites. We report wide-spread pluralistic regulation in ≈800 000 tightly co-expressed pairs of diverse human genes. Typically, half of ≈50 observed regulators bind to both genes reproducibly, twice more than in independently expressed gene pairs. We also examine the largest set of co-expressed genes, which code for cytoplasmic ribosomal proteins. Numerous regulatory complexes are highly significant enriched in ribosomal genes compared to highly expressed non-ribosomal genes. We could not find any DNA-associated, strict sense master regulator. Despite major fluctuations in transcription factor binding, our machine learning model accurately predicted transcript levels using binding sites of 20+ regulators. Our pluralistic and stochastic theory is consistent with partially random binding patterns, redundancy, stochastic regulator binding, burst-like expression, degeneracy of binding motifs and massive regulatory rewiring during evolution.
Modeling Adversaries in Counterterrorism Decisions Using Prospect Theory.
Merrick, Jason R W; Leclerc, Philip
2016-04-01
Counterterrorism decisions have been an intense area of research in recent years. Both decision analysis and game theory have been used to model such decisions, and more recently approaches have been developed that combine the techniques of the two disciplines. However, each of these approaches assumes that the attacker is maximizing its utility. Experimental research shows that human beings do not make decisions by maximizing expected utility without aid, but instead deviate in specific ways such as loss aversion or likelihood insensitivity. In this article, we modify existing methods for counterterrorism decisions. We keep expected utility as the defender's paradigm to seek for the rational decision, but we use prospect theory to solve for the attacker's decision to descriptively model the attacker's loss aversion and likelihood insensitivity. We study the effects of this approach in a critical decision, whether to screen containers entering the United States for radioactive materials. We find that the defender's optimal decision is sensitive to the attacker's levels of loss aversion and likelihood insensitivity, meaning that understanding such descriptive decision effects is important in making such decisions.
Theory and Modeling for the Magnetospheric Multiscale Mission
Hesse, M.; Aunai, N.; Birn, J.; Cassak, P.; Denton, R. E.; Drake, J. F.; Gombosi, T.; Hoshino, M.; Matthaeus, W.; Sibeck, D.; Zenitani, S.
2016-03-01
The Magnetospheric Multiscale (MMS) mission will provide measurement capabilities, which will exceed those of earlier and even contemporary missions by orders of magnitude. MMS will, for the first time, be able to measure directly and with sufficient resolution key features of the magnetic reconnection process, down to the critical electron scales, which need to be resolved to understand how reconnection works. Owing to the complexity and extremely high spatial resolution required, no prior measurements exist, which could be employed to guide the definition of measurement requirements, and consequently set essential parameters for mission planning and execution. Insight into expected details of the reconnection process could hence only been obtained from theory and modern kinetic modeling. This situation was recognized early on by MMS leadership, which supported the formation of a fully integrated Theory and Modeling Team (TMT). The TMT participated in all aspects of mission planning, from the proposal stage to individual aspects of instrument performance characteristics. It provided and continues to provide to the mission the latest insights regarding the kinetic physics of magnetic reconnection, as well as associated particle acceleration and turbulence, assuring that, to the best of modern knowledge, the mission is prepared to resolve the inner workings of the magnetic reconnection process. The present paper provides a summary of key recent results or reconnection research by TMT members.