When do mixotrophs specialize? Adaptive dynamics theory applied to a dynamic energy budget model.
Troost, T.A.; Kooi, B.W.; Kooijman, S.A.L.M.
2005-01-01
In evolutionary history, several events have occurred at which mixotrophs specialized into pure autotrophs and heterotrophs. We studied the conditions under which such events take place, using the Dynamic Energy Budget (DEB) theory for physiological rules of the organisms' metabolism and Adaptive
Augustine, Starrlight; Rosa, Sara; Kooijman, Sebastiaan A. L. M.; Carlotti, François; Poggiale, Jean-Christophe
2014-11-01
Parameters for the standard Dynamic Energy Budget (DEB) model were estimated for the purple mauve stinger, Pelagia noctiluca, using literature data. Overall, the model predictions are in good agreement with data covering the full life-cycle. The parameter set we obtain suggests that P. noctiluca is well adapted to survive long periods of starvation since the predicted maximum reserve capacity is extremely high. Moreover we predict that the reproductive output of larger individuals is relatively insensitive to changes in food level while wet mass and length are. Furthermore, the parameters imply that even if food were scarce (ingestion levels only 14% of the maximum for a given size) an individual would still mature and be able to reproduce. We present detailed model predictions for embryo development and discuss the developmental energetics of the species such as the fact that the metabolism of ephyrae accelerates for several days after birth. Finally we explore a number of concrete testable model predictions which will help to guide future research. The application of DEB theory to the collected data allowed us to conclude that P. noctiluca combines maximizing allocation to reproduction with rather extreme capabilities to survive starvation. The combination of these properties might explain why P. noctiluca is a rapidly growing concern to fisheries and tourism.
Directory of Open Access Journals (Sweden)
M. J. Newland
2018-05-01
CHOO. The experimental results are interpreted through theoretical studies of the SCI unimolecular reactions and bimolecular reactions with H2O, characterised for α-pinene and β-pinene at the M06-2X/aug-cc-pVTZ level of theory. The theoretically derived rates agree with the experimental results within the uncertainties. A global modelling study, applying the experimental results within the GEOS-Chem chemical transport model, suggests that > 97 % of the total monoterpene-derived global SCI burden is comprised of SCIs with a structure that determines that they react slowly with water and that their atmospheric fate is dominated by unimolecular reactions. Seasonally averaged boundary layer concentrations of monoterpene-derived SCIs reach up to 1.4 × 104 cm−3 in regions of elevated monoterpene emissions in the tropics. Reactions of monoterpene-derived SCIs with SO2 account for < 1 % globally but may account for up to 60 % of the gas-phase SO2 removal over areas of tropical forests, with significant localised impacts on the formation of sulfate aerosol and hence the lifetime and distribution of SO2.
Newland, Mike J.; Rickard, Andrew R.; Sherwen, Tomás; Evans, Mathew J.; Vereecken, Luc; Muñoz, Amalia; Ródenas, Milagros; Bloss, William J.
2018-05-01
interpreted through theoretical studies of the SCI unimolecular reactions and bimolecular reactions with H2O, characterised for α-pinene and β-pinene at the M06-2X/aug-cc-pVTZ level of theory. The theoretically derived rates agree with the experimental results within the uncertainties. A global modelling study, applying the experimental results within the GEOS-Chem chemical transport model, suggests that > 97 % of the total monoterpene-derived global SCI burden is comprised of SCIs with a structure that determines that they react slowly with water and that their atmospheric fate is dominated by unimolecular reactions. Seasonally averaged boundary layer concentrations of monoterpene-derived SCIs reach up to 1.4 × 104 cm-3 in regions of elevated monoterpene emissions in the tropics. Reactions of monoterpene-derived SCIs with SO2 account for < 1 % globally but may account for up to 60 % of the gas-phase SO2 removal over areas of tropical forests, with significant localised impacts on the formation of sulfate aerosol and hence the lifetime and distribution of SO2.
Building information modeling in budgeting
Directory of Open Access Journals (Sweden)
Strnad, Michal
2017-12-01
Full Text Available Construction activity is one of the financially demanding and ever-changing locations of implementation. The basic idea of the budget is to determine all the possible costs that will arise during construction work. The budget must be a transparent and effective way of communication in the context of supplier-customer relationships. For this reason it is essential to give the budget the structure that is now represented by the price system. It is important to adhere to the principles of budgeting and technical standards. It is necessary to have good documentation for budgeting such as project documentation and much more. However, the construction product range is one of the most extensive, the product group can be changed several times in the investment phase not only materially but also cost-effectively because of the longest production cycle in the construction industry.
Chang, CC
2012-01-01
Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko
Hodges, Wilfrid
1993-01-01
An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.
Radiation budget measurement/model interface
Vonderhaar, T. H.; Ciesielski, P.; Randel, D.; Stevens, D.
1983-01-01
This final report includes research results from the period February, 1981 through November, 1982. Two new results combine to form the final portion of this work. They are the work by Hanna (1982) and Stevens to successfully test and demonstrate a low-order spectral climate model and the work by Ciesielski et al. (1983) to combine and test the new radiation budget results from NIMBUS-7 with earlier satellite measurements. Together, the two related activities set the stage for future research on radiation budget measurement/model interfacing. Such combination of results will lead to new applications of satellite data to climate problems. The objectives of this research under the present contract are therefore satisfied. Additional research reported herein includes the compilation and documentation of the radiation budget data set a Colorado State University and the definition of climate-related experiments suggested after lengthy analysis of the satellite radiation budget experiments.
A Theory of the Perturbed Consumer with General Budgets
DEFF Research Database (Denmark)
McFadden, Daniel L; Fosgerau, Mogens
We consider demand systems for utility-maximizing consumers facing general budget constraints whose utilities are perturbed by additive linear shifts in marginal utilities. Budgets are required to be compact but are not required to be convex. We define demand generating functions (DGF) whose...... subgradients with respect to these perturbations are convex hulls of the utility-maximizing demands. We give necessary as well as sufficient conditions for DGF to be consistent with utility maximization, and establish under quite general conditions that utility-maximizing demands are almost everywhere single......-valued and smooth in their arguments. We also give sufficient conditions for integrability of perturbed demand. Our analysis provides a foundation for applications of consumer theory to problems with nonlinear budget constraints....
Radiation budget measurement/model interface research
Vonderhaar, T. H.
1981-01-01
The NIMBUS 6 data were analyzed to form an up to date climatology of the Earth radiation budget as a basis for numerical model definition studies. Global maps depicting infrared emitted flux, net flux and albedo from processed NIMBUS 6 data for July, 1977, are presented. Zonal averages of net radiation flux for April, May, and June and zonal mean emitted flux and net flux for the December to January period are also presented. The development of two models is reported. The first is a statistical dynamical model with vertical and horizontal resolution. The second model is a two level global linear balance model. The results of time integration of the model up to 120 days, to simulate the January circulation, are discussed. Average zonal wind, meridonal wind component, vertical velocity, and moisture budget are among the parameters addressed.
Parameterising a generic model for the dynamic energy budget of Antarctic krill, Euphausia superba.
Jager, T.; Ravagnan, E.
2015-01-01
Dynamic Energy Budget (DEB) theory is a generic and comprehensive framework for understanding bioenergetics over the entire life cycle of an organism. Here, we apply a simplified model derived from this theory (DEBkiss) to Antarctic krill Euphausia superba. The model was parameterised using growth
Nambe Pueblo Water Budget and Forecasting model.
Energy Technology Data Exchange (ETDEWEB)
Brainard, James Robert
2009-10-01
This report documents The Nambe Pueblo Water Budget and Water Forecasting model. The model has been constructed using Powersim Studio (PS), a software package designed to investigate complex systems where flows and accumulations are central to the system. Here PS has been used as a platform for modeling various aspects of Nambe Pueblo's current and future water use. The model contains three major components, the Water Forecast Component, Irrigation Scheduling Component, and the Reservoir Model Component. In each of the components, the user can change variables to investigate the impacts of water management scenarios on future water use. The Water Forecast Component includes forecasting for industrial, commercial, and livestock use. Domestic demand is also forecasted based on user specified current population, population growth rates, and per capita water consumption. Irrigation efficiencies are quantified in the Irrigated Agriculture component using critical information concerning diversion rates, acreages, ditch dimensions and seepage rates. Results from this section are used in the Water Demand Forecast, Irrigation Scheduling, and the Reservoir Model components. The Reservoir Component contains two sections, (1) Storage and Inflow Accumulations by Categories and (2) Release, Diversion and Shortages. Results from both sections are derived from the calibrated Nambe Reservoir model where historic, pre-dam or above dam USGS stream flow data is fed into the model and releases are calculated.
Validation of a Dynamic Energy Budget (DEB) model for the blue mussel
Saraiva, S.; van der Meer, J.; Kooijman, S.A.L.M.; Witbaard, R.; Philippart, C.J.M.; Hippler, D.; Parker, R.
2012-01-01
A model for bivalve growth was developed and the results were tested against field observations. The model is based on the Dynamic Energy Budget (DEB) theory and includes an extension of the standard DEB model to cope with changing food quantity and quality. At 4 different locations in the North Sea
Galic, Nika; Forbes, Valery E.
2017-03-01
Human activities have been modifying ecosystems for centuries, from pressures on wild populations we harvest to modifying habitats through urbanization and agricultural activities. Changes in global climate patterns are adding another layer of, often unpredictable, perturbations to ecosystems on which we rely for life support [1,2]. To ensure the sustainability of ecosystem services, especially at this point in time when the human population is estimated to grow by another 2 billion by 2050 [3], we need to predict possible consequences of our actions and suggest relevant solutions [4,5]. We face several challenges when estimating adverse impacts of our actions on ecosystems. We describe these in the context of ecological risk assessment of chemicals. Firstly, when attempting to assess risk from exposure to chemicals, we base our decisions on a very limited number of species that are easily cultured and kept in the lab. We assume that preventing risk to these species will also protect all of the untested species present in natural ecosystems [6]. Secondly, although we know that chemicals interact with other stressors in the field, the number of stressors that we can test is limited due to logistical and ethical reasons. Similarly, empirical approaches are limited in both spatial and temporal scale due to logistical, financial and ethical reasons [7,8]. To bypass these challenges, we can develop ecological models that integrate relevant life history and other information and make testable predictions across relevant spatial and temporal scales [8-10].
Geček, Sunčana
2017-03-01
Jusup and colleagues in the recent review on physics of metabolic organization [1] discuss in detail motivational considerations and common assumptions of Dynamic Energy Budget (DEB) theory, supply readers with a practical guide to DEB-based modeling, demonstrate the construction and dynamics of the standard DEB model, and illustrate several applications. The authors make a step forward from the existing literature by seamlessly bridging over the dichotomy between (i) thermodynamic foundations of the theory (which are often more accessible and understandable to physicists and mathematicians), and (ii) the resulting bioenergetic models (mostly used by biologists in real-world applications).
Prest, M
1988-01-01
In recent years the interplay between model theory and other branches of mathematics has led to many deep and intriguing results. In this, the first book on the topic, the theme is the interplay between model theory and the theory of modules. The book is intended to be a self-contained introduction to the subject and introduces the requisite model theory and module theory as it is needed. Dr Prest develops the basic ideas concerning what can be said about modules using the information which may be expressed in a first-order language. Later chapters discuss stability-theoretic aspects of module
3D modeling of satellite spectral images, radiation budget and energy budget of urban landscapes
Gastellu-Etchegorry, J. P.
2008-12-01
DART EB is a model that is being developed for simulating the 3D (3 dimensional) energy budget of urban and natural scenes, possibly with topography and atmosphere. It simulates all non radiative energy mechanisms (heat conduction, turbulent momentum and heat fluxes, water reservoir evolution, etc.). It uses DART model (Discrete Anisotropic Radiative Transfer) for simulating radiative mechanisms: 3D radiative budget of 3D scenes and their remote sensing images expressed in terms of reflectance or brightness temperature values, for any atmosphere, wavelength, sun/view direction, altitude and spatial resolution. It uses an innovative multispectral approach (ray tracing, exact kernel, discrete ordinate techniques) over the whole optical domain. This paper presents two major and recent improvements of DART for adapting it to urban canopies. (1) Simulation of the geometry and optical characteristics of urban elements (houses, etc.). (2) Modeling of thermal infrared emission by vegetation and urban elements. The new DART version was used in the context of the CAPITOUL project. For that, districts of the Toulouse urban data base (Autocad format) were translated into DART scenes. This allowed us to simulate visible, near infrared and thermal infrared satellite images of Toulouse districts. Moreover, the 3D radiation budget was used by DARTEB for simulating the time evolution of a number of geophysical quantities of various surface elements (roads, walls, roofs). Results were successfully compared with ground measurements of the CAPITOUL project.
Balancing the books - a statistical theory of prospective budgets in Earth System science
O'Kane, J. Philip
An honest declaration of the error in a mass, momentum or energy balance, ɛ, simply raises the question of its acceptability: "At what value of ɛ is the attempted balance to be rejected?" Answering this question requires a reference quantity against which to compare ɛ. This quantity must be a mathematical function of all the data used in making the balance. To deliver this function, a theory grounded in a workable definition of acceptability is essential. A distinction must be drawn between a retrospective balance and a prospective budget in relation to any natural space-filling body. Balances look to the past; budgets look to the future. The theory is built on the application of classical sampling theory to the measurement and closure of a prospective budget. It satisfies R.A. Fisher's "vital requirement that the actual and physical conduct of experiments should govern the statistical procedure of their interpretation". It provides a test, which rejects, or fails to reject, the hypothesis that the closing error on the budget, when realised, was due to sampling error only. By increasing the number of measurements, the discrimination of the test can be improved, controlling both the precision and accuracy of the budget and its components. The cost-effective design of such measurement campaigns is discussed briefly. This analysis may also show when campaigns to close a budget on a particular space-filling body are not worth the effort for either scientific or economic reasons. Other approaches, such as those based on stochastic processes, lack this finality, because they fail to distinguish between different types of error in the mismatch between a set of realisations of the process and the measured data.
Modeling Budget Optimum Allocation of Khorasan Razavi Province Agriculture Sector
Directory of Open Access Journals (Sweden)
Seyed Mohammad Fahimifard
2016-09-01
Full Text Available Introduction: Stock shortage is one of the development impasses in developing countries and trough it the agriculture sector has faced with the most limitation. The share of Iran’s agricultural sector from total investments after the Islamic revolution (1979 has been just 5.5 percent. This fact causes low efficiency in Iran’s agriculture sector. For instance per each 1 cubic meter of water in Iran’s agriculture sector, less that 1 kilogram dry food produced and each Iranian farmer achieves less annual income and has less mechanization in comparison with similar countries in Iran’s 1404 perspective document. Therefore, it is clear that increasing investment in agriculture sector, optimize the budget allocation for this sector is mandatory however has not been adequately and scientifically revised until now. Thus, in this research optimum budget allocation of Iran- Khorasan Razavi province agriculture sector was modeled. Materials and Methods: In order to model the optimum budget allocation of Khorasan Razavi province’s agriculture sector at first optimum budget allocation between agriculture programs was modeled with compounding three indexes: 1. Analyzing the priorities of Khorasan Razavi province’s agriculture sector experts with the application of Analytical Hierarchy Process (AHP, 2. The average share of agriculture sector programs from 4th country’s development program for Khorasan Razavi province’s agriculture sector, and 3.The average share of agriculture sector programs from 5th country’s development program for Khorasan Razavi province’s agriculture sector. Then, using Delphi technique potential indexes of each program was determined. After that, determined potential indexes were weighted using Analytical Hierarchy Process (AHP and finally, using numerical taxonomy model to optimize allocation of the program’s budget between cities based on two scenarios. Required data, also was gathered from the budget and planning
Budget model can aid group practice planning.
Bender, A D
1991-12-01
A medical practice can enhance its planning by developing a budgetary model to test effects of planning assumptions on its profitability and cash requirements. A model focusing on patient visits, payment mix, patient mix, and fee and payment schedules can help assess effects of proposed decisions. A planning model is not a substitute for planning but should complement a plan that includes mission, goals, values, strategic issues, and different outcomes.
Electric solar wind sail mass budget model
Directory of Open Access Journals (Sweden)
P. Janhunen
2013-02-01
Full Text Available The electric solar wind sail (E-sail is a new type of propellantless propulsion system for Solar System transportation, which uses the natural solar wind to produce spacecraft propulsion. The E-sail consists of thin centrifugally stretched tethers that are kept charged by an onboard electron gun and, as such, experience Coulomb drag through the high-speed solar wind plasma stream. This paper discusses a mass breakdown and a performance model for an E-sail spacecraft that hosts a mission-specific payload of prescribed mass. In particular, the model is able to estimate the total spacecraft mass and its propulsive acceleration as a function of various design parameters such as the number of tethers and their length. A number of subsystem masses are calculated assuming existing or near-term E-sail technology. In light of the obtained performance estimates, an E-sail represents a promising propulsion system for a variety of transportation needs in the Solar System.
Nisbet, Roger M.
2017-03-01
Jusup et al. [1] provide a comprehensive review of Dynamic Energy Budget (DEB) theory - a theory of metabolic organization that has its roots in a model by S.A.L.M Kooijman [2] and has evolved over three decades into a remarkable general theory whose use appears to be growing exponentially. The definitive text on DEB theory [3] is a challenging (though exceptionally rewarding) read, and previous reviews (e.g. [4,5]) have provided focused summaries of some of its main themes, targeted at specific groups of readers. The strong case for a further review is well captured in the abstract: ;Hitherto, the foundations were more accessible to physicists or mathematicians, and the applications to biologists, causing a dichotomy in what always should have been a single body of work.; In response to this need, Jusup et al. provide a review that combines a lucid, rigorous exposition of the core components of DEB theory with a diverse collection of DEB applications. They also highlight some recent advances, notably the rapidly growing on-line database of DEB model parameters (451 species on 15 August 2016 according to [1], now, just a few months later, over 500 species).
How processing digital elevation models can affect simulated water budgets
Kuniansky, E.L.; Lowery, M.A.; Campbell, B.G.
2009-01-01
For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.
Holman, Gordon D.
1989-01-01
The primary purpose of the Theory and Modeling Group meeting was to identify scientists engaged or interested in theoretical work pertinent to the Max '91 program, and to encourage theorists to pursue modeling which is directly relevant to data which can be expected to result from the program. A list of participants and their institutions is presented. Two solar flare paradigms were discussed during the meeting -- the importance of magnetic reconnection in flares and the applicability of numerical simulation results to solar flare studies.
Directory of Open Access Journals (Sweden)
Михаил Николаевич Грачев
2011-09-01
Full Text Available In the given article there is shown that the process of the regional-level budget policy implementation is not free from contradictions arising in the course of the resource allocation and expenditure responsibilities between the levels of the government, and often acquires the conflict nature. Taking into consideration the basic theories, the authors proposed the typology of conflicts between the regional government and municipalities, which can be used in the inter-budget relationships.
DEFF Research Database (Denmark)
Andersen, Asger Lau; Lassen, David Dreyer; Nielsen, Lasse Holbøll Westh
are negative rather than positive; and when there is divided government. We test the hypotheses of the model using a unique data set of late budgets for US state governments, based on dates of budget approval collected from news reports and a survey of state budget o¢ cers for the period 1988...
MACCIA, ELIZABETH S.; AND OTHERS
AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…
Dependent-Chance Programming Models for Capital Budgeting in Fuzzy Environments
Institute of Scientific and Technical Information of China (English)
LIANG Rui; GAO Jinwu
2008-01-01
Capital budgeting is concerned with maximizing the total net profit subject to budget constraints by selecting an appropriate combination of projects. This paper presents chance maximizing models for capital budgeting with fuzzy input data and multiple conflicting objectives. When the decision maker sets a prospec-tive profit level and wants to maximize the chances of the total profit achieving the prospective profit level, a fuzzy dependent-chance programming model, a fuzzy multi-objective dependent-chance programming model, and a fuzzy goal dependent-chance programming model are used to formulate the fuzzy capital budgeting problem. A fuzzy simulation based genetic algorithm is used to solve these models. Numerical examples are provided to illustrate the effectiveness of the simulation-based genetic algorithm and the po-tential applications of these models.
Budget constraint and vaccine dosing: A mathematical modelling exercise
Standaert, Baudouin A.; Curran, Desmond; Postma, Maarten J.
2014-01-01
Background: Increasing the number of vaccine doses may potentially improve overall efficacy. Decision-makers need information about choosing the most efficient dose schedule to maximise the total health gain of a population when operating under a constrained budget. The objective of this study is to
Lectures on algebraic model theory
Hart, Bradd
2001-01-01
In recent years, model theory has had remarkable success in solving important problems as well as in shedding new light on our understanding of them. The three lectures collected here present recent developments in three such areas: Anand Pillay on differential fields, Patrick Speissegger on o-minimality and Matthias Clasen and Matthew Valeriote on tame congruence theory.
Model integration and a theory of models
Dolk, Daniel R.; Kottemann, Jeffrey E.
1993-01-01
Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...
Directory of Open Access Journals (Sweden)
Coralie Picoche
Full Text Available The blue mussel, Mytilus edulis, is a commercially important species, with production based on both fisheries and aquaculture. Dynamic Energy Budget (DEB models have been extensively applied to study its energetics but such applications require a deep understanding of its nutrition, from filtration to assimilation. Being filter feeders, mussels show multiple responses to temporal fluctuations in their food and environment, raising questions that can be investigated by modeling. To provide a better insight into mussel-environment interactions, an experiment was conducted in one of the main French growing zones (Utah Beach, Normandy. Mussel growth was monitored monthly for 18 months, with a large number of environmental descriptors measured in parallel. Food proxies such as chlorophyll a, particulate organic carbon and phytoplankton were also sampled, in addition to non-nutritious particles. High-frequency physical data recording (e.g., water temperature, immersion duration completed the habitat description. Measures revealed an increase in dry flesh mass during the first year, followed by a high mass loss, which could not be completely explained by the DEB model using raw external signals. We propose two methods that reconstruct food from shell length and dry flesh mass variations. The former depends on the inversion of the growth equation while the latter is based on iterative simulations. Assemblages of food proxies are then related to reconstructed food input, with a special focus on plankton species. A characteristic contribution is attributed to these sources to estimate nutritional values for mussels. M. edulis shows no preference between most plankton life history traits. Selection is based on the size of the ingested particles, which is modified by the volume and social behavior of plankton species. This finding reveals the importance of diet diversity and both passive and active selections, and confirms the need to adjust DEB models to
Warped models in string theory
International Nuclear Information System (INIS)
Acharya, B.S.; Benini, F.; Valandro, R.
2006-12-01
Warped models, originating with the ideas of Randall and Sundrum, provide a fascinating extension of the standard model with interesting consequences for the LHC. We investigate in detail how string theory realises such models, with emphasis on fermion localisation and the computation of Yukawa couplings. We find, in contrast to the 5d models, that fermions can be localised anywhere in the extra dimension, and that there are new mechanisms to generate exponential hierarchies amongst the Yukawa couplings. We also suggest a way to distinguish these string theory models with data from the LHC. (author)
Model Theory in Algebra, Analysis and Arithmetic
Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J
2014-01-01
Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.
1982-08-01
accomplish the task, (2) the instrumentality of task performance for job outcomes, and (3) the instrumentality of outcomes for need satisfaction . We...in this discussion: effort, performance , outcomes, and needs. In order to present briefly the conventional approach to the Vroom models, another...Presumably, this is the final event in the sequence of effort, performance , outcome, and need satisfaction . The actual research reported in expectancy
Outcomes analysis of hospital management model in restricted budget conditions
Directory of Open Access Journals (Sweden)
Virsavia Vaseva
2016-03-01
Full Text Available Facing conditions of market economy and financial crisis, the head of any healthcare facility has to take adequate decisions about the cost-effective functioning of the hospital. Along with cost reduction, the main problem is how to maintain a high level of health services. The aim of our study was to analyse the quality of healthcare services after the implementation of control over expenses due to a reduction in the budgetary resources in Military Medical Academy (MMA, Sofia, Bulgaria. Data from the hospital information system and the Financial Department about the incomes and expenditures for patient treatment were used. We conducted a retrospective study on the main components of clinical indicators in 2013 to reveal the main problems in the hospital management. In 2014, control was imposed on the use of the most expensive medicines and consumables. Comparative analysis was made of the results of the medical services in MMA for 2013 and 2014. Our results showed that despite the limited budget in MMA over the last year, the policy of control over operational costs succeeded in maintaining the quality of healthcare services. While reducing the expenses for medicines, consumables and laboratory investigations by ∼26%, some quality criteria for healthcare services were observed to be improved by ∼9%. Financial crisis and budget reduction urge healthcare economists to create adequate economical instruments to assist the normal functioning of hospital facilities. Our analysis showed that when a right policy is chosen, better results may be achieved with fewer resources.
A Methodological Review of US Budget-Impact Models for New Drugs.
Mauskopf, Josephine; Earnshaw, Stephanie
2016-11-01
A budget-impact analysis is required by many jurisdictions when adding a new drug to the formulary. However, previous reviews have indicated that adherence to methodological guidelines is variable. In this methodological review, we assess the extent to which US budget-impact analyses for new drugs use recommended practices. We describe recommended practice for seven key elements in the design of a budget-impact analysis. Targeted literature searches for US studies reporting estimates of the budget impact of a new drug were performed and we prepared a summary of how each study addressed the seven key elements. The primary finding from this review is that recommended practice is not followed in many budget-impact analyses. For example, we found that growth in the treated population size and/or changes in disease-related costs expected during the model time horizon for more effective treatments was not included in several analyses for chronic conditions. In addition, all drug-related costs were not captured in the majority of the models. Finally, for most studies, one-way sensitivity and scenario analyses were very limited, and the ranges used in one-way sensitivity analyses were frequently arbitrary percentages rather than being data driven. The conclusions from our review are that changes in population size, disease severity mix, and/or disease-related costs should be properly accounted for to avoid over- or underestimating the budget impact. Since each budget holder might have different perspectives and different values for many of the input parameters, it is also critical for published budget-impact analyses to include extensive sensitivity and scenario analyses based on realistic input values.
Luther, M. R.
1981-01-01
The Earth Radiation Budget Experiment (ERBE) is to fly on NASA's Earth Radiation Budget Satellite (ERBS) and on NOAA F and NOAA G. Large spatial scale earth energy budget data will be derived primarily from measurements made by the ERBE nonscanning instrument (ERBE-NS). A description is given of a mathematical model capable of simulating the radiometric response of any of the ERBE-NS earth viewing channels. The model uses a Monte Carlo method to accurately account for directional distributions of emission and reflection from optical surfaces which are neither strictly diffuse nor strictly specular. The model computes radiation exchange factors among optical system components, and determines the distribution in the optical system of energy from an outside source. Attention is also given to an approach for implementing the model and results obtained from the implementation.
Dynamic energy budget (DEB) theory provides a generalizable and broadly applicable framework to connect sublethal toxic effects on individuals to changes in population survival and growth. To explore this approach, we conducted growth and bioaccumulation studies that contribute t...
Minisuperspace models in histories theory
International Nuclear Information System (INIS)
Anastopoulos, Charis; Savvidou, Ntina
2005-01-01
We study the Robertson-Walker minisuperspace model in histories theory, motivated by the results that emerged from the histories approach to general relativity. We examine, in particular, the issue of time reparametrization in such systems. The model is quantized using an adaptation of reduced state space quantization. We finally discuss the classical limit, the implementation of initial cosmological conditions and estimation of probabilities in the histories context
System dynamics model developement for R and D budget allocation strategy
International Nuclear Information System (INIS)
Kwack, S. M.; Kim, D. H.; Lee, Y. S.; Jung, M. T.
2003-01-01
A computer simulation model was developed for R and D budget allocation problems that usually contain very complex and non-linear social issues. The System Dynamics approach was employed, which is proper to complex and non-linear social problem modeling. An issue of budget allocation to each step in five research areas was analyzed for an application example. The base scenario, that assumes to allocate a large portion of budget to demonstration step, was found to have a weakness in long-term sense. To overcome this weakness, some other better alternatives were recommended through the analysis. In addition, this paper suggests the ways to utilize the updated model in the future
Memmesheimer, M.; Ebel, A.; Roemer, M.
1997-01-01
Results from two air quality models (LOTOS, EURAD) have been used to analyse the contribution of the different terms in the continuity equation to the budget of ozone, NO(x) and PAN. Both models cover large parts of Europe and describe the processes relevant for tropospheric chemistry and dynamics.
Directory of Open Access Journals (Sweden)
Edo Cvrkalj
2015-12-01
Full Text Available Traditional budgeting principles, with strictly defined business goals, have been, since 1998, slowly growing into more sophisticated and organization-adjusted alternative budgeting concepts. One of those alternative concepts is the “Beyond budgeting” model with an implemented performance effects measuring process. In order for the model to be practicable, budget planning and control has to be reoriented to the “bottom up” planning and control approach. In today’s modern business surroundings one has to take both present and future opportunities and threats into consideration, by valorizing them in a budget which would allow a company to realize a whole pallet of advantages over the traditional budgeting principles which are presented later in the article. It is essential to emphasize the importance of successfully implementing the new budgeting principles within an organization. If the implementation has been lacking and done without a higher goal in mind, it is easily possible that the process has been implemented without coordination, planning and control framework within the organization itself. Further in the article we present an overview of managerial techniques and instruments within the “Beyond budgeting” model such as balanced scorecard, rolling forecast, dashboard, KPI and other supporting instruments. Lastly we define seven steps for implementing the “Beyond budgeting” model and offer a comparison of “Beyond budgeting” model against traditional budgeting principles which lists twelve reasons why “Beyond budgeting” is better suited to modern and market-oriented organizations. Each company faces those challenges in their own characteristic way but implementing new dynamic planning models will soon become essential for surviving in the market.
Developing an Earth system Inverse model for the Earth's energy and water budgets.
Haines, K.; Thomas, C.; Liu, C.; Allan, R. P.; Carneiro, D. M.
2017-12-01
The CONCEPT-Heat project aims at developing a consistent energy budget for the Earth system in order to better understand and quantify global change. We advocate a variational "Earth system inverse" solution as the best methodology to bring the necessary expertise from different disciplines together. L'Ecuyer et al (2015) and Rodell et al (2015) first used a variational approach to adjust multiple satellite data products for air-sea-land vertical fluxes of heat and freshwater, achieving closed budgets on a regional and global scale. However their treatment of horizontal energy and water redistribution and its uncertainties was limited. Following the recent work of Liu et al (2015, 2017) which used atmospheric reanalysis convergences to derive a new total surface heat flux product from top of atmosphere fluxes, we have revisited the variational budget approach introducing a more extensive analysis of the role of horizontal transports of heat and freshwater, using multiple atmospheric and ocean reanalysis products. We find considerable improvements in fluxes in regions such as the North Atlantic and Arctic, for example requiring higher atmospheric heat and water convergences over the Arctic than given by ERA-Interim, thereby allowing lower and more realistic oceanic transports. We explore using the variational uncertainty analysis to produce lower resolution corrections to higher resolution flux products and test these against in situ flux data. We also explore the covariance errors implied between component fluxes that are imposed by the regional budget constraints. Finally we propose this as a valuable methodology for developing consistent observational constraints on the energy and water budgets in climate models. We take a first look at the same regional budget quantities in CMIP5 models and consider the implications of the differences for the processes and biases active in the models. Many further avenues of investigation are possible focused on better valuing
Foundations of compositional model theory
Czech Academy of Sciences Publication Activity Database
Jiroušek, Radim
2011-01-01
Roč. 40, č. 6 (2011), s. 623-678 ISSN 0308-1079 R&D Projects: GA MŠk 1M0572; GA ČR GA201/09/1891; GA ČR GEICC/08/E010 Institutional research plan: CEZ:AV0Z10750506 Keywords : multidimensional probability distribution * conditional independence * graphical Markov model * composition of distributions Subject RIV: IN - Informatics, Computer Science Impact factor: 0.667, year: 2011 http://library.utia.cas.cz/separaty/2011/MTR/jirousek-foundations of compositional model theory.pdf
Donnelly, Gloria
2005-01-01
In the allocation of resources in academic settings, hierarchies of tradition and status often supersede documented need. Nursing programs sometimes have difficulty in getting what they need to maintain quality programs and to grow. The budget is the crucial tool in documenting nursing program needs and its contributions to the entire academic enterprise. Most nursing programs administrators see only an operating expense budget that may grow or shrink by a rubric that may not fit the reality of the situation. A budget is a quantitative expression of how well a unit is managed. Educational administrators should be paying as much attention to analyzing financial outcomes as they do curricular outcomes. This article describes the development of a model for tracking revenue and expense and a simple rubric for analyzing the relationship between the two. It also discusses how to use financial data to improve the fiscal performance of nursing units and to leverage support during times of growth.
Methodology for estimating soil carbon for the forest carbon budget model of the United States, 2001
L. S. Heath; R. A. Birdsey; D. W. Williams
2002-01-01
The largest carbon (C) pool in United States forests is the soil C pool. We present methodology and soil C pool estimates used in the FORCARB model, which estimates and projects forest carbon budgets for the United States. The methodology balances knowledge, uncertainties, and ease of use. The estimates are calculated using the USDA Natural Resources Conservation...
Dropouts and Budgets: A Test of a Dropout Reduction Model among Students in Israeli Higher Education
Bar-Am, Ran; Arar, Osama
2017-01-01
This article deals with the problem of student dropout during the first year in a higher education institution. To date, no model on a budget has been developed and tested to prevent dropout among Engineering Students. This case study was conducted among first-year students taking evening classes in two practical engineering colleges in Israel.…
Superfield theory and supermatrix model
International Nuclear Information System (INIS)
Park, Jeong-Hyuck
2003-01-01
We study the noncommutative superspace of arbitrary dimensions in a systematic way. Superfield theories on a noncommutative superspace can be formulated in two folds, through the star product formalism and in terms of the supermatrices. We elaborate the duality between them by constructing the isomorphism explicitly and relating the superspace integrations of the star product lagrangian or the superpotential to the traces of the supermatrices. We show there exists an interesting fine tuned commutative limit where the duality can be still maintained. Namely on the commutative superspace too, there exists a supermatrix model description for the superfield theory. We interpret the result in the context of the wave particle duality. The dual particles for the superfields in even and odd spacetime dimensions are D-instantons and D0-branes respectively to be consistent with the T-duality. (author)
Therapeutic budget modelling: a possible road to budgetary ...
African Journals Online (AJOL)
... modelling: a possible road to budgetary allocations in the public health care ... on the public health care sector (especially the primary health care structure) for ... budgetary policies for better medicine formulary and resource management.
Demonstration of the gypsy moth energy budget microclimate model
D. E. Anderson; D. R. Miller; W. E. Wallner
1991-01-01
The use of a "User friendly" version of "GMMICRO" model to quantify the local environment and resulting core temperature of GM larvae under different conditions of canopy defoliation, different forest sites, and different weather conditions was demonstrated.
Fournier, N.; Tang, Y.S.; Dragosits, U.; Kluizenaar, Y.de; Sutton, M.A.
2005-01-01
Atmospheric budgets of reduced nitrogen for the major political regions of the British Isles are investigated with a multi-layer atmospheric transport model. The model is validated against measurements of NH3 concentration and is developed to provide atmospheric budgets for defined subdomains of the
Implementing Strategy in a Budget: A Model of the Coast Guard Reserve
Bromund, Carl Douglas
1990-01-01
Approved for public release; distribution is unlimited. This thesis discusses the managment strategy of the Coast Guard Reserve; it examines the formulation and implmentation of strateqy. A model to develop and implement strategy is proposed, which defines the role of the budget in this strategic management process. The recent strategy of the Coast Guard Reserve is analyzed using this model.. This analysis seems to indicate that the Coast Guard Reserve had no explicit strate...
Models in cooperative game theory
Branzei, Rodica; Tijs, Stef
2008-01-01
This book investigates models in cooperative game theory in which the players have the possibility to cooperate partially. In a crisp game the agents are either fully involved or not involved at all in cooperation with some other agents, while in a fuzzy game players are allowed to cooperate with infinite many different participation levels, varying from non-cooperation to full cooperation. A multi-choice game describes the intermediate case in which each player may have a fixed number of activity levels. Different set and one-point solution concepts for these games are presented. The properties of these solution concepts and their interrelations on several classes of crisp, fuzzy, and multi-choice games are studied. Applications of the investigated models to many economic situations are indicated as well. The second edition is highly enlarged and contains new results and additional sections in the different chapters as well as one new chapter.
Chance-constrained programming models for capital budgeting with NPV as fuzzy parameters
Huang, Xiaoxia
2007-01-01
In an uncertain economic environment, experts' knowledge about outlays and cash inflows of available projects consists of much vagueness instead of randomness. Investment outlays and annual net cash flows of a project are usually predicted by using experts' knowledge. Fuzzy variables can overcome the difficulties in predicting these parameters. In this paper, capital budgeting problem with fuzzy investment outlays and fuzzy annual net cash flows is studied based on credibility measure. Net present value (NPV) method is employed, and two fuzzy chance-constrained programming models for capital budgeting problem are provided. A fuzzy simulation-based genetic algorithm is provided for solving the proposed model problems. Two numerical examples are also presented to illustrate the modelling idea and the effectiveness of the proposed algorithm.
Field theory and the Standard Model
Energy Technology Data Exchange (ETDEWEB)
Dudas, E [Orsay, LPT (France)
2014-07-01
This brief introduction to Quantum Field Theory and the Standard Model contains the basic building blocks of perturbation theory in quantum field theory, an elementary introduction to gauge theories and the basic classical and quantum features of the electroweak sector of the Standard Model. Some details are given for the theoretical bias concerning the Higgs mass limits, as well as on obscure features of the Standard Model which motivate new physics constructions.
Lattice models and conformal field theories
International Nuclear Information System (INIS)
Saleur, H.
1988-01-01
Theoretical studies concerning the connection between critical physical systems and the conformal theories are reviewed. The conformal theory associated to a critical (integrable) lattice model is derived. The obtention of the central charge, critical exponents and torus partition function, using renormalization group arguments, is shown. The quantum group structure, in the integrable lattice models, and the theory of Visaro algebra representations are discussed. The relations between off-critical integrable models and conformal theories, in finite geometries, are studied
Water Budget Model for a Remnant of the Historic Northern Everglades
Arceneaux, J. C.; Meselhe, E. A.; Habib, E.; Waldon, M. G.
2006-12-01
The Arthur R. Marshall Loxahatchee National Wildlife Refuge overlays an area termed Water Conservation Area 1 (WCA-1, a 143,000 acre (58,000 ha) freshwater wetland. It is a remnant of the northern Everglades in Palm Beach County, Florida, USA. Sheetflow that naturally would flow across the Refuge wetlands was disrupted in the 1950s and early 1960s by construction of stormwater pumps, and levees with associated borrow canals which hydraulically isolated the Refuge from its watershed. The U.S. Fish and Wildlife Services (USFWS) concludes that changes in the water quantity, timing, and quality have caused negative impacts to the Refuge ecosystem. It is a top priority of the Refuge to ensure appropriate management that will produce maximum benefits for fish and wildlife, while meeting flood control and water supply needs. Models can improve our understanding and support improvement in these management decisions. The development of a water budget for the Loxahatchee Refuge will provide one useful modeling tool in support of Refuge water management decisions. The water budget model reported here was developed as a double- box (2-compartment) model with a daily time step that predicts temporal variations of water level in the Refuge rim canal and interior marsh based on observed inflows, outflows, precipitation, and evapotranspiration. The water budget model was implemented using Microsoft EXCEL. The model calibration period was from January 1, 1995 to December 31, 1999; the validation period extended from January 1, 2000 to December 31, 2004. Statistical analyses demonstrate the utility of this simple water budget model to predict the temporal variation of water levels in both the Refuge marsh and rim canal. The Refuge water budget model is currently being applied to evaluate various water management scenarios for the Refuge. Preliminary results modeling the mass balance of water quality constituents, including chloride, total phosphorus are encouraging. Success of this
Armour, K.
2017-12-01
Global energy budget observations have been widely used to constrain the effective, or instantaneous climate sensitivity (ICS), producing median estimates around 2°C (Otto et al. 2013; Lewis & Curry 2015). A key question is whether the comprehensive climate models used to project future warming are consistent with these energy budget estimates of ICS. Yet, performing such comparisons has proven challenging. Within models, values of ICS robustly vary over time, as surface temperature patterns evolve with transient warming, and are generally smaller than the values of equilibrium climate sensitivity (ECS). Naively comparing values of ECS in CMIP5 models (median of about 3.4°C) to observation-based values of ICS has led to the suggestion that models are overly sensitive. This apparent discrepancy can partially be resolved by (i) comparing observation-based values of ICS to model values of ICS relevant for historical warming (Armour 2017; Proistosescu & Huybers 2017); (ii) taking into account the "efficacies" of non-CO2 radiative forcing agents (Marvel et al. 2015); and (iii) accounting for the sparseness of historical temperature observations and differences in sea-surface temperature and near-surface air temperature over the oceans (Richardson et al. 2016). Another potential source of discrepancy is a mismatch between observed and simulated surface temperature patterns over recent decades, due to either natural variability or model deficiencies in simulating historical warming patterns. The nature of the mismatch is such that simulated patterns can lead to more positive radiative feedbacks (higher ICS) relative to those engendered by observed patterns. The magnitude of this effect has not yet been addressed. Here we outline an approach to perform fully commensurate comparisons of climate models with global energy budget observations that take all of the above effects into account. We find that when apples-to-apples comparisons are made, values of ICS in models are
Halo modelling in chameleon theories
Energy Technology Data Exchange (ETDEWEB)
Lombriser, Lucas; Koyama, Kazuya [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, PO1 3FX (United Kingdom); Li, Baojiu, E-mail: lucas.lombriser@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk, E-mail: baojiu.li@durham.ac.uk [Institute for Computational Cosmology, Ogden Centre for Fundamental Physics, Department of Physics, University of Durham, Science Laboratories, South Road, Durham, DH1 3LE (United Kingdom)
2014-03-01
We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations.
Halo modelling in chameleon theories
International Nuclear Information System (INIS)
Lombriser, Lucas; Koyama, Kazuya; Li, Baojiu
2014-01-01
We analyse modelling techniques for the large-scale structure formed in scalar-tensor theories of constant Brans-Dicke parameter which match the concordance model background expansion history and produce a chameleon suppression of the gravitational modification in high-density regions. Thereby, we use a mass and environment dependent chameleon spherical collapse model, the Sheth-Tormen halo mass function and linear halo bias, the Navarro-Frenk-White halo density profile, and the halo model. Furthermore, using the spherical collapse model, we extrapolate a chameleon mass-concentration scaling relation from a ΛCDM prescription calibrated to N-body simulations. We also provide constraints on the model parameters to ensure viability on local scales. We test our description of the halo mass function and nonlinear matter power spectrum against the respective observables extracted from large-volume and high-resolution N-body simulations in the limiting case of f(R) gravity, corresponding to a vanishing Brans-Dicke parameter. We find good agreement between the two; the halo model provides a good qualitative description of the shape of the relative enhancement of the f(R) matter power spectrum with respect to ΛCDM caused by the extra attractive gravitational force but fails to recover the correct amplitude. Introducing an effective linear power spectrum in the computation of the two-halo term to account for an underestimation of the chameleon suppression at intermediate scales in our approach, we accurately reproduce the measurements from the N-body simulations
Stochastic models: theory and simulation.
Energy Technology Data Exchange (ETDEWEB)
Field, Richard V., Jr.
2008-03-01
Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.
Health and budget impact of combined HIV prevention - first results of the BELHIVPREV model.
Vermeersch, Sebastian; Callens, Steven; De Wit, Stéphane; Goffard, Jean-Christophe; Laga, Marie; Van Beckhoven, Dominique; Annemans, Lieven
2018-02-01
We developed a pragmatic modelling approach to estimate the impact of treatment as prevention (TasP); outreach testing strategies; and pre-exposure prophylaxis (PrEP) on the epidemiology of HIV and its associated pharmaceutical expenses. Our model estimates the incremental health (in terms of new HIV diagnoses) and budget impact of two prevention scenarios (outreach+TasP and outreach+TasP+PrEP) against a 'no additional prevention' scenario. Model parameters were estimated from reported Belgian epidemiology and literature data. The analysis was performed from a healthcare payer perspective with a 15-year-time horizon. It considers subpopulation differences, HIV infections diagnosed in Belgium having occurred prior to migration, and the effects of an ageing HIV population. Without additional prevention measures, the annual number of new HIV diagnoses rises to over 1350 new diagnoses in 2030 as compared to baseline, resulting in a budget expenditure of €260.5 million. Implementation of outreach+TasP and outreach+TasP+PrEP results in a decrease in the number of new HIV diagnoses to 865 and 663 per year, respectively. Respective budget impacts decrease by €20.6 million and €33.7 million. Foregoing additional investments in prevention is not an option. An approach combining TasP, outreach and PrEP is most effective in reducing the number of new HIV diagnoses and the HIV treatment budget. Our model is the first pragmatic HIV model in Belgium estimating the consequences of a combined preventive approach on the HIV epidemiology and its economic burden assuming other prevention efforts such as condom use and harm reduction strategies remain the same.
A new modified resource budget model for nonlinear dynamics in citrus production
International Nuclear Information System (INIS)
Ye, Xujun; Sakai, Kenshi
2016-01-01
Highlights: • A theoretical modeling and simulation study of the nonlinear dynamics in citrus is conducted. • New leaf growth is incorporated into the model as a major factor responsible for the yield oscillations. • A Ricker-type equation for the relationship between costs for flowering and fruiting is proposed. • A generic form of the resource budget model for the nonlinear dynamics in citrus is obtained. • The new model is tested with experimental data for two citrus trees. - Abstract : Alternate bearing or masting is a general yield variability phenomenon in perennial tree crops. This paper first presents a theoretical modeling and simulation study of the mechanism for this dynamics in citrus, and then provides a test of the proposed models using data from a previous 16-year experiment in a citrus orchard. Our previous studies suggest that the mutual effects between vegetative and reproductive growths caused by resource allocation and budgeting in plant body might be considered as a major factor responsible for the yield oscillations in citrus. Based on the resource budget model proposed by Isagi et al. (J Theor Biol. 1997;187:231-9), we first introduce the new leaf growth as a major energy consumption component into the model. Further, we introduce a nonlinear Ricker-type equation to replace the linear relationship between costs for flowering and fruiting used in Isagi's model. Model simulations demonstrate that the proposed new models can successfully simulate the reproductive behaviors of citrus trees with different fruiting dynamics. These results may enrich the mechanical dynamics in tree crop reproductive models and help us to better understand the dynamics of vegetative-reproductive growth interactions in a real environment.
Quiver gauge theories and integrable lattice models
International Nuclear Information System (INIS)
Yagi, Junya
2015-01-01
We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.
Energy Technology Data Exchange (ETDEWEB)
Martens, Craig C., E-mail: cmartens@uci.edu
2016-12-20
In this paper, we revisit the semiclassical Liouville approach to describing molecular dynamics with electronic transitions using classical trajectories. Key features of the formalism are highlighted. The locality in phase space and presence of nonclassical terms in the generalized Liouville equations are emphasized and discussed in light of trajectory surface hopping methodology. The representation dependence of the coupled semiclassical Liouville equations in the diabatic and adiabatic bases are discussed and new results for the transformation theory of the Wigner functions representing the corresponding density matrix elements given. We show that the diagonal energies of the state populations are not conserved during electronic transitions, as energy is stored in the electronic coherence. We discuss the implications of this observation for the validity of imposing strict energy conservation in trajectory based methods for simulating nonadiabatic processes.
The Ozone Budget in the Upper Troposphere from Global Modeling Initiative (GMI)Simulations
Rodriquez, J.; Duncan, Bryan N.; Logan, Jennifer A.
2006-01-01
Ozone concentrations in the upper troposphere are influenced by in-situ production, long-range tropospheric transport, and influx of stratospheric ozone, as well as by photochemical removal. Since ozone is an important greenhouse gas in this region, it is particularly important to understand how it will respond to changes in anthropogenic emissions and changes in stratospheric ozone fluxes.. This response will be determined by the relative balance of the different production, loss and transport processes. Ozone concentrations calculated by models will differ depending on the adopted meteorological fields, their chemical scheme, anthropogenic emissions, and treatment of the stratospheric influx. We performed simulations using the chemical-transport model from the Global Modeling Initiative (GMI) with meteorological fields from (It)h e NASA Goddard Institute for Space Studies (GISS) general circulation model (GCM), (2) the atmospheric GCM from NASA's Global Modeling and Assimilation Office(GMAO), and (3) assimilated winds from GMAO . These simulations adopt the same chemical mechanism and emissions, and adopt the Synthetic Ozone (SYNOZ) approach for treating the influx of stratospheric ozone -. In addition, we also performed simulations for a coupled troposphere-stratosphere model with a subset of the same winds. Simulations were done for both 4degx5deg and 2degx2.5deg resolution. Model results are being tested through comparison with a suite of atmospheric observations. In this presentation, we diagnose the ozone budget in the upper troposphere utilizing the suite of GMI simulations, to address the sensitivity of this budget to: a) the different meteorological fields used; b) the adoption of the SYNOZ boundary condition versus inclusion of a full stratosphere; c) model horizontal resolution. Model results are compared to observations to determine biases in particular simulations; by examining these comparisons in conjunction with the derived budgets, we may pinpoint
New Pathways between Group Theory and Model Theory
Fuchs, László; Goldsmith, Brendan; Strüngmann, Lutz
2017-01-01
This volume focuses on group theory and model theory with a particular emphasis on the interplay of the two areas. The survey papers provide an overview of the developments across group, module, and model theory while the research papers present the most recent study in those same areas. With introductory sections that make the topics easily accessible to students, the papers in this volume will appeal to beginning graduate students and experienced researchers alike. As a whole, this book offers a cross-section view of the areas in group, module, and model theory, covering topics such as DP-minimal groups, Abelian groups, countable 1-transitive trees, and module approximations. The papers in this book are the proceedings of the conference “New Pathways between Group Theory and Model Theory,” which took place February 1-4, 2016, in Mülheim an der Ruhr, Germany, in honor of the editors’ colleague Rüdiger Göbel. This publication is dedicated to Professor Göbel, who passed away in 2014. He was one of th...
Galaxy Alignments: Theory, Modelling & Simulations
Kiessling, Alina; Cacciato, Marcello; Joachimi, Benjamin; Kirk, Donnacha; Kitching, Thomas D.; Leonard, Adrienne; Mandelbaum, Rachel; Schäfer, Björn Malte; Sifón, Cristóbal; Brown, Michael L.; Rassat, Anais
2015-11-01
The shapes of galaxies are not randomly oriented on the sky. During the galaxy formation and evolution process, environment has a strong influence, as tidal gravitational fields in the large-scale structure tend to align nearby galaxies. Additionally, events such as galaxy mergers affect the relative alignments of both the shapes and angular momenta of galaxies throughout their history. These "intrinsic galaxy alignments" are known to exist, but are still poorly understood. This review will offer a pedagogical introduction to the current theories that describe intrinsic galaxy alignments, including the apparent difference in intrinsic alignment between early- and late-type galaxies and the latest efforts to model them analytically. It will then describe the ongoing efforts to simulate intrinsic alignments using both N-body and hydrodynamic simulations. Due to the relative youth of this field, there is still much to be done to understand intrinsic galaxy alignments and this review summarises the current state of the field, providing a solid basis for future work.
Directory of Open Access Journals (Sweden)
I Putu Yoga Bumi Pradana
2015-02-01
Full Text Available This study aims to present a reconciliation model of bureaucratic principles (Secretion and democracy (Transparency through the mapping of public information about managing a local government budget which is accessible to the public and which ones are excluded (secret based on bureaucracy and public perceptions. This study uses a mixed method with sequential exploratory design and data collection research procedures using surveys, depth interviews, and documents. The validation data use source of triangulation techniques. The subjects of this study was divided into 2 (two information assembling that is government bureaucracy and public Kupang determined by purposive. The results of this research showed that Kupang Goverment bureaucracy has 22 types of information perception (33,85% in category information which is open and 42 types of information (64,62% in category information that are closed while the public perceives 29 types of information (44,62% in category information which is open and 26 types of information (40% in the category of information that are closed. Therefore, to achieve the main of reconciliation to end of conflict between bureaucracy and public, later on the amount of information is open budget of management that are 32 types of information (49,2% and the amount of information that is enclosed which includes 33 types of information (50,8 % of the 65 types of management budget information by egulation No. 13 of 2006 on local Financial Management.
Applications of model theory to functional analysis
Iovino, Jose
2014-01-01
During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the
National Research Council Canada - National Science Library
2000-01-01
This volume-part of the Congressional Budget Office's (CBO's) annual report to the House and Senate Committees on the Budget-is intended to help inform policymakers about options for the federal budget...
Alunno-Bruscia, M.; v.d. Veer, H.; Kooijman, S.A.L.M.
2011-01-01
This second special issue of the Journal of Sea Research on development and applications of Dynamic Energy Budget (DEB) theory concludes the European Research Project AquaDEB (2007-2011). In this introductory paper we summarise the progress made during the running time of this 5. years' project,
Jochheim, H.; Puhlmann, M.; Beese, F.; Berthold, D.; Einert, P.; Kallweit, R.; Konopatzky, A.; Meesenburg, H.; Meiwes, K.-J.; Raspe, S.; Schulte-Bisping, H.; Schulz, C.
2008-01-01
It is shown that by calibrating the simulation model BIOME-BGC with mandatory and optional Level II data, within the ICP Forest programme, a well-founded calculation of the carbon budget of forest stands is achievable and, based on succeeded calibration, the modified BIOME-BGC model is a useful tool to assess the effect of climate change on forest ecosystems. peerReviewed
Directory of Open Access Journals (Sweden)
Davor Čutić
2010-07-01
Full Text Available The system of budget planning, programming, development and execution of the Ministry of Defence of the Republic of Croatia (henceforth: the Croatian acronym SPPIIP is the basic system for the strategic management of defence resources through which an effective and rational distribution of available resources is conducted, based on the goals of national security of the Republic of Croatia. This system sets the principles of transparency and democratic management of defence resources while respecting the specificities of the defence system. The SPPIIP allows for decision making based on complete information about alternatives and the choice of the most economical and most efficient way to reach the goal. It unites the strategic plan, program and budget. It consists of four continuous, independent and interconnected phases: planning, programming, development and the execution of the budget. The processes of the phases are dynamic and cyclic. In addition to the SPPIIP, the Defence Resources Management Model (DRMM, Croatian acronym: MURO has also been developed. This is an analytic tool which serves as a decision support system in the SPPIIP. The DRMM is a complex computer model showing graph and tabular overviews in a multi-year period. The model examines three areas: the strength of the forces, expenses and defence programs. The purpose of the model is cost and strength analysis and the analysis of compromise and feasibility, i.e. how sensitive the programs are to fiscal movements in the sphere of the MoD budget in the course of a multiyear cycle, until a certain project ends. The analysis results are an easily understandable basis for decision making. The SPPIIP and the DRMM are mutually independent systems, but they complement each other well. The SPPIIP uses the DRMM in designing and resource allocation based on the goals set. The quality of the DRMM depends on the amount and quality of data in its database. The DRMM can be used as a basis for
The Nitrous Oxide (N2O) Budget: Constraints from Atmospheric Observations and Models
Tian, H.; Thompson, R.; Canadell, J.; Winiwarter, W.; Tian, H.; Thompson, R.; Prather, M. J.
2017-12-01
The increasing global abundance of N2O poses a threat to human health and society over this coming century through both climate change and ozone depletion. In the sense of greenhouse gases, N2O ranks third behind CO2 and CH4. In the sense of ozone depletion, N2O stands alone. In order to identify the cause of these increases and hopefully reverse them, we need to have a thorough understanding of the sources and sinks (a.k.a. the budget) of N2O and how they can be altered. A bottom-up approach to the budget evaluates individual biogeochemical sources of N2O from the land and ocean; whereas, a top-down approach uses atmospheric observations of the variability, combined with modeling of the atmospheric chemistry and transport, to infer the magnitude of sources and sinks throughout the Earth system. This paper reviews top-down approaches using atmospheric data; a similar top-down approach can be taken with oceanic measurements of N2O, but is not covered here. Stratospheric chemistry is the predominant loss of N2O, and here we review how a merging of new measurements with stratospheric chemistry models is able to provide a constrained budget for the global N2O sink. N2O surface sources are transported and mixed throughout the atmosphere, providing positive anomalies in the N2O abundance (mole fraction of N2O with respect to dry air); while N2O-depleted air from the stratosphere provides negative anomalies. With accurate atmospheric transport models, including for stratosphere-troposphere exchange, the observed tropospheric variability in N2O is effectively a fingerprint of the location and magnitude of sources. This inverse modeling of sources is part of the top-down constraints and is reviewed here.
Rosland, R.; Strand, Ø.; Alunno-Bruscia, M.; Bacher, C.; Strohmeier, T.
2009-08-01
A Dynamic Energy Budget (DEB) model for simulation of growth and bioenergetics of blue mussels ( Mytilus edulis) has been tested in three low seston sites in southern Norway. The observations comprise four datasets from laboratory experiments (physiological and biometrical mussel data) and three datasets from in situ growth experiments (biometrical mussel data). Additional in situ data from commercial farms in southern Norway were used for estimation of biometrical relationships in the mussels. Three DEB parameters (shape coefficient, half saturation coefficient, and somatic maintenance rate coefficient) were estimated from experimental data, and the estimated parameters were complemented with parameter values from literature to establish a basic parameter set. Model simulations based on the basic parameter set and site specific environmental forcing matched fairly well with observations, but the model was not successful in simulating growth at the extreme low seston regimes in the laboratory experiments in which the long period of negative growth caused negative reproductive mass. Sensitivity analysis indicated that the model was moderately sensitive to changes in the parameter and initial conditions. The results show the robust properties of the DEB model as it manages to simulate mussel growth in several independent datasets from a common basic parameter set. However, the results also demonstrate limitations of Chl a as a food proxy for blue mussels and limitations of the DEB model to simulate long term starvation. Future work should aim at establishing better food proxies and improving the model formulations of the processes involved in food ingestion and assimilation. The current DEB model should also be elaborated to allow shrinking in the structural tissue in order to produce more realistic growth simulations during long periods of starvation.
Theuerkauf, E. J.; Rodriguez, A. B.
2017-12-01
The size of backbarrier saltmarsh carbon reservoirs are dictated by transgressive processes, such as erosion and overwash, yet these processes are not included in blue carbon budgets. These carbon reservoirs are presumed to increase through time if marsh elevation is keeping pace with sea-level rise. However, changes in marsh width due to erosion and overwash can alter carbon budgets and reservoirs. To explore the impacts of these processes on transgressive barrier island carbon budgets and reservoirs we developed and tested a transect model. The model couples a carbon storage term driven by backbarrier marsh width and a carbon export term driven by ocean and backbarrier shoreline erosion. We tested the model using data collected from two transgressive barrier islands in North Carolina with different backbarrier settings. Core Banks is an undeveloped barrier island with a wide backbarrier marsh and lagoon, hence, landward migration of the island (rollover) is unimpeded. Barrier rollover is impeded at Onslow Beach as there is no backbarrier lagoon and the island is immediately adjacent to steeper mainland topography. Sediment cores were collected to determine carbon storage rates as well as the quantity of carbon exported from eroding marsh. Backbarrier marsh erosion rates, ocean shoreline erosion rates, and changes in marsh width were determined from aerial photographs. Output from the model indicated that hurricane erosion and overwash as well as human disturbance from the construction of the Intracoastal Waterway temporarily transitioned the Onslow Beach sites to carbon sources. Through time, the carbon reservoir at this barrier continued to decrease as carbon export outpaced carbon storage. The carbon reservoir will continue to exhaust as the ocean shoreline migrates landward given the inability for new marsh to form during island rollover. At Core Banks, barrier rollover is unimpeded and new saltmarsh can form during transgression. The Core Banks site only
Theories, Models and Methodology in Writing Research
Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel
1996-01-01
Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the
The Friction Theory for Viscosity Modeling
DEFF Research Database (Denmark)
Cisneros, Sergio; Zeberg-Mikkelsen, Claus Kjær; Stenby, Erling Halfdan
2001-01-01
, in the case when experimental information is available a more accurate modeling can be obtained by means of a simple tuning procedure. A tuned f-theory general model can deliver highly accurate viscosity modeling above the saturation pressure and good prediction of the liquid-phase viscosity at pressures......In this work the one-parameter friction theory (f-theory) general models have been extended to the viscosity prediction and modeling of characterized oils. It is demonstrated that these simple models, which take advantage of the repulsive and attractive pressure terms of cubic equations of state...... such as the SRK, PR and PRSV, can provide accurate viscosity prediction and modeling of characterized oils. In the case of light reservoir oils, whose properties are close to those of normal alkanes, the one-parameter f-theory general models can predict the viscosity of these fluids with good accuracy. Yet...
Klok, T.C.; Nordtug, T.; Tamis, J.E.
2014-01-01
To estimate the impact of accidental oil-spills on cod fisheries a model framework is developed in which a Dynamic Energy Budget (DEB) model is applied to assess mortality caused by petroleum substances in early life stages. In this paper we report on a literature search and DEB analyses, aiming for
F.K.M. van Nispen tot Pannerden (Frans)
2012-01-01
textabstractThe Call for a Budgetary Theory: The appeal of Valdimer Key for a budgetary theory marks the interest in public budgeting in modern history. He clearly referred to a normative theory, raising the question: ‘on what basis shall it be decided to allocate X dollars to activity A instead of
A Budget Impact Model for Paclitaxel-eluting Stent in Femoropopliteal Disease in France
International Nuclear Information System (INIS)
De Cock, Erwin; Sapoval, Marc; Julia, Pierre; Lissovoy, Greg de; Lopes, Sandra
2013-01-01
The Zilver PTX drug-eluting stent (Cook Ireland Ltd., Limerick, Ireland) represents an advance in endovascular treatments for atherosclerotic superficial femoral artery (SFA) disease. Clinical data demonstrate improved clinical outcomes compared to bare-metal stents (BMS). This analysis assessed the likely impact on the French public health care budget of introducing reimbursement for the Zilver PTX stent. A model was developed in Microsoft Excel to estimate the impact of a progressive transition from BMS to Zilver PTX over a 5-year horizon. The number of patients undergoing SFA stenting was estimated on the basis of hospital episode data. The analysis from the payer perspective used French reimbursement tariffs. Target lesion revascularization (TLR) after primary stent placement was the primary outcome. TLR rates were based on 2-year data from the Zilver PTX single-arm study (6 and 9 %) and BMS rates reported in the literature (average 16 and 22 %) and extrapolated to 5 years. Net budget impact was expressed as the difference in total costs (primary stenting and reinterventions) for a scenario where BMS is progressively replaced by Zilver PTX compared to a scenario of BMS only. The model estimated a net cumulative 5-year budget reduction of €6,807,202 for a projected population of 82,316 patients (21,361 receiving Zilver PTX). Base case results were confirmed in sensitivity analyses. Adoption of Zilver PTX could lead to important savings for the French public health care payer. Despite higher initial reimbursement for the Zilver PTX stent, fewer expected SFA reinterventions after the primary stenting procedure result in net savings.
A Budget Impact Model for Paclitaxel-eluting Stent in Femoropopliteal Disease in France
Energy Technology Data Exchange (ETDEWEB)
De Cock, Erwin, E-mail: erwin.decock@unitedbiosource.com [United BioSource Corporation, Peri- and Post-Approval Services (Spain); Sapoval, Marc, E-mail: Marc.sapoval2@egp.aphp.fr [Hopital Europeen Georges Pompidou, Universite Rene Descartes, Department of Cardiovascular and Interventional Radiology (France); Julia, Pierre, E-mail: pierre.julia@egp.aphp.fr [Hopital Europeen Georges Pompidou, Universite Rene Descartes, Cardiovascular Surgery Department (France); Lissovoy, Greg de, E-mail: gdelisso@jhsph.edu [Johns Hopkins Bloomberg School of Public Health, Department of Health Policy and Management (United States); Lopes, Sandra, E-mail: Sandra.Lopes@CookMedical.com [Cook Medical, Health Economics and Reimbursement (Denmark)
2013-04-15
The Zilver PTX drug-eluting stent (Cook Ireland Ltd., Limerick, Ireland) represents an advance in endovascular treatments for atherosclerotic superficial femoral artery (SFA) disease. Clinical data demonstrate improved clinical outcomes compared to bare-metal stents (BMS). This analysis assessed the likely impact on the French public health care budget of introducing reimbursement for the Zilver PTX stent. A model was developed in Microsoft Excel to estimate the impact of a progressive transition from BMS to Zilver PTX over a 5-year horizon. The number of patients undergoing SFA stenting was estimated on the basis of hospital episode data. The analysis from the payer perspective used French reimbursement tariffs. Target lesion revascularization (TLR) after primary stent placement was the primary outcome. TLR rates were based on 2-year data from the Zilver PTX single-arm study (6 and 9 %) and BMS rates reported in the literature (average 16 and 22 %) and extrapolated to 5 years. Net budget impact was expressed as the difference in total costs (primary stenting and reinterventions) for a scenario where BMS is progressively replaced by Zilver PTX compared to a scenario of BMS only. The model estimated a net cumulative 5-year budget reduction of Euro-Sign 6,807,202 for a projected population of 82,316 patients (21,361 receiving Zilver PTX). Base case results were confirmed in sensitivity analyses. Adoption of Zilver PTX could lead to important savings for the French public health care payer. Despite higher initial reimbursement for the Zilver PTX stent, fewer expected SFA reinterventions after the primary stenting procedure result in net savings.
Potential Applications of Gosat Based Carbon Budget Products to Refine Terrestrial Ecosystem Model
Kondo, M.; Ichii, K.
2011-12-01
Estimation of carbon exchange in terrestrial ecosystem associates with difficulties due to complex entanglement of physical and biological processes: thus, the net ecosystem productivity (NEP) estimated from simulation often differs among process-based terrestrial ecosystem models. In addition to complexity of the system, validation can only be conducted in a point scale since reliable observation is only available from ground observations. With a lack of large spatial data, extension of model simulation to a global scale results in significant uncertainty in the future carbon balance and climate change. Greenhouse gases Observing SATellite (GOSAT), launched by the Japanese space agency (JAXA) in January, 2009, is the 1st operational satellite promised to deliver the net land-atmosphere carbon budget to the terrestrial biosphere research community. Using that information, the model reproducibility of carbon budget is expected to improve: hence, gives a better estimation of the future climate change. This initial analysis is to seek and evaluate the potential applications of GOSAT observation toward the sophistication of terrestrial ecosystem model. The present study was conducted in two processes: site-based analysis using eddy covariance observation data to assess the potential use of terrestrial carbon fluxes (GPP, RE, and NEP) to refine the model, and extension of the point scale analysis to spatial using Carbon Tracker product as a prototype of GOSAT product. In the first phase of the experiment, it was verified that an optimization routine adapted to a terrestrial model, Biome-BGC, yielded the improved result with respect to eddy covariance observation data from AsiaFlux Network. Spatial data sets used in the second phase were consists of GPP from empirical algorithm (e.g. support vector machine), NEP from Carbon Tracker, and RE from the combination of these. These spatial carbon flux estimations was used to refine the model applying the exactly same
Economic benefits of safety-engineered sharp devices in Belgium - a budget impact model.
Hanmore, Emma; Maclaine, Grant; Garin, Fiona; Alonso, Alexander; Leroy, Nicolas; Ruff, Lewis
2013-11-25
Measures to protect healthcare workers where there is risk of injury or infection from medical sharps became mandatory in the European Union (EU) from May 2013. Our research objective was to estimate the net budget impact of introducing safety-engineered devices (SEDs) for prevention of needlestick injuries (NSIs) in a Belgian hospital. A 5-year incidence-based budget impact model was developed from the hospital inpatient perspective, comparing costs and outcomes with SEDs and prior-used conventional (non-safety) devices. The model accounts for device acquisition costs and costs of NSI management in 4 areas of application where SEDs are currently used: blood collection, infusion, injection and diabetes insulin administration. Model input data were sourced from the Institut National d'Assurance Maladie-Invalidité, published studies, clinical guidelines and market research. Costs are discounted at 3%. For a 420-bed hospital, 100% substitution of conventional devices by SEDs is estimated to decrease the cumulative 5-year incidence of NSIs from 310 to 75, and those associated with exposure to blood-borne viral diseases from 60 to 15. Cost savings from managing fewer NSIs more than offset increased device acquisition costs, yielding estimated 5-year overall savings of €51,710. The direction of these results is robust to a range of sensitivity and model scenario analyses. The model was most sensitive to variation in the acquisition costs of SEDs, rates of NSI associated with conventional devices, and the acquisition costs of conventional devices. NSIs are a significant potential risk with the use of sharp devices. The incidence of NSIs and the costs associated with their management can be reduced through the adoption of safer work practices, including investment in SEDs. For a Belgian hospital, the budget impact model reports that the incremental acquisition costs of SEDs are offset by the savings from fewer NSIs. The availability of more robust data for NSI reduction
Topsoil N-budget model in orchard farming to evaluate groundwater nitrate contamination
Wijayanti, Yureana; Budihardjo, Kadarwati; Sakamoto, Yasushi; Setyandito, Oki
2017-12-01
A small scale field research was conducted in an orchard farming area in Kofu, Japan, where nitrate contamination was found in groundwater. The purpose of assessing the leaching of nitrate in this study is to understand the transformation and transport process of N-source in topsoil that leads to nitrate contamination of groundwater. In order to calculate N-budget in the soil, the model was utilized to predict the nitrogen leaching. In this res earch, the N-budget model was modified to evaluate influence of precipitation and application pattern of fertilizer and manure compost. The result shows that at the time before the addition of manure compost and fertilizer, about 75% of fertilizer leach from topsoil. Every month, the average remaining nitrate in soil from fertilizer and manure compost are 22% and 50%, respectively. The accumulation of this monthly manure compost nitrate, which stored in soil, should be carefully monitored. It could become the potential source of nitrate leaching to groundwater in the future.
Developing integrated parametric planning models for budgeting and managing complex projects
Etnyre, Vance A.; Black, Ken U.
1988-01-01
The applicability of integrated parametric models for the budgeting and management of complex projects is investigated. Methods for building a very flexible, interactive prototype for a project planning system, and software resources available for this purpose, are discussed and evaluated. The prototype is required to be sensitive to changing objectives, changing target dates, changing costs relationships, and changing budget constraints. To achieve the integration of costs and project and task durations, parametric cost functions are defined by a process of trapezoidal segmentation, where the total cost for the project is the sum of the various project cost segments, and each project cost segment is the integral of a linearly segmented cost loading function over a specific interval. The cost can thus be expressed algebraically. The prototype was designed using Lotus-123 as the primary software tool. This prototype implements a methodology for interactive project scheduling that provides a model of a system that meets most of the goals for the first phase of the study and some of the goals for the second phase.
Simulated effects of nitrogen saturation the global carbon budget using the IBIS model
Lu, Xuehe; Jiang, Hong; Liu, Jinxun; Zhang, Xiuying; Jin, Jiaxin; Zhu, Qiuan; Zhang, Zhen; Peng, Changhui
2016-01-01
Over the past 100 years, human activity has greatly changed the rate of atmospheric N (nitrogen) deposition in terrestrial ecosystems, resulting in N saturation in some regions of the world. The contribution of N saturation to the global carbon budget remains uncertain due to the complicated nature of C-N (carbon-nitrogen) interactions and diverse geography. Although N deposition is included in most terrestrial ecosystem models, the effect of N saturation is frequently overlooked. In this study, the IBIS (Integrated BIosphere Simulator) was used to simulate the global-scale effects of N saturation during the period 1961–2009. The results of this model indicate that N saturation reduced global NPP (Net Primary Productivity) and NEP (Net Ecosystem Productivity) by 0.26 and 0.03 Pg C yr−1, respectively. The negative effects of N saturation on carbon sequestration occurred primarily in temperate forests and grasslands. In response to elevated CO2 levels, global N turnover slowed due to increased biomass growth, resulting in a decline in soil mineral N. These changes in N cycling reduced the impact of N saturation on the global carbon budget. However, elevated N deposition in certain regions may further alter N saturation and C-N coupling.
Chhatwal, Jagpreet; Chen, Qiushi; Aggarwal, Rakesh
2018-06-01
Oral direct-acting antiviral agents have revolutionized treatment of hepatitis C virus (HCV) infection. Nonetheless, barriers exist to elimination of HCV as a public health threat including low uptake of treatment, limited budget allocations for HCV treatment, and low awareness rates of HCV status among infected people. Mathematical modeling provides a systematic framework to analyze and compare potential solutions and elimination strategies by simulating the HCV epidemic under different conditions. Such models evaluate impact of interventions in advance of implementation. This article describes key components of developing an HCV burden model and illustrates its use by simulating the HCV epidemic in the United States. Copyright © 2018 Elsevier Inc. All rights reserved.
Murray, L. T.; Strode, S. A.; Fiore, A. M.; Lamarque, J. F.; Prather, M. J.; Thompson, C. R.; Peischl, J.; Ryerson, T. B.; Allen, H.; Blake, D. R.; Crounse, J. D.; Brune, W. H.; Elkins, J. W.; Hall, S. R.; Hintsa, E. J.; Huey, L. G.; Kim, M. J.; Moore, F. L.; Ullmann, K.; Wennberg, P. O.; Wofsy, S. C.
2017-12-01
Nitrogen oxides (NOx ≡ NO + NO2) in the background atmosphere are critical precursors for the formation of tropospheric ozone and OH, thereby exerting strong influence on surface air quality, reactive greenhouse gases, and ecosystem health. The impact of NOx on atmospheric composition and climate is sensitive to the relative partitioning of reactive nitrogen between NOx and longer-lived reservoir species of the total reactive nitrogen family (NOy) such as HNO3, HNO4, PAN and organic nitrates (RONO2). Unfortunately, global chemistry-climate models (CCMs) and chemistry-transport models (CTMs) have historically disagreed in their reactive nitrogen budgets outside of polluted continental regions, and we have lacked in situ observations with which to evaluate them. Here, we compare and evaluate the NOy budget of six global models (GEOS-Chem CTM, GFDL AM3 CCM, GISS E2.1 CCM, GMI CTM, NCAR CAM CCM, and UCI CTM) using new observations of total reactive nitrogen and its member species from the NASA Atmospheric Tomography (ATom) mission. ATom has now completed two of its four planned deployments sampling the remote Pacific and Atlantic basins of both hemispheres with a comprehensive suite of measurements for constraining reactive photochemistry. All six models have simulated conditions climatologically similar to the deployments. The GMI and GEOS-Chem CTMs have in addition performed hindcast simulations using the MERRA-2 reanalysis, and have been sampled along the flight tracks. We evaluate the performance of the models relative to the observations, and identify factors contributing to their disparate behavior using known differences in model oxidation mechanisms, heterogeneous loss pathways, lightning and surface emissions, and physical loss processes.
Crisis in Context Theory: An Ecological Model
Myer, Rick A.; Moore, Holly B.
2006-01-01
This article outlines a theory for understanding the impact of a crisis on individuals and organizations. Crisis in context theory (CCT) is grounded in an ecological model and based on literature in the field of crisis intervention and on personal experiences of the authors. A graphic representation denotes key components and premises of CCT,…
Constraint theory multidimensional mathematical model management
Friedman, George J
2017-01-01
Packed with new material and research, this second edition of George Friedman’s bestselling Constraint Theory remains an invaluable reference for all engineers, mathematicians, and managers concerned with modeling. As in the first edition, this text analyzes the way Constraint Theory employs bipartite graphs and presents the process of locating the “kernel of constraint” trillions of times faster than brute-force approaches, determining model consistency and computational allowability. Unique in its abundance of topological pictures of the material, this book balances left- and right-brain perceptions to provide a thorough explanation of multidimensional mathematical models. Much of the extended material in this new edition also comes from Phan Phan’s PhD dissertation in 2011, titled “Expanding Constraint Theory to Determine Well-Posedness of Large Mathematical Models.” Praise for the first edition: "Dr. George Friedman is indisputably the father of the very powerful methods of constraint theory...
Upper Blue Nile basin water budget from a multi-model perspective
Jung, Hahn Chul; Getirana, Augusto; Policelli, Frederick; McNally, Amy; Arsenault, Kristi R.; Kumar, Sujay; Tadesse, Tsegaye; Peters-Lidard, Christa D.
2017-12-01
Improved understanding of the water balance in the Blue Nile is of critical importance because of increasingly frequent hydroclimatic extremes under a changing climate. The intercomparison and evaluation of multiple land surface models (LSMs) associated with different meteorological forcing and precipitation datasets can offer a moderate range of water budget variable estimates. In this context, two LSMs, Noah version 3.3 (Noah3.3) and Catchment LSM version Fortuna 2.5 (CLSMF2.5) coupled with the Hydrological Modeling and Analysis Platform (HyMAP) river routing scheme are used to produce hydrological estimates over the region. The two LSMs were forced with different combinations of two reanalysis-based meteorological datasets from the Modern-Era Retrospective analysis for Research and Applications datasets (i.e., MERRA-Land and MERRA-2) and three observation-based precipitation datasets, generating a total of 16 experiments. Modeled evapotranspiration (ET), streamflow, and terrestrial water storage estimates were evaluated against the Atmosphere-Land Exchange Inverse (ALEXI) ET, in-situ streamflow observations, and NASA Gravity Recovery and Climate Experiment (GRACE) products, respectively. Results show that CLSMF2.5 provided better representation of the water budget variables than Noah3.3 in terms of Nash-Sutcliffe coefficient when considering all meteorological forcing datasets and precipitation datasets. The model experiments forced with observation-based products, the Climate Hazards group Infrared Precipitation with Stations (CHIRPS) and the Tropical Rainfall Measuring Mission (TRMM) Multi-Satellite Precipitation Analysis (TMPA), outperform those run with MERRA-Land and MERRA-2 precipitation. The results presented in this paper would suggest that the Famine Early Warning Systems Network (FEWS NET) Land Data Assimilation System incorporate CLSMF2.5 and HyMAP routing scheme to better represent the water balance in this region.
Dynamic energy budget (DEB) theory provides a generalizable and broadly applicable framework to connect sublethal toxic effects on individuals to changes in population survival and growth. To explore this approach, we are developing growth and bioaccumulation studies that contrib...
Dynamic energy budget (DEB) theory provides a generalizable and broadly applicable framework to connect sublethal toxic effects on individuals to changes in population survival and growth. To explore this approach, we are conducting growth and bioaccumulation studies that contrib...
Staircase Models from Affine Toda Field Theory
Dorey, P; Dorey, Patrick; Ravanini, Francesco
1993-01-01
We propose a class of purely elastic scattering theories generalising the staircase model of Al. B. Zamolodchikov, based on the affine Toda field theories for simply-laced Lie algebras g=A,D,E at suitable complex values of their coupling constants. Considering their Thermodynamic Bethe Ansatz equations, we give analytic arguments in support of a conjectured renormalisation group flow visiting the neighbourhood of each W_g minimal model in turn.
Reconstructing bidimensional scalar field theory models
International Nuclear Information System (INIS)
Flores, Gabriel H.; Svaiter, N.F.
2001-07-01
In this paper we review how to reconstruct scalar field theories in two dimensional spacetime starting from solvable Scrodinger equations. Theree different Schrodinger potentials are analyzed. We obtained two new models starting from the Morse and Scarf II hyperbolic potencials, the U (θ) θ 2 In 2 (θ 2 ) model and U (θ) = θ 2 cos 2 (In(θ 2 )) model respectively. (author)
Alunno-bruscia, Marianne; Van Der Veer, Henk; Kooijman, S. A. L. M.
2011-01-01
This second special issue of the Journal of Sea Research on development and applications of Dynamic Energy Budget (DEB) theory concludes the European Research Project AquaDEB (2007–2011). In this introductory paper we summarise the progress made during the running time of this 5 years’ project, present context for the papers in this volume and discuss future directions. The main scientific objectives in AquaDEB were (i) to study and compare the sensitivity of aquatic species (mainly molluscs ...
A course on basic model theory
Sarbadhikari, Haimanti
2017-01-01
This self-contained book is an exposition of the fundamental ideas of model theory. It presents the necessary background from logic, set theory and other topics of mathematics. Only some degree of mathematical maturity and willingness to assimilate ideas from diverse areas are required. The book can be used for both teaching and self-study, ideally over two semesters. It is primarily aimed at graduate students in mathematical logic who want to specialise in model theory. However, the first two chapters constitute the first introduction to the subject and can be covered in one-semester course to senior undergraduate students in mathematical logic. The book is also suitable for researchers who wish to use model theory in their work.
Motivation in Beyond Budgeting: A Motivational Paradox?
DEFF Research Database (Denmark)
Sandalgaard, Niels; Bukh, Per Nikolaj
In this paper we discuss the role of motivation in relation to budgeting and we analyse how the Beyond Budgeting model functions compared with traditional budgeting. In the paper we focus on budget related motivation (and motivation in general) and conclude that the Beyond Budgeting model...
Gauge theories and integrable lattice models
International Nuclear Information System (INIS)
Witten, E.
1989-01-01
Investigations of new knot polynomials discovered in the last few years have shown them to be intimately connected with soluble models of two dimensional lattice statistical mechanics. In this paper, these results, which in time may illuminate the whole question of why integrable lattice models exist, are reconsidered from the point of view of three dimensional gauge theory. Expectation values of Wilson lines in three dimensional Chern-Simons gauge theories can be computed by evaluating the partition functions of certain lattice models on finite graphs obtained by projecting the Wilson lines to the plane. The models in question - previously considered in both the knot theory and statistical mechanics literature - are IRF models in which the local Boltzmann weights are the matrix elements of braiding matrices in rational conformal field theories. These matrix elements, in turn, can be represented in three dimensional gauge theory in terms of the expectation value of a certain tetrahedral configuration of Wilson lines. This representation makes manifest a surprising symmetry of the braiding matrix elements in conformal field theory. (orig.)
Cluster model in reaction theory
International Nuclear Information System (INIS)
Adhikari, S.K.
1979-01-01
A recent work by Rosenberg on cluster states in reaction theory is reexamined and generalized to include energies above the threshold for breakup into four composite fragments. The problem of elastic scattering between two interacting composite fragments is reduced to an equivalent two-particle problem with an effective potential to be determined by extremum principles. For energies above the threshold for breakup into three or four composite fragments effective few-particle potentials are introduced and the problem is reduced to effective three- and four-particle problems. The equivalent three-particle equation contains effective two- and three-particle potentials. The effective potential in the equivalent four-particle equation has two-, three-, and four-body connected parts and a piece which has two independent two-body connected parts. In the equivalent three-particle problem we show how to include the effect of a weak three-body potential perturbatively. In the equivalent four-body problem an approximate simple calculational scheme is given when one neglects the four-particle potential the effect of which is presumably very small
Directory of Open Access Journals (Sweden)
Y. Jia
2009-10-01
Full Text Available A distributed model for simulating the land surface hydrological processes in the Heihe river basin was developed and validated on the basis of considering the physical mechanism of hydrological cycle and the artificial system of water utilization in the basin. Modeling approach of every component process was introduced from 2 aspects, i.e., water cycle and energy cycle. The hydrological processes include evapotranspiration, infiltration, runoff, groundwater flow, interaction between groundwater and river water, overland flow, river flow and artificial cycle processes of water utilization. A simulation of 21 years from 1982 to 2002 was carried out after obtaining various input data and model parameters. The model was validated for both the simulation of monthly discharge process and that of daily discharge process. Water budgets and spatial and temporal variations of hydrological cycle components as well as energy cycle components in the upper and middle reach Heihe basin (36 728 km^{2} were studied by using the distributed hydrological model. In addition, the model was further used to predict the water budgets under the future land surface change scenarios in the basin. The modeling results show: (1 in the upper reach watershed, the annual average evapotranspiration and runoff account for 63% and 37% of the annual precipitation, respectively, the snow melting runoff accounts for 19% of the total runoff and 41% of the direct runoff, and the groundwater storage has no obvious change; (2 in the middle reach basin, the annual average evapotranspiration is 52 mm more than the local annual precipitation, and the groundwater storage is of an obvious declining trend because of irrigation water consumption; (3 for the scenario of conservation forest construction in the upper reach basin, although the evapotranspiration from interception may increase, the soil evaporation may reduce at the same time, therefore the total evapotranspiration may not
Economic Modelling in Institutional Economic Theory
Directory of Open Access Journals (Sweden)
Wadim Strielkowski
2017-06-01
Full Text Available Our paper is centered around the formation of theory of institutional modelling that includes principles and ideas reflecting the laws of societal development within the framework of institutional economic theory. We scrutinize and discuss the scientific principles of this institutional modelling that are increasingly postulated by the classics of institutional theory and find their way into the basics of the institutional economics. We propose scientific ideas concerning the new innovative approaches to institutional modelling. These ideas have been devised and developed on the basis of the results of our own original design, as well as on the formalisation and measurements of economic institutions, their functioning and evolution. Moreover, we consider the applied aspects of the institutional theory of modelling and employ them in our research for formalizing our results and maximising the practical outcome of our paper. Our results and findings might be useful for the researchers and stakeholders searching for the systematic and comprehensive description of institutional level modelling, the principles involved in this process and the main provisions of the institutional theory of economic modelling.
Randomized Item Response Theory Models
Fox, Gerardus J.A.
2005-01-01
The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by
Arnault, Joel; Kunstmann, Harald; Knoche, Hans-Richard
2015-04-01
Many numerical studies have shown that the West African monsoon is highly sensitive to the state of the land surface. It is however questionable to which extend a local change of land surface properties would affect the local climate, especially with respect to precipitation. This issue is traditionally addressed with the concept of precipitation recycling, defined as the contribution of local surface evaporation to local precipitation. For this study the West African monsoon has been simulated with the Weather Research and Forecasting (WRF) model using explicit convection, for the domain (1°S-21°N, 18°W-14°E) at a spatial resolution of 10 km, for the period January-October 2013, and using ERA-Interim reanalyses as driving data. This WRF configuration has been selected for its ability to simulate monthly precipitation amounts and daily histograms close to TRMM (Tropical Rainfall Measuring Mission) data. In order to investigate precipitation recycling in this WRF simulation, surface evaporation tagging has been implemented in the WRF source code as well as the budget of total and tagged atmospheric water. Surface evaporation tagging consists in duplicating all water species and the respective prognostic equations in the source code. Then, tagged water species are set to zero at the lateral boundaries of the simulated domain (no inflow of tagged water vapor), and tagged surface evaporation is considered only in a specified region. All the source terms of the prognostic equations of total and tagged water species are finally saved in the outputs for the budget analysis. This allows quantifying the respective contribution of total and tagged atmospheric water to atmospheric precipitation processes. The WRF simulation with surface evaporation tagging and budgets has been conducted two times, first with a 100 km2 tagged region (11-12°N, 1-2°W), and second with a 1000 km2 tagged region (7-16°N, 6°W -3°E). In this presentation we will investigate hydro
Linking Adverse Outcome Pathways to Dynamic Energy Budgets: A Conceptual Model
Energy Technology Data Exchange (ETDEWEB)
Murphy, Cheryl [Michigan State University, East Lansing; Nisbet, Roger [University of California Santa Barbara; Antczak, Philipp [University of Liverpool, UK; Reyero, Natalia [Army Corps of Engineers, Vicksburg; Gergs, Andre [Gaiac; Lika, Dina [University of Crete; Mathews, Teresa J. [ORNL; Muller, Eric [University of California, Santa Barbara; Nacci, Dianne [U.S. Environmental Protection Agency (EPA); Peace, Angela L. [ORNL; Remien, Chris [University of Idaho; Schulz, Irv [Pacific Northwest National Laboratory (PNNL); Watanabe, Karen [Arizona State University
2018-02-01
Ecological risk assessment quantifies the likelihood of undesirable impacts of stressors, primarily at high levels of biological organization. Data used to inform ecological risk assessments come primarily from tests on individual organisms or from suborganismal studies, indicating a disconnect between primary data and protection goals. We know how to relate individual responses to population dynamics using individual-based models, and there are emerging ideas on how to make connections to ecosystem services. However, there is no established methodology to connect effects seen at higher levels of biological organization with suborganismal dynamics, despite progress made in identifying Adverse Outcome Pathways (AOPs) that link molecular initiating events to ecologically relevant key events. This chapter is a product of a working group at the National Center for Mathematical and Biological Synthesis (NIMBioS) that assessed the feasibility of using dynamic energy budget (DEB) models of individual organisms as a “pivot” connecting suborganismal processes to higher level ecological processes. AOP models quantify explicit molecular, cellular or organ-level processes, but do not offer a route to linking sub-organismal damage to adverse effects on individual growth, reproduction, and survival, which can be propagated to the population level through individual-based models. DEB models describe these processes, but use abstract variables with undetermined connections to suborganismal biology. We propose linking DEB and quantitative AOP models by interpreting AOP key events as measures of damage-inducing processes in a DEB model. Here, we present a conceptual model for linking AOPs to DEB models and review existing modeling tools available for both AOP and DEB.
Graphical Model Theory for Wireless Sensor Networks
International Nuclear Information System (INIS)
Davis, William B.
2002-01-01
Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm
Topological quantum theories and integrable models
International Nuclear Information System (INIS)
Keski-Vakkuri, E.; Niemi, A.J.; Semenoff, G.; Tirkkonen, O.
1991-01-01
The path-integral generalization of the Duistermaat-Heckman integration formula is investigated for integrable models. It is shown that for models with periodic classical trajectories the path integral reduces to a form similar to the finite-dimensional Duistermaat-Heckman integration formula. This provides a relation between exactness of the stationary-phase approximation and Morse theory. It is also argued that certain integrable models can be related to topological quantum theories. Finally, it is found that in general the stationary-phase approximation presumes that the initial and final configurations are in different polarizations. This is exemplified by the quantization of the SU(2) coadjoint orbit
Self Modeling: Expanding the Theories of Learning
Dowrick, Peter W.
2012-01-01
Self modeling (SM) offers a unique expansion of learning theory. For several decades, a steady trickle of empirical studies has reported consistent evidence for the efficacy of SM as a procedure for positive behavior change across physical, social, educational, and diagnostic variations. SM became accepted as an extreme case of model similarity;…
Security Theorems via Model Theory
Directory of Open Access Journals (Sweden)
Joshua Guttman
2009-11-01
Full Text Available A model-theoretic approach can establish security theorems for cryptographic protocols. Formulas expressing authentication and non-disclosure properties of protocols have a special form. They are quantified implications for all xs . (phi implies for some ys . psi. Models (interpretations for these formulas are *skeletons*, partially ordered structures consisting of a number of local protocol behaviors. *Realized* skeletons contain enough local sessions to explain all the behavior, when combined with some possible adversary behaviors. We show two results. (1 If phi is the antecedent of a security goal, then there is a skeleton A_phi such that, for every skeleton B, phi is satisfied in B iff there is a homomorphism from A_phi to B. (2 A protocol enforces for all xs . (phi implies for some ys . psi iff every realized homomorphic image of A_phi satisfies psi. Hence, to verify a security goal, one can use the Cryptographic Protocol Shapes Analyzer CPSA (TACAS, 2007 to identify minimal realized skeletons, or "shapes," that are homomorphic images of A_phi. If psi holds in each of these shapes, then the goal holds.
Alunno-Bruscia, Marianne; van der Veer, Henk W.; Kooijman, Sebastiaan A. L. M.
2011-11-01
This second special issue of the Journal of Sea Research on development and applications of Dynamic Energy Budget (DEB) theory concludes the European Research Project AquaDEB (2007-2011). In this introductory paper we summarise the progress made during the running time of this 5 years' project, present context for the papers in this volume and discuss future directions. The main scientific objectives in AquaDEB were (i) to study and compare the sensitivity of aquatic species (mainly molluscs and fish) to environmental variability within the context of DEB theory for metabolic organisation, and (ii) to evaluate the inter-relationships between different biological levels (individual, population, ecosystem) and temporal scales (life cycle, population dynamics, evolution). AquaDEB phase I focussed on quantifying bio-energetic processes of various aquatic species ( e.g. molluscs, fish, crustaceans, algae) and phase II on: (i) comparing of energetic and physiological strategies among species through the DEB parameter values and identifying the factors responsible for any differences in bioenergetics and physiology; (ii) considering different scenarios of environmental disruption (excess of nutrients, diffuse or massive pollution, exploitation by man, climate change) to forecast effects on growth, reproduction and survival of key species; (iii) scaling up the models for a few species from the individual level up to the level of evolutionary processes. Apart from the three special issues in the Journal of Sea Research — including the DEBIB collaboration (see vol. 65 issue 2), a theme issue on DEB theory appeared in the Philosophical Transactions of the Royal Society B (vol 365, 2010); a large number of publications were produced; the third edition of the DEB book appeared (2010); open-source software was substantially expanded (over 1000 functions); a large open-source systematic collection of ecophysiological data and DEB parameters has been set up; and a series of DEB
Brodszky, Valentin; Rencz, Fanni; Péntek, Márta; Baji, Petra; Lakatos, Péter L; Gulácsi, László
2016-01-01
To estimate the budget impact of the introduction of biosimilar infliximab for the treatment of Crohn's disease (CD) in Bulgaria, the Czech Republic, Hungary, Poland, Romania and Slovakia. A 3-year, prevalence-based budget impact analysis for biosimilar infliximab to treat CD was developed from third-party payers' perspective. The model included various scenarios depending on whether interchanging originator infliximab with biosimilar infliximab was allowed or not. Total cost savings achieved in biosimilar scenario 1 (interchanging not allowed) and BSc2 (interchanging allowed in 80% of the patients) were estimated to €8.0 million and €16.9 million in the six countries. Budget savings may cover the biosimilar infliximab therapy for 722-1530 additional CD patients. Introduction of biosimilar infliximab to treat CD may offset the inequity in access to biological therapy for CD between Central and Eastern European countries.
Vacation queueing models theory and applications
Tian, Naishuo
2006-01-01
A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...
Brackett, John; And Others
This paper represents a backdrop from which to consider the development of a planning and budgeting model for local education agencies. The first part of the presentation describes the demands and external pressures that affect resource allocation decisions in school districts. The ability of local school officials to link the cost consequences…
A numerical modelling study on regional mercury budget for eastern North America
Directory of Open Access Journals (Sweden)
X. Lin
2003-01-01
Full Text Available In this study, we have integrated an up-to-date physio-chemical transformation mechanism of Hg into the framework of US EPA's CMAQ model system. In addition, the model adapted detailed calculations of the air-surface exchange for Hg to properly describe Hg re-emissions and dry deposition from and to natural surfaces. The mechanism covers Hg in three categories, elemental Hg (Hg0, reactive gaseous Hg (RGM and particulate Hg (HgP. With interfacing to MM5 (meteorology processor and SMOKE (emission processor, we applied the model to a 4-week period in June/July 1995 on a domain covering most of eastern North America. Results indicate that the model simulates reasonably well the levels of total gaseous Hg (TGM and the specific Hg wet deposition measurements made by the Hg deposition network (MDN. Moreover, results from various scenario runs reveal that the Hg system behaves in a closely linear way in terms of contributions from different source categories, i.e. anthropogenic emissions, natural re-emissions and background. Analyses of the scenario results suggest that 37% of anthropogenically emitted Hg was deposited back in the model domain with 5155 kg of anthropogenic Hg moving out of the domain during the simulation period. Overall, the domain served as a net source, which supplied ~a half ton of Hg to the global background pool over the period. Our model validation and a sensitivity test further rationalized the rate constant for gaseous oxidation of Hg0 by hydroxyl radical OH used in the global scale modelling study by Bergan and Rodhe (2001. A further laboratory determination of the reaction rate constant, including its temperature dependence, stands as one of the important issues critical to improving our knowledge on the budget and cycling of Hg.
Béjaoui-Omri, Amel; Béjaoui, Béchir; Harzallah, Ali; Aloui-Béjaoui, Nejla; El Bour, Monia; Aleya, Lotfi
2014-11-01
Mussel farming is the main economic activity in Bizerte Lagoon, with a production that fluctuates depending on environmental factors. In the present study, we apply a bioenergetic growth model to the mussel Mytilus galloprovincialis, based on dynamic energy budget (DEB) theory which describes energy flux variation through the different compartments of the mussel body. Thus, the present model simulates both mussel growth and sexual cycle steps according to food availability and water temperature and also the effect of climate change on mussel behavior and reproduction. The results point to good concordance between simulations and growth parameters (metric length and weight) for mussels in the lagoon. A heat wave scenario was also simulated using the DEB model, which highlighted mussel mortality periods during a period of high temperature.
Quantum field theory and the standard model
Schwartz, Matthew D
2014-01-01
Providing a comprehensive introduction to quantum field theory, this textbook covers the development of particle physics from its foundations to the discovery of the Higgs boson. Its combination of clear physical explanations, with direct connections to experimental data, and mathematical rigor make the subject accessible to students with a wide variety of backgrounds and interests. Assuming only an undergraduate-level understanding of quantum mechanics, the book steadily develops the Standard Model and state-of-the-art calculation techniques. It includes multiple derivations of many important results, with modern methods such as effective field theory and the renormalization group playing a prominent role. Numerous worked examples and end-of-chapter problems enable students to reproduce classic results and to master quantum field theory as it is used today. Based on a course taught by the author over many years, this book is ideal for an introductory to advanced quantum field theory sequence or for independe...
International Nuclear Information System (INIS)
Schlingemann, D.
1996-10-01
Several two dimensional quantum field theory models have more than one vacuum state. An investigation of super selection sectors in two dimensions from an axiomatic point of view suggests that there should be also states, called soliton or kink states, which interpolate different vacua. Familiar quantum field theory models, for which the existence of kink states have been proven, are the Sine-Gordon and the φ 4 2 -model. In order to establish the existence of kink states for a larger class of models, we investigate the following question: Which are sufficient conditions a pair of vacuum states has to fulfill, such that an interpolating kink state can be constructed? We discuss the problem in the framework of algebraic quantum field theory which includes, for example, the P(φ) 2 -models. We identify a large class of vacuum states, including the vacua of the P(φ) 2 -models, the Yukawa 2 -like models and special types of Wess-Zumino models, for which there is a natural way to construct an interpolating kink state. In two space-time dimensions, massive particle states are kink states. We apply the Haag-Ruelle collision theory to kink sectors in order to analyze the asymptotic scattering states. We show that for special configurations of n kinks the scattering states describe n freely moving non interacting particles. (orig.)
International Nuclear Information System (INIS)
Starrlight, A.
2012-01-01
The aim of this dissertation is to characterize the toxicity of uranium on the metabolism of zebra fish, nio rerio. The first three chapters of this manuscript are dedicated to characterizing the blank metabolism of zebra fish. I used the Dynamic Energy Budget (deb) theory for this characterisation; it is presently the only theory that covers the full life cycle of the organism and quantifies feeding, assimilation, growth, reproduction, maturation, maintenance and ageing. Any metabolic effect of uranium should appear as effects on one or more of these fundamental processes. Since the life span of zebra fish is some four and a half years, and larger individuals respond slower to chemical stress, the focus was on the early life stages. Considerable breakthroughs in the quantification of zebra fish development, growth and reproduction have been made. It turned out the zebra fish accelerates its metabolism after birth till metamorphosis, when acceleration ceases. This process is seen in some, but not all, species of fish. Another striking conclusion was that somatic maintenance was much higher than is typical for fish. We don't yet have an explanation for this funding. Further it turned out that the details of reproduction matter: allocation to reproduction (in adults) accumulates in a reproduction buffer and this buffer is used to prepare batches of eggs. We needed to detail this preparation process to understand how zebra fish can eliminate uranium via eggs. Deb theory specifies that a particular developmental stage (birth, metamorphosis, puberty) is reached at specified levels of maturity. For different temperatures and food levels, that can occur at different ages and body sizes. We extended this idea to include all the described morphologically defined developmental stages of the zebra fish in the literature; the observed variations in ages and body sizes can now be explained by deb theory. To test if deb theory can also explain perturbations of maturation, we
Designing A Budgeting Model With Strategic Planning Approach Case Study Of The Ministry Of Energy
Directory of Open Access Journals (Sweden)
Mohammad. Sharif. Malekzadeh
2017-10-01
Full Text Available In traditional costing systems the emphasis is on the production volume and products units and also it is assumed that the products consume the resources. In activity based costing it is argued that the production of the products requires some activities and activities are consumers of the resources. Therefore in costing based on the activity initially overhead costs is allocated to the activities pile up of costs under the title of costs reservoirs and then the allocated costs to the activities are allocated based on a factor called cost driver to products or production lines. In activity based costing the major activities in the process of production are divided into four classes of product unit level product category level product support level and factory level. In the present research we aim to design a budgeting model with strategic planning approach and regarding the views of the elites and the previous researches a questionnaire is presented on the intended field and using the structural equations SEMs a model is presented in order to evaluate the parameters of the applied strategy in the Ministry of Power that according to the results related to the impact coefficient the greatest coefficient is related to the allocation of financial resources on financial strategy dimension with an impact factor value of 4.954.
Fraser, Annemarie; Chan Miller, Christopher; Palmer, Paul I.; Deutscher, Nicholas M.; Jones, Nicholas B.; Griffith, David W. T.
2011-10-01
We investigate the Australian methane budget from 2005-2008 using the GEOS-Chem 3D chemistry transport model, focusing on the relative contribution of emissions from different sectors and the influence of long-range transport. To evaluate the model, we use in situ surface measurements of methane, methane dry air column average (XCH4) from ground-based Fourier transform spectrometers (FTSs), and train-borne surface concentration measurements from an in situ FTS along the north-south continental transect. We use gravity anomaly data from Gravity Recovery and Climate Experiment to describe the spatial and temporal distribution of wetland emissions and scale it to a prior emission estimate, which better describes observed atmospheric methane variability at tropical latitudes. The clean air sites of Cape Ferguson and Cape Grim are the least affected by local emissions, while Wollongong, located in the populated southeast with regional coal mining, samples the most locally polluted air masses (2.5% of the total air mass versus Asia, accounting for ˜25% of the change in surface concentration above background. At Cape Ferguson and Cape Grim, emissions from ruminant animals are the largest source of methane above background, at approximately 20% and 30%, respectively, of the surface concentration. At Wollongong, emissions from coal mining are the largest source above background representing 60% of the surface concentration. The train data provide an effective way of observing transitions between urban, desert, and tropical landscapes.
Operational budgeting using fuzzy goal programming
Directory of Open Access Journals (Sweden)
Saeed Mohammadi
2013-10-01
Full Text Available Having an efficient budget normally has different advantages such as measuring the performance of various organizations, setting appropriate targets and promoting managers based on their achievements. However, any budgeting planning requires prediction of different cost components. There are various methods for budgeting planning such as incremental budgeting, program budgeting, zero based budgeting and performance budgeting. In this paper, we present a fuzzy goal programming to estimate operational budget. The proposed model uses fuzzy triangular as well as interval number to estimate budgeting expenses. The proposed study of this paper is implemented for a real-world case study in province of Qom, Iran and the results are analyzed.
Introduction to zeolite theory and modelling
Santen, van R.A.; Graaf, van de B.; Smit, B.; Bekkum, van H.
2001-01-01
A review. Some of the recent advances in zeolite theory and modeling are present. In particular the current status of computational chem. in Bronsted acid zeolite catalysis, mol. dynamics simulations of mols. adsorbed in zeolites, and novel Monte Carlo technique are discussed to simulate the
Prospect Theory in the Heterogeneous Agent Model
Czech Academy of Sciences Publication Activity Database
Polach, J.; Kukačka, Jiří
(2018) ISSN 1860-711X R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional support: RVO:67985556 Keywords : Heterogeneous Agent Model * Prospect Theory * Behavioral finance * Stylized facts Subject RIV: AH - Economic s OBOR OECD: Finance Impact factor: 0.931, year: 2016 http://library.utia.cas.cz/separaty/2018/E/kukacka-0488438.pdf
Recursive renormalization group theory based subgrid modeling
Zhou, YE
1991-01-01
Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.
Diagrammatic group theory in quark models
International Nuclear Information System (INIS)
Canning, G.P.
1977-05-01
A simple and systematic diagrammatic method is presented for calculating the numerical factors arising from group theory in quark models: dimensions, casimir invariants, vector coupling coefficients and especially recoupling coefficients. Some coefficients for the coupling of 3 quark objects are listed for SU(n) and SU(2n). (orig.) [de
Aligning Grammatical Theories and Language Processing Models
Lewis, Shevaun; Phillips, Colin
2015-01-01
We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…
Capital budgeting practices in Spain
Directory of Open Access Journals (Sweden)
Pablo de Andrés
2015-01-01
Full Text Available This paper seeks to shed further light on the capital budgeting techniques used by Spanish companies. Our paper posits that the gap between theory and practice might be related to the nature of sources of value and to the efficiency of mechanisms aligning managerial and shareholder incentives, rather than to resource restrictions or model misinterpretation. We analyze data from a survey conducted in 2011, the final sample comprising 140 non-financial Spanish firms. Our findings show a behaviour pattern similar to that reported in prior research for firms in other countries. Particularly noteworthy is that payback appears to be the most widely used tool, while real options are used relatively little. Our results confirm that size and industry are related to the frequency of use of certain capital budgeting techniques. Further, we find that the relevance of growth opportunities and flexibility is an important factor explaining the use of real options.
Griffith, S. M.; Hansen, R. F.; Dusanter, S.; Michoud, V.; Gilman, J. B.; Kuster, W. C.; Veres, P. R.; Graus, M.; de Gouw, J. A.; Roberts, J.; Young, C.; Washenfelder, R.; Brown, S. S.; Thalman, R.; Waxman, E.; Volkamer, R.; Tsai, C.; Stutz, J.; Flynn, J. H.; Grossberg, N.; Lefer, B.; Alvarez, S. L.; Rappenglueck, B.; Mielke, L. H.; Osthoff, H. D.; Stevens, P. S.
2016-04-01
Measurements of hydroxyl (OH) and hydroperoxy (HO2*) radical concentrations were made at the Pasadena ground site during the CalNex-LA 2010 campaign using the laser-induced fluorescence-fluorescence assay by gas expansion technique. The measured concentrations of OH and HO2* exhibited a distinct weekend effect, with higher radical concentrations observed on the weekends corresponding to lower levels of nitrogen oxides (NOx). The radical measurements were compared to results from a zero-dimensional model using the Regional Atmospheric Chemical Mechanism-2 constrained by NOx and other measured trace gases. The chemical model overpredicted measured OH concentrations during the weekends by a factor of approximately 1.4 ± 0.3 (1σ), but the agreement was better during the weekdays (ratio of 1.0 ± 0.2). Model predicted HO2* concentrations underpredicted by a factor of 1.3 ± 0.2 on the weekends, while measured weekday concentrations were underpredicted by a factor of 3.0 ± 0.5. However, increasing the modeled OH reactivity to match the measured total OH reactivity improved the overall agreement for both OH and HO2* on all days. A radical budget analysis suggests that photolysis of carbonyls and formaldehyde together accounted for approximately 40% of radical initiation with photolysis of nitrous acid accounting for 30% at the measurement height and ozone photolysis contributing less than 20%. An analysis of the ozone production sensitivity reveals that during the week, ozone production was limited by volatile organic compounds throughout the day during the campaign but NOx limited during the afternoon on the weekends.
The sea ice mass budget of the Arctic and its future change as simulated by coupled climate models
Energy Technology Data Exchange (ETDEWEB)
Holland, Marika M. [National Center for Atmospheric Research, Boulder, CO (United States); Serreze, Mark C.; Stroeve, Julienne [University of Colorado, National Snow and Ice Data Center, Cooperative Institute for Research in Environmental Sciences, Boulder, CO (United States)
2010-02-15
Arctic sea ice mass budgets for the twentieth century and projected changes through the twenty-first century are assessed from 14 coupled global climate models. Large inter-model scatter in contemporary mass budgets is strongly related to variations in absorbed solar radiation, due in large part to differences in the surface albedo simulation. Over the twenty-first century, all models simulate a decrease in ice volume resulting from increased annual net melt (melt minus growth), partially compensated by reduced transport to lower latitudes. Despite this general agreement, the models vary considerably regarding the magnitude of ice volume loss and the relative roles of changing melt and growth in driving it. Projected changes in sea ice mass budgets depend in part on the initial (mid twentieth century) ice conditions; models with thicker initial ice generally exhibit larger volume losses. Pointing to the importance of evolving surface albedo and cloud properties, inter-model scatter in changing net ice melt is significantly related to changes in downwelling longwave and absorbed shortwave radiation. These factors, along with the simulated mean and spatial distribution of ice thickness, contribute to a large inter-model scatter in the projected onset of seasonally ice-free conditions. (orig.)
Directory of Open Access Journals (Sweden)
Anita V. Sotnikova
2015-01-01
Full Text Available Article is devoted to a problem of effective distribution of the general budget of a portfolio between the IT projects which are its part taking into ac-count their priority. The designated problem is actual in view of low results of activity of the consulting companies in the sphere of information technologies.For determination of priority of IT projects the method of analytical networks developed by T. Saati is used. For the purpose of application of this method the system of criteria (indicators reflecting influence of IT projects of a portfolio on the most significant purposes of implementation of IT projects of a portfolio is developed. As system of criteria the key indicators of efficiency defined when developing the Balanced system of indicators which meet above-mentioned requirements are used. The essence of a method of analytical net-works consists in paired comparison of key indicators of efficiency concerning the purpose of realization of a portfolio and IT projects which are a part of a portfolio. Result of use of a method of analytical networks are coefficients of priority of each IT project of a portfolio. The received coefficients of priority of IT projects are used in the offered model of distribution of the budget of a portfolio between IT projects. Thus, the budget of a portfolio of IT projects is distributed between them taking into account not only the income from implementation of each IT project, but also other criteria, important for the IT company, for example: the degree of compliance of the IT project to strategic objectives of the IT company defining expediency of implementation of the IT project; the term of implementation of the IT project determined by the customer. The developed model of distribution of the budget of a portfolio between IT projects is approved on the example of distribution of the budget between IT projects of the portfolio consisting of three IT projects. Taking into account the received
Directory of Open Access Journals (Sweden)
Tânia Regina Sordi Relvas
2011-09-01
Full Text Available Diante da constatação de que os estudos sobre o orçamento exploram o fenômeno de forma reducionista, este artigo tem por objetivo propor uma teoria substantiva abrangente e fundamentada em dados empíricos para a análise do orçamento. Essa abordagem considera seus elementos constituintes e suas interdependências. Isso foi feito por meio da aplicação da abordagem indutiva fundamentada nos dados empíricos (grounded theory, sob o paradigma qualitativo. O foco de análise foi uma instituição financeira de grande porte e o trabalho de campo foi desenvolvido ao longo de dois anos, envolvendo vários níveis gerenciais. A contribuição do trabalho advém da disponibilização de framework para o tratamento do tema em um contexto amplo, o que permitiu entender aspectos que deixariam de ser considerados com uma abordagem de análise mais restrita e menos abrangente. Como produto da teoria substantiva, cinco proposições foram desenvolvidas com a perspectiva de serem aplicadas nas organizações. --- Budgeting: substantive analysis using grounded theory --- Abstract --- Considering the fact that studies into budgeting basically use a reductionist approach, this paper proposes a comprehensive substantive theory based on empirical data to be used in budget analysis. This approach takes into consideration its elements and interdependence by applying the inductive approach based on empirical data (grounded theory on a qualitative paradigm. The focus was an in-depth two-year study of a large Brazilian financial institution involving several management levels. The main contribution of the study is as a framework that treats all elements of the budget process in a comprehensive and coherent fashion, otherwise impossible using a reductionist approach. As products of the substantive theory, five propositions were developed to be applied in organizations.
A dynamical theory for the Rishon model
International Nuclear Information System (INIS)
Harari, H.; Seiberg, N.
1980-09-01
We propose a composite model for quarks and leptons based on an exact SU(3)sub(C)xSU(3)sub(H) gauge theory and two fundamental J=1/2 fermions: a charged T-rishon and a neutral V-rishon. Quarks, leptons and W-bosons are SU(3)sub(H)-singlet composites of rishons. A dynamically broken effective SU(3)sub(C)xSU(2)sub(L)xSU(2)sub(R)xU(1)sub(B-L) gauge theory emerges at the composite level. The theory is ''natural'', anomaly-free, has no fundamental scalar particles, and describes at least three generations of quarks and leptons. Several ''technicolor'' mechanisms are automatically present. (Author)
Polyacetylene and relativistic field-theory models
International Nuclear Information System (INIS)
Bishop, A.R.; Campbell, D.K.; Fesser, K.
1981-01-01
Connections between continuum, mean-field, adiabatic Peierls-Froehlich theory in the half-filled band limit and known field theory results are discussed. Particular attention is given to the phi 4 model and to the solvable N = 2 Gross-Neveu model. The latter is equivalent to the Peierls system at a static, semi-classical level. Based on this equivalence we note the prediction of both kink and polaron solitons in models of trans-(CH)/sub x/. Polarons in cis-(CH)/sub x/ are compared with those in the trans isomer. Optical absorption from polarons is described, and general experimental consequences of polarons in (CH)/sub x/ and other conjugated polymers is discussed
DEFF Research Database (Denmark)
Jeppesen, Palle
1996-01-01
The lecture note is aimed at introducing system budgets for optical communication systems. It treats optical fiber communication systems (six generations), system design, bandwidth effects, other system impairments and optical amplifiers.......The lecture note is aimed at introducing system budgets for optical communication systems. It treats optical fiber communication systems (six generations), system design, bandwidth effects, other system impairments and optical amplifiers....
Working memory: theories, models, and controversies.
Baddeley, Alan
2012-01-01
I present an account of the origins and development of the multicomponent approach to working memory, making a distinction between the overall theoretical framework, which has remained relatively stable, and the attempts to build more specific models within this framework. I follow this with a brief discussion of alternative models and their relationship to the framework. I conclude with speculations on further developments and a comment on the value of attempting to apply models and theories beyond the laboratory studies on which they are typically based.
Effective field theory and the quark model
International Nuclear Information System (INIS)
Durand, Loyal; Ha, Phuoc; Jaczko, Gregory
2001-01-01
We analyze the connections between the quark model (QM) and the description of hadrons in the low-momentum limit of heavy-baryon effective field theory in QCD. By using a three-flavor-index representation for the effective baryon fields, we show that the 'nonrelativistic' constituent QM for baryon masses and moments is completely equivalent through O(m s ) to a parametrization of the relativistic field theory in a general spin-flavor basis. The flavor and spin variables can be identified with those of effective valence quarks. Conversely, the spin-flavor description clarifies the structure and dynamical interpretation of the chiral expansion in effective field theory, and provides a direct connection between the field theory and the semirelativistic models for hadrons used in successful dynamical calculations. This allows dynamical information to be incorporated directly into the chiral expansion. We find, for example, that the striking success of the additive QM for baryon magnetic moments is a consequence of the relative smallness of the non-additive spin-dependent corrections
Liangxia Zhang; Ge Sun; Erika Cohen; Steven McNulty; Peter Caldwell; Suzanne Krieger; Jason Christian; Decheng Zhou; Kai Duan; Keren J. Cepero-Pérez
2018-01-01
Quantifying the forest water budget is fundamental to making science-based forest management decisions. This study aimed at developing an improved water budget for the El Yunque National Forest (ENF) in Puerto Rico, one of the wettest forests in the United States. We modified an existing monthly scale water balance model, Water Supply Stress Index (WaSSI), to reflect...
Lu, Fei; Wang, Xiao-Ke; Han, Bing; Ouyang, Zhi-Yun; Zheng, Hua
2010-05-01
Straw returning is considered to be one of the most promising carbon sequestration measures in China's cropland. A compound model, namely "Straw Returning and Burning Model-Expansion" (SRBME), was built to estimate the net mitigation potential, economic benefits, and air pollutant reduction of straw returning. Three scenarios, that is, baseline, "full popularization of straw returning (FP)," and "full popularization of straw returning and precision fertilization (FP + P)," were set to reflect popularization of straw returning. The results of the SRBME indicated that (1) compared with the soil carbon sequestration of 13.37 Tg/yr, the net mitigation potentials, which were 6.328 Tg/yr for the FP scenario and 9.179 Tg/yr for the FP + P scenario, had different trends when the full budget of the greenhouse gases was considered; (2) when the feasibility in connection with greenhouse gas (GHG) mitigation, economic benefits, and environmental benefits was taken into consideration, straw returning was feasible in 15 provinces in the FP scenario, with a total net mitigation potential of 7.192 TgCe/yr and the total benefits of CNY 1.473 billion (USD 216.6 million); (3) in the FP + P scenario, with the implementation of precision fertilization, straw returning was feasible in 26 provinces with a total net mitigation potential of 10.39 TgCe/yr and the total benefits of CNY 5.466 billion (USD 803.8 million); (4) any extent of change in the treatment of straw from being burnt to being returned would contribute to air pollution reduction; (5) some countermeasures, such as CH(4) reduction in rice paddies, precision fertilization, financial support, education and propaganda, would promote the feasibility of straw returning as a mitigation measure.
Social Security Administration — DCS Budget Tracking System database contains budget information for the Information Technology budget and the 'Other Objects' budget. This data allows for monitoring...
Topos models for physics and topos theory
International Nuclear Information System (INIS)
Wolters, Sander
2014-01-01
What is the role of topos theory in the topos models for quantum theory as used by Isham, Butterfield, Döring, Heunen, Landsman, Spitters, and others? In other words, what is the interplay between physical motivation for the models and the mathematical framework used in these models? Concretely, we show that the presheaf topos model of Butterfield, Isham, and Döring resembles classical physics when viewed from the internal language of the presheaf topos, similar to the copresheaf topos model of Heunen, Landsman, and Spitters. Both the presheaf and copresheaf models provide a “quantum logic” in the form of a complete Heyting algebra. Although these algebras are natural from a topos theoretic stance, we seek a physical interpretation for the logical operations. Finally, we investigate dynamics. In particular, we describe how an automorphism on the operator algebra induces a homeomorphism (or isomorphism of locales) on the associated state spaces of the topos models, and how elementary propositions and truth values transform under the action of this homeomorphism. Also with dynamics the focus is on the internal perspective of the topos
Prospects for advanced RF theory and modeling
International Nuclear Information System (INIS)
Batchelor, D. B.
1999-01-01
This paper represents an attempt to express in print the contents of a rather philosophical review talk. The charge for the talk was not to summarize the present status of the field and what we can do, but to assess what we will need to do in the future and where the gaps are in fulfilling these needs. The objective was to be complete, covering all aspects of theory and modeling in all frequency regimes, although in the end the talk mainly focussed on the ion cyclotron range of frequencies (ICRF). In choosing which areas to develop, it is important to keep in mind who the customers for RF modeling are likely to be and what sorts of tasks they will need for RF to do. This occupies the first part of the paper. Then we examine each of the elements of a complete RF theory and try to identify the kinds of advances needed. (c) 1999 American Institute of Physics
A Membrane Model from Implicit Elasticity Theory
Freed, A. D.; Liao, J.; Einstein, D. R.
2014-01-01
A Fungean solid is derived for membranous materials as a body defined by isotropic response functions whose mathematical structure is that of a Hookean solid where the elastic constants are replaced by functions of state derived from an implicit, thermodynamic, internal-energy function. The theory utilizes Biot’s (1939) definitions for stress and strain that, in 1-dimension, are the stress/strain measures adopted by Fung (1967) when he postulated what is now known as Fung’s law. Our Fungean membrane model is parameterized against a biaxial data set acquired from a porcine pleural membrane subjected to three, sequential, proportional, planar extensions. These data support an isotropic/deviatoric split in the stress and strain-rate hypothesized by our theory. These data also demonstrate that the material response is highly non-linear but, otherwise, mechanically isotropic. These data are described reasonably well by our otherwise simple, four-parameter, material model. PMID:24282079
Attribution models and the Cooperative Game Theory
Cano Berlanga, Sebastian; Vilella, Cori
2017-01-01
The current paper studies the attribution model used by Google Analytics. Precisely, we use the Cooperative Game Theory to propose a fair distribution of the revenues among the considered channels, in order to facilitate the cooperation and to guarantee stability. We define a transferable utility convex cooperative game from the observed frequencies and we use the Shapley value to allocate the revenues among the di erent channels. Furthermore, we evaluate the impact of an advertising...
Identifying influences on model uncertainty: an application using a forest carbon budget model
James E. Smith; Linda S. Heath
2001-01-01
Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...
MODELS AND THE DYNAMICS OF THEORIES
Directory of Open Access Journals (Sweden)
Paulo Abrantes
2007-12-01
Full Text Available Abstract: This paper gives a historical overview of the ways various trends in the philosophy of science dealt with models and their relationship with the topics of heuristics and theoretical dynamics. First of all, N. Campbell’s account of analogies as components of scientific theories is presented. Next, the notion of ‘model’ in the reconstruction of the structure of scientific theories proposed by logical empiricists is examined. This overview finishes with M. Hesse’s attempts to develop Campbell’s early ideas in terms of an analogical inference. The final part of the paper points to contemporary developments on these issues which adopt a cognitivist perspective. It is indicated how discussions in the cognitive sciences might help to flesh out some of the insights philosophers of science had concerning the role models and analogies play in actual scientific theorizing. Key words: models, analogical reasoning, metaphors in science, the structure of scientific theories, theoretical dynamics, heuristics, scientific discovery.
Conceptual Models and Theory-Embedded Principles on Effective Schooling.
Scheerens, Jaap
1997-01-01
Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…
Finite Unification: Theory, Models and Predictions
Heinemeyer, S; Zoupanos, G
2011-01-01
All-loop Finite Unified Theories (FUTs) are very interesting N=1 supersymmetric Grand Unified Theories (GUTs) realising an old field theory dream, and moreover have a remarkable predictive power due to the required reduction of couplings. The reduction of the dimensionless couplings in N=1 GUTs is achieved by searching for renormalization group invariant (RGI) relations among them holding beyond the unification scale. Finiteness results from the fact that there exist RGI relations among dimensional couplings that guarantee the vanishing of all beta-functions in certain N=1 GUTs even to all orders. Furthermore developments in the soft supersymmetry breaking sector of N=1 GUTs and FUTs lead to exact RGI relations, i.e. reduction of couplings, in this dimensionful sector of the theory, too. Based on the above theoretical framework phenomenologically consistent FUTs have been constructed. Here we review FUT models based on the SU(5) and SU(3)^3 gauge groups and their predictions. Of particular interest is the Hig...
Foglia, L.; McNally, A.; Harter, T.
2012-12-01
The Scott River is one of four major tributaries in the Klamath River Basin that provide cold water habitat for salmonid populations. The Scott Valley is also a major agricultural growing region with extensive alfalfa and hay productions that are key to the local economy. Due to the Mediterranean climate in the area, discharge rates in the river are highly seasonal. Almost all annual discharge occurs during the winter precipitation season and spring snowmelt. During the summer months (July through September), the main-stem river becomes disconnected from its tributaries throughout much of Scott Valley and relies primarily on baseflow from the Scott Valley aquifer. Scott Valley agriculture relies on a combination of surface water and groundwater supplies for crop irrigation during April through September. Conflicts between ecosystem services needs to guarantee a sustainable water quality (mainly in-stream temperature) for the native salmon population and water demands for agricultural irrigation motivated the development of a new conceptual model for the evaluation of the soil-water budget throughout the valley, as a basis for developing alternative surface water and groundwater management practices. The model simulates daily hydrologic fluxes at the individual field scale (100 - 200 m), allocates water resources to nearby irrigation systems, and tracks soil moisture to determine groundwater recharge. The water budget model provides recharge and pumping values for each field. These values in turn are used as inputs for a valley-wide groundwater model developed with MODFLOW-2000. In a first step, separate sensitivity analysis and calibration of the groundwater model is used to provide insights on the accuracy of the recharge and pumping distribution estimated with the water budget model. In a further step, the soil water budget and groundwater flow models will be coupled and sensitivity analysis and calibration will be performed simultaneously. Field-based, local
Theory, modeling and simulation: Annual report 1993
Energy Technology Data Exchange (ETDEWEB)
Dunning, T.H. Jr.; Garrett, B.C.
1994-07-01
Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.
Theory, modeling and simulation: Annual report 1993
International Nuclear Information System (INIS)
Dunning, T.H. Jr.; Garrett, B.C.
1994-07-01
Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies
PRINCIPLES OF FORMATION OF INNOVATIVE MODEL OF EFFICIENCY OF BUDGET FUNDS USE
Directory of Open Access Journals (Sweden)
Elena I. Chibisova
2013-01-01
Full Text Available The article describes the innovative approach to use of performance indicators to improve the quality of efficiency control of budget funds use, it is offered to create the inner budgetary administrative system of control on the effectiveness and appropriateness of the budgetary funds use.
National Research Council Canada - National Science Library
2001-01-01
... (including the off-budget Social Security trust funds) to $281 billion. That surplus would be the largest in history in nominal dollars and the largest since 1948 as a percentage of gross domestic product (GDP...
Learn about the budget for the National Clinical Trials Network (NCTN), a National Cancer Institute program that gives funds and other support to cancer research organizations to conduct cancer clinical trials.
International Nuclear Information System (INIS)
Randjbar-Daemi, S.
1987-01-01
The propagation of closed bosonic strings interacting with background gravitational and dilaton fields is reviewed. The string is treated as a quantum field theory on a compact 2-dimensional manifold. The question is posed as to how the conditions for the vanishing trace anomaly and the ensuing background field equations may depend on global features of the manifold. It is shown that to the leading order in σ-model perturbation theory the string loop effects do not modify the gravitational and the dilaton field equations. However for the purely bosonic strings new terms involving the modular parameter of the world sheet are induced by quantum effects which can be absorbed into a re-definition of the background fields. The authors also discuss some aspects of several regularization schemes such as dimensional, Pauli-Villars and the proper-time cut off in an appendix
Steeneveld, G. J.; Tolk, L. F.; Moene, A. F.; Hartogensis, O. K.; Peters, W.; Holtslag, A. A. M.
2011-12-01
The Weather Research and Forecasting Model (WRF) and the Regional Atmospheric Mesoscale Model System (RAMS) are frequently used for (regional) weather, climate and air quality studies. This paper covers an evaluation of these models for a windy and calm episode against Cabauw tower observations (Netherlands), with a special focus on the representation of the physical processes in the atmospheric boundary layer (ABL). In addition, area averaged sensible heat flux observations by scintillometry are utilized which enables evaluation of grid scale model fluxes and flux observations at the same horizontal scale. Also, novel ABL height observations by ceilometry and of the near surface longwave radiation divergence are utilized. It appears that WRF in its basic set-up shows satisfactory model results for nearly all atmospheric near surface variables compared to field observations, while RAMS needed refining of its ABL scheme. An important inconsistency was found regarding the ABL daytime heat budget: Both model versions are only able to correctly forecast the ABL thermodynamic structure when the modeled surface sensible heat flux is much larger than both the eddy-covariance and scintillometer observations indicate. In order to clarify this discrepancy, model results for each term of the heat budget equation is evaluated against field observations. Sensitivity studies and evaluation of radiative tendencies and entrainment reveal that possible errors in these variables cannot explain the overestimation of the sensible heat flux within the current model infrastructure.
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... are related to expectations formation, market clearing, nominal rigidities, etc. Finally, the general-partial equilibrium distinction is analyzed....
Bridging Economic Theory Models and the Cointegrated Vector Autoregressive Model
DEFF Research Database (Denmark)
Møller, Niels Framroze
2008-01-01
Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity in the econo......Examples of simple economic theory models are analyzed as restrictions on the Cointegrated VAR (CVAR). This establishes a correspondence between basic economic concepts and the econometric concepts of the CVAR: The economic relations correspond to cointegrating vectors and exogeneity...... parameters of the CVAR are shown to be interpretable in terms of expectations formation, market clearing, nominal rigidities, etc. The general-partial equilibrium distinction is also discussed....
Quantum integrable models of field theory
International Nuclear Information System (INIS)
Faddeev, L.D.
1979-01-01
Fundamental features of the classical method of the inverse problem have been formulated in the form which is convenient for its quantum reformulation. Typical examples are studied which may help to formulate the quantum method of the inverse problem. Examples are considered for interaction with both attraction and repulsion at a final density. The sine-Gordon model and the XYZ model from the quantum theory of magnetics are examined in short. It is noted that all the achievements of the one-dimensional mathematical physics as applied to exactly solvable quantum models may be put to an extent within the framework of the quantum method of the inverse problem. Unsolved questions are enumerated and perspectives of applying the inverse problem method are shown
Theory and Model for Martensitic Transformations
DEFF Research Database (Denmark)
Lindgård, Per-Anker; Mouritsen, Ole G.
1986-01-01
Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry is constr......Martensitic transformations are shown to be driven by the interplay between two fluctuating strain components. No soft mode is needed, but a central peak occurs representing the dynamics of strain clusters. A two-dimensional magnetic-analog model with the martensitic-transition symmetry...... is constructed and analyzed by computer simulation and by a theory which accounts for correlation effects. Dramatic precursor effects at the first-order transition are demonstrated. The model is also of relevance for surface reconstruction transitions....
Economic contract theory tests models of mutualism.
Weyl, E Glen; Frederickson, Megan E; Yu, Douglas W; Pierce, Naomi E
2010-09-07
Although mutualisms are common in all ecological communities and have played key roles in the diversification of life, our current understanding of the evolution of cooperation applies mostly to social behavior within a species. A central question is whether mutualisms persist because hosts have evolved costly punishment of cheaters. Here, we use the economic theory of employment contracts to formulate and distinguish between two mechanisms that have been proposed to prevent cheating in host-symbiont mutualisms, partner fidelity feedback (PFF) and host sanctions (HS). Under PFF, positive feedback between host fitness and symbiont fitness is sufficient to prevent cheating; in contrast, HS posits the necessity of costly punishment to maintain mutualism. A coevolutionary model of mutualism finds that HS are unlikely to evolve de novo, and published data on legume-rhizobia and yucca-moth mutualisms are consistent with PFF and not with HS. Thus, in systems considered to be textbook cases of HS, we find poor support for the theory that hosts have evolved to punish cheating symbionts; instead, we show that even horizontally transmitted mutualisms can be stabilized via PFF. PFF theory may place previously underappreciated constraints on the evolution of mutualism and explain why punishment is far from ubiquitous in nature.
Magnetic flux tube models in superstring theory
Russo, Jorge G
1996-01-01
Superstring models describing curved 4-dimensional magnetic flux tube backgrounds are exactly solvable in terms of free fields. We consider the simplest model of this type (corresponding to `Kaluza-Klein' Melvin background). Its 2d action has a flat but topologically non-trivial 10-dimensional target space (there is a mixing of angular coordinate of the 2-plane with an internal compact coordinate). We demonstrate that this theory has broken supersymmetry but is perturbatively stable if the radius R of the internal coordinate is larger than R_0=\\sqrt{2\\a'}. In the Green-Schwarz formulation the supersymmetry breaking is a consequence of the presence of a flat but non-trivial connection in the fermionic terms in the action. For R R/2\\a' there appear instabilities corresponding to tachyonic winding states. The torus partition function Z(q,R) is finite for R > R_0 (and vanishes for qR=2n, n=integer). At the special points qR=2n (2n+1) the model is equivalent to the free superstring theory compactified on a circle...
Group theory for unified model building
International Nuclear Information System (INIS)
Slansky, R.
1981-01-01
The results gathered here on simple Lie algebras have been selected with attention to the needs of unified model builders who study Yang-Mills theories based on simple, local-symmetry groups that contain as a subgroup the SUsup(w) 2 x Usup(w) 1 x SUsup(c) 3 symmetry of the standard theory of electromagnetic, weak, and strong interactions. The major topics include, after a brief review of the standard model and its unification into a simple group, the use of Dynkin diagrams to analyze the structure of the group generators and to keep track of the weights (quantum numbers) of the representation vectors; an analysis of the subgroup structure of simple groups, including explicit coordinatizations of the projections in weight space; lists of representations, tensor products and branching rules for a number of simple groups; and other details about groups and their representations that are often helpful for surveying unified models, including vector-coupling coefficient calculations. Tabulations of representations, tensor products, and branching rules for E 6 , SO 10 , SU 6 , F 4 , SO 9 , SO 5 , SO 8 , SO 7 , SU 4 , E 7 , E 8 , SU 8 , SO 14 , SO 18 , SO 22 , and for completeness, SU 3 are included. (These tables may have other applications.) Group-theoretical techniques for analyzing symmetry breaking are described in detail and many examples are reviewed, including explicit parameterizations of mass matrices. (orig.)
A matrix model from string field theory
Directory of Open Access Journals (Sweden)
Syoji Zeze
2016-09-01
Full Text Available We demonstrate that a Hermitian matrix model can be derived from level truncated open string field theory with Chan-Paton factors. The Hermitian matrix is coupled with a scalar and U(N vectors which are responsible for the D-brane at the tachyon vacuum. Effective potential for the scalar is evaluated both for finite and large N. Increase of potential height is observed in both cases. The large $N$ matrix integral is identified with a system of N ZZ branes and a ghost FZZT brane.
Hu, Lu; Jacob, Daniel J.; Liu, Xiong; Zhang, Yi; Zhang, Lin; Kim, Patrick S.; Sulprizio, Melissa P.; Yantosca, Robert M.
2017-10-01
The global budget of tropospheric ozone is governed by a complicated ensemble of coupled chemical and dynamical processes. Simulation of tropospheric ozone has been a major focus of the GEOS-Chem chemical transport model (CTM) over the past 20 years, and many developments over the years have affected the model representation of the ozone budget. Here we conduct a comprehensive evaluation of the standard version of GEOS-Chem (v10-01) with ozone observations from ozonesondes, the OMI satellite instrument, and MOZAIC-IAGOS commercial aircraft for 2012-2013. Global validation of the OMI 700-400 hPa data with ozonesondes shows that OMI maintained persistent high quality and no significant drift over the 2006-2013 period. GEOS-Chem shows no significant seasonal or latitudinal bias relative to OMI and strong correlations in all seasons on the 2° × 2.5° horizontal scale (r = 0.88-0.95), improving on previous model versions. The most pronounced model bias revealed by ozonesondes and MOZAIC-IAGOS is at high northern latitudes in winter-spring where the model is 10-20 ppbv too low. This appears to be due to insufficient stratosphere-troposphere exchange (STE). Model updates to lightning NOx, Asian anthropogenic emissions, bromine chemistry, isoprene chemistry, and meteorological fields over the past decade have overall led to gradual increase in the simulated global tropospheric ozone burden and more active ozone production and loss. From simulations with different versions of GEOS meteorological fields we find that tropospheric ozone in GEOS-Chem v10-01 has a global production rate of 4960-5530 Tg a-1, lifetime of 20.9-24.2 days, burden of 345-357 Tg, and STE of 325-492 Tg a-1. Change in the intensity of tropical deep convection between these different meteorological fields is a major factor driving differences in the ozone budget.
On low rank classical groups in string theory, gauge theory and matrix models
International Nuclear Information System (INIS)
Intriligator, Ken; Kraus, Per; Ryzhov, Anton V.; Shigemori, Masaki; Vafa, Cumrun
2004-01-01
We consider N=1 supersymmetric U(N), SO(N), and Sp(N) gauge theories, with two-index tensor matter and added tree-level superpotential, for general breaking patterns of the gauge group. By considering the string theory realization and geometric transitions, we clarify when glueball superfields should be included and extremized, or rather set to zero; this issue arises for unbroken group factors of low rank. The string theory results, which are equivalent to those of the matrix model, refer to a particular UV completion of the gauge theory, which could differ from conventional gauge theory results by residual instanton effects. Often, however, these effects exhibit miraculous cancellations, and the string theory or matrix model results end up agreeing with standard gauge theory. In particular, these string theory considerations explain and remove some apparent discrepancies between gauge theories and matrix models in the literature
Application of Chaos Theory to Psychological Models
Blackerby, Rae Fortunato
This dissertation shows that an alternative theoretical approach from physics--chaos theory--offers a viable basis for improved understanding of human beings and their behavior. Chaos theory provides achievable frameworks for potential identification, assessment, and adjustment of human behavior patterns. Most current psychological models fail to address the metaphysical conditions inherent in the human system, thus bringing deep errors to psychological practice and empirical research. Freudian, Jungian and behavioristic perspectives are inadequate psychological models because they assume, either implicitly or explicitly, that the human psychological system is a closed, linear system. On the other hand, Adlerian models that require open systems are likely to be empirically tenable. Logically, models will hold only if the model's assumptions hold. The innovative application of chaotic dynamics to psychological behavior is a promising theoretical development because the application asserts that human systems are open, nonlinear and self-organizing. Chaotic dynamics use nonlinear mathematical relationships among factors that influence human systems. This dissertation explores these mathematical relationships in the context of a sample model of moral behavior using simulated data. Mathematical equations with nonlinear feedback loops describe chaotic systems. Feedback loops govern the equations' value in subsequent calculation iterations. For example, changes in moral behavior are affected by an individual's own self-centeredness, family and community influences, and previous moral behavior choices that feed back to influence future choices. When applying these factors to the chaos equations, the model behaves like other chaotic systems. For example, changes in moral behavior fluctuate in regular patterns, as determined by the values of the individual, family and community factors. In some cases, these fluctuations converge to one value; in other cases, they diverge in
PARFUME Theory and Model basis Report
Energy Technology Data Exchange (ETDEWEB)
Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson
2009-09-01
The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.
Stochastic linear programming models, theory, and computation
Kall, Peter
2011-01-01
This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...
Park, K.; Mak, J. E.; Emmons, L. K.
2008-12-01
Carbon monoxide is not only an important component for determining the atmospheric oxidizing capacity but also a key trace gas in the atmospheric chemistry of the Earth's background environment. The global CO cycle and its change are closely related to both the change of CO mixing ratio and the change of source strength. Previously, to estimate the global CO budget, most top-down estimation techniques have been applied the concentrations of CO solely. Since CO from certain sources has a unique isotopic signature, its isotopes provide additional information to constrain its sources. Thus, coupling the concentration and isotope fraction information enables to tightly constrain CO flux by its sources and allows better estimations on the global CO budget. MOZART4 (Model for Ozone And Related chemical Tracers), a 3-D global chemical transport model developed at NCAR, MPI for meteorology and NOAA/GFDL and is used to simulate the global CO concentration and its isotopic signature. Also, a tracer version of MOZART4 which tagged for C16O and C18O from each region and each source was developed to see their contributions to the atmosphere efficiently. Based on the nine-year-simulation results we analyze the influences of each source of CO to the isotopic signature and the concentration. Especially, the evaluations are focused on the oxygen isotope of CO (δ18O), which has not been extensively studied yet. To validate the model performance, CO concentrations and isotopic signatures measured from MPI, NIWA and our lab are compared to the modeled results. The MOZART4 reproduced observational data fairly well; especially in mid to high latitude northern hemisphere. Bayesian inversion techniques have been used to estimate the global CO budget with combining observed and modeled CO concentration. However, previous studies show significant differences in their estimations on CO source strengths. Because, in addition to the CO mixing ratio, isotopic signatures are independent tracers
Driscoll, Daniel G.; Norton, Parker A.
2009-01-01
The U.S. Geological Survey cooperated with South Dakota Game, Fish and Parks to characterize hydrologic information relevant to management of water resources associated with Sheridan Lake, which is formed by a dam on Spring Creek. This effort consisted primarily of characterization of hydrologic data for a base period of 1962 through 2006, development of a hydrologic budget for Sheridan Lake for this timeframe, and development of an associated model for simulation of storage deficits and drawdown in Sheridan Lake for hypothetical release scenarios from the lake. Historically, the dam has been operated primarily as a 'pass-through' system, in which unregulated outflows pass over the spillway; however, the dam recently was retrofitted with an improved control valve system that would allow controlled releases of about 7 cubic feet per second (ft3/s) or less from a fixed depth of about 60 feet (ft). Development of a hydrologic budget for Sheridan Lake involved compilation, estimation, and characterization of data sets for streamflow, precipitation, and evaporation. The most critical data need was for extrapolation of available short-term streamflow records for Spring Creek to be used as the long-term inflow to Sheridan Lake. Available short-term records for water years (WY) 1991-2004 for a gaging station upstream from Sheridan Lake were extrapolated to WY 1962-2006 on the basis of correlations with streamflow records for a downstream station and for stations located along two adjacent streams. Comparisons of data for the two streamflow-gaging stations along Spring Creek indicated that tributary inflow is approximately proportional to the intervening drainage area, which was used as a means of estimating tributary inflow for the hydrologic budget. Analysis of evaporation data shows that sustained daily rates may exceed maximum monthly rates by a factor of about two. A long-term (1962-2006) hydrologic budget was developed for computation of reservoir outflow from
Patel, Nitin R; Ankolekar, Suresh; Antonijevic, Zoran; Rajicic, Natasa
2013-05-10
We describe a value-driven approach to optimizing pharmaceutical portfolios. Our approach incorporates inputs from research and development and commercial functions by simultaneously addressing internal and external factors. This approach differentiates itself from current practices in that it recognizes the impact of study design parameters, sample size in particular, on the portfolio value. We develop an integer programming (IP) model as the basis for Bayesian decision analysis to optimize phase 3 development portfolios using expected net present value as the criterion. We show how this framework can be used to determine optimal sample sizes and trial schedules to maximize the value of a portfolio under budget constraints. We then illustrate the remarkable flexibility of the IP model to answer a variety of 'what-if' questions that reflect situations that arise in practice. We extend the IP model to a stochastic IP model to incorporate uncertainty in the availability of drugs from earlier development phases for phase 3 development in the future. We show how to use stochastic IP to re-optimize the portfolio development strategy over time as new information accumulates and budget changes occur. Copyright © 2013 John Wiley & Sons, Ltd.
The Application of a Phosphorus Budget Model Estimating the Carrying Capacity of Kesikköprü Dam Lake
PULATSÜ, Serap
2014-01-01
The aim of this study was to estimate the carrying capacity of Kesikköprü Dam Lake, Ankara, where cage farms for the intensive culture of rainbow trout are located. For this purpose Dillon and Rigler's phosphorus budget model was applied in a series of steps and the carrying capacity of the lake was found to be 3335 tonnes per year. This estimated value was about 10 times higher than the present production level of the lake. It seems possible to orientate the fish culture in cages in...
Educational Program Evaluation Model, From the Perspective of the New Theories
Directory of Open Access Journals (Sweden)
Soleiman Ahmady
2014-05-01
Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.
Kuribayashi, Masatoshi; Noh, Nam-Jin; Saitoh, Taku M; Ito, Akihiko; Wakazuki, Yasutaka; Muraoka, Hiroyuki
2017-06-01
Accurate projection of carbon budget in forest ecosystems under future climate and atmospheric carbon dioxide (CO 2 ) concentration is important to evaluate the function of terrestrial ecosystems, which serve as a major sink of atmospheric CO 2 . In this study, we examined the effects of spatial resolution of meteorological data on the accuracies of ecosystem model simulation for canopy phenology and carbon budget such as gross primary production (GPP), ecosystem respiration (ER), and net ecosystem production (NEP) of a deciduous forest in Japan. Then, we simulated the future (around 2085) changes in canopy phenology and carbon budget of the forest by incorporating high-resolution meteorological data downscaled by a regional climate model. The ecosystem model overestimated GPP and ER when we inputted low-resolution data, which have warming biases over mountainous landscape. But, it reproduced canopy phenology and carbon budget well, when we inputted high-resolution data. Under the future climate, earlier leaf expansion and delayed leaf fall by about 10 days compared with the present state was simulated, and also, GPP, ER and NEP were estimated to increase by 25.2%, 23.7% and 35.4%, respectively. Sensitivity analysis showed that the increase of NEP in June and October would be mainly caused by rising temperature, whereas that in July and August would be largely attributable to CO 2 fertilization. This study suggests that the downscaling of future climate data enable us to project more reliable carbon budget of forest ecosystem in mountainous landscape than the low-resolution simulation due to the better predictions of leaf expansion and shedding.
A fusion of top-down and bottom-up modeling techniques to constrain regional scale carbon budgets
Goeckede, M.; Turner, D. P.; Michalak, A. M.; Vickers, D.; Law, B. E.
2009-12-01
The effort to constrain regional scale carbon budgets benefits from assimilating as many high quality data sources as possible in order to reduce uncertainties. Two of the most common approaches used in this field, bottom-up and top-down techniques, both have their strengths and weaknesses, and partly build on very different sources of information to train, drive, and validate the models. Within the context of the ORCA2 project, we follow both bottom-up and top-down modeling strategies with the ultimate objective of reconciling their surface flux estimates. The ORCA2 top-down component builds on a coupled WRF-STILT transport module that resolves the footprint function of a CO2 concentration measurement in high temporal and spatial resolution. Datasets involved in the current setup comprise GDAS meteorology, remote sensing products, VULCAN fossil fuel inventories, boundary conditions from CarbonTracker, and high-accuracy time series of atmospheric CO2 concentrations. Surface fluxes of CO2 are normally provided through a simple diagnostic model which is optimized against atmospheric observations. For the present study, we replaced the simple model with fluxes generated by an advanced bottom-up process model, Biome-BGC, which uses state-of-the-art algorithms to resolve plant-physiological processes, and 'grow' a biosphere based on biogeochemical conditions and climate history. This approach provides a more realistic description of biomass and nutrient pools than is the case for the simple model. The process model ingests various remote sensing data sources as well as high-resolution reanalysis meteorology, and can be trained against biometric inventories and eddy-covariance data. Linking the bottom-up flux fields to the atmospheric CO2 concentrations through the transport module allows evaluating the spatial representativeness of the BGC flux fields, and in that way assimilates more of the available information than either of the individual modeling techniques alone
Modeling and Optimization : Theory and Applications Conference
Terlaky, Tamás
2017-01-01
This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 17-19, 2016. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, health, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.
Theory and modelling of nanocarbon phase stability.
Energy Technology Data Exchange (ETDEWEB)
Barnard, A. S.
2006-01-01
The transformation of nanodiamonds into carbon-onions (and vice versa) has been observed experimentally and has been modeled computationally at various levels of sophistication. Also, several analytical theories have been derived to describe the size, temperature and pressure dependence of this phase transition. However, in most cases a pure carbon-onion or nanodiamond is not the final product. More often than not an intermediary is formed, known as a bucky-diamond, with a diamond-like core encased in an onion-like shell. This has prompted a number of studies investigating the relative stability of nanodiamonds, bucky-diamonds, carbon-onions and fullerenes, in various size regimes. Presented here is a review outlining results of numerous theoretical studies examining the phase diagrams and phase stability of carbon nanoparticles, to clarify the complicated relationship between fullerenic and diamond structures at the nanoscale.
Modeling and Optimization : Theory and Applications Conference
Terlaky, Tamás
2015-01-01
This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 13-15, 2014. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, healthcare, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.
Idealized numerical modeling of polar mesocyclones dynamics diagnosed by energy budget
Sergeev, Dennis; Stepanenko, Victor
2014-05-01
can be interpreted as the growth rate of the vortex) and energy conversion in the diagnostic equations for kinetic and available potential energy (APE). The energy budget equations are implemented in two forms. The first approach follows the scheme developed by Lorenz (1955) in which KE and APE are broken into a mean component and an eddy component forming a well-known energy cycle. The second method is based on the energy equations that are strictly derived from the governing equations of the numerical mesoscale model used. The latter approach, hence, takes into account all the approximations and numerical features used in the model. Some conclusions based on the comparison of the described methods are presented in the study. A series of high-resolution experiments is carried out using three-dimensional non-hydrostatic limited-area sigma-coordinate numerical model ReMeDy (Research Mesoscale Dynamics), being developed at Lomonosov Moscow State University [3]. An idealized basic state condition is used for all simulations. It is composed of the zonally oriented baroclinic zone over the sea surface partly covered with ice. To realize a baroclinic channel environment zero-gradient boundary conditions at the meridional lateral oundaries are imposed, while the zonal boundary conditions are periodic. The initialization of the mesocyclone is achieved by creating a small axis-symmetric vortex in the center of the model domain. The baroclinicity and stratification of the basic state, as well as the surface parameters, are varied in the typically observed range. References 1. Heinemann G, Øyvind S. 2013. Workshop On Polar Lows. Bull. Amer. Meteor. Soc. 94: ES123-ES126. 2. Yanase W, Niino H. 2006. Dependence of Polar Low Development on Baroclinicity and Physical Processes: An Idealized High-Resolution Experiment, J. Atmos. Sci. 64: 3044-3067. 3. Chechin DG et al. 2013. Idealized dry quasi 2-D mesoscale simulations of cold-air outbreaks over the marginal sea ice zone with fine
Nesladek, Pavel; Wiswesser, Andreas; Sass, Björn; Mauermann, Sebastian
2008-04-01
The Critical dimension off-target (CDO) is a key parameter for mask house customer, affecting directly the performance of the mask. The CDO is the difference between the feature size target and the measured feature size. The change of CD during the process is either compensated within the process or by data correction. These compensation methods are commonly called process bias and data bias, respectively. The difference between data bias and process bias in manufacturing results in systematic CDO error, however, this systematic error does not take into account the instability of the process bias. This instability is a result of minor variations - instabilities of manufacturing processes and changes in materials and/or logistics. Using several masks the CDO of the manufacturing line can be estimated. For systematic investigation of the unit process contribution to CDO and analysis of the factors influencing the CDO contributors, a solid understanding of each unit process and huge number of masks is necessary. Rough identification of contributing processes and splitting of the final CDO variation between processes can be done with approx. 50 masks with identical design, material and process. Such amount of data allows us to identify the main contributors and estimate the effect of them by means of Analysis of variance (ANOVA) combined with multivariate analysis. The analysis does not provide information about the root cause of the variation within the particular unit process, however, it provides a good estimate of the impact of the process on the stability of the manufacturing line. Additionally this analysis can be used to identify possible interaction between processes, which cannot be investigated if only single processes are considered. Goal of this work is to evaluate limits for CDO budgeting models given by the precision and the number of measurements as well as partitioning the variation within the manufacturing process. The CDO variation splits according to
Game Theory and its Relationship with Linear Programming Models ...
African Journals Online (AJOL)
Game Theory and its Relationship with Linear Programming Models. ... This paper shows that game theory and linear programming problem are closely related subjects since any computing method devised for ... AJOL African Journals Online.
Hosotani model in closed string theory
International Nuclear Information System (INIS)
Shiraishi, Kiyoshi.
1988-11-01
Hosotani mechanism in the closed string theory with current algebra symmetry is described by the (old covariant) operator method. We compare the gauge symmetry breaking mechanism in a string theory which has SU(2) symmetry with the one in an equivalent compactified closed string theory. We also investigate the difference between Hosotani mechanism and Higgs mechanism in closed string theories by calculation of a fourpoint amplitude of 'Higgs' bosons at tree level. (author)
A modelling study of the impact of cirrus clouds on the moisture budget of the upper troposphere
Directory of Open Access Journals (Sweden)
S. Fueglistaler
2006-01-01
Full Text Available We present a modelling study of the effect of cirrus clouds on the moisture budget of the layer wherein the cloud formed. Our framework simplifies many aspects of cloud microphysics and collapses the problem of sedimentation onto a 0-dimensional box model, but retains essential feedbacks between saturation mixing ratio, particle growth, and water removal through particle sedimentation. The water budget is described by two coupled first-order differential equations for dimensionless particle number density and saturation point temperature, where the parameters defining the system (layer depth, reference temperature, amplitude and time scale of temperature perturbation and inital particle number density, which may or may not be a function of reference temperature and cooling rate are encapsulated in a single coefficient. This allows us to scale the results to a broad range of atmospheric conditions, and to test sensitivities. Results of the moisture budget calculations are presented for a range of atmospheric conditions (T: 238–205 K; p: 325–180 hPa and a range of time scales τT of the temperature perturbation that induces the cloud formation. The cirrus clouds are found to efficiently remove water for τT longer than a few hours, with longer perturbations (τT≳10 h required at lower temperatures (T≲210 K. Conversely, we find that temperature perturbations of duration order 1 h and less (a typical timescale for e.g., gravity waves do not efficiently dehydrate over most of the upper troposphere. A consequence is that (for particle densities typical of current cirrus clouds the assumption of complete dehydration to the saturation mixing ratio may yield valid predictions for upper tropospheric moisture distributions if it is based on the large scale temperature field, but this assumption is not necessarily valid if it is based on smaller scale temperature fields.
The Properties of Model Selection when Retaining Theory Variables
DEFF Research Database (Denmark)
Hendry, David F.; Johansen, Søren
Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...... set by their statistical significance can be undertaken without affecting the estimator distribution of the theory parameters. This strategy returns the theory-parameter estimates when the theory is correct, yet protects against the theory being under-specified because some w{t} are relevant....
Bounded Rationality and Budgeting
Ibrahim, Mukdad
2016-01-01
This article discusses the theory of bounded rationality which had been introduced by Herbert Simon in the 1950s. Simon introduced the notion of bounded rationality stating that while decision-makers strive for rationality, they are limited by the effect of the environment, their information process capacity and by the constraints on their information storage and retrieval capabilities. Moreover, this article tries to specifically blend this notion into budgeting, using the foundations of inc...
System Dynamics as Model-Based Theory Building
Schwaninger, Markus; Grösser, Stefan N.
2008-01-01
This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...
A Realizability Model for Impredicative Hoare Type Theory
DEFF Research Database (Denmark)
Petersen, Rasmus Lerchedal; Birkedal, Lars; Nanevski, Alexandar
2008-01-01
We present a denotational model of impredicative Hoare Type Theory, a very expressive dependent type theory in which one can specify and reason about mutable abstract data types. The model ensures soundness of the extension of Hoare Type Theory with impredicative polymorphism; makes the connections...... to separation logic clear, and provides a basis for investigation of further sound extensions of the theory, in particular equations between computations and types....
Irreducible integrable theories form tensor products of conformal models
International Nuclear Information System (INIS)
Mathur, S.D.; Warner, N.P.
1991-01-01
By using Toda field theories we show that there are perturbations of direct products of conformal theories that lead to irreducible integrable field theories. The same affine Toda theory can be truncated to different quantum integrable models for different choices of the charge at infinity and the coupling. The classification of integrable models that can be obtained in this fashion follows the classification of symmetric spaces of type G/H with rank H = rank G. (orig.)
Volcanic aquifers of Hawai‘i—Hydrogeology, water budgets, and conceptual models
Izuka, Scot K.; Engott, John A.; Rotzoll, Kolja; Bassiouni, Maoya; Johnson, Adam G.; Miller, Lisa D.; Mair, Alan
2016-06-13
Hawai‘i’s aquifers have limited capacity to store fresh groundwater because each island is small and surrounded by saltwater. Saltwater also underlies much of the fresh groundwater. Fresh groundwater resources are, therefore, particularly vulnerable to human activity, short-term climate cycles, and long-term climate change. Availability of fresh groundwater for human use is constrained by the degree to which the impacts of withdrawal—such as lowering of the water table, saltwater intrusion, and reduction in the natural discharge to springs, streams, wetlands, and submarine seeps—are deemed acceptable. This report describes the hydrogeologic framework, groundwater budgets (inflows and outflows), conceptual models of groundwater occurrence and movement, and the factors limiting groundwater availability for the largest and most populated of the Hawaiian Islands—Kaua‘i, O‘ahu, Maui, and Hawai‘i Island.The bulk of each of Hawai‘i’s islands is built of many thin lava flows erupted from shield volcanoes; the great piles of lava flows form highly permeable aquifers. In some areas, low-permeability dikes cutting across the lava flows, or low-permeability ash and soil horizons interlayered with the lava flows, can substantially alter groundwater flow. On some islands, sedimentary rocks form thick semiconfining coastal-plain deposits, locally known as caprock, that impede natural groundwater discharge to the ocean. In some regions, thick lava flows that ponded in preexisting depressions form aquifers that are much less permeable than aquifers formed by thin lava flows.Fresh groundwater inflow to Hawai‘i’s aquifers comes from recharge. For predevelopment conditions (1870), estimates of groundwater recharge from this study are 871, 675, 1,279, and 5,291 million gallons per day (Mgal/d) for Kaua‘i, O‘ahu, Maui, and Hawai‘i Island, respectively. Estimates of recharge for recent conditions (2010 land cover and 1978–2007 rainfall for Kaua‘i, O
DEFF Research Database (Denmark)
Maar, Marie; Saurel, Camille; Landes, Anja
2015-01-01
) metabolic costs due to osmoregulation in different salinity environments. Themodified DEBmodel was validated with experimental data fromdifferent locations in the Western Baltic Sea (including the Limfjorden) with salinities varying from 8.5 to 29.9 psu. The identified areas suitable for mussel production......For bluemussels,Mytilus edulis, onemajor constrain in the Baltic Sea is the low salinities that reduce the efficiency of mussel production. However, the effects of living in low and variable salinity regimes are rarely considered in models describing mussel growth. The aim of the present study...... was to incorporate the effects of low salinity into an eco-physiological model of blue mussels and to identify areas suitable for mussel production. A Dynamic Energy Budget (DEB) model was modified with respect to i) the morphological parameters (DW/WW-ratio, shape factor), ii) change in ingestion rate and iii...
International Nuclear Information System (INIS)
Cooper, F.
1996-01-01
We review the assumptions and domain of applicability of Landau's Hydrodynamical Model. By considering two models of particle production, pair production from strong electric fields and particle production in the linear σ model, we demonstrate that many of Landau's ideas are verified in explicit field theory calculations
Chaos Theory as a Model for Managing Issues and Crises.
Murphy, Priscilla
1996-01-01
Uses chaos theory to model public relations situations in which the salient feature is volatility of public perceptions. Discusses the premises of chaos theory and applies them to issues management, the evolution of interest groups, crises, and rumors. Concludes that chaos theory is useful as an analogy to structure image problems and to raise…
Catastrophe Theory: A Unified Model for Educational Change.
Cryer, Patricia; Elton, Lewis
1990-01-01
Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)
A Leadership Identity Development Model: Applications from a Grounded Theory
Komives, Susan R.; Mainella, Felicia C.; Longerbeam, Susan D.; Osteen, Laura; Owen, Julie E.
2006-01-01
This article describes a stage-based model of leadership identity development (LID) that resulted from a grounded theory study on developing a leadership identity (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005). The LID model expands on the leadership identity stages, integrates the categories of the grounded theory into the LID model, and…
Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…
Directory of Open Access Journals (Sweden)
JOEL Arnault
2013-03-01
Full Text Available The case study of a mountain wave triggered by the Antarctic Peninsula on 6 October 2005, which has already been documented in the literature, is chosen here to quantify the associated gravity wave forcing on the large-scale flow, with a budget analysis of the horizontal wind components and horizontal kinetic energy. In particular, a numerical simulation using the Weather Research and Forecasting (WRF model is compared to a control simulation with flat orography to separate the contribution of the mountain wave from that of other synoptic processes of non-orographic origin. The so-called differential budgets of horizontal wind components and horizontal kinetic energy (after subtracting the results from the simulation without orography are then averaged horizontally and vertically in the inner domain of the simulation to quantify the mountain wave dynamical influence at this scale. This allows for a quantitative analysis of the simulated mountain wave's dynamical influence, including the orographically induced pressure drag, the counterbalancing wave-induced vertical transport of momentum from the flow aloft, the momentum and energy exchanges with the outer flow at the lateral and upper boundaries, the effect of turbulent mixing, the dynamics associated with geostrophic re-adjustment of the inner flow, the deceleration of the inner flow, the secondary generation of an inertia–gravity wave and the so-called baroclinic conversion of energy between potential energy and kinetic energy.
Theory and modeling of active brazing.
Energy Technology Data Exchange (ETDEWEB)
van Swol, Frank B.; Miller, James Edward; Lechman, Jeremy B.; Givler, Richard C.
2013-09-01
Active brazes have been used for many years to produce bonds between metal and ceramic objects. By including a relatively small of a reactive additive to the braze one seeks to improve the wetting and spreading behavior of the braze. The additive modifies the substrate, either by a chemical surface reaction or possibly by alloying. By its nature, the joining process with active brazes is a complex nonequilibrium non-steady state process that couples chemical reaction, reactant and product diffusion to the rheology and wetting behavior of the braze. Most of the these subprocesses are taking place in the interfacial region, most are difficult to access by experiment. To improve the control over the brazing process, one requires a better understanding of the melting of the active braze, rate of the chemical reaction, reactant and product diffusion rates, nonequilibrium composition-dependent surface tension as well as the viscosity. This report identifies ways in which modeling and theory can assist in improving our understanding.
Miller, James R.; Russell, Gary L.
1996-01-01
The annual flux of freshwater into the Arctic Ocean by the atmosphere and rivers is balanced by the export of sea ice and oceanic freshwater. Two 150-year simulations of a global climate model are used to examine how this balance might change if atmospheric greenhouse gases (GHGs) increase. Relative to the control, the last 50-year period of the GHG experiment indicates that the total inflow of water from the atmosphere and rivers increases by 10% primarily due to an increase in river discharge, the annual sea-ice export decreases by about half, the oceanic liquid water export increases, salinity decreases, sea-ice cover decreases, and the total mass and sea-surface height of the Arctic Ocean increase. The closed, compact, and multi-phased nature of the hydrologic cycle in the Arctic Ocean makes it an ideal test of water budgets that could be included in model intercomparisons.
Initiatives Hearings Full Menu About Toggle Links Members History Staff Rules & Budget Law News Toggle Links Press Releases Budget Digests HBC Publications Op-Eds Speeches & Statements Budgets Toggle Links FY 2018 Budget FY 2017 Budget FY 2017 Reconciliation FY 2016 Budget FY 2016 Reconciliation FY 2015
Domain Theory, Its Models and Concepts
DEFF Research Database (Denmark)
Andreasen, Mogens Myrup; Howard, Thomas J.; Bruun, Hans Peter Lomholt
2014-01-01
Domain Theory is a systems approach for the analysis and synthesis of products. Its basic idea is to view a product as systems of activities, organs and parts and to define structure, elements, behaviour and function in these domains. The theory is a basis for a long line of research contribution...
Filgueira, Ramón; Rosland, Rune; Grant, Jon
2011-11-01
Growth of Mytilus edulis was simulated using individual based models following both Scope For Growth (SFG) and Dynamic Energy Budget (DEB) approaches. These models were parameterized using independent studies and calibrated for each dataset by adjusting the half-saturation coefficient of the food ingestion function term, XK, a common parameter in both approaches related to feeding behavior. Auto-calibration was carried out using an optimization tool, which provides an objective way of tuning the model. Both approaches yielded similar performance, suggesting that although the basis for constructing the models is different, both can successfully reproduce M. edulis growth. The good performance of both models in different environments achieved by adjusting a single parameter, XK, highlights the potential of these models for (1) producing prospective analysis of mussel growth and (2) investigating mussel feeding response in different ecosystems. Finally, we emphasize that the convergence of two different modeling approaches via calibration of XK, indicates the importance of the feeding behavior and local trophic conditions for bivalve growth performance. Consequently, further investigations should be conducted to explore the relationship of XK to environmental variables and/or to the sophistication of the functional response to food availability with the final objective of creating a general model that can be applied to different ecosystems without the need for calibration.
Big bang models in string theory
Energy Technology Data Exchange (ETDEWEB)
Craps, Ben [Theoretische Natuurkunde, Vrije Universiteit Brussel and The International Solvay Institutes Pleinlaan 2, B-1050 Brussels (Belgium)
2006-11-07
These proceedings are based on lectures delivered at the 'RTN Winter School on Strings, Supergravity and Gauge Theories', CERN, 16-20 January 2006. The school was mainly aimed at PhD students and young postdocs. The lectures start with a brief introduction to spacetime singularities and the string theory resolution of certain static singularities. Then they discuss attempts to resolve cosmological singularities in string theory, mainly focusing on two specific examples: the Milne orbifold and the matrix big bang.
The Standard Model is Natural as Magnetic Gauge Theory
DEFF Research Database (Denmark)
Sannino, Francesco
2011-01-01
matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...
Hobbelen, P.H.F.; van Gestel, C.A.M.
2007-01-01
The aim of this study was to predict the dependence on temperature and food density of effects of Cu on the litter consumption by the earthworm Lumbricus rubellus, using a dynamic energy budget model (DEB-model). As a measure of the effects of Cu on food consumption, EC50s (soil concentrations
Chiral gauged Wess-Zumino-Witten theories and coset models in conformal field theory
International Nuclear Information System (INIS)
Chung, S.; Tye, S.H.
1993-01-01
The Wess-Zumino-Witten (WZW) theory has a global symmetry denoted by G L direct-product G R . In the standard gauged WZW theory, vector gauge fields (i.e., with vector gauge couplings) are in the adjoint representation of the subgroup H contained-in G. In this paper, we show that, in the conformal limit in two dimensions, there is a gauged WZW theory where the gauge fields are chiral and belong to the subgroups H L and H R where H L and H R can be different groups. In the special case where H L =H R , the theory is equivalent to vector gauged WZW theory. For general groups H L and H R , an examination of the correlation functions (or more precisely, conformal blocks) shows that the chiral gauged WZW theory is equivalent to (G/H L ) L direct-product(G/H R ) R coset models in conformal field theory
Spatial data modelling and maximum entropy theory
Czech Academy of Sciences Publication Activity Database
Klimešová, Dana; Ocelíková, E.
2005-01-01
Roč. 51, č. 2 (2005), s. 80-83 ISSN 0139-570X Institutional research plan: CEZ:AV0Z10750506 Keywords : spatial data classification * distribution function * error distribution Subject RIV: BD - Theory of Information
Electroweak theory and the Standard Model
CERN. Geneva; Giudice, Gian Francesco
2004-01-01
There is a natural splitting in four sectors of the theory of the ElectroWeak (EW) Interactions, at pretty different levels of development/test. Accordingly, the 5 lectures are organized as follows, with an eye to the future: Lecture 1: The basic structure of the theory; Lecture 2: The gauge sector; Lecture 3: The flavor sector; Lecture 4: The neutrino sector; Lecture 5: The EW symmetry breaking sector.
Statistical Learning Theory: Models, Concepts, and Results
von Luxburg, Ulrike; Schoelkopf, Bernhard
2008-01-01
Statistical learning theory provides the theoretical basis for many of today's machine learning algorithms. In this article we attempt to give a gentle, non-technical overview over the key ideas and insights of statistical learning theory. We target at a broad audience, not necessarily machine learning researchers. This paper can serve as a starting point for people who want to get an overview on the field before diving into technical details.
Glass Durability Modeling, Activated Complex Theory (ACT)
International Nuclear Information System (INIS)
CAROL, JANTZEN
2005-01-01
atomic ratios is shown to represent the structural effects of the glass on the dissolution and the formation of activated complexes in the glass leached layer. This provides two different methods by which a linear glass durability model can be formulated. One based on the quasi- crystalline mineral species in a glass and one based on cation ratios in the glass: both are related to the activated complexes on the surface by the law of mass action. The former would allow a new Thermodynamic Hydration Energy Model to be developed based on the hydration of the quasi-crystalline mineral species if all the pertinent thermodynamic data were available. Since the pertinent thermodynamic data is not available, the quasi-crystalline mineral species and the activated complexes can be related to cation ratios in the glass by the law of mass action. The cation ratio model can, thus, be used by waste form producers to formulate durable glasses based on fundamental structural and activated complex theories. Moreover, glass durability model based on atomic ratios simplifies HLW glass process control in that the measured ratios of only a few waste components and glass formers can be used to predict complex HLW glass performance with a high degree of accuracy, e.g. an R 2 approximately 0.97
Solid modeling and applications rapid prototyping, CAD and CAE theory
Um, Dugan
2016-01-01
The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...
The logical foundations of scientific theories languages, structures, and models
Krause, Decio
2016-01-01
This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...
Supersymmetry and String Theory: Beyond the Standard Model
International Nuclear Information System (INIS)
Rocek, Martin
2007-01-01
When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)
Introduction to gauge theories and the Standard Model
de Wit, Bernard
1995-01-01
The conceptual basis of gauge theories is introduced to enable the construction of generic models.Spontaneous symmetry breaking is dicussed and its relevance for the renormalization of theories with massive vector field is explained. Subsequently a d standard model. When time permits we will address more practical questions that arise in the evaluation of quantum corrections.
A 'theory of everything'? [Extending the Standard Model
International Nuclear Information System (INIS)
Ross, G.G.
1993-01-01
The Standard Model provides us with an amazingly successful theory of the strong, weak and electromagnetic interactions. Despite this, many physicists believe it represents only a step towards understanding the ultimate ''theory of everything''. In this article we describe why the Standard Model is thought to be incomplete and some of the suggestions for its extension. (Author)
Neutron Star Models in Alternative Theories of Gravity
Manolidis, Dimitrios
We study the structure of neutron stars in a broad class of alternative theories of gravity. In particular, we focus on Scalar-Tensor theories and f(R) theories of gravity. We construct static and slowly rotating numerical star models for a set of equations of state, including a polytropic model and more realistic equations of state motivated by nuclear physics. Observable quantities such as masses, radii, etc are calculated for a set of parameters of the theories. Specifically for Scalar-Tensor theories, we also calculate the sensitivities of the mass and moment of inertia of the models to variations in the asymptotic value of the scalar field at infinity. These quantities enter post-Newtonian equations of motion and gravitational waveforms of two body systems that are used for gravitational-wave parameter estimation, in order to test these theories against observations. The construction of numerical models of neutron stars in f(R) theories of gravity has been difficult in the past. Using a new formalism by Jaime, Patino and Salgado we were able to construct models with high interior pressure, namely pc > rho c/3, both for constant density models and models with a polytropic equation of state. Thus, we have shown that earlier objections to f(R) theories on the basis of the inability to construct viable neutron star models are unfounded.
Generalized algebra-valued models of set theory
Löwe, B.; Tarafder, S.
2015-01-01
We generalize the construction of lattice-valued models of set theory due to Takeuti, Titani, Kozawa and Ozawa to a wider class of algebras and show that this yields a model of a paraconsistent logic that validates all axioms of the negation-free fragment of Zermelo-Fraenkel set theory.
A QCD Model Using Generalized Yang-Mills Theory
International Nuclear Information System (INIS)
Wang Dianfu; Song Heshan; Kou Lina
2007-01-01
Generalized Yang-Mills theory has a covariant derivative, which contains both vector and scalar gauge bosons. Based on this theory, we construct a strong interaction model by using the group U(4). By using this U(4) generalized Yang-Mills model, we also obtain a gauge potential solution, which can be used to explain the asymptotic behavior and color confinement.
A review of organizational buyer behaviour models and theories ...
African Journals Online (AJOL)
Over the years, models have been developed, and theories propounded, to explain the behavior of industrial buyers on the one hand and the nature of the dyadic relationship between organizational buyers and sellers on the other hand. This paper is an attempt at a review of the major models and theories in extant ...
Sahin, Oz; Bertone, Edoardo; Beal, Cara; Stewart, Rodney A
2018-06-01
Population growth, coupled with declining water availability and changes in climatic conditions underline the need for sustainable and responsive water management instruments. Supply augmentation and demand management are the two main strategies used by water utilities. Water demand management has long been acknowledged as a least-cost strategy to maintain water security. This can be achieved in a variety of ways, including: i) educating consumers to limit their water use; ii) imposing restrictions/penalties; iii) using smart and/or efficient technologies; and iv) pricing mechanisms. Changing water consumption behaviours through pricing or restrictions is challenging as it introduces more social and political issues into the already complex water resources management process. This paper employs a participatory systems modelling approach for: (1) evaluating various forms of a proposed tiered scarcity adjusted water budget and pricing structure, and (2) comparing scenario outcomes against the traditional restriction policy regime. System dynamics modelling was applied since it can explicitly account for the feedbacks, interdependencies, and non-linear relations that inherently characterise the water tariff (price)-demand-revenue system. A combination of empirical water use data, billing data and customer feedback on future projected water bills facilitated the assessment of the suitability and likelihood of the adoption of scarcity-driven tariff options for a medium-sized city within Queensland, Australia. Results showed that the tiered scarcity adjusted water budget and pricing structure presented was preferable to restrictions since it could maintain water security more equitably with the lowest overall long-run marginal cost. Copyright © 2018 Elsevier Ltd. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Hollmann, R. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Atmosphaerenphysik
2000-07-01
Since forty years instruments onboard satellites have been demonstrated their usefulness for many applications in the field of meteorology and oceanography. Several experiments, like ERBE, are dedicated to establish a climatology of the global Earth radiation budget at the top of the atmosphere. Now the focus has been changed to the regional scale, e.g. GEWEX with its regional sub-experiments like BALTEX. To obtain a regional radiation budget for Europe in the first part of the work the well calibrated measurements from ScaRaB (scanner for radiation budget) are used to derive a narrow-to-broadband conversion, which is applicable to the AVHRR (advanced very high resolution radiometer). It is shown, that the accuracy of the method is in the order of that from SCaRaB itself. In the second part of the work, results of REMO have been compared with measurements of ScaRaB and AVHRR for March 1994. The model reproduces the measurements overall well, but it is overestimating the cold areas and underestimating the warm areas in the longwave spectral domain. Similarly it is overestimating the dark areas and underestimating the bright areas in the solar spectral domain. (orig.)
The Birth of Model Theory Lowenheim's Theorem in the Frame of the Theory of Relatives
Badesa, Calixto
2008-01-01
Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. In The Birth of Model Theory, Cali
Non-linear σ-models and string theories
International Nuclear Information System (INIS)
Sen, A.
1986-10-01
The connection between σ-models and string theories is discussed, as well as how the σ-models can be used as tools to prove various results in string theories. Closed bosonic string theory in the light cone gauge is very briefly introduced. Then, closed bosonic string theory in the presence of massless background fields is discussed. The light cone gauge is used, and it is shown that in order to obtain a Lorentz invariant theory, the string theory in the presence of background fields must be described by a two-dimensional conformally invariant theory. The resulting constraints on the background fields are found to be the equations of motion of the string theory. The analysis is extended to the case of the heterotic string theory and the superstring theory in the presence of the massless background fields. It is then shown how to use these results to obtain nontrivial solutions to the string field equations. Another application of these results is shown, namely to prove that the effective cosmological constant after compactification vanishes as a consequence of the classical equations of motion of the string theory. 34 refs
Toric Methods in F-Theory Model Building
Directory of Open Access Journals (Sweden)
Johanna Knapp
2011-01-01
Full Text Available We discuss recent constructions of global F-theory GUT models and explain how to make use of toric geometry to do calculations within this framework. After introducing the basic properties of global F-theory GUTs, we give a self-contained review of toric geometry and introduce all the tools that are necessary to construct and analyze global F-theory models. We will explain how to systematically obtain a large class of compact Calabi-Yau fourfolds which can support F-theory GUTs by using the software package PALP.
Quantum Link Models and Quantum Simulation of Gauge Theories
International Nuclear Information System (INIS)
Wiese, U.J.
2015-01-01
This lecture is about Quantum Link Models and Quantum Simulation of Gauge Theories. The lecture consists out of 4 parts. The first part gives a brief history of Computing and Pioneers of Quantum Computing and Quantum Simulations of Quantum Spin Systems are introduced. The 2nd lecture is about High-Temperature Superconductors versus QCD, Wilson’s Lattice QCD and Abelian Quantum Link Models. The 3rd lecture deals with Quantum Simulators for Abelian Lattice Gauge Theories and Non-Abelian Quantum Link Models. The last part of the lecture discusses Quantum Simulators mimicking ‘Nuclear’ physics and the continuum limit of D-Theorie models. (nowak)
Chamberlin, Phillip
2008-01-01
The Flare Irradiance Spectral Model (FISM) is an empirical model of the solar irradiance spectrum from 0.1 to 190 nm at 1 nm spectral resolution and on a 1-minute time cadence. The goal of FISM is to provide accurate solar spectral irradiances over the vacuum ultraviolet (VUV: 0-200 nm) range as input for ionospheric and thermospheric models. The seminar will begin with a brief overview of the FISM model, and also how the Solar Dynamics Observatory (SDO) EUV Variability Experiment (EVE) will contribute to improving FISM. Some current studies will then be presented that use FISM estimations of the solar VUV irradiance to quantify the contributions of the increased irradiance from flares to Earth's increased thermospheric and ionospheric densites. Initial results will also be presented from a study looking at the electron density increases in the Martian atmosphere during a solar flare. Results will also be shown quantifying the VUV contributions to the total flare energy budget for both the impulsive and gradual phases of solar flares. Lastly, an example of how FISM can be used to simplify the design of future solar VUV irradiance instruments will be discussed, using the future NOAA GOES-R Extreme Ultraviolet and X-Ray Sensors (EXIS) space weather instrument.
Smirnova, Daria
2017-01-01
The purpose of this research-based thesis was to get an idea how managers of two small resembling hotels of a specific region deal with marketing process with a limited budget. In addition, the aim of the thesis was to examine if hotel managers who were interviewed perceive marketing only in the way of ‘promotion’ rather than marketing research, marketing mix and marketing environment theories. It was also found out if hotel managers of those hotels consider marketing as a key to successful h...
Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...
The Self-Perception Theory vs. a Dynamic Learning Model
Swank, Otto H.
2006-01-01
Several economists have directed our attention to a finding in the social psychological literature that extrinsic motivation may undermine intrinsic motivation. The self-perception (SP) theory developed by Bem (1972) explains this finding. The crux of this theory is that people remember their past decisions and the extrinsic rewards they received, but they do not recall their intrinsic motives. In this paper I show that the SP theory can be modeled as a variant of a conventional dynamic learn...
Theory and model use in social marketing health interventions.
Luca, Nadina Raluca; Suggs, L Suzanne
2013-01-01
The existing literature suggests that theories and models can serve as valuable frameworks for the design and evaluation of health interventions. However, evidence on the use of theories and models in social marketing interventions is sparse. The purpose of this systematic review is to identify to what extent papers about social marketing health interventions report using theory, which theories are most commonly used, and how theory was used. A systematic search was conducted for articles that reported social marketing interventions for the prevention or management of cancer, diabetes, heart disease, HIV, STDs, and tobacco use, and behaviors related to reproductive health, physical activity, nutrition, and smoking cessation. Articles were published in English, after 1990, reported an evaluation, and met the 6 social marketing benchmarks criteria (behavior change, consumer research, segmentation and targeting, exchange, competition and marketing mix). Twenty-four articles, describing 17 interventions, met the inclusion criteria. Of these 17 interventions, 8 reported using theory and 7 stated how it was used. The transtheoretical model/stages of change was used more often than other theories. Findings highlight an ongoing lack of use or underreporting of the use of theory in social marketing campaigns and reinforce the call to action for applying and reporting theory to guide and evaluate interventions.
Tang, J.; Riley, W. J.
2015-12-01
Previous studies have identified four major sources of predictive uncertainty in modeling land biogeochemical (BGC) processes: (1) imperfect initial conditions (e.g., assumption of preindustrial equilibrium); (2) imperfect boundary conditions (e.g., climate forcing data); (3) parameterization (type I equifinality); and (4) model structure (type II equifinality). As if that were not enough to cause substantial sleep loss in modelers, we propose here a fifth element of uncertainty that results from implementation ambiguity that occurs when the model's mathematical description is translated into computational code. We demonstrate the implementation ambiguity using the example of nitrogen down regulation, a necessary process in modeling carbon-climate feedbacks. We show that, depending on common land BGC model interpretations of the governing equations for mineral nitrogen, there are three different implementations of nitrogen down regulation. We coded these three implementations in the ACME land model (ALM), and explored how they lead to different preindustrial and contemporary land biogeochemical states and fluxes. We also show how this implementation ambiguity can lead to different carbon-climate feedback estimates across the RCP scenarios. We conclude by suggesting how to avoid such implementation ambiguity in ESM BGC models.
Measurement Models for Reasoned Action Theory
Hennessy, Michael; Bleakley, Amy; Fishbein, Martin
2012-01-01
Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are ...
Budgeting for School Media Centers.
Drott, M. Carl
1978-01-01
Describes various forms of budgets and discusses concepts in budgeting useful to supervisors of school media centers: line item budgets, capital budgets, creating budgets, the budget calendar, innovations, PPBS (Planning, Programing, Budgeting System), zero-based budgeting, cost-benefit analysis, benefits, benefit guidelines, and budgeting for the…
Directory of Open Access Journals (Sweden)
M. Adachi
2011-09-01
Full Text Available More reliable estimates of the carbon (C stock within forest ecosystems and C emission induced by deforestation are urgently needed to mitigate the effects of emissions on climate change. A process-based terrestrial biogeochemical model (VISIT was applied to tropical primary forests of two types (a seasonal dry forest in Thailand and a rainforest in Malaysia and one agro-forest (an oil palm plantation in Malaysia to estimate the C budget of tropical ecosystems in Southeast Asia, including the impacts of land-use conversion. The observed aboveground biomass in the seasonal dry tropical forest in Thailand (226.3 t C ha^{−1} and the rainforest in Malaysia (201.5 t C ha^{−1} indicate that tropical forests of Southeast Asia are among the most C-abundant ecosystems in the world. The model simulation results in rainforests were consistent with field data, except for the NEP, however, the VISIT model tended to underestimate C budget and stock in the seasonal dry tropical forest. The gross primary production (GPP based on field observations ranged from 32.0 to 39.6 t C ha^{−1} yr^{−1} in the two primary forests, whereas the model slightly underestimated GPP (26.5–34.5 t C ha^{−1} yr^{−1}. The VISIT model appropriately captured the impacts of disturbances such as deforestation and land-use conversions on the C budget. Results of sensitivity analysis showed that the proportion of remaining residual debris was a key parameter determining the soil C budget after the deforestation event. According to the model simulation, the total C stock (total biomass and soil C of the oil palm plantation was about 35% of the rainforest's C stock at 30 yr following initiation of the plantation. However, there were few field data of C budget and stock, especially in oil palm plantation. The C budget of each ecosystem must be evaluated over the long term using both the model simulations and observations to
Modeling Routinization in Games: An Information Theory Approach
DEFF Research Database (Denmark)
Wallner, Simon; Pichlmair, Martin; Hecher, Michael
2015-01-01
Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...
Internal Universes in Models of Homotopy Type Theory
DEFF Research Database (Denmark)
Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.
2018-01-01
We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory.......We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language...
Theory, modeling, and simulation annual report, 1992
Energy Technology Data Exchange (ETDEWEB)
1993-05-01
This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.
Theories of conduct disorder: a causal modelling analysis
Krol, N.P.C.M.; Morton, J.; Bruyn, E.E.J. De
2004-01-01
Background: If a clinician has to make decisions on diagnosis and treatment, he or she is confronted with a variety of causal theories. In order to compare these theories a neutral terminology and notational system is needed. The Causal Modelling framework involving three levels of description –
Models of Regge behaviour in an asymptotically free theory
International Nuclear Information System (INIS)
Polkinghorne, J.C.
1976-01-01
Two simple Feynman integral models are presented which reproduce the features expected to be of physical importance in the Regge behaviour of asymptotically free theories. Analysis confirms the result, expected on general grounds, that phi 3 in six dimensions has an essential singularity at l=-1. The extension to gauge theories is discussed. (Auth.)
Theory analysis of the Dental Hygiene Human Needs Conceptual Model.
MacDonald, L; Bowen, D M
2017-11-01
Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Budget Elements of Economic Security: Specifics of Classification
Directory of Open Access Journals (Sweden)
О. S.
2017-02-01
Full Text Available Theoretical aspects of economic security in conjunction with budget components such as “budget interests” and “budget necessities” are analyzed. Key positions of the categories “budget interests” and “budget necessities” in the theory of economic security in the budgetary area are substantiated given their priority role in setting up its implementation strategy. The category “budget interests” is defined as the system of budget necessities of the interest holders, implemented through budget activities of entities and aimed at seeking benefits through the budget, in order to guarantee functioning and development of the society, the state, legal entities and physical persons. “Budget necessities” are defined as the need in budget funds to achieve and sustain, at a certain level, life activities of individuals, social groups, society, state and legal entities. Classification of budget interests by various criteria is made in the context of their impact on the economic security of the state. It is demonstrated that the four-tier classification of the budget interests by interest holder is essential to guaranteeing economic security in the budgetary area: budget interests of the state: the interests held by central and local power offices; budget interests of legal entities: the interests of profit and non-profit (public, budgetary, party and other organizations; budget interests of individuals: basic necessities of individuals, met by budget transfers, which stand out of the array of public necessities by their individual character.
Extended Nambu models: Their relation to gauge theories
Escobar, C. A.; Urrutia, L. F.
2017-05-01
Yang-Mills theories supplemented by an additional coordinate constraint, which is solved and substituted in the original Lagrangian, provide examples of the so-called Nambu models, in the case where such constraints arise from spontaneous Lorentz symmetry breaking. Some explicit calculations have shown that, after additional conditions are imposed, Nambu models are capable of reproducing the original gauge theories, thus making Lorentz violation unobservable and allowing the interpretation of the corresponding massless gauge bosons as the Goldstone bosons arising from the spontaneous symmetry breaking. A natural question posed by this approach in the realm of gauge theories is to determine under which conditions the recovery of an arbitrary gauge theory from the corresponding Nambu model, defined by a general constraint over the coordinates, becomes possible. We refer to these theories as extended Nambu models (ENM) and emphasize the fact that the defining coordinate constraint is not treated as a standard gauge fixing term. At this level, the mechanism for generating the constraint is irrelevant and the case of spontaneous Lorentz symmetry breaking is taken only as a motivation, which naturally bring this problem under consideration. Using a nonperturbative Hamiltonian analysis we prove that the ENM yields the original gauge theory after we demand current conservation for all time, together with the imposition of the Gauss laws constraints as initial conditions upon the dynamics of the ENM. The Nambu models yielding electrodynamics, Yang-Mills theories and linearized gravity are particular examples of our general approach.
Linear control theory for gene network modeling.
Shin, Yong-Jun; Bleris, Leonidas
2010-09-16
Systems biology is an interdisciplinary field that aims at understanding complex interactions in cells. Here we demonstrate that linear control theory can provide valuable insight and practical tools for the characterization of complex biological networks. We provide the foundation for such analyses through the study of several case studies including cascade and parallel forms, feedback and feedforward loops. We reproduce experimental results and provide rational analysis of the observed behavior. We demonstrate that methods such as the transfer function (frequency domain) and linear state-space (time domain) can be used to predict reliably the properties and transient behavior of complex network topologies and point to specific design strategies for synthetic networks.
Polling models : from theory to traffic intersections
Boon, M.A.A.
2011-01-01
The subject of the present monograph is the study of polling models, which are queueing models consisting of multiple queues, cyclically attended by one server. Polling models originated in the late 1950s, but did not receive much attention until the 1980s when an abundance of new applications arose
Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...
Development of a dynamic computational model of social cognitive theory.
Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C
2016-12-01
Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.
Contribution to the study of conformal theories and integrable models
International Nuclear Information System (INIS)
Sochen, N.
1992-05-01
The purpose of this thesis is the 2-D physics study. The main tool is the conformal field theory with Kac-Moody and W algebra. This theory describes the 2-D models that have translation, rotation and dilatation symmetries, at their critical point. The expanded conformal theories describe models that have a larger symmetry than conformal symmetry. After a review of conformal theory methods, the author effects a detailed study of singular vector form in sl(2) affine algebra. With this important form, correlation functions can be calculated. The classical W algebra is studied and the relations between classical W algebra and quantum W algebra are specified. Bosonization method is presented and sl(2)/sl(2) topological model, studied. Partition function bosonization of different models is described. A program of rational theory classification is described linking rational conformal theories and spin integrable models, and interesting relations between Boltzmann weights of different models have been found. With these relations, the integrability of models by a direct calculation of their Boltzmann weights is proved
Studies of the Earth Energy Budget and Water Cycle Using Satellite Observations and Model Analyses
Campbell, G. G.; VonderHarr, T. H.; Randel, D. L.; Kidder, S. Q.
1997-01-01
During this research period we have utilized the ERBE data set in comparisons to surface properties and water vapor observations in the atmosphere. A relationship between cloudiness and surface temperature anomalies was found. This same relationship was found in a general circulation model, verifying the model. The attempt to construct a homogeneous time series from Nimbus 6, Nimbus 7 and ERBE data is not complete because we are still waiting for the ERBE reanalysis to be completed. It will be difficult to merge the Nimbus 6 data in because its observations occurred when the average weather was different than the other periods, so regression adjustments are not effective.
Three level constraints on conformal field theories and string models
International Nuclear Information System (INIS)
Lewellen, D.C.
1989-05-01
Simple tree level constraints for conformal field theories which follow from the requirement of crossing symmetry of four-point amplitudes are presented, and their utility for probing general properties of string models is briefly illustrated and discussed. 9 refs
Endrizzi, S.; Gruber, S.; Dall'Amico, M.; Rigon, R.
2013-12-01
This contribution describes the new version of GEOtop which emerges after almost eight years of development from the original version. GEOtop now integrate the 3D Richards equation with a new numerical method; improvements were made on the treatment of surface waters by using the shallow water equation. The freezing-soil module was greatly improved, and the evapotranspiration -vegetation modelling is now based on a double layer scheme. Here we discuss the rational of each choice that was made, and we compare the differences between the actual solutions and the old solutions. In doing we highlight the issues that we faced during the development, including the trade-off between complexity and simplicity of the code, the requirements of a shared development, the different branches that were opened during the evolution of the code, and why we think that a code like GEOtop is indeed necessary. Models where the hydrological cycle is simplified can be built on the base of perceptual models that neglects some fundamental aspects of the hydrological processes, of which some examples are presented. At the same time, also process-based models like GEOtop can indeed neglect some fundamental process: but this is made evident with the comparison with measurements, especially when data are imposed ex-ante, instead than calibrated.
Nematic elastomers: from a microscopic model to macroscopic elasticity theory.
Xing, Xiangjun; Pfahl, Stephan; Mukhopadhyay, Swagatam; Goldbart, Paul M; Zippelius, Annette
2008-05-01
A Landau theory is constructed for the gelation transition in cross-linked polymer systems possessing spontaneous nematic ordering, based on symmetry principles and the concept of an order parameter for the amorphous solid state. This theory is substantiated with help of a simple microscopic model of cross-linked dimers. Minimization of the Landau free energy in the presence of nematic order yields the neoclassical theory of the elasticity of nematic elastomers and, in the isotropic limit, the classical theory of isotropic elasticity. These phenomenological theories of elasticity are thereby derived from a microscopic model, and it is furthermore demonstrated that they are universal mean-field descriptions of the elasticity for all chemical gels and vulcanized media.
Soliton excitations in a class of nonlinear field theory models
International Nuclear Information System (INIS)
Makhan'kov, V.G.; Fedyanin, V.K.
1985-01-01
Investigation results of nonlinear models of the field theory with a lagrangian are described. The theory includes models both with zero stable vacuum epsilon=1 and with condensate epsilon=-1 (of disturbed symmetry). Conditions of existence of particle-like solutions (PLS), stability of these solutions are investigated. Soliton dynamics is studied. PLS formfactors are calculated. Statistical mechanics of solitons is built and their dynamic structure factors are calculated
Two-matrix models and c =1 string theory
International Nuclear Information System (INIS)
Bonora, L.; Xiong Chuansheng
1994-05-01
We show that the most general two-matrix model with bilinear coupling underlies c = 1 string theory. More precisely we prove that W 1+∞ constraints, a subset of the correlation functions and the integrable hierarchy characterizing such two-matrix model, correspond exactly to the W 1+∞ constraints, to the discrete tachyon correlation functions and the integrable hierarchy of the c = 1 string theory. (orig.)
Planar N = 4 gauge theory and the Hubbard model
International Nuclear Information System (INIS)
Rej, Adam; Serban, Didina; Staudacher, Matthias
2006-01-01
Recently it was established that a certain integrable long-range spin chain describes the dilatation operator of N = 4 gauge theory in the su(2) sector to at least three-loop order, while exhibiting BMN scaling to all orders in perturbation theory. Here we identify this spin chain as an approximation to an integrable short-ranged model of strongly correlated electrons: The Hubbard model
Scattering and short-distance properties in field theory models
International Nuclear Information System (INIS)
Iagolnitzer, D.
1987-01-01
The aim of constructive field theory is not only to define models but also to establish their general properties of physical interest. We here review recent works on scattering and on short-distance properties for weakly coupled theories with mass gap such as typically P(φ) in dimension 2, φ 4 in dimension 3 and the (renormalizable, asymptotically free) massive Gross-Neveu (GN) model in dimension 2. Many of the ideas would apply similarly to other (possibly non renormalizable) theories that might be defined in a similar way via phase-space analysis
The monster sporadic group and a theory underlying superstring models
International Nuclear Information System (INIS)
Chapline, G.
1996-09-01
The pattern of duality symmetries acting on the states of compactified superstring models reinforces an earlier suggestion that the Monster sporadic group is a hidden symmetry for superstring models. This in turn points to a supersymmetric theory of self-dual and anti-self-dual K3 manifolds joined by Dirac strings and evolving in a 13 dimensional spacetime as the fundamental theory. In addition to the usual graviton and dilaton this theory contains matter-like degrees of freedom resembling the massless states of the heterotic string, thus providing a completely geometric interpretation for ordinary matter. 25 refs
The chlorine budget of the present-day atmosphere - A modeling study
Weisenstein, Debra K.; Ko, Malcolm K. W.; Sze, Nien-Dak
1992-01-01
The contribution of source gases to the total amount of inorganic chlorine (ClY) is examined analytically with a time-dependent model employing 11 source gases. The source-gas emission data are described, and the modeling methodology is set forth with attention given to the data interpretation. The abundances and distributions are obtained for all 11 source gases with corresponding ClY production rates and mixing ratios. It is shown that the ClY production rate and the ClY mixing ratio for each source gas are spatially dependent, and the change in the relative contributions from 1950 to 1990 is given. Ozone changes in the past decade are characterized by losses in the polar and midlatitude lower stratosphere. The values for CFC-11, CCl4, and CH3CCl3 suggest that they are more evident in the lower stratosphere than is suggested by steady-state estimates based on surface concentrations.
Mandal, Nibir; Sarkar, Shamik; Baruah, Amiya; Dutta, Urmi
2018-04-01
Using an enthalpy based thermo-mechanical model we provide a theoretical evaluation of melt production beneath mid-ocean ridges (MORs), and demonstrate how the melts subsequently develop their pathways to sustain the major ridge processes. Our model employs a Darcy idealization of the two-phase (solid-melt) system, accounting enthalpy (ΔH) as a function of temperature dependent liquid fraction (ϕ). Random thermal perturbations imposed in this model set in local convection that drive melts to flow through porosity controlled pathways with a typical mushroom-like 3D structure. We present across- and along-MOR axis model profiles to show the mode of occurrence of melt-rich zones within mushy regions, connected to deeper sources by single or multiple feeders. The upwelling of melts experiences two synchronous processes: 1) solidification-accretion, and 2) eruption, retaining a large melt fraction in the framework of mantle dynamics. Using a bifurcation analysis we determine the threshold condition for melt eruption, and estimate the potential volumes of eruptible melts (∼3.7 × 106 m3/yr) and sub-crustal solidified masses (∼1-8.8 × 106 m3/yr) on an axis length of 500 km. The solidification process far dominates over the eruption process in the initial phase, but declines rapidly on a time scale (t) of 1 Myr. Consequently, the eruption rate takes over the solidification rate, but attains nearly a steady value as t > 1.5 Myr. We finally present a melt budget, where a maximum of ∼5% of the total upwelling melt volume is available for eruption, whereas ∼19% for deeper level solidification; the rest continue to participate in the sub-crustal processes.
Consumer preference models: fuzzy theory approach
Turksen, I. B.; Wilson, I. A.
1993-12-01
Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).
Narrative theories as computational models: reader-oriented theory and artificial intelligence
Energy Technology Data Exchange (ETDEWEB)
Galloway, P.
1983-12-01
In view of the rapid development of reader-oriented theory and its interest in dynamic models of narrative, the author speculates in a serious way about what such models might look like in computational terms. Researchers in artificial intelligence (AI) have already begun to develop models of story understanding as the emphasis in ai research has shifted toward natural language understanding and as ai has allied itself with cognitive psychology and linguistics to become cognitive science. Research in ai and in narrative theory share many common interests and problems and both studies might benefit from an exchange of ideas. 11 references.
Multi-model ensemble combinations of the water budget in the East/Japan Sea
HAN, S.; Hirose, N.; Usui, N.; Miyazawa, Y.
2016-02-01
The water balance of East/Japan Sea is determined mainly by inflow and outflow through the Korea/Tsushima, Tsugaru and Soya/La Perouse Straits. However, the volume transports measured at three straits remain quantitatively unbalanced. This study examined the seasonal variation of the volume transport using the multiple linear regression and ridge regression of multi-model ensemble (MME) methods to estimate physically consistent circulation in East/Japan Sea by using four different data assimilation models. The MME outperformed all of the single models by reducing uncertainties, especially the multicollinearity problem with the ridge regression. However, the regression constants turned out to be inconsistent with each other if the MME was applied separately for each strait. The MME for a connected system was thus performed to find common constants for these straits. The estimation of this MME was found to be similar to the MME result of sea level difference (SLD). The estimated mean transport (2.42 Sv) was smaller than the measurement data at the Korea/Tsushima Strait, but the calibrated transport of the Tsugaru Strait (1.63 Sv) was larger than the observed data. The MME results of transport and SLD also suggested that the standard deviation (STD) of the Korea/Tsushima Strait is larger than the STD of the observation, whereas the estimated results were almost identical to that observed for the Tsugaru and Soya/La Perouse Straits. The similarity between MME results enhances the reliability of the present MME estimation.
Directory of Open Access Journals (Sweden)
Antonio Agüera
Full Text Available Marine organisms in Antarctica are adapted to an extreme ecosystem including extremely stable temperatures and strong seasonality due to changes in day length. It is now largely accepted that Southern Ocean organisms are particularly vulnerable to global warming with some regions already being challenged by a rapid increase of temperature. Climate change affects both the physical and biotic components of marine ecosystems and will have an impact on the distribution and population dynamics of Antarctic marine organisms. To predict and assess the effect of climate change on marine ecosystems a more comprehensive knowledge of the life history and physiology of key species is urgently needed. In this study we estimate the Dynamic Energy Budget (DEB model parameters for key benthic Antarctic species the sea star Odontaster validus using available information from literature and experiments. The DEB theory is unique in capturing the metabolic processes of an organism through its entire life cycle as a function of temperature and food availability. The DEB model allows for the inclusion of the different life history stages, and thus, becomes a tool that can be used to model lifetime feeding, growth, reproduction, and their responses to changes in biotic and abiotic conditions. The DEB model presented here includes the estimation of reproduction handling rules for the development of simultaneous oocyte cohorts within the gonad. Additionally it links the DEB model reserves to the pyloric caeca an organ whose function has long been ascribed to energy storage. Model parameters described a slowed down metabolism of long living animals that mature slowly. O. validus has a large reserve that-matching low maintenance costs- allow withstanding long periods of starvation. Gonad development is continuous and individual cohorts developed within the gonads grow in biomass following a power function of the age of the cohort. The DEB model developed here for O
A Dynamic Systems Theory Model of Visual Perception Development
Coté, Carol A.
2015-01-01
This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…
Membrane models and generalized Z2 gauge theories
International Nuclear Information System (INIS)
Lowe, M.J.; Wallace, D.J.
1980-01-01
We consider models of (d-n)-dimensional membranes fluctuating in a d-dimensional space under the action of surface tension. We investigate the renormalization properties of these models perturbatively and in 1/n expansion. The potential relationships of these models to generalized Z 2 gauge theories are indicated. (orig.)
Theories and Frameworks for Online Education: Seeking an Integrated Model
Picciano, Anthony G.
2017-01-01
This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…
Linear control theory for gene network modeling.
Directory of Open Access Journals (Sweden)
Yong-Jun Shin
Full Text Available Systems biology is an interdisciplinary field that aims at understanding complex interactions in cells. Here we demonstrate that linear control theory can provide valuable insight and practical tools for the characterization of complex biological networks. We provide the foundation for such analyses through the study of several case studies including cascade and parallel forms, feedback and feedforward loops. We reproduce experimental results and provide rational analysis of the observed behavior. We demonstrate that methods such as the transfer function (frequency domain and linear state-space (time domain can be used to predict reliably the properties and transient behavior of complex network topologies and point to specific design strategies for synthetic networks.
Measurement Models for Reasoned Action Theory.
Hennessy, Michael; Bleakley, Amy; Fishbein, Martin
2012-03-01
Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.
Modeling acquaintance networks based on balance theory
Directory of Open Access Journals (Sweden)
Vukašinović Vida
2014-09-01
Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models
Modeling in applied sciences a kinetic theory approach
Pulvirenti, Mario
2000-01-01
Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...
Baldrige Theory into Practice: A Generic Model
Arif, Mohammed
2007-01-01
Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…
Optimal transportation networks models and theory
Bernot, Marc; Morel, Jean-Michel
2009-01-01
The transportation problem can be formalized as the problem of finding the optimal way to transport a given measure into another with the same mass. In contrast to the Monge-Kantorovitch problem, recent approaches model the branched structure of such supply networks as minima of an energy functional whose essential feature is to favour wide roads. Such a branched structure is observable in ground transportation networks, in draining and irrigation systems, in electrical power supply systems and in natural counterparts such as blood vessels or the branches of trees. These lectures provide mathematical proof of several existence, structure and regularity properties empirically observed in transportation networks. The link with previous discrete physical models of irrigation and erosion models in geomorphology and with discrete telecommunication and transportation models is discussed. It will be mathematically proven that the majority fit in the simple model sketched in this volume.
Directory of Open Access Journals (Sweden)
Ensor Tim
2012-08-01
Full Text Available Abstract Background Allocating national resources to regions based on need is a key policy issue in most health systems. Many systems utilise proxy measures of need as the basis for allocation formulae. Increasingly these are underpinned by complex statistical methods to separate need from supplier induced utilisation. Assessment of need is then used to allocate existing global budgets to geographic areas. Many low and middle income countries are beginning to use formula methods for funding however these attempts are often hampered by a lack of information on utilisation, relative needs and whether the budgets allocated bear any relationship to cost. An alternative is to develop bottom-up estimates of the cost of providing for local need. This method is viable where public funding is focused on a relatively small number of targeted services. We describe a bottom-up approach to developing a formula for the allocation of resources. The method is illustrated in the context of the state minimum service package mandated to be provided by the Indonesian public health system. Methods A standardised costing methodology was developed that is sensitive to the main expected drivers of local cost variation including demographic structure, epidemiology and location. Essential package costing is often undertaken at a country level. It is less usual to utilise the methods across different parts of a country in a way that takes account of variation in population needs and location. Costing was based on best clinical practice in Indonesia and province specific data on distribution and costs of facilities. The resulting model was used to estimate essential package costs in a representative district in each province of the country. Findings Substantial differences in the costs of providing basic services ranging from USD 15 in urban Yogyakarta to USD 48 in sparsely populated North Maluku. These costs are driven largely by the structure of the population
Ensor, Tim; Firdaus, Hafidz; Dunlop, David; Manu, Alex; Mukti, Ali Ghufron; Ayu Puspandari, Diah; von Roenne, Franz; Indradjaya, Stephanus; Suseno, Untung; Vaughan, Patrick
2012-08-29
Allocating national resources to regions based on need is a key policy issue in most health systems. Many systems utilise proxy measures of need as the basis for allocation formulae. Increasingly these are underpinned by complex statistical methods to separate need from supplier induced utilisation. Assessment of need is then used to allocate existing global budgets to geographic areas. Many low and middle income countries are beginning to use formula methods for funding however these attempts are often hampered by a lack of information on utilisation, relative needs and whether the budgets allocated bear any relationship to cost. An alternative is to develop bottom-up estimates of the cost of providing for local need. This method is viable where public funding is focused on a relatively small number of targeted services. We describe a bottom-up approach to developing a formula for the allocation of resources. The method is illustrated in the context of the state minimum service package mandated to be provided by the Indonesian public health system. A standardised costing methodology was developed that is sensitive to the main expected drivers of local cost variation including demographic structure, epidemiology and location. Essential package costing is often undertaken at a country level. It is less usual to utilise the methods across different parts of a country in a way that takes account of variation in population needs and location. Costing was based on best clinical practice in Indonesia and province specific data on distribution and costs of facilities. The resulting model was used to estimate essential package costs in a representative district in each province of the country. Substantial differences in the costs of providing basic services ranging from USD 15 in urban Yogyakarta to USD 48 in sparsely populated North Maluku. These costs are driven largely by the structure of the population, particularly numbers of births, infants and children and also key
Directory of Open Access Journals (Sweden)
A. Lauer
2007-10-01
Full Text Available International shipping contributes significantly to the fuel consumption of all transport related activities. Specific emissions of pollutants such as sulfur dioxide (SO_{2} per kg of fuel emitted are higher than for road transport or aviation. Besides gaseous pollutants, ships also emit various types of particulate matter. The aerosol impacts the Earth's radiation budget directly by scattering and absorbing the solar and thermal radiation and indirectly by changing cloud properties. Here we use ECHAM5/MESSy1-MADE, a global climate model with detailed aerosol and cloud microphysics to study the climate impacts of international shipping. The simulations show that emissions from ships significantly increase the cloud droplet number concentration of low marine water clouds by up to 5% to 30% depending on the ship emission inventory and the geographic region. Whereas the cloud liquid water content remains nearly unchanged in these simulations, effective radii of cloud droplets decrease, leading to cloud optical thickness increase of up to 5–10%. The sensitivity of the results is estimated by using three different emission inventories for present-day conditions. The sensitivity analysis reveals that shipping contributes to 2.3% to 3.6% of the total sulfate burden and 0.4% to 1.4% to the total black carbon burden in the year 2000 on the global mean. In addition to changes in aerosol chemical composition, shipping increases the aerosol number concentration, e.g. up to 25% in the size range of the accumulation mode (typically >0.1 μm over the Atlantic. The total aerosol optical thickness over the Indian Ocean, the Gulf of Mexico and the Northeastern Pacific increases by up to 8–10% depending on the emission inventory. Changes in aerosol optical thickness caused by shipping induced modification of aerosol particle number concentration and chemical composition lead to a change in the shortwave radiation budget at the top of the
The Relevance of Using Mathematical Models in Macroeconomic Policies Theory
Directory of Open Access Journals (Sweden)
Nora Mihail
2006-11-01
Full Text Available The article presents a look of the principal’s mathematical models – starting with Theil, Hansen and Tinbergen work – and their results used to analysis and design macroeconomic policies. In modeling field changes are very fast in theoretical aspects of modeling the many problems of macroeconomic policies and in using in practice the different political models elaboration. The article points out the problems of static and dynamic theory used in macro-policies modeling.
The Relevance of Using Mathematical Models in Macroeconomic Policies Theory
Directory of Open Access Journals (Sweden)
Nora Mihail
2006-09-01
Full Text Available The article presents a look of the principal’s mathematical models – starting with Theil, Hansen and Tinbergen work – and their results used to analysis and design macroeconomic policies. In modeling field changes are very fast in theoretical aspects of modeling the many problems of macroeconomic policies and in using in practice the different political models elaboration. The article points out the problems of static and dynamic theory used in macro-policies modeling.
Fire and Heat Spreading Model Based on Cellular Automata Theory
Samartsev, A. A.; Rezchikov, A. F.; Kushnikov, V. A.; Ivashchenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikov, O. V.; Shulga, T. E.; Tverdokhlebov, V. A.; Fominykh, D. S.
2018-05-01
The distinctive feature of the proposed fire and heat spreading model in premises is the reduction of the computational complexity due to the use of the theory of cellular automata with probability rules of behavior. The possibilities and prospects of using this model in practice are noted. The proposed model has a simple mechanism of integration with agent-based evacuation models. The joint use of these models could improve floor plans and reduce the time of evacuation from premises during fires.
Matrix model as a mirror of Chern-Simons theory
International Nuclear Information System (INIS)
Aganagic, Mina; Klemm, Albrecht; Marino, Marcos; Vafa, Cumrun
2004-01-01
Using mirror symmetry, we show that Chern-Simons theory on certain manifolds such as lens spaces reduces to a novel class of Hermitian matrix models, where the measure is that of unitary matrix models. We show that this agrees with the more conventional canonical quantization of Chern-Simons theory. Moreover, large N dualities in this context lead to computation of all genus A-model topological amplitudes on toric Calabi-Yau manifolds in terms of matrix integrals. In the context of type IIA superstring compactifications on these Calabi-Yau manifolds with wrapped D6 branes (which are dual to M-theory on G2 manifolds) this leads to engineering and solving F-terms for N=1 supersymmetric gauge theories with superpotentials involving certain multi-trace operators. (author)
Mixed models theory and applications with R
Demidenko, Eugene
2013-01-01
Mixed modeling is one of the most promising and exciting areas of statistical analysis, enabling the analysis of nontraditional, clustered data that may come in the form of shapes or images. This book provides in-depth mathematical coverage of mixed models' statistical properties and numerical algorithms, as well as applications such as the analysis of tumor regrowth, shape, and image. The new edition includes significant updating, over 300 exercises, stimulating chapter projects and model simulations, inclusion of R subroutines, and a revised text format. The target audience continues to be g
Nonconvex Model of Material Growth: Mathematical Theory
Ganghoffer, J. F.; Plotnikov, P. I.; Sokolowski, J.
2018-06-01
The model of volumetric material growth is introduced in the framework of finite elasticity. The new results obtained for the model are presented with complete proofs. The state variables include the deformations, temperature and the growth factor matrix function. The existence of global in time solutions for the quasistatic deformations boundary value problem coupled with the energy balance and the evolution of the growth factor is shown. The mathematical results can be applied to a wide class of growth models in mechanics and biology.
Does National Culture Impact Capital Budgeting Systems?
Directory of Open Access Journals (Sweden)
Peter J. Graham
2017-06-01
Full Text Available We examine how national culture impacts organisational selection of capital budgeting systems to develop our understanding of what influence a holistic formulation of national culture has on capital budgeting systems. Such an understanding is important as it would not only provide a clearer link between national culture and capital budgeting systems and advance extant literature but would also help multinational firms that have business relationships with Indonesian firms in suitably designing strategies. We conducted semi-structured interviews of selected finance managers of listed firms in Indonesia and Australia. Consistent with the contingency theory, we found that economic, political, legal and social uncertainty impact on the use of capital budgeting systems. The levels of uncertainty were higher in Indonesia than Australia and need to be reckoned in the selection of capital budgeting systems used by firms. We also found that firms are influenced by project size and complexity, when selecting capital budgeting systems.
Directory of Open Access Journals (Sweden)
D. B. Millet
2010-04-01
Full Text Available We construct a global atmospheric budget for acetaldehyde using a 3-D model of atmospheric chemistry (GEOS-Chem, and use an ensemble of observations to evaluate present understanding of its sources and sinks. Hydrocarbon oxidation provides the largest acetaldehyde source in the model (128 Tg a^{−1}, a factor of 4 greater than the previous estimate, with alkanes, alkenes, and ethanol the main precursors. There is also a minor source from isoprene oxidation. We use an updated chemical mechanism for GEOS-Chem, and photochemical acetaldehyde yields are consistent with the Master Chemical Mechanism. We present a new approach to quantifying the acetaldehyde air-sea flux based on the global distribution of light absorption due to colored dissolved organic matter (CDOM derived from satellite ocean color observations. The resulting net ocean emission is 57 Tg a^{−1}, the second largest global source of acetaldehyde. A key uncertainty is the acetaldehyde turnover time in the ocean mixed layer, with quantitative model evaluation over the ocean complicated by known measurement artifacts in clean air. Simulated concentrations in surface air over the ocean generally agree well with aircraft measurements, though the model tends to overestimate the vertical gradient. PAN:NO_{x} ratios are well-simulated in the marine boundary layer, providing some support for the modeled ocean source. We introduce the Model of Emissions of Gases and Aerosols from Nature (MEGANv2.1 for acetaldehyde and ethanol and use it to quantify their net flux from living terrestrial plants. Including emissions from decaying plants the total direct acetaldehyde source from the land biosphere is 23 Tg a^{−1}. Other terrestrial acetaldehyde sources include biomass burning (3 Tg a^{−1} and anthropogenic emissions (2 Tg a^{−1}. Simulated concentrations in the continental boundary layer are generally unbiased and capture the spatial
Young, Paul. J.; Emmons, Louisa K.; Roberts, James M.; Lamarque, Jean-FrançOis; Wiedinmyer, Christine; Veres, Patrick; Vandenboer, Trevor C.
2012-05-01
This study uses a global chemical transport model to estimate the distribution of isocyanic acid (HNCO). HNCO is toxic, and concentrations exceeding 1 ppbv have been suggested to have negative health effects. Based on fire studies, HNCO emissions were scaled to those of hydrogen cyanide (30%), resulting in yearly total emissions of 1.5 Tg for 2008, from both anthropogenic and biomass burning sources. Loss processes included heterogeneous uptake (pH dependent), dry deposition (like formic acid), and reaction with the OH radical (k = 1 × 10-15 molecule-1 cm3 s-1). Annual mean surface HNCO concentrations were highest over parts of China (maximum of 470 pptv), but episodic fire emissions gave much higher levels, exceeding 4 ppbv in tropical Africa and the Amazon, and exceeding 10 ppbv in Southeast Asia and Siberia. This suggests that large biomass burning events could result in deleterious health effects for populations in these regions. For the tropospheric budget, using the model-calculated pH the HNCO lifetime was 37 days, with the split between dry deposition and heterogeneous loss being 95%:5%. Fixing the heterogeneous loss rate at pH = 7 meant that this process dominated, accounting for ˜70% of the total loss, giving a lifetime of 6 days, and resulting in upper tropospheric concentrations that were essentially zero. However, changing the pH does not notably impact the high concentrations found in biomass burning regions. More observational data is needed to evaluate the model, as well as a better representation of the likely underestimated biofuel emissions, which could mean more populations exposed to elevated HNCO concentrations.
Solid mechanics theory, modeling, and problems
Bertram, Albrecht
2015-01-01
This textbook offers an introduction to modeling the mechanical behavior of solids within continuum mechanics and thermodynamics. To illustrate the fundamental principles, the book starts with an overview of the most important models in one dimension. Tensor calculus, which is called for in three-dimensional modeling, is concisely presented in the second part of the book. Once the reader is equipped with these essential mathematical tools, the third part of the book develops the foundations of continuum mechanics right from the beginning. Lastly, the book’s fourth part focuses on modeling the mechanics of materials and in particular elasticity, viscoelasticity and plasticity. Intended as an introductory textbook for students and for professionals interested in self-study, it also features numerous worked-out examples to aid in understanding.
Modeling workplace bullying using catastrophe theory.
Escartin, J; Ceja, L; Navarro, J; Zapf, D
2013-10-01
Workplace bullying is defined as negative behaviors directed at organizational members or their work context that occur regularly and repeatedly over a period of time. Employees' perceptions of psychosocial safety climate, workplace bullying victimization, and workplace bullying perpetration were assessed within a sample of nearly 5,000 workers. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes in workplace bullying. More specifically, the present study examines whether a nonlinear dynamical systems model (i.e., a cusp catastrophe model) is superior to the linear combination of variables for predicting the effect of psychosocial safety climate and workplace bullying victimization on workplace bullying perpetration. According to the AICc, and BIC indices, the linear regression model fits the data better than the cusp catastrophe model. The study concludes that some phenomena, especially unhealthy behaviors at work (like workplace bullying), may be better studied using linear approaches as opposed to nonlinear dynamical systems models. This can be explained through the healthy variability hypothesis, which argues that positive organizational behavior is likely to present nonlinear behavior, while a decrease in such variability may indicate the occurrence of negative behaviors at work.
Assessment of moisture budget over West Africa using MERRA-2's aerological model and satellite data
Igbawua, Tertsea; Zhang, Jiahua; Yao, Fengmei; Zhang, Da
2018-02-01
The study assessed the performance of NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) and MERRA-2 aerological (P-E*) model in reproducing the salient features of West Africa water balance including its components from 1980 to 2013. In this study we have shown that recent reanalysis efforts have generated imbalances between regional integrated precipitation (P) and surface evaporation (E), and the effect is more in the newly released MERRA-2. The atmospheric water balance of MERRA and MERRA-2 were inter-compared and thereafter compared with model forecast output of European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERA-I) and Japanese 55-year Reanalysis (JRA-55). Results indicated that a bias of 12-20 (5-13) mm/month in MERRA-2 (ERA-I) leads to the classification of the Sahel (14°N-20°N) as a moisture source during the West African Summer Monsoon. Comparisons between MERRA/MERRA-2 and prognostic fields from two ERA-I and JRA-55 indicated that the average P-E* in MERRA is 18.94 (52.24) mm/month which is less than ERA-I (JRA-55) over Guinea domain and 25.03 (4.53) mm/month greater than ERA-I (JRA-55) over the Sahel. In MERRA-2, average P-E* indicated 25.76 (59.06) mm/month which is less than ERA-I (JRA-55) over Guinea and 73.72 (94.22) mm/month less than ERA-I (JRA-55) over the Sahel respectively. These imbalances are due to adjustments in data assimilation methods, satellite calibration and observational data base. The change in convective P parameterization and increased re-evaporation of P in MERRA-2 is suggestive of the cause of positive biases in P and E. The little disagreements between MERRA/MERRA-2 and CRU precipitation highlights one of the major challenges associated with climate research in West Africa and major improvements in observation data and surface fluxes from reanalysis remain vital.
Mallinson, Daniel J.
2018-01-01
The California Budget Challenge produced by Next10 provides a useful and intuitive tool for instructors to introduce students to public budgeting. Students will reason through a series of budgeting decisions using information provided on the fiscal and practical implications of their choices. The Challenge is updated with each budget cycle, so it…
Britt, Steuart-Henderson
1979-01-01
Methods for establishing an advertising budget are reviewed. They include methods based on percentage of sales or profits, unit of sales, and objective and task. Also discussed are ways to allocate a promotional budget. The most common breakdowns are: departmental budgets, total budget, calendar periods, media, and sales area. (JMD)
Modeling Carbon and Water Budgets in the Lushi Basin with Biome-BGC
Institute of Scientific and Technical Information of China (English)
Dong Wenjuan; Qi Ye; Li Huimin; Zhou Dajie; Shi Duanhua; Sun Liying
2005-01-01
In this article, annual evapotranspiration (ET) and net primary productivity (NPP) of four types of vegetation were estimated for the Lushi basin,a subbasin of the Yellow River in China. These four vegetation types include: deciduous broadleaf forest,evergreen needle leaf forest, dwarf shrub and grass.Biome-BGC--a biogeochemical process model was used to calculate annual ET and NPP for each vegetation type in the study area from 1954 to 2000.Daily microclimate data of 47 years monitored by Lushi meteorological station was extrapolated to cover the basin using MT-CLIM, a mountain microclimate simulator. The output files of MTCLIM were used to feed Biome-BGC. We used average ecophysiological values of each type of vegetation supplied by Numerical Terradynamic Simulation Group (NTSG) in the University of Montana as input ecophysiological constants file.The estimates of daily NPP in early July and annual ET on these four biome groups were compared respectively with field measurements and other studies.Daily gross primary production (GPP) of evergreen needle leaf forest measurements were very dose to the output of Biome-BGC, but measurements of broadleaf forest and dwarf shrub were much smaller than the simulation result. Simulated annual ET and NPP had a significant correlation with precipitation,indicating precipitation is the major environmental factor affecting ET and NPP in the study area.Precipitation also is the key climatic factor for the interannual ET and NPP variations.
Spatial interaction models facility location using game theory
D'Amato, Egidio; Pardalos, Panos
2017-01-01
Facility location theory develops the idea of locating one or more facilities by optimizing suitable criteria such as minimizing transportation cost, or capturing the largest market share. The contributions in this book focus an approach to facility location theory through game theoretical tools highlighting situations where a location decision is faced by several decision makers and leading to a game theoretical framework in non-cooperative and cooperative methods. Models and methods regarding the facility location via game theory are explored and applications are illustrated through economics, engineering, and physics. Mathematicians, engineers, economists and computer scientists working in theory, applications and computational aspects of facility location problems using game theory will find this book useful.
International Nuclear Information System (INIS)
Colette, A.
2005-12-01
Closing the tropospheric ozone budget requires a better understanding of the role of transport processes from the major reservoirs: the planetary boundary layer and the stratosphere. Case studies lead to the identification of mechanisms involved as well as their efficiency. However, their global impact on the budget must be addressed on a climatological basis. This manuscript is thus divided in two parts. First, we present case studies based on ozone LIDAR measurements performed during the ESCOMPTE campaign. This work consists in a data analysis investigation by means of a hybrid - Lagrangian study involving: global meteorological analyses, Lagrangian particle dispersion computation, and mesoscale, chemistry - transport, and Lagrangian photochemistry modeling. Our aim is to document the amount of observed ozone variability related to transport processes and, when appropriate, to infer the role of tropospheric photochemical production. Second, we propose a climatological analysis of the respective impact of transport from the boundary-layer and from the tropopause region on the tropospheric ozone budget. A multivariate analysis is presented and compared to a trajectography approach. Once validated, this algorithm is applied to the whole database of ozone profiles collected above Europe during the past 30 years in order to discuss the seasonal, geographical and temporal variability of transport processes as well as their impact on the tropospheric ozone budget. The variability of turbulent mixing and its impact on the persistence of tropospheric layers will also be discussed. (author)
Electrorheological fluids modeling and mathematical theory
Růžička, Michael
2000-01-01
This is the first book to present a model, based on rational mechanics of electrorheological fluids, that takes into account the complex interactions between the electromagnetic fields and the moving liquid. Several constitutive relations for the Cauchy stress tensor are discussed. The main part of the book is devoted to a mathematical investigation of a model possessing shear-dependent viscosities, proving the existence and uniqueness of weak and strong solutions for the steady and the unsteady case. The PDS systems investigated possess so-called non-standard growth conditions. Existence results for elliptic systems with non-standard growth conditions and with a nontrivial nonlinear r.h.s. and the first ever results for parabolic systems with a non-standard growth conditions are given for the first time. Written for advanced graduate students, as well as for researchers in the field, the discussion of both the modeling and the mathematics is self-contained.
Zhu, Q.; Jiang, H.; Liu, J.; Wei, X.; Peng, C.; Fang, X.; Liu, S.; Zhou, G.; Yu, S.; Ju, W.
2010-01-01
The Integrated Biosphere Simulator is used to evaluate the spatial and temporal patterns of the crucial hydrological variables [run-off and actual evapotranspiration (AET)] of the water balance across China for the period 1951–2006 including a precipitation analysis. Results suggest three major findings. First, simulated run-off captured 85% of the spatial variability and 80% of the temporal variability for 85 hydrological gauges across China. The mean relative errors were within 20% for 66% of the studied stations and within 30% for 86% of the stations. The Nash–Sutcliffe coefficients indicated that the quantity pattern of run-off was also captured acceptably except for some watersheds in southwestern and northwestern China. The possible reasons for underestimation of run-off in the Tibetan plateau include underestimation of precipitation and uncertainties in other meteorological data due to complex topography, and simplified representations of the soil depth attribute and snow processes in the model. Second, simulated AET matched reasonably with estimated values calculated as the residual of precipitation and run-off for watersheds controlled by the hydrological gauges. Finally, trend analysis based on the Mann–Kendall method indicated that significant increasing and decreasing patterns in precipitation appeared in the northwest part of China and the Yellow River region, respectively. Significant increasing and decreasing trends in AET were detected in the Southwest region and the Yangtze River region, respectively. In addition, the Southwest region, northern China (including the Heilongjiang, Liaohe, and Haihe Basins), and the Yellow River Basin showed significant decreasing trends in run-off, and the Zhemin hydrological region showed a significant increasing trend.
Density functional theory and multiscale materials modeling
Indian Academy of Sciences (India)
One of the vital ingredients in the theoretical tools useful in materials modeling at all the length scales of interest is the concept of density. In the microscopic length scale, it is the electron density that has played a major role in providing a deeper understanding of chemical binding in atoms, molecules and solids.
Directory of Open Access Journals (Sweden)
Zunjian Bian
2018-05-01
Full Text Available Land surface temperatures (LSTs obtained from remote sensing data are crucial in monitoring the conditions of crops and urban heat islands. However, since retrieved LSTs represent only the average temperature states of pixels, the distributions of temperatures within individual pixels remain unknown. Such data cannot satisfy the requirements of applications such as precision agriculture. Therefore, in this paper, we propose a model that combines a fast radiosity model, the Radiosity Applicable to Porous IndiviDual Objects (RAPID model, and energy budget methods to dynamically simulate brightness temperatures (BTs over complex surfaces. This model represents a model-based tool that can be used to estimate temperature distributions using fine-scale visible as well as near-infrared (VNIR data and temporal variations in meteorological conditions. The proposed model is tested over a study area in an artificial oasis in Northwestern China. The simulated BTs agree well with those measured with the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER. The results reflect root mean squared errors (RMSEs less than 1.6 °C and coefficients of determination (R2 greater than 0.7. In addition, compared to the leaf area index (LAI, this model displays high sensitivity to wind speed during validation. Although simplifications may be adopted for use in specific simulations, this proposed model can be used to support in situ measurements and to provide reference data over heterogeneous vegetation surfaces.
Toda theories, W-algebras, and minimal models
International Nuclear Information System (INIS)
Mansfield, P.; Spence, B.
1991-01-01
We discuss the classical W-algebra symmetries of Toda field theories in terms of the pseudo-differential Lax operator associated with the Toda Lax pair. We then show how the W-algebra transformations can be understood as the non-abelian gauge transformations which preserve the form of the Lax pair. This provides a new understanding of the W-algebras, and we discuss their closure and co-cycle structure using this approach. The quantum Lax operator is investigated, and we show that this operator, which generates the quantum W-algebra currents, is conserved in the conformally extended Toda theories. The W-algebra minimal model primary fields are shown to arise naturally in these theories, leading to the conjecture that the conformally extended Toda theories provide a lagrangian formulation of the W-algebra minimal models. (orig.)
2005-12-01
asset pricing model ( CAPM ). “According to the CAPM theory, investors determine their required return by adding a risk premium to the interest rate...NUMBER OF PAGES 77 14. SUBJECT TERMS Capital Budgeting; GAO; DOD; Capital Assets ; Risk, OMB; NPV, IRR 16. PRICE CODE 17. SECURITY...needs of the mission, as defined by the strategic plan, and limit the number of “nice to haves” (OMB, 1997). d. Alternatives to Capital Assets
Automated Physico-Chemical Cell Model Development through Information Theory
Energy Technology Data Exchange (ETDEWEB)
Peter J. Ortoleva
2005-11-29
The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.
Computational hemodynamics theory, modelling and applications
Tu, Jiyuan; Wong, Kelvin Kian Loong
2015-01-01
This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system. Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...
Fuzzy Stochastic Optimization Theory, Models and Applications
Wang, Shuming
2012-01-01
Covering in detail both theoretical and practical perspectives, this book is a self-contained and systematic depiction of current fuzzy stochastic optimization that deploys the fuzzy random variable as a core mathematical tool to model the integrated fuzzy random uncertainty. It proceeds in an orderly fashion from the requisite theoretical aspects of the fuzzy random variable to fuzzy stochastic optimization models and their real-life case studies. The volume reflects the fact that randomness and fuzziness (or vagueness) are two major sources of uncertainty in the real world, with significant implications in a number of settings. In industrial engineering, management and economics, the chances are high that decision makers will be confronted with information that is simultaneously probabilistically uncertain and fuzzily imprecise, and optimization in the form of a decision must be made in an environment that is doubly uncertain, characterized by a co-occurrence of randomness and fuzziness. This book begins...
Nonlinear model predictive control theory and algorithms
Grüne, Lars
2017-01-01
This book offers readers a thorough and rigorous introduction to nonlinear model predictive control (NMPC) for discrete-time and sampled-data systems. NMPC schemes with and without stabilizing terminal constraints are detailed, and intuitive examples illustrate the performance of different NMPC variants. NMPC is interpreted as an approximation of infinite-horizon optimal control so that important properties like closed-loop stability, inverse optimality and suboptimality can be derived in a uniform manner. These results are complemented by discussions of feasibility and robustness. An introduction to nonlinear optimal control algorithms yields essential insights into how the nonlinear optimization routine—the core of any nonlinear model predictive controller—works. Accompanying software in MATLAB® and C++ (downloadable from extras.springer.com/), together with an explanatory appendix in the book itself, enables readers to perform computer experiments exploring the possibilities and limitations of NMPC. T...
An Ar threesome: Matrix models, 2d conformal field theories, and 4dN=2 gauge theories
International Nuclear Information System (INIS)
Schiappa, Ricardo; Wyllard, Niclas
2010-01-01
We explore the connections between three classes of theories: A r quiver matrix models, d=2 conformal A r Toda field theories, and d=4N=2 supersymmetric conformal A r quiver gauge theories. In particular, we analyze the quiver matrix models recently introduced by Dijkgraaf and Vafa (unpublished) and make detailed comparisons with the corresponding quantities in the Toda field theories and the N=2 quiver gauge theories. We also make a speculative proposal for how the matrix models should be modified in order for them to reproduce the instanton partition functions in quiver gauge theories in five dimensions.
Lenses on reading an introduction to theories and models
Tracey, Diane H
2017-01-01
Widely adopted as an ideal introduction to the major models of reading, this text guides students to understand and facilitate children's literacy development. Coverage encompasses the full range of theories that have informed reading instruction and research, from classical thinking to cutting-edge cognitive, social learning, physiological, and affective perspectives. Readers learn how theory shapes instructional decision making and how to critically evaluate the assumptions and beliefs that underlie their own teaching. Pedagogical features include framing and discussion questions, learning a
Perturbation theory instead of large scale shell model calculations
International Nuclear Information System (INIS)
Feldmeier, H.; Mankos, P.
1977-01-01
Results of large scale shell model calculations for (sd)-shell nuclei are compared with a perturbation theory provides an excellent approximation when the SU(3)-basis is used as a starting point. The results indicate that perturbation theory treatment in an SU(3)-basis including 2hω excitations should be preferable to a full diagonalization within the (sd)-shell. (orig.) [de
Scaling theory of depinning in the Sneppen model
International Nuclear Information System (INIS)
Maslov, S.; Paczuski, M.
1994-01-01
We develop a scaling theory for the critical depinning behavior of the Sneppen interface model [Phys. Rev. Lett. 69, 3539 (1992)]. This theory is based on a ''gap'' equation that describes the self-organization process to a critical state of the depinning transition. All of the critical exponents can be expressed in terms of two independent exponents, ν parallel (d) and ν perpendicular (d), characterizing the divergence of the parallel and perpendicular correlation lengths as the interface approaches its dynamical attractor
Thermodynamic Models from Fluctuation Solution Theory Analysis of Molecular Simulations
DEFF Research Database (Denmark)
Christensen, Steen; Peters, Günther H.j.; Hansen, Flemming Yssing
2007-01-01
Fluctuation solution theory (FST) is employed to analyze results of molecular dynamics (MD) simulations of liquid mixtures. The objective is to generate parameters for macroscopic GE-models, here the modified Margules model. We present a strategy for choosing the number of parameters included...
The Use of Modelling for Theory Building in Qualitative Analysis
Briggs, Ann R. J.
2007-01-01
The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…
Goodness-of-Fit Assessment of Item Response Theory Models
Maydeu-Olivares, Alberto
2013-01-01
The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…
FY 1996 Congressional budget request: Budget highlights
Energy Technology Data Exchange (ETDEWEB)
1995-02-01
The FY 1996 budget presentation is organized by the Department`s major business lines. An accompanying chart displays the request for new budget authority. The report compares the budget request for FY 1996 with the appropriated FY 1995 funding levels displayed on a comparable basis. The FY 1996 budget represents the first year of a five year plan in which the Department will reduce its spending by $15.8 billion in budget authority and by $14.1 billion in outlays. FY 1996 is a transition year as the Department embarks on its multiyear effort to do more with less. The Budget Highlights are presented by business line; however, the fifth business line, Economic Productivity, which is described in the Policy Overview section, cuts across multiple organizational missions, funding levels and activities and is therefore included in the discussion of the other four business lines.
Optimal velocity difference model for a car-following theory
International Nuclear Information System (INIS)
Peng, G.H.; Cai, X.H.; Liu, C.Q.; Cao, B.F.; Tuo, M.X.
2011-01-01
In this Letter, we present a new optimal velocity difference model for a car-following theory based on the full velocity difference model. The linear stability condition of the new model is obtained by using the linear stability theory. The unrealistically high deceleration does not appear in OVDM. Numerical simulation of traffic dynamics shows that the new model can avoid the disadvantage of negative velocity occurred at small sensitivity coefficient λ in full velocity difference model by adjusting the coefficient of the optimal velocity difference, which shows that collision can disappear in the improved model. -- Highlights: → A new optimal velocity difference car-following model is proposed. → The effects of the optimal velocity difference on the stability of traffic flow have been explored. → The starting and braking process were carried out through simulation. → The effects of the optimal velocity difference can avoid the disadvantage of negative velocity.
Advances in cognitive theory and therapy: the generic cognitive model.
Beck, Aaron T; Haigh, Emily A P
2014-01-01
For over 50 years, Beck's cognitive model has provided an evidence-based way to conceptualize and treat psychological disorders. The generic cognitive model represents a set of common principles that can be applied across the spectrum of psychological disorders. The updated theoretical model provides a framework for addressing significant questions regarding the phenomenology of disorders not explained in previous iterations of the original model. New additions to the theory include continuity of adaptive and maladaptive function, dual information processing, energizing of schemas, and attentional focus. The model includes a theory of modes, an organization of schemas relevant to expectancies, self-evaluations, rules, and memories. A description of the new theoretical model is followed by a presentation of the corresponding applied model, which provides a template for conceptualizing a specific disorder and formulating a case. The focus on beliefs differentiates disorders and provides a target for treatment. A variety of interventions are described.
Kane, Jacqueline
2004-01-01
Earth science teachers know how frustrating it can be to spend hundreds of dollars on three-dimensional (3-D) models of Earth's geologic features, to use the models for only a few class periods. To avoid emptying an already limited science budget, the author states that teachers can use a simple alternative to the expensive 3-D models--sand. She…
Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat
2018-03-01
Although widely recognized as a comprehensive framework for representing score reliability, generalizability theory (G-theory), despite its potential benefits, has been used sparingly in reporting of results for measures of individual differences. In this article, we highlight many valuable ways that G-theory can be used to quantify, evaluate, and improve psychometric properties of scores. Our illustrations encompass assessment of overall reliability, percentages of score variation accounted for by individual sources of measurement error, dependability of cut-scores for decision making, estimation of reliability and dependability for changes made to measurement procedures, disattenuation of validity coefficients for measurement error, and linkages of G-theory with classical test theory and structural equation modeling. We also identify computer packages for performing G-theory analyses, most of which can be obtained free of charge, and describe how they compare with regard to data input requirements, ease of use, complexity of designs supported, and output produced. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Xie, Zhipeng; Hu, Zeyong; Xie, Zhenghui; Jia, Binghao; Sun, Genhou; Du, Yizhen; Song, Haiqing
2018-02-01
This paper presents the impact of two snow cover schemes (NY07 and SL12) in the Community Land Model version 4.5 (CLM4.5) on the snow distribution and surface energy budget over the Tibetan Plateau. The simulated snow cover fraction (SCF), snow depth, and snow cover days were evaluated against in situ snow depth observations and a satellite-based snow cover product and snow depth dataset. The results show that the SL12 scheme, which considers snow accumulation and snowmelt processes separately, has a higher overall accuracy (81.8%) than the NY07 (75.8%). The newer scheme performs better in the prediction of overall accuracy compared with the NY07; however, SL12 yields a 15.1% underestimation rate while NY07 overestimated the SCF with a 15.2% overestimation rate. Both two schemes capture the distribution of the maximum snow depth well but show large positive biases in the average value through all periods (3.37, 3.15, and 1.48 cm for NY07; 3.91, 3.52, and 1.17 cm for SL12) and overestimate snow cover days compared with the satellite-based product and in situ observations. Higher altitudes show larger root-mean-square errors (RMSEs) in the simulations of snow depth and snow cover days during the snow-free period. Moreover, the surface energy flux estimations from the SL12 scheme are generally superior to the simulation from NY07 when evaluated against ground-based observations, in particular for net radiation and sensible heat flux. This study has great implications for further improvement of the subgrid-scale snow variations over the Tibetan Plateau.
Hannah, David R.; Venkatachary, Ranga
2010-01-01
In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…
Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.
Gopnik, Alison; Wellman, Henry M
2012-11-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.
M-Theory Model-Building and Proton Stability
Ellis, Jonathan Richard; Nanopoulos, Dimitri V; Ellis, John; Faraggi, Alon E.
1998-01-01
We study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. We exhibit the underlying geometric (bosonic) interpretation of these models, which have a $Z_2 \\times Z_2$ orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory.
M-theory model-building and proton stability
International Nuclear Information System (INIS)
Ellis, J.; Faraggi, A.E.; Nanopoulos, D.V.; Houston Advanced Research Center, The Woodlands, TX; Academy of Athens
1997-09-01
The authors study the problem of baryon stability in M theory, starting from realistic four-dimensional string models constructed using the free-fermion formulation of the weakly-coupled heterotic string. Suitable variants of these models manifest an enhanced custodial gauge symmetry that forbids to all orders the appearance of dangerous dimension-five baryon-decay operators. The authors exhibit the underlying geometric (bosonic) interpretation of these models, which have a Z 2 x Z 2 orbifold structure similar, but not identical, to the class of Calabi-Yau threefold compactifications of M and F theory investigated by Voisin and Borcea. A related generalization of their work may provide a solution to the problem of proton stability in M theory
Algebraic computability and enumeration models recursion theory and descriptive complexity
Nourani, Cyrus F
2016-01-01
This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...
Theory to practice: the humanbecoming leading-following model.
Ursel, Karen L
2015-01-01
Guided by the humanbecoming leading-following model, the author designed a nursing theories course with the intention of creating a meaningful nursing theory to practice link. The author perceived that with the implementation of Situation-Background-Assessment-Recommendations (SBAR) communication, nursing staff had drifted away from using the Kardex™ in shift to shift reporting. Nurse students, faculty, and staff members supported the creation of a theories project which would engage nursing students in the pursuit of clinical excellence. The project chosen was to revise the existing Kardex™ (predominant nursing communication tool). In the project, guided by a nursing theory, nursing students focused on the unique patient's experience, depicting the specific role of nursing knowledge and the contributions of the registered nurse to the patient's healthcare journey. The emphasis of this theoretical learning was the application of a nursing theory to real-life clinical challenges with communication of relevant, timely, and accurate patient information, recognizing that real problems are often complex and require multi-perspective approaches. This project created learning opportunities where a nursing theory would be chosen by the nursing student clinical group and applied in their clinical specialty area. This practice activity served to broaden student understandings of the role of nursing knowledge and nursing theories in their professional practice. © The Author(s) 2014.
Theory of Time beyond the standard model
International Nuclear Information System (INIS)
Poliakov, Eugene S.
2008-01-01
A frame of non-uniform time is discussed. A concept of 'flow of time' is presented. The principle of time relativity in analogy with Galilean principle of relativity is set. Equivalence principle is set to state that the outcome of non-uniform time in an inertial frame of reference is equivalent to the outcome of a fictitious gravity force external to the frame of reference. Thus it is flow of time that causes gravity rather than mass. The latter is compared to experimental data achieving precision of up to 0.0003%. It is shown that the law of energy conservation is inapplicable to the frames of non-uniform time. A theoretical model of a physical entity (point mass, photon) travelling in the field of non-uniform time is considered. A generalized law that allows the flow of time to replace classical energy conservation is introduced on the basis of the experiment of Pound and Rebka. It is shown that linear dependence of flow of time on spatial coordinate conforms the inverse square law of universal gravitation and Keplerian mechanics. Momentum is shown to still be conserved
Standard Model theory calculations and experimental tests
International Nuclear Information System (INIS)
Cacciari, M.; Hamel de Monchenault, G.
2015-01-01
To present knowledge, all the physics at the Large Hadron Collider (LHC) can be described in the framework of the Standard Model (SM) of particle physics. Indeed the newly discovered Higgs boson with a mass close to 125 GeV seems to confirm the predictions of the SM. Thus, besides looking for direct manifestations of the physics beyond the SM, one of the primary missions of the LHC is to perform ever more stringent tests of the SM. This requires not only improved theoretical developments to produce testable predictions and provide experiments with reliable event generators, but also sophisticated analyses techniques to overcome the formidable experimental environment of the LHC and perform precision measurements. In the first section, we describe the state of the art of the theoretical tools and event generators that are used to provide predictions for the production cross sections of the processes of interest. In section 2, inclusive cross section measurements with jets, leptons and vector bosons are presented. Examples of differential cross sections, charge asymmetries and the study of lepton pairs are proposed in section 3. Finally, in section 4, we report studies on the multiple production of gauge bosons and constraints on anomalous gauge couplings
Models with oscillator terms in noncommutative quantum field theory
International Nuclear Information System (INIS)
Kronberger, E.
2010-01-01
The main focus of this Ph.D. thesis is on noncommutative models involving oscillator terms in the action. The first one historically is the successful Grosse-Wulkenhaar (G.W.) model which has already been proven to be renormalizable to all orders of perturbation theory. Remarkably it is furthermore capable of solving the Landau ghost problem. In a first step, we have generalized the G.W. model to gauge theories in a very straightforward way, where the action is BRS invariant and exhibits the good damping properties of the scalar theory by using the same propagator, the so-called Mehler kernel. To be able to handle some more involved one-loop graphs we have programmed a powerful Mathematica package, which is capable of analytically computing Feynman graphs with many terms. The result of those investigations is that new terms originally not present in the action arise, which led us to the conclusion that we should better start from a theory where those terms are already built in. Fortunately there is an action containing this complete set of terms. It can be obtained by coupling a gauge field to the scalar field of the G.W. model, integrating out the latter, and thus 'inducing' a gauge theory. Hence the model is called Induced Gauge Theory. Despite the advantage that it is by construction completely gauge invariant, it contains also some unphysical terms linear in the gauge field. Advantageously we could get rid of these terms using a special gauge dedicated to this purpose. Within this gauge we could again establish the Mehler kernel as gauge field propagator. Furthermore we where able to calculate the ghost propagator, which turned out to be very involved. Thus we were able to start with the first few loop computations showing the expected behavior. The next step is to show renormalizability of the model, where some hints towards this direction will also be given. (author) [de
Implications of Information Theory for Computational Modeling of Schizophrenia.
Silverstein, Steven M; Wibral, Michael; Phillips, William A
2017-10-01
Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.
A spatial Mankiw-Romer-Weil model: Theory and evidence
Fischer, Manfred M.
2009-01-01
This paper presents a theoretical growth model that extends the Mankiw-Romer-Weil [MRW] model by accounting for technological interdependence among regional economies. Interdependence is assumed to work through spatial externalities caused by disembodied knowledge diffusion. The transition from theory to econometrics leads to a reduced-form empirical spatial Durbin model specification that explains the variation in regional levels of per worker output at steady state. A system ...
Reservoir theory, groundwater transit time distributions, and lumped parameter models
International Nuclear Information System (INIS)
Etcheverry, D.; Perrochet, P.
1999-01-01
The relation between groundwater residence times and transit times is given by the reservoir theory. It allows to calculate theoretical transit time distributions in a deterministic way, analytically, or on numerical models. Two analytical solutions validates the piston flow and the exponential model for simple conceptual flow systems. A numerical solution of a hypothetical regional groundwater flow shows that lumped parameter models could be applied in some cases to large-scale, heterogeneous aquifers. (author)
Theory of compressive modeling and simulation
Szu, Harold; Cha, Jae; Espinola, Richard L.; Krapels, Keith
2013-05-01
Modeling and Simulation (M&S) has been evolving along two general directions: (i) data-rich approach suffering the curse of dimensionality and (ii) equation-rich approach suffering computing power and turnaround time. We suggest a third approach. We call it (iii) compressive M&S (CM&S); because the basic Minimum Free-Helmholtz Energy (MFE) facilitating CM&S can reproduce and generalize Candes, Romberg, Tao & Donoho (CRT&D) Compressive Sensing (CS) paradigm as a linear Lagrange Constraint Neural network (LCNN) algorithm. CM&S based MFE can generalize LCNN to 2nd order as Nonlinear augmented LCNN. For example, during the sunset, we can avoid a reddish bias of sunlight illumination due to a long-range Rayleigh scattering over the horizon. With CM&S we can take instead of day camera, a night vision camera. We decomposed long wave infrared (LWIR) band with filter into 2 vector components (8~10μm and 10~12μm) and used LCNN to find pixel by pixel the map of Emissive-Equivalent Planck Radiation Sources (EPRS). Then, we up-shifted consistently, according to de-mixed sources map, to the sub-micron RGB color image. Moreover, the night vision imaging can also be down-shifted at Passive Millimeter Wave (PMMW) imaging, suffering less blur owing to dusty smokes scattering and enjoying apparent smoothness of surface reflectivity of man-made objects under the Rayleigh resolution. One loses three orders of magnitudes in the spatial Rayleigh resolution; but gains two orders of magnitude in the reflectivity, and gains another two orders in the propagation without obscuring smog . Since CM&S can generate missing data and hard to get dynamic transients, CM&S can reduce unnecessary measurements and their associated cost and computing in the sense of super-saving CS: measuring one & getting one's neighborhood free .
Leroux, Estelle; Gorini, Christian; Aslanian, Daniel; Rabineau, Marina; Blanpied, Christian; Rubino, Jean-Loup; Robin, Cécile; Granjeon, Didier; Taillepierre, Rachel
2016-04-01
The post-rift (~20-0 Ma) vertical movements of the Provence Basin (West Mediterranean) are quantified on its both conjugate (the Gulf of Lion and the West Sardinia) margins. This work is based on the stratigraphic study of sedimentary markers using a large 3D grid of seismic data, correlations with existing drillings and refraction data. The post-rift subsidence is measured by the direct use of sedimentary geometries analysed in 3D [Gorini et al., 2015; Rabineau et al., 2014] and validated by numerical stratigraphic modelling. Three domains were found: on the platform (1) and slope (2), the subsidence takes the form of a seaward tilting with different amplitudes, whereas the deep basin (3) subsides purely vertically [Leroux et al., 2015a]. These domains correspond to the deeper crustal domains respectively highlighted by wide angle seismic data. The continental crust (1) and the thinned continental crust (2) are tilted, whereas the intermediate crust, identified as lower continental exhumed crust [Moulin et al., 2015, Afhilado et al., 2015] (3) sagged. The post-break-up subsidence re-uses the initial hinge lines of the rifting phase. This striking correlation between surface geologic processes and deep earth dynamic processes emphasizes that the sedimentary record and sedimentary markers is a window into deep geodynamic processes and dynamic topography. Pliocene-Pleistocene seismic markers enabled high resolution quantification of sediment budgets over the past 6 Myr [Leroux et al., in press]. Sediment budget history is here completed on the Miocene interval. Thus, the controlling factors (climate, tectonics and eustasy) are discussed. Afilhado, A., Moulin, M., Aslanian, D., Schnürle, P., Klingelhoefer, F., Nouzé, H., Rabineau, M., Leroux, E. & Beslier, M.-O. (2015). Deep crustal structure across a young 1 passive margin from wide-angle and reflection seismic data (The SARDINIA Experiment) - II. Sardinia's margin. Bull. Soc. géol. France, 186, ILP Spec. issue, 4
Consistent constraints on the Standard Model Effective Field Theory
International Nuclear Information System (INIS)
Berthier, Laure; Trott, Michael
2016-01-01
We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred and three observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, Λ≳ 3 TeV. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an S,T analysis is modified by the theory errors we include as an illustrative example.
Effective potential in Lorentz-breaking field theory models
Energy Technology Data Exchange (ETDEWEB)
Baeta Scarpelli, A.P. [Centro Federal de Educacao Tecnologica, Nova Gameleira Belo Horizonte, MG (Brazil); Setor Tecnico-Cientifico, Departamento de Policia Federal, Belo Horizonte, MG (Brazil); Brito, L.C.T. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Felipe, J.C.C. [Universidade Federal de Lavras, Departamento de Fisica, Lavras, MG (Brazil); Universidade Federal dos Vales do Jequitinhonha e Mucuri, Instituto de Engenharia, Ciencia e Tecnologia, Veredas, Janauba, MG (Brazil); Nascimento, J.R.; Petrov, A.Yu. [Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, Paraiba (Brazil)
2017-12-15
We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)
Lenses on Reading An Introduction to Theories and Models
Tracey, Diane H
2012-01-01
This widely adopted text explores key theories and models that frame reading instruction and research. Readers learn why theory matters in designing and implementing high-quality instruction and research; how to critically evaluate the assumptions and beliefs that guide their own work; and what can be gained by looking at reading through multiple theoretical lenses. For each theoretical model, classroom applications are brought to life with engaging vignettes and teacher reflections. Research applications are discussed and illustrated with descriptions of exemplary studies. New to This Edition
Effective potential in Lorentz-breaking field theory models
International Nuclear Information System (INIS)
Baeta Scarpelli, A.P.; Brito, L.C.T.; Felipe, J.C.C.; Nascimento, J.R.; Petrov, A.Yu.
2017-01-01
We calculate explicitly the one-loop effective potential in different Lorentz-breaking field theory models. First, we consider a Yukawa-like theory and some examples of Lorentz-violating extensions of scalar QED. We observe, for the extended QED models, that the resulting effective potential converges to the known result in the limit in which Lorentz symmetry is restored. Besides, the one-loop corrections to the effective potential in all the cases we study depend on the background tensors responsible for the Lorentz-symmetry violation. This has consequences for physical quantities like, for example, in the induced mass due to the Coleman-Weinberg mechanism. (orig.)
Arrospide, Arantzazu; Idigoras, Isabel; Mar, Javier; de Koning, Harry; van der Meulen, Miriam; Soto-Gordoa, Myriam; Martinez-Llorente, Jose Miguel; Portillo, Isabel; Arana-Arri, Eunate; Ibarrondo, Oliver; Lansdorp-Vogelaar, Iris
2018-04-25
The Basque Colorectal Cancer Screening Programme began in 2009 and the implementation has been complete since 2013. Faecal immunological testing was used for screening in individuals between 50 and 69 years old. Colorectal Cancer in Basque country is characterized by unusual epidemiological features given that Colorectal Cancer incidence is similar to other European countries while adenoma prevalence is higher. The object of our study was to economically evaluate the programme via cost-effectiveness and budget impact analyses with microsimulation models. We applied the Microsimulation Screening Analysis (MISCAN)-Colon model to predict trends in Colorectal Cancer incidence and mortality and to quantify the short- and long-term effects and costs of the Basque Colorectal Cancer Screening Programme. The model was calibrated to the Basque demographics in 2008 and age-specific Colorectal Cancer incidence data in the Basque Cancer Registry from 2005 to 2008 before the screening begun. The model was also calibrated to the high adenoma prevalence observed for the Basque population in a previously published study. The multi-cohort approach used in the model included all the cohorts in the programme during 30 years of implementation, with lifetime follow-up. Unit costs were obtained from the Basque Health Service and both cost-effectiveness analysis and budget impact analysis were carried out. The goodness-of-fit of the model adaptation to observed programme data was evidence of validation. In the cost-effectiveness analysis, the savings from treatment were larger than the added costs due to screening. Thus, the Basque programme was dominant compared to no screening, as life expectancy increased by 29.3 days per person. The savings in the budget analysis appeared 10 years after the complete implementation of the programme. The average annual budget was €73.4 million from year 2023 onwards. This economic evaluation showed a screening intervention with a major health gain
Integrable models in 1+1 dimensional quantum field theory
International Nuclear Information System (INIS)
Faddeev, Ludvig.
1982-09-01
The goal of this lecture is to present a unifying view on the exactly soluble models. There exist several reasons arguing in favor of the 1+1 dimensional models: every exact solution of a field-theoretical model can teach about the ability of quantum field theory to describe spectrum and scattering; some 1+1 d models have physical applications in the solid state theory. There are several ways to become acquainted with the methods of exactly soluble models: via classical statistical mechanics, via Bethe Ansatz, via inverse scattering method. Fundamental Poisson bracket relation FPR and/or fundamental commutation relations FCR play fundamental role. General classification of FPR is given with promizing generalizations to FCR
A model of PCF in guarded type theory
DEFF Research Database (Denmark)
Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars
2015-01-01
Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy...... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...
A Model of PCF in Guarded Type Theory
DEFF Research Database (Denmark)
Paviotti, Marco; Møgelberg, Rasmus Ejlers; Birkedal, Lars
2015-01-01
Guarded recursion is a form of recursion where recursive calls are guarded by delay modalities. Previous work has shown how guarded recursion is useful for constructing logics for reasoning about programming languages with advanced features, as well as for constructing and reasoning about element...... adequate. The model construction is related to Escardo's metric model for PCF, but here everything is carried out entirely in type theory with guarded recursion, including the formulation of the operational semantics, the model construction and the proof of adequacy....... of coinductive types. In this paper we investigate how type theory with guarded recursion can be used as a metalanguage for denotational semantics useful both for constructing models and for proving properties of these. We do this by constructing a fairly intensional model of PCF and proving it computationally...
An introduction to queueing theory modeling and analysis in applications
Bhat, U Narayan
2015-01-01
This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...
Lika, Konstadia; Kearney, Michael R.; Kooijman, Sebastiaan A. L. M.
2011-11-01
The covariation method for estimating the parameters of the standard Dynamic Energy Budget (DEB) model provides a single-step method of accessing all the core DEB parameters from commonly available empirical data. In this study, we assess the robustness of this parameter estimation procedure and analyse the role of pseudo-data using elasticity coefficients. In particular, we compare the performance of Maximum Likelihood (ML) vs. Weighted Least Squares (WLS) approaches and find that the two approaches tend to converge in performance as the number of uni-variate data sets increases, but that WLS is more robust when data sets comprise single points (zero-variate data). The efficiency of the approach is shown to be high, and the prior parameter estimates (pseudo-data) have very little influence if the real data contain information about the parameter values. For instance, the effects of the pseudo-value for the allocation fraction κ is reduced when there is information for both growth and reproduction, that for the energy conductance is reduced when information on age at birth and puberty is given, and the effects of the pseudo-value for the maturity maintenance rate coefficient are insignificant. The estimation of some parameters (e.g., the zoom factor and the shape coefficient) requires little information, while that of others (e.g., maturity maintenance rate, puberty threshold and reproduction efficiency) require data at several food levels. The generality of the standard DEB model, in combination with the estimation of all of its parameters, allows comparison of species on the basis of parameter values. We discuss a number of preliminary patterns emerging from the present collection of parameter estimates across a wide variety of taxa. We make the observation that the estimated value of the fraction κ of mobilised reserve that is allocated to soma is far away from the value that maximises reproduction. We recognise this as the reason why two very different
Traffic Games: Modeling Freeway Traffic with Game Theory.
Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R
2016-01-01
We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.
Comparison of potential models through heavy quark effective theory
International Nuclear Information System (INIS)
Amundson, J.F.
1995-01-01
I calculate heavy-light decay constants in a nonrelativistic potential model. The resulting estimate of heavy quark symmetry breaking conflicts with similar estimates from lattice QCD. I show that a semirelativistic potential model eliminates the conflict. Using the results of heavy quark effective theory allows me to identify and compensate for shortcomings in the model calculations in addition to isolating the source of the differences in the two models. The results lead to a rule as to where the nonrelativistic quark model gives misleading predictions
International Nuclear Information System (INIS)
Guendelman, E.
2004-01-01
Full Text:The Volume Element of Space Time can be considered as a geometrical object which can be independent of the metric. The use in the action of a volume element which is metric independent leads to the appearance of a measure of integration which is metric independent. This can be applied to all known generally coordinate invariant theories, we will discuss three very important cases: 1. 4-D theories describing gravity and matter fields, 2. Parametrization invariant theories of extended objects and 3. Higher dimensional theories including gravity and matter fields. In case 1, a large number of new effects appear: (i) spontaneous breaking of scale invariance associated to integration of degrees of freedom related to the measure, (ii) under normal particle physics laboratory conditions fermions split into three families, but when matter is highly diluted, neutrinos increase their mass and become suitable candidates for dark matter, (iii) cosmic coincidence between dark energy and dark matter is natural, (iv) quintessence scenarios with automatic decoupling of the quintessence scalar to ordinary matter, but not dark matter are obtained (2) For theories or extended objects, the use of a measure of integration independent of the metric leads to (i) dynamical tension, (ii) string models of non abelian confinement (iii) The possibility of new Weyl invariant light-like branes (WTT.L branes). These Will branes dynamically adjust themselves to sit at black hole horizons and in the context of higher dimensional theories can provide examples of massless 4-D particles with nontrivial Kaluza Klein quantum numbers, (3) In Bronx and Kaluza Klein scenarios, the use of a measure independent of the metric makes it possible to construct naturally models where only the extra dimensions get curved and the 4-D observable space-time remain flat
Theory of positive disintegration as a model of adolescent development.
Laycraft, Krystyna
2011-01-01
This article introduces a conceptual model of the adolescent development based on the theory of positive disintegration combined with theory of self-organization. Dabrowski's theory of positive disintegration, which was created almost a half century ago, still attracts psychologists' and educators' attention, and is extensively applied into studies of gifted and talented people. The positive disintegration is the mental development described by the process of transition from lower to higher levels of mental life and stimulated by tension, inner conflict, and anxiety. This process can be modeled by a sequence of patterns of organization (attractors) as a developmental potential (a control parameter) changes. Three levels of disintegration (unilevel disintegration, spontaneous multilevel disintegration, and organized multilevel disintegration) are analyzed in detail and it is proposed that they represent behaviour of early, middle and late periods of adolescence. In the discussion, recent research on the adolescent brain development is included.
International Nuclear Information System (INIS)
Le Quere, C.; Moriarty, R.; Jones, S.D.; Boden, T.A.; Peters, G.P.; Andrew, R.M.; Andres, R.J.; Ciais, P.; Bopp, L.; Maignan, F.; Viovy, N.
2014-01-01
Accurate assessment of anthropogenic carbon dioxide (CO 2 ) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates, consistency within and among components, alongside methodology and data limitations. CO 2 emissions from fossil-fuel combustion and cement production (EFF) are based on energy statistics, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover change data, fire activity associated with deforestation, and models. The global atmospheric CO 2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO 2 sink (SOCEAN) is based on observations from the 1990's, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated for the first time in this budget with data products based on surveys of ocean CO 2 measurements. The global residual terrestrial CO 2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models forced by observed climate, CO 2 and land cover change (some including nitrogen-carbon interactions). All uncertainties are reported as ±1, reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. For the last decade available (2003-2012), EFF was 8.6±0.4 GtC yr -1 , ELUC 0.9±0.5 GtC yr -1 , GATM 4.3±0
Bertoldi, Giacomo; Cordano, Emanuele; Brenner, Johannes; Senoner, Samuel; Della Chiesa, Stefano; Niedrist, Georg
2017-04-01
In mountain regions, the plot- and catchment-scale water and energy budgets are controlled by a complex interplay of different abiotic (i.e. topography, geology, climate) and biotic (i.e. vegetation, land management) controlling factors. When integrated, physically-based eco-hydrological models are used in mountain areas, there are a large number of parameters, topographic and boundary conditions that need to be chosen. However, data on soil and land-cover properties are relatively scarce and do not reflect the strong variability at the local scale. For this reason, tools for uncertainty quantification and optimal parameters identification are essential not only to improve model performances, but also to identify most relevant parameters to be measured in the field and to evaluate the impact of different assumptions for topographic and boundary conditions (surface, lateral and subsurface water and energy fluxes), which are usually unknown. In this contribution, we present the results of a sensitivity analysis exercise for a set of 20 experimental stations located in the Italian Alps, representative of different conditions in terms of topography (elevation, slope, aspect), land use (pastures, meadows, and apple orchards), soil type and groundwater influence. Besides micrometeorological parameters, each station provides soil water content at different depths, and in three stations (one for each land cover) eddy covariance fluxes. The aims of this work are: (I) To present an approach for improving calibration of plot-scale soil moisture and evapotranspiration (ET). (II) To identify the most sensitive parameters and relevant factors controlling temporal and spatial differences among sites. (III) Identify possible model structural deficiencies or uncertainties in boundary conditions. Simulations have been performed with the GEOtop 2.0 model, which is a physically-based, fully distributed integrated eco-hydrological model that has been specifically designed for mountain
Tsai, Chung-Hung
2014-05-07
Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.
Directory of Open Access Journals (Sweden)
Chung-Hung Tsai
2014-05-01
Full Text Available Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory, technological factors (TAM, and system self-efficacy (social cognitive theory in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively, which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.
Modelling machine ensembles with discrete event dynamical system theory
Hunter, Dan
1990-01-01
Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).
Theory, modeling, and integrated studies in the Arase (ERG) project
Seki, Kanako; Miyoshi, Yoshizumi; Ebihara, Yusuke; Katoh, Yuto; Amano, Takanobu; Saito, Shinji; Shoji, Masafumi; Nakamizo, Aoi; Keika, Kunihiro; Hori, Tomoaki; Nakano, Shin'ya; Watanabe, Shigeto; Kamiya, Kei; Takahashi, Naoko; Omura, Yoshiharu; Nose, Masahito; Fok, Mei-Ching; Tanaka, Takashi; Ieda, Akimasa; Yoshikawa, Akimasa
2018-02-01
Understanding of underlying mechanisms of drastic variations of the near-Earth space (geospace) is one of the current focuses of the magnetospheric physics. The science target of the geospace research project Exploration of energization and Radiation in Geospace (ERG) is to understand the geospace variations with a focus on the relativistic electron acceleration and loss processes. In order to achieve the goal, the ERG project consists of the three parts: the Arase (ERG) satellite, ground-based observations, and theory/modeling/integrated studies. The role of theory/modeling/integrated studies part is to promote relevant theoretical and simulation studies as well as integrated data analysis to combine different kinds of observations and modeling. Here we provide technical reports on simulation and empirical models related to the ERG project together with their roles in the integrated studies of dynamic geospace variations. The simulation and empirical models covered include the radial diffusion model of the radiation belt electrons, GEMSIS-RB and RBW models, CIMI model with global MHD simulation REPPU, GEMSIS-RC model, plasmasphere thermosphere model, self-consistent wave-particle interaction simulations (electron hybrid code and ion hybrid code), the ionospheric electric potential (GEMSIS-POT) model, and SuperDARN electric field models with data assimilation. ERG (Arase) science center tools to support integrated studies with various kinds of data are also briefly introduced.[Figure not available: see fulltext.
Directory of Open Access Journals (Sweden)
Nazeer M. Asmael
2015-07-01
Full Text Available In developing countries such as Syria, the lack of hydrological data affects groundwater resource assessment. Groundwater models provide the means to fill the gaps in the available data in order to improve the understanding of groundwater systems. The study area can be considered as the main recharge area of the eastern side of Barada and Awaj basin in the eastern part of Mt. Hermon. The withdrawal for agricultural and domestic purposes removes a considerable amount of water. The steady-state three-dimensional (3D groundwater model (FEFLOW which is an advanced finite element groundwater flow and transport modeling tool, was used to quantify groundwater budget components by using all available data of hydrological year 2009–2010. The results obtained may be considered as an essential tool for groundwater management options in the study area. The calibrated model demonstrates a good agreement between the observed and simulated hydraulic head. The result of the sensitivity analysis shows that the model is highly sensitive to hydraulic conductivity changes and sensitive to a lesser extent to water recharge amount. Regarding the upper aquifer horizon, the water budget under steady-state condition indicates that the lateral groundwater inflow from the Jurassic aquifer into this horizon is the most important recharge component. The major discharge component from this aquifer horizon occurs at its eastern boundary toward the outside of the model domain. The model was able to produce a satisfying estimation of the preliminary water budget of the upper aquifer horizon which indicates a positive imbalance of 4.6 Mm3·y−1.
Directory of Open Access Journals (Sweden)
Pilli R
2014-02-01
Full Text Available Historical analysis and modeling of the forest carbon dynamics using the Carbon Budget Model: an example for the Trento Province (NE, Italy. The Carbon Budget Model (CBM-CFS3 developed by the Canadian Forest Service was applied to data collected by the last Italian National Forest Inventory (INFC for the Trento Province (NE, Italy. CBM was modified and adapted to the different management types (i.e., even-aged high forests, uneven-aged high forests and coppices and silvicultural systems (including clear cuts, single tree selection systems and thinning applied in this province. The aim of this study was to provide an example of down-scaling of this model from a national to a regional scale, providing (i an historical analysis, from 1995 to 2011, and (ii a projection, from 2012 to 2020, of the forest biomass and the carbon stock evolution. The analysis was based on the harvest rate reported by the Italian National Institute of Statistics (from 1995 to 2011, corrected according to the last INFC data and distinguished between timber and fuel woods and between conifers and broadleaves. Since 2012, we applied a constant harvest rate, equal to about 1300 Mm3 yr-1, estimated from the average harvest rate for the period 2006-2011. Model results were consistent with similar data reported in the literature. The average biomass C stock was 90 Mg C ha-1 and the biomass C stock change was 0.97 Mg C ha-1 yr-1 and 0.87 Mg C ha-1 yr-1, for the period 1995 -2011 and 2012-2020, respectively. The C stock cumulated by the timber products since 1995 was 96 Gg C yr-1, i.e., about 28% of the average annual C stock change of the forests, equal to 345 Gg C yr-1. CBM also provided estimates on the evolution of the age class distribution of the even-aged forests and on the C stock of the DOM forest pools (litter, dead wood and soil. This study demonstrates the utility of CBM to provide estimates at a regional or local scale, using not only the data provided by the forest
Spectral and scattering theory for translation invariant models in quantum field theory
DEFF Research Database (Denmark)
Rasmussen, Morten Grud
This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...... by the physically relevant choices. The translation invariance implies that the Hamiltonian may be decomposed into a direct integral over the space of total momentum where the fixed momentum fiber Hamiltonians are given by , where denotes total momentum and is the Segal field operator. The fiber Hamiltonians...
Constituency Input into Budget Management.
Miller, Norman E.
1995-01-01
Presents techniques for ensuring constituency involvement in district- and site-level budget management. Outlines four models for securing constituent input and focuses on strategies to orchestrate the more complex model for staff and community participation. Two figures are included. (LMI)
Cirafici, M.; Sinkovics, A.; Szabo, R.J.
2009-01-01
We study the relation between Donaldson–Thomas theory of Calabi–Yau threefolds and a six-dimensional topological Yang–Mills theory. Our main example is the topological U(N) gauge theory on flat space in its Coulomb branch. To evaluate its partition function we use equivariant localization techniques
Davey, Keith; Chang, Bernard; Purslow, Christine; Clay, Emilie; Vataire, Anne-Lise
2018-04-19
During cataract surgery, maintaining an adequate degree of mydriasis throughout the entire operation is critical to allow for visualisation of the capsulorhexis and the crystalline lens. Good anaesthesia is also essential for safe intraocular surgery. Mydrane® is a new injectable intracameral solution containing two mydriatics (tropicamide 0.02% and phenylephrine 0.31%) and one anaesthetic (lidocaine 1%) that was developed as an alternative to the conventional topical pre-operative mydriatics used in cataract surgery. This study aimed to estimate the budget impact across a one year time frame using Mydrane® instead of topical dilating eye drops, for a UK hospital performing 3,000 cataract operations a year. A budget impact model (BIM) was developed to compare the economic outcomes associated with the use of Mydrane® versus topical drops (tropicamide 0.5% and phenylephrine 10%) in patients undergoing cataract surgery in a UK hospital. The outcomes of interest included costs and resource use (e.g. clinician time, mydriasis failures, operating room time, number of patients per vial of therapy etc.) associated with management of mydriasis in patients undergoing cataract surgery. All model inputs considered the UK hospital perspective without social or geographical variables. Deterministic sensitivity analyses were also performed to assess the model uncertainty. Introduction of Mydrane® is associated with a cost saving of £6,251 over 3,000 cataract surgeries in one year. The acquisition costs of the Mydrane® (£18,000 by year vs. £3,330 for eye drops) were balanced by substantial reductions in mainly nurses' costs and time, plus a smaller contribution from savings in surgeons' costs (£20,511) and lower costs associated with auxiliary dilation (£410 due to avoidance of additional dilation methods). Results of the sensitivity analyses confirmed the robustness of the model to the variation of inputs. Except for the duration of one session of eye drop instillation
Ogden, Daniel M., Jr.
1978-01-01
Suggests that the most practical budgeting system for most managers is a formalized combination of incremental and zero-based analysis because little can be learned about most programs from an annual zero-based budget. (Author/IRT)
Montgomery County of Maryland — This dataset includes the Fiscal Year 2015 Council-approved operating budget for Montgomery County. The dataset does not include revenues and detailed agency budget...
Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions
Directory of Open Access Journals (Sweden)
Camaren Peter
2014-03-01
Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.
Excellence in Physics Education Award: Modeling Theory for Physics Instruction
Hestenes, David
2014-03-01
All humans create mental models to plan and guide their interactions with the physical world. Science has greatly refined and extended this ability by creating and validating formal scientific models of physical things and processes. Research in physics education has found that mental models created from everyday experience are largely incompatible with scientific models. This suggests that the fundamental problem in learning and understanding science is coordinating mental models with scientific models. Modeling Theory has drawn on resources of cognitive science to work out extensive implications of this suggestion and guide development of an approach to science pedagogy and curriculum design called Modeling Instruction. Modeling Instruction has been widely applied to high school physics and, more recently, to chemistry and biology, with noteworthy results.
A Model of Statistics Performance Based on Achievement Goal Theory.
Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.
2003-01-01
Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…
Anisotropic cosmological models and generalized scalar tensor theory
Indian Academy of Sciences (India)
Abstract. In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–. Sachs space-time. For bulk viscous fluid, both exponential and power-law solutions have been stud- ied and some assumptions ...
Anisotropic cosmological models and generalized scalar tensor theory
Indian Academy of Sciences (India)
In this paper generalized scalar tensor theory has been considered in the background of anisotropic cosmological models, namely, axially symmetric Bianchi-I, Bianchi-III and Kortowski–Sachs space-time. For bulk viscous ﬂuid, both exponential and power-law solutions have been studied and some assumptions among the ...
Two-dimensional models in statistical mechanics and field theory
International Nuclear Information System (INIS)
Koberle, R.
1980-01-01
Several features of two-dimensional models in statistical mechanics and Field theory, such as, lattice quantum chromodynamics, Z(N), Gross-Neveu and CP N-1 are discussed. The problems of confinement and dynamical mass generation are also analyzed. (L.C.) [pt
The early years of string theory: The dual resonance model
International Nuclear Information System (INIS)
Ramond, P.
1987-10-01
This paper reviews the past quantum mechanical history of the dual resonance model which is an early string theory. The content of this paper is listed as follows: historical review, the Veneziano amplitude, the operator formalism, the ghost story, and the string story
Interacting bosons model and relation with BCS theory
International Nuclear Information System (INIS)
Diniz, R.
1990-01-01
The Nambu mechanism for BCS theory is extended with inclusion of quadrupole pairing in addition to the usual monopole pairing. An effective Hamiltonian is constructed and its relation to the IBM is discussed. The faced difficulties and a possible generalization of this model are discussed. (author)
Symmetry-guided large-scale shell-model theory
Czech Academy of Sciences Publication Activity Database
Launey, K. D.; Dytrych, Tomáš; Draayer, J. P.
2016-01-01
Roč. 89, JUL (2016), s. 101-136 ISSN 0146-6410 R&D Projects: GA ČR GA16-16772S Institutional support: RVO:61389005 Keywords : Ab intio shell -model theory * Symplectic symmetry * Collectivity * Clusters * Hoyle state * Orderly patterns in nuclei from first principles Subject RIV: BE - Theoretical Physics Impact factor: 11.229, year: 2016
The Five-Factor Model and Self-Determination Theory
DEFF Research Database (Denmark)
Olesen, Martin Hammershøj; Thomsen, Dorthe Kirkegaard; Schnieber, Anette
This study investigates conceptual overlap vs. distinction between individual differences in personality traits, i.e. the Five-Factor Model; and Self-determination Theory, i.e. general causality orientations. Twelve-hundred-and-eighty-seven freshmen (mean age 21.71; 64% women) completed electronic...
A Proposed Model of Jazz Theory Knowledge Acquisition
Ciorba, Charles R.; Russell, Brian E.
2014-01-01
The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…
S matrix theory of the massive Thirring model
International Nuclear Information System (INIS)
Berg, B.
1980-01-01
The S matrix theory of the massive Thirring model, describing the exact quantum scattering of solitons and their boundstates, is reviewed. Treated are: Factorization equations and their solution, boundstates, generalized Jost functions and Levinson's theorem, scattering of boundstates, 'virtual' and anomalous thresholds. (orig.) 891 HSI/orig. 892 MKO
Using SAS PROC MCMC for Item Response Theory Models
Ames, Allison J.; Samonte, Kelli
2015-01-01
Interest in using Bayesian methods for estimating item response theory models has grown at a remarkable rate in recent years. This attentiveness to Bayesian estimation has also inspired a growth in available software such as WinBUGS, R packages, BMIRT, MPLUS, and SAS PROC MCMC. This article intends to provide an accessible overview of Bayesian…
Multilevel Higher-Order Item Response Theory Models
Huang, Hung-Yu; Wang, Wen-Chung
2014-01-01
In the social sciences, latent traits often have a hierarchical structure, and data can be sampled from multiple levels. Both hierarchical latent traits and multilevel data can occur simultaneously. In this study, we developed a general class of item response theory models to accommodate both hierarchical latent traits and multilevel data. The…
Item Response Theory Models for Performance Decline during Testing
Jin, Kuan-Yu; Wang, Wen-Chung
2014-01-01
Sometimes, test-takers may not be able to attempt all items to the best of their ability (with full effort) due to personal factors (e.g., low motivation) or testing conditions (e.g., time limit), resulting in poor performances on certain items, especially those located toward the end of a test. Standard item response theory (IRT) models fail to…
Item Response Theory Modeling of the Philadelphia Naming Test
Fergadiotis, Gerasimos; Kellough, Stacey; Hula, William D.
2015-01-01
Purpose: In this study, we investigated the fit of the Philadelphia Naming Test (PNT; Roach, Schwartz, Martin, Grewal, & Brecher, 1996) to an item-response-theory measurement model, estimated the precision of the resulting scores and item parameters, and provided a theoretical rationale for the interpretation of PNT overall scores by relating…
An NCME Instructional Module on Polytomous Item Response Theory Models
Penfield, Randall David
2014-01-01
A polytomous item is one for which the responses are scored according to three or more categories. Given the increasing use of polytomous items in assessment practices, item response theory (IRT) models specialized for polytomous items are becoming increasingly common. The purpose of this ITEMS module is to provide an accessible overview of…
Profiles in Leadership: Enhancing Learning through Model and Theory Building.
Mello, Jeffrey A.
2003-01-01
A class assignment was designed to present factors affecting leadership dynamics, allow practice in model and theory building, and examine leadership from multicultural perspectives. Students developed a profile of a fictional or real leader and analyzed qualities, motivations, context, and effectiveness in written and oral presentations.…
Compositional models and conditional independence in evidence theory
Czech Academy of Sciences Publication Activity Database
Jiroušek, Radim; Vejnarová, Jiřina
2011-01-01
Roč. 52, č. 3 (2011), s. 316-334 ISSN 0888-613X Institutional research plan: CEZ:AV0Z10750506 Keywords : Evidence theory * Conditional independence * multidimensional models Subject RIV: BA - General Mathematics Impact factor: 1.948, year: 2011 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-0370515.pdf
Evaluating hydrological model performance using information theory-based metrics
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...
Stochastic models in risk theory and management accounting
Brekelmans, R.C.M.
2000-01-01
This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest
Conformal field theories, Coulomb gas picture and integrable models
International Nuclear Information System (INIS)
Zuber, J.B.
1988-01-01
The aim of the study is to present the links between some results of conformal field theory, the conventional Coulomb gas picture in statistical mechanics and the approach of integrable models. It is shown that families of conformal theories, related by the coset construction to the SU(2) Kac-Moody algebra, may be regarded as obtained from some free field, and modified by the coupling of its winding numbers to floating charges. This representation reflects the procedure of restriction of the corresponding integrable lattice models. The work may be generalized to models based on the coset construction with higher rank algebras. The corresponding integrable models are identified. In the conformal field description, generalized parafermions appear, and are coupled to free fields living on a higher-dimensional torus. The analysis is not as exhaustive as in the SU(2) case: all the various restrictions have not been identified, nor the modular invariants completely classified
Route Choice Model Based on Game Theory for Commuters
Directory of Open Access Journals (Sweden)
Licai Yang
2016-06-01
Full Text Available The traffic behaviours of commuters may cause traffic congestion during peak hours. Advanced Traffic Information System can provide dynamic information to travellers. Due to the lack of timeliness and comprehensiveness, the provided information cannot satisfy the travellers’ needs. Since the assumptions of traditional route choice model based on Expected Utility Theory conflict with the actual situation, a route choice model based on Game Theory is proposed to provide reliable route choice to commuters in actual situation in this paper. The proposed model treats the alternative routes as game players and utilizes the precision of predicted information and familiarity of traffic condition to build a game. The optimal route can be generated considering Nash Equilibrium by solving the route choice game. Simulations and experimental analysis show that the proposed model can describe the commuters’ routine route choice decisionexactly and the provided route is reliable.
A budget-impact and cost-effectiveness model for second-line treatment of major depression.
Malone, Daniel C
2007-07-01
Depressed patients who initially fail to achieve remission when placed on a selective serotonin reuptake inhibitor (SSRI) may require a second treatment. The purpose of this study was to evaluate the effectiveness, cost, cost-effectiveness, and budget impact of second-line pharmacologic treatment for major depressive disorder (MDD). A cost-effectiveness analysis was conducted to evaluate second-line therapies (citalopram, escitalopram, fluoxetine, paroxetine, paroxetine controlled release [CR], sertraline, and venlafaxine extended release [XR]) for the treatment of depression. Effectiveness data were obtained from published clinical studies. The primary outcome was remission defined as a score of 7 or less on the Hamilton Rating Scale for Depression (HAM-D) or a score of 10 or less on the montgomery-Asberg Depression Rating Scale (MADRS) depression rating scales. The wholesale acquisition cost (WAC) for medications and medical treatment costs for depression were included. The perspective was derived from a managed care organization (MCO) with 500,000 members, a 1.9% annual incidence of depression, and treatment duration of 6 months. Assumptions included: second-line treatment is not as effective as first-line treatment, WAC price reflects MCO costs, and side effects were identical. Sensitivity analyses were conducted to determine variables that influenced the results. Second-line remission rates were 20.4% for venlafaxine XR, 16.9% for sertraline, 16.4% for escitalopram, 15.1% for generic SSRIs (weighted average), and 13.6% for paroxetine CR. Pharmacy costs ranged from $163 for generic SSRIs to $319 for venlafaxine SR. Total cost per patient achieving remission was $14,275 for venlafaxine SR, followed by $16,100 for escitalopram. The incremental cost-effectiveness ratio (ICER) for venlafaxine SR compared with generic SSRIs was $2,073 per patient achieving remission, followed by escitalopram with an ICER of $3,566. The model was most sensitive to other therapies
Modeling Composite Assessment Data Using Item Response Theory
Ueckert, Sebastian
2018-01-01
Composite assessments aim to combine different aspects of a disease in a single score and are utilized in a variety of therapeutic areas. The data arising from these evaluations are inherently discrete with distinct statistical properties. This tutorial presents the framework of the item response theory (IRT) for the analysis of this data type in a pharmacometric context. The article considers both conceptual (terms and assumptions) and practical questions (modeling software, data requirements, and model building). PMID:29493119
Constitutive relationships and models in continuum theories of multiphase flows
International Nuclear Information System (INIS)
Decker, R.
1989-09-01
In April, 1989, a workshop on constitutive relationships and models in continuum theories of multiphase flows was held at NASA's Marshall Space Flight Center. Topics of constitutive relationships for the partial or per phase stresses, including the concept of solid phase pressure are discussed. Models used for the exchange of mass, momentum, and energy between the phases in a multiphase flow are also discussed. The program, abstracts, and texts of the presentations from the workshop are included
Perturbation theory around the Wess-Zumino-Witten model
International Nuclear Information System (INIS)
Hasseln, H. v.
1991-05-01
We consider a perturbation of the Wess-Zumino-Witten model in 2D by a current-current interaction. The β-function is computed to third order in the coupling constant and a nontrivial fixedpoint is found. By non-abelian bosonization, this perturbed WZW-model is shown to have the same β-function (at least to order g 2 ) as the fermionic theory with a four-fermion interaction. (orig.) [de
DEFF Research Database (Denmark)
Rohde, Carsten
Budgets and budget control has been known since the early 19th century1. However the use of budget control was until the beginning of the 1920ies in US primarily related to governmental units and states and to a minor extent to business units in practice. At that time James McKinsey describes...
J.L.T. Blank; E. Eggink
1998-01-01
Original title: Tussen bed en budget. The report Between bedside and budget (Tussen bed en budget) describes an extensive empirical study of the efficiency of general and university hospitals in the Netherlands. A policy summary recaps the main findings of the study. Those findings
Warner, Alice Sizer
1993-01-01
Discusses the advantages and disadvantages of six types of budgets commonly used by many different kinds of libraries. The budget types covered are lump-sum; formula; line or line-item; program; performance or function; and zero-based. Accompanying figures demonstrate the differences between four of the budget types. (three references) (KRN)
A general-model-space diagrammatic perturbation theory
International Nuclear Information System (INIS)
Hose, G.; Kaldor, U.
1980-01-01
A diagrammatic many-body perturbation theory applicable to arbitrary model spaces is presented. The necessity of having a complete model space (all possible occupancies of the partially-filled shells) is avoided. This requirement may be troublesome for systems with several well-spaced open shells, such as most atomic and molecular excited states, as a complete model space spans a very broad energy range and leaves out states within that range, leading to poor or no convergence of the perturbation series. The method presented here would be particularly useful for such states. The solution of a model problem (He 2 excited Σ + sub(g) states) is demonstrated. (Auth.)
Theory-based Bayesian models of inductive learning and reasoning.
Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles
2006-07-01
Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.
Fluid analog model for boundary effects in field theory
International Nuclear Information System (INIS)
Ford, L. H.; Svaiter, N. F.
2009-01-01
Quantum fluctuations in the density of a fluid with a linear phonon dispersion relation are studied. In particular, we treat the changes in these fluctuations due to nonclassical states of phonons and to the presence of boundaries. These effects are analogous to similar effects in relativistic quantum field theory, and we argue that the case of the fluid is a useful analog model for effects in field theory. We further argue that the changes in the mean squared density are, in principle, observable by light scattering experiments.
Chern-Simons Theory, Matrix Models, and Topological Strings
International Nuclear Information System (INIS)
Walcher, J
2006-01-01
This book is a find. Marino meets the challenge of filling in less than 200 pages the need for an accessible review of topological gauge/gravity duality. He is one of the pioneers of the subject and a clear expositor. It is no surprise that reading this book is a great pleasure. The existence of dualities between gauge theories and theories of gravity remains one of the most surprising recent discoveries in mathematical physics. While it is probably fair to say that we do not yet understand the full reach of such a relation, the impressive amount of evidence that has accumulated over the past years can be regarded as a substitute for a proof, and will certainly help to delineate the question of what is the most fundamental quantum mechanical theory. Here is a brief summary of the book. The journey begins with matrix models and an introduction to various techniques for the computation of integrals including perturbative expansion, large-N approximation, saddle point analysis, and the method of orthogonal polynomials. The second chapter, on Chern-Simons theory, is the longest and probably the most complete one in the book. Starting from the action we meet Wilson loop observables, the associated perturbative 3-manifold invariants, Witten's exact solution via the canonical duality to WZW models, the framing ambiguity, as well as a collection of results on knot invariants that can be derived from Chern-Simons theory and the combinatorics of U (∞) representation theory. The chapter also contains a careful derivation of the large-N expansion of the Chern-Simons partition function, which forms the cornerstone of its interpretation as a closed string theory. Finally, we learn that Chern-Simons theory can sometimes also be represented as a matrix model. The story then turns to the gravity side, with an introduction to topological sigma models (chapter 3) and topological string theory (chapter 4). While this presentation is necessarily rather condensed (and the beginner may
Finite-size scaling theory and quantum hamiltonian Field theory: the transverse Ising model
International Nuclear Information System (INIS)
Hamer, C.J.; Barber, M.N.
1979-01-01
Exact results for the mass gap, specific heat and susceptibility of the one-dimensional transverse Ising model on a finite lattice are generated by constructing a finite matrix representation of the Hamiltonian using strong-coupling eigenstates. The critical behaviour of the limiting infinite chain is analysed using finite-size scaling theory. In this way, excellent estimates (to within 1/2% accuracy) are found for the critical coupling and the exponents α, ν and γ
A General Framework for Portfolio Theory. Part I: theory and various models
Maier-Paape, Stanislaus; Zhu, Qiji Jim
2017-01-01
Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two dimensional space of utility and risk. This is a rather general pattern. The modern portfolio theory of Markowitz [H. Markowitz, Portfolio Selection, 1959] and its natural generalization, the capital market pricing model, [W. F. Sharpe, Mutual fund performance , 1966] are spe...
Energy Technology Data Exchange (ETDEWEB)
Dritselis, Chris D, E-mail: dritseli@mie.uth.gr [Mechanical Engineering Department, University of Thessaly, Pedion Areos, 38334 Volos (Greece)
2017-04-15
In the first part of this study (Dritselis 2016 Fluid Dyn. Res. 48 015507), the Reynolds stress budgets were evaluated through point-particle direct numerical simulations (pp-DNSs) for the particle-laden turbulent flow in a vertical channel with two- and four-way coupling effects. Here several turbulence models are assessed by direct comparison of the particle contribution terms to the budgets, the dissipation rate, the pressure-strain rate, and the transport rate with the model expressions using the pp-DNS data. It is found that the models of the particle sources to the equations of fluid turbulent kinetic energy and dissipation rate cannot represent correctly the physics of the complex interaction between turbulence and particles. A relatively poor performance of the pressure-strain term models is revealed in the particulate flows, while the algebraic models for the dissipation rate of the fluid turbulence kinetic energy and the transport rate terms can adequately reproduce the main trends due to the presence of particles. Further work is generally needed to improve the models in order to account properly for the momentum exchange between the two phases and the effects of particle inertia, gravity and inter-particle collisions. (paper)
International Nuclear Information System (INIS)
Dritselis, Chris D
2017-01-01
In the first part of this study (Dritselis 2016 Fluid Dyn. Res. 48 015507), the Reynolds stress budgets were evaluated through point-particle direct numerical simulations (pp-DNSs) for the particle-laden turbulent flow in a vertical channel with two- and four-way coupling effects. Here several turbulence models are assessed by direct comparison of the particle contribution terms to the budgets, the dissipation rate, the pressure-strain rate, and the transport rate with the model expressions using the pp-DNS data. It is found that the models of the particle sources to the equations of fluid turbulent kinetic energy and dissipation rate cannot represent correctly the physics of the complex interaction between turbulence and particles. A relatively poor performance of the pressure-strain term models is revealed in the particulate flows, while the algebraic models for the dissipation rate of the fluid turbulence kinetic energy and the transport rate terms can adequately reproduce the main trends due to the presence of particles. Further work is generally needed to improve the models in order to account properly for the momentum exchange between the two phases and the effects of particle inertia, gravity and inter-particle collisions. (paper)
Blyth, E.; Martinez-de la Torre, A.; Ellis, R.; Robinson, E.
2017-12-01
The fresh-water budget of the Artic region has a diverse range of impacts: the ecosystems of the region, ocean circulation response to Arctic freshwater, methane emissions through changing wetland extent as well as the available fresh water for human consumption. But there are many processes that control the budget including a seasonal snow packs building and thawing, freezing soils and permafrost, extensive organic soils and large wetland systems. All these processes interact to create a complex hydrological system. In this study we examine a suite of 10 models that bring all those processes together in a 25 year reanalysis of the global water budget. We assess their performance in the Arctic region. There are two approaches to modelling fresh-water flows at large scales, referred to here as `Hydrological' and `Land Surface' models. While both approaches include a physically based model of the water stores and fluxes, the Land Surface models links the water flows to an energy-based model for processes such as snow melt and soil freezing. This study will analyse the impact of that basic difference on the regional patterns of evapotranspiration, runoff generation and terrestrial water storage. For the evapotranspiration, the Hydrological models tend to have a bigger spatial range in the model bias (difference to observations), implying greater errors compared to the Land-Surface models. For instance, some regions such as Eastern Siberia have consistently lower Evaporation in the Hydrological models than the Land Surface models. For the Runoff however, the results are the other way round with a slightly higher spatial range in bias for the Land Surface models implying greater errors than the Hydrological models. A simple analysis would suggest that Hydrological models are designed to get the runoff right, while Land Surface models designed to get the evapotranspiration right. Tracing the source of the difference suggests that the difference comes from the treatment
List, J. H.; Safak, I.; Warner, J. C.; Schwab, W. C.; Hapke, C. J.; Lentz, E. E.
2016-02-01
The processes responsible for long-term (decadal) shoreline change and the related imbalance in the sediment budget on Fire Island, a 50 km long barrier island on the south coast of Long Island, NY, has been the subject of debate. The estimated net rate of sediment leaving the barrier at the west end of the island is approximately double the estimated net rate of sediment entering in the east, but the island-wide average sediment volume change associated with shoreline change is near zero and cannot account for this deficit. A long-held hypothesis is that onshore sediment flux from the inner continental shelf within the western half of the island is responsible for balancing the sediment budget. To investigate this possibility, we use a nested, 3-D, hydrodynamics-based modeling system (COAWST) to simulate the island-wide alongshore and cross-shore transport, in combination with shoreline change observations. The modeled, net alongshore transport gradients in the nearshore predict that the central part of Fire Island should be erosional, yet shoreline change observations show this area to be accretionary. We compare the model-predicted alongshore transport gradients with the flux gradients that would be required to generate the observed shoreline change, to give the pattern of sediment volume gains or losses that cannot be explained by the modeled alongshore transport gradients. Results show that the western 30 km of coast requires an input of sediment, supporting the hypothesis of onshore flux in this area. The modeled cross-shore flux of sediment between the shoreface and inner shelf is consistent these results, with onshore-directed bottom currents creating an environment more conducive to onshore sediment flux in the western 30 km of the island compared to the eastern 20 km. We conclude that the cross-shore flux of sediment can explain the shoreline change observations, and is an integral component of Fire Island's sediment budget.
7 CFR 3402.14 - Budget and budget narrative.
2010-01-01
... 7 Agriculture 15 2010-01-01 2010-01-01 false Budget and budget narrative. 3402.14 Section 3402.14... GRADUATE AND POSTGRADUATE FELLOWSHIP GRANTS PROGRAM Preparation of an Application § 3402.14 Budget and budget narrative. Applicants must prepare the Budget, Form CSREES-2004, and a budget narrative...
Should the model for risk-informed regulation be game theory rather than decision theory?
Bier, Vicki M; Lin, Shi-Woei
2013-02-01
deception), to identify optimal regulatory strategies. Therefore, we believe that the types of regulatory interactions analyzed in this article are better modeled using game theory rather than decision theory. In particular, the goals of this article are to review the relevant literature in game theory and regulatory economics (to stimulate interest in this area among risk analysts), and to present illustrative results showing how the application of game theory can provide useful insights into the theory and practice of risk-informed regulation. © 2012 Society for Risk Analysis.
sigma model approach to the heterotic string theory
International Nuclear Information System (INIS)
Sen, A.
1985-09-01
Relation between the equations of motion for the massless fields in the heterotic string theory, and the conformal invariance of the sigma model describing the propagation of the heterotic string in arbitrary background massless fields is discussed. It is emphasized that this sigma model contains complete information about the string theory. Finally, we discuss the extension of the Hull-Witten proof of local gauge and Lorentz invariance of the sigma-model to higher order in α', and the modification of the transformation laws of the antisymmetric tensor field under these symmetries. Presence of anomaly in the naive N = 1/2 supersymmetry transformation is also pointed out in this context. 12 refs
Integrable lambda models and Chern-Simons theories
International Nuclear Information System (INIS)
Schmidtt, David M.
2017-01-01
In this note we reveal a connection between the phase space of lambda models on S 1 ×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS 5 ×S 5 lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.
Models for probability and statistical inference theory and applications
Stapleton, James H
2007-01-01
This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...
Integrable lambda models and Chern-Simons theories
Energy Technology Data Exchange (ETDEWEB)
Schmidtt, David M. [Departamento de Física, Universidade Federal de São Carlos,Caixa Postal 676, CEP 13565-905, São Carlos-SP (Brazil)
2017-05-03
In this note we reveal a connection between the phase space of lambda models on S{sup 1}×ℝ and the phase space of double Chern-Simons theories on D×ℝ and explain in the process the origin of the non-ultralocality of the Maillet bracket, which emerges as a boundary algebra. In particular, this means that the (classical) AdS{sub 5}×S{sup 5} lambda model can be understood as a double Chern-Simons theory defined on the Lie superalgebra psu(2,2|4) after a proper dependence of the spectral parameter is introduced. This offers a possibility for avoiding the use of the problematic non-ultralocal Poisson algebras that preclude the introduction of lattice regularizations and the application of the QISM to string sigma models. The utility of the equivalence at the quantum level is, however, still to be explored.
Classical nucleation theory in the phase-field crystal model.
Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas
2018-04-01
A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.
Matrix models and stochastic growth in Donaldson-Thomas theory
Energy Technology Data Exchange (ETDEWEB)
Szabo, Richard J. [Department of Mathematics, Heriot-Watt University, Colin Maclaurin Building, Riccarton, Edinburgh EH14 4AS, United Kingdom and Maxwell Institute for Mathematical Sciences, Edinburgh (United Kingdom); Tierz, Miguel [Grupo de Fisica Matematica, Complexo Interdisciplinar da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, PT-1649-003 Lisboa (Portugal); Departamento de Analisis Matematico, Facultad de Ciencias Matematicas, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid (Spain)
2012-10-15
We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kaehler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.
Matrix models and stochastic growth in Donaldson-Thomas theory
International Nuclear Information System (INIS)
Szabo, Richard J.; Tierz, Miguel
2012-01-01
We show that the partition functions which enumerate Donaldson-Thomas invariants of local toric Calabi-Yau threefolds without compact divisors can be expressed in terms of specializations of the Schur measure. We also discuss the relevance of the Hall-Littlewood and Jack measures in the context of BPS state counting and study the partition functions at arbitrary points of the Kähler moduli space. This rewriting in terms of symmetric functions leads to a unitary one-matrix model representation for Donaldson-Thomas theory. We describe explicitly how this result is related to the unitary matrix model description of Chern-Simons gauge theory. This representation is used to show that the generating functions for Donaldson-Thomas invariants are related to tau-functions of the integrable Toda and Toeplitz lattice hierarchies. The matrix model also leads to an interpretation of Donaldson-Thomas theory in terms of non-intersecting paths in the lock-step model of vicious walkers. We further show that these generating functions can be interpreted as normalization constants of a corner growth/last-passage stochastic model.
Forewarning model for water pollution risk based on Bayes theory.
Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis
2014-02-01
In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.
Soliton excitations in polyacetylene and relativistic field theory models
International Nuclear Information System (INIS)
Campbell, D.K.; Bishop, A.R.; Los Alamos Scientific Lab., NM
1982-01-01
A continuum model of a Peierls-dimerized chain, as described generally by Brazovskii and discussed for the case of polyacetylene by Takayama, Lin-Liu and Maki (TLM), is considered. The continuum (Bogliubov-de Gennes) equations arising in this model of interacting electrons and phonons are shown to be equivalent to the static, semiclassical equations for a solvable model field theory of self-coupled fermions - the N = 2 Gross-Neveu model. Based on this equivalence we note the existence of soliton defect states in polyacetylene that are additional to, and qualitatively different from, the amplitude kinks commonly discussed. The new solutions do not have the topological stability of kinks but are essentially conventional strong-coupling polarons in the dimerized chain. They carry spin (1/2) and charge (+- e). In addition, we discuss further areas in which known field theory results may apply to a Peierls-dimerized chain, including relations between phenomenological PHI 4 and continuuum electron-phonon models, and the structure of the fully quantum versus mean field theories. (orig.)
Classical nucleation theory in the phase-field crystal model
Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas
2018-04-01
A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.
Richon, Camille; Dutay, Jean-Claude; Dulac, François; Desboeufs, Karine; Nabat, Pierre; Guieu, Cécile; Aumont, Olivier; Palmieri, Julien
2016-04-01
Atmospheric deposition is at present not included in regional oceanic biogeochemical models of the Mediterranean Sea, whereas, along with river inputs, it represents a significant source of nutrients at the basin scale, especially through intense desert dust events. Moreover, observations (e.g. DUNE campaign, Guieu et al. 2010) show that these events significantly modify the biogeochemistry of the oligotrophic Mediterranean Sea. We use a high resolution (1/12°) version of the 3D coupled model NEMOMED12/PISCES to investigate the effects of high resolution atmospheric dust deposition forcings on the biogeochemistry of the Mediterranean basin. The biogeochemical model PISCES represents the evolution of 24 prognostic tracers including five nutrients (nitrate, ammonium, phosphate, silicate and iron) and two phytoplankton and zooplanktons groups (Palmiéri, 2014). From decadal simulations (1982-2012) we evaluate the influence of natural dust and anthropogenic nitrogen deposition on the budget of nutrients in the basin and its impact on the biogeochemistry (primary production, plankton distributions...). Our results show that natural dust deposition accounts for 15% of global PO4 budget and that it influences primarily the southern part of the basin. Anthropogenic nitrogen accounts for 50% of bioavailable N supply for the northern part. Deposition events significantly affect biological production; primary productivity enhancement can be as high as 30% in the areas of high deposition, especially during the stratified period. Further developments of the model will include 0D and 1D modeling of bacteria in the frame of the PEACETIME project.
Theory and theory-based models for the pedestal, edge stability and ELMs in tokamaks
International Nuclear Information System (INIS)
Guzdar, P.N.; Mahajan, S.M.; Yoshida, Z.; Dorland, W.; Rogers, B.N.; Bateman, G.; Kritz, A.H.; Pankin, A.; Voitsekhovitch, I.; Onjun, T.; Snyder, S.
2005-01-01
Theories for equilibrium and stability of H-modes, and models for use within integrated modeling codes with the objective of predicting the height, width and shape of the pedestal at the edge of H-mode plasmas in tokamaks, as well as the onset and frequency of Edge Localized Modes (ELMs), are developed. A theory model for relaxed plasma states with flow, which uses two-fluid Hall-MHD equations, predicts that the natural scale length of the pedestal is the ion skin depth and the pedestal width is larger than the ion poloidal gyro-radius, in agreement with experimental observations. Computations with the GS2 code are used to identify micro-instabilities, such as electron drift waves, that survive the strong flow shear, diamagnetic flows, and magnetic shear that are characteristic of the pedestal. Other instabilities on the pedestal and gyro-radius scale, such as the Kelvin-Helmholtz instability, are also investigated. Time-dependent integrated modeling simulations are used to follow the transition from L-mode to H-mode and the subsequent evolution of ELMs as the heating power is increased. The flow shear stabilization that produces the transport barrier at the edge of the plasma reduces different modes of anomalous transport and, consequently, different channels of transport at different rates. ELM crashes are triggered in the model by pressure-driven ballooning modes or by current-driven peeling modes. (author)
Thomas, Yoann; Mazurié, Joseph; Alunno-Bruscia, Marianne; Bacher, Cédric; Bouget, Jean-François; Gohin, Francis; Pouvreau, Stéphane; Struski, Caroline
2011-11-01
In order to assess the potential of various marine ecosystems for shellfish aquaculture and to evaluate their carrying capacities, there is a need to clarify the response of exploited species to environmental variations using robust ecophysiological models and available environmental data. For a large range of applications and comparison purposes, a non-specific approach based on 'generic' individual growth models offers many advantages. In this context, we simulated the response of blue mussel ( Mytilus edulis L.) to the spatio-temporal fluctuations of the environment in Mont Saint-Michel Bay (North Brittany) by forcing a generic growth model based on Dynamic Energy Budgets with satellite-derived environmental data (i.e. temperature and food). After a calibration step based on data from mussel growth surveys, the model was applied over nine years on a large area covering the entire bay. These simulations provide an evaluation of the spatio-temporal variability in mussel growth and also show the ability of the DEB model to integrate satellite-derived data and to predict spatial and temporal growth variability of mussels. Observed seasonal, inter-annual and spatial growth variations are well simulated. The large-scale application highlights the strong link between food and mussel growth. The methodology described in this study may be considered as a suitable approach to account for environmental effects (food and temperature variations) on physiological responses (growth and reproduction) of filter feeders in varying environments. Such physiological responses may then be useful for evaluating the suitability of coastal ecosystems for shellfish aquaculture.
Miller, James R.; Russell, Gary L.; Hansen, James E. (Technical Monitor)
2001-01-01
The annual energy budget of the Arctic Ocean is characterized by a net heat loss at the air-sea interface that is balanced by oceanic heat transport into the Arctic. The energy loss at the air-sea interface is due to the combined effects of radiative, sensible, and latent heat fluxes. The inflow of heat by the ocean can be divided into two components: the transport of water masses of different temperatures between the Arctic and the Atlantic and Pacific Oceans and the export of sea ice, primarily through Fram Strait. Two 150-year simulations (1950-2099) of a global climate model are used to examine how this balance might change if atmospheric greenhouse gases (GHGs) increase. One is a control simulation for the present climate with constant 1950 atmospheric composition, and the other is a transient experiment with observed GHGs from 1950 to 1990 and 0.5% annual compounded increases of CO2 after 1990. For the present climate the model agrees well with observations of radiative fluxes at the top of the atmosphere, atmospheric advective energy transport into the Arctic, and surface air temperature. It also simulates the seasonal cycle and summer increase of cloud cover and the seasonal cycle of sea-ice cover. In addition, the changes in high-latitude surface air temperature and sea-ice cover in the GHG experiment are consistent with observed changes during the last 40 and 20 years, respectively. Relative to the control, the last 50-year period of the GHG experiment indicates that even though the net annual incident solar radiation at the surface decreases by 4.6 W(per square meters) (because of greater cloud cover and increased cloud optical depth), the absorbed solar radiation increases by 2.8 W(per square meters) (because of less sea ice). Increased cloud cover and warmer air also cause increased downward thermal radiation at the surface so that the net radiation into the ocean increases by 5.0 Wm-2. The annual increase in radiation into the ocean, however, is
Hypersurface Homogeneous Cosmological Model in Modified Theory of Gravitation
Katore, S. D.; Hatkar, S. P.; Baxi, R. J.
2016-12-01
We study a hypersurface homogeneous space-time in the framework of the f (R, T) theory of gravitation in the presence of a perfect fluid. Exact solutions of field equations are obtained for exponential and power law volumetric expansions. We also solve the field equations by assuming the proportionality relation between the shear scalar (σ ) and the expansion scalar (θ ). It is observed that in the exponential model, the universe approaches isotropy at large time (late universe). The investigated model is notably accelerating and expanding. The physical and geometrical properties of the investigated model are also discussed.
Categories of relations as models of quantum theory
Directory of Open Access Journals (Sweden)
Chris Heunen
2015-11-01
Full Text Available Categories of relations over a regular category form a family of models of quantum theory. Using regular logic, many properties of relations over sets lift to these models, including the correspondence between Frobenius structures and internal groupoids. Over compact Hausdorff spaces, this lifting gives continuous symmetric encryption. Over a regular Mal'cev category, this correspondence gives a characterization of categories of completely positive maps, enabling the formulation of quantum features. These models are closer to Hilbert spaces than relations over sets in several respects: Heisenberg uncertainty, impossibility of broadcasting, and behavedness of rank one morphisms.
Massive mu pair production in a vector field theory model
Halliday, I G
1976-01-01
Massive electrodynamics is treated as a model for the production of massive mu pairs in high-energy hadronic collisions. The dominant diagrams in perturbation theory are identified and analyzed. These graphs have an eikonal structure which leads to enormous cancellations in the two-particle inclusive cross section but not in the n-particle production cross sections. Under the assumption that these cancellations are complete, a Drell-Yan structure appears in the inclusive cross section but the particles accompanying the mu pairs have a very different structure compared to the parton model. The pionization region is no longer empty of particles as in single parton models. (10 refs).
Supersymmetric sigma models and composite Yang-Mills theory
International Nuclear Information System (INIS)
Lukierski, J.
1980-04-01
We describe two types of supersymmetric sigma models: with field values in supercoset space and with superfields. The notion of Riemannian symmetric pair (H,G/H) is generalized to supergroups. Using the supercoset approach the superconformal-invariant model of composite U(n) Yang-Mills fields in introduced. In the framework of the superfield approach we present with some details two versions of the composite N=1 supersymmetric Yang-Mills theory in four dimensions with U(n) and U(m) x U(n) local invariance. We argue that especially the superfield sigma models can be used for the description of pre-QCD supersymmetric dynamics. (author)
Approximate models for broken clouds in stochastic radiative transfer theory
International Nuclear Information System (INIS)
Doicu, Adrian; Efremenko, Dmitry S.; Loyola, Diego; Trautmann, Thomas
2014-01-01
This paper presents approximate models in stochastic radiative transfer theory. The independent column approximation and its modified version with a solar source computed in a full three-dimensional atmosphere are formulated in a stochastic framework and for arbitrary cloud statistics. The nth-order stochastic models describing the independent column approximations are equivalent to the nth-order stochastic models for the original radiance fields in which the gradient vectors are neglected. Fast approximate models are further derived on the basis of zeroth-order stochastic models and the independent column approximation. The so-called “internal mixing” models assume a combination of the optical properties of the cloud and the clear sky, while the “external mixing” models assume a combination of the radiances corresponding to completely overcast and clear skies. A consistent treatment of internal and external mixing models is provided, and a new parameterization of the closure coefficient in the effective thickness approximation is given. An efficient computation of the closure coefficient for internal mixing models, using a previously derived vector stochastic model as a reference, is also presented. Equipped with appropriate look-up tables for the closure coefficient, these models can easily be integrated into operational trace gas retrieval systems that exploit absorption features in the near-IR solar spectrum. - Highlights: • Independent column approximation in a stochastic setting. • Fast internal and external mixing models for total and diffuse radiances. • Efficient optimization of internal mixing models to match reference models
Energy Technology Data Exchange (ETDEWEB)
Dubois, C.; Somot, S.; Deque, M.; Sevault, F. [CNRM-GAME, Meteo-France, CNRS, Toulouse (France); Calmanti, S.; Carillo, A.; Dell' Aquilla, A.; Sannino, G. [ENEA, Rome (Italy); Elizalde, A.; Jacob, D. [Max Planck Institute for Meteorology, Hamburg (Germany); Gualdi, S.; Oddo, P.; Scoccimarro, E. [INGV, Bologna (Italy); L' Heveder, B.; Li, L. [Laboratoire de Meteorologie Dynamique, Paris (France)
2012-10-15
Within the CIRCE project ''Climate change and Impact Research: the Mediterranean Environment'', an ensemble of high resolution coupled atmosphere-ocean regional climate models (AORCMs) are used to simulate the Mediterranean climate for the period 1950-2050. For the first time, realistic net surface air-sea fluxes are obtained. The sea surface temperature (SST) variability is consistent with the atmospheric forcing above it and oceanic constraints. The surface fluxes respond to external forcing under a warming climate and show an equivalent trend in all models. This study focuses on the present day and on the evolution of the heat and water budget over the Mediterranean Sea under the SRES-A1B scenario. On the contrary to previous studies, the net total heat budget is negative over the present period in all AORCMs and satisfies the heat closure budget controlled by a net positive heat gain at the strait of Gibraltar in the present climate. Under climate change scenario, some models predict a warming of the Mediterranean Sea from the ocean surface (positive net heat flux) in addition to the positive flux at the strait of Gibraltar for the 2021-2050 period. The shortwave and latent flux are increasing and the longwave and sensible fluxes are decreasing compared to the 1961-1990 period due to a reduction of the cloud cover and an increase in greenhouse gases (GHGs) and SSTs over the 2021-2050 period. The AORCMs provide a good estimates of the water budget with a drying of the region during the twenty-first century. For the ensemble mean, he decrease in precipitation and runoff is about 10 and 15% respectively and the increase in evaporation is much weaker, about 2% compared to the 1961-1990 period which confirm results obtained in recent studies. Despite a clear consistency in the trends and results between the models, this study also underlines important differences in the model set-ups, methodology and choices of some physical parameters inducing
Symmetry Breaking, Unification, and Theories Beyond the Standard Model
Energy Technology Data Exchange (ETDEWEB)
Nomura, Yasunori
2009-07-31
A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressing the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.
Oliveira, Arnaldo
2007-01-01
This paper examines rational and psychological decision-making models. Descriptive and normative methodologies such as attribution theory, schema theory, prospect theory, ambiguity model, game theory, and expected utility theory are discussed. The definition of culture is reviewed, and the relationship between culture and decision making is also highlighted as many organizations use a cultural-ethical decision-making model.
Cycle-Based Budgeting and Continuous Improvement at Jefferson County Public Schools: Year 1 Report
Yan, Bo
2016-01-01
This report documents the first-year of implementing Cycle-based Budgeting at Jefferson County Public Schools (Louisville, KY). To address the limitations of incremental budgeting and zero-based budgeting, a Cycle-based Budgeting model was developed and implemented in JCPS. Specifically, each new program needs to submit an on-line budget request…
Budget Setting Strategies for the Company's Divisions
Berg, M.; Brekelmans, R.C.M.; De Waegenaere, A.M.B.
1997-01-01
The paper deals with the issue of budget setting to the divisions of a company. The approach is quantitative in nature both in the formulation of the requirements for the set-budgets, as related to different general managerial objectives of interest, and in the modelling of the inherent
Zero Based Budgeting for Voc Ed
Chuang, Ying C.
1977-01-01
To help vocational education budget planners take a good look each year at where they are going, what they are trying to accomplish, and where to put their money, this article describes the 12 steps in a model commonly used for zero based budgeting. (Author/HD)
Nonlinear structural mechanics theory, dynamical phenomena and modeling
Lacarbonara, Walter
2013-01-01
Nonlinear Structural Mechanics: Theory, Dynamical Phenomena and Modeling offers a concise, coherent presentation of the theoretical framework of nonlinear structural mechanics, computational methods, applications, parametric investigations of nonlinear phenomena and their mechanical interpretation towards design. The theoretical and computational tools that enable the formulation, solution, and interpretation of nonlinear structures are presented in a systematic fashion so as to gradually attain an increasing level of complexity of structural behaviors, under the prevailing assumptions on the geometry of deformation, the constitutive aspects and the loading scenarios. Readers will find a treatment of the foundations of nonlinear structural mechanics towards advanced reduced models, unified with modern computational tools in the framework of the prominent nonlinear structural dynamic phenomena while tackling both the mathematical and applied sciences. Nonlinear Structural Mechanics: Theory, Dynamical Phenomena...
Profile-likelihood Confidence Intervals in Item Response Theory Models.
Chalmers, R Philip; Pek, Jolynn; Liu, Yang
2017-01-01
Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.
The QCD model of hadron cores of the meson theory
International Nuclear Information System (INIS)
Pokrovskii, Y.E.
1985-01-01
It was shown that in the previously proposed QCD model of hadron cores the exchange and self-energy contributions of the virtual quark-antiquark-gluon cloud on the outside of a bag which radius coincides with the hardon core radius of the meson theory (∼ 0.4 Fm) have been taken into account at the phenomenological level. Simulation of this cloud by the meson field results in realistic estimations of the nucleon's electroweak properties, moment fractions carried by gluons, quarks, antiquarks and hadron-hadron interaction cross-sections within a wide range of energies. The authors note that the QCD hadron core model proposed earlier not only realistically reflects the hadron masses, but reflects self-consistently main elements of the structure and interaction of hadrons at the quark-gluon bag radius (R - 0.4Fm) being close to the meson theory core radius
Synthetic Domain Theory and Models of Linear Abadi & Plotkin Logic
DEFF Research Database (Denmark)
Møgelberg, Rasmus Ejlers; Birkedal, Lars; Rosolini, Guiseppe
2008-01-01
Plotkin suggested using a polymorphic dual intuitionistic/linear type theory (PILLY) as a metalanguage for parametric polymorphism and recursion. In recent work the first two authors and R.L. Petersen have defined a notion of parametric LAPL-structure, which are models of PILLY, in which one can...... reason using parametricity and, for example, solve a large class of domain equations, as suggested by Plotkin.In this paper, we show how an interpretation of a strict version of Bierman, Pitts and Russo's language Lily into synthetic domain theory presented by Simpson and Rosolini gives rise...... to a parametric LAPL-structure. This adds to the evidence that the notion of LAPL-structure is a general notion, suitable for treating many different parametric models, and it provides formal proofs of consequences of parametricity expected to hold for the interpretation. Finally, we show how these results...
SIMP model at NNLO in chiral perturbation theory
DEFF Research Database (Denmark)
Hansen, Martin Rasmus Lundquist; Langaeble, K.; Sannino, F.
2015-01-01
We investigate the phenomenological viability of a recently proposed class of composite dark matter models where the relic density is determined by 3 to 2 number-changing processes in the dark sector. Here the pions of the strongly interacting field theory constitute the dark matter particles...... with phenomenological constraints challenging the viability of the simplest realisation of the strongly interacting massive particle (SIMP) paradigm....
Regression modeling methods, theory, and computation with SAS
Panik, Michael
2009-01-01
Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,
A model theory for tachyons in two dimensions
International Nuclear Information System (INIS)
Recami, E.; Rodrigues, W.A.
1985-01-01
The paper is divided in two parts, the first one having nothing to do with tachyons. In fact, to prepare the ground, in part one (sect. 2) it is shown that special relativity, even without tachyons, can be given a form such to describe both particles and antiparticles. The plan of part two is confined only to a model theory in two dimensions, for the reasons stated in sect. 3
A realistic model for quantum theory with a locality property
International Nuclear Information System (INIS)
Eberhard, P.H.
1987-04-01
A model reproducing the predictions of relativistic quantum theory to any desired degree of accuracy is described in this paper. It involves quantities that are independent of the observer's knowledge, and therefore can be called real, and which are defined at each point in space, and therefore can be called local in a rudimentary sense. It involves faster-than-light, but not instantaneous, action at distance
Theory, Modeling and Simulation Annual Report 2000; FINAL
International Nuclear Information System (INIS)
Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A
2001-01-01
This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems
Properties of lattice gauge theory models at low temperatures
International Nuclear Information System (INIS)
Mack, G.
1980-01-01
The Z(N) theory of quark confinement is discussed and how fluctuations of Z(N) gauge fields may continue to be important in the continuum limit. Existence of a model in four dimensions is pointed out in which confinement of (scalar) quarks can be shown to persist in the continuum limit. This article is based on the author's Cargese lectures 1979. Some of its results are published here for the first time. (orig.) 891 HSI/orig. 892 MKO
Field theory of large amplitude collective motion. A schematic model
International Nuclear Information System (INIS)
Reinhardt, H.
1978-01-01
By using path integral methods the equation for large amplitude collective motion for a schematic two-level model is derived. The original fermion theory is reformulated in terms of a collective (Bose) field. The classical equation of motion for the collective field coincides with the time-dependent Hartree-Fock equation. Its classical solution is quantized by means of the field-theoretical generalization of the WKB method. (author)
Stability Analysis for Car Following Model Based on Control Theory
International Nuclear Information System (INIS)
Meng Xiang-Pei; Li Zhi-Peng; Ge Hong-Xia
2014-01-01
Stability analysis is one of the key issues in car-following theory. The stability analysis with Lyapunov function for the two velocity difference car-following model (for short, TVDM) is conducted and the control method to suppress traffic congestion is introduced. Numerical simulations are given and results are consistent with the theoretical analysis. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)
Analytical theory of Doppler reflectometry in slab plasma model
Energy Technology Data Exchange (ETDEWEB)
Gusakov, E.Z.; Surkov, A.V. [Ioffe Institute, Politekhnicheskaya 26, St. Petersburg (Russian Federation)
2004-07-01
Doppler reflectometry is considered in slab plasma model in the frameworks of analytical theory. The diagnostics locality is analyzed for both regimes: linear and nonlinear in turbulence amplitude. The toroidal antenna focusing of probing beam to the cut-off is proposed and discussed as a method to increase diagnostics spatial resolution. It is shown that even in the case of nonlinear regime of multiple scattering, the diagnostics can be used for an estimation (with certain accuracy) of plasma poloidal rotation profile. (authors)
Spherically symmetric star model in the gravitational gauge theory
Energy Technology Data Exchange (ETDEWEB)
Tsou, C [Peking Observatory, China; Ch' en, S; Ho, T; Kuo, H
1976-12-01
It is shown that a star model, which is black hole-free and singularity-free, can be obtained naturally in the gravitational gauge theory, provided the space-time is torsion-free and the matter is spinless. The conclusion in a sense shows that the discussions about the black hole and the singularity based on general relativity may not describe nature correctly.
Building Better Ecological Machines: Complexity Theory and Alternative Economic Models
Directory of Open Access Journals (Sweden)
Jess Bier
2016-12-01
Full Text Available Computer models of the economy are regularly used to predict economic phenomena and set financial policy. However, the conventional macroeconomic models are currently being reimagined after they failed to foresee the current economic crisis, the outlines of which began to be understood only in 2007-2008. In this article we analyze the most prominent of this reimagining: Agent-Based models (ABMs. ABMs are an influential alternative to standard economic models, and they are one focus of complexity theory, a discipline that is a more open successor to the conventional chaos and fractal modeling of the 1990s. The modelers who create ABMs claim that their models depict markets as ecologies, and that they are more responsive than conventional models that depict markets as machines. We challenge this presentation, arguing instead that recent modeling efforts amount to the creation of models as ecological machines. Our paper aims to contribute to an understanding of the organizing metaphors of macroeconomic models, which we argue is relevant conceptually and politically, e.g., when models are used for regulatory purposes.
Le Quéré, Corinne; Andrew, Robbie M.; Friedlingstein, Pierre; Sitch, Stephen; Pongratz, Julia; Manning, Andrew C.; Korsbakken, Jan Ivar; Peters, Glen P.; Canadell, Josep G.; Jackson, Robert B.; Boden, Thomas A.; Tans, Pieter P.; Andrews, Oliver D.; Arora, Vivek K.; Bakker, Dorothee C. E.; Barbero, Leticia; Becker, Meike; Betts, Richard A.; Bopp, Laurent; Chevallier, Frédéric; Chini, Louise P.; Ciais, Philippe; Cosca, Catherine E.; Cross, Jessica; Currie, Kim; Gasser, Thomas; Harris, Ian; Hauck, Judith; Haverd, Vanessa; Houghton, Richard A.; Hunt, Christopher W.; Hurtt, George; Ilyina, Tatiana; Jain, Atul K.; Kato, Etsushi; Kautz, Markus; Keeling, Ralph F.; Klein Goldewijk, Kees; Körtzinger, Arne; Landschützer, Peter; Lefèvre, Nathalie; Lenton, Andrew; Lienert, Sebastian; Lima, Ivan; Lombardozzi, Danica; Metzl, Nicolas; Millero, Frank; Monteiro, Pedro M. S.; Munro, David R.; Nabel, Julia E. M. S.; Nakaoka, Shin-ichiro; Nojiri, Yukihiro; Padin, X. Antonio; Peregon, Anna; Pfeil, Benjamin; Pierrot, Denis; Poulter, Benjamin; Rehder, Gregor; Reimer, Janet; Rödenbeck, Christian; Schwinger, Jörg; Séférian, Roland; Skjelvan, Ingunn; Stocker, Benjamin D.; Tian, Hanqin; Tilbrook, Bronte; Tubiello, Francesco N.; van der Laan-Luijkx, Ingrid T.; van der Werf, Guido R.; van Heuven, Steven; Viovy, Nicolas; Vuichard, Nicolas; Walker, Anthony P.; Watson, Andrew J.; Wiltshire, Andrew J.; Zaehle, Sönke; Zhu, Dan
2018-03-01
Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere - the global carbon budget - is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and methodology to quantify the five major components of the global carbon budget and their uncertainties. CO2 emissions from fossil fuels and industry (EFF) are based on energy statistics and cement production data, respectively, while emissions from land-use change (ELUC), mainly deforestation, are based on land-cover change data and bookkeeping models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The ocean CO2 sink (SOCEAN) and terrestrial CO2 sink (SLAND) are estimated with global process models constrained by observations. The resulting carbon budget imbalance (BIM), the difference between the estimated total emissions and the estimated changes in the atmosphere, ocean, and terrestrial biosphere, is a measure of imperfect data and understanding of the contemporary carbon cycle. All uncertainties are reported as ±1σ. For the last decade available (2007-2016), EFF was 9.4 ± 0.5 GtC yr-1, ELUC 1.3 ± 0.7 GtC yr-1, GATM 4.7 ± 0.1 GtC yr-1, SOCEAN 2.4 ± 0.5 GtC yr-1, and SLAND 3.0 ± 0.8 GtC yr-1, with a budget imbalance BIM of 0.6 GtC yr-1 indicating overestimated emissions and/or underestimated sinks. For year 2016 alone, the growth in EFF was approximately zero and emissions remained at 9.9 ± 0.5 GtC yr-1. Also for 2016, ELUC was 1.3 ± 0.7 GtC yr-1, GATM was 6.1 ± 0.2 GtC yr-1, SOCEAN was 2.6 ± 0.5 GtC yr-1, and SLAND was 2.7 ± 1.0 GtC yr-1, with a small BIM of -0.3 GtC. GATM continued to be higher in 2016 compared to the past decade (2007-2016), reflecting in part the high fossil emissions and the small SLAND
Directory of Open Access Journals (Sweden)
C. Le Quéré
2018-03-01
Full Text Available Accurate assessment of anthropogenic carbon dioxide (CO2 emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere – the global carbon budget – is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and methodology to quantify the five major components of the global carbon budget and their uncertainties. CO2 emissions from fossil fuels and industry (EFF are based on energy statistics and cement production data, respectively, while emissions from land-use change (ELUC, mainly deforestation, are based on land-cover change data and bookkeeping models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM is computed from the annual changes in concentration. The ocean CO2 sink (SOCEAN and terrestrial CO2 sink (SLAND are estimated with global process models constrained by observations. The resulting carbon budget imbalance (BIM, the difference between the estimated total emissions and the estimated changes in the atmosphere, ocean, and terrestrial biosphere, is a measure of imperfect data and understanding of the contemporary carbon cycle. All uncertainties are reported as ±1σ. For the last decade available (2007–2016, EFF was 9.4 ± 0.5 GtC yr−1, ELUC 1.3 ± 0.7 GtC yr−1, GATM 4.7 ± 0.1 GtC yr−1, SOCEAN 2.4 ± 0.5 GtC yr−1, and SLAND 3.0 ± 0.8 GtC yr−1, with a budget imbalance BIM of 0.6 GtC yr−1 indicating overestimated emissions and/or underestimated sinks. For year 2016 alone, the growth in EFF was approximately zero and emissions remained at 9.9 ± 0.5 GtC yr−1. Also for 2016, ELUC was 1.3 ± 0.7 GtC yr−1, GATM was 6.1 ± 0.2 GtC yr−1, SOCEAN was 2.6 ± 0.5 GtC yr−1, and SLAND was 2.7 ± 1.0 GtC yr−1, with a small BIM of −0.3 GtC. GATM continued to be
Understanding the Budget Process
Directory of Open Access Journals (Sweden)
Mesut Yalvaç
2000-03-01
Full Text Available Many different budgeting techniques can be used in libraries, and some combination of these will be appropriate for almost any individual situation. Li-ne-item, program, performance, formula, variable, and zero-base budgets all have features that may prove beneficial in the preparation of a budget. Budgets also serve a variety of functions, providing for short-term and long-term financial planning as well as for cash management over a period of time. Short-term plans are reflected in the operating budget, while long-term plans are reflected in the capital budget. Since the time when cash is available to an organization does not usually coincide with the time that disbursements must be made, it is also important to carefully plan for the inflow and outflow of funds by means of a cash budget. During the budget process an organization selects its programs and activities by providing the necessary funding; the library, along with others in the organization, must justify its requests. Because of the cyclical nature of the budget process, it is possible continually to gather information and evaluate alternatives for the next budget period so that the library may achieve its maximum potential for service to its patrons.
Noncommutative gauge theory and symmetry breaking in matrix models
International Nuclear Information System (INIS)
Grosse, Harald; Steinacker, Harold; Lizzi, Fedele
2010-01-01
We show how the fields and particles of the standard model can be naturally realized in noncommutative gauge theory. Starting with a Yang-Mills matrix model in more than four dimensions, an SU(n) gauge theory on a Moyal-Weyl space arises with all matter and fields in the adjoint of the gauge group. We show how this gauge symmetry can be broken spontaneously down to SU(3) c xSU(2) L xU(1) Q [resp. SU(3) c xU(1) Q ], which couples appropriately to all fields in the standard model. An additional U(1) B gauge group arises which is anomalous at low energies, while the trace-U(1) sector is understood in terms of emergent gravity. A number of additional fields arise, which we assume to be massive, in a pattern that is reminiscent of supersymmetry. The symmetry breaking might arise via spontaneously generated fuzzy spheres, in which case the mechanism is similar to brane constructions in string theory.
Measuring and modeling salience with the theory of visual attention.
Krüger, Alexander; Tünnermann, Jan; Scharlau, Ingrid
2017-08-01
For almost three decades, the theory of visual attention (TVA) has been successful in mathematically describing and explaining a wide variety of phenomena in visual selection and recognition with high quantitative precision. Interestingly, the influence of feature contrast on attention has been included in TVA only recently, although it has been extensively studied outside the TVA framework. The present approach further develops this extension of TVA's scope by measuring and modeling salience. An empirical measure of salience is achieved by linking different (orientation and luminance) contrasts to a TVA parameter. In the modeling part, the function relating feature contrasts to salience is described mathematically and tested against alternatives by Bayesian model comparison. This model comparison reveals that the power function is an appropriate model of salience growth in the dimensions of orientation and luminance contrast. Furthermore, if contrasts from the two dimensions are combined, salience adds up additively.
Application of the evolution theory in modelling of innovation diffusion
Directory of Open Access Journals (Sweden)
Krstić Milan
2016-01-01
Full Text Available The theory of evolution has found numerous analogies and applications in other scientific disciplines apart from biology. In that sense, today the so-called 'memetic-evolution' has been widely accepted. Memes represent a complex adaptable system, where one 'meme' represents an evolutional cultural element, i.e. the smallest unit of information which can be identified and used in order to explain the evolution process. Among others, the field of innovations has proved itself to be a suitable area where the theory of evolution can also be successfully applied. In this work the authors have started from the assumption that it is also possible to apply the theory of evolution in the modelling of the process of innovation diffusion. Based on the conducted theoretical research, the authors conclude that the process of innovation diffusion in the interpretation of a 'meme' is actually the process of imitation of the 'meme' of innovation. Since during the process of their replication certain 'memes' show a bigger success compared to others, that eventually leads to their natural selection. For the survival of innovation 'memes', their manifestations are of key importance in the sense of their longevity, fruitfulness and faithful replicating. The results of the conducted research have categorically confirmed the assumption of the possibility of application of the evolution theory with the innovation diffusion with the help of innovation 'memes', which opens up the perspectives for some new researches on the subject.
Schulz, E.; Grasso, F.; Le Hir, P.; Verney, R.; Thouvenin, B.
2018-01-01
Understanding the sediment dynamics in an estuary is important for its morphodynamic and ecological assessment as well as, in case of an anthropogenically controlled system, for its maintenance. However, the quantification of sediment fluxes and budgets is extremely difficult from in-situ data and requires thoroughly validated numerical models. In the study presented here, sediment fluxes and budgets in the lower Seine Estuary were quantified and investigated from seasonal to annual time scales with respect to realistic hydro- and meteorological conditions. A realistic three-dimensional process-based hydro- and sediment-dynamic model was used to quantify mud and sand fluxes through characteristic estuarine cross-sections. In addition to a reference experiment with typical forcing, three experiments were carried out and analyzed, each differing from the reference experiment in either river discharge or wind and waves so that the effects of these forcings could be separated. Hydro- and meteorological conditions affect the sediment fluxes and budgets in different ways and at different locations. Single storm events induce strong erosion in the lower estuary and can have a significant effect on the sediment fluxes offshore of the Seine Estuary mouth, with the flux direction depending on the wind direction. Spring tides cause significant up-estuary fluxes at the mouth. A high river discharge drives barotropic down-estuary fluxes at the upper cross-sections, but baroclinic up-estuary fluxes at the mouth and offshore so that the lower estuary gains sediment during wet years. This behavior is likely to be observed worldwide in estuaries affected by density gradients and turbidity maximum dynamics.
THE REAL OPTIONS OF CAPITAL BUDGET
Directory of Open Access Journals (Sweden)
Antonio Lopo Martins
2008-07-01
Full Text Available The traditional techniques of capital budget, as the deducted cash flow and the net value present, do not incorporate existing flexibilities in an investment project, they tend to distort the value of certain investments, mainly those that are considered in scenes of uncertainty and risk. Therefore, this study intends to demonstrate that the Real Options Theory (TOR is a useful methodology to evaluate and to indicate the best option for project of expansion investment. To reach the considered objective the procedure method was used a case study, having as unit of case the Resort Praia Hotel do Litoral Norte of Salvador. This study was developed of the following form: first it identified the traditional net value present and later it was incorporated the volatileness of each analyzed uncertainty. Second, as the real options are analogous to the financial options, it was necessary to identify elements that composed the terminologies of the financial options with intention to get the value of the real option. For this model of options pricing of Black & Scholes jointly with a computational simulator was used (SLS to get the expanded net value present. As a result of this study it was possible to evidence that using the traditional tool of capital budget Net Value Present (VPL is negative, therefore the project of expansion of the Hotel would be rejected. While for the application of methodology TOR the project presents positive Expanded Present Value which would represent an excellent chance of investment. Key-word: Capital budget, Real options, Analysis of investments.
H+3 WZNW model from Liouville field theory
International Nuclear Information System (INIS)
Hikida, Yasuaki; Schomerus, Volker
2007-01-01
There exists an intriguing relation between genus zero correlation functions in the H + 3 WZNW model and in Liouville field theory. We provide a path integral derivation of the correspondence and then use our new approach to generalize the relation to surfaces of arbitrary genus g. In particular we determine the correlation functions of N primary fields in the WZNW model explicitly through Liouville correlators with N+2g-2 additional insertions of certain degenerate fields. The paper concludes with a list of interesting further extensions and a few comments on the relation to the geometric Langlands program
A model for hot electron phenomena: Theory and general results
International Nuclear Information System (INIS)
Carrillo, J.L.; Rodriquez, M.A.
1988-10-01
We propose a model for the description of the hot electron phenomena in semiconductors. Based on this model we are able to reproduce accurately the main characteristics observed in experiments of electric field transport, optical absorption, steady state photoluminescence and relaxation process. Our theory does not contain free nor adjustable parameters, it is very fast computerwise, and incorporates the main collision mechanisms including screening and phonon heating effects. Our description on a set of nonlinear rate equations in which the interactions are represented by coupling coefficients or effective frequencies. We calculate three coefficients from the characteristic constants and the band structure of the material. (author). 22 refs, 5 figs, 1 tab
A possibilistic uncertainty model in classical reliability theory
International Nuclear Information System (INIS)
De Cooman, G.; Capelle, B.
1994-01-01
The authors argue that a possibilistic uncertainty model can be used to represent linguistic uncertainty about the states of a system and of its components. Furthermore, the basic properties of the application of this model to classical reliability theory are studied. The notion of the possibilistic reliability of a system or a component is defined. Based on the concept of a binary structure function, the important notion of a possibilistic function is introduced. It allows to calculate the possibilistic reliability of a system in terms of the possibilistic reliabilities of its components
Theory and Circuit Model for Lossy Coaxial Transmission Line
Energy Technology Data Exchange (ETDEWEB)
Genoni, T. C.; Anderson, C. N.; Clark, R. E.; Gansz-Torres, J.; Rose, D. V.; Welch, Dale Robert
2017-04-01
The theory of signal propagation in lossy coaxial transmission lines is revisited and new approximate analytic formulas for the line impedance and attenuation are derived. The accuracy of these formulas from DC to 100 GHz is demonstrated by comparison to numerical solutions of the exact field equations. Based on this analysis, a new circuit model is described which accurately reproduces the line response over the entire frequency range. Circuit model calculations are in excellent agreement with the numerical and analytic results, and with finite-difference-time-domain simulations which resolve the skindepths of the conducting walls.
Models and applications of chaos theory in modern sciences
Zeraoulia, Elhadj
2011-01-01
This book presents a select group of papers that provide a comprehensive view of the models and applications of chaos theory in medicine, biology, ecology, economy, electronics, mechanical, and the human sciences. Covering both the experimental and theoretical aspects of the subject, it examines a range of current topics of interest. It considers the problems arising in the study of discrete and continuous time chaotic dynamical systems modeling the several phenomena in nature and society-highlighting powerful techniques being developed to meet these challenges that stem from the area of nonli
Refined pipe theory for mechanistic modeling of wood development.
Deckmyn, Gaby; Evans, Sam P; Randle, Tim J
2006-06-01
We present a mechanistic model of wood tissue development in response to changes in competition, management and climate. The model is based on a refinement of the pipe theory, where the constant ratio between sapwood and leaf area (pipe theory) is replaced by a ratio between pipe conductivity and leaf area. Simulated pipe conductivity changes with age, stand density and climate in response to changes in allocation or pipe radius, or both. The central equation of the model, which calculates the ratio of carbon (C) allocated to leaves and pipes, can be parameterized to describe the contrasting stem conductivity behavior of different tree species: from constant stem conductivity (functional homeostasis hypothesis) to height-related reduction in stem conductivity with age (hydraulic limitation hypothesis). The model simulates the daily growth of pipes (vessels or tracheids), fibers and parenchyma as well as vessel size and simulates the wood density profile and the earlywood to latewood ratio from these data. Initial runs indicate the model yields realistic seasonal changes in pipe radius (decreasing pipe radius from spring to autumn) and wood density, as well as realistic differences associated with the competitive status of trees (denser wood in suppressed trees).
Toward a General Research Process for Using Dubin's Theory Building Model
Holton, Elwood F.; Lowe, Janis S.
2007-01-01
Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…
Two problems from the theory of semiotic control models. I. Representations of semiotic models
Energy Technology Data Exchange (ETDEWEB)
Osipov, G S
1981-11-01
Two problems from the theory of semiotic control models are being stated, in particular the representation of models and the semantic analysis of themtheory of semiotic control models are being stated, in particular the representation of models and the semantic analysis of them. Algebraic representation of semiotic models, covering of representations, their reduction and equivalence are discussed. The interrelations between functional and structural characteristics of semiotic models are investigated. 20 references.
General topology meets model theory, on p and t.
Malliaris, Maryanthe; Shelah, Saharon
2013-08-13
Cantor proved in 1874 [Cantor G (1874) J Reine Angew Math 77:258-262] that the continuum is uncountable, and Hilbert's first problem asks whether it is the smallest uncountable cardinal. A program arose to study cardinal invariants of the continuum, which measure the size of the continuum in various ways. By Gödel [Gödel K (1939) Proc Natl Acad Sci USA 25(4):220-224] and Cohen [Cohen P (1963) Proc Natl Acad Sci USA 50(6):1143-1148], Hilbert's first problem is independent of ZFC (Zermelo-Fraenkel set theory with the axiom of choice). Much work both before and since has been done on inequalities between these cardinal invariants, but some basic questions have remained open despite Cohen's introduction of forcing. The oldest and perhaps most famous of these is whether " p = t," which was proved in a special case by Rothberger [Rothberger F (1948) Fund Math 35:29-46], building on Hausdorff [Hausdorff (1936) Fund Math 26:241-255]. In this paper we explain how our work on the structure of Keisler's order, a large-scale classification problem in model theory, led to the solution of this problem in ZFC as well as of an a priori unrelated open question in model theory.
Item level diagnostics and model - data fit in item response theory ...
African Journals Online (AJOL)
Item response theory (IRT) is a framework for modeling and analyzing item response data. Item-level modeling gives IRT advantages over classical test theory. The fit of an item score pattern to an item response theory (IRT) models is a necessary condition that must be assessed for further use of item and models that best fit ...
Understanding the Budget Process
Mesut Yalvaç
2000-01-01
Many different budgeting techniques can be used in libraries, and some combination of these will be appropriate for almost any individual situation. Li-ne-item, program, performance, formula, variable, and zero-base budgets all have features that may prove beneficial in the preparation of a budget. Budgets also serve a variety of functions, providing for short-term and long-term financial planning as well as for cash management over a period of time. Short-term plans are reflected in the oper...
Ito, A.
2017-12-01
Terrestrial ecosystems are important sink of carbon dioxide (CO2) but significant sources of other greenhouse gases such as methane (CH4) and nitrous oxide (N2O). To resolve the role of terrestrial biosphere in the climate system, we need to quantify total greenhouse gas budget with an adequate accuracy. In addition to top-down evaluation on the basis of atmospheric measurements, model-based approach is required for integration and up-scaling of filed data and for prediction under changing environment and different management practices. Since the early 2000s, we have developed a process-based model of terrestrial biogeochemical cycles focusing on atmosphere-ecosystem exchange of trace gases: Vegetation Integrated SImulator for Trace gases (VISIT). The model includes simple and comprehensive schemes of carbon and nitrogen cycles in terrestrial ecosystems, allowing us to capture dynamic nature of greenhouse gas budget. Beginning from natural ecosystems such as temperate and tropical forests, the models is now applicable to croplands by including agricultural practices such as planting, harvest, and fertilizer input. Global simulation results have been published from several papers, but model validation and benchmarking using up-to-date observations are remained for works. The model is now applied to several practical issues such as evaluation of N2O emission from bio-fuel croplands, which are expected to accomplish the mitigation target of the Paris Agreement. We also show several topics about basic model development such as revised CH4 emission affected by dynamic water-table and refined N2O emission from nitrification.
Roth, Jason L.; Capel, Paul D.
2012-01-01
Crop agriculture occupies 13 percent of the conterminous United States. Agricultural management practices, such as crop and tillage types, affect the hydrologic flow paths through the landscape. Some agricultural practices, such as drainage and irrigation, create entirely new hydrologic flow paths upon the landscapes where they are implemented. These hydrologic changes can affect the magnitude and partitioning of water budgets and sediment erosion. Given the wide degree of variability amongst agricultural settings, changes in the magnitudes of hydrologic flow paths and sediment erosion induced by agricultural management practices commonly are difficult to characterize, quantify, and compare using only field observations. The Water Erosion Prediction Project (WEPP) model was used to simulate two landscape characteristics (slope and soil texture) and three agricultural management practices (land cover/crop type, tillage type, and selected agricultural land management practices) to evaluate their effects on the water budgets of and sediment yield from agricultural lands. An array of sixty-eight 60-year simulations were run, each representing a distinct natural or agricultural scenario with various slopes, soil textures, crop or land cover types, tillage types, and select agricultural management practices on an isolated 16.2-hectare field. Simulations were made to represent two common agricultural climate regimes: arid with sprinkler irrigation and humid. These climate regimes were constructed with actual climate and irrigation data. The results of these simulations demonstrate the magnitudes of potential changes in water budgets and sediment yields from lands as a result of landscape characteristics and agricultural practices adopted on them. These simulations showed that variations in landscape characteristics, such as slope and soil type, had appreciable effects on water budgets and sediment yields. As slopes increased, sediment yields increased in both the arid and
A model-theory for Tachyons in two dimensions
International Nuclear Information System (INIS)
Recami, E.; Rodriques, W.A. Jr.
1986-01-01
The subject of Tachyons, even if still speculative, may deserve some attention for reasons that can be divided into a few categories, two of which are as follows: The larger scheme, to build up in order to incorporate space-like objects in the relativistic theories. These allow better understanding of many aspects of the ordinary relativistic physics, even if Tachyons would not exist in our cosmos as ''asymptotically free'' objects; superliminal classical objects can have a role in elementary particle interactions (perhaps even in astrophysics) and possible verification of the reproduction of quantum-like behaviour at a classical level when taking into account the possible existence of faster-than-light classical particles. This paper shows that Special Relativity - even without tachyons - can be given a form which describes both particles and anti-particles. This paper also is confined only to a ''model theory'' of Tachyons in two dimensions
DsixTools: the standard model effective field theory toolkit
Energy Technology Data Exchange (ETDEWEB)
Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)
2017-06-15
We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)
Visceral obesity and psychosocial stress: a generalised control theory model
Wallace, Rodrick
2016-07-01
The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.
Effective-field theory on the kinetic Ising model
International Nuclear Information System (INIS)
Shi Xiaoling; Wei Guozhu; Li Lin
2008-01-01
As an analytical method, the effective-field theory (EFT) is used to study the dynamical response of the kinetic Ising model in the presence of a sinusoidal oscillating field. The effective-field equations of motion of the average magnetization are given for the square lattice (Z=4) and the simple cubic lattice (Z=6), respectively. The dynamic order parameter, the hysteresis loop area and the dynamic correlation are calculated. In the field amplitude h 0 /ZJ-temperature T/ZJ plane, the phase boundary separating the dynamic ordered and the disordered phase has been drawn, and the dynamical tricritical point has been observed. We also make the compare results of EFT with that given by using the mean field theory (MFT)
Taniguchi, Kristine; Gudiño, Napoleon; Biggs, Trent; Castillo, Carlos; Langendoen, Eddy; Bingner, Ron; Taguas, Encarnación; Liden, Douglas; Yuan, Yongping
2015-04-01
Several watersheds cross the US-Mexico boundary, resulting in trans-boundary environmental problems. Erosion in Tijuana, Mexico, increases the rate of sediment deposition in the Tijuana Estuary in the United States, altering the structure and function of the ecosystem. The well-being of residents in Tijuana is compromised by damage to infrastructure and homes built adjacent to stream channels, gully formation in dirt roads, and deposition of trash. We aim to understand the dominant source of sediment contributing to the sediment budget of the watershed (channel, gully, or rill erosion), where the hotspots of erosion are located, and what the impact of future planned and unplanned land use changes and Best Management Practices (BMPs) will be on sediment and storm flow. We will be using a mix of field methods, including 3D photo-reconstruction of stream channels, with two models, CONCEPTS and AnnAGNPS to constrain estimates of the sediment budget and impacts of land use change. Our research provides an example of how 3D photo-reconstruction and Structure from Motion (SfM) can be used to model channel evolution.
A study of the logical model of capital market complexity theories
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Analyzes the shortcomings of the classic capital market theories based on EMH and discloses the complexity essence of the capital market. Considering the capital market a complicated, interactive and adaptable dynamic system, with complexity science as the method for researching the operation law of the capital market, this paper constructs a nonlinear logical model to analyze the applied realm, focal point and interrelationship of such theories as dissipative structure theory, chaos theory, fractal theory, synergetics theory, catastrophe theory and scale theory, and summarizes and discusses the achievements and problems of each theory.Based on the research, the paper foretells the developing direction of complexity science in a capital market.
Chern-Simons matrix models, two-dimensional Yang-Mills theory and the Sutherland model
International Nuclear Information System (INIS)
Szabo, Richard J; Tierz, Miguel
2010-01-01
We derive some new relationships between matrix models of Chern-Simons gauge theory and of two-dimensional Yang-Mills theory. We show that q-integration of the Stieltjes-Wigert matrix model is the discrete matrix model that describes q-deformed Yang-Mills theory on S 2 . We demonstrate that the semiclassical limit of the Chern-Simons matrix model is equivalent to the Gross-Witten model in the weak-coupling phase. We study the strong-coupling limit of the unitary Chern-Simons matrix model and show that it too induces the Gross-Witten model, but as a first-order deformation of Dyson's circular ensemble. We show that the Sutherland model is intimately related to Chern-Simons gauge theory on S 3 , and hence to q-deformed Yang-Mills theory on S 2 . In particular, the ground-state wavefunction of the Sutherland model in its classical equilibrium configuration describes the Chern-Simons free energy. The correspondence is extended to Wilson line observables and to arbitrary simply laced gauge groups.
Plane answers to complex questions the theory of linear models
Christensen, Ronald
1987-01-01
This book was written to rigorously illustrate the practical application of the projective approach to linear models. To some, this may seem contradictory. I contend that it is possible to be both rigorous and illustrative and that it is possible to use the projective approach in practical applications. Therefore, unlike many other books on linear models, the use of projections and sub spaces does not stop after the general theory. They are used wherever I could figure out how to do it. Solving normal equations and using calculus (outside of maximum likelihood theory) are anathema to me. This is because I do not believe that they contribute to the understanding of linear models. I have similar feelings about the use of side conditions. Such topics are mentioned when appropriate and thenceforward avoided like the plague. On the other side of the coin, I just as strenuously reject teaching linear models with a coordinate free approach. Although Joe Eaton assures me that the issues in complicated problems freq...
Theory of thermoluminescence gamma dose response: The unified interaction model
International Nuclear Information System (INIS)
Horowitz, Y.S.
2001-01-01
We describe the development of a comprehensive theory of thermoluminescence (TL) dose response, the unified interaction model (UNIM). The UNIM is based on both radiation absorption stage and recombination stage mechanisms and can describe dose response for heavy charged particles (in the framework of the extended track interaction model - ETIM) as well as for isotropically ionising gamma rays and electrons (in the framework of the TC/LC geminate recombination model) in a unified and self-consistent conceptual and mathematical formalism. A theory of optical absorption dose response is also incorporated in the UNIM to describe the radiation absorption stage. The UNIM is applied to the dose response supralinearity characteristics of LiF:Mg,Ti and is especially and uniquely successful in explaining the ionisation density dependence of the supralinearity of composite peak 5 in TLD-100. The UNIM is demonstrated to be capable of explaining either qualitatively or quantitatively all of the major features of TL dose response with many of the variable parameters of the model strongly constrained by ancilliary optical absorption and sensitisation measurements
Directory of Open Access Journals (Sweden)
Joel Arnault
2012-02-01
Full Text Available Gravity waves generated by the Vestfjella Mountains (in western Droning Maud Land, Antarctica, southwest of the Finnish/Swedish Aboa/Wasa station have been observed with the Moveable atmospheric radar for Antarctica (MARA during the SWEDish Antarctic Research Programme (SWEDARP in December 2007/January 2008. These radar observations are compared with a 2-month Weather Research Forecast (WRF model experiment operated at 2 km horizontal resolution. A control simulation without orography is also operated in order to separate unambiguously the contribution of the mountain waves on the simulated atmospheric flow. This contribution is then quantified with a kinetic energy budget analysis computed in the two simulations. The results of this study confirm that mountain waves reaching lower-stratospheric heights break through convective overturning and generate inertia gravity waves with a smaller vertical wavelength, in association with a brief depletion of kinetic energy through frictional dissipation and negative vertical advection. The kinetic energy budget also shows that gravity waves have a strong influence on the other terms of the budget, i.e. horizontal advection and horizontal work of pressure forces, so evaluating the influence of gravity waves on the mean-flow with the vertical advection term alone is not sufficient, at least in this case. We finally obtain that gravity waves generated by the Vestfjella Mountains reaching lower stratospheric heights generally deplete (create kinetic energy in the lower troposphere (upper troposphere–lower stratosphere, in contradiction with the usual decelerating effect attributed to gravity waves on the zonal circulation in the upper troposphere–lower stratosphere.
Adapting Structuration Theory as a Comprehensive Theory for Distance Education: The ASTIDE Model
Aktaruzzaman, Md; Plunkett, Margaret
2016-01-01
Distance Education (DE) theorists have argued about the requirement for a theory to be comprehensive in a way that can explicate many of the activities associated with DE. Currently, Transactional Distance Theory (TDT) (Moore, 1993) and the Theory of Instructional Dialogue (IDT) (Caspi & Gorsky, 2006) are the most prominent theories, yet they…
Modelling non-ignorable missing data mechanisms with item response theory models
Holman, Rebecca; Glas, Cornelis A.W.
2005-01-01
A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled
Modelling non-ignorable missing-data mechanisms with item response theory models
Holman, Rebecca; Glas, Cees A. W.
2005-01-01
A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled
The pipe model theory half a century on: a review.
Lehnebach, Romain; Beyer, Robert; Letort, Véronique; Heuret, Patrick
2018-01-23
More than a half century ago, Shinozaki et al. (Shinozaki K, Yoda K, Hozumi K, Kira T. 1964b. A quantitative analysis of plant form - the pipe model theory. II. Further evidence of the theory and its application in forest ecology. Japanese Journal of Ecology14: 133-139) proposed an elegant conceptual framework, the pipe model theory (PMT), to interpret the observed linear relationship between the amount of stem tissue and corresponding supported leaves. The PMT brought a satisfactory answer to two vividly debated problems that were unresolved at the moment of its publication: (1) What determines tree form and which rules drive biomass allocation to the foliar versus stem compartments in plants? (2) How can foliar area or mass in an individual plant, in a stand or at even larger scales be estimated? Since its initial formulation, the PMT has been reinterpreted and used in applications, and has undoubtedly become an important milestone in the mathematical interpretation of plant form and functioning. This article aims to review the PMT by going back to its initial formulation, stating its explicit and implicit properties and discussing them in the light of current biological knowledge and experimental evidence in order to identify the validity and range of applicability of the theory. We also discuss the use of the theory in tree biomechanics and hydraulics as well as in functional-structural plant modelling. Scrutinizing the PMT in the light of modern biological knowledge revealed that most of its properties are not valid as a general rule. The hydraulic framework derived from the PMT has attracted much more attention than its mechanical counterpart and implies that only the conductive portion of a stem cross-section should be proportional to the supported foliage amount rather than the whole of it. The facts that this conductive portion is experimentally difficult to measure and varies with environmental conditions and tree ontogeny might cause the commonly
Continued development of modeling tools and theory for RF heating
International Nuclear Information System (INIS)
1998-01-01
Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project's renewal than to the initiation of a new project
Diffusion theory model for optimization calculations of cold neutron sources
International Nuclear Information System (INIS)
Azmy, Y.Y.
1987-01-01
Cold neutron sources are becoming increasingly important and common experimental facilities made available at many research reactors around the world due to the high utility of cold neutrons in scattering experiments. The authors describe a simple two-group diffusion model of an infinite slab LD 2 cold source. The simplicity of the model permits to obtain an analytical solution from which one can deduce the reason for the optimum thickness based solely on diffusion-type phenomena. Also, a second more sophisticated model is described and the results compared to a deterministic transport calculation. The good (particularly qualitative) agreement between the results suggests that diffusion theory methods can be used in parametric and optimization studies to avoid the generally more expensive transport calculations
Lattice Gauge Theories Within and Beyond the Standard Model
Energy Technology Data Exchange (ETDEWEB)
Gelzer, Zechariah John [Iowa U.
2017-01-01
The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involving $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($B \\to \\pi \\ell \
A queueing theory based model for business continuity in hospitals.
Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R
2013-01-01
Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals.
Density Functional Theory and Materials Modeling at Atomistic Length Scales
Directory of Open Access Journals (Sweden)
Swapan K. Ghosh
2002-04-01
Full Text Available Abstract: We discuss the basic concepts of density functional theory (DFT as applied to materials modeling in the microscopic, mesoscopic and macroscopic length scales. The picture that emerges is that of a single unified framework for the study of both quantum and classical systems. While for quantum DFT, the central equation is a one-particle Schrodinger-like Kohn-Sham equation, the classical DFT consists of Boltzmann type distributions, both corresponding to a system of noninteracting particles in the field of a density-dependent effective potential, the exact functional form of which is unknown. One therefore approximates the exchange-correlation potential for quantum systems and the excess free energy density functional or the direct correlation functions for classical systems. Illustrative applications of quantum DFT to microscopic modeling of molecular interaction and that of classical DFT to a mesoscopic modeling of soft condensed matter systems are highlighted.
Criticism of the Classical Theory of Macroeconomic Modeling
Directory of Open Access Journals (Sweden)
Konstantin K. Kumehov
2015-01-01
Full Text Available Abstract: Current approaches and methods of modeling of macroeconomic systems do not allow to generate research ideas that could be used in applications. This is largely due to the fact that the dominant economic schools and research directions are building their theories on misconceptions about the economic system as object modeling, and have no common methodological approaches in the design of macroeconomic models. All of them are focused on building a model aimed at establishing equilibrium parameters of supply and demand, production and consumption. At the same time as the underlying factors are not considered resource potential and the needs of society in material and other benefits. In addition, there is no unity in the choice of elements and mechanisms of interaction between them. Not installed, what are the criteria to determine the elements of the model: whether it is the institutions, whether the industry is whether the population, or banks, or classes, etc. From the methodological point of view, the design of the model all the most well-known authors extrapolated to the new models of the past state or past events. As a result, every time the model is ready by the time the situation changes, the last parameters underlying the model are losing relevance, so at best, the researcher may have to interpret the events and parameters that are not feasible in the future. In this paper, based on analysis of the works of famous authors, belonging to different schools and areas revealed weaknesses of their proposed macroeconomic models that do not allow you to use them to solve applied problems of economic development. A fundamentally new approaches and methods by which it is possible the construction of macroeconomic models that take into account the theoretical and applied aspects of modeling, as well as formulated the basic methodological requirements.
Lavaud, Romain; LaPeyre, Megan K.; Casas, Sandra M.; Bacher, C.; La Peyre, Jerome F.
2017-01-01
We present a Dynamic Energy Budget (DEB) model for the eastern oyster, Crassostrea virginica, which enables the inclusion of salinity as a third environmental variable, on top of the standard foodr and temperature variables. Salinity changes have various effects on the physiology of oysters, potentially altering filtration and respiration rates, and ultimately impacting growth, reproduction and mortality. We tested different hypotheses as to how to include these effects in a DEB model for C. virginica. Specifically, we tested two potential mechanisms to explain changes in oyster shell growth (cm), tissue dry weight (g) and gonad dry weight (g) when salinity moves away from the ideal range: 1) a negative effect on filtration rate and 2) an additional somatic maintenance cost. Comparative simulations of shell growth, dry tissue biomass and dry gonad weight in two monitored sites in coastal Louisiana experiencing salinity from 0 to 28 were statistically analyzed to determine the best hypothesis. Model parameters were estimated through the covariation method, using literature data and a set of specifically designed ecophysiological experiments. The model was validated through independent field studies in estuaries along the northern Gulf of Mexico. Our results suggest that salinity impacts C. virginica’s energy budget predominantly through effects on filtration rate. With an overwhelming number of environmental factors impacting organisms, and increasing exposure to novel and extreme conditions, the mechanistic nature of the DEB model with its ability to incorporate more than the standard food and temperature variables provides a powerful tool to verify hypotheses and predict individual organism performance across a range of conditions.
The Gaussian streaming model and convolution Lagrangian effective field theory
Energy Technology Data Exchange (ETDEWEB)
Vlah, Zvonimir [Stanford Institute for Theoretical Physics and Department of Physics, Stanford University, Stanford, CA 94306 (United States); Castorina, Emanuele; White, Martin, E-mail: zvlah@stanford.edu, E-mail: ecastorina@berkeley.edu, E-mail: mwhite@berkeley.edu [Department of Physics, University of California, Berkeley, CA 94720 (United States)
2016-12-01
We update the ingredients of the Gaussian streaming model (GSM) for the redshift-space clustering of biased tracers using the techniques of Lagrangian perturbation theory, effective field theory (EFT) and a generalized Lagrangian bias expansion. After relating the GSM to the cumulant expansion, we present new results for the real-space correlation function, mean pairwise velocity and pairwise velocity dispersion including counter terms from EFT and bias terms through third order in the linear density, its leading derivatives and its shear up to second order. We discuss the connection to the Gaussian peaks formalism. We compare the ingredients of the GSM to a suite of large N-body simulations, and show the performance of the theory on the low order multipoles of the redshift-space correlation function and power spectrum. We highlight the importance of a general biasing scheme, which we find to be as important as higher-order corrections due to non-linear evolution for the halos we consider on the scales of interest to us.
Non local theory of excitations applied to the Hubbard model
International Nuclear Information System (INIS)
Kakehashi, Y; Nakamura, T; Fulde, P
2010-01-01
We propose a nonlocal theory of single-particle excitations. It is based on an off-diagonal effective medium and the projection operator method for treating the retarded Green function. The theory determines the nonlocal effective medium matrix elements by requiring that they are consistent with those of the self-energy of the Green function. This arrows for a description of long-range intersite correlations with high resolution in momentum space. Numerical study for the half-filled Hubbard model on the simple cubic lattice demonstrates that the theory is applicable to the strong correlation regime as well as the intermediate regime of Coulomb interaction strength. Furthermore the results show that nonlocal excitations cause sub-bands in the strong Coulomb interaction regime due to strong antiferromagnetic correlations, decrease the quasi-particle peak on the Fermi level with increasing Coulomb interaction, and shift the critical Coulomb interaction U C2 for the divergence of effective mass towards higher energies at least by a factor of two as compared with that in the single-site approximation.
A Well-Designed Budget Yields Long-Term Rewards.
Pinola, Mary; Knirk, Frederick G.
1984-01-01
Defines zero-based budgeting, compares it to traditional budgeting, and discusses five steps of a zero-base budget model: determining organization's goals and refining them into objectives; listing activities to achieve objectives in decision packages; evaluating decision packages; ranking packages by order of importance; and funding decision…
Multiagent model and mean field theory of complex auction dynamics
Chen, Qinghua; Huang, Zi-Gang; Wang, Yougui; Lai, Ying-Cheng
2015-09-01
Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena.
Multiagent model and mean field theory of complex auction dynamics
International Nuclear Information System (INIS)
Chen, Qinghua; Wang, Yougui; Huang, Zi-Gang; Lai, Ying-Cheng
2015-01-01
Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena. (paper)
Budgeting Approaches in Community Colleges
Palmer, James C.
2014-01-01
Several budgeting approaches have been initiated as alternatives to the traditional, incremental process. These include formula budgeting; zero-base budgeting; planning, programming, and budgeting systems; and responsibility center budgeting. Each is premised on assumptions about how organizations might best make resource allocation decisions.…
Irreducible gauge theory of a consolidated Salam-Weinberg model
International Nuclear Information System (INIS)
Ne'eman, Y.
1979-01-01
The Salam-Weinberg model is derived by gauging an internal simple supergroup SU(2/1). The theory uniquely assigns the correct SU(2)sub(L) X U(1) eigenvalues for all leptons, fixes thetasub(W) = 30 0 , generates the W +- sub(sigma), Z 0 sub(sigma) and Asub(sigma) together with the Higgs-Goldstone Isub(L) = 1/2 scalar multiplets as gauge fields, and imposes the standard spontaneous breakdown of SU(2)sub(L) X U(1). The masses of intermediate bosons and fermions are directly generated by SU(2/1) universality, which also fixes the Higgs field coupling. (Auth.)
Ferromagnetism in the Hubbard model: a modified perturbation theory
International Nuclear Information System (INIS)
Gangadhar Reddy, G.; Ramakanth, A.; Nolting, W.
2005-01-01
We study the possibility of ferromagnetism in the Hubbard model using the modified perturbation theory. In this approach an Ansatz is made for the self-energy of the electron which contains the second order contribution developed around the Hartree-Fock solution and two parameters. The parameters are fixed by using a moment method. This self energy satisfies several known exact limiting cases. Using this self energy, the Curie temperature T c as a function of band filling n is investigated. It is found that T c falls off abruptly as n approaches half filling. The results are in qualitative agreement with earlier calculations using other approximation schemes. (author)
Mean-field theory and self-consistent dynamo modeling
International Nuclear Information System (INIS)
Yoshizawa, Akira; Yokoi, Nobumitsu
2001-12-01
Mean-field theory of dynamo is discussed with emphasis on the statistical formulation of turbulence effects on the magnetohydrodynamic equations and the construction of a self-consistent dynamo model. The dynamo mechanism is sought in the combination of the turbulent residual-helicity and cross-helicity effects. On the basis of this mechanism, discussions are made on the generation of planetary magnetic fields such as geomagnetic field and sunspots and on the occurrence of flow by magnetic fields in planetary and fusion phenomena. (author)
Morphing the Shell Model into an Effective Theory
International Nuclear Information System (INIS)
Haxton, W. C.; Song, C.-L.
2000-01-01
We describe a strategy for attacking the canonical nuclear structure problem--bound-state properties of a system of point nucleons interacting via a two-body potential--which involves an expansion in the number of particles scattering at high momenta, but is otherwise exact. The required self-consistent solutions of the Bloch-Horowitz equation for effective interactions and operators are obtained by an efficient Green's function method based on the Lanczos algorithm. We carry out this program for the simplest nuclei, d and 3 He , in order to explore the consequences of reformulating the shell model as a controlled effective theory. (c) 2000 The American Physical Society
Lagrangian model of conformal invariant interacting quantum field theory
International Nuclear Information System (INIS)
Lukierski, J.
1976-01-01
A Lagrangian model of conformal invariant interacting quantum field theory is presented. The interacting Lagrangian and free Lagrangian are derived replacing the canonical field phi by the field operator PHIsub(d)sup(c) and introducing the conformal-invariant interaction Lagrangian. It is suggested that in the conformal-invariant QFT with the dimensionality αsub(B) obtained from the bootstrep equation, the normalization constant c of the propagator and the coupling parametery do not necessarily need to satisfy the relation xsub(B) = phi 2 c 3
Energy Technology Data Exchange (ETDEWEB)
1981-12-01
In 1982, the amount of the CEA budget will be 13.4 billions French Francs. The main characteristics are the priority for employment and investments. In this budget programs are adapted to fit R and D to the government policy: innovation, industrial valorization and fundamental research especially thermonuclear fusion and in the electronuclear field to safety, reprocessing and radioactive waste management.
Budgeting in Nonprofit Organizations.
Kelly, Lauren
1985-01-01
This description of the role of budgets in nonprofit organizations uses libraries as an example. Four types of budgets--legislative, management, cash, and capital--are critiqued in terms of cost effectiveness, implementation, and facilitation of organizational control and objectives. (CLB)
Colorado Children's Budget 2010
Colorado Children's Campaign, 2010
2010-01-01
The "Children's Budget 2010" is intended to be a resource guide for policymakers and advocates who are interested in better understanding how Colorado funds children's programs and services. It attempts to clarify often confusing budget information and describe where the state's investment trends are and where those trends will lead the…
Colorado Children's Budget 2013
Buck, Beverly; Baker, Robin
2013-01-01
The "Colorado Children's Budget" presents and analyzes investments and spending trends during the past five state fiscal years on services that benefit children. The "Children's Budget" focuses mainly on state investment and spending, with some analysis of federal investments and spending to provide broader context of state…
International Nuclear Information System (INIS)
1964-01-01
A total Agency Budget of $10 406 000 for 1965 was approved by the General Conference at its session of September 1964; the Budget for the year 1964 amounted to $9 812 000. The consolidated Budget figures are shown in the table at the end of this article. The Budget falls into two parts - the Regular Budget and the Operational Budget. The Regular Budget provides for the ordinary administrative expenses of the Agency, and for expert panels, special missions, symposia and conferences, distribution of information, and scientific and technical services. In conformity with the Agency's Statute, these expenses are met by contributions made according to Voluntary contributions are paid initially into a General Fund established for this purpose, and money for operations is transferred to the respective Operating Funds as appropriate, and as approved by the Board of Governors. The scale of assessments for 1965 is based on the United Nations scale for 1964. The assessments are estimated to yield $7 713 000 - an increase of 6.8 per cent; however, more than three quarters of this increase will be offset by credits which Member States will receive as a result of a cash surplus brought forward. The Operational Budget is financed by voluntary contributions and is divided into two parts - Operating Fund I, devoted to certain laboratory and research projects, and Operating Fund II, for technical assistance, training and research contracts.
New Trends in Model Coupling Theory, Numerics and Applications
International Nuclear Information System (INIS)
Coquel, F.; Godlewski, E.; Herard, J. M.; Segre, J.
2010-01-01
This special issue comprises selected papers from the workshop New Trends in Model Coupling, Theory, Numerics and Applications (NTMC'09) which took place in Paris, September 2 - 4, 2009. The research of optimal technological solutions in a large amount of industrial systems requires to perform numerical simulations of complex phenomena which are often characterized by the coupling of models related to various space and/or time scales. Thus, the so-called multi-scale modelling has been a thriving scientific activity which connects applied mathematics and other disciplines such as physics, chemistry, biology or even social sciences. To illustrate the variety of fields concerned by the natural occurrence of model coupling we may quote: meteorology where it is required to take into account several turbulence scales or the interaction between oceans and atmosphere, but also regional models in a global description, solid mechanics where a thorough understanding of complex phenomena such as propagation of cracks needs to couple various models from the atomistic level to the macroscopic level; plasma physics for fusion energy for instance where dense plasmas and collisionless plasma coexist; multiphase fluid dynamics when several types of flow corresponding to several types of models are present simultaneously in complex circuits; social behaviour analysis with interaction between individual actions and collective behaviour. (authors)
Rigorously testing multialternative decision field theory against random utility models.
Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg
2014-06-01
Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions. PsycINFO Database Record (c) 2014 APA, all rights reserved.