Magnetization of the canted antiferromagnetic CoCO 3 in Abragam-Pryce approximation
Meshcheryakov, V. F.
2006-05-01
Weiss molecular field theory was used to calculate the magnetization of the canted antiferromagnetic CoCO 3 ( T=18.1 K). Wave functions of magnetic doublets near Co 2+ ground state in Abragam-Pryce approximation were determined. One of the crystal field variables, free Co 2+ ion isotropic exchange interaction inside, and between magnetic sublatticies, and rotation angle ϕ, characterizing nonequivalence ion Co 2+ positions, were used as parameters. From comparison with the experimental data exchange interaction anisotropy and g-factors g, g were obtained. At low temperatures T<40 K the coincidence of calculated and experimental results are good and g-factor values are almost the same as have been obtained from EPR data in Co(1%)+CdCO single crystals. At high temperatures in the paramagnetic region, experimental data differs from calculated ones by more than two times. It is shown that this discrepancy cannot be described within the frames of used approximations.
Magnetization of the canted antiferromagnetic CoCO3 in Abragam-Pryce approximation
Weiss molecular field theory was used to calculate the magnetization of the canted antiferromagnetic CoCO3 (TN=18.1K). Wave functions of magnetic doublets near Co2+ ground state in Abragam-Pryce approximation were determined. One of the crystal field variables, free Co2+ ion isotropic exchange interaction inside, and between magnetic sublatticies, and rotation angle φ, characterizing nonequivalence ion Co2+ positions, were used as parameters. From comparison with the experimental data exchange interaction anisotropy and g-factors g-bar , g-bar were obtained. At low temperatures T2+(1%)+CdCO3 single crystals. At high temperatures in the paramagnetic region, experimental data differs from calculated ones by more than two times. It is shown that this discrepancy cannot be described within the frames of used approximations
CERN Press Office. Geneva
1987-01-01
Le Conseil du CERN s'est réuni à Genève (Suisse); Josef Rembser élu comme président du Conseil; le Prix Nobel Carlo Rubbia nommé prochain Directeur Général (1989-1993); La Commission Abragam présente son rapport
Juel-Christiansen, Carsten
2005-01-01
Artiklen fremhæver den visuelle rotation - billeder, tegninger, modeller, værker - som det privilligerede medium i kommunikationen af ideer imellem skabende arkitekter......Artiklen fremhæver den visuelle rotation - billeder, tegninger, modeller, værker - som det privilligerede medium i kommunikationen af ideer imellem skabende arkitekter...
Spädtke, P
2013-01-01
Modeling of technical machines became a standard technique since computer became powerful enough to handle the amount of data relevant to the specific system. Simulation of an existing physical device requires the knowledge of all relevant quantities. Electric fields given by the surrounding boundary as well as magnetic fields caused by coils or permanent magnets have to be known. Internal sources for both fields are sometimes taken into account, such as space charge forces or the internal magnetic field of a moving bunch of charged particles. Used solver routines are briefly described and some bench-marking is shown to estimate necessary computing times for different problems. Different types of charged particle sources will be shown together with a suitable model to describe the physical model. Electron guns are covered as well as different ion sources (volume ion sources, laser ion sources, Penning ion sources, electron resonance ion sources, and H$^-$-sources) together with some remarks on beam transport.
This last volume in the series of textbooks on environmental isotopes in the hydrological cycle provides an overview of the basic principles of existing conceptual formulations of modelling approaches. While some of the concepts provided in Chapter 2 and Chapter 3 are of general validity for quantitative interpretation of isotope data; the modelling methodologies commonly employed for incorporating isotope data into evaluations specifically related to groundwater systems are given in this volume together with some illustrative examples. Development of conceptual models for quantitative interpretations of isotope data in hydrogeology and the assessment of their limitations and field verification has been given priority in the research and development efforts of the IAEA during the last decade. Several Co-ordinated Research Projects on this specific topic were implemented and results published by the IAEA. Based on these efforts and contributions made by a number of scientists involved in this specific field, the IAEA has published two Technical Documents entitled ''Mathematical models and their applications to isotope studies in groundwater studies -- IAEA TECDOC-777, 1994'' and ''Manual on Mathematical models in isotope hydrogeology -- IAEA TECDOC-910, 1996''. Results of a recently completed Co-ordinated Research Project by the IAEA entitled ''Use of isotopes for analysis of flow and transport dynamics in groundwater systems'' will also soon be published by the IAEA. This is the reason why the IAEA was involved in the co-ordination required for preparation of this volume; the material presented is a condensed overview prepared by some of the scientists that were involved in the above cited IAEA activities. This volume VI providing such an overview was included into the series to make this series self-sufficient in its coverage of the field of Isotope Hydrology. A special chapter on the methodologies and concepts related to geochemical modelling in groundwater
Muller, Pierre-Alain; Fondement, Frédéric; Baudry, Benoit
2009-01-01
Model-driven engineering and model-based approaches have permeated all branches of software engineering; to the point that it seems that we are using models, as Molière's Monsieur Jourdain was using prose, without knowing it. At the heart of modeling, there is a relation that we establish to represent something by something else. In this paper we review various definitions of models and relations between them. Then, we define a canonical set of relations that can be used to express various ki...
Muller, Pierre-Alain; Fondement, Frédéric; Baudry, Benoit; Combemale, Benoit
2012-01-01
Model-driven engineering and model-based approaches have permeated all branches of software engineering to the point that it seems that we are using models, as Molière's Monsieur Jourdain was using prose, without knowing it. At the heart of modeling, there is a relation that we establish to represent something by something else. In this paper we review various definitions of models and relations between them. Then, we define a canonical set of relations that can be used to express various kin...
Blouin A.; Combemale B.; Baudry B.; Beaudoux O.
2011-01-01
International audience Among model comprehension tools, model slicers are tools that extract a subset from a model, for a specific purpose. Model slicers are tools that let modelers rapidly gather relevant knowledge from large models. However, existing slicers are dedicated to one modeling language. This is an issue when we observe that new domain specific modeling languages (DSMLs), for which we want slicing abilities, are created almost on a daily basis. This paper proposes the Kompren l...
Onatski, Alexei; Williams, Noah
2003-01-01
Recently there has been much interest in studying monetary policy under model uncertainty. We develop methods to analyze different sources of uncertainty in one coherent structure useful for policy decisions. We show how to estimate the size of the uncertainty based on time series data, and incorporate this uncertainty in policy optimization. We propose two different approaches to modeling model uncertainty. The first is model error modeling, which imposes additional structure on the errors o...
Baden-Fuller, C.; Morgan, M S
2010-01-01
Drawing on research undertaken in the history and philosophy of science, with particular reference to the extensive literature which discusses the use of models in biology and economics, we explore the question ‘Are Business Models useful?’ We point out that they act as various forms of model: to provide means to describe and classify businesses; to operate as sites for scientific investigation; and to act as recipes for creative managers. We argue that studying business models as models is r...
Model Validation and Model Error Modeling
Ljung, Lennart
1999-01-01
To validate an estimated model and to have a good understanding of its reliability is a central aspect of System Identification. This contribution discusses these aspects in the light of model error models that are explicit descriptions of the model error. A model error model is implicitly present in most model validation methods, so the concept is more of a representation form than a set of new techniques. Traditional model validation is essentially a test of whether the confidence region of...
Poulsen, Helle
1996-01-01
This paper presents a functional modelling method called Actant Modelling rooted in linguistics and semiotics. Actant modelling can be integrated with Multilevel Flow Modelling (MFM) in order to give an interpretation of actants.......This paper presents a functional modelling method called Actant Modelling rooted in linguistics and semiotics. Actant modelling can be integrated with Multilevel Flow Modelling (MFM) in order to give an interpretation of actants....
Anaïs Schaeffer
2012-01-01
By analysing the production of mesons in the forward region of LHC proton-proton collisions, the LHCf collaboration has provided key information needed to calibrate extremely high-energy cosmic ray models. Average transverse momentum (pT) as a function of rapidity loss ∆y. Black dots represent LHCf data and the red diamonds represent SPS experiment UA7 results. The predictions of hadronic interaction models are shown by open boxes (sibyll 2.1), open circles (qgsjet II-03) and open triangles (epos 1.99). Among these models, epos 1.99 shows the best overall agreement with the LHCf data. LHCf is dedicated to the measurement of neutral particles emitted at extremely small angles in the very forward region of LHC collisions. Two imaging calorimeters – Arm1 and Arm2 – take data 140 m either side of the ATLAS interaction point. “The physics goal of this type of analysis is to provide data for calibrating the hadron interaction models – the well-known &...
Cameron, Ian; Gani, Rafiqul
2011-01-01
This chapter deals with the practicalities of building, testing, deploying and maintaining models. It gives specific advice for each phase of the modelling cycle. To do this, a modelling framework is introduced which covers: problem and model definition; model conceptualization; model data...... requirements; model construction; model solution; model verification; model validation and finally model deployment and maintenance. Within the adopted methodology, each step is discussedthrough the consideration of key issues and questions relevant to the modelling activity. Practical advice, based on many...... years of experience is providing in directing the reader in their activities.Traps and pitfalls are discussed and strategies also given to improve model development towards “fit-for-purpose” models. The emphasis in this chapter is the adoption and exercise of a modelling methodology that has proven very...
Daniel J Kliebenstein
2012-01-01
Models of myriad forms are rapidly becoming central to biology. This ranges from statistical models that are fundamental to the interpretation of experimental results to ODE models that attempt to describe the results in a mechanistic format. Models will be more and more essential to biologists but this growing importance requires all model users to become more sophisticated about what is in a model and how that limits the usability of the model. This review attempts to relay the potential pi...
Li, Qin; Zhao, Yongxin; Wu, Xiaofeng; Liu, Si
There can be multitudinous models specifying aspects of the same system. Each model has a bias towards one aspect. These models often override in specific aspects though they have different expressions. A specification written in one model can be refined by introducing additional information from other models. The paper proposes a concept of promoting models which is a methodology to obtain refinements with support from cooperating models. It refines a primary model by integrating the information from a secondary model. The promotion principle is not merely an academic point, but also a reliable and robust engineering technique which can be used to develop software and hardware systems. It can also check the consistency between two specifications from different models. A case of modeling a simple online shopping system with the cooperation of the guarded design model and CSP model illustrates the practicability of the promotion principle.
Anyone who worries that physicists are running out of interesting challenges to tackle and important problems to solve should read the two, very different feature articles in this issue. In 'Climate change: complexity in action', Klaus Hasselmann and colleagues write about the challenges of including economic and political dimensions in computer simulations of climate change. It is hard to imagine a physics-based topic that has a greater impact on the world at large. In 'Quarks, diquarks and pentaquarks', Robert Jaffe and Frank Wilczek describe our current understanding of quantum chromodynamics and the strong nuclear force. In this case it is hard to think of many more difficult problems in fundamental physics. Traditional climate modelling is difficult enough because a whole range of effects in the atmosphere and the oceans have to be taken into account. It typically takes weeks for a state-of-the-art supercomputer to simulate 100 years of climate change with a horizontal resolution of 100 km. But climate change is about much more than solving difficult differential equations - there are crucial social, political and economic influences as well. Some researchers, including a significant number of physicists, have started to look at this integrated-assessment approach. The first challenge is to develop climate models that take minutes to run on a laptop. The next challenge is to develop analogous models that work in the social, political and economic arenas - which is not a trivial task - and then integrate all these different models and explore all the possible global-warming scenarios. Physicists also hope to integrate quantum chromodynamics (QCD) into the larger framework of a so-called theory of everything. Like climate modellers, particle theorists working on QCD require enormous computational resources for their calculations, and even then there are limits to what can be achieved (e.g. the mass of the proton has yet to be calculated from first principles
无
2003-01-01
This paper puts forward a new conception:model warehouse,analyzes the reason why model warehouse appears and introduces the characteristics and architecture of model warehouse.Last,this paper points out that model warehouse is an important part of WebGIS.
Sales-Cruz, Mauricio; Piccolo, Chiara; Heitzig, Martina;
2011-01-01
This chapter presents various types of constitutive models and their applications. There are 3 aspects dealt with in this chapter, namely: creation and solution of property models, the application of parameter estimation and finally application examples of constitutive models. A systematic...... procedure is introduced for the analysis and solution of property models. Models that capture and represent the temperature dependent behaviour of physical properties are introduced, as well as equation of state models (EOS) such as the SRK EOS. Modelling of liquid phase activity coefficients are also...... covered, illustrating several models such as the Wilson equation and NRTL equation, along with their solution strategies. A section shows how to use experimental data to regress the property model parameters using a least squares approach. A full model analysis is applied in each example that discusses...
Katerina Simons
1997-01-01
Modern finance would not have been possible without models. Increasingly complex quantitative models drive financial innovation and the growth of derivatives markets. Models are necessary to value financial instruments and to measure the risks of individual positions and portfolios. Yet when used inappropriately, the models themselves can become an important source of risk. Recently, several well-publicized instances occurred of institutions suffering significant losses attributed to model er...
M Batty
2007-01-01
The term ?model? is now central to our thinking about how weunderstand and design cities. We suggest a variety of ways inwhich we use ?models?, linking these ideas to Abercrombie?sexposition of Town and Country Planning which represented thestate of the art fifty years ago. Here we focus on using models asphysical representations of the city, tracing the development ofsymbolic models where the focus is on simulating how functiongenerates form, to iconic models where the focus is on representi...
Yost, S.A.
1991-05-01
Radom matrix models based on an integral over supermatrices are proposed as a natural extension of bosonic matrix models. The subtle nature of superspace integration allows these models to have very different properties from the analogous bosonic models. Two choices of integration slice are investigated. One leads to a perturbative structure which is reminiscent of, and perhaps identical to, the usual Hermitian matrix models. Another leads to an eigenvalue reduction which can be described by a two component plasma in one dimension. A stationary point of the model is described.
Larsen, Lars Bjørn; Vesterager, Johan
This report provides an overview of the existing models of global manufacturing, describes the required modelling views and associated methods and identifies tools, which can provide support for this modelling activity.The model adopted for global manufacturing is that of an extended enterprise....... One or more units from beyond the network may complement the extended enterprise. The common reference model for this extended enterprise will utilise GERAM (Generalised Enterprise Reference Architecture and Methodology) to provide an architectural framework for the modelling carried out within the...
Contributions to the workshop 'Geochemical modeling' from 19 to 20 September 1990 at the Karlsruhe Nuclear Research Centre. The report contains the programme and a selection of the lectures held at the workshop 'Geochemical modeling'. (BBR)
This presentation presented information on entrainment models. Entrainment models use entrainment hypotheses to express the continuity equation. The advantage is that plume boundaries are known. A major disadvantage is that the problems that can be solved are rather simple. The ...
Fox, Mark S.; Gruninger, Michael
1998-01-01
To remain competitive, enterprises must become increasingly agile and integrated across their functions. Enterprise models play a critical role in this integration, enabling better designs for enterprises, analysis of their performance, and management of their operations. This article motivates the need for enterprise models and introduces the concepts of generic and deductive enterprise models. It reviews research to date on enterprise modeling and considers in detail the Toronto virtual ent...
Jongerden, M.R.; Haverkort, B.R.
2008-01-01
The use of mobile devices is often limited by the capacity of the employed batteries. The battery lifetime determines how long one can use a device. Battery modeling can help to predict, and possibly extend this lifetime. Many different battery models have been developed over the years. However, with these models one can only compute lifetimes for specific discharge profiles, and not for workloads in general. In this paper, we give an overview of the different battery models that are availabl...
Turner, Raymond
2009-01-01
Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al
Frampton, Paul H.
1997-01-01
In this talk I begin with some general discussion of model building in particle theory, emphasizing the need for motivation and testability. Three illustrative examples are then described. The first is the Left-Right model which provides an explanation for the chirality of quarks and leptons. The second is the 331-model which offers a first step to understanding the three generations of quarks and leptons. Third and last is the SU(15) model which can accommodate the light leptoquarks possibly...
In this work the most recent magnetospheric models are reviewed. After a short overview of the particle environment, a synthetic survey of the problem is given. For each feature of magnetospheric modelling (boundary, current sheet, ring-current) the approaches used by different authors are described. In the second part a description is given of the magnetospheric models, divided into four groups. In the last part, the different uses of magnetospheric models are illustrated by means of examples
Phoenix (formerly referred to as the Second Generation Model or SGM) is a global general equilibrium model designed to analyze energy-economy-climate related questions and policy implications in the medium- to long-term. This model disaggregates the global economy into 26 industr...
Sclütter, Flemming; Frigaard, Peter; Liu, Zhou
This report presents the model test results on wave run-up on the Zeebrugge breakwater under the simulated prototype storms. The model test was performed in January 2000 at the Hydraulics & Coastal Engineering Laboratory, Aalborg University. The detailed description of the model is given in...
Ravn, Anders P.; Staunstrup, Jørgen
1994-01-01
This paper proposes a model for specifying interfaces between concurrently executing modules of a computing system. The model does not prescribe a particular type of communication protocol and is aimed at describing interfaces between both software and hardware modules or a combination of the two....... The model describes both functional and timing properties of an interface...
Hydrological models are mediating models
Babel, L. V.; Karssenberg, D.
2013-08-01
Despite the increasing role of models in hydrological research and decision-making processes, only few accounts of the nature and function of models exist in hydrology. Earlier considerations have traditionally been conducted while making a clear distinction between physically-based and conceptual models. A new philosophical account, primarily based on the fields of physics and economics, transcends classes of models and scientific disciplines by considering models as "mediators" between theory and observations. The core of this approach lies in identifying models as (1) being only partially dependent on theory and observations, (2) integrating non-deductive elements in their construction, and (3) carrying the role of instruments of scientific enquiry about both theory and the world. The applicability of this approach to hydrology is evaluated in the present article. Three widely used hydrological models, each showing a different degree of apparent physicality, are confronted to the main characteristics of the "mediating models" concept. We argue that irrespective of their kind, hydrological models depend on both theory and observations, rather than merely on one of these two domains. Their construction is additionally involving a large number of miscellaneous, external ingredients, such as past experiences, model objectives, knowledge and preferences of the modeller, as well as hardware and software resources. We show that hydrological models convey the role of instruments in scientific practice by mediating between theory and the world. It results from these considerations that the traditional distinction between physically-based and conceptual models is necessarily too simplistic and refers at best to the stage at which theory and observations are steering model construction. The large variety of ingredients involved in model construction would deserve closer attention, for being rarely explicitly presented in peer-reviewed literature. We believe that devoting
Selén, Yngve
2004-01-01
Before using a parametric model one has to be sure that it offers a reasonable description of the system to be modeled. If a bad model structure is employed, the obtained model will also be bad, no matter how good is the parameter estimation method. There exist many possible ways of validating candidate models. This thesis focuses on one of the most common ways, i.e., the use of information criteria. First, some common information criteria are presented, and in the later chapters, various ext...
Stubkjær, Erik
2005-01-01
Modeling is a term that refers to a variety of efforts, including data and process modeling. The domain to be modeled may be a department, an organization, or even an industrial sector. E-business presupposes the modeling of an industrial sector, a substantial task. Cadastral modeling compares...... to the modeling of an industrial sector, as it aims at rendering the basic concepts that relate to the domain of real estate and the pertinent human activities. The palpable objects are pieces of land and buildings, documents, data stores and archives, as well as persons in their diverse roles as owners, holders...... to land. The paper advances the position that cadastral modeling has to include not only the physical objects, agents, and information sets of the domain, but also the objectives or requirements of cadastral systems....
Bois, Frederic Y; Brochot, Céline
2016-01-01
Pharmacokinetics is the study of the fate of xenobiotics in a living organism. Physiologically based pharmacokinetic (PBPK) models provide realistic descriptions of xenobiotics' absorption, distribution, metabolism, and excretion processes. They model the body as a set of homogeneous compartments representing organs, and their parameters refer to anatomical, physiological, biochemical, and physicochemical entities. They offer a quantitative mechanistic framework to understand and simulate the time-course of the concentration of a substance in various organs and body fluids. These models are well suited for performing extrapolations inherent to toxicology and pharmacology (e.g., between species or doses) and for integrating data obtained from various sources (e.g., in vitro or in vivo experiments, structure-activity models). In this chapter, we describe the practical development and basic use of a PBPK model from model building to model simulations, through implementation with an easily accessible free software. PMID:27311461
This lecture provides a survey of the methods used to model fast magnetosonic wave coupling, propagation, and absorption in tokamaks. The validity and limitations of three distinct types of modelling codes, which will be contrasted, include discrete models which utilize ray tracing techniques, approximate continuous field models based on a parabolic approximation of the wave equation, and full field models derived using finite difference techniques. Inclusion of mode conversion effects in these models and modification of the minority distribution function will also be discussed. The lecture will conclude with a presentation of time-dependent global transport simulations of ICRF-heated tokamak discharges obtained in conjunction with the ICRF modelling codes. 52 refs., 15 figs
Model choice versus model criticism
Robert, Christian P.; Mengersen, Kerrie; Chen, Carla
2009-01-01
The new perspectives on ABC and Bayesian model criticisms presented in Ratmann et al.(2009) are challenging standard approaches to Bayesian model choice. We discuss here some issues arising from the authors' approach, including prior influence, model assessment and criticism, and the meaning of error in ABC.
H. Yang
1999-11-04
The purpose of this analysis and model report (AMR) for the Ventilation Model is to analyze the effects of pre-closure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts and provide heat removal data to support EBS design. It will also provide input data (initial conditions, and time varying boundary conditions) for the EBS post-closure performance assessment and the EBS Water Distribution and Removal Process Model. The objective of the analysis is to develop, describe, and apply calculation methods and models that can be used to predict thermal conditions within emplacement drifts under forced ventilation during the pre-closure period. The scope of this analysis includes: (1) Provide a general description of effects and heat transfer process of emplacement drift ventilation. (2) Develop a modeling approach to simulate the impacts of pre-closure ventilation on the thermal conditions in emplacement drifts. (3) Identify and document inputs to be used for modeling emplacement ventilation. (4) Perform calculations of temperatures and heat removal in the emplacement drift. (5) Address general considerations of the effect of water/moisture removal by ventilation on the repository thermal conditions. The numerical modeling in this document will be limited to heat-only modeling and calculations. Only a preliminary assessment of the heat/moisture ventilation effects and modeling method will be performed in this revision. Modeling of moisture effects on heat removal and emplacement drift temperature may be performed in the future.
Bækgaard, Lars
2001-01-01
The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics.We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...... temporarily from bookcases to borrowers. When we characterize events as change agents we focus on concepts like transactions, entity processes, and workflow processes....
Bækgaard, Lars
2001-01-01
The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics. We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...... temporarily from bookcases to borrowers. When we characterize events as change agents we focus on concepts like transactions, entity processes, and workflow processes....
This paper is an introduction course in modelling turbulent thermohydraulics, aimed at computational fluid dynamics users. No specific knowledge other than the Navier Stokes equations is required beforehand. Chapter I (which those who are not beginners can skip) provides basic ideas on turbulence physics and is taken up in a textbook prepared by the teaching team of the ENPC (Benque, Viollet). Chapter II describes turbulent viscosity type modelling and the 2k-ε two equations model. It provides details of the channel flow case and the boundary conditions. Chapter III describes the 'standard' (Rij-ε) Reynolds tensions transport model and introduces more recent models called 'feasible'. A second paper deals with heat transfer and the effects of gravity, and returns to the Reynolds stress transport model. (author)
Braby, L.A.
1990-09-01
The biological effects of ionizing radiation exposure are the result of a complex sequence of physical, chemical, biochemical, and physiological interactions. One way to begin a search for an understanding of health effects of radiation is through the development of phenomenological models of the response. Many models have been presented and tested in the slowly evolving process of characterizing cellular response. A range of models covering different endpoints and phenomena has developed in parallel. Many of these models employ similar assumptions about some underlying processes while differing about the nature of others. An attempt is made to organize many of the models into groups with similar features and to compare the consequences of those features with the actual experimental observations. It is assumed that by showing that some assumptions are inconsistent with experimental observations, the job of devising and testing mechanistic models can be simplified. 43 refs., 13 figs.
Nielsen, Mogens Peter; Shui, Wan; Johansson, Jens
2011-01-01
In this report a new turbulence model is presented.In contrast to the bulk of modern work, the model is a classical continuum model with a relatively simple constitutive equation. The constitutive equation is, as usual in continuum mechanics, entirely empirical. It has the usual Newton or Stokes...... term with stresses depending linearly on the strain rates. This term takes into account the transfer of linear momentum from one part of the fluid to another. Besides there is another term, which takes into account the transfer of angular momentum. Thus the model implies a new definition of turbulence....... The model is in a virgin state, but a number of numerical tests have been carried out with good results. It is published to encourage other researchers to study the model in order to find its merits and possible limitations....
Blomhøj, Morten
2004-01-01
Developing competences for setting up, analysing and criticising mathematical models are normally seen as relevant only from and above upper secondary level. The general belief among teachers is that modelling activities presuppose conceptual understanding of the mathematics involved. Mathematical...... modelling, however, can be seen as a practice of teaching that place the relation between real life and mathematics into the centre of teaching and learning mathematics, and this is relevant at all levels. Modelling activities may motivate the learning process and help the learner to establish cognitive...... roots for the construction of important mathematical concepts. In addition competences for setting up, analysing and criticising modelling processes and the possible use of models is a formative aim in this own right for mathematics teaching in general education. The paper presents a theoretical...
2016-01-01
This book provides a thorough introduction to the challenge of applying mathematics in real-world scenarios. Modelling tasks rarely involve well-defined categories, and they often require multidisciplinary input from mathematics, physics, computer sciences, or engineering. In keeping with this spirit of modelling, the book includes a wealth of cross-references between the chapters and frequently points to the real-world context. The book combines classical approaches to modelling with novel areas such as soft computing methods, inverse problems, and model uncertainty. Attention is also paid to the interaction between models, data and the use of mathematical software. The reader will find a broad selection of theoretical tools for practicing industrial mathematics, including the analysis of continuum models, probabilistic and discrete phenomena, and asymptotic and sensitivity analysis.
Sochůrková, Adéla
2012-01-01
The aim of this thesis is the compilation of an inventory management methods, describe their principles and assess the appropriateness of their use. In the introductory part of the work, "The nature and importance of inventory management" are briefly described the inventory management, the main objectives of inventory control models, the basic division of inventory species and costs of supply. The following chapter "Overview of inventory control models" includes a breakdown of models from dif...
Epstein, Joshua M.
2008-01-01
This address treats some enduring misconceptions about modeling. One of these is that the goal is always prediction. The lecture distinguishes between explanation and prediction as modeling goals, and offers sixteen reasons other than prediction to build a model. It also challenges the common assumption that scientific theories arise from and 'summarize' data, when often, theories precede and guide data collection; without theory, in other words, it is not clear what data to collect. Among ot...
Marco Antonio Moreira
1996-12-01
Full Text Available The mental models subject is presented particularly in the light of Johnson-Laird’s theory. Views from different authors are also presented but the emphasis lies in Johson-Laird’s approach, proposing mental models as a third path in the images x propositions debate. In this perspective, the nature, content, and typology of mental models are discussed, as well as the issue of conciousness and computability. In addition, the methodology of research studies are provided. Essentially, the aim of the paper is to provide an introduction to the mental models topic, having science education research in mind.
Liu, Zhou; Frigaard, Peter
This report presents the model on wave run-up and run-down on the Zeebrugge breakwater under short-crested oblique wave attacks. The model test was performed in March-April 2000 at the Hydraulics & Coastal Engineering Laboratory, Aalborg University.......This report presents the model on wave run-up and run-down on the Zeebrugge breakwater under short-crested oblique wave attacks. The model test was performed in March-April 2000 at the Hydraulics & Coastal Engineering Laboratory, Aalborg University....
Vestergaard, Kristian
engineers, but as the scale and the complexity of the hydraulic works increased, the mathematical models became so complex that a mathematical solution could not be obtained. This created a demand for new methods and again the experimental investigation became popular, but this time as measurements on small......-scale models. But still the scale and complexity of hydraulic works were increasing, and soon even small-scale models reached a natural limit for some applications. In the mean time the modern computer was developed, and it became possible to solve complex mathematical models by use of computer-based numerical...
The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their post-closure analyses. The Ventilation Model report was initially developed to analyze the effects of preclosure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts, and to provide heat removal data to support EBS design. Revision 00 of the Ventilation Model included documentation of the modeling results from the ANSYS-based heat transfer model. The purposes of Revision 01 of the Ventilation Model are: (1) To validate the conceptual model for preclosure ventilation of emplacement drifts and verify its numerical application in accordance with new procedural requirements as outlined in AP-SIII-10Q, Models (Section 7.0). (2) To satisfy technical issues posed in KTI agreement RDTME 3.14 (Reamer and Williams 2001a). Specifically to demonstrate, with respect to the ANSYS ventilation model, the adequacy of the discretization (Section 6.2.3.1), and the downstream applicability of the model results (i.e. wall heat fractions) to initialize post
Modeling Documents with Event Model
Longhui Wang
2015-08-01
Full Text Available Currently deep learning has made great breakthroughs in visual and speech processing, mainly because it draws lessons from the hierarchical mode that brain deals with images and speech. In the field of NLP, a topic model is one of the important ways for modeling documents. Topic models are built on a generative model that clearly does not match the way humans write. In this paper, we propose Event Model, which is unsupervised and based on the language processing mechanism of neurolinguistics, to model documents. In Event Model, documents are descriptions of concrete or abstract events seen, heard, or sensed by people and words are objects in the events. Event Model has two stages: word learning and dimensionality reduction. Word learning is to learn semantics of words based on deep learning. Dimensionality reduction is the process that representing a document as a low dimensional vector by a linear mode that is completely different from topic models. Event Model achieves state-of-the-art results on document retrieval tasks.
Poortman, Sybilla; Sloep, Peter
2006-01-01
Educational models describes a case study on a complex learning object. Possibilities are investigated for using this learning object, which is based on a particular educational model, outside of its original context. Furthermore, this study provides advice that might lead to an increase in teachers’ motivation for using and sharing learning objects. This document is aimed at teachers and educational designers.
Højgaard, Tomas; Hansen, Rune
2016-01-01
The purpose of this paper is to introduce Didactical Modelling as a research methodology in mathematics education. We compare the methodology with other approaches and argue that Didactical Modelling has its own specificity. We discuss the methodological “why” and explain why we find it useful to...
Jantzen, Jan
1998-01-01
A neural network can approximate a function, but it is impossible to interpret the result in terms of natural language. The fusion of neural networks and fuzzy logic in neurofuzzy models provide learning as well as readability. Control engineers find this useful, because the models can be...
Giandomenico, Rossano
2006-01-01
The model determines a stochastic continuous process as continuous limit of a stochastic discrete process so to show that the stochastic continuous process converges to the stochastic discrete process such that we can integrate it. Furthermore, the model determines the expected volatility and the expected mean so to show that the volatility and the mean are increasing function of the time.
Løssing, Ulrik
1986-01-01
Ulrik Løssing har redigeret, illustreret og oversat: "Scribe Modeller System, Sheffield, november 1985" af forfatterne: Cedric Green, David Cooper og John Wells.......Ulrik Løssing har redigeret, illustreret og oversat: "Scribe Modeller System, Sheffield, november 1985" af forfatterne: Cedric Green, David Cooper og John Wells....
Gøtze, Jens Peter; Krentz, Andrew
2014-01-01
In this issue of Cardiovascular Endocrinology, we are proud to present a broad and dedicated spectrum of reviews on animal models in cardiovascular disease. The reviews cover most aspects of animal models in science from basic differences and similarities between small animals and the human...
Kindler, Ekkart
2009-01-01
There are many different notations and formalisms for modelling business processes and workflows. These notations and formalisms have been introduced with different purposes and objectives. Later, influenced by other notations, comparisons with other tools, or by standardization efforts, these...... notations have been extended in order to increase expressiveness and to be more competitive. This resulted in an increasing number of notations and formalisms for modelling business processes and in an increase of the different modelling constructs provided by modelling notations, which makes it difficult...... to compare modelling notations and to make transformations between them. One of the reasons is that, in each notation, the new concepts are introduced in a different way by extending the already existing constructs. In this chapter, we go the opposite direction: We show that it is possible to add...
Two dimensional IR-FID-CPMG acquisition and adaptation of a maximum entropy reconstruction
Rondeau-Mouro, C.; Kovrlija, R.; Van Steenberge, E.; Moussaoui, S.
2016-04-01
By acquiring the FID signal in two-dimensional TD-NMR spectroscopy, it is possible to characterize mixtures or complex samples composed of solid and liquid phases. We have developed a new sequence for this purpose, called IR-FID-CPMG, making it possible to correlate spin-lattice T1 and spin-spin T2 relaxation times, including both liquid and solid phases in samples. We demonstrate here the potential of a new algorithm for the 2D inverse Laplace transformation of IR-FID-CPMG data based on an adapted reconstruction of the maximum entropy method, combining the standard decreasing exponential decay function with an additional term drawn from Abragam's FID function. The results show that the proposed IR-FID-CPMG sequence and its related inversion model allow accurate characterization and quantification of both solid and liquid phases in multiphasic and compartmentalized systems. Moreover, it permits to distinguish between solid phases having different T1 relaxation times or to highlight cross-relaxation phenomena.
Building Models and Building Modelling
Jørgensen, Kaj Asbjørn; Skauge, Jørn
teoretiske basis for de kapitler, der har et mere teoretisk indhold. De følgende appendikser B-D indeholder nærmere karakteristika om de to modellerings CAD-programmer ArchiCAD og Architectural Desktop tillige med en sammenligning mellem de to værktøjer. I de resterende to appendikser beskrives de specielle...... problemstillinger vedrørende modellering af de to "Sorthøjparken"-modeller og de resulterende modeller bliver præsenteret og evalueret. Den samlede rapport er udgivet på projektets hjemmeside: www.iprod.aau.dk/bygit/Web3B/ under Technical Reports....
Veronica J. Rutledge
2013-01-01
The absence of industrial scale nuclear fuel reprocessing in the U.S. has precluded the necessary driver for developing the advanced simulation capability now prevalent in so many other countries. Thus, it is essential to model complex series of unit operations to simulate, understand, and predict inherent transient behavior and feedback loops. A capability of accurately simulating the dynamic behavior of advanced fuel cycle separation processes will provide substantial cost savings and many technical benefits. The specific fuel cycle separation process discussed in this report is the off-gas treatment system. The off-gas separation consists of a series of scrubbers and adsorption beds to capture constituents of interest. Dynamic models are being developed to simulate each unit operation involved so each unit operation can be used as a stand-alone model and in series with multiple others. Currently, an adsorption model has been developed within Multi-physics Object Oriented Simulation Environment (MOOSE) developed at the Idaho National Laboratory (INL). Off-gas Separation and REcoverY (OSPREY) models the adsorption of off-gas constituents for dispersed plug flow in a packed bed under non-isothermal and non-isobaric conditions. Inputs to the model include gas, sorbent, and column properties, equilibrium and kinetic data, and inlet conditions. The simulation outputs component concentrations along the column length as a function of time from which breakthrough data is obtained. The breakthrough data can be used to determine bed capacity, which in turn can be used to size columns. It also outputs temperature along the column length as a function of time and pressure drop along the column length. Experimental data and parameters were input into the adsorption model to develop models specific for krypton adsorption. The same can be done for iodine, xenon, and tritium. The model will be validated with experimental breakthrough curves. Customers will be given access to
Kreiner, Svend; Christensen, Karl Bang
Rasch models; Partial Credit models; Rating Scale models; Item bias; Differential item functioning; Local independence; Graphical models......Rasch models; Partial Credit models; Rating Scale models; Item bias; Differential item functioning; Local independence; Graphical models...
Grimaldi, P.
2012-07-01
These mandatory guidelines are provided for preparation of papers accepted for publication in the series of Volumes of The The stereometric modelling means modelling achieved with : - the use of a pair of virtual cameras, with parallel axes and positioned at a mutual distance average of 1/10 of the distance camera-object (in practice the realization and use of a stereometric camera in the modeling program); - the shot visualization in two distinct windows - the stereoscopic viewing of the shot while modelling. Since the definition of "3D vision" is inaccurately referred to as the simple perspective of an object, it is required to add the word stereo so that "3D stereo vision " shall stand for "three-dimensional view" and ,therefore, measure the width, height and depth of the surveyed image. Thanks to the development of a stereo metric model , either real or virtual, through the "materialization", either real or virtual, of the optical-stereo metric model made visible with a stereoscope. It is feasible a continuous on line updating of the cultural heritage with the help of photogrammetry and stereometric modelling. The catalogue of the Architectonic Photogrammetry Laboratory of Politecnico di Bari is available on line at: http://rappresentazione.stereofot.it:591/StereoFot/FMPro?-db=StereoFot.fp5&-lay=Scheda&-format=cerca.htm&-view
Hodges, Wilfrid
1993-01-01
An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.
Insepov, Zeke; Veitzer, Seth; Mahalingam, Sudhakar
2011-01-01
Although vacuum arcs were first identified over 110 years ago, they are not yet well understood. We have since developed a model of breakdown and gradient limits that tries to explain, in a self-consistent way: arc triggering, plasma initiation, plasma evolution, surface damage and gra- dient limits. We use simple PIC codes for modeling plasmas, molecular dynamics for modeling surface breakdown, and surface damage, and mesoscale surface thermodynamics and finite element electrostatic codes for to evaluate surface properties. Since any given experiment seems to have more variables than data points, we have tried to consider a wide variety of arcing (rf structures, e beam welding, laser ablation, etc.) to help constrain the problem, and concentrate on common mechanisms. While the mechanisms can be comparatively simple, modeling can be challenging.
National Oceanic and Atmospheric Administration, Department of Commerce — Computer simulations of past climate. Variables provided as model output are described by parameter keyword. In some cases the parameter keywords are a subset of...
The author's goal is to provide a physical understanding of the ideal MHD model which includes: (1) a basic description of the model, (2) a derivation starting from a more fundamental kinetic model, and (3) a discussion of its range of validity. The ideal MHD model is a single-fluid model that describes the effects of magnetic geometry on the macroscopic equilibrium and stability properties of fusion plasmas. The model is derived in a straight forward manner by forming the mass, momentum, and energy moments of the Boltzmann equation. The moment equations reduce to ideal MHD with the introduction of three critical assumptions: high collisionality, small ion gyro radius, and small resistivity. An analysis of the validity conditions shows that the collision-dominated assumption is never satisfied in plasmas of fusion interest. The remaining two conditions are satisfied by a wide margin. A careful examination of the collision-dominated assumption shows that those particular parts of ideal MHD treated inaccurately (i.e., the parallel momentum and energy equations), play little, if any practical role in MHD equilibrium and stability. These equations primarily describe compression and expansion of a plasma whereas most MHD instabilities involve incompressible motions. The model is incorrect only where it does not matter. This realization leads to the introduction of a modified MHD model known as collisionless MHD which makes predictions nearly identical to collision-dominated assumption. It is thus valid for plasmas of fusion interest. The derivation follows from an analysis of single-particle guiding center motion in a collisionless plasma and the subsequent closure of the system by the heuristic assumption that the motions of interest are incompressible
Accelerated life models modeling and statistical analysis
Bagdonavicius, Vilijandas
2001-01-01
Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia
Kocherlakota, Narayana R.
2007-01-01
This paper uses an example to show that a model that fits the available data perfectly may provide worse answers to policy questions than an alternative, imperfectly fitting model. The author argues that, in the context of Bayesian estimation, this result can be interpreted as being due to the use of an inappropriate prior over the parameters of shock processes. He urges the use of priors that are obtained from explicit auxiliary information, not from the desire to obtain identification.
Model composition in model checking
Felscher, Ingo
2014-01-01
Model-checking allows one to formally check properties of systems: these properties are modeled as logic formulas and the systems as structures like transition systems. These transition systems are often composed, i.e., they arise in form of products or sums. The composition technique allows us to deduce the truth of a formula in the composed system from "interface information": the truth of formulas for the component systems and information in which components which of these formulas hold. W...
The invention relates to devices for modelling the space-dependent kinetics of a nuclear reactor. It can be advantageously used in studying the dynamics of the neutron field in the core to determine the effect of the control rods on the power distribution in the core, for training purposes. The proposed analog model of a nuclear reactor comprises operational amplifiers and a grid of resistors simulating neutron diffusion. Connected to the grid nodes are supply resistors modelling absorption and multiplication of neutrons. This is achieved by that, in the proposed model, all resistors through which power is supplied to the grid nodes are interconnected by their other leads and coupled to the output of the amplifier unit common for all nodes. Therewith, the amlifier unit models the transfer function of a ''point'' reactor. Connected to the input of this unit which includes two to four amplifiers are resistors for addition of signals with a grid node. Coupled to the grid nodes via additional resistors are voltage sources simulating reactivity
Nash, Ulrik William
2014-01-01
Firms consist of people who make decisions to achieve goals. How do these people develop the expectations which underpin the choices they make? The lens model provides one answer to this question. It was developed by cognitive psychologist Egon Brunswik (1952) to illustrate his theory of probabil......Firms consist of people who make decisions to achieve goals. How do these people develop the expectations which underpin the choices they make? The lens model provides one answer to this question. It was developed by cognitive psychologist Egon Brunswik (1952) to illustrate his theory of...... probabilistic functionalism, and concerns the environment and the mind, and adaptation by the latter to the former. This entry is about the lens model, and probabilistic functionalism more broadly. Focus will mostly be on firms and their employees, but, to fully appreciate the scope, we have to keep in mind the...
2012-01-01
The relationship between representation and the represented is examined here through the notion of persistent modelling. This notion is not novel to the activity of architectural design if it is considered as describing a continued active and iterative engagement with design concerns – an evident...... characteristic of architectural practice. But the persistence in persistent modelling can also be understood to apply in other ways, reflecting and anticipating extended roles for representation. This book identifies three principle areas in which these extensions are becoming apparent within contemporary....... It also provides critical insight into the use of contemporary modelling tools and methods, together with an examination of the implications their use has within the territories of architectural design, realisation and experience....
Bork Petersen, Franziska
2013-01-01
focus centres on how the catwalk scenography evokes a ‘defiguration’ of the walking models and to what effect. Vibskov’s mobile catwalk draws attention to the walk, which is a key element of models’ performance but which usually functions in fashion shows merely to present clothes in the most...... advantageous manner. Stepping on the catwalk’s sloping, moving surfaces decelerates the models’ walk and makes it cautious, hesitant and shaky: suddenly the models lack exactly the affirmative, staccato, striving quality of motion, and the condescending expression that they perform on most contemporary...... determines the models’ walk. Furthermore, letting the models set off sound through triggers with attached sound samples gives them an implied agency. This calls into question the designer’s unrestricted authorship....
Ling Li; Vasily Volkov
2006-01-01
A physically-based model is presented for the simulation of a new type of deformable objects-inflatable objects, such as shaped balloons, which consist of pressurized air enclosed by an elastic surface. These objects have properties inherent in both 3D and 2D elastic bodies, as they demonstrate the behaviour of 3D shapes using 2D formulations. As there is no internal structure in them, their behaviour is substantially different from the behaviour of deformable solid objects. We use one of the few available models for deformable surfaces, and enhance it to include the forces of internal and external pressure. These pressure forces may also incorporate buoyancy forces, to allow objects filled with a low density gas to float in denser media. The obtained models demonstrate rich dynamic behaviour, such as bouncing, floating, deflation and inflation.
Recent progress in understanding the observed properties of Type I supernovae as a consequence of the thermonuclear detonation of white dwarf stars and the ensuing decay of the 56Ni produced therein is reviewed. Within the context of this model for Type I explosions and the 1978 model for Type II explosions, the expected nucleosynthesis and gamma-line spectra from both kinds of supernovae are presented. Finally, a qualitatively new approach to the problem of massive star death and Type II supernovae based upon a combination of rotation and thermonuclear burning is discussed
Aarti Sharma
2009-01-01
Full Text Available The use of computational chemistry in the development of novel pharmaceuticals is becoming an increasingly important tool. In the past, drugs were simply screened for effectiveness. The recent advances in computing power and the exponential growth of the knowledge of protein structures have made it possible for organic compounds to be tailored to decrease the harmful side effects and increase the potency. This article provides a detailed description of the techniques employed in molecular modeling. Molecular modeling is a rapidly developing discipline, and has been supported by the dramatic improvements in computer hardware and software in recent years.
Arnoldi, Jakob
The article discusses the use of algorithmic models for so-called High Frequency Trading (HFT) in finance. HFT is controversial yet widespread in modern financial markets. It is a form of automated trading technology which critics among other things claim can lead to market manipulation. Drawing on...... two cases, this article shows that manipulation more likely happens in the reverse way, meaning that human traders attempt to make algorithms ‘make mistakes’ or ‘mislead’ algos. Thus, it is algorithmic models, not humans, that are manipulated. Such manipulation poses challenges for security exchanges...
Holmes, Jon L.
1999-06-01
Molecular modeling has trickled down from the realm of pharmaceutical and research laboratories into the realm of undergraduate chemistry instruction. It has opened avenues for the visualization of chemical concepts that previously were difficult or impossible to convey. I am sure that many of you have developed exercises using the various molecular modeling tools. It is the desire of this Journal to become an avenue for you to share these exercises among your colleagues. It is to this end that Ron Starkey has agreed to edit such a column and to publish not only the description of such exercises, but also the software documents they use. The WWW is the obvious medium to distribute this combination and so accepted submissions will appear online as a feature of JCE Internet. Typical molecular modeling exercise: finding conformation energies. Molecular Modeling Exercises and Experiments is the latest feature column of JCE Internet, joining Conceptual Questions and Challenge Problems, Hal's Picks, and Mathcad in the Chemistry Curriculum. JCE Internet continues to seek submissions in these areas of interest and submissions of general interest. If you have developed materials and would like to submit them, please see our Guide to Submissions for more information. The Chemical Education Resource Shelf, Equipment Buyers Guide, and WWW Site Review would also like to hear about chemistry textbooks and software, equipment, and WWW sites, respectively. Please consult JCE Internet Features to learn more about these resources at JCE Online. Email Announcements Would you like to be informed by email when the latest issue of the Journal is available online? when a new JCE Software title is shipping? when a new JCE Internet article has been published or is available for Open Review? when your subscription is about to expire? A new feature of JCE Online makes this possible. Visit our Guestbook to learn how. When you submit the form on this page, which includes your email address
Cardey, Sylviane
2013-01-01
In response to the need for reliable results from natural language processing, this book presents an original way of decomposing a language(s) in a microscopic manner by means of intra/inter‑language norms and divergences, going progressively from languages as systems to the linguistic, mathematical and computational models, which being based on a constructive approach are inherently traceable. Languages are described with their elements aggregating or repelling each other to form viable interrelated micro‑systems. The abstract model, which contrary to the current state of the art works in int
Sivaram, C.
2007-01-01
An alternate model for gamma ray bursts is suggested. For a white dwarf (WD) and neutron star (NS) very close binary system, the WD (close to Mch) can detonate due to tidal heating, leading to a SN. Material falling on to the NS at relativistic velocities can cause its collapse to a magnetar or quark star or black hole leading to a GRB. As the material smashes on to the NS, it is dubbed the Smashnova model. Here the SN is followed by a GRB. NS impacting a RG (or RSG) (like in Thorne-Zytkow ob...
Calculations, drawing principally on developments at AERE Harwell, of the relaxation about lattice defects are reviewed with emphasis on the techniques required for such calculations. The principles of defect modelling are outlined and various programs developed for defect simulations are discussed. Particular calculations for metals, ionic crystals and oxides, are considered. (UK)
Michael, John
others' minds. Then (2), in order to bring to light some possible justifications, as well as hazards and criticisms of the methodology of looking time tests, I will take a closer look at the concept of folk psychology and will focus on the idea that folk psychology involves using oneself as a model of...
Olaf eWolkenhauer
2014-01-01
Full Text Available Next generation sequencing technologies are bringing about a renaissance of mining approaches. A comprehensive picture of the genetic landscape of an individual patient will be useful, for example, to identify groups of patients that do or do not respond to certain therapies. The high expectations may however not be satisfied if the number of patient groups with similar characteristics is going to be very large. I therefore doubt that mining sequence data will give us an understanding of why and when therapies work. For understanding the mechanisms underlying diseases, an alternative approach is to model small networks in quantitative mechanistic detail, to elucidate the role of gene and proteins in dynamically changing the functioning of cells. Here an obvious critique is that these models consider too few components, compared to what might be relevant for any particular cell function. I show here that mining approaches and dynamical systems theory are two ends of a spectrum of methodologies to choose from. Drawing upon personal experience in numerous interdisciplinary collaborations, I provide guidance on how to model by discussing the question Why model?
Baart, F.; Donchyts, G.; van Dam, A.; Plieger, M.
2015-12-01
The emergence of interactive art has blurred the line between electronic, computer graphics and art. Here we apply this art form to numerical models. Here we show how the transformation of a numerical model into an interactive painting can both provide insights and solve real world problems. The cases that are used as an example include forensic reconstructions, dredging optimization, barrier design. The system can be fed using any source of time varying vector fields, such as hydrodynamic models. The cases used here, the Indian Ocean (HYCOM), the Wadden Sea (Delft3D Curvilinear), San Francisco Bay (3Di subgrid and Delft3D Flexible Mesh), show that the method used is suitable for different time and spatial scales. High resolution numerical models become interactive paintings by exchanging their velocity fields with a high resolution (>=1M cells) image based flow visualization that runs in a html5 compatible web browser. The image based flow visualization combines three images into a new image: the current image, a drawing, and a uv + mask field. The advection scheme that computes the resultant image is executed in the graphics card using WebGL, allowing for 1M grid cells at 60Hz performance on mediocre graphic cards. The software is provided as open source software. By using different sources for a drawing one can gain insight into several aspects of the velocity fields. These aspects include not only the commonly represented magnitude and direction, but also divergence, topology and turbulence .
Taylor, Julie
2013-01-01
This paper provides a brief overview of the NSPCC/University of Edinburgh Child Protection Research Centre. It highlights the Centre's work, approach, progress to date and direction of travel. The document includes the Centre's Logic Model which details types of research and outcomes.
R.E. Waltz
2007-01-01
@@ There has been remarkable progress during the past decade in understanding and modeling turbulent transport in tokamaks. With some exceptions the progress is derived from the huge increases in computational power and the ability to simulate tokamak turbulence with ever more fundamental and physically realistic dynamical equations, e.g.
Burianová, Eva
2008-01-01
Cílem první části této bakalářské práce je - pomocí analýzy výchozích textů - teoretické shrnutí ekonomických modelů a teorií, na kterých model CAPM stojí: Markowitzův model teorie portfolia (analýza maximalizace očekávaného užitku a na něm založený model výběru optimálního portfolia), Tobina (rozšíření Markowitzova modelu ? rozdělení výběru optimálního portfolia do dvou fází; nejprve určení optimální kombinace rizikových instrumentů a následná alokace dostupného kapitálu mezi tuto optimální ...
Jensen, Morten S.; Frigaard, Peter
In the following, results from model tests with Zeebrugge breakwater are presented. The objective with these tests is partly to investigate the influence on wave run-up due to a changing waterlevel during a storm. Finally, the influence on wave run-up due to an introduced longshore current is...
N. Bosma (Niels); G. de Wit (Gerrit); M.A. Carree (Martin)
2003-01-01
textabstractTwo approaches can be distinguished with respect to modelling entrepreneurship: (i) the approach focusing on the net development of the number of entrepreneurs in an equilibrium framework and (ii) the approach focusing on the entries and exits of entrepreneurs. In this paper we unify the
The ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003) presents the methodology for evaluating potential criticality situations in the monitored geologic repository. As stated in the referenced Topical Report, the detailed methodology for performing the disposal criticality analyses will be documented in model reports. Many of the models developed in support of the Topical Report differ from the definition of models as given in the Office of Civilian Radioactive Waste Management procedure AP-SIII.10Q, ''Models'', in that they are procedural, rather than mathematical. These model reports document the detailed methodology necessary to implement the approach presented in the Disposal Criticality Analysis Methodology Topical Report and provide calculations utilizing the methodology. Thus, the governing procedure for this type of report is AP-3.12Q, ''Design Calculations and Analyses''. The ''Criticality Model'' is of this latter type, providing a process evaluating the criticality potential of in-package and external configurations. The purpose of this analysis is to layout the process for calculating the criticality potential for various in-package and external configurations and to calculate lower-bound tolerance limit (LBTL) values and determine range of applicability (ROA) parameters. The LBTL calculations and the ROA determinations are performed using selected benchmark experiments that are applicable to various waste forms and various in-package and external configurations. The waste forms considered in this calculation are pressurized water reactor (PWR), boiling water reactor (BWR), Fast Flux Test Facility (FFTF), Training Research Isotope General Atomic (TRIGA), Enrico Fermi, Shippingport pressurized water reactor, Shippingport light water breeder reactor (LWBR), N-Reactor, Melt and Dilute, and Fort Saint Vrain Reactor spent nuclear fuel (SNF). The scope of this analysis is to document the criticality computational method. The criticality
A. Alsaed
2004-09-14
The ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003) presents the methodology for evaluating potential criticality situations in the monitored geologic repository. As stated in the referenced Topical Report, the detailed methodology for performing the disposal criticality analyses will be documented in model reports. Many of the models developed in support of the Topical Report differ from the definition of models as given in the Office of Civilian Radioactive Waste Management procedure AP-SIII.10Q, ''Models'', in that they are procedural, rather than mathematical. These model reports document the detailed methodology necessary to implement the approach presented in the Disposal Criticality Analysis Methodology Topical Report and provide calculations utilizing the methodology. Thus, the governing procedure for this type of report is AP-3.12Q, ''Design Calculations and Analyses''. The ''Criticality Model'' is of this latter type, providing a process evaluating the criticality potential of in-package and external configurations. The purpose of this analysis is to layout the process for calculating the criticality potential for various in-package and external configurations and to calculate lower-bound tolerance limit (LBTL) values and determine range of applicability (ROA) parameters. The LBTL calculations and the ROA determinations are performed using selected benchmark experiments that are applicable to various waste forms and various in-package and external configurations. The waste forms considered in this calculation are pressurized water reactor (PWR), boiling water reactor (BWR), Fast Flux Test Facility (FFTF), Training Research Isotope General Atomic (TRIGA), Enrico Fermi, Shippingport pressurized water reactor, Shippingport light water breeder reactor (LWBR), N-Reactor, Melt and Dilute, and Fort Saint Vrain Reactor spent nuclear fuel (SNF). The scope of
Information Model for Product Modeling
焦国方; 刘慎权
1992-01-01
The Key problems in product modeling for integrated CAD ∥CAM systems are the information structures and representations of products.They are taking more and more important roles in engineering applications.With the investigation on engineering product information and from the viewpoint of industrial process,in this paper,the information models are proposed and the definitions of the framework of product information are given.And then,the integration and the consistence of product information are discussed by introucing the entity and its instance.As a summary,the information structures described in this paper have many advantage and natures helpful in engineering design.
Aarti Sharma
2009-12-01
Full Text Available
Almeida, Leandro S.; José Fernando A. Cruz; Ferreira, Helena Isabel dos Santos Ribeiro; Pinto, Alberto
2011-01-01
The Theory of Planned Behavior studies the decision-making mechanisms of individuals. We propose the Nash Equilibria as one, of many, possible mechanisms of transforming human intentions in behavior. This process corresponds to the best strategic individual decision taking in account the collective response. We built a game theoretical model to understand the role of leaders in decision-making of individuals or groups. We study the characteristics of the leaders that can have a...
Clyde, Merlise; George, Edward I.
2004-01-01
The evolution of Bayesian approaches for model uncertainty over the past decade has been remarkable. Catalyzed by advances in methods and technology for posterior computation, the scope of these methods has widened substantially. Major thrusts of these developments have included new methods for semiautomatic prior specification and posterior exploration. To illustrate key aspects of this evolution, the highlights of some of these developments are described.
This lecture was given at the KEK Summer School on August 3-6, 1993 by Professor N. Sakai. All the available experimental data at low energy can be adequately described by the standard model with SU(3) x SU(2) x U(1) gauge group. The three different gauge coupling constants originate from the three different interactions, namely, strong, weak and electromagnetic interactions. The three interactions described by the three different gauge groups can be truly unified into a single gauge group if a simple gauge group to describe all three interactions is chosen. Even if the grand unified theory is not accepted, the existence of gravitational interaction is sure. There are only two options to explain the gauge hierarchy, that is, technicolor model and supersymmetry. As the introduction to supersymmetry, Spinors and Grassmann number, Supertransformation, unitary representation, chiral scalar superfield and supersymmetric Lagrangian field theory are explained. Regarding the supersymmetric SU(3) x SU(2) x U(1) model, Yukawa coupling and particle content are described. It should be noted that the Higgsino (chiral fermions associated with Higgs scalar) in general introduces anomaly in gauge currents. The simplest way out of such anomaly problem is to introduce Higgsino doublet in pair. (K.I.)
Plimpton, Steven James; Heffernan, Julieanne; Sasaki, Darryl Yoshio; Frischknecht, Amalie Lucile; Stevens, Mark Jackson; Frink, Laura J. Douglas
2005-11-01
Understanding the properties and behavior of biomembranes is fundamental to many biological processes and technologies. Microdomains in biomembranes or ''lipid rafts'' are now known to be an integral part of cell signaling, vesicle formation, fusion processes, protein trafficking, and viral and toxin infection processes. Understanding how microdomains form, how they depend on membrane constituents, and how they act not only has biological implications, but also will impact Sandia's effort in development of membranes that structurally adapt to their environment in a controlled manner. To provide such understanding, we created physically-based models of biomembranes. Molecular dynamics (MD) simulations and classical density functional theory (DFT) calculations using these models were applied to phenomena such as microdomain formation, membrane fusion, pattern formation, and protein insertion. Because lipid dynamics and self-organization in membranes occur on length and time scales beyond atomistic MD, we used coarse-grained models of double tail lipid molecules that spontaneously self-assemble into bilayers. DFT provided equilibrium information on membrane structure. Experimental work was performed to further help elucidate the fundamental membrane organization principles.
Exhaust gases from power plants that burn fossil fuels contain concentrations of sulfur dioxide (SO2), nitric oxide (NO), particulate matter, hydrocarbon compounds and trace metals. Estimated emissions from the operation of a hypothetical 500 MW coal-fired power plant are given. Ozone is considered a secondary pollutant, since it is not emitted directly into the atmosphere but is formed from other air pollutants, specifically, nitrogen oxides (NO), and non-methane organic compounds (NMOQ) in the presence of sunlight. (NMOC are sometimes referred to as hydrocarbons, HC, or volatile organic compounds, VOC, and they may or may not include methane). Additionally, ozone formation Alternative is a function of the ratio of NMOC concentrations to NOx concentrations. A typical ozone isopleth is shown, generated with the Empirical Kinetic Modeling Approach (EKMA) option of the Environmental Protection Agency's (EPA) Ozone Isopleth Plotting Mechanism (OZIPM-4) model. Ozone isopleth diagrams, originally generated with smog chamber data, are more commonly generated with photochemical reaction mechanisms and tested against smog chamber data. The shape of the isopleth curves is a function of the region (i.e. background conditions) where ozone concentrations are simulated. The location of an ozone concentration on the isopleth diagram is defined by the ratio of NMOC and NOx coordinates of the point, known as the NMOC/NOx ratio. Results obtained by the described model are presented
Technological Forecasting---Model Selection, Model Stability, and Combining Models
Nigel Meade; Towhidul Islam
1998-01-01
The paper identifies 29 models that the literature suggests are appropriate for technological forecasting. These models are divided into three classes according to the timing of the point of inflexion in the innovation or substitution process. Faced with a given data set and such a choice, the issue of model selection needs to be addressed. Evidence used to aid model selection is drawn from measures of model fit and model stability. An analysis of the forecasting performance of these models u...
Model Construct Based Enterprise Model Architecture and Its Modeling Approach
无
2002-01-01
In order to support enterprise integration, a kind of model construct based enterprise model architecture and its modeling approach are studied in this paper. First, the structural makeup and internal relationships of enterprise model architecture are discussed. Then, the concept of reusable model construct (MC) which belongs to the control view and can help to derive other views is proposed. The modeling approach based on model construct consists of three steps, reference model architecture synthesis, enterprise model customization, system design and implementation. According to MC based modeling approach a case study with the background of one-kind-product machinery manufacturing enterprises is illustrated. It is shown that proposal model construct based enterprise model architecture and modeling approach are practical and efficient.
Unnikrishnan, A.S.; Manoj, N.T.
the wetted perimeter and A the area of cross section (excluding mud flats); C = (1.49/n)R1/6, where n is the Manning coefficient. The numerical scheme used by Harleman and Lee (1969) was used to solve the above equations. In this scheme, the continuity... equation is solved at odd grid points to compute eta at the next time step and the momentum equation is solved at even grid points to compute U . The original scheme of Harleman & Lee (1969) was developed for a single channel. For developing a model...
Liu Zhiyang
2011-01-01
Similar to ISO Technical Committees,SAC Technical Committees undertake the management and coordination of standard's development and amendments in various sectors in industry,playing the role as a bridge among enterprises,research institutions and the governmental standardization administration.How to fully play the essential role is the vital issue SAC has been committing to resolve.Among hundreds of SAC TCs,one stands out in knitting together those isolated,scattered,but highly competitive enterprises in the same industry with the "Standards" thread,and achieving remarkable results in promoting industry development with standardization.It sets a role model for other TCs.
From model checking to model measuring
Henzinger, Thomas A.; Otop, Jan
2013-01-01
We define the model-measuring problem: given a model $M$ and specification~$\\varphi$, what is the maximal distance $\\rho$ such that all models $M'$ within distance $\\rho$ from $M$ satisfy (or violate)~$\\varphi$. The model measuring problem presupposes a distance function on models. We concentrate on automatic distance functions, which are defined by weighted automata. The model-measuring problem subsumes several generalizations of the classical model-checking problem, in particular, qu...
ModelWizard: Toward Interactive Model Construction
Hutchison, Dylan
2016-01-01
Data scientists engage in model construction to discover machine learning models that well explain a dataset, in terms of predictiveness, understandability and generalization across domains. Questions such as "what if we model common cause Z" and "what if Y's dependence on X reverses" inspire many candidate models to consider and compare, yet current tools emphasize constructing a final model all at once. To more naturally reflect exploration when debating numerous models, we propose an inter...
Towards a Multi Business Model Innovation Model
Lindgren, Peter; Jørgensen, Rasmus
2012-01-01
This paper studies the evolution of business model (BM) innovations related to a multi business model framework. The paper tries to answer the research questions: • What are the requirements for a multi business model innovation model (BMIM)? • How should a multi business model innovation model...... look like? Different generations of BMIMs are initially studied in the context of laying the baseline for how next generation multi BM Innovation model (BMIM) should look like. All generations of models are analyzed with the purpose of comparing the characteristics and challenges of previous...
Better Language Models with Model Merging
Brants, T
1996-01-01
This paper investigates model merging, a technique for deriving Markov models from text or speech corpora. Models are derived by starting with a large and specific model and by successively combining states to build smaller and more general models. We present methods to reduce the time complexity of the algorithm and report on experiments on deriving language models for a speech recognition task. The experiments show the advantage of model merging over the standard bigram approach. The merged model assigns a lower perplexity to the test set and uses considerably fewer states.
Impedance model for nanostructures
R. S. Akhmedov
2007-06-01
Full Text Available The application of the impedance model for nanoelectronic quantum-mechanical structures modelling is described. Characteristics illustrating the efficiency of the model are presented.
Building Mental Models by Dissecting Physical Models
Srivastava, Anveshna
2016-01-01
When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to…
From Product Models to Product State Models
Larsen, Michael Holm
1999-01-01
A well-known technology designed to handle product data is Product Models. Product Models are in their current form not able to handle all types of product state information. Hence, the concept of a Product State Model (PSM) is proposed. The PSM and in particular how to model a PSM is the Research...
The IMACLIM model; Le modele IMACLIM
NONE
2003-07-01
This document provides annexes to the IMACLIM model which propose an actualized description of IMACLIM, model allowing the design of an evaluation tool of the greenhouse gases reduction policies. The model is described in a version coupled with the POLES, technical and economical model of the energy industry. Notations, equations, sources, processing and specifications are proposed and detailed. (A.L.B.)
Rask, Morten
insight from the literature about business models, international product policy, international entry modes and globalization into a conceptual model of relevant design elements of global business models, enabling global business model innovation to deal with differences in a downstream perspective...
Forward model nonlinearity versus inverse model nonlinearity
Mehl, S.
2007-01-01
The issue of concern is the impact of forward model nonlinearity on the nonlinearity of the inverse model. The question posed is, "Does increased nonlinearity in the head solution (forward model) always result in increased nonlinearity in the inverse solution (estimation of hydraulic conductivity)?" It is shown that the two nonlinearities are separate, and it is not universally true that increased forward model nonlinearity increases inverse model nonlinearity. ?? 2007 National Ground Water Association.
Combemale, Benoit; Cheng, Betty H.C.; Moreira, Ana; Bruel, Jean-Michel; Gray, Jeff
2015-01-01
Various disciplines use models for different purposes. An engineering model, including a software engineering model, is often developed to guide the construction of a non-existent system. A scientific model is created to better understand a natural phenomenon (i.e., an already existing system). An engineering model may incorporate scientific models to build a system. Sustainability is an area that requires both types of models. Both engineering and scientific models have been used to support ...
A review is made of some properties of the rotating Universe models. Godel's model is identified as a generalized filted model. Some properties of new solutions of the Einstein's equations, which are rotating non-stationary Universe models, are presented and analyzed. These models have the Godel's model as a particular case. Non-stationary cosmological models are found which are a generalization of the Godel's metrics in an analogous way in which Friedmann is to the Einstein's model. (L.C.)
Concept Modeling vs. Data modeling in Practice
Madsen, Bodil Nistrup; Erdman Thomsen, Hanne
2015-01-01
This chapter shows the usefulness of terminological concept modeling as a first step in data modeling. First, we introduce terminological concept modeling with terminological ontologies, i.e. concept systems enriched with characteristics modeled as feature specifications. This enables a formal...... account of the inheritance of characteristics and allows us to introduce a number of principles and constraints which render concept modeling more coherent than earlier approaches. Second, we explain how terminological ontologies can be used as the basis for developing conceptual and logical data models....... We also show how to map from the various elements in the terminological ontology to elements in the data models, and explain the differences between the models. Finally the usefulness of terminological ontologies as a prerequisite for IT development and data modeling is illustrated with examples from...
Madsen, Henrik; Zhou, Jianjun; Hansen, Lars Henrik
1997-01-01
This paper describes a case study of identifying the physical model (or the grey box model) of a hydraulic test robot. The obtained model is intended to provide a basis for model-based control of the robot. The physical model is formulated in continuous time and is derived by application of the l......This paper describes a case study of identifying the physical model (or the grey box model) of a hydraulic test robot. The obtained model is intended to provide a basis for model-based control of the robot. The physical model is formulated in continuous time and is derived by application...
Optimal predictive model selection
Barbieri, Maria Maddalena; Berger, James O.
2004-01-01
Often the goal of model selection is to choose a model for future prediction, and it is natural to measure the accuracy of a future prediction by squared error loss. Under the Bayesian approach, it is commonly perceived that the optimal predictive model is the model with highest posterior probability, but this is not necessarily the case. In this paper we show that, for selection among normal linear models, the optimal predictive model is often the median probability model, which is defined a...
Dodgson, Mark; Gann, David; Phillips, Nelson; Massa, Lorenzo; Tucci, Christopher
2014-01-01
The chapter offers a broad review of the literature at the nexus between Business Models and innovation studies, and examines the notion of Business Model Innovation in three different situations: Business Model Design in newly formed organizations, Business Model Reconfiguration in incumbent firms, and Business Model Innovation in the broad context of sustainability. Tools and perspectives to make sense of Business Models and support managers and entrepreneurs in dealing with Business Model ...
Wake modelling combining mesoscale and microscale models
Badger, Jake; Volker, Patrick; Prospathospoulos, J.; Sieros, G.; Ott, Søren; Réthoré, Pierre-Elouan; Hahmann, Andrea N.; Hasager, Charlotte Bay
2013-01-01
In this paper the basis for introducing thrust information from microscale wake models into mesocale model wake parameterizations will be described. A classification system for the different types of mesoscale wake parameterizations is suggested and outlined. Four different mesoscale wake...
Model Manipulation for End-User Modelers
Acretoaie, Vlad
, and transformations using their modeling notation and editor of choice. The VM* languages are implemented via a single execution engine, the VM* Runtime, built on top of the Henshin graph-based transformation engine. This approach combines the benefits of flexibility, maturity, and formality. To simplify model editor......End-user modelers are domain experts who create and use models as part of their work. They are typically not Software Engineers, and have little or no programming and meta-modeling experience. However, using model manipulation languages developed in the context of Model-Driven Engineering often...... requires such experience. These languages are therefore only used by a small subset of the modelers that could, in theory, benefit from them. The goals of this thesis are to substantiate this observation, introduce the concepts and tools required to overcome it, and provide empirical evidence in support...
Model Checking of Boolean Process Models
Schneider, Christoph; Wehler, Joachim
2011-01-01
In the field of Business Process Management formal models for the control flow of business processes have been designed since more than 15 years. Which methods are best suited to verify the bulk of these models? The first step is to select a formal language which fixes the semantics of the models. We adopt the language of Boolean systems as reference language for Boolean process models. Boolean systems form a simple subclass of coloured Petri nets. Their characteristics are low tokens to mode...
MODEL VALIDATION AND THE PHILIPPINE PROGRAMMING MODEL
Rodriguez, Gil R. Jr.; Kunkel, David E.
1980-01-01
This research demonstrates the need and the procedure for testing sector programming models It compares the model estimates of endogenous variables to carefully selected base period parameters It uses an operational, static, deterministic, and highly aggregate programming model of Philippine agriculture as the framework Alternative formulations of the Philippine model are also examined for possible errors In the consumption, production, and objective function data sets
Molecular Models: Construction of Models with Magnets
Kalinovčić P.
2015-01-01
Molecular models are indispensable tools in teaching chemistry. Beside their high price, commercially available models are generally too small for classroom demonstration. This paper suggests how to make space-filling (callote) models from Styrofoam with magnetic balls as connectors and disc magnets for showing molecular polarity
Tahir Abdullah
2012-02-01
Full Text Available Software architecture design and requirement engineering are core and independent areas of engineering. A lot of research, education and practice are carried on Requirement elicitation and doing refine it, but it is a major issue of engineering. QSMSR model act as a bridge between requirement and design there is a huge gap between these two areas of software architecture and requirement engineering. The QSMSR model divide into two sub model qualitative model and Principal model in this research we focus on Qualitative model which further divide into two sub models fabricated model and classified model. Classified model make the sub groups of the role and match it with components. The Fabricated model link QSMSR Principal Model to an architecture design. At the end it provides the QSMSR Architecture model of the system as output.
Willden, Jeff
2001-01-01
"Bohr's Atomic Model" is a small interactive multimedia program that introduces the viewer to a simplified model of the atom. This interactive simulation lets students build an atom using an atomic construction set. The underlying design methodology for "Bohr's Atomic Model" is model-centered instruction, which means the central model of the…
Könemann, Patrick
just contain a list of strings, one for each line, whereas the structure of models is defined by their meta models. There are tools available which are able to compute the diff between two models, e.g. RSA or EMF Compare. However, their diff is not model-independent, i.e. it refers to the models it was...
Automated data model evaluation
Modeling process is essential phase within information systems development and implementation. This paper presents methods and techniques for analysis and evaluation of data model correctness. Recent methodologies and development results regarding automation of the process of model correctness analysis and relations with ontology tools has been presented. Key words: Database modeling, Data model correctness, Evaluation
Modelling Foundations and Applications
selected from 81 submissions. Papers on all aspects of MDE were received, including topics such as architectural modelling and product lines, code generation, domain-specic modeling, metamodeling, model analysis and verication, model management, model transformation and simulation. The breadth of topics...
Environmental Satellite Models for a Macroeconomic Model
To support national environmental policy, it is desirable to forecast and analyse environmental indicators consistently with economic variables. However, environmental indicators are physical measures linked to physical activities that are not specified in economic models. One way to deal with this is to develop environmental satellite models linked to economic models. The system of models presented gives a frame of reference where emissions of greenhouse gases, acid gases, and leaching of nutrients to the aquatic environment are analysed in line with - and consistently with - macroeconomic variables. This paper gives an overview of the data and the satellite models. Finally, the results of applying the model system to calculate the impacts on emissions and the economy are reviewed in a few illustrative examples. The models have been developed for Denmark; however, most of the environmental data used are from the CORINAIR system implemented in numerous countries
Geologic Framework Model Analysis Model Report
R. Clayton
2000-12-19
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the
Geologic Framework Model Analysis Model Report
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and
Collaborative networks: Reference modeling
L.M. Camarinha-Matos; H. Afsarmanesh
2008-01-01
Collaborative Networks: Reference Modeling works to establish a theoretical foundation for Collaborative Networks. Particular emphasis is put on modeling multiple facets of collaborative networks and establishing a comprehensive modeling framework that captures and structures diverse perspectives of
Earth Data Analysis Center, University of New Mexico — The model combines three modeled fire behavior parameters (rate of spread, flame length, crown fire potential) and one modeled ecological health measure (fire...
LSTM based Conversation Models
Luan, Yi; Ji, Yangfeng; Ostendorf, Mari
2016-01-01
In this paper, we present a conversational model that incorporates both context and participant role for two-party conversations. Different architectures are explored for integrating participant role and context information into a Long Short-term Memory (LSTM) language model. The conversational model can function as a language model or a language generation model. Experiments on the Ubuntu Dialog Corpus show that our model can capture multiple turn interaction between participants. The propos...
Computational neurogenetic modeling
Benuskova, Lubica
2010-01-01
Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol
National Aeronautics and Space Administration — CLAIRE MONTELEONI*, GAVIN SCHMIDT, AND SHAILESH SAROHA* Climate models are complex mathematical models designed by meteorologists, geophysicists, and climate...
Federal Laboratory Consortium — The Environmental Modeling Center provides the computational tools to perform geostatistical analysis, to model ground water and atmospheric releases for comparison...
Combustion modeling in a model combustor
L.Y.Jiang; I.Campbell; K.Su
2007-01-01
The flow-field of a propane-air diffusion flame combustor with interior and exterior conjugate heat transfers was numerically studied.Results obtained from four combustion models,combined with the re-normalization group (RNG) k-ε turbulence model,discrete ordinates radiation model and enhanced wall treatment are presented and discussed.The results are compared with a comprehensive database obtained from a series of experimental measurements.The flow patterns and the recirculation zone length in the combustion chamber are accurately predicted,and the mean axial velocities are in fairly good agreement with the experimental data,particularly at downstream sections for all four combustion models.The mean temperature profiles are captured fairly well by the eddy dissipation (EDS),probability density function (PDF),and laminar flamelet combustion models.However,the EDS-finite-rate combustion model fails to provide an acceptable temperature field.In general,the flamelet model illustrates little superiority over the PDF model,and to some extent the PDF model shows better performance than the EDS model.
ROCK PROPERTIES MODEL ANALYSIS MODEL REPORT
The purpose of this Analysis and Model Report (AMR) is to document Rock Properties Model (RPM) 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties models are intended principally for use as input to numerical physical-process modeling, such as of ground-water flow and/or radionuclide transport. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. This work was conducted in accordance with the following planning documents: WA-0344, ''3-D Rock Properties Modeling for FY 1998'' (SNL 1997, WA-0358), ''3-D Rock Properties Modeling for FY 1999'' (SNL 1999), and the technical development plan, Rock Properties Model Version 3.1, (CRWMS MandO 1999c). The Interim Change Notice (ICNs), ICN 02 and ICN 03, of this AMR were prepared as part of activities being conducted under the Technical Work Plan, TWP-NBS-GS-000003, ''Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01'' (CRWMS MandO 2000b). The purpose of ICN 03 is to record changes in data input status due to data qualification and verification activities. These work plans describe the scope, objectives, tasks, methodology, and implementing procedures for model construction. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The work scope for this activity consists of the following: (1) Conversion of the input data (laboratory measured porosity data, x-ray diffraction mineralogy, petrophysical calculations of bound water, and petrophysical calculations of porosity) for each borehole into stratigraphic coordinates; (2) Re-sampling and merging of data sets; (3) Development of geostatistical simulations of porosity; (4
ROCK PROPERTIES MODEL ANALYSIS MODEL REPORT
Clinton Lum
2002-02-04
The purpose of this Analysis and Model Report (AMR) is to document Rock Properties Model (RPM) 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties models are intended principally for use as input to numerical physical-process modeling, such as of ground-water flow and/or radionuclide transport. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. This work was conducted in accordance with the following planning documents: WA-0344, ''3-D Rock Properties Modeling for FY 1998'' (SNL 1997, WA-0358), ''3-D Rock Properties Modeling for FY 1999'' (SNL 1999), and the technical development plan, Rock Properties Model Version 3.1, (CRWMS M&O 1999c). The Interim Change Notice (ICNs), ICN 02 and ICN 03, of this AMR were prepared as part of activities being conducted under the Technical Work Plan, TWP-NBS-GS-000003, ''Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01'' (CRWMS M&O 2000b). The purpose of ICN 03 is to record changes in data input status due to data qualification and verification activities. These work plans describe the scope, objectives, tasks, methodology, and implementing procedures for model construction. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The work scope for this activity consists of the following: (1) Conversion of the input data (laboratory measured porosity data, x-ray diffraction mineralogy, petrophysical calculations of bound water, and petrophysical calculations of porosity) for each borehole into stratigraphic coordinates; (2) Re-sampling and merging of data sets; (3
Business value modeling based on BPMN models
Masoumigoudarzi, Farahnaz
2014-01-01
In this study we will try to clarify the explanation of modeling and measuring 'Business Values', as it is defined in business context, in the business processes of a company and introduce different methods and select the one which is best for modeling the company's business values. These methods have been used by researchers in business analytics and senior managers of many companies. The focus in this project is business value detection and modeling. The basis of this research is on BPM...
A future of the model organism model
Rine, Jasper
2014-01-01
Changes in technology are fundamentally reframing our concept of what constitutes a model organism. Nevertheless, research advances in the more traditional model organisms have enabled fresh and exciting opportunities for young scientists to establish new careers and offer the hope of comprehensive understanding of fundamental processes in life. New advances in translational research can be expected to heighten the importance of basic research in model organisms and expand opportunities. Howe...
Failure prediction model: Model napovedovanja odpovedi:
Čelan, Štefan; Težak, Oto; Žižek, Adolf
2002-01-01
Preventative maintenance is vital for delicate technical products. Electronic components or the whole system must be changed, and thus need a good model that will indicate failure accurately. In this paper a stochastic stress-strength quantitative model is presented, folowing the five original hypothesis. Proposed new model of failure prediction could be used by the system maintenance. Failure risk could be instantaneosly calculated. The given theory considers the influences of stress on the ...
Better models are more effectively connected models
Nunes, João Pedro; Bielders, Charles; Darboux, Frederic; Fiener, Peter; Finger, David; Turnbull-Lloyd, Laura; Wainwright, John
2016-04-01
The concept of hydrologic and geomorphologic connectivity describes the processes and pathways which link sources (e.g. rainfall, snow and ice melt, springs, eroded areas and barren lands) to accumulation areas (e.g. foot slopes, streams, aquifers, reservoirs), and the spatial variations thereof. There are many examples of hydrological and sediment connectivity on a watershed scale; in consequence, a process-based understanding of connectivity is crucial to help managers understand their systems and adopt adequate measures for flood prevention, pollution mitigation and soil protection, among others. Modelling is often used as a tool to understand and predict fluxes within a catchment by complementing observations with model results. Catchment models should therefore be able to reproduce the linkages, and thus the connectivity of water and sediment fluxes within the systems under simulation. In modelling, a high level of spatial and temporal detail is desirable to ensure taking into account a maximum number of components, which then enables connectivity to emerge from the simulated structures and functions. However, computational constraints and, in many cases, lack of data prevent the representation of all relevant processes and spatial/temporal variability in most models. In most cases, therefore, the level of detail selected for modelling is too coarse to represent the system in a way in which connectivity can emerge; a problem which can be circumvented by representing fine-scale structures and processes within coarser scale models using a variety of approaches. This poster focuses on the results of ongoing discussions on modelling connectivity held during several workshops within COST Action Connecteur. It assesses the current state of the art of incorporating the concept of connectivity in hydrological and sediment models, as well as the attitudes of modellers towards this issue. The discussion will focus on the different approaches through which connectivity
Rahmani, Fouad Lazhar
2010-11-01
The aim of this paper is to present mathematical modelling of the spread of infection in the context of the transmission of the human immunodeficiency virus (HIV) and the acquired immune deficiency syndrome (AIDS). These models are based in part on the models suggested in the field of th AIDS mathematical modelling as reported by ISHAM [6].
MacKay, N. J.
2006-01-01
An overview of Lanchester combat models, emphasising their pedagogical possibilities. After a description of the aimed-fire model and comments on the literature, we introduce briefly a range of further topics: a discrete equivalent, the unaimed-fire model, mixed forces, the meaning of a 'unit', support troops, Bracken's generalization and an asymmetric model.
Ibsen, Lars Bo; Liingaard, M.
2006-12-15
A lumped-parameter model represents the frequency dependent soil-structure interaction of a massless foundation placed on or embedded into an unbounded soil domain. In this technical report the steps of establishing a lumped-parameter model are presented. Following sections are included in this report: Static and dynamic formulation, Simple lumped-parameter models and Advanced lumped-parameter models. (au)
Croft, Barbara Y.
2002-01-01
Animal models can be used in the study of disease. This chapter discusses imaging animal models to elucidate the process of human disease. The mouse is used as the primary model. Though this choice simplifies many research choices, it necessitates compromises for in vivo imaging. In the future, we can expect improvements in both animal models and imaging techniques.
Andrist, Rafael B.; Haworth, Guy McCrossan
2005-01-01
A reference model of Fallible Endgame Play has been implemented and exercised with the chess-engine WILHELM. Past experiments have demonstrated the value of the model and the robustness of decisions based on it: experiments agree well with a Markov Model theory. Here, the reference model is exercised on the well-known endgame KBBKN.
Generative Models of Disfluency
Miller, Timothy A.
2010-01-01
This thesis describes a generative model for representing disfluent phenomena in human speech. This model makes use of observed syntactic structure present in disfluent speech, and uses a right-corner transform on syntax trees to model this structure in a very natural way. Specifically, the phenomenon of speech repair is modeled by explicitly…
Modelling Railway Interlocking Systems
Lindegaard, Morten Peter; Viuf, P.; Haxthausen, Anne Elisabeth
2000-01-01
In this report we present a model of interlocking systems, and describe how the model may be validated by simulation. Station topologies are modelled by graphs in which the nodes denote track segments, and the edges denote connectivity for train traÆc. Points and signals are modelled by annotatio...
On Multiobjective Evolution Model
Ahmed, E; Elettreby, M. F.
2004-01-01
Self-Organized Criticality (SOC) phenomena could have a significant effect on the dynamics of ecosystems. The Bak-Sneppen (BS) model is a simple and robust model of biological evolution that exhibits punctuated equilibrium behavior. Here we will introduce random version of BS model. Also we generalize the single objective BS model to a multiobjective one.
On Multiobjective Evolution Model
Ahmed, E.; Elettreby, M. F.
Self-Organized Criticality (SOC) phenomena could have a significant effect on the dynamics of ecosystems. The Bak-Sneppen (BS) model is a simple and robust model of biological evolution that exhibits punctuated equilibrium behavior. Here, we will introduce random version of BS model. We also generalize the single objective BS model to a multiobjective one.
2015-09-01
The Biomass Scenario Model (BSM) is a unique, carefully validated, state-of-the-art dynamic model of the domestic biofuels supply chain which explicitly focuses on policy issues, their feasibility, and potential side effects. It integrates resource availability, physical/technological/economic constraints, behavior, and policy. The model uses a system dynamics simulation (not optimization) to model dynamic interactions across the supply chain.
Masonry behavior and modelling
Angelillo, Maurizio; Lourenço, Paulo B.; Milani, G.
2014-01-01
In this Chapter we present the basic experimental facts on masonry materials and introduce simple and refined models for masonry. The simple models are essentially macroscopic and based on the assumption that the material is incapable of sustaining tensile loads (No-Tension assumption). The refined models account for the microscopic structure of masonry, modeling the interaction between the blocks and the interfaces.
Numerical Modelling of Streams
Vestergaard, Kristian
In recent years there has been a sharp increase in the use of numerical water quality models. Numeric water quality modeling can be divided into three steps: Hydrodynamic modeling for the determination of stream flow and water levels. Modelling of transport and dispersion of a conservative...
Gernaey, Krist; Sin, Gürkan
2008-01-01
The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... practice of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise...
Gernaey, Krist; Sin, Gürkan
2011-01-01
The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise...
The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid
Sen S; Moha N.; Baudry B.; Jezequel J.-M.
2009-01-01
International audience Large and complex meta-models such as those of Uml and its profiles are growing due to modelling and inter-operability needs of numerous stakeholders. The complexity of such meta-models has led to coining of the term meta-muddle. Individual users often exercise only a small view of a meta-muddle for tasks ranging from model creation to construction of model transformations. What is the effective meta-model that represents this view? We present a flexible meta-model p...
The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model predictions with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid
Conceptual Model for Communication
Fedaghi, Sabah Al; Fadel, Zahraa
2009-01-01
A variety of idealized models of communication systems exist, and all may have something in common. Starting with Shannons communication model and ending with the OSI model, this paper presents progressively more advanced forms of modeling of communication systems by tying communication models together based on the notion of flow. The basic communication process is divided into different spheres (sources, channels, and destinations), each with its own five interior stages, receiving, processing, creating, releasing, and transferring of information. The flow of information is ontologically distinguished from the flow of physical signals, accordingly, Shannons model, network based OSI models, and TCP IP are redesigned.
Widera, Paweł
2011-01-01
The process of comparison of computer generated protein structural models is an important element of protein structure prediction. It has many uses including model quality evaluation, selection of the final models from a large set of candidates or optimisation of parameters of energy functions used in template free modelling and refinement. Although many protein comparison methods are available online on numerous web servers, their ability to handle a large scale model comparison is often very limited. Most of the servers offer only a single pairwise structural comparison, and they usually do not provide a model-specific comparison with a fixed alignment between the models. To bridge the gap between the protein and model structure comparison we have developed the Protein Models Comparator (pm-cmp). To be able to deliver the scalability on demand and handle large comparison experiments the pm-cmp was implemented "in the cloud". Protein Models Comparator is a scalable web application for a fast distributed comp...
Conceptual Model for Communication
Zahra'a Fadel; Ala'a Alsaqa; Sabah Al-Fedaghi
2009-01-01
A variety of idealized models of communication systems exist, and all may have something in common. Starting with Shannon’s communication model and ending with the OSI model, this paper presents progressively more advanced forms of modeling of communication systems by tying communication models together based on the notion of flow. The basic communication process is divided into different spheres (sources, channels, and destinations), each with its own five interior stages: receiving, process...
Høskuldsson, Agnar
1996-01-01
Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four of these...... the basic problems in determining the dimension of linear models. Then each of the eight measures are treated. The results are illustrated by examples....
Visualizing Risk Prediction Models
Vanya Van Belle; Ben Van Calster
2015-01-01
Objective Risk prediction models can assist clinicians in making decisions. To boost the uptake of these models in clinical practice, it is important that end-users understand how the model works and can efficiently communicate its results. We introduce novel methods for interpretable model visualization. Methods The proposed visualization techniques are applied to two prediction models from the Framingham Heart Study for the prediction of intermittent claudication and stroke after atrial fib...
William Poole
2006-01-01
Most monetary economists today conduct their analysis within some version of a rational expectations model. A well-defined equilibrium in such a model requires that the private sector understand policy goals and the policymakers' model of the economy. An austere version of the model, with no information asymmetries, is valid only to a first approximation but nevertheless provides core insights to short- and long-run monetary policy. In this model, effective policy requires clarity of policy g...
Slavik Stefan; Bednar Richard
2014-01-01
The term business model has been used in practice for few years, but companies create, define and innovate their models subconsciously from the start of business. Our paper is aimed to clear the theory about business model, hence definition and all the components that form each business. In the second part, we create an analytical tool and analyze the real business models in Slovakia and define the characteristics of each part of business model, i.e., customers, distribution, value, resour...
Inference for Multiplicative Models
Wexler, Ydo; Meek, Christopher
2012-01-01
The paper introduces a generalization for known probabilistic models such as log-linear and graphical models, called here multiplicative models. These models, that express probabilities via product of parameters are shown to capture multiple forms of contextual independence between variables, including decision graphs and noisy-OR functions. An inference algorithm for multiplicative models is provided and its correctness is proved. The complexity analysis of the inference algorithm uses a mor...
Wortelboer FG
1994-01-01
This report contains the descriptions of the models currently used within the National Institute of Public Health and Environmental Protection (RIVM). Each model description contains the following entries: Name of the model, Contact in RIVM, Purpose, Policy theme, Technical specifications, Status, Availability, Documentation. Besides, the report contains a list of the models grouped by laboratory, a list of the models grouped by theme, and an index. The purpose of this report is both to give ...
An enhanced communication model
Flensburg, Per
2010-01-01
The concept of information is often taken for more or less granted in research about information systems. This paper introduces a model starting with Shannon and Weaver data transmission model and ends with knowledge transfer between individual persons. The model is in fact an enhanced communication model giving a framework for discussing problems in the communication process. A specific feature of the model is the aim for providing design guidelines in designing the communication process. Th...
Model Driven Language Engineering
Patrascoiu, Octavian
2005-01-01
Modeling is a most important exercise in software engineering and development and one of the current practices is object-oriented (OO) modeling. The Object Management Group (OMG) has defined a standard object-oriented modeling language the Unified Modeling Language (UML). The OMG is not only interested in modeling languages; its primary aim is to enable easy integration of software systems and components using vendor-neutral technologies. This thesis investigates the possibilities for designi...
Langseth, Helge; Nielsen, Thomas Dyhre
2005-01-01
parametric family ofdistributions. In this paper we propose a new set of models forclassification in continuous domains, termed latent classificationmodels. The latent classification model can roughly be seen ascombining the \\NB model with a mixture of factor analyzers,thereby relaxing the assumptions of...... classification model, and wedemonstrate empirically that the accuracy of the proposed model issignificantly higher than the accuracy of other probabilisticclassifiers....
Fundamentals of Friction Modeling
Al-Bender, Farid
2010-01-01
This communication presents an overview of friction model-building, which starts from the generic mechanisms behind friction to construct models that simulate observed macroscopic friction behavior. First, basic friction properties are presented. Then, the generic friction model is outlined. Hereafter, simple heuristic/empirical models are discussed, which are suitable for quick simulation and control purposes. An example of these is the Generalized Maxwell-Slip model.
Papamakarios, George
2015-01-01
Top-performing machine learning systems, such as deep neural networks, large ensembles and complex probabilistic graphical models, can be expensive to store, slow to evaluate and hard to integrate into larger systems. Ideally, we would like to replace such cumbersome models with simpler models that perform equally well. In this thesis, we study knowledge distillation, the idea of extracting the knowledge contained in a complex model and injecting it into a more convenient model. We present a ...
Tahir Abdullah; Shahbaz Nazeer
2012-01-01
Software architecture design and requirement engineering are core and independent areas of engineering. A lot of research, education and practice are carried on Requirement elicitation and doing refine it, but it is a major issue of engineering. QSMSR model act as a bridge between requirement and design there is a huge gap between these two areas of software architecture and requirement engineering. The QSMSR model divide into two sub model qualitative model and Principal model in this resear...
Bubble models, data acquisition and model applicability
Jebavá, Marcela; Kloužek, Jaroslav; Němec, Lubomír
Vsetín : GLASS SERVICE ,INC, 2005, s. 182-191. ISBN 80-239-4687-0. [International Seminar on Mathematical Modeling and Advanced Numerical Methods in Furnace Design and Operation /8./. Velké Karlovice (CZ), 19.05.2005-20.05.2005] Institutional research plan: CEZ:AV0Z40320502 Keywords : bubble models Subject RIV: CA - Inorganic Chemistry
Standard Model Masses and Models of Nuclei
Rivero, Alejandro
2003-01-01
We note an intriguing coincidence in nuclear levels, that the subshells responsible for doubly magic numbers happen to bracket nuclei at the energies of the Standard Model bosons. This could show that these bosons actually contribute to the effective mesons of nuclear models.
Geochemistry Model Validation Report: External Accumulation Model
The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation
Pavement Aging Model by Response Surface Modeling
Manzano-Ramírez A.
2011-10-01
Full Text Available In this work, surface course aging was modeled by Response Surface Methodology (RSM. The Marshall specimens were placed in a conventional oven for time and temperature conditions established on the basis of the environment factors of the region where the surface course is constructed by AC-20 from the Ing. Antonio M. Amor refinery. Volatilized material (VM, load resistance increment (ΔL and flow resistance increment (ΔF models were developed by the RSM. Cylindrical specimens with real aging were extracted from the surface course pilot to evaluate the error of the models. The VM model was adequate, in contrast (ΔL and (ΔF models were almost adequate with an error of 20 %, that was associated with the other environmental factors, which were not considered at the beginning of the research.
Model Validation Status Review
E.L. Hardin
2001-11-28
The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and
Model Validation Status Review
The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and
Cameron, Ian T.; Gani, Rafiqul
This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....
Modeling extragalactic bowshocks. I. The model.
Ferruit, P.; Binette, L.; Sutherland, R. S.; Pecontal, E.
1997-06-01
To probe the effects of the nuclear activity on the host galaxy, it is essential to disentangle the relative contribution of shock excitation from that of photoionization. One milestone towards this goal is the ability to model the bowshock structures created by the interaction of radio ejecta with their surrounding medium. We have built a bowshock model based on TDA's one (Taylor, Dyson & Axon, 1992MNRAS.255..351T) which was itself derived from an earlier work on Herbig-Haro objects. Since TDA's original model supplied the astronomers with only [OIII]λ5007 fluxes and profiles for various models of bowshocks, we undertook to include magnetic fields and to incorporate all of the atomic data tables of the code Mappings Ic for the computation of ionization states, cooling rates and line emissivities of the gas. This new model allows us to map line ratios and profiles of extragalactic bowshocks for all major lines of astrophysical interest. In this first paper, we present our model, analyse the gas behavior along the bowshock and give some examples of model results.
蒋娜; 谢有琪
2012-01-01
With the development of human society, the social hub enlarges beyond one community to the extent that the world is deemed as a community as a whole. Communication, therefore, plays an increasingly important role in our daily life. As a consequence, communication model or the definition of which is not so much a definition as a guide in communication. However, some existed communication models are not as practical as it was. This paper tries to make an overall contrast among three communication models Coded Model, Gable Communication Model and Ostensive Inferential Model, to see how they assist people to comprehend verbal and non -verbal communication.
Towards Approximate Model Transformations
Troya, Javier; Wimmer, Manuel; Vallecillo, Antonio; Burgueño, Loli
2014-01-01
As the size and complexity of models grow, there is a need to count on novel mechanisms and tools for transforming them. This is required, e.g., when model transformations need to provide target models without having access to the complete source models or in really short time—as it happens, e.g., with streaming models—or with very large models for which the transformation algorithms become too slow to be of practical use if the complete population of a model is investigated. In this pa...
Luiz Carlos Bresser-Pereira
2012-03-01
Full Text Available Besides analyzing capitalist societies historically and thinking of them in terms of phases or stages, we may compare different models or varieties of capitalism. In this paper I survey the literature on this subject, and distinguish the classification that has a production or business approach from those that use a mainly political criterion. I identify five forms of capitalism: among the rich countries, the liberal democratic or Anglo-Saxon model, the social or European model, and the endogenous social integration or Japanese model; among developing countries, I distinguish the Asian developmental model from the liberal-dependent model that characterizes most other developing countries, including Brazil.
Microsoft tabular modeling cookbook
Braak, Paul te
2013-01-01
This book follows a cookbook style with recipes explaining the steps for developing analytic data using Business Intelligence Semantic Models.This book is designed for developers who wish to develop powerful and dynamic models for users as well as those who are responsible for the administration of models in corporate environments. It is also targeted at analysts and users of Excel who wish to advance their knowledge of Excel through the development of tabular models or who wish to analyze data through tabular modeling techniques. We assume no prior knowledge of tabular modeling
LI Zhi-jia; YAO Cheng; KONG Xiang-guang
2005-01-01
To improve the Xinanjiang model, the runoff generating from infiltration-excess is added to the model.The another 6 parameters are added to Xinanjiang model.In principle, the improved Xinanjiang model can be used to simulate runoff in the humid, semi-humid and also semi-arid regions.The application in Yi River shows the improved Xinanjiang model could forecast discharge with higher accuracy and can satisfy the practical requirements.It also shows that the improved model is reasonable.
Hansen, Mads Fogtmann; Fagertun, Jens; Larsen, Rasmus
2011-01-01
This paper presents a fusion of the active appearance model (AAM) and the Riemannian elasticity framework which yields a non-linear shape model and a linear texture model – the active elastic appearance model (EAM). The non-linear elasticity shape model is more flexible than the usual linear...... subspace model, and it is therefore able to capture more complex shape variations. Local rotation and translation invariance are the primary explanation for the additional flexibility. In addition, we introduce global scale invariance into the Riemannian elasticity framework which together with the local...
Flexible survival regression modelling
Cortese, Giuliana; Scheike, Thomas H; Martinussen, Torben
2009-01-01
time-varying effects. The introduced models are all applied to data on breast cancer from the Norwegian cancer registry, and these analyses clearly reveal the shortcomings of Cox's regression model and the need for other supplementary analyses with models such as those we present here.......Regression analysis of survival data, and more generally event history data, is typically based on Cox's regression model. We here review some recent methodology, focusing on the limitations of Cox's regression model. The key limitation is that the model is not well suited to represent time...
Reiter, E.R.
1980-01-01
A highly sophisticated and accurate approach is described to compute on an hourly or daily basis the energy consumption for space heating by individual buildings, urban sectors, and whole cities. The need for models and specifically weather-sensitive models, composite models, and space-heating models are discussed. Development of the Colorado State University Model, based on heat-transfer equations and on a heuristic, adaptive, self-organizing computation learning approach, is described. Results of modeling energy consumption by the city of Minneapolis and Cheyenne are given. Some data on energy consumption in individual buildings are included.
Modelling farmers' labour supply in CGE models
Gaasland, Ivar
2008-01-01
In most CGE models with special focus on farm policy, the on-farm wage either follows the ordinary wage in the economy or it is varies according to an assumption of sector specific farm labour. This paper demonstrates a practical and empirical consistent way to model farm household allocation of labour in CGE models, assuming that farm labour is partially sector specific. In this set up, preferences for farming and the relative wage between on-farm and off-farm work, determines the allocation...
Modeling Guru: Knowledge Base for NASA Modelers
Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.
2009-05-01
Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the
Empirical Model Building Data, Models, and Reality
Thompson, James R
2011-01-01
Praise for the First Edition "This...novel and highly stimulating book, which emphasizes solving real problems...should be widely read. It will have a positive and lasting effect on the teaching of modeling and statistics in general." - Short Book Reviews This new edition features developments and real-world examples that showcase essential empirical modeling techniques Successful empirical model building is founded on the relationship between data and approximate representations of the real systems that generated that data. As a result, it is essential for researchers who construct these m
Major Differences between the Jerome Model and the Horace Model
朱艳
2014-01-01
There are three famous translation models in the field of translation: the Jerome model, the Horace model and the Schleiermacher model. The production and development of the three models have significant influence on the translation. To find the major differences between the two western classical translation theoretical models, we discuss the Jerome model and the Hor-ace model deeply in this paper.
D.W. Wu; A.J. Smith
2004-11-08
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7)
Nonlinear Modeling by Assembling Piecewise Linear Models
Yao, Weigang; Liou, Meng-Sing
2013-01-01
To preserve nonlinearity of a full order system over a parameters range of interest, we propose a simple modeling approach by assembling a set of piecewise local solutions, including the first-order Taylor series terms expanded about some sampling states. The work by Rewienski and White inspired our use of piecewise linear local solutions. The assembly of these local approximations is accomplished by assigning nonlinear weights, through radial basis functions in this study. The efficacy of the proposed procedure is validated for a two-dimensional airfoil moving at different Mach numbers and pitching motions, under which the flow exhibits prominent nonlinear behaviors. All results confirm that our nonlinear model is accurate and stable for predicting not only aerodynamic forces but also detailed flowfields. Moreover, the model is robustness-accurate for inputs considerably different from the base trajectory in form and magnitude. This modeling preserves nonlinearity of the problems considered in a rather simple and accurate manner.
Modeling agriculture in the Community Land Model
B. Drewniak; Song, J.(Pusan National University, Pusan, South Korea); Prell, J.; Kotamarthi, V. R.; Jacob, R.
2012-01-01
The potential impact of climate change on agriculture is uncertain. In addition, agriculture could influence above- and below-ground carbon storage. Development of models that represent agriculture is necessary to address these impacts. We have developed an approach to integrate agriculture representations for three crop types – maize, soybean, and spring wheat – into the coupled carbon-nitrogen version of the Community Land Model (CLM), to help address these questions. Here we present the...
Modeling agriculture in the Community Land Model
B. Drewniak; Song, J.(Pusan National University, Pusan, South Korea); Prell, J.; Kotamarthi, V. R.; Jacob, R.
2013-01-01
The potential impact of climate change on agriculture is uncertain. In addition, agriculture could influence above- and below-ground carbon storage. Development of models that represent agriculture is necessary to address these impacts. We have developed an approach to integrate agriculture representations for three crop types – maize, soybean, and spring wheat – into the coupled carbon–nitrogen version of the Community Land Model (CLM), to help address these questions. Here we present the ne...
OPEC model : adjustment or new model
Since the early eighties, the international oil industry went through major changes : new financial markets, reintegration, opening of the upstream, liberalization of investments, privatization. This article provides answers to two major questions : what are the reasons for these changes ? ; do these changes announce the replacement of OPEC model by a new model in which state intervention is weaker and national companies more autonomous. This would imply a profound change of political and institutional systems of oil producing countries. (Author)
Solid Waste Projection Model: Model user's guide
The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford company (WHC) specifically to address solid waste management issues at the Hanford Central Waste Complex (HCWC). This document, one of six documents supporting the SWPM system, contains a description of the system and instructions for preparing to use SWPM and operating Version 1 of the model. 4 figs., 1 tab
A dispersion model to be used off costal waters has been developed. The model has been applied to describe the migration of radionuclides in the Baltic sea. A summary of the results is presented here. (K.A.E)
... Background Information > Modeling Infectious Diseases Fact Sheet Modeling Infectious Diseases Fact Sheet Tagline (Optional) Using computers to prepare ... Content Area Predicting the potential spread of an infectious disease requires much more than simply connecting cities on ...
National Aeronautics and Space Administration — The Galactic model is a spatial and spectral template. The model for the Galactic diffuse emission was developed using spectral line surveys of HI and CO (as a...
Modeling EERE deployment programs
Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2007-11-01
The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge for future research.
Laboratory of Biological Modeling
Federal Laboratory Consortium — The Laboratory of Biological Modeling is defined by both its methodologies and its areas of application. We use mathematical modeling in many forms and apply it to...
Højsgaard, Søren; Edwards, David; Lauritzen, Steffen
of these software developments have taken place within the R community, either in the form of new packages or by providing an R ingerface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In addition......Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many......, the book provides examples of how more advanced aspects of graphical modeling can be represented and handled within R. Topics covered in the seven chapters include graphical models for contingency tables, Gaussian and mixed graphical models, Bayesian networks and modeling high dimensional data...
Modeling in Chemical Engineering
Jaap van Brakel
2000-10-01
Full Text Available Models underlying the use of similarity considerations, dimensionless numbers, and dimensional analysis in chemical engineering are discussed. Special attention is given to the many levels at which models and ceteris paribus conditions play a role and to the modeling of initial and boundary conditions. It is shown that both the laws or dimensionless number correlations and the systems to which they apply are models. More generally, no matter which model or description one picks out, what is being modeled is itself a model of something else. Instead of saying that the artifact S models the given B, it is therefore better to say that S and B jointly make up B and S.
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...
National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...
Chip Multithreaded Consistency Model
Zu-Song Li; Dan-Dan Huan; Wei-Wu Hu; Zhi-Min Tang
2008-01-01
Multithreaded technique is the developing trend of high performance processor. Memory consistency model is essential to the correctness, performance and complexity of multithreaded processor. The chip multithreaded consistency model adapting to multithreaded processor is proposed in this paper. The restriction imposed on memory event ordering by chip multithreaded consistency is presented and formalized. With the idea of critical cycle built by Wei-Wu Hu, we prove that the proposed chip multithreaded consistency model satisfies the criterion of correct execution of sequential consistency model. Chip multithreaded consistency model provides a way of achieving high performance compared with sequential consistency model and ensures the compatibility of software that the execution result in multithreaded processor is the same as the execution result in uniprocessor. The implementation strategy of chip multithreaded consistency model in Godson-2 SMT processor is also proposed. Godson-2 SMT processor supports chip multithreaded consistency model correctly by exception scheme based on the sequential memory access queue of each thread.
Bennett, Joan
1998-01-01
Recommends the use of a model of DNA made out of Velcro to help students visualize the steps of DNA replication. Includes a materials list, construction directions, and details of the demonstration using the model parts. (DDR)
Consistent model driven architecture
Niepostyn, Stanisław J.
2015-09-01
The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.
Oleg Svatos
2013-01-01
Full Text Available In this paper we analyze complexity of time limits we can find especially in regulated processes of public administration. First we review the most popular process modeling languages. There is defined an example scenario based on the current Czech legislature which is then captured in discussed process modeling languages. Analysis shows that the contemporary process modeling languages support capturing of the time limit only partially. This causes troubles to analysts and unnecessary complexity of the models. Upon unsatisfying results of the contemporary process modeling languages we analyze the complexity of the time limits in greater detail and outline lifecycles of a time limit using the multiple dynamic generalizations pattern. As an alternative to the popular process modeling languages there is presented PSD process modeling language, which supports the defined lifecycles of a time limit natively and therefore allows keeping the models simple and easy to understand.
Lichtenberg, Jakob; Hansen, Michael Reichhardt; Rischel, Hans
This paper presents a solution to the Invoicing case study using the Standard ML programming language for modelling.......This paper presents a solution to the Invoicing case study using the Standard ML programming language for modelling....
Riis, Troels; Jørgensen, John Leif
1999-01-01
This documents describes a test of the implementation of the ASC orbit model for the Champ satellite.......This documents describes a test of the implementation of the ASC orbit model for the Champ satellite....
Multivariate volatility models
Fengler, Matthias R.; Herwartz, Helmut
2001-01-01
Multivariate Volatility Models belong to the class of nonlinear models for financial data. Here we want to focus on multivariate GARCH models. These models assume that the variance of the innovation distribution follows a time dependent process conditional on information which is generated by the history of the process. In this chapter we demonstrate how to use the bigarch quantlet of XploRe to estimate the conditional covariance of a bivariate (high frequency) return process. In particular w...
STRATEGY PATTERNS PREDICTION MODEL
Aram Baruch Gonzalez Perez; Jorge Adolfo Ramirez Uresti
2014-01-01
Multi-agent systems are broadly known for being able to simulate real-life situations which require the interaction and cooperation of individuals. Opponent modeling can be used along with multi-agent systems to model complex situations such as competitions like soccer games. In this study, a model for predicting opponent moves based on their target is presented. The model is composed by an offline step (learning phase) and an online one (execution phase). The offline step gets and analyses p...
Avionics Architecture Modelling Language
Alana, Elena; Naranjo, Hector; Valencia, Raul; Medina, Alberto; Honvault, Christophe; Rugina, Ana; Panunzia, Marco; Dellandrea, Brice; Garcia, Gerald
2014-08-01
This paper presents the ESA AAML (Avionics Architecture Modelling Language) study, which aimed at advancing the avionics engineering practices towards a model-based approach by (i) identifying and prioritising the avionics-relevant analyses, (ii) specifying the modelling language features necessary to support the identified analyses, and (iii) recommending/prototyping software tooling to demonstrate the automation of the selected analyses based on a modelling language and compliant with the defined specification.
Modelling Retail Floorspace Productivity
Thurik, Roy; Kooiman, P.
1986-01-01
textabstractThis research note presents a "switching regime" model to investigate the impact of environmental factors on floorspace productivity of individual retail stores. The model includes independent supply and demand functions, which are incorporated within a sales maximizing framework. Unlike previous models, the switching approach allows the model to determine first whether sales are determined by demand or supply side constraints. The appropriate regime is then chosen to estimate spa...
Tashiro, Tohru
2013-01-01
We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model.
The purpose of this Model Report is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Office of Repository Development (ORD). The UZ contains the unsaturated rock layers overlying the repository and host unit, which constitute a natural barrier to flow, and the unsaturated rock layers below the repository which constitute a natural barrier to flow and transport. This work followed, and was planned in, ''Technical Work Plan (TWP) for: Performance Assessment Unsaturated Zone'' (BSC 2002 [160819], Section 1.10.8 [under Work Package (WP) AUZM06, Climate Infiltration and Flow], and Section I-1-1 [in Attachment I, Model Validation Plans]). In Section 4.2, four acceptance criteria (ACs) are identified for acceptance of this Model Report; only one of these (Section 4.2.1.3.6.3, AC 3) was identified in the TWP (BSC 2002 [160819], Table 3-1). These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, and drift-scale and mountain-scale coupled-process models from the UZ Flow, Transport and Coupled Processes Department in the Natural Systems Subproject of the Performance Assessment (PA) Project. The Calibrated Properties Model output will also be used by the Engineered Barrier System Department in the Engineering Systems Subproject. The Calibrated Properties Model provides input through the UZ Model and other process models of natural and engineered systems to the Total System Performance Assessment (TSPA) models, in accord with the PA Strategy and Scope in the PA Project of the Bechtel SAIC Company, LLC (BSC). The UZ process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions. UZ flow is a TSPA model component
Alexander Fedorov
2011-01-01
The author supposed that media education models can be divided into the following groups:- educational-information models (the study of the theory, history, language of media culture, etc.), based on the cultural, aesthetic, semiotic, socio-cultural theories of media education;- educational-ethical models (the study of moral, religions, philosophical problems relying on the ethic, religious, ideological, ecological, protectionist theories of media education;- pragmatic models (practical media...
Validation of simulation models
Rehman, Muniza; Pedersen, Stig Andur
2012-01-01
In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety of...... models has been somewhat narrow-minded reducing the notion of validation to establishment of truth. This article puts forward the diversity in applications of simulation models that demands a corresponding diversity in the notion of validation....
Modeling Digital Video Database
无
2001-01-01
The main purpose of the model is to present how the UnifiedModeling L anguage (UML) can be used for modeling digital video database system (VDBS). It demonstrates the modeling process that can be followed during the analysis phase of complex applications. In order to guarantee the continuity mapping of the mo dels, the authors propose some suggestions to transform the use case diagrams in to an object diagram, which is one of the main diagrams for the next development phases.
Artificial neural network modelling
Samarasinghe, Sandhya
2016-01-01
This book covers theoretical aspects as well as recent innovative applications of Artificial Neural networks (ANNs) in natural, environmental, biological, social, industrial and automated systems. It presents recent results of ANNs in modelling small, large and complex systems under three categories, namely, 1) Networks, Structure Optimisation, Robustness and Stochasticity 2) Advances in Modelling Biological and Environmental Systems and 3) Advances in Modelling Social and Economic Systems. The book aims at serving undergraduates, postgraduates and researchers in ANN computational modelling. .
无
2002-01-01
In order to set up a conceptual data model that reflects the real world as accurately as possible,this paper firstly reviews and analyzes the disadvantages of previous conceptual data models used by traditional GIS in simulating geographic space,gives a new explanation to geographic space and analyzes its various essential characteristics.Finally,this paper proposes several detailed key points for designing a new type of GIS data model and gives a simple holistic GIS data model.
Narayanasamy, Viknashvaran; Wong, Kok Wai; Rai, Shri; Chiou, Andrew
2010-01-01
International audience This paper looks at the game design and engineering approach to model the game design. The game modeling framework discussed in this paper could be a systematic alternative for implementing in the game engine architecture. The suggested game modeling framework incorporates structural game component, temporal game component and boundary game component frameworks. It is suitable to model most complex games and game engines.
Metabolic Model Generalization
Zhukova, Anna
2013-01-01
International audience Genome-scale metabolic models for new organisms include thousands of reactions that are generated automatically: by inferring them from databases of reactions and pathways, existing models for similar organisms, etc. This process includes several iterations of the draft model analysis, error detection, and improvement; starting from more general issues and going deeper into details. Especially in the first iterations model evaluation by a human expert is important. B...
Vépa, Éric; Bézivin, Jean; Brunelière, Hugo; Jouault, Frédéric
2006-01-01
International audience We first present a model repository that has been built as part of the open source Eclipse GMT/AM3 project (Generative Modeling Technology/ATLAS MegaModel Management). Several contributed artifacts present in this repository are organized into sets of models of similar nature called zoos. The structure of the repository will be rapidly described. Its content is very rapidly extending, providing a publicly available source of experimental data to evaluate real life se...
Ravikumar, Pradeep; Lafferty, John; Liu, Han; Wasserman, Larry
2007-01-01
We present a new class of methods for high-dimensional nonparametric regression and classification called sparse additive models (SpAM). Our methods combine ideas from sparse linear modeling and additive nonparametric regression. We derive an algorithm for fitting the models that is practical and effective even when the number of covariates is larger than the sample size. SpAM is closely related to the COSSO model of Lin and Zhang (2006), but decouples smoothing and sparsity, enabling the use...
Thoft-Christensen, Palle
Modelling of corrosion cracking of reinforced concrete structures is complicated as a great number of uncertain factors are involved. To get a reliable modelling a physical and mechanical understanding of the process behind corrosion in needed.......Modelling of corrosion cracking of reinforced concrete structures is complicated as a great number of uncertain factors are involved. To get a reliable modelling a physical and mechanical understanding of the process behind corrosion in needed....
Tashiro, Tohru
2014-03-01
We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model.
Gamayun, I. P.; Cherednichenko, O. Yu.
2015-01-01
The handbook contains the fundamentals of modeling of complex systems. The classification of mathematical models is represented and the methods of their construction are given. The analytical modeling of the basic types of processes in the complex systems is considered. The principles of simulation, statistical and business processes modeling are described. The handbook is oriented on students of higher education establishments that obtain a degree in directions of “Software engineering” and ...
We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model
Mathematical circulatory system model
Lakin, William D. (Inventor); Stevens, Scott A. (Inventor)
2010-01-01
A system and method of modeling a circulatory system including a regulatory mechanism parameter. In one embodiment, a regulatory mechanism parameter in a lumped parameter model is represented as a logistic function. In another embodiment, the circulatory system model includes a compliant vessel, the model having a parameter representing a change in pressure due to contraction of smooth muscles of a wall of the vessel.
TAKEDA, Hideaki; Veerkamp, Paul; Yoshikawa, Hiroyuki
1990-01-01
This article discusses building a computable design process model, which is a prerequisite for realizing intelligent computer-aided design systems. First, we introduce general design theory, from which a descriptive model of design processes is derived. In this model, the concept of metamodels plays a crucial role in describing the evolutionary nature of design. Second, we show a cognitive design process model obtained by observing design processes using a protocol analysis method. We then di...
Bootstrapping Macroeconometric Models
2001-01-01
This paper outlines a bootstrapping approach to the estimation and analysis of macroeconometric models. It integrates for dynamic, nonlinear, simultaneous equation models the bootstrapping approach to evaluating estimators initiated by Efron (1979) and the stochastic simulation approach to evaluating models' properties initiated by Adelman and Adelman (1959). It also estimates for a particular model the gain in coverage accuracy from using bootstrap confidence intervals over asymptotic confid...
Transformation survival models
Yulia Marchenko
2014-01-01
The Cox proportional hazards model is one of the most popular methods for analyzing survival or failure-time data. The key assumption underlying the Cox model is that of proportional hazards. This assumption may often be violated in practice. Transformation survival models extend the Cox regression methodology to allow for nonproportional hazards. They represent the class of semiparametric linear transformation models, which relates an unknown transformation of the survival time linearly to c...
Lakmeche Abdelkader
2015-01-01
Full Text Available A model of prion diseases with impulse effects is studied in this work. First we transform the model to a system of three differential equations with impulse effects in order to study the stability of periodic solution. After that we study the general model by the mean of evolution semi group in order to find conditions of existence of mild solution.
Modeling of ultrasound transducers
Bæk, David
This Ph.D. dissertation addresses ultrasound transducer modeling for medical ultrasound imaging and combines the modeling with the ultrasound simulation program Field II. The project firstly presents two new models for spatial impulse responses (SIR)s to a rectangular elevation focused transducer...
The purpose of this model report is to document the Rock Properties Model version 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties model provides mean matrix and lithophysae porosity, and the cross-correlated mean bulk density as direct input to the ''Saturated Zone Flow and Transport Model Abstraction'', MDL-NBS-HS-000021, REV 02 (BSC 2004 [DIRS 170042]). The constraints, caveats, and limitations associated with this model are discussed in Section 6.6 and 8.2. Model validation accomplished by corroboration with data not cited as direct input is discussed in Section 7. The revision of this model report was performed as part of activities being conducted under the ''Technical Work Plan for: The Integrated Site Model, Revision 05'' (BSC 2004 [DIRS 169635]). The purpose of this revision is to bring the report up to current procedural requirements and address the Regulatory Integration Team evaluation comments. The work plan describes the scope, objectives, tasks, methodology, and procedures for this process
The main purpose of the model is a more detailed description of the radionuclide transfer in food chains, including the dynamics in the early period after accidental release. Detailed modelling of the dynamics of radioactive depositions is beyond the purpose of the model. Standard procedures are used for assessing inhalation and external doses. 3 figs, 2 tabs
Gudiksen, Sune Klok; Poulsen, Søren Bolvig; Buur, Jacob
2014-01-01
Well-established companies are currently struggling to secure profits due to the pressure from new players' business models as they take advantage of communication technology and new business-model configurations. Because of this, the business model research field flourishes currently; however, the...
Bogiages, Christopher A.; Lotter, Christine
2011-01-01
In their research, scientists generate, test, and modify scientific models. These models can be shared with others and demonstrate a scientist's understanding of how the natural world works. Similarly, students can generate and modify models to gain a better understanding of the content, process, and nature of science (Kenyon, Schwarz, and Hug…
Bergdahl, Basti; Sonnenschein, Nikolaus; Machado, Daniel;
2016-01-01
An introduction to genome-scale models, how to build and use them, will be given in this chapter. Genome-scale models have become an important part of systems biology and metabolic engineering, and are increasingly used in research, both in academica and in industry, both for modeling chemical...
GTM is an economic model capable of examining global forestry land-use, management, and trade responses to policies. In responding to a policy, the model captures afforestation, forest management, and avoided deforestation behavior. The model estimates harvests in industrial fore...
One of the environmental and economic models that the U.S. EPA uses to assess climate change policies is the Second Generation Model (SGM). SGM is a 13 region, 24 sector computable general equilibrium (CGE) model of the world that can be used to estimate the domestic and intern...
Modelling of wastewater systems
Bechmann, Henrik
analyze and quantify the effect of the Aeration Tank Settling (ATS) operating mode, which is used during rain events. Furthermore, the model is used to propose a control algorithm for the phase lengths during ATS operation. The models are mainly formulated as state space model in continuous time with...
Kokorev, Andrii
2016-01-01
In this article we consider different methods of modeling asteroid shapes, especially lightcurve inversion technique, and scattering laws used for it. We also introduce our program, which constructs lightcurves for a given asteroid shape model. It can be used to comparing shape model with observational data.
Lee, Cheng K.; Lee, Jenq-Daw
2006-01-01
A survival model is derived from the exponential function using the concept of fractional differentiation. The hazard function of the proposed model generates various shapes of curves including increasing, increasing-constant-increasing, increasing-decreasing-increasing, and so-called bathtub hazard curve. The model also contains a parameter that is the maximum of the survival time.
Fortelius, C.; Holopainen, E.; Kaurola, J.; Ruosteenoja, K.; Raeisaenen, J. [Helsinki Univ. (Finland). Dept. of Meteorology
1996-12-31
In recent years the modelling of interannual climate variability has been studied, the atmospheric energy and water cycles, and climate simulations with the ECHAM3 model. In addition, the climate simulations of several models have been compared with special emphasis in the area of northern Europe
Huang, P; Huang, Yong-Chang
2012-01-01
We suggest a holographic energy model in which the energy coming from spatial curvature, matter and radiation can be obtained by using the particle horizon for the infrared cut-off. We show the consistency between the holographic dark-energy model and the holographic energy model proposed in this paper. Then, we give a holographic description of the universe.
Haworth, Guy McCrossan; Andrist, Rafael B.
2004-01-01
A reference model of Fallible Endgame Play has been implemented and exercised with the chess engine WILHELM. Various experiments have demonstrated the value of the model and the robustness of decisions based on it. Experimental results have also been compared with the theoretical predictions of a Markov model of the endgame and found to be in close agreement.
Modeling EERE Deployment Programs
Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2007-11-01
This report compiles information and conclusions gathered as part of the “Modeling EERE Deployment Programs” project. The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge in which future research is needed.
We review the Skyrme model and discuss a method for incorporating quark degrees of freedom into the model. In addition, by generalizing the Skyrme/quark model to three flavors and taking into account the Wess-Zumino term, we obtain a condition on the SU(3) charges in the quark sector of the theory
Diggle, Peter J
2007-01-01
Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.
Cameron, Ian; Gani, Rafiqul
Engineering of products and processes is increasingly “model-centric”. Models in their multitudinous forms are ubiquitous, being heavily used for a range of decision making activities across all life cycle phases. This chapter gives an overview of what is a model, the principal activities in the ...
Fedorov, Alexander
2011-01-01
The author supposed that media education models can be divided into the following groups: (1) educational-information models (the study of the theory, history, language of media culture, etc.), based on the cultural, aesthetic, semiotic, socio-cultural theories of media education; (2) educational-ethical models (the study of moral, religions,…
Zephyr - the prediction models
Nielsen, Torben Skov; Madsen, Henrik; Nielsen, Henrik Aalborg;
2001-01-01
utilities as partners and users. The new models are evaluated for five wind farms in Denmark as well as one wind farm in Spain. It is shown that the predictions based on conditional parametric models are superior to the predictions obatined by state-of-the-art parametric models....
Thornton, Bradley D.; Smalley, Robert A.
2008-01-01
Building information modeling (BIM) uses three-dimensional modeling concepts, information technology and interoperable software to design, construct and operate a facility. However, BIM can be more than a tool for virtual modeling--it can provide schools with a 3-D walkthrough of a project while it still is on the electronic drawing board. BIM can…
Andresen, Mette
2007-01-01
-authentic modelling is also linked with the potentials of exploration of ready-made models as a forerunner for more authentic modelling processes. The discussion includes analysis of an episode of students? work in the classroom, which serves to illustrate how concept formation may be linked to explorations of a non...
The study has investigated the capabilities of a microdosimetry model to give more understanding in the energy transfer on cellular scale. A simple mathematical model is constructed and validated by existing radiobiological experiments on cell suspensions. The results are used to indicate an approach to develope a more usable microdosimetry model. (orig.)
Crushed Salt Constitutive Model
The constitutive model used to describe the deformation of crushed salt is presented in this report. Two mechanisms -- dislocation creep and grain boundary diffusional pressure solution -- are combined to form the basis for the constitutive model governing the deformation of crushed salt. The constitutive model is generalized to represent three-dimensional states of stress. Upon complete consolidation, the crushed-salt model reproduces the Multimechanism Deformation (M-D) model typically used for the Waste Isolation Pilot Plant (WIPP) host geological formation salt. New shear consolidation tests are combined with an existing database that includes hydrostatic consolidation and shear consolidation tests conducted on WIPP and southeastern New Mexico salt. Nonlinear least-squares model fitting to the database produced two sets of material parameter values for the model -- one for the shear consolidation tests and one for a combination of the shear and hydrostatic consolidation tests. Using the parameter values determined from the fitted database, the constitutive model is validated against constant strain-rate tests. Shaft seal problems are analyzed to demonstrate model-predicted consolidation of the shaft seal crushed-salt component. Based on the fitting statistics, the ability of the model to predict the test data, and the ability of the model to predict load paths and test data outside of the fitted database, the model appears to capture the creep consolidation behavior of crushed salt reasonably well
Model Breaking Points Conceptualized
Vig, Rozy; Murray, Eileen; Star, Jon R.
2014-01-01
Current curriculum initiatives (e.g., National Governors Association Center for Best Practices and Council of Chief State School Officers 2010) advocate that models be used in the mathematics classroom. However, despite their apparent promise, there comes a point when models break, a point in the mathematical problem space where the model cannot,…
Long term morphological modelling
Kristensen, Sten Esbjørn; Deigaard, Rolf; Taaning, Martin;
2010-01-01
A morphological modelling concept for long term nearshore morphology is proposed and examples of its application are presented and discussed. The model concept combines parameterised representations of the cross-shore morphology, with a 2DH area model for waves, currents and sediment transport in...
C. Lum
2004-09-16
The purpose of this model report is to document the Rock Properties Model version 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties model provides mean matrix and lithophysae porosity, and the cross-correlated mean bulk density as direct input to the ''Saturated Zone Flow and Transport Model Abstraction'', MDL-NBS-HS-000021, REV 02 (BSC 2004 [DIRS 170042]). The constraints, caveats, and limitations associated with this model are discussed in Section 6.6 and 8.2. Model validation accomplished by corroboration with data not cited as direct input is discussed in Section 7. The revision of this model report was performed as part of activities being conducted under the ''Technical Work Plan for: The Integrated Site Model, Revision 05'' (BSC 2004 [DIRS 169635]). The purpose of this revision is to bring the report up to current procedural requirements and address the Regulatory Integration Team evaluation comments. The work plan describes the scope, objectives, tasks, methodology, and procedures for this process.
Hydrological land surface modelling
Ridler, Marc-Etienne Francois
and disaster management. The objective of this study is to develop and investigate methods to reduce hydrological model uncertainty by using supplementary data sources. The data is used either for model calibration or for model updating using data assimilation. Satellite estimates of soil moisture and...
FOSS, NICOLAI; Stieglitz, Nils
2014-01-01
We draw on the complementarity literature in economics and management research to dimensionalize business models innovations. Specifically, such innovation can be dimensionalized in terms of the depth and the breadth of the changes to the company’s business model that they imply. In turn, different business model innovations are associated with different management challenges and require different leadership interventions to become successful.
Dynamic Latent Classification Model
Zhong, Shengtong; Martínez, Ana M.; Nielsen, Thomas Dyhre;
possible. Motivated by this problem setting, we propose a generative model for dynamic classification in continuous domains. At each time point the model can be seen as combining a naive Bayes model with a mixture of factor analyzers (FA). The latent variables of the FA are used to capture the dynamics in...
Breitung, Jörg; Eickmeier, Sandra
2005-01-01
Factor models can cope with many variables without running into scarce degrees of freedom problems often faced in a regression-based analysis. In this article we review recent work on dynamic factor models that have become popular in macroeconomic policy analysis and forecasting. By means of an empirical application we demonstrate that these models turn out to be useful in investigating macroeconomic problems.
Andreasen, Martin Møller; Meldrum, Andrew
This paper studies whether dynamic term structure models for US nominal bond yields should enforce the zero lower bound by a quadratic policy rate or a shadow rate specification. We address the question by estimating quadratic term structure models (QTSMs) and shadow rate models with at most four...
Modelling of Information Systems
Hausman, Halina
1982-01-01
The article discusses selected problems in methodology of designing comprehensive information systems. Main emphasis has been laid on modelling of information systems for companies. Presentation of bases for construction of models and description of their main types provides a basis allowing the author to draw conclusions concerning their application. Modelling of information systems is treated as one of stages in designing information systems.
Hierarchical Models of Attitude.
Reddy, Srinivas K.; LaBarbera, Priscilla A.
1985-01-01
The application and use of hierarchical models is illustrated, using the example of the structure of attitudes toward a new product and a print advertisement. Subjects were college students who responded to seven-point bipolar scales. Hierarchical models were better than nonhierarchical models in conceptualizing attitude but not intention. (GDC)
Model description and evaluation of model performance: DOSDIM model
DOSDIM was developed to assess the impact to man from routine and accidental atmospheric releases. It is a compartmental, deterministic, radiological model. For an accidental release, dynamic transfer are used in opposition to a routine release for which equilibrium transfer factors are used. Parameters values were chosen to be conservative. Transfer between compartments are described by first-order differential equations. 2 figs
Crystal field and magnetization of canted antiferromagnet CoCO3
Meshcheryakov, V. F.
2007-11-01
The magnetization of the canted antiferromagnet CoCO3 ( T N = 18.1 K) is calculated in the Weiss molecular field approximation taking into account the microscopic state of the Co2+ ion in the entire range of temperatures and magnetic fields. The values of T N, magnetic susceptibility in the basal plane, and ferromagnetic moment were used as parameters. It is shown that the anisotropy of the g factor and of the exchange interaction at low temperatures ( T < 30 K) including the magnetic ordering temperature is correctly described in the Abragam-Pryce approximation. At high temperatures, the g factor increases and becomes isotropic, but it cannot be described using the Abragam-Pryce approximation. The reasons for g factor variation and the magnitude of the magnetic moment are discussed.
Crystal field and magnetization of canted antiferromagnet CoCO3
The magnetization of the canted antiferromagnet CoCO3 (TN = 18.1 K) is calculated in the Weiss molecular field approximation taking into account the microscopic state of the Co2+ ion in the entire range of temperatures and magnetic fields. The values of TN, magnetic susceptibility in the basal plane, and ferromagnetic moment were used as parameters. It is shown that the anisotropy of the g factor and of the exchange interaction at low temperatures (T < 30 K) including the magnetic ordering temperature is correctly described in the Abragam-Pryce approximation. At high temperatures, the g factor increases and becomes isotropic, but it cannot be described using the Abragam-Pryce approximation. The reasons for g factor variation and the magnitude of the magnetic moment are discussed
Modeling agriculture in the Community Land Model
Drewniak, B.; Song, J.; Prell, J.; Kotamarthi, V. R.; Jacob, R.
2013-04-01
The potential impact of climate change on agriculture is uncertain. In addition, agriculture could influence above- and below-ground carbon storage. Development of models that represent agriculture is necessary to address these impacts. We have developed an approach to integrate agriculture representations for three crop types - maize, soybean, and spring wheat - into the coupled carbon-nitrogen version of the Community Land Model (CLM), to help address these questions. Here we present the new model, CLM-Crop, validated against observations from two AmeriFlux sites in the United States, planted with maize and soybean. Seasonal carbon fluxes compared well with field measurements for soybean, but not as well for maize. CLM-Crop yields were comparable with observations in countries such as the United States, Argentina, and China, although the generality of the crop model and its lack of technology and irrigation made direct comparison difficult. CLM-Crop was compared against the standard CLM3.5, which simulates crops as grass. The comparison showed improvement in gross primary productivity in regions where crops are the dominant vegetation cover. Crop yields and productivity were negatively correlated with temperature and positively correlated with precipitation, in agreement with other modeling studies. In case studies with the new crop model looking at impacts of residue management and planting date on crop yield, we found that increased residue returned to the litter pool increased crop yield, while reduced residue returns resulted in yield decreases. Using climate controls to signal planting date caused different responses in different crops. Maize and soybean had opposite reactions: when low temperature threshold resulted in early planting, maize responded with a loss of yield, but soybean yields increased. Our improvements in CLM demonstrate a new capability in the model - simulating agriculture in a realistic way, complete with fertilizer and residue management
Modeling agriculture in the Community Land Model
B. Drewniak
2013-04-01
Full Text Available The potential impact of climate change on agriculture is uncertain. In addition, agriculture could influence above- and below-ground carbon storage. Development of models that represent agriculture is necessary to address these impacts. We have developed an approach to integrate agriculture representations for three crop types – maize, soybean, and spring wheat – into the coupled carbon–nitrogen version of the Community Land Model (CLM, to help address these questions. Here we present the new model, CLM-Crop, validated against observations from two AmeriFlux sites in the United States, planted with maize and soybean. Seasonal carbon fluxes compared well with field measurements for soybean, but not as well for maize. CLM-Crop yields were comparable with observations in countries such as the United States, Argentina, and China, although the generality of the crop model and its lack of technology and irrigation made direct comparison difficult. CLM-Crop was compared against the standard CLM3.5, which simulates crops as grass. The comparison showed improvement in gross primary productivity in regions where crops are the dominant vegetation cover. Crop yields and productivity were negatively correlated with temperature and positively correlated with precipitation, in agreement with other modeling studies. In case studies with the new crop model looking at impacts of residue management and planting date on crop yield, we found that increased residue returned to the litter pool increased crop yield, while reduced residue returns resulted in yield decreases. Using climate controls to signal planting date caused different responses in different crops. Maize and soybean had opposite reactions: when low temperature threshold resulted in early planting, maize responded with a loss of yield, but soybean yields increased. Our improvements in CLM demonstrate a new capability in the model – simulating agriculture in a realistic way, complete with
Blaha, Michael
2010-01-01
Best-selling author and database expert with more than 25 years of experience modeling application and enterprise data, Dr. Michael Blaha provides tried and tested data model patterns, to help readers avoid common modeling mistakes and unnecessary frustration on their way to building effective data models. Unlike the typical methodology book, "Patterns of Data Modeling" provides advanced techniques for those who have mastered the basics. Recognizing that database representation sets the path for software, determines its flexibility, affects its quality, and influences whether it succ
Brown, T.W.
2010-11-15
The same complex matrix model calculates both tachyon scattering for the c=1 non-critical string at the self-dual radius and certain correlation functions of half-BPS operators in N=4 super- Yang-Mills. It is dual to another complex matrix model where the couplings of the first model are encoded in the Kontsevich-like variables of the second. The duality between the theories is mirrored by the duality of their Feynman diagrams. Analogously to the Hermitian Kontsevich- Penner model, the correlation functions of the second model can be written as sums over discrete points in subspaces of the moduli space of punctured Riemann surfaces. (orig.)
A treatment of nuclear masses and deformations is described which combines the Droplet Model with the folding model surface and Coulomb energy integrals. An additional exponential term, inspired by the folding model, but treated here as an independent contribution with two adjustable parameters, is included. With this term incorporated, the accuracy of the predicted masses and fission barriers was improved significantly, the ability of the Droplet Model to account for isotope shifts in charge radii was retained, and the tendency of the Droplet Model to over-predict the surface-tension squeezing of light nuclei was rectified. 20 references, 4 figures
Blackman, Jonathan; Field, Scott; Galley, Chad; Scheel, Mark; Szilagyi, Bela; Tiglio, Manuel
2015-04-01
With the advanced detector era just around the corner, there is a strong need for fast and accurate models of gravitational waveforms from compact binary coalescence. Fast surrogate models can be built out of an accurate but slow waveform model with minimal to no loss in accuracy, but may require a large number of evaluations of the underlying model. This may be prohibitively expensive if the underlying is extremely slow, for example if we wish to build a surrogate for numerical relativity. We examine alternate choices to building surrogate models which allow for a more sparse set of input waveforms. Research supported in part by NSERC.
Pediatric Computational Models
Soni, Bharat K.; Kim, Jong-Eun; Ito, Yasushi; Wagner, Christina D.; Yang, King-Hay
A computational model is a computer program that attempts to simulate a behavior of a complex system by solving mathematical equations associated with principles and laws of physics. Computational models can be used to predict the body's response to injury-producing conditions that cannot be simulated experimentally or measured in surrogate/animal experiments. Computational modeling also provides means by which valid experimental animal and cadaveric data can be extrapolated to a living person. Widely used computational models for injury biomechanics include multibody dynamics and finite element (FE) models. Both multibody and FE methods have been used extensively to study adult impact biomechanics in the past couple of decades.
Modeling Epidemic Network Failures
Ruepp, Sarah Renée; Fagertun, Anna Manolova
2013-01-01
This paper presents the implementation of a failure propagation model for transport networks when multiple failures occur resulting in an epidemic. We model the Susceptible Infected Disabled (SID) epidemic model and validate it by comparing it to analytical solutions. Furthermore, we evaluate...... the SID model’s behavior and impact on the network performance, as well as the severity of the infection spreading. The simulations are carried out in OPNET Modeler. The model provides an important input to epidemic connection recovery mechanisms, and can due to its flexibility and versatility be used...... to evaluate multiple epidemic scenarios in various network types....
Intersection carbon monoxide modeling
In this note the author discusses the need for better air quality mobile source models near roadways and intersections. To develop the improved models, a better understanding of emissions and their relation to ambient concentrations is necessary. The database for the modal model indicates that vehicles do have different emission levels for different engine operating modes. If the modal approach is used information is needed on traffic signal phasing, queue lengths, delay times, acceleration rates, deceleration rates, capacity, etc. Dispersion estimates using current air quality models may be inaccurate because the models do not take into account intersecting traffic streams, multiple buildings of varying setbacks, height, and spacing
Dependence modeling with copulas
Joe, Harry
2014-01-01
Dependence Modeling with Copulas covers the substantial advances that have taken place in the field during the last 15 years, including vine copula modeling of high-dimensional data. Vine copula models are constructed from a sequence of bivariate copulas. The book develops generalizations of vine copula models, including common and structured factor models that extend from the Gaussian assumption to copulas. It also discusses other multivariate constructions and parametric copula families that have different tail properties and presents extensive material on dependence and tail properties to a
Designing Business Model Change
Cavalcante, Sergio Andre
2014-01-01
The aim of this paper is to base organisational change on the firm's business model, an approach that research has only recently start to address. This study adopts a process-based perspective on business models and insights from a variety of theories as the basis for the development of ideas on...... the design of business model change. This paper offers a new, process-based strategic analytical artefact for the design of business model change, consisting of three main phases. Designing business model change as suggested in this paper allows ex ante analysis of alternative scenarios of change in a...
The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions
Sibbertsen, Philipp; Stahl, Gerhard; Luedtke, Corinna
2008-01-01
Model risk as part of the operational risk is a serious problem for financial institutions. As the pricing of derivatives as well as the computation of the market or credit risk of an institution depend on statistical models the application of a wrong model can lead to a serious over- or underestimation of the institution’s risk. Because the underlying data generating process is unknown in practice evaluating the model risk is a challenge. So far, definitions of model risk are either applicat...
Reconstruction of inflation models
Myrzakulov, Ratbay; Sebastiani, Lorenzo [Eurasian National University, Department of General and Theoretical Physics and Eurasian Center for Theoretical Physics, Astana (Kazakhstan); Zerbini, Sergio [Universita di Trento, Dipartimento di Fisica, Trento (Italy); TIFPA, Istituto Nazionale di Fisica Nucleare, Trento (Italy)
2015-05-15
In this paper, we reconstruct viable inflationary models by starting from spectral index and tensor-to-scalar ratio from Planck observations. We analyze three different kinds of models: scalar field theories, fluid cosmology, and f(R)-modified gravity. We recover the well-known R{sup 2} inflation in Jordan-frame and Einstein-frame representation, the massive scalar inflaton models and two models of inhomogeneous fluid. A model of R{sup 2} correction to Einstein's gravity plus a ''cosmological constant'' with an exact solution for early-time acceleration is reconstructed. (orig.)
The same complex matrix model calculates both tachyon scattering for the c=1 non-critical string at the self-dual radius and certain correlation functions of half-BPS operators in N=4 super- Yang-Mills. It is dual to another complex matrix model where the couplings of the first model are encoded in the Kontsevich-like variables of the second. The duality between the theories is mirrored by the duality of their Feynman diagrams. Analogously to the Hermitian Kontsevich- Penner model, the correlation functions of the second model can be written as sums over discrete points in subspaces of the moduli space of punctured Riemann surfaces. (orig.)
Knudsen, Torben
2011-01-01
The purpose with this deliverable 2.5 is to use fresh experimental data for validation and selection of a flow model to be used for control design in WP3-4. Initially the idea was to investigate the models developed in WP2. However, in the project it was agreed to include and focus on a additive...... model turns out not to be useful for prediction of the flow. Moreover, standard Box Jenkins model structures and multiple output auto regressive models proves to be superior as they can give useful predictions of the flow....
Mathematical modelling techniques
Aris, Rutherford
1995-01-01
""Engaging, elegantly written."" - Applied Mathematical ModellingMathematical modelling is a highly useful methodology designed to enable mathematicians, physicists and other scientists to formulate equations from a given nonmathematical situation. In this elegantly written volume, a distinguished theoretical chemist and engineer sets down helpful rules not only for setting up models but also for solving the mathematical problems they pose and for evaluating models.The author begins with a discussion of the term ""model,"" followed by clearly presented examples of the different types of mode
Hansen, Peter Reinhard; Lunde, Asger; Nason, James M.
The paper introduces the model confidence set (MCS) and applies it to the selection of models. A MCS is a set of models that is constructed such that it will contain the best model with a given level of confidence. The MCS is in this sense analogous to a confidence interval for a parameter. The MCS......, beyond the comparison of models. We apply the MCS procedure to two empirical problems. First, we revisit the inflation forecasting problem posed by Stock and Watson (1999), and compute the MCS for their set of inflation forecasts. Second, we compare a number of Taylor rule regressions and determine the...
Long, John
2014-01-01
Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ
The IITRI Urban Fire Spread Model as well as others of similar vintage were constrained by computer size and running costs such that many approximations/generalizations were introduced to reduce program complexity and data storage requirements. Simplifications were introduced both in input data and in fire growth and spread calculations. Modern computational capabilities offer the means to introduce greater detail and to examine its practical significance on urban fire predictions. Selected portions of the model are described as presently configured, and potential modifications are discussed. A single tract model is hypothesized which permits the importance of various model details to be assessed, and, other model applications are identified
The purpose of this model report is to document the calibrated properties model that provides calibrated property sets for unsaturated zone (UZ) flow and transport process models (UZ models). The calibration of the property sets is performed through inverse modeling. This work followed, and was planned in, ''Technical Work Plan (TWP) for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654], Sections 1.2.6 and 2.1.1.6). Direct inputs to this model report were derived from the following upstream analysis and model reports: ''Analysis of Hydrologic Properties Data'' (BSC 2004 [DIRS 170038]); ''Development of Numerical Grids for UZ Flow and Transport Modeling'' (BSC 2004 [DIRS 169855]); ''Simulation of Net Infiltration for Present-Day and Potential Future Climates'' (BSC 2004 [DIRS 170007]); ''Geologic Framework Model'' (GFM2000) (BSC 2004 [DIRS 170029]). Additionally, this model report incorporates errata of the previous version and closure of the Key Technical Issue agreement TSPAI 3.26 (Section 6.2.2 and Appendix B), and it is revised for improved transparency
Johnson, Douglas H.; Cook, R.D.
2013-01-01
In her AAAS News & Notes piece "Can the Southwest manage its thirst?" (26 July, p. 362), K. Wren quotes Ajay Kalra, who advocates a particular method for predicting Colorado River streamflow "because it eschews complex physical climate models for a statistical data-driven modeling approach." A preference for data-driven models may be appropriate in this individual situation, but it is not so generally, Data-driven models often come with a warning against extrapolating beyond the range of the data used to develop the models. When the future is like the past, data-driven models can work well for prediction, but it is easy to over-model local or transient phenomena, often leading to predictive inaccuracy (1). Mechanistic models are built on established knowledge of the process that connects the response variables with the predictors, using information obtained outside of an extant data set. One may shy away from a mechanistic approach when the underlying process is judged to be too complicated, but good predictive models can be constructed with statistical components that account for ingredients missing in the mechanistic analysis. Models with sound mechanistic components are more generally applicable and robust than data-driven models.
Titan atmospheric models intercomparison
Pernot, P.
2008-09-01
Several groups over the world have developed independently models of the photochemistry of Titan. The Cassini mission reveals daily that the chemical complexity is beyond our expectations e. g. observation of heavy positive and negative ions..., and the models are updated accordingly. At this stage, there is no consensus on the various input parameters, and it becomes increasingly difficult to compare outputs form different models. An ISSI team of experts of those models will be gathered shortly to proceed to an intercomparison, i.e. to assess how the models behave, given identical sets of inputs (collectively defined). Expected discrepancies will have to be elucidated and reduced. This intercomparison will also be an occasion to estimate explicitly the importance of various physicalchemical processes on model predictions versus observations. More robust and validated models are expected from this study for the interpretation of Titanrelated data.
Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann
2008-09-01
In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.
Könemann, Patrick
2009-01-01
is fundamentally different. This paper reports on our ongoing work on model-independent diffs, i.e. a diff that does not directly refer to the models it was created from. Based on that, we present an idea of how the diff could be generalized, e.g. many atomic diffs are merged to a new, generalized......Computing differences (diffs) and merging different versions is well-known for text files, but for models it is a very young field - especially patches for models are still matter of research. Text-based and model-based diffs have different starting points because the semantics of their structure...... diff. One use of these concepts could be a patch for models as it already exists for text files. The advantage of such a generalized diff compared to dasianormalpsila diffs is that it is applicable to a higher variety of models....
Renormalization in supersymmetric models
Fonseca, Renato M
2013-01-01
There are reasons to believe that the Standard Model is only an effective theory, with new Physics lying beyond it. Supersymmetric extensions are one possibility: they address some of the Standard Model's shortcomings, such as the instability of the Higgs boson mass under radiative corrections. In this thesis, some topics related to the renormalization of supersymmetric models are analyzed. One of them is the automatic computation of the Lagrangian and the renormalization group equations of these models, which is a hard and error-prone process if carried out by hand. The generic renormalization group equations themselves are extended so as to include those models which have more than a single abelian gauge factor group. Such situations can occur in grand unified theories, for example. For a wide range of SO(10)-inspired supersymmetric models, we also show that the renormalization group imprints on sparticle masses some information on the higher energies behavior of the models. Finally, in some cases these the...
Alexander Fedorov
2011-03-01
Full Text Available The author supposed that media education models can be divided into the following groups:- educational-information models (the study of the theory, history, language of media culture, etc., based on the cultural, aesthetic, semiotic, socio-cultural theories of media education;- educational-ethical models (the study of moral, religions, philosophical problems relying on the ethic, religious, ideological, ecological, protectionist theories of media education;- pragmatic models (practical media technology training, based on the uses and gratifications and ‘practical’ theories of media education;- aesthetical models (aimed above all at the development of the artistic taste and enriching the skills of analysis of the best media culture examples. Relies on the aesthetical (art and cultural studies theory; - socio-cultural models (socio-cultural development of a creative personality as to the perception, imagination, visual memory, interpretation analysis, autonomic critical thinking, relying on the cultural studies, semiotic, ethic models of media education.
Sommerlund, Julie
from laboratory studies, (Latour 1979; Lynch 1985; Sommerlund 2004 (2007); Sommerlund 2006) and is complemented by the attention paid to the "mediator" by Hennion (1989; 1997; 2005). The empirical focus will be on a central - but overlooked - actor of branding and advertising; the model. The model has...... solely been theorized within cultural studies (Craik 1994) as feminine spectacle, but has been neglected as mediator and actor. This paper will argue that models are co-producers of brands, and vice versa. Empirically, the paper will present interviews with models, model-scouts, agents, and advertisers...... using models in branding-campaigns. The paper will contribute to the field of cultural economy by extending the productive methodology of STS into the fields of branding and marketing, and to the understanding of branding and marketing by focusing on an understudied phenomena - the model - and by...
Testing the applicability of mathematical models with carefully designed experiments is a powerful tool in the investigations of the effects of ionizing radiation on cells. The modeling and cellular studies complement each other, for modeling provides guidance for designing critical experiments which must provide definitive results, while the experiments themselves provide new input to the model. Based on previous experimental results the model for the accumulation of damage in Chlamydomonas reinhardi has been extended to include various multiple two-event combinations. Split dose survival experiments have shown that models tested to date predict most but not all the observed behavior. Stationary-phase mammalian cells, required for tests of other aspects of the model, have been shown to be at different points in the cell cycle depending on how they were forced to stop proliferating. These cultures also demonstrate different capacities for repair of sublethal radiation damage
Shipman, Galen M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-13
These are the slides for a presentation on programming models in HPC, at the Los Alamos National Laboratory's Parallel Computing Summer School. The following topics are covered: Flynn's Taxonomy of computer architectures; single instruction single data; single instruction multiple data; multiple instruction multiple data; address space organization; definition of Trinity (Intel Xeon-Phi is a MIMD architecture); single program multiple data; multiple program multiple data; ExMatEx workflow overview; definition of a programming model, programming languages, runtime systems; programming model and environments; MPI (Message Passing Interface); OpenMP; Kokkos (Performance Portable Thread-Parallel Programming Model); Kokkos abstractions, patterns, policies, and spaces; RAJA, a systematic approach to node-level portability and tuning; overview of the Legion Programming Model; mapping tasks and data to hardware resources; interoperability: supporting task-level models; Legion S3D execution and performance details; workflow, integration of external resources into the programming model.
Modelling structured data with Probabilistic Graphical Models
Forbes, F.
2016-05-01
Most clustering and classification methods are based on the assumption that the objects to be clustered are independent. However, in more and more modern applications, data are structured in a way that makes this assumption not realistic and potentially misleading. A typical example that can be viewed as a clustering task is image segmentation where the objects are the pixels on a regular grid and depend on neighbouring pixels on this grid. Also, when data are geographically located, it is of interest to cluster data with an underlying dependence structure accounting for some spatial localisation. These spatial interactions can be naturally encoded via a graph not necessarily regular as a grid. Data sets can then be modelled via Markov random fields and mixture models (e.g. the so-called MRF and Hidden MRF). More generally, probabilistic graphical models are tools that can be used to represent and manipulate data in a structured way while modeling uncertainty. This chapter introduces the basic concepts. The two main classes of probabilistic graphical models are considered: Bayesian networks and Markov networks. The key concept of conditional independence and its link to Markov properties is presented. The main problems that can be solved with such tools are described. Some illustrations are given associated with some practical work.
Modeling electricity markets with hidden Markov model
This paper proposes to model the movements of electricity markets as partially observable Markov processes driven by underlying economic forces. An electricity market is modeled as a dynamic system evolving over time according to Markov processes. At any time interval, the electricity market can be in one state and transition to another state in the next time interval. This paper models the states of an electricity market as partially observable, while each state has incomplete observations such as market-clearing price and quantity. The true market states are hidden from a market participant behind the incomplete observation. The hidden Markov model (HMM) is of a more fundamental approach and focuses on capturing the interaction of supply and demand forces on electricity markets. Such an approach is appropriate because the simultaneous production and consumption of electricity eliminates the storage sector, while limited transmission networks segment electricity markets. This model is shown to be able to link the fundamental drivers to the price behaviors; therefore, it provides forecast power for mid-term and long-term price movements. This work applies HMM to historical data from New York independent system operator (NYISO), and examples are given to illustrate the forecast power of HMM. (author)
The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their post-closure analyses. The Ventilation Model report was initially developed to analyze the effects of preclosure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts, and to provide heat removal data to support EBS design. Revision 00 of the Ventilation Model included documentation of the modeling results from the ANSYS-based heat transfer model. Revision 01 ICN 01 included the results of the unqualified software code MULTIFLUX to assess the influence of moisture on the ventilation efficiency. The purposes of Revision 02 of the Ventilation Model are: (1) To validate the conceptual model for preclosure ventilation of emplacement drifts and verify its numerical application in accordance with new procedural requirements as outlined in AP-SIII-10Q, Models (Section 7.0). (2) To satisfy technical issues posed in KTI agreement RDTME 3.14 (Reamer and Williams 2001a). Specifically to demonstrate, with respect to the ANSYS ventilation model, the adequacy of
Financial modeling using Gaussian process models
Petelin, D.; Šindelář, Jan; Přikryl, Jan; Kocijan, J.
Piscataway: IEEE, 2011, s. 672-677. ISBN 978-1-4577-1424-5. [6th International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications. Prague (CZ), 15.09.2011-17.09.2011] R&D Projects: GA MŠk 1M0572; GA TA ČR TA01030603; GA ČR GA102/08/0567; GA MŠk(CZ) MEB091015 Institutional research plan: CEZ:AV0Z10750506 Keywords : gaussian process models * autoregression * financial * efficient markets Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2011/AS/sindelar- financial modeling using gaussian process models.pdf
Possibilistic Graphical Models and Compositional Models
Vejnarová, Jiřina
Vol. I. Berlin Heidelberg: Springer, 2010 - (Hullermaier, E.; Kruse, R.; Hoffman, F.), s. 21-30. (Lecture Notes in Computer Science. 80). ISBN 978-3-642-14054-9. ISSN 1865-0929. [13h International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems IPMU'10. Dortmund (DE), 28.06.2010-02.07.2010] R&D Projects: GA ČR GA201/09/1891; GA AV ČR IAA100750603 Grant ostatní: GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : possibility distributions * graphical models * triangular norms Subject RIV: BA - General Mathematics http://library.utia.cas.cz/separaty/2010/MTR/vejnarova-possibilistic graphical models and compositional models.pdf
Modelling cointegration in the vector autoregressive model
Johansen, Søren
2000-01-01
A survey is given of some results obtained for the cointegrated VAR. The Granger representation theorem is discussed and the notions of cointegration and common trends are defined. The statistical model for cointegrated I(1) variables is defined, and it is shown how hypotheses on the cointegratin...... relations can be estimated under suitable identification conditions. The asymptotic theory is briefly mentioned and a few economic applications of the cointegration model are indicated.......A survey is given of some results obtained for the cointegrated VAR. The Granger representation theorem is discussed and the notions of cointegration and common trends are defined. The statistical model for cointegrated I(1) variables is defined, and it is shown how hypotheses on the cointegrating...
Personal history of nucleon polarization experiments
The history of nucleon scattering experiments is reviewed, starting with the observation of large proton polarizations in scattering from light elements such as carbon, and ending with the acceleration of polarized proton beams in high-energy synchrotrons. Special mention is made about significant contributions made by C.L. Oxley, L. Wolfenstein, R.D. Tripp, T. Ypsilantis, A. Abragam, M. Borghini, T. Niinikoski, Froissart, Stora, A.D. Krisch, and L.G. Ratner
Modeling agriculture in the Community Land Model
B. Drewniak
2012-12-01
Full Text Available The potential impact of climate change on agriculture is uncertain. In addition, agriculture could influence above- and below-ground carbon storage. Development of models that represent agriculture is necessary to address these impacts. We have developed an approach to integrate agriculture representations for three crop types – maize, soybean, and spring wheat – into the coupled carbon-nitrogen version of the Community Land Model (CLM, to help address these questions. Here we present the new model, CLM-Crop, validated against observations from two AmeriFlux sites in the United States, planted with maize and soybean. Seasonal carbon fluxes compared well with field measurements. CLM-Crop yields were comparable with observations in some regions, although the generality of the crop model and its lack of technology and irrigation made direct comparison difficult. CLM-Crop was compared against the standard CLM3.5, which simulates crops as grass. The comparison showed improvement in gross primary productivity in regions where crops are the dominant vegetation cover. Crop yields and productivity were negatively correlated with temperature and positively correlated with precipitation. In case studies with the new crop model looking at impacts of residue management and planting date on crop yield, we found that increased residue returned to the litter pool increased crop yield, while reduced residue returns resulted in yield decreases. Using climate controls to signal planting date caused different responses in different crops. Maize and soybean had opposite reactions: when low temperature threshold resulted in early planting, maize responded with a loss of yield, but soybean yields increased. Our improvements in CLM demonstrate a new capability in the model – simulating agriculture in a realistic way, complete with fertilizer and residue management practices. Results are encouraging, with improved representation of human influences on the land
Phyloclimatic modeling: combining phylogenetics and bioclimatic modeling.
Yesson, C; Culham, A
2006-10-01
We investigate the impact of past climates on plant diversification by tracking the "footprint" of climate change on a phylogenetic tree. Diversity within the cosmopolitan carnivorous plant genus Drosera (Droseraceae) is focused within Mediterranean climate regions. We explore whether this diversity is temporally linked to Mediterranean-type climatic shifts of the mid-Miocene and whether climate preferences are conservative over phylogenetic timescales. Phyloclimatic modeling combines environmental niche (bioclimatic) modeling with phylogenetics in order to study evolutionary patterns in relation to climate change. We present the largest and most complete such example to date using Drosera. The bioclimatic models of extant species demonstrate clear phylogenetic patterns; this is particularly evident for the tuberous sundews from southwestern Australia (subgenus Ergaleium). We employ a method for establishing confidence intervals of node ages on a phylogeny using replicates from a Bayesian phylogenetic analysis. This chronogram shows that many clades, including subgenus Ergaleium and section Bryastrum, diversified during the establishment of the Mediterranean-type climate. Ancestral reconstructions of bioclimatic models demonstrate a pattern of preference for this climate type within these groups. Ancestral bioclimatic models are projected into palaeo-climate reconstructions for the time periods indicated by the chronogram. We present two such examples that each generate plausible estimates of ancestral lineage distribution, which are similar to their current distributions. This is the first study to attempt bioclimatic projections on evolutionary time scales. The sundews appear to have diversified in response to local climate development. Some groups are specialized for Mediterranean climates, others show wide-ranging generalism. This demonstrates that Phyloclimatic modeling could be repeated for other plant groups and is fundamental to the understanding of
Multiple Model Approaches to Modelling and Control,
learning. The underlying question is `How should we partition the system - what is `local'?'. This book presents alternative ways of bringing submodels together,which lead to varying levels of performance and insight. Some are further developed for autonomous learning of parameters from data, while others...... into multiple smaller operating regimes each of which is associated a locally valid model orcontroller. This can often give a simplified and transparent nonlinear model or control representation. In addition, the local approach has computationaladvantages, it lends itself to adaptation and learning...
Hammerand, Daniel Carl; Scherzinger, William Mark
2007-09-01
The Library of Advanced Materials for Engineering (LAME) provides a common repository for constitutive models that can be used in computational solid mechanics codes. A number of models including both hypoelastic (rate) and hyperelastic (total strain) constitutive forms have been implemented in LAME. The structure and testing of LAME is described in Scherzinger and Hammerand ([3] and [4]). The purpose of the present report is to describe the material models which have already been implemented into LAME. The descriptions are designed to give useful information to both analysts and code developers. Thus far, 33 non-ITAR/non-CRADA protected material models have been incorporated. These include everything from the simple isotropic linear elastic models to a number of elastic-plastic models for metals to models for honeycomb, foams, potting epoxies and rubber. A complete description of each model is outside the scope of the current report. Rather, the aim here is to delineate the properties, state variables, functions, and methods for each model. However, a brief description of some of the constitutive details is provided for a number of the material models. Where appropriate, the SAND reports available for each model have been cited. Many models have state variable aliases for some or all of their state variables. These alias names can be used for outputting desired quantities. The state variable aliases available for results output have been listed in this report. However, not all models use these aliases. For those models, no state variable names are listed. Nevertheless, the number of state variables employed by each model is always given. Currently, there are four possible functions for a material model. This report lists which of these four methods are employed in each material model. As far as analysts are concerned, this information is included only for the awareness purposes. The analyst can take confidence in the fact that model has been properly implemented
Modelling prokaryote gene content
Edward Susko
2006-01-01
Full Text Available The patchy distribution of genes across the prokaryotes may be caused by multiple gene losses or lateral transfer. Probabilistic models of gene gain and loss are needed to distinguish between these possibilities. Existing models allow only single genes to be gained and lost, despite the empirical evidence for multi-gene events. We compare birth-death models (currently the only widely-used models, in which only one gene can be gained or lost at a time to blocks models (allowing gain and loss of multiple genes within a family. We analyze two pairs of genomes: two E. coli strains, and the distantly-related Archaeoglobus fulgidus (archaea and Bacillus subtilis (gram positive bacteria. Blocks models describe the data much better than birth-death models. Our models suggest that lateral transfers of multiple genes from the same family are rare (although transfers of single genes are probably common. For both pairs, the estimated median time that a gene will remain in the genome is not much greater than the time separating the common ancestors of the archaea and bacteria. Deep phylogenetic reconstruction from sequence data will therefore depend on choosing genes likely to remain in the genome for a long time. Phylogenies based on the blocks model are more biologically plausible than phylogenies based on the birth-death model.
The eight book chapters demonstrate the link between the physical models of the environment and the policy analysis in support of policy making. Each chapter addresses an environmental policy issue using a quantitative modeling approach. The volume addresses three general areas of environmental policy - non-point source pollution in the agricultural sector, pollution generated in the extractive industries, and transboundary pollutants from burning fossil fuels. The book concludes by discussing the modeling efforts and the use of mathematical models in general. Chapters are entitled: modeling environmental policy: an introduction; modeling nonpoint source pollution in an integrated system (agri-ecological); modeling environmental and trade policy linkages: the case of EU and US agriculture; modeling ecosystem constraints in the Clean Water Act: a case study in Clearwater National Forest (subject to discharge from metal mining waste); costs and benefits of coke oven emission controls; modeling equilibria and risk under global environmental constraints (discussing energy and environmental interrelations); relative contribution of the enhanced greenhouse effect on the coastal changes in Louisiana; and the use of mathematical models in policy evaluations: comments. The paper on coke area emission controls has been abstracted separately for the IEA Coal Research CD-ROM
Geochemical modeling: a review
Two general families of geochemical models presently exist. The ion speciation-solubility group of geochemical models contain submodels to first calculate a distribution of aqueous species and to secondly test the hypothesis that the water is near equilibrium with particular solid phases. These models may or may not calculate the adsorption of dissolved constituents and simulate the dissolution and precipitation (mass transfer) of solid phases. Another family of geochemical models, the reaction path models, simulates the stepwise precipitation of solid phases as a result of reacting specified amounts of water and rock. Reaction path models first perform an aqueous speciation of the dissolved constituents of the water, test solubility hypotheses, then perform the reaction path modeling. Certain improvements in the present versions of these models would enhance their value and usefulness to applications in nuclear-waste isolation, etc. Mass-transfer calculations of limited extent are certainly within the capabilities of state-of-the-art models. However, the reaction path models require an expansion of their thermodynamic data bases and systematic validation before they are generally accepted
Dan Alexandru Anghel
2012-01-01
Full Text Available In semiconductor laser modeling, a good mathematical model gives near-reality results. Three methods of modeling solutions from the rate equations are presented and analyzed. A method based on the rate equations modeled in Simulink to describe quantum well lasers was presented. For different signal types like step function, saw tooth and sinus used as input, a good response of the used equations is obtained. Circuit model resulting from one of the rate equations models is presented and simulated in SPICE. Results show a good modeling behavior. Numerical simulation in MathCad gives satisfactory results for the study of the transitory and dynamic operation at small level of the injection current. The obtained numerical results show the specific limits of each model, according to theoretical analysis. Based on these results, software can be built that integrates circuit simulation and other modeling methods for quantum well lasers to have a tool that model and analysis these devices from all points of view.
Chen, Changyou; Buntine, Wray; Ding, Nan; Xie, Lexing; Du, Lan
2015-02-01
In applications we may want to compare different document collections: they could have shared content but also different and unique aspects in particular collections. This task has been called comparative text mining or cross-collection modeling. We present a differential topic model for this application that models both topic differences and similarities. For this we use hierarchical Bayesian nonparametric models. Moreover, we found it was important to properly model power-law phenomena in topic-word distributions and thus we used the full Pitman-Yor process rather than just a Dirichlet process. Furthermore, we propose the transformed Pitman-Yor process (TPYP) to incorporate prior knowledge such as vocabulary variations in different collections into the model. To deal with the non-conjugate issue between model prior and likelihood in the TPYP, we thus propose an efficient sampling algorithm using a data augmentation technique based on the multinomial theorem. Experimental results show the model discovers interesting aspects of different collections. We also show the proposed MCMC based algorithm achieves a dramatically reduced test perplexity compared to some existing topic models. Finally, we show our model outperforms the state-of-the-art for document classification/ideology prediction on a number of text collections. PMID:26353238
The data presented here includes male and female models for Asian populations in the age groups: Newborn, 1 year, 5 years, 10 years, 15 years and adult. The model for adult male was presented at the 3rd Research Coordination Meeting held in Tianjin, October 1993. At that time, the CRP participants requested Dr. Tanaka to continue development of a female model. The adult female model was developed together with models for five younger age groups. It is intended to provide useful data for radiation protection, and has been submitted to ICRP for use in developing revised models for internal dosimetry. The model is based on normal organ masses as well as physical measurements obtained primarily from Chinese, Indian and Japanese populations. These are believed to be the most extensive data sets available. The data presented here also takes into account the variations found in the data reported by other CRP participants. It should be stressed that the model is, at the same time, based on the approach used by the ICRP Reference Man Task Group in development of their Reference Man. As noted above, the adult male model was presented at the RCM Meeting in Tianjin and approved by the participants as ''Tanaka Model'' that would be convenient for use in internal dosimetry studies for subjects from Asian populations. It is also the essential part of a publication which is a revised edition of the previous work
It has been realized that resilience as a concept involves several contradictory definitions, both for instance resilience as agile adjustment and as robust resistance to situations. Our analysis of resilience concepts and models suggest that beyond simplistic definitions, it is possible to draw up a systemic resilience model (SyRes) that maintains these opposing characteristics without contradiction. We outline six functions in a systemic model, drawing primarily on resilience engineering, and disaster response: anticipation, monitoring, response, recovery, learning, and self-monitoring. The model consists of four areas: Event-based constraints, Functional Dependencies, Adaptive Capacity and Strategy. The paper describes dependencies between constraints, functions and strategies. We argue that models such as SyRes should be useful both for envisioning new resilience methods and metrics, as well as for engineering and evaluating resilient systems. - Highlights: • The SyRes model resolves contradictions between previous resilience definitions. • SyRes is a core model for envisioning and evaluating resilience metrics and models. • SyRes describes six functions in a systemic model. • They are anticipation, monitoring, response, recovery, learning, self-monitoring. • The model describes dependencies between constraints, functions and strategies
Modeling Imports in a Keynesian Expenditure Model
Findlay, David W.
2010-01-01
The author discusses several issues that instructors of introductory macroeconomics courses should consider when introducing imports in the Keynesian expenditure model. The analysis suggests that the specification of the import function should partially, if not completely, be the result of a simple discussion about the spending and import…
Tan, A; Lyatskaya, I [Department of Physics, Alabama A and M University, Normal, AL 35762 (United States)], E-mail: arjun.tan@aamu.edu
2009-01-15
The interesting papers by Margaritondo (2005 Eur. J. Phys. 26 401) and by Helene and Yamashita (2006 Eur. J. Phys. 27 855) analysed the great Indian Ocean tsunami of 2004 using a simple one-dimensional canal wave model, which was appropriate for undergraduate students in physics and related fields of discipline. In this paper, two additional, easily understandable models, suitable for the same level of readership, are proposed: one, a two-dimensional model in flat space, and two, the same on a spherical surface. The models are used to study the tsunami produced by the central Kuril earthquake of November 2006. It is shown that the two alternative models, especially the latter one, give better representations of the wave amplitude, especially at far-flung locations. The latter model further demonstrates the enhancing effect on the amplitude due to the curvature of the Earth for far-reaching tsunami propagation.
STRATEGY PATTERNS PREDICTION MODEL
Aram Baruch Gonzalez Perez
2014-01-01
Full Text Available Multi-agent systems are broadly known for being able to simulate real-life situations which require the interaction and cooperation of individuals. Opponent modeling can be used along with multi-agent systems to model complex situations such as competitions like soccer games. In this study, a model for predicting opponent moves based on their target is presented. The model is composed by an offline step (learning phase and an online one (execution phase. The offline step gets and analyses previous experiences while the online step uses the data generated by offline analysis to predict opponent moves. This model is illustrated by an experiment with the RoboCup 2D Soccer Simulator. The proposed model was tested using 22 games to create the knowledge base and getting an accuracy rate over 80%.
Developing mathematical modelling competence
Blomhøj, Morten; Jensen, Tomas Højgaard
2003-01-01
In this paper we introduce the concept of mathematical modelling competence, by which we mean being able to carry through a whole mathematical modelling process in a certain context. Analysing the structure of this process, six sub-competences are identified. Mathematical modelling competence...... cannot be reduced to these six sub-competences, but they are necessary elements in the development of mathematical modelling competence. Experience from the development of a modelling course is used to illustrate how the different nature of the sub-competences can be used as a tool for finding the...... balance between different kinds of activities in a particular educational setting. Obstacles of social, cognitive and affective nature for the students' development of mathematical modelling competence are reported and discussed in relation to the sub-competences....
There is burgeoning interest in modeling-based accelerator control. With more and more stringent requirements on the performance, the importance of knowing, controlling, predicting the behavior of the accelerator system is growing. Modeling means two things: (1) the development of programs and data which predict the outcome of a measurement, and (2) devising and performing measurements to find the machine physics parameter and their behavior under different conditions. These two sides should be tied together in an iterative process. With knowledge gained on the real system, the model will be modified, calibrated, and fine-tuned. The model of a system consists of data and the modeling program. The Modeling Based Control Programs (MBC) should in the on-line mode control, optimize, and correct the machine. In the off-line mode, the MBC is used to simulate the machine as well as explore and study its behavior and responses under a wide variety of circumstances. 15 refs., 3 figs
North, G. R.; Cahalan, R. F.; Coakley, J. A., Jr.
1981-01-01
An introductory survey of the global energy balance climate models is presented with an emphasis on analytical results. A sequence of increasingly complicated models involving ice cap and radiative feedback processes are solved, and the solutions and parameter sensitivities are studied. The model parameterizations are examined critically in light of many current uncertainties. A simple seasonal model is used to study the effects of changes in orbital elements on the temperature field. A linear stability theorem and a complete nonlinear stability analysis for the models are developed. Analytical solutions are also obtained for the linearized models driven by stochastic forcing elements. In this context the relation between natural fluctuation statistics and climate sensitivity is stressed.
Decomposing model systematic error
Keenlyside, Noel; Shen, Mao-Lin
2014-05-01
Seasonal forecasts made with a single model are generally overconfident. The standard approach to improve forecast reliability is to account for structural uncertainties through a multi-model ensemble (i.e., an ensemble of opportunity). Here we analyse a multi-model set of seasonal forecasts available through ENSEMBLES and DEMETER EU projects. We partition forecast uncertainties into initial value and structural uncertainties, as function of lead-time and region. Statistical analysis is used to investigate sources of initial condition uncertainty, and which regions and variables lead to the largest forecast error. Similar analysis is then performed to identify common elements of model error. Results of this analysis will be used to discuss possibilities to reduce forecast uncertainty and improve models. In particular, better understanding of error growth will be useful for the design of interactive multi-model ensembles.
Electricity market modeling trends
The trend towards competition in the electricity sector has led to efforts by the research community to develop decision and analysis support models adapted to the new market context. This paper focuses on electricity generation market modeling. Its aim is to help to identify, classify and characterize the somewhat confusing diversity of approaches that can be found in the technical literature on the subject. The paper presents a survey of the most relevant publications regarding electricity market modeling, identifying three major trends: optimization models, equilibrium models and simulation models. It introduces a classification according to their most relevant attributes. Finally, it identifies the most suitable approaches for conducting various types of planning studies or market analysis in this new context
1985-01-01
The outside users payload model which is a continuation of documents and replaces and supersedes the July 1984 edition is presented. The time period covered by this model is 1985 through 2000. The following sections are included: (1) definition of the scope of the model; (2) discussion of the methodology used; (3) overview of total demand; (4) summary of the estimated market segmentation by launch vehicle; (5) summary of the estimated market segmentation by user type; (6) details of the STS market forecast; (7) summary of transponder trends; (8) model overview by mission category; and (9) detailed mission models. All known non-NASA, non-DOD reimbursable payloads forecast to be flown by non-Soviet-block countries are included in this model with the exception of Spacelab payloads and small self contained payloads. Certain DOD-sponsored or cosponsored payloads are included if they are reimbursable launches.
Teaching macromolecular modeling.
Harvey, S C; Tan, R K
1992-12-01
Training newcomers to the field of macromolecular modeling is as difficult as is training beginners in x-ray crystallography, nuclear magnetic resonance, or other methods in structural biology. In one or two lectures, the most that can be conveyed is a general sense of the relationship between modeling and other structural methods. If a full semester is available, then students can be taught how molecular structures are built, manipulated, refined, and analyzed on a computer. Here we describe a one-semester modeling course that combines lectures, discussions, and a laboratory using a commercial modeling package. In the laboratory, students carry out prescribed exercises that are coordinated to the lectures, and they complete a term project on a modeling problem of their choice. The goal is to give students an understanding of what kinds of problems can be attacked by molecular modeling methods and which problems are beyond the current capabilities of those methods. PMID:1489919
Identification of physical models
Melgaard, Henrik
1994-01-01
The problem of identification of physical models is considered within the frame of stochastic differential equations. Methods for estimation of parameters of these continuous time models based on descrete time measurements are discussed. The important algorithms of a computer program for ML or MAP...... design of experiments, which is for instance the design of an input signal that are optimal according to a criterion based on the information provided by the experiment. Also model validation is discussed. An important verification of a physical model is to compare the physical characteristics of the...... model with the available prior knowledge. The methods for identification of physical models have been applied in two different case studies. One case is the identification of thermal dynamics of building components. The work is related to a CEC research project called PASSYS (Passive Solar Components...
Essentials of econophysics modelling
Slanina, Frantisek
2014-01-01
This book is a course in methods and models rooted in physics and used in modelling economic and social phenomena. It covers the discipline of econophysics, which creates an interface between physics and economics. Besides the main theme, it touches on the theory of complex networks and simulations of social phenomena in general. After a brief historical introduction, the book starts with a list of basic empirical data and proceeds to thorough investigation of mathematical and computer models. Many of the models are based on hypotheses of the behaviour of simplified agents. These comprise strategic thinking, imitation, herding, and the gem of econophysics, the so-called minority game. At the same time, many other models view the economic processes as interactions of inanimate particles. Here, the methods of physics are especially useful. Examples of systems modelled in such a way include books of stock-market orders, and redistribution of wealth among individuals. Network effects are investigated in the inter...
King, S F
2004-01-01
This is a review article about neutrino mass models, particularly see-saw models involving three active neutrinos which are capable of describing both the atmospheric neutrino oscillation data, and the large mixing angle MSW solar solution, which is now uniquely specified by recent data. We briefly review the current experimental status, show how to parametrise and construct the neutrino mixing matrix, and present the leading order neutrino Majorana mass matrices. We then introduce the see-saw mechanism, and discuss a natural application of it to current data using the sequential dominance mechanism, which we compare to an early proposal for obtaining large mixing angles. We show how both the Standard Model and the Minimal Supersymmetric Standard Model may be extended to incorporate the see-saw mechanism, and show how the latter case leads to the expectation of lepton flavour violation. The see-saw mechanism motivates models with additional symmetries such as unification and family symmetry models, and we tab...
Andersen, Kasper Winther
Three main topics are presented in this thesis. The first and largest topic concerns network modelling of functional Magnetic Resonance Imaging (fMRI) and Diffusion Weighted Imaging (DWI). In particular nonparametric Bayesian methods are used to model brain networks derived from resting state f...... for their ability to reproduce node clustering and predict unseen data. Comparing the models on whole brain networks, BCD and IRM showed better reproducibility and predictability than IDM, suggesting that resting state networks exhibit community structure. This also points to the importance of using models, which...... allow for complex interactions between all pairs of clusters. In addition, it is demonstrated how the IRM can be used for segmenting brain structures into functionally coherent clusters. A new nonparametric Bayesian network model is presented. The model builds upon the IRM and can be used to infer...
Aeroservoelasticity modeling and control
Tewari, Ashish
2015-01-01
This monograph presents the state of the art in aeroservoelastic (ASE) modeling and analysis and develops a systematic theoretical and computational framework for use by researchers and practicing engineers. It is the first book to focus on the mathematical modeling of structural dynamics, unsteady aerodynamics, and control systems to evolve a generic procedure to be applied for ASE synthesis. Existing robust, nonlinear, and adaptive control methodology is applied and extended to some interesting ASE problems, such as transonic flutter and buffet, post-stall buffet and maneuvers, and flapping flexible wing. The author derives a general aeroservoelastic plant via the finite-element structural dynamic model, unsteady aerodynamic models for various regimes in the frequency domain, and the associated state-space model by rational function approximations. For more advanced models, the full-potential, Euler, and Navier-Stokes methods for treating transonic and separated flows are also briefly addressed. Essential A...
Lawson, Andrew B
2002-01-01
Research has generated a number of advances in methods for spatial cluster modelling in recent years, particularly in the area of Bayesian cluster modelling. Along with these advances has come an explosion of interest in the potential applications of this work, especially in epidemiology and genome research. In one integrated volume, this book reviews the state-of-the-art in spatial clustering and spatial cluster modelling, bringing together research and applications previously scattered throughout the literature. It begins with an overview of the field, then presents a series of chapters that illuminate the nature and purpose of cluster modelling within different application areas, including astrophysics, epidemiology, ecology, and imaging. The focus then shifts to methods, with discussions on point and object process modelling, perfect sampling of cluster processes, partitioning in space and space-time, spatial and spatio-temporal process modelling, nonparametric methods for clustering, and spatio-temporal ...
Sanzheeva, Larisa
2014-01-01
In article the analysis of synergetic model of culture is carried out. The ontology of sense of life and semantic interrelations of subject and subject and object elements of culture as systems are considered. Need of designing of culture model, for identification of synergetic mechanisms of fluctuations and transformations of multipurpose systems of the person, society and the nature is proved. The synergetic model of culture, its design, structural forms, and levels in complete system of li...
Supersymmetric nonlinear sigma models
Supersymmetric nonlinear sigma models are formulated as gauge theories. Auxiliary chiral superfields are introduced to impose supersymmetric constraints of F-type. Target manifolds defined by F-type constraints are always non-compact. In order to obtain nonlinear sigma models on compact manifolds, we have to introduce gauge symmetry to eliminate the degrees of freedom in non-compact directions. All supersymmetric nonlinear sigma models defined on the hermitian symmetric spaces are successfully formulated as gauge theories. (author)
Introduction to Graphical Modelling
Scutari, Marco
2010-01-01
The aim of this chapter is twofold. In the first part we will provide a brief overview of the mathematical and statistical foundations of graphical models, along with their fundamental properties, estimation and basic inference procedures. In particular we will develop Markov networks (also known as Markov random fields) and Bayesian networks, which comprise most past and current literature on graphical models. In the second part we will review some applications of graphical models in systems biology.
Motl, L
2001-01-01
In this short note we construct the DLCQ description of the flux seven-branes in type IIA string theory and discuss its basic properties. The matrix model involves dipole fields. We explain the relation of this nonlocal matrix model to various orbifolds. We also give a spacetime interpretation of the Seiberg-Witten-like map, proposed in a different context first by Bergman and Ganor, that converts this matrix model to a local, highly nonlinear theory.
Rosenthal, Dale W.R.
2008-01-01
The problem of classifying trades as buys or sells is examined. I propose estimated quotes for midpoint and bid/ask tests and a modeling approach to classification. Prevailing quotes are estimated using flexible approximations to the distribution for delays of quotes relative to trade timestamps. Classification is done by a generalized linear model which includes improved versions of midpoint, tick, and bid/ask tests. The model also considers the relative strengths of these tests, can accou...
Modeling multiphase materials processes
Iguchi, Manabu
2010-01-01
""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of
Leimkuhler, F.F.
1988-01-01
A common feature in bibliometric studies is the use of mathematical models to analyze fundamental problems arising from the operation of information systems. As bibliometrics develops, more explicit attention needs to be given to the modeling process as a unifying activity within the field, a vital link to other fields of study, and an avenue to future growth. In this paper the author draws on his experience with bibliometric modeling to demonstrate its practical and theoretical significance,...
Modeling American Marriage Patterns
Bloom, David E.; Neil G. Bennett
1990-01-01
This paper investigates the application of the three-parameter, Coale-McNeil marriage model and some related hyper-parameterized specifications to data on the first marriage patterns of American women. Because the model is parametric, it can be used to estimate the parameters of the marriage process, free of censoring bias, for cohorts that have yet to complete their first marriage experience. Empirical evidence from three surveys is reported on the ability of the model to replicate and proje...
Makarov, Daniil
2012-01-01
The thesis covers the phenomenon of business model innovation. It provides with theoretical background of the concept based on the works of several scientists who stand at the beginnings of the discipline. The paper also introduces the principles of design thinking applied to business model innovation in order to get superior results and serve as a guideline for ideation processes and presenting enhancements to existing business models. The practical part is devoted to applying the described ...
Affine General Equilibrium Models
Bjørn Eraker
2008-01-01
No-arbitrage models are extremely flexible modelling tools but often lack economic motivation. This paper describes an equilibrium consumption-based CAPM framework based on Epstein-Zin preferences, which produces analytic pricing formulas for stocks and bonds under the assumption that macro growth rates follow affine processes. This allows the construction of equilibrium pricing formulas while maintaining the same flexibility of state dynamics as in no-arbitrage models. In demonstrating the a...
Modeling Transcient Trace Data
Mathur, Anup; Abrams, Marc
1996-01-01
This paper introduces a novel technique to construct an empirical workload model fitting time-varying (transient) trace data. The trace can be a categorical or numerical time-series. We model the trace as a Piecewise Independent stochastic process. To estimate the parameters for our model we first build a Rate Evolution Graph from the trace data. Piecewise linear regression is then used to construct a joint time-dependent probablity mass function for the trace data. Two methods are propo...
Assessing DSGE model nonlinearities
S. Borağan Aruoba; Luigi Bocola; Frank Schorfheide
2013-01-01
We develop a new class of nonlinear time-series models to identify nonlinearities in the data and to evaluate nonlinear DSGE models. U.S. output growth and the federal funds rate display nonlinear conditional mean dynamics, while inflation and nominal wage growth feature conditional heteroskedasticity. We estimate a DSGE model with asymmetric wage/price adjustment costs and use predictive checks to assess its ability to account for nonlinearities. While it is able to match the nonlinear infla...
Modelling manufacturing inventories
John D. Tsoukalas
2005-01-01
This paper presents and applies a stage-of-fabrication inventory model to the UK manufacturing sector. The model emphasises the interaction between input (raw materials and work-in-process) and output (finished goods) inventories. This interaction is an important empirical regularity and proves critical for the ability of the model to fit the data. Decisions about input and output inventory investment cannot be considered in isolation from each other, but must be analysed jointly. Overall, th...
Multivariate Rotated ARCH models
Shephard, Neil; Sheppard, Kevin; Noureldin, Diaa
2012-01-01
This paper introduces a new class of multivariate volatility models which is easy to estimate using covariance targeting, even with rich dynamics. We call them rotated ARCH (RARCH) models. The basic structure is to rotate the returns and then to fit them using a BEKK-type parameterization of the time-varying covariance whose long-run covariance is the identity matrix. The extension to DCC-type parameterizations is given, introducing the rotated conditional correlation (RCC) model. Inference f...
Modelling Realized Covariances
Xin Jin; John M Maheu
2009-01-01
This paper proposes a new dynamic model of realized covariance (RCOV) matrices based on recent work in time-varying Wishart distributions. The specifications can be linked to returns for a joint multivariate model of returns and covariance dynamics that is both easy to estimate and forecast. Realized covariance matrices are constructed for 5 stocks using high-frequency intraday prices based on positive semi-definite realized kernel estimates. We extend the model to capture the strong persiste...
Behavioral Modeling of Memcapacitor
D. Biolek
2011-04-01
Full Text Available Two behavioral models of memcapacitor are developed and implemented in SPICE-compatible simulators. Both models are related to the charge-controlled memcapacitor, the capacitance of which is controlled by the amount of electric charge conveyed through it. The first model starts from the state description of memcapacitor whereas the second one uses the memcapacitor constitutive relation as the only input data. Results of transient analyses clearly show the basic fingerprints of the memcapacitor.
Klein, Jacques; Kienzle, J. (ed.); Morin, B.; Jézéquel, J.-M.
2009-01-01
Since software systems need to be continuously available, their ability to evolve at runtime is a key issue. The emergence of models@runtime, combined with Aspect-Oriented Modeling techniques, is a promising approach to tame the complexity of adaptive systems. However, with no support for aspect unweaving, these approaches are not agile enough in an adaptive system context. In case of small modifications, the adapted model has to be generated by again weaving all the aspects, even those uncha...
Williams David K; Bursac Zoran; Tabatabai Mohammad A; Singh Karan P
2007-01-01
Abstract A new two-parameter probability distribution called hypertabastic is introduced to model the survival or time-to-event data. A simulation study was carried out to evaluate the performance of the hypertabastic distribution in comparison with popular distributions. We then demonstrate the application of the hypertabastic survival model by applying it to data from two motivating studies. The first one demonstrates the proportional hazards version of the model by applying it to a data se...
Shcherbina, Masha; Tirozzi, Brunello
1995-01-01
The Hopfield model in a transverse field is investigated in order to clarify how quantum fluctuations affect the macroscopic behavior of neural networks. Using the Trotter decomposition and the replica method, we find that the $\\alpha$ (the ratio of the number of stored patterns to the system size)-$\\Delta$ (the strength of the transverse field) phase diagram of this model in the ground state resembles the $\\alpha$-$T$ phase diagram of the Hopfield model quantitatively, within the replica-sym...
Acher, Mathieu; Heymans, Patrick; Collet, Philippe; Quinton, Clément; Lahire, Philippe; Merle, Philippe
2012-01-01
International audience Feature models are a widespread means to represent commonality and variability in software product lines. As is the case for other kinds of models, computing and managing feature model differences is useful in various real-world situations. In this paper, we propose a set of novel differencing techniques that combine syntactic and semantic mechanisms, and automatically produce meaningful differences. Practitioners can exploit our results in various ways: to understan...
Thomas Breuer; Imre Csiszar
2013-01-01
We propose to interpret distribution model risk as sensitivity of expected loss to changes in the risk factor distribution, and to measure the distribution model risk of a portfolio by the maximum expected loss over a set of plausible distributions defined in terms of some divergence from an estimated distribution. The divergence may be relative entropy, a Bregman distance, or an $f$-divergence. We give formulas for the calculation of distribution model risk and explicitly determine the worst...
Engineering Delta Modeling Languages
Haber, Arne; Hölldobler, Katrin; Kolassa, Carsten; Look, Markus; Müller, Klaus; Rumpe, Bernhard; Schaefer, Ina
2014-01-01
Delta modeling is a modular, yet flexible approach to capture spatial and temporal variability by explicitly representing the differences between system variants or versions. The conceptual idea of delta modeling is language-independent. But, in order to apply delta modeling for a concrete language, so far, a delta language had to be manually developed on top of the base language leading to a large variety of heterogeneous language concepts. In this paper, we present a process that allows der...
Dave Clarke; Michiel Helvensteijn; Ina Schaefer
2011-01-01
Delta modeling is an approach to facilitate automated product derivation for software product lines. It is based on a set of deltas specifying modifications that are incrementally applied to a core product. The applicability of deltas depends on feature-dependent conditions. This paper presents abstract delta modeling, which explores delta modeling from an abstract, algebraic perspective. Compared to previous work, we take a more flexible approach with respect to conflicts between modificatio...
Modelling airport congestion charges
Janić, Milan
2012-01-01
This article deals with modelling congestion charges at an airport. In this context, congestion charging represents internalizing the cost of marginal delays that a flight imposes on other flights due to congestion. The modelling includes estimating congestion and flight delays, the cost of these delays and the efficiency of particular flights following the introduction ofa congestion charge. The models are applied to an airport / New York LaGuardia / to illustrate their ability to handle mor...
Modeling Frequency Comb Sources
Li Feng
2016-06-01
Full Text Available Frequency comb sources have revolutionized metrology and spectroscopy and found applications in many fields. Stable, low-cost, high-quality frequency comb sources are important to these applications. Modeling of the frequency comb sources will help the understanding of the operation mechanism and optimization of the design of such sources. In this paper,we review the theoretical models used and recent progress of the modeling of frequency comb sources.
Hsu, H. M.
1980-01-01
A mesoscale numerical model of the Florida peninsula was formulated and applied to a dry, neutral atmosphere. The prospective use of the STAR-100 computer for the submesoscale model is discussed. The numerical model presented is tested under synoptically undisturbed conditions. Two cases, differing only in the direction of the prevailing geostrophic wind, are examined: a prevailing southwest wind and a prevailing southeast wind, both 6 m/sec at all levels initially.
Togelius, Julian; Shaker, Noor; Yannakakis, Georgios N.
2013-01-01
We argue for the use of active learning methods for player modelling. In active learning, the learning algorithm chooses where to sample the search space so as to optimise learning progress. We hypothesise that player modelling based on active learning could result in vastly more efficient learning, but will require big changes in how data is collected. Some example active player modelling scenarios are described. A particular form of active learning is also equivalent to an influential forma...
Combustion theory and modeling
Buckmaster, J; Clavin, Paul; Liñán Martínez, Amable; Matalon, M.; Peters, N; Sivashinsky, G.; Williams, F. A.
2005-01-01
In honor of the fiftieth anniversary of the Combustion Institute, we are asked to assess accomplishments of theory in combustion over the past fifty years and prospects for the future. The title of our article is chosen to emphasize that development of theory necessarily goes hand-in-hand with specification of a model. Good conceptual models underlie successful mathematical theories. Models and theories are discussed here for deflagrations, detonations, diffusion flames, ignition, propellant ...
Mobile Services Adoption Model
Abu Ghannam, Bashar
2011-01-01
This research presents an explanatory model for consumers' adoption of mobile services. This model uses the Unified Theory of Acceptance and Use of Technology presented by Venkatesh in 2003 as a baseline and integrates the Perceived Enjoyment, Mobile Affinity, Perceived Price of Service and the Frequency of Mobile Usage as to investigate the Attitude and the Intention to Use mobile services. The proposed model was empirically tested using data collected from a field survey where 1095 responde...
Arnaoudova, Kristina; Stanchev, Peter
2015-11-01
The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.
Next Generation Relativistic Models
Furnstahl, R. J.
2003-01-01
The current generation of covariant mean-field models has had many successes in calculations of bulk observables for medium to heavy nuclei, but there remain many open questions. New challenges are confronted when trying to systematically extend these models to reliably address nuclear structure physics away from the line of stability. In this lecture, we discuss a framework for the next generation of relativistic models that can address these questions and challenges. We interpret nuclear me...
The Office of Civilian Radioactive Waste Management and the Power Reactor and Nuclear Fuel Development Corporation of Japan (PNC) have supported the development of the Analytical Repository Source-Term (AREST) at Pacific Northwest Laboratory. AREST is a computer model developed to evaluate radionuclide release from an underground geologic repository. The AREST code can be used to calculate/estimate the amount and rate of each radionuclide that is released from the engineered barrier system (EBS) of the repository. The EBS is the man-made or disrupted area of the repository. AREST was designed as a system-level models to simulate the behavior of the total repository by combining process-level models for the release from an individual waste package or container. AREST contains primarily analytical models for calculating the release/transport of radionuclides to the lost rock that surrounds each waste package. Analytical models were used because of the small computational overhead that allows all the input parameters to be derived from a statistical distribution. Recently, a one-dimensional numerical model was also incorporated into AREST, to allow for more detailed modeling of the transport process with arbitrary length decay chains. The next step in modeling the EBS, is to develop a model that couples the probabilistic capabilities of AREST with a more detailed process model. This model will need to look at the reactive coupling of the processes that are involved with the release process. Such coupling would include: (1) the dissolution of the waste form, (2) the geochemical modeling of the groundwater, (3) the corrosion of the container overpacking, and (4) the backfill material, just to name a few. Several of these coupled processes are already incorporated in the current version of AREST
Young, Michael F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-07-01
Aerosol particles that deposit on surfaces may be subsequently resuspended by air flowing over the surface. A review of models for this liftoff process is presented and compared to available data. Based on this review, a model that agrees with existing data and is readily computed is presented for incorporation into a system level code such as MELCOR. Liftoff Model for MELCOR July 2015 4 This page is intentionally blank
Audestad, Jan Arild
2015-01-01
In this text, we study the temporal behavior of markets using models expressible as ordinary differential equations. The markets studied are those where each customer buys only one copy of the good, for example, subscription of smartphone service, journals and newspapers, and goods such as books, music and games. The underlying model is the diffusion model of Frank Bass. Evolution of markets with no competitors and markets with several competitors are analyzed where, in particulat, the effect...
Bergen, Benjamin Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-07-07
This is the PDF of a powerpoint presentation from a teleconference on Los Alamos programming models. It starts by listing their assumptions for the programming models and then details a hierarchical programming model at the System Level and Node Level. Then it details how to map this to their internal nomenclature. Finally, a list is given of what they are currently doing in this regard.
Bayesian default probability models
Andrlíková, Petra
2014-01-01
This paper proposes a methodology for default probability estimation for low default portfolios, where the statistical inference may become troublesome. The author suggests using logistic regression models with the Bayesian estimation of parameters. The piecewise logistic regression model and Box-Cox transformation of credit risk score is used to derive the estimates of probability of default, which extends the work by Neagu et al. (2009). The paper shows that the Bayesian models are more acc...
Suvorova, A V; Kuznetsov, S N
1999-01-01
A review of empirical data-based models of the magnetopause and a comparative analysis are given with special attention to the dynamics of the dayside boundary. Recently different research groups have presented new magnetopause models as an alternative to the model of J. Geophys. Res. 94, 15, 125). All models have a greater parametric extent than the model of Roelof and Sibeck and allow prediction of the magnetopause location during extreme solar wind and IMF conditions. The models of J. Geophys. Res. 102, 9497-9511) and , developed using classic multi-factor regression analysis are two-dimensional and bivariate. The model of created using artificial neural networks (ANNs) is three-dimensional and contains multiple parameters. A statistical study of Kuznetsov et al. confirmed by the ANN modeling of Dmitriev et al. has shown that the shape of dayside magnetopause has dawn-dusk asymmetry. The uncertainty in the determination of the dayside magnetopause position is practically the same for these models in spite ...
Multifamily Envelope Leakage Model
Faakye, O. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States); Griffiths, D. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)
2015-05-01
The objective of the 2013 research project was to develop the model for predicting fully guarded test results (FGT), using unguarded test data and specific building features of apartment units. The model developed has a coefficient of determination R2 value of 0.53 with a root mean square error (RMSE) of 0.13. Both statistical metrics indicate that the model is relatively strong. When tested against data that was not included in the development of the model, prediction accuracy was within 19%, which is reasonable given that seasonal differences in blower door measurements can vary by as much as 25%.
Valgas, Helio Moreira; Pinto, Roberto del Giudice R.; Franca, Carlos [Companhia Energetica de Minas Gerais (CEMIG), Belo Horizonte, MG (Brazil); Lambert-Torres, Germano; Silva, Alexandre P. Alves da; Pires, Robson Celso; Costa Junior, Roberto Affonso [Escola Federal de Engenharia de Itajuba, MG (Brazil)
1994-12-31
Accurate dynamic load models allow more precise calculations of power system controls and stability limits, which are critical mainly in the operation planning of power systems. This paper describes the development of a computer program (software) for static and dynamic load model studies using the measurement approach for the CEMIG system. Two dynamic load model structures are developed and tested. A procedure for applying a set of measured data from an on-line transient recording system to develop load models is described. (author) 6 refs., 17 figs.
Parametric Models of Periodogram
P. Mohan; A. Mangalam; S. Chattopadhyay
2014-09-01
The maximum likelihood estimator is used to determine fit parameters for various parametric models of the Fourier periodogram followed by the selection of the best-fit model amongst competing models using the Akaike information criteria. This analysis, when applied to light curves of active galactic nuclei can be used to infer the presence of quasi-periodicity and break or knee frequencies. The extracted information can be used to place constraints on the mass, spin and other properties of the putative central black hole and the region surrounding it through theoretical models involving disk and jet physics.
The first Seminar on Groundwater Modelling was arranged by VTT (Reactor Laboratory) in Espoo Finland in May 1991. The one day seminar dealt both with modelling of geochemistry and transport of groundwater, as well as mathematical methods for modelling. The seminar concentrated on giving a broad picture of the applications of groundwater modelling e.g. nuclear waste, groundwater resources including artificial groundwater and pollution. The participants came from research institutes and universities as well as engineering companies. Articles are published in Finnish with English abstracts
Delocalization in polymer models
Jitomirskaya, S Yu; Stolz, G
2003-01-01
A polymer model is a one-dimensional Schroedinger operator composed of two finite building blocks. If the two associated transfer matrices commute, the corresponding energy is called critical. Such critical energies appear in physical models, an example being the widely studied random dimer model. Although the random models are known to have pure-point spectrum with exponentially localized eigenstates for almost every configuration of the polymers, the spreading of an initially localized wave packet is here proven to be at least diffusive for every configuration.
Juhl, Joakim
This thesis is about mathematical modelling and technology development. While mathematical modelling has become widely deployed within a broad range of scientific practices, it has also gained a central position within technology development. The intersection of mathematical modelling and...... generated. Structured around the intersections of certainty, agency, and dependences, the thesis’ findings are in chapter 9 extended to a discussion of the theoretical fundament through which we interpret the regulation project and its use of modelling. I demonstrate a novel framework that I term...
Tanwir, Savera
2014-01-01
There has been a phenomenal growth in video applications over the past few years. An accurate traffic model of Variable Bit Rate (VBR) video is necessary for performance evaluation of a network design and for generating synthetic traffic that can be used for benchmarking a network. A large number of models for VBR video traffic have been proposed in the literature for different types of video in the past 20 years. Here, the authors have classified and surveyed these models and have also evaluated the models for H.264 AVC and MVC encoded video and discussed their findings.
Brown-VanHoozer, S. A.
1999-06-02
Conscious awareness of our environment is based on a feedback loop comprised of sensory input transmitted to the central nervous system leading to construction of our ''model of the world,'' (Lewis et al, 1982). We then assimilate the neurological model at the unconscious level into information we can later consciously consider useful in identifying belief systems and behaviors for designing diverse systems. Thus, we can avoid potential problems based on our open-to-error perceived reality of the world. By understanding how our model of reality is organized, we allow ourselves to transcend content and develop insight into how effective choices and belief systems are generated through sensory derived processes. These are the processes which provide the designer the ability to meta model (build a model of a model) the user; consequently, matching the mental model of the user with that of the designer's and, coincidentally, forming rapport between the two participants. The information shared between the participants is neither assumed nor generalized, it is closer to equivocal; thus minimizing error through a sharing of each other's model of reality. How to identify individual mental mechanisms or processes, how to organize the individual strategies of these mechanisms into useful patterns, and to formulate these into models for success and knowledge based outcomes is the subject of the discussion that follows.
Computer Modeling and Simulation
Pronskikh, V. S. [Fermilab
2014-05-09
Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes
Faraway, Julian J
2014-01-01
A Hands-On Way to Learning Data AnalysisPart of the core of statistics, linear models are used to make predictions and explain the relationship between the response and the predictors. Understanding linear models is crucial to a broader competence in the practice of statistics. Linear Models with R, Second Edition explains how to use linear models in physical science, engineering, social science, and business applications. The book incorporates several improvements that reflect how the world of R has greatly expanded since the publication of the first edition.New to the Second EditionReorganiz
Business Model Innovation Leadership
Lindgren, Peter; Rasmussen, Ole Horn
2012-01-01
Leading business model (BM) strategizing through “the field of innovation” has not yet been covered in business model and innovation leadership literature. This is a bit peculiar considering that there has been an increased focus on BM innovation (BMI) by academics and industry since 2011. The......”. This emphasizes the importance of questioning. How is BM innovation leadership (BMIL) carried out in companies related to various BM(s) and BMI tasks and throughout their business model innovation process? And, how can innovation leadership be related to BMI? A framework model for BMIL based on case...
Croatian Cadastre Database Modelling
Zvonko Biljecki
2013-04-01
Full Text Available The Cadastral Data Model has been developed as a part of a larger programme to improve products and production environment of the Croatian Cadastral Service of the State Geodetic Administration (SGA. The goal of the project was to create a cadastral data model conforming to relevant standards and specifications in the field of geoinformation (GI adapted by international organisations for standardisation under the competence of GI (ISO TC211 and OpenGIS and it implementations.The main guidelines during the project have been object-oriented conceptual modelling of the updated users' requests and a "new" cadastral data model designed by SGA - Faculty of Geodesy - Geofoto LLC project team. The UML of the conceptual model is given per all feature categories and is described only at class level. The next step was the UML technical model, which was developed from the UML conceptual model. The technical model integrates different UML schemas in one united schema.XML (eXtensible Markup Language was applied for XML description of UML models, and then the XML schema was transferred into GML (Geography Markup Language application schema. With this procedure we have completely described the behaviour of each cadastral feature and rules for the transfer and storage of cadastral features into the database.
Inside - Outside Model Viewing
Nikolov, Ivan Adriyanov
2016-01-01
components of the model, their proportions compared to each other and the overall design. A variety of augmented reality(AR) applications have been created for overall visualization of large scale models. For tours inside 3D renderings of models many immersive virtual reality (VR) applications exist. Both...... types of applications have their limitation, omitting either important details in the AR case or the full picture in the case of VR. This paper presents a low-cost way to demonstrate models using a hybrid virtual environment system (HVE), combining virtual reality and augmented reality visualization...
Prusinkiewicz, Przemyslaw; Rolland-Lagan, Anne-Gaëlle
2006-02-01
Applications of computational techniques to developmental plant biology include the processing of experimental data and the construction of simulation models. Substantial progress has been made in these areas over the past few years. Complex image-processing techniques are used to integrate sequences of two-dimensional images into three-dimensional descriptions of development over time and to extract useful quantitative traits. Large amounts of data are integrated into empirical models of developing plant organs and entire plants. Mechanistic models link molecular-level phenomena with the resulting phenotypes. Several models shed light on the possible properties of active auxin transport and its role in plant morphogenesis. PMID:16376602
Jensen, Finn Verner; Nielsen, Thomas Dyhre
2016-01-01
Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes and...... largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...
Højsgaard, Søren; Lauritzen, Steffen
2012-01-01
Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many of these software developments have taken place within the R community, either in the form of new packages or by providing an R interface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In add
Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan
A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.
Kühn, Michael
In order to deal with the complexity of natural systems simplified models are employed to illustrate the principal and regulatory factors controlling a chemical system. Following the aphorism of Albert Einstein: Everything should be made as simple as possible, but not simpler, models need not to be completely realistic to be useful (Stumm and Morgan 1996), but need to meet a successful balance between realism and practicality. Properly constructed, a model is neither too simplified that it is unrealistic nor too detailed that it cannot be readily evaluated and applied to the problem of interest (Bethke 1996). The results of a model have to be at least partially observable or experimentally verifiable (Zhu and Anderson 2002). Geochemical modeling theories are presented here in a sequence of increasing complexity from geochemical equilibrium models to kinetic, reaction path, and finally coupled transport and reaction models. The description is far from complete but provides the needs for the set up of reactive transport models of hydrothermal systems as done within subsequent chapters. Extensive reviews of geochemical models in general can be found in the literature (Appelo and Postma 1999, Bethke 1996, Melchior and Bassett 1990, Nordstrom and Ball 1984, Paschke and van der Heijde 1996).
Modeling magnetic pulse compressors
In this paper, the author considers the problem of modeling the dynamic performance of high-average-power, high repetition-rate magnetic pulse compressors. The author is particularly concerned with developing system models suitable for studying output pulse stability in high repetition rate applications. To this end, the author presents a magnetic switch model suitable for system studies and discusses a modeling tool being developed to perform these studies. The author concludes with some preliminary results of efforts to simulate the MAG1D compressor performance
Quantum Group $\\sigma$ Models
Frishman, Yitzhak; Zakrzewski, W J
1993-01-01
Field-theoretic models for fields taking values in quantum groups are investigated. First we consider $SU_q(2)$ $\\sigma$ model ($q$ real) expressed in terms of basic notions of noncommutative differential geometry. We discuss the case in which the $\\sigma$ models fields are represented as products of conventional $\\sigma$ fields and of the coordinate-independent algebra. An explicit example is provided by the $U_q(2)$ $\\sigma$ model with $q\\sp{N}=1$, in which case quantum matrices $U_q(2)$ are realised as $2N\\times 2N$ unitary matrices. Open problems are pointed out.
Quantum Group $\\sigma$ Models
Frishman, Y.; Lukierski, J.; Zakrzewski, W. J.
1992-01-01
Field-theoretic models for fields taking values in quantum groups are investigated. First we consider $SU_q(2)$ $\\sigma$ model ($q$ real) expressed in terms of basic notions of noncommutative differential geometry. We discuss the case in which the $\\sigma$ models fields are represented as products of conventional $\\sigma$ fields and of the coordinate-independent algebra. An explicit example is provided by the $U_q(2)$ $\\sigma$ model with $q\\sp{N}=1$, in which case quantum matrices $U_q(2)$ ar...
He, Y.; Jejjala, V.
2003-01-01
Inspired by a formal resemblance of certain q-expansions of modular forms and the master field formalism of matrix models in terms of Cuntz operators, we construct a Hermitian one-matrix model, which we dub the ``modular matrix model.'' Together with an N=1 gauge theory and a special Calabi-Yau geometry, we find a modular matrix model that naturally encodes the Klein elliptic j-invariant, and hence, by Moonshine, the irreducible representations of the Fischer-Griess Monster group.
Bahr, Benjamin [DAMTP, Centre for Mathematical Sciences, Wilberforce Road, Cambridge CB3 0WA (United Kingdom); Hellmann, Frank; Kaminski, Wojciech; Kisielowski, Marcin; Lewandowski, Jerzy [Instytut Fizyki Teoretycznej, Uniwersytet Warszawski, ul. Hoza 69, 00-681 Warszawa (Warsaw), Polska. Poland (Poland)
2011-05-21
The goal of this paper is to introduce a systematic approach to spin foams. We define operator spin foams, that is foams labelled by group representations and operators, as our main tool. A set of moves we define in the set of the operator spin foams (among other operations) allows us to split the faces and the edges of the foams. We assign to each operator spin foam a contracted operator, by using the contractions at the vertices and suitably adjusted face amplitudes. The emergence of the face amplitudes is the consequence of assuming the invariance of the contracted operator with respect to the moves. Next, we define spin foam models and consider the class of models assumed to be symmetric with respect to the moves we have introduced, and assuming their partition functions (state sums) are defined by the contracted operators. Briefly speaking, those operator spin foam models are invariant with respect to the cellular decomposition, and are sensitive only to the topology and colouring of the foam. Imposing an extra symmetry leads to a family we call natural operator spin foam models. This symmetry, combined with assumed invariance with respect to the edge splitting move, determines a complete characterization of a general natural model. It can be obtained by applying arbitrary (quantum) constraints on an arbitrary BF spin foam model. In particular, imposing suitable constraints on a spin(4) BF spin foam model is exactly the way we tend to view 4D quantum gravity, starting with the BC model and continuing with the Engle-Pereira-Rovelli-Livine (EPRL) or Freidel-Krasnov (FK) models. That makes our framework directly applicable to those models. Specifically, our operator spin foam framework can be translated into the language of spin foams and partition functions. Among our natural spin foam models there are the BF spin foam model, the BC model, and a model corresponding to the EPRL intertwiners. Our operator spin foam framework can also be used for more general spin
Proactive Quality Guidance for Model Evolution in Model Libraries
Ganser, Andreas; Lichter, Horst; Roth, Alexander; Rumpe, Bernhard
2014-01-01
Model evolution in model libraries differs from general model evolution. It limits the scope to the manageable and allows to develop clear concepts, approaches, solutions, and methodologies. Looking at model quality in evolving model libraries, we focus on quality concerns related to reusability. In this paper, we put forward our proactive quality guidance approach for model evolution in model libraries. It uses an editing-time assessment linked to a lightweight quality model, corresponding m...
A Model-Driven Engineering Framework for Constrained Model Search
Kleiner, Mathias
2009-01-01
This document describes a formalization, a solver-independant methodology and implementation alternatives for realizing constrained model search in a model-driven engineering framework. The proposed approach combines model-driven engineering tools ((meta)model transformations, models to text, text to models) and constraint programming techniques. Based on previous research, motivations to model search are first introduced together with objectives and background context. A theory of model sear...
Comparative Analysis of Parametric Engine Model and Engine Map Model
Zeeshan Ali Memon; Sadiq Ali Shah; Muhammas Saleh Jumani
2015-01-01
Two different engine models, parametric engine model and engine map model are employed to analyze the dynamics of an engine during the gear shifting. The models are analyzed under critical transitional manoeuvres to investigate their appropriateness for vehicle longitudinal dynamics. The simulation results for both models have been compared. The results show the engine map model matches well with the parametric model and can be used for the vehicle longitudinal dynamics model. The proposed ap...
Biosphere Process Model Report
J. Schmitt
2000-05-25
To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor
This paper describes the methodology behind the Hanford Defined Wastes (HDW) model for estimating the contents of Hanford high level waste (HLW) tanks. The HDW model is based on historical process and transaction histories for each tank and has four major components: Waste Status and Transaction Record Summary (WSTRS), Tank Layer Model (TLM), Supernatant Mixing Model (SMM), and HDW Compositions. Three examples of the application of HDW model estimates are described, including comparisons with global site inventories, comparisons with per tank assays, and comparisons of HDW TOC (Total Organic Carbon) estimates with existing hydrogen watch list tanks. The HDW model provides a cross check on existing assumptions for the global site inventory of wastes. Note that existing inventories for Hanford are based on much the same source information as the HDW model, chemicals used and process flowsheet data. Despite that, the HDW model predicts that the sodium inventory for Hanford tanks is 40,300 MT (metric tonnes), which is only 58% of the previous baseline estimate of 69,000 MT. There are other significant differences for inventories of chromium, iron, and nitrate as well. There are two causes for these differences; (1) previous neglect of chemical inventory placed into the ground at Hanford; (2) double counting attributed to tank inventory that was retrieved, reprocessed, and returned to the tanks. This double-counted inventory was counted once when it first went into the tanks and then again after it was reprocessed. The HDW model estimates also can provide a basis for targeting tanks for organic safety issues. In particular, the HDW model has shown that 88% of flammable gas watch list tanks have HDW estimated organic concentrations in excess of 0.64 wt% TOC. Derivation of variabilities for the HDW model estimates and other potential uses will also be outlined
Magretta, Joan
2002-05-01
"Business model" was one of the great buzz-words of the Internet boom. A company didn't need a strategy, a special competence, or even any customers--all it needed was a Web-based business model that promised wild profits in some distant, ill-defined future. Many people--investors, entrepreneurs, and executives alike--fell for the fantasy and got burned. And as the inevitable counterreaction played out, the concept of the business model fell out of fashion nearly as quickly as the .com appendage itself. That's a shame. As Joan Magretta explains, a good business model remains essential to every successful organization, whether it's a new venture or an established player. To help managers apply the concept successfully, she defines what a business model is and how it complements a smart competitive strategy. Business models are, at heart, stories that explain how enterprises work. Like a good story, a robust business model contains precisely delineated characters, plausible motivations, and a plot that turns on an insight about value. It answers certain questions: Who is the customer? How do we make money? What underlying economic logic explains how we can deliver value to customers at an appropriate cost? Every viable organization is built on a sound business model, but a business model isn't a strategy, even though many people use the terms interchangeably. Business models describe, as a system, how the pieces of a business fit together. But they don't factor in one critical dimension of performance: competition. That's the job of strategy. Illustrated with examples from companies like American Express, EuroDisney, WalMart, and Dell Computer, this article clarifies the concepts of business models and strategy, which are fundamental to every company's performance. PMID:12024761
Biosphere Process Model Report
To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor
Crop rotation modelling - A European model intercomparison
Kollas, Chris; Kersebaum, Kurt C; Nendel, Claas;
2015-01-01
Diversification of crop rotations is considered an option to increase the resilience of European crop production under climate change. So far, however, many crop simulation studies have focused on predicting single crops in separate one-year simulations. Here, we compared the capability of fifteen...... sound representation of crop rotations, further research is required to synthesise existing knowledge of the physiology of intermediate crops and of carry-over effects from the preceding to the following crop, and to implement/improve the modelling of processes that condition these effects....... crop growth simulation models to predict yields in crop rotations at five sites across Europe under minimal calibration. Crop rotations encompassed 301 seasons of ten crop types common to European agriculture and a diverse set of treatments (irrigation, fertilisation, CO2 concentration, soil types...
Probabilistic models of perception.
Ennis, D.M.
1991-01-01
Mental representations of objects may fluctuate or change from moment to moment. Many models of similarity, identification, classification, and preferential choice are deterministic. These models cannot formally account for perceptual fluctuations. In this thesis, it is assumed that there exists a p
Zhu, Zhifan
2016-01-01
Under the NASA-KAIA-KARI ATM research collaboration agreement, SOSS ICN Model has been developed for Incheon International Airport. This presentation describes the model validation work in the project. The presentation will show the results and analysis of the validation.
This report documents a numerical simulation model of the natural gas market in Germany, France, the Netherlands and Belgium. It is a part of a project called ''Internationalization and structural change in the gas market'' aiming to enhance the understanding of the factors behind the current and upcoming changes in the European gas market, especially the downstream part of the gas chain. The model takes European border prices of gas as given, adds transmission and distribution cost and profit margins as well as gas taxes to calculate gas prices. The model includes demand sub-models for households, chemical industry, other industry, the commercial sector and electricity generation. Demand responses to price changes are assumed to take time, and the long run effects are significantly larger than the short run effects. For the household sector and the electricity sector, the dynamics are modeled by distinguishing between energy use in the old and new capital stock. In addition to prices and the activity level (GDP), the model includes the extension of the gas network as a potentially important variable in explaining the development of gas demand. The properties of numerical simulation models are often described by dynamic multipliers, which describe the behaviour of important variables when key explanatory variables are changed. At the end, the report shows the results of a model experiment where the costs in transmission and distribution were reduced. 6 refs., 9 figs., 1 tab
Pathological Gambling: Psychiatric Models
Westphal, James R.
2008-01-01
Three psychiatric conceptual models: addictive, obsessive-compulsive spectrum and mood spectrum disorder have been proposed for pathological gambling. The objectives of this paper are to (1) evaluate the evidence base from the most recent reviews of each model, (2) update the evidence through 2007 and (3) summarize the status of the evidence for…
Modeling volcanic ash dispersal
CERN. Geneva
2010-01-01
The assessment of volcanic fallout hazard is an important scientific, economic, and political issue, especially in densely populated areas. From a scientific point of view, considerable progress has been made during the last two decades through the use of increasingly powerful computational models and capabilities. Nowadays, models are used to quantify hazard...
Flannery, Maura C.
1997-01-01
Addresses the most popular models currently being chosen for biological research and the reasons behind those choices. Among the current favorites are zebra fish, fruit flies, mice, monkeys, and yeast. Concludes with a brief examination of the ethical issues involved, and why some animals may need to be replaced in research with model systems.…
Perelson, Alan; Conway, Jessica; Cao, Youfang
A large effort is being made to find a means to cure HIV infection. I will present a dynamical model of post-treatment control (PTC) or ``functional cure'' of HIV-infection. Some patients treated with suppressive antiviral therapy have been taken off of therapy and then spontaneously control HIV infection such that the amount of virus in the circulation is maintained undetectable by clinical assays for years. The model explains PTC occurring in some patients by having a parameter regime in which the model exhibits bistability, with both a low and high steady state viral load being stable. The model makes a number of predictions about how to attain the low PTC steady state. Bistability in this model depends upon the immune response becoming exhausted when over stimulated. I will also present a generalization of the model in which immunotherapy can be used to reverse immune exhaustion and compare model predictions with experiments in SIV infected macaques given immunotherapy and then taken off of antiretroviral therapy. Lastly, if time permits, I will discuss one of the hurdles to true HIV eradication, latently infected cells, and present clinical trial data and a new model addressing pharmacological means of flushing out the latent reservoir. Supported by NIH Grants AI028433 and OD011095.
Multilevel Mixture Factor Models
Varriale, Roberta; Vermunt, Jeroen K.
2012-01-01
Factor analysis is a statistical method for describing the associations among sets of observed variables in terms of a small number of underlying continuous latent variables. Various authors have proposed multilevel extensions of the factor model for the analysis of data sets with a hierarchical structure. These Multilevel Factor Models (MFMs)…
Business Model Innovation Leadership
Lindgren, Peter
2012-01-01
When SME´s practice business model (BM) innovation (BMI), leading strategically BMs through the innovation process can be the difference between success and failure to a BM. Business Model Innovation Leadership (BMIL) is however extremely complex to carry out especially to small and medium size...
Rat Endovascular Perforation Model
Sehba, Fatima A.
2014-01-01
Experimental animal models of aneurysmal subarachnoid hemorrhage (SAH) have provided a wealth of information on the mechanisms of brain injury. The Rat endovascular perforation model (EVP) replicates the early pathophysiology of SAH and hence is frequently used to study early brain injury following SAH.
The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs
Stochastic Control - External Models
Poulsen, Niels Kjølstad
2005-01-01
This note is devoted to control of stochastic systems described in discrete time. We are concerned with external descriptions or transfer function model, where we have a dynamic model for the input output relation only (i.e.. no direct internal information). The methods are based on LTI systems and...
Modelling Hyperboloid Sound Scattering
Burry, Jane; Davis, Daniel; Peters, Brady; Ayres, Phil; Klein, John; Pena de Leon, Alexander; Burry, Mark
The Responsive Acoustic Surfaces workshop project described here sought new understandings about the interaction between geometry and sound in the arena of sound scattering. This paper reports on the challenges associated with modelling, simulating, fabricating and measuring this phenomenon using...... both physical and digital models at three distinct scales. The results suggest hyperboloid geometry, while difficult to fabricate, facilitates sound scattering....
Rutten, R. J.
2002-12-01
This contribution honoring Kees de Jager's 80th birthday is a review of "one-dimensional" solar atmosphere modeling that followed on the initial "Utrecht Reference Photosphere" of Heintze, Hubenet & de Jager (1964). My starting point is the Bilderberg conference, convened by de Jager in 1967 at the time when NLTE radiative transfer theory became mature. The resulting Bilderberg model was quickly superseded by the HSRA and later by the VAL-FAL sequence of increasingly sophisticated NLTE continuum-fitting models from Harvard. They became the "standard models" of solar atmosphere physics, but Holweger's relatively simple LTE line-fitting model still persists as a favorite of solar abundance determiners. After a brief model inventory I discuss subsequent work on the major modeling issues (coherency, NLTE, dynamics) listed as to-do items by de Jager in 1968. The present conclusion is that one-dimensional modeling recovers Schwarzschild's (1906) finding that the lower solar atmosphere is grosso modo in radiative equilibrium. This is a boon for applications regarding the solar atmosphere as one-dimensional stellar example - but the real sun, including all the intricate phenomena that now constitute the mainstay of solar physics, is vastly more interesting.
Dynamic modelling of windmills
Akhmatov, Vladislav; Knudsen, Hans
1999-01-01
An empirical dynamic model of windmills is set up based on analysis of measured Fourier spectra of the active electric power from a wind farm. The model is based on the assumption that eigenswings of the mechanical construction of the windmills excited by the phenomenon of vortex tower interaction...
New Mexico Univ., Albuquerque. American Indian Law Center.
The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…
Radiation risk estimation models.
Hoel, D. G.
1987-01-01
Cancer risk models and their relationship to ionizing radiation are discussed. There are many model assumptions and risk factors that have a large quantitative impact on the cancer risk estimates. Other health end points such as mental retardation may be an even more serious risk than cancer for those with in utero exposures.
A Situational Maintenance Model
Luxhoj, James T.; Thorsteinsson, Uffe; Riis, Jens Ove
1997-01-01
An overview of trend in maintenance management and presentation of a situational model and an analytical tools for identification of managerial efforts in maintenance.......An overview of trend in maintenance management and presentation of a situational model and an analytical tools for identification of managerial efforts in maintenance....
Generalized gamma frailty model.
Balakrishnan, N; Peng, Yingwei
2006-08-30
In this article, we present a frailty model using the generalized gamma distribution as the frailty distribution. It is a power generalization of the popular gamma frailty model. It also includes other frailty models such as the lognormal and Weibull frailty models as special cases. The flexibility of this frailty distribution makes it possible to detect a complex frailty distribution structure which may otherwise be missed. Due to the intractable integrals in the likelihood function and its derivatives, we propose to approximate the integrals either by Monte Carlo simulation or by a quadrature method and then determine the maximum likelihood estimates of the parameters in the model. We explore the properties of the proposed frailty model and the computation method through a simulation study. The study shows that the proposed model can potentially reduce errors in the estimation, and that it provides a viable alternative for correlated data. The merits of proposed model are demonstrated in analysing the effects of sublingual nitroglycerin and oral isosorbide dinitrate on angina pectoris of coronary heart disease patients based on the data set in Danahy et al. (sustained hemodynamic and antianginal effect of high dose oral isosorbide dinitrate. Circulation 1977; 55:381-387). PMID:16220516
Object-Oriented Programming has been used extensively to model the LBL Advanced Light Source 1.5 GeV electron storage ring. This paper is on the present status of the class library construction with emphasis on a dynamic modeling
Olesen, H. R.
1998-01-01
Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France.......Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France....
Generalized simplicial chiral models
Using the auxiliary field representation of the simplicial chiral models on a (d-1)-dimensional simplex, the simplicial chiral models are generalized through replacing the term Tr(AA†) in the Lagrangian of these models by an arbitrary class function of AA†; V(AA†). This is the same method used in defining the generalized two-dimensional Yang-Mills theories (gYM2) from ordinary YM2. We call these models the 'generalized simplicial chiral models'. Using the results of the one-link integral over a U(N) matrix, the large-N saddle-point equations for eigenvalue density function ρ(z) in the weak (β>βc) and strong (βc) regions are computed. In d=2, where the model is in some sense related to the gYM2 theory, the saddle-point equations are solved for ρ(z) in the two regions, and the explicit value of critical point βc is calculated for V(B)=Tr Bn (B=AA†). For V(B)=Tr B2,Tr B3, and TrB4, the critical behaviour of the model at d=2 is studied, and by calculating the internal energy, it is shown that these models have a third order phase transition
Lumped Thermal Household Model
Biegel, Benjamin; Andersen, Palle; Stoustrup, Jakob;
2013-01-01
pump portfolio. Following, we illustrate two disadvantages of individual models, namely that it requires much computational effort to optimize over a large portfolio, and second that it is difficult to accurately model the houses in certain time periods due to local disturbances. Finally, we propose a...... significantly reduced. Further, the individual disturbances will smooth out as the number of houses in the portfolio increases....
Model Minority Stereotype Reconsidered.
Kobayashi, Futoshi
This paper explores the origin and historical background of the "model minority" stereotype. It includes evidence illustrating problems resulting from the artificial grouping of Asian Americans as one ethnic group and the stereotype's influence on young Asian Americans. In the 1960s, the U.S. media began to portray the model minority through…
Metaphorical Models in Chemistry.
Rosenfeld, Stuart; Bhusan, Nalini
1995-01-01
What happens when students of chemistry fail to recognize the metaphorical status of certain models and interpret them literally? Suggests that such failures lead students to form perceptions of phenomena that can be misleading. Argues that the key to making good use of metaphorical models is a recognition of their metaphorical status. Examines…
These lecture notes are intended as a pedagogical introduction to several popular extensions of the standard model of strong and electroweak interactions. The topics include the Higgs sector, the left-right symmetric model, grand unification and supersymmetry. Phenomenological consequences and search procedures are emphasized. (author) figs., tabs., 18 refs
Galle, Per
2000-01-01
In preparation of an analysis of product modelling in terms of communication, this report presents a brief analysis of symbols; that is, the entities by means of which communication takes place. Symbols are defined in such a way as to admit artefacts and models (the latter including linguistic...
Mas, André; Pumo, Besnik
2005-01-01
We introduce and study a new model for functional data. The ARHD is an autoregressive model in which the first order derivative of the random curves appears explicitely. Convergent estimates are obtained through a double penalization method. A simluation and a real case study follow as well as comparisons with other recent techniques.
Hejlesen, Aske K.; Ovesen, Nis
2012-01-01
This paper presents an experimental approach to teaching 3D modelling techniques in an Industrial Design programme. The approach includes the use of tangible free form models as tools for improving the overall learning. The paper is based on lecturer and student experiences obtained through...
Bounding species distribution models
Thomas J. STOHLGREN; Catherine S. JARNEVICH; Wayne E. ESAIAS; Jeffrey T. MORISETTE
2011-01-01
Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern.Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development,yet there is no recommended best practice for “clamping” model extrapolations.We relied on two commonly used modeling approaches:classification and regression tree (CART) and maximum entropy (Maxent) models,and we tested a simple alteration of the model extrapolations,bounding extrapolations to the maximum and minimum values of primary environmental predictors,to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States.Findings suggest that multiple models of bounding,and the most conservative bounding of species distribution models,like those presented here,should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5):642-647,2011].
Parks, Melissa
2014-01-01
Model-eliciting activities (MEAs) are not new to those in engineering or mathematics, but they were new to Melissa Parks. Model-eliciting activities are simulated real-world problems that integrate engineering, mathematical, and scientific thinking as students find solutions for specific scenarios. During this process, students generate solutions…
Lybarger, Scott; Smith, Craig R.
1996-01-01
Reconstructs Lloyd Bitzer's situational model to serve as a guide for the generation of multiperspectival critical assessments of rhetorical discourse. Uses two of President Bush's speeches on the drug crisis to illustrate how the reconstructed model can account for such modern problems as multiple audiences, perceptions, and exigencies. (PA)
Correia, Sebastiao; Polonyi, Janos; Richert, Jean
2000-01-01
A simple model is presented for the calculation of the quenched average over impurities which are rendered static by setting their mass equal to infinity. The path integral formalism of the second quantized theory contains annealed averages only. The similarity with the Gaussian quenched potential model is discussed.
Franco de los Rios, Camilo Andres; Hougaard, Jens Leth; Nielsen, Kurt
This paper extends the Weighted Overlap Dominance (WOD) model (initially presented in J.L. Hougaard, K. Nielsen. Weighted Overlap Dominance - A procedure for interactive selection on multidimensional interval data. Applied Mathematical Modelling 35, 2011, 3958 - 3969), as an outranking approach for...
Ifenthaler, Dirk; Seel, Norbert M.
2013-01-01
In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…
A deterministic model for transport of radionuclides in rivers was used for prediction of the activity concentration of radionuclides in scenarios as Clinch-Tennessee rivers and Dnjepr river, as experimental data were provided in a VAMP subgroup. Different runs of the calculation with data fitting and adaption of parameter lead to improved results. The model gives reasonable agreement with experimental data
Models for Dynamic Applications
Sales-Cruz, Mauricio; Morales Rodriguez, Ricardo; Heitzig, Martina;
2011-01-01
This chapter covers aspects of the dynamic modelling and simulation of several complex operations that include a controlled blending tank, a direct methanol fuel cell that incorporates a multiscale model, a fluidised bed reactor, a standard chemical reactor and finally a polymerisation reactor. T...
On some electroconvection models
Constantin, Peter; Ignatova, Mihaela; Vicol, Vlad
2015-01-01
We consider a model of electroconvection motivated by studies of the motion of a two dimensional annular suspended smectic film under the influence of an electric potential maintained at the boundary by two cylindrical electrodes. We prove that this electroconvection model has global in time unique smooth solutions.
Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming;
In this document, we consider a specific Chinese Smart Grid implementation and try to address the verification problem for certain quantitative properties including performance and battery consumption. We employ stochastic model checking approach and present our modelling and analysis study using...
Acid rain: Microphysical model
Dingle, A. N.
1980-01-01
A microphysical model was used to simulate the case of a ground cloud without dilution by entrainment and without precipitation. The numerical integration techniques of the model are presented. The droplet size spectra versus time and the droplet molalities for each value of time are discussed.
The normal or abnormal operating subjects the nuclear power plant's components to cyclic loading (pressure, temperature gradient). So, we can have a progressive strain accumulation on every cyclic loading. This ratchet (cyclic strain accumulation) can produce excessive deformation or increase some damages as thermal fatigue. For some components, a fine modelling of the material's behaviour is necessary to study their mechanical strength. The modelling of cyclic plasticity made great progress during the past 20 years. The ratchet is one of the last phenomena for which numerical models have to be improved. We give in this paper the present state of research to model the description of ratcheting effects. Then we use the experimental results on the austenitic stainless steel 316L at 20 deg C and 300 deg C to study the TAHERI and the BURLET and CAILLETAUD model's capabilities. The cyclic constitutive law with a discrete memory variable developed by TAHERI leads to a satisfying description of ratcheting phenomena in uniaxial loadings. With the modification of kinematic hardening proposed by Burlet and Cailletaud in the Chaboche model we get a good modelling of ratchet in biaxial loadings. These two models have been integrated into a 3D structural mechanics software, the F.E. code ASTER. We present here the calculation of a tubular structure with a thickness transition subjected to thermal cycling. (authors). 11 figs., 3 tabs., 22 refs
Kobayashi, T.; Lim, C. S.
1994-01-01
We study CP in orbifold models. It is found that the orbifolds always have some automorphisms as CP symmetry. The symmetries are restricted non-trivially due to geometrical structure of the orbifolds. Explicit analysis on Yukawa couplings also shows that CP is not violated in orbifold models.
Jansen, J.D.; Fonseca, R.M.; Kahrobaei, S.; Siraj, M.; Van Essen, G.M.; Van den Hof, P.M.J.
2013-01-01
The "Egg Model" is a synthetic reservoir model consisting of an ensemble of 101 relatively small three-dimensional realizations of a channelized reservoir produced under water flooding conditions with eight water injectors and four producers. It has been used in numerous publications to demonstrate
Erpylev, N. P.; Smirnov, M. A.; Bagrov, A. V.
A night sky model is proposed. It includes different components of light polution, such as solar twilight, moon scattered light, zodiacal light, Milky Way, air glow and artificial light pollution. The model is designed for calculating the efficiency of astronomical installations.
A normalized form of the point kinetics equations, a prompt jump approximation, and the Nordheim-Fuchs model are used to model nuclear systems. Reactivity feedback mechanisms considered include volumetric expansion, thermal neutron temperature effect, Doppler effect and void formation. A sample problem of an excursion occurring in a plutonium solution accidentally formed in a glovebox is presented
This magnetic tape contains the FORTRAN source code, sample input data, and sample output data for the Photochemical Box Model (PBM). The PBM is a simple stationary single-cell model with a variable height lid designed to provide volume-integrated hour averages of O3 and other ph...
ASSO : Behavioural specialization modelling
Andolina, Rosanna; Locuratolo, Elvira
1996-01-01
An approach of behavioural modelling based on the specialization concept is proposed within ASSO, a formal database design methodology which combines features from database design with formal methods. This approach preserves the semantics of the current behavioural modelling while producing benefits on the phases of the methodology.
Brozoski, Thomas J; Bauer, Carol A
2016-08-01
Presented is a thematic review of animal tinnitus models from a functional perspective. Chronic tinnitus is a persistent subjective sound sensation, emergent typically after hearing loss. Although the sensation is experientially simple, it appears to have central a nervous system substrate of unexpected complexity that includes areas outside of those classically defined as auditory. Over the past 27 years animal models have significantly contributed to understanding tinnitus' complex neurophysiology. In that time, a diversity of models have been developed, each with its own strengths and limitations. None has clearly become a standard. Animal models trace their origin to the 1988 experiments of Jastreboff and colleagues. All subsequent models derive some of their features from those experiments. Common features include behavior-dependent psychophysical determination, acoustic conditions that contrast objective sound and silence, and inclusion of at least one normal-hearing control group. In the present review, animal models have been categorized as either interrogative or reflexive. Interrogative models use emitted behavior under voluntary control to indicate hearing. An example would be pressing a lever to obtain food in the presence of a particular sound. In this type of model animals are interrogated about their auditory sensations, analogous to asking a patient, "What do you hear?" These models require at least some training and motivation management, and reflect the perception of tinnitus. Reflexive models, in contrast, employ acoustic modulation of an auditory reflex, such as the acoustic startle response. An unexpected loud sound will elicit a reflexive motor response from many species, including humans. Although involuntary, acoustic startle can be modified by a lower-level preceding event, including a silent sound gap. Sound-gap modulation of acoustic startle appears to discriminate tinnitus in animals as well as humans, and requires no training or
Non-commutative standard model: model building
A non-commutative version of the usual electro-weak theory is constructed. We discuss how to overcome the two major problems: (1) although we can have non-commutative U(n) (which we denote by U*(n)) gauge theory we cannot have non-commutative SU(n) and (2) the charges in non-commutative QED are quantized to just 0,±1. We show how the latter problem with charge quantization, as well as with the gauge group, can be resolved by taking the U*(3) x U*(2) x U*(1) gauge group and reducing the extra U(1) factors in an appropriate way. Then we proceed with building the non-commutative version of the standard model by specifying the proper representations for the entire particle content of the theory, the gauge bosons, the fermions and Higgs. We also present the full action for the non-commutative standard model (NCSM). In addition, among several peculiar features of our model, we address the inherentCP violation and new neutrino interactions. (orig.)
Australia's Next Top Fraction Model
Gould, Peter
2013-01-01
Peter Gould suggests Australia's next top fraction model should be a linear model rather than an area model. He provides a convincing argument and gives examples of ways to introduce a linear model in primary classrooms.
Ford, Dominic
2012-01-01
This paper presents a hands-on introduction to the medieval astrolabe, based around a working model which can be constructed from photocopies of the supplied figures. As well as describing how to assemble the model, I also provide a brief explanation of how each of its various parts might be used. The printed version of this paper includes only the parts needed to build a single model prepared for use at latitudes around 52{\\deg}N, but an accompanying electronic file archive includes equivalent images which can be used to build models prepared for use at any other latitude. The vector graphics scripts used to generate the models are also available for download, allowing customised astrolabes to be made.
Jones, Katherine A.; Finley, Patrick D.; Moore, Thomas W.; Nozick, Linda Karen; Martin, Nathaniel; Bandlow, Alisa; Detry, Richard Joseph; Evans, Leland B.; Berger, Taylor Eugen
2013-09-01
Infectious diseases can spread rapidly through healthcare facilities, resulting in widespread illness among vulnerable patients. Computational models of disease spread are useful for evaluating mitigation strategies under different scenarios. This report describes two infectious disease models built for the US Department of Veteran Affairs (VA) motivated by a Varicella outbreak in a VA facility. The first model simulates disease spread within a notional contact network representing staff and patients. Several interventions, along with initial infection counts and intervention delay, were evaluated for effectiveness at preventing disease spread. The second model adds staff categories, location, scheduling, and variable contact rates to improve resolution. This model achieved more accurate infection counts and enabled a more rigorous evaluation of comparative effectiveness of interventions.
Sparse Nonparametric Graphical Models
Lafferty, John; Wasserman, Larry
2012-01-01
We present some nonparametric methods for graphical modeling. In the discrete case, where the data are binary or drawn from a finite alphabet, Markov random fields are already essentially nonparametric, since the cliques can take only a finite number of values. Continuous data are different. The Gaussian graphical model is the standard parametric model for continuous data, but it makes distributional assumptions that are often unrealistic. We discuss two approaches to building more flexible graphical models. One allows arbitrary graphs and a nonparametric extension of the Gaussian; the other uses kernel density estimation and restricts the graphs to trees and forests. Examples of both methods are presented. We also discuss possible future research directions for nonparametric graphical modeling.
Compound semiconductor device modelling
Miles, Robert
1993-01-01
Compound semiconductor devices form the foundation of solid-state microwave and optoelectronic technologies used in many modern communication systems. In common with their low frequency counterparts, these devices are often represented using equivalent circuit models, but it is often necessary to resort to physical models in order to gain insight into the detailed operation of compound semiconductor devices. Many of the earliest physical models were indeed developed to understand the 'unusual' phenomena which occur at high frequencies. Such was the case with the Gunn and IMPATI diodes, which led to an increased interest in using numerical simulation methods. Contemporary devices often have feature sizes so small that they no longer operate within the familiar traditional framework, and hot electron or even quantum mechanical models are required. The need for accurate and efficient models suitable for computer aided design has increased with the demand for a wider range of integrated devices for operation at...
Pedersen, Rasmus
encapsulation can be visually modelled during product platform projects. A fundamental hypothesis in this project is that decision makers and important stakeholders have to be able to see the platform in order to manage it. Consequently, the thesis also investigates how visual models of important phenomena can...... two main purposes; First, various phenomena related to product platforms are investigated and secondly it is investigated how some of these phenomena can be visually modelled in order to support decision making in industrial platform projects. The investigation of platform phenomena is based on the...... support decision makers during a product platform project. The reaction from stakeholders in the case companies indicates that the decision base is improved by means of visual models. Another finding is that the sometimes rather theoretical and intangible phenomena can be instantiated in models and...
Calum, Henrik; Høiby, Niels; Moser, Claus
2014-01-01
Severe thermal injury induces immunosuppression, involving all parts of the immune system, especially when large fractions of the total body surface area are affected. An animal model was established to characterize the burn-induced immunosuppression. In our novel mouse model a 6 % third-degree b......Severe thermal injury induces immunosuppression, involving all parts of the immune system, especially when large fractions of the total body surface area are affected. An animal model was established to characterize the burn-induced immunosuppression. In our novel mouse model a 6 % third...... with infected burn wound compared with the burn wound only group. The burn mouse model resembles the clinical situation and provides an opportunity to examine or develop new strategies like new antibiotics and immune therapy, in handling burn wound victims much....
Linear models: permutation methods
Cade, B.S.
2005-01-01
Permutation tests (see Permutation Based Inference) for the linear model have applications in behavioral studies when traditional parametric assumptions about the error term in a linear model are not tenable. Improved validity of Type I error rates can be achieved with properly constructed permutation tests. Perhaps more importantly, increased statistical power, improved robustness to effects of outliers, and detection of alternative distributional differences can be achieved by coupling permutation inference with alternative linear model estimators. For example, it is well-known that estimates of the mean in linear model are extremely sensitive to even a single outlying value of the dependent variable compared to estimates of the median [7, 19]. Traditionally, linear modeling focused on estimating changes in the center of distributions (means or medians). However, quantile regression allows distributional changes to be estimated in all or any selected part of a distribution or responses, providing a more complete statistical picture that has relevance to many biological questions [6]...
The Finite Element Method (FEM) is a numerical technique for finding approximate solutions to boundary value problems. While FEM is commonly used to solve solid mechanics equations, it can be applied to a large range of BVPs from many different fields. FEM has been used for reactor fuels modelling for many years. It is most often used for fuel performance modelling at the pellet and pin scale, however, it has also been used to investigate properties of the fuel material, such as thermal conductivity and fission gas release. Recently, the United Stated Department Nuclear Energy Advanced Modelling and Simulation Program has begun using FEM as the basis of the MOOSE-BISON-MARMOT Project that is developing a multi-dimensional, multi-physics fuel performance capability that is massively parallel and will use multi-scale material models to provide a truly predictive modelling capability. (authors)
Caferra, Ricardo; Peltier, Nicholas
2004-01-01
This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses
Auxiliary Deep Generative Models
Maaløe, Lars; Sønderby, Casper Kaae; Sønderby, Søren Kaae;
2016-01-01
Deep generative models parameterized by neural networks have recently achieved state-of-the-art performance in unsupervised and semi-supervised learning. We extend deep generative models with auxiliary variables which improves the variational approximation. The auxiliary variables leave the...... generative model unchanged but make the variational distribution more expressive. Inspired by the structure of the auxiliary variable we also propose a model with two stochastic layers and skip connections. Our findings suggest that more expressive and properly specified deep generative models converge...... faster with better results. We show state-of-the-art performance within semi-supervised learning on MNIST (0.96%), SVHN (16.61%) and NORB (9.40%) datasets....
Brown, R. G.
1984-01-01
The formulation of appropriate state-space models for Kalman filtering applications is studied. The so-called model is completely specified by four matrix parameters and the initial conditions of the recursive equations. Once these are determined, the die is cast, and the way in which the measurements are weighted is determined foreverafter. Thus, finding a model that fits the physical situation at hand is all important. Also, it is often the most difficult aspect of designing a Kalman filter. Formulation of discrete state models from the spectral density and ARMA random process descriptions is discussed. Finally, it is pointed out that many common processes encountered in applied work (such as band-limited white noise) simply do not lend themselves very well to Kalman filter modeling.
Mellor, Vincent
2011-01-01
This paper extends the subjects dicussed in the Data Analysis and Dynamical Systems courses by looking at the subject of modelling data. This task is nontrivial as the underlying process could be non-linear. In the paper some common methods, including global and local polynomial fitting, are discussed in terms of their applicability, level of computation and accuracy. One example method, Measure based Reconstruction, has been investigated in greater detail and experimentation is carried out to evaluate the method. In this project we shall be looking at the different ways one can model chaotic time series. The reason we are going to look at a range of methods is that different methods are "good" for different applications. As the "goodness" of a model is subjective to the task one wishes to do, we will investigate a selected models and compare the prediction to see how one goes about testing a model.
Cosmological models and stability
Andersson, Lars
2013-01-01
Principles in the form of heuristic guidelines or generally accepted dogma play an important role in the development of physical theories. In particular, philosophical considerations and principles figure prominently in the work of Albert Einstein. As mentioned in the talk by Jiri Bicak at this conference Einstein formulated the equivalence principle, an essential step on the road to general relativity, during his time in Prague 1911-1912. In this talk, I would like to discuss some aspects of cosmological models. As cosmology is an area of physics where "principles" such as the "cosmological principle" or the "Copernican principle" play a prominent role in motivating the class of models which form part of the current standard model, I will start by comparing the role of the equivalence principle to that of the principles used in cosmology. I will then briefly describe the standard model of cosmology to give a perspective on some mathematical problems and conjectures on cosmological models, which are discussed...
Boyd-Graber, Jordan
2010-01-01
The syntactic topic model (STM) is a Bayesian nonparametric model of language that discovers latent distributions of words (topics) that are both semantically and syntactically coherent. The STM models dependency parsed corpora where sentences are grouped into documents. It assumes that each word is drawn from a latent topic chosen by combining document-level features and the local syntactic context. Each document has a distribution over latent topics, as in topic models, which provides the semantic consistency. Each element in the dependency parse tree also has a distribution over the topics of its children, as in latent-state syntax models, which provides the syntactic consistency. These distributions are convolved so that the topic of each word is likely under both its document and syntactic context. We derive a fast posterior inference algorithm based on variational methods. We report qualitative and quantitative studies on both synthetic data and hand-parsed documents. We show that the STM is a more pred...
The University of Maine conducted this study for Pacific Northwest Laboratory (PNL) as part of a global climate modeling task for site characterization of the potential nuclear waste respository site at Yucca Mountain, NV. The purpose of the study was to develop a global ice sheet dynamics model that will forecast the three-dimensional configuration of global ice sheets for specific climate change scenarios. The objective of the third (final) year of the work was to produce ice sheet data for glaciation scenarios covering the next 100,000 years. This was accomplished using both the map-plane and flowband solutions of our time-dependent, finite-element gridpoint model. The theory and equations used to develop the ice sheet models are presented. Three future scenarios were simulated by the model and results are discussed
Crowdsourcing Business Model Innovation
Waldner, Florian; Poetz, Marion
Successfully adapting existing business models or developing new ones significantly influences a firm?s ability to generate profits and develop competitive advantages. However, business model innovation is perceived as a complex, risky and uncertain process and its success strongly depends on...... whether or not firms are capable of understanding and addressing their customers? needs. This study explores how crowdsourcing-based search approaches can contribute to the process of business model innovation. Drawing on data from a crowdsourcing initiative designed to develop ideas for new business...... models in the podcast industry, we provide first exploratory insights into the value of crowdsourcing for innovating a firm?s business model, and discuss which characteristics of crowd-contributors increase the quantity and quality of the outcome....
Integrated Modeling of Telescopes
Andersen, Torben
2011-01-01
With increasingly complex and costly opto-mechanical systems, there is a growing need for reliable computer modeling and simulation. The field of integrated modeling, combining optics, mechanics, control engineering, and other disciplines, is the subject of this book. Although the book primarily focuses on ground-based optical telescopes, the techniques introduced are applicable also to other wavelengths and to other opto-mechanical applications on the ground or in space. Basic tools of integrated modeling are introduced together with concepts of ground-based telescopes. Modeling of optical systems, structures, wavefront control systems with emphasis on segmented mirror control, and active and adaptive optics are described together with a variety of noise sources; many examples are included in this book. Integrated Modeling of Telescopes is a text for physicists and engineers working in the field of opto-mechanical design and wavefront control, but it will also be valuable as a textbook for PhD students.
Cardiovascular modeling and diagnostics
Kangas, L.J.; Keller, P.E.; Hashem, S.; Kouzes, R.T. [Pacific Northwest Lab., Richland, WA (United States)
1995-12-31
In this paper, a novel approach to modeling and diagnosing the cardiovascular system is introduced. A model exhibits a subset of the dynamics of the cardiovascular behavior of an individual by using a recurrent artificial neural network. Potentially, a model will be incorporated into a cardiovascular diagnostic system. This approach is unique in that each cardiovascular model is developed from physiological measurements of an individual. Any differences between the modeled variables and the variables of an individual at a given time are used for diagnosis. This approach also exploits sensor fusion to optimize the utilization of biomedical sensors. The advantage of sensor fusion has been demonstrated in applications including control and diagnostics of mechanical and chemical processes.
Hundebøl, Jesper
wave of new building information modelling tools demands further investigation, not least because of industry representatives' somewhat coarse parlance: Now the word is spreading -3D digital modelling is nothing less than a revolution, a shift of paradigm, a new alphabet... Research qeustions. Based...... on empirical probes (interviews, observations, written inscriptions) within the Danish construction industry this paper explores the organizational and managerial dynamics of 3D Digital Modelling. The paper intends to - Illustrate how the network of (non-)human actors engaged in the promotion (and arrest) of 3......D Modelling (in Denmark) stabilizes - Examine how 3D Modelling manifests itself in the early design phases of a construction project with a view to discuss the effects hereof for i.a. the management of the building process. Structure. The paper introduces a few, basic methodological concepts...
Udupa, Jayaram K.; Odhner, Dewey; Falcao, Alexandre X.; Ciesielski, Krzysztof C.; Miranda, Paulo A. V.; Vaideeswaran, Pavithra; Mishra, Shipra; Grevera, George J.; Saboury, Babak; Torigian, Drew A.
2011-03-01
To make Quantitative Radiology (QR) a reality in routine clinical practice, computerized automatic anatomy recognition (AAR) becomes essential. As part of this larger goal, we present in this paper a novel fuzzy strategy for building bodywide group-wise anatomic models. They have the potential to handle uncertainties and variability in anatomy naturally and to be integrated with the fuzzy connectedness framework for image segmentation. Our approach is to build a family of models, called the Virtual Quantitative Human, representing normal adult subjects at a chosen resolution of the population variables (gender, age). Models are represented hierarchically, the descendents representing organs contained in parent organs. Based on an index of fuzziness of the models, 32 thorax data sets, and 10 organs defined in them, we found that the hierarchical approach to modeling can effectively handle the non-linear relationships in position, scale, and orientation that exist among organs in different patients.
A series of models is being built and used as tools in the design of the SNUPPS Standard Power Block. The modelling programme includes both preliminary and final design models, a construction sequence mode, and additional models used to study various features of the design. The design of a standard power block unit has necessitated design definition which is more detailed than that customarily used in the design of nuclear power stations. One innovation is the use of engineering models as a primary design tool in the layout of process piping, preparation of isometric drawings, design of small components which are customarily designed in the field during construction. Development of a standard construction sequence and construction work plan is another innovation. (author)
These lectures constitute a short course in ''Beyond the Standard Model'' for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e+e- colliders
Trelles, J P; Vardelle, A; Heberlein, J V R
2013-01-01
Arc plasma torches are the primary components of various industrial thermal plasma processes involving plasma spraying, metal cutting and welding, thermal plasma CVD, metal melting and remelting, waste treatment and gas production. They are relatively simple devices whose operation implies intricate thermal, chemical, electrical, and fluid dynamics phenomena. Modeling may be used as a means to better understand the physical processes involved in their operation. This paper presents an overview of the main aspects involved in the modeling of DC arc plasma torches: the mathematical models including thermodynamic and chemical non-equilibrium models, turbulent and radiative transport, thermodynamic and transport property calculation, boundary conditions and arc reattachment models. It focuses on the conventional plasma torches used for plasma spraying that include a hot-cathode and a nozzle anode.
Peskin, M.E.
1997-05-01
These lectures constitute a short course in ``Beyond the Standard Model`` for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e{sup +}e{sup {minus}} colliders.
Hubbard, W B
2016-01-01
In anticipation of new observational results for Jupiter's axial moment of inertia and gravitational zonal harmonic coefficients from the forthcoming Juno orbiter, we present a number of preliminary Jupiter interior models. We combine results from ab initio computer simulations of hydrogen-helium mixtures, including immiscibility calculations, with a new nonperturbative calculation of Jupiter's zonal harmonic coefficients, to derive a self-consistent model for the planet's external gravity and moment of inertia. We assume helium rain modified the interior temperature and composition profiles. Our calculation predicts zonal harmonic values to which measurements can be compared. Although some models fit the observed (pre-Juno) second- and fourth-order zonal harmonics to within their error bars, our preferred reference model predicts a fourth-order zonal harmonic whose absolute value lies above the pre-Juno error bars. This model has a dense core of about 12 Earth masses, and a hydrogen-helium-rich envelope with...
Macroscopic Models of Superconductivity
Chapman, S. J.
Available from UMI in association with The British Library. Requires signed TDF. After giving a description of the basic physical phenomena to be modelled, we begin by formulating a sharp -interface free-boundary model for the destruction of superconductivity by an applied magnetic field, under isothermal and anisothermal conditions, which takes the form of a vectorial Stefan model similar to the classical scalar Stefan model of solid/liquid phase transitions and identical in certain two-dimensional situations. This model is found sometimes to have instabilities similar to those of the classical Stefan model. We then describe the Ginzburg-Landau theory of superconductivity, in which the sharp interface is 'smoothed out' by the introduction of an order parameter, representing the number density of superconducting electrons. By performing a formal asymptotic analysis of this model as various parameters in it tend to zero we find that the leading order solution does indeed satisfy the vectorial Stefan model. However, at the next order we find the emergence of terms analogous to those of 'surface tension' and 'kinetic undercooling' in the scalar Stefan model. Moreover, the 'surface energy' of a normal/superconducting interface is found to take both positive and negative values, defining Type I and Type II superconductors respectively. We discuss the response of superconductors to external influences by considering the nucleation of superconductivity with decreasing magnetic field and with decreasing temperature respectively, and find there to be a pitchfork bifurcation to a superconducting state which is subcritical for Type I superconductors and supercritical for Type II superconductors. We also examine the effects of boundaries on the nucleation field, and describe in more detail the nature of the superconducting solution in Type II superconductors--the so-called 'mixed state'. Finally, we present some open questions concerning both the modelling and analysis of
Antedependence models for longitudinal data
Zimmerman, Dale L
2009-01-01
The First Book Dedicated to This Class of Longitudinal ModelsAlthough antedependence models are particularly useful for modeling longitudinal data that exhibit serial correlation, few books adequately cover these models. By gathering results scattered throughout the literature, Antedependence Models for Longitudinal Data offers a convenient, systematic way to learn about antedependence models. Illustrated with numerous examples, the book also covers some important statistical inference procedures associated with these models.After describing unstructured and structured antedependence models an
Traffic flow modeling: a Genealogy
Van Wageningen-Kessels, F.L.M.; Hoogendoorn, S.P.; Vuik, C.; Lint, van J. W. C.
2014-01-01
80 years ago, Bruce Greenshields presented the first traffic flow model at the Annual Meeting of the Highway Research Board. Since then, many models and simulation tools have been developed. We show a model tree with four families of traffic flow models, all descending from Greenshields' model. The tree shows the historical development of traffic flow modeling and the relations between models. Based on the tree we discuss the main trends and future developments in traffic flow modeling and si...
SPAR Model Structural Efficiencies
John Schroeder; Dan Henry
2013-04-01
The Nuclear Regulatory Commission (NRC) and the Electric Power Research Institute (EPRI) are supporting initiatives aimed at improving the quality of probabilistic risk assessments (PRAs). Included in these initiatives are the resolution of key technical issues that are have been judged to have the most significant influence on the baseline core damage frequency of the NRC’s Standardized Plant Analysis Risk (SPAR) models and licensee PRA models. Previous work addressed issues associated with support system initiating event analysis and loss of off-site power/station blackout analysis. The key technical issues were: • Development of a standard methodology and implementation of support system initiating events • Treatment of loss of offsite power • Development of standard approach for emergency core cooling following containment failure Some of the related issues were not fully resolved. This project continues the effort to resolve outstanding issues. The work scope was intended to include substantial collaboration with EPRI; however, EPRI has had other higher priority initiatives to support. Therefore this project has addressed SPAR modeling issues. The issues addressed are • SPAR model transparency • Common cause failure modeling deficiencies and approaches • Ac and dc modeling deficiencies and approaches • Instrumentation and control system modeling deficiencies and approaches
Fire accident in a containment is a serious threat to nuclear reactors. Fire can cause substantial loss to life and property. The risk posed by fire can also exceed the risk from internal events within a nuclear reactor. Numerous research efforts have been performed to understand and analyze the phenomenon of fire in nuclear reactor and its consequences. Modeling of fire is an important subject in the field of fire safety engineering. Two approaches which are commonly used in fire modeling are zonal modeling and field modeling. The objective of this work is to compare zonal and field modeling approach against a pool fired experiment performed in a well-confined compartment. Numerical simulations were performed against experiments, which were conducted within PRISME program under the framework of OECD. In these experiments, effects of ventilation flow rate on heat release rate in a confined and mechanically ventilated compartment is investigated. Time dependent changes in gas temperature and oxygen mass fraction were measured. The trends obtained by numerical simulation performed using zonal model and field model compares well with experiments. Further validation is needed before this code can be used for fire safety analyses. (author)
Generalized simplicial chiral models
Alimohammadi, M
2000-01-01
Using the auxiliary field representation of the simplicial chiral models on a (d-1)-dimensional simplex, we generalize the simplicial chiral models by replacing the term Tr$(AA^{\\d})$ in the Lagrangian of these models, by an arbitrary class function of $AA^{\\d}; V(AA^{\\d})$. This is the same method that has been used in defining the generalized two-dimensional Yang-Mills theories (gYM_2) from ordinary YM_2. We call these models, the " generalized simplicial chiral models ". With the help of the results of one-link integral over a U(N) matrix, we compute the large-N saddle-point equations for eigenvalue density function $\\ro (z)$ in the weak ($\\b >\\b_c$) and strong ($\\b <\\b_c$) regions. In d=2, where the model somehow relates to gYM_2 theory, we solve the saddle-point equations and find $\\ro (z)$ in two region, and calculate the explicit value of critical point $\\b_c$ for $V(B)=TrB^n (B=AA^{\\d})$. For $V(B)=Tr B^2,Tr B^3$ and Tr$B^4$, we study the critical behaviour of the model at d=2, and by calculating t...
Stochasticity Modeling in Memristors
Naous, Rawan
2015-10-26
Diverse models have been proposed over the past years to explain the exhibiting behavior of memristors, the fourth fundamental circuit element. The models varied in complexity ranging from a description of physical mechanisms to a more generalized mathematical modeling. Nonetheless, stochasticity, a widespread observed phenomenon, has been immensely overlooked from the modeling perspective. This inherent variability within the operation of the memristor is a vital feature for the integration of this nonlinear device into the stochastic electronics realm of study. In this paper, experimentally observed innate stochasticity is modeled in a circuit compatible format. The model proposed is generic and could be incorporated into variants of threshold-based memristor models in which apparent variations in the output hysteresis convey the switching threshold shift. Further application as a noise injection alternative paves the way for novel approaches in the fields of neuromorphic engineering circuits design. On the other hand, extra caution needs to be paid to variability intolerant digital designs based on non-deterministic memristor logic.
Cestari, Andrea
2013-01-01
Predictive modeling is emerging as an important knowledge-based technology in healthcare. The interest in the use of predictive modeling reflects advances on different fronts such as the availability of health information from increasingly complex databases and electronic health records, a better understanding of causal or statistical predictors of health, disease processes and multifactorial models of ill-health and developments in nonlinear computer models using artificial intelligence or neural networks. These new computer-based forms of modeling are increasingly able to establish technical credibility in clinical contexts. The current state of knowledge is still quite young in understanding the likely future direction of how this so-called 'machine intelligence' will evolve and therefore how current relatively sophisticated predictive models will evolve in response to improvements in technology, which is advancing along a wide front. Predictive models in urology are gaining progressive popularity not only for academic and scientific purposes but also into the clinical practice with the introduction of several nomograms dealing with the main fields of onco-urology. PMID:23423686