Conceptualising Business Models: Definitions, Frameworks and Classifications
Erwin Fielt
2013-01-01
The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in...
Conceptualising Business Models: Definitions, Frameworks and Classifications
Directory of Open Access Journals (Sweden)
Erwin Fielt
2013-12-01
Full Text Available The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in terms of how it creates and captures customer value. This abstract and generic definition is made more specific and operational by the compositional elements that need to address the customer, value proposition, organizational architecture (firm and network level and economics dimensions. Business model archetypes complement the definition and elements by providing a more concrete and empirical understanding of the business model concept. The main contributions of this paper are (1 explicitly including the customer value concept in the business model definition and focussing on value creation, (2 presenting four core dimensions that business model elements need to cover, (3 arguing for flexibility by adapting and extending business model elements to cater for different purposes and contexts (e.g. technology, innovation, strategy (4 stressing a more systematic approach to business model archetypes by using business model elements for their description, and (5 suggesting to use business model archetype research for the empirical exploration and testing of business model elements and their relationships.
What Is A Homosexual? A Definitional Model.
Berger, Raymond M.
1983-01-01
Presents a definitional model to explain homosexuality and discusses its implications for practice. Contends that social workers must discard the traditional binary model of hetersexual versus homesexual for one incorporating relevant psychosocial factors including life experiences, social reaction, and association with others. (Author/JAC)
Formal Definition of Measures for BPMN Models
Reynoso, Luis; Rolón, Elvira; Genero, Marcela; García, Félix; Ruiz, Francisco; Piattini, Mario
Business process models are currently attaining more relevance, and more attention is therefore being paid to their quality. This situation led us to define a set of measures for the understandability of BPMN models, which is shown in a previous work. We focus on understandability since a model must be well understood before any changes are made to it. These measures were originally informally defined in natural language. As is well known, natural language is ambiguous and may lead to misunderstandings and a misinterpretation of the concepts captured by a measure and the way in which the measure value is obtained. This has motivated us to provide the formal definition of the proposed measures using OCL (Object Constraint Language) upon the BPMN (Business Process Modeling Notation) metamodel presented in this paper. The main advantages and lessons learned (which were obtained both from the current work and from previous works carried out in relation to the formal definition of other measures) are also summarized.
Current definition and a generalized federbush model
International Nuclear Information System (INIS)
Singh, L.P.S.; Hagen, C.R.
1978-01-01
The Federbush model is studied, with particular attention being given to the definition of currents. Inasmuch as there is no a priori restriction of local gauge invariance, the currents in the interacting case can be defined more generally than in Q.E.D. It is found that two arbitrary parameters are thereby introduced into the theory. Lowest order perturbation calculations for the current correlation functions and the Fermion propagators indicate that the theory admits a whole class of solutions dependent upon these parameters with the closed solution of Federbush emerging as a special case. The theory is shown to be locally covariant, and a conserved energy--momentum tensor is displayed. One finds in addition that the generators of gauge transformations for the fields are conserved. Finally it is shown that the general theory yields the Federbush solution if suitable Thirring model type counterterms are added
Translating building information modeling to building energy modeling using model view definition.
Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei
2014-01-01
This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.
Translating Building Information Modeling to Building Energy Modeling Using Model View Definition
Directory of Open Access Journals (Sweden)
WoonSeong Jeong
2014-01-01
Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.
Weak Memory Models: Balancing Definitional Simplicity and Implementation Flexibility
Zhang, Sizhuo; Vijayaraghavan, Muralidaran; Arvind
2017-01-01
The memory model for RISC-V, a newly developed open source ISA, has not been finalized yet and thus, offers an opportunity to evaluate existing memory models. We believe RISC-V should not adopt the memory models of POWER or ARM, because their axiomatic and operational definitions are too complicated. We propose two new weak memory models: WMM and WMM-S, which balance definitional simplicity and implementation flexibility differently. Both allow all instruction reorderings except overtaking of...
Concrete syntax definition for modeling languages
Fondement, Frédéric; Baar, Thomas
2008-01-01
Model Driven Engineering (MDE) promotes the use of models as primary artefacts of a software development process, as an attempt to handle complexity through abstraction, e.g. to cope with the evolution of execution platforms. MDE follows a stepwise approach, by prescribing to develop abstract models further improved to integrate little by little details relative to the final deployment platforms. Thus, the application of an MDE process results in various models residing at various levels of a...
Concrete syntax definition for modeling languages
Fondement, Frédéric
2007-01-01
Model Driven Engineering (MDE) promotes the use of models as primary artefacts of a software development process, as an attempt to handle complexity through abstraction, e.g. to cope with the evolution of execution platforms. MDE follows a stepwise approach, by prescribing to develop abstract models further improved to integrate little by little details relative to the final deployment platforms. Thus, the application of an MDE process results in various models residing at various levels of a...
Moving towards maturity in business model definitions
DEFF Research Database (Denmark)
Nielsen, Christian; Lund, Morten; Bukh, Per Nikolaj
2014-01-01
The field of business models has, as is the case with all emerging fields of practice, slowly matured through the development of frameworks, models, concepts and ideas over the last 15 years. New concepts, theories and models typically transcend a series of maturity phases. For the concept of Bus...
[Safety culture: definition, models and design].
Pfaff, Holger; Hammer, Antje; Ernstmann, Nicole; Kowalski, Christoph; Ommen, Oliver
2009-01-01
Safety culture is a multi-dimensional phenomenon. Safety culture of a healthcare organization is high if it has a common stock in knowledge, values and symbols in regard to patients' safety. The article intends to define safety culture in the first step and, in the second step, demonstrate the effects of safety culture. We present the model of safety behaviour and show how safety culture can affect behaviour and produce safe behaviour. In the third step we will look at the causes of safety culture and present the safety-culture-model. The main hypothesis of this model is that the safety culture of a healthcare organization strongly depends on its communication culture and its social capital. Finally, we will investigate how the safety culture of a healthcare organization can be improved. Based on the safety culture model six measures to improve safety culture will be presented.
The infinitesimal model: Definition, derivation, and implications.
Barton, N H; Etheridge, A M; Véber, A
2017-12-01
Our focus here is on the infinitesimal model. In this model, one or several quantitative traits are described as the sum of a genetic and a non-genetic component, the first being distributed within families as a normal random variable centred at the average of the parental genetic components, and with a variance independent of the parental traits. Thus, the variance that segregates within families is not perturbed by selection, and can be predicted from the variance components. This does not necessarily imply that the trait distribution across the whole population should be Gaussian, and indeed selection or population structure may have a substantial effect on the overall trait distribution. One of our main aims is to identify some general conditions on the allelic effects for the infinitesimal model to be accurate. We first review the long history of the infinitesimal model in quantitative genetics. Then we formulate the model at the phenotypic level in terms of individual trait values and relationships between individuals, but including different evolutionary processes: genetic drift, recombination, selection, mutation, population structure, …. We give a range of examples of its application to evolutionary questions related to stabilising selection, assortative mating, effective population size and response to selection, habitat preference and speciation. We provide a mathematical justification of the model as the limit as the number M of underlying loci tends to infinity of a model with Mendelian inheritance, mutation and environmental noise, when the genetic component of the trait is purely additive. We also show how the model generalises to include epistatic effects. We prove in particular that, within each family, the genetic components of the individual trait values in the current generation are indeed normally distributed with a variance independent of ancestral traits, up to an error of order 1∕M. Simulations suggest that in some cases the convergence
Weak Memory Models with Matching Axiomatic and Operational Definitions
Zhang, Sizhuo; Vijayaraghavan, Muralidaran; Lustig, Dan; Arvind
2017-01-01
Memory consistency models are notorious for being difficult to define precisely, to reason about, and to verify. More than a decade of effort has gone into nailing down the definitions of the ARM and IBM Power memory models, and yet there still remain aspects of those models which (perhaps surprisingly) remain unresolved to this day. In response to these complexities, there has been somewhat of a recent trend in the (general-purpose) architecture community to limit new memory models to being ...
Spectra of definite type in waveguide models
Czech Academy of Sciences Publication Activity Database
Lotoreichik, Vladimir; Siegl, Petr
2017-01-01
Roč. 145, č. 3 (2017), s. 1231-1246 ISSN 0002-9939 R&D Projects: GA ČR(CZ) GA14-06818S Institutional support: RVO:61389005 Keywords : spectral points of definite and of type pi * weakly coupled bound states * pertrubations of essential spectrum * PT-symmetric waveguide Subject RIV: BE - Theoretical Physics OBOR OECD: Applied mathematics Impact factor: 0.679, year: 2016
Building a Shared Definitional Model of Long Duration Human Spaceflight
Orr, M.; Whitmire, A.; Sandoval, L.; Leveton, L.; Arias, D.
2011-01-01
In 1956, on the eve of human space travel Strughold first proposed a simple classification of the present and future stages of manned flight that identified key factors, risks and developmental stages for the evolutionary journey ahead. As we look to optimize the potential of the ISS as a gateway to new destinations, we need a current shared working definitional model of long duration human space flight to help guide our path. Initial search of formal and grey literature augmented by liaison with subject matter experts. Search strategy focused on both the use of term long duration mission and long duration spaceflight, and also broader related current and historical definitions and classification models of spaceflight. The related sea and air travel literature was also subsequently explored with a view to identifying analogous models or classification systems. There are multiple different definitions and classification systems for spaceflight including phase and type of mission, craft and payload and related risk management models. However the frequently used concepts of long duration mission and long duration spaceflight are infrequently operationally defined by authors, and no commonly referenced classical or gold standard definition or model of these terms emerged from the search. The categorization (Cat) system for sailing was found to be of potential analogous utility, with its focus on understanding the need for crew and craft autonomy at various levels of potential adversity and inability to gain outside support or return to a safe location, due to factors of time, distance and location.
A consensus definition of cataplexy in mouse models of narcolepsy.
Scammell, Thomas E; Willie, Jon T; Guilleminault, Christian; Siegel, Jerome M
2009-01-01
People with narcolepsy often have episodes of cataplexy, brief periods of muscle weakness triggered by strong emotions. Many researchers are now studying mouse models of narcolepsy, but definitions of cataplexy-like behavior in mice differ across labs. To establish a common language, the International Working Group on Rodent Models of Narcolepsy reviewed the literature on cataplexy in people with narcolepsy and in dog and mouse models of narcolepsy and then developed a consensus definition of murine cataplexy. The group concluded that murine cataplexy is an abrupt episode of nuchal atonia lasting at least 10 seconds. In addition, theta activity dominates the EEG during the episode, and video recordings document immobility. To distinguish a cataplexy episode from REM sleep after a brief awakening, at least 40 seconds of wakefulness must precede the episode. Bouts of cataplexy fitting this definition are common in mice with disrupted orexin/hypocretin signaling, but these events almost never occur in wild type mice. It remains unclear whether murine cataplexy is triggered by strong emotions or whether mice remain conscious during the episodes as in people with narcolepsy. This working definition provides helpful insights into murine cataplexy and should allow objective and accurate comparisons of cataplexy in future studies using mouse models of narcolepsy.
A Model-Free Definition of Increasing Uncertainty
Grant, S.; Quiggin, J.
2001-01-01
We present a definition of increasing uncertainty, independent of any notion of subjective probabilities, or of any particular model of preferences.Our notion of an elementary increase in the uncertainty of any act corresponds to the addition of an 'elementary bet' which increases consumption by a
Promoting Model-based Definition to Establish a Complete Product Definition.
Ruemler, Shawn P; Zimmerman, Kyle E; Hartman, Nathan W; Hedberg, Thomas; Feeny, Allison Barnard
2017-05-01
The manufacturing industry is evolving and starting to use 3D models as the central knowledge artifact for product data and product definition, or what is known as Model-based Definition (MBD). The Model-based Enterprise (MBE) uses MBD as a way to transition away from using traditional paper-based drawings and documentation. As MBD grows in popularity, it is imperative to understand what information is needed in the transition from drawings to models so that models represent all the relevant information needed for processes to continue efficiently. Finding this information can help define what data is common amongst different models in different stages of the lifecycle, which could help establish a Common Information Model. The Common Information Model is a source that contains common information from domain specific elements amongst different aspects of the lifecycle. To help establish this Common Information Model, information about how models are used in industry within different workflows needs to be understood. To retrieve this information, a survey mechanism was administered to industry professionals from various sectors. Based on the results of the survey a Common Information Model could not be established. However, the results gave great insight that will help in further investigation of the Common Information Model.
Livrable D1.2 of the PERSEE project : Perceptual-Modelling-Definition-of-the-Models
Wang , Junle; Bosc , Emilie; Li , Jing; Ricordel , Vincent
2011-01-01
Livrable D1.2 du projet ANR PERSEE; Ce rapport a été réalisé dans le cadre du projet ANR PERSEE (n° ANR-09-BLAN-0170). Exactement il correspond au livrable D1.2 du projet. Son titre : Perceptual-Modelling-Definition-of-the-Models
Basic definitions for discrete modeling of computer worms epidemics
Directory of Open Access Journals (Sweden)
Pedro Guevara López
2015-01-01
Full Text Available The information technologies have evolved in such a way that communication between computers or hosts has become common, so much that the worldwide organization (governments and corporations depends on it; what could happen if these computers stop working for a long time is catastrophic. Unfortunately, networks are attacked by malware such as viruses and worms that could collapse the system. This has served as motivation for the formal study of computer worms and epidemics to develop strategies for prevention and protection; this is why in this paper, before analyzing epidemiological models, a set of formal definitions based on set theory and functions is proposed for describing 21 concepts used in the study of worms. These definitions provide a basis for future qualitative research on the behavior of computer worms, and quantitative for the study of their epidemiological models.
Fuzzy Entropy： Axiomatic Definition and Neural Networks Model
Institute of Scientific and Technical Information of China (English)
QINGMing; CAOYue; HUANGTian-min
2004-01-01
The measure of uncertainty is adopted as a measure of information. The measures of fuzziness are known as fuzzy information measures. The measure of a quantity of fuzzy information gained from a fuzzy set or fuzzy system is known as fuzzy entropy. Fuzzy entropy has been focused and studied by many researchers in various fields. In this paper, firstly, the axiomatic definition of fuzzy entropy is discussed. Then, neural networks model of fuzzy entropy is proposed, based on the computing capability of neural networks. In the end, two examples are discussed to show the efficiency of the model.
HIV lipodystrophy case definition using artificial neural network modelling
DEFF Research Database (Denmark)
Ioannidis, John P A; Trikalinos, Thomas A; Law, Matthew
2003-01-01
OBJECTIVE: A case definition of HIV lipodystrophy has recently been developed from a combination of clinical, metabolic and imaging/body composition variables using logistic regression methods. We aimed to evaluate whether artificial neural networks could improve the diagnostic accuracy. METHODS......: The database of the case-control Lipodystrophy Case Definition Study was split into 504 subjects (265 with and 239 without lipodystrophy) used for training and 284 independent subjects (152 with and 132 without lipodystrophy) used for validation. Back-propagation neural networks with one or two middle layers...... were trained and validated. Results were compared against logistic regression models using the same information. RESULTS: Neural networks using clinical variables only (41 items) achieved consistently superior performance than logistic regression in terms of specificity, overall accuracy and area under...
Sustainable geothermal utilization - Case histories; definitions; research issues and modelling
International Nuclear Information System (INIS)
Axelsson, Gudni
2010-01-01
Sustainable development by definition meets the needs of the present without compromising the ability of future generations to meet their own needs. The Earth's enormous geothermal resources have the potential to contribute significantly to sustainable energy use worldwide as well as to help mitigate climate change. Experience from the use of numerous geothermal systems worldwide lasting several decades demonstrates that by maintaining production below a certain limit the systems reach a balance between net energy discharge and recharge that may be maintained for a long time (100-300 years). Modelling studies indicate that the effect of heavy utilization is often reversible on a time-scale comparable to the period of utilization. Thus, geothermal resources can be used in a sustainable manner either through (1) constant production below the sustainable limit, (2) step-wise increase in production, (3) intermittent excessive production with breaks, and (4) reduced production after a shorter period of heavy production. The long production histories that are available for low-temperature as well as high-temperature geothermal systems distributed throughout the world, provide the most valuable data available for studying sustainable management of geothermal resources, and reservoir modelling is the most powerful tool available for this purpose. The paper presents sustainability modelling studies for the Hamar and Nesjavellir geothermal systems in Iceland, the Beijing Urban system in China and the Olkaria system in Kenya as examples. Several relevant research issues have also been identified, such as the relevance of system boundary conditions during long-term utilization, how far reaching interference from utilization is, how effectively geothermal systems recover after heavy utilization and the reliability of long-term (more than 100 years) model predictions. (author)
Scoring predictive models using a reduced representation of proteins: model and energy definition.
Fogolari, Federico; Pieri, Lidia; Dovier, Agostino; Bortolussi, Luca; Giugliarelli, Gilberto; Corazza, Alessandra; Esposito, Gennaro; Viglino, Paolo
2007-03-23
Reduced representations of proteins have been playing a keyrole in the study of protein folding. Many such models are available, with different representation detail. Although the usefulness of many such models for structural bioinformatics applications has been demonstrated in recent years, there are few intermediate resolution models endowed with an energy model capable, for instance, of detecting native or native-like structures among decoy sets. The aim of the present work is to provide a discrete empirical potential for a reduced protein model termed here PC2CA, because it employs a PseudoCovalent structure with only 2 Centers of interactions per Amino acid, suitable for protein model quality assessment. All protein structures in the set top500H have been converted in reduced form. The distribution of pseudobonds, pseudoangle, pseudodihedrals and distances between centers of interactions have been converted into potentials of mean force. A suitable reference distribution has been defined for non-bonded interactions which takes into account excluded volume effects and protein finite size. The correlation between adjacent main chain pseudodihedrals has been converted in an additional energetic term which is able to account for cooperative effects in secondary structure elements. Local energy surface exploration is performed in order to increase the robustness of the energy function. The model and the energy definition proposed have been tested on all the multiple decoys' sets in the Decoys'R'us database. The energetic model is able to recognize, for almost all sets, native-like structures (RMSD less than 2.0 A). These results and those obtained in the blind CASP7 quality assessment experiment suggest that the model compares well with scoring potentials with finer granularity and could be useful for fast exploration of conformational space. Parameters are available at the url: http://www.dstb.uniud.it/~ffogolari/download/.
Persuasive Game Design : A model and its definitions
Visch, V.T.; Vegt, N.J.H.; Anderiesen, H.; Van der Kooij, K.
2013-01-01
The following position paper proposes a general theoretical model for persuasive game design. This model combines existing theories on persuasive technology, serious gaming, and gamification. The model is based on user experience, gamification design, and transfer effects.
Integral definition of transition time in the Landau-Zener model
International Nuclear Information System (INIS)
Yan Yue; Wu Biao
2010-01-01
We give a general definition for the transition time in the Landau-Zener model. This definition allows us to compute numerically the Landau-Zener transition time at any sweeping rate without ambiguity in both diabatic and adiabatic bases. With this new definition, analytical results are obtained in both the adiabatic limit and the sudden limit.
Integrated source-risk model for radon: A definition study
International Nuclear Information System (INIS)
Laheij, G.M.H.; Aldenkamp, F.J.; Stoop, P.
1993-10-01
The purpose of a source-risk model is to support policy making on radon mitigation by comparing effects of various policy options and to enable optimization of counter measures applied to different parts of the source-risk chain. There are several advantages developing and using a source-risk model: risk calculations are standardized; the effects of measures applied to different parts of the source-risk chain can be better compared because interactions are included; and sensitivity analyses can be used to determine the most important parameters within the total source-risk chain. After an inventory of processes and sources to be included in the source-risk chain, the models presently available in the Netherlands are investigated. The models were screened for completeness, validation and operational status. The investigation made clear that, by choosing for each part of the source-risk chain the most convenient model, a source-risk chain model for radon may be realized. However, the calculation of dose out of the radon concentrations and the status of the validation of most models should be improved. Calculations with the proposed source-risk model will give estimations with a large uncertainty at the moment. For further development of the source-risk model an interaction between the source-risk model and experimental research is recommended. Organisational forms of the source-risk model are discussed. A source-risk model in which only simple models are included is also recommended. The other models are operated and administrated by the model owners. The model owners execute their models for a combination of input parameters. The output of the models is stored in a database which will be used for calculations with the source-risk model. 5 figs., 15 tabs., 7 appendices, 14 refs
A Model-Driven Approach for Telecommunications Network Services Definition
Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.
Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.
A definitional framework for the human/biometric sensor interaction model
Elliott, Stephen J.; Kukula, Eric P.
2010-04-01
Existing definitions for biometric testing and evaluation do not fully explain errors in a biometric system. This paper provides a definitional framework for the Human Biometric-Sensor Interaction (HBSI) model. This paper proposes six new definitions based around two classifications of presentations, erroneous and correct. The new terms are: defective interaction (DI), concealed interaction (CI), false interaction (FI), failure to detect (FTD), failure to extract (FTX), and successfully acquired samples (SAS). As with all definitions, the new terms require a modification to the general biometric model developed by Mansfield and Wayman [1].
Time domain series system definition and gear set reliability modeling
International Nuclear Information System (INIS)
Xie, Liyang; Wu, Ningxiang; Qian, Wenxue
2016-01-01
Time-dependent multi-configuration is a typical feature for mechanical systems such as gear trains and chain drives. As a series system, a gear train is distinct from a traditional series system, such as a chain, in load transmission path, system-component relationship, system functioning manner, as well as time-dependent system configuration. Firstly, the present paper defines time-domain series system to which the traditional series system reliability model is not adequate. Then, system specific reliability modeling technique is proposed for gear sets, including component (tooth) and subsystem (tooth-pair) load history description, material priori/posterior strength expression, time-dependent and system specific load-strength interference analysis, as well as statistically dependent failure events treatment. Consequently, several system reliability models are developed for gear sets with different tooth numbers in the scenario of tooth root material ultimate tensile strength failure. The application of the models is discussed in the last part, and the differences between the system specific reliability model and the traditional series system reliability model are illustrated by virtue of several numerical examples. - Highlights: • A new type of series system, i.e. time-domain multi-configuration series system is defined, that is of great significance to reliability modeling. • Multi-level statistical analysis based reliability modeling method is presented for gear transmission system. • Several system specific reliability models are established for gear set reliability estimation. • The differences between the traditional series system reliability model and the new model are illustrated.
TAPWAT: Definition structure and applications for modelling drinking water treatment
Versteegh JFM; Gaalen FW van; Rietveld LC; Evers EG; Aldenberg TA; Cleij P; Technische Universiteit Delft; LWD
2001-01-01
The 'Tool for the Analysis of the Production of drinking WATer' (TAPWAT) model has been developed for describing drinking-water quality in integral studies in the context of the Environmental Policy Assessment of the RIVM. The model consists of modules that represent individual steps in a treatment
TAPWAT: Definition structure and applications for modelling drinking water treatment
Versteegh JFM; van Gaalen FW; Rietveld LC; Evers EG; Aldenberg TA; Cleij P; LWD
2001-01-01
Het model TAPWAT (Tool for the Analysis of the Production of drinking WATer), is ontwikkeld om de drinkwaterkwaliteit te beschrijven voor integrale studies in het kader van het planbureau Milieu en Natuur van het RIVM. Het model bestaat uit modules die de individuele zuiveringsstappen van het
Business process model abstraction : a definition, catalog, and survey
Smirnov, S.; Reijers, H.A.; Weske, M.H.; Nugteren, T.
2012-01-01
The discipline of business process management aims at capturing, understanding, and improving work in organizations by using process models as central artifacts. Since business-oriented tasks require different information from such models to be highlighted, a range of abstraction techniques has been
Towards a Definition of Role-related Concepts for Business Modeling
Meertens, Lucas Onno; Iacob, Maria Eugenia; Nieuwenhuis, Lambertus Johannes Maria
2010-01-01
Abstract—While several role-related concepts play an important role in business modeling, their definitions, relations, and use differ greatly between languages, papers, and reports. Due to this, the knowledge captured by models is not transferred correctly, and models are incomparable. In this
Health literacy and public health: A systematic review and integration of definitions and models
LENUS (Irish Health Repository)
Sorensen, Kristine
2012-01-25
Abstract Background Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. Methods A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. Results The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Conclusions Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings.
Beyond a Definition: Toward a Framework for Designing and Specifying Mentoring Models
Dawson, Phillip
2014-01-01
More than three decades of mentoring research has yet to converge on a unifying definition of mentoring; this is unsurprising given the diversity of relationships classified as mentoring. This article advances beyond a definition toward a common framework for specifying mentoring models. Sixteen design elements were identified from the literature…
A new multidimensional model with text dimensions: definition and implementation
Directory of Open Access Journals (Sweden)
MariaJ. Martin-Bautista
2013-02-01
Full Text Available We present a new multidimensional model with textual dimensions based on a knowledge structure extracted from the texts, where any textual attribute in a database can be processed, and not only XML texts. This dimension allows to treat the textual data in the same way as the non-textual one in an automatic way, without user's intervention, so all the classical operations in the multidimensional model can been defined for this textual dimension. While most of the models dealing with texts that can be found in the literature are not implemented, in this proposal, the multidimensional model and the OLAP system have been implemented in a software tool, so it can be tested on real data. A case study with medical data is included in this work.
Directory of Open Access Journals (Sweden)
Piet Swanepoel
2011-10-01
Full Text Available
ABSTRACT: This article focuses on some of the problems raised by Atkins and Rundell's (2008 approach to the design of lexicographic definitions for members of lexical sets. The questions raised are how to define and identify lexical sets, how lexical conceptual models (LCMs can support definitional consistency and coherence in defining members of lexical sets, and what the ideal content and structure of LCMs could be. Although similarity of meaning is proposed as the defining feature of lexical sets, similarity of meaning is only one dimension of the broader concept of lexical coherence. The argument is presented that numerous conceptual lexical models (e.g. taxonomies, folk models, frames, etc. in fact indicate, justify or explain how lexical items cohere (and thus form sets. In support of Fillmore's (2003 suggestion that definitions of the lexical items of cohering sets should be linked to such explanatory models, additional functionally-orientated arguments are presented for the incorporation of conceptual lexical models in electronic monolingual learners' dictionaries. Numerous resources exist to support the design of LCMs which can improve the functionality of definitions of members of lexical sets. A few examples are discussed of how such resources can be used to design functionally justified LCMs.
OPSOMMING: Verbetering van die funksionaliteit van woordeboekdefinisies vir leksikale versamelings: Die rol van definisiematryse, definisie-eenvormigheid, definisiesamehang en die inkorporering van leksikale konseptuele modelle. Hierdie artikel fokus op sommige van die probleme wat ter sprake kom deur Atkins en Rundell (2008 se benadering tot die ontwerp van leksikografiese definisies vir lede van leksikale versamelings. Die vrae wat gestel word, is hoe leksikale versamelings gedefinieer en geïdentifiseer moet word, hoe leksikale konseptuele modelle (LKM's definisie-eenvormigheid en-samehang kan ondersteun by die definiëring van lede
Biomass Scenario Model Scenario Library: Definitions, Construction, and Description
Energy Technology Data Exchange (ETDEWEB)
Inman, D.; Vimmerstedt, L.; Bush, B.; Peterson, S.
2014-04-01
Understanding the development of the biofuels industry in the United States is important to policymakers and industry. The Biomass Scenario Model (BSM) is a system dynamics model of the biomass-to-biofuels system that can be used to explore policy effects on biofuels development. Because of the complexity of the model, as well as the wide range of possible future conditions that affect biofuels industry development, we have not developed a single reference case but instead developed a set of specific scenarios that provide various contexts for our analyses. The purpose of this report is to describe the scenarios that comprise the BSM scenario library. At present, we have the following policy-focused scenarios in our library: minimal policies, ethanol-focused policies, equal access to policies, output-focused policies, technological diversity focused, and the point-of-production- focused. This report describes each scenario, its policy settings, and general insights gained through use of the scenarios in analytic studies.
A formal definition of data flow graph models
Kavi, Krishna M.; Buckles, Bill P.; Bhat, U. Narayan
1986-01-01
In this paper, a new model for parallel computations and parallel computer systems that is based on data flow principles is presented. Uninterpreted data flow graphs can be used to model computer systems including data driven and parallel processors. A data flow graph is defined to be a bipartite graph with actors and links as the two vertex classes. Actors can be considered similar to transitions in Petri nets, and links similar to places. The nondeterministic nature of uninterpreted data flow graphs necessitates the derivation of liveness conditions.
Analyzing differences in operational disease definitions using ontological modeling
Peelen, Linda; Klein, Michel C.A.; Schlobach, Stefan; De Keizer, Nicolette F.; Peek, Niels
2007-01-01
In medicine, there are many diseases which cannot be precisely characterized but are considered as natural kinds. In the communication between health care professionals, this is generally not problematic. In biomedical research, however, crisp definitions are required to unambiguously distinguish
Ecosystem models are by definition simplifications of the real ...
African Journals Online (AJOL)
spamer
to calculate changes in total phytoplankton vegetative biomass with time ... into account when modelling phytoplankton population dynamics. ... Then, the means whereby the magnitude of ..... There was increased heat input and slight stratification from mid to ... conditions must be optimal and the water should be extremely ...
The modified turning bands (MTB) model for space-time rainfall. I. Model definition and properties
Mellor, Dale
1996-02-01
A new stochastic model of space-time rainfall, the Modified Turning Bands (MTB) model, is proposed which reproduces, in particular, the movements and developments of rainbands, cluster potential regions and raincells, as well as their respective interactions. The ensemble correlation structure is unsuitable for practical estimation of the model parameters because the model is not ergodic in this statistic, and hence it cannot easily be measured from a single real storm. Thus, some general theory on the internal covariance structure of a class of stochastic models is presented, of which the MTB model is an example. It is noted that, for the MTB model, the internal covariance structure may be measured from a single storm, and can thus be used for model identification.
Defined Contribution Model: Definition, Theory and an Application for Turkey
Metin Ercen; Deniz Gokce
1998-01-01
Based on a numerical application that employs social and economic parameters of the Turkish economy, this study attempts to demonstrate that the current collapse in the Turkish social security system is not unavoidable. The present social security system in Turkey is based on the defined benefit model of pension provision. On the other hand, recent proposals for reform in the social security system are based on a multipillar system, where one of the alternatives is a defined contribution pens...
Ports: Definition and study of types, sizes and business models
Directory of Open Access Journals (Sweden)
Ivan Roa
2013-09-01
Full Text Available Purpose: In the world today there are thousands of port facilities of different types and sizes, competing to capture some market share of freight by sea, mainly. This article aims to determine the type of port and the most common size, in order to find out which business model is applied in that segment and what is the legal status of the companies of such infrastructure.Design/methodology/approach: To achieve this goal, we develop a research on a representative sample of 800 ports worldwide, which manage 90% of the containerized port loading. Then you can find out the legal status of the companies that manage them.Findings: The results indicate a port type and a dominant size, which are mostly managed by companies subject to a concession model.Research limitations/implications: In this research, we study only those ports that handle freight (basically containerized, ignoring other activities such as fishing, military, tourism or recreational.Originality/value: This is an investigation to show that the vast majority of the studied segment port facilities are governed by a similar corporate model and subject to pressure from the markets, which increasingly demand efficiency and service. Consequently, we tend to concession terminals to private operators in a process that might be called privatization, but in the strictest sense of the term, is not entirely realistic because the ownership of the land never ceases to be public
Spädtke, P
2013-01-01
Modeling of technical machines became a standard technique since computer became powerful enough to handle the amount of data relevant to the specific system. Simulation of an existing physical device requires the knowledge of all relevant quantities. Electric fields given by the surrounding boundary as well as magnetic fields caused by coils or permanent magnets have to be known. Internal sources for both fields are sometimes taken into account, such as space charge forces or the internal magnetic field of a moving bunch of charged particles. Used solver routines are briefly described and some bench-marking is shown to estimate necessary computing times for different problems. Different types of charged particle sources will be shown together with a suitable model to describe the physical model. Electron guns are covered as well as different ion sources (volume ion sources, laser ion sources, Penning ion sources, electron resonance ion sources, and H$^-$-sources) together with some remarks on beam transport.
Effect of the MCNP model definition on the computation time
International Nuclear Information System (INIS)
Šunka, Michal
2017-01-01
The presented work studies the influence of the method of defining the geometry in the MCNP transport code and its impact on the computational time, including the difficulty of preparing an input file describing the given geometry. Cases using different geometric definitions including the use of basic 2-dimensional and 3-dimensional objects and theirs combinations were studied. The results indicate that an inappropriate definition can increase the computational time by up to 59% (a more realistic case indicates 37%) for the same results and the same statistical uncertainty. (orig.)
Description logics with approximate definitions precise modeling of vague concepts
Schlobach, Stefan; Klein, Michel; Peelen, Linda
2007-01-01
We extend traditional Description Logics (DL) with a simple mechanism to handle approximate concept definitions in a qualitative way. Often, for example in medical applications, concepts are not definable in a crisp way but can fairly exhaustively be constrained through a particular sub- and a
Ports: Definition and study of types, sizes and business models
Ivan Roa; Yessica Peña; Beatriz Amante; María Goretti
2013-01-01
Purpose: In the world today there are thousands of port facilities of different types and sizes, competing to capture some market share of freight by sea, mainly. This article aims to determine the type of port and the most common size, in order to find out which business model is applied in that segment and what is the legal status of the companies of such infrastructure.Design/methodology/approach: To achieve this goal, we develop a research on a representative sample of 800 ports worldwide...
Making the Case for a Model-Based Definition of Engineering Materials (Postprint)
2017-09-12
MBE relies on digi- tal representations, or a model-based definition (MBD), to define a product throughout design , manufacturing and sus- tainment...discovery through development, scale-up, product design and qualification, manufacture and sustainment have changed little over the past decades. This...testing data provided a certifiable material definition, so as to minimize risk and simplify procurement of materials during the design , manufacture , and
Multiple organ definition in CT using a Bayesian approach for 3D model fitting
Boes, Jennifer L.; Weymouth, Terry E.; Meyer, Charles R.
1995-08-01
Organ definition in computed tomography (CT) is of interest for treatment planning and response monitoring. We present a method for organ definition using a priori information about shape encoded in a set of biometric organ models--specifically for the liver and kidney-- that accurately represents patient population shape information. Each model is generated by averaging surfaces from a learning set of organ shapes previously registered into a standard space defined by a small set of landmarks. The model is placed in a specific patient's data set by identifying these landmarks and using them as the basis for model deformation; this preliminary representation is then iteratively fit to the patient's data based on a Bayesian formulation of the model's priors and CT edge information, yielding a complete organ surface. We demonstrate this technique using a set of fifteen abdominal CT data sets for liver surface definition both before and after the addition of a kidney model to the fitting; we demonstrate the effectiveness of this tool for organ surface definition in this low-contrast domain.
Hammer, K A; Janes, F R
1995-01-01
The objectives for developing the participative method of subject definition were to gain all the relevant information to a high level of fidelity in the earliest stages of the work and so be able to build a realistic model at reduced labour cost. In order to better integrate the two activities--information acquisition and mathematical modelling--a procedure was devised using the methods of interactive management to facilitate teamwork. This procedure provided the techniques to create suitable working relationships between the two groups, the informants and the modellers, so as to maximize their free and accurate intercommunication, both during the initial definition of the linen service and during the monitoring of the accuracy and reality of the draft models. The objectives of this project were met in that the final model was quickly validated and approved, at a low labour cost.
TAME - the terrestrial-aquatic model of the environment: model definition
International Nuclear Information System (INIS)
Klos, R.A.; Mueller-Lemans, H.; Dorp, F. van; Gribi, P.
1996-10-01
TAME - the Terrestrial-Aquatic Model of the Environment is a new computer model for use in assessments of the radiological impact of the release of radionuclides to the biosphere, following their disposal in underground waste repositories. Based on regulatory requirements, the end-point of the calculations is the maximum annual individual dose to members of a hypothetical population group inhabiting the biosphere region. Additional mid- and end-points in the TAME calculations are dose as function of time from eleven exposure pathways, foodstuff concentrations and the distribution of radionuclides in the modelled biosphere. A complete description of the mathematical representations of the biosphere in TAME is given in this document, based on a detailed review of the underlying conceptual framework for the model. Example results are used to illustrate features of the conceptual and mathematical models. The end-point of dose is shown to be robust for the simplifying model assumptions used to define the biosphere for the example calculations. TAME comprises two distinct sub-models - one representing the transport of radionuclides in the near-surface environment and one for the calculation of dose to individual inhabitants of that biosphere. The former is the result of a detailed review of the modelling requirements for such applications and is based on a comprehensive consideration of all features, events and processes (FEPs) relevant to Swiss biospheres, both in the present-day biosphere and in potential future biosphere states. Representations of the transport processes are derived from first principles. Mass balance for water and solid material fluxes is used to determine the rates of contaminant transfer between components of the biosphere system. The calculation of doses is based on existing representations of exposure pathways and draws on experience both from Switzerland and elsewhere. (author) figs., tabs., refs
TAME - the terrestrial-aquatic model of the environment: model definition
Energy Technology Data Exchange (ETDEWEB)
Klos, R.A. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Mueller-Lemans, H. [Tergoso AG fuer Umweltfragen, Sargans (Switzerland); Dorp, F. van [Nationale Genossenschaft fuer die Lagerung Radioaktiver Abfaelle (NAGRA), Baden (Switzerland); Gribi, P. [Colenco AG, Baden (Switzerland)
1996-10-01
TAME - the Terrestrial-Aquatic Model of the Environment is a new computer model for use in assessments of the radiological impact of the release of radionuclides to the biosphere, following their disposal in underground waste repositories. Based on regulatory requirements, the end-point of the calculations is the maximum annual individual dose to members of a hypothetical population group inhabiting the biosphere region. Additional mid- and end-points in the TAME calculations are dose as function of time from eleven exposure pathways, foodstuff concentrations and the distribution of radionuclides in the modelled biosphere. A complete description of the mathematical representations of the biosphere in TAME is given in this document, based on a detailed review of the underlying conceptual framework for the model. Example results are used to illustrate features of the conceptual and mathematical models. The end-point of dose is shown to be robust for the simplifying model assumptions used to define the biosphere for the example calculations. TAME comprises two distinct sub-models - one representing the transport of radionuclides in the near-surface environment and one for the calculation of dose to individual inhabitants of that biosphere. The former is the result of a detailed review of the modelling requirements for such applications and is based on a comprehensive consideration of all features, events and processes (FEPs) relevant to Swiss biospheres, both in the present-day biosphere and in potential future biosphere states. Representations of the transport processes are derived from first principles. Mass balance for water and solid material fluxes is used to determine the rates of contaminant transfer between components of the biosphere system. The calculation of doses is based on existing representations of exposure pathways and draws on experience both from Switzerland and elsewhere. (author) figs., tabs., refs.
Energy Technology Data Exchange (ETDEWEB)
Zagonel, Aldo A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Systems Engineering & Analysis; Andersen, David F. [University in Albany, NY (United States). The Rockefeller College of Public Affairs & Policy
2007-03-01
Based upon participant observation in group model building and content analysis of the system dynamics literature, we postulate that modeling efforts have a dual nature. On one hand, the modeling process aims to create a useful representation of a real-world system. This must be done, however, while aligning the clients’ mental models around a shared view of the system. There is significant overlap and confusion between these two goals and how they play out on a practical level. This research clarifies these distinctions by establishing an ideal-type dichotomy. To highlight the differences, we created two straw men: “micro world” characterizes a model that represents reality and “boundary object” represents a socially negotiated model. Using this framework, the literature was examined, revealing evidence for several competing views on problem definition and model conceptualization. The results are summarized in the text of this article, substantiated with strikingly polarized citations, often from the same authors. We also introduce hypotheses for the duality across the remaining phases of the modeling process. Finally, understanding and appreciation of the differences between these ideal types can promote constructive debate on their balance in system dynamics theory and practice.
Integration Of Externalized Decision Models In The Definition Of Workflows For Digital Pathology
Directory of Open Access Journals (Sweden)
J. van Leeuwen
2016-06-01
We proposed a workflow solution enabling the representation of decision models as externalized executable tasks in the process definition. Our approach separates the task implementations from the workflow model, ensuring scalability and allowing for the inclusion of complex decision logic in the workflow execution. In we depict a simplified model of a pathology diagnosis workflow (starting with the digitization of the slides, represented according to the BPMN modeling conventions. The example shows a workflow sequence that automatically orders a HER2 FISH when IHC is borderline according to defined customizable thresholds. The process model integrates an image analysis algorithm that scores images. Based on the score and the thresholds the decision model evaluates the condition and recommends the pre-ordering of an additional test when the score falls between the two thresholds. This leads to faster diagnosis and allows balancing the costs of an additional test versus the overhead of the pathologist by choosing the values of the thresholds.
DEFF Research Database (Denmark)
Jensen, Morten B; Guldberg, Trine L; Harbøll, Anja
2017-01-01
the microscopic tumor cell spread. Gliomas favor spread along the white matter fiber tracts. Tumor growth models incorporating the MRI diffusion tensors (DTI) allow to account more consistently for the glioma growth. The aim of the study was to investigate the potential of a DTI driven growth model to improve...... target definition in glioblastoma (GBM). MATERIAL AND METHODS: Eleven GBM patients were scanned using T1w, T2w FLAIR, T1w + Gd and DTI. The brain was segmented into white matter, gray matter and cerebrospinal fluid. The Fisher-Kolmogorov growth model was used assuming uniform proliferation...
Development of a definition, classification system, and model for cultural geology
Mitchell, Lloyd W., III
The concept for this study is based upon a personal interest by the author, an American Indian, in promoting cultural perspectives in undergraduate college teaching and learning environments. Most academicians recognize that merged fields can enhance undergraduate curricula. However, conflict may occur when instructors attempt to merge social science fields such as history or philosophy with geoscience fields such as mining and geomorphology. For example, ideologies of Earth structures derived from scientific methodologies may conflict with historical and spiritual understandings of Earth structures held by American Indians. Specifically, this study addresses the problem of how to combine cultural studies with the geosciences into a new merged academic discipline called cultural geology. This study further attempts to develop the merged field of cultural geology using an approach consisting of three research foci: a definition, a classification system, and a model. Literature reviews were conducted for all three foci. Additionally, to better understand merged fields, a literature review was conducted specifically for academic fields that merged social and physical sciences. Methodologies concentrated on the three research foci: definition, classification system, and model. The definition was derived via a two-step process. The first step, developing keyword hierarchical ranking structures, was followed by creating and analyzing semantic word meaning lists. The classification system was developed by reviewing 102 classification systems and incorporating selected components into a system framework. The cultural geology model was created also utilizing a two-step process. A literature review of scientific models was conducted. Then, the definition and classification system were incorporated into a model felt to reflect the realm of cultural geology. A course syllabus was then developed that incorporated the resulting definition, classification system, and model. This
Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Model View Definition
2013-06-01
contexts; • To all attributes with a defined datatype indicating a measure datatype ; • To all properties and quantities with a defined datatype ...indicating a measure datatype and with no local unit definitions provided. 3.2.2.3 Project context A project representation context indicates the
Ontology for Life-Cycle Modeling of Water Distribution Systems: Model View Definition
2013-06-01
attributes with a defined datatype indicating a measure datatype ; • To all properties and quantities with a defined datatype indicating a measure... datatype and with no local unit definitions provided. 3.2.3.4 Project context A project representation context indicates the coordinate system orienta
Defining epidemics in computer simulation models: How do definitions influence conclusions?
Directory of Open Access Journals (Sweden)
Carolyn Orbann
2017-06-01
Full Text Available Computer models have proven to be useful tools in studying epidemic disease in human populations. Such models are being used by a broader base of researchers, and it has become more important to ensure that descriptions of model construction and data analyses are clear and communicate important features of model structure. Papers describing computer models of infectious disease often lack a clear description of how the data are aggregated and whether or not non-epidemic runs are excluded from analyses. Given that there is no concrete quantitative definition of what constitutes an epidemic within the public health literature, each modeler must decide on a strategy for identifying epidemics during simulation runs. Here, an SEIR model was used to test the effects of how varying the cutoff for considering a run an epidemic changes potential interpretations of simulation outcomes. Varying the cutoff from 0% to 15% of the model population ever infected with the illness generated significant differences in numbers of dead and timing variables. These results are important for those who use models to form public health policy, in which questions of timing or implementation of interventions might be answered using findings from computer simulation models.
DEFF Research Database (Denmark)
Karlshøj, Jan
2012-01-01
The construction industry is gradually increasing its use of structured information and building information modelling.To date, the industry has suffered from the disadvantages of a project-based organizational structure and ad hoc solutions. Furthermore, it is not used to formalizing the flow...... of information and specifying exactly which objects and properties are needed for each process and which information is produced by the processes. The present study is based on reviewing the existing methodology of Information Delivery Manuals (IDM) from Buildingsmart, which also is also an ISO standard 29481...... Part 1; and the Model View Definition (MVD) methodology developed by Buildingsmart and BLIS. The research also includes a review of concrete IDM development projects that have been developed over the last five years. Although the study has identified interest in the IDM methodology in a number...
DEFF Research Database (Denmark)
Cameron, Ian; Gani, Rafiqul
2011-01-01
This chapter deals with the practicalities of building, testing, deploying and maintaining models. It gives specific advice for each phase of the modelling cycle. To do this, a modelling framework is introduced which covers: problem and model definition; model conceptualization; model data...... requirements; model construction; model solution; model verification; model validation and finally model deployment and maintenance. Within the adopted methodology, each step is discussedthrough the consideration of key issues and questions relevant to the modelling activity. Practical advice, based on many...
Bruce Bagwell, C
2018-01-01
This chapter outlines how to approach the complex tasks associated with designing models for high-dimensional cytometry data. Unlike gating approaches, modeling lends itself to automation and accounts for measurement overlap among cellular populations. Designing these models is now easier because of a new technique called high-definition t-SNE mapping. Nontrivial examples are provided that serve as a guide to create models that are consistent with data.
Directory of Open Access Journals (Sweden)
А. Лопатьєв
2017-09-01
Full Text Available The objective is to systematize and adapt the basic definitions and concepts of the systems approach, mathematical modeling and information technologies to sports science. Materials and methods. The research has studied the availability of appropriate terms in shooting sports, which would meet the requirements of modern sports science. It has examined the compliance of the shooting sports training program for children and youth sports schools, the Olympic reserve specialized children and youth schools, schools of higher sports skills, and sports educational institutions with the modern requirements and principles. Research results. The paper suggests the basic definitions adapted to the requirements of technical sports and sports science. The research has thoroughly analyzed the shooting sports training program for children and youth sports schools, the Olympic reserve specialized children and youth schools, schools of higher sports skills, and sports educational institutions. The paper offers options to improve the training program in accordance with the modern tendencies of training athletes. Conclusions. The research suggests to systematize and adapt the basic definitions and concepts of the systems approach, mathematical modeling and information technologies using the example of technical sports.
Modeling, Simulation, and Analysis of Novel Threshold Voltage Definition for Nano-MOSFET
Directory of Open Access Journals (Sweden)
Yashu Swami
2017-01-01
Full Text Available Threshold voltage (VTH is the indispensable vital parameter in MOSFET designing, modeling, and operation. Diverse expounds and extraction methods exist to model the on-off transition characteristics of the device. The governing gauge for efficient threshold voltage definition and extraction method can be itemized as clarity, simplicity, precision, and stability throughout the operating conditions and technology node. The outcomes of extraction methods diverge from the exact values due to various short-channel effects (SCEs and nonidealities present in the device. A new approach to define and extract the real value of VTH of MOSFET is proposed in the manuscript. The subsequent novel enhanced SCE-independent VTH extraction method named “hybrid extrapolation VTH extraction method” (HEEM is elaborated, modeled, and compared with few prevalent MOSFET threshold voltage extraction methods for validation of the results. All the results are verified by extensive 2D TCAD simulation and confirmed analytically at various technology nodes.
An integrated operational definition and conceptual model of asthma self-management in teens.
Mammen, Jennifer; Rhee, Hyekyun; Norton, Sally A; Butz, Arlene M; Halterman, Jill S; Arcoleo, Kimberly
2018-01-19
A previous definition of adolescent asthma self-management was derived from interviews with clinicians/researchers and published literature; however, it did not incorporate perspectives of teens or parents. Therefore, we conducted in-depth interviews with teens and parents and synthesized present findings with the prior analysis to develop a more encompassing definition and model. Focal concepts were qualitatively extracted from 14-day self-management voice-diaries (n = 14) and 1-hour interviews (n = 42) with teens and parents (28 individuals) along with concepts found in the previous clinical/research oriented analysis. Conceptual structure and relationships were identified and key findings synthesized to develop a revised definition and model of adolescent asthma self-management. There were two primary self-management constructs: processes of self-management and tasks of self-management. Self-management was defined as the iterative process of assessing, deciding, and responding to specific situations in order to achieve personally important outcomes. Clinically relevant asthma self-management tasks included monitoring asthma, managing active issues through pharmacologic and non-pharmacologic strategies, preventing future issues, and communicating with others as needed. Self-management processes were reciprocally influenced by intrapersonal factors (both cognitive and physical), interpersonal factors (family, social and physical environments), and personally relevant asthma and non-asthma outcomes. This is the first definition of asthma self-management incorporating teen, parent, clinician, and researcher perspectives, which suggests that self-management processes and behaviors are influenced by individually variable personal and interpersonal factors, and are driven by personally important outcomes. Clinicians and researchers should investigate teens' symptom perceptions, medication beliefs, current approaches to symptom management, relevant outcomes, and
Draft Common Frame of Reference. Principles, Definitions and Model Rules of European Private Law
AA.VV; IUDICA G.
2009-01-01
European private law in principles, definitions and model rules. The volumes contain the results of the work of the Study Group on a European Civil Code (the “Study Group”) and the Research Group on Existing EC Private Law (the “Acquis Group”). The former Commission on European Contract Law (the “Lando Commission”) provided the basis for much of Books II and III; it was on their Principles of European Contract Law (PECL)1 that the Study Group and the Acquis Group built. The Acquis Group ...
A hierarchical modeling methodology for the definition and selection of requirements
Dufresne, Stephane
This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the
Linking definitions, mechanisms, and modeling of drought-induced tree death.
Anderegg, William R L; Berry, Joseph A; Field, Christopher B
2012-12-01
Tree death from drought and heat stress is a critical and uncertain component in forest ecosystem responses to a changing climate. Recent research has illuminated how tree mortality is a complex cascade of changes involving interconnected plant systems over multiple timescales. Explicit consideration of the definitions, dynamics, and temporal and biological scales of tree mortality research can guide experimental and modeling approaches. In this review, we draw on the medical literature concerning human death to propose a water resource-based approach to tree mortality that considers the tree as a complex organism with a distinct growth strategy. This approach provides insight into mortality mechanisms at the tree and landscape scales and presents promising avenues into modeling tree death from drought and temperature stress. Copyright © 2012 Elsevier Ltd. All rights reserved.
Exploratory analysis regarding the domain definitions for computer based analytical models
Raicu, A.; Oanta, E.; Barhalescu, M.
2017-08-01
Our previous computer based studies dedicated to structural problems using analytical methods defined the composite cross section of a beam as a result of Boolean operations with so-called ‘simple’ shapes. Using generalisations, in the class of the ‘simple’ shapes were included areas bounded by curves approximated using spline functions and areas approximated as polygons. However, particular definitions lead to particular solutions. In order to ascend above the actual limitations, we conceived a general definition of the cross sections that are considered now calculus domains consisting of several subdomains. The according set of input data use complex parameterizations. This new vision allows us to naturally assign a general number of attributes to the subdomains. In this way there may be modelled new phenomena that use map-wise information, such as the metal alloys equilibrium diagrams. The hierarchy of the input data text files that use the comma-separated-value format and their structure are also presented and discussed in the paper. This new approach allows us to reuse the concepts and part of the data processing software instruments already developed. The according software to be subsequently developed will be modularised and generalised in order to be used in the upcoming projects that require rapid development of computer based models.
Escorpizo, Reuben; Reneman, Michiel F; Ekholm, Jan; Fritz, Julie; Krupa, Terry; Marnetoft, Sven-Uno; Maroun, Claude E; Guzman, Julietta Rodriguez; Suzuki, Yoshiko; Stucki, Gerold; Chan, Chetwyn C H
2011-06-01
The International Classification of Functioning, Disability and Health (ICF) is a conceptual framework and classification system by the World Health Organization (WHO) to understand functioning. The objective of this discussion paper is to offer a conceptual definition for vocational rehabilitation (VR) based on the ICF. We presented the ICF as a model for application in VR and the rationale for the integration of the ICF. We also briefly reviewed other work disability models. Five essential elements of foci were found towards a conceptual definition of VR: an engagement or re-engagement to work, along a work continuum, involved health conditions or events leading to work disability, patient-centered and evidence-based, and is multi-professional or multidisciplinary. VR refers to a multi-professional approach that is provided to individuals of working age with health-related impairments, limitations, or restrictions with work functioning and whose primary aim is to optimize work participation. We propose that the ICF and VR interface be explored further using empirical and qualitative works and encouraging stakeholders' participation.
Psychoacoustic and cognitive aspects of auditory roughness: definitions, models, and applications
Vassilakis, Pantelis N.; Kendall, Roger A.
2010-02-01
The term "auditory roughness" was first introduced in the 19th century to describe the buzzing, rattling auditory sensation accompanying narrow harmonic intervals (i.e. two tones with frequency difference in the range of ~15-150Hz, presented simultaneously). A broader definition and an overview of the psychoacoustic correlates of the auditory roughness sensation, also referred to as sensory dissonance, is followed by an examination of efforts to quantify it over the past one hundred and fifty years and leads to the introduction of a new roughness calculation model and an application that automates spectral and roughness analysis of sound signals. Implementation of spectral and roughness analysis is briefly discussed in the context of two pilot perceptual experiments, designed to assess the relationship among cultural background, music performance practice, and aesthetic attitudes towards the auditory roughness sensation.
Model of observed stochastic balance between work and free time supporting the LQTAI definition
DEFF Research Database (Denmark)
Ditlevsen, Ove Dalager
2008-01-01
A balance differential equation between free time and money-producing work time on the national economy level is formulated in a previous paper in terms of two dimensionless quantities, the fraction of work time and the total productivity factor defined as the ratio of the Gross Domestic Product...... significant systematically balance influencing parameters on the macro economical level than those considered in the definition in the previous paper of the Life Quality Time Allocation Index....... to the total salary paid in return for work. Among the solutions there is one relation that compares surprisingly well with the relevant sequences of Danish data spanning from 1948 to 2003, and also with similar data from several other countries except for slightly different model parameter values. Statistical...
Evans, Natalie; Meñaca, Arantza; Koffman, Jonathan; Harding, Richard; Higginson, Irene J; Pool, Robert; Gysels, Marjolein
2012-07-01
Cultural competency is increasingly recommended in policy and practice to improve end-of-life (EoL) care for minority ethnic groups in multicultural societies. It is imperative to critically analyze this approach to understand its underlying concepts. Our aim was to appraise cultural competency approaches described in the British literature on EoL care and minority ethnic groups. This is a critical review. Articles on cultural competency were identified from a systematic review of the literature on minority ethnic groups and EoL care in the United Kingdom. Terms, definitions, and conceptual models of cultural competency approaches were identified and situated according to purpose, components, and origin. Content analysis of definitions and models was carried out to identify key components. One-hundred thirteen articles on minority ethnic groups and EoL care in the United Kingdom were identified. Over half (n=60) contained a term, definition, or model for cultural competency. In all, 17 terms, 17 definitions, and 8 models were identified. The most frequently used term was "culturally sensitive," though "cultural competence" was defined more often. Definitions contained one or more of the components: "cognitive," "implementation," or "outcome." Models were categorized for teaching or use in patient assessment. Approaches were predominantly of American origin. The variety of terms, definitions, and models underpinning cultural competency approaches demonstrates a lack of conceptual clarity, and potentially complicates implementation. Further research is needed to compare the use of cultural competency approaches in diverse cultures and settings, and to assess the impact of such approaches on patient outcomes.
Energy Technology Data Exchange (ETDEWEB)
Atkinson-Hope, Gary; Stemmet, W.C. [Cape Peninsula University of Technology, Cape Town Campus, Cape Town (South Africa)
2006-07-01
The purpose of this paper is to assess the DlgSILENT PowerFactory software power definitions (indices) in terms of phase and sequence components for balanced and unbalanced networks when harmonic distortion is present and to compare its results to hand calculations done, following recommendation made by the IEEE Working Group on this topic. This paper also includes the development of a flowchart for calculating power indices in balanced and unbalanced three-phase networks when non-sinusoidal voltages and currents are present. A further purpose is to determine how two industrial grade harmonic analysis software packages (DlgSILENT and ERACS) model three-phase harmonic sources used for current penetration studies and to compare their results when applied to a network. From the investigations, another objective was to develop a methodology for modelling harmonic current sources based on a spectrum obtained from measurements. Three case studies were conducted and the assessment and developed methodologies were shown to be effective. (Author)
A Gaussian mixture model for definition of lung tumor volumes in positron emission tomography
International Nuclear Information System (INIS)
Aristophanous, Michalis; Penney, Bill C.; Martel, Mary K.; Pelizzari, Charles A.
2007-01-01
The increased interest in 18 F-fluorodeoxyglucose (FDG) positron emission tomography (PET) in radiation treatment planning in the past five years necessitated the independent and accurate segmentation of gross tumor volume (GTV) from FDG-PET scans. In some studies the radiation oncologist contours the GTV based on a computed tomography scan, while incorporating pertinent data from the PET images. Alternatively, a simple threshold, typically 40% of the maximum intensity, has been employed to differentiate tumor from normal tissue, while other researchers have developed algorithms to aid the PET based GTV definition. None of these methods, however, results in reliable PET tumor segmentation that can be used for more sophisticated treatment plans. For this reason, we developed a Gaussian mixture model (GMM) based segmentation technique on selected PET tumor regions from non-small cell lung cancer patients. The purpose of this study was to investigate the feasibility of using a GMM-based tumor volume definition in a robust, reliable and reproducible way. A GMM relies on the idea that any distribution, in our case a distribution of image intensities, can be expressed as a mixture of Gaussian densities representing different classes. According to our implementation, each class belongs to one of three regions in the image; the background (B), the uncertain (U) and the target (T), and from these regions we can obtain the tumor volume. User interaction in the implementation is required, but is limited to the initialization of the model parameters and the selection of an ''analysis region'' to which the modeling is restricted. The segmentation was developed on three and tested on another four clinical cases to ensure robustness against differences observed in the clinic. It also compared favorably with thresholding at 40% of the maximum intensity and a threshold determination function based on tumor to background image intensities proposed in a recent paper. The parts of the
Improving fire season definition by optimized temporal modelling of daily human-caused ignitions.
Costafreda-Aumedes, S; Vega-Garcia, C; Comas, C
2018-07-01
Wildfire suppression management is usually based on fast control of all ignitions, especially in highly populated countries with pervasive values-at-risk. To minimize values-at-risk loss by improving response time of suppression resources it is necessary to anticipate ignitions, which are mainly caused by people. Previous studies have found that human-ignition patterns change spatially and temporally depending on socio-economic activities, hence, the deployment of suppression resources along the year should consider these patterns. However, full suppression capacity is operational only within legally established fire seasons, driven by past events and budgets, which limits response capacity and increases damages out of them. The aim of this study was to assess the temporal definition of fire seasons from the perspective of human-ignition patterns for the case study of Spain, where people cause over 95% of fires. Humans engage in activities that use fire as a tool in certain periods within a year, and in locations linked to specific spatial factors. Geographic variables (population, infrastructures, physiography and land uses) were used as explanatory variables for human-ignition patterns. The changing influence of these geographic variables on occurrence along the year was analysed with day-by-day logistic regression models. Daily models were built for all the municipal units in the two climatic regions in Spain (Atlantic and Mediterranean Spain) from 2002 to 2014, and similar models were grouped within continuous periods, designated as ignition-based seasons. We found three ignition-based seasons in the Mediterranean region and five in the Atlantic zones, not coincidental with calendar seasons, but with a high degree of agreement with current legally designated operational fire seasons. Our results suggest that an additional late-winter-early-spring fire season in the Mediterranean area and the extension of this same season in the Atlantic zone should be re
Jensen, Morten B; Guldberg, Trine L; Harbøll, Anja; Lukacova, Slávka; Kallehauge, Jesper F
2017-11-01
The clinical target volume (CTV) in radiotherapy is routinely based on gadolinium contrast enhanced T1 weighted (T1w + Gd) and T2 weighted fluid attenuated inversion recovery (T2w FLAIR) magnetic resonance imaging (MRI) sequences which have been shown to over- or underestimate the microscopic tumor cell spread. Gliomas favor spread along the white matter fiber tracts. Tumor growth models incorporating the MRI diffusion tensors (DTI) allow to account more consistently for the glioma growth. The aim of the study was to investigate the potential of a DTI driven growth model to improve target definition in glioblastoma (GBM). Eleven GBM patients were scanned using T1w, T2w FLAIR, T1w + Gd and DTI. The brain was segmented into white matter, gray matter and cerebrospinal fluid. The Fisher-Kolmogorov growth model was used assuming uniform proliferation and a difference in white and gray matter diffusion of a ratio of 10. The tensor directionality was tested using an anisotropy weighting parameter set to zero (γ0) and twenty (γ20). The volumetric comparison was performed using Hausdorff distance, Dice similarity coefficient (DSC) and surface area. The median of the standard CTV (CTVstandard) was 180 cm 3 . The median surface area of CTVstandard was 211 cm 2 . The median surface area of respective CTV γ0 and CTV γ20 significantly increased to 338 and 376 cm 2 , respectively. The Hausdorff distance was greater than zero and significantly increased for both CTV γ0 and CTV γ20 with respective median of 18.7 and 25.2 mm. The DSC for both CTV γ0 and CTV γ20 were significantly below one with respective median of 0.74 and 0.72, which means that 74 and 72% of CTVstandard were included in CTV γ0 and CTV γ20, respectively. DTI driven growth models result in CTVs with a significantly increased surface area, a significantly increased Hausdorff distance and decreased overlap between the standard and model derived volume.
Feldmesser, Ester; Rosenwasser, Shilo; Vardi, Assaf; Ben-Dor, Shifra
2014-01-01
Background The advent of Next Generation Sequencing technologies and corresponding bioinformatics tools allows the definition of transcriptomes in non-model organisms. Non-model organisms are of great ecological and biotechnological significance, and consequently the understanding of their unique metabolic pathways is essential. Several methods that integrate de novo assembly with genome-based assembly have been proposed. Yet, there are many open challenges in defining genes, particularly whe...
A process-based model for the definition of hydrological alert systems in landslide risk mitigation
Directory of Open Access Journals (Sweden)
M. Floris
2012-11-01
Full Text Available The definition of hydrological alert systems for rainfall-induced landslides is strongly related to a deep knowledge of the geological and geomorphological features of the territory. Climatic conditions, spatial and temporal evolution of the phenomena and characterization of landslide triggering, together with propagation mechanisms, are the key elements to be considered. Critical steps for the development of the systems consist of the identification of the hydrological variable related to landslide triggering and of the minimum rainfall threshold for landslide occurrence.
In this paper we report the results from a process-based model to define a hydrological alert system for the Val di Maso Landslide, located in the northeastern Italian Alps and included in the Vicenza Province (Veneto region, NE Italy. The instability occurred in November 2010, due to an exceptional rainfall event that hit the Vicenza Province and the entire NE Italy. Up to 500 mm in 3-day cumulated rainfall generated large flood conditions and triggered hundreds of landslides. During the flood, the Soil Protection Division of the Vicenza Province received more than 500 warnings of instability phenomena. The complexity of the event and the high level of risk to infrastructure and private buildings are the main reasons for deepening the specific phenomenon occurred at Val di Maso.
Empirical and physically-based models have been used to identify the minimum rainfall threshold for the occurrence of instability phenomena in the crown area of Val di Maso landslide, where a retrogressive evolution by multiple rotational slides is expected. Empirical models helped in the identification and in the evaluation of recurrence of critical rainfall events, while physically-based modelling was essential to verify the effects on the slope stability of determined rainfall depths. Empirical relationships between rainfall and landslide consist of the calculation of rainfall
Definition of an Object-Oriented Modeling Language for Enterprise Architecture
Lê, Lam Son; Wegmann, Alain
2005-01-01
In enterprise architecture, the goal is to integrate business resources and IT resources in order to improve an enterprises competitiveness. In an enterprise architecture project, the development team usually constructs a model that represents the enterprise: the enterprise model. In this paper, we present a modeling language for building such enterprise models. Our enterprise models are hierarchical object-oriented representations of the enterprises. This paper presents the foundations of o...
2013-12-31
... features to be excluded from certification, verification, and enforcement testing as long as specific... class* that must be tested Self-Contained Open Refrigerators... 2 Basic Models. Self-Contained Open... Open Freezers..... 2 Basic Models. Self-Contained Closed Refrigerators. 2 Basic Models. Self-Contained...
Modelling SDL, Modelling Languages
Directory of Open Access Journals (Sweden)
Michael Piefel
2007-02-01
Full Text Available Today's software systems are too complex to implement them and model them using only one language. As a result, modern software engineering uses different languages for different levels of abstraction and different system aspects. Thus to handle an increasing number of related or integrated languages is the most challenging task in the development of tools. We use object oriented metamodelling to describe languages. Object orientation allows us to derive abstract reusable concept definitions (concept classes from existing languages. This language definition technique concentrates on semantic abstractions rather than syntactical peculiarities. We present a set of common concept classes that describe structure, behaviour, and data aspects of high-level modelling languages. Our models contain syntax modelling using the OMG MOF as well as static semantic constraints written in OMG OCL. We derive metamodels for subsets of SDL and UML from these common concepts, and we show for parts of these languages that they can be modelled and related to each other through the same abstract concepts.
Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition
Ewing, Anthony; Adams, Charles
2004-01-01
Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.
International Nuclear Information System (INIS)
Stephansson, O.
1987-05-01
Existing knowledge of crustal stresses for Fennoscandia is presented. Generic, two-dimensional models are proposed for vertical and planar sections of a traverse having a direction NW-SE in Northern Fennoscandia. The proposed traverse will include the major neotectonic structures at Lansjaerv and Paervie, respectively, and also the study site for storage of spent nuclear fuel at Kamlunge. The influence of glaciation, deglaciation, glacial rebound on crustal rock mechanics and stability is studied for the modelling work. Global models, with a length of roughly 100 km, will increase our over all understanding of the change in stresses and deformations. These can provide boundary conditions for regional and near-field models. Properties of strength and stiffness of intact granitic rock masses, faults and joints are considered in the modelling of the crustal rock mechanics for any of the three models described. (orig./HP)
Upton, J.R.; Murphy, M.; Shallo, L.; Groot Koerkamp, P.W.G.; Boer, de I.J.M.
2014-01-01
Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on
2013-10-22
... transformers, electric motors, and small electric motors to use AEDMs to rate their non-tested combinations... electric storage water heaters [cir] Commercial gas-fired and oil-fired storage water heaters [cir.... Electric Water Heaters 2 Basic Models. Heat Pump Water Heaters 2 Basic Models. Unfired Hot Water Storage...
Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil
2016-01-01
Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated - from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the
Laffan, Shawn W; Wang, Zhaoyuan; Ward, Michael P
2011-12-01
The definition of the spatial relatedness between infectious and susceptible animal groups is a fundamental component of spatio-temporal modelling of disease outbreaks. A common neighbourhood definition for disease spread in wild and feral animal populations is the distance between the centroids of neighbouring group home ranges. This distance can be used to define neighbourhood interactions, and also to describe the probability of successful disease transmission. Key limitations of this approach are (1) that a susceptible neighbour of an infectious group with an overlapping home range - but whose centroid lies outside the home range of an infectious group - will not be considered for disease transmission, and (2) the degree of overlap between the home ranges is not taken into account for those groups with centroids inside the infectious home range. We assessed the impact of both distance-based and range overlap methods of disease transmission on model-predicted disease spread. Range overlap was calculated using home ranges modelled as circles. We used the Sirca geographic automata model, with the population data from a nine-county study area in Texas that we have previously described. For each method we applied 100 model repetitions, each of 100 time steps, to 30 index locations. The results show that the rate of disease spread for the range-overlap method is clearly less than the distance-based method, with median outbreaks modelled using the latter being 1.4-1.45 times larger. However, the two methods show similar overall trends in the area infected, and the range-overlap median (48 and 120 for cattle and pigs, respectively) falls within the 5th-95th percentile range of the distance-based method (0-96 and 0-252 for cattle and pigs, respectively). These differences can be attributed to the calculation of the interaction probabilities in the two methods, with overlap weights generally resulting in lower interaction probabilities. The definition of spatial
Torian, J. G.
1977-01-01
Consumables models required for the mission planning and scheduling function are formulated. The relation of the models to prelaunch, onboard, ground support, and postmission functions for the space transportation systems is established. Analytical models consisting of an orbiter planning processor with consumables data base is developed. A method of recognizing potential constraint violations in both the planning and flight operations functions, and a flight data file storage/retrieval of information over an extended period which interfaces with a flight operations processor for monitoring of the actual flights is presented.
International Nuclear Information System (INIS)
Fritsch, Daniel; Yu Liyun; Johnson, Valen; McAuliffe, Matthew; Pizer, Stephen; Chaney, Edward
1996-01-01
Purpose/Objective : Current clinical methods for defining normal anatomical structures on tomographic images are time consuming and subject to intra- and inter-user variability. With the widespread implementation of 3D RTP, conformal radiotherapy, and dose escalation the implications of imprecise object definition have assumed a much higher level of importance. Object definition and volume-weighted metrics for normal anatomy, such as DVHs and NTCPs, play critical roles in aiming, shaping, and weighting beams. Improvements in object definition, including computer automation, are essential to yield reliable volume-weighted metrics and gains in human efficiency. The purpose of this study was to investigate a probabilistic approach using deformable models to automatically recognize and extract normal anatomy in tomographic images. Materials and Methods: Object models were created from normal organs that were segmented by an interactive method which involved placing a cursor near the center of the object on a slice and clicking a mouse button to initiate computation of structures called cores. Cores describe the skeletal and boundary shape of image objects in a manner that, in 2D, associates a location on the skeleton with the width of the object at that location. A significant advantage of cores is stability against image disturbances such as noise and blur. The model was composed of a relatively small set of extracted points on the skeleton and boundary. The points were carefully chosen to summarize the shape information captured by the cores. Neighborhood relationships between points were represented mathematically by energy functions that penalize, due to warping of the model, the ''goodness'' of match between the model and the image data at any stage during the segmentation process. The model was matched against the image data using a probabilistic approach based on Bayes theorem, which provides a means for computing a posteriori (posterior) probability from 1) a
Ising model of a randomly triangulated random surface as a definition of fermionic string theory
International Nuclear Information System (INIS)
Bershadsky, M.A.; Migdal, A.A.
1986-01-01
Fermionic degrees of freedom are added to randomly triangulated planar random surfaces. It is shown that the Ising model on a fixed graph is equivalent to a certain Majorana fermion theory on the dual graph. (orig.)
Transport spatial model for the definition of green routes for city logistics centers
International Nuclear Information System (INIS)
Pamučar, Dragan; Gigović, Ljubomir; Ćirović, Goran; Regodić, Miodrag
2016-01-01
This paper presents a transport spatial decision support model (TSDSM) for carrying out the optimization of green routes for city logistics centers. The TSDSM model is based on the integration of the multi-criteria method of Weighted Linear Combination (WLC) and the modified Dijkstra algorithm within a geographic information system (GIS). The GIS is used for processing spatial data. The proposed model makes it possible to plan routes for green vehicles and maximize the positive effects on the environment, which can be seen in the reduction of harmful gas emissions and an increase in the air quality in highly populated areas. The scheduling of delivery vehicles is given as a problem of optimization in terms of the parameters of: the environment, health, use of space and logistics operating costs. Each of these input parameters was thoroughly examined and broken down in the GIS into criteria which further describe them. The model presented here takes into account the fact that logistics operators have a limited number of environmentally friendly (green) vehicles available. The TSDSM was tested on a network of roads with 127 links for the delivery of goods from the city logistics center to the user. The model supports any number of available environmentally friendly or environmentally unfriendly vehicles consistent with the size of the network and the transportation requirements. - Highlights: • Model for routing light delivery vehicles in urban areas. • Optimization of green routes for city logistics centers. • The proposed model maximizes the positive effects on the environment. • The model was tested on a real network.
Upton, J.R.; Murphy, M.; Shallo, L.; Groot Koerkamp, P.W.G.; Boer, de, I.J.M.
2014-01-01
Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical d...
Transport spatial model for the definition of green routes for city logistics centers
Energy Technology Data Exchange (ETDEWEB)
Pamučar, Dragan, E-mail: dpamucar@gmail.com [University of Defence in Belgrade, Department of Logistics, Pavla Jurisica Sturma 33, 11000 Belgrade (Serbia); Gigović, Ljubomir, E-mail: gigoviclj@gmail.com [University of Defence in Belgrade, Department of Mathematics, Pavla Jurisica Sturma 33, 11000 Belgrade (Serbia); Ćirović, Goran, E-mail: cirovic@sezampro.rs [College of Civil Engineering and Geodesy, The Belgrade University, Hajduk Stankova 2, 11000 Belgrade (Serbia); Regodić, Miodrag, E-mail: mregodic62@gmail.com [University of Defence in Belgrade, Department of Mathematics, Pavla Jurisica Sturma 33, 11000 Belgrade (Serbia)
2016-01-15
This paper presents a transport spatial decision support model (TSDSM) for carrying out the optimization of green routes for city logistics centers. The TSDSM model is based on the integration of the multi-criteria method of Weighted Linear Combination (WLC) and the modified Dijkstra algorithm within a geographic information system (GIS). The GIS is used for processing spatial data. The proposed model makes it possible to plan routes for green vehicles and maximize the positive effects on the environment, which can be seen in the reduction of harmful gas emissions and an increase in the air quality in highly populated areas. The scheduling of delivery vehicles is given as a problem of optimization in terms of the parameters of: the environment, health, use of space and logistics operating costs. Each of these input parameters was thoroughly examined and broken down in the GIS into criteria which further describe them. The model presented here takes into account the fact that logistics operators have a limited number of environmentally friendly (green) vehicles available. The TSDSM was tested on a network of roads with 127 links for the delivery of goods from the city logistics center to the user. The model supports any number of available environmentally friendly or environmentally unfriendly vehicles consistent with the size of the network and the transportation requirements. - Highlights: • Model for routing light delivery vehicles in urban areas. • Optimization of green routes for city logistics centers. • The proposed model maximizes the positive effects on the environment. • The model was tested on a real network.
International Nuclear Information System (INIS)
Kimura, Mitsuhiro
2006-01-01
This paper proposes a method of assessing software vulnerability quantitatively. By expanding the concept of the IPO (input-program-output) model, we first define the software vulnerability and construct a stochastic model. Then we evaluate the software vulnerability of the sendmail system by analyzing the actual security-hole data, which were collected from its release note. Also we show the relationship between the estimated software reliability and vulnerability of the analyzed system
Sidorov, Vladimir P.; Melzitdinova, Anna V.
2017-10-01
This paper represents the definition methods for thermal constants according to the data of the weld width under the normal-circular heat source. The method is based on isoline contouring of “effective power - temperature conductivity coefficient”. The definition of coefficients provides setting requirements to the precision of welding parameters support with the enough accuracy for an engineering practice.
Directory of Open Access Journals (Sweden)
A.A. Malykh
2017-08-01
Full Text Available In this paper, the concept of locally simple models is considered. Locally simple models are arbitrarily complex models built from relatively simple components. A lot of practically important domains of discourse can be described as locally simple models, for example, business models of enterprises and companies. Up to now, research in human reasoning automation has been mainly concentrated around the most intellectually intensive activities, such as automated theorem proving. On the other hand, the retailer business model is formed from ”jobs”, and each ”job” can be modelled and automated more or less easily. At the same time, the whole retailer model as an integrated system is extremely complex. In this paper, we offer a variant of the mathematical definition of a locally simple model. This definition is intended for modelling a wide range of domains. Therefore, we also must take into account the perceptual and psychological issues. Logic is elitist, and if we want to attract to our models as many people as possible, we need to hide this elitism behind some metaphor, to which ’ordinary’ people are accustomed. As such a metaphor, we use the concept of a document, so our locally simple models are called document models. Document models are built in the paradigm of semantic programming. This allows us to achieve another important goal - to make the documentary models executable. Executable models are models that can act as practical information systems in the described domain of discourse. Thus, if our model is executable, then programming becomes redundant. The direct use of a model, instead of its programming coding, brings important advantages, for example, a drastic cost reduction for development and maintenance. Moreover, since the model is well and sound, and not dissolved within programming modules, we can directly apply AI tools, in particular, machine learning. This significantly expands the possibilities for automation and
A topo-graph model for indistinct target boundary definition from anatomical images.
Cui, Hui; Wang, Xiuying; Zhou, Jianlong; Gong, Guanzhong; Eberl, Stefan; Yin, Yong; Wang, Lisheng; Feng, Dagan; Fulham, Michael
2018-06-01
It can be challenging to delineate the target object in anatomical imaging when the object boundaries are difficult to discern due to the low contrast or overlapping intensity distributions from adjacent tissues. We propose a topo-graph model to address this issue. The first step is to extract a topographic representation that reflects multiple levels of topographic information in an input image. We then define two types of node connections - nesting branches (NBs) and geodesic edges (GEs). NBs connect nodes corresponding to initial topographic regions and GEs link the nodes at a detailed level. The weights for NBs are defined to measure the similarity of regional appearance, and weights for GEs are defined with geodesic and local constraints. NBs contribute to the separation of topographic regions and the GEs assist the delineation of uncertain boundaries. Final segmentation is achieved by calculating the relevance of the unlabeled nodes to the labels by the optimization of a graph-based energy function. We test our model on 47 low contrast CT studies of patients with non-small cell lung cancer (NSCLC), 10 contrast-enhanced CT liver cases and 50 breast and abdominal ultrasound images. The validation criteria are the Dice's similarity coefficient and the Hausdorff distance. Student's t-test show that our model outperformed the graph models with pixel-only, pixel and regional, neighboring and radial connections (p-values <0.05). Our findings show that the topographic representation and topo-graph model provides improved delineation and separation of objects from adjacent tissues compared to the tested models. Copyright © 2018 Elsevier B.V. All rights reserved.
Russell, Richard A.; Waiss, Richard D.
1988-01-01
A study was conducted to identify the common support equipment and Space Station interface requirements for the IOC (initial operating capabilities) model technology experiments. In particular, each principal investigator for the proposed model technology experiment was contacted and visited for technical understanding and support for the generation of the detailed technical backup data required for completion of this study. Based on the data generated, a strong case can be made for a dedicated technology experiment command and control work station consisting of a command keyboard, cathode ray tube, data processing and storage, and an alert/annunciator panel located in the pressurized laboratory.
Model integration and a theory of models
Dolk, Daniel R.; Kottemann, Jeffrey E.
1993-01-01
Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...
Escorpizo, Reuben; Reneman, Michiel F.; Ekholm, Jan; Fritz, Julie; Krupa, Terry; Marnetoft, Sven-Uno; Maroun, Claude E.; Guzman, Julietta Rodriguez; Suzuki, Yoshiko; Stucki, Gerold; Chan, Chetwyn C. H.
Background The International Classification of Functioning, Disability and Health (ICF) is a conceptual framework and classification system by the World Health Organization (WHO) to understand functioning. The objective of this discussion paper is to offer a conceptual definition for vocational
Directory of Open Access Journals (Sweden)
Hucka Michael
2015-06-01
Full Text Available Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2015-09-04
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.
High definition geomagnetic models: A new perspective for improved wellbore positioning
DEFF Research Database (Denmark)
Maus, Stefan; Nair, Manoj C.; Poedjono, Benny
2012-01-01
represent the main magnetic field originating in the Earth's liquid core, but the new models additionally account for crustal magnetic anomalies, which constitute a significant source of error in directional drilling. NGDC maintains a public archive of global ship and airborne magnetic field measurements...
An evaluation model for the definition of regulatory requirements on spent fuel pool cooling systems
International Nuclear Information System (INIS)
Izquierdo, J.M.
1979-01-01
A calculation model is presented for establishing regulatory requirements in the SFPCS System. The major design factors, regulatory and design limits and key parameters are discussed. A regulatory position for internal use is proposed. Finally, associated problems and experience are presented. (author)
Mathematical models for the definition of cell manufacturing layout. Literature review
Directory of Open Access Journals (Sweden)
Gustavo Andrés Romero Duque
2015-11-01
Full Text Available This review article discusses the approach to the layout problem of cell manufacturing (LCM in a descriptive form; considering at first the problem and its variations, then the elements of the mathematical models, subsequently presenting solution methods used; and finally some future perspectives about this topic are considered.
de Soria-Santacruz Pich, Maria; Jun, Insoo; Evans, Robin
2017-09-01
The empirical AP8/AE8 model has been the de facto Earth's radiation belts engineering reference for decades. The need from the community for a better model incubated the development of AP9/AE9/SPM, which addresses several shortcomings of the old model. We provide additional validation of AP9/AE9 by comparing in situ electron and proton data from Jason-2, Polar Orbiting Environmental Satellites (POES), and the Van Allen Probes spacecraft with the 5th, 50th, and 95th percentiles from AE9/AP9 and with the model outputs from AE8/AP8. The relatively short duration of Van Allen Probes and Jason-2 missions means that their measurements are most certainly the result of specific climatological conditions. In low Earth orbit (LEO), the Jason-2 proton flux is better reproduced by AP8 compared to AP9, while the POES electron data are well enveloped by AE9 5th and 95th percentiles. The shape of the South Atlantic anomaly (SAA) from Jason-2 data is better captured by AP9 compared to AP8, while the peak SAA flux is better reproduced by AP8. The <1.5 MeV inner belt electrons from Magnetic Electron Ion Spectrometer (MagEIS) are well enveloped by AE9 5th and 95th percentiles, while AE8 overpredicts the measurements. In the outer radiation belt, MagEIS and Relativistic Electron and Proton Telescope (REPT) electrons closely follow the median estimate from AE9, while AP9 5th and 95th percentiles generally envelope REPT proton measurements in the inner belt and slot regions. While AE9/AP9 offer the flexibility to specify the environment with different confidence levels, the dose and trapped proton peak flux for POES and Jason-2 trajectories from the AE9/AP9 50th percentile and above are larger than the estimates from the AE8/AP8 models.
Directory of Open Access Journals (Sweden)
Alba Sandyra Bezerra Lopes
2012-01-01
Full Text Available The motion estimation is the most complex module in a video encoder requiring a high processing throughput and high memory bandwidth, mainly when the focus is high-definition videos. The throughput problem can be solved increasing the parallelism in the internal operations. The external memory bandwidth may be reduced using a memory hierarchy. This work presents a memory hierarchy model for a full-search motion estimation core. The proposed memory hierarchy model is based on a data reuse scheme considering the full search algorithm features. The proposed memory hierarchy expressively reduces the external memory bandwidth required for the motion estimation process, and it provides a very high data throughput for the ME core. This throughput is necessary to achieve real time when processing high-definition videos. When considering the worst bandwidth scenario, this memory hierarchy is able to reduce the external memory bandwidth in 578 times. A case study for the proposed hierarchy, using 32×32 search window and 8×8 block size, was implemented and prototyped on a Virtex 4 FPGA. The results show that it is possible to reach 38 frames per second when processing full HD frames (1920×1080 pixels using nearly 299 Mbytes per second of external memory bandwidth.
Upton, J; Murphy, M; Shalloo, L; Groot Koerkamp, P W G; De Boer, I J M
2014-01-01
Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical data of 1yr on commercial spring calving, grass-based dairy farms with 45, 88, and 195 milking cows; and (3) demonstrated the functionality of the model by applying 2 electricity tariffs to the electricity consumption data and examining the effect on total dairy farm electricity costs. The MECD was developed using a mechanistic modeling approach and required the key inputs of milk production, cow number, and details relating to the milk-cooling system, milking machine system, water-heating system, lighting systems, water pump systems, and the winter housing facilities as well as details relating to the management of the farm (e.g., season of calving). Model validation showed an overall relative prediction error (RPE) of less than 10% for total electricity consumption. More than 87% of the mean square prediction error of total electricity consumption was accounted for by random variation. The RPE values of the milk-cooling systems, water-heating systems, and milking machine systems were less than 20%. The RPE values for automatic scraper systems, lighting systems, and water pump systems varied from 18 to 113%, indicating a poor prediction for these metrics. However, automatic scrapers, lighting, and water pumps made up only 14% of total electricity consumption across all farms, reducing the overall impact of these poor predictions. Demonstration of the model showed that total farm electricity costs increased by between 29 and 38% by moving from a day and night tariff to a flat
Model for definition of heat transfer coefficient in an annular two-phase flow
International Nuclear Information System (INIS)
Khun, J.
1976-01-01
Near-wall heat exchange in a vertical tube at high vapor velocity in a two-phase vapor and liquid flow is investigated. The flow divides inside the tube into a near-wall liquid film and a vapor nucleus containing liquid droplets, with the boundaries being uniform. The liquid film thickness determines the main resistance during heat transfer between the wall and vapor nucleus. The theoretical model presented is verified in water vaporization experiments, the R12 cooling agent and certain hydrocarbons. The loss of friction pressure is determined by the Lockart-Martinelli method. The approximately universal Carman velocity profile is used to evaluate the velocity in film, and basing on this, film thickness is determined. The parameter ranges were: Resub(vap)=10 4 -3x10 6 , Resub(liq.)=0.9-10. The theoretical model ensures good correlation with the experiment
Economics definitions, methods, models, and analysis procedures for Homeland Security applications.
Energy Technology Data Exchange (ETDEWEB)
Ehlen, Mark Andrew; Loose, Verne William; Vargas, Vanessa N.; Smith, Braeton J.; Warren, Drake E.; Downes, Paula Sue; Eidson, Eric D.; Mackey, Greg Edward
2010-01-01
This report gives an overview of the types of economic methodologies and models used by Sandia economists in their consequence analysis work for the National Infrastructure Simulation & Analysis Center and other DHS programs. It describes the three primary resolutions at which analysis is conducted (microeconomic, mesoeconomic, and macroeconomic), the tools used at these three levels (from data analysis to internally developed and publicly available tools), and how they are used individually and in concert with each other and other infrastructure tools.
Gibbs, Sheena Simpkins; Kulig, Judith C
2017-09-01
The world's population is getting older, which will inevitably cause increased demands for nurses to provide high quality care to this demographic. Attitudes have been shown to influence the quality of care that older adults receive. It is therefore important to gain a better understanding of what influences nursing students' attitudes towards older adults. This article reports on one of three inter-connected research questions of a mixed methods study that explored the relationship between clinical instructors' attitudes and nursing students' attitudes towards older adults. Semi-structured interviews were conducted with 6 clinical instructors and 13 nursing students. Interview data was analyzed using thematic analysis. A conceptual model was developed from the research findings, which revealed that nursing instructors are seen as strong role models for their students, and as role models, they influence students through demonstrations, expectations and support. As a result, nursing students mirror the attitudes of their instructors towards older adults. Findings from this study highlight the strong connection between nursing instructors' and students' attitudes. This has important implications for nursing education including strategies that instructors can employ to enhance students' attitudes towards older adults. Insights from this study also have the potential to improve the quality of care that future nurses provide to older adults. Copyright © 2017 Elsevier Ltd. All rights reserved.
Definition and implementation of a fully coupled THM model for unsaturated soils
International Nuclear Information System (INIS)
Haxaire, A.; Galavi, V.; Brinkgreve, R.B.J.
2012-01-01
Document available in extended abstract form only. The governing equations of a coupled thermo-hydro-mechanical (THM) model are presented. They are an extension to the previous work of Galavi (2011) in which a coupled THM model based on Biot's consolidation theory was developed for saturated and partially saturated soils. This study is based on the assumption of local thermodynamical equilibrium, meaning that all phases have the same temperature at a point of the multiphase porous medium. The model is implemented in a research version of PLAXIS 2D. The non isothermal partially saturated flow is modeled using the water mass balance described in Rutqvist et al. (2001), in which the flux is decomposed in water advection and vapor diffusion. A limiting hypothesis lies in the gas pressure assumed to be constant in the entire domain. The air flow as a separate phase is therefore neglected. This choice results in having only one independent unknown in the fluid mass balance equation, which is water pressure. However, the vapor diffusion can still be modeled. It depends on temperature by means of a decomposition in a water pressure gradient part and a temperature gradient part. The vapor density follows the psychrometric law. The water storage is described using the ratio of saturation in the soil, the mechanical volumetric strains, and the variation of the skeleton density. To model the influence of water flow in the heat transport equation, the flux term is split in an averaged conductive term and a water advection term. The other quantities such as heat capacity and density are also averaged. This yields a simple yet accurate representation of the interactions between water flow and temperature. The non-isothermal deformation is formulated in terms of Bishop stresses. The effective saturation is taken as the Bishop coefficient. This provides a better accuracy compared to experimental results. The anisotropic thermal expansion tensor is used for the drained linear
Cucchi, K.; Kawa, N.; Hesse, F.; Rubin, Y.
2017-12-01
In order to reduce uncertainty in the prediction of subsurface flow and transport processes, practitioners should use all data available. However, classic inverse modeling frameworks typically only make use of information contained in in-situ field measurements to provide estimates of hydrogeological parameters. Such hydrogeological information about an aquifer is difficult and costly to acquire. In this data-scarce context, the transfer of ex-situ information coming from previously investigated sites can be critical for improving predictions by better constraining the estimation procedure. Bayesian inverse modeling provides a coherent framework to represent such ex-situ information by virtue of the prior distribution and combine them with in-situ information from the target site. In this study, we present an innovative data-driven approach for defining such informative priors for hydrogeological parameters at the target site. Our approach consists in two steps, both relying on statistical and machine learning methods. The first step is data selection; it consists in selecting sites similar to the target site. We use clustering methods for selecting similar sites based on observable hydrogeological features. The second step is data assimilation; it consists in assimilating data from the selected similar sites into the informative prior. We use a Bayesian hierarchical model to account for inter-site variability and to allow for the assimilation of multiple types of site-specific data. We present the application and validation of the presented methods on an established database of hydrogeological parameters. Data and methods are implemented in the form of an open-source R-package and therefore facilitate easy use by other practitioners.
Barbagallo, Simone; Corradi, Luca; de Ville de Goyet, Jean; Iannucci, Marina; Porro, Ivan; Rosso, Nicola; Tanfani, Elena; Testi, Angela
2015-05-17
The Operating Room (OR) is a key resource of all major hospitals, but it also accounts for up 40% of resource costs. Improving cost effectiveness, while maintaining a quality of care, is a universal objective. These goals imply an optimization of planning and a scheduling of the activities involved. This is highly challenging due to the inherent variable and unpredictable nature of surgery. A Business Process Modeling Notation (BPMN 2.0) was used for the representation of the "OR Process" (being defined as the sequence of all of the elementary steps between "patient ready for surgery" to "patient operated upon") as a general pathway ("path"). The path was then both further standardized as much as possible and, at the same time, keeping all of the key-elements that would allow one to address or define the other steps of planning, and the inherent and wide variability in terms of patient specificity. The path was used to schedule OR activity, room-by-room, and day-by-day, feeding the process from a "waiting list database" and using a mathematical optimization model with the objective of ending up in an optimized planning. The OR process was defined with special attention paid to flows, timing and resource involvement. Standardization involved a dynamics operation and defined an expected operating time for each operation. The optimization model has been implemented and tested on real clinical data. The comparison of the results reported with the real data, shows that by using the optimization model, allows for the scheduling of about 30% more patients than in actual practice, as well as to better exploit the OR efficiency, increasing the average operating room utilization rate up to 20%. The optimization of OR activity planning is essential in order to manage the hospital's waiting list. Optimal planning is facilitated by defining the operation as a standard pathway where all variables are taken into account. By allowing a precise scheduling, it feeds the process of
Automatic Conversion of a Conceptual Model to a Standard Multi-view Web Services Definition
Directory of Open Access Journals (Sweden)
Anass Misbah
2018-03-01
Full Text Available Information systems are becoming more and more heterogeneous and here comes the need to have more generic transformation algorithms and more automatic generation Meta rules. In fact, the large number of terminals, devices, operating systems, platforms and environments require a high level of adaptation. Therefore, it is becoming more and more difficult to validate, generate and implement manually models, designs and codes.Web services are one of the technologies that are used massively nowadays; hence, it is considered as one of technologies that require the most automatic rules of validation and automation. Many previous works have dealt with Web services by proposing new concepts such as Multi-view Web services, standard WSDL implementation of Multi-view Web services and even further Generic Meta rules for automatic generation of Multi-view Web services.In this work we will propose a new way of generating Multi-view Web ser-vices, which is based on an engine algorithm that takes as input both an initial Conceptual Model and user’s matrix and then unroll a generic algorithm to gen-erate dynamically a validated set of points of view. This set of points of view will be transformed to a standard WSDL implementation of Multi-view Web services by means of the automatic transformation Meta rules.
Leach, Colin Wayne; van Zomeren, Martijn; Zebel, Sven; Vliek, Michael L W; Pennekamp, Sjoerd F; Doosje, Bertjan; Ouwerkerk, Jaap W; Spears, Russell
2008-07-01
Recent research shows individuals' identification with in-groups to be psychologically important and socially consequential. However, there is little agreement about how identification should be conceptualized or measured. On the basis of previous work, the authors identified 5 specific components of in-group identification and offered a hierarchical 2-dimensional model within which these components are organized. Studies 1 and 2 used confirmatory factor analysis to validate the proposed model of self-definition (individual self-stereotyping, in-group homogeneity) and self-investment (solidarity, satisfaction, and centrality) dimensions, across 3 different group identities. Studies 3 and 4 demonstrated the construct validity of the 5 components by examining their (concurrent) correlations with established measures of in-group identification. Studies 5-7 demonstrated the predictive and discriminant validity of the 5 components by examining their (prospective) prediction of individuals' orientation to, and emotions about, real intergroup relations. Together, these studies illustrate the conceptual and empirical value of a hierarchical multicomponent model of in-group identification.
Folkert, Michael R.; Setton, Jeremy; Apte, Aditya P.; Grkovski, Milan; Young, Robert J.; Schöder, Heiko; Thorstad, Wade L.; Lee, Nancy Y.; Deasy, Joseph O.; Oh, Jung Hun
2017-07-01
In this study, we investigate the use of imaging feature-based outcomes research (‘radiomics’) combined with machine learning techniques to develop robust predictive models for the risk of all-cause mortality (ACM), local failure (LF), and distant metastasis (DM) following definitive chemoradiation therapy (CRT). One hundred seventy four patients with stage III-IV oropharyngeal cancer (OC) treated at our institution with CRT with retrievable pre- and post-treatment 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) scans were identified. From pre-treatment PET scans, 24 representative imaging features of FDG-avid disease regions were extracted. Using machine learning-based feature selection methods, multiparameter logistic regression models were built incorporating clinical factors and imaging features. All model building methods were tested by cross validation to avoid overfitting, and final outcome models were validated on an independent dataset from a collaborating institution. Multiparameter models were statistically significant on 5 fold cross validation with the area under the receiver operating characteristic curve (AUC) = 0.65 (p = 0.004), 0.73 (p = 0.026), and 0.66 (p = 0.015) for ACM, LF, and DM, respectively. The model for LF retained significance on the independent validation cohort with AUC = 0.68 (p = 0.029) whereas the models for ACM and DM did not reach statistical significance, but resulted in comparable predictive power to the 5 fold cross validation with AUC = 0.60 (p = 0.092) and 0.65 (p = 0.062), respectively. In the largest study of its kind to date, predictive features including increasing metabolic tumor volume, increasing image heterogeneity, and increasing tumor surface irregularity significantly correlated to mortality, LF, and DM on 5 fold cross validation in a relatively uniform single-institution cohort. The LF model also retained
Semantic Building Information Modeling and high definition surveys for Cultural Heritage sites
Directory of Open Access Journals (Sweden)
Simone Garagnani
2012-11-01
Full Text Available In recent years, digital technology devoted to the building design has experienced significant advancements allowing to reach, by means of the Building Information Modeling, those goals only imagined since the mid-Seventies of the last century. The BIM process, bearer of several advantages for actors and designers who implement it in their workflow, may be employed even in various case studies related to some interventions on the existing architectural Cultural Heritage. The semantics typical of the classical architecture, so pervasive in the European urban landscape, as well as the Modern or Contemporary architecture features, coincide with the self-conscious structure made of “smart objects” proper of BIM, which proves to be an effective system to document component relationships. However, the translation of existing buildings geometric information, acquired using the common techniques of laser scanning and digital photogrammetry, into BIM objects, is still a critical process that this paper aims to investigate, describing possible methods and approaches.
Definition of a 5MW/61.5m wind turbine blade reference model.
Energy Technology Data Exchange (ETDEWEB)
Resor, Brian Ray
2013-04-01
A basic structural concept of the blade design that is associated with the frequently utilized %E2%80%9CNREL offshore 5-MW baseline wind turbine%E2%80%9D is needed for studies involving blade structural design and blade structural design tools. The blade structural design documented in this report represents a concept that meets basic design criteria set forth by IEC standards for the onshore turbine. The design documented in this report is not a fully vetted blade design which is ready for manufacture. The intent of the structural concept described by this report is to provide a good starting point for more detailed and targeted investigations such as blade design optimization, blade design tool verification, blade materials and structures investigations, and blade design standards evaluation. This report documents the information used to create the current model as well as the analyses used to verify that the blade structural performance meets reasonable blade design criteria.
A Mediated Definite Delegation Model allowing for Certified Grid Job Submission
Schreiner, Steffen; Grigoras, Costin; Litmaath, Maarten
2012-01-01
Grid computing infrastructures need to provide traceability and accounting of their users" activity and protection against misuse and privilege escalation. A central aspect of multi-user Grid job environments is the necessary delegation of privileges in the course of a job submission. With respect to these generic requirements this document describes an improved handling of multi-user Grid jobs in the ALICE ("A Large Ion Collider Experiment") Grid Services. A security analysis of the ALICE Grid job model is presented with derived security objectives, followed by a discussion of existing approaches of unrestricted delegation based on X.509 proxy certificates and the Grid middleware gLExec. Unrestricted delegation has severe security consequences and limitations, most importantly allowing for identity theft and forgery of delegated assignments. These limitations are discussed and formulated, both in general and with respect to an adoption in line with multi-user Grid jobs. Based on the architecture of the ALICE...
Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco
2016-04-01
The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including
DEFF Research Database (Denmark)
Nielsen, Mogens Peter; Shui, Wan; Johansson, Jens
2011-01-01
term with stresses depending linearly on the strain rates. This term takes into account the transfer of linear momentum from one part of the fluid to another. Besides there is another term, which takes into account the transfer of angular momentum. Thus the model implies a new definition of turbulence...
A univocal definition of the neuronal soma morphology using Gaussian mixture models
Directory of Open Access Journals (Sweden)
Sergio eLuengo-Sanchez
2015-11-01
Full Text Available The definition of the soma is fuzzy, as there is no clear line demarcating the soma of the labeled neurons and the origin of the dendrites and axon. Thus, the morphometric analysis of the neuronal soma is highly subjective. In this paper, we provide a mathematical definition and an automatic segmentation method to delimit the neuronal soma. We applied this method to the characterization of pyramidal cells, which are the most abundant neurons in the cerebral cortex. Since there are no benchmarks with which to compare the proposed procedure, we validated the goodness of this automatic segmentation method against manual segmentation by experts in neuroanatomy to set up a framework for comparison. We concluded that there were no significant differences between automatically and manually segmented somata, i.e., the proposed procedure segments the neurons more or less as an expert does. It also provides univocal, justifiable and objective cutoffs. Thus, this study is a means of characterizing pyramidal neurons in order to objectively compare the morphometry of the somata of these neurons in different cortical areas and species.
A univocal definition of the neuronal soma morphology using Gaussian mixture models.
Luengo-Sanchez, Sergio; Bielza, Concha; Benavides-Piccione, Ruth; Fernaud-Espinosa, Isabel; DeFelipe, Javier; Larrañaga, Pedro
2015-01-01
The definition of the soma is fuzzy, as there is no clear line demarcating the soma of the labeled neurons and the origin of the dendrites and axon. Thus, the morphometric analysis of the neuronal soma is highly subjective. In this paper, we provide a mathematical definition and an automatic segmentation method to delimit the neuronal soma. We applied this method to the characterization of pyramidal cells, which are the most abundant neurons in the cerebral cortex. Since there are no benchmarks with which to compare the proposed procedure, we validated the goodness of this automatic segmentation method against manual segmentation by neuroanatomists to set up a framework for comparison. We concluded that there were no significant differences between automatically and manually segmented somata, i.e., the proposed procedure segments the neurons similarly to how a neuroanatomist does. It also provides univocal, justifiable and objective cutoffs. Thus, this study is a means of characterizing pyramidal neurons in order to objectively compare the morphometry of the somata of these neurons in different cortical areas and species.
A univocal definition of the neuronal soma morphology using Gaussian mixture models
Luengo-Sanchez, Sergio; Bielza, Concha; Benavides-Piccione, Ruth; Fernaud-Espinosa, Isabel; DeFelipe, Javier; Larrañaga, Pedro
2015-01-01
The definition of the soma is fuzzy, as there is no clear line demarcating the soma of the labeled neurons and the origin of the dendrites and axon. Thus, the morphometric analysis of the neuronal soma is highly subjective. In this paper, we provide a mathematical definition and an automatic segmentation method to delimit the neuronal soma. We applied this method to the characterization of pyramidal cells, which are the most abundant neurons in the cerebral cortex. Since there are no benchmarks with which to compare the proposed procedure, we validated the goodness of this automatic segmentation method against manual segmentation by neuroanatomists to set up a framework for comparison. We concluded that there were no significant differences between automatically and manually segmented somata, i.e., the proposed procedure segments the neurons similarly to how a neuroanatomist does. It also provides univocal, justifiable and objective cutoffs. Thus, this study is a means of characterizing pyramidal neurons in order to objectively compare the morphometry of the somata of these neurons in different cortical areas and species. PMID:26578898
Universality Conjecture and Results for a Model of Several Coupled Positive-Definite Matrices
Bertola, Marco; Bothner, Thomas
2015-08-01
The paper contains two main parts: in the first part, we analyze the general case of matrices coupled in a chain subject to Cauchy interaction. Similarly to the Itzykson-Zuber interaction model, the eigenvalues of the Cauchy chain form a multi level determinantal point process. We first compute all correlations functions in terms of Cauchy biorthogonal polynomials and locate them as specific entries of a matrix valued solution of a Riemann-Hilbert problem. In the second part, we fix the external potentials as classical Laguerre weights. We then derive strong asymptotics for the Cauchy biorthogonal polynomials when the support of the equilibrium measures contains the origin. As a result, we obtain a new family of universality classes for multi-level random determinantal point fields, which include the Bessel universality for 1-level and the Meijer-G universality for 2-level. Our analysis uses the Deift-Zhou nonlinear steepest descent method and the explicit construction of a origin parametrix in terms of Meijer G-functions. The solution of the full Riemann-Hilbert problem is derived rigorously only for p = 3 but the general framework of the proof can be extended to the Cauchy chain of arbitrary length p.
Definition, modeling and simulation of a grid computing system for high throughput computing
Caron, E; Tsaregorodtsev, A Yu
2006-01-01
In this paper, we study and compare grid and global computing systems and outline the benefits of having an hybrid system called dirac. To evaluate the dirac scheduling for high throughput computing, a new model is presented and a simulator was developed for many clusters of heterogeneous nodes belonging to a local network. These clusters are assumed to be connected to each other through a global network and each cluster is managed via a local scheduler which is shared by many users. We validate our simulator by comparing the experimental and analytical results of a M/M/4 queuing system. Next, we do the comparison with a real batch system and we obtain an average error of 10.5% for the response time and 12% for the makespan. We conclude that the simulator is realistic and well describes the behaviour of a large-scale system. Thus we can study the scheduling of our system called dirac in a high throughput context. We justify our decentralized, adaptive and oppor! tunistic approach in comparison to a centralize...
DEFF Research Database (Denmark)
Nygård, Lotte; Vogelius, Ivan R; Fischer, Barbara M
2018-01-01
INTRODUCTION: The aim of the study was to build a model of first failure site and lesion specific failure probability after definitive chemo-radiotherapy for inoperable non-small cell lung cancer (NSCLC). METHODS: We retrospectively analyzed 251 patients receiving definitive chemo......-regional failure, multivariable logistic regression was applied to assess risk of each lesion being first site of failure. The two models were used in combination to predict lesion failure probability accounting for competing events. RESULTS: Adenocarcinoma had a lower hazard ratio (HR) of loco-regional (LR...
FORECASTING MODELS IN MANAGEMENT
Sindelar, Jiri
2008-01-01
This article deals with the problems of forecasting models. First part of the article is dedicated to definition of the relevant areas (vertical and horizontal pillar of definition) and then the forecasting model itself is defined; as article presents theoretical background for further primary research, this definition is crucial. Finally the position of forecasting models within the management system is identified. The paper is a part of the outputs of FEM CULS grant no. 1312/11/3121.
Chen, Denise
2009-01-01
Currently, the rule of defining tool kits is varied and more engineer's aspects oriented. However, the decision of the tool kit's definition is a trade-off problem between the cost and the service performance. This project is designed to develop a model that can integrate the engineer's preferences
Feldmesser, Ester; Rosenwasser, Shilo; Vardi, Assaf; Ben-Dor, Shifra
2014-02-22
The advent of Next Generation Sequencing technologies and corresponding bioinformatics tools allows the definition of transcriptomes in non-model organisms. Non-model organisms are of great ecological and biotechnological significance, and consequently the understanding of their unique metabolic pathways is essential. Several methods that integrate de novo assembly with genome-based assembly have been proposed. Yet, there are many open challenges in defining genes, particularly where genomes are not available or incomplete. Despite the large numbers of transcriptome assemblies that have been performed, quality control of the transcript building process, particularly on the protein level, is rarely performed if ever. To test and improve the quality of the automated transcriptome reconstruction, we used manually defined and curated genes, several of them experimentally validated. Several approaches to transcript construction were utilized, based on the available data: a draft genome, high quality RNAseq reads, and ESTs. In order to maximize the contribution of the various data, we integrated methods including de novo and genome based assembly, as well as EST clustering. After each step a set of manually curated genes was used for quality assessment of the transcripts. The interplay between the automated pipeline and the quality control indicated which additional processes were required to improve the transcriptome reconstruction. We discovered that E. huxleyi has a very high percentage of non-canonical splice junctions, and relatively high rates of intron retention, which caused unique issues with the currently available tools. While individual tools missed genes and artificially joined overlapping transcripts, combining the results of several tools improved the completeness and quality considerably. The final collection, created from the integration of several quality control and improvement rounds, was compared to the manually defined set both on the DNA and
International Nuclear Information System (INIS)
Kainz, Wolfgang; Christ, Andreas; Kellom, Tocher; Seidman, Seth; Nikoloski, Neviana; Beard, Brian; Kuster, Niels
2005-01-01
This paper presents new definitions for obtaining reproducible results in numerical phone dosimetry. Numerous numerical dosimetric studies have been published about the exposure of mobile phone users which concluded with conflicting results. However, many of these studies lack reproducibility due to shortcomings in the description of the phone positioning. The new approach was tested by two groups applying two different numerical program packages to compare the specific anthropomorphic mannequin (SAM) to 14 anatomically correct head models. A novel definition for the positioning of mobile phones next to anatomically correct head models is given along with other essential parameters to be reported. The definition is solely based on anatomical characteristics of the head. A simple up-to-date phone model was used to determine the peak spatial specific absorption rate (SAR) of mobile phones in SAM and in the anatomically correct head models. The results were validated by measurements. The study clearly shows that SAM gives a conservative estimate of the exposure in anatomically correct head models for head only tissue. Depending on frequency, phone position and head size the numerically calculated 10 g averaged SAR in the pinna can be up to 2.1 times greater than the peak spatial SAR in SAM. Measurements in small structures, such as the pinna, will significantly increase the uncertainty; therefore SAM was designed for SAR assessment in the head only. Whether SAM will provide a conservative value for the pinna depends on the pinna SAR limit of the safety standard considered
Bartlett, Marcus A.; Liang, Tao; Pu, Liang; Schaefer, Henry F.; Allen, Wesley D.
2018-03-01
The n-propyl + O2 reaction is an important model of chain branching reactions in larger combustion systems. In this work, focal point analyses (FPAs) extrapolating to the ab initio limit were performed on the n-propyl + O2 system based on explicit quantum chemical computations with electron correlation treatments through coupled cluster single, double, triple, and perturbative quadruple excitations [CCSDT(Q)] and basis sets up to cc-pV5Z. All reaction species and transition states were fully optimized at the rigorous CCSD(T)/cc-pVTZ level of theory, revealing some substantial differences in comparison to the density functional theory geometries existing in the literature. A mixed Hessian methodology was implemented and benchmarked that essentially makes the computations of CCSD(T)/cc-pVTZ vibrational frequencies feasible and thus provides critical improvements to zero-point vibrational energies for the n-propyl + O2 system. Two key stationary points, n-propylperoxy radical (MIN1) and its concerted elimination transition state (TS1), were located 32.7 kcal mol-1 and 2.4 kcal mol-1 below the reactants, respectively. Two competitive β-hydrogen transfer transition states (TS2 and TS2') were found separated by only 0.16 kcal mol-1, a fact unrecognized in the current combustion literature. Incorporating TS2' in master equation (ME) kinetic models might reduce the large discrepancy of 2.5 kcal mol-1 between FPA and ME barrier heights for TS2. TS2 exhibits an anomalously large diagonal Born-Oppenheimer correction (ΔDBOC = 1.71 kcal mol-1), which is indicative of a nearby surface crossing and possible nonadiabatic reaction dynamics. The first systematic conformational search of three hydroperoxypropyl (QOOH) intermediates was completed, uncovering a total of 32 rotamers lying within 1.6 kcal mol-1 of their respective lowest-energy minima. Our definitive energetics for stationary points on the n-propyl + O2 potential energy surface provide key benchmarks for future studies
Directory of Open Access Journals (Sweden)
P. Grimaldi
2012-07-01
Full Text Available These mandatory guidelines are provided for preparation of papers accepted for publication in the series of Volumes of The The stereometric modelling means modelling achieved with : – the use of a pair of virtual cameras, with parallel axes and positioned at a mutual distance average of 1/10 of the distance camera-object (in practice the realization and use of a stereometric camera in the modeling program; – the shot visualization in two distinct windows – the stereoscopic viewing of the shot while modelling. Since the definition of "3D vision" is inaccurately referred to as the simple perspective of an object, it is required to add the word stereo so that "3D stereo vision " shall stand for "three-dimensional view" and ,therefore, measure the width, height and depth of the surveyed image. Thanks to the development of a stereo metric model , either real or virtual, through the "materialization", either real or virtual, of the optical-stereo metric model made visible with a stereoscope. It is feasible a continuous on line updating of the cultural heritage with the help of photogrammetry and stereometric modelling. The catalogue of the Architectonic Photogrammetry Laboratory of Politecnico di Bari is available on line at: http://rappresentazione.stereofot.it:591/StereoFot/FMPro?-db=StereoFot.fp5&-lay=Scheda&-format=cerca.htm&-view
Freeman, Thomas J.
This paper discusses six different models of organizational structure and leadership, including the scalar chain or pyramid model, the continuum model, the grid model, the linking pin model, the contingency model, and the circle or democratic model. Each model is examined in a separate section that describes the model and its development, lists…
ten Cate, Jacob M
2015-01-01
Developing experimental models to understand dental caries has been the theme in our research group. Our first, the pH-cycling model, was developed to investigate the chemical reactions in enamel or dentine, which lead to dental caries. It aimed to leverage our understanding of the fluoride mode of action and was also utilized for the formulation of oral care products. In addition, we made use of intra-oral (in situ) models to study other features of the oral environment that drive the de/remineralization balance in individual patients. This model addressed basic questions, such as how enamel and dentine are affected by challenges in the oral cavity, as well as practical issues related to fluoride toothpaste efficacy. The observation that perhaps fluoride is not sufficiently potent to reduce dental caries in the present-day society triggered us to expand our knowledge in the bacterial aetiology of dental caries. For this we developed the Amsterdam Active Attachment biofilm model. Different from studies on planktonic ('single') bacteria, this biofilm model captures bacteria in a habitat similar to dental plaque. With data from the combination of these models, it should be possible to study separate processes which together may lead to dental caries. Also products and novel agents could be evaluated that interfere with either of the processes. Having these separate models in place, a suggestion is made to design computer models to encompass the available information. Models but also role models are of the utmost importance in bringing and guiding research and researchers. 2015 S. Karger AG, Basel
DEFF Research Database (Denmark)
Damkjær, Sidsel; Thomsen, Jakob B; Petersen, Svetlana I
2017-01-01
prescribed the same PTV mean dose. Rectal NTCP grade ≥2 was evaluated with the Lyman-Kutcher-Burman model and TCP was estimated by a logistic model using the combined MRI positive volume in SV and prostate as region-of-interest. RESULTS: Fourteen of twenty-one patients were classified as MRI positive, six...
Hsp90 inhibitors, part 1: definition of 3-D QSAutogrid/R models as a tool for virtual screening.
Ballante, Flavio; Caroli, Antonia; Wickersham, Richard B; Ragno, Rino
2014-03-24
The multichaperone heat shock protein (Hsp) 90 complex mediates the maturation and stability of a variety of oncogenic signaling proteins. For this reason, Hsp90 has emerged as a promising target for anticancer drug development. Herein, we describe a complete computational procedure for building several 3-D QSAR models used as a ligand-based (LB) component of a comprehensive ligand-based (LB) and structure-based (SB) virtual screening (VS) protocol to identify novel molecular scaffolds of Hsp90 inhibitors. By the application of the 3-D QSAutogrid/R method, eight SB PLS 3-D QSAR models were generated, leading to a final multiprobe (MP) 3-D QSAR pharmacophoric model capable of recognizing the most significant chemical features for Hsp90 inhibition. Both the monoprobe and multiprobe models were optimized, cross-validated, and tested against an external test set. The obtained statistical results confirmed the models as robust and predictive to be used in a subsequent VS.
DEFF Research Database (Denmark)
Carlson, Kerstin
The International Criminal Tribunal for the former Yugoslavia (ICTY) was the first and most celebrated of a wave of international criminal tribunals (ICTs) built in the 1990s designed to advance liberalism through international criminal law. Model(ing) Justice examines the case law of the ICTY...
ten Cate, J.M.
2015-01-01
Developing experimental models to understand dental caries has been the theme in our research group. Our first, the pH-cycling model, was developed to investigate the chemical reactions in enamel or dentine, which lead to dental caries. It aimed to leverage our understanding of the fluoride mode of
Moghaddasi, L; Bezak, E; Harriss-Phillips, W
2016-05-07
Clinical target volume (CTV) determination may be complex and subjective. In this work a microscopic-scale tumour model was developed to evaluate current CTV practices in glioblastoma multiforme (GBM) external radiotherapy. Previously, a Geant4 cell-based dosimetry model was developed to calculate the dose deposited in individual GBM cells. Microscopic extension probability (MEP) models were then developed using Matlab-2012a. The results of the cell-based dosimetry model and MEP models were combined to calculate survival fractions (SF) for CTV margins of 2.0 and 2.5 cm. In the current work, oxygenation and heterogeneous radiosensitivity profiles were incorporated into the GBM model. The genetic heterogeneity was modelled using a range of α/β values (linear-quadratic model parameters) associated with different GBM cell lines. These values were distributed among the cells randomly, taken from a Gaussian-weighted sample of α/β values. Cellular oxygen pressure was distributed randomly taken from a sample weighted to profiles obtained from literature. Three types of GBM models were analysed: homogeneous-normoxic, heterogeneous-normoxic, and heterogeneous-hypoxic. The SF in different regions of the tumour model and the effect of the CTV margin extension from 2.0-2.5 cm on SFs were investigated for three MEP models. The SF within the beam was increased by up to three and two orders of magnitude following incorporation of heterogeneous radiosensitivities and hypoxia, respectively, in the GBM model. However, the total SF was shown to be overdominated by the presence of tumour cells in the penumbra region and to a lesser extent by genetic heterogeneity and hypoxia. CTV extension by 0.5 cm reduced the SF by a maximum of 78.6 ± 3.3%, 78.5 ± 3.3%, and 77.7 ± 3.1% for homogeneous and heterogeneous-normoxic, and heterogeneous hypoxic GBMs, respectively. Monte-Carlo model was developed to quantitatively evaluate SF for genetically
Butler, T.; Graham, L.; Estep, D.; Dawson, C.; Westerink, J. J.
2015-04-01
The uncertainty in spatially heterogeneous Manning's n fields is quantified using a novel formulation and numerical solution of stochastic inverse problems for physics-based models. The uncertainty is quantified in terms of a probability measure and the physics-based model considered here is the state-of-the-art ADCIRC model although the presented methodology applies to other hydrodynamic models. An accessible overview of the formulation and solution of the stochastic inverse problem in a mathematically rigorous framework based on measure theory is presented. Technical details that arise in practice by applying the framework to determine the Manning's n parameter field in a shallow water equation model used for coastal hydrodynamics are presented and an efficient computational algorithm and open source software package are developed. A new notion of "condition" for the stochastic inverse problem is defined and analyzed as it relates to the computation of probabilities. This notion of condition is investigated to determine effective output quantities of interest of maximum water elevations to use for the inverse problem for the Manning's n parameter and the effect on model predictions is analyzed.
D'Ulivo, Alessandro
2016-05-01
A reaction model describing the reactivity of metal and semimetal species with aqueous tetrahydridoborate (THB) has been drawn taking into account the mechanism of chemical vapor generation (CVG) of hydrides, recent evidences on the mechanism of interference and formation of byproducts in arsane generation, and other evidences in the field of the synthesis of nanoparticles and catalytic hydrolysis of THB by metal nanoparticles. The new "non-analytical" reaction model is of more general validity than the previously described "analytical" reaction model for CVG. The non-analytical model is valid for reaction of a single analyte with THB and for conditions approaching those typically encountered in the synthesis of nanoparticles and macroprecipitates. It reduces to the previously proposed analytical model under conditions typically employed in CVG for trace analysis (analyte below the μM level, borane/analyte ≫ 103 mol/mol, no interference). The non-analytical reaction model is not able to explain all the interference effects observed in CVG, which can be achieved only by assuming the interaction among the species of reaction pathways of different analytical substrates. The reunification of CVG, the synthesis of nanoparticles by aqueous THB and the catalytic hydrolysis of THB inside a common frame contribute to rationalization of the complex reactivity of aqueous THB with metal and semimetal species.
Nygård, Lotte; Vogelius, Ivan R; Fischer, Barbara M; Kjær, Andreas; Langer, Seppo W; Aznar, Marianne C; Persson, Gitte F; Bentzen, Søren M
2018-04-01
The aim of the study was to build a model of first failure site- and lesion-specific failure probability after definitive chemoradiotherapy for inoperable NSCLC. We retrospectively analyzed 251 patients receiving definitive chemoradiotherapy for NSCLC at a single institution between 2009 and 2015. All patients were scanned by fludeoxyglucose positron emission tomography/computed tomography for radiotherapy planning. Clinical patient data and fludeoxyglucose positron emission tomography standardized uptake values from primary tumor and nodal lesions were analyzed by using multivariate cause-specific Cox regression. In patients experiencing locoregional failure, multivariable logistic regression was applied to assess risk of each lesion being the first site of failure. The two models were used in combination to predict probability of lesion failure accounting for competing events. Adenocarcinoma had a lower hazard ratio (HR) of locoregional failure than squamous cell carcinoma (HR = 0.45, 95% confidence interval [CI]: 0.26-0.76, p = 0.003). Distant failures were more common in the adenocarcinoma group (HR = 2.21, 95% CI: 1.41-3.48, p failure showed that primary tumors were more likely to fail than lymph nodes (OR = 12.8, 95% CI: 5.10-32.17, p failure (OR = 1.26 per unit increase, 95% CI: 1.12-1.40, p failure site-specific competing risk model based on patient- and lesion-level characteristics. Failure patterns differed between adenocarcinoma and squamous cell carcinoma, illustrating the limitation of aggregating them into NSCLC. Failure site-specific models add complementary information to conventional prognostic models. Copyright © 2018 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.
Collett, David
2002-01-01
INTRODUCTION Some Examples The Scope of this Book Use of Statistical Software STATISTICAL INFERENCE FOR BINARY DATA The Binomial Distribution Inference about the Success Probability Comparison of Two Proportions Comparison of Two or More Proportions MODELS FOR BINARY AND BINOMIAL DATA Statistical Modelling Linear Models Methods of Estimation Fitting Linear Models to Binomial Data Models for Binomial Response Data The Linear Logistic Model Fitting the Linear Logistic Model to Binomial Data Goodness of Fit of a Linear Logistic Model Comparing Linear Logistic Models Linear Trend in Proportions Comparing Stimulus-Response Relationships Non-Convergence and Overfitting Some other Goodness of Fit Statistics Strategy for Model Selection Predicting a Binary Response Probability BIOASSAY AND SOME OTHER APPLICATIONS The Tolerance Distribution Estimating an Effective Dose Relative Potency Natural Response Non-Linear Logistic Regression Models Applications of the Complementary Log-Log Model MODEL CHECKING Definition of Re...
Herrera-Vega, Javier; Montero-Hernández, Samuel; Tachtsidis, Ilias; Treviño-Palacios, Carlos G.; Orihuela-Espina, Felipe
2017-11-01
Accurate estimation of brain haemodynamics parameters such as cerebral blood flow and volume as well as oxygen consumption i.e. metabolic rate of oxygen, with funcional near infrared spectroscopy (fNIRS) requires precise characterization of light propagation through head tissues. An anatomically realistic forward model of the human adult head with unprecedented detailed specification of the 5 scalp sublayers to account for blood irrigation in the connective tissue layer is introduced. The full model consists of 9 layers, accounts for optical properties ranging from 750nm to 950nm and has a voxel size of 0.5mm. The whole model is validated comparing the predicted remitted spectra, using Monte Carlo simulations of radiation propagation with 108 photons, against continuous wave (CW) broadband fNIRS experimental data. As the true oxy- and deoxy-hemoglobin concentrations during acquisition are unknown, a genetic algorithm searched for the vector of parameters that generates a modelled spectrum that optimally fits the experimental spectrum. Differences between experimental and model predicted spectra was quantified using the Root mean square error (RMSE). RMSE was 0.071 +/- 0.004, 0.108 +/- 0.018 and 0.235+/-0.015 at 1, 2 and 3cm interoptode distance respectively. The parameter vector of absolute concentrations of haemoglobin species in scalp and cortex retrieved with the genetic algorithm was within histologically plausible ranges. The new model capability to estimate the contribution of the scalp blood flow shall permit incorporating this information to the regularization of the inverse problem for a cleaner reconstruction of brain hemodynamics.
Product models for the Construction industry
DEFF Research Database (Denmark)
Sørensen, Lars Schiøtt
1996-01-01
Different types of product models for the building sector was elaborated and grouped. Some discussion on the different models was given. The "definition" of Product models was given.......Different types of product models for the building sector was elaborated and grouped. Some discussion on the different models was given. The "definition" of Product models was given....
Nazemi, A.; Wheater, H. S.
2015-01-01
Human activities have caused various changes to the Earth system, and hence the interconnections between human activities and the Earth system should be recognized and reflected in models that simulate Earth system processes. One key anthropogenic activity is water resource management, which determines the dynamics of human-water interactions in time and space and controls human livelihoods and economy, including energy and food production. There are immediate needs to include water resource management in Earth system models. First, the extent of human water requirements is increasing rapidly at the global scale and it is crucial to analyze the possible imbalance between water demands and supply under various scenarios of climate change and across various temporal and spatial scales. Second, recent observations show that human-water interactions, manifested through water resource management, can substantially alter the terrestrial water cycle, affect land-atmospheric feedbacks and may further interact with climate and contribute to sea-level change. Due to the importance of water resource management in determining the future of the global water and climate cycles, the World Climate Research Program's Global Energy and Water Exchanges project (WRCP-GEWEX) has recently identified gaps in describing human-water interactions as one of the grand challenges in Earth system modeling (GEWEX, 2012). Here, we divide water resource management into two interdependent elements, related firstly to water demand and secondly to water supply and allocation. In this paper, we survey the current literature on how various components of water demand have been included in large-scale models, in particular land surface and global hydrological models. Issues of water supply and allocation are addressed in a companion paper. The available algorithms to represent the dominant demands are classified based on the demand type, mode of simulation and underlying modeling assumptions. We discuss
2013-05-28
... Model EMB-550 airplane is the first of a new family of jet airplanes designed for corporate flight... inadvertent overspeed conditions as well. Section 25.335(b)(1) is intended as a conservative enveloping... or conservative aerodynamic data are used. (a) From an initial condition of stabilized flight at V C...
2013-01-24
.... The Model EMB-550 airplane is the first of a new family of jet airplanes designed for corporate flight... inadvertent overspeed conditions as well. Section 25.335(b)(1) is intended as a conservative enveloping... or conservative aerodynamic data are used. (a) From an initial condition of stabilized flight at V C...
Meijerink, Arjan; Molisch, Andreas F.
2014-01-01
The physical motivation and interpretation of the stochastic propagation channel model of Saleh and Valenzuela are discussed in detail. This motivation mainly relies on assumptions on the stochastic properties of the positions of transmitter, receiver and scatterers in the propagation environment,
Energy Technology Data Exchange (ETDEWEB)
Grousson, F.
2000-10-03
This study has been achieved in order to improve the direct injection engine control, by using internal model control strategies. Its aim is to optimise the engine performance and to decrease the polluting emissions through a better dynamic control. The use of internal model controls brings robustness in order to face the engine parameter disparity and allows great improvements in the control calibration thanks to a shorter tuning time. The first part gives the outlines of thermic engine operating and focuses on modeling with the final control in view. The second part tackles the implementation of regulation algorithms. Firstly, the air path control uses the state feedback linearization mixed with the predictive control. Secondly, the torque control of the driver's requests is performed with a static inversion using the Jacobian matrix. Finally, a simplified predictive control makes it possible to solve idle speed regulation problems. The last part is devoted to real time and fast proto-typing tests. The main simulation results have been validated through experimental tests on a direct injection car. (author)
Sapriza-Azuri, Gonzalo; Gamazo, Pablo; Razavi, Saman; Wheater, Howard S.
2018-06-01
Arctic and subarctic regions are amongst the most susceptible regions on Earth to global warming and climate change. Understanding and predicting the impact of climate change in these regions require a proper process representation of the interactions between climate, carbon cycle, and hydrology in Earth system models. This study focuses on land surface models (LSMs) that represent the lower boundary condition of general circulation models (GCMs) and regional climate models (RCMs), which simulate climate change evolution at the global and regional scales, respectively. LSMs typically utilize a standard soil configuration with a depth of no more than 4 m, whereas for cold, permafrost regions, field experiments show that attention to deep soil profiles is needed to understand and close the water and energy balances, which are tightly coupled through the phase change. To address this gap, we design and run a series of model experiments with a one-dimensional LSM, called CLASS (Canadian Land Surface Scheme), as embedded in the MESH (Modélisation Environmentale Communautaire - Surface and Hydrology) modelling system, to (1) characterize the effect of soil profile depth under different climate conditions and in the presence of parameter uncertainty; (2) assess the effect of including or excluding the geothermal flux in the LSM at the bottom of the soil column; and (3) develop a methodology for temperature profile initialization in permafrost regions, where the system has an extended memory, by the use of paleo-records and bootstrapping. Our study area is in Norman Wells, Northwest Territories of Canada, where measurements of soil temperature profiles and historical reconstructed climate data are available. Our results demonstrate a dominant role for parameter uncertainty, that is often neglected in LSMs. Considering such high sensitivity to parameter values and dependency on the climate condition, we show that a minimum depth of 20 m is essential to adequately represent
Directory of Open Access Journals (Sweden)
G. Sapriza-Azuri
2018-06-01
Full Text Available Arctic and subarctic regions are amongst the most susceptible regions on Earth to global warming and climate change. Understanding and predicting the impact of climate change in these regions require a proper process representation of the interactions between climate, carbon cycle, and hydrology in Earth system models. This study focuses on land surface models (LSMs that represent the lower boundary condition of general circulation models (GCMs and regional climate models (RCMs, which simulate climate change evolution at the global and regional scales, respectively. LSMs typically utilize a standard soil configuration with a depth of no more than 4 m, whereas for cold, permafrost regions, field experiments show that attention to deep soil profiles is needed to understand and close the water and energy balances, which are tightly coupled through the phase change. To address this gap, we design and run a series of model experiments with a one-dimensional LSM, called CLASS (Canadian Land Surface Scheme, as embedded in the MESH (Modélisation Environmentale Communautaire – Surface and Hydrology modelling system, to (1 characterize the effect of soil profile depth under different climate conditions and in the presence of parameter uncertainty; (2 assess the effect of including or excluding the geothermal flux in the LSM at the bottom of the soil column; and (3 develop a methodology for temperature profile initialization in permafrost regions, where the system has an extended memory, by the use of paleo-records and bootstrapping. Our study area is in Norman Wells, Northwest Territories of Canada, where measurements of soil temperature profiles and historical reconstructed climate data are available. Our results demonstrate a dominant role for parameter uncertainty, that is often neglected in LSMs. Considering such high sensitivity to parameter values and dependency on the climate condition, we show that a minimum depth of 20 m is essential to
Anaïs Schaeffer
2012-01-01
By analysing the production of mesons in the forward region of LHC proton-proton collisions, the LHCf collaboration has provided key information needed to calibrate extremely high-energy cosmic ray models. Average transverse momentum (pT) as a function of rapidity loss ∆y. Black dots represent LHCf data and the red diamonds represent SPS experiment UA7 results. The predictions of hadronic interaction models are shown by open boxes (sibyll 2.1), open circles (qgsjet II-03) and open triangles (epos 1.99). Among these models, epos 1.99 shows the best overall agreement with the LHCf data. LHCf is dedicated to the measurement of neutral particles emitted at extremely small angles in the very forward region of LHC collisions. Two imaging calorimeters – Arm1 and Arm2 – take data 140 m either side of the ATLAS interaction point. “The physics goal of this type of analysis is to provide data for calibrating the hadron interaction models – the well-known &...
International Nuclear Information System (INIS)
Alsaed, A.
2004-01-01
The ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003) presents the methodology for evaluating potential criticality situations in the monitored geologic repository. As stated in the referenced Topical Report, the detailed methodology for performing the disposal criticality analyses will be documented in model reports. Many of the models developed in support of the Topical Report differ from the definition of models as given in the Office of Civilian Radioactive Waste Management procedure AP-SIII.10Q, ''Models'', in that they are procedural, rather than mathematical. These model reports document the detailed methodology necessary to implement the approach presented in the Disposal Criticality Analysis Methodology Topical Report and provide calculations utilizing the methodology. Thus, the governing procedure for this type of report is AP-3.12Q, ''Design Calculations and Analyses''. The ''Criticality Model'' is of this latter type, providing a process evaluating the criticality potential of in-package and external configurations. The purpose of this analysis is to layout the process for calculating the criticality potential for various in-package and external configurations and to calculate lower-bound tolerance limit (LBTL) values and determine range of applicability (ROA) parameters. The LBTL calculations and the ROA determinations are performed using selected benchmark experiments that are applicable to various waste forms and various in-package and external configurations. The waste forms considered in this calculation are pressurized water reactor (PWR), boiling water reactor (BWR), Fast Flux Test Facility (FFTF), Training Research Isotope General Atomic (TRIGA), Enrico Fermi, Shippingport pressurized water reactor, Shippingport light water breeder reactor (LWBR), N-Reactor, Melt and Dilute, and Fort Saint Vrain Reactor spent nuclear fuel (SNF). The scope of this analysis is to document the criticality computational method. The criticality
Directory of Open Access Journals (Sweden)
Koen Degeling
2017-12-01
Full Text Available Abstract Background Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Methods Two approaches, 1 using non-parametric bootstrapping and 2 using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Results Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500, the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25, yielding infeasible modeling outcomes. Conclusions Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.
Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik
2017-12-15
Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.
Degeling, Koen; IJzerman, Maarten J.; Koopman, Miriam; Koffijberg, Hendrik
2017-01-01
Background Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by ...
Energy Technology Data Exchange (ETDEWEB)
Shipman, Galen M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-13
These are the slides for a presentation on programming models in HPC, at the Los Alamos National Laboratory's Parallel Computing Summer School. The following topics are covered: Flynn's Taxonomy of computer architectures; single instruction single data; single instruction multiple data; multiple instruction multiple data; address space organization; definition of Trinity (Intel Xeon-Phi is a MIMD architecture); single program multiple data; multiple program multiple data; ExMatEx workflow overview; definition of a programming model, programming languages, runtime systems; programming model and environments; MPI (Message Passing Interface); OpenMP; Kokkos (Performance Portable Thread-Parallel Programming Model); Kokkos abstractions, patterns, policies, and spaces; RAJA, a systematic approach to node-level portability and tuning; overview of the Legion Programming Model; mapping tasks and data to hardware resources; interoperability: supporting task-level models; Legion S3D execution and performance details; workflow, integration of external resources into the programming model.
DEFF Research Database (Denmark)
Beauquier, Maxime; Schürmann, Carsten
2011-01-01
In this paper, we present a model based on relations for bigraphical reactive system [Milner09]. Its defining characteristics are that validity and reaction relations are captured as traces in a multi-set rewriting system. The relational model is derived from Milner's graphical definition...
International Nuclear Information System (INIS)
Ellis, R.J.
2000-01-01
The US Department of Energy (USDOE) has contracted with Duke Engineering and Services, Cogema, Inc., and Stone and Webster (DCS) to provide mixed-oxide (MOX) fuel fabrication and reactor irradiation services in support of USDOE's mission to dispose of surplus weapons-grade plutonium. The nuclear station units currently identified as mission reactors for this project are Catawba Units 1 and 2 and McGuire Units 1 and 2. This report is specific to Catawba Nuclear Station Units 1 and 2, but the details and materials for the McGuire reactors are very similar. The purpose of this document is to present a complete set of data about the reactor materials and components to be used in modeling the Catawba reactors to predict reactor physics parameters for the Catawba site. Except where noted, Duke Power Company or DCS documents are the sources of these data. These data are being used with the ORNL computer code models of the DCS Catawba (and McGuire) pressurized-water reactors
Wilkins, J J; Chan, Pls; Chard, J; Smith, G; Smith, M K; Beer, M; Dunn, A; Flandorfer, C; Franklin, C; Gomeni, R; Harnisch, L; Kaye, R; Moodie, S; Sardu, M L; Wang, E; Watson, E; Wolstencroft, K; Cheung, Sya
2017-05-01
Pharmacometric analyses are complex and multifactorial. It is essential to check, track, and document the vast amounts of data and metadata that are generated during these analyses (and the relationships between them) in order to comply with regulations, support quality control, auditing, and reporting. It is, however, challenging, tedious, error-prone, and time-consuming, and diverts pharmacometricians from the more useful business of doing science. Automating this process would save time, reduce transcriptional errors, support the retention and transfer of knowledge, encourage good practice, and help ensure that pharmacometric analyses appropriately impact decisions. The ability to document, communicate, and reconstruct a complete pharmacometric analysis using an open standard would have considerable benefits. In this article, the Innovative Medicines Initiative (IMI) Drug Disease Model Resources (DDMoRe) consortium proposes a set of standards to facilitate the capture, storage, and reporting of knowledge (including assumptions and decisions) in the context of model-informed drug discovery and development (MID3), as well as to support reproducibility: "Thoughtflow." A prototype software implementation is provided. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
International Nuclear Information System (INIS)
Lundberg, Jonas; Johansson, Björn JE
2015-01-01
It has been realized that resilience as a concept involves several contradictory definitions, both for instance resilience as agile adjustment and as robust resistance to situations. Our analysis of resilience concepts and models suggest that beyond simplistic definitions, it is possible to draw up a systemic resilience model (SyRes) that maintains these opposing characteristics without contradiction. We outline six functions in a systemic model, drawing primarily on resilience engineering, and disaster response: anticipation, monitoring, response, recovery, learning, and self-monitoring. The model consists of four areas: Event-based constraints, Functional Dependencies, Adaptive Capacity and Strategy. The paper describes dependencies between constraints, functions and strategies. We argue that models such as SyRes should be useful both for envisioning new resilience methods and metrics, as well as for engineering and evaluating resilient systems. - Highlights: • The SyRes model resolves contradictions between previous resilience definitions. • SyRes is a core model for envisioning and evaluating resilience metrics and models. • SyRes describes six functions in a systemic model. • They are anticipation, monitoring, response, recovery, learning, self-monitoring. • The model describes dependencies between constraints, functions and strategies
Arnaoudova, Kristina; Stanchev, Peter
2015-11-01
The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.
International Nuclear Information System (INIS)
Pinedo, P.
2002-01-01
The long life of high level waste and their 'possible' releases, from the repository, in the far future during wide time frames, introduce difficulties on the ability of forecasting actual doses. Similar difficulties were found when trying to establish or recommend protection criteria for the environment and human health. The stochastic nature of the whole problem, from the causes that initiate radionuclides releases to the nature of the environmental conditions where impact is evaluated, made more complex the treatment of the radionuclide transport models and the analysis of radiological impact. The application of radiological protection principles to this management option, was also seen as different from other present-day practices. All this gave rise to the diversification of the research lines towards new areas that allow for the analysis of radionuclide transport, dose calculations and, criteria, in this new situation. The approach for the biosphere system based on the 'reference' concept, in essence the same idea as the one for the 'Reference man' concept, was promoted internationally, first within the BIOMOVS II Project and, afterwards, in the BIOMASS IAEA Programme. In parallel to the participation in these Projects and based on their conclusions, CIEMAT has been developing for ENRESA a methodology, which has to be updated and completed with recent developments from BIOMASS-Theme1. Notably, for the Justification and Identification step, the Description of Critical Groups and the use of the Data protocol. An application of this methodology was performed and published in 1998 and, its results and conclusions are summarised in the paper. Also, the paper includes main conclusions from the biosphere modelling applied in the last ENRESA2000 Spanish PA exercise and, difficulties found in the consistency between the scenario generation procedure, the treatment of the interface and the source term and, the use of the reference biosphere concept. (author)
DEFF Research Database (Denmark)
Larsen, Lars Bjørn; Vesterager, Johan
This report provides an overview of the existing models of global manufacturing, describes the required modelling views and associated methods and identifies tools, which can provide support for this modelling activity.The model adopted for global manufacturing is that of an extended enterprise s...
Guerrero, J M; Martínez-Tomás, R; Rincón, M; Peraita, H
2016-01-01
Early detection of Alzheimer's disease (AD) has become one of the principal focuses of research in medicine, particularly when the disease is incipient or even prodromic, because treatments are more effective in these stages. Lexical-semantic-conceptual deficit (LSCD) in the oral definitions of semantic categories for basic objects is an important early indicator in the evaluation of the cognitive state of patients. The objective of this research is to define an economic procedure for cognitive impairment (CI) diagnosis, which may be associated with early stages of AD, by analysing cognitive alterations affecting declarative semantic memory. Because of its low cost, it could be used for routine clinical evaluations or screenings, leading to more expensive and selective tests that confirm or rule out the disease accurately. It should necessarily be an explanatory procedure, which would allow us to study the evolution of the disease in relation to CI, the irregularities in different semantic categories, and other neurodegenerative diseases. On the basis of these requirements, we hypothesise that Bayesian networks (BNs) are the most appropriate tool for this purpose. We have developed a BN for CI diagnosis in mild and moderate AD patients by analysing the oral production of semantic features. The BN causal model represents LSCD in certain semantic categories, both of living things (dog, pine, and apple) and non-living things (chair, car, and trousers), as symptoms of CI. The model structure, the qualitative part of the model, uses domain knowledge obtained from psychology experts and epidemiological studies. Further, the model parameters, the quantitative part of the model, are learnt automatically from epidemiological studies and Peraita and Grasso's linguistic corpus of oral definitions. This corpus was prepared with an incidental sampling and included the analysis of the oral linguistic production of 81 participants (42 cognitively healthy elderly people and 39
International Nuclear Information System (INIS)
Borges, C.; Zarza-Moreno, M.; Heath, E.; Teixeira, N.; Vaz, P.
2012-01-01
Purpose: The most recent Varian micro multileaf collimator (MLC), the High Definition (HD120) MLC, was modeled using the BEAMNRC Monte Carlo code. This model was incorporated into a Varian medical linear accelerator, for a 6 MV beam, in static and dynamic mode. The model was validated by comparing simulated profiles with measurements. Methods: The Varian Trilogy (2300C/D) accelerator model was accurately implemented using the state-of-the-art Monte Carlo simulation program BEAMNRC and validated against off-axis and depth dose profiles measured using ionization chambers, by adjusting the energy and the full width at half maximum (FWHM) of the initial electron beam. The HD120 MLC was modeled by developing a new BEAMNRC component module (CM), designated HDMLC, adapting the available DYNVMLC CM and incorporating the specific characteristics of this new micro MLC. The leaf dimensions were provided by the manufacturer. The geometry was visualized by tracing particles through the CM and recording their position when a leaf boundary is crossed. The leaf material density and abutting air gap between leaves were adjusted in order to obtain a good agreement between the simulated leakage profiles and EBT2 film measurements performed in a solid water phantom. To validate the HDMLC implementation, additional MLC static patterns were also simulated and compared to additional measurements. Furthermore, the ability to simulate dynamic MLC fields was implemented in the HDMLC CM. The simulation results of these fields were compared with EBT2 film measurements performed in a solid water phantom. Results: Overall, the discrepancies, with and without MLC, between the opened field simulations and the measurements using ionization chambers in a water phantom, for the off-axis profiles are below 2% and in depth-dose profiles are below 2% after the maximum dose depth and below 4% in the build-up region. On the conditions of these simulations, this tungsten-based MLC has a density of 18.7 g
Chang, CC
2012-01-01
Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko
Directory of Open Access Journals (Sweden)
Jordi Palacín
2012-06-01
Full Text Available This work proposes the detection of red peaches in orchard images based on the definition of different linear color models in the RGB vector color space. The classification and segmentation of the pixels of the image is then performed by comparing the color distance from each pixel to the different previously defined linear color models. The methodology proposed has been tested with images obtained in a real orchard under natural light. The peach variety in the orchard was the paraguayo (Prunus persica var. platycarpa peach with red skin. The segmentation results showed that the area of the red peaches in the images was detected with an average error of 11.6%; 19.7% in the case of bright illumination; 8.2% in the case of low illumination; 8.6% for occlusion up to 33%; 12.2% in the case of occlusion between 34 and 66%; and 23% for occlusion above 66%. Finally, a methodology was proposed to estimate the diameter of the fruits based on an ellipsoidal fitting. A first diameter was obtained by using all the contour pixels and a second diameter was obtained by rejecting some pixels of the contour. This approach enables a rough estimate of the fruit occlusion percentage range by comparing the two diameter estimates.
Calizzani, Gabriele; Menichini, Ivana; Candura, Fabio; Lanzoni, Monica; Profili, Samantha; Tamburrini, Maria Rita; Fortino, Antonio; Vaglio, Stefania; Marano, Giuseppe; Facco, Giuseppina; Oliovecchio, Emily; Franchini, Massimo; Coppola, Antonio; Arcieri, Romano; Bon, Cinzia; Saia, Mario; Nuti, Sabina; Morfini, Massimo; Liumbruno, Giancarlo M; Di Minno, Giovanni; Grazzini, Giuliano
2014-04-01
Due to the increase in life expectancy, patients with haemophilia and other inherited bleeding disorders are experiencing age-related comorbidities that present new challenges. In order to meet current and emerging needs, a model for healthcare pathways was developed through a project funded by the Italian Ministry of Health. The project aimed to prevent or reduce the social-health burden of the disease and its complications. The National Blood Centre appointed a panel of experts comprising clinicians, patients, National and Regional Health Authority representatives. Following an analysis of the scientific and regulatory references, the panel drafted a technical proposal containing recommendations for Regional Health Authorities, which has been formally submitted to the Ministry of Health. Finally, a set of indicators to monitor haemophilia care provision has been defined. In the technical document, the panel of experts proposed the adoption of health policy recommendations summarised in areas, such as: multidisciplinary integrated approach for optimal healthcare provision; networking and protocols for emergency care; home therapy; registries/databases; replacement therapy supply and distribution; recruitment and training of experts in bleeding disorders. The recommendations became the content of proposal of agreement between the Government and the Regions. Monitoring and evaluation of haemophilia care through the set of established indicators was partially performed due to limited available data. The project provided recommendations for the clinical and organisational management of patient with haemophilia. A particular concern was given to those areas that play a critical role in the comorbidities and complications prevention. Recommendations are expected to harmonise healthcare care delivery across regional networks and building the foundation for the national haemophilia network.
Healy, Richard W.; Scanlon, Bridget R.
2010-01-01
Simulation models are widely used in all types of hydrologic studies, and many of these models can be used to estimate recharge. Models can provide important insight into the functioning of hydrologic systems by identifying factors that influence recharge. The predictive capability of models can be used to evaluate how changes in climate, water use, land use, and other factors may affect recharge rates. Most hydrological simulation models, including watershed models and groundwater-flow models, are based on some form of water-budget equation, so the material in this chapter is closely linked to that in Chapter 2. Empirical models that are not based on a water-budget equation have also been used for estimating recharge; these models generally take the form of simple estimation equations that define annual recharge as a function of precipitation and possibly other climatic data or watershed characteristics.Model complexity varies greatly. Some models are simple accounting models; others attempt to accurately represent the physics of water movement through each compartment of the hydrologic system. Some models provide estimates of recharge explicitly; for example, a model based on the Richards equation can simulate water movement from the soil surface through the unsaturated zone to the water table. Recharge estimates can be obtained indirectly from other models. For example, recharge is a parameter in groundwater-flow models that solve for hydraulic head (i.e. groundwater level). Recharge estimates can be obtained through a model calibration process in which recharge and other model parameter values are adjusted so that simulated water levels agree with measured water levels. The simulation that provides the closest agreement is called the best fit, and the recharge value used in that simulation is the model-generated estimate of recharge.
International Nuclear Information System (INIS)
Buchler, J.R.; Gottesman, S.T.; Hunter, J.H. Jr.
1990-01-01
Various papers on galactic models are presented. Individual topics addressed include: observations relating to galactic mass distributions; the structure of the Galaxy; mass distribution in spiral galaxies; rotation curves of spiral galaxies in clusters; grand design, multiple arm, and flocculent spiral galaxies; observations of barred spirals; ringed galaxies; elliptical galaxies; the modal approach to models of galaxies; self-consistent models of spiral galaxies; dynamical models of spiral galaxies; N-body models. Also discussed are: two-component models of galaxies; simulations of cloudy, gaseous galactic disks; numerical experiments on the stability of hot stellar systems; instabilities of slowly rotating galaxies; spiral structure as a recurrent instability; model gas flows in selected barred spiral galaxies; bar shapes and orbital stochasticity; three-dimensional models; polar ring galaxies; dynamical models of polar rings
Model-model Perencanaan Strategik
Amirin, Tatang M
2005-01-01
The process of strategic planning, used to be called as long-term planning, consists of several components, including strategic analysis, setting strategic direction (covering of mission, vision, and values), and action planning. Many writers develop models representing the steps of the strategic planning process, i.e. basic planning model, problem-based planning model, scenario model, and organic or self-organizing model.
DEFF Research Database (Denmark)
Bækgaard, Lars
2001-01-01
The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics.We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...
DEFF Research Database (Denmark)
Ashauer, Roman; Albert, Carlo; Augustine, Starrlight
2016-01-01
The General Unified Threshold model for Survival (GUTS) integrates previously published toxicokinetic-toxicodynamic models and estimates survival with explicitly defined assumptions. Importantly, GUTS accounts for time-variable exposure to the stressor. We performed three studies to test...
DEFF Research Database (Denmark)
Sales-Cruz, Mauricio; Piccolo, Chiara; Heitzig, Martina
2011-01-01
covered, illustrating several models such as the Wilson equation and NRTL equation, along with their solution strategies. A section shows how to use experimental data to regress the property model parameters using a least squares approach. A full model analysis is applied in each example that discusses...... the degrees of freedom, dependent and independent variables and solution strategy. Vapour-liquid and solid-liquid equilibrium is covered, and applications to droplet evaporation and kinetic models are given....
International Nuclear Information System (INIS)
Anon.
1982-01-01
Testing the applicability of mathematical models with carefully designed experiments is a powerful tool in the investigations of the effects of ionizing radiation on cells. The modeling and cellular studies complement each other, for modeling provides guidance for designing critical experiments which must provide definitive results, while the experiments themselves provide new input to the model. Based on previous experimental results the model for the accumulation of damage in Chlamydomonas reinhardi has been extended to include various multiple two-event combinations. Split dose survival experiments have shown that models tested to date predict most but not all the observed behavior. Stationary-phase mammalian cells, required for tests of other aspects of the model, have been shown to be at different points in the cell cycle depending on how they were forced to stop proliferating. These cultures also demonstrate different capacities for repair of sublethal radiation damage
DEFF Research Database (Denmark)
Ravn, Anders P.; Staunstrup, Jørgen
1994-01-01
This paper proposes a model for specifying interfaces between concurrently executing modules of a computing system. The model does not prescribe a particular type of communication protocol and is aimed at describing interfaces between both software and hardware modules or a combination of the two....... The model describes both functional and timing properties of an interface...
Hydrological models are mediating models
Babel, L. V.; Karssenberg, D.
2013-08-01
Despite the increasing role of models in hydrological research and decision-making processes, only few accounts of the nature and function of models exist in hydrology. Earlier considerations have traditionally been conducted while making a clear distinction between physically-based and conceptual models. A new philosophical account, primarily based on the fields of physics and economics, transcends classes of models and scientific disciplines by considering models as "mediators" between theory and observations. The core of this approach lies in identifying models as (1) being only partially dependent on theory and observations, (2) integrating non-deductive elements in their construction, and (3) carrying the role of instruments of scientific enquiry about both theory and the world. The applicability of this approach to hydrology is evaluated in the present article. Three widely used hydrological models, each showing a different degree of apparent physicality, are confronted to the main characteristics of the "mediating models" concept. We argue that irrespective of their kind, hydrological models depend on both theory and observations, rather than merely on one of these two domains. Their construction is additionally involving a large number of miscellaneous, external ingredients, such as past experiences, model objectives, knowledge and preferences of the modeller, as well as hardware and software resources. We show that hydrological models convey the role of instruments in scientific practice by mediating between theory and the world. It results from these considerations that the traditional distinction between physically-based and conceptual models is necessarily too simplistic and refers at best to the stage at which theory and observations are steering model construction. The large variety of ingredients involved in model construction would deserve closer attention, for being rarely explicitly presented in peer-reviewed literature. We believe that devoting
Zaytsev, V.; Pierantonio, A.; Schätz, B.; Tamzalit, D.
2014-01-01
The evolution of a software language (whether modelled by a grammar or a schema or a metamodel) is not limited to development of new versions and dialects. An important dimension of a software language evolution is maturing in the sense of improving the quality of its definition. In this paper, we
Fernandez, R.; Deveaux, V.
2010-01-01
We provide a formal definition and study the basic properties of partially ordered chains (POC). These systems were proposed to model textures in image processing and to represent independence relations between random variables in statistics (in the later case they are known as Bayesian networks).
Business Model Process Configurations
DEFF Research Database (Denmark)
Taran, Yariv; Nielsen, Christian; Thomsen, Peter
2015-01-01
, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...
International Nuclear Information System (INIS)
Phillips, C.K.
1985-12-01
This lecture provides a survey of the methods used to model fast magnetosonic wave coupling, propagation, and absorption in tokamaks. The validity and limitations of three distinct types of modelling codes, which will be contrasted, include discrete models which utilize ray tracing techniques, approximate continuous field models based on a parabolic approximation of the wave equation, and full field models derived using finite difference techniques. Inclusion of mode conversion effects in these models and modification of the minority distribution function will also be discussed. The lecture will conclude with a presentation of time-dependent global transport simulations of ICRF-heated tokamak discharges obtained in conjunction with the ICRF modelling codes. 52 refs., 15 figs
Marra, Francesco P; Barone, Ettore; La Mantia, Michele; Caruso, Tiziano
2009-09-01
This study, as a preliminary step toward the definition of a carbon budget model for pistachio trees (Pistacia vera L.), aimed at estimating and evaluating the dynamics of respiration of vegetative and reproductive organs of pistachio tree. Trials were performed in 2005 in a commercial orchard located in Sicily (370 m a.s.l.) on five bearing 20-year-old pistachio trees of cv. Bianca grafted onto Pistachio terebinthus L. Growth analyses and respiration measurements were done on vegetative (leaf) and reproductive (infructescence) organs during the entire growing season (April-September) at biweekly intervals. Results suggested that the respiration rates of pistachio reproductive and vegetative organs were related to their developmental stage. Both for leaf and for infructescence, the highest values were observed during the earlier stages of growth corresponding to the phases of most intense organ growth. The sensitivity of respiration activity to temperature changes, measured by Q(10), showed an increase throughout the transition from immature to mature leaves, as well as during fruit development. The data collected were also used to estimate the seasonal carbon loss by respiration activity for a single leaf and a single infructescence. The amount of carbon lost by respiration was affected by short-term temperature patterns, organ developmental stage and tissue function.
Modelling in Business Model design
Simonse, W.L.
2013-01-01
It appears that business model design might not always produce a design or model as the expected result. However when designers are involved, a visual model or artefact is produced. To assist strategic managers in thinking about how they can act, the designers challenge is to combine strategy and
Limitations of JEDI Models | Jobs and Economic Development Impact Models |
Group's IMPLAN accounting software. For JEDI, these are updated every two years for the best available -output modeling remains a widely used methodology for measuring economic development activity. Definition definition of the geographic area under consideration. Datasets of multipliers from IMPLAN are available at
International Nuclear Information System (INIS)
Michel, F.C.
1989-01-01
Three existing eclipse models for the PSR 1957 + 20 pulsar are discussed in terms of their requirements and the information they yield about the pulsar wind: the interacting wind from a companion model, the magnetosphere model, and the occulting disk model. It is shown out that the wind model requires an MHD wind from the pulsar, with enough particles that the Poynting flux of the wind can be thermalized; in this model, a large flux of energetic radiation from the pulsar is required to accompany the wind and drive the wind off the companion. The magnetosphere model requires an EM wind, which is Poynting flux dominated; the advantage of this model over the wind model is that the plasma density inside the magnetosphere can be orders of magnitude larger than in a magnetospheric tail blown back by wind interaction. The occulting disk model also requires an EM wind so that the interaction would be pushed down onto the companion surface, minimizing direct interaction of the wind with the orbiting macroscopic particles
International Nuclear Information System (INIS)
Yang, H.
1999-01-01
The purpose of this analysis and model report (AMR) for the Ventilation Model is to analyze the effects of pre-closure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts and provide heat removal data to support EBS design. It will also provide input data (initial conditions, and time varying boundary conditions) for the EBS post-closure performance assessment and the EBS Water Distribution and Removal Process Model. The objective of the analysis is to develop, describe, and apply calculation methods and models that can be used to predict thermal conditions within emplacement drifts under forced ventilation during the pre-closure period. The scope of this analysis includes: (1) Provide a general description of effects and heat transfer process of emplacement drift ventilation. (2) Develop a modeling approach to simulate the impacts of pre-closure ventilation on the thermal conditions in emplacement drifts. (3) Identify and document inputs to be used for modeling emplacement ventilation. (4) Perform calculations of temperatures and heat removal in the emplacement drift. (5) Address general considerations of the effect of water/moisture removal by ventilation on the repository thermal conditions. The numerical modeling in this document will be limited to heat-only modeling and calculations. Only a preliminary assessment of the heat/moisture ventilation effects and modeling method will be performed in this revision. Modeling of moisture effects on heat removal and emplacement drift temperature may be performed in the future
DEFF Research Database (Denmark)
Blomhøj, Morten
2004-01-01
Developing competences for setting up, analysing and criticising mathematical models are normally seen as relevant only from and above upper secondary level. The general belief among teachers is that modelling activities presuppose conceptual understanding of the mathematics involved. Mathematical...... roots for the construction of important mathematical concepts. In addition competences for setting up, analysing and criticising modelling processes and the possible use of models is a formative aim in this own right for mathematics teaching in general education. The paper presents a theoretical...... modelling, however, can be seen as a practice of teaching that place the relation between real life and mathematics into the centre of teaching and learning mathematics, and this is relevant at all levels. Modelling activities may motivate the learning process and help the learner to establish cognitive...
2016-01-01
This book provides a thorough introduction to the challenge of applying mathematics in real-world scenarios. Modelling tasks rarely involve well-defined categories, and they often require multidisciplinary input from mathematics, physics, computer sciences, or engineering. In keeping with this spirit of modelling, the book includes a wealth of cross-references between the chapters and frequently points to the real-world context. The book combines classical approaches to modelling with novel areas such as soft computing methods, inverse problems, and model uncertainty. Attention is also paid to the interaction between models, data and the use of mathematical software. The reader will find a broad selection of theoretical tools for practicing industrial mathematics, including the analysis of continuum models, probabilistic and discrete phenomena, and asymptotic and sensitivity analysis.
Bottle, Neil
2013-01-01
The Model : making exhibition was curated by Brian Kennedy in collaboration with Allies & Morrison in September 2013. For the London Design Festival, the Model : making exhibition looked at the increased use of new technologies by both craft-makers and architectural model makers. In both practices traditional ways of making by hand are increasingly being combined with the latest technologies of digital imaging, laser cutting, CNC machining and 3D printing. This exhibition focussed on ...
Open source molecular modeling.
Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan
2016-09-01
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Frampton, Paul H.
1998-01-01
In this talk I begin with some general discussion of model building in particle theory, emphasizing the need for motivation and testability. Three illustrative examples are then described. The first is the Left-Right model which provides an explanation for the chirality of quarks and leptons. The second is the 331-model which offers a first step to understanding the three generations of quarks and leptons. Third and last is the SU(15) model which can accommodate the light leptoquarks possibly seen at HERA
International Nuclear Information System (INIS)
Frampton, P.H.
1998-01-01
In this talk I begin with some general discussion of model building in particle theory, emphasizing the need for motivation and testability. Three illustrative examples are then described. The first is the Left-Right model which provides an explanation for the chirality of quarks and leptons. The second is the 331-model which offers a first step to understanding the three generations of quarks and leptons. Third and last is the SU(15) model which can accommodate the light leptoquarks possibly seen at HERA. copyright 1998 American Institute of Physics
Modeling Documents with Event Model
Directory of Open Access Journals (Sweden)
Longhui Wang
2015-08-01
Full Text Available Currently deep learning has made great breakthroughs in visual and speech processing, mainly because it draws lessons from the hierarchical mode that brain deals with images and speech. In the field of NLP, a topic model is one of the important ways for modeling documents. Topic models are built on a generative model that clearly does not match the way humans write. In this paper, we propose Event Model, which is unsupervised and based on the language processing mechanism of neurolinguistics, to model documents. In Event Model, documents are descriptions of concrete or abstract events seen, heard, or sensed by people and words are objects in the events. Event Model has two stages: word learning and dimensionality reduction. Word learning is to learn semantics of words based on deep learning. Dimensionality reduction is the process that representing a document as a low dimensional vector by a linear mode that is completely different from topic models. Event Model achieves state-of-the-art results on document retrieval tasks.
DEFF Research Database (Denmark)
Gøtze, Jens Peter; Krentz, Andrew
2014-01-01
In this issue of Cardiovascular Endocrinology, we are proud to present a broad and dedicated spectrum of reviews on animal models in cardiovascular disease. The reviews cover most aspects of animal models in science from basic differences and similarities between small animals and the human...
Jongerden, M.R.; Haverkort, Boudewijn R.H.M.
2008-01-01
The use of mobile devices is often limited by the capacity of the employed batteries. The battery lifetime determines how long one can use a device. Battery modeling can help to predict, and possibly extend this lifetime. Many different battery models have been developed over the years. However,
DEFF Research Database (Denmark)
Højgaard, Tomas; Hansen, Rune
The purpose of this paper is to introduce Didactical Modelling as a research methodology in mathematics education. We compare the methodology with other approaches and argue that Didactical Modelling has its own specificity. We discuss the methodological “why” and explain why we find it useful...
Kempen, van A.; Kok, H.; Wagter, H.
1992-01-01
In Computer Aided Drafting three groups of three-dimensional geometric modelling can be recognized: wire frame, surface and solid modelling. One of the methods to describe a solid is by using a boundary based representation. The topology of the surface of a solid is the adjacency information between
Poortman, Sybilla; Sloep, Peter
2006-01-01
Educational models describes a case study on a complex learning object. Possibilities are investigated for using this learning object, which is based on a particular educational model, outside of its original context. Furthermore, this study provides advice that might lead to an increase in
International Nuclear Information System (INIS)
V. Chipman
2002-01-01
The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their postclosure analyses
DEFF Research Database (Denmark)
Kindler, Ekkart
2009-01-01
, these notations have been extended in order to increase expressiveness and to be more competitive. This resulted in an increasing number of notations and formalisms for modelling business processes and in an increase of the different modelling constructs provided by modelling notations, which makes it difficult......There are many different notations and formalisms for modelling business processes and workflows. These notations and formalisms have been introduced with different purposes and objectives. Later, influenced by other notations, comparisons with other tools, or by standardization efforts...... to compare modelling notations and to make transformations between them. One of the reasons is that, in each notation, the new concepts are introduced in a different way by extending the already existing constructs. In this chapter, we go the opposite direction: We show that it is possible to add most...
Modeling complexes of modeled proteins.
Anishchenko, Ivan; Kundrotas, Petras J; Vakser, Ilya A
2017-03-01
Structural characterization of proteins is essential for understanding life processes at the molecular level. However, only a fraction of known proteins have experimentally determined structures. This fraction is even smaller for protein-protein complexes. Thus, structural modeling of protein-protein interactions (docking) primarily has to rely on modeled structures of the individual proteins, which typically are less accurate than the experimentally determined ones. Such "double" modeling is the Grand Challenge of structural reconstruction of the interactome. Yet it remains so far largely untested in a systematic way. We present a comprehensive validation of template-based and free docking on a set of 165 complexes, where each protein model has six levels of structural accuracy, from 1 to 6 Å C α RMSD. Many template-based docking predictions fall into acceptable quality category, according to the CAPRI criteria, even for highly inaccurate proteins (5-6 Å RMSD), although the number of such models (and, consequently, the docking success rate) drops significantly for models with RMSD > 4 Å. The results show that the existing docking methodologies can be successfully applied to protein models with a broad range of structural accuracy, and the template-based docking is much less sensitive to inaccuracies of protein models than the free docking. Proteins 2017; 85:470-478. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
DEFF Research Database (Denmark)
Kreiner, Svend; Christensen, Karl Bang
Rasch models; Partial Credit models; Rating Scale models; Item bias; Differential item functioning; Local independence; Graphical models......Rasch models; Partial Credit models; Rating Scale models; Item bias; Differential item functioning; Local independence; Graphical models...
International Nuclear Information System (INIS)
Woosley, S.E.; California, University, Livermore, CA); Weaver, T.A.
1981-01-01
Recent progress in understanding the observed properties of type I supernovae as a consequence of the thermonuclear detonation of white dwarf stars and the ensuing decay of the Ni-56 produced therein is reviewed. The expected nucleosynthesis and gamma-line spectra for this model of type I explosions and a model for type II explosions are presented. Finally, a qualitatively new approach to the problem of massive star death and type II supernovae based upon a combination of rotation and thermonuclear burning is discussed. While the theoretical results of existing models are predicated upon the assumption of a successful core bounce calculation and the neglect of such two-dimensional effects as rotation and magnetic fields the new model suggests an entirely different scenario in which a considerable portion of the energy carried by an equatorially ejected blob is deposited in the red giant envelope overlying the mantle of the star
Hodges, Wilfrid
1993-01-01
An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.
Indian Academy of Sciences (India)
2School of Water Resources, Indian Institute of Technology,. Kharagpur ... the most accepted method for modelling LULCC using current .... We used UTM coordinate system with zone 45 .... need to develop criteria for making decision about.
National Oceanic and Atmospheric Administration, Department of Commerce — Computer simulations of past climate. Variables provided as model output are described by parameter keyword. In some cases the parameter keywords are a subset of all...
Energy models characterize the energy system, its evolution, and its interactions with the broader economy. The energy system consists of primary resources, including both fossil fuels and renewables; power plants, refineries, and other technologies to process and convert these r...
Searle, Shayle R
2012-01-01
This 1971 classic on linear models is once again available--as a Wiley Classics Library Edition. It features material that can be understood by any statistician who understands matrix algebra and basic statistical methods.
Skaaret, Eimund
Calculation procedures, used in the design of ventilating systems, which are especially suited for displacement ventilation in addition to linking it to mixing ventilation, are addressed. The two zone flow model is considered and the steady state and transient solutions are addressed. Different methods of supplying air are discussed, and different types of air flow are considered: piston flow, plane flow and radial flow. An evaluation model for ventilation systems is presented.
Model uncertainty: Probabilities for models?
International Nuclear Information System (INIS)
Winkler, R.L.
1994-01-01
Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising
Modelling synergistic effects of appetite regulating hormones
DEFF Research Database (Denmark)
Schmidt, Julie Berg; Ritz, Christian
2016-01-01
We briefly reviewed one definition of dose addition, which is applicable within the framework of generalized linear models. We established how this definition of dose addition corresponds to effect addition in case only two doses per compound are considered for evaluating synergistic effects. The....... The link between definitions was exemplified for an appetite study where two appetite hormones were studied....
Model Hadron asymptotic behaviour
International Nuclear Information System (INIS)
Kralchevsky, P.; Nikolov, A.
1983-01-01
The work is devoted to the problem of solving a set of asymptotic equations describing the model hardon interaction. More specifically an interactive procedure consisting of two stages is proposed and the first stage is exhaustively studied here. The principle of contracting transformations has been applied for this purpose. Under rather general and natural assumptions, solutions in a series of metric spaces suitable for physical applications have been found. For each of these spaces a solution with unique definiteness is found. (authors)
International Nuclear Information System (INIS)
Fryer, M.O.
1984-01-01
The temperature measurements provided by thermocouples (TCs) are important for the operation of pressurized water reactors. During severe inadequate core cooling incidents, extreme temperatures may cause type K thermocouples (TCs) used for core exit temperature monitoring to perform poorly. A model of TC electrical behavior has been developed to determine how TCs react under extreme temperatures. The model predicts the voltage output of the TC and its impedance. A series of experiments were conducted on a length of type K thermocouple to validate the model. Impedance was measured at several temperatures between 22 0 C and 1100 0 C and at frequencies between dc and 10 MHz. The model was able to accurately predict impedance over this wide range of conditions. The average percentage difference between experimental data and the model was less than 6.5%. Experimental accuracy was +-2.5%. There is a sriking difference between impedance versus frequency plots at 300 0 C and at higher temperatures. This may be useful in validating TC data during accident conditions
Kallman, T.
2010-01-01
Warm absorber spectra are characterized by the many lines from partially ionized intermediate-Z elements, and iron, detected with the grating instruments on Chandra and XMM-Newton. If these ions are formed in a gas which is in photoionization equilibrium, they correspond to a broad range of ionization parameters, although there is evidence for certain preferred values. A test for any dynamical model for these outflows is to reproduce these properties, at some level of detail. In this paper we present a statistical analysis of the ionization distribution which can be applied both the observed spectra and to theoretical models. As an example, we apply it to our dynamical models for warm absorber outflows, based on evaporation from the molecular torus.
Smith, J. A.; Cooper, K.; Randolph, M.
1984-01-01
A classical description of the one dimensional radiative transfer treatment of vegetation canopies was completed and the results were tested against measured prairie (blue grama) and agricultural canopies (soybean). Phase functions are calculated in terms of directly measurable biophysical characteristics of the canopy medium. While the phase functions tend to exhibit backscattering anisotropy, their exact behavior is somewhat more complex and wavelength dependent. A Monte Carlo model was developed that treats soil surfaces with large periodic variations in three dimensions. A photon-ray tracing technology is used. Currently, the rough soil surface is described by analytic functions and appropriate geometric calculations performed. A bidirectional reflectance distribution function is calculated and, hence, available for other atmospheric or canopy reflectance models as a lower boundary condition. This technique is used together with an adding model to calculate several cases where Lambertian leaves possessing anisotropic leaf angle distributions yield non-Lambertian reflectance; similar behavior is exhibited for simulated soil surfaces.
Eck, Christof; Knabner, Peter
2017-01-01
Mathematical models are the decisive tool to explain and predict phenomena in the natural and engineering sciences. With this book readers will learn to derive mathematical models which help to understand real world phenomena. At the same time a wealth of important examples for the abstract concepts treated in the curriculum of mathematics degrees are given. An essential feature of this book is that mathematical structures are used as an ordering principle and not the fields of application. Methods from linear algebra, analysis and the theory of ordinary and partial differential equations are thoroughly introduced and applied in the modeling process. Examples of applications in the fields electrical networks, chemical reaction dynamics, population dynamics, fluid dynamics, elasticity theory and crystal growth are treated comprehensively.
Cardey, Sylviane
2013-01-01
In response to the need for reliable results from natural language processing, this book presents an original way of decomposing a language(s) in a microscopic manner by means of intra/inter‑language norms and divergences, going progressively from languages as systems to the linguistic, mathematical and computational models, which being based on a constructive approach are inherently traceable. Languages are described with their elements aggregating or repelling each other to form viable interrelated micro‑systems. The abstract model, which contrary to the current state of the art works in int
Directory of Open Access Journals (Sweden)
Aarti Sharma
2009-01-01
Full Text Available The use of computational chemistry in the development of novel pharmaceuticals is becoming an increasingly important tool. In the past, drugs were simply screened for effectiveness. The recent advances in computing power and the exponential growth of the knowledge of protein structures have made it possible for organic compounds to be tailored to decrease the harmful side effects and increase the potency. This article provides a detailed description of the techniques employed in molecular modeling. Molecular modeling is a rapidly developing discipline, and has been supported by the dramatic improvements in computer hardware and software in recent years.
International Nuclear Information System (INIS)
Woosley, S.E.; Weaver, T.A.
1980-01-01
Recent progress in understanding the observed properties of Type I supernovae as a consequence of the thermonuclear detonation of white dwarf stars and the ensuing decay of the 56 Ni produced therein is reviewed. Within the context of this model for Type I explosions and the 1978 model for Type II explosions, the expected nucleosynthesis and gamma-line spectra from both kinds of supernovae are presented. Finally, a qualitatively new approach to the problem of massive star death and Type II supernovae based upon a combination of rotation and thermonuclear burning is discussed
Principles of models based engineering
Energy Technology Data Exchange (ETDEWEB)
Dolin, R.M.; Hefele, J.
1996-11-01
This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.
Baart, F.; Donchyts, G.; van Dam, A.; Plieger, M.
2015-12-01
The emergence of interactive art has blurred the line between electronic, computer graphics and art. Here we apply this art form to numerical models. Here we show how the transformation of a numerical model into an interactive painting can both provide insights and solve real world problems. The cases that are used as an example include forensic reconstructions, dredging optimization, barrier design. The system can be fed using any source of time varying vector fields, such as hydrodynamic models. The cases used here, the Indian Ocean (HYCOM), the Wadden Sea (Delft3D Curvilinear), San Francisco Bay (3Di subgrid and Delft3D Flexible Mesh), show that the method used is suitable for different time and spatial scales. High resolution numerical models become interactive paintings by exchanging their velocity fields with a high resolution (>=1M cells) image based flow visualization that runs in a html5 compatible web browser. The image based flow visualization combines three images into a new image: the current image, a drawing, and a uv + mask field. The advection scheme that computes the resultant image is executed in the graphics card using WebGL, allowing for 1M grid cells at 60Hz performance on mediocre graphic cards. The software is provided as open source software. By using different sources for a drawing one can gain insight into several aspects of the velocity fields. These aspects include not only the commonly represented magnitude and direction, but also divergence, topology and turbulence .
Finger Lakes Regional Education Center for Economic Development, Mount Morris, NY.
This guide describes seven model programs that were developed by the Finger Lakes Regional Center for Economic Development (New York) to meet the training needs of female and minority entrepreneurs to help their businesses survive and grow and to assist disabled and dislocated workers and youth in beginning small businesses. The first three models…
DEFF Research Database (Denmark)
Nash, Ulrik William
2014-01-01
Firms consist of people who make decisions to achieve goals. How do these people develop the expectations which underpin the choices they make? The lens model provides one answer to this question. It was developed by cognitive psychologist Egon Brunswik (1952) to illustrate his theory of probabil...
International Nuclear Information System (INIS)
Michel, F.C.
1989-01-01
This paper addresses the question of, if one overlooks their idiosyncratic difficulties, what could be learned from the various models about the pulsar wind? The wind model requires an MHD wind from the pulsar, namely, one with enough particles that the Poynting flux of the wind can be thermalized. Otherwise, there is no shock and the pulsar wind simply reflects like a flashlight beam. Additionally, a large flux of energetic radiation from the pulsar is required to accompany the wind and drive the wind off the companion. The magnetosphere model probably requires an EM wind, which is Poynting flux dominated. Reflection in this case would arguably minimize the intimate interaction between the two flows that leads to tail formation and thereby permit a weakly magnetized tail. The occulting disk model also would point to an EM wind so that the interaction would be pushed down onto the companion surface (to form the neutral fountain) and so as to also minimize direct interaction of the wind with the orbiting macroscopic particles
African Journals Online (AJOL)
Simple analytic polynomials have been proposed for estimating solar radiation in the traditional Northern, Central and Southern regions of Malawi. There is a strong agreement between the polynomials and the SSE model with R2 values of 0.988, 0.989 and 0.989 and root mean square errors of 0.061, 0.057 and 0.062 ...
Lomnitz, Cinna
Tichelaar and Ruff [1989] propose to “estimate model variance in complicated geophysical problems,” including the determination of focal depth in earthquakes, by means of unconventional statistical methods such as bootstrapping. They are successful insofar as they are able to duplicate the results from more conventional procedures.
International Nuclear Information System (INIS)
Norgett, M.J.
1980-01-01
Calculations, drawing principally on developments at AERE Harwell, of the relaxation about lattice defects are reviewed with emphasis on the techniques required for such calculations. The principles of defect modelling are outlined and various programs developed for defect simulations are discussed. Particular calculations for metals, ionic crystals and oxides, are considered. (UK)
DEFF Research Database (Denmark)
Stubkjær, Erik
2005-01-01
to the modeling of an industrial sector, as it aims at rendering the basic concepts that relate to the domain of real estate and the pertinent human activities. The palpable objects are pieces of land and buildings, documents, data stores and archives, as well as persons in their diverse roles as owners, holders...
DEFF Research Database (Denmark)
About the reconstruction of Palle Nielsen's (f. 1942) work The Model from 1968: a gigantic playground for children in the museum, where they can freely romp about, climb in ropes, crawl on wooden structures, work with tools, jump in foam rubber, paint with finger paints and dress up in costumes....
International Nuclear Information System (INIS)
Wenzel, W.J.; Gallegos, A.F.; Rodgers, J.C.
1985-01-01
The BIOTRAN model was developed at Los Alamos to help predict short- and long-term consequences to man from releases of radionuclides into the environment. It is a dynamic model that simulates on a daily and yearly basis the flux of biomass, water, and radionuclides through terrestrial and aquatic ecosystems. Biomass, water, and radionuclides are driven within the ecosystems by climate variables stochastically generated by BIOTRAN each simulation day. The climate variables influence soil hydraulics, plant growth, evapotranspiration, and particle suspension and deposition. BIOTRAN has 22 different plant growth strategies for simulating various grasses, shrubs, trees, and crops. Ruminants and humans are also dynamically simulated by using the simulated crops and forage as intake for user-specified diets. BIOTRAN has been used at Los Alamos for long-term prediction of health effects to populations following potential accidental releases of uranium and plutonium. Newly developed subroutines are described: a human dynamic physiological and metabolic model; a soil hydrology and irrigation model; limnetic nutrient and radionuclide cycling in fresh-water lakes. 7 references
1975-01-01
thai h’liathe0in antd is finaull’ %IIIrd alt %tramlit And drohlttle. Mike aplpars Ito inua•,e upward in outler a rei and dowoi. ward it %iunr areli, Oil...fiducial marks should be constant and the edges phobic nor hydrophilic is better for routine sharpl ) defined. model testing. Before each launching in
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 9; Issue 5. Molecular Modeling: A Powerful Tool for Drug Design and Molecular Docking. Rama Rao Nadendla. General Article Volume 9 Issue 5 May 2004 pp 51-60. Fulltext. Click here to view fulltext PDF. Permanent link:
National Research Council Canada - National Science Library
Ha, Yonghoon
2000-01-01
.... This approach requires the user to have installed both Matlab and Fortran compilers. The MMPE model and associated acoustic processing tools are now rewritten in the object-oriented language Java...
Building Models and Building Modelling
DEFF Research Database (Denmark)
Jørgensen, Kaj; Skauge, Jørn
2008-01-01
I rapportens indledende kapitel beskrives de primære begreber vedrørende bygningsmodeller og nogle fundamentale forhold vedrørende computerbaseret modulering bliver opstillet. Desuden bliver forskellen mellem tegneprogrammer og bygningsmodelleringsprogrammer beskrevet. Vigtige aspekter om comp...
DEFF Research Database (Denmark)
2012-01-01
The relationship between representation and the represented is examined here through the notion of persistent modelling. This notion is not novel to the activity of architectural design if it is considered as describing a continued active and iterative engagement with design concerns – an evident....... It also provides critical insight into the use of contemporary modelling tools and methods, together with an examination of the implications their use has within the territories of architectural design, realisation and experience....... on this subject, this book makes essential reading for anyone considering new ways of thinking about architecture. In drawing upon both historical and contemporary perspectives this book provides evidence of the ways in which relations between representation and the represented continue to be reconsidered...
DEFF Research Database (Denmark)
The relationship between representation and the represented is examined here through the notion of persistent modelling. This notion is not novel to the activity of architectural design if it is considered as describing a continued active and iterative engagement with design concerns – an evident....... It also provides critical insight into the use of contemporary modelling tools and methods, together with an examination of the implications their use has within the territories of architectural design, realisation and experience....... on this subject, this book makes essential reading for anyone considering new ways of thinking about architecture. In drawing upon both historical and contemporary perspectives this book provides evidence of the ways in which relations between representation and the represented continue to be reconsidered...
Barr, Michael
2002-01-01
Acyclic models is a method heavily used to analyze and compare various homology and cohomology theories appearing in topology and algebra. This book is the first attempt to put together in a concise form this important technique and to include all the necessary background. It presents a brief introduction to category theory and homological algebra. The author then gives the background of the theory of differential modules and chain complexes over an abelian category to state the main acyclic models theorem, generalizing and systemizing the earlier material. This is then applied to various cohomology theories in algebra and topology. The volume could be used as a text for a course that combines homological algebra and algebraic topology. Required background includes a standard course in abstract algebra and some knowledge of topology. The volume contains many exercises. It is also suitable as a reference work for researchers.
Directory of Open Access Journals (Sweden)
Aarti Sharma
2009-12-01
Full Text Available
DEFF Research Database (Denmark)
Pedersen, Mogens Jin; Stritch, Justin Michael
2018-01-01
Replication studies relate to the scientific principle of replicability and serve the significant purpose of providing supporting (or contradicting) evidence regarding the existence of a phenomenon. However, replication has never been an integral part of public administration and management...... research. Recently, scholars have issued calls for more replication, but academic reflections on when replication adds substantive value to public administration and management research are needed. This concise article presents a conceptual model, RNICE, for assessing when and how a replication study...... contributes knowledge about a social phenomenon and advances knowledge in the public administration and management literatures. The RNICE model provides a vehicle for researchers who seek to evaluate or demonstrate the value of a replication study systematically. We illustrate the practical application...
DEFF Research Database (Denmark)
Lasrado, Lester Allan; Vatrapu, Ravi
2016-01-01
Recent advancements in set theory and readily available software have enabled social science researchers to bridge the variable-centered quantitative and case-based qualitative methodological paradigms in order to analyze multi-dimensional associations beyond the linearity assumptions, aggregate...... effects, unicausal reduction, and case specificity. Based on the developments in set theoretical thinking in social sciences and employing methods like Qualitative Comparative Analysis (QCA), Necessary Condition Analysis (NCA), and set visualization techniques, in this position paper, we propose...... and demonstrate a new approach to maturity models in the domain of Information Systems. This position paper describes the set-theoretical approach to maturity models, presents current results and outlines future research work....
DEFF Research Database (Denmark)
Bork Petersen, Franziska
2013-01-01
advantageous manner. Stepping on the catwalk’s sloping, moving surfaces decelerates the models’ walk and makes it cautious, hesitant and shaky: suddenly the models lack exactly the affirmative, staccato, striving quality of motion, and the condescending expression that they perform on most contemporary......For the presentation of his autumn/winter 2012 collection in Paris and subsequently in Copenhagen, Danish designer Henrik Vibskov installed a mobile catwalk. The article investigates the choreographic impact of this scenography on those who move through it. Drawing on Dance Studies, the analytical...... focus centres on how the catwalk scenography evokes a ‘defiguration’ of the walking models and to what effect. Vibskov’s mobile catwalk draws attention to the walk, which is a key element of models’ performance but which usually functions in fashion shows merely to present clothes in the most...
Minimalistic Neutrino Mass Model
De Gouvêa, A; Gouvea, Andre de
2001-01-01
We consider the simplest model which solves the solar and atmospheric neutrino puzzles, in the sense that it contains the smallest amount of beyond the Standard Model ingredients. The solar neutrino data is accounted for by Planck-mass effects while the atmospheric neutrino anomaly is due to the existence of a single right-handed neutrino at an intermediate mass scale between 10^9 GeV and 10^14 GeV. Even though the neutrino mixing angles are not exactly predicted, they can be naturally large, which agrees well with the current experimental situation. Furthermore, the amount of lepton asymmetry produced in the early universe by the decay of the right-handed neutrino is very predictive and may be enough to explain the current baryon-to-photon ratio if the right-handed neutrinos are produced out of thermal equilibrium. One definitive test for the model is the search for anomalous seasonal effects at Borexino.
Abreu, Orlando; Alvear, Daniel
2016-01-01
This book presents an overview of modeling definitions and concepts, theory on human behavior and human performance data, available tools and simulation approaches, model development, and application and validation methods. It considers the data and research efforts needed to develop and incorporate functions for the different parameters into comprehensive escape and evacuation simulations, with a number of examples illustrating different aspects and approaches. After an overview of basic modeling approaches, the book discusses benefits and challenges of current techniques. The representation of evacuees is a central issue, including human behavior and the proper implementation of representational tools. Key topics include the nature and importance of the different parameters involved in ASET and RSET and the interactions between them. A review of the current literature on verification and validation methods is provided, with a set of recommended verification tests and examples of validation tests. The book c...
DEFF Research Database (Denmark)
Arnoldi, Jakob
The article discusses the use of algorithmic models for so-called High Frequency Trading (HFT) in finance. HFT is controversial yet widespread in modern financial markets. It is a form of automated trading technology which critics among other things claim can lead to market manipulation. Drawing....... The article analyses these challenges and argues that we witness a new post-social form of human-technology interaction that will lead to a reconfiguration of professional codes for financial trading....
Vincent, Julian F V
2003-01-01
Biomimetics is seen as a path from biology to engineering. The only path from engineering to biology in current use is the application of engineering concepts and models to biological systems. However, there is another pathway: the verification of biological mechanisms by manufacture, leading to an iterative process between biology and engineering in which the new understanding that the engineering implementation of a biological system can bring is fed back into biology, allowing a more compl...
Energy Technology Data Exchange (ETDEWEB)
McIllvaine, C M
1994-07-01
Exhaust gases from power plants that burn fossil fuels contain concentrations of sulfur dioxide (SO{sub 2}), nitric oxide (NO), particulate matter, hydrocarbon compounds and trace metals. Estimated emissions from the operation of a hypothetical 500 MW coal-fired power plant are given. Ozone is considered a secondary pollutant, since it is not emitted directly into the atmosphere but is formed from other air pollutants, specifically, nitrogen oxides (NO), and non-methane organic compounds (NMOQ) in the presence of sunlight. (NMOC are sometimes referred to as hydrocarbons, HC, or volatile organic compounds, VOC, and they may or may not include methane). Additionally, ozone formation Alternative is a function of the ratio of NMOC concentrations to NO{sub x} concentrations. A typical ozone isopleth is shown, generated with the Empirical Kinetic Modeling Approach (EKMA) option of the Environmental Protection Agency's (EPA) Ozone Isopleth Plotting Mechanism (OZIPM-4) model. Ozone isopleth diagrams, originally generated with smog chamber data, are more commonly generated with photochemical reaction mechanisms and tested against smog chamber data. The shape of the isopleth curves is a function of the region (i.e. background conditions) where ozone concentrations are simulated. The location of an ozone concentration on the isopleth diagram is defined by the ratio of NMOC and NO{sub x} coordinates of the point, known as the NMOC/NO{sub x} ratio. Results obtained by the described model are presented.
International Nuclear Information System (INIS)
McIllvaine, C.M.
1994-01-01
Exhaust gases from power plants that burn fossil fuels contain concentrations of sulfur dioxide (SO 2 ), nitric oxide (NO), particulate matter, hydrocarbon compounds and trace metals. Estimated emissions from the operation of a hypothetical 500 MW coal-fired power plant are given. Ozone is considered a secondary pollutant, since it is not emitted directly into the atmosphere but is formed from other air pollutants, specifically, nitrogen oxides (NO), and non-methane organic compounds (NMOQ) in the presence of sunlight. (NMOC are sometimes referred to as hydrocarbons, HC, or volatile organic compounds, VOC, and they may or may not include methane). Additionally, ozone formation Alternative is a function of the ratio of NMOC concentrations to NO x concentrations. A typical ozone isopleth is shown, generated with the Empirical Kinetic Modeling Approach (EKMA) option of the Environmental Protection Agency's (EPA) Ozone Isopleth Plotting Mechanism (OZIPM-4) model. Ozone isopleth diagrams, originally generated with smog chamber data, are more commonly generated with photochemical reaction mechanisms and tested against smog chamber data. The shape of the isopleth curves is a function of the region (i.e. background conditions) where ozone concentrations are simulated. The location of an ozone concentration on the isopleth diagram is defined by the ratio of NMOC and NO x coordinates of the point, known as the NMOC/NO x ratio. Results obtained by the described model are presented
Walker, Ellen A
2010-01-01
As clinical studies reveal that chemotherapeutic agents may impair several different cognitive domains in humans, the development of preclinical animal models is critical to assess the degree of chemotherapy-induced learning and memory deficits and to understand the underlying neural mechanisms. In this chapter, the effects of various cancer chemotherapeutic agents in rodents on sensory processing, conditioned taste aversion, conditioned emotional response, passive avoidance, spatial learning, cued memory, discrimination learning, delayed-matching-to-sample, novel-object recognition, electrophysiological recordings and autoshaping is reviewed. It appears at first glance that the effects of the cancer chemotherapy agents in these many different models are inconsistent. However, a literature is emerging that reveals subtle or unique changes in sensory processing, acquisition, consolidation and retrieval that are dose- and time-dependent. As more studies examine cancer chemotherapeutic agents alone and in combination during repeated treatment regimens, the animal models will become more predictive tools for the assessment of these impairments and the underlying neural mechanisms. The eventual goal is to collect enough data to enable physicians to make informed choices about therapeutic regimens for their patients and discover new avenues of alternative or complementary therapies that reduce or eliminate chemotherapy-induced cognitive deficits.
Energy Technology Data Exchange (ETDEWEB)
Plimpton, Steven James; Heffernan, Julieanne; Sasaki, Darryl Yoshio; Frischknecht, Amalie Lucile; Stevens, Mark Jackson; Frink, Laura J. Douglas
2005-11-01
Understanding the properties and behavior of biomembranes is fundamental to many biological processes and technologies. Microdomains in biomembranes or ''lipid rafts'' are now known to be an integral part of cell signaling, vesicle formation, fusion processes, protein trafficking, and viral and toxin infection processes. Understanding how microdomains form, how they depend on membrane constituents, and how they act not only has biological implications, but also will impact Sandia's effort in development of membranes that structurally adapt to their environment in a controlled manner. To provide such understanding, we created physically-based models of biomembranes. Molecular dynamics (MD) simulations and classical density functional theory (DFT) calculations using these models were applied to phenomena such as microdomain formation, membrane fusion, pattern formation, and protein insertion. Because lipid dynamics and self-organization in membranes occur on length and time scales beyond atomistic MD, we used coarse-grained models of double tail lipid molecules that spontaneously self-assemble into bilayers. DFT provided equilibrium information on membrane structure. Experimental work was performed to further help elucidate the fundamental membrane organization principles.
Energy Technology Data Exchange (ETDEWEB)
Ahmed E. Hassan
2006-01-24
Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation
Energy Technology Data Exchange (ETDEWEB)
Chandler, Graham
2011-03-15
Ken Dedeluk is the president and CEO of Computer Modeling Group (CMG). Dedeluk started his career with Gulf Oil in 1972, worked in computer assisted design; then joined Imperial Esso and Shell, where he became international operations' VP; and finally joined CMG in 1998. CMG made a decision that turned out to be the company's turning point: they decided to provide intensive support and service to their customer to better use their technology. Thanks to this service, their customers' satisfaction grew as well as their revenues.
Model-based software process improvement
Zettervall, Brenda T.
1994-01-01
The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.
International Nuclear Information System (INIS)
Fawaz, S.; Khan, Zulfiquar A.; Mossa, Samir Y.
2006-01-01
A new definition is proposed for analyzing the consultation in the primary health care, integrating other models of consultation and provides a framework by which general practitioners can apply the principles of consultation using communication skills to reconcile the respective agenda and autonomy of both doctor and patient into a negotiated agreed plan, which includes both management of health problems and health promotion. Achieving success of consultations depends on time and mutual cooperation between patient and doctor showed by doctor-patient relationship. (author)
Stickler, Leslie; Sykes, Gary
2016-01-01
This report reviews the scholarly and research evidence supporting the construct labeled modeling and explaining content (MEC), which is measured via a performance assessment in the "ETS"® National Observational Teaching Examination (NOTE) assessment series. This construct involves practices at the heart of teaching that deal with how…
Energy Technology Data Exchange (ETDEWEB)
Andrade, Jose Geraldo Pena de; Koelle, Edmundo; Luvizotto Junior, Edevar [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Civil. Dept. de Hidraulica e Saneamento
1997-07-01
This paper presents a complete mathematical and computer model which allows simulating a generic hydroelectric power plant under steady state and transitory regimes, in the extensive time, and also the analysis of the oscillating flows resulting from excitation sources present in the installation, such as vortices in the suction pipe during partial load operation.
International Nuclear Information System (INIS)
Choi, Sang Hyoun
2007-08-01
Ajou University School of Medicine made the serially sectioned anatomical images from the Visible Korean Human (VKH) Project in Korea. The VKH images, which are the high-resolution color photographic images, show the organs and tissues in the human body very clearly at 0.2 mm intervals. In this study, we constructed a high-quality voxel model (VKH-Man) with a total of 30 organs and tissues by manual and automatic segmentation method using the serially sectioned anatomical image data from the Visible Korean Human (VKH) project in Korea. The height and weight of VKH-Man voxel model is 164 cm and 57.6 kg, respectively, and the voxel resolution is 1.875 x 1.875 x 2 mm 3 . However, this voxel phantom can be used to calculate the organ and tissue doses of only one person. Therefore, in this study, we adjusted the voxel phantom to the 'Reference Korean' data to construct the voxel phantom that represents the radiation workers in Korea. The height and weight of the voxel model (HDRK-Man) that is finally developed are 171 cm and 68 kg, respectively, and the voxel resolution is 1.981 x 1.981 x 2.0854 mm 3 . VKH-Man and HDRK-Man voxel model were implemented in a Monte Carlo particle transport simulation code for calculation of the organ and tissue doses in various irradiation geometries. The calculated values were compared with each other to see the effect of the adjustment and also compared with other computational models (KTMAN-2, ICRP-74 and VIP-Man). According to the results, the adjustment of the voxel model was found hardly affect the dose calculations and most of the organ and tissue equivalent doses showed some differences among the models. These results shows that the difference in figure, and organ topology affects the organ doses more than the organ size. The calculated values of the effective dose from VKH-Man and HDRK-Man according to the ICRP-60 and upcoming ICRP recommendation were compared. For the other radiation geometries (AP, LLAT, RLAT) except for PA
PATHS groundwater hydrologic model
Energy Technology Data Exchange (ETDEWEB)
Nelson, R.W.; Schur, J.A.
1980-04-01
A preliminary evaluation capability for two-dimensional groundwater pollution problems was developed as part of the Transport Modeling Task for the Waste Isolation Safety Assessment Program (WISAP). Our approach was to use the data limitations as a guide in setting the level of modeling detail. PATHS Groundwater Hydrologic Model is the first level (simplest) idealized hybrid analytical/numerical model for two-dimensional, saturated groundwater flow and single component transport; homogeneous geology. This document consists of the description of the PATHS groundwater hydrologic model. The preliminary evaluation capability prepared for WISAP, including the enhancements that were made because of the authors' experience using the earlier capability is described. Appendixes A through D supplement the report as follows: complete derivations of the background equations are provided in Appendix A. Appendix B is a comprehensive set of instructions for users of PATHS. It is written for users who have little or no experience with computers. Appendix C is for the programmer. It contains information on how input parameters are passed between programs in the system. It also contains program listings and test case listing. Appendix D is a definition of terms.
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
Cannerfelt, B; Nystedt, J; Jönsen, A; Lätt, J; van Westen, D; Lilja, A; Bengtsson, A; Nilsson, P; Mårtensson, J; Sundgren, P C
2018-06-01
Aim The aim of this study was to evaluate the extent of white matter lesions, atrophy of the hippocampus and corpus callosum, and their correlation with cognitive dysfunction (CD), in patients diagnosed with systemic lupus erythematosus (SLE). Methods Seventy SLE patients and 25 healthy individuals (HIs) were included in the study. To evaluate the different SLE and neuropsychiatric SLE (NPSLE) definition schemes, patients were grouped both according to the American College of Rheumatology (ACR) definition, as well as the more stringent ACR-Systemic Lupus International Collaborating Clinics definition. Patients and HIs underwent a 3 Tesla brain MRI and a standardized neuropsychological test. MRI data were evaluated for number and volume of white matter lesions and atrophy of the hippocampus and corpus callosum. Differences between groups and subgroups were evaluated for significance. Number and volume of white matter lesions and atrophy of the hippocampus and corpus callosum were correlated to cognitive dysfunction. Results The total volume of white matter lesions was significantly larger in SLE patients compared to HIs ( p = 0.004). However, no significant differences were seen between the different SLE subgroups. Atrophy of the bilateral hippocampus was significantly more pronounced in patients with NPSLE compared to those with non-NPSLE (right: p = 0.010; left p = 0.023). Significant negative correlations between cognitive test scores on verbal memory and number and volume of white matter lesions were present. Conclusion SLE patients have a significantly larger volume of white matter lesions on MRI compared to HIs and the degree of white matter lesion volume correlates to cognitive dysfunction, specifically to verbal memory. No significant differences in the number or volume of white matter lesions were identified between subgroups of SLE patients regardless of the definition model used.
1989-01-01
A wooden model of the ALEPH experiment and its cavern. ALEPH was one of 4 experiments at CERN's 27km Large Electron Positron collider (LEP) that ran from 1989 to 2000. During 11 years of research, LEP's experiments provided a detailed study of the electroweak interaction. Measurements performed at LEP also proved that there are three – and only three – generations of particles of matter. LEP was closed down on 2 November 2000 to make way for the construction of the Large Hadron Collider in the same tunnel. The cavern and detector are in separate locations - the cavern is stored at CERN and the detector is temporarily on display in Glasgow physics department. Both are available for loan.
Directory of Open Access Journals (Sweden)
Robert F. Love
2001-01-01
Full Text Available Distance predicting functions may be used in a variety of applications for estimating travel distances between points. To evaluate the accuracy of a distance predicting function and to determine its parameters, a goodness-of-fit criteria is employed. AD (Absolute Deviations, SD (Squared Deviations and NAD (Normalized Absolute Deviations are the three criteria that are mostly employed in practice. In the literature some assumptions have been made about the properties of each criterion. In this paper, we present statistical analyses performed to compare the three criteria from different perspectives. For this purpose, we employ the ℓkpθ-norm as the distance predicting function, and statistically compare the three criteria by using normalized absolute prediction error distributions in seventeen geographical regions. We find that there exist no significant differences between the criteria. However, since the criterion SD has desirable properties in terms of distance modelling procedures, we suggest its use in practice.
International Nuclear Information System (INIS)
Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M
2014-01-01
The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.
Progress in modeling and simulation.
Kindler, E
1998-01-01
For the modeling of systems, the computers are more and more used while the other "media" (including the human intellect) carrying the models are abandoned. For the modeling of knowledges, i.e. of more or less general concepts (possibly used to model systems composed of instances of such concepts), the object-oriented programming is nowadays widely used. For the modeling of processes existing and developing in the time, computer simulation is used, the results of which are often presented by means of animation (graphical pictures moving and changing in time). Unfortunately, the object-oriented programming tools are commonly not designed to be of a great use for simulation while the programming tools for simulation do not enable their users to apply the advantages of the object-oriented programming. Nevertheless, there are exclusions enabling to use general concepts represented at a computer, for constructing simulation models and for their easy modification. They are described in the present paper, together with true definitions of modeling, simulation and object-oriented programming (including cases that do not satisfy the definitions but are dangerous to introduce misunderstanding), an outline of their applications and of their further development. In relation to the fact that computing systems are being introduced to be control components into a large spectrum of (technological, social and biological) systems, the attention is oriented to models of systems containing modeling components.
Controversy around the definition of waste
CSIR Research Space (South Africa)
Oelofse, Suzanna HH
2009-11-20
Full Text Available This paper presents the information concerning the definition of waste. Discussing the importance of the clear definition, ongoing debates, broad definition of waste, problems with the broad definition, interpretation, current waste management model...
Comparison: Binomial model and Black Scholes model
Directory of Open Access Journals (Sweden)
Amir Ahmad Dar
2018-03-01
Full Text Available The Binomial Model and the Black Scholes Model are the popular methods that are used to solve the option pricing problems. Binomial Model is a simple statistical method and Black Scholes model requires a solution of a stochastic differential equation. Pricing of European call and a put option is a very difficult method used by actuaries. The main goal of this study is to differentiate the Binominal model and the Black Scholes model by using two statistical model - t-test and Tukey model at one period. Finally, the result showed that there is no significant difference between the means of the European options by using the above two models.
International Nuclear Information System (INIS)
Schreckenberg, M
2004-01-01
This book by Nino Boccara presents a compilation of model systems commonly termed as 'complex'. It starts with a definition of the systems under consideration and how to build up a model to describe the complex dynamics. The subsequent chapters are devoted to various categories of mean-field type models (differential and recurrence equations, chaos) and of agent-based models (cellular automata, networks and power-law distributions). Each chapter is supplemented by a number of exercises and their solutions. The table of contents looks a little arbitrary but the author took the most prominent model systems investigated over the years (and up until now there has been no unified theory covering the various aspects of complex dynamics). The model systems are explained by looking at a number of applications in various fields. The book is written as a textbook for interested students as well as serving as a comprehensive reference for experts. It is an ideal source for topics to be presented in a lecture on dynamics of complex systems. This is the first book on this 'wide' topic and I have long awaited such a book (in fact I planned to write it myself but this is much better than I could ever have written it!). Only section 6 on cellular automata is a little too limited to the author's point of view and one would have expected more about the famous Domany-Kinzel model (and more accurate citation!). In my opinion this is one of the best textbooks published during the last decade and even experts can learn a lot from it. Hopefully there will be an actualization after, say, five years since this field is growing so quickly. The price is too high for students but this, unfortunately, is the normal case today. Nevertheless I think it will be a great success! (book review)
Computational Modeling | Bioenergy | NREL
cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the
Agent oriented modeling of business information systems
Vymetal, Dominik
2009-01-01
Enterprise modeling is an abstract definition of processes running in enterprise using process, value, data and resource models. There are two perspectives of business modeling: process perspective and value chain perspective. Both have some advantages and disadvantages. This paper proposes a combination of both perspectives into one generic model. The model takes also social part or the enterprise system into consideration and pays attention to disturbances influencing the enterprise system....
Computational Modeling of Culture's Consequences
Hofstede, G.J.; Jonker, C.M.; Verwaart, T.
2010-01-01
This paper presents an approach to formalize the influence of culture on the decision functions of agents in social simulations. The key components are (a) a definition of the domain of study in the form of a decision model, (b) knowledge acquisition based on a dimensional theory of culture,
Kyriacou, Chris; Sutcliffe, John
1978-01-01
A definition and model of teacher stress is presented which conceptualizes teacher stress as a response syndrome (anger or depression) mediated by (1) an appraisal of threat to the teacher's self-esteem or well-being and (2) coping mechanisms activated to reduce the perceived threat. (Author)
Marginal Models for Categorial Data
Bergsma, W.P.; Rudas, T.
2002-01-01
Statistical models defined by imposing restrictions on marginal distributions of contingency tables have received considerable attention recently. This paper introduces a general definition of marginal log-linear parameters and describes conditions for a marginal log-linear parameter to be a smooth
Essays on model uncertainty in financial models
Li, Jing
2018-01-01
This dissertation studies model uncertainty, particularly in financial models. It consists of two empirical chapters and one theoretical chapter. The first empirical chapter (Chapter 2) classifies model uncertainty into parameter uncertainty and misspecification uncertainty. It investigates the
Criterion of Semi-Markov Dependent Risk Model
Institute of Scientific and Technical Information of China (English)
Xiao Yun MO; Xiang Qun YANG
2014-01-01
A rigorous definition of semi-Markov dependent risk model is given. This model is a generalization of the Markov dependent risk model. A criterion and necessary conditions of semi-Markov dependent risk model are obtained. The results clarify relations between elements among semi-Markov dependent risk model more clear and are applicable for Markov dependent risk model.
Energy Technology Data Exchange (ETDEWEB)
Blanchard, Miran [Department of Molecular Medicine, Mayo Clinic, Rochester, Minnesota (United States); Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota (United States); Shim, Kevin G. [Department of Molecular Medicine, Mayo Clinic, Rochester, Minnesota (United States); Department of Immunology, Mayo Clinic, Rochester, Minnesota (United States); Grams, Michael P. [Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota (United States); Rajani, Karishma; Diaz, Rosa M. [Department of Molecular Medicine, Mayo Clinic, Rochester, Minnesota (United States); Furutani, Keith M. [Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota (United States); Thompson, Jill [Department of Molecular Medicine, Mayo Clinic, Rochester, Minnesota (United States); Olivier, Kenneth R.; Park, Sean S. [Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota (United States); Markovic, Svetomir N. [Department of Immunology, Mayo Clinic, Rochester, Minnesota (United States); Department of Medical Oncology, Mayo Clinic, Rochester, Minnesota (United States); Pandha, Hardev [The Postgraduate Medical School, University of Surrey, Guildford (United Kingdom); Melcher, Alan [Leeds Institute of Cancer Studies and Pathology, University of Leeds, Leeds (United Kingdom); Harrington, Kevin [Targeted Therapy Laboratory, The Institute of Cancer Research, London (United Kingdom); Zaidi, Shane [Department of Molecular Medicine, Mayo Clinic, Rochester, Minnesota (United States); Targeted Therapy Laboratory, The Institute of Cancer Research, London (United Kingdom); Vile, Richard, E-mail: vile.richard@mayo.edu [Department of Molecular Medicine, Mayo Clinic, Rochester, Minnesota (United States); Department of Immunology, Mayo Clinic, Rochester, Minnesota (United States); Leeds Institute of Cancer Studies and Pathology, University of Leeds, Leeds (United Kingdom)
2015-11-01
Purpose: The oligometastatic state is an intermediate state between a malignancy that can be completely eradicated with conventional modalities and one in which a palliative approach is undertaken. Clinically, high rates of local tumor control are possible with stereotactic ablative radiation therapy (SABR), using precisely targeted, high-dose, low-fraction radiation therapy. However, in oligometastatic melanoma, virtually all patients develop progression systemically at sites not initially treated with ablative radiation therapy that cannot be managed with conventional chemotherapy and immunotherapy. We have demonstrated in mice that intravenous administration of vesicular stomatitis virus (VSV) expressing defined tumor-associated antigens (TAAs) generates systemic immune responses capable of clearing established tumors. Therefore, in the present preclinical study, we tested whether the combination of systemic VSV-mediated antigen delivery and SABR would be effective against oligometastatic disease. Methods and Materials: We generated a model of oligometastatic melanoma in C57BL/6 immunocompetent mice and then used a combination of SABR and systemically administered VSV-TAA viral immunotherapy to treat both local and systemic disease. Results: Our data showed that SABR generates excellent control or cure of local, clinically detectable, and accessible tumor through direct cell ablation. Also, the immunotherapeutic activity of systemically administered VSV-TAA generated T-cell responses that cleared subclinical metastatic tumors. We also showed that SABR induced weak T-cell-mediated tumor responses, which, particularly if boosted by VSV-TAA, might contribute to control of local and systemic disease. In addition, VSV-TAA therapy alone had significant effects on control of both local and metastatic tumors. Conclusions: We have shown in the present preliminary murine study using a single tumor model that this approach represents an effective, complementary
International Nuclear Information System (INIS)
Blanchard, Miran; Shim, Kevin G.; Grams, Michael P.; Rajani, Karishma; Diaz, Rosa M.; Furutani, Keith M.; Thompson, Jill; Olivier, Kenneth R.; Park, Sean S.; Markovic, Svetomir N.; Pandha, Hardev; Melcher, Alan; Harrington, Kevin; Zaidi, Shane; Vile, Richard
2015-01-01
Purpose: The oligometastatic state is an intermediate state between a malignancy that can be completely eradicated with conventional modalities and one in which a palliative approach is undertaken. Clinically, high rates of local tumor control are possible with stereotactic ablative radiation therapy (SABR), using precisely targeted, high-dose, low-fraction radiation therapy. However, in oligometastatic melanoma, virtually all patients develop progression systemically at sites not initially treated with ablative radiation therapy that cannot be managed with conventional chemotherapy and immunotherapy. We have demonstrated in mice that intravenous administration of vesicular stomatitis virus (VSV) expressing defined tumor-associated antigens (TAAs) generates systemic immune responses capable of clearing established tumors. Therefore, in the present preclinical study, we tested whether the combination of systemic VSV-mediated antigen delivery and SABR would be effective against oligometastatic disease. Methods and Materials: We generated a model of oligometastatic melanoma in C57BL/6 immunocompetent mice and then used a combination of SABR and systemically administered VSV-TAA viral immunotherapy to treat both local and systemic disease. Results: Our data showed that SABR generates excellent control or cure of local, clinically detectable, and accessible tumor through direct cell ablation. Also, the immunotherapeutic activity of systemically administered VSV-TAA generated T-cell responses that cleared subclinical metastatic tumors. We also showed that SABR induced weak T-cell-mediated tumor responses, which, particularly if boosted by VSV-TAA, might contribute to control of local and systemic disease. In addition, VSV-TAA therapy alone had significant effects on control of both local and metastatic tumors. Conclusions: We have shown in the present preliminary murine study using a single tumor model that this approach represents an effective, complementary
Vector models and generalized SYK models
Energy Technology Data Exchange (ETDEWEB)
Peng, Cheng [Department of Physics, Brown University,Providence RI 02912 (United States)
2017-05-23
We consider the relation between SYK-like models and vector models by studying a toy model where a tensor field is coupled with a vector field. By integrating out the tensor field, the toy model reduces to the Gross-Neveu model in 1 dimension. On the other hand, a certain perturbation can be turned on and the toy model flows to an SYK-like model at low energy. A chaotic-nonchaotic phase transition occurs as the sign of the perturbation is altered. We further study similar models that possess chaos and enhanced reparameterization symmetries.
A model for persistency of egg production
Grossman, M.; Gossman, T.N.; Koops, W.J.
2000-01-01
The objectives of our study were to propose a new definition for persistency of egg production and to develop a mathematical model to describe the egg production curve, one that includes a new measure for persistency, based on the proposed definition, for use as a selection criterion to improve
Hauksson, Hilmar
2013-01-01
Interest for business models and business modeling has increased rapidly since the mid-1990‘s and there are numerous approaches used to create business models. The business model concept has many definitions which can lead to confusion and slower progress in the research and development of business models. A business model ontology (BMO) was created in 2004 where the business model concept was conceptualized based on an analysis of existing literature. A few years later the Business Model Can...
Directory of Open Access Journals (Sweden)
Marcus Alessi Bittencourt
2016-07-01
Full Text Available An indisputable cornerstone of the Western music tradition, the dialectic opposition between the major and minor grammatical modal genders has always been present in the imagination of musicians and music theorists for centuries. Such dialectics of opposition is especially important in the context of nineteenth-century harmonic dualism, with its ideas of tonicity and phonicity. These concepts serve as the main foundation for the way harmonic dualism conceives the major and minor worlds: two worlds with equivalent rights and properties, but with opposed polarities. This paper presents a redefinition of the terms tonicity and phonicity, translating those concepts to the context of post-tonal music theory. The terminologies of generatrix, tonicity, root, phonicity, vertex, and azimuth are explained in this paper, followed by propositions of mathematical models for those concepts, which spring from Richard Parncutt’s root-salience model for pitch-class sets. In order to demonstrate the possibilities of using modal gender as a criterion for the study and classification of the universe of Tn-types, we will present a taxonomy of the 351 transpositional set types, which comprises the categories of tonic (major, phonic (minor and neutral (genderless. In addition, there will be a small discussion on the effect of set symmetries and set asymmetries on the tonic/phonic properties of a Tn-type.
Business model elements impacting cloud computing adoption
DEFF Research Database (Denmark)
Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek
The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...
Modelling and Forecasting Multivariate Realized Volatility
DEFF Research Database (Denmark)
Halbleib, Roxana; Voev, Valeri
2011-01-01
This paper proposes a methodology for dynamic modelling and forecasting of realized covariance matrices based on fractionally integrated processes. The approach allows for flexible dependence patterns and automatically guarantees positive definiteness of the forecast. We provide an empirical appl...
Monica, Ratti Maria; Delli Zotti, Giulia Bruna; Spotti, Donatella; Sarno, Lucio
2014-01-01
Chronic Kidney Disease (CKD) and the dialytic treatment cause a significant psychological impact on patients, their families and on the medical-nursing staff too. The psychological aspects linked to the chronic condition of Kidney Disease generate the need to integrated a psychologist into the healthcare team of the Nephrology, Dialysis and Hypertension Operative Unit, in order to offer a specific and professional support to the patient during the different stages of the disease, to their caregivers and to the medical team. The aim of this collaboration project between Nephrology and Psychology is to create a global and integrated healthcare model. It does not give attention simply to the physical dimension of patients affected by CKD, but also to the emotional-affective, cognitive and social dimensions and to the health environment.
Tryby, M.; Fries, J. S.; Baranowski, C.
2014-12-01
Extreme precipitation events can cause significant impacts to drinking water and wastewater utilities, including facility damage, water quality impacts, service interruptions and potential risks to human health and the environment due to localized flooding and combined sewer overflows (CSOs). These impacts will become more pronounced with the projected increases in frequency and intensity of extreme precipitation events due to climate change. To model the impacts of extreme precipitation events, wastewater utilities often develop Intensity, Duration, and Frequency (IDF) rainfall curves and "design storms" for use in the U.S. Environmental Protection Agency's (EPA) Storm Water Management Model (SWMM). Wastewater utilities use SWMM for planning, analysis, and facility design related to stormwater runoff, combined and sanitary sewers, and other drainage systems in urban and non-urban areas. SWMM tracks (1) the quantity and quality of runoff made within each sub-catchment; and (2) the flow rate, flow depth, and quality of water in each pipe and channel during a simulation period made up of multiple time steps. In its current format, EPA SWMM does not consider climate change projection data. Climate change may affect the relationship between intensity, duration, and frequency described by past rainfall events. Therefore, EPA is integrating climate projection data available in the Climate Resilience Evaluation and Awareness Tool (CREAT) into SWMM. CREAT is a climate risk assessment tool for utilities that provides downscaled climate change projection data for changes in the amount of rainfall in a 24-hour period for various extreme precipitation events (e.g., from 5-year to 100-year storm events). Incorporating climate change projections into SWMM will provide wastewater utilities with more comprehensive data they can use in planning for future storm events, thereby reducing the impacts to the utility and customers served from flooding and stormwater issues.
Directory of Open Access Journals (Sweden)
David Lo Buglio
2012-12-01
Full Text Available EnWith the arrival of digital technologies in the field of architectural documentation, many tools and methods for data acquisition have been considerably developed. However, these developments are primarily used for recording colorimetric and dimensional properties of the objects processed. The actors, of the disciplines concerned by 3D digitization of architectural heritage, are facing with a large number of data, leaving the survey far from its cognitive dimension. In this context, it seems necessary to provide innovative solutions in order to increase the informational value of the representations produced by strengthen relations between "multiplicity" of data and "intelligibility" of the theoretical model. With the purpose of answering to the lack of methodology we perceived, this article therefore offers an approach to the creation of representation systems that articulate the digital instance with the geometric/semantic model.ItGrazie all’introduzione delle tecnologie digitali nel campo della documentazione architettonica, molti strumenti e metodi di acquisizione hanno avuto un notevole sviluppo. Tuttavia, questi sviluppi si sono principalmente concentrati sulla registrazione e sulla restituzione delle proprietà geometriche e colorimetriche degli oggetti di studio. Le discipline interessate alla digitalizzazione 3D del patrimonio architettonico hanno pertanto la possibilità di produrre delle grandi quantità di dati attraverso un’evoluzione delle pratiche di documentazione che potrebbero progressivamente far scomparire la dimensione cognitiva del rilievo. In questo contesto, appare necessario fornire soluzioni innovative per aumentare il valore informativo delle rappresentazioni digitali tramite l’identificazione delle relazioni potenziali che è possibile costruire fra le nozioni di "molteplicità" ed "intelligibilità". Per rispondere a questo deficit metodologico, questo articolo presenta le basi di un approccio per la
Modeling styles in business process modeling
Pinggera, J.; Soffer, P.; Zugal, S.; Weber, B.; Weidlich, M.; Fahland, D.; Reijers, H.A.; Mendling, J.; Bider, I.; Halpin, T.; Krogstie, J.; Nurcan, S.; Proper, E.; Schmidt, R.; Soffer, P.; Wrycza, S.
2012-01-01
Research on quality issues of business process models has recently begun to explore the process of creating process models. As a consequence, the question arises whether different ways of creating process models exist. In this vein, we observed 115 students engaged in the act of modeling, recording
The IMACLIM model; Le modele IMACLIM
Energy Technology Data Exchange (ETDEWEB)
NONE
2003-07-01
This document provides annexes to the IMACLIM model which propose an actualized description of IMACLIM, model allowing the design of an evaluation tool of the greenhouse gases reduction policies. The model is described in a version coupled with the POLES, technical and economical model of the energy industry. Notations, equations, sources, processing and specifications are proposed and detailed. (A.L.B.)
From Product Models to Product State Models
DEFF Research Database (Denmark)
Larsen, Michael Holm
1999-01-01
A well-known technology designed to handle product data is Product Models. Product Models are in their current form not able to handle all types of product state information. Hence, the concept of a Product State Model (PSM) is proposed. The PSM and in particular how to model a PSM is the Research...
MOS modeling hierarchy including radiation effects
International Nuclear Information System (INIS)
Alexander, D.R.; Turfler, R.M.
1975-01-01
A hierarchy of modeling procedures has been developed for MOS transistors, circuit blocks, and integrated circuits which include the effects of total dose radiation and photocurrent response. The models were developed for use with the SCEPTRE circuit analysis program, but the techniques are suitable for other modern computer aided analysis programs. The modeling hierarchy permits the designer or analyst to select the level of modeling complexity consistent with circuit size, parametric information, and accuracy requirements. Improvements have been made in the implementation of important second order effects in the transistor MOS model, in the definition of MOS building block models, and in the development of composite terminal models for MOS integrated circuits
International Nuclear Information System (INIS)
Serre, S.
2010-01-01
This research thesis first describes the problematic of the effects of natural radiation on micro- and nano-electronic components, and the atmospheric-radiative stress of atmospheric neutrons from cosmic origin: issue of 'Single event upsets', present knowledge of the atmospheric radiative environment induced by cosmic rays. The author then presents the neutron-based detection and spectrometry by using the Bonner sphere technique: principle of moderating spheres, definition and mathematical formulation of neutron spectrometry using Bonner spheres, active sensors of thermal neutrons, response of a system to conventional Bonner spheres, extension to the range of high energies. Then, he reports the development of a Bonner sphere system extended to the high-energy range for the spectrometry of atmospheric neutrons: definition of a conventional system, Monte Carlo calculation of response functions, development of the response matrix, representation and semi-empirical verification of fluence response, uncertainty analysis, extension to high energies, and measurement tests of the spectrometer. He reports the use of a Monte Carlo simulation to characterize the spectrometer response in the high-energy range
Modelling live forensic acquisition
CSIR Research Space (South Africa)
Grobler, MM
2009-06-01
Full Text Available This paper discusses the development of a South African model for Live Forensic Acquisition - Liforac. The Liforac model is a comprehensive model that presents a range of aspects related to Live Forensic Acquisition. The model provides forensic...
Functional Decomposition of Modeling and Simulation Terrain Database Generation Process
National Research Council Canada - National Science Library
Yakich, Valerie R; Lashlee, J. D
2008-01-01
.... This report documents the conceptual procedure as implemented by Lockheed Martin Simulation, Training, and Support and decomposes terrain database construction using the Integration Definition for Function Modeling (IDEF...
Stochastic dynamical models for ecological regime shifts
DEFF Research Database (Denmark)
Møller, Jan Kloppenborg; Carstensen, Jacob; Madsen, Henrik
the physical and biological knowledge of the system, and nonlinearities introduced here can generate regime shifts or enhance the probability of regime shifts in the case of stochastic models, typically characterized by a threshold value for the known driver. A simple model for light competition between...... definition and stability of regimes become less subtle. Ecological regime shifts and their modeling must be viewed in a probabilistic manner, particularly if such model results are to be used in ecosystem management....
Data Modeling Challenges of Advanced Interoperability.
Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka
2018-01-01
Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.
The Shuttle Cost and Price model
Leary, Katherine; Stone, Barbara
1983-01-01
The Shuttle Cost and Price (SCP) model was developed as a tool to assist in evaluating major aspects of Shuttle operations that have direct and indirect economic consequences. It incorporates the major aspects of NASA Pricing Policy and corresponds to the NASA definition of STS operating costs. An overview of the SCP model is presented and the cost model portion of SCP is described in detail. Selected recent applications of the SCP model to NASA Pricing Policy issues are presented.
Mobility Models for Systems Evaluation
Musolesi, Mirco; Mascolo, Cecilia
Mobility models are used to simulate and evaluate the performance of mobile wireless systems and the algorithms and protocols at the basis of them. The definition of realistic mobility models is one of the most critical and, at the same time, difficult aspects of the simulation of applications and systems designed for mobile environments. There are essentially two possible types of mobility patterns that can be used to evaluate mobile network protocols and algorithms by means of simulations: traces and synthetic models [130]. Traces are obtained by means of measurements of deployed systems and usually consist of logs of connectivity or location information, whereas synthetic models are mathematical models, such as sets of equations, which try to capture the movement of the devices.
Stochastic Subspace Modelling of Turbulence
DEFF Research Database (Denmark)
Sichani, Mahdi Teimouri; Pedersen, B. J.; Nielsen, Søren R.K.
2009-01-01
positive definite cross-spectral density matrix a frequency response matrix is constructed which determines the turbulence vector as a linear filtration of Gaussian white noise. Finally, an accurate state space modelling method is proposed which allows selection of an appropriate model order......, and estimation of a state space model for the vector turbulence process incorporating its phase spectrum in one stage, and its results are compared with a conventional ARMA modelling method.......Turbulence of the incoming wind field is of paramount importance to the dynamic response of civil engineering structures. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper...
Improved Maximum Parsimony Models for Phylogenetic Networks.
Van Iersel, Leo; Jones, Mark; Scornavacca, Celine
2018-05-01
Phylogenetic networks are well suited to represent evolutionary histories comprising reticulate evolution. Several methods aiming at reconstructing explicit phylogenetic networks have been developed in the last two decades. In this article, we propose a new definition of maximum parsimony for phylogenetic networks that permits to model biological scenarios that cannot be modeled by the definitions currently present in the literature (namely, the "hardwired" and "softwired" parsimony). Building on this new definition, we provide several algorithmic results that lay the foundations for new parsimony-based methods for phylogenetic network reconstruction.
Models in architectural design
Pauwels, Pieter
2017-01-01
Whereas architects and construction specialists used to rely mainly on sketches and physical models as representations of their own cognitive design models, they rely now more and more on computer models. Parametric models, generative models, as-built models, building information models (BIM), and so forth, they are used daily by any practitioner in architectural design and construction. Although processes of abstraction and the actual architectural model-based reasoning itself of course rema...
International Nuclear Information System (INIS)
Tozini, A.V.
1984-01-01
A review is made of some properties of the rotating Universe models. Godel's model is identified as a generalized filted model. Some properties of new solutions of the Einstein's equations, which are rotating non-stationary Universe models, are presented and analyzed. These models have the Godel's model as a particular case. Non-stationary cosmological models are found which are a generalization of the Godel's metrics in an analogous way in which Friedmann is to the Einstein's model. (L.C.) [pt
Concept Modeling vs. Data modeling in Practice
DEFF Research Database (Denmark)
Madsen, Bodil Nistrup; Erdman Thomsen, Hanne
2015-01-01
This chapter shows the usefulness of terminological concept modeling as a first step in data modeling. First, we introduce terminological concept modeling with terminological ontologies, i.e. concept systems enriched with characteristics modeled as feature specifications. This enables a formal...... account of the inheritance of characteristics and allows us to introduce a number of principles and constraints which render concept modeling more coherent than earlier approaches. Second, we explain how terminological ontologies can be used as the basis for developing conceptual and logical data models....... We also show how to map from the various elements in the terminological ontology to elements in the data models, and explain the differences between the models. Finally the usefulness of terminological ontologies as a prerequisite for IT development and data modeling is illustrated with examples from...
Model-to-model interface for multiscale materials modeling
Energy Technology Data Exchange (ETDEWEB)
Antonelli, Perry Edward [Iowa State Univ., Ames, IA (United States)
2017-12-17
A low-level model-to-model interface is presented that will enable independent models to be linked into an integrated system of models. The interface is based on a standard set of functions that contain appropriate export and import schemas that enable models to be linked with no changes to the models themselves. These ideas are presented in the context of a specific multiscale material problem that couples atomistic-based molecular dynamics calculations to continuum calculations of fluid ow. These simulations will be used to examine the influence of interactions of the fluid with an adjacent solid on the fluid ow. The interface will also be examined by adding it to an already existing modeling code, Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) and comparing it with our own molecular dynamics code.
Cognitive models embedded in system simulation models
International Nuclear Information System (INIS)
Siegel, A.I.; Wolf, J.J.
1982-01-01
If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context
Finsler Geometry Modeling of an Orientation-Asymmetric Surface Model for Membranes
Proutorov, Evgenii; Koibuchi, Hiroshi
2017-12-01
In this paper, a triangulated surface model is studied in the context of Finsler geometry (FG) modeling. This FG model is an extended version of a recently reported model for two-component membranes, and it is asymmetric under surface inversion. We show that the definition of the model is independent of how the Finsler length of a bond is defined. This leads us to understand that the canonical (or Euclidean) surface model is obtained from the FG model such that it is uniquely determined as a trivial model from the viewpoint of well definedness.
Model Manipulation for End-User Modelers
DEFF Research Database (Denmark)
Acretoaie, Vlad
, and transformations using their modeling notation and editor of choice. The VM* languages are implemented via a single execution engine, the VM* Runtime, built on top of the Henshin graph-based transformation engine. This approach combines the benefits of flexibility, maturity, and formality. To simplify model editor......End-user modelers are domain experts who create and use models as part of their work. They are typically not Software Engineers, and have little or no programming and meta-modeling experience. However, using model manipulation languages developed in the context of Model-Driven Engineering often...... requires such experience. These languages are therefore only used by a small subset of the modelers that could, in theory, benefit from them. The goals of this thesis are to substantiate this observation, introduce the concepts and tools required to overcome it, and provide empirical evidence in support...
Air Quality Dispersion Modeling - Alternative Models
Models, not listed in Appendix W, that can be used in regulatory applications with case-by-case justification to the Reviewing Authority as noted in Section 3.2, Use of Alternative Models, in Appendix W.
Topological massive sigma models
International Nuclear Information System (INIS)
Lambert, N.D.
1995-01-01
In this paper we construct topological sigma models which include a potential and are related to twisted massive supersymmetric sigma models. Contrary to a previous construction these models have no central charge and do not require the manifold to admit a Killing vector. We use the topological massive sigma model constructed here to simplify the calculation of the observables. Lastly it is noted that this model can be viewed as interpolating between topological massless sigma models and topological Landau-Ginzburg models. ((orig.))
Dodgson, Mark; Gann, David; Phillips, Nelson; Massa, Lorenzo; Tucci, Christopher
2014-01-01
The chapter offers a broad review of the literature at the nexus between Business Models and innovation studies, and examines the notion of Business Model Innovation in three different situations: Business Model Design in newly formed organizations, Business Model Reconfiguration in incumbent firms, and Business Model Innovation in the broad context of sustainability. Tools and perspectives to make sense of Business Models and support managers and entrepreneurs in dealing with Business Model ...
Function Model for Community Health Service Information
Yang, Peng; Pan, Feng; Liu, Danhong; Xu, Yongyong
In order to construct a function model of community health service (CHS) information for development of CHS information management system, Integration Definition for Function Modeling (IDEF0), an IEEE standard which is extended from Structured Analysis and Design(SADT) and now is a widely used function modeling method, was used to classifying its information from top to bottom. The contents of every level of the model were described and coded. Then function model for CHS information, which includes 4 super-classes, 15 classes and 28 sub-classed of business function, 43 business processes and 168 business activities, was established. This model can facilitate information management system development and workflow refinement.
THERMODYNAMIC MODEL OF GAS HYDRATES
Недоступ, В. И.; Недоступ, О. В.
2015-01-01
The interest to gas hydrates grows last years. Therefore working out of reliable settlement-theoretical methods of definition of their properties is necessary. The thermodynamic model of gas hydrates in which the central place occupies a behaviour of guest molecule in cell is described. The equations of interaction of molecule hydrate formative gas with cell are received, and also an enthalpy and energy of output of molecule from a cell are determined. The equation for calculation of thermody...
Exclusion statistics and integrable models
International Nuclear Information System (INIS)
Mashkevich, S.
1998-01-01
The definition of exclusion statistics, as given by Haldane, allows for a statistical interaction between distinguishable particles (multi-species statistics). The thermodynamic quantities for such statistics ca be evaluated exactly. The explicit expressions for the cluster coefficients are presented. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models. The interesting questions of generalizing this correspondence onto the higher-dimensional and the multi-species cases remain essentially open
Streamline Your Project: A Lifecycle Model.
Viren, John
2000-01-01
Discusses one approach to project organization providing a baseline lifecycle model for multimedia/CBT development. This variation of the standard four-phase model of Analysis, Design, Development, and Implementation includes a Pre-Analysis phase, called Definition, and a Post-Implementation phase, known as Maintenance. Each phase is described.…
Extendable linearised adjustment model for deformation analysis
Hiddo Velsink
2015-01-01
Author supplied: "This paper gives a linearised adjustment model for the affine, similarity and congruence transformations in 3D that is easily extendable with other parameters to describe deformations. The model considers all coordinates stochastic. Full positive semi-definite covariance matrices
Extendable linearised adjustment model for deformation analysis
Velsink, H.
2015-01-01
This paper gives a linearised adjustment model for the affine, similarity and congruence transformations in 3D that is easily extendable with other parameters to describe deformations. The model considers all coordinates stochastic. Full positive semi-definite covariance matrices and correlation
A Formal Model for Context-Awareness
DEFF Research Database (Denmark)
Kjærgaard, Mikkel Baun; Bunde-Pedersen, Jonathan
here is a definite lack of formal support for modeling real- istic context-awareness in pervasive computing applications. The Conawa calculus presented in this paper provides mechanisms for modeling complex and interwoven sets of context-information by extending ambient calculus with new construc...
[Bone remodeling and modeling/mini-modeling.
Hasegawa, Tomoka; Amizuka, Norio
Modeling, adapting structures to loading by changing bone size and shapes, often takes place in bone of the fetal and developmental stages, while bone remodeling-replacement of old bone into new bone-is predominant in the adult stage. Modeling can be divided into macro-modeling(macroscopic modeling)and mini-modeling(microscopic modeling). In the cellular process of mini-modeling, unlike bone remodeling, bone lining cells, i.e., resting flattened osteoblasts covering bone surfaces will become active form of osteoblasts, and then, deposit new bone onto the old bone without mediating osteoclastic bone resorption. Among the drugs for osteoporotic treatment, eldecalcitol(a vitamin D3 analog)and teriparatide(human PTH[1-34])could show mini-modeling based bone formation. Histologically, mature, active form of osteoblasts are localized on the new bone induced by mini-modeling, however, only a few cell layer of preosteoblasts are formed over the newly-formed bone, and accordingly, few osteoclasts are present in the region of mini-modeling. In this review, histological characteristics of bone remodeling and modeling including mini-modeling will be introduced.
A Model of Trusted Measurement Model
Ma Zhili; Wang Zhihao; Dai Liang; Zhu Xiaoqin
2017-01-01
A model of Trusted Measurement supporting behavior measurement based on trusted connection architecture (TCA) with three entities and three levels is proposed, and a frame to illustrate the model is given. The model synthesizes three trusted measurement dimensions including trusted identity, trusted status and trusted behavior, satisfies the essential requirements of trusted measurement, and unified the TCA with three entities and three levels.
Exclusion statistics and integrable models
International Nuclear Information System (INIS)
Mashkevich, S.
1998-01-01
The definition of exclusion statistics that was given by Haldane admits a 'statistical interaction' between distinguishable particles (multispecies statistics). For such statistics, thermodynamic quantities can be evaluated exactly; explicit expressions are presented here for cluster coefficients. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models of the Calogero-Sutherland type. The interesting questions of generalizing this correspondence to the higher-dimensional and the multispecies cases remain essentially open; however, our results provide some hints as to searches for the models in question
Macroeconomic models and energy transition
International Nuclear Information System (INIS)
Douillard, Pierre; Le Hir, Boris; Epaulard, Anne
2016-02-01
As a new policy for energy transition has just been adopted, several questions emerge about the best way to reduce CO 2 emissions, about policies which enable this reduction, and about their costs and opportunities. This note discusses the contribution macro-economic models may have in this respect, notably in the definition of policies which trigger behaviour changes, and those which support energy transition. The authors first discuss the stakes of the assessment of energy transition, and then describe macro-economic models which can be used for such an assessment, give and comment some results of simulations performed for France by using four of these models (Mesange, Numesis, ThreeME, and Imaclim-R France). The authors finally draw lessons about the way to use these models and to interpret their results within the frame of energy transition
Model for paramagnetic Fermi systems
International Nuclear Information System (INIS)
Ainsworth, T.L.; Bedell, K.S.; Brown, G.E.; Quader, K.F.
1983-01-01
We develop a mode for paramagnetic Fermi liquids. This model has both direct and induced interactions, the latter including both density-density and current-current response. The direct interactions are chosen to reproduce the Fermi liquid parameters F/sup s/ 0 , F/sup a/ 0 , F/sup s/ 1 and to satify the forward scattering sum rule. The F/sup a/ 1 and F/sup s/,a/sub l/ for l>1 are determined self-consistently by the induced interactions; they are checked aginst experimental determinations. The model is applied in detail to liquid 3 He, using data from spin-echo experiments, sound attenuation, and the velocities of first and zero sound. Consistency with experiments gives definite preferences for values of m. The model is also applied to paramagnetic metals. Arguments are given that this model should provide a basis for calculating effects of magnetic fields
International Symposia on Scale Modeling
Ito, Akihiko; Nakamura, Yuji; Kuwana, Kazunori
2015-01-01
This volume thoroughly covers scale modeling and serves as the definitive source of information on scale modeling as a powerful simplifying and clarifying tool used by scientists and engineers across many disciplines. The book elucidates techniques used when it would be too expensive, or too difficult, to test a system of interest in the field. Topics addressed in the current edition include scale modeling to study weather systems, diffusion of pollution in air or water, chemical process in 3-D turbulent flow, multiphase combustion, flame propagation, biological systems, behavior of materials at nano- and micro-scales, and many more. This is an ideal book for students, both graduate and undergraduate, as well as engineers and scientists interested in the latest developments in scale modeling. This book also: Enables readers to evaluate essential and salient aspects of profoundly complex systems, mechanisms, and phenomena at scale Offers engineers and designers a new point of view, liberating creative and inno...
Network model of security system
Directory of Open Access Journals (Sweden)
Adamczyk Piotr
2016-01-01
Full Text Available The article presents the concept of building a network security model and its application in the process of risk analysis. It indicates the possibility of a new definition of the role of the network models in the safety analysis. Special attention was paid to the development of the use of an algorithm describing the process of identifying the assets, vulnerability and threats in a given context. The aim of the article is to present how this algorithm reduced the complexity of the problem by eliminating from the base model these components that have no links with others component and as a result and it was possible to build a real network model corresponding to reality.
A Method for Model Checking Feature Interactions
DEFF Research Database (Denmark)
Pedersen, Thomas; Le Guilly, Thibaut; Ravn, Anders Peter
2015-01-01
This paper presents a method to check for feature interactions in a system assembled from independently developed concurrent processes as found in many reactive systems. The method combines and refines existing definitions and adds a set of activities. The activities describe how to populate the ...... the definitions with models to ensure that all interactions are captured. The method is illustrated on a home automation example with model checking as analysis tool. In particular, the modelling formalism is timed automata and the analysis uses UPPAAL to find interactions....
BUDAYA TRI HITA KARANA DALAM MODEL UTAUT
Directory of Open Access Journals (Sweden)
Dodik Ariyanto
2017-08-01
Full Text Available Abstract: Tri Hita Karana Culture in UTAUT Model. This study explores the definitions and indicators of questions that represent Tri Hita Karana (THK culture in Unified Theory of Acceptance and Use of Technology (UTAUT model. This study uses literature study (to dig definition and field test (to validate. This study finds Social Factor Culture (FSB as a new indicator in UTAUT model. FSB is defined as perceptions of individuals that is considered important (adoption, utilization, and use of Accounting Information Systems. FSB is influenced by the important people around, individual thinking, and the level of spirituality.
DEFF Research Database (Denmark)
Ivang, Reimer; Hinson, Robert; Somasundaram, Ramanathan
2006-01-01
on the literature survey and ídentification of gaps in the present e-market definitive models, the authors postulate a preliminary e-market reference model. Originality/ Value: Through synthesizing the e-market literature, and by taking into account contemporary e-market developments, key dimensions that define......Purpose: Seeks to argue that there are problems associated with e-market definitive efforts and consequently seeks proposes a new e-market model. Design/methodology/Approach: Paper based largely on literature survey and an assessment of the existing e-market conceptualizations. Findings: Based...
Tavasszy, L.A.; Jong, G. de
2014-01-01
Freight Transport Modelling is a unique new reference book that provides insight into the state-of-the-art of freight modelling. Focusing on models used to support public transport policy analysis, Freight Transport Modelling systematically introduces the latest freight transport modelling
Semantic Business Process Modeling
Markovic, Ivan
2010-01-01
This book presents a process-oriented business modeling framework based on semantic technologies. The framework consists of modeling languages, methods, and tools that allow for semantic modeling of business motivation, business policies and rules, and business processes. Quality of the proposed modeling framework is evaluated based on the modeling content of SAP Solution Composer and several real-world business scenarios.
DEFF Research Database (Denmark)
Madsen, Henrik; Zhou, Jianjun; Hansen, Lars Henrik
1997-01-01
This paper describes a case study of identifying the physical model (or the grey box model) of a hydraulic test robot. The obtained model is intended to provide a basis for model-based control of the robot. The physical model is formulated in continuous time and is derived by application...
DEFF Research Database (Denmark)
Könemann, Patrick
just contain a list of strings, one for each line, whereas the structure of models is defined by their meta models. There are tools available which are able to compute the diff between two models, e.g. RSA or EMF Compare. However, their diff is not model-independent, i.e. it refers to the models...
Haiganoush Preisler; Alan Ager
2013-01-01
For applied mathematicians forest fire models refer mainly to a non-linear dynamic system often used to simulate spread of fire. For forest managers forest fire models may pertain to any of the three phases of fire management: prefire planning (fire risk models), fire suppression (fire behavior models), and postfire evaluation (fire effects and economic models). In...
Tree-Structured Digital Organisms Model
Suzuki, Teruhiko; Nobesawa, Shiho; Tahara, Ikuo
Tierra and Avida are well-known models of digital organisms. They describe a life process as a sequence of computation codes. A linear sequence model may not be the only way to describe a digital organism, though it is very simple for a computer-based model. Thus we propose a new digital organism model based on a tree structure, which is rather similar to the generic programming. With our model, a life process is a combination of various functions, as if life in the real world is. This implies that our model can easily describe the hierarchical structure of life, and it can simulate evolutionary computation through mutual interaction of functions. We verified our model by simulations that our model can be regarded as a digital organism model according to its definitions. Our model even succeeded in creating species such as viruses and parasites.
Environmental Satellite Models for a Macroeconomic Model
International Nuclear Information System (INIS)
Moeller, F.; Grinderslev, D.; Werner, M.
2003-01-01
To support national environmental policy, it is desirable to forecast and analyse environmental indicators consistently with economic variables. However, environmental indicators are physical measures linked to physical activities that are not specified in economic models. One way to deal with this is to develop environmental satellite models linked to economic models. The system of models presented gives a frame of reference where emissions of greenhouse gases, acid gases, and leaching of nutrients to the aquatic environment are analysed in line with - and consistently with - macroeconomic variables. This paper gives an overview of the data and the satellite models. Finally, the results of applying the model system to calculate the impacts on emissions and the economy are reviewed in a few illustrative examples. The models have been developed for Denmark; however, most of the environmental data used are from the CORINAIR system implemented in numerous countries
Geologic Framework Model Analysis Model Report
Energy Technology Data Exchange (ETDEWEB)
R. Clayton
2000-12-19
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the
Geologic Framework Model Analysis Model Report
International Nuclear Information System (INIS)
Clayton, R.
2000-01-01
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and
Natural climate variability in a coupled model
International Nuclear Information System (INIS)
Zebiak, S.E.; Cane, M.A.
1990-01-01
Multi-century simulations with a simplified coupled ocean-atmosphere model are described. These simulations reveal an impressive range of variability on decadal and longer time scales, in addition to the dominant interannual el Nino/Southern Oscillation signal that the model originally was designed to simulate. Based on a very large sample of century-long simulations, it is nonetheless possible to identify distinct model parameter sensitivities that are described here in terms of selected indices. Preliminary experiments motivated by general circulation model results for increasing greenhouse gases suggest a definite sensitivity to model global warming. While these results are not definitive, they strongly suggest that coupled air-sea dynamics figure prominently in global change and must be included in models for reliable predictions
DEFF Research Database (Denmark)
De Giovanni, Domenico
2010-01-01
prepayment models for mortgage backed securities, this paper builds a Rational Expectation (RE) model describing the policyholders' behavior in lapsing the contract. A market model with stochastic interest rates is considered, and the pricing is carried out through numerical approximation...
DEFF Research Database (Denmark)
De Giovanni, Domenico
prepayment models for mortgage backed securities, this paper builds a Rational Expectation (RE) model describing the policyholders' behavior in lapsing the contract. A market model with stochastic interest rates is considered, and the pricing is carried out through numerical approximation...
DEFF Research Database (Denmark)
Silvennoinen, Annastiina; Teräsvirta, Timo
This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...
Collaborative networks: Reference modeling
Camarinha-Matos, L.M.; Afsarmanesh, H.
2008-01-01
Collaborative Networks: Reference Modeling works to establish a theoretical foundation for Collaborative Networks. Particular emphasis is put on modeling multiple facets of collaborative networks and establishing a comprehensive modeling framework that captures and structures diverse perspectives of
DEFF Research Database (Denmark)
Juhl, Joakim
This thesis is about mathematical modelling and technology development. While mathematical modelling has become widely deployed within a broad range of scientific practices, it has also gained a central position within technology development. The intersection of mathematical modelling and technol...
D'Souza, Austin
2013-01-01
Presentatie gegeven op 13 mei 2013 op de bijeenkomst "Business Model Canvas Challenge Assen".
Het Business Model Canvas is ontworpen door Alex Osterwalder. Het model werkt zeer overzichtelijk en bestaat uit negen bouwstenen.
CSIR Research Space (South Africa)
Osburn, L
2010-01-01
Full Text Available The construction industry has turned to energy modelling in order to assist them in reducing the amount of energy consumed by buildings. However, while the energy loads of buildings can be accurately modelled, energy models often under...
Earth Data Analysis Center, University of New Mexico — The model combines three modeled fire behavior parameters (rate of spread, flame length, crown fire potential) and one modeled ecological health measure (fire regime...
Mathematical Modeling Using MATLAB
National Research Council Canada - National Science Library
Phillips, Donovan
1998-01-01
.... Mathematical Modeling Using MA MATLAB acts as a companion resource to A First Course in Mathematical Modeling with the goal of guiding the reader to a fuller understanding of the modeling process...
Modelling and evaluation of surgical performance using hidden Markov models.
Megali, Giuseppe; Sinigaglia, Stefano; Tonet, Oliver; Dario, Paolo
2006-10-01
Minimally invasive surgery has become very widespread in the last ten years. Since surgeons experience difficulties in learning and mastering minimally invasive techniques, the development of training methods is of great importance. While the introduction of virtual reality-based simulators has introduced a new paradigm in surgical training, skill evaluation methods are far from being objective. This paper proposes a method for defining a model of surgical expertise and an objective metric to evaluate performance in laparoscopic surgery. Our approach is based on the processing of kinematic data describing movements of surgical instruments. We use hidden Markov model theory to define an expert model that describes expert surgical gesture. The model is trained on kinematic data related to exercises performed on a surgical simulator by experienced surgeons. Subsequently, we use this expert model as a reference model in the definition of an objective metric to evaluate performance of surgeons with different abilities. Preliminary results show that, using different topologies for the expert model, the method can be efficiently used both for the discrimination between experienced and novice surgeons, and for the quantitative assessment of surgical ability.
Analytic Modeling of Insurgencies
2014-08-01
Counterinsurgency, Situational Awareness, Civilians, Lanchester 1. Introduction Combat modeling is one of the oldest areas of operations research, dating...Army. The ground-breaking work of Lanchester in 1916 [1] marks the beginning of formal models of conflicts, where mathematical formulas and, later...Warfare model [3], which is a Lanchester - based mathematical model (see more details about this model later on), and McCormick’s Magic Diamond model [4
Computational neurogenetic modeling
Benuskova, Lubica
2010-01-01
Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol
Designing a Sustainable Future with Mental Models
Bernotat, Anke; Bertling, Jürgen; English, Christiane; Schanz, Judith
2017-01-01
Inspired by the question of the Club of Rome as to Design could help to translate the ubiquitous knowledge on sustainability into daily practise and Peter Senge's belief on mental models as a limiting factor to implementation of systemic insight (Senge 2006), we explored working with mental models as a sustainable design tool. We propose a definition for design uses. At the 7th Sustainable Summer School we collected general unsustainable mental models and "designed" sustainable ones. These me...
Federal Laboratory Consortium — The Environmental Modeling Center provides the computational tools to perform geostatistical analysis, to model ground water and atmospheric releases for comparison...
Finch, W Holmes; Kelley, Ken
2014-01-01
A powerful tool for analyzing nested designs in a variety of fields, multilevel/hierarchical modeling allows researchers to account for data collected at multiple levels. Multilevel Modeling Using R provides you with a helpful guide to conducting multilevel data modeling using the R software environment.After reviewing standard linear models, the authors present the basics of multilevel models and explain how to fit these models using R. They then show how to employ multilevel modeling with longitudinal data and demonstrate the valuable graphical options in R. The book also describes models fo
Cosmological models without singularities
International Nuclear Information System (INIS)
Petry, W.
1981-01-01
A previously studied theory of gravitation in flat space-time is applied to homogeneous and isotropic cosmological models. There exist two different classes of models without singularities: (i) ever-expanding models, (ii) oscillating models. The first class contains models with hot big bang. For these models there exist at the beginning of the universe-in contrast to Einstein's theory-very high but finite densities of matter and radiation with a big bang of very short duration. After short time these models pass into the homogeneous and isotropic models of Einstein's theory with spatial curvature equal to zero and cosmological constant ALPHA >= O. (author)
National Aeronautics and Space Administration — CLAIRE MONTELEONI*, GAVIN SCHMIDT, AND SHAILESH SAROHA* Climate models are complex mathematical models designed by meteorologists, geophysicists, and climate...
4K Video Traffic Prediction using Seasonal Autoregressive Modeling
Directory of Open Access Journals (Sweden)
D. R. Marković
2017-06-01
Full Text Available From the perspective of average viewer, high definition video streams such as HD (High Definition and UHD (Ultra HD are increasing their internet presence year over year. This is not surprising, having in mind expansion of HD streaming services, such as YouTube, Netflix etc. Therefore, high definition video streams are starting to challenge network resource allocation with their bandwidth requirements and statistical characteristics. Need for analysis and modeling of this demanding video traffic has essential importance for better quality of service and experience support. In this paper we use an easy-to-apply statistical model for prediction of 4K video traffic. Namely, seasonal autoregressive modeling is applied in prediction of 4K video traffic, encoded with HEVC (High Efficiency Video Coding. Analysis and modeling were performed within R programming environment using over 17.000 high definition video frames. It is shown that the proposed methodology provides good accuracy in high definition video traffic modeling.
ROCK PROPERTIES MODEL ANALYSIS MODEL REPORT
International Nuclear Information System (INIS)
Clinton Lum
2002-01-01
The purpose of this Analysis and Model Report (AMR) is to document Rock Properties Model (RPM) 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties models are intended principally for use as input to numerical physical-process modeling, such as of ground-water flow and/or radionuclide transport. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. This work was conducted in accordance with the following planning documents: WA-0344, ''3-D Rock Properties Modeling for FY 1998'' (SNL 1997, WA-0358), ''3-D Rock Properties Modeling for FY 1999'' (SNL 1999), and the technical development plan, Rock Properties Model Version 3.1, (CRWMS MandO 1999c). The Interim Change Notice (ICNs), ICN 02 and ICN 03, of this AMR were prepared as part of activities being conducted under the Technical Work Plan, TWP-NBS-GS-000003, ''Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01'' (CRWMS MandO 2000b). The purpose of ICN 03 is to record changes in data input status due to data qualification and verification activities. These work plans describe the scope, objectives, tasks, methodology, and implementing procedures for model construction. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The work scope for this activity consists of the following: (1) Conversion of the input data (laboratory measured porosity data, x-ray diffraction mineralogy, petrophysical calculations of bound water, and petrophysical calculations of porosity) for each borehole into stratigraphic coordinates; (2) Re-sampling and merging of data sets; (3) Development of geostatistical simulations of porosity; (4
Directory of Open Access Journals (Sweden)
Nadina Yedid
2013-12-01
Full Text Available El artículo repasa los hallazgos y opiniones de teóricos e investigadores que se han dedicado a analizar el fenómeno de las folksonomías. Se rescatan las principales definiciones del concepto de folksonomía, sus características, los tipos de folksonomías existentes, y las diferencias con los modelos tradicionales de indización mediante vocabularios controlados, analizando las ventajas y desventajas de este nuevo modelo. Se concluye que las folksonomías pueden ofrecer grandes ventajas en la recuperación de información, y más aún si son utilizadas como complemento de la indización mediante vocabularios controlados = The article reviews the findings and opinions of theorists and researchers who are dedicated to analyzing the phenomenon of folksonomies. We highlight the main definitions of folksonomy, their characteristics, the different types of existing folksonomies, and the differences with traditional indexing using controlled vocabularies, analyzing the advantages and disadvantages of this new model. We conclude that folksonomies can offer great advantages in information retrieval, especially if they are used to complement the indexing using controlled vocabularies.
A Comparative of business process modelling techniques
Tangkawarow, I. R. H. T.; Waworuntu, J.
2016-04-01
In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.
Traceability in Model-Based Testing
Directory of Open Access Journals (Sweden)
Mathew George
2012-11-01
Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.
Minimum-complexity helicopter simulation math model
Heffley, Robert K.; Mnich, Marc A.
1988-01-01
An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.
Integrated Site Model Process Model Report
International Nuclear Information System (INIS)
Booth, T.
2000-01-01
The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM
ECONOMIC MODELING STOCKS CONTROL SYSTEM: SIMULATION MODEL
Климак, М.С.; Войтко, С.В.
2016-01-01
Considered theoretical and applied aspects of the development of simulation models to predictthe optimal development and production systems that create tangible products andservices. It isproved that theprocessof inventory control needs of economicandmathematical modeling in viewof thecomplexity of theoretical studies. A simulation model of stocks control that allows make managementdecisions with production logistics
Modelling bankruptcy prediction models in Slovak companies
Directory of Open Access Journals (Sweden)
Kovacova Maria
2017-01-01
Full Text Available An intensive research from academics and practitioners has been provided regarding models for bankruptcy prediction and credit risk management. In spite of numerous researches focusing on forecasting bankruptcy using traditional statistics techniques (e.g. discriminant analysis and logistic regression and early artificial intelligence models (e.g. artificial neural networks, there is a trend for transition to machine learning models (support vector machines, bagging, boosting, and random forest to predict bankruptcy one year prior to the event. Comparing the performance of this with unconventional approach with results obtained by discriminant analysis, logistic regression, and neural networks application, it has been found that bagging, boosting, and random forest models outperform the others techniques, and that all prediction accuracy in the testing sample improves when the additional variables are included. On the other side the prediction accuracy of old and well known bankruptcy prediction models is quiet high. Therefore, we aim to analyse these in some way old models on the dataset of Slovak companies to validate their prediction ability in specific conditions. Furthermore, these models will be modelled according to new trends by calculating the influence of elimination of selected variables on the overall prediction ability of these models.
Better models are more effectively connected models
Nunes, João Pedro; Bielders, Charles; Darboux, Frederic; Fiener, Peter; Finger, David; Turnbull-Lloyd, Laura; Wainwright, John
2016-04-01
The concept of hydrologic and geomorphologic connectivity describes the processes and pathways which link sources (e.g. rainfall, snow and ice melt, springs, eroded areas and barren lands) to accumulation areas (e.g. foot slopes, streams, aquifers, reservoirs), and the spatial variations thereof. There are many examples of hydrological and sediment connectivity on a watershed scale; in consequence, a process-based understanding of connectivity is crucial to help managers understand their systems and adopt adequate measures for flood prevention, pollution mitigation and soil protection, among others. Modelling is often used as a tool to understand and predict fluxes within a catchment by complementing observations with model results. Catchment models should therefore be able to reproduce the linkages, and thus the connectivity of water and sediment fluxes within the systems under simulation. In modelling, a high level of spatial and temporal detail is desirable to ensure taking into account a maximum number of components, which then enables connectivity to emerge from the simulated structures and functions. However, computational constraints and, in many cases, lack of data prevent the representation of all relevant processes and spatial/temporal variability in most models. In most cases, therefore, the level of detail selected for modelling is too coarse to represent the system in a way in which connectivity can emerge; a problem which can be circumvented by representing fine-scale structures and processes within coarser scale models using a variety of approaches. This poster focuses on the results of ongoing discussions on modelling connectivity held during several workshops within COST Action Connecteur. It assesses the current state of the art of incorporating the concept of connectivity in hydrological and sediment models, as well as the attitudes of modellers towards this issue. The discussion will focus on the different approaches through which connectivity
Model Validation in Ontology Based Transformations
Directory of Open Access Journals (Sweden)
Jesús M. Almendros-Jiménez
2012-10-01
Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.
Preliminary Multi-Variable Parametric Cost Model for Space Telescopes
Stahl, H. Philip; Hendrichs, Todd
2010-01-01
This slide presentation reviews creating a preliminary multi-variable cost model for the contract costs of making a space telescope. There is discussion of the methodology for collecting the data, definition of the statistical analysis methodology, single variable model results, testing of historical models and an introduction of the multi variable models.
Generalized latent variable modeling multilevel, longitudinal, and structural equation models
Skrondal, Anders; Rabe-Hesketh, Sophia
2004-01-01
This book unifies and extends latent variable models, including multilevel or generalized linear mixed models, longitudinal or panel models, item response or factor models, latent class or finite mixture models, and structural equation models.
Target-Centric Network Modeling
DEFF Research Database (Denmark)
Mitchell, Dr. William L.; Clark, Dr. Robert M.
In Target-Centric Network Modeling: Case Studies in Analyzing Complex Intelligence Issues, authors Robert Clark and William Mitchell take an entirely new approach to teaching intelligence analysis. Unlike any other book on the market, it offers case study scenarios using actual intelligence...... reporting formats, along with a tested process that facilitates the production of a wide range of analytical products for civilian, military, and hybrid intelligence environments. Readers will learn how to perform the specific actions of problem definition modeling, target network modeling......, and collaborative sharing in the process of creating a high-quality, actionable intelligence product. The case studies reflect the complexity of twenty-first century intelligence issues by dealing with multi-layered target networks that cut across political, economic, social, technological, and military issues...
Business process modeling in healthcare.
Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd
2012-01-01
The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.
Mathematical modelling in solid mechanics
Sofonea, Mircea; Steigmann, David
2017-01-01
This book presents new research results in multidisciplinary fields of mathematical and numerical modelling in mechanics. The chapters treat the topics: mathematical modelling in solid, fluid and contact mechanics nonconvex variational analysis with emphasis to nonlinear solid and structural mechanics numerical modelling of problems with non-smooth constitutive laws, approximation of variational and hemivariational inequalities, numerical analysis of discrete schemes, numerical methods and the corresponding algorithms, applications to mechanical engineering numerical aspects of non-smooth mechanics, with emphasis on developing accurate and reliable computational tools mechanics of fibre-reinforced materials behaviour of elasto-plastic materials accounting for the microstructural defects definition of structural defects based on the differential geometry concepts or on the atomistic basis interaction between phase transformation and dislocations at nano-scale energetic arguments bifurcation and post-buckling a...
Energy Technology Data Exchange (ETDEWEB)
D. W. Wu
2003-07-16
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).
Energy Technology Data Exchange (ETDEWEB)
M. A. Wasiolek
2003-10-27
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).
International Nuclear Information System (INIS)
D. W. Wu
2003-01-01
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7)
Rahmani, Fouad Lazhar
2010-11-01
The aim of this paper is to present mathematical modelling of the spread of infection in the context of the transmission of the human immunodeficiency virus (HIV) and the acquired immune deficiency syndrome (AIDS). These models are based in part on the models suggested in the field of th AIDS mathematical modelling as reported by ISHAM [6].
DEFF Research Database (Denmark)
Ayres, Phil
2012-01-01
This essay discusses models. It examines what models are, the roles models perform and suggests various intentions that underlie their construction and use. It discusses how models act as a conversational partner, and how they support various forms of conversation within the conversational activity...
Wenger, Trey V.; Kepley, Amanda K.; Balser, Dana S.
2017-07-01
HII Region Models fits HII region models to observed radio recombination line and radio continuum data. The algorithm includes the calculations of departure coefficients to correct for non-LTE effects. HII Region Models has been used to model star formation in the nucleus of IC 342.
Energy Technology Data Exchange (ETDEWEB)
Ibsen, Lars Bo; Liingaard, M.
2006-12-15
A lumped-parameter model represents the frequency dependent soil-structure interaction of a massless foundation placed on or embedded into an unbounded soil domain. In this technical report the steps of establishing a lumped-parameter model are presented. Following sections are included in this report: Static and dynamic formulation, Simple lumped-parameter models and Advanced lumped-parameter models. (au)
DEFF Research Database (Denmark)
Larsen, Bjarke Alexander; Andkjær, Kasper Ingdahl; Schoenau-Fog, Henrik
2015-01-01
This paper proposes a new relation model, called "The Moody Mask model", for Interactive Digital Storytelling (IDS), based on Franceso Osborne's "Mask Model" from 2011. This, mixed with some elements from Chris Crawford's Personality Models, is a system designed for dynamic interaction between ch...
Efficient polarimetric BRDF model.
Renhorn, Ingmar G E; Hallberg, Tomas; Boreman, Glenn D
2015-11-30
The purpose of the present manuscript is to present a polarimetric bidirectional reflectance distribution function (BRDF) model suitable for hyperspectral and polarimetric signature modelling. The model is based on a further development of a previously published four-parameter model that has been generalized in order to account for different types of surface structures (generalized Gaussian distribution). A generalization of the Lambertian diffuse model is presented. The pBRDF-functions are normalized using numerical integration. Using directional-hemispherical reflectance (DHR) measurements, three of the four basic parameters can be determined for any wavelength. This simplifies considerably the development of multispectral polarimetric BRDF applications. The scattering parameter has to be determined from at least one BRDF measurement. The model deals with linear polarized radiation; and in similarity with e.g. the facet model depolarization is not included. The model is very general and can inherently model extreme surfaces such as mirrors and Lambertian surfaces. The complex mixture of sources is described by the sum of two basic models, a generalized Gaussian/Fresnel model and a generalized Lambertian model. Although the physics inspired model has some ad hoc features, the predictive power of the model is impressive over a wide range of angles and scattering magnitudes. The model has been applied successfully to painted surfaces, both dull and glossy and also on metallic bead blasted surfaces. The simple and efficient model should be attractive for polarimetric simulations and polarimetric remote sensing.
Fundamental concepts of modeling
International Nuclear Information System (INIS)
Garland, W.J.
1990-01-01
This paper addresses the roles of simulation in science and engineering: the extended calculator, the prototyper and the intuition generation medium. Science and engineering involves thought processes which transcend the rational. Simulation has emerged as a third wing of science that is orthogonal to experimentation and theory. It has the pedestrian role of the extended calculator in the simulation provides a numerical bridge between symbolic theory and hard experimental data. In this role, discovery has been assisted. But, simulation has proved to be more than a super calculator. The nuclear industry and computational fluid dynamics are but two examples of areas that use simulation to replace experimentation (prototyping) for cost and danger reasons. Further, there is an emerging role of graphics and artificial intelligence in the discovery process. Simulation is clearly becoming not only a tool that reduces the tedium, but one that enhances the creative process. The paper looks at thermalhydraulics and considers the emerging trends. A general modelling scheme is proposed and a systems view is used to suggest the criteria for optimum simulation models for the working environment. The basic theme proposed is to reduce the proportion of rational mental time we spend on perspiration so as to allow more time for inspiration non- rational. Our QUEST then is the Quintessential Eureka Stimulator. The QUEST involves the use of existing tools on the microcomputer to enhance problem setup and post-run analysis. The key to successful questing lies, however, in the development of a simulation environment that is seamless through the whole of the process, from problem definition to presentation of results. The framework for this mythical environment is introduced
Multivariable Wind Modeling in State Space
DEFF Research Database (Denmark)
Sichani, Mahdi Teimouri; Pedersen, B. J.
2011-01-01
Turbulence of the incoming wind field is of paramount importance to the dynamic response of wind turbines. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper an empirical...... for the vector turbulence process incorporating its phase spectrum in one stage, and its results are compared with a conventional ARMA modeling method....... the succeeding state space and ARMA modeling of the turbulence rely on the positive definiteness of the cross-spectral density matrix, the problem with the non-positive definiteness of such matrices is at first addressed and suitable treatments regarding it are proposed. From the adjusted positive definite cross...
International Nuclear Information System (INIS)
Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.
1994-05-01
The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid
International Nuclear Information System (INIS)
Ogava, S.; Savada, S.; Nakagava, M.
1983-01-01
Composite models of hadrons are considered. The main attention is paid to the Sakata, S model. In the framework of the model it is presupposed that proton, neutron and Λ particle are the fundamental particles. Theoretical studies of unknown fundamental constituents of a substance have led to the creation of the quark model. In the framework of the quark model using the theory of SU(6)-symmetry the classification of mesons and baryons is considered. Using the quark model relations between hadron masses, their spins and electromagnetic properties are explained. The problem of three-colour model with many flavours is briefly presented
Modeller af komplicerede systemer
DEFF Research Database (Denmark)
Mortensen, J.
emphasizes their use in relation to technical systems. All the presented models, with the exception of the types presented in chapter 2, are non-theoretical non-formal conceptual network models. Two new model types are presented: 1) The System-Environment model, which describes the environments interaction...... with conceptual modeling in relation to process control. It´s purpose is to present classify and exemplify the use of a set of qualitative model types. Such model types are useful in the early phase of modeling, where no structured methods are at hand. Although the models are general in character, this thesis......This thesis, "Modeller af komplicerede systemer", represents part of the requirements for the Danish Ph.D.degree. Assisting professor John Nørgaard-Nielsen, M.Sc.E.E.Ph.D. has been principal supervisor and professor Morten Lind, M.Sc.E.E.Ph.D. has been assisting supervisor. The thesis is concerned...
Molenaar, Peter C M
2017-01-01
Equivalences of two classes of dynamic models for weakly stationary multivariate time series are discussed: dynamic factor models and autoregressive models. It is shown that exploratory dynamic factor models can be rotated, yielding an infinite set of equivalent solutions for any observed series. It also is shown that dynamic factor models with lagged factor loadings are not equivalent to the currently popular state-space models, and that restriction of attention to the latter type of models may yield invalid results. The known equivalent vector autoregressive model types, standard and structural, are given a new interpretation in which they are conceived of as the extremes of an innovating type of hybrid vector autoregressive models. It is shown that consideration of hybrid models solves many problems, in particular with Granger causality testing.
DEFF Research Database (Denmark)
Justesen, Lise; Overgaard, Svend Skafte
2017-01-01
This article presents an analytical model that aims to conceptualize how meal experiences are framed when taking into account a dynamic understanding of hospitality: the meal model is named The Hospitable Meal Model. The idea behind The Hospitable Meal Model is to present a conceptual model...... that can serve as a frame for developing hospitable meal competencies among professionals working within the area of institutional foodservices as well as a conceptual model for analysing meal experiences. The Hospitable Meal Model transcends and transforms existing meal models by presenting a more open......-ended approach towards meal experiences. The underlying purpose of The Hospitable Meal Model is to provide the basis for creating value for the individuals involved in institutional meal services. The Hospitable Meal Model was developed on the basis of an empirical study on hospital meal experiences explored...
Morgan, Byron JT; Tanner, Martin Abba; Carlin, Bradley P
2008-01-01
Introduction and Examples Introduction Examples of data sets Basic Model Fitting Introduction Maximum-likelihood estimation for a geometric model Maximum-likelihood for the beta-geometric model Modelling polyspermy Which model? What is a model for? Mechanistic models Function Optimisation Introduction MATLAB: graphs and finite differences Deterministic search methods Stochastic search methods Accuracy and a hybrid approach Basic Likelihood ToolsIntroduction Estimating standard errors and correlations Looking at surfaces: profile log-likelihoods Confidence regions from profiles Hypothesis testing in model selectionScore and Wald tests Classical goodness of fit Model selection biasGeneral Principles Introduction Parameterisation Parameter redundancy Boundary estimates Regression and influence The EM algorithm Alternative methods of model fitting Non-regular problemsSimulation Techniques Introduction Simulating random variables Integral estimation Verification Monte Carlo inference Estimating sampling distributi...
International Nuclear Information System (INIS)
Ahlers, C.F.; Liu, H.H.
2001-01-01
The purpose of this Analysis/Model Report (AMR) is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Yucca Mountain Site Characterization Project (YMP). This work was performed in accordance with the AMR Development Plan for U0035 Calibrated Properties Model REV00 (CRWMS M and O 1999c). These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, drift-scale and mountain-scale coupled-processes models, and Total System Performance Assessment (TSPA) models as well as Performance Assessment (PA) and other participating national laboratories and government agencies. These process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions
International Nuclear Information System (INIS)
Ahlers, C.; Liu, H.
2000-01-01
The purpose of this Analysis/Model Report (AMR) is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Yucca Mountain Site Characterization Project (YMP). This work was performed in accordance with the ''AMR Development Plan for U0035 Calibrated Properties Model REV00. These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, drift-scale and mountain-scale coupled-processes models, and Total System Performance Assessment (TSPA) models as well as Performance Assessment (PA) and other participating national laboratories and government agencies. These process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions
Business Models and Business Model Innovation
DEFF Research Database (Denmark)
Foss, Nicolai J.; Saebi, Tina
2018-01-01
While research on business models and business model innovation continue to exhibit growth, the field is still, even after more than two decades of research, characterized by a striking lack of cumulative theorizing and an opportunistic borrowing of more or less related ideas from neighbouring...
Wake modelling combining mesoscale and microscale models
DEFF Research Database (Denmark)
Badger, Jake; Volker, Patrick; Prospathospoulos, J.
2013-01-01
In this paper the basis for introducing thrust information from microscale wake models into mesocale model wake parameterizations will be described. A classification system for the different types of mesoscale wake parameterizations is suggested and outlined. Four different mesoscale wake paramet...
Energy Technology Data Exchange (ETDEWEB)
Rubio, F. Javier; Siddiqui, Afzal S.; Marnay, Chris; Hamachi,Kristina S.
2000-03-01
This effort represents a contribution to the wider distributed energy resources (DER) research of the Consortium for Electric Reliability Technology Solutions (CERTS, http://certs.lbl.gov) that is intended to attack and, hopefully, resolve the technical barriers to DER adoption, particularly those that are unlikely to be of high priority to individual equipment vendors. The longer term goal of the Berkeley Lab effort is to guide the wider technical research towards the key technical problems by forecasting some likely patterns of DER adoption. In sharp contrast to traditional electricity utility planning, this work takes a customer-centric approach and focuses on DER adoption decision making at, what we currently think of as, the customer level. This study reports on Berkeley Lab's second year effort (completed in Federal fiscal year 2000, FY00) of a project aimed to anticipate patterns of customer adoption of distributed energy resources (DER). Marnay, et al., 2000 describes the earlier FY99 Berkeley Lab work. The results presented herein are not intended to represent definitive economic analyses of possible DER projects by any means. The paucity of data available and the importance of excluded factors, such as environmental implications, are simply too important to make such an analysis possible at this time. Rather, the work presented represents a demonstration of the current model and an indicator of the potential to conduct more relevant studies in the future.
Introduction to Adjoint Models
Errico, Ronald M.
2015-01-01
In this lecture, some fundamentals of adjoint models will be described. This includes a basic derivation of tangent linear and corresponding adjoint models from a parent nonlinear model, the interpretation of adjoint-derived sensitivity fields, a description of methods of automatic differentiation, and the use of adjoint models to solve various optimization problems, including singular vectors. Concluding remarks will attempt to correct common misconceptions about adjoint models and their utilization.
Zagorsek, Branislav
2013-01-01
Business model describes the company’s most important activities, proposed value, and the compensation for the value. Business model visualization enables to simply and systematically capture and describe the most important components of the business model while the standardization of the concept allows the comparison between companies. There are several possibilities how to visualize the model. The aim of this paper is to describe the options for business model visualization and business mod...
DEFF Research Database (Denmark)
Langseth, Helge; Nielsen, Thomas Dyhre
2005-01-01
parametric family ofdistributions. In this paper we propose a new set of models forclassification in continuous domains, termed latent classificationmodels. The latent classification model can roughly be seen ascombining the \\NB model with a mixture of factor analyzers,thereby relaxing the assumptions...... classification model, and wedemonstrate empirically that the accuracy of the proposed model issignificantly higher than the accuracy of other probabilisticclassifiers....
Geochemistry Model Validation Report: External Accumulation Model
International Nuclear Information System (INIS)
Zarrabi, K.
2001-01-01
The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation
Ferrofluids: Modeling, numerical analysis, and scientific computation
Tomas, Ignacio
This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a
Pavement Aging Model by Response Surface Modeling
Directory of Open Access Journals (Sweden)
Manzano-Ramírez A.
2011-10-01
Full Text Available In this work, surface course aging was modeled by Response Surface Methodology (RSM. The Marshall specimens were placed in a conventional oven for time and temperature conditions established on the basis of the environment factors of the region where the surface course is constructed by AC-20 from the Ing. Antonio M. Amor refinery. Volatilized material (VM, load resistance increment (ΔL and flow resistance increment (ΔF models were developed by the RSM. Cylindrical specimens with real aging were extracted from the surface course pilot to evaluate the error of the models. The VM model was adequate, in contrast (ΔL and (ΔF models were almost adequate with an error of 20 %, that was associated with the other environmental factors, which were not considered at the beginning of the research.
Modelling of an homogeneous equilibrium mixture model
International Nuclear Information System (INIS)
Bernard-Champmartin, A.; Poujade, O.; Mathiaud, J.; Mathiaud, J.; Ghidaglia, J.M.
2014-01-01
We present here a model for two phase flows which is simpler than the 6-equations models (with two densities, two velocities, two temperatures) but more accurate than the standard mixture models with 4 equations (with two densities, one velocity and one temperature). We are interested in the case when the two-phases have been interacting long enough for the drag force to be small but still not negligible. The so-called Homogeneous Equilibrium Mixture Model (HEM) that we present is dealing with both mixture and relative quantities, allowing in particular to follow both a mixture velocity and a relative velocity. This relative velocity is not tracked by a conservation law but by a closure law (drift relation), whose expression is related to the drag force terms of the two-phase flow. After the derivation of the model, a stability analysis and numerical experiments are presented. (authors)
Wind Shear Modeling for Aircraft Hazard Definition.
1978-02-01
should give a valid representation of most terminal areas. For air- ports located near unusual terrain features such as mountains or cliffs...A ( IP+t—1 , 11, 2 )—A(1~~+i— 1 , 10,2) )/u1~ lJd L J U INt)r. ~AA (1. O ALEA )* (1. O — b E L A ) $ L )X X ( 1, 1) +bE.Lu *C 1.0 ALd~A )*L)XX (1
Wind Shear Modeling for Aircraft Hazard Definition
1977-03-01
Fichtl, "Rough to Smooth Transition of an Equilibrium Neutral Constant Stress Layer," NASA TM X-3322, (1975). 5-36 Geiger, Rudolf , The Climate Near the...Roy Steiner , and K. G. Pratt. "Dynamic Response of Airplanes to Atmospheric Turbulence Including Flight Data on Input and Response," NASA TR R-199
Establishing model credibility involves more than validation
International Nuclear Information System (INIS)
Kirchner, T.
1991-01-01
One widely used definition of validation is that the quantitative test of the performance of a model through the comparison of model predictions to independent sets of observations from the system being simulated. The ability to show that the model predictions compare well with observations is often thought to be the most rigorous test that can be used to establish credibility for a model in the scientific community. However, such tests are only part of the process used to establish credibility, and in some cases may be either unnecessary or misleading. Naylor and Finger extended the concept of validation to include the establishment of validity for the postulates embodied in the model and the test of assumptions used to select postulates for the model. Validity of postulates is established through concurrence by experts in the field of study that the mathematical or conceptual model contains the structural components and mathematical relationships necessary to adequately represent the system with respect to the goals for the model. This extended definition of validation provides for consideration of the structure of the model, not just its performance, in establishing credibility. Evaluation of a simulation model should establish the correctness of the code and the efficacy of the model within its domain of applicability. (24 refs., 6 figs.)
Hébert, Hélène; Abadie, Stéphane; Benoit, Michel; Créach, Ronan; Frère, Antoine; Gailler, Audrey; Garzaglia, Sébastien; Hayashi, Yutaka; Loevenbruck, Anne; Macary, Olivier; Marcer, Richard; Morichon, Denis; Pedreros, Rodrigo; Rebour, Vincent; Ricchiuto, Mario; Silva Jacinto, Ricardo; Terrier, Monique; Toucanne, Samuel; Traversa, Paola; Violeau, Damien
2014-05-01
TANDEM (Tsunamis in the Atlantic and the English ChaNnel: Definition of the Effects through numerical Modeling) is a French research project dedicated to the appraisal of coastal effects due to tsunami waves on the French coastlines, with a special focus on the Atlantic and Channel coastlines, where French civil nuclear facilities have been operating since about 30 years. This project aims at drawing conclusions from the 2011 catastrophic tsunami, and will allow, together with a Japanese research partner, to design, adapt and validate numerical methods of tsunami hazard assessment, using the outstanding database of the 2011 tsunami. Then the validated methods will be applied to estimate, as accurately as possible, the tsunami hazard for the French Atlantic and Channel coastlines, in order to provide guidance for risk assessment on the nuclear facilities. The project TANDEM follows the recommendations of International Atomic Energy Agency (IAEA) to analyse the tsunami exposure of the nuclear facilities, as well as the recommendations of the French Nuclear Safety Authority (Autorité de Sûreté Nucléaire, ASN) in the aftermath of the 2011 catastrophe, which required the licensee of nuclear facilities to conduct complementary safety assessments (CSA), also including "the robustness beyond their design basis". The tsunami hazard deserves an appraisal in the light of the 2011 catastrophe, to check whether any unforeseen tsunami impact can be expected for these facilities. TANDEM aims at defining the tsunami effects expected for the French Atlantic and Channel coastlines, basically from numerical modeling methods, through adaptation and improvement of numerical methods, in order to study tsunami impacts down to the interaction with coastal structures (thus sometimes using 3D approaches) (WP1). Then the methods will be tested to better characterize and quantify the associated uncertainties (in the source, the propagation, and the coastal impact) (WP2). The project will
High resolution extremity CT for biomechanics modeling
International Nuclear Information System (INIS)
Ashby, A.E.; Brand, H.; Hollerbach, K.; Logan, C.M.; Martz, H.E.
1995-01-01
With the advent of ever more powerful computing and finite element analysis (FEA) capabilities, the bone and joint geometry detail available from either commercial surface definitions or from medical CT scans is inadequate. For dynamic FEA modeling of joints, precise articular contours are necessary to get appropriate contact definition. In this project, a fresh cadaver extremity was suspended in parafin in a lucite cylinder and then scanned with an industrial CT system to generate a high resolution data set for use in biomechanics modeling
High resolution extremity CT for biomechanics modeling
Energy Technology Data Exchange (ETDEWEB)
Ashby, A.E.; Brand, H.; Hollerbach, K.; Logan, C.M.; Martz, H.E.
1995-09-23
With the advent of ever more powerful computing and finite element analysis (FEA) capabilities, the bone and joint geometry detail available from either commercial surface definitions or from medical CT scans is inadequate. For dynamic FEA modeling of joints, precise articular contours are necessary to get appropriate contact definition. In this project, a fresh cadaver extremity was suspended in parafin in a lucite cylinder and then scanned with an industrial CT system to generate a high resolution data set for use in biomechanics modeling.
Model Validation Status Review
International Nuclear Information System (INIS)
E.L. Hardin
2001-01-01
The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and
Modeling for Battery Prognostics
Kulkarni, Chetan S.; Goebel, Kai; Khasin, Michael; Hogge, Edward; Quach, Patrick
2017-01-01
For any battery-powered vehicles (be it unmanned aerial vehicles, small passenger aircraft, or assets in exoplanetary operations) to operate at maximum efficiency and reliability, it is critical to monitor battery health as well performance and to predict end of discharge (EOD) and end of useful life (EOL). To fulfil these needs, it is important to capture the battery's inherent characteristics as well as operational knowledge in the form of models that can be used by monitoring, diagnostic, and prognostic algorithms. Several battery modeling methodologies have been developed in last few years as the understanding of underlying electrochemical mechanics has been advancing. The models can generally be classified as empirical models, electrochemical engineering models, multi-physics models, and molecular/atomist. Empirical models are based on fitting certain functions to past experimental data, without making use of any physicochemical principles. Electrical circuit equivalent models are an example of such empirical models. Electrochemical engineering models are typically continuum models that include electrochemical kinetics and transport phenomena. Each model has its advantages and disadvantages. The former type of model has the advantage of being computationally efficient, but has limited accuracy and robustness, due to the approximations used in developed model, and as a result of such approximations, cannot represent aging well. The latter type of model has the advantage of being very accurate, but is often computationally inefficient, having to solve complex sets of partial differential equations, and thus not suited well for online prognostic applications. In addition both multi-physics and atomist models are computationally expensive hence are even less suited to online application An electrochemistry-based model of Li-ion batteries has been developed, that captures crucial electrochemical processes, captures effects of aging, is computationally efficient
DEFF Research Database (Denmark)
Cameron, Ian T.; Gani, Rafiqul
. These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....
DEFF Research Database (Denmark)
Høskuldsson, Agnar
1996-01-01
Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four of these cri......Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four...... the basic problems in determining the dimension of linear models. Then each of the eight measures are treated. The results are illustrated by examples....
Model Validation Status Review
Energy Technology Data Exchange (ETDEWEB)
E.L. Hardin
2001-11-28
The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and
Modeling volatility using state space models.
Timmer, J; Weigend, A S
1997-08-01
In time series problems, noise can be divided into two categories: dynamic noise which drives the process, and observational noise which is added in the measurement process, but does not influence future values of the system. In this framework, we show that empirical volatilities (the squared relative returns of prices) exhibit a significant amount of observational noise. To model and predict their time evolution adequately, we estimate state space models that explicitly include observational noise. We obtain relaxation times for shocks in the logarithm of volatility ranging from three weeks (for foreign exchange) to three to five months (for stock indices). In most cases, a two-dimensional hidden state is required to yield residuals that are consistent with white noise. We compare these results with ordinary autoregressive models (without a hidden state) and find that autoregressive models underestimate the relaxation times by about two orders of magnitude since they do not distinguish between observational and dynamic noise. This new interpretation of the dynamics of volatility in terms of relaxators in a state space model carries over to stochastic volatility models and to GARCH models, and is useful for several problems in finance, including risk management and the pricing of derivative securities. Data sets used: Olsen & Associates high frequency DEM/USD foreign exchange rates (8 years). Nikkei 225 index (40 years). Dow Jones Industrial Average (25 years).
Empirical Model Building Data, Models, and Reality
Thompson, James R
2011-01-01
Praise for the First Edition "This...novel and highly stimulating book, which emphasizes solving real problems...should be widely read. It will have a positive and lasting effect on the teaching of modeling and statistics in general." - Short Book Reviews This new edition features developments and real-world examples that showcase essential empirical modeling techniques Successful empirical model building is founded on the relationship between data and approximate representations of the real systems that generated that data. As a result, it is essential for researchers who construct these m
Modeling Guru: Knowledge Base for NASA Modelers
Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.
2009-05-01
Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the
Models for Dynamic Applications
DEFF Research Database (Denmark)
Sales-Cruz, Mauricio; Morales Rodriguez, Ricardo; Heitzig, Martina
2011-01-01
This chapter covers aspects of the dynamic modelling and simulation of several complex operations that include a controlled blending tank, a direct methanol fuel cell that incorporates a multiscale model, a fluidised bed reactor, a standard chemical reactor and finally a polymerisation reactor...... be applied to formulate, analyse and solve these dynamic problems and how in the case of the fuel cell problem the model consists of coupledmeso and micro scale models. It is shown how data flows are handled between the models and how the solution is obtained within the modelling environment....
Geller, Michael; Telem, Ofri
2015-05-15
We present the first realization of a "twin Higgs" model as a holographic composite Higgs model. Uniquely among composite Higgs models, the Higgs potential is protected by a new standard model (SM) singlet elementary "mirror" sector at the sigma model scale f and not by the composite states at m_{KK}, naturally allowing for m_{KK} beyond the LHC reach. As a result, naturalness in our model cannot be constrained by the LHC, but may be probed by precision Higgs measurements at future lepton colliders, and by direct searches for Kaluza-Klein excitations at a 100 TeV collider.
International Nuclear Information System (INIS)
Harvey, M.; Khanna, F.C.
1975-01-01
The general problem of what constitutes a physical model and what is known about the free nucleon-nucleon interaction are considered. A time independent formulation of the basic equations is chosen. Construction of the average field in which particles move in a general independent particle model is developed, concentrating on problems of defining the average spherical single particle field for any given nucleus, and methods for construction of effective residual interactions and other physical operators. Deformed shell models and both spherical and deformed harmonic oscillator models are discussed in detail, and connections between spherical and deformed shell models are analyzed. A section on cluster models is included. 11 tables, 21 figures
Geller, Michael; Telem, Ofri
2015-05-01
We present the first realization of a "twin Higgs" model as a holographic composite Higgs model. Uniquely among composite Higgs models, the Higgs potential is protected by a new standard model (SM) singlet elementary "mirror" sector at the sigma model scale f and not by the composite states at mKK , naturally allowing for mKK beyond the LHC reach. As a result, naturalness in our model cannot be constrained by the LHC, but may be probed by precision Higgs measurements at future lepton colliders, and by direct searches for Kaluza-Klein excitations at a 100 TeV collider.
Directory of Open Access Journals (Sweden)
Luiz Carlos Bresser-Pereira
2012-03-01
Full Text Available Besides analyzing capitalist societies historically and thinking of them in terms of phases or stages, we may compare different models or varieties of capitalism. In this paper I survey the literature on this subject, and distinguish the classification that has a production or business approach from those that use a mainly political criterion. I identify five forms of capitalism: among the rich countries, the liberal democratic or Anglo-Saxon model, the social or European model, and the endogenous social integration or Japanese model; among developing countries, I distinguish the Asian developmental model from the liberal-dependent model that characterizes most other developing countries, including Brazil.
DEFF Research Database (Denmark)
Gernaey, Krist; Sin, Gürkan
2011-01-01
description of biological phosphorus removal, physicalchemical processes, hydraulics and settling tanks. For attached growth systems, biofilm models have progressed from analytical steady-state models to more complex 2D/3D dynamic numerical models. Plant-wide modeling is set to advance further the practice......The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...
DEFF Research Database (Denmark)
Gernaey, Krist; Sin, Gürkan
2008-01-01
description of biological phosphorus removal, physical–chemical processes, hydraulics, and settling tanks. For attached growth systems, biofilm models have progressed from analytical steady-state models to more complex 2-D/3-D dynamic numerical models. Plant-wide modeling is set to advance further......The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...
Microsoft tabular modeling cookbook
Braak, Paul te
2013-01-01
This book follows a cookbook style with recipes explaining the steps for developing analytic data using Business Intelligence Semantic Models.This book is designed for developers who wish to develop powerful and dynamic models for users as well as those who are responsible for the administration of models in corporate environments. It is also targeted at analysts and users of Excel who wish to advance their knowledge of Excel through the development of tabular models or who wish to analyze data through tabular modeling techniques. We assume no prior knowledge of tabular modeling
Hunt, W. D.; Brennan, K. F.; Summers, C. J.; Yun, Ilgu
1994-01-01
Reliability modeling and parametric yield prediction of GaAs/AlGaAs multiple quantum well (MQW) avalanche photodiodes (APDs), which are of interest as an ultra-low noise image capture mechanism for high definition systems, have been investigated. First, the effect of various doping methods on the reliability of GaAs/AlGaAs multiple quantum well (MQW) avalanche photodiode (APD) structures fabricated by molecular beam epitaxy is investigated. Reliability is examined by accelerated life tests by monitoring dark current and breakdown voltage. Median device lifetime and the activation energy of the degradation mechanism are computed for undoped, doped-barrier, and doped-well APD structures. Lifetimes for each device structure are examined via a statistically designed experiment. Analysis of variance shows that dark-current is affected primarily by device diameter, temperature and stressing time, and breakdown voltage depends on the diameter, stressing time and APD type. It is concluded that the undoped APD has the highest reliability, followed by the doped well and doped barrier devices, respectively. To determine the source of the degradation mechanism for each device structure, failure analysis using the electron-beam induced current method is performed. This analysis reveals some degree of device degradation caused by ionic impurities in the passivation layer, and energy-dispersive spectrometry subsequently verified the presence of ionic sodium as the primary contaminant. However, since all device structures are similarly passivated, sodium contamination alone does not account for the observed variation between the differently doped APDs. This effect is explained by the dopant migration during stressing, which is verified by free carrier concentration measurements using the capacitance-voltage technique.
Flight Dynamic Model Exchange using XML
Jackson, E. Bruce; Hildreth, Bruce L.
2002-01-01
The AIAA Modeling and Simulation Technical Committee has worked for several years to develop a standard by which the information needed to develop physics-based models of aircraft can be specified. The purpose of this standard is to provide a well-defined set of information, definitions, data tables and axis systems so that cooperating organizations can transfer a model from one simulation facility to another with maximum efficiency. This paper proposes using an application of the eXtensible Markup Language (XML) to implement the AIAA simulation standard. The motivation and justification for using a standard such as XML is discussed. Necessary data elements to be supported are outlined. An example of an aerodynamic model as an XML file is given. This example includes definition of independent and dependent variables for function tables, definition of key variables used to define the model, and axis systems used. The final steps necessary for implementation of the standard are presented. Software to take an XML-defined model and import/export it to/from a given simulation facility is discussed, but not demonstrated. That would be the next step in final implementation of standards for physics-based aircraft dynamic models.
Collision models in quantum optics
Ciccarello, Francesco
2017-12-01
Quantum collision models (CMs) provide advantageous case studies for investigating major issues in open quantum systems theory, and especially quantum non-Markovianity. After reviewing their general definition and distinctive features, we illustrate the emergence of a CM in a familiar quantum optics scenario. This task is carried out by highlighting the close connection between the well-known input-output formalism and CMs. Within this quantum optics framework, usual assumptions in the CMs' literature - such as considering a bath of noninteracting yet initially correlated ancillas - have a clear physical origin.
Serpentinization reaction pathways: implications for modeling approach
Energy Technology Data Exchange (ETDEWEB)
Janecky, D.R.
1986-01-01
Experimental seawater-peridotite reaction pathways to form serpentinites at 300/sup 0/C, 500 bars, can be accurately modeled using the EQ3/6 codes in conjunction with thermodynamic and kinetic data from the literature and unpublished compilations. These models provide both confirmation of experimental interpretations and more detailed insight into hydrothermal reaction processes within the oceanic crust. The accuracy of these models depends on careful evaluation of the aqueous speciation model, use of mineral compositions that closely reproduce compositions in the experiments, and definition of realistic reactive components in terms of composition, thermodynamic data, and reaction rates.
Model Uncertainty for Bilinear Hysteretic Systems
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Thoft-Christensen, Palle
1984-01-01
. The statistical uncertainty -due to lack of information can e.g. be taken into account by describing the variables by predictive density functions, Veneziano [2). In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis...... is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...
Energy Technology Data Exchange (ETDEWEB)
D.W. Wu; A.J. Smith
2004-11-08
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).
International Nuclear Information System (INIS)
D.W. Wu; A.J. Smith
2004-01-01
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7)
Modelling of Innovation Diffusion
Directory of Open Access Journals (Sweden)
Arkadiusz Kijek
2010-01-01
Full Text Available Since the publication of the Bass model in 1969, research on the modelling of the diffusion of innovation resulted in a vast body of scientific literature consisting of articles, books, and studies of real-world applications of this model. The main objective of the diffusion model is to describe a pattern of spread of innovation among potential adopters in terms of a mathematical function of time. This paper assesses the state-of-the-art in mathematical models of innovation diffusion and procedures for estimating their parameters. Moreover, theoretical issues related to the models presented are supplemented with empirical research. The purpose of the research is to explore the extent to which the diffusion of broadband Internet users in 29 OECD countries can be adequately described by three diffusion models, i.e. the Bass model, logistic model and dynamic model. The results of this research are ambiguous and do not indicate which model best describes the diffusion pattern of broadband Internet users but in terms of the results presented, in most cases the dynamic model is inappropriate for describing the diffusion pattern. Issues related to the further development of innovation diffusion models are discussed and some recommendations are given. (original abstract
Nonlinear Modeling by Assembling Piecewise Linear Models
Yao, Weigang; Liou, Meng-Sing
2013-01-01
To preserve nonlinearity of a full order system over a parameters range of interest, we propose a simple modeling approach by assembling a set of piecewise local solutions, including the first-order Taylor series terms expanded about some sampling states. The work by Rewienski and White inspired our use of piecewise linear local solutions. The assembly of these local approximations is accomplished by assigning nonlinear weights, through radial basis functions in this study. The efficacy of the proposed procedure is validated for a two-dimensional airfoil moving at different Mach numbers and pitching motions, under which the flow exhibits prominent nonlinear behaviors. All results confirm that our nonlinear model is accurate and stable for predicting not only aerodynamic forces but also detailed flowfields. Moreover, the model is robustness-accurate for inputs considerably different from the base trajectory in form and magnitude. This modeling preserves nonlinearity of the problems considered in a rather simple and accurate manner.
Integrated Medical Model – Chest Injury Model
National Aeronautics and Space Administration — The Exploration Medical Capability (ExMC) Element of NASA's Human Research Program (HRP) developed the Integrated Medical Model (IMM) to forecast the resources...
Traffic & safety statewide model and GIS modeling.
2012-07-01
Several steps have been taken over the past two years to advance the Utah Department of Transportation (UDOT) safety initiative. Previous research projects began the development of a hierarchical Bayesian model to analyze crashes on Utah roadways. De...
OPEC model : adjustment or new model
International Nuclear Information System (INIS)
Ayoub, A.
1994-01-01
Since the early eighties, the international oil industry went through major changes : new financial markets, reintegration, opening of the upstream, liberalization of investments, privatization. This article provides answers to two major questions : what are the reasons for these changes ? ; do these changes announce the replacement of OPEC model by a new model in which state intervention is weaker and national companies more autonomous. This would imply a profound change of political and institutional systems of oil producing countries. (Author)
Solid Waste Projection Model: Model user's guide
International Nuclear Information System (INIS)
Stiles, D.L.; Crow, V.L.
1990-08-01
The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford company (WHC) specifically to address solid waste management issues at the Hanford Central Waste Complex (HCWC). This document, one of six documents supporting the SWPM system, contains a description of the system and instructions for preparing to use SWPM and operating Version 1 of the model. 4 figs., 1 tab
Emissions Modeling Clearinghouse
U.S. Environmental Protection Agency — The Emissions Modeling Clearinghouse (EMCH) supports and promotes emissions modeling activities both internal and external to the EPA. Through this site, the EPA...
Radiobilogical cell survival models
International Nuclear Information System (INIS)
Zackrisson, B.
1992-01-01
A central issue in clinical radiobiological research is the prediction of responses to different radiation qualities. The choice of cell survival and dose-response model greatly influences the results. In this context the relationship between theory and model is emphasized. Generally, the interpretations of experimental data depend on the model. Cell survival models are systematized with respect to their relations to radiobiological theories of cell kill. The growing knowlegde of biological, physical, and chemical mechanisms is reflected in the formulation of new models. The present overview shows that recent modelling has been more oriented towards the stochastic fluctuations connected to radiation energy deposition. This implies that the traditional cell surivival models ought to be complemented by models of stochastic energy deposition processes and repair processes at the intracellular level. (orig.)
Pruneau, Diane; Chouinard, Omer; Arsenault, Charline
1998-01-01
Reports on a model of environmental education that aims to encourage greater attachment to the bioregion of Arcadia. The model results from cooperation within a village community and addresses the environmental education of people of all ages. (DDR)
National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...
National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...
International Nuclear Information System (INIS)
Pulkkinen, U.
2004-04-01
The report describes a simple comparison of two CCF-models, the ECLM, and the Beta-model. The objective of the comparison is to identify differences in the results of the models by applying the models in some simple test data cases. The comparison focuses mainly on theoretical aspects of the above mentioned CCF-models. The properties of the model parameter estimates in the data cases is also discussed. The practical aspects in using and estimating CCFmodels in real PSA context (e.g. the data interpretation, properties of computer tools, the model documentation) are not discussed in the report. Similarly, the qualitative CCF-analyses needed in using the models are not discussed in the report. (au)
2014-01-01
This study developed a new snow model and a database which warehouses geometric, weather and traffic : data on New Jersey highways. The complexity of the model development lies in considering variable road : width, different spreading/plowing pattern...
International Nuclear Information System (INIS)
Rahm, L.; Nyberg, L.; Gidhagen, L.
1990-01-01
A dispersion model to be used off costal waters has been developed. The model has been applied to describe the migration of radionuclides in the Baltic sea. A summary of the results is presented here. (K.A.E)
Consistent model driven architecture
Niepostyn, Stanisław J.
2015-09-01
The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.
Laboratory of Biological Modeling
Federal Laboratory Consortium — The Laboratory of Biological Modeling is defined by both its methodologies and its areas of application. We use mathematical modeling in many forms and apply it to a...
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
Directory of Open Access Journals (Sweden)
Oleg Svatos
2013-01-01
Full Text Available In this paper we analyze complexity of time limits we can find especially in regulated processes of public administration. First we review the most popular process modeling languages. There is defined an example scenario based on the current Czech legislature which is then captured in discussed process modeling languages. Analysis shows that the contemporary process modeling languages support capturing of the time limit only partially. This causes troubles to analysts and unnecessary complexity of the models. Upon unsatisfying results of the contemporary process modeling languages we analyze the complexity of the time limits in greater detail and outline lifecycles of a time limit using the multiple dynamic generalizations pattern. As an alternative to the popular process modeling languages there is presented PSD process modeling language, which supports the defined lifecycles of a time limit natively and therefore allows keeping the models simple and easy to understand.
Modeling Philosophies and Applications
All models begin with a framework and a set of assumptions and limitations that go along with that framework. In terms of fracing and RA, there are several places where models and parameters must be chosen to complete hazard identification.
Bounding species distribution models
Directory of Open Access Journals (Sweden)
Thomas J. STOHLGREN, Catherine S. JARNEVICH, Wayne E. ESAIAS,Jeffrey T. MORISETTE
2011-10-01
Full Text Available Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for “clamping” model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART and maximum entropy (Maxent models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5: 642–647, 2011].
Bounding Species Distribution Models
Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.
2011-01-01
Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].
Modelling of wastewater systems
DEFF Research Database (Denmark)
Bechmann, Henrik
to analyze and quantify the effect of the Aeration Tank Settling (ATS) operating mode, which is used during rain events. Furthermore, the model is used to propose a control algorithm for the phase lengths during ATS operation. The models are mainly formulated as state space model in continuous time......In this thesis, models of pollution fluxes in the inlet to 2 Danish wastewater treatment plants (WWTPs) as well as of suspended solids (SS) concentrations in the aeration tanks of an alternating WWTP and in the effluent from the aeration tanks are developed. The latter model is furthermore used...... at modelling the fluxes in terms of the multiple correlation coefficient R2. The model of the SS concentrations in the aeration tanks of an alternating WWTP as well as in the effluent from the aeration tanks is a mass balance model based on measurements of SS in one aeration tank and in the common outlet...
DEFF Research Database (Denmark)
Højsgaard, Søren; Edwards, David; Lauritzen, Steffen
Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many...... of these software developments have taken place within the R community, either in the form of new packages or by providing an R ingerface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In addition......, the book provides examples of how more advanced aspects of graphical modeling can be represented and handled within R. Topics covered in the seven chapters include graphical models for contingency tables, Gaussian and mixed graphical models, Bayesian networks and modeling high dimensional data...
Modeling EERE deployment programs
Energy Technology Data Exchange (ETDEWEB)
Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2007-11-01
The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge for future research.
Bennett, Joan
1998-01-01
Recommends the use of a model of DNA made out of Velcro to help students visualize the steps of DNA replication. Includes a materials list, construction directions, and details of the demonstration using the model parts. (DDR)
Modelling arithmetic operations
Energy Technology Data Exchange (ETDEWEB)
Shabanov-kushnarenk, Yu P
1981-01-01
The possibility of modelling finite alphabetic operators using formal intelligence theory, is explored, with the setting up of models of a 3-digit adder and a multidigit subtractor, as examples. 2 references.
The CRAFT Fortran Programming Model
Directory of Open Access Journals (Sweden)
Douglas M. Pase
1994-01-01
Full Text Available Many programming models for massively parallel machines exist, and each has its advantages and disadvantages. In this article we present a programming model that combines features from other programming models that (1 can be efficiently implemented on present and future Cray Research massively parallel processor (MPP systems and (2 are useful in constructing highly parallel programs. The model supports several styles of programming: message-passing, data parallel, global address (shared data, and work-sharing. These styles may be combined within the same program. The model includes features that allow a user to define a program in terms of the behavior of the system as a whole, where the behavior of individual tasks is implicit from this systemic definition. (In general, features marked as shared are designed to support this perspective. It also supports an opposite perspective, where a program may be defined in terms of the behaviors of individual tasks, and a program is implicitly the sum of the behaviors of all tasks. (Features marked as private are designed to support this perspective. Users can exploit any combination of either set of features without ambiguity and thus are free to define a program from whatever perspective is most appropriate to the problem at hand.
Modeling and Simulation for Safeguards
International Nuclear Information System (INIS)
Swinhoe, Martyn T.
2012-01-01
The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R and D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.
Object-oriented biomedical system modelling--the language.
Hakman, M; Groth, T
1999-11-01
The paper describes a new object-oriented biomedical continuous system modelling language (OOBSML). It is fully object-oriented and supports model inheritance, encapsulation, and model component instantiation and behaviour polymorphism. Besides the traditional differential and algebraic equation expressions the language includes also formal expressions for documenting models and defining model quantity types and quantity units. It supports explicit definition of model input-, output- and state quantities, model components and component connections. The OOBSML model compiler produces self-contained, independent, executable model components that can be instantiated and used within other OOBSML models and/or stored within model and model component libraries. In this way complex models can be structured as multilevel, multi-component model hierarchies. Technically the model components produced by the OOBSML compiler are executable computer code objects based on distributed object and object request broker technology. This paper includes both the language tutorial and the formal language syntax and semantic description.
International Nuclear Information System (INIS)
Tashiro, Tohru
2014-01-01
We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model
2006-01-01
This is the version 1.1 of the TENCompetence Domain Model (version 1.0 released at 19-6-2006; version 1.1 at 9-11-2008). It contains several files: a) a pdf with the model description, b) three jpg files with class models (also in the pdf), c) a MagicDraw zip file with the model itself, d) a release
Optimization modeling with spreadsheets
Baker, Kenneth R
2015-01-01
An accessible introduction to optimization analysis using spreadsheets Updated and revised, Optimization Modeling with Spreadsheets, Third Edition emphasizes model building skills in optimization analysis. By emphasizing both spreadsheet modeling and optimization tools in the freely available Microsoft® Office Excel® Solver, the book illustrates how to find solutions to real-world optimization problems without needing additional specialized software. The Third Edition includes many practical applications of optimization models as well as a systematic framework that il
Model Checking Feature Interactions
DEFF Research Database (Denmark)
Le Guilly, Thibaut; Olsen, Petur; Pedersen, Thomas
2015-01-01
This paper presents an offline approach to analyzing feature interactions in embedded systems. The approach consists of a systematic process to gather the necessary information about system components and their models. The model is first specified in terms of predicates, before being refined to t...... to timed automata. The consistency of the model is verified at different development stages, and the correct linkage between the predicates and their semantic model is checked. The approach is illustrated on a use case from home automation....
International Nuclear Information System (INIS)
Cheney, J.A.
1981-01-01
The problems of statisfying similarity between a physical model and the prototype in rock wherein fissures and cracks place a role in physical behavior is explored. The need for models of large physical dimensions is explained but also testing of models of the same prototype over a wide range of scales is needed to ascertain the influence of lack of similitude of particular parameters between prototype and model. A large capacity centrifuge would be useful in that respect
Dorofeenko, Victor; Lee, Gabriel; Salyer, Kevin; Strobel, Johannes
2016-01-01
Within the context of a financial accelerator model, we model time-varying uncertainty (i.e. risk shocks) through the use of a mixture Normal model with time variation in the weights applied to the underlying distributions characterizing entrepreneur productivity. Specifically, we model capital producers (i.e. the entrepreneurs) as either low-risk (relatively small second moment for productivity) and high-risk (relatively large second moment for productivity) and the fraction of both types is...
Tashiro, Tohru
2014-03-01
We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model.
DEFF Research Database (Denmark)
Thoft-Christensen, Palle
Modelling of corrosion cracking of reinforced concrete structures is complicated as a great number of uncertain factors are involved. To get a reliable modelling a physical and mechanical understanding of the process behind corrosion in needed.......Modelling of corrosion cracking of reinforced concrete structures is complicated as a great number of uncertain factors are involved. To get a reliable modelling a physical and mechanical understanding of the process behind corrosion in needed....
GARCH Modelling of Cryptocurrencies
Jeffrey Chu; Stephen Chan; Saralees Nadarajah; Joerg Osterrieder
2017-01-01
With the exception of Bitcoin, there appears to be little or no literature on GARCH modelling of cryptocurrencies. This paper provides the first GARCH modelling of the seven most popular cryptocurrencies. Twelve GARCH models are fitted to each cryptocurrency, and their fits are assessed in terms of five criteria. Conclusions are drawn on the best fitting models, forecasts and acceptability of value at risk estimates.
GARCH Modelling of Cryptocurrencies
Directory of Open Access Journals (Sweden)
Jeffrey Chu
2017-10-01
Full Text Available With the exception of Bitcoin, there appears to be little or no literature on GARCH modelling of cryptocurrencies. This paper provides the first GARCH modelling of the seven most popular cryptocurrencies. Twelve GARCH models are fitted to each cryptocurrency, and their fits are assessed in terms of five criteria. Conclusions are drawn on the best fitting models, forecasts and acceptability of value at risk estimates.
Artificial neural network modelling
Samarasinghe, Sandhya
2016-01-01
This book covers theoretical aspects as well as recent innovative applications of Artificial Neural networks (ANNs) in natural, environmental, biological, social, industrial and automated systems. It presents recent results of ANNs in modelling small, large and complex systems under three categories, namely, 1) Networks, Structure Optimisation, Robustness and Stochasticity 2) Advances in Modelling Biological and Environmental Systems and 3) Advances in Modelling Social and Economic Systems. The book aims at serving undergraduates, postgraduates and researchers in ANN computational modelling. .
Differential models in ecology
International Nuclear Information System (INIS)
Barco Gomez, Carlos; Barco Gomez, German
2002-01-01
The models mathematical writings with differential equations are used to describe the populational behavior through the time of the animal species. These models can be lineal or no lineal. The differential models for unique specie include the exponential pattern of Malthus and the logistical pattern of Verlhust. The lineal differential models to describe the interaction between two species include the competition relationships, predation and symbiosis
Competing through business models
Casadesus-Masanell, Ramon; Ricart, Joan E.
2007-01-01
In this article a business model is defined as the firm choices on policies, assets and governance structure of those policies and assets, together with their consequences, be them flexible or rigid. We also provide a way to represent such business models to highlight the dynamic loops and to facilitate understanding interaction with other business models. Furthermore, we develop some tests to evaluate the goodness of a business model both in isolation as well as in interaction with other bus...
Petrone, Giovanni; Spagnuolo, Giovanni
2016-01-01
This comprehensive guide surveys all available models for simulating a photovoltaic (PV) generator at different levels of granularity, from cell to system level, in uniform as well as in mismatched conditions. Providing a thorough comparison among the models, engineers have all the elements needed to choose the right PV array model for specific applications or environmental conditions matched with the model of the electronic circuit used to maximize the PV power production.
Model description and evaluation of model performance: DOSDIM model
International Nuclear Information System (INIS)
Lewyckyj, N.; Zeevaert, T.
1996-01-01
DOSDIM was developed to assess the impact to man from routine and accidental atmospheric releases. It is a compartmental, deterministic, radiological model. For an accidental release, dynamic transfer are used in opposition to a routine release for which equilibrium transfer factors are used. Parameters values were chosen to be conservative. Transfer between compartments are described by first-order differential equations. 2 figs
Modelling MIZ dynamics in a global model
Rynders, Stefanie; Aksenov, Yevgeny; Feltham, Daniel; Nurser, George; Naveira Garabato, Alberto
2016-04-01
Exposure of large, previously ice-covered areas of the Arctic Ocean to the wind and surface ocean waves results in the Arctic pack ice cover becoming more fragmented and mobile, with large regions of ice cover evolving into the Marginal Ice Zone (MIZ). The need for better climate predictions, along with growing economic activity in the Polar Oceans, necessitates climate and forecasting models that can simulate fragmented sea ice with a greater fidelity. Current models are not fully fit for the purpose, since they neither model surface ocean waves in the MIZ, nor account for the effect of floe fragmentation on drag, nor include sea ice rheology that represents both the now thinner pack ice and MIZ ice dynamics. All these processes affect the momentum transfer to the ocean. We present initial results from a global ocean model NEMO (Nucleus for European Modelling of the Ocean) coupled to the Los Alamos sea ice model CICE. The model setup implements a novel rheological formulation for sea ice dynamics, accounting for ice floe collisions, thus offering a seamless framework for pack ice and MIZ simulations. The effect of surface waves on ice motion is included through wave pressure and the turbulent kinetic energy of ice floes. In the multidecadal model integrations we examine MIZ and basin scale sea ice and oceanic responses to the changes in ice dynamics. We analyse model sensitivities and attribute them to key sea ice and ocean dynamical mechanisms. The results suggest that the effect of the new ice rheology is confined to the MIZ. However with the current increase in summer MIZ area, which is projected to continue and may become the dominant type of sea ice in the Arctic, we argue that the effects of the combined sea ice rheology will be noticeable in large areas of the Arctic Ocean, affecting sea ice and ocean. With this study we assert that to make more accurate sea ice predictions in the changing Arctic, models need to include MIZ dynamics and physics.
Towards Clone Detection in UML Domain Models
DEFF Research Database (Denmark)
Störrle, Harald
2013-01-01
Code clones (i.e., duplicate fragments of code) have been studied for long, and there is strong evidence that they are a major source of software faults. Anecdotal evidence suggests that this phenomenon occurs similarly in models, suggesting that model clones are as detrimental to model quality...... as they are to code quality. However, programming language code and visual models have significant differences that make it difficult to directly transfer notions and algorithms developed in the code clone arena to model clones. In this article, we develop and propose a definition of the notion of “model clone” based...... we believe that our approach advances the state of the art significantly, it is restricted to UML models, its results leave room for improvements, and there is no validation by field studies....
DEFF Research Database (Denmark)
Andresen, Mette
2007-01-01
-authentic modelling is also linked with the potentials of exploration of ready-made models as a forerunner for more authentic modelling processes. The discussion includes analysis of an episode of students? work in the classroom, which serves to illustrate how concept formation may be linked to explorations of a non...
Crushed Salt Constitutive Model
International Nuclear Information System (INIS)
Callahan, G.D.
1999-01-01
The constitutive model used to describe the deformation of crushed salt is presented in this report. Two mechanisms -- dislocation creep and grain boundary diffusional pressure solution -- are combined to form the basis for the constitutive model governing the deformation of crushed salt. The constitutive model is generalized to represent three-dimensional states of stress. Upon complete consolidation, the crushed-salt model reproduces the Multimechanism Deformation (M-D) model typically used for the Waste Isolation Pilot Plant (WIPP) host geological formation salt. New shear consolidation tests are combined with an existing database that includes hydrostatic consolidation and shear consolidation tests conducted on WIPP and southeastern New Mexico salt. Nonlinear least-squares model fitting to the database produced two sets of material parameter values for the model -- one for the shear consolidation tests and one for a combination of the shear and hydrostatic consolidation tests. Using the parameter values determined from the fitted database, the constitutive model is validated against constant strain-rate tests. Shaft seal problems are analyzed to demonstrate model-predicted consolidation of the shaft seal crushed-salt component. Based on the fitting statistics, the ability of the model to predict the test data, and the ability of the model to predict load paths and test data outside of the fitted database, the model appears to capture the creep consolidation behavior of crushed salt reasonably well
E. Gregory McPherson; Paula J. Peper
2012-01-01
This paper describes three long-term tree growth studies conducted to evaluate tree performance because repeated measurements of the same trees produce critical data for growth model calibration and validation. Several empirical and process-based approaches to modeling tree growth are reviewed. Modeling is more advanced in the fields of forestry and...
DEFF Research Database (Denmark)
Borlund, Pia
2003-01-01
An alternative approach to evaluation of interactive information retrieval (IIR) systems, referred to as the IIR evaluation model, is proposed. The model provides a framework for the collection and analysis of IR interaction data. The aim of the model is two-fold: 1) to facilitate the evaluation ...
Bogiages, Christopher A.; Lotter, Christine
2011-01-01
In their research, scientists generate, test, and modify scientific models. These models can be shared with others and demonstrate a scientist's understanding of how the natural world works. Similarly, students can generate and modify models to gain a better understanding of the content, process, and nature of science (Kenyon, Schwarz, and Hug…
International Nuclear Information System (INIS)
Zuber, A.
1983-01-01
A review and discussion is given of mathematical models used for interpretation of tracer experiments in hydrology. For dispersion model, different initial and boundary conditions are related to different injection and detection modes. Examples of applications of various models are described and commented. (author)
Kelderman, Hendrikus
1984-01-01
Existing statistical tests for the fit of the Rasch model have been criticized, because they are only sensitive to specific violations of its assumptions. Contingency table methods using loglinear models have been used to test various psychometric models. In this paper, the assumptions of the Rasch
International Nuclear Information System (INIS)
Thomas, A.W.
1981-01-01
Recent developments in the bag model, in which the constraints of chiral symmetry are explicitly included are reviewed. The model leads to a new understanding of the Δ-resonance. The connection of the theory with current algebra is clarified and implications of the model for the structure of the nucleon are discussed
Energy Technology Data Exchange (ETDEWEB)
Fortelius, C.; Holopainen, E.; Kaurola, J.; Ruosteenoja, K.; Raeisaenen, J. [Helsinki Univ. (Finland). Dept. of Meteorology
1996-12-31
In recent years the modelling of interannual climate variability has been studied, the atmospheric energy and water cycles, and climate simulations with the ECHAM3 model. In addition, the climate simulations of several models have been compared with special emphasis in the area of northern Europe
The nontopological soliton model
International Nuclear Information System (INIS)
Wilets, L.
1988-01-01
The nontopological soliton model introduced by Friedberg and Lee, and variations of it, provide a method for modeling QCD which can effectively include the dynamics of hadronic collisions as well as spectra. Absolute color confinement is effected by the assumed dielectric properties of the medium. A recently proposed version of the model is chirally invariant. 32 refs., 5 figs., 1 tab
International Nuclear Information System (INIS)
Martin Llorente, F.
1990-01-01
The models of atmospheric pollutants dispersion are based in mathematic algorithms that describe the transport, diffusion, elimination and chemical reactions of atmospheric contaminants. These models operate with data of contaminants emission and make an estimation of quality air in the area. This model can be applied to several aspects of atmospheric contamination
DEFF Research Database (Denmark)
Jensen, Finn Verner; Nielsen, Thomas Dyhre
2016-01-01
Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...
Intermittency in branching models
International Nuclear Information System (INIS)
Chiu, C.B.; Texas Univ., Austin; Hwa, R.C.; Oregon Univ., Eugene
1990-01-01
The intermittency properties of three branching models have been investigated. The factorial moments show power-law behavior as function of small rapidity width. The slopes and energy dependences reveal different characteristics of the models. The gluon model has the weakest intermittency. (orig.)
DEFF Research Database (Denmark)
Gudiksen, Sune Klok; Poulsen, Søren Bolvig; Buur, Jacob
2014-01-01
Well-established companies are currently struggling to secure profits due to the pressure from new players' business models as they take advantage of communication technology and new business-model configurations. Because of this, the business model research field flourishes currently; however, t...
International Nuclear Information System (INIS)
Sazykina, T.G.; Kryshev, I.I.
1996-01-01
The main purpose of the model is a more detailed description of the radionuclide transfer in food chains, including the dynamics in the early period after accidental release. Detailed modelling of the dynamics of radioactive depositions is beyond the purpose of the model. Standard procedures are used for assessing inhalation and external doses. 3 figs, 2 tabs
Fedorov, Alexander
2011-01-01
The author supposed that media education models can be divided into the following groups: (1) educational-information models (the study of the theory, history, language of media culture, etc.), based on the cultural, aesthetic, semiotic, socio-cultural theories of media education; (2) educational-ethical models (the study of moral, religions,…
Energy Technology Data Exchange (ETDEWEB)
Fortelius, C; Holopainen, E; Kaurola, J; Ruosteenoja, K; Raeisaenen, J [Helsinki Univ. (Finland). Dept. of Meteorology
1997-12-31
In recent years the modelling of interannual climate variability has been studied, the atmospheric energy and water cycles, and climate simulations with the ECHAM3 model. In addition, the climate simulations of several models have been compared with special emphasis in the area of northern Europe
DEFF Research Database (Denmark)
Andreasen, Martin Møller; Meldrum, Andrew
This paper studies whether dynamic term structure models for US nominal bond yields should enforce the zero lower bound by a quadratic policy rate or a shadow rate specification. We address the question by estimating quadratic term structure models (QTSMs) and shadow rate models with at most four...
Automated Simulation Model Generation
Huang, Y.
2013-01-01
One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become
Modeling EERE Deployment Programs
Energy Technology Data Exchange (ETDEWEB)
Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2007-11-01
This report compiles information and conclusions gathered as part of the “Modeling EERE Deployment Programs” project. The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge in which future research is needed.
DEFF Research Database (Denmark)
Cameron, Ian; Gani, Rafiqul
2011-01-01
Engineering of products and processes is increasingly “model-centric”. Models in their multitudinous forms are ubiquitous, being heavily used for a range of decision making activities across all life cycle phases. This chapter gives an overview of what is a model, the principal activities in the ...